diff
stringlengths
41
2.03M
msg
stringlengths
1
1.5k
repo
stringlengths
5
40
sha
stringlengths
40
40
time
stringlengths
20
20
mmm a / README . md <nl> ppp b / README . md <nl> EOS . IO currently supports the following operating systems : <nl> 1 . [ Getting Started ] ( # gettingstarted ) <nl> 2 . [ Setting up a build / development environment ] ( # setup ) <nl> 1 . [ Automated build script ] ( # autobuild ) <nl> - 1 . [ Clean install Linux ( Amazon , Fedora , & Ubuntu ) for a local testnet ] ( # autoubuntulocal ) <nl> - 2 . [ Clean install Linux ( Amazon , Fedora , & Ubuntu ) for the public testnet ] ( # autoubuntupublic ) <nl> + 1 . [ Clean install Linux ( Amazon , Centos , Fedora , & Ubuntu ) for a local testnet ] ( # autoubuntulocal ) <nl> + 2 . [ Clean install Linux ( Amazon , Centos , Fedora , & Ubuntu ) for the public testnet ] ( # autoubuntupublic ) <nl> 3 . [ MacOS for a local testnet ] ( # automaclocal ) <nl> 4 . [ MacOS for the public testnet ] ( # automacpublic ) <nl> 3 . [ Building EOS and running a node ] ( # runanode ) <nl> EOS . IO currently supports the following operating systems : <nl> 8 . [ Running EOS in Docker ] ( # docker ) <nl> 9 . [ Manual installation of the dependencies ] ( # manualdep ) <nl> 1 . [ Clean install Amazon 2017 . 09 and higher ] ( # manualdepamazon ) <nl> - 2 . [ Clean install Fedora 25 and higher ] ( # manualdepfedora ) <nl> - 3 . [ Clean install Ubuntu 16 . 04 and higher ] ( # manualdepubuntu ) <nl> - 4 . [ Clean install MacOS Sierra 10 . 12 and higher ] ( # manualdepmacos ) <nl> + 2 . [ Clean install Centos 7 and higher ] ( # manualdepcentos ) <nl> + 3 . [ Clean install Fedora 25 and higher ] ( # manualdepfedora ) <nl> + 4 . [ Clean install Ubuntu 16 . 04 and higher ] ( # manualdepubuntu ) <nl> + 5 . [ Clean install MacOS Sierra 10 . 12 and higher ] ( # manualdepmacos ) <nl> <nl> < a name = " gettingstarted " > < / a > <nl> # # Getting Started <nl> The following instructions detail the process of getting the software , building <nl> <nl> Supported Operating Systems : <nl> 1 . Amazon 2017 . 09 and higher . <nl> - 2 . Fedora 25 and higher ( Fedora 27 recommended ) . <nl> - 3 . Ubuntu 16 . 04 and higher ( Ubuntu 16 . 10 recommended ) . <nl> - 4 . MacOS Darwin 10 . 12 and higher ( MacOS 10 . 13 . x recommended ) . <nl> + 2 . Centos 7 and higher . <nl> + 3 . Fedora 25 and higher ( Fedora 27 recommended ) . <nl> + 4 . Ubuntu 16 . 04 and higher ( Ubuntu 16 . 10 recommended ) . <nl> + 5 . MacOS Darwin 10 . 12 and higher ( MacOS 10 . 13 . x recommended ) . <nl> <nl> - For Amazon , Fedora , Ubuntu & MacOS there is an automated build script that can install all dependencies and builds EOS . <nl> + For Amazon , Centos , Fedora , Ubuntu & MacOS there is an automated build script that can install all dependencies and builds EOS . <nl> We are working on supporting other Linux / Unix distributions in future releases . <nl> <nl> Choose whether you will be building for a local testnet or for the public testnet and jump to the appropriate section below . Clone the EOS repository recursively as described and run eosio_build . sh located in the root ` eos ` folder . <nl> Choose whether you will be building for a local testnet or for the public testne <nl> We strongly recommend following the instructions for building the public testnet version for [ Ubuntu ] ( # autoubuntupublic ) or [ Mac OS X ] ( # automacpublic ) . ` master ` is in pieces on the garage floor while we rebuild this hotrod . This notice will be removed when ` master ` is usable again . Your patience is appreciated . <nl> <nl> < a name = " autoubuntulocal " > < / a > <nl> - # # # # : no_entry : Clean install Linux ( Amazon , Fedora & Ubuntu ) for a local testnet : no_entry : <nl> + # # # # : no_entry : Clean install Linux ( Amazon , Centos , Fedora & Ubuntu ) for a local testnet : no_entry : <nl> <nl> ` ` ` bash <nl> git clone https : / / github . com / eosio / eos - - recursive <nl> sudo make install <nl> Now you can proceed to the next step - [ Creating and launching a single - node testnet ] ( # singlenode ) <nl> <nl> < a name = " autoubuntupublic " > < / a > <nl> - # # # # Clean install Linux ( Amazon , Fedora & Ubuntu ) for the public testnet <nl> + # # # # Clean install Linux ( Amazon , Centos , Fedora & Ubuntu ) for the public testnet <nl> <nl> ` ` ` bash <nl> git clone https : / / github . com / eosio / eos - - recursive <nl> Dependencies : <nl> * OpenSSL <nl> * LLVM 4 . 0 <nl> * [ secp256k1 - zkp ( Cryptonomex branch ) ] ( https : / / github . com / cryptonomex / secp256k1 - zkp . git ) <nl> - * [ binaryen ] ( https : / / github . com / WebAssembly / binaryen . git ) <nl> <nl> < a name = " manualdepamazon " > < / a > <nl> # # # Clean install Amazon 2017 . 09 and higher <nl> sudo yum install git gcc72 . x86_64 gcc72 - c + + . x86_64 autoconf automake libtool mak <nl> <nl> ` ` ` <nl> <nl> + Install CMake 3 . 10 . 2 : <nl> + <nl> + ` ` ` bash <nl> + cd ~ <nl> + curl - L - O https : / / cmake . org / files / v3 . 10 / cmake - 3 . 10 . 2 . tar . gz <nl> + tar xf cmake - 3 . 10 . 2 . tar . gz <nl> + rm - f cmake - 3 . 10 . 2 . tar . gz <nl> + ln - s cmake - 3 . 10 . 2 / cmake <nl> + cd cmake <nl> + . / bootstrap <nl> + make <nl> + sudo make install <nl> + ` ` ` <nl> + <nl> Install Boost 1 . 66 : <nl> <nl> ` ` ` bash <nl> make - j $ ( nproc ) <nl> sudo make install <nl> ` ` ` <nl> <nl> - To use the WASM compiler , EOS has an external dependency on [ binaryen ] ( https : / / github . com / WebAssembly / binaryen . git ) : <nl> + By default LLVM and clang do not include the WASM build target , so you will have to build it yourself : <nl> <nl> ` ` ` bash <nl> - cd ~ <nl> - git clone https : / / github . com / WebAssembly / binaryen . git <nl> - cd ~ / binaryen <nl> - git checkout tags / 1 . 37 . 14 <nl> - cmake . & & make <nl> + mkdir ~ / wasm - compiler <nl> + cd ~ / wasm - compiler <nl> + git clone - - depth 1 - - single - branch - - branch release_40 https : / / github . com / llvm - mirror / llvm . git <nl> + cd llvm / tools <nl> + git clone - - depth 1 - - single - branch - - branch release_40 https : / / github . com / llvm - mirror / clang . git <nl> + cd . . <nl> + mkdir build <nl> + cd build <nl> + cmake - G " Unix Makefiles " - DCMAKE_INSTALL_PREFIX = . . - DLLVM_TARGETS_TO_BUILD = - DLLVM_EXPERIMENTAL_TARGETS_TO_BUILD = WebAssembly - DCMAKE_BUILD_TYPE = Release . . / <nl> + make - j $ ( nproc ) <nl> + make install <nl> + ` ` ` <nl> + <nl> + Your environment is set up . Now you can < a href = " # runanode " > build EOS and run a node < / a > . <nl> + <nl> + < a name = " manualdepcentos " > < / a > <nl> + # # # Clean install Centos 7 and higher <nl> + <nl> + Install the development toolkit : <nl> + * Installation on Centos requires installing / enabling the Centos Software Collections <nl> + Repository . <nl> + [ Centos SCL ] ( https : / / wiki . centos . org / AdditionalResources / Repositories / SCL ) : <nl> + <nl> + ` ` ` bash <nl> + sudo yum - - enablerepo = extras install centos - release - scl <nl> + sudo yum update <nl> + sudo yum install - y devtoolset - 7 <nl> + scl enable devtoolset - 7 bash <nl> + sudo yum install git autoconf automake libtool make bzip2 \ <nl> + bzip2 - devel . x86_64 openssl - devel . x86_64 gmp - devel . x86_64 \ <nl> + ocaml . x86_64 doxygen libicu - devel . x86_64 python27 - devel . x86_64 \ <nl> + gettext - devel . x86_64 <nl> <nl> ` ` ` <nl> <nl> - Add ` BINARYEN_ROOT ` to your . bash_profile : <nl> + Install CMake 3 . 10 . 2 : <nl> <nl> ` ` ` bash <nl> - echo " export BINARYEN_ROOT = ~ / binaryen " > > ~ / . bash_profile <nl> + cd ~ <nl> + curl - L - O https : / / cmake . org / files / v3 . 10 / cmake - 3 . 10 . 2 . tar . gz <nl> + tar xf cmake - 3 . 10 . 2 . tar . gz <nl> + cd cmake - 3 . 10 . 2 <nl> + . / bootstrap <nl> + make - j $ ( nproc ) <nl> + sudo make install <nl> + ` ` ` <nl> + <nl> + Install Boost 1 . 66 : <nl> + <nl> + ` ` ` bash <nl> + cd ~ <nl> + curl - L https : / / dl . bintray . com / boostorg / release / 1 . 66 . 0 / source / boost_1_66_0 . tar . bz2 > boost_1 . 66 . 0 . tar . bz2 <nl> + tar xf boost_1 . 66 . 0 . tar . bz2 <nl> + echo " export BOOST_ROOT = $ HOME / boost_1_66_0 " > > ~ / . bash_profile <nl> source ~ / . bash_profile <nl> + cd boost_1_66_0 / <nl> + . / bootstrap . sh " - - prefix = $ BOOST_ROOT " <nl> + . / b2 install <nl> + ` ` ` <nl> + <nl> + Install [ secp256k1 - zkp ( Cryptonomex branch ) ] ( https : / / github . com / cryptonomex / secp256k1 - zkp . git ) : <nl> + <nl> + ` ` ` bash <nl> + cd ~ <nl> + git clone https : / / github . com / cryptonomex / secp256k1 - zkp . git <nl> + cd secp256k1 - zkp <nl> + . / autogen . sh <nl> + . / configure <nl> + make - j $ ( nproc ) <nl> + sudo make install <nl> ` ` ` <nl> <nl> By default LLVM and clang do not include the WASM build target , so you will have to build it yourself : <nl> Install the development toolkit : <nl> ` ` ` bash <nl> sudo yum update <nl> sudo yum install git gcc . x86_64 gcc - c + + . x86_64 autoconf automake libtool make cmake . x86_64 \ <nl> - bzip2 bzip2 - devel . x86_64 openssl - devel . x86_64 gmp - devel . x86_64 \ <nl> - libstdc + + - devel . x86_64 python3 - devel . x86_64 libedit . x86_64 \ <nl> - ncurses - devel . x86_64 swig . x86_64 gettext - devel . x86_64 <nl> + bzip2 - devel . x86_64 openssl - devel . x86_64 gmp - devel . x86_64 \ <nl> + libstdc + + - devel . x86_64 python3 - devel . x86_64 libedit . x86_64 \ <nl> + ncurses - devel . x86_64 swig . x86_64 gettext - devel . x86_64 <nl> <nl> ` ` ` <nl> <nl> make - j $ ( nproc ) <nl> sudo make install <nl> ` ` ` <nl> <nl> - To use the WASM compiler , EOS has an external dependency on [ binaryen ] ( https : / / github . com / WebAssembly / binaryen . git ) : <nl> - <nl> - ` ` ` bash <nl> - cd ~ <nl> - git clone https : / / github . com / WebAssembly / binaryen . git <nl> - cd ~ / binaryen <nl> - git checkout tags / 1 . 37 . 14 <nl> - cmake . & & make <nl> - <nl> - ` ` ` <nl> - <nl> - Add ` BINARYEN_ROOT ` to your . bash_profile : <nl> - <nl> - ` ` ` bash <nl> - echo " export BINARYEN_ROOT = ~ / binaryen " > > ~ / . bash_profile <nl> - source ~ / . bash_profile <nl> - ` ` ` <nl> - <nl> By default LLVM and clang do not include the WASM build target , so you will have to build it yourself : <nl> <nl> ` ` ` bash <nl> make <nl> sudo make install <nl> ` ` ` <nl> <nl> - To use the WASM compiler , EOS has an external dependency on [ binaryen ] ( https : / / github . com / WebAssembly / binaryen . git ) : <nl> - <nl> - ` ` ` bash <nl> - cd ~ <nl> - git clone https : / / github . com / WebAssembly / binaryen . git <nl> - cd ~ / binaryen <nl> - git checkout tags / 1 . 37 . 14 <nl> - cmake . & & make <nl> - <nl> - ` ` ` <nl> - <nl> - Add ` BINARYEN_ROOT ` to your . bash_profile : <nl> - <nl> - ` ` ` bash <nl> - echo " export BINARYEN_ROOT = ~ / binaryen " > > ~ / . bash_profile <nl> - source ~ / . bash_profile <nl> - ` ` ` <nl> - <nl> By default LLVM and clang do not include the WASM build target , so you will have to build it yourself : <nl> <nl> ` ` ` bash <nl> make - j $ ( sysctl - in machdep . cpu . core_count ) <nl> sudo make install <nl> ` ` ` <nl> <nl> - Install [ binaryen v1 . 37 . 14 ] ( https : / / github . com / WebAssembly / binaryen . git ) : <nl> - <nl> - ` ` ` bash <nl> - cd ~ <nl> - git clone https : / / github . com / WebAssembly / binaryen . git <nl> - cd ~ / binaryen <nl> - git checkout tags / 1 . 37 . 14 <nl> - cmake . & & make - j $ ( sysctl - in machdep . cpu . core_count ) <nl> - ` ` ` <nl> - <nl> - Add ` BINARYEN_ROOT ` to your . bash_profile : <nl> - <nl> - ` ` ` bash <nl> - echo " export BINARYEN_ROOT = ~ / binaryen " > > ~ / . bash_profile <nl> - source ~ / . bash_profile <nl> - ` ` ` <nl> - <nl> Build LLVM and clang for WASM : <nl> <nl> ` ` ` bash <nl> make - j $ ( sysctl - in machdep . cpu . core_count ) <nl> make install <nl> ` ` ` <nl> <nl> - Your environment is set up . Now you can < a href = " # runanode " > build EOS and run a node < / a > . <nl> + Your environment is set up . Now you can < a href = " # runanode " > build EOS and run a node < / a > . <nl> \ No newline at end of file <nl> mmm a / eosio_build . sh <nl> ppp b / eosio_build . sh <nl> <nl> # ! / bin / bash <nl> # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - # This is EOS bootstrapper script for Linux and OS X . <nl> + # This is EOS automated install script for Linux and OS X . <nl> # This file was downloaded from https : / / github . com / EOSIO / eos <nl> # <nl> # Copyright ( c ) 2017 , Respective Authors all rights reserved . <nl> <nl> if [ $ ARCH = = " Linux " ] ; then <nl> <nl> if [ ! - e / etc / os - release ] ; then <nl> - printf " EOSIO currently supports Ubuntu , Red Hat & Fedora Linux only . \ n " <nl> + printf " EOSIO currently supports Amazon , Centos , Fedora & Ubuntu Linux only . \ n " <nl> printf " Please install on the latest version of one of these Linux distributions . \ n " <nl> - printf " https : / / www . ubuntu . com / \ n " <nl> + printf " https : / / aws . amazon . com / amazon - linux - ami / \ n " <nl> printf " https : / / start . fedoraproject . org / en / \ n " <nl> + printf " https : / / www . ubuntu . com / \ n " <nl> printf " Exiting now . \ n " <nl> exit 1 <nl> fi <nl> <nl> OS_NAME = $ ( cat / etc / os - release | grep ^ NAME | cut - d ' = ' - f2 | sed ' s / \ " / / gI ' ) <nl> <nl> case $ OS_NAME in <nl> - " Ubuntu " ) <nl> - FILE = $ { WORK_DIR } / scripts / eosio_build_ubuntu . sh <nl> - CXX_COMPILER = clang + + - 4 . 0 <nl> - C_COMPILER = clang - 4 . 0 <nl> + " Amazon Linux AMI " ) <nl> + FILE = $ { WORK_DIR } / scripts / eosio_build_amazon . sh <nl> + export CMAKE = $ { HOME } / opt / cmake / bin / cmake <nl> + CXX_COMPILER = g + + <nl> + C_COMPILER = gcc <nl> + export LLVM_DIR = $ { HOME } / opt / wasm / lib / cmake / llvm <nl> ; ; <nl> - " Fedora " ) <nl> - FILE = $ { WORK_DIR } / scripts / eosio_build_fedora . sh <nl> + " CentOS Linux " ) <nl> + FILE = $ { WORK_DIR } / scripts / eosio_build_centos . sh <nl> + export CMAKE = $ { HOME } / opt / cmake / bin / cmake <nl> CXX_COMPILER = g + + <nl> C_COMPILER = gcc <nl> + export LLVM_DIR = $ { HOME } / opt / wasm / lib / cmake / llvm <nl> ; ; <nl> - " Amazon Linux AMI " ) <nl> - FILE = $ { WORK_DIR } / scripts / eosio_build_amazon . sh <nl> - CMAKE = $ { HOME } / opt / cmake / bin / cmake <nl> + " Fedora " ) <nl> + FILE = $ { WORK_DIR } / scripts / eosio_build_fedora . sh <nl> CXX_COMPILER = g + + <nl> C_COMPILER = gcc <nl> export LLVM_DIR = $ { HOME } / opt / wasm / lib / cmake / llvm <nl> ; ; <nl> + " Ubuntu " ) <nl> + FILE = $ { WORK_DIR } / scripts / eosio_build_ubuntu . sh <nl> + CXX_COMPILER = clang + + - 4 . 0 <nl> + C_COMPILER = clang - 4 . 0 <nl> + ; ; <nl> * ) <nl> printf " \ n \ tUnsupported Linux Distribution . Exiting now . \ n \ n " <nl> exit 1 <nl> <nl> export BOOST_ROOT = $ { HOME } / opt / boost_1_66_0 <nl> export OPENSSL_ROOT_DIR = / usr / include / openssl <nl> export OPENSSL_LIBRARIES = / usr / include / openssl <nl> - export WASM_ROOT = $ { HOME } / opt / wasm <nl> + export WASM_ROOT = $ { HOME } / opt / wasm <nl> <nl> . $ FILE <nl> <nl> <nl> <nl> printf " \ n \ n > > > > > > > > ALL dependencies sucessfully found or installed . Installing EOS . IO \ n \ n " <nl> <nl> - # Debug flags <nl> COMPILE_EOS = 1 <nl> COMPILE_CONTRACTS = 1 <nl> - <nl> - # Define default arguments . <nl> CMAKE_BUILD_TYPE = RelWithDebugInfo <nl> <nl> - # Create the build dir <nl> cd $ { WORK_DIR } <nl> mkdir - p $ { BUILD_DIR } <nl> cd $ { BUILD_DIR } <nl> <nl> CMAKE = $ ( which cmake ) <nl> fi <nl> <nl> - # Build EOS <nl> $ CMAKE - DCMAKE_BUILD_TYPE = $ { CMAKE_BUILD_TYPE } - DCMAKE_CXX_COMPILER = $ { CXX_COMPILER } \ <nl> - - DCMAKE_C_COMPILER = $ { C_COMPILER } - DWASM_ROOT = $ { WASM_ROOT } \ <nl> - - DOPENSSL_ROOT_DIR = $ { OPENSSL_ROOT_DIR } \ <nl> + - DCMAKE_C_COMPILER = $ { C_COMPILER } - DWASM_ROOT = $ { WASM_ROOT } \ <nl> + - DOPENSSL_ROOT_DIR = $ { OPENSSL_ROOT_DIR } \ <nl> - DOPENSSL_LIBRARIES = $ { OPENSSL_LIBRARIES } . . <nl> + <nl> if [ $ ? - ne 0 ] ; then <nl> printf " \ n \ t > > > > > > > > > > > > > > > > > > > > CMAKE building EOSIO has exited with the above error . \ n \ n " <nl> exit - 1 <nl> deleted file mode 100644 <nl> index 7ecd9f0ead . . 0000000000 <nl> mmm a / scripts / eosio - build_dep <nl> ppp / dev / null <nl> <nl> - automake , http : / / ftp . gnu . org / gnu / automake / automake - 1 . 15 . tar . gz <nl> - libtool , http : / / gnu . askapache . com / libtool / libtool - 2 . 4 . 6 . tar . gz <nl> - openssl , https : / / www . openssl . org / source / openssl - 1 . 0 . 2n . tar . gz <nl> - LLVM , http : / / releases . llvm . org / 5 . 0 . 1 / llvm - 5 . 0 . 1 . src . tar . xz <nl> - wget , https : / / ftp . gnu . org / gnu / wget / wget - 1 . 19 . 2 . tar . gz <nl> - cmake , https : / / cmake . org / files / v3 . 10 / cmake - 3 . 10 . 1 - Darwin - x86_64 . tar . gz <nl> - boost , https : / / dl . bintray . com / boostorg / release / 1 . 66 . 0 / source / boost_1_66_0 . tar . gz <nl> - gmp , https : / / ftp . gnu . org / gnu / gmp / gmp - 6 . 1 . 2 . tar . bz2 <nl> - gettext , https : / / ftp . gnu . org / pub / gnu / gettext / gettext - latest . tar . gz <nl> mmm a / scripts / eosio_build_amazon . sh <nl> ppp b / scripts / eosio_build_amazon . sh <nl> <nl> fi <nl> <nl> if [ $ DISK_AVAIL - lt $ DISK_MIN ] ; then <nl> - printf " \ tYou must have at least 100GB of available storage to install EOSIO . \ n " <nl> + printf " \ tYou must have at least $ { DISK_MIN } GB of available storage to install EOSIO . \ n " <nl> printf " \ texiting now . \ n " <nl> exit 1 <nl> fi <nl> new file mode 100644 <nl> index 0000000000 . . 57321ec07c <nl> mmm / dev / null <nl> ppp b / scripts / eosio_build_centos . sh <nl> <nl> + OS_VER = $ ( cat / etc / os - release | grep VERSION_ID | cut - d ' = ' - f2 | sed ' s / [ ^ 0 - 9 \ . ] / / gI ' | cut - d ' . ' - f1 ) <nl> + <nl> + MEM_MEG = $ ( free - m | grep Mem | tr - s ' ' | cut - d \ - f2 ) <nl> + <nl> + CPU_SPEED = $ ( lscpu | grep " MHz " | tr - s ' ' | cut - d \ - f3 | cut - d ' . ' - f1 ) <nl> + CPU_CORE = $ ( lscpu | grep " ^ CPU ( s ) " | tr - s ' ' | cut - d \ - f2 ) <nl> + <nl> + DISK_TOTAL = ` df - h / | grep / dev | tr - s ' ' | cut - d \ - f2 | sed ' s / [ ^ 0 - 9 ] / / ' ` <nl> + DISK_AVAIL = ` df - h / | grep / dev | tr - s ' ' | cut - d \ - f4 | sed ' s / [ ^ 0 - 9 ] / / ' ` <nl> + <nl> + printf " \ n \ tOS name : $ OS_NAME \ n " <nl> + printf " \ tOS Version : $ { OS_VER } \ n " <nl> + printf " \ tCPU speed : $ { CPU_SPEED } Mhz \ n " <nl> + printf " \ tCPU cores : $ CPU_CORE \ n " <nl> + printf " \ tPhysical Memory : $ MEM_MEG Mgb \ n " <nl> + printf " \ tDisk space total : $ { DISK_TOTAL } G \ n " <nl> + printf " \ tDisk space available : $ { DISK_AVAIL } G \ n " <nl> + <nl> + if [ $ MEM_MEG - lt 4000 ] ; then <nl> + echo " Your system must have 4 or more Gigabytes of physical memory installed . " <nl> + echo " exiting now . " <nl> + exit 1 <nl> + fi <nl> + <nl> + if [ $ OS_VER - lt 7 ] ; then <nl> + echo " You must be running Centos 7 or higher to install EOSIO . " <nl> + echo " exiting now " <nl> + exit 1 <nl> + fi <nl> + <nl> + if [ $ DISK_AVAIL - lt $ DISK_MIN ] ; then <nl> + echo " You must have at least $ { DISK_MIN } GB of available storage to install EOSIO . " <nl> + echo " exiting now " <nl> + exit 1 <nl> + fi <nl> + printf " \ n \ tChecking Yum installation \ n " <nl> + <nl> + YUM = $ ( which yum 2 > / dev / null ) <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ n \ tYum must be installed to compile EOS . IO . \ n " <nl> + printf " \ n \ tExiting now . \ n " <nl> + exit 0 <nl> + fi <nl> + <nl> + printf " \ tYum installation found at $ { YUM } . \ n " <nl> + printf " \ n \ tChecking installation of Centos Software Collections Repository . \ n " <nl> + <nl> + SCL = $ ( which scl 2 > / dev / null ) <nl> + if [ - z $ SCL ] ; then <nl> + printf " \ n \ tThe Centos Software Collections Repository and devtoolset - 7 are required to install EOSIO . \ n " <nl> + printf " \ tDo you wish to install and enable this repository and devtoolset package ? \ n " <nl> + select yn in " Yes " " No " ; do <nl> + case $ yn in <nl> + [ Yy ] * ) <nl> + printf " \ n \ n \ tInstalling SCL . \ n \ n " <nl> + sudo yum - y - - enablerepo = extras install centos - release - scl 2 > / dev / null <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ n \ tCentos Software Collections Repository installation failed . \ n " <nl> + printf " \ n \ tExiting now . \ n " <nl> + exit 1 <nl> + else <nl> + printf " \ n \ tCentos Software Collections Repository installed successfully . \ n " <nl> + fi <nl> + printf " \ n \ n \ tInstalling devtoolset - 7 . \ n \ n " <nl> + sudo yum install - y devtoolset - 7 2 > / dev / null <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ n \ tCentos devtoolset - 7 installation failed . \ n " <nl> + printf " \ n \ tExiting now . \ n " <nl> + exit 1 <nl> + else <nl> + printf " \ n \ tCentos devtoolset installed successfully . \ n " <nl> + fi <nl> + break ; ; <nl> + [ Nn ] * ) echo " User aborting installation of required Centos Software Collections Repository , Exiting now . " ; exit ; ; <nl> + * ) echo " Please type 1 for yes or 2 for no . " ; ; <nl> + esac <nl> + done <nl> + else <nl> + printf " \ n \ tCentos Software Collections Repository found . \ n " <nl> + fi <nl> + <nl> + printf " \ n \ tEnabling Centos devtoolset - 7 . \ n " <nl> + source / opt / rh / devtoolset - 7 / enable <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ n \ tUnable to enable Centos devtoolset - 7 at this time . \ n " <nl> + printf " \ n \ tExiting now . \ n " <nl> + exit 1 <nl> + fi <nl> + printf " \ n \ tCentos devtoolset - 7 successfully enabled . \ n " <nl> + <nl> + printf " \ n \ tUpdating YUM repository . \ n " <nl> + <nl> + sudo yum - y update 2 > / dev / null <nl> + <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ n \ tYUM update failed . \ n " <nl> + printf " \ n \ tExiting now . \ n " <nl> + exit 1 <nl> + fi <nl> + <nl> + printf " \ n \ tYUM repository successfully updated . \ n " <nl> + <nl> + DEP_ARRAY = ( git autoconf automake libtool ocaml . x86_64 doxygen libicu - devel . x86_64 bzip2 - devel . x86_64 openssl - devel . x86_64 gmp - devel . x86_64 python - devel . x86_64 gettext - devel . x86_64 ) <nl> + DCOUNT = 0 <nl> + COUNT = 1 <nl> + DISPLAY = " " <nl> + DEP = " " <nl> + <nl> + printf " \ n \ tChecking YUM for installed dependencies . \ n \ n " <nl> + <nl> + for ( ( i = 0 ; i < $ { # DEP_ARRAY [ @ ] } ; i + + ) ) ; <nl> + do <nl> + pkg = $ ( sudo $ YUM info $ { DEP_ARRAY [ $ i ] } 2 > / dev / null | grep Repo | tr - s ' ' | cut - d : - f2 | sed ' s / / / g ' ) <nl> + <nl> + if [ " $ pkg " ! = " installed " ] ; then <nl> + DEP = $ DEP " $ { DEP_ARRAY [ $ i ] } " <nl> + DISPLAY = " $ { DISPLAY } $ { COUNT } . $ { DEP_ARRAY [ $ i ] } \ n \ t " <nl> + printf " \ tPackage $ { DEP_ARRAY [ $ i ] } $ { bldred } NOT $ { txtrst } found . \ n " <nl> + let COUNT + + <nl> + let DCOUNT + + <nl> + else <nl> + printf " \ tPackage $ { DEP_ARRAY [ $ i ] } found . \ n " <nl> + continue <nl> + fi <nl> + done <nl> + <nl> + if [ $ { DCOUNT } - ne 0 ] ; then <nl> + printf " \ n \ tThe following dependencies are required to install EOSIO . \ n " <nl> + printf " \ n \ t $ DISPLAY \ n \ n " <nl> + printf " \ tDo you wish to install these dependencies ? \ n " <nl> + select yn in " Yes " " No " ; do <nl> + case $ yn in <nl> + [ Yy ] * ) <nl> + printf " \ n \ n \ tInstalling dependencies \ n \ n " <nl> + sudo yum - y install $ { DEP } <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ n \ tYUM dependency installation failed . \ n " <nl> + printf " \ n \ tExiting now . \ n " <nl> + exit 1 <nl> + else <nl> + printf " \ n \ tYUM dependencies installed successfully . \ n " <nl> + fi <nl> + break ; ; <nl> + [ Nn ] * ) echo " User aborting installation of required dependencies , Exiting now . " ; exit ; ; <nl> + * ) echo " Please type 1 for yes or 2 for no . " ; ; <nl> + esac <nl> + done <nl> + else <nl> + printf " \ n \ tNo required YUM dependencies to install . \ n " <nl> + fi <nl> + <nl> + printf " \ n \ tChecking for CMAKE . \ n " <nl> + # install CMAKE 3 . 10 . 2 <nl> + if [ ! - e $ { CMAKE } ] ; then <nl> + printf " \ tInstalling CMAKE \ n " <nl> + mkdir - p $ { HOME } / opt / 2 > / dev / null <nl> + cd $ { HOME } / opt <nl> + curl - L - O https : / / cmake . org / files / v3 . 10 / cmake - 3 . 10 . 2 . tar . gz <nl> + tar xf cmake - 3 . 10 . 2 . tar . gz <nl> + rm - f cmake - 3 . 10 . 2 . tar . gz <nl> + ln - s cmake - 3 . 10 . 2 / cmake <nl> + cd cmake <nl> + . / bootstrap <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ tError running bootstrap for CMAKE . \ n " <nl> + printf " \ tExiting now . \ n \ n " <nl> + exit ; <nl> + fi <nl> + make <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ tError compiling CMAKE . \ n " <nl> + printf " \ tExiting now . \ n \ n " <nl> + exit ; <nl> + fi <nl> + else <nl> + printf " \ tCMAKE found \ n " <nl> + fi <nl> + <nl> + printf " \ n \ tChecking for boost libraries \ n " <nl> + if [ ! - d $ { HOME } / opt / boost_1_66_0 ] ; then <nl> + # install boost <nl> + printf " \ tInstalling boost libraries \ n " <nl> + cd $ { TEMP_DIR } <nl> + curl - L https : / / dl . bintray . com / boostorg / release / 1 . 66 . 0 / source / boost_1_66_0 . tar . bz2 > boost_1 . 66 . 0 . tar . bz2 <nl> + tar xf boost_1 . 66 . 0 . tar . bz2 <nl> + cd boost_1_66_0 / <nl> + . / bootstrap . sh " - - prefix = $ BOOST_ROOT " <nl> + . / b2 install <nl> + rm - rf $ { TEMP_DIR } / boost_1_66_0 / <nl> + rm - f $ { TEMP_DIR } / boost_1 . 66 . 0 . tar . bz2 <nl> + else <nl> + printf " \ tBoost 1 . 66 found at $ { HOME } / opt / boost_1_66_0 \ n " <nl> + fi <nl> + <nl> + printf " \ n \ tChecking for secp256k1 - zkp \ n " <nl> + # install secp256k1 - zkp ( Cryptonomex branch ) <nl> + if [ ! - e / usr / local / lib / libsecp256k1 . a ] ; then <nl> + printf " \ tInstalling secp256k1 - zkp ( Cryptonomex branch ) \ n " <nl> + cd $ { TEMP_DIR } <nl> + git clone https : / / github . com / cryptonomex / secp256k1 - zkp . git <nl> + cd secp256k1 - zkp <nl> + . / autogen . sh <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ tError running autogen for secp256k1 - zkp . \ n " <nl> + printf " \ tExiting now . \ n \ n " <nl> + exit ; <nl> + fi <nl> + . / configure <nl> + make <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ tError compiling secp256k1 - zkp . \ n " <nl> + printf " \ tExiting now . \ n \ n " <nl> + exit ; <nl> + fi <nl> + sudo make install <nl> + rm - rf cd $ { TEMP_DIR } / secp256k1 - zkp <nl> + else <nl> + printf " \ tsecp256k1 found \ n " <nl> + fi <nl> + <nl> + printf " \ n \ tChecking for binaryen \ n " <nl> + if [ ! - d $ { HOME } / opt / binaryen ] ; then <nl> + # Install binaryen v1 . 37 . 14 : <nl> + printf " \ tInstalling binaryen v1 . 37 . 14 : \ n " <nl> + cd $ { TEMP_DIR } <nl> + git clone https : / / github . com / EOSIO / binaryen <nl> + cd binaryen <nl> + git checkout eosio <nl> + $ CMAKE . & & make <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ tError compiling binaryen . \ n " <nl> + printf " \ tExiting now . \ n \ n " <nl> + exit ; <nl> + fi <nl> + mkdir - p $ { HOME } / opt / binaryen / 2 > / dev / null <nl> + mv $ { TEMP_DIR } / binaryen / bin $ { HOME } / opt / binaryen / <nl> + rm - rf $ { TEMP_DIR } / binaryen <nl> + else <nl> + printf " \ tBinaryen found at $ { HOME } / opt / binaryen \ n " <nl> + fi <nl> + <nl> + printf " \ n \ tChecking for LLVM with WASM support . \ n " <nl> + if [ ! - d $ { HOME } / opt / wasm / bin ] ; then <nl> + # Build LLVM and clang with EXPERIMENTAL WASM support : <nl> + printf " \ tInstalling LLVM & WASM \ n " <nl> + cd $ { TEMP_DIR } <nl> + mkdir llvm - compiler 2 > / dev / null <nl> + cd llvm - compiler <nl> + git clone - - depth 1 - - single - branch - - branch release_40 https : / / github . com / llvm - mirror / llvm . git <nl> + cd llvm / tools <nl> + git clone - - depth 1 - - single - branch - - branch release_40 https : / / github . com / llvm - mirror / clang . git <nl> + cd . . <nl> + mkdir build 2 > / dev / null <nl> + cd build <nl> + $ CMAKE - G " Unix Makefiles " - DCMAKE_INSTALL_PREFIX = $ { HOME } / opt / wasm \ <nl> + - DLLVM_TARGETS_TO_BUILD = " host " - DLLVM_EXPERIMENTAL_TARGETS_TO_BUILD = WebAssembly \ <nl> + - DLLVM_ENABLE_RTTI = 1 - DCMAKE_BUILD_TYPE = Release . . / <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ tError compiling LLVM and clang with EXPERIMENTAL WASM support . \ n " <nl> + printf " \ tExiting now . \ n \ n " <nl> + exit ; <nl> + fi <nl> + make - j $ ( nproc ) <nl> + if [ $ ? - ne 0 ] ; then <nl> + printf " \ tError compiling LLVM and clang with EXPERIMENTAL WASM support . \ n " <nl> + printf " \ tExiting now . \ n \ n " <nl> + exit ; <nl> + fi <nl> + make install <nl> + rm - rf $ { TEMP_DIR } / llvm - compiler 2 > / dev / null <nl> + else <nl> + printf " \ tWASM found at $ { HOME } / opt / wasm \ n " <nl> + fi <nl> \ No newline at end of file <nl> mmm a / scripts / eosio_build_darwin . sh <nl> ppp b / scripts / eosio_build_darwin . sh <nl> <nl> fi <nl> <nl> if [ $ DISK_AVAIL - lt $ DISK_MIN ] ; then <nl> - echo " You must have at least 100GB of available storage to install EOSIO . " <nl> + echo " You must have at least $ { DISK_MIN } GB of available storage to install EOSIO . " <nl> echo " Exiting now . " <nl> exit 1 <nl> fi <nl> mmm a / scripts / eosio_build_fedora . sh <nl> ppp b / scripts / eosio_build_fedora . sh <nl> <nl> fi <nl> <nl> if [ $ DISK_AVAIL - lt $ DISK_MIN ] ; then <nl> - printf " \ tYou must have at least 100GB of available storage to install EOSIO . \ n " <nl> + printf " \ tYou must have at least $ { DISK_MIN } GB of available storage to install EOSIO . \ n " <nl> printf " \ tExiting now . \ n " <nl> exit 1 <nl> fi <nl> mmm a / scripts / eosio_build_ubuntu . sh <nl> ppp b / scripts / eosio_build_ubuntu . sh <nl> <nl> fi <nl> <nl> if [ $ DISK_AVAIL - lt $ DISK_MIN ] ; then <nl> - printf " \ tYou must have at least 100GB of available storage to install EOSIO . \ n " <nl> + printf " \ tYou must have at least $ { DISK_MIN } GB of available storage to install EOSIO . \ n " <nl> printf " \ tExiting now . \ n " <nl> exit 1 <nl> fi <nl>
Merge pull request from pacificcode / eosio_build_centos
EOSIO/eos
fb77459a66455613e1299eab59b1a7720a994c34
2018-03-05T14:41:58Z
new file mode 100644 <nl> index 000000000000 . . 972bb2edb099 <nl> mmm / dev / null <nl> ppp b / native_mate / LICENSE . chromium <nl> <nl> + / / Copyright 2014 The Chromium Authors . All rights reserved . <nl> + / / <nl> + / / Redistribution and use in source and binary forms , with or without <nl> + / / modification , are permitted provided that the following conditions are <nl> + / / met : <nl> + / / <nl> + / / * Redistributions of source code must retain the above copyright <nl> + / / notice , this list of conditions and the following disclaimer . <nl> + / / * Redistributions in binary form must reproduce the above <nl> + / / copyright notice , this list of conditions and the following disclaimer <nl> + / / in the documentation and / or other materials provided with the <nl> + / / distribution . <nl> + / / * Neither the name of Google Inc . nor the names of its <nl> + / / contributors may be used to endorse or promote products derived from <nl> + / / this software without specific prior written permission . <nl> + / / <nl> + / / THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS <nl> + / / " AS IS " AND ANY EXPRESS OR IMPLIED WARRANTIES , INCLUDING , BUT NOT <nl> + / / LIMITED TO , THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR <nl> + / / A PARTICULAR PURPOSE ARE DISCLAIMED . IN NO EVENT SHALL THE COPYRIGHT <nl> + / / OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT , INDIRECT , INCIDENTAL , <nl> + / / SPECIAL , EXEMPLARY , OR CONSEQUENTIAL DAMAGES ( INCLUDING , BUT NOT <nl> + / / LIMITED TO , PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES ; LOSS OF USE , <nl> + / / DATA , OR PROFITS ; OR BUSINESS INTERRUPTION ) HOWEVER CAUSED AND ON ANY <nl> + / / THEORY OF LIABILITY , WHETHER IN CONTRACT , STRICT LIABILITY , OR TORT <nl> + / / ( INCLUDING NEGLIGENCE OR OTHERWISE ) ARISING IN ANY WAY OUT OF THE USE <nl> + / / OF THIS SOFTWARE , EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE . <nl> new file mode 100644 <nl> index 000000000000 . . e5d056bfed9f <nl> mmm / dev / null <nl> ppp b / native_mate / README . md <nl> <nl> + > A fork of Chromium ' s [ gin library ] [ chromium - gin - lib ] that makes it easier to <nl> + > marshal types between C + + and JavaScript . <nl> + <nl> + # Overview <nl> + <nl> + ` native - mate ` was forked from ` gin ` so that it could be used in <nl> + [ Electron ] [ electron ] without conflicting with Node ' s Environment . It has also <nl> + been extended to allow Electron to create classes in JavaScript . <nl> + <nl> + With the help of Chromium ' s ` base ` library , ` native - mate ` makes writing JS <nl> + bindings very easy , and most of the intricate details of converting V8 types <nl> + to C + + types and back are taken care of auto - magically . In most cases there ' s <nl> + no need to use the raw V8 API to implement an API binding . <nl> + <nl> + For example , here ' s an API binding that doesn ' t use ` native - mate ` : <nl> + <nl> + ` ` ` c + + <nl> + / / static <nl> + void Shell : : OpenItem ( const v8 : : FunctionCallbackInfo < v8 : : Value > & args ) { <nl> + base : : FilePath file_path ; <nl> + if ( ! FromV8Arguments ( args , & file_path ) ) <nl> + return node : : ThrowTypeError ( " Bad argument " ) ; <nl> + <nl> + platform_util : : OpenItem ( file_path ) ; <nl> + } <nl> + <nl> + / / static <nl> + void Shell : : Initialize ( v8 : : Handle < v8 : : Object > target ) { <nl> + NODE_SET_METHOD ( target , " openItem " , OpenItem ) ; <nl> + } <nl> + ` ` ` <nl> + <nl> + And here ' s the same API binding using ` native - mate ` : <nl> + <nl> + ` ` ` c + + <nl> + void Initialize ( v8 : : Handle < v8 : : Object > exports ) { <nl> + mate : : Dictionary dict ( v8 : : Isolate : : GetCurrent ( ) , exports ) ; <nl> + dict . SetMethod ( " openItem " , & platform_util : : OpenItem ) ; <nl> + } <nl> + ` ` ` <nl> + <nl> + # Code Structure <nl> + <nl> + * ` converter . h ` - Templatized JS < - > C + + conversion routines for many common C + + <nl> + types . You can define your own by specializing ` Converter ` . <nl> + * ` function_template . h ` - Create JavaScript functions that dispatch to any C + + <nl> + function , member function pointer , or ` base : : Callback ` . <nl> + * ` object_template_builder . h ` - A handy utility for creation of ` v8 : : ObjectTemplate ` . <nl> + * ` wrappable . h ` - Base class for C + + classes that want to be owned by the V8 GC . <nl> + Wrappable objects are automatically deleted when GC discovers that nothing in <nl> + the V8 heap refers to them . This is also an easy way to expose C + + objects to <nl> + JavaScript . <nl> + <nl> + <nl> + [ chromium - gin - lib ] : https : / / code . google . com / p / chromium / codesearch # chromium / src / gin / README . md & sq = package : chromium <nl> + [ electron ] : http : / / electron . atom . io / <nl> new file mode 100644 <nl> index 000000000000 . . a23e6523cac1 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / arguments . cc <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # include " native_mate / arguments . h " <nl> + <nl> + # include " base / strings / stringprintf . h " <nl> + # include " native_mate / converter . h " <nl> + <nl> + namespace mate { <nl> + <nl> + namespace { <nl> + <nl> + std : : string V8TypeAsString ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > value ) { <nl> + if ( value . IsEmpty ( ) ) <nl> + return " < empty handle > " ; <nl> + v8 : : MaybeLocal < v8 : : String > details = <nl> + value - > ToDetailString ( isolate - > GetCurrentContext ( ) ) ; <nl> + std : : string result ; <nl> + if ( ! details . IsEmpty ( ) ) <nl> + ConvertFromV8 ( isolate , details . ToLocalChecked ( ) , & result ) ; <nl> + return result ; <nl> + } <nl> + <nl> + } / / namespace <nl> + <nl> + Arguments : : Arguments ( ) <nl> + : isolate_ ( NULL ) , <nl> + info_ ( NULL ) , <nl> + next_ ( 0 ) , <nl> + insufficient_arguments_ ( false ) { <nl> + } <nl> + <nl> + Arguments : : Arguments ( const v8 : : FunctionCallbackInfo < v8 : : Value > & info ) <nl> + : isolate_ ( info . GetIsolate ( ) ) , <nl> + info_ ( & info ) , <nl> + next_ ( 0 ) , <nl> + insufficient_arguments_ ( false ) { <nl> + } <nl> + <nl> + Arguments : : ~ Arguments ( ) { <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Value > Arguments : : PeekNext ( ) const { <nl> + if ( next_ > = info_ - > Length ( ) ) <nl> + return v8 : : Local < v8 : : Value > ( ) ; <nl> + return ( * info_ ) [ next_ ] ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Value > Arguments : : ThrowError ( ) const { <nl> + if ( insufficient_arguments_ ) <nl> + return ThrowTypeError ( " Insufficient number of arguments . " ) ; <nl> + <nl> + return ThrowTypeError ( base : : StringPrintf ( <nl> + " Error processing argument at index % d , conversion failure from % s " , <nl> + next_ , V8TypeAsString ( isolate_ , ( * info_ ) [ next_ ] ) . c_str ( ) ) ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Value > Arguments : : ThrowError ( const std : : string & message ) const { <nl> + isolate_ - > ThrowException ( v8 : : Exception : : Error ( <nl> + StringToV8 ( isolate_ , message ) ) ) ; <nl> + return v8 : : Undefined ( isolate_ ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Value > Arguments : : ThrowTypeError ( <nl> + const std : : string & message ) const { <nl> + isolate_ - > ThrowException ( v8 : : Exception : : TypeError ( <nl> + StringToV8 ( isolate_ , message ) ) ) ; <nl> + return v8 : : Undefined ( isolate_ ) ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . 9198f289d6e1 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / arguments . h <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_ARGUMENTS_H_ <nl> + # define NATIVE_MATE_ARGUMENTS_H_ <nl> + <nl> + # include " base / macros . h " <nl> + # include " native_mate / converter . h " <nl> + <nl> + namespace mate { <nl> + <nl> + / / Arguments is a wrapper around v8 : : FunctionCallbackInfo that integrates <nl> + / / with Converter to make it easier to marshall arguments and return values <nl> + / / between V8 and C + + . <nl> + class Arguments { <nl> + public : <nl> + Arguments ( ) ; <nl> + explicit Arguments ( const v8 : : FunctionCallbackInfo < v8 : : Value > & info ) ; <nl> + ~ Arguments ( ) ; <nl> + <nl> + v8 : : Local < v8 : : Object > GetHolder ( ) const { <nl> + return info_ - > Holder ( ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool GetHolder ( T * out ) { <nl> + return ConvertFromV8 ( isolate_ , info_ - > Holder ( ) , out ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool GetData ( T * out ) { <nl> + return ConvertFromV8 ( isolate_ , info_ - > Data ( ) , out ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool GetNext ( T * out ) { <nl> + if ( next_ > = info_ - > Length ( ) ) { <nl> + insufficient_arguments_ = true ; <nl> + return false ; <nl> + } <nl> + v8 : : Local < v8 : : Value > val = ( * info_ ) [ next_ ] ; <nl> + bool success = ConvertFromV8 ( isolate_ , val , out ) ; <nl> + if ( success ) <nl> + next_ + + ; <nl> + return success ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool GetRemaining ( std : : vector < T > * out ) { <nl> + if ( next_ > = info_ - > Length ( ) ) { <nl> + insufficient_arguments_ = true ; <nl> + return false ; <nl> + } <nl> + int remaining = info_ - > Length ( ) - next_ ; <nl> + out - > resize ( remaining ) ; <nl> + for ( int i = 0 ; i < remaining ; + + i ) { <nl> + v8 : : Local < v8 : : Value > val = ( * info_ ) [ next_ + + ] ; <nl> + if ( ! ConvertFromV8 ( isolate_ , val , & out - > at ( i ) ) ) <nl> + return false ; <nl> + } <nl> + return true ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Object > GetThis ( ) { <nl> + return info_ - > This ( ) ; <nl> + } <nl> + <nl> + bool IsConstructCall ( ) const { <nl> + return info_ - > IsConstructCall ( ) ; <nl> + } <nl> + <nl> + int Length ( ) const { <nl> + return info_ - > Length ( ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + void Return ( T val ) { <nl> + info_ - > GetReturnValue ( ) . Set ( ConvertToV8 ( isolate_ , val ) ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Value > PeekNext ( ) const ; <nl> + <nl> + v8 : : Local < v8 : : Value > ThrowError ( ) const ; <nl> + v8 : : Local < v8 : : Value > ThrowError ( const std : : string & message ) const ; <nl> + v8 : : Local < v8 : : Value > ThrowTypeError ( const std : : string & message ) const ; <nl> + <nl> + v8 : : Isolate * isolate ( ) const { return isolate_ ; } <nl> + <nl> + private : <nl> + v8 : : Isolate * isolate_ ; <nl> + const v8 : : FunctionCallbackInfo < v8 : : Value > * info_ ; <nl> + int next_ ; <nl> + bool insufficient_arguments_ ; <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_ARGUMENTS_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 8f165e629d88 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / constructor . h <nl> <nl> + / / This file was GENERATED by command : <nl> + / / pump . py constructor . h . pump <nl> + / / DO NOT EDIT BY HAND ! ! ! <nl> + <nl> + / / Copyright 2014 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_WRAPPABLE_CLASS_H_ <nl> + # define NATIVE_MATE_WRAPPABLE_CLASS_H_ <nl> + <nl> + # include " base / bind . h " <nl> + # include " native_mate / function_template . h " <nl> + <nl> + namespace mate { <nl> + <nl> + namespace internal { <nl> + <nl> + / / This set of templates invokes a base : : Callback by converting the Arguments <nl> + / / into native types . It relies on the function_template . h to provide helper <nl> + / / templates . <nl> + inline WrappableBase * InvokeFactory ( <nl> + Arguments * args , <nl> + const base : : Callback < WrappableBase * ( ) > & callback ) { <nl> + return callback . Run ( ) ; <nl> + } ; <nl> + <nl> + template < typename P1 > <nl> + inline WrappableBase * InvokeFactory ( <nl> + Arguments * args , <nl> + const base : : Callback < WrappableBase * ( P1 ) > & callback ) { <nl> + typename CallbackParamTraits < P1 > : : LocalType a1 ; <nl> + if ( ! GetNextArgument ( args , 0 , true , & a1 ) ) <nl> + return nullptr ; <nl> + return callback . Run ( a1 ) ; <nl> + } ; <nl> + <nl> + template < typename P1 , typename P2 > <nl> + inline WrappableBase * InvokeFactory ( <nl> + Arguments * args , <nl> + const base : : Callback < WrappableBase * ( P1 , P2 ) > & callback ) { <nl> + typename CallbackParamTraits < P1 > : : LocalType a1 ; <nl> + typename CallbackParamTraits < P2 > : : LocalType a2 ; <nl> + if ( ! GetNextArgument ( args , 0 , true , & a1 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a2 ) ) <nl> + return nullptr ; <nl> + return callback . Run ( a1 , a2 ) ; <nl> + } ; <nl> + <nl> + template < typename P1 , typename P2 , typename P3 > <nl> + inline WrappableBase * InvokeFactory ( <nl> + Arguments * args , <nl> + const base : : Callback < WrappableBase * ( P1 , P2 , P3 ) > & callback ) { <nl> + typename CallbackParamTraits < P1 > : : LocalType a1 ; <nl> + typename CallbackParamTraits < P2 > : : LocalType a2 ; <nl> + typename CallbackParamTraits < P3 > : : LocalType a3 ; <nl> + if ( ! GetNextArgument ( args , 0 , true , & a1 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a2 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a3 ) ) <nl> + return nullptr ; <nl> + return callback . Run ( a1 , a2 , a3 ) ; <nl> + } ; <nl> + <nl> + template < typename P1 , typename P2 , typename P3 , typename P4 > <nl> + inline WrappableBase * InvokeFactory ( <nl> + Arguments * args , <nl> + const base : : Callback < WrappableBase * ( P1 , P2 , P3 , P4 ) > & callback ) { <nl> + typename CallbackParamTraits < P1 > : : LocalType a1 ; <nl> + typename CallbackParamTraits < P2 > : : LocalType a2 ; <nl> + typename CallbackParamTraits < P3 > : : LocalType a3 ; <nl> + typename CallbackParamTraits < P4 > : : LocalType a4 ; <nl> + if ( ! GetNextArgument ( args , 0 , true , & a1 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a2 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a3 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a4 ) ) <nl> + return nullptr ; <nl> + return callback . Run ( a1 , a2 , a3 , a4 ) ; <nl> + } ; <nl> + <nl> + template < typename P1 , typename P2 , typename P3 , typename P4 , typename P5 > <nl> + inline WrappableBase * InvokeFactory ( <nl> + Arguments * args , <nl> + const base : : Callback < WrappableBase * ( P1 , P2 , P3 , P4 , P5 ) > & callback ) { <nl> + typename CallbackParamTraits < P1 > : : LocalType a1 ; <nl> + typename CallbackParamTraits < P2 > : : LocalType a2 ; <nl> + typename CallbackParamTraits < P3 > : : LocalType a3 ; <nl> + typename CallbackParamTraits < P4 > : : LocalType a4 ; <nl> + typename CallbackParamTraits < P5 > : : LocalType a5 ; <nl> + if ( ! GetNextArgument ( args , 0 , true , & a1 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a2 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a3 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a4 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a5 ) ) <nl> + return nullptr ; <nl> + return callback . Run ( a1 , a2 , a3 , a4 , a5 ) ; <nl> + } ; <nl> + <nl> + template < typename P1 , typename P2 , typename P3 , typename P4 , typename P5 , <nl> + typename P6 > <nl> + inline WrappableBase * InvokeFactory ( <nl> + Arguments * args , <nl> + const base : : Callback < WrappableBase * ( P1 , P2 , P3 , P4 , P5 , P6 ) > & callback ) { <nl> + typename CallbackParamTraits < P1 > : : LocalType a1 ; <nl> + typename CallbackParamTraits < P2 > : : LocalType a2 ; <nl> + typename CallbackParamTraits < P3 > : : LocalType a3 ; <nl> + typename CallbackParamTraits < P4 > : : LocalType a4 ; <nl> + typename CallbackParamTraits < P5 > : : LocalType a5 ; <nl> + typename CallbackParamTraits < P6 > : : LocalType a6 ; <nl> + if ( ! GetNextArgument ( args , 0 , true , & a1 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a2 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a3 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a4 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a5 ) | | <nl> + ! GetNextArgument ( args , 0 , false , & a6 ) ) <nl> + return nullptr ; <nl> + return callback . Run ( a1 , a2 , a3 , a4 , a5 , a6 ) ; <nl> + } ; <nl> + <nl> + template < typename Sig > <nl> + void InvokeNew ( const base : : Callback < Sig > & factory , <nl> + v8 : : Isolate * isolate , Arguments * args ) { <nl> + if ( ! args - > IsConstructCall ( ) ) { <nl> + args - > ThrowError ( " Requires constructor call " ) ; <nl> + return ; <nl> + } <nl> + <nl> + WrappableBase * object ; <nl> + { <nl> + / / Don ' t continue if the constructor throws an exception . <nl> + v8 : : TryCatch try_catch ( isolate ) ; <nl> + object = internal : : InvokeFactory ( args , factory ) ; <nl> + if ( try_catch . HasCaught ( ) ) { <nl> + try_catch . ReThrow ( ) ; <nl> + return ; <nl> + } <nl> + } <nl> + <nl> + if ( ! object ) <nl> + args - > ThrowError ( ) ; <nl> + <nl> + return ; <nl> + } <nl> + <nl> + } / / namespace internal <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_WRAPPABLE_CLASS_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 260bdf105f60 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / converter . cc <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # include " native_mate / converter . h " <nl> + <nl> + # include " v8 / include / v8 . h " <nl> + <nl> + using v8 : : Array ; <nl> + using v8 : : Boolean ; <nl> + using v8 : : External ; <nl> + using v8 : : Function ; <nl> + using v8 : : Integer ; <nl> + using v8 : : Isolate ; <nl> + using v8 : : Local ; <nl> + using v8 : : Number ; <nl> + using v8 : : Object ; <nl> + using v8 : : String ; <nl> + using v8 : : Value ; <nl> + <nl> + namespace mate { <nl> + <nl> + Local < Value > Converter < bool > : : ToV8 ( Isolate * isolate , bool val ) { <nl> + return v8 : : Boolean : : New ( isolate , val ) ; <nl> + } <nl> + <nl> + bool Converter < bool > : : FromV8 ( Isolate * isolate , Local < Value > val , bool * out ) { <nl> + if ( ! val - > IsBoolean ( ) ) <nl> + return false ; <nl> + * out = val - > BooleanValue ( ) ; <nl> + return true ; <nl> + } <nl> + <nl> + # if ! defined ( OS_LINUX ) & & ! defined ( OS_FREEBSD ) <nl> + Local < Value > Converter < unsigned long > : : ToV8 ( Isolate * isolate , <nl> + unsigned long val ) { <nl> + return v8 : : Integer : : New ( isolate , val ) ; <nl> + } <nl> + <nl> + bool Converter < unsigned long > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + unsigned long * out ) { <nl> + if ( ! val - > IsNumber ( ) ) <nl> + return false ; <nl> + * out = val - > IntegerValue ( ) ; <nl> + return true ; <nl> + } <nl> + # endif <nl> + <nl> + Local < Value > Converter < int32_t > : : ToV8 ( Isolate * isolate , int32_t val ) { <nl> + return v8 : : Integer : : New ( isolate , val ) ; <nl> + } <nl> + <nl> + bool Converter < int32_t > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + int32_t * out ) { <nl> + if ( ! val - > IsInt32 ( ) ) <nl> + return false ; <nl> + * out = val - > Int32Value ( ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < uint32_t > : : ToV8 ( Isolate * isolate , uint32_t val ) { <nl> + return v8 : : Integer : : NewFromUnsigned ( isolate , val ) ; <nl> + } <nl> + <nl> + bool Converter < uint32_t > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + uint32_t * out ) { <nl> + if ( ! val - > IsUint32 ( ) ) <nl> + return false ; <nl> + * out = val - > Uint32Value ( ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < int64_t > : : ToV8 ( Isolate * isolate , int64_t val ) { <nl> + return v8 : : Number : : New ( isolate , static_cast < double > ( val ) ) ; <nl> + } <nl> + <nl> + bool Converter < int64_t > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + int64_t * out ) { <nl> + if ( ! val - > IsNumber ( ) ) <nl> + return false ; <nl> + / / Even though IntegerValue returns int64_t , JavaScript cannot represent <nl> + / / the full precision of int64_t , which means some rounding might occur . <nl> + * out = val - > IntegerValue ( ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < uint64_t > : : ToV8 ( Isolate * isolate , uint64_t val ) { <nl> + return v8 : : Number : : New ( isolate , static_cast < double > ( val ) ) ; <nl> + } <nl> + <nl> + bool Converter < uint64_t > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + uint64_t * out ) { <nl> + if ( ! val - > IsNumber ( ) ) <nl> + return false ; <nl> + * out = static_cast < uint64_t > ( val - > IntegerValue ( ) ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < float > : : ToV8 ( Isolate * isolate , float val ) { <nl> + return v8 : : Number : : New ( isolate , val ) ; <nl> + } <nl> + <nl> + bool Converter < float > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + float * out ) { <nl> + if ( ! val - > IsNumber ( ) ) <nl> + return false ; <nl> + * out = static_cast < float > ( val - > NumberValue ( ) ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < double > : : ToV8 ( Isolate * isolate , double val ) { <nl> + return v8 : : Number : : New ( isolate , val ) ; <nl> + } <nl> + <nl> + bool Converter < double > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + double * out ) { <nl> + if ( ! val - > IsNumber ( ) ) <nl> + return false ; <nl> + * out = val - > NumberValue ( ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < const char * > : : ToV8 ( <nl> + Isolate * isolate , const char * val ) { <nl> + return v8 : : String : : NewFromUtf8 ( isolate , val ) ; <nl> + } <nl> + <nl> + Local < Value > Converter < base : : StringPiece > : : ToV8 ( <nl> + Isolate * isolate , const base : : StringPiece & val ) { <nl> + return v8 : : String : : NewFromUtf8 ( isolate , <nl> + val . data ( ) , <nl> + v8 : : String : : kNormalString , <nl> + static_cast < uint32_t > ( val . length ( ) ) ) ; <nl> + } <nl> + <nl> + Local < Value > Converter < std : : string > : : ToV8 ( Isolate * isolate , <nl> + const std : : string & val ) { <nl> + return Converter < base : : StringPiece > : : ToV8 ( isolate , val ) ; <nl> + } <nl> + <nl> + bool Converter < std : : string > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + std : : string * out ) { <nl> + if ( ! val - > IsString ( ) ) <nl> + return false ; <nl> + Local < String > str = Local < String > : : Cast ( val ) ; <nl> + int length = str - > Utf8Length ( ) ; <nl> + out - > resize ( length ) ; <nl> + str - > WriteUtf8 ( & ( * out ) [ 0 ] , length , NULL , String : : NO_NULL_TERMINATION ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < Local < Function > > : : ToV8 ( Isolate * isolate , <nl> + Local < Function > val ) { <nl> + return val ; <nl> + } <nl> + <nl> + bool Converter < Local < Function > > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + Local < Function > * out ) { <nl> + if ( ! val - > IsFunction ( ) ) <nl> + return false ; <nl> + * out = Local < Function > : : Cast ( val ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < Local < Object > > : : ToV8 ( Isolate * isolate , <nl> + Local < Object > val ) { <nl> + return val ; <nl> + } <nl> + <nl> + bool Converter < Local < Object > > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + Local < Object > * out ) { <nl> + if ( ! val - > IsObject ( ) ) <nl> + return false ; <nl> + * out = Local < Object > : : Cast ( val ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < Local < String > > : : ToV8 ( Isolate * isolate , <nl> + Local < String > val ) { <nl> + return val ; <nl> + } <nl> + <nl> + bool Converter < Local < String > > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + Local < String > * out ) { <nl> + if ( ! val - > IsString ( ) ) <nl> + return false ; <nl> + * out = Local < String > : : Cast ( val ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < Local < External > > : : ToV8 ( Isolate * isolate , <nl> + Local < External > val ) { <nl> + return val ; <nl> + } <nl> + <nl> + bool Converter < Local < External > > : : FromV8 ( Isolate * isolate , <nl> + v8 : : Local < Value > val , <nl> + Local < External > * out ) { <nl> + if ( ! val - > IsExternal ( ) ) <nl> + return false ; <nl> + * out = Local < External > : : Cast ( val ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < Local < Array > > : : ToV8 ( Isolate * isolate , <nl> + Local < Array > val ) { <nl> + return val ; <nl> + } <nl> + <nl> + bool Converter < Local < Array > > : : FromV8 ( Isolate * isolate , <nl> + v8 : : Local < Value > val , <nl> + Local < Array > * out ) { <nl> + if ( ! val - > IsArray ( ) ) <nl> + return false ; <nl> + * out = Local < Array > : : Cast ( val ) ; <nl> + return true ; <nl> + } <nl> + <nl> + Local < Value > Converter < Local < Value > > : : ToV8 ( Isolate * isolate , <nl> + Local < Value > val ) { <nl> + return val ; <nl> + } <nl> + <nl> + bool Converter < Local < Value > > : : FromV8 ( Isolate * isolate , Local < Value > val , <nl> + Local < Value > * out ) { <nl> + * out = val ; <nl> + return true ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : String > StringToSymbol ( v8 : : Isolate * isolate , <nl> + const base : : StringPiece & val ) { <nl> + return v8 : : String : : NewFromUtf8 ( isolate , <nl> + val . data ( ) , <nl> + v8 : : String : : kInternalizedString , <nl> + static_cast < uint32_t > ( val . length ( ) ) ) ; <nl> + } <nl> + <nl> + std : : string V8ToString ( v8 : : Local < v8 : : Value > value ) { <nl> + if ( value . IsEmpty ( ) ) <nl> + return std : : string ( ) ; <nl> + std : : string result ; <nl> + if ( ! ConvertFromV8 ( NULL , value , & result ) ) <nl> + return std : : string ( ) ; <nl> + return result ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . ec3649c19bb7 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / converter . h <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_CONVERTER_H_ <nl> + # define NATIVE_MATE_CONVERTER_H_ <nl> + <nl> + # include < map > <nl> + # include < string > <nl> + # include < vector > <nl> + # include < set > <nl> + <nl> + # include " base / strings / string_piece . h " <nl> + # include " v8 / include / v8 . h " <nl> + <nl> + namespace mate { <nl> + <nl> + template < typename KeyType > <nl> + bool SetProperty ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Object > object , <nl> + KeyType key , <nl> + v8 : : Local < v8 : : Value > value ) { <nl> + auto maybe = object - > Set ( isolate - > GetCurrentContext ( ) , key , value ) ; <nl> + return ! maybe . IsNothing ( ) & & maybe . FromJust ( ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + struct ToV8ReturnsMaybe { <nl> + static const bool value = false ; <nl> + } ; <nl> + <nl> + template < typename T , typename Enable = void > <nl> + struct Converter { } ; <nl> + <nl> + template < > <nl> + struct Converter < void * > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , void * val ) { <nl> + return v8 : : Undefined ( isolate ) ; <nl> + } <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < std : : nullptr_t > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , std : : nullptr_t val ) { <nl> + return v8 : : Null ( isolate ) ; <nl> + } <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < bool > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + bool val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + bool * out ) ; <nl> + } ; <nl> + <nl> + # if ! defined ( OS_LINUX ) & & ! defined ( OS_FREEBSD ) <nl> + template < > <nl> + struct Converter < unsigned long > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + unsigned long val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + unsigned long * out ) ; <nl> + } ; <nl> + # endif <nl> + <nl> + template < > <nl> + struct Converter < int32_t > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + int32_t val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + int32_t * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < uint32_t > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + uint32_t val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + uint32_t * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < int64_t > { <nl> + / / Warning : JavaScript cannot represent 64 integers precisely . <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + int64_t val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + int64_t * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < uint64_t > { <nl> + / / Warning : JavaScript cannot represent 64 integers precisely . <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + uint64_t val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + uint64_t * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < float > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + float val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + float * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < double > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + double val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + double * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < const char * > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , const char * val ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < base : : StringPiece > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + const base : : StringPiece & val ) ; <nl> + / / No conversion out is possible because StringPiece does not contain storage . <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < std : : string > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + const std : : string & val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + std : : string * out ) ; <nl> + } ; <nl> + <nl> + v8 : : Local < v8 : : String > StringToSymbol ( v8 : : Isolate * isolate , <nl> + const base : : StringPiece & input ) ; <nl> + <nl> + std : : string V8ToString ( v8 : : Local < v8 : : Value > value ) ; <nl> + <nl> + template < > <nl> + struct Converter < v8 : : Local < v8 : : Function > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Function > val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + v8 : : Local < v8 : : Function > * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < v8 : : Local < v8 : : Object > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Object > val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + v8 : : Local < v8 : : Object > * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < v8 : : Local < v8 : : String > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : String > val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + v8 : : Local < v8 : : String > * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < v8 : : Local < v8 : : External > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : External > val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + v8 : : Local < v8 : : External > * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < v8 : : Local < v8 : : Array > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Array > val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + v8 : : Local < v8 : : Array > * out ) ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < v8 : : Local < v8 : : Value > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + v8 : : Local < v8 : : Value > * out ) ; <nl> + } ; <nl> + <nl> + template < typename T > <nl> + struct Converter < std : : vector < T > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + const std : : vector < T > & val ) { <nl> + v8 : : Local < v8 : : Array > result ( <nl> + v8 : : Array : : New ( isolate , static_cast < int > ( val . size ( ) ) ) ) ; <nl> + for ( size_t i = 0 ; i < val . size ( ) ; + + i ) { <nl> + result - > Set ( static_cast < int > ( i ) , Converter < T > : : ToV8 ( isolate , val [ i ] ) ) ; <nl> + } <nl> + return result ; <nl> + } <nl> + <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + std : : vector < T > * out ) { <nl> + if ( ! val - > IsArray ( ) ) <nl> + return false ; <nl> + <nl> + std : : vector < T > result ; <nl> + v8 : : Local < v8 : : Array > array ( v8 : : Local < v8 : : Array > : : Cast ( val ) ) ; <nl> + uint32_t length = array - > Length ( ) ; <nl> + for ( uint32_t i = 0 ; i < length ; + + i ) { <nl> + T item ; <nl> + if ( ! Converter < T > : : FromV8 ( isolate , array - > Get ( i ) , & item ) ) <nl> + return false ; <nl> + result . push_back ( item ) ; <nl> + } <nl> + <nl> + out - > swap ( result ) ; <nl> + return true ; <nl> + } <nl> + } ; <nl> + <nl> + template < typename T > <nl> + struct Converter < std : : set < T > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + const std : : set < T > & val ) { <nl> + v8 : : Local < v8 : : Array > result ( <nl> + v8 : : Array : : New ( isolate , static_cast < int > ( val . size ( ) ) ) ) ; <nl> + typename std : : set < T > : : const_iterator it ; <nl> + int i ; <nl> + for ( i = 0 , it = val . begin ( ) ; it ! = val . end ( ) ; + + it , + + i ) <nl> + result - > Set ( i , Converter < T > : : ToV8 ( isolate , * it ) ) ; <nl> + return result ; <nl> + } <nl> + <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + std : : set < T > * out ) { <nl> + if ( ! val - > IsArray ( ) ) <nl> + return false ; <nl> + <nl> + std : : set < T > result ; <nl> + v8 : : Local < v8 : : Array > array ( v8 : : Local < v8 : : Array > : : Cast ( val ) ) ; <nl> + uint32_t length = array - > Length ( ) ; <nl> + for ( uint32_t i = 0 ; i < length ; + + i ) { <nl> + T item ; <nl> + if ( ! Converter < T > : : FromV8 ( isolate , array - > Get ( i ) , & item ) ) <nl> + return false ; <nl> + result . insert ( item ) ; <nl> + } <nl> + <nl> + out - > swap ( result ) ; <nl> + return true ; <nl> + } <nl> + } ; <nl> + <nl> + template < typename T > <nl> + struct Converter < std : : map < std : : string , T > > { <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + std : : map < std : : string , T > * out ) { <nl> + if ( ! val - > IsObject ( ) ) <nl> + return false ; <nl> + <nl> + v8 : : Local < v8 : : Object > dict = val - > ToObject ( ) ; <nl> + v8 : : Local < v8 : : Array > keys = dict - > GetOwnPropertyNames ( ) ; <nl> + for ( uint32_t i = 0 ; i < keys - > Length ( ) ; + + i ) { <nl> + v8 : : Local < v8 : : Value > key = keys - > Get ( i ) ; <nl> + T value ; <nl> + if ( Converter < T > : : FromV8 ( isolate , dict - > Get ( key ) , & value ) ) <nl> + ( * out ) [ V8ToString ( key ) ] = std : : move ( value ) ; <nl> + } <nl> + return true ; <nl> + } <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + const std : : map < std : : string , T > & val ) { <nl> + v8 : : Local < v8 : : Object > result = v8 : : Object : : New ( isolate ) ; <nl> + for ( auto i = val . begin ( ) ; i ! = val . end ( ) ; i + + ) { <nl> + result - > Set ( Converter < T > : : ToV8 ( isolate , i - > first ) , <nl> + Converter < T > : : ToV8 ( isolate , i - > second ) ) ; <nl> + } <nl> + return result ; <nl> + } <nl> + } ; <nl> + <nl> + / / Convenience functions that deduce T . <nl> + template < typename T > <nl> + v8 : : Local < v8 : : Value > ConvertToV8 ( v8 : : Isolate * isolate , const T & input ) { <nl> + return Converter < T > : : ToV8 ( isolate , input ) ; <nl> + } <nl> + <nl> + inline v8 : : Local < v8 : : Value > ConvertToV8 ( v8 : : Isolate * isolate , <nl> + const char * input ) { <nl> + return Converter < const char * > : : ToV8 ( isolate , input ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + v8 : : MaybeLocal < v8 : : Value > ConvertToV8 ( v8 : : Local < v8 : : Context > context , <nl> + const T & input ) { <nl> + return Converter < T > : : ToV8 ( context , input ) ; <nl> + } <nl> + <nl> + template < typename T , bool = ToV8ReturnsMaybe < T > : : value > struct ToV8Traits ; <nl> + <nl> + template < typename T > <nl> + struct ToV8Traits < T , true > { <nl> + static bool TryConvertToV8 ( v8 : : Isolate * isolate , <nl> + const T & input , <nl> + v8 : : Local < v8 : : Value > * output ) { <nl> + auto maybe = ConvertToV8 ( isolate - > GetCurrentContext ( ) , input ) ; <nl> + if ( maybe . IsEmpty ( ) ) <nl> + return false ; <nl> + * output = maybe . ToLocalChecked ( ) ; <nl> + return true ; <nl> + } <nl> + } ; <nl> + <nl> + template < typename T > <nl> + struct ToV8Traits < T , false > { <nl> + static bool TryConvertToV8 ( v8 : : Isolate * isolate , <nl> + const T & input , <nl> + v8 : : Local < v8 : : Value > * output ) { <nl> + * output = ConvertToV8 ( isolate , input ) ; <nl> + return true ; <nl> + } <nl> + } ; <nl> + <nl> + template < typename T > <nl> + bool TryConvertToV8 ( v8 : : Isolate * isolate , <nl> + const T & input , <nl> + v8 : : Local < v8 : : Value > * output ) { <nl> + return ToV8Traits < T > : : TryConvertToV8 ( isolate , input , output ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool ConvertFromV8 ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > input , <nl> + T * result ) { <nl> + return Converter < T > : : FromV8 ( isolate , input , result ) ; <nl> + } <nl> + <nl> + inline v8 : : Local < v8 : : String > StringToV8 ( <nl> + v8 : : Isolate * isolate , <nl> + const base : : StringPiece & input ) { <nl> + return ConvertToV8 ( isolate , input ) . As < v8 : : String > ( ) ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_CONVERTER_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 3caff1145974 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / dictionary . cc <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # include " native_mate / dictionary . h " <nl> + <nl> + namespace mate { <nl> + <nl> + Dictionary : : Dictionary ( ) <nl> + : isolate_ ( NULL ) { <nl> + } <nl> + <nl> + Dictionary : : Dictionary ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Object > object ) <nl> + : isolate_ ( isolate ) , <nl> + object_ ( object ) { <nl> + } <nl> + <nl> + Dictionary : : ~ Dictionary ( ) { <nl> + } <nl> + <nl> + Dictionary Dictionary : : CreateEmpty ( v8 : : Isolate * isolate ) { <nl> + return Dictionary ( isolate , v8 : : Object : : New ( isolate ) ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Object > Dictionary : : GetHandle ( ) const { <nl> + return object_ ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Value > Converter < Dictionary > : : ToV8 ( v8 : : Isolate * isolate , <nl> + Dictionary val ) { <nl> + return val . GetHandle ( ) ; <nl> + } <nl> + <nl> + bool Converter < Dictionary > : : FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + Dictionary * out ) { <nl> + if ( ! val - > IsObject ( ) | | val - > IsFunction ( ) ) <nl> + return false ; <nl> + * out = Dictionary ( isolate , v8 : : Local < v8 : : Object > : : Cast ( val ) ) ; <nl> + return true ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . 9e80cb99acfd <nl> mmm / dev / null <nl> ppp b / native_mate / mate / dictionary . h <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_DICTIONARY_H_ <nl> + # define NATIVE_MATE_DICTIONARY_H_ <nl> + <nl> + # include " native_mate / converter . h " <nl> + # include " native_mate / object_template_builder . h " <nl> + <nl> + namespace mate { <nl> + <nl> + namespace internal { <nl> + <nl> + / / Returns true if | maybe | is both a value , and that value is true . <nl> + inline bool IsTrue ( v8 : : Maybe < bool > maybe ) { <nl> + return maybe . IsJust ( ) & & maybe . FromJust ( ) ; <nl> + } <nl> + <nl> + } / / namespace internal <nl> + <nl> + / / Dictionary is useful when writing bindings for a function that either <nl> + / / receives an arbitrary JavaScript object as an argument or returns an <nl> + / / arbitrary JavaScript object as a result . For example , Dictionary is useful <nl> + / / when you might use the | dictionary | type in WebIDL : <nl> + / / <nl> + / / http : / / heycam . github . io / webidl / # idl - dictionaries <nl> + / / <nl> + / / WARNING : You cannot retain a Dictionary object in the heap . The underlying <nl> + / / storage for Dictionary is tied to the closest enclosing <nl> + / / v8 : : HandleScope . Generally speaking , you should store a Dictionary <nl> + / / on the stack . <nl> + / / <nl> + class Dictionary { <nl> + public : <nl> + Dictionary ( ) ; <nl> + Dictionary ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Object > object ) ; <nl> + virtual ~ Dictionary ( ) ; <nl> + <nl> + static Dictionary CreateEmpty ( v8 : : Isolate * isolate ) ; <nl> + <nl> + template < typename T > <nl> + bool Get ( const base : : StringPiece & key , T * out ) const { <nl> + / / Check for existence before getting , otherwise this method will always <nl> + / / returns true when T = = v8 : : Local < v8 : : Value > . <nl> + v8 : : Local < v8 : : Context > context = isolate_ - > GetCurrentContext ( ) ; <nl> + v8 : : Local < v8 : : String > v8_key = StringToV8 ( isolate_ , key ) ; <nl> + if ( ! internal : : IsTrue ( GetHandle ( ) - > Has ( context , v8_key ) ) ) <nl> + return false ; <nl> + <nl> + v8 : : Local < v8 : : Value > val ; <nl> + if ( ! GetHandle ( ) - > Get ( context , v8_key ) . ToLocal ( & val ) ) <nl> + return false ; <nl> + return ConvertFromV8 ( isolate_ , val , out ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool GetHidden ( const base : : StringPiece & key , T * out ) const { <nl> + v8 : : Local < v8 : : Context > context = isolate_ - > GetCurrentContext ( ) ; <nl> + v8 : : Local < v8 : : Private > privateKey = <nl> + v8 : : Private : : ForApi ( isolate_ , StringToV8 ( isolate_ , key ) ) ; <nl> + v8 : : Local < v8 : : Value > value ; <nl> + v8 : : Maybe < bool > result = <nl> + GetHandle ( ) - > HasPrivate ( context , privateKey ) ; <nl> + if ( internal : : IsTrue ( result ) & & <nl> + GetHandle ( ) - > GetPrivate ( context , privateKey ) . ToLocal ( & value ) ) <nl> + return ConvertFromV8 ( isolate_ , value , out ) ; <nl> + return false ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool Set ( const base : : StringPiece & key , const T & val ) { <nl> + v8 : : Local < v8 : : Value > v8_value ; <nl> + if ( ! TryConvertToV8 ( isolate_ , val , & v8_value ) ) <nl> + return false ; <nl> + v8 : : Maybe < bool > result = <nl> + GetHandle ( ) - > Set ( isolate_ - > GetCurrentContext ( ) , <nl> + StringToV8 ( isolate_ , key ) , <nl> + v8_value ) ; <nl> + return ! result . IsNothing ( ) & & result . FromJust ( ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool SetHidden ( const base : : StringPiece & key , T val ) { <nl> + v8 : : Local < v8 : : Value > v8_value ; <nl> + if ( ! TryConvertToV8 ( isolate_ , val , & v8_value ) ) <nl> + return false ; <nl> + v8 : : Local < v8 : : Context > context = isolate_ - > GetCurrentContext ( ) ; <nl> + v8 : : Local < v8 : : Private > privateKey = <nl> + v8 : : Private : : ForApi ( isolate_ , StringToV8 ( isolate_ , key ) ) ; <nl> + v8 : : Maybe < bool > result = <nl> + GetHandle ( ) - > SetPrivate ( context , privateKey , v8_value ) ; <nl> + return ! result . IsNothing ( ) & & result . FromJust ( ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool SetReadOnly ( const base : : StringPiece & key , T val ) { <nl> + v8 : : Local < v8 : : Value > v8_value ; <nl> + if ( ! TryConvertToV8 ( isolate_ , val , & v8_value ) ) <nl> + return false ; <nl> + v8 : : Maybe < bool > result = <nl> + GetHandle ( ) - > DefineOwnProperty ( isolate_ - > GetCurrentContext ( ) , <nl> + StringToV8 ( isolate_ , key ) , <nl> + v8_value , <nl> + v8 : : ReadOnly ) ; <nl> + return ! result . IsNothing ( ) & & result . FromJust ( ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + bool SetMethod ( const base : : StringPiece & key , const T & callback ) { <nl> + return GetHandle ( ) - > Set ( <nl> + StringToV8 ( isolate_ , key ) , <nl> + CallbackTraits < T > : : CreateTemplate ( isolate_ , callback ) - > GetFunction ( ) ) ; <nl> + } <nl> + <nl> + bool Delete ( const base : : StringPiece & key ) { <nl> + v8 : : Maybe < bool > result = GetHandle ( ) - > Delete ( isolate_ - > GetCurrentContext ( ) , <nl> + StringToV8 ( isolate_ , key ) ) ; <nl> + return ! result . IsNothing ( ) & & result . FromJust ( ) ; <nl> + } <nl> + <nl> + bool IsEmpty ( ) const { return isolate ( ) = = NULL ; } <nl> + <nl> + virtual v8 : : Local < v8 : : Object > GetHandle ( ) const ; <nl> + <nl> + v8 : : Isolate * isolate ( ) const { return isolate_ ; } <nl> + <nl> + protected : <nl> + v8 : : Isolate * isolate_ ; <nl> + <nl> + private : <nl> + v8 : : Local < v8 : : Object > object_ ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < Dictionary > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + Dictionary val ) ; <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + Dictionary * out ) ; <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_DICTIONARY_H_ <nl> new file mode 100644 <nl> index 000000000000 . . a4bcb1141b74 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / function_template . cc <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # include " native_mate / function_template . h " <nl> + <nl> + namespace mate { <nl> + <nl> + namespace internal { <nl> + <nl> + CallbackHolderBase : : CallbackHolderBase ( v8 : : Isolate * isolate ) <nl> + : v8_ref_ ( isolate , v8 : : External : : New ( isolate , this ) ) { <nl> + v8_ref_ . SetWeak ( this , & CallbackHolderBase : : FirstWeakCallback , <nl> + v8 : : WeakCallbackType : : kParameter ) ; <nl> + } <nl> + <nl> + CallbackHolderBase : : ~ CallbackHolderBase ( ) { <nl> + DCHECK ( v8_ref_ . IsEmpty ( ) ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : External > CallbackHolderBase : : GetHandle ( v8 : : Isolate * isolate ) { <nl> + return v8 : : Local < v8 : : External > : : New ( isolate , v8_ref_ ) ; <nl> + } <nl> + <nl> + / / static <nl> + void CallbackHolderBase : : FirstWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < CallbackHolderBase > & data ) { <nl> + data . GetParameter ( ) - > v8_ref_ . Reset ( ) ; <nl> + data . SetSecondPassCallback ( SecondWeakCallback ) ; <nl> + } <nl> + <nl> + / / static <nl> + void CallbackHolderBase : : SecondWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < CallbackHolderBase > & data ) { <nl> + delete data . GetParameter ( ) ; <nl> + } <nl> + <nl> + } / / namespace internal <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . abbe7b5326e1 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / function_template . h <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_FUNCTION_TEMPLATE_H_ <nl> + # define NATIVE_MATE_FUNCTION_TEMPLATE_H_ <nl> + <nl> + # include " base / callback . h " <nl> + # include " base / logging . h " <nl> + # include " native_mate / arguments . h " <nl> + # include " native_mate / wrappable_base . h " <nl> + # include " v8 / include / v8 . h " <nl> + <nl> + namespace mate { <nl> + <nl> + enum CreateFunctionTemplateFlags { <nl> + HolderIsFirstArgument = 1 < < 0 , <nl> + } ; <nl> + <nl> + namespace internal { <nl> + <nl> + struct Destroyable { <nl> + static void Destroy ( Arguments * args ) { <nl> + if ( IsDestroyed ( args ) ) <nl> + return ; <nl> + <nl> + v8 : : Local < v8 : : Object > holder = args - > GetHolder ( ) ; <nl> + delete static_cast < WrappableBase * > ( <nl> + holder - > GetAlignedPointerFromInternalField ( 0 ) ) ; <nl> + holder - > SetAlignedPointerInInternalField ( 0 , nullptr ) ; <nl> + } <nl> + static bool IsDestroyed ( Arguments * args ) { <nl> + v8 : : Local < v8 : : Object > holder = args - > GetHolder ( ) ; <nl> + return holder - > InternalFieldCount ( ) = = 0 | | <nl> + holder - > GetAlignedPointerFromInternalField ( 0 ) = = nullptr ; <nl> + } <nl> + } ; <nl> + <nl> + template < typename T > <nl> + struct CallbackParamTraits { <nl> + typedef T LocalType ; <nl> + } ; <nl> + template < typename T > <nl> + struct CallbackParamTraits < const T & > { <nl> + typedef T LocalType ; <nl> + } ; <nl> + template < typename T > <nl> + struct CallbackParamTraits < const T * > { <nl> + typedef T * LocalType ; <nl> + } ; <nl> + <nl> + <nl> + / / CallbackHolder and CallbackHolderBase are used to pass a base : : Callback from <nl> + / / CreateFunctionTemplate through v8 ( via v8 : : FunctionTemplate ) to <nl> + / / DispatchToCallback , where it is invoked . <nl> + <nl> + / / This simple base class is used so that we can share a single object template <nl> + / / among every CallbackHolder instance . <nl> + class CallbackHolderBase { <nl> + public : <nl> + v8 : : Local < v8 : : External > GetHandle ( v8 : : Isolate * isolate ) ; <nl> + <nl> + protected : <nl> + explicit CallbackHolderBase ( v8 : : Isolate * isolate ) ; <nl> + virtual ~ CallbackHolderBase ( ) ; <nl> + <nl> + private : <nl> + static void FirstWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < CallbackHolderBase > & data ) ; <nl> + static void SecondWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < CallbackHolderBase > & data ) ; <nl> + <nl> + v8 : : Global < v8 : : External > v8_ref_ ; <nl> + <nl> + DISALLOW_COPY_AND_ASSIGN ( CallbackHolderBase ) ; <nl> + } ; <nl> + <nl> + template < typename Sig > <nl> + class CallbackHolder : public CallbackHolderBase { <nl> + public : <nl> + CallbackHolder ( v8 : : Isolate * isolate , <nl> + const base : : Callback < Sig > & callback , <nl> + int flags ) <nl> + : CallbackHolderBase ( isolate ) , callback ( callback ) , flags ( flags ) { } <nl> + base : : Callback < Sig > callback ; <nl> + int flags ; <nl> + private : <nl> + virtual ~ CallbackHolder ( ) { } <nl> + <nl> + DISALLOW_COPY_AND_ASSIGN ( CallbackHolder ) ; <nl> + } ; <nl> + <nl> + template < typename T > <nl> + bool GetNextArgument ( Arguments * args , int create_flags , bool is_first , <nl> + T * result ) { <nl> + if ( is_first & & ( create_flags & HolderIsFirstArgument ) ! = 0 ) { <nl> + return args - > GetHolder ( result ) ; <nl> + } else { <nl> + return args - > GetNext ( result ) ; <nl> + } <nl> + } <nl> + <nl> + / / For advanced use cases , we allow callers to request the unparsed Arguments <nl> + / / object and poke around in it directly . <nl> + inline bool GetNextArgument ( Arguments * args , int create_flags , bool is_first , <nl> + Arguments * result ) { <nl> + * result = * args ; <nl> + return true ; <nl> + } <nl> + inline bool GetNextArgument ( Arguments * args , int create_flags , bool is_first , <nl> + Arguments * * result ) { <nl> + * result = args ; <nl> + return true ; <nl> + } <nl> + <nl> + / / It ' s common for clients to just need the isolate , so we make that easy . <nl> + inline bool GetNextArgument ( Arguments * args , int create_flags , <nl> + bool is_first , v8 : : Isolate * * result ) { <nl> + * result = args - > isolate ( ) ; <nl> + return true ; <nl> + } <nl> + <nl> + / / Classes for generating and storing an argument pack of integer indices <nl> + / / ( based on well - known " indices trick " , see : http : / / goo . gl / bKKojn ) : <nl> + template < size_t . . . indices > <nl> + struct IndicesHolder { } ; <nl> + <nl> + template < size_t requested_index , size_t . . . indices > <nl> + struct IndicesGenerator { <nl> + using type = typename IndicesGenerator < requested_index - 1 , <nl> + requested_index - 1 , <nl> + indices . . . > : : type ; <nl> + } ; <nl> + template < size_t . . . indices > <nl> + struct IndicesGenerator < 0 , indices . . . > { <nl> + using type = IndicesHolder < indices . . . > ; <nl> + } ; <nl> + <nl> + / / Class template for extracting and storing single argument for callback <nl> + / / at position | index | . <nl> + template < size_t index , typename ArgType > <nl> + struct ArgumentHolder { <nl> + using ArgLocalType = typename CallbackParamTraits < ArgType > : : LocalType ; <nl> + <nl> + ArgLocalType value ; <nl> + bool ok ; <nl> + <nl> + ArgumentHolder ( Arguments * args , int create_flags ) <nl> + : ok ( false ) { <nl> + if ( index = = 0 & & <nl> + ( create_flags & HolderIsFirstArgument ) & & <nl> + Destroyable : : IsDestroyed ( args ) ) { <nl> + args - > ThrowError ( " Object has been destroyed " ) ; <nl> + return ; <nl> + } <nl> + ok = GetNextArgument ( args , create_flags , index = = 0 , & value ) ; <nl> + if ( ! ok ) { <nl> + / / Ideally we would include the expected c + + type in the error <nl> + / / message which we can access via typeid ( ArgType ) . name ( ) <nl> + / / however we compile with no - rtti , which disables typeid . <nl> + args - > ThrowError ( ) ; <nl> + } <nl> + } <nl> + } ; <nl> + <nl> + / / Class template for converting arguments from JavaScript to C + + and running <nl> + / / the callback with them . <nl> + template < typename IndicesType , typename . . . ArgTypes > <nl> + class Invoker { } ; <nl> + <nl> + template < size_t . . . indices , typename . . . ArgTypes > <nl> + class Invoker < IndicesHolder < indices . . . > , ArgTypes . . . > <nl> + : public ArgumentHolder < indices , ArgTypes > . . . { <nl> + public : <nl> + / / Invoker < > inherits from ArgumentHolder < > for each argument . <nl> + / / C + + has always been strict about the class initialization order , <nl> + / / so it is guaranteed ArgumentHolders will be initialized ( and thus , will <nl> + / / extract arguments from Arguments ) in the right order . <nl> + Invoker ( Arguments * args , int create_flags ) <nl> + : ArgumentHolder < indices , ArgTypes > ( args , create_flags ) . . . , args_ ( args ) { <nl> + / / GCC thinks that create_flags is going unused , even though the <nl> + / / expansion above clearly makes use of it . Per jyasskin @ , casting <nl> + / / to void is the commonly accepted way to convince the compiler <nl> + / / that you ' re actually using a parameter / varible . <nl> + ( void ) create_flags ; <nl> + } <nl> + <nl> + bool IsOK ( ) { <nl> + return And ( ArgumentHolder < indices , ArgTypes > : : ok . . . ) ; <nl> + } <nl> + <nl> + template < typename ReturnType > <nl> + void DispatchToCallback ( base : : Callback < ReturnType ( ArgTypes . . . ) > callback ) { <nl> + v8 : : MicrotasksScope script_scope ( <nl> + args_ - > isolate ( ) , v8 : : MicrotasksScope : : kRunMicrotasks ) ; <nl> + args_ - > Return ( callback . Run ( ArgumentHolder < indices , ArgTypes > : : value . . . ) ) ; <nl> + } <nl> + <nl> + / / In C + + , you can declare the function foo ( void ) , but you can ' t pass a void <nl> + / / expression to foo . As a result , we must specialize the case of Callbacks <nl> + / / that have the void return type . <nl> + void DispatchToCallback ( base : : Callback < void ( ArgTypes . . . ) > callback ) { <nl> + v8 : : MicrotasksScope script_scope ( <nl> + args_ - > isolate ( ) , v8 : : MicrotasksScope : : kRunMicrotasks ) ; <nl> + callback . Run ( ArgumentHolder < indices , ArgTypes > : : value . . . ) ; <nl> + } <nl> + <nl> + private : <nl> + static bool And ( ) { return true ; } <nl> + template < typename . . . T > <nl> + static bool And ( bool arg1 , T . . . args ) { <nl> + return arg1 & & And ( args . . . ) ; <nl> + } <nl> + <nl> + Arguments * args_ ; <nl> + } ; <nl> + <nl> + / / DispatchToCallback converts all the JavaScript arguments to C + + types and <nl> + / / invokes the base : : Callback . <nl> + template < typename Sig > <nl> + struct Dispatcher { } ; <nl> + <nl> + template < typename ReturnType , typename . . . ArgTypes > <nl> + struct Dispatcher < ReturnType ( ArgTypes . . . ) > { <nl> + static void DispatchToCallback ( <nl> + const v8 : : FunctionCallbackInfo < v8 : : Value > & info ) { <nl> + Arguments args ( info ) ; <nl> + v8 : : Local < v8 : : External > v8_holder ; <nl> + args . GetData ( & v8_holder ) ; <nl> + CallbackHolderBase * holder_base = reinterpret_cast < CallbackHolderBase * > ( <nl> + v8_holder - > Value ( ) ) ; <nl> + <nl> + typedef CallbackHolder < ReturnType ( ArgTypes . . . ) > HolderT ; <nl> + HolderT * holder = static_cast < HolderT * > ( holder_base ) ; <nl> + <nl> + using Indices = typename IndicesGenerator < sizeof . . . ( ArgTypes ) > : : type ; <nl> + Invoker < Indices , ArgTypes . . . > invoker ( & args , holder - > flags ) ; <nl> + if ( invoker . IsOK ( ) ) <nl> + invoker . DispatchToCallback ( holder - > callback ) ; <nl> + } <nl> + } ; <nl> + <nl> + } / / namespace internal <nl> + <nl> + <nl> + / / CreateFunctionTemplate creates a v8 : : FunctionTemplate that will create <nl> + / / JavaScript functions that execute a provided C + + function or base : : Callback . <nl> + / / JavaScript arguments are automatically converted via gin : : Converter , as is <nl> + / / the return value of the C + + function , if any . <nl> + / / <nl> + / / NOTE : V8 caches FunctionTemplates for a lifetime of a web page for its own <nl> + / / internal reasons , thus it is generally a good idea to cache the template <nl> + / / returned by this function . Otherwise , repeated method invocations from JS <nl> + / / will create substantial memory leaks . See http : / / crbug . com / 463487 . <nl> + template < typename Sig > <nl> + v8 : : Local < v8 : : FunctionTemplate > CreateFunctionTemplate ( <nl> + v8 : : Isolate * isolate , const base : : Callback < Sig > callback , <nl> + int callback_flags = 0 ) { <nl> + typedef internal : : CallbackHolder < Sig > HolderT ; <nl> + HolderT * holder = new HolderT ( isolate , callback , callback_flags ) ; <nl> + <nl> + return v8 : : FunctionTemplate : : New ( <nl> + isolate , <nl> + & internal : : Dispatcher < Sig > : : DispatchToCallback , <nl> + ConvertToV8 < v8 : : Local < v8 : : External > > ( isolate , <nl> + holder - > GetHandle ( isolate ) ) ) ; <nl> + } <nl> + <nl> + / / CreateFunctionHandler installs a CallAsFunction handler on the given <nl> + / / object template that forwards to a provided C + + function or base : : Callback . <nl> + template < typename Sig > <nl> + void CreateFunctionHandler ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : ObjectTemplate > tmpl , <nl> + const base : : Callback < Sig > callback , <nl> + int callback_flags = 0 ) { <nl> + typedef internal : : CallbackHolder < Sig > HolderT ; <nl> + HolderT * holder = new HolderT ( isolate , callback , callback_flags ) ; <nl> + tmpl - > SetCallAsFunctionHandler ( & internal : : Dispatcher < Sig > : : DispatchToCallback , <nl> + ConvertToV8 < v8 : : Local < v8 : : External > > ( <nl> + isolate , holder - > GetHandle ( isolate ) ) ) ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_FUNCTION_TEMPLATE_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 60bd2348dd6c <nl> mmm / dev / null <nl> ppp b / native_mate / mate / handle . h <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_HANDLE_H_ <nl> + # define NATIVE_MATE_HANDLE_H_ <nl> + <nl> + # include " native_mate / converter . h " <nl> + <nl> + namespace mate { <nl> + <nl> + / / You can use mate : : Handle on the stack to retain a mate : : Wrappable object . <nl> + / / Currently we don ' t have a mechanism for retaining a mate : : Wrappable object <nl> + / / in the C + + heap because strong references from C + + to V8 can cause memory <nl> + / / leaks . <nl> + template < typename T > <nl> + class Handle { <nl> + public : <nl> + Handle ( ) : object_ ( NULL ) { } <nl> + <nl> + Handle ( v8 : : Local < v8 : : Object > wrapper , T * object ) <nl> + : wrapper_ ( wrapper ) , <nl> + object_ ( object ) { <nl> + } <nl> + <nl> + bool IsEmpty ( ) const { return ! object_ ; } <nl> + <nl> + void Clear ( ) { <nl> + wrapper_ . Clear ( ) ; <nl> + object_ = NULL ; <nl> + } <nl> + <nl> + T * operator - > ( ) const { return object_ ; } <nl> + v8 : : Local < v8 : : Object > ToV8 ( ) const { return wrapper_ ; } <nl> + T * get ( ) const { return object_ ; } <nl> + <nl> + private : <nl> + v8 : : Local < v8 : : Object > wrapper_ ; <nl> + T * object_ ; <nl> + } ; <nl> + <nl> + template < typename T > <nl> + struct Converter < mate : : Handle < T > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + const mate : : Handle < T > & val ) { <nl> + return val . ToV8 ( ) ; <nl> + } <nl> + static bool FromV8 ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > val , <nl> + mate : : Handle < T > * out ) { <nl> + T * object = NULL ; <nl> + if ( val - > IsNull ( ) | | val - > IsUndefined ( ) ) { <nl> + * out = mate : : Handle < T > ( ) ; <nl> + return true ; <nl> + } <nl> + if ( ! Converter < T * > : : FromV8 ( isolate , val , & object ) ) { <nl> + return false ; <nl> + } <nl> + * out = mate : : Handle < T > ( val - > ToObject ( ) , object ) ; <nl> + return true ; <nl> + } <nl> + } ; <nl> + <nl> + / / This function is a convenient way to create a handle from a raw pointer <nl> + / / without having to write out the type of the object explicitly . <nl> + template < typename T > <nl> + mate : : Handle < T > CreateHandle ( v8 : : Isolate * isolate , T * object ) { <nl> + return mate : : Handle < T > ( object - > GetWrapper ( ) , object ) ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_HANDLE_H_ <nl> new file mode 100644 <nl> index 000000000000 . . c64e38fa1e2c <nl> mmm / dev / null <nl> ppp b / native_mate / mate / object_template_builder . cc <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # include " native_mate / object_template_builder . h " <nl> + <nl> + namespace mate { <nl> + <nl> + ObjectTemplateBuilder : : ObjectTemplateBuilder ( <nl> + v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : ObjectTemplate > templ ) <nl> + : isolate_ ( isolate ) , template_ ( templ ) { <nl> + } <nl> + <nl> + ObjectTemplateBuilder : : ~ ObjectTemplateBuilder ( ) { <nl> + } <nl> + <nl> + ObjectTemplateBuilder & ObjectTemplateBuilder : : SetImpl ( <nl> + const base : : StringPiece & name , v8 : : Local < v8 : : Data > val ) { <nl> + template_ - > Set ( StringToSymbol ( isolate_ , name ) , val ) ; <nl> + return * this ; <nl> + } <nl> + <nl> + ObjectTemplateBuilder & ObjectTemplateBuilder : : SetPropertyImpl ( <nl> + const base : : StringPiece & name , v8 : : Local < v8 : : FunctionTemplate > getter , <nl> + v8 : : Local < v8 : : FunctionTemplate > setter ) { <nl> + template_ - > SetAccessorProperty ( StringToSymbol ( isolate_ , name ) , getter , <nl> + setter ) ; <nl> + return * this ; <nl> + } <nl> + <nl> + ObjectTemplateBuilder & ObjectTemplateBuilder : : MakeDestroyable ( ) { <nl> + SetMethod ( " destroy " , base : : Bind ( internal : : Destroyable : : Destroy ) ) ; <nl> + SetMethod ( " isDestroyed " , base : : Bind ( internal : : Destroyable : : IsDestroyed ) ) ; <nl> + return * this ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : ObjectTemplate > ObjectTemplateBuilder : : Build ( ) { <nl> + v8 : : Local < v8 : : ObjectTemplate > result = template_ ; <nl> + template_ . Clear ( ) ; <nl> + return result ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . 533576f98630 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / object_template_builder . h <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_OBJECT_TEMPLATE_BUILDER_H_ <nl> + # define NATIVE_MATE_OBJECT_TEMPLATE_BUILDER_H_ <nl> + <nl> + # include " base / bind . h " <nl> + # include " base / callback . h " <nl> + # include " base / strings / string_piece . h " <nl> + # include " native_mate / converter . h " <nl> + # include " native_mate / function_template . h " <nl> + # include " v8 / include / v8 . h " <nl> + <nl> + namespace mate { <nl> + <nl> + namespace { <nl> + <nl> + / / Base template - used only for non - member function pointers . Other types <nl> + / / either go to one of the below specializations , or go here and fail to compile <nl> + / / because of base : : Bind ( ) . <nl> + template < typename T , typename Enable = void > <nl> + struct CallbackTraits { <nl> + static v8 : : Local < v8 : : FunctionTemplate > CreateTemplate ( <nl> + v8 : : Isolate * isolate , T callback ) { <nl> + return CreateFunctionTemplate ( isolate , base : : Bind ( callback ) ) ; <nl> + } <nl> + } ; <nl> + <nl> + / / Specialization for base : : Callback . <nl> + template < typename T > <nl> + struct CallbackTraits < base : : Callback < T > > { <nl> + static v8 : : Local < v8 : : FunctionTemplate > CreateTemplate ( <nl> + v8 : : Isolate * isolate , const base : : Callback < T > & callback ) { <nl> + return CreateFunctionTemplate ( isolate , callback ) ; <nl> + } <nl> + } ; <nl> + <nl> + / / Specialization for member function pointers . We need to handle this case <nl> + / / specially because the first parameter for callbacks to MFP should typically <nl> + / / come from the the JavaScript " this " object the function was called on , not <nl> + / / from the first normal parameter . <nl> + template < typename T > <nl> + struct CallbackTraits < T , typename std : : enable_if < <nl> + std : : is_member_function_pointer < T > : : value > : : type > { <nl> + static v8 : : Local < v8 : : FunctionTemplate > CreateTemplate ( <nl> + v8 : : Isolate * isolate , T callback ) { <nl> + int flags = HolderIsFirstArgument ; <nl> + return CreateFunctionTemplate ( isolate , base : : Bind ( callback ) , flags ) ; <nl> + } <nl> + } ; <nl> + <nl> + / / This specialization allows people to construct function templates directly if <nl> + / / they need to do fancier stuff . <nl> + template < > <nl> + struct CallbackTraits < v8 : : Local < v8 : : FunctionTemplate > > { <nl> + static v8 : : Local < v8 : : FunctionTemplate > CreateTemplate ( <nl> + v8 : : Local < v8 : : FunctionTemplate > templ ) { <nl> + return templ ; <nl> + } <nl> + } ; <nl> + <nl> + } / / namespace <nl> + <nl> + <nl> + / / ObjectTemplateBuilder provides a handy interface to creating <nl> + / / v8 : : ObjectTemplate instances with various sorts of properties . <nl> + class ObjectTemplateBuilder { <nl> + public : <nl> + explicit ObjectTemplateBuilder ( <nl> + v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : ObjectTemplate > templ ) ; <nl> + ~ ObjectTemplateBuilder ( ) ; <nl> + <nl> + / / It ' s against Google C + + style to return a non - const ref , but we take some <nl> + / / poetic license here in order that all calls to Set ( ) can be via the ' . ' <nl> + / / operator and line up nicely . <nl> + template < typename T > <nl> + ObjectTemplateBuilder & SetValue ( const base : : StringPiece & name , T val ) { <nl> + return SetImpl ( name , ConvertToV8 ( isolate_ , val ) ) ; <nl> + } <nl> + <nl> + / / In the following methods , T and U can be function pointer , member function <nl> + / / pointer , base : : Callback , or v8 : : FunctionTemplate . Most clients will want to <nl> + / / use one of the first two options . Also see mate : : CreateFunctionTemplate ( ) <nl> + / / for creating raw function templates . <nl> + template < typename T > <nl> + ObjectTemplateBuilder & SetMethod ( const base : : StringPiece & name , <nl> + T callback ) { <nl> + return SetImpl ( name , <nl> + CallbackTraits < T > : : CreateTemplate ( isolate_ , callback ) ) ; <nl> + } <nl> + template < typename T > <nl> + ObjectTemplateBuilder & SetProperty ( const base : : StringPiece & name , <nl> + T getter ) { <nl> + return SetPropertyImpl ( <nl> + name , <nl> + CallbackTraits < T > : : CreateTemplate ( isolate_ , getter ) , <nl> + v8 : : Local < v8 : : FunctionTemplate > ( ) ) ; <nl> + } <nl> + template < typename T , typename U > <nl> + ObjectTemplateBuilder & SetProperty ( const base : : StringPiece & name , <nl> + T getter , <nl> + U setter ) { <nl> + return SetPropertyImpl ( <nl> + name , <nl> + CallbackTraits < T > : : CreateTemplate ( isolate_ , getter ) , <nl> + CallbackTraits < U > : : CreateTemplate ( isolate_ , setter ) ) ; <nl> + } <nl> + <nl> + / / Add " destroy " and " isDestroyed " methods . <nl> + ObjectTemplateBuilder & MakeDestroyable ( ) ; <nl> + <nl> + v8 : : Local < v8 : : ObjectTemplate > Build ( ) ; <nl> + <nl> + private : <nl> + ObjectTemplateBuilder & SetImpl ( const base : : StringPiece & name , <nl> + v8 : : Local < v8 : : Data > val ) ; <nl> + ObjectTemplateBuilder & SetPropertyImpl ( <nl> + const base : : StringPiece & name , v8 : : Local < v8 : : FunctionTemplate > getter , <nl> + v8 : : Local < v8 : : FunctionTemplate > setter ) ; <nl> + <nl> + v8 : : Isolate * isolate_ ; <nl> + <nl> + / / ObjectTemplateBuilder should only be used on the stack . <nl> + v8 : : Local < v8 : : ObjectTemplate > template_ ; <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_OBJECT_TEMPLATE_BUILDER_H_ <nl> new file mode 100644 <nl> index 000000000000 . . fd68cdced275 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / persistent_dictionary . cc <nl> <nl> + / / Copyright 2014 Cheng Zhao . All rights reserved . <nl> + / / Use of this source code is governed by MIT license that can be found in the <nl> + / / LICENSE file . <nl> + <nl> + # include " native_mate / persistent_dictionary . h " <nl> + <nl> + namespace mate { <nl> + <nl> + PersistentDictionary : : PersistentDictionary ( ) { <nl> + } <nl> + <nl> + PersistentDictionary : : PersistentDictionary ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Object > object ) <nl> + : handle_ ( new RefCountedPersistent < v8 : : Object > ( isolate , object ) ) { <nl> + isolate_ = isolate ; <nl> + } <nl> + <nl> + PersistentDictionary : : ~ PersistentDictionary ( ) { <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Object > PersistentDictionary : : GetHandle ( ) const { <nl> + return handle_ - > NewHandle ( ) ; <nl> + } <nl> + <nl> + bool Converter < PersistentDictionary > : : FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + PersistentDictionary * out ) { <nl> + if ( ! val - > IsObject ( ) ) <nl> + return false ; <nl> + * out = PersistentDictionary ( isolate , v8 : : Local < v8 : : Object > : : Cast ( val ) ) ; <nl> + return true ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . 26c8998632bd <nl> mmm / dev / null <nl> ppp b / native_mate / mate / persistent_dictionary . h <nl> <nl> + / / Copyright 2014 Cheng Zhao . All rights reserved . <nl> + / / Use of this source code is governed by MIT license that can be found in the <nl> + / / LICENSE file . <nl> + <nl> + # ifndef NATIVE_MATE_PERSISTENT_DICTIONARY_H_ <nl> + # define NATIVE_MATE_PERSISTENT_DICTIONARY_H_ <nl> + <nl> + # include " native_mate / dictionary . h " <nl> + # include " native_mate / scoped_persistent . h " <nl> + <nl> + namespace mate { <nl> + <nl> + / / Like Dictionary , but stores object in persistent handle so you can keep it <nl> + / / safely on heap . <nl> + class PersistentDictionary : public Dictionary { <nl> + public : <nl> + PersistentDictionary ( ) ; <nl> + PersistentDictionary ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Object > object ) ; <nl> + virtual ~ PersistentDictionary ( ) ; <nl> + <nl> + v8 : : Local < v8 : : Object > GetHandle ( ) const override ; <nl> + <nl> + private : <nl> + scoped_refptr < RefCountedPersistent < v8 : : Object > > handle_ ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < PersistentDictionary > { <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + PersistentDictionary * out ) ; <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_PERSISTENT_DICTIONARY_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 81b79441e72e <nl> mmm / dev / null <nl> ppp b / native_mate / mate / promise . cc <nl> <nl> + / / Copyright ( c ) 2018 GitHub , Inc . <nl> + / / Use of this source code is governed by the MIT license that can be <nl> + / / found in the LICENSE file . <nl> + <nl> + # include " native_mate / promise . h " <nl> + <nl> + namespace mate { <nl> + <nl> + Promise : : Promise ( ) <nl> + : isolate_ ( NULL ) { <nl> + } <nl> + <nl> + Promise : : Promise ( v8 : : Isolate * isolate ) <nl> + : isolate_ ( isolate ) { <nl> + resolver_ = v8 : : Promise : : Resolver : : New ( isolate ) ; <nl> + } <nl> + <nl> + Promise : : ~ Promise ( ) { <nl> + } <nl> + <nl> + Promise Promise : : Create ( v8 : : Isolate * isolate ) { <nl> + return Promise ( isolate ) ; <nl> + } <nl> + <nl> + Promise Promise : : Create ( ) { <nl> + return Promise : : Create ( v8 : : Isolate : : GetCurrent ( ) ) ; <nl> + } <nl> + <nl> + void Promise : : RejectWithErrorMessage ( const std : : string & string ) { <nl> + v8 : : Local < v8 : : String > error_message = <nl> + v8 : : String : : NewFromUtf8 ( isolate ( ) , string . c_str ( ) ) ; <nl> + v8 : : Local < v8 : : Value > error = v8 : : Exception : : Error ( error_message ) ; <nl> + resolver_ - > Reject ( mate : : ConvertToV8 ( isolate ( ) , error ) ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Object > Promise : : GetHandle ( ) const { <nl> + return resolver_ - > GetPromise ( ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Value > Converter < Promise > : : ToV8 ( v8 : : Isolate * isolate , <nl> + Promise val ) { <nl> + return val . GetHandle ( ) ; <nl> + } <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . 225ac6d048f9 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / promise . h <nl> <nl> + / / Copyright ( c ) 2018 GitHub , Inc . <nl> + / / Use of this source code is governed by the MIT license that can be <nl> + / / found in the LICENSE file . <nl> + <nl> + # ifndef NATIVE_MATE_PROMISE_H_ <nl> + # define NATIVE_MATE_PROMISE_H_ <nl> + <nl> + # include " native_mate / converter . h " <nl> + <nl> + namespace mate { <nl> + <nl> + class Promise { <nl> + public : <nl> + Promise ( ) ; <nl> + Promise ( v8 : : Isolate * isolate ) ; <nl> + virtual ~ Promise ( ) ; <nl> + <nl> + static Promise Create ( v8 : : Isolate * isolate ) ; <nl> + static Promise Create ( ) ; <nl> + <nl> + v8 : : Isolate * isolate ( ) const { return isolate_ ; } <nl> + <nl> + virtual v8 : : Local < v8 : : Object > GetHandle ( ) const ; <nl> + <nl> + template < typename T > <nl> + void Resolve ( T * value ) { <nl> + resolver_ - > Resolve ( mate : : ConvertToV8 ( isolate ( ) , value ) ) ; <nl> + } <nl> + <nl> + template < typename T > <nl> + void Reject ( T * value ) { <nl> + resolver_ - > Reject ( mate : : ConvertToV8 ( isolate ( ) , value ) ) ; <nl> + } <nl> + <nl> + void RejectWithErrorMessage ( const std : : string & error ) ; <nl> + <nl> + protected : <nl> + v8 : : Isolate * isolate_ ; <nl> + <nl> + private : <nl> + v8 : : Local < v8 : : Promise : : Resolver > resolver_ ; <nl> + } ; <nl> + <nl> + template < > <nl> + struct Converter < Promise > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + Promise val ) ; <nl> + / / TODO ( MarshallOfSound ) : Implement FromV8 to allow promise chaining <nl> + / / in native land <nl> + / / static bool FromV8 ( v8 : : Isolate * isolate , <nl> + / / v8 : : Local < v8 : : Value > val , <nl> + / / Promise * out ) ; <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_PROMISE_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 5d9c8fff423a <nl> mmm / dev / null <nl> ppp b / native_mate / mate / scoped_persistent . h <nl> <nl> + / / Copyright 2014 Cheng Zhao . All rights reserved . <nl> + / / Use of this source code is governed by MIT license that can be found in the <nl> + / / LICENSE file . <nl> + <nl> + # ifndef NATIVE_MATE_SCOPED_PERSISTENT_H_ <nl> + # define NATIVE_MATE_SCOPED_PERSISTENT_H_ <nl> + <nl> + # include " base / memory / ref_counted . h " <nl> + # include " native_mate / converter . h " <nl> + # include " v8 / include / v8 . h " <nl> + <nl> + namespace mate { <nl> + <nl> + / / A v8 : : Persistent handle to a V8 value which destroys and clears the <nl> + / / underlying handle on destruction . <nl> + template < typename T > <nl> + class ScopedPersistent { <nl> + public : <nl> + ScopedPersistent ( ) : isolate_ ( v8 : : Isolate : : GetCurrent ( ) ) { } <nl> + <nl> + ScopedPersistent ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > handle ) <nl> + : isolate_ ( isolate ) { <nl> + reset ( isolate , v8 : : Local < T > : : Cast ( handle ) ) ; <nl> + } <nl> + <nl> + ~ ScopedPersistent ( ) { <nl> + reset ( ) ; <nl> + } <nl> + <nl> + void reset ( v8 : : Isolate * isolate , v8 : : Local < T > handle ) { <nl> + if ( ! handle . IsEmpty ( ) ) { <nl> + isolate_ = isolate ; <nl> + handle_ . Reset ( isolate , handle ) ; <nl> + } else { <nl> + reset ( ) ; <nl> + } <nl> + } <nl> + <nl> + void reset ( ) { <nl> + handle_ . Reset ( ) ; <nl> + } <nl> + <nl> + bool IsEmpty ( ) const { <nl> + return handle_ . IsEmpty ( ) ; <nl> + } <nl> + <nl> + v8 : : Local < T > NewHandle ( ) const { <nl> + return NewHandle ( isolate_ ) ; <nl> + } <nl> + <nl> + v8 : : Local < T > NewHandle ( v8 : : Isolate * isolate ) const { <nl> + if ( handle_ . IsEmpty ( ) ) <nl> + return v8 : : Local < T > ( ) ; <nl> + return v8 : : Local < T > : : New ( isolate , handle_ ) ; <nl> + } <nl> + <nl> + template < typename P , typename C > <nl> + void SetWeak ( P * parameter , C callback ) { <nl> + handle_ . SetWeak ( parameter , callback ) ; <nl> + } <nl> + <nl> + v8 : : Isolate * isolate ( ) const { return isolate_ ; } <nl> + <nl> + private : <nl> + v8 : : Isolate * isolate_ ; <nl> + v8 : : Persistent < T > handle_ ; <nl> + <nl> + DISALLOW_COPY_AND_ASSIGN ( ScopedPersistent ) ; <nl> + } ; <nl> + <nl> + template < typename T > <nl> + class RefCountedPersistent : public ScopedPersistent < T > , <nl> + public base : : RefCounted < RefCountedPersistent < T > > { <nl> + public : <nl> + RefCountedPersistent ( ) { } <nl> + <nl> + RefCountedPersistent ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > handle ) <nl> + : ScopedPersistent < T > ( isolate , handle ) { <nl> + } <nl> + <nl> + protected : <nl> + friend class base : : RefCounted < RefCountedPersistent < T > > ; <nl> + <nl> + ~ RefCountedPersistent ( ) { } <nl> + <nl> + private : <nl> + DISALLOW_COPY_AND_ASSIGN ( RefCountedPersistent ) ; <nl> + } ; <nl> + <nl> + template < typename T > <nl> + struct Converter < ScopedPersistent < T > > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , <nl> + const ScopedPersistent < T > & val ) { <nl> + return val . NewHandle ( isolate ) ; <nl> + } <nl> + <nl> + static bool FromV8 ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Value > val , <nl> + ScopedPersistent < T > * out ) { <nl> + v8 : : Local < T > converted ; <nl> + if ( ! Converter < v8 : : Local < T > > : : FromV8 ( isolate , val , & converted ) ) <nl> + return false ; <nl> + <nl> + out - > reset ( isolate , converted ) ; <nl> + return true ; <nl> + } <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_SCOPED_PERSISTENT_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 3d5ce44ac0a8 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / wrappable . cc <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # include " native_mate / wrappable . h " <nl> + <nl> + # include " base / logging . h " <nl> + # include " native_mate / dictionary . h " <nl> + # include " native_mate / object_template_builder . h " <nl> + <nl> + namespace mate { <nl> + <nl> + WrappableBase : : WrappableBase ( ) : isolate_ ( nullptr ) { } <nl> + <nl> + WrappableBase : : ~ WrappableBase ( ) { <nl> + if ( wrapper_ . IsEmpty ( ) ) <nl> + return ; <nl> + <nl> + GetWrapper ( ) - > SetAlignedPointerInInternalField ( 0 , nullptr ) ; <nl> + wrapper_ . ClearWeak ( ) ; <nl> + wrapper_ . Reset ( ) ; <nl> + } <nl> + <nl> + v8 : : Local < v8 : : Object > WrappableBase : : GetWrapper ( ) const { <nl> + if ( ! wrapper_ . IsEmpty ( ) ) <nl> + return v8 : : Local < v8 : : Object > : : New ( isolate_ , wrapper_ ) ; <nl> + else <nl> + return v8 : : Local < v8 : : Object > ( ) ; <nl> + } <nl> + <nl> + void WrappableBase : : InitWith ( v8 : : Isolate * isolate , <nl> + v8 : : Local < v8 : : Object > wrapper ) { <nl> + CHECK ( wrapper_ . IsEmpty ( ) ) ; <nl> + isolate_ = isolate ; <nl> + wrapper - > SetAlignedPointerInInternalField ( 0 , this ) ; <nl> + wrapper_ . Reset ( isolate , wrapper ) ; <nl> + wrapper_ . SetWeak ( this , FirstWeakCallback , v8 : : WeakCallbackType : : kParameter ) ; <nl> + <nl> + / / Call object . _init if we have one . <nl> + v8 : : Local < v8 : : Function > init ; <nl> + if ( Dictionary ( isolate , wrapper ) . Get ( " _init " , & init ) ) <nl> + init - > Call ( wrapper , 0 , nullptr ) ; <nl> + <nl> + AfterInit ( isolate ) ; <nl> + } <nl> + <nl> + / / static <nl> + void WrappableBase : : FirstWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < WrappableBase > & data ) { <nl> + WrappableBase * wrappable = data . GetParameter ( ) ; <nl> + wrappable - > wrapper_ . Reset ( ) ; <nl> + data . SetSecondPassCallback ( SecondWeakCallback ) ; <nl> + } <nl> + <nl> + / / static <nl> + void WrappableBase : : SecondWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < WrappableBase > & data ) { <nl> + WrappableBase * wrappable = data . GetParameter ( ) ; <nl> + delete wrappable ; <nl> + } <nl> + <nl> + namespace internal { <nl> + <nl> + void * FromV8Impl ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > val ) { <nl> + if ( ! val - > IsObject ( ) ) <nl> + return nullptr ; <nl> + v8 : : Local < v8 : : Object > obj = v8 : : Local < v8 : : Object > : : Cast ( val ) ; <nl> + if ( obj - > InternalFieldCount ( ) ! = 1 ) <nl> + return nullptr ; <nl> + return obj - > GetAlignedPointerFromInternalField ( 0 ) ; <nl> + } <nl> + <nl> + } / / namespace internal <nl> + <nl> + } / / namespace mate <nl> new file mode 100644 <nl> index 000000000000 . . 489c5817481b <nl> mmm / dev / null <nl> ppp b / native_mate / mate / wrappable . h <nl> <nl> + / / Copyright 2013 The Chromium Authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE . chromium file . <nl> + <nl> + # ifndef NATIVE_MATE_WRAPPABLE_H_ <nl> + # define NATIVE_MATE_WRAPPABLE_H_ <nl> + <nl> + # include " base / bind . h " <nl> + # include " native_mate / converter . h " <nl> + # include " native_mate / constructor . h " <nl> + # include " gin / per_isolate_data . h " <nl> + <nl> + namespace mate { <nl> + <nl> + namespace internal { <nl> + <nl> + void * FromV8Impl ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > val ) ; <nl> + <nl> + } / / namespace internal <nl> + <nl> + template < typename T > <nl> + class Wrappable : public WrappableBase { <nl> + public : <nl> + Wrappable ( ) { } <nl> + <nl> + template < typename Sig > <nl> + static void SetConstructor ( v8 : : Isolate * isolate , <nl> + const base : : Callback < Sig > & constructor ) { <nl> + v8 : : Local < v8 : : FunctionTemplate > templ = CreateFunctionTemplate ( <nl> + isolate , base : : Bind ( & internal : : InvokeNew < Sig > , constructor ) ) ; <nl> + templ - > InstanceTemplate ( ) - > SetInternalFieldCount ( 1 ) ; <nl> + T : : BuildPrototype ( isolate , templ ) ; <nl> + gin : : PerIsolateData : : From ( isolate ) - > SetFunctionTemplate ( <nl> + & kWrapperInfo , templ ) ; <nl> + } <nl> + <nl> + static v8 : : Local < v8 : : FunctionTemplate > GetConstructor ( v8 : : Isolate * isolate ) { <nl> + / / Fill the object template . <nl> + auto data = gin : : PerIsolateData : : From ( isolate ) ; <nl> + auto templ = data - > GetFunctionTemplate ( & kWrapperInfo ) ; <nl> + if ( templ . IsEmpty ( ) ) { <nl> + templ = v8 : : FunctionTemplate : : New ( isolate ) ; <nl> + templ - > InstanceTemplate ( ) - > SetInternalFieldCount ( 1 ) ; <nl> + T : : BuildPrototype ( isolate , templ ) ; <nl> + data - > SetFunctionTemplate ( & kWrapperInfo , templ ) ; <nl> + } <nl> + return templ ; <nl> + } <nl> + <nl> + protected : <nl> + / / Init the class with T : : BuildPrototype . <nl> + void Init ( v8 : : Isolate * isolate ) { <nl> + v8 : : Local < v8 : : FunctionTemplate > templ = GetConstructor ( isolate ) ; <nl> + <nl> + / / | wrapper | may be empty in some extreme cases , e . g . , when <nl> + / / Object . prototype . constructor is overwritten . <nl> + v8 : : Local < v8 : : Object > wrapper ; <nl> + if ( ! templ - > InstanceTemplate ( ) - > NewInstance ( <nl> + isolate - > GetCurrentContext ( ) ) . ToLocal ( & wrapper ) ) { <nl> + / / The current wrappable object will be no longer managed by V8 . Delete <nl> + / / this now . <nl> + delete this ; <nl> + return ; <nl> + } <nl> + InitWith ( isolate , wrapper ) ; <nl> + } <nl> + <nl> + private : <nl> + static gin : : WrapperInfo kWrapperInfo ; <nl> + <nl> + DISALLOW_COPY_AND_ASSIGN ( Wrappable ) ; <nl> + } ; <nl> + <nl> + / / static <nl> + template < typename T > <nl> + gin : : WrapperInfo Wrappable < T > : : kWrapperInfo = { gin : : kEmbedderNativeGin } ; <nl> + <nl> + / / This converter handles any subclass of Wrappable . <nl> + template < typename T > <nl> + struct Converter < T * , <nl> + typename std : : enable_if < <nl> + std : : is_convertible < T * , WrappableBase * > : : value > : : type > { <nl> + static v8 : : Local < v8 : : Value > ToV8 ( v8 : : Isolate * isolate , T * val ) { <nl> + if ( val ) <nl> + return val - > GetWrapper ( ) ; <nl> + else <nl> + return v8 : : Null ( isolate ) ; <nl> + } <nl> + <nl> + static bool FromV8 ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Value > val , T * * out ) { <nl> + * out = static_cast < T * > ( static_cast < WrappableBase * > ( <nl> + internal : : FromV8Impl ( isolate , val ) ) ) ; <nl> + return * out ! = nullptr ; <nl> + } <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_WRAPPABLE_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 1c489cc37af3 <nl> mmm / dev / null <nl> ppp b / native_mate / mate / wrappable_base . h <nl> <nl> + # ifndef NATIVE_MATE_WRAPPABLE_BASE_H_ <nl> + # define NATIVE_MATE_WRAPPABLE_BASE_H_ <nl> + <nl> + namespace mate { <nl> + <nl> + namespace internal { <nl> + struct Destroyable ; <nl> + } <nl> + <nl> + / / Wrappable is a base class for C + + objects that have corresponding v8 wrapper <nl> + / / objects . To retain a Wrappable object on the stack , use a gin : : Handle . <nl> + / / <nl> + / / USAGE : <nl> + / / / / my_class . h <nl> + / / class MyClass : Wrappable < MyClass > { <nl> + / / public : <nl> + / / . . . <nl> + / / } ; <nl> + / / <nl> + / / Subclasses should also typically have private constructors and expose a <nl> + / / static Create function that returns a mate : : Handle . Forcing creators through <nl> + / / this static Create function will enforce that clients actually create a <nl> + / / wrapper for the object . If clients fail to create a wrapper for a wrappable <nl> + / / object , the object will leak because we use the weak callback from the <nl> + / / wrapper as the signal to delete the wrapped object . <nl> + class WrappableBase { <nl> + public : <nl> + WrappableBase ( ) ; <nl> + virtual ~ WrappableBase ( ) ; <nl> + <nl> + / / Retrieve the v8 wrapper object cooresponding to this object . <nl> + v8 : : Local < v8 : : Object > GetWrapper ( ) const ; <nl> + <nl> + / / Returns the Isolate this object is created in . <nl> + v8 : : Isolate * isolate ( ) const { return isolate_ ; } <nl> + <nl> + protected : <nl> + / / Called after the " _init " method gets called in JavaScript . <nl> + virtual void AfterInit ( v8 : : Isolate * isolate ) { } <nl> + <nl> + / / Bind the C + + class to the JS wrapper . <nl> + / / This method should only be called by classes using Constructor . <nl> + virtual void InitWith ( v8 : : Isolate * isolate , v8 : : Local < v8 : : Object > wrapper ) ; <nl> + <nl> + private : <nl> + friend struct internal : : Destroyable ; <nl> + <nl> + static void FirstWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < WrappableBase > & data ) ; <nl> + static void SecondWeakCallback ( <nl> + const v8 : : WeakCallbackInfo < WrappableBase > & data ) ; <nl> + <nl> + v8 : : Isolate * isolate_ ; <nl> + v8 : : Global < v8 : : Object > wrapper_ ; / / Weak <nl> + <nl> + DISALLOW_COPY_AND_ASSIGN ( WrappableBase ) ; <nl> + } ; <nl> + <nl> + } / / namespace mate <nl> + <nl> + # endif / / NATIVE_MATE_WRAPPABLE_BASE_H_ <nl> new file mode 100644 <nl> index 000000000000 . . 6756a5ad1d08 <nl> mmm / dev / null <nl> ppp b / native_mate / native_mate_files . gypi <nl> <nl> + { <nl> + ' variables ' : { <nl> + ' native_mate_files ' : [ <nl> + ' mate / arguments . cc ' , <nl> + ' mate / arguments . h ' , <nl> + ' mate / constructor . h ' , <nl> + ' mate / converter . cc ' , <nl> + ' mate / converter . h ' , <nl> + ' mate / dictionary . cc ' , <nl> + ' mate / dictionary . h ' , <nl> + ' mate / function_template . cc ' , <nl> + ' mate / function_template . h ' , <nl> + ' mate / handle . h ' , <nl> + ' mate / object_template_builder . cc ' , <nl> + ' mate / object_template_builder . h ' , <nl> + ' mate / persistent_dictionary . cc ' , <nl> + ' mate / persistent_dictionary . h ' , <nl> + ' mate / scoped_persistent . h ' , <nl> + ' mate / wrappable . cc ' , <nl> + ' mate / wrappable . h ' , <nl> + ' mate / wrappable_base . h ' , <nl> + ' mate / promise . h ' , <nl> + ' mate / promise . cc ' , <nl> + ] , <nl> + } , <nl> + } <nl> new file mode 100755 <nl> index 000000000000 . . 5efb653c207d <nl> mmm / dev / null <nl> ppp b / native_mate / script / pump . py <nl> <nl> + # ! / usr / bin / env python <nl> + # <nl> + # Copyright 2008 , Google Inc . <nl> + # All rights reserved . <nl> + # <nl> + # Redistribution and use in source and binary forms , with or without <nl> + # modification , are permitted provided that the following conditions are <nl> + # met : <nl> + # <nl> + # * Redistributions of source code must retain the above copyright <nl> + # notice , this list of conditions and the following disclaimer . <nl> + # * Redistributions in binary form must reproduce the above <nl> + # copyright notice , this list of conditions and the following disclaimer <nl> + # in the documentation and / or other materials provided with the <nl> + # distribution . <nl> + # * Neither the name of Google Inc . nor the names of its <nl> + # contributors may be used to endorse or promote products derived from <nl> + # this software without specific prior written permission . <nl> + # <nl> + # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS <nl> + # " AS IS " AND ANY EXPRESS OR IMPLIED WARRANTIES , INCLUDING , BUT NOT <nl> + # LIMITED TO , THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR <nl> + # A PARTICULAR PURPOSE ARE DISCLAIMED . IN NO EVENT SHALL THE COPYRIGHT <nl> + # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT , INDIRECT , INCIDENTAL , <nl> + # SPECIAL , EXEMPLARY , OR CONSEQUENTIAL DAMAGES ( INCLUDING , BUT NOT <nl> + # LIMITED TO , PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES ; LOSS OF USE , <nl> + # DATA , OR PROFITS ; OR BUSINESS INTERRUPTION ) HOWEVER CAUSED AND ON ANY <nl> + # THEORY OF LIABILITY , WHETHER IN CONTRACT , STRICT LIABILITY , OR TORT <nl> + # ( INCLUDING NEGLIGENCE OR OTHERWISE ) ARISING IN ANY WAY OUT OF THE USE <nl> + # OF THIS SOFTWARE , EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE . <nl> + <nl> + " " " pump v0 . 2 . 0 - Pretty Useful for Meta Programming . <nl> + <nl> + A tool for preprocessor meta programming . Useful for generating <nl> + repetitive boilerplate code . Especially useful for writing C + + <nl> + classes , functions , macros , and templates that need to work with <nl> + various number of arguments . <nl> + <nl> + USAGE : <nl> + pump . py SOURCE_FILE <nl> + <nl> + EXAMPLES : <nl> + pump . py foo . cc . pump <nl> + Converts foo . cc . pump to foo . cc . <nl> + <nl> + GRAMMAR : <nl> + CODE : : = ATOMIC_CODE * <nl> + ATOMIC_CODE : : = $ var ID = EXPRESSION <nl> + | $ var ID = [ [ CODE ] ] <nl> + | $ range ID EXPRESSION . . EXPRESSION <nl> + | $ for ID SEPARATOR [ [ CODE ] ] <nl> + | $ ( $ ) <nl> + | $ ID <nl> + | $ ( EXPRESSION ) <nl> + | $ if EXPRESSION [ [ CODE ] ] ELSE_BRANCH <nl> + | [ [ CODE ] ] <nl> + | RAW_CODE <nl> + SEPARATOR : : = RAW_CODE | EMPTY <nl> + ELSE_BRANCH : : = $ else [ [ CODE ] ] <nl> + | $ elif EXPRESSION [ [ CODE ] ] ELSE_BRANCH <nl> + | EMPTY <nl> + EXPRESSION has Python syntax . <nl> + " " " <nl> + <nl> + __author__ = ' wan @ google . com ( Zhanyong Wan ) ' <nl> + <nl> + import os <nl> + import re <nl> + import sys <nl> + <nl> + <nl> + TOKEN_TABLE = [ <nl> + ( re . compile ( r ' \ $ var \ s + ' ) , ' $ var ' ) , <nl> + ( re . compile ( r ' \ $ elif \ s + ' ) , ' $ elif ' ) , <nl> + ( re . compile ( r ' \ $ else \ s + ' ) , ' $ else ' ) , <nl> + ( re . compile ( r ' \ $ for \ s + ' ) , ' $ for ' ) , <nl> + ( re . compile ( r ' \ $ if \ s + ' ) , ' $ if ' ) , <nl> + ( re . compile ( r ' \ $ range \ s + ' ) , ' $ range ' ) , <nl> + ( re . compile ( r ' \ $ [ _A - Za - z ] \ w * ' ) , ' $ id ' ) , <nl> + ( re . compile ( r ' \ $ \ ( \ $ \ ) ' ) , ' $ ( $ ) ' ) , <nl> + ( re . compile ( r ' \ $ ' ) , ' $ ' ) , <nl> + ( re . compile ( r ' \ [ \ [ \ n ? ' ) , ' [ [ ' ) , <nl> + ( re . compile ( r ' \ ] \ ] \ n ? ' ) , ' ] ] ' ) , <nl> + ] <nl> + <nl> + <nl> + class Cursor : <nl> + " " " Represents a position ( line and column ) in a text file . " " " <nl> + <nl> + def __init__ ( self , line = - 1 , column = - 1 ) : <nl> + self . line = line <nl> + self . column = column <nl> + <nl> + def __eq__ ( self , rhs ) : <nl> + return self . line = = rhs . line and self . column = = rhs . column <nl> + <nl> + def __ne__ ( self , rhs ) : <nl> + return not self = = rhs <nl> + <nl> + def __lt__ ( self , rhs ) : <nl> + return self . line < rhs . line or ( <nl> + self . line = = rhs . line and self . column < rhs . column ) <nl> + <nl> + def __le__ ( self , rhs ) : <nl> + return self < rhs or self = = rhs <nl> + <nl> + def __gt__ ( self , rhs ) : <nl> + return rhs < self <nl> + <nl> + def __ge__ ( self , rhs ) : <nl> + return rhs < = self <nl> + <nl> + def __str__ ( self ) : <nl> + if self = = Eof ( ) : <nl> + return ' EOF ' <nl> + else : <nl> + return ' % s ( % s ) ' % ( self . line + 1 , self . column ) <nl> + <nl> + def __add__ ( self , offset ) : <nl> + return Cursor ( self . line , self . column + offset ) <nl> + <nl> + def __sub__ ( self , offset ) : <nl> + return Cursor ( self . line , self . column - offset ) <nl> + <nl> + def Clone ( self ) : <nl> + " " " Returns a copy of self . " " " <nl> + <nl> + return Cursor ( self . line , self . column ) <nl> + <nl> + <nl> + # Special cursor to indicate the end - of - file . <nl> + def Eof ( ) : <nl> + " " " Returns the special cursor to denote the end - of - file . " " " <nl> + return Cursor ( - 1 , - 1 ) <nl> + <nl> + <nl> + class Token : <nl> + " " " Represents a token in a Pump source file . " " " <nl> + <nl> + def __init__ ( self , start = None , end = None , value = None , token_type = None ) : <nl> + if start is None : <nl> + self . start = Eof ( ) <nl> + else : <nl> + self . start = start <nl> + if end is None : <nl> + self . end = Eof ( ) <nl> + else : <nl> + self . end = end <nl> + self . value = value <nl> + self . token_type = token_type <nl> + <nl> + def __str__ ( self ) : <nl> + return ' Token @ % s : \ ' % s \ ' type = % s ' % ( <nl> + self . start , self . value , self . token_type ) <nl> + <nl> + def Clone ( self ) : <nl> + " " " Returns a copy of self . " " " <nl> + <nl> + return Token ( self . start . Clone ( ) , self . end . Clone ( ) , self . value , <nl> + self . token_type ) <nl> + <nl> + <nl> + def StartsWith ( lines , pos , string ) : <nl> + " " " Returns True iff the given position in lines starts with ' string ' . " " " <nl> + <nl> + return lines [ pos . line ] [ pos . column : ] . startswith ( string ) <nl> + <nl> + <nl> + def FindFirstInLine ( line , token_table ) : <nl> + best_match_start = - 1 <nl> + for ( regex , token_type ) in token_table : <nl> + m = regex . search ( line ) <nl> + if m : <nl> + # We found regex in lines <nl> + if best_match_start < 0 or m . start ( ) < best_match_start : <nl> + best_match_start = m . start ( ) <nl> + best_match_length = m . end ( ) - m . start ( ) <nl> + best_match_token_type = token_type <nl> + <nl> + if best_match_start < 0 : <nl> + return None <nl> + <nl> + return ( best_match_start , best_match_length , best_match_token_type ) <nl> + <nl> + <nl> + def FindFirst ( lines , token_table , cursor ) : <nl> + " " " Finds the first occurrence of any string in strings in lines . " " " <nl> + <nl> + start = cursor . Clone ( ) <nl> + cur_line_number = cursor . line <nl> + for line in lines [ start . line : ] : <nl> + if cur_line_number = = start . line : <nl> + line = line [ start . column : ] <nl> + m = FindFirstInLine ( line , token_table ) <nl> + if m : <nl> + # We found a regex in line . <nl> + ( start_column , length , token_type ) = m <nl> + if cur_line_number = = start . line : <nl> + start_column + = start . column <nl> + found_start = Cursor ( cur_line_number , start_column ) <nl> + found_end = found_start + length <nl> + return MakeToken ( lines , found_start , found_end , token_type ) <nl> + cur_line_number + = 1 <nl> + # We failed to find str in lines <nl> + return None <nl> + <nl> + <nl> + def SubString ( lines , start , end ) : <nl> + " " " Returns a substring in lines . " " " <nl> + <nl> + if end = = Eof ( ) : <nl> + end = Cursor ( len ( lines ) - 1 , len ( lines [ - 1 ] ) ) <nl> + <nl> + if start > = end : <nl> + return ' ' <nl> + <nl> + if start . line = = end . line : <nl> + return lines [ start . line ] [ start . column : end . column ] <nl> + <nl> + result_lines = ( [ lines [ start . line ] [ start . column : ] ] + <nl> + lines [ start . line + 1 : end . line ] + <nl> + [ lines [ end . line ] [ : end . column ] ] ) <nl> + return ' ' . join ( result_lines ) <nl> + <nl> + <nl> + def StripMetaComments ( str ) : <nl> + " " " Strip meta comments from each line in the given string . " " " <nl> + <nl> + # First , completely remove lines containing nothing but a meta <nl> + # comment , including the trailing \ n . <nl> + str = re . sub ( r ' ^ \ s * \ $ \ $ . * \ n ' , ' ' , str ) <nl> + <nl> + # Then , remove meta comments from contentful lines . <nl> + return re . sub ( r ' \ s * \ $ \ $ . * ' , ' ' , str ) <nl> + <nl> + <nl> + def MakeToken ( lines , start , end , token_type ) : <nl> + " " " Creates a new instance of Token . " " " <nl> + <nl> + return Token ( start , end , SubString ( lines , start , end ) , token_type ) <nl> + <nl> + <nl> + def ParseToken ( lines , pos , regex , token_type ) : <nl> + line = lines [ pos . line ] [ pos . column : ] <nl> + m = regex . search ( line ) <nl> + if m and not m . start ( ) : <nl> + return MakeToken ( lines , pos , pos + m . end ( ) , token_type ) <nl> + else : <nl> + print ' ERROR : % s expected at % s . ' % ( token_type , pos ) <nl> + sys . exit ( 1 ) <nl> + <nl> + <nl> + ID_REGEX = re . compile ( r ' [ _A - Za - z ] \ w * ' ) <nl> + EQ_REGEX = re . compile ( r ' = ' ) <nl> + REST_OF_LINE_REGEX = re . compile ( r ' . * ? ( ? = $ | \ $ \ $ ) ' ) <nl> + OPTIONAL_WHITE_SPACES_REGEX = re . compile ( r ' \ s * ' ) <nl> + WHITE_SPACE_REGEX = re . compile ( r ' \ s ' ) <nl> + DOT_DOT_REGEX = re . compile ( r ' \ . \ . ' ) <nl> + <nl> + <nl> + def Skip ( lines , pos , regex ) : <nl> + line = lines [ pos . line ] [ pos . column : ] <nl> + m = re . search ( regex , line ) <nl> + if m and not m . start ( ) : <nl> + return pos + m . end ( ) <nl> + else : <nl> + return pos <nl> + <nl> + <nl> + def SkipUntil ( lines , pos , regex , token_type ) : <nl> + line = lines [ pos . line ] [ pos . column : ] <nl> + m = re . search ( regex , line ) <nl> + if m : <nl> + return pos + m . start ( ) <nl> + else : <nl> + print ( ' ERROR : % s expected on line % s after column % s . ' % <nl> + ( token_type , pos . line + 1 , pos . column ) ) <nl> + sys . exit ( 1 ) <nl> + <nl> + <nl> + def ParseExpTokenInParens ( lines , pos ) : <nl> + def ParseInParens ( pos ) : <nl> + pos = Skip ( lines , pos , OPTIONAL_WHITE_SPACES_REGEX ) <nl> + pos = Skip ( lines , pos , r ' \ ( ' ) <nl> + pos = Parse ( pos ) <nl> + pos = Skip ( lines , pos , r ' \ ) ' ) <nl> + return pos <nl> + <nl> + def Parse ( pos ) : <nl> + pos = SkipUntil ( lines , pos , r ' \ ( | \ ) ' , ' ) ' ) <nl> + if SubString ( lines , pos , pos + 1 ) = = ' ( ' : <nl> + pos = Parse ( pos + 1 ) <nl> + pos = Skip ( lines , pos , r ' \ ) ' ) <nl> + return Parse ( pos ) <nl> + else : <nl> + return pos <nl> + <nl> + start = pos . Clone ( ) <nl> + pos = ParseInParens ( pos ) <nl> + return MakeToken ( lines , start , pos , ' exp ' ) <nl> + <nl> + <nl> + def RStripNewLineFromToken ( token ) : <nl> + if token . value . endswith ( ' \ n ' ) : <nl> + return Token ( token . start , token . end , token . value [ : - 1 ] , token . token_type ) <nl> + else : <nl> + return token <nl> + <nl> + <nl> + def TokenizeLines ( lines , pos ) : <nl> + while True : <nl> + found = FindFirst ( lines , TOKEN_TABLE , pos ) <nl> + if not found : <nl> + yield MakeToken ( lines , pos , Eof ( ) , ' code ' ) <nl> + return <nl> + <nl> + if found . start = = pos : <nl> + prev_token = None <nl> + prev_token_rstripped = None <nl> + else : <nl> + prev_token = MakeToken ( lines , pos , found . start , ' code ' ) <nl> + prev_token_rstripped = RStripNewLineFromToken ( prev_token ) <nl> + <nl> + if found . token_type = = ' $ var ' : <nl> + if prev_token_rstripped : <nl> + yield prev_token_rstripped <nl> + yield found <nl> + id_token = ParseToken ( lines , found . end , ID_REGEX , ' id ' ) <nl> + yield id_token <nl> + pos = Skip ( lines , id_token . end , OPTIONAL_WHITE_SPACES_REGEX ) <nl> + <nl> + eq_token = ParseToken ( lines , pos , EQ_REGEX , ' = ' ) <nl> + yield eq_token <nl> + pos = Skip ( lines , eq_token . end , r ' \ s * ' ) <nl> + <nl> + if SubString ( lines , pos , pos + 2 ) ! = ' [ [ ' : <nl> + exp_token = ParseToken ( lines , pos , REST_OF_LINE_REGEX , ' exp ' ) <nl> + yield exp_token <nl> + pos = Cursor ( exp_token . end . line + 1 , 0 ) <nl> + elif found . token_type = = ' $ for ' : <nl> + if prev_token_rstripped : <nl> + yield prev_token_rstripped <nl> + yield found <nl> + id_token = ParseToken ( lines , found . end , ID_REGEX , ' id ' ) <nl> + yield id_token <nl> + pos = Skip ( lines , id_token . end , WHITE_SPACE_REGEX ) <nl> + elif found . token_type = = ' $ range ' : <nl> + if prev_token_rstripped : <nl> + yield prev_token_rstripped <nl> + yield found <nl> + id_token = ParseToken ( lines , found . end , ID_REGEX , ' id ' ) <nl> + yield id_token <nl> + pos = Skip ( lines , id_token . end , OPTIONAL_WHITE_SPACES_REGEX ) <nl> + <nl> + dots_pos = SkipUntil ( lines , pos , DOT_DOT_REGEX , ' . . ' ) <nl> + yield MakeToken ( lines , pos , dots_pos , ' exp ' ) <nl> + yield MakeToken ( lines , dots_pos , dots_pos + 2 , ' . . ' ) <nl> + pos = dots_pos + 2 <nl> + new_pos = Cursor ( pos . line + 1 , 0 ) <nl> + yield MakeToken ( lines , pos , new_pos , ' exp ' ) <nl> + pos = new_pos <nl> + elif found . token_type = = ' $ ' : <nl> + if prev_token : <nl> + yield prev_token <nl> + yield found <nl> + exp_token = ParseExpTokenInParens ( lines , found . end ) <nl> + yield exp_token <nl> + pos = exp_token . end <nl> + elif ( found . token_type = = ' ] ] ' or found . token_type = = ' $ if ' or <nl> + found . token_type = = ' $ elif ' or found . token_type = = ' $ else ' ) : <nl> + if prev_token_rstripped : <nl> + yield prev_token_rstripped <nl> + yield found <nl> + pos = found . end <nl> + else : <nl> + if prev_token : <nl> + yield prev_token <nl> + yield found <nl> + pos = found . end <nl> + <nl> + <nl> + def Tokenize ( s ) : <nl> + " " " A generator that yields the tokens in the given string . " " " <nl> + if s ! = ' ' : <nl> + lines = s . splitlines ( True ) <nl> + for token in TokenizeLines ( lines , Cursor ( 0 , 0 ) ) : <nl> + yield token <nl> + <nl> + <nl> + class CodeNode : <nl> + def __init__ ( self , atomic_code_list = None ) : <nl> + self . atomic_code = atomic_code_list <nl> + <nl> + <nl> + class VarNode : <nl> + def __init__ ( self , identifier = None , atomic_code = None ) : <nl> + self . identifier = identifier <nl> + self . atomic_code = atomic_code <nl> + <nl> + <nl> + class RangeNode : <nl> + def __init__ ( self , identifier = None , exp1 = None , exp2 = None ) : <nl> + self . identifier = identifier <nl> + self . exp1 = exp1 <nl> + self . exp2 = exp2 <nl> + <nl> + <nl> + class ForNode : <nl> + def __init__ ( self , identifier = None , sep = None , code = None ) : <nl> + self . identifier = identifier <nl> + self . sep = sep <nl> + self . code = code <nl> + <nl> + <nl> + class ElseNode : <nl> + def __init__ ( self , else_branch = None ) : <nl> + self . else_branch = else_branch <nl> + <nl> + <nl> + class IfNode : <nl> + def __init__ ( self , exp = None , then_branch = None , else_branch = None ) : <nl> + self . exp = exp <nl> + self . then_branch = then_branch <nl> + self . else_branch = else_branch <nl> + <nl> + <nl> + class RawCodeNode : <nl> + def __init__ ( self , token = None ) : <nl> + self . raw_code = token <nl> + <nl> + <nl> + class LiteralDollarNode : <nl> + def __init__ ( self , token ) : <nl> + self . token = token <nl> + <nl> + <nl> + class ExpNode : <nl> + def __init__ ( self , token , python_exp ) : <nl> + self . token = token <nl> + self . python_exp = python_exp <nl> + <nl> + <nl> + def PopFront ( a_list ) : <nl> + head = a_list [ 0 ] <nl> + a_list [ : 1 ] = [ ] <nl> + return head <nl> + <nl> + <nl> + def PushFront ( a_list , elem ) : <nl> + a_list [ : 0 ] = [ elem ] <nl> + <nl> + <nl> + def PopToken ( a_list , token_type = None ) : <nl> + token = PopFront ( a_list ) <nl> + if token_type is not None and token . token_type ! = token_type : <nl> + print ' ERROR : % s expected at % s ' % ( token_type , token . start ) <nl> + print ' ERROR : % s found instead ' % ( token , ) <nl> + sys . exit ( 1 ) <nl> + <nl> + return token <nl> + <nl> + <nl> + def PeekToken ( a_list ) : <nl> + if not a_list : <nl> + return None <nl> + <nl> + return a_list [ 0 ] <nl> + <nl> + <nl> + def ParseExpNode ( token ) : <nl> + python_exp = re . sub ( r ' ( [ _A - Za - z ] \ w * ) ' , r ' self . GetValue ( " \ 1 " ) ' , token . value ) <nl> + return ExpNode ( token , python_exp ) <nl> + <nl> + <nl> + def ParseElseNode ( tokens ) : <nl> + def Pop ( token_type = None ) : <nl> + return PopToken ( tokens , token_type ) <nl> + <nl> + next = PeekToken ( tokens ) <nl> + if not next : <nl> + return None <nl> + if next . token_type = = ' $ else ' : <nl> + Pop ( ' $ else ' ) <nl> + Pop ( ' [ [ ' ) <nl> + code_node = ParseCodeNode ( tokens ) <nl> + Pop ( ' ] ] ' ) <nl> + return code_node <nl> + elif next . token_type = = ' $ elif ' : <nl> + Pop ( ' $ elif ' ) <nl> + exp = Pop ( ' code ' ) <nl> + Pop ( ' [ [ ' ) <nl> + code_node = ParseCodeNode ( tokens ) <nl> + Pop ( ' ] ] ' ) <nl> + inner_else_node = ParseElseNode ( tokens ) <nl> + return CodeNode ( [ IfNode ( ParseExpNode ( exp ) , code_node , inner_else_node ) ] ) <nl> + elif not next . value . strip ( ) : <nl> + Pop ( ' code ' ) <nl> + return ParseElseNode ( tokens ) <nl> + else : <nl> + return None <nl> + <nl> + <nl> + def ParseAtomicCodeNode ( tokens ) : <nl> + def Pop ( token_type = None ) : <nl> + return PopToken ( tokens , token_type ) <nl> + <nl> + head = PopFront ( tokens ) <nl> + t = head . token_type <nl> + if t = = ' code ' : <nl> + return RawCodeNode ( head ) <nl> + elif t = = ' $ var ' : <nl> + id_token = Pop ( ' id ' ) <nl> + Pop ( ' = ' ) <nl> + next = PeekToken ( tokens ) <nl> + if next . token_type = = ' exp ' : <nl> + exp_token = Pop ( ) <nl> + return VarNode ( id_token , ParseExpNode ( exp_token ) ) <nl> + Pop ( ' [ [ ' ) <nl> + code_node = ParseCodeNode ( tokens ) <nl> + Pop ( ' ] ] ' ) <nl> + return VarNode ( id_token , code_node ) <nl> + elif t = = ' $ for ' : <nl> + id_token = Pop ( ' id ' ) <nl> + next_token = PeekToken ( tokens ) <nl> + if next_token . token_type = = ' code ' : <nl> + sep_token = next_token <nl> + Pop ( ' code ' ) <nl> + else : <nl> + sep_token = None <nl> + Pop ( ' [ [ ' ) <nl> + code_node = ParseCodeNode ( tokens ) <nl> + Pop ( ' ] ] ' ) <nl> + return ForNode ( id_token , sep_token , code_node ) <nl> + elif t = = ' $ if ' : <nl> + exp_token = Pop ( ' code ' ) <nl> + Pop ( ' [ [ ' ) <nl> + code_node = ParseCodeNode ( tokens ) <nl> + Pop ( ' ] ] ' ) <nl> + else_node = ParseElseNode ( tokens ) <nl> + return IfNode ( ParseExpNode ( exp_token ) , code_node , else_node ) <nl> + elif t = = ' $ range ' : <nl> + id_token = Pop ( ' id ' ) <nl> + exp1_token = Pop ( ' exp ' ) <nl> + Pop ( ' . . ' ) <nl> + exp2_token = Pop ( ' exp ' ) <nl> + return RangeNode ( id_token , ParseExpNode ( exp1_token ) , <nl> + ParseExpNode ( exp2_token ) ) <nl> + elif t = = ' $ id ' : <nl> + return ParseExpNode ( Token ( head . start + 1 , head . end , head . value [ 1 : ] , ' id ' ) ) <nl> + elif t = = ' $ ( $ ) ' : <nl> + return LiteralDollarNode ( head ) <nl> + elif t = = ' $ ' : <nl> + exp_token = Pop ( ' exp ' ) <nl> + return ParseExpNode ( exp_token ) <nl> + elif t = = ' [ [ ' : <nl> + code_node = ParseCodeNode ( tokens ) <nl> + Pop ( ' ] ] ' ) <nl> + return code_node <nl> + else : <nl> + PushFront ( tokens , head ) <nl> + return None <nl> + <nl> + <nl> + def ParseCodeNode ( tokens ) : <nl> + atomic_code_list = [ ] <nl> + while True : <nl> + if not tokens : <nl> + break <nl> + atomic_code_node = ParseAtomicCodeNode ( tokens ) <nl> + if atomic_code_node : <nl> + atomic_code_list . append ( atomic_code_node ) <nl> + else : <nl> + break <nl> + return CodeNode ( atomic_code_list ) <nl> + <nl> + <nl> + def ParseToAST ( pump_src_text ) : <nl> + " " " Convert the given Pump source text into an AST . " " " <nl> + tokens = list ( Tokenize ( pump_src_text ) ) <nl> + code_node = ParseCodeNode ( tokens ) <nl> + return code_node <nl> + <nl> + <nl> + class Env : <nl> + def __init__ ( self ) : <nl> + self . variables = [ ] <nl> + self . ranges = [ ] <nl> + <nl> + def Clone ( self ) : <nl> + clone = Env ( ) <nl> + clone . variables = self . variables [ : ] <nl> + clone . ranges = self . ranges [ : ] <nl> + return clone <nl> + <nl> + def PushVariable ( self , var , value ) : <nl> + # If value looks like an int , store it as an int . <nl> + try : <nl> + int_value = int ( value ) <nl> + if ( ' % s ' % int_value ) = = value : <nl> + value = int_value <nl> + except Exception : <nl> + pass <nl> + self . variables [ : 0 ] = [ ( var , value ) ] <nl> + <nl> + def PopVariable ( self ) : <nl> + self . variables [ : 1 ] = [ ] <nl> + <nl> + def PushRange ( self , var , lower , upper ) : <nl> + self . ranges [ : 0 ] = [ ( var , lower , upper ) ] <nl> + <nl> + def PopRange ( self ) : <nl> + self . ranges [ : 1 ] = [ ] <nl> + <nl> + def GetValue ( self , identifier ) : <nl> + for ( var , value ) in self . variables : <nl> + if identifier = = var : <nl> + return value <nl> + <nl> + print ' ERROR : meta variable % s is undefined . ' % ( identifier , ) <nl> + sys . exit ( 1 ) <nl> + <nl> + def EvalExp ( self , exp ) : <nl> + try : <nl> + result = eval ( exp . python_exp ) <nl> + except Exception , e : <nl> + print ' ERROR : caught exception % s : % s ' % ( e . __class__ . __name__ , e ) <nl> + print ( ' ERROR : failed to evaluate meta expression % s at % s ' % <nl> + ( exp . python_exp , exp . token . start ) ) <nl> + sys . exit ( 1 ) <nl> + return result <nl> + <nl> + def GetRange ( self , identifier ) : <nl> + for ( var , lower , upper ) in self . ranges : <nl> + if identifier = = var : <nl> + return ( lower , upper ) <nl> + <nl> + print ' ERROR : range % s is undefined . ' % ( identifier , ) <nl> + sys . exit ( 1 ) <nl> + <nl> + <nl> + class Output : <nl> + def __init__ ( self ) : <nl> + self . string = ' ' <nl> + <nl> + def GetLastLine ( self ) : <nl> + index = self . string . rfind ( ' \ n ' ) <nl> + if index < 0 : <nl> + return ' ' <nl> + <nl> + return self . string [ index + 1 : ] <nl> + <nl> + def Append ( self , s ) : <nl> + self . string + = s <nl> + <nl> + <nl> + def RunAtomicCode ( env , node , output ) : <nl> + if isinstance ( node , VarNode ) : <nl> + identifier = node . identifier . value . strip ( ) <nl> + result = Output ( ) <nl> + RunAtomicCode ( env . Clone ( ) , node . atomic_code , result ) <nl> + value = result . string <nl> + env . PushVariable ( identifier , value ) <nl> + elif isinstance ( node , RangeNode ) : <nl> + identifier = node . identifier . value . strip ( ) <nl> + lower = int ( env . EvalExp ( node . exp1 ) ) <nl> + upper = int ( env . EvalExp ( node . exp2 ) ) <nl> + env . PushRange ( identifier , lower , upper ) <nl> + elif isinstance ( node , ForNode ) : <nl> + identifier = node . identifier . value . strip ( ) <nl> + if node . sep is None : <nl> + sep = ' ' <nl> + else : <nl> + sep = node . sep . value <nl> + ( lower , upper ) = env . GetRange ( identifier ) <nl> + for i in range ( lower , upper + 1 ) : <nl> + new_env = env . Clone ( ) <nl> + new_env . PushVariable ( identifier , i ) <nl> + RunCode ( new_env , node . code , output ) <nl> + if i ! = upper : <nl> + output . Append ( sep ) <nl> + elif isinstance ( node , RawCodeNode ) : <nl> + output . Append ( node . raw_code . value ) <nl> + elif isinstance ( node , IfNode ) : <nl> + cond = env . EvalExp ( node . exp ) <nl> + if cond : <nl> + RunCode ( env . Clone ( ) , node . then_branch , output ) <nl> + elif node . else_branch is not None : <nl> + RunCode ( env . Clone ( ) , node . else_branch , output ) <nl> + elif isinstance ( node , ExpNode ) : <nl> + value = env . EvalExp ( node ) <nl> + output . Append ( ' % s ' % ( value , ) ) <nl> + elif isinstance ( node , LiteralDollarNode ) : <nl> + output . Append ( ' $ ' ) <nl> + elif isinstance ( node , CodeNode ) : <nl> + RunCode ( env . Clone ( ) , node , output ) <nl> + else : <nl> + print ' BAD ' <nl> + print node <nl> + sys . exit ( 1 ) <nl> + <nl> + <nl> + def RunCode ( env , code_node , output ) : <nl> + for atomic_code in code_node . atomic_code : <nl> + RunAtomicCode ( env , atomic_code , output ) <nl> + <nl> + <nl> + def IsSingleLineComment ( cur_line ) : <nl> + return ' / / ' in cur_line <nl> + <nl> + <nl> + def IsInPreprocessorDirective ( prev_lines , cur_line ) : <nl> + if cur_line . lstrip ( ) . startswith ( ' # ' ) : <nl> + return True <nl> + return prev_lines and prev_lines [ - 1 ] . endswith ( ' \ \ ' ) <nl> + <nl> + <nl> + def WrapComment ( line , output ) : <nl> + loc = line . find ( ' / / ' ) <nl> + before_comment = line [ : loc ] . rstrip ( ) <nl> + if before_comment = = ' ' : <nl> + indent = loc <nl> + else : <nl> + output . append ( before_comment ) <nl> + indent = len ( before_comment ) - len ( before_comment . lstrip ( ) ) <nl> + prefix = indent * ' ' + ' / / ' <nl> + max_len = 80 - len ( prefix ) <nl> + comment = line [ loc + 2 : ] . strip ( ) <nl> + segs = [ seg for seg in re . split ( r ' ( \ w + \ W * ) ' , comment ) if seg ! = ' ' ] <nl> + cur_line = ' ' <nl> + for seg in segs : <nl> + if len ( ( cur_line + seg ) . rstrip ( ) ) < max_len : <nl> + cur_line + = seg <nl> + else : <nl> + if cur_line . strip ( ) ! = ' ' : <nl> + output . append ( prefix + cur_line . rstrip ( ) ) <nl> + cur_line = seg . lstrip ( ) <nl> + if cur_line . strip ( ) ! = ' ' : <nl> + output . append ( prefix + cur_line . strip ( ) ) <nl> + <nl> + <nl> + def WrapCode ( line , line_concat , output ) : <nl> + indent = len ( line ) - len ( line . lstrip ( ) ) <nl> + prefix = indent * ' ' # Prefix of the current line <nl> + max_len = 80 - indent - len ( line_concat ) # Maximum length of the current line <nl> + new_prefix = prefix + 4 * ' ' # Prefix of a continuation line <nl> + new_max_len = max_len - 4 # Maximum length of a continuation line <nl> + # Prefers to wrap a line after a ' , ' or ' ; ' . <nl> + segs = [ seg for seg in re . split ( r ' ( [ ^ , ; ] + [ , ; ] ? ) ' , line . strip ( ) ) if seg ! = ' ' ] <nl> + cur_line = ' ' # The current line without leading spaces . <nl> + for seg in segs : <nl> + # If the line is still too long , wrap at a space . <nl> + while cur_line = = ' ' and len ( seg . strip ( ) ) > max_len : <nl> + seg = seg . lstrip ( ) <nl> + split_at = seg . rfind ( ' ' , 0 , max_len ) <nl> + output . append ( prefix + seg [ : split_at ] . strip ( ) + line_concat ) <nl> + seg = seg [ split_at + 1 : ] <nl> + prefix = new_prefix <nl> + max_len = new_max_len <nl> + <nl> + if len ( ( cur_line + seg ) . rstrip ( ) ) < max_len : <nl> + cur_line = ( cur_line + seg ) . lstrip ( ) <nl> + else : <nl> + output . append ( prefix + cur_line . rstrip ( ) + line_concat ) <nl> + prefix = new_prefix <nl> + max_len = new_max_len <nl> + cur_line = seg . lstrip ( ) <nl> + if cur_line . strip ( ) ! = ' ' : <nl> + output . append ( prefix + cur_line . strip ( ) ) <nl> + <nl> + <nl> + def WrapPreprocessorDirective ( line , output ) : <nl> + WrapCode ( line , ' \ \ ' , output ) <nl> + <nl> + <nl> + def WrapPlainCode ( line , output ) : <nl> + WrapCode ( line , ' ' , output ) <nl> + <nl> + <nl> + def IsMultiLineIWYUPragma ( line ) : <nl> + return re . search ( r ' / \ * IWYU pragma : ' , line ) <nl> + <nl> + <nl> + def IsHeaderGuardIncludeOrOneLineIWYUPragma ( line ) : <nl> + return ( re . match ( r ' ^ # ( ifndef | define | endif \ s * / / ) \ s * [ \ w_ ] + \ s * $ ' , line ) or <nl> + re . match ( r ' ^ # include \ s ' , line ) or <nl> + # Don ' t break IWYU pragmas , either ; that causes iwyu . py problems . <nl> + re . search ( r ' / / IWYU pragma : ' , line ) ) <nl> + <nl> + <nl> + def WrapLongLine ( line , output ) : <nl> + line = line . rstrip ( ) <nl> + if len ( line ) < = 80 : <nl> + output . append ( line ) <nl> + elif IsSingleLineComment ( line ) : <nl> + if IsHeaderGuardIncludeOrOneLineIWYUPragma ( line ) : <nl> + # The style guide made an exception to allow long header guard lines , <nl> + # includes and IWYU pragmas . <nl> + output . append ( line ) <nl> + else : <nl> + WrapComment ( line , output ) <nl> + elif IsInPreprocessorDirective ( output , line ) : <nl> + if IsHeaderGuardIncludeOrOneLineIWYUPragma ( line ) : <nl> + # The style guide made an exception to allow long header guard lines , <nl> + # includes and IWYU pragmas . <nl> + output . append ( line ) <nl> + else : <nl> + WrapPreprocessorDirective ( line , output ) <nl> + elif IsMultiLineIWYUPragma ( line ) : <nl> + output . append ( line ) <nl> + else : <nl> + WrapPlainCode ( line , output ) <nl> + <nl> + <nl> + def BeautifyCode ( string ) : <nl> + lines = string . splitlines ( ) <nl> + output = [ ] <nl> + for line in lines : <nl> + WrapLongLine ( line , output ) <nl> + output2 = [ line . rstrip ( ) for line in output ] <nl> + return ' \ n ' . join ( output2 ) + ' \ n ' <nl> + <nl> + <nl> + def ConvertFromPumpSource ( src_text ) : <nl> + " " " Return the text generated from the given Pump source text . " " " <nl> + ast = ParseToAST ( StripMetaComments ( src_text ) ) <nl> + output = Output ( ) <nl> + RunCode ( Env ( ) , ast , output ) <nl> + return BeautifyCode ( output . string ) <nl> + <nl> + <nl> + def main ( argv ) : <nl> + if len ( argv ) = = 1 : <nl> + print __doc__ <nl> + sys . exit ( 1 ) <nl> + <nl> + file_path = argv [ - 1 ] <nl> + output_str = ConvertFromPumpSource ( file ( file_path , ' r ' ) . read ( ) ) <nl> + if file_path . endswith ( ' . pump ' ) : <nl> + output_file_path = file_path [ : - 5 ] <nl> + else : <nl> + output_file_path = ' - ' <nl> + if output_file_path = = ' - ' : <nl> + print output_str , <nl> + else : <nl> + output_file = file ( output_file_path , ' w ' ) <nl> + output_file . write ( ' / / This file was GENERATED by command : \ n ' ) <nl> + output_file . write ( ' / / % s % s \ n ' % <nl> + ( os . path . basename ( __file__ ) , os . path . basename ( file_path ) ) ) <nl> + output_file . write ( ' / / DO NOT EDIT BY HAND ! ! ! \ n \ n ' ) <nl> + output_file . write ( output_str ) <nl> + output_file . close ( ) <nl> + <nl> + <nl> + if __name__ = = ' __main__ ' : <nl> + main ( sys . argv ) <nl>
Merge ' native_mate ' into ' electron '
electron/electron
d04cdbb367f8dcb91a540869bdb24964dbedd946
2018-06-22T01:32:08Z
mmm a / hphp / runtime / ext / process / ext_process . cpp <nl> ppp b / hphp / runtime / ext / process / ext_process . cpp <nl> static bool pre_proc_open ( const Array & descriptorspec , <nl> return false ; <nl> } <nl> <nl> - static Variant post_proc_open ( const String & cmd , Variant & pipes , <nl> + static Variant post_proc_open ( const String & cmd , Variant & pipes , <nl> const Variant & env , <nl> std : : vector < DescriptorItem > & items , <nl> pid_t child ) { <nl> if ( child < 0 ) { <nl> / * failed to fork ( ) * / <nl> - for ( int i = 0 ; i < ( int ) items . size ( ) ; i + + ) { <nl> - items [ i ] . cleanup ( ) ; <nl> + for ( auto & item : items ) { <nl> + item . cleanup ( ) ; <nl> } <nl> raise_warning ( " fork failed - % s " , folly : : errnoStr ( errno ) . c_str ( ) ) ; <nl> return false ; <nl> static Variant post_proc_open ( const String & cmd , Variant & pipes , <nl> / / previously set to <nl> pipes = Variant ( Array : : Create ( ) ) ; <nl> <nl> - for ( int i = 0 ; i < ( int ) items . size ( ) ; i + + ) { <nl> - Resource f = items [ i ] . dupParent ( ) ; <nl> + for ( auto & item : items ) { <nl> + Resource f = item . dupParent ( ) ; <nl> if ( ! f . isNull ( ) ) { <nl> proc - > pipes . append ( f ) ; <nl> - pipes . toArrRef ( ) . set ( items [ i ] . index , f ) ; <nl> + pipes . toArrRef ( ) . set ( item . index , f ) ; <nl> } <nl> } <nl> return Variant ( std : : move ( proc ) ) ; <nl> Variant HHVM_FUNCTION ( proc_open , <nl> if ( RuntimeOption : : WhitelistExec & & ! check_cmd ( cmd . data ( ) ) ) { <nl> return false ; <nl> } <nl> + if ( cmd . size ( ) ! = strlen ( cmd . c_str ( ) ) ) { <nl> + raise_warning ( " NULL byte detected . Possible attack " ) ; <nl> + return false ; <nl> + } <nl> <nl> std : : vector < DescriptorItem > items ; <nl> <nl> Variant HHVM_FUNCTION ( proc_open , <nl> / / for each name . <nl> <nl> / / Env vars defined in the hdf file go in first <nl> - for ( auto iter = RuntimeOption : : EnvVariables . begin ( ) ; <nl> - iter ! = RuntimeOption : : EnvVariables . end ( ) ; + + iter ) { <nl> - enva . set ( String ( iter - > first ) , String ( iter - > second ) ) ; <nl> + for ( const auto & envvar : RuntimeOption : : EnvVariables ) { <nl> + enva . set ( String ( envvar . first ) , String ( envvar . second ) ) ; <nl> } <nl> <nl> / / global environment overrides the hdf <nl> Variant HHVM_FUNCTION ( proc_open , <nl> / / there is no need to do any locking , because the forking is delegated <nl> / / to the light process <nl> if ( ! pre_proc_open ( descriptorspec , items ) ) return false ; <nl> + const int item_size = items . size ( ) ; <nl> std : : vector < int > created ; <nl> + created . reserve ( item_size ) ; <nl> std : : vector < int > intended ; <nl> - for ( int i = 0 ; i < ( int ) items . size ( ) ; i + + ) { <nl> - created . push_back ( items [ i ] . childend ) ; <nl> - intended . push_back ( items [ i ] . index ) ; <nl> + intended . reserve ( item_size ) ; <nl> + for ( int i = 0 ; i < item_size ; i + + ) { <nl> + const auto & item = items [ i ] ; <nl> + created . push_back ( item . childend ) ; <nl> + intended . push_back ( item . index ) ; <nl> } <nl> <nl> std : : vector < std : : string > envs ; <nl> Variant HHVM_FUNCTION ( proc_open , <nl> / * close those descriptors that we just opened for the parent stuff , <nl> * dup new descriptors into required descriptors and close the original <nl> * cruft * / <nl> - for ( int i = 0 ; i < ( int ) items . size ( ) ; i + + ) { <nl> - items [ i ] . dupChild ( ) ; <nl> + for ( auto & item : items ) { <nl> + item . dupChild ( ) ; <nl> } <nl> if ( scwd . length ( ) > 0 & & chdir ( scwd . c_str ( ) ) ) { <nl> / / chdir failed , the working directory remains unchanged <nl> mmm a / hphp / test / slow / ext_process / lwp . php <nl> ppp b / hphp / test / slow / ext_process / lwp . php <nl> function VERIFY ( $ x ) { VS ( $ x ! = false , true ) ; } <nl> VS ( exec ( $ nullbyte , $ nullbyteout ) , " " ) ; <nl> VS ( $ nullbyteout , null ) ; <nl> VS ( shell_exec ( $ nullbyte ) , null ) ; <nl> + $ process = proc_open ( $ nullbyte , array ( ) , $ pipes ) ; <nl> + VS ( $ process , false ) ; <nl> mmm a / hphp / test / slow / ext_process / lwp . php . expectf <nl> ppp b / hphp / test / slow / ext_process / lwp . php . expectf <nl> bool ( true ) <nl> <nl> Warning : NULL byte detected . Possible attack in % s / lwp . php on line 114 <nl> bool ( true ) <nl> + <nl> + Warning : NULL byte detected . Possible attack in % s / lwp . php on line 115 <nl> + bool ( true ) <nl>
add nullbyte detection to proc_open
facebook/hhvm
4bd3774a55dc1a30ecf2df02f618b1dd5571cc36
2015-05-22T01:30:47Z
mmm a / android / sdk / src / main / java / com / taobao / weex / utils / WXLogUtils . java <nl> ppp b / android / sdk / src / main / java / com / taobao / weex / utils / WXLogUtils . java <nl> <nl> private static StringBuilder builder = new StringBuilder ( 50 ) ; <nl> private static HashMap < String , Class > clazzMaps = new HashMap < > ( 2 ) ; <nl> private static JsLogWatcher jsLogWatcher ; <nl> + private static LogWatcher sLogWatcher ; <nl> <nl> static { <nl> clazzMaps . put ( CLAZZ_NAME_DEBUG_TOOL , loadClass ( CLAZZ_NAME_DEBUG_TOOL ) ) ; <nl> private static void log ( String tag , String msg , LogLevel level ) { <nl> writeConsoleLog ( level . getName ( ) , msg ) ; <nl> sendLog ( level , msg ) ; <nl> } <nl> + if ( sLogWatcher ! = null ) { <nl> + sLogWatcher . onLog ( level . getName ( ) , tag , msg ) ; <nl> + } <nl> } <nl> <nl> public static void d ( String msg ) { <nl> public static void d ( String tag , String msg ) { <nl> } <nl> } <nl> sendLog ( LogLevel . DEBUG , tag + " : " + msg ) ; <nl> + log ( tag , msg , LogLevel . DEBUG ) ; <nl> } <nl> } <nl> <nl> public static void setJsLogWatcher ( JsLogWatcher watcher ) { <nl> jsLogWatcher = watcher ; <nl> } <nl> <nl> + public static void setLogWatcher ( LogWatcher watcher ) { <nl> + sLogWatcher = watcher ; <nl> + } <nl> + <nl> public interface JsLogWatcher { <nl> void onJsLog ( int level , String log ) ; <nl> } <nl> + <nl> + public interface LogWatcher { <nl> + void onLog ( String level , String tag , String msg ) ; <nl> + } <nl> } <nl>
+ [ android ] add LogWatcher interface
apache/incubator-weex
3baa9e048928b8ada9ea9f2e88ab363a93e9b36e
2017-10-01T03:09:18Z
mmm a / src / rabit . h <nl> ppp b / src / rabit . h <nl> <nl> * / <nl> # include < string > <nl> # include < vector > <nl> + # include " . / io . h " <nl> # include " . / engine . h " <nl> <nl> / * ! \ brief namespace of rabit * / <nl>
Update rabit . h
dmlc/xgboost
31403a41cd93d281089ed4e96e2df1cf8cd3549c
2014-12-10T05:03:41Z
mmm a / include / swift / Frontend / ModuleInterfaceLoader . h <nl> ppp b / include / swift / Frontend / ModuleInterfaceLoader . h <nl> class ExplicitSwiftModuleLoader : public SerializedModuleLoaderBase { <nl> / / / Information about explicitly specified Swift module files . <nl> struct ExplicitModuleInfo { <nl> / / Path of the . swiftmodule file . <nl> - StringRef modulePath ; <nl> + std : : string modulePath ; <nl> / / Path of the . swiftmoduledoc file . <nl> - StringRef moduleDocPath ; <nl> + std : : string moduleDocPath ; <nl> / / Path of the . swiftsourceinfo file . <nl> - StringRef moduleSourceInfoPath ; <nl> + std : : string moduleSourceInfoPath ; <nl> / / Opened buffer for the . swiftmodule file . <nl> std : : unique_ptr < llvm : : MemoryBuffer > moduleBuffer ; <nl> } ; <nl>
Change ExplicitModuleInfo to have String members instead of StringRef
apple/swift
70585b4f0fab192b8f4194328aa713fd86060c71
2020-07-28T22:57:17Z
mmm a / tensorflow / core / common_runtime / process_function_library_runtime . h <nl> ppp b / tensorflow / core / common_runtime / process_function_library_runtime . h <nl> class ProcessFunctionLibraryRuntime { <nl> bool * is_cross_process ) const ; <nl> <nl> / / Delegates to the local FLR that owns state corresponding to ` handle ` and <nl> - / / tells it to release it . If the ` handle ` isn ' t ' needed at all , the local FLR <nl> + / / tells it to release it . If the ` handle ` isn ' t needed at all , the local FLR <nl> / / might call RemoveHandle on this to get rid of the state owned by the Proc <nl> / / FLR . <nl> / / For multi - device functions , calls ReleaseHandle on local FLRs for each <nl>
extra apostrophe
tensorflow/tensorflow
4df96ca10a03a721f10cb433a78ce935b1ce8e32
2020-10-28T23:59:03Z
mmm a / doc / options . compiler <nl> ppp b / doc / options . compiler <nl> behaviors , so this is recommended to leave as off . <nl> <nl> = EnableEval <nl> <nl> - Default is false . When turned on , eval ( ) is supported , as long as it does not <nl> - declare new functions or classes . <nl> + Default is 0 , eval ( ) will throw a fatal error . When 1 , eval ( ) is supported in <nl> + a limited way , mixed together with compiled program . When 2 , eval ( ) is fully <nl> + supported as an interpreter mode . <nl> <nl> = IncludeRoots <nl> <nl> mmm a / src / hphpi / Makefile <nl> ppp b / src / hphpi / Makefile <nl> PHP_FILES = hphpi . php <nl> <nl> all : $ ( HPHPI ) <nl> <nl> - $ ( HPHPI ) : $ ( HPHP ) $ ( PHP_FILES ) hphpi_build . hdf <nl> + $ ( HPHPI ) : $ ( HPHP ) $ ( PHP_FILES ) <nl> @ echo " Compiling hphpi . . . " <nl> + $ ( V ) $ ( if $ ( OUT_TOP ) , HPHP_LIB = $ ( LIB_DIR ) ) $ ( HPHP ) \ <nl> - t cpp - f exe - - input - dir . \ <nl> - i $ ( PHP_FILES ) - o $ ( OUT_DIR ) gen \ <nl> - - - config hphpi_build . hdf - - log = 1 \ <nl> + - vEnableEval = 2 - - log = 1 \ <nl> - - program = $ ( if $ ( OUT_TOP ) , , $ ( CWD ) / ) $ @ <nl> <nl> clobber : <nl> deleted file mode 100644 <nl> index 3fc00f217a9 . . 00000000000 <nl> mmm a / src / hphpi / hphpi_build . hdf <nl> ppp / dev / null <nl> @ @ - 1 + 0 , 0 @ @ <nl> - EnableEval = 2 <nl>
documented how eval ( ) is supported in compiled program
facebook/hhvm
60f54672dc6e99463e1ffd34fd7e18f8c8bac722
2010-10-18T20:16:43Z
mmm a / src / debug . cc <nl> ppp b / src / debug . cc <nl> void DebugMessageThread : : DebugEvent ( v8 : : DebugEvent event , <nl> return ; <nl> } <nl> <nl> - / / Notify the debugger that a debug event has occurred . <nl> + / / Notify the debugger that a debug event has occurred unless auto continue is <nl> + / / active in which case no event is send . <nl> if ( ! auto_continue ) { <nl> bool success = SetEventJSONFromEvent ( event_data ) ; <nl> if ( ! success ) { <nl> void DebugMessageThread : : DebugEvent ( v8 : : DebugEvent event , <nl> / / Return the result . <nl> SendMessage ( str ) ; <nl> <nl> - / / Return from debug event processing is VM should be running . <nl> + / / Return from debug event processing if either the VM is put into the <nl> + / / runnning state ( through a continue command ) or auto continue is active <nl> + / / and there are no more commands queued . <nl> if ( running | | ( auto_continue & & ! HasCommands ( ) ) ) { <nl> return ; <nl> } <nl>
Missed a few comment changes in r1508 .
v8/v8
6a9d16f40f46586ca574fb37d0deb307976892d9
2009-03-16T07:36:52Z
mmm a / src / ExtendedScintilla . cpp <nl> ppp b / src / ExtendedScintilla . cpp <nl> void ExtendedScintilla : : setLexer ( QsciLexer * lexer ) <nl> { <nl> QsciScintilla : : setLexer ( lexer ) ; <nl> <nl> - / / Set margins to system window theme . setLexer seems to reset these colours . <nl> - setMarginsBackgroundColor ( QPalette ( ) . color ( QPalette : : Active , QPalette : : Window ) ) ; <nl> - setMarginsForegroundColor ( QPalette ( ) . color ( QPalette : : Active , QPalette : : WindowText ) ) ; <nl> - setIndentationGuidesBackgroundColor ( QPalette ( ) . color ( QPalette : : Active , QPalette : : Window ) ) ; <nl> - setIndentationGuidesForegroundColor ( QPalette ( ) . color ( QPalette : : Active , QPalette : : WindowText ) ) ; <nl> + / / Set margins according to settings . setLexer seems to reset these colours . <nl> + / / Use desktop default colors for margins when following desktop style , or the custom colors otherwise . <nl> + switch ( Settings : : getValue ( " General " , " appStyle " ) . toInt ( ) ) { <nl> + case Settings : : FollowDesktopStyle : <nl> + setMarginsBackgroundColor ( QPalette ( ) . color ( QPalette : : Active , QPalette : : Window ) ) ; <nl> + setMarginsForegroundColor ( QPalette ( ) . color ( QPalette : : Active , QPalette : : WindowText ) ) ; <nl> + break ; <nl> + case Settings : : DarkStyle : <nl> + setMarginsBackgroundColor ( QColor ( Settings : : getValue ( " syntaxhighlighter " , " background_colour " ) . toString ( ) ) ) ; <nl> + setMarginsForegroundColor ( QColor ( Settings : : getValue ( " syntaxhighlighter " , " foreground_colour " ) . toString ( ) ) ) ; <nl> + break ; <nl> + } <nl> } <nl> <nl> void ExtendedScintilla : : reloadKeywords ( ) <nl> void ExtendedScintilla : : reloadSettings ( ) <nl> } <nl> void ExtendedScintilla : : reloadLexerSettings ( QsciLexer * lexer ) <nl> { <nl> + QColor foreground ( Settings : : getValue ( " syntaxhighlighter " , " foreground_colour " ) . toString ( ) ) ; <nl> + QColor background ( Settings : : getValue ( " syntaxhighlighter " , " background_colour " ) . toString ( ) ) ; <nl> + <nl> + QFont defaultfont ( Settings : : getValue ( " editor " , " font " ) . toString ( ) ) ; <nl> + defaultfont . setStyleHint ( QFont : : TypeWriter ) ; <nl> + defaultfont . setPointSize ( Settings : : getValue ( " editor " , " fontsize " ) . toInt ( ) ) ; <nl> + <nl> / / Set syntax highlighting settings <nl> if ( lexer ) <nl> { <nl> - QFont defaultfont ( Settings : : getValue ( " editor " , " font " ) . toString ( ) ) ; <nl> - defaultfont . setStyleHint ( QFont : : TypeWriter ) ; <nl> - defaultfont . setPointSize ( Settings : : getValue ( " editor " , " fontsize " ) . toInt ( ) ) ; <nl> lexer - > setFont ( defaultfont ) ; <nl> <nl> - lexer - > setDefaultColor ( QColor ( Settings : : getValue ( " syntaxhighlighter " , " foreground_colour " ) . toString ( ) ) ) ; <nl> - lexer - > setPaper ( QColor ( Settings : : getValue ( " syntaxhighlighter " , " background_colour " ) . toString ( ) ) ) ; <nl> + lexer - > setDefaultPaper ( background ) ; <nl> + lexer - > setDefaultColor ( foreground ) ; <nl> + <nl> + / / This sets the base colors for all the styles <nl> + lexer - > setPaper ( background ) ; <nl> + lexer - > setColor ( foreground ) ; <nl> } <nl> <nl> / / Set font <nl> - QFont font ( Settings : : getValue ( " editor " , " font " ) . toString ( ) ) ; <nl> - font . setStyleHint ( QFont : : TypeWriter ) ; <nl> - font . setPointSize ( Settings : : getValue ( " editor " , " fontsize " ) . toInt ( ) ) ; <nl> - setFont ( font ) ; <nl> + setFont ( defaultfont ) ; <nl> <nl> / / Show line numbers <nl> - setMarginsFont ( font ) ; <nl> + setMarginsFont ( defaultfont ) ; <nl> setMarginLineNumbers ( 0 , true ) ; <nl> updateLineNumberAreaWidth ( ) ; <nl> <nl> / / Highlight current line <nl> setCaretLineVisible ( true ) ; <nl> setCaretLineBackgroundColor ( QColor ( Settings : : getValue ( " syntaxhighlighter " , " currentline_colour " ) . toString ( ) ) ) ; <nl> - setCaretForegroundColor ( QColor ( Settings : : getValue ( " syntaxhighlighter " , " foreground_colour " ) . toString ( ) ) ) ; <nl> + setCaretForegroundColor ( foreground ) ; <nl> <nl> / / Set tab width <nl> setTabWidth ( Settings : : getValue ( " editor " , " tabsize " ) . toInt ( ) ) ; <nl> mmm a / src / PreferencesDialog . cpp <nl> ppp b / src / PreferencesDialog . cpp <nl> PreferencesDialog : : PreferencesDialog ( QWidget * parent , Tabs tab ) <nl> <nl> loadSettings ( ) ; <nl> <nl> + connect ( ui - > appStyleCombo , SIGNAL ( currentIndexChanged ( int ) ) , this , SLOT ( adjustColorsToStyle ( int ) ) ) ; <nl> + <nl> / / Avoid different heights due to having check boxes or not <nl> ui - > treeSyntaxHighlighting - > setUniformRowHeights ( true ) ; <nl> <nl> void PreferencesDialog : : saveColorSetting ( QFrame * frame , const QString & settingN <nl> frame - > palette ( ) . color ( frame - > backgroundRole ( ) ) ) ; <nl> } <nl> <nl> + void PreferencesDialog : : adjustColorsToStyle ( int style ) <nl> + { <nl> + Settings : : AppStyle appStyle = static_cast < Settings : : AppStyle > ( style ) ; <nl> + setColorSetting ( ui - > fr_null_fg , Settings : : getDefaultColorValue ( " databrowser " , " null_fg_colour " , appStyle ) ) ; <nl> + setColorSetting ( ui - > fr_null_bg , Settings : : getDefaultColorValue ( " databrowser " , " null_bg_colour " , appStyle ) ) ; <nl> + setColorSetting ( ui - > fr_bin_fg , Settings : : getDefaultColorValue ( " databrowser " , " bin_fg_colour " , appStyle ) ) ; <nl> + setColorSetting ( ui - > fr_bin_bg , Settings : : getDefaultColorValue ( " databrowser " , " bin_bg_colour " , appStyle ) ) ; <nl> + setColorSetting ( ui - > fr_reg_fg , Settings : : getDefaultColorValue ( " databrowser " , " reg_fg_colour " , appStyle ) ) ; <nl> + setColorSetting ( ui - > fr_reg_bg , Settings : : getDefaultColorValue ( " databrowser " , " reg_bg_colour " , appStyle ) ) ; <nl> + <nl> + for ( int i = 0 ; i < ui - > treeSyntaxHighlighting - > topLevelItemCount ( ) ; + + i ) <nl> + { <nl> + QString name = ui - > treeSyntaxHighlighting - > topLevelItem ( i ) - > text ( 0 ) ; <nl> + QColor color = Settings : : getDefaultColorValue ( " syntaxhighlighter " , name + " _colour " , appStyle ) ; <nl> + ui - > treeSyntaxHighlighting - > topLevelItem ( i ) - > setTextColor ( 2 , color ) ; <nl> + ui - > treeSyntaxHighlighting - > topLevelItem ( i ) - > setBackgroundColor ( 2 , color ) ; <nl> + ui - > treeSyntaxHighlighting - > topLevelItem ( i ) - > setText ( 2 , color . name ( ) ) ; <nl> + } <nl> + } <nl> + <nl> void PreferencesDialog : : activateRemoteTab ( bool active ) <nl> { <nl> ui - > tabWidget - > setTabEnabled ( ui - > tabWidget - > indexOf ( ui - > tabRemote ) , active ) ; <nl> mmm a / src / PreferencesDialog . h <nl> ppp b / src / PreferencesDialog . h <nl> private slots : <nl> void removeClientCertificate ( ) ; <nl> void chooseRemoteCloneDirectory ( ) ; <nl> void updatePreviewFont ( ) ; <nl> + void adjustColorsToStyle ( int style ) ; <nl> <nl> void on_buttonManageFileExtension_clicked ( ) ; <nl> void on_buttonBox_clicked ( QAbstractButton * button ) ; <nl> mmm a / src / PreferencesDialog . ui <nl> ppp b / src / PreferencesDialog . ui <nl> <nl> < verstretch > 0 < / verstretch > <nl> < / sizepolicy > <nl> < / property > <nl> + < property name = " toolTip " > <nl> + < string > When this value is changed , all the other color preferences are also set to matching colors . < / string > <nl> + < / property > <nl> < property name = " currentIndex " > <nl> < number > 0 < / number > <nl> < / property > <nl> Can be set to 0 for disabling completion . < / string > <nl> < property name = " focusPolicy " > <nl> < enum > Qt : : StrongFocus < / enum > <nl> < / property > <nl> + < property name = " toolTip " > <nl> + < string > Click to set this color < / string > <nl> + < / property > <nl> < property name = " autoFillBackground " > <nl> < bool > true < / bool > <nl> < / property > <nl> Can be set to 0 for disabling completion . < / string > <nl> < property name = " focusPolicy " > <nl> < enum > Qt : : StrongFocus < / enum > <nl> < / property > <nl> + < property name = " toolTip " > <nl> + < string > Click to set this color < / string > <nl> + < / property > <nl> < property name = " autoFillBackground " > <nl> < bool > true < / bool > <nl> < / property > <nl> Can be set to 0 for disabling completion . < / string > <nl> < property name = " focusPolicy " > <nl> < enum > Qt : : StrongFocus < / enum > <nl> < / property > <nl> + < property name = " toolTip " > <nl> + < string > Click to set this color < / string > <nl> + < / property > <nl> < property name = " autoFillBackground " > <nl> < bool > true < / bool > <nl> < / property > <nl> Can be set to 0 for disabling completion . < / string > <nl> < property name = " focusPolicy " > <nl> < enum > Qt : : StrongFocus < / enum > <nl> < / property > <nl> + < property name = " toolTip " > <nl> + < string > Click to set this color < / string > <nl> + < / property > <nl> < property name = " autoFillBackground " > <nl> < bool > true < / bool > <nl> < / property > <nl> Can be set to 0 for disabling completion . < / string > <nl> < property name = " focusPolicy " > <nl> < enum > Qt : : StrongFocus < / enum > <nl> < / property > <nl> + < property name = " toolTip " > <nl> + < string > Click to set this color < / string > <nl> + < / property > <nl> < property name = " autoFillBackground " > <nl> < bool > true < / bool > <nl> < / property > <nl> Can be set to 0 for disabling completion . < / string > <nl> < property name = " focusPolicy " > <nl> < enum > Qt : : StrongFocus < / enum > <nl> < / property > <nl> + < property name = " toolTip " > <nl> + < string > Click to set this color < / string > <nl> + < / property > <nl> < property name = " autoFillBackground " > <nl> < bool > true < / bool > <nl> < / property > <nl> mmm a / src / Settings . cpp <nl> ppp b / src / Settings . cpp <nl> QVariant Settings : : getDefaultValue ( const QString & group , const QString & name ) <nl> return " \ \ " ; <nl> if ( name = = " filter_delay " ) <nl> return 200 ; <nl> - if ( name = = " null_fg_colour " ) <nl> - return QColor ( Qt : : lightGray ) . name ( ) ; <nl> - if ( name = = " null_bg_colour " ) <nl> - return QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) . name ( ) ; <nl> - if ( name = = " reg_fg_colour " ) <nl> - return QPalette ( ) . color ( QPalette : : Active , QPalette : : Text ) . name ( ) ; <nl> - if ( name = = " reg_bg_colour " ) <nl> - return QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) . name ( ) ; <nl> - if ( name = = " bin_fg_colour " ) <nl> - return QColor ( Qt : : lightGray ) . name ( ) ; <nl> - if ( name = = " bin_bg_colour " ) <nl> - return QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) . name ( ) ; <nl> + if ( name . right ( 6 ) = = " colour " ) <nl> + return getDefaultColorValue ( group , name , FollowDesktopStyle ) ; <nl> } <nl> <nl> / / syntaxhighlighter ? <nl> QVariant Settings : : getDefaultValue ( const QString & group , const QString & name ) <nl> <nl> / / Colour ? <nl> if ( name . right ( 6 ) = = " colour " ) <nl> - { <nl> - QColor backgroundColour = QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) ; <nl> - QColor foregroundColour = QPalette ( ) . color ( QPalette : : Active , QPalette : : Text ) ; <nl> - <nl> - if ( name = = " foreground_colour " ) <nl> - return foregroundColour . name ( ) ; <nl> - else if ( name = = " background_colour " ) <nl> - return backgroundColour . name ( ) ; <nl> - <nl> - / / Detect and provide sensible defaults for dark themes <nl> - if ( backgroundColour . value ( ) < foregroundColour . value ( ) ) { <nl> - if ( name = = " keyword_colour " ) <nl> - return QColor ( 82 , 148 , 226 ) . name ( ) ; <nl> - else if ( name = = " function_colour " ) <nl> - return QColor ( Qt : : yellow ) . name ( ) ; <nl> - else if ( name = = " table_colour " ) <nl> - return QColor ( Qt : : cyan ) . name ( ) ; <nl> - else if ( name = = " comment_colour " ) <nl> - return QColor ( Qt : : green ) . name ( ) ; <nl> - else if ( name = = " identifier_colour " ) <nl> - return QColor ( Qt : : magenta ) . name ( ) ; <nl> - else if ( name = = " string_colour " ) <nl> - return QColor ( Qt : : lightGray ) . name ( ) ; <nl> - else if ( name = = " currentline_colour " ) <nl> - return backgroundColour . lighter ( 150 ) . name ( ) ; <nl> - else if ( name = = " background_colour " ) <nl> - return backgroundColour . name ( ) ; <nl> - } else { <nl> - if ( name = = " keyword_colour " ) <nl> - return QColor ( Qt : : darkBlue ) . name ( ) ; <nl> - else if ( name = = " function_colour " ) <nl> - return QColor ( Qt : : blue ) . name ( ) ; <nl> - else if ( name = = " table_colour " ) <nl> - return QColor ( Qt : : darkCyan ) . name ( ) ; <nl> - else if ( name = = " comment_colour " ) <nl> - return QColor ( Qt : : darkGreen ) . name ( ) ; <nl> - else if ( name = = " identifier_colour " ) <nl> - return QColor ( Qt : : darkMagenta ) . name ( ) ; <nl> - else if ( name = = " string_colour " ) <nl> - return QColor ( Qt : : red ) . name ( ) ; <nl> - else if ( name = = " currentline_colour " ) <nl> - return QColor ( 236 , 236 , 245 ) . name ( ) ; <nl> - else if ( name = = " background_colour " ) <nl> - return backgroundColour . name ( ) ; <nl> - } <nl> - } <nl> + return getDefaultColorValue ( group , name , FollowDesktopStyle ) ; <nl> } <nl> <nl> / / editor / font ? <nl> QVariant Settings : : getDefaultValue ( const QString & group , const QString & name ) <nl> return QVariant ( ) ; <nl> } <nl> <nl> + QColor Settings : : getDefaultColorValue ( const QString & group , const QString & name , AppStyle style ) <nl> + { <nl> + / / Data Browser / NULL & Binary Fields <nl> + if ( group = = " databrowser " ) <nl> + { <nl> + switch ( style ) { <nl> + case FollowDesktopStyle : <nl> + if ( name = = " null_fg_colour " ) <nl> + return QColor ( Qt : : lightGray ) . name ( ) ; <nl> + if ( name = = " null_bg_colour " ) <nl> + return QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) . name ( ) ; <nl> + if ( name = = " reg_fg_colour " ) <nl> + return QPalette ( ) . color ( QPalette : : Active , QPalette : : Text ) . name ( ) ; <nl> + if ( name = = " reg_bg_colour " ) <nl> + return QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) . name ( ) ; <nl> + if ( name = = " bin_fg_colour " ) <nl> + return QColor ( Qt : : lightGray ) . name ( ) ; <nl> + if ( name = = " bin_bg_colour " ) <nl> + return QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) . name ( ) ; <nl> + case DarkStyle : <nl> + if ( name = = " null_fg_colour " ) <nl> + return QColor ( " # 787878 " ) ; <nl> + if ( name = = " null_bg_colour " ) <nl> + return QColor ( " # 19232D " ) ; <nl> + if ( name = = " reg_fg_colour " ) <nl> + return QColor ( " # F0F0F0 " ) ; <nl> + if ( name = = " reg_bg_colour " ) <nl> + return QColor ( " # 19232D " ) ; <nl> + if ( name = = " bin_fg_colour " ) <nl> + return QColor ( " # 787878 " ) ; <nl> + if ( name = = " bin_bg_colour " ) <nl> + return QColor ( " # 19232D " ) ; <nl> + } <nl> + } <nl> + <nl> + / / syntaxhighlighter ? <nl> + if ( group = = " syntaxhighlighter " ) <nl> + { <nl> + / / Colour ? <nl> + if ( name . right ( 6 ) = = " colour " ) <nl> + { <nl> + QColor backgroundColour ; <nl> + QColor foregroundColour ; <nl> + switch ( style ) { <nl> + case FollowDesktopStyle : <nl> + backgroundColour = QPalette ( ) . color ( QPalette : : Active , QPalette : : Base ) ; <nl> + foregroundColour = QPalette ( ) . color ( QPalette : : Active , QPalette : : Text ) ; <nl> + break ; <nl> + case DarkStyle : <nl> + foregroundColour = QColor ( " # F0F0F0 " ) ; <nl> + backgroundColour = QColor ( " # 19232D " ) ; <nl> + break ; <nl> + } <nl> + if ( name = = " foreground_colour " ) <nl> + return foregroundColour ; <nl> + else if ( name = = " background_colour " ) <nl> + return backgroundColour ; <nl> + <nl> + / / Detect and provide sensible defaults for dark themes <nl> + if ( backgroundColour . value ( ) < foregroundColour . value ( ) ) { <nl> + if ( name = = " keyword_colour " ) <nl> + return QColor ( 82 , 148 , 226 ) ; <nl> + else if ( name = = " function_colour " ) <nl> + return QColor ( Qt : : yellow ) ; <nl> + else if ( name = = " table_colour " ) <nl> + return QColor ( Qt : : cyan ) ; <nl> + else if ( name = = " comment_colour " ) <nl> + return QColor ( Qt : : green ) ; <nl> + else if ( name = = " identifier_colour " ) <nl> + return QColor ( Qt : : magenta ) ; <nl> + else if ( name = = " string_colour " ) <nl> + return QColor ( Qt : : lightGray ) ; <nl> + else if ( name = = " currentline_colour " ) <nl> + return backgroundColour . lighter ( 150 ) ; <nl> + } else { <nl> + if ( name = = " keyword_colour " ) <nl> + return QColor ( Qt : : darkBlue ) ; <nl> + else if ( name = = " function_colour " ) <nl> + return QColor ( Qt : : blue ) ; <nl> + else if ( name = = " table_colour " ) <nl> + return QColor ( Qt : : darkCyan ) ; <nl> + else if ( name = = " comment_colour " ) <nl> + return QColor ( Qt : : darkGreen ) ; <nl> + else if ( name = = " identifier_colour " ) <nl> + return QColor ( Qt : : darkMagenta ) ; <nl> + else if ( name = = " string_colour " ) <nl> + return QColor ( Qt : : red ) ; <nl> + else if ( name = = " currentline_colour " ) <nl> + return QColor ( 236 , 236 , 245 ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> + / / Unknown combination of group and name ? Return an invalid QColor ! <nl> + return QColor ( ) ; <nl> + } <nl> + <nl> void Settings : : restoreDefaults ( ) <nl> { <nl> QSettings settings ( QApplication : : organizationName ( ) , QApplication : : organizationName ( ) ) ; <nl> mmm a / src / Settings . h <nl> ppp b / src / Settings . h <nl> class Settings <nl> private : <nl> Settings ( ) { } / / class is fully static <nl> <nl> - / / This works similar to getSettingsValue but returns the default value instead of the value set by the user <nl> + / / This works similar to getValue but returns the default value instead of the value set by the user <nl> static QVariant getDefaultValue ( const QString & group , const QString & name ) ; <nl> <nl> + / / This works similar to getDefaultValue but returns the default color value based on the passed application style <nl> + / / instead of the current palette . <nl> + static QColor getDefaultColorValue ( const QString & group , const QString & name , AppStyle style ) ; <nl> + <nl> / / Cache for storing the settings to avoid repeatedly reading the settings file all the time <nl> static QHash < QString , QVariant > m_hCache ; <nl> } ; <nl>
Update preference colours when the application style is changed
sqlitebrowser/sqlitebrowser
cc67969d73e42b1aeaa9ee89398523005ee8e127
2019-02-27T23:02:47Z
mmm a / src / compiler / statement / interface_statement . cpp <nl> ppp b / src / compiler / statement / interface_statement . cpp <nl> void InterfaceStatement : : outputCPPImpl ( CodeGenerator & cg , <nl> cg_printf ( " FORWARD_DECLARE_INTERFACE ( % s ) ; \ n " , clsName ) ; <nl> } <nl> } <nl> + if ( m_stmt ) { <nl> + cg . setContext ( CodeGenerator : : CppClassConstantsDecl ) ; <nl> + m_stmt - > outputCPP ( cg , ar ) ; <nl> + cg . setContext ( CodeGenerator : : CppForwardDeclaration ) ; <nl> + } <nl> break ; <nl> case CodeGenerator : : CppDeclaration : <nl> { <nl> void InterfaceStatement : : outputCPPImpl ( CodeGenerator & cg , <nl> } <nl> break ; <nl> case CodeGenerator : : CppImplementation : <nl> - / / do nothing <nl> + { <nl> + if ( m_stmt ) { <nl> + cg . setContext ( CodeGenerator : : CppClassConstantsImpl ) ; <nl> + m_stmt - > outputCPP ( cg , ar ) ; <nl> + cg . setContext ( CodeGenerator : : CppImplementation ) ; <nl> + } <nl> + } <nl> break ; <nl> case CodeGenerator : : CppFFIDecl : <nl> case CodeGenerator : : CppFFIImpl : <nl>
Add class constant declaration and implementation for interfaces
facebook/hhvm
8cc7d34373e9ae678566d6da2ef13d820b227bb4
2011-01-23T23:16:27Z
mmm a / Documentation / Books / Users / Transactions / LockingAndIsolation . mdpp <nl> ppp b / Documentation / Books / Users / Transactions / LockingAndIsolation . mdpp <nl> collections are potentially non - repeatable . <nl> <nl> * * EXAMPLES * * : <nl> <nl> - db . _executeTransaction ( { <nl> - collections : { <nl> - read : " users " <nl> - } , <nl> - action : function ( ) { <nl> - / / execute an AQL query that traverses a graph starting at a " users " vertex . <nl> - / / it is yet unknown into which other collections the query will traverse <nl> - db . _createStatement ( { <nl> - query : " FOR t IN TRAVERSAL ( users , connections , " users / 1234 " , " any " , { } ) RETURN t " <nl> - } ) . execute ( ) . toArray ( ) . forEach ( function ( d ) { <nl> - / / . . . <nl> - } ) ; <nl> - } <nl> + ` ` ` js <nl> + db . _executeTransaction ( { <nl> + collections : { <nl> + read : " users " <nl> + } , <nl> + action : function ( ) { <nl> + / / execute an AQL query that traverses a graph starting at a " users " vertex . <nl> + / / it is yet unknown into which other collections the query will traverse <nl> + db . _createStatement ( { <nl> + query : " FOR t IN TRAVERSAL ( users , connections , " users / 1234 " , " any " , { } ) RETURN t " <nl> + } ) . execute ( ) . toArray ( ) . forEach ( function ( d ) { <nl> + / / . . . <nl> } ) ; <nl> - <nl> + } <nl> + } ) ; <nl> + ` ` ` <nl> <nl> This automatic lazy addition of collections to a transaction also introduces the <nl> possibility of deadlocks . Deadlocks may occur if there are concurrent transactions <nl> that try to acquire locks on the same collections lazily . <nl> <nl> To recover from a deadlock state , ArangoDB will give up waiting for a collection <nl> after a configurable amount of time . The wait time can be specified per transaction <nl> - using the optional * lockTimeout * attribute . If no value is specified , some default <nl> + using the optional * lockTimeout * attribute . If no value is specified , some default <nl> value will be applied . <nl> <nl> If ArangoDB was waited at least * lockTimeout * seconds during lock acquisition , it <nl> at least as many lock acquisition attempts as there are collections used in the <nl> transaction . The total lock wait time may thus be much higher than the value of <nl> * lockTimeout * . <nl> <nl> - <nl> To avoid both deadlocks and non - repeatable reads , all collections used in a <nl> transaction should always be specified if known in advance . <nl> <nl> + In order to make a transaction fail when a non - declared collection is used inside <nl> + a transaction for reading , the optional * allowImplicit * sub - attribute of * collections * <nl> + can be set to * false * : <nl> + <nl> + ` ` ` js <nl> + db . _executeTransaction ( { <nl> + collections : { <nl> + read : " users " , <nl> + allowImplicit : false <nl> + } , <nl> + action : function ( ) { <nl> + / / the below query will now fail because the collection connections has not <nl> + / / been specified in the list of collections used by the transaction <nl> + db . _createStatement ( { <nl> + query : " FOR t IN TRAVERSAL ( users , connections , " users / 1234 " , " any " , { } ) RETURN t " <nl> + } ) . execute ( ) . toArray ( ) . forEach ( function ( d ) { <nl> + / / . . . <nl> + } ) ; <nl> + } <nl> + } ) ; <nl> + ` ` ` <nl> + <nl> + The default value for * allowImplicit * is * true * . Write - accessing collections that <nl> + have not been declared in the * collections * array is never possible , regardless of <nl> + the value of * allowImplicit * . <nl> + <nl> mmm a / Documentation / Books / Users / Transactions / TransactionInvocation . mdpp <nl> ppp b / Documentation / Books / Users / Transactions / TransactionInvocation . mdpp <nl> db . _executeTransaction ( { <nl> * read * and * write * are optional attributes , and only need to be specified if <nl> the operations inside the transactions demand for it . <nl> <nl> - The contents of * read * or * write * can each be lists with collection names or a <nl> + The contents of * read * or * write * can each be lists arrays collection names or a <nl> single collection name ( as a string ) : <nl> <nl> ` ` ` js <nl> Even without specifying them , it is still possible to read from such collections <nl> from within a transaction , but with relaxed isolation . Please refer to <nl> [ Transactions Locking ] ( . . / Transactions / LockingAndIsolation . md ) for more details . <nl> <nl> + In order to make a transaction fail when a non - declared collection is used inside <nl> + for reading , the optional * allowImplicit * sub - attribute of * collections * can be <nl> + set to * false * : <nl> + <nl> + ` ` ` js <nl> + db . _executeTransaction ( { <nl> + collections : { <nl> + read : " recommendations " , <nl> + allowImplicit : false / * this disallows read access to other collections <nl> + than specified * / <nl> + } , <nl> + action : function ( ) { <nl> + var db = require ( " org / arangodb " ) . db ; <nl> + return db . foobar . toArray ( ) ; / * will fail because db . foobar must not be accessed <nl> + for reading inside this transaction * / <nl> + } <nl> + } ) ; <nl> + ` ` ` <nl> + <nl> + The default value for * allowImplicit * is * true * . Write - accessing collections that <nl> + have not been declared in the * collections * array is never possible , regardless of <nl> + the value of * allowImplicit * . <nl> + <nl> ! SUBSECTION Declaration of data modification and retrieval operations <nl> <nl> All data modification and retrieval operations that are to be executed inside <nl> mmm a / arangod / Utils / ExplicitTransaction . h <nl> ppp b / arangod / Utils / ExplicitTransaction . h <nl> namespace triagens { <nl> std : : vector < std : : string > const & writeCollections , <nl> double lockTimeout , <nl> bool waitForSync , <nl> - bool embed ) <nl> + bool embed , <nl> + bool allowImplicitCollections ) <nl> : Transaction ( new V8TransactionContext ( embed ) , vocbase , 0 ) { <nl> <nl> this - > addHint ( TRI_TRANSACTION_HINT_LOCK_ENTIRELY , false ) ; <nl> namespace triagens { <nl> if ( waitForSync ) { <nl> this - > setWaitForSync ( ) ; <nl> } <nl> + <nl> + this - > setAllowImplicitCollections ( allowImplicitCollections ) ; <nl> <nl> for ( auto const & it : readCollections ) { <nl> this - > addCollection ( it , TRI_TRANSACTION_READ ) ; <nl> mmm a / arangod / Utils / ReplicationTransaction . h <nl> ppp b / arangod / Utils / ReplicationTransaction . h <nl> namespace triagens { <nl> TRI_transaction_collection_t * trxCollection = TRI_GetCollectionTransaction ( this - > _trx , cid , TRI_TRANSACTION_WRITE ) ; <nl> <nl> if ( trxCollection = = nullptr ) { <nl> - int res = TRI_AddCollectionTransaction ( this - > _trx , cid , TRI_TRANSACTION_WRITE , 0 , true ) ; <nl> + int res = TRI_AddCollectionTransaction ( this - > _trx , cid , TRI_TRANSACTION_WRITE , 0 , true , true ) ; <nl> <nl> if ( res = = TRI_ERROR_NO_ERROR ) { <nl> res = TRI_EnsureCollectionsTransaction ( this - > _trx ) ; <nl> mmm a / arangod / Utils / Transaction . h <nl> ppp b / arangod / Utils / Transaction . h <nl> namespace triagens { <nl> _hints ( 0 ) , <nl> _timeout ( 0 . 0 ) , <nl> _waitForSync ( false ) , <nl> + _allowImplicitCollections ( true ) , <nl> _isReal ( true ) , <nl> _trx ( nullptr ) , <nl> _vocbase ( vocbase ) , <nl> namespace triagens { <nl> _waitForSync = true ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief set the allowImplicitCollections property <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + void setAllowImplicitCollections ( bool value ) { <nl> + _allowImplicitCollections = value ; <nl> + } <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / / @ brief read - or write - lock a collection <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> namespace triagens { <nl> return TRI_ERROR_TRANSACTION_INTERNAL ; <nl> } <nl> <nl> - int res = TRI_LockCollectionTransaction ( trxCollection , type , _nestingLevel ) ; <nl> - <nl> - return res ; <nl> + return TRI_LockCollectionTransaction ( trxCollection , type , _nestingLevel ) ; <nl> } <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> namespace triagens { <nl> return TRI_ERROR_TRANSACTION_INTERNAL ; <nl> } <nl> <nl> - int res = TRI_UnlockCollectionTransaction ( trxCollection , type , _nestingLevel ) ; <nl> - <nl> - return res ; <nl> + return TRI_UnlockCollectionTransaction ( trxCollection , type , _nestingLevel ) ; <nl> } <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> namespace triagens { <nl> return false ; <nl> } <nl> <nl> - bool locked = TRI_IsLockedCollectionTransaction ( trxCollection , type , _nestingLevel ) ; <nl> - <nl> - return locked ; <nl> + return TRI_IsLockedCollectionTransaction ( trxCollection , type , _nestingLevel ) ; <nl> } <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> namespace triagens { <nl> TRI_transaction_type_e type ) { <nl> TRI_ASSERT ( _trx ! = nullptr ) ; <nl> <nl> - int res = TRI_AddCollectionTransaction ( _trx , cid , type , _nestingLevel , false ) ; <nl> + int res = TRI_AddCollectionTransaction ( _trx , cid , type , _nestingLevel , false , _allowImplicitCollections ) ; <nl> <nl> if ( res ! = TRI_ERROR_NO_ERROR ) { <nl> return registerError ( res ) ; <nl> namespace triagens { <nl> res = TRI_ERROR_TRANSACTION_INTERNAL ; <nl> } <nl> else { <nl> - res = TRI_AddCollectionTransaction ( _trx , cid , type , _nestingLevel , false ) ; <nl> + res = TRI_AddCollectionTransaction ( _trx , cid , type , _nestingLevel , false , _allowImplicitCollections ) ; <nl> } <nl> <nl> if ( res ! = TRI_ERROR_NO_ERROR ) { <nl> namespace triagens { <nl> <nl> bool _waitForSync ; <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief allow implicit collections for transaction <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + bool _allowImplicitCollections ; <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / / @ brief whether or not this is a " real " transaction <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> mmm a / arangod / V8Server / v8 - vocbase . cpp <nl> ppp b / arangod / V8Server / v8 - vocbase . cpp <nl> static void JS_Transaction ( const v8 : : FunctionCallbackInfo < v8 : : Value > & args ) { <nl> } <nl> <nl> / / extract collections <nl> - v8 : : Handle < v8 : : Array > collections = v8 : : Handle < v8 : : Array > : : Cast ( object - > Get ( TRI_V8_ASCII_STRING ( " collections " ) ) ) ; <nl> + v8 : : Handle < v8 : : Object > collections = v8 : : Handle < v8 : : Object > : : Cast ( object - > Get ( TRI_V8_ASCII_STRING ( " collections " ) ) ) ; <nl> <nl> if ( collections . IsEmpty ( ) ) { <nl> TRI_V8_THROW_EXCEPTION_PARAMETER ( collectionError ) ; <nl> static void JS_Transaction ( const v8 : : FunctionCallbackInfo < v8 : : Value > & args ) { <nl> bool isValid = true ; <nl> vector < string > readCollections ; <nl> vector < string > writeCollections ; <nl> + <nl> + bool allowImplicitCollections = true ; <nl> + if ( collections - > Has ( TRI_V8_ASCII_STRING ( " allowImplicit " ) ) ) { <nl> + allowImplicitCollections = TRI_ObjectToBoolean ( collections - > Get ( TRI_V8_ASCII_STRING ( " allowImplicit " ) ) ) ; <nl> + } <nl> <nl> / / collections . read <nl> if ( collections - > Has ( TRI_V8_ASCII_STRING ( " read " ) ) ) { <nl> static void JS_Transaction ( const v8 : : FunctionCallbackInfo < v8 : : Value > & args ) { <nl> writeCollections , <nl> lockTimeout , <nl> waitForSync , <nl> - embed ) ; <nl> + embed , <nl> + allowImplicitCollections ) ; <nl> <nl> int res = trx . begin ( ) ; <nl> <nl> mmm a / arangod / VocBase / transaction . cpp <nl> ppp b / arangod / VocBase / transaction . cpp <nl> int TRI_AddCollectionTransaction ( TRI_transaction_t * trx , <nl> TRI_voc_cid_t cid , <nl> TRI_transaction_type_e accessType , <nl> int nestingLevel , <nl> - bool force ) { <nl> + bool force , <nl> + bool allowImplicitCollections ) { <nl> <nl> LOG_TRX ( trx , nestingLevel , " adding collection % llu " , ( unsigned long long ) cid ) ; <nl> <nl> int TRI_AddCollectionTransaction ( TRI_transaction_t * trx , <nl> return TRI_ERROR_TRANSACTION_UNREGISTERED_COLLECTION ; <nl> } <nl> <nl> + if ( accessType = = TRI_TRANSACTION_READ & & ! allowImplicitCollections ) { <nl> + return TRI_ERROR_TRANSACTION_UNREGISTERED_COLLECTION ; <nl> + } <nl> + <nl> / / collection was not contained . now create and insert it <nl> trxCollection = CreateCollection ( trx , cid , accessType , nestingLevel ) ; <nl> <nl> mmm a / arangod / VocBase / transaction . h <nl> ppp b / arangod / VocBase / transaction . h <nl> int TRI_AddCollectionTransaction ( TRI_transaction_t * , <nl> TRI_voc_cid_t , <nl> TRI_transaction_type_e , <nl> int , <nl> + bool , <nl> bool ) ; <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> mmm a / js / common / tests / shell - transactions . js <nl> ppp b / js / common / tests / shell - transactions . js <nl> function TransactionsInvocationsSuite ( ) { <nl> } ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief test suite <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + function TransactionsImplicitCollectionsSuite ( ) { <nl> + ' use strict ' ; <nl> + <nl> + var cn1 = " UnitTestsTransaction1 " ; <nl> + var cn2 = " UnitTestsTransaction2 " ; <nl> + var c1 = null ; <nl> + var c2 = null ; <nl> + <nl> + return { <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief set up <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + setUp : function ( ) { <nl> + db . _drop ( cn1 ) ; <nl> + db . _drop ( cn2 ) ; <nl> + c1 = db . _create ( cn1 ) ; <nl> + c2 = db . _create ( cn2 ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief tear down <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + tearDown : function ( ) { <nl> + c1 = null ; <nl> + c2 = null ; <nl> + db . _drop ( cn1 ) ; <nl> + db . _drop ( cn2 ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses implicitly declared collections in AQL <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseInAql : function ( ) { <nl> + var result = db . _executeTransaction ( { <nl> + collections : { allowImplicit : false } , <nl> + action : " function ( params ) { " + <nl> + " return require ( ' internal ' ) . db . _query ( ' FOR i IN @ @ cn1 RETURN i ' , { ' @ cn1 ' : params . cn1 } ) . toArray ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + assertEqual ( [ ] , result ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses implicitly declared collections in AQL <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseImplicitAql : function ( ) { <nl> + var result = db . _executeTransaction ( { <nl> + collections : { allowImplicit : true , read : cn1 } , <nl> + action : " function ( params ) { " + <nl> + " return require ( ' internal ' ) . db . _query ( ' FOR i IN @ @ cn1 RETURN i ' , { ' @ cn1 ' : params . cn1 } ) . toArray ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + assertEqual ( [ ] , result ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses implicitly declared collections in AQL <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseNoImplicitAql : function ( ) { <nl> + try { <nl> + db . _executeTransaction ( { <nl> + collections : { allowImplicit : false , read : cn2 } , <nl> + action : " function ( params ) { " + <nl> + " return require ( ' internal ' ) . db . _query ( ' FOR i IN @ @ cn1 RETURN i ' , { ' @ cn1 ' : params . cn1 } ) . toArray ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + fail ( ) ; <nl> + } <nl> + catch ( err ) { <nl> + assertEqual ( ERRORS . ERROR_TRANSACTION_UNREGISTERED_COLLECTION . code , err . errorNum ) ; <nl> + } <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for writing <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseForWriting : function ( ) { <nl> + try { <nl> + db . _executeTransaction ( { <nl> + collections : { } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; db . _collection ( params . cn1 ) . truncate ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + fail ( ) ; <nl> + } <nl> + catch ( err ) { <nl> + assertEqual ( ERRORS . ERROR_TRANSACTION_UNREGISTERED_COLLECTION . code , err . errorNum ) ; <nl> + } <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for writing <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseReadForWriting : function ( ) { <nl> + try { <nl> + db . _executeTransaction ( { <nl> + collections : { read : cn1 } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; db . _collection ( params . cn1 ) . truncate ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + fail ( ) ; <nl> + } <nl> + catch ( err ) { <nl> + assertEqual ( ERRORS . ERROR_TRANSACTION_UNREGISTERED_COLLECTION . code , err . errorNum ) ; <nl> + } <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for writing <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseOtherForWriting : function ( ) { <nl> + try { <nl> + db . _executeTransaction ( { <nl> + collections : { write : cn2 } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; db . _collection ( params . cn1 ) . truncate ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + fail ( ) ; <nl> + } <nl> + catch ( err ) { <nl> + assertEqual ( ERRORS . ERROR_TRANSACTION_UNREGISTERED_COLLECTION . code , err . errorNum ) ; <nl> + } <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an explicitly declared collection for reading <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseForRead : function ( ) { <nl> + var result = db . _executeTransaction ( { <nl> + collections : { read : cn1 } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; return db . _collection ( params . cn1 ) . toArray ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + <nl> + assertEqual ( [ ] , result ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an explicitly declared collection for writing <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseForWriteAllowImplicit : function ( ) { <nl> + db . _executeTransaction ( { <nl> + collections : { write : cn1 , allowImplicit : true } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; db . _collection ( params . cn1 ) . truncate ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an explicitly declared collection for writing <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseForWriteNoAllowImplicit : function ( ) { <nl> + db . _executeTransaction ( { <nl> + collections : { write : cn1 , allowImplicit : false } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; db . _collection ( params . cn1 ) . truncate ( ) ; } " , <nl> + params : { cn1 : cn1 } <nl> + } ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for reading <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseOtherForRead : function ( ) { <nl> + var result = db . _executeTransaction ( { <nl> + collections : { read : cn1 } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; return db . _collection ( params . cn2 ) . toArray ( ) ; } " , <nl> + params : { cn2 : cn2 } <nl> + } ) ; <nl> + <nl> + assertEqual ( [ ] , result ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for reading <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseOtherForReadAllowImplicit : function ( ) { <nl> + var result = db . _executeTransaction ( { <nl> + collections : { read : cn1 , allowImplicit : true } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; return db . _collection ( params . cn2 ) . toArray ( ) ; } " , <nl> + params : { cn2 : cn2 } <nl> + } ) ; <nl> + <nl> + assertEqual ( [ ] , result ) ; <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for reading <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseOtherForReadNoAllowImplicit : function ( ) { <nl> + try { <nl> + db . _executeTransaction ( { <nl> + collections : { read : cn1 , allowImplicit : false } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; return db . _collection ( params . cn2 ) . toArray ( ) ; } " , <nl> + params : { cn2 : cn2 } <nl> + } ) ; <nl> + fail ( ) ; <nl> + } <nl> + catch ( err ) { <nl> + assertEqual ( ERRORS . ERROR_TRANSACTION_UNREGISTERED_COLLECTION . code , err . errorNum ) ; <nl> + } <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for writing <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseOtherForWriteNoAllowImplicit : function ( ) { <nl> + try { <nl> + db . _executeTransaction ( { <nl> + collections : { read : cn1 , allowImplicit : false } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; db . _collection ( params . cn2 ) . truncate ( ) ; } " , <nl> + params : { cn2 : cn2 } <nl> + } ) ; <nl> + fail ( ) ; <nl> + } <nl> + catch ( err ) { <nl> + assertEqual ( ERRORS . ERROR_TRANSACTION_UNREGISTERED_COLLECTION . code , err . errorNum ) ; <nl> + } <nl> + } , <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief uses an implicitly declared collection for writing <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + testUseOtherForWriteAllowImplicit : function ( ) { <nl> + try { <nl> + db . _executeTransaction ( { <nl> + collections : { read : cn1 , allowImplicit : true } , <nl> + action : " function ( params ) { var db = require ( ' internal ' ) . db ; db . _collection ( params . cn2 ) . truncate ( ) ; } " , <nl> + params : { cn2 : cn2 } <nl> + } ) ; <nl> + fail ( ) ; <nl> + } <nl> + catch ( err ) { <nl> + assertEqual ( ERRORS . ERROR_TRANSACTION_UNREGISTERED_COLLECTION . code , err . errorNum ) ; <nl> + } <nl> + } <nl> + <nl> + } ; <nl> + } <nl> + <nl> / / mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> / / - - SECTION - - main <nl> / / mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> function TransactionsInvocationsSuite ( ) { <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> <nl> jsunity . run ( TransactionsInvocationsSuite ) ; <nl> + jsunity . run ( TransactionsImplicitCollectionsSuite ) ; <nl> <nl> return jsunity . done ( ) ; <nl> <nl>
issue : added optional * allowImplicit * sub - attribute for transactions
arangodb/arangodb
4249095456b3cfe04604017d99b0f6ed29a478d6
2015-09-24T13:50:00Z
mmm a / DEPS <nl> ppp b / DEPS <nl> vars = { <nl> <nl> deps = { <nl> ' v8 / build ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / src / build . git ' + ' @ ' + ' 6455acf1c78ab1f3a9be215adaa0fd7ce8017dea ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / src / build . git ' + ' @ ' + ' 5a371bcc0efe2cc84f384f14bdf5eaf5fe3e271a ' , <nl> ' v8 / third_party / depot_tools ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / tools / depot_tools . git ' + ' @ ' + ' 98f1e59b41c6c580cd168ac4456bf27d78c12a95 ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / tools / depot_tools . git ' + ' @ ' + ' 2b71832f6d8dc74119589992836cf95aeb8a9842 ' , <nl> ' v8 / third_party / icu ' : <nl> Var ( ' chromium_url ' ) + ' / chromium / deps / icu . git ' + ' @ ' + ' b029971f1fc6b20d06887c47c7afebd5881f31ff ' , <nl> ' v8 / third_party / instrumented_libraries ' : <nl> deps = { <nl> ' condition ' : ' checkout_android ' , <nl> } , <nl> ' v8 / third_party / catapult ' : { <nl> - ' url ' : Var ( ' chromium_url ' ) + ' / catapult . git ' + ' @ ' + ' b026043a43f9ce3f37c1cd57269f92cb8bee756c ' , <nl> + ' url ' : Var ( ' chromium_url ' ) + ' / catapult . git ' + ' @ ' + ' ed6fe0f638403e1afd377e38975e4fd430f53432 ' , <nl> ' condition ' : ' checkout_android ' , <nl> } , <nl> ' v8 / third_party / colorama / src ' : { <nl> deps = { <nl> ' condition ' : ' checkout_android ' , <nl> } , <nl> ' v8 / third_party / fuchsia - sdk ' : { <nl> - ' url ' : Var ( ' chromium_url ' ) + ' / chromium / src / third_party / fuchsia - sdk . git ' + ' @ ' + ' bac04339dfb1e41704eca419eb0196028a15b8c3 ' , <nl> + ' url ' : Var ( ' chromium_url ' ) + ' / chromium / src / third_party / fuchsia - sdk . git ' + ' @ ' + ' 29de0c2af139b63a8a59ceeeb732cf4b049f4f0d ' , <nl> ' condition ' : ' checkout_fuchsia ' , <nl> } , <nl> ' v8 / third_party / googletest / src ' : <nl> deps = { <nl> ' dep_type ' : ' cipd ' , <nl> } , <nl> ' v8 / tools / clang ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / src / tools / clang . git ' + ' @ ' + ' a245b955fe9cd620081ed267fae303c88d033fef ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / src / tools / clang . git ' + ' @ ' + ' 3041f30dd6b3fa4fb8ca7db6439bed372f4accc0 ' , <nl> ' v8 / tools / luci - go ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / src / tools / luci - go . git ' + ' @ ' + ' 445d7c4b6a4f10e188edb395b132e3996b127691 ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / src / tools / luci - go . git ' + ' @ ' + ' 86c09e88368d0eb01a08841b7f959b63330f30f7 ' , <nl> ' v8 / test / wasm - js / data ' : <nl> - Var ( ' chromium_url ' ) + ' / external / github . com / WebAssembly / spec . git ' + ' @ ' + ' 7e3c46a072a13fbfc871a3b78ddf7bd7e50a0ddd ' , <nl> + Var ( ' chromium_url ' ) + ' / external / github . com / WebAssembly / spec . git ' + ' @ ' + ' b0e783867ebc917c2b69a55f55ca5500e298e356 ' , <nl> } <nl> <nl> recursedeps = [ <nl> mmm a / third_party / googletest / BUILD . gn <nl> ppp b / third_party / googletest / BUILD . gn <nl> source_set ( " gtest " ) { <nl> deps = [ ] <nl> <nl> if ( is_fuchsia ) { <nl> - deps + = [ " / / third_party / fuchsia - sdk : fdio " ] <nl> + deps + = [ " / / third_party / fuchsia - sdk / sdk : fdio " ] <nl> } <nl> } <nl> <nl>
Update V8 DEPS .
v8/v8
9929a238ab9ab8441b00fb8c23008aec5c373e53
2018-10-25T07:28:05Z
mmm a / tensorflow / core / ops / compat / ops_history . v0 . pbtxt <nl> ppp b / tensorflow / core / ops / compat / ops_history . v0 . pbtxt <nl> op { <nl> } <nl> } <nl> } <nl> + op { <nl> + name : " SparseSparseMaximum " <nl> + input_arg { <nl> + name : " a_indices " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " a_values " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " a_shape " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_indices " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_values " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " b_shape " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_indices " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_values " <nl> + type_attr : " T " <nl> + } <nl> + attr { <nl> + name : " T " <nl> + type : " type " <nl> + allowed_values { <nl> + list { <nl> + type : DT_FLOAT <nl> + type : DT_DOUBLE <nl> + type : DT_INT32 <nl> + type : DT_INT64 <nl> + type : DT_UINT8 <nl> + type : DT_INT16 <nl> + type : DT_INT8 <nl> + type : DT_UINT16 <nl> + type : DT_HALF <nl> + } <nl> + } <nl> + } <nl> + } <nl> + op { <nl> + name : " SparseSparseMinimum " <nl> + input_arg { <nl> + name : " a_indices " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " a_values " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " a_shape " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_indices " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_values " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " b_shape " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_indices " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_values " <nl> + type_attr : " T " <nl> + } <nl> + attr { <nl> + name : " T " <nl> + type : " type " <nl> + allowed_values { <nl> + list { <nl> + type : DT_FLOAT <nl> + type : DT_DOUBLE <nl> + type : DT_INT64 <nl> + type : DT_INT32 <nl> + type : DT_UINT8 <nl> + type : DT_UINT16 <nl> + type : DT_INT16 <nl> + type : DT_INT8 <nl> + type : DT_COMPLEX64 <nl> + type : DT_COMPLEX128 <nl> + type : DT_QINT8 <nl> + type : DT_QUINT8 <nl> + type : DT_QINT32 <nl> + type : DT_HALF <nl> + } <nl> + } <nl> + } <nl> + } <nl> op { <nl> name : " SparseSplit " <nl> input_arg { <nl> mmm a / tensorflow / core / ops / ops . pbtxt <nl> ppp b / tensorflow / core / ops / ops . pbtxt <nl> op { <nl> summary : " Computes softmax cross entropy cost and gradients to backpropagate . " <nl> description : " Unlike ` SoftmaxCrossEntropyWithLogits ` , this operation does not accept \ na matrix of label probabilities , but rather a single label per row \ nof features . This label is considered to have probability 1 . 0 for the \ ngiven row . \ n \ nInputs are the logits , not probabilities . " <nl> } <nl> + op { <nl> + name : " SparseSparseMaximum " <nl> + input_arg { <nl> + name : " a_indices " <nl> + description : " 2 - D . ` N x R ` matrix with the indices of non - empty values in a \ nSparseTensor , in the canonical lexicographic ordering . " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " a_values " <nl> + description : " 1 - D . ` N ` non - empty values corresponding to ` a_indices ` . " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " a_shape " <nl> + description : " 1 - D . Shape of the input SparseTensor . " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_indices " <nl> + description : " counterpart to ` a_indices ` for the other operand . " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_values " <nl> + description : " counterpart to ` a_values ` for the other operand ; must be of the same dtype . " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " b_shape " <nl> + description : " counterpart to ` a_shape ` for the other operand ; the two shapes must be equal . " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_indices " <nl> + description : " 2 - D . The indices of the output SparseTensor . " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_values " <nl> + description : " 1 - D . The values of the output SparseTensor . " <nl> + type_attr : " T " <nl> + } <nl> + attr { <nl> + name : " T " <nl> + type : " type " <nl> + allowed_values { <nl> + list { <nl> + type : DT_FLOAT <nl> + type : DT_DOUBLE <nl> + type : DT_INT32 <nl> + type : DT_INT64 <nl> + type : DT_UINT8 <nl> + type : DT_INT16 <nl> + type : DT_INT8 <nl> + type : DT_UINT16 <nl> + type : DT_HALF <nl> + } <nl> + } <nl> + } <nl> + summary : " Returns the element - wise max of two SparseTensors . " <nl> + description : " Assumes the two SparseTensors have the same shape , i . e . , no broadcasting . " <nl> + } <nl> + op { <nl> + name : " SparseSparseMinimum " <nl> + input_arg { <nl> + name : " a_indices " <nl> + description : " 2 - D . ` N x R ` matrix with the indices of non - empty values in a \ nSparseTensor , in the canonical lexicographic ordering . " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " a_values " <nl> + description : " 1 - D . ` N ` non - empty values corresponding to ` a_indices ` . " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " a_shape " <nl> + description : " 1 - D . Shape of the input SparseTensor . " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_indices " <nl> + description : " counterpart to ` a_indices ` for the other operand . " <nl> + type : DT_INT64 <nl> + } <nl> + input_arg { <nl> + name : " b_values " <nl> + description : " counterpart to ` a_values ` for the other operand ; must be of the same dtype . " <nl> + type_attr : " T " <nl> + } <nl> + input_arg { <nl> + name : " b_shape " <nl> + description : " counterpart to ` a_shape ` for the other operand ; the two shapes must be equal . " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_indices " <nl> + description : " 2 - D . The indices of the output SparseTensor . " <nl> + type : DT_INT64 <nl> + } <nl> + output_arg { <nl> + name : " output_values " <nl> + description : " 1 - D . The values of the output SparseTensor . " <nl> + type_attr : " T " <nl> + } <nl> + attr { <nl> + name : " T " <nl> + type : " type " <nl> + allowed_values { <nl> + list { <nl> + type : DT_FLOAT <nl> + type : DT_DOUBLE <nl> + type : DT_INT64 <nl> + type : DT_INT32 <nl> + type : DT_UINT8 <nl> + type : DT_UINT16 <nl> + type : DT_INT16 <nl> + type : DT_INT8 <nl> + type : DT_COMPLEX64 <nl> + type : DT_COMPLEX128 <nl> + type : DT_QINT8 <nl> + type : DT_QUINT8 <nl> + type : DT_QINT32 <nl> + type : DT_HALF <nl> + } <nl> + } <nl> + } <nl> + summary : " Returns the element - wise min of two SparseTensors . " <nl> + description : " Assumes the two SparseTensors have the same shape , i . e . , no broadcasting . " <nl> + } <nl> op { <nl> name : " SparseSplit " <nl> input_arg { <nl>
Update ops - related pbtxt files .
tensorflow/tensorflow
0eeeced61ef55b7615690648e7c0cbdb998b54ea
2016-07-02T00:46:57Z
mmm a / src / json . hpp <nl> ppp b / src / json . hpp <nl> <nl> # define _NLOHMANN_JSON <nl> <nl> # include < algorithm > <nl> - # include < cassert > <nl> # include < functional > <nl> # include < initializer_list > <nl> # include < iostream > <nl> mmm a / src / json . hpp . re2c <nl> ppp b / src / json . hpp . re2c <nl> <nl> # define _NLOHMANN_JSON <nl> <nl> # include < algorithm > <nl> - # include < cassert > <nl> # include < functional > <nl> # include < initializer_list > <nl> # include < iostream > <nl> mmm a / test / unit . cpp <nl> ppp b / test / unit . cpp <nl> TEST_CASE ( " deserialization " ) <nl> <nl> TEST_CASE ( " iterator class " ) <nl> { <nl> - SECTION ( " initialization " ) <nl> + SECTION ( " construction " ) <nl> { <nl> - SECTION ( " constructor with object " ) <nl> + SECTION ( " constructor " ) <nl> { <nl> SECTION ( " null " ) <nl> { <nl> TEST_CASE ( " iterator class " ) <nl> json : : iterator it2 ( & j ) ; <nl> it2 = it ; <nl> } <nl> + } <nl> <nl> + SECTION ( " initialization " ) <nl> + { <nl> SECTION ( " set_begin " ) <nl> { <nl> SECTION ( " null " ) <nl> TEST_CASE ( " iterator class " ) <nl> CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > end ( ) ) ; <nl> } <nl> } <nl> + <nl> + SECTION ( " post - decrement " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : begin ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > begin ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " pre - decrement " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : begin ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > begin ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : iterator it = j . end ( ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> + SECTION ( " comparison " ) <nl> + { <nl> + json j_values = <nl> + { <nl> + nullptr , nullptr , <nl> + 17 , 42 , <nl> + 3 . 14159 , 23 . 42 , <nl> + " foo " , " bar " , <nl> + true , false , <nl> + { 1 , 2 , 3 } , { " one " , " two " , " three " } , <nl> + { { " first " , 1 } , { " second " , 2 } } , { { " a " , " A " } , { " b " , { " B " } } } <nl> + } ; <nl> + <nl> + SECTION ( " comparison : equal " ) <nl> + { <nl> + std : : vector < std : : vector < bool > > expected = <nl> + { <nl> + { true , false , false , false , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , true , false , false , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , true , false , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , true , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , true , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , true , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , true , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , true , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , true , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , true , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , true , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , false , true , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , false , false , true , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , false , false , false , true } <nl> + } ; <nl> + <nl> + for ( size_t i = 0 ; i < j_values . size ( ) ; + + i ) <nl> + { <nl> + for ( size_t j = 0 ; j < j_values . size ( ) ; + + j ) <nl> + { <nl> + / / check precomputed values <nl> + CHECK ( ( j_values [ i ] . begin ( ) = = j_values [ j ] . begin ( ) ) = = expected [ i ] [ j ] ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> + SECTION ( " comparison : not equal " ) <nl> + { <nl> + for ( size_t i = 0 ; i < j_values . size ( ) ; + + i ) <nl> + { <nl> + for ( size_t j = 0 ; j < j_values . size ( ) ; + + j ) <nl> + { <nl> + / / check definition <nl> + CHECK ( ( j_values [ i ] . begin ( ) ! = j_values [ j ] . begin ( ) ) = = not ( ( j_values [ i ] . begin ( ) = = <nl> + j_values [ j ] . begin ( ) ) ) ) ; <nl> + } <nl> + } <nl> + } <nl> + } <nl> + } <nl> + <nl> + TEST_CASE ( " const_iterator class " ) <nl> + { <nl> + SECTION ( " construction " ) <nl> + { <nl> + SECTION ( " constructor " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( json : : value_t : : object ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( json : : value_t : : array ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " copy assignment " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + json : : const_iterator it2 ( & j ) ; <nl> + it2 = it ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " initialization " ) <nl> + { <nl> + SECTION ( " set_begin " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + it . set_begin ( ) ; <nl> + CHECK ( it = = j . cbegin ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( json : : value_t : : object ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + it . set_begin ( ) ; <nl> + CHECK ( it = = j . cbegin ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( json : : value_t : : array ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + it . set_begin ( ) ; <nl> + CHECK ( it = = j . cbegin ( ) ) ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " set_end " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + it . set_end ( ) ; <nl> + CHECK ( it = = j . cend ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( json : : value_t : : object ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + it . set_end ( ) ; <nl> + CHECK ( it = = j . cend ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( json : : value_t : : array ) ; <nl> + json : : const_iterator it ( & j ) ; <nl> + it . set_end ( ) ; <nl> + CHECK ( it = = j . cend ( ) ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> + SECTION ( " element access " ) <nl> + { <nl> + SECTION ( " operator * " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK_THROWS_AS ( * it , std : : out_of_range ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( * it = = json ( 17 ) ) ; <nl> + it = j . cend ( ) ; <nl> + CHECK_THROWS_AS ( * it , std : : out_of_range ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( * it = = json ( " bar " ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( * it = = json ( 1 ) ) ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " operator - > " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK_THROWS_AS ( it - > type_name ( ) , std : : out_of_range ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it - > type_name ( ) = = " number " ) ; <nl> + it = j . cend ( ) ; <nl> + CHECK_THROWS_AS ( it - > type_name ( ) , std : : out_of_range ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it - > type_name ( ) = = " string " ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it - > type_name ( ) = = " number " ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> + SECTION ( " increment / decrement " ) <nl> + { <nl> + SECTION ( " post - increment " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : begin ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > begin ( ) ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > end ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it + + ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > end ( ) ) ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " pre - increment " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : begin ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > begin ( ) ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > end ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : const_iterator it = j . cbegin ( ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + + + it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > end ( ) ) ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " post - decrement " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : begin ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > begin ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + it - - ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + } <nl> + } <nl> + <nl> + SECTION ( " pre - decrement " ) <nl> + { <nl> + SECTION ( " null " ) <nl> + { <nl> + json j ( json : : value_t : : null ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " number " ) <nl> + { <nl> + json j ( 17 ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : end ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : begin ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . generic_iterator = = json : : generic_iterator_value : : invalid ) ; <nl> + } <nl> + <nl> + SECTION ( " object " ) <nl> + { <nl> + json j ( { { " foo " , " bar " } } ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . object_iterator = = it . m_object - > m_value . object - > begin ( ) ) ; <nl> + } <nl> + <nl> + SECTION ( " array " ) <nl> + { <nl> + json j ( { 1 , 2 , 3 , 4 } ) ; <nl> + json : : const_iterator it = j . cend ( ) ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + - - it ; <nl> + CHECK ( it . m_it . array_iterator = = it . m_object - > m_value . array - > begin ( ) ) ; <nl> + CHECK ( it . m_it . array_iterator ! = it . m_object - > m_value . array - > end ( ) ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> + SECTION ( " comparison " ) <nl> + { <nl> + json j_values = <nl> + { <nl> + nullptr , nullptr , <nl> + 17 , 42 , <nl> + 3 . 14159 , 23 . 42 , <nl> + " foo " , " bar " , <nl> + true , false , <nl> + { 1 , 2 , 3 } , { " one " , " two " , " three " } , <nl> + { { " first " , 1 } , { " second " , 2 } } , { { " a " , " A " } , { " b " , { " B " } } } <nl> + } ; <nl> + <nl> + SECTION ( " comparison : equal " ) <nl> + { <nl> + std : : vector < std : : vector < bool > > expected = <nl> + { <nl> + { true , false , false , false , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , true , false , false , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , true , false , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , true , false , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , true , false , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , true , false , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , true , false , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , true , false , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , true , false , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , true , false , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , true , false , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , false , true , false , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , false , false , true , false } , <nl> + { false , false , false , false , false , false , false , false , false , false , false , false , false , true } <nl> + } ; <nl> + <nl> + for ( size_t i = 0 ; i < j_values . size ( ) ; + + i ) <nl> + { <nl> + for ( size_t j = 0 ; j < j_values . size ( ) ; + + j ) <nl> + { <nl> + / / check precomputed values <nl> + CHECK ( ( j_values [ i ] . cbegin ( ) = = j_values [ j ] . cbegin ( ) ) = = expected [ i ] [ j ] ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> + SECTION ( " comparison : not equal " ) <nl> + { <nl> + for ( size_t i = 0 ; i < j_values . size ( ) ; + + i ) <nl> + { <nl> + for ( size_t j = 0 ; j < j_values . size ( ) ; + + j ) <nl> + { <nl> + / / check definition <nl> + CHECK ( ( j_values [ i ] . cbegin ( ) ! = j_values [ j ] . cbegin ( ) ) = = not ( ( j_values [ i ] . cbegin ( ) = = <nl> + j_values [ j ] . cbegin ( ) ) ) ) ; <nl> + } <nl> + } <nl> + } <nl> } <nl> } <nl> <nl>
test cases for iterator classes
nlohmann/json
3f8dc632e2fb83788985d14abacc1f1ea58640c2
2015-02-11T14:29:41Z
mmm a / cmake / ArangoDBMacros . cmake <nl> ppp b / cmake / ArangoDBMacros . cmake <nl> <nl> - include ( GNUInstallDirs ) <nl> - <nl> - # install the visual studio runtime : <nl> - if ( MSVC ) <nl> - include ( InstallRequiredSystemLibraries ) <nl> - INSTALL ( FILES $ { CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS } DESTINATION bin COMPONENT Libraries ) <nl> - endif ( ) <nl> - <nl> - # etc mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - set ( ETCDIR " " CACHE path " System configuration directory ( defaults to prefix / etc ) " ) <nl> - <nl> - # / etc mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - if ( ETCDIR STREQUAL " " ) <nl> - set ( ETCDIR_NATIVE " $ { CMAKE_INSTALL_PREFIX } / etc / arangodb3 " ) <nl> - set ( ETCDIR_INSTALL " etc / arangodb3 " ) <nl> - else ( ) <nl> - set ( ETCDIR_NATIVE " $ { ETCDIR } / arangodb3 " ) <nl> - set ( ETCDIR_INSTALL " $ { ETCDIR } / arangodb3 " ) <nl> - endif ( ) <nl> - <nl> - # MS stuff mmmmmmmmmmmmmmmmmmmmmmmmmmm <nl> - if ( MSVC ) <nl> - file ( TO_NATIVE_PATH " $ { ETCDIR_INSTALL } " ETCDIR_INSTALL ) <nl> - STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ETCDIR_ESCAPED " $ { ETCDIR_INSTALL } " ) <nl> - else ( ) <nl> - file ( TO_NATIVE_PATH " $ { ETCDIR_NATIVE } " ETCDIR_NATIVE ) <nl> - STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ETCDIR_ESCAPED " $ { ETCDIR_NATIVE } " ) <nl> - endif ( ) <nl> - <nl> - add_definitions ( " - D_SYSCONFDIR_ = \ " $ { ETCDIR_ESCAPED } \ " " ) <nl> - <nl> - # / var <nl> - set ( VARDIR " " <nl> - CACHE path <nl> - " System configuration directory ( defaults to prefix / var / arangodb3 ) " <nl> - ) <nl> - <nl> - if ( VARDIR STREQUAL " " ) <nl> - set ( VARDIR_NATIVE " $ { CMAKE_INSTALL_PREFIX } / var " ) <nl> - set ( VARDIR_INSTALL " var " ) <nl> - else ( ) <nl> - set ( VARDIR_NATIVE " $ { VARDIR } " ) <nl> - set ( VARDIR_INSTALL " $ { VARDIR } " ) <nl> - endif ( ) <nl> - <nl> - file ( TO_NATIVE_PATH " $ { VARDIR_NATIVE } " VARDIR_NATIVE ) <nl> - <nl> - # database directory <nl> - FILE ( MAKE_DIRECTORY " $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 " ) <nl> - <nl> - # apps <nl> - FILE ( MAKE_DIRECTORY " $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 - apps " ) <nl> - <nl> - # logs <nl> - FILE ( MAKE_DIRECTORY " $ { PROJECT_BINARY_DIR } / var / log / arangodb " ) <nl> - <nl> - # package <nl> - set ( TRI_PKGDATADIR " $ { CMAKE_INSTALL_PREFIX } / share / arangodb3 " ) <nl> - <nl> - # resources <nl> - set ( TRI_RESOURCEDIR " resources " ) <nl> - <nl> - # sbinaries <nl> - if ( MSVC ) <nl> - set ( ARANGODB_INSTALL_SBIN " bin " ) <nl> - set ( TRI_SBINDIR " $ { CMAKE_INSTALL_PREFIX } / bin " ) <nl> - else ( ) <nl> - set ( ARANGODB_INSTALL_SBIN " sbin " ) <nl> - set ( TRI_SBINDIR " $ { CMAKE_INSTALL_PREFIX } / sbin " ) <nl> - endif ( ) <nl> - <nl> - # MS Windows mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - if ( MSVC ) <nl> - # icon paths <nl> - file ( TO_NATIVE_PATH <nl> - " $ { TRI_RESOURCEDIR } / Icons / arangodb . ico " <nl> - RELATIVE_ARANGO_ICON <nl> - ) <nl> - <nl> - file ( TO_NATIVE_PATH <nl> - " $ { PROJECT_SOURCE_DIR } / Installation / Windows / Icons / arangodb . bmp " <nl> - ARANGO_IMG <nl> - ) <nl> - <nl> - file ( TO_NATIVE_PATH <nl> - " $ { PROJECT_SOURCE_DIR } / Installation / Windows / Icons / arangodb . ico " <nl> - ARANGO_ICON <nl> - ) <nl> - <nl> - STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ARANGO_IMG " $ { ARANGO_IMG } " ) <nl> - STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ARANGO_ICON " $ { ARANGO_ICON } " ) <nl> - STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " RELATIVE_ARANGO_ICON " $ { RELATIVE_ARANGO_ICON } " ) <nl> - <nl> - # versioning <nl> - set ( CMAKE_MODULE_PATH <nl> - $ { CMAKE_MODULE_PATH } <nl> - $ { PROJECT_SOURCE_DIR } / Installation / Windows / version <nl> - ) <nl> - <nl> - include ( " $ { PROJECT_SOURCE_DIR } / Installation / Windows / version / generate_product_version . cmake " ) <nl> - endif ( ) <nl> - <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - # # INSTALL <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - <nl> - # Global macros mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - macro ( generate_root_config name ) <nl> - FILE ( READ $ { PROJECT_SOURCE_DIR } / etc / arangodb3 / $ { name } . conf . in FileContent ) <nl> - STRING ( REPLACE " @ PKGDATADIR @ " " @ ROOTDIR @ / share / arangodb3 " <nl> - FileContent " $ { FileContent } " ) <nl> - STRING ( REPLACE " @ LOCALSTATEDIR @ " " @ ROOTDIR @ / var " <nl> - FileContent " $ { FileContent } " ) <nl> - STRING ( REPLACE " @ SBINDIR @ " " @ ROOTDIR @ / bin " <nl> - FileContent " $ { FileContent } " ) <nl> - STRING ( REPLACE " @ LIBEXECDIR @ / arangodb3 " " @ ROOTDIR @ / bin " <nl> - FileContent " $ { FileContent } " ) <nl> - STRING ( REPLACE " @ SYSCONFDIR @ " " @ ROOTDIR @ / etc / arangodb3 " <nl> - FileContent " $ { FileContent } " ) <nl> - if ( MSVC ) <nl> - STRING ( REPLACE " @ PROGRAM_SUFFIX @ " " . exe " <nl> - FileContent " $ { FileContent } " ) <nl> - STRING ( REGEX REPLACE " [ \ r \ n ] file = " " \ n # file = " <nl> - FileContent " $ { FileContent } " ) <nl> - endif ( ) <nl> - FILE ( WRITE $ { PROJECT_BINARY_DIR } / etc / arangodb3 / $ { name } . conf " $ { FileContent } " ) <nl> - endmacro ( ) <nl> - <nl> - # generates config file using the configured paths mmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - macro ( generate_path_config name ) <nl> - FILE ( READ $ { PROJECT_SOURCE_DIR } / etc / arangodb3 / $ { name } . conf . in FileContent ) <nl> - STRING ( REPLACE " @ PKGDATADIR @ " " $ { TRI_PKGDATADIR } " <nl> - FileContent " $ { FileContent } " ) <nl> - STRING ( REPLACE " @ LOCALSTATEDIR @ " " $ { VARDIR_NATIVE } " <nl> - FileContent " $ { FileContent } " ) <nl> - FILE ( WRITE $ { PROJECT_BINARY_DIR } / etc / arangodb3 / $ { name } . conf " $ { FileContent } " ) <nl> - endmacro ( ) <nl> - <nl> - # installs a config file mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - macro ( install_config name ) <nl> - if ( MSVC OR ( DARWIN AND NOT HOMEBREW ) ) <nl> - generate_root_config ( $ { name } ) <nl> - else ( ) <nl> - generate_path_config ( $ { name } ) <nl> - endif ( ) <nl> - install ( <nl> - FILES $ { PROJECT_BINARY_DIR } / etc / arangodb3 / $ { name } . conf <nl> - DESTINATION $ { ETCDIR_INSTALL } ) <nl> - endmacro ( ) <nl> - <nl> - # installs a readme file converting EOL mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - macro ( install_readme input where output ) <nl> - FILE ( READ $ { PROJECT_SOURCE_DIR } / $ { input } FileContent ) <nl> - STRING ( REPLACE " \ r " " " FileContent " $ { FileContent } " ) <nl> - if ( MSVC ) <nl> - STRING ( REPLACE " \ n " " \ r \ n " FileContent " $ { FileContent } " ) <nl> - endif ( ) <nl> - FILE ( WRITE $ { PROJECT_BINARY_DIR } / $ { output } " $ { FileContent } " ) <nl> - install ( <nl> - FILES $ { PROJECT_BINARY_DIR } / $ { output } <nl> - DESTINATION $ { where } ) <nl> - endmacro ( ) <nl> - <nl> - # installs a link to an executable mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm <nl> - macro ( install_command_alias name where alias ) <nl> - if ( MSVC ) <nl> - add_custom_command ( <nl> - TARGET $ { name } <nl> - POST_BUILD <nl> - COMMAND $ { CMAKE_COMMAND } - E copy $ < TARGET_FILE : $ { name } > <nl> - $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ < CONFIGURATION > / $ { alias } . exe ) <nl> - install ( <nl> - PROGRAMS $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ < CONFIGURATION > / $ { alias } . exe <nl> - DESTINATION $ { where } ) <nl> - else ( ) <nl> - add_custom_command ( <nl> - TARGET $ { name } <nl> - POST_BUILD <nl> - COMMAND $ { CMAKE_COMMAND } - E create_symlink $ { name } <nl> - $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ { alias } ) <nl> - install ( <nl> - PROGRAMS $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ { alias } <nl> - DESTINATION $ { where } ) <nl> - endif ( ) <nl> - endmacro ( ) <nl> - <nl> - # sub directories mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> - <nl> - # if ( BUILD_STATIC_EXECUTABLES ) <nl> - # set ( CMAKE_EXE_LINKER_FLAGS - static ) <nl> - # set ( CMAKE_FIND_LIBRARY_SUFFIXES . a ) <nl> - # set ( CMAKE_EXE_LINK_DYNAMIC_C_FLAGS ) # remove - Wl , - Bdynamic <nl> - # set ( CMAKE_EXE_LINK_DYNAMIC_CXX_FLAGS ) <nl> - # set ( CMAKE_SHARED_LIBRARY_C_FLAGS ) # remove - fPIC <nl> - # set ( CMAKE_SHARED_LIBRARY_CXX_FLAGS ) <nl> - # set ( CMAKE_SHARED_LIBRARY_LINK_C_FLAGS ) # remove - rdynamic <nl> - # set ( CMAKE_SHARED_LIBRARY_LINK_CXX_FLAGS ) <nl> - # # Maybe this works as well , haven ' t tried yet . <nl> - # # set_property ( GLOBAL PROPERTY TARGET_SUPPORTS_SHARED_LIBS FALSE ) <nl> - # else ( BUILD_STATIC_EXECUTABLES ) <nl> - # # Set RPATH to use for installed targets ; append linker search path <nl> - # set ( CMAKE_INSTALL_RPATH " $ { CMAKE_INSTALL_PREFIX } / $ { LOFAR_LIBDIR } " ) <nl> - # set ( CMAKE_INSTALL_RPATH_USE_LINK_PATH TRUE ) <nl> - # endif ( BUILD_STATIC_EXECUTABLES ) <nl> - <nl> - <nl> - # mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> - # get_cmake_property ( _variableNames VARIABLES ) <nl> - # foreach ( _variableName $ { _variableNames } ) <nl> - # message ( STATUS " $ { _variableName } = $ { $ { _variableName } } " ) <nl> - # endforeach ( ) <nl> - # mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> - <nl> - # install mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - install ( DIRECTORY $ { PROJECT_SOURCE_DIR } / Documentation / man / <nl> - DESTINATION share / man ) <nl> - <nl> - if ( MSVC ) <nl> - install_readme ( README . README . txt ) <nl> - install_readme ( README . md . README . md ) <nl> - install_readme ( README . windows . README . windows . txt ) <nl> - endif ( ) <nl> - <nl> - if ( MSVC ) <nl> - install_readme ( LICENSE . LICENSE . txt ) <nl> - install_readme ( LICENSES - OTHER - COMPONENTS . md . LICENSES - OTHER - COMPONENTS . md ) <nl> - else ( ) <nl> - install_readme ( README share / doc / arangodb3 README ) <nl> - install_readme ( README . md share / doc / arangodb3 README . md ) <nl> - install_readme ( LICENSE share / doc / arangodb3 LICENSE ) <nl> - install_readme ( LICENSES - OTHER - COMPONENTS . md share / doc / arangodb3 LICENSES - OTHER - COMPONENTS . md ) <nl> - endif ( ) <nl> - <nl> - # Build package mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - if ( NOT ( MSVC ) ) <nl> - set ( CPACK_SET_DESTDIR ON ) <nl> - endif ( ) <nl> - <nl> - find_program ( DH_INSTALLINIT dh_installinit ) <nl> - find_program ( FAKEROOT fakeroot ) <nl> - <nl> - if ( DH_INSTALLINIT AND FAKEROOT ) <nl> - add_custom_target ( prepare_debian ) <nl> - SET ( DEBIAN_CONTROL_EXTRA_BASENAMES <nl> - postinst <nl> - preinst <nl> - postrm <nl> - prerm <nl> - ) <nl> - SET ( DEBIAN_WORK_DIR " $ { PROJECT_BINARY_DIR } / debian - work " ) <nl> - add_custom_command ( TARGET prepare_debian POST_BUILD <nl> - COMMAND $ { CMAKE_COMMAND } - E <nl> - remove_directory " $ { DEBIAN_WORK_DIR } " <nl> - ) <nl> - foreach ( _DEBIAN_CONTROL_EXTRA_BASENAME $ { DEBIAN_CONTROL_EXTRA_BASENAMES } ) <nl> - SET ( RELATIVE_NAME " debian / $ { _DEBIAN_CONTROL_EXTRA_BASENAME } " ) <nl> - SET ( SRCFILE " $ { PROJECT_SOURCE_DIR } / Installation / $ { RELATIVE_NAME } " ) <nl> - SET ( DESTFILE " $ { DEBIAN_WORK_DIR } / $ { RELATIVE_NAME } " ) <nl> - <nl> - list ( APPEND DEBIAN_CONTROL_EXTRA_SRC " $ { SRCFILE } " ) <nl> - list ( APPEND DEBIAN_CONTROL_EXTRA_DEST " $ { DESTFILE } " ) <nl> - <nl> - add_custom_command ( TARGET prepare_debian POST_BUILD <nl> - COMMAND $ { CMAKE_COMMAND } - E <nl> - copy $ { SRCFILE } $ { DESTFILE } ) <nl> - endforeach ( ) <nl> - <nl> - add_custom_command ( TARGET prepare_debian POST_BUILD <nl> - COMMAND $ { CMAKE_COMMAND } - E <nl> - copy " $ { PROJECT_SOURCE_DIR } / Installation / debian / control " " $ { DEBIAN_WORK_DIR } / debian / control " <nl> - ) <nl> - add_custom_command ( TARGET prepare_debian POST_BUILD <nl> - COMMAND $ { CMAKE_COMMAND } - E <nl> - copy " $ { PROJECT_SOURCE_DIR } / Installation / debian / compat " " $ { DEBIAN_WORK_DIR } / debian / compat " <nl> - ) <nl> - add_custom_command ( TARGET prepare_debian POST_BUILD <nl> - COMMAND fakeroot " $ { DH_INSTALLINIT } " - o 2 > / dev / null <nl> - WORKING_DIRECTORY $ { DEBIAN_WORK_DIR } <nl> - ) <nl> - add_custom_command ( TARGET prepare_debian POST_BUILD <nl> - COMMAND fakeroot dh_installdeb <nl> - WORKING_DIRECTORY $ { DEBIAN_WORK_DIR } <nl> - ) <nl> - endif ( ) <nl> - <nl> - # General <nl> - set ( CPACK_PACKAGE_NAME " arangodb3 " ) <nl> - set ( CPACK_PACKAGE_VENDOR " ArangoDB GmbH " ) <nl> - set ( CPACK_PACKAGE_CONTACT " info @ arangodb . com " ) <nl> - set ( CPACK_PACKAGE_VERSION " $ { ARANGODB_VERSION } " ) <nl> - <nl> - set ( CPACK_RESOURCE_FILE_LICENSE " $ { PROJECT_SOURCE_DIR } / LICENSE " ) <nl> - <nl> - set ( CPACK_STRIP_FILES " ON " ) <nl> - set ( CPACK_DEBIAN_PACKAGE_ARCHITECTURE " amd64 " ) <nl> - set ( CPACK_DEBIAN_PACKAGE_SECTION " database " ) <nl> - set ( CPACK_DEBIAN_PACKAGE_DESCRIPTION " a multi - purpose NoSQL database <nl> - A distributed free and open - source database with a flexible data model for documents , <nl> - graphs , and key - values . Build high performance applications using a convenient <nl> - SQL - like query language or JavaScript extensions . <nl> - . <nl> - Copyright : 2014 - 2016 by ArangoDB GmbH <nl> - Copyright : 2012 - 2013 by triAGENS GmbH <nl> - ArangoDB Software <nl> - www . arangodb . com <nl> - " ) <nl> - SET ( CPACK_DEBIAN_PACKAGE_CONFLICTS " arangodb " ) <nl> - set ( CPACK_DEBIAN_PACKAGE_SHLIBDEPS ON ) <nl> - set ( CPACK_DEBIAN_COMPRESSION_TYPE " xz " ) <nl> - set ( CPACK_DEBIAN_PACKAGE_HOMEPAGE " https : / / www . arangodb . com / " ) <nl> - set ( CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA " $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / postinst ; $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / preinst ; $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / postrm ; $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / prerm ; " ) <nl> - set ( CPACK_BUNDLE_NAME " $ { CPACK_PACKAGE_NAME } " ) <nl> - configure_file ( " $ { PROJECT_SOURCE_DIR } / Installation / MacOSX / Bundle / Info . plist . in " " $ { CMAKE_CURRENT_BINARY_DIR } / Info . plist " ) <nl> - set ( CPACK_BUNDLE_PLIST " $ { CMAKE_CURRENT_BINARY_DIR } / Info . plist " ) <nl> - set ( CPACK_BUNDLE_ICON " $ { PROJECT_SOURCE_DIR } / Installation / MacOSX / Bundle / icon . icns " ) <nl> - set ( CPACK_BUNDLE_STARTUP_COMMAND " $ { PROJECT_SOURCE_DIR } / Installation / MacOSX / Bundle / arangodb - cli . sh " ) <nl> - <nl> - # MS installer <nl> - if ( MSVC ) <nl> - set ( CPACK_PACKAGE_NAME " ArangoDB " ) <nl> - set ( CPACK_MODULE_PATH " $ { CMAKE_CURRENT_SOURCE_DIR } / Installation / Windows / Templates " ) <nl> - set ( CPACK_PLUGIN_PATH " $ { CMAKE_CURRENT_SOURCE_DIR } / Installation / Windows / Plugins " ) <nl> - set ( CPACK_NSIS_ENABLE_UNINSTALL_BEFORE_INSTALL 1 ) <nl> - set ( BITS 64 ) <nl> - <nl> - if ( CMAKE_CL_64 ) <nl> - SET ( CPACK_NSIS_INSTALL_ROOT " $ PROGRAMFILES64 " ) <nl> - SET ( BITS 64 ) <nl> - else ( ) <nl> - SET ( CPACK_NSIS_INSTALL_ROOT " $ PROGRAMFILES " ) <nl> - SET ( BITS 32 ) <nl> - endif ( ) <nl> - <nl> - message ( STATUS " ARANGO_IMG : $ { ARANGO_IMG } " ) <nl> - message ( STATUS " ARANGO_ICON : $ { ARANGO_ICON } " ) <nl> - message ( STATUS " RELATIVE_ARANGO_ICON : $ { RELATIVE_ARANGO_ICON } " ) <nl> - <nl> - install ( <nl> - DIRECTORY " $ { PROJECT_SOURCE_DIR } / Installation / Windows / Icons " <nl> - DESTINATION $ { TRI_RESOURCEDIR } ) <nl> - <nl> - set ( CPACK_ARANGODB_NSIS_DEFINES " <nl> - ! define BITS $ { BITS } <nl> - ! define TRI_FRIENDLY_SVC_NAME ' $ { ARANGODB_FRIENDLY_STRING } ' <nl> - ! define TRI_AARDVARK_URL ' http : / / 127 . 0 . 0 . 1 : 8529 ' <nl> - " ) <nl> - <nl> - set ( CPACK_PACKAGE_ICON $ { ARANGO_ICON } ) <nl> - <nl> - set ( CPACK_NSIS_MODIFY_PATH ON ) <nl> - set ( CPACK_NSIS_MUI_ICON $ { ARANGO_ICON } ) <nl> - set ( CPACK_NSIS_MUI_UNIICON $ { ARANGO_ICON } ) <nl> - set ( CPACK_NSIS_INSTALLED_ICON_NAME $ { RELATIVE_ARANGO_ICON } ) <nl> - set ( CPACK_NSIS_DISPLAY_NAME , $ { ARANGODB_DISPLAY_NAME } ) <nl> - set ( CPACK_NSIS_HELP_LINK $ { ARANGODB_HELP_LINK } ) <nl> - set ( CPACK_NSIS_URL_INFO_ABOUT $ { ARANGODB_URL_INFO_ABOUT } ) <nl> - set ( CPACK_NSIS_CONTACT $ { ARANGODB_CONTACT } ) <nl> - endif ( ) <nl> - <nl> - configure_file ( " $ { CMAKE_SOURCE_DIR } / Installation / cmake / CMakeCPackOptions . cmake . in " <nl> - " $ { CMAKE_BINARY_DIR } / CMakeCPackOptions . cmake " @ ONLY ) <nl> - set ( CPACK_PROJECT_CONFIG_FILE " $ { CMAKE_BINARY_DIR } / CMakeCPackOptions . cmake " ) <nl> - <nl> - if ( NOT ( MSVC ) ) <nl> - # components <nl> - install ( <nl> - FILES $ { PROJECT_SOURCE_DIR } / Installation / debian / arangodb . init <nl> - PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE <nl> - DESTINATION $ { ETCDIR } / init . d <nl> - RENAME arangodb3 <nl> - COMPONENT debian - extras <nl> - ) <nl> - endif ( ) <nl> - <nl> - # Custom targets mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> - <nl> - # love <nl> - add_custom_target ( love <nl> - COMMENT " ArangoDB loves you . " <nl> - COMMAND " " <nl> - ) <nl> - <nl> - <nl> - # Finally : user cpack <nl> - include ( CPack ) <nl> - <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - # # # @ brief install client - side JavaScript files <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - <nl> - install ( <nl> - DIRECTORY $ { PROJECT_SOURCE_DIR } / js / common $ { PROJECT_SOURCE_DIR } / js / client <nl> - DESTINATION share / arangodb3 / js <nl> - FILES_MATCHING PATTERN " * . js " <nl> - REGEX " ^ . * / common / test - data $ " EXCLUDE <nl> - REGEX " ^ . * / common / tests $ " EXCLUDE <nl> - REGEX " ^ . * / client / tests $ " EXCLUDE ) <nl> - <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - # # # @ brief install server - side JavaScript files <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - <nl> - install ( <nl> - DIRECTORY $ { PROJECT_SOURCE_DIR } / js / actions $ { PROJECT_SOURCE_DIR } / js / apps $ { PROJECT_SOURCE_DIR } / js / contrib $ { PROJECT_SOURCE_DIR } / js / node $ { PROJECT_SOURCE_DIR } / js / server <nl> - DESTINATION share / arangodb3 / js <nl> - REGEX " ^ . * / server / tests $ " EXCLUDE <nl> - REGEX " ^ . * / aardvark / APP / node_modules $ " EXCLUDE <nl> - ) <nl> - <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - # # # @ brief install log directory <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - <nl> - install ( <nl> - DIRECTORY $ { PROJECT_BINARY_DIR } / var / log / arangodb <nl> - DESTINATION $ { VARDIR_INSTALL } / log ) <nl> - <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - # # # @ brief install database directory <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - <nl> - install ( <nl> - DIRECTORY $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 <nl> - DESTINATION $ { VARDIR_INSTALL } / lib ) <nl> - <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - # # # @ brief install apps directory <nl> - # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> - <nl> - install ( <nl> - DIRECTORY $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 - apps <nl> - DESTINATION $ { VARDIR_INSTALL } / lib ) <nl> + include ( GNUInstallDirs ) <nl> + <nl> + # install the visual studio runtime : <nl> + if ( MSVC ) <nl> + include ( InstallRequiredSystemLibraries ) <nl> + INSTALL ( FILES $ { CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS } DESTINATION bin COMPONENT Libraries ) <nl> + endif ( ) <nl> + <nl> + # etc mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + set ( ETCDIR " " CACHE path " System configuration directory ( defaults to prefix / etc ) " ) <nl> + <nl> + # / etc mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + if ( ETCDIR STREQUAL " " ) <nl> + set ( ETCDIR_NATIVE " $ { CMAKE_INSTALL_PREFIX } / etc / arangodb3 " ) <nl> + set ( ETCDIR_INSTALL " etc / arangodb3 " ) <nl> + else ( ) <nl> + set ( ETCDIR_NATIVE " $ { ETCDIR } / arangodb3 " ) <nl> + set ( ETCDIR_INSTALL " $ { ETCDIR } / arangodb3 " ) <nl> + endif ( ) <nl> + <nl> + # MS stuff mmmmmmmmmmmmmmmmmmmmmmmmmmm <nl> + if ( MSVC ) <nl> + file ( TO_NATIVE_PATH " $ { ETCDIR_INSTALL } " ETCDIR_INSTALL ) <nl> + STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ETCDIR_ESCAPED " $ { ETCDIR_INSTALL } " ) <nl> + else ( ) <nl> + file ( TO_NATIVE_PATH " $ { ETCDIR_NATIVE } " ETCDIR_NATIVE ) <nl> + STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ETCDIR_ESCAPED " $ { ETCDIR_NATIVE } " ) <nl> + endif ( ) <nl> + <nl> + add_definitions ( " - D_SYSCONFDIR_ = \ " $ { ETCDIR_ESCAPED } \ " " ) <nl> + <nl> + # / var <nl> + set ( VARDIR " " <nl> + CACHE path <nl> + " System configuration directory ( defaults to prefix / var / arangodb3 ) " <nl> + ) <nl> + <nl> + if ( VARDIR STREQUAL " " ) <nl> + set ( VARDIR_NATIVE " $ { CMAKE_INSTALL_PREFIX } / var " ) <nl> + set ( VARDIR_INSTALL " var " ) <nl> + else ( ) <nl> + set ( VARDIR_NATIVE " $ { VARDIR } " ) <nl> + set ( VARDIR_INSTALL " $ { VARDIR } " ) <nl> + endif ( ) <nl> + <nl> + file ( TO_NATIVE_PATH " $ { VARDIR_NATIVE } " VARDIR_NATIVE ) <nl> + <nl> + # database directory <nl> + FILE ( MAKE_DIRECTORY " $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 " ) <nl> + <nl> + # apps <nl> + FILE ( MAKE_DIRECTORY " $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 - apps " ) <nl> + <nl> + # logs <nl> + FILE ( MAKE_DIRECTORY " $ { PROJECT_BINARY_DIR } / var / log / arangodb " ) <nl> + <nl> + # package <nl> + set ( TRI_PKGDATADIR " $ { CMAKE_INSTALL_PREFIX } / share / arangodb3 " ) <nl> + <nl> + # resources <nl> + set ( TRI_RESOURCEDIR " resources " ) <nl> + <nl> + # sbinaries <nl> + if ( MSVC ) <nl> + set ( ARANGODB_INSTALL_SBIN " bin " ) <nl> + set ( TRI_SBINDIR " $ { CMAKE_INSTALL_PREFIX } / bin " ) <nl> + else ( ) <nl> + set ( ARANGODB_INSTALL_SBIN " sbin " ) <nl> + set ( TRI_SBINDIR " $ { CMAKE_INSTALL_PREFIX } / sbin " ) <nl> + endif ( ) <nl> + <nl> + # MS Windows mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + if ( MSVC ) <nl> + # icon paths <nl> + file ( TO_NATIVE_PATH <nl> + " $ { TRI_RESOURCEDIR } / Icons / arangodb . ico " <nl> + RELATIVE_ARANGO_ICON <nl> + ) <nl> + <nl> + file ( TO_NATIVE_PATH <nl> + " $ { PROJECT_SOURCE_DIR } / Installation / Windows / Icons / arangodb . bmp " <nl> + ARANGO_IMG <nl> + ) <nl> + <nl> + file ( TO_NATIVE_PATH <nl> + " $ { PROJECT_SOURCE_DIR } / Installation / Windows / Icons / arangodb . ico " <nl> + ARANGO_ICON <nl> + ) <nl> + <nl> + STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ARANGO_IMG " $ { ARANGO_IMG } " ) <nl> + STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " ARANGO_ICON " $ { ARANGO_ICON } " ) <nl> + STRING ( REGEX REPLACE " \ \ \ \ " " \ \ \ \ \ \ \ \ " RELATIVE_ARANGO_ICON " $ { RELATIVE_ARANGO_ICON } " ) <nl> + <nl> + # versioning <nl> + set ( CMAKE_MODULE_PATH <nl> + $ { CMAKE_MODULE_PATH } <nl> + $ { PROJECT_SOURCE_DIR } / Installation / Windows / version <nl> + ) <nl> + <nl> + include ( " $ { PROJECT_SOURCE_DIR } / Installation / Windows / version / generate_product_version . cmake " ) <nl> + endif ( ) <nl> + <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + # # INSTALL <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + <nl> + # Global macros mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + macro ( generate_root_config name ) <nl> + FILE ( READ $ { PROJECT_SOURCE_DIR } / etc / arangodb3 / $ { name } . conf . in FileContent ) <nl> + STRING ( REPLACE " @ PKGDATADIR @ " " @ ROOTDIR @ / share / arangodb3 " <nl> + FileContent " $ { FileContent } " ) <nl> + STRING ( REPLACE " @ LOCALSTATEDIR @ " " @ ROOTDIR @ / var " <nl> + FileContent " $ { FileContent } " ) <nl> + STRING ( REPLACE " @ SBINDIR @ " " @ ROOTDIR @ / bin " <nl> + FileContent " $ { FileContent } " ) <nl> + STRING ( REPLACE " @ LIBEXECDIR @ / arangodb3 " " @ ROOTDIR @ / bin " <nl> + FileContent " $ { FileContent } " ) <nl> + STRING ( REPLACE " @ SYSCONFDIR @ " " @ ROOTDIR @ / etc / arangodb3 " <nl> + FileContent " $ { FileContent } " ) <nl> + if ( MSVC ) <nl> + STRING ( REPLACE " @ PROGRAM_SUFFIX @ " " . exe " <nl> + FileContent " $ { FileContent } " ) <nl> + STRING ( REGEX REPLACE " [ \ r \ n ] file = " " \ n # file = " <nl> + FileContent " $ { FileContent } " ) <nl> + endif ( ) <nl> + FILE ( WRITE $ { PROJECT_BINARY_DIR } / etc / arangodb3 / $ { name } . conf " $ { FileContent } " ) <nl> + endmacro ( ) <nl> + <nl> + # generates config file using the configured paths mmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + macro ( generate_path_config name ) <nl> + FILE ( READ $ { PROJECT_SOURCE_DIR } / etc / arangodb3 / $ { name } . conf . in FileContent ) <nl> + STRING ( REPLACE " @ PKGDATADIR @ " " $ { TRI_PKGDATADIR } " <nl> + FileContent " $ { FileContent } " ) <nl> + STRING ( REPLACE " @ LOCALSTATEDIR @ " " $ { VARDIR_NATIVE } " <nl> + FileContent " $ { FileContent } " ) <nl> + FILE ( WRITE $ { PROJECT_BINARY_DIR } / etc / arangodb3 / $ { name } . conf " $ { FileContent } " ) <nl> + endmacro ( ) <nl> + <nl> + # installs a config file mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + macro ( install_config name ) <nl> + if ( MSVC OR ( DARWIN AND NOT HOMEBREW ) ) <nl> + generate_root_config ( $ { name } ) <nl> + else ( ) <nl> + generate_path_config ( $ { name } ) <nl> + endif ( ) <nl> + install ( <nl> + FILES $ { PROJECT_BINARY_DIR } / etc / arangodb3 / $ { name } . conf <nl> + DESTINATION $ { ETCDIR_INSTALL } ) <nl> + endmacro ( ) <nl> + <nl> + # installs a readme file converting EOL mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + macro ( install_readme input where output ) <nl> + FILE ( READ $ { PROJECT_SOURCE_DIR } / $ { input } FileContent ) <nl> + STRING ( REPLACE " \ r " " " FileContent " $ { FileContent } " ) <nl> + if ( MSVC ) <nl> + STRING ( REPLACE " \ n " " \ r \ n " FileContent " $ { FileContent } " ) <nl> + endif ( ) <nl> + FILE ( WRITE $ { PROJECT_BINARY_DIR } / $ { output } " $ { FileContent } " ) <nl> + install ( <nl> + FILES $ { PROJECT_BINARY_DIR } / $ { output } <nl> + DESTINATION $ { where } ) <nl> + endmacro ( ) <nl> + <nl> + # installs a link to an executable mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm <nl> + macro ( install_command_alias name where alias ) <nl> + if ( MSVC ) <nl> + add_custom_command ( <nl> + TARGET $ { name } <nl> + POST_BUILD <nl> + COMMAND $ { CMAKE_COMMAND } - E copy $ < TARGET_FILE : $ { name } > <nl> + $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ < CONFIGURATION > / $ { alias } . exe ) <nl> + install ( <nl> + PROGRAMS $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ < CONFIGURATION > / $ { alias } . exe <nl> + DESTINATION $ { where } ) <nl> + else ( ) <nl> + add_custom_command ( <nl> + TARGET $ { name } <nl> + POST_BUILD <nl> + COMMAND $ { CMAKE_COMMAND } - E create_symlink $ { name } <nl> + $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ { alias } ) <nl> + install ( <nl> + PROGRAMS $ { CMAKE_RUNTIME_OUTPUT_DIRECTORY } / $ { alias } <nl> + DESTINATION $ { where } ) <nl> + endif ( ) <nl> + endmacro ( ) <nl> + <nl> + # sub directories mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> + <nl> + # if ( BUILD_STATIC_EXECUTABLES ) <nl> + # set ( CMAKE_EXE_LINKER_FLAGS - static ) <nl> + # set ( CMAKE_FIND_LIBRARY_SUFFIXES . a ) <nl> + # set ( CMAKE_EXE_LINK_DYNAMIC_C_FLAGS ) # remove - Wl , - Bdynamic <nl> + # set ( CMAKE_EXE_LINK_DYNAMIC_CXX_FLAGS ) <nl> + # set ( CMAKE_SHARED_LIBRARY_C_FLAGS ) # remove - fPIC <nl> + # set ( CMAKE_SHARED_LIBRARY_CXX_FLAGS ) <nl> + # set ( CMAKE_SHARED_LIBRARY_LINK_C_FLAGS ) # remove - rdynamic <nl> + # set ( CMAKE_SHARED_LIBRARY_LINK_CXX_FLAGS ) <nl> + # # Maybe this works as well , haven ' t tried yet . <nl> + # # set_property ( GLOBAL PROPERTY TARGET_SUPPORTS_SHARED_LIBS FALSE ) <nl> + # else ( BUILD_STATIC_EXECUTABLES ) <nl> + # # Set RPATH to use for installed targets ; append linker search path <nl> + # set ( CMAKE_INSTALL_RPATH " $ { CMAKE_INSTALL_PREFIX } / $ { LOFAR_LIBDIR } " ) <nl> + # set ( CMAKE_INSTALL_RPATH_USE_LINK_PATH TRUE ) <nl> + # endif ( BUILD_STATIC_EXECUTABLES ) <nl> + <nl> + <nl> + # mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> + # get_cmake_property ( _variableNames VARIABLES ) <nl> + # foreach ( _variableName $ { _variableNames } ) <nl> + # message ( STATUS " $ { _variableName } = $ { $ { _variableName } } " ) <nl> + # endforeach ( ) <nl> + # mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> + <nl> + # install mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + install ( DIRECTORY $ { PROJECT_SOURCE_DIR } / Documentation / man / <nl> + DESTINATION share / man ) <nl> + <nl> + if ( MSVC ) <nl> + install_readme ( README . README . txt ) <nl> + install_readme ( README . md . README . md ) <nl> + install_readme ( README . windows . README . windows . txt ) <nl> + endif ( ) <nl> + <nl> + if ( MSVC ) <nl> + install_readme ( LICENSE . LICENSE . txt ) <nl> + install_readme ( LICENSES - OTHER - COMPONENTS . md . LICENSES - OTHER - COMPONENTS . md ) <nl> + else ( ) <nl> + install_readme ( README share / doc / arangodb3 README ) <nl> + install_readme ( README . md share / doc / arangodb3 README . md ) <nl> + install_readme ( LICENSE share / doc / arangodb3 LICENSE ) <nl> + install_readme ( LICENSES - OTHER - COMPONENTS . md share / doc / arangodb3 LICENSES - OTHER - COMPONENTS . md ) <nl> + endif ( ) <nl> + <nl> + # Build package mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + if ( NOT ( MSVC ) ) <nl> + set ( CPACK_SET_DESTDIR ON ) <nl> + endif ( ) <nl> + <nl> + find_program ( DH_INSTALLINIT dh_installinit ) <nl> + find_program ( FAKEROOT fakeroot ) <nl> + <nl> + if ( DH_INSTALLINIT AND FAKEROOT ) <nl> + add_custom_target ( prepare_debian ) <nl> + SET ( DEBIAN_CONTROL_EXTRA_BASENAMES <nl> + postinst <nl> + preinst <nl> + postrm <nl> + prerm <nl> + ) <nl> + SET ( DEBIAN_WORK_DIR " $ { PROJECT_BINARY_DIR } / debian - work " ) <nl> + add_custom_command ( TARGET prepare_debian POST_BUILD <nl> + COMMAND $ { CMAKE_COMMAND } - E <nl> + remove_directory " $ { DEBIAN_WORK_DIR } " <nl> + ) <nl> + foreach ( _DEBIAN_CONTROL_EXTRA_BASENAME $ { DEBIAN_CONTROL_EXTRA_BASENAMES } ) <nl> + SET ( RELATIVE_NAME " debian / $ { _DEBIAN_CONTROL_EXTRA_BASENAME } " ) <nl> + SET ( SRCFILE " $ { PROJECT_SOURCE_DIR } / Installation / $ { RELATIVE_NAME } " ) <nl> + SET ( DESTFILE " $ { DEBIAN_WORK_DIR } / $ { RELATIVE_NAME } " ) <nl> + <nl> + list ( APPEND DEBIAN_CONTROL_EXTRA_SRC " $ { SRCFILE } " ) <nl> + list ( APPEND DEBIAN_CONTROL_EXTRA_DEST " $ { DESTFILE } " ) <nl> + <nl> + add_custom_command ( TARGET prepare_debian POST_BUILD <nl> + COMMAND $ { CMAKE_COMMAND } - E <nl> + copy $ { SRCFILE } $ { DESTFILE } ) <nl> + endforeach ( ) <nl> + <nl> + add_custom_command ( TARGET prepare_debian POST_BUILD <nl> + COMMAND $ { CMAKE_COMMAND } - E <nl> + copy " $ { PROJECT_SOURCE_DIR } / Installation / debian / control " " $ { DEBIAN_WORK_DIR } / debian / control " <nl> + ) <nl> + add_custom_command ( TARGET prepare_debian POST_BUILD <nl> + COMMAND $ { CMAKE_COMMAND } - E <nl> + copy " $ { PROJECT_SOURCE_DIR } / Installation / debian / compat " " $ { DEBIAN_WORK_DIR } / debian / compat " <nl> + ) <nl> + add_custom_command ( TARGET prepare_debian POST_BUILD <nl> + COMMAND fakeroot " $ { DH_INSTALLINIT } " - o 2 > / dev / null <nl> + WORKING_DIRECTORY $ { DEBIAN_WORK_DIR } <nl> + ) <nl> + add_custom_command ( TARGET prepare_debian POST_BUILD <nl> + COMMAND fakeroot dh_installdeb <nl> + WORKING_DIRECTORY $ { DEBIAN_WORK_DIR } <nl> + ) <nl> + endif ( ) <nl> + <nl> + # General <nl> + set ( CPACK_PACKAGE_NAME " arangodb3 " ) <nl> + set ( CPACK_PACKAGE_VENDOR " ArangoDB GmbH " ) <nl> + set ( CPACK_PACKAGE_CONTACT " info @ arangodb . com " ) <nl> + set ( CPACK_PACKAGE_VERSION " $ { ARANGODB_VERSION } " ) <nl> + <nl> + set ( CPACK_RESOURCE_FILE_LICENSE " $ { PROJECT_SOURCE_DIR } / LICENSE " ) <nl> + <nl> + set ( CPACK_STRIP_FILES " ON " ) <nl> + set ( CPACK_DEBIAN_PACKAGE_ARCHITECTURE " amd64 " ) <nl> + set ( CPACK_DEBIAN_PACKAGE_SECTION " database " ) <nl> + set ( CPACK_DEBIAN_PACKAGE_DESCRIPTION " a multi - purpose NoSQL database <nl> + A distributed free and open - source database with a flexible data model for documents , <nl> + graphs , and key - values . Build high performance applications using a convenient <nl> + SQL - like query language or JavaScript extensions . <nl> + . <nl> + Copyright : 2014 - 2016 by ArangoDB GmbH <nl> + Copyright : 2012 - 2013 by triAGENS GmbH <nl> + ArangoDB Software <nl> + www . arangodb . com <nl> + " ) <nl> + SET ( CPACK_DEBIAN_PACKAGE_CONFLICTS " arangodb " ) <nl> + set ( CPACK_DEBIAN_PACKAGE_SHLIBDEPS ON ) <nl> + set ( CPACK_DEBIAN_COMPRESSION_TYPE " xz " ) <nl> + set ( CPACK_DEBIAN_PACKAGE_HOMEPAGE " https : / / www . arangodb . com / " ) <nl> + set ( CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA " $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / postinst ; $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / preinst ; $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / postrm ; $ { PROJECT_BINARY_DIR } / debian - work / debian / $ { CPACK_PACKAGE_NAME } / DEBIAN / prerm ; " ) <nl> + set ( CPACK_BUNDLE_NAME " $ { CPACK_PACKAGE_NAME } " ) <nl> + configure_file ( " $ { PROJECT_SOURCE_DIR } / Installation / MacOSX / Bundle / Info . plist . in " " $ { CMAKE_CURRENT_BINARY_DIR } / Info . plist " ) <nl> + set ( CPACK_BUNDLE_PLIST " $ { CMAKE_CURRENT_BINARY_DIR } / Info . plist " ) <nl> + set ( CPACK_BUNDLE_ICON " $ { PROJECT_SOURCE_DIR } / Installation / MacOSX / Bundle / icon . icns " ) <nl> + set ( CPACK_BUNDLE_STARTUP_COMMAND " $ { PROJECT_SOURCE_DIR } / Installation / MacOSX / Bundle / arangodb - cli . sh " ) <nl> + <nl> + # MS installer <nl> + if ( MSVC ) <nl> + set ( CPACK_PACKAGE_NAME " ArangoDB " ) <nl> + set ( CPACK_MODULE_PATH " $ { CMAKE_CURRENT_SOURCE_DIR } / Installation / Windows / Templates " ) <nl> + set ( CPACK_PLUGIN_PATH " $ { CMAKE_CURRENT_SOURCE_DIR } / Installation / Windows / Plugins " ) <nl> + set ( CPACK_NSIS_ENABLE_UNINSTALL_BEFORE_INSTALL 1 ) <nl> + set ( BITS 64 ) <nl> + <nl> + if ( CMAKE_CL_64 ) <nl> + SET ( CPACK_NSIS_INSTALL_ROOT " $ PROGRAMFILES64 " ) <nl> + SET ( BITS 64 ) <nl> + else ( ) <nl> + SET ( CPACK_NSIS_INSTALL_ROOT " $ PROGRAMFILES " ) <nl> + SET ( BITS 32 ) <nl> + endif ( ) <nl> + <nl> + message ( STATUS " ARANGO_IMG : $ { ARANGO_IMG } " ) <nl> + message ( STATUS " ARANGO_ICON : $ { ARANGO_ICON } " ) <nl> + message ( STATUS " RELATIVE_ARANGO_ICON : $ { RELATIVE_ARANGO_ICON } " ) <nl> + <nl> + install ( <nl> + DIRECTORY " $ { PROJECT_SOURCE_DIR } / Installation / Windows / Icons " <nl> + DESTINATION $ { TRI_RESOURCEDIR } ) <nl> + <nl> + set ( CPACK_ARANGODB_NSIS_DEFINES " <nl> + ! define BITS $ { BITS } <nl> + ! define TRI_FRIENDLY_SVC_NAME ' $ { ARANGODB_FRIENDLY_STRING } ' <nl> + ! define TRI_AARDVARK_URL ' http : / / 127 . 0 . 0 . 1 : 8529 ' <nl> + " ) <nl> + <nl> + set ( CPACK_PACKAGE_ICON $ { ARANGO_ICON } ) <nl> + <nl> + set ( CPACK_NSIS_MODIFY_PATH ON ) <nl> + set ( CPACK_NSIS_MUI_ICON $ { ARANGO_ICON } ) <nl> + set ( CPACK_NSIS_MUI_UNIICON $ { ARANGO_ICON } ) <nl> + set ( CPACK_NSIS_INSTALLED_ICON_NAME $ { RELATIVE_ARANGO_ICON } ) <nl> + set ( CPACK_NSIS_DISPLAY_NAME , $ { ARANGODB_DISPLAY_NAME } ) <nl> + set ( CPACK_NSIS_HELP_LINK $ { ARANGODB_HELP_LINK } ) <nl> + set ( CPACK_NSIS_URL_INFO_ABOUT $ { ARANGODB_URL_INFO_ABOUT } ) <nl> + set ( CPACK_NSIS_CONTACT $ { ARANGODB_CONTACT } ) <nl> + endif ( ) <nl> + <nl> + configure_file ( " $ { CMAKE_SOURCE_DIR } / Installation / cmake / CMakeCPackOptions . cmake . in " <nl> + " $ { CMAKE_BINARY_DIR } / CMakeCPackOptions . cmake " @ ONLY ) <nl> + set ( CPACK_PROJECT_CONFIG_FILE " $ { CMAKE_BINARY_DIR } / CMakeCPackOptions . cmake " ) <nl> + <nl> + if ( NOT ( MSVC ) ) <nl> + # components <nl> + install ( <nl> + FILES $ { PROJECT_SOURCE_DIR } / Installation / debian / arangodb . init <nl> + PERMISSIONS OWNER_READ OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE <nl> + DESTINATION $ { ETCDIR } / init . d <nl> + RENAME arangodb3 <nl> + COMPONENT debian - extras <nl> + ) <nl> + endif ( ) <nl> + <nl> + # Custom targets mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - <nl> + <nl> + # love <nl> + add_custom_target ( love <nl> + COMMENT " ArangoDB loves you . " <nl> + COMMAND " " <nl> + ) <nl> + <nl> + <nl> + # Finally : user cpack <nl> + include ( CPack ) <nl> + <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + # # # @ brief install client - side JavaScript files <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + <nl> + install ( <nl> + DIRECTORY $ { PROJECT_SOURCE_DIR } / js / common $ { PROJECT_SOURCE_DIR } / js / client <nl> + DESTINATION share / arangodb3 / js <nl> + FILES_MATCHING PATTERN " * . js " <nl> + REGEX " ^ . * / common / test - data $ " EXCLUDE <nl> + REGEX " ^ . * / common / tests $ " EXCLUDE <nl> + REGEX " ^ . * / client / tests $ " EXCLUDE ) <nl> + <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + # # # @ brief install server - side JavaScript files <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + <nl> + install ( <nl> + DIRECTORY $ { PROJECT_SOURCE_DIR } / js / actions $ { PROJECT_SOURCE_DIR } / js / apps $ { PROJECT_SOURCE_DIR } / js / contrib $ { PROJECT_SOURCE_DIR } / js / node $ { PROJECT_SOURCE_DIR } / js / server <nl> + DESTINATION share / arangodb3 / js <nl> + REGEX " ^ . * / server / tests $ " EXCLUDE <nl> + REGEX " ^ . * / aardvark / APP / node_modules $ " EXCLUDE <nl> + ) <nl> + <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + # # # @ brief install log directory <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + <nl> + install ( <nl> + DIRECTORY $ { PROJECT_BINARY_DIR } / var / log / arangodb <nl> + DESTINATION $ { VARDIR_INSTALL } / log ) <nl> + <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + # # # @ brief install database directory <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + <nl> + install ( <nl> + DIRECTORY $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 <nl> + DESTINATION $ { VARDIR_INSTALL } / lib ) <nl> + <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + # # # @ brief install apps directory <nl> + # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> + <nl> + install ( <nl> + DIRECTORY $ { PROJECT_BINARY_DIR } / var / lib / arangodb3 - apps <nl> + DESTINATION $ { VARDIR_INSTALL } / lib ) <nl>
fixed eol
arangodb/arangodb
c1f9534c9bc3876e4c2ce7ba08553ca56cc97017
2016-06-14T13:58:51Z
mmm a / Makefile <nl> ppp b / Makefile <nl> else <nl> $ ( GENDIR ) / examples / pubsub / empty . pb . cc : examples / pubsub / empty . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / examples / pubsub / label . pb . cc : examples / pubsub / label . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / examples / pubsub / pubsub . pb . cc : examples / pubsub / pubsub . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / test / cpp / interop / empty . pb . cc : test / cpp / interop / empty . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / test / cpp / interop / messages . pb . cc : test / cpp / interop / messages . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / test / cpp / interop / test . pb . cc : test / cpp / interop / test . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / test / cpp / qps / qpstest . pb . cc : test / cpp / qps / qpstest . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / test / cpp / util / echo . pb . cc : test / cpp / util / echo . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / test / cpp / util / echo_duplicate . pb . cc : test / cpp / util / echo_duplicate . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> ifeq ( $ ( NO_PROTOC ) , true ) <nl> else <nl> $ ( GENDIR ) / test / cpp / util / messages . pb . cc : test / cpp / util / messages . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> <nl> mmm a / templates / Makefile . template <nl> ppp b / templates / Makefile . template <nl> else <nl> $ ( GENDIR ) / $ { p } . pb . cc : $ { p } . proto $ ( PROTOBUF_DEP ) $ ( PROTOC_PLUGINS ) <nl> $ ( E ) " [ PROTOC ] Generating protobuf CC file from $ < " <nl> $ ( Q ) mkdir - p ` dirname $ @ ` <nl> - $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / cpp_plugin $ < <nl> + $ ( Q ) $ ( PROTOC ) - - cpp_out = $ ( GENDIR ) - - grpc_out = $ ( GENDIR ) - - plugin = protoc - gen - grpc = $ ( BINDIR ) / $ ( CONFIG ) / grpc_cpp_plugin $ < <nl> endif <nl> <nl> % endfor <nl>
Fix a bug in Makefile where cpp_plugin name hadn ' t been updated yet where it
grpc/grpc
850290ff7cd2776de48522ec8e0914171d97d60d
2015-02-19T17:59:44Z
mmm a / atom / renderer / lib / web - view / web - view - attributes . coffee <nl> ppp b / atom / renderer / lib / web - view / web - view - attributes . coffee <nl> class SrcAttribute extends WebViewAttribute <nl> # Navigate to | this . src | . <nl> httpreferrer = @ webViewImpl . attributes [ webViewConstants . ATTRIBUTE_HTTPREFERRER ] . getValue ( ) <nl> urlOptions = if httpreferrer then { httpreferrer } else { } <nl> - remote . getGuestWebContents ( @ webViewImpl . guestInstanceId ) . loadUrl @ getValue ( ) , urlOptions <nl> + <nl> + useragent = @ webViewImpl . attributes [ webViewConstants . ATTRIBUTE_HTTPREFERRER ] . getValue ( ) <nl> + <nl> + guestContents = remote . getGuestWebContents ( @ webViewImpl . guestInstanceId ) <nl> + guestContents . setUserAgent ( guestContents ) if guestContents <nl> + guestContents . loadUrl @ getValue ( ) , urlOptions <nl> <nl> # Attribute specifies HTTP referrer . <nl> class HttpReferrerAttribute extends WebViewAttribute <nl>
Right before navigate , set the user agent
electron/electron
4a8d7c18194570d29d0bcece3f9cdf0269986a7d
2015-05-19T21:27:15Z
new file mode 100644 <nl> index 00000000000 . . ada938a148c <nl> mmm / dev / null <nl> ppp b / contrib / Python / cntk / ops / cntk2 . py <nl> <nl> + # Copyright ( c ) Microsoft . All rights reserved . <nl> + <nl> + # Licensed under the MIT license . See LICENSE . md file in the project root <nl> + # for full license information . <nl> + # = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = <nl> + <nl> + # This file is auto - generated by _fetch_ops . py . <nl> + <nl> + from cntk . graph import ComputationNode , InputComputationNodeBase , ImageInputComputationNodeBase <nl> + <nl> + class Ceil ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Ceil ' , var_name = None ) : <nl> + super ( Ceil , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . inputs = [ ' _ ' ] <nl> + self . params_with_defaults = [ ] <nl> + <nl> + class ElementDivide ( ComputationNode ) : <nl> + def __init__ ( self , _ , y , name = ' CNTK2 . ElementDivide ' , var_name = None ) : <nl> + super ( ElementDivide , self ) . __init__ ( params = [ ' _ ' , ' y ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . y = y <nl> + self . inputs = [ ' _ ' , ' y ' ] <nl> + self . params_with_defaults = [ ] <nl> + <nl> + class Round ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Round ' , var_name = None ) : <nl> + super ( Round , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . inputs = [ ' _ ' ] <nl> + self . params_with_defaults = [ ] <nl> + <nl> + class DynamicAxis ( ComputationNode ) : <nl> + def __init__ ( self , name = ' CNTK2 . DynamicAxis ' , var_name = None ) : <nl> + super ( DynamicAxis , self ) . __init__ ( params = [ ] , name = name , var_name = var_name ) <nl> + <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ] <nl> + <nl> + class Input ( InputComputationNodeBase ) : <nl> + def __init__ ( self , shape , dynamicAxis = ' ' , tag = ' feature ' , name = ' CNTK2 . Input ' , var_name = None ) : <nl> + super ( Input , self ) . __init__ ( params = [ ' shape ' , ' dynamicAxis ' , ' tag ' ] , name = name , var_name = var_name ) <nl> + self . shape = shape <nl> + self . dynamicAxis = dynamicAxis <nl> + self . tag = tag <nl> + self . params_with_defaults = [ ' dynamicAxis ' , ' tag ' ] <nl> + self . inputs = [ ] <nl> + <nl> + class _Parameter ( ComputationNode ) : <nl> + def __init__ ( self , shape , value = 0 , learningRateMultiplier = 1 . 0 , init = ' uniform ' , initValueScale = 1 , initFromFilePath = ' ' , initFromLiteral = ' ' , initOnCPUOnly = True , randomSeed = - 1 , name = ' CNTK2 . _Parameter ' , var_name = None ) : <nl> + super ( _Parameter , self ) . __init__ ( params = [ ' shape ' , ' value ' , ' learningRateMultiplier ' , ' init ' , ' initValueScale ' , ' initFromFilePath ' , ' initFromLiteral ' , ' initOnCPUOnly ' , ' randomSeed ' ] , name = name , var_name = var_name ) <nl> + self . shape = shape <nl> + self . value = value <nl> + self . learningRateMultiplier = learningRateMultiplier <nl> + self . init = init <nl> + self . initValueScale = initValueScale <nl> + self . initFromFilePath = initFromFilePath <nl> + self . initFromLiteral = initFromLiteral <nl> + self . initOnCPUOnly = initOnCPUOnly <nl> + self . randomSeed = randomSeed <nl> + self . params_with_defaults = [ ' value ' , ' learningRateMultiplier ' , ' init ' , ' initValueScale ' , ' initFromFilePath ' , ' initFromLiteral ' , ' initOnCPUOnly ' , ' randomSeed ' ] <nl> + self . inputs = [ ] <nl> + <nl> + class Reshape ( ComputationNode ) : <nl> + def __init__ ( self , _ , shape , beginAxis = 0 , endAxis = 0 , name = ' CNTK2 . Reshape ' , var_name = None ) : <nl> + super ( Reshape , self ) . __init__ ( params = [ ' _ ' , ' shape ' , ' beginAxis ' , ' endAxis ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . shape = shape <nl> + self . beginAxis = beginAxis <nl> + self . endAxis = endAxis <nl> + self . params_with_defaults = [ ' beginAxis ' , ' endAxis ' ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class Times ( ComputationNode ) : <nl> + def __init__ ( self , x , y , outputRank = 1 , name = ' CNTK2 . Times ' , var_name = None ) : <nl> + super ( Times , self ) . __init__ ( params = [ ' x ' , ' y ' , ' outputRank ' ] , name = name , var_name = var_name ) <nl> + self . x = x <nl> + self . y = y <nl> + self . outputRank = outputRank <nl> + self . params_with_defaults = [ ' outputRank ' ] <nl> + self . inputs = [ ' x ' , ' y ' ] <nl> + <nl> + class Abs ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Abs ' , var_name = None ) : <nl> + super ( Abs , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class Clip ( ComputationNode ) : <nl> + def __init__ ( self , _ , minValue , maxValue , name = ' CNTK2 . Clip ' , var_name = None ) : <nl> + super ( Clip , self ) . __init__ ( params = [ ' _ ' , ' minValue ' , ' maxValue ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . minValue = minValue <nl> + self . maxValue = maxValue <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' minValue ' , ' maxValue ' , ' _ ' ] <nl> + <nl> + class ElementTimes ( ComputationNode ) : <nl> + def __init__ ( self , _ , y , name = ' CNTK2 . ElementTimes ' , var_name = None ) : <nl> + super ( ElementTimes , self ) . __init__ ( params = [ ' _ ' , ' y ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . y = y <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' , ' y ' ] <nl> + <nl> + class Floor ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Floor ' , var_name = None ) : <nl> + super ( Floor , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class Minus ( ComputationNode ) : <nl> + def __init__ ( self , _ , y , name = ' CNTK2 . Minus ' , var_name = None ) : <nl> + super ( Minus , self ) . __init__ ( params = [ ' _ ' , ' y ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . y = y <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' , ' y ' ] <nl> + <nl> + class Plus ( ComputationNode ) : <nl> + def __init__ ( self , _ , y , name = ' CNTK2 . Plus ' , var_name = None ) : <nl> + super ( Plus , self ) . __init__ ( params = [ ' _ ' , ' y ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . y = y <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' , ' y ' ] <nl> + <nl> + class Tanh ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Tanh ' , var_name = None ) : <nl> + super ( Tanh , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class FutureValue ( ComputationNode ) : <nl> + def __init__ ( self , _ , shape , timeStep = 1 , defaultHiddenActivation = 0 . 1 , name = ' CNTK2 . FutureValue ' , var_name = None ) : <nl> + super ( FutureValue , self ) . __init__ ( params = [ ' _ ' , ' shape ' , ' timeStep ' , ' defaultHiddenActivation ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . shape = shape <nl> + self . timeStep = timeStep <nl> + self . defaultHiddenActivation = defaultHiddenActivation <nl> + self . params_with_defaults = [ ' timeStep ' , ' defaultHiddenActivation ' ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class PastValue ( ComputationNode ) : <nl> + def __init__ ( self , _ , shape , timeStep = 1 , defaultHiddenActivation = 0 . 1 , name = ' CNTK2 . PastValue ' , var_name = None ) : <nl> + super ( PastValue , self ) . __init__ ( params = [ ' _ ' , ' shape ' , ' timeStep ' , ' defaultHiddenActivation ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . shape = shape <nl> + self . timeStep = timeStep <nl> + self . defaultHiddenActivation = defaultHiddenActivation <nl> + self . params_with_defaults = [ ' timeStep ' , ' defaultHiddenActivation ' ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class Relu ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Relu ' , var_name = None ) : <nl> + super ( Relu , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class Sigmoid ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Sigmoid ' , var_name = None ) : <nl> + super ( Sigmoid , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class Softmax ( ComputationNode ) : <nl> + def __init__ ( self , _ , name = ' CNTK2 . Softmax ' , var_name = None ) : <nl> + super ( Softmax , self ) . __init__ ( params = [ ' _ ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' ] <nl> + <nl> + class CrossEntropyWithSoftmax ( ComputationNode ) : <nl> + def __init__ ( self , _ , outProbVectorSequence , name = ' CNTK2 . CrossEntropyWithSoftmax ' , var_name = None ) : <nl> + super ( CrossEntropyWithSoftmax , self ) . __init__ ( params = [ ' _ ' , ' outProbVectorSequence ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . outProbVectorSequence = outProbVectorSequence <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' , ' outProbVectorSequence ' ] <nl> + <nl> + class ErrorPrediction ( ComputationNode ) : <nl> + def __init__ ( self , _ , outVectorSequence , name = ' CNTK2 . ErrorPrediction ' , var_name = None ) : <nl> + super ( ErrorPrediction , self ) . __init__ ( params = [ ' _ ' , ' outVectorSequence ' ] , name = name , var_name = var_name ) <nl> + self . _ = _ <nl> + self . outVectorSequence = outVectorSequence <nl> + self . params_with_defaults = [ ] <nl> + self . inputs = [ ' _ ' , ' outVectorSequence ' ] <nl> + <nl>
Use cntk2 namespace where appropriate
microsoft/CNTK
f4c2e9a1998bbe254b61ea57d09c39b2f6983b2a
2016-04-29T10:56:20Z
mmm a / imgui . cpp <nl> ppp b / imgui . cpp <nl> void ImGui : : SetTooltip ( const char * fmt , . . . ) <nl> / / [ SECTION ] POPUPS <nl> / / mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> <nl> + / / Return true if the popup is open at the current BeginPopup ( ) level of the popup stack <nl> bool ImGui : : IsPopupOpen ( ImGuiID id ) <nl> { <nl> ImGuiContext & g = * GImGui ; <nl> return g . OpenPopupStack . Size > g . BeginPopupStack . Size & & g . OpenPopupStack [ g . BeginPopupStack . Size ] . PopupId = = id ; <nl> } <nl> <nl> + / / Return true if the popup is open at the current BeginPopup ( ) level of the popup stack <nl> bool ImGui : : IsPopupOpen ( const char * str_id ) <nl> { <nl> ImGuiContext & g = * GImGui ; <nl> bool ImGui : : IsPopupOpenAtAnyLevel ( ImGuiID id ) <nl> return false ; <nl> } <nl> <nl> + / / Return true if any popup is open at the current BeginPopup ( ) level of the popup stack <nl> + / / This may be used to e . g . test for another popups already opened in the same frame to handle popups priorities at the same level . <nl> + bool ImGui : : IsAnyPopupOpen ( ) <nl> + { <nl> + ImGuiContext & g = * GImGui ; <nl> + return g . OpenPopupStack . Size > g . BeginPopupStack . Size ; <nl> + } <nl> + <nl> ImGuiWindow * ImGui : : GetTopMostPopupModal ( ) <nl> { <nl> ImGuiContext & g = * GImGui ; <nl> bool ImGui : : OpenPopupContextItem ( const char * str_id , ImGuiMouseButton mouse_butt <nl> } <nl> <nl> / / This is a helper to handle the simplest case of associating one named popup to one given widget . <nl> - / / You may want to handle this on user side if you have specific needs ( e . g . tweaking IsItemHovered ( ) parameters ) . <nl> - / / You can pass a NULL str_id to use the identifier of the last item . <nl> + / / - You can pass a NULL str_id to use the identifier of the last item . <nl> + / / - You may want to handle this on user side if you have specific needs ( e . g . tweaking IsItemHovered ( ) parameters ) . <nl> + / / - This is essentially the same as calling OpenPopupContextItem ( ) + BeginPopupEx ( ) but written to avoid <nl> + / / computing the ID twice because BeginPopupContextXXX functions are called very frequently . <nl> bool ImGui : : BeginPopupContextItem ( const char * str_id , ImGuiMouseButton mouse_button ) <nl> { <nl> ImGuiWindow * window = GImGui - > CurrentWindow ; <nl> bool ImGui : : BeginPopupContextItem ( const char * str_id , ImGuiMouseButton mouse_but <nl> <nl> bool ImGui : : BeginPopupContextWindow ( const char * str_id , ImGuiMouseButton mouse_button , bool also_over_items ) <nl> { <nl> + ImGuiWindow * window = GImGui - > CurrentWindow ; <nl> if ( ! str_id ) <nl> str_id = " window_context " ; <nl> - ImGuiID id = GImGui - > CurrentWindow - > GetID ( str_id ) ; <nl> + ImGuiID id = window - > GetID ( str_id ) ; <nl> if ( IsMouseReleased ( mouse_button ) & & IsWindowHovered ( ImGuiHoveredFlags_AllowWhenBlockedByPopup ) ) <nl> if ( also_over_items | | ! IsAnyItemHovered ( ) ) <nl> OpenPopupEx ( id ) ; <nl> bool ImGui : : BeginPopupContextWindow ( const char * str_id , ImGuiMouseButton mouse_b <nl> <nl> bool ImGui : : BeginPopupContextVoid ( const char * str_id , ImGuiMouseButton mouse_button ) <nl> { <nl> + ImGuiWindow * window = GImGui - > CurrentWindow ; <nl> if ( ! str_id ) <nl> str_id = " void_context " ; <nl> - ImGuiID id = GImGui - > CurrentWindow - > GetID ( str_id ) ; <nl> + ImGuiID id = window - > GetID ( str_id ) ; <nl> if ( IsMouseReleased ( mouse_button ) & & ! IsWindowHovered ( ImGuiHoveredFlags_AnyWindow ) ) <nl> OpenPopupEx ( id ) ; <nl> return BeginPopupEx ( id , ImGuiWindowFlags_AlwaysAutoResize | ImGuiWindowFlags_NoTitleBar | ImGuiWindowFlags_NoSavedSettings ) ; <nl> mmm a / imgui . h <nl> ppp b / imgui . h <nl> namespace ImGui <nl> / / ( * 1 ) You can bypass that restriction and detect hovering even when normally blocked by a popup . <nl> / / To do this use the ImGuiHoveredFlags_AllowWhenBlockedByPopup when calling IsItemHovered ( ) or IsWindowHovered ( ) . <nl> / / This is what BeginPopupContextItem ( ) and BeginPopupContextWindow ( ) are doing already , allowing a right - click to reopen another popups without losing the click . <nl> + / / - The BeginPopupContextXXX functions are essentially helpers to do an OpenPopup ( ) in some condition + BeginPopup ( ) . <nl> IMGUI_API void OpenPopup ( const char * str_id ) ; / / call to mark popup as open ( don ' t call every frame ! ) . popups are closed when user click outside , or if CloseCurrentPopup ( ) is called within a BeginPopup ( ) / EndPopup ( ) block . By default , Selectable ( ) / MenuItem ( ) are calling CloseCurrentPopup ( ) . Popup identifiers are relative to the current ID - stack ( so OpenPopup and BeginPopup needs to be at the same level ) . <nl> IMGUI_API bool BeginPopup ( const char * str_id , ImGuiWindowFlags flags = 0 ) ; / / return true if the popup is open , and you can start outputting to it . only call EndPopup ( ) if BeginPopup ( ) returns true ! <nl> IMGUI_API bool BeginPopupContextItem ( const char * str_id = NULL , ImGuiMouseButton mouse_button = 1 ) ; / / helper to open and begin popup when clicked on last item . if you can pass a NULL str_id only if the previous item had an id . If you want to use that on a non - interactive item such as Text ( ) you need to pass in an explicit ID here . read comments in . cpp ! <nl> namespace ImGui <nl> IMGUI_API bool BeginPopupContextVoid ( const char * str_id = NULL , ImGuiMouseButton mouse_button = 1 ) ; / / helper to open and begin popup when clicked in void ( where there are no imgui windows ) . <nl> IMGUI_API bool BeginPopupModal ( const char * name , bool * p_open = NULL , ImGuiWindowFlags flags = 0 ) ; / / modal dialog ( regular window with title bar , block interactions behind the modal window , can ' t close the modal window by clicking outside ) <nl> IMGUI_API void EndPopup ( ) ; / / only call EndPopup ( ) if BeginPopupXXX ( ) returns true ! <nl> - IMGUI_API bool OpenPopupContextItem ( const char * str_id = NULL , ImGuiMouseButton mouse_button = 1 ) ; / / helper to open popup when clicked on last item ( note : actually triggers on the mouse _released_ event to be consistent with popup behaviors ) . return true when just opened . <nl> - IMGUI_API bool IsPopupOpen ( const char * str_id ) ; / / return true if the popup is open at the current begin - ed level of the popup stack . <nl> + IMGUI_API bool OpenPopupContextItem ( const char * str_id = NULL , ImGuiMouseButton mouse_button = 1 ) ; / / helper to open popup when clicked on last item . return true when just opened . ( note : actually triggers on the mouse _released_ event to be consistent with popup behaviors ) <nl> + IMGUI_API bool IsPopupOpen ( const char * str_id ) ; / / return true if the popup is open at the current BeginPopup ( ) level of the popup stack <nl> IMGUI_API void CloseCurrentPopup ( ) ; / / close the popup we have begin - ed into . clicking on a MenuItem or Selectable automatically close the current popup . <nl> <nl> / / Columns <nl> mmm a / imgui_demo . cpp <nl> ppp b / imgui_demo . cpp <nl> static void ShowDemoWindowPopups ( ) <nl> if ( ImGui : : TreeNode ( " Context menus " ) ) <nl> { <nl> / / BeginPopupContextItem ( ) is a helper to provide common / simple popup behavior of essentially doing : <nl> - / / if ( IsItemHovered ( ) & & IsMouseReleased ( 0 ) ) <nl> + / / if ( IsItemHovered ( ) & & IsMouseReleased ( ImGuiMouseButton_Right ) ) <nl> / / OpenPopup ( id ) ; <nl> / / return BeginPopup ( id ) ; <nl> / / For more advanced uses you may want to replicate and customize this code . <nl> mmm a / imgui_internal . h <nl> ppp b / imgui_internal . h <nl> namespace ImGui <nl> IMGUI_API void OpenPopupEx ( ImGuiID id ) ; <nl> IMGUI_API void ClosePopupToLevel ( int remaining , bool restore_focus_to_window_under_popup ) ; <nl> IMGUI_API void ClosePopupsOverWindow ( ImGuiWindow * ref_window , bool restore_focus_to_window_under_popup ) ; <nl> - IMGUI_API bool IsPopupOpen ( ImGuiID id ) ; / / Test for id at current popup stack level ( currently begin - ed into ) ; this doesn ' t scan the whole popup stack ! <nl> + IMGUI_API bool IsPopupOpen ( ImGuiID id ) ; / / Test for id at the current BeginPopup ( ) level of the popup stack ( this doesn ' t scan the whole popup stack ! ) <nl> IMGUI_API bool IsPopupOpenAtAnyLevel ( ImGuiID id ) ; <nl> + IMGUI_API bool IsAnyPopupOpen ( ) ; / / Return true if any popup is open at the current BeginPopup ( ) level of the popup stack <nl> IMGUI_API bool BeginPopupEx ( ImGuiID id , ImGuiWindowFlags extra_flags ) ; <nl> IMGUI_API void BeginTooltipEx ( ImGuiWindowFlags extra_flags , ImGuiTooltipFlags tooltip_flags ) ; <nl> IMGUI_API ImGuiWindow * GetTopMostPopupModal ( ) ; <nl>
Popups : Internals : Added IsAnyPopupOpen ( ) .
ocornut/imgui
37eb89371b7ee8b0728b07b2ab437c6281bc8261
2020-06-16T16:46:25Z
mmm a / src / google / protobuf / stubs / atomicops . h <nl> ppp b / src / google / protobuf / stubs / atomicops . h <nl> GOOGLE_PROTOBUF_ATOMICOPS_ERROR <nl> # include < google / protobuf / stubs / atomicops_internals_x86_gcc . h > <nl> # elif defined ( GOOGLE_PROTOBUF_ARCH_ARM ) <nl> # include < google / protobuf / stubs / atomicops_internals_arm_gcc . h > <nl> + # elif defined ( GOOGLE_PROTOBUF_ARCH_AARCH64 ) <nl> + # include < google / protobuf / stubs / atomicops_internals_arm64_gcc . h > <nl> # elif defined ( GOOGLE_PROTOBUF_ARCH_ARM_QNX ) <nl> # include < google / protobuf / stubs / atomicops_internals_arm_qnx . h > <nl> # elif defined ( GOOGLE_PROTOBUF_ARCH_MIPS ) <nl> new file mode 100644 <nl> index 0000000000 . . c13cddb7c5 <nl> mmm / dev / null <nl> ppp b / src / google / protobuf / stubs / atomicops_internals_arm64_gcc . h <nl> <nl> + / / Protocol Buffers - Google ' s data interchange format <nl> + / / Copyright 2012 Google Inc . All rights reserved . <nl> + / / http : / / code . google . com / p / protobuf / <nl> + / / <nl> + / / Redistribution and use in source and binary forms , with or without <nl> + / / modification , are permitted provided that the following conditions are <nl> + / / met : <nl> + / / <nl> + / / * Redistributions of source code must retain the above copyright <nl> + / / notice , this list of conditions and the following disclaimer . <nl> + / / * Redistributions in binary form must reproduce the above <nl> + / / copyright notice , this list of conditions and the following disclaimer <nl> + / / in the documentation and / or other materials provided with the <nl> + / / distribution . <nl> + / / * Neither the name of Google Inc . nor the names of its <nl> + / / contributors may be used to endorse or promote products derived from <nl> + / / this software without specific prior written permission . <nl> + / / <nl> + / / THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS <nl> + / / " AS IS " AND ANY EXPRESS OR IMPLIED WARRANTIES , INCLUDING , BUT NOT <nl> + / / LIMITED TO , THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR <nl> + / / A PARTICULAR PURPOSE ARE DISCLAIMED . IN NO EVENT SHALL THE COPYRIGHT <nl> + / / OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT , INDIRECT , INCIDENTAL , <nl> + / / SPECIAL , EXEMPLARY , OR CONSEQUENTIAL DAMAGES ( INCLUDING , BUT NOT <nl> + / / LIMITED TO , PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES ; LOSS OF USE , <nl> + / / DATA , OR PROFITS ; OR BUSINESS INTERRUPTION ) HOWEVER CAUSED AND ON ANY <nl> + / / THEORY OF LIABILITY , WHETHER IN CONTRACT , STRICT LIABILITY , OR TORT <nl> + / / ( INCLUDING NEGLIGENCE OR OTHERWISE ) ARISING IN ANY WAY OUT OF THE USE <nl> + / / OF THIS SOFTWARE , EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE . <nl> + <nl> + / / This file is an internal atomic implementation , use atomicops . h instead . <nl> + <nl> + # ifndef GOOGLE_PROTOBUF_ATOMICOPS_INTERNALS_ARM64_GCC_H_ <nl> + # define GOOGLE_PROTOBUF_ATOMICOPS_INTERNALS_ARM64_GCC_H_ <nl> + <nl> + namespace google { <nl> + namespace protobuf { <nl> + namespace internal { <nl> + <nl> + inline void MemoryBarrier ( ) { <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " dmb ish \ n \ t " / / Data memory barrier . <nl> + : : : " memory " <nl> + ) ; / / NOLINT <nl> + } <nl> + <nl> + <nl> + inline Atomic32 NoBarrier_CompareAndSwap ( volatile Atomic32 * ptr , <nl> + Atomic32 old_value , <nl> + Atomic32 new_value ) { <nl> + Atomic32 prev ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % w [ prev ] , % [ ptr ] \ n \ t " / / Load the previous value . <nl> + " cmp % w [ prev ] , % w [ old_value ] \ n \ t " <nl> + " bne 1f \ n \ t " <nl> + " stxr % w [ temp ] , % w [ new_value ] , % [ ptr ] \ n \ t " / / Try to store the new value . <nl> + " cbnz % w [ temp ] , 0b \ n \ t " / / Retry if it did not work . <nl> + " 1 : \ n \ t " <nl> + " clrex \ n \ t " / / In case we didn ' t swap . <nl> + : [ prev ] " = & r " ( prev ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ old_value ] " r " ( old_value ) , <nl> + [ new_value ] " r " ( new_value ) <nl> + : " memory " , " cc " <nl> + ) ; / / NOLINT <nl> + <nl> + return prev ; <nl> + } <nl> + <nl> + inline Atomic32 NoBarrier_AtomicExchange ( volatile Atomic32 * ptr , <nl> + Atomic32 new_value ) { <nl> + Atomic32 result ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % w [ result ] , % [ ptr ] \ n \ t " / / Load the previous value . <nl> + " stxr % w [ temp ] , % w [ new_value ] , % [ ptr ] \ n \ t " / / Try to store the new value . <nl> + " cbnz % w [ temp ] , 0b \ n \ t " / / Retry if it did not work . <nl> + : [ result ] " = & r " ( result ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ new_value ] " r " ( new_value ) <nl> + : " memory " <nl> + ) ; / / NOLINT <nl> + <nl> + return result ; <nl> + } <nl> + <nl> + inline Atomic32 NoBarrier_AtomicIncrement ( volatile Atomic32 * ptr , <nl> + Atomic32 increment ) { <nl> + Atomic32 result ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % w [ result ] , % [ ptr ] \ n \ t " / / Load the previous value . <nl> + " add % w [ result ] , % w [ result ] , % w [ increment ] \ n \ t " <nl> + " stxr % w [ temp ] , % w [ result ] , % [ ptr ] \ n \ t " / / Try to store the result . <nl> + " cbnz % w [ temp ] , 0b \ n \ t " / / Retry on failure . <nl> + : [ result ] " = & r " ( result ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ increment ] " r " ( increment ) <nl> + : " memory " <nl> + ) ; / / NOLINT <nl> + <nl> + return result ; <nl> + } <nl> + <nl> + inline Atomic32 Barrier_AtomicIncrement ( volatile Atomic32 * ptr , <nl> + Atomic32 increment ) { <nl> + MemoryBarrier ( ) ; <nl> + Atomic32 result = NoBarrier_AtomicIncrement ( ptr , increment ) ; <nl> + MemoryBarrier ( ) ; <nl> + <nl> + return result ; <nl> + } <nl> + <nl> + inline Atomic32 Acquire_CompareAndSwap ( volatile Atomic32 * ptr , <nl> + Atomic32 old_value , <nl> + Atomic32 new_value ) { <nl> + Atomic32 prev ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % w [ prev ] , % [ ptr ] \ n \ t " / / Load the previous value . <nl> + " cmp % w [ prev ] , % w [ old_value ] \ n \ t " <nl> + " bne 1f \ n \ t " <nl> + " stxr % w [ temp ] , % w [ new_value ] , % [ ptr ] \ n \ t " / / Try to store the new value . <nl> + " cbnz % w [ temp ] , 0b \ n \ t " / / Retry if it did not work . <nl> + " dmb ish \ n \ t " / / Data memory barrier . <nl> + " 1 : \ n \ t " <nl> + / / If the compare failed the ' dmb ' is unnecessary , but we still need a <nl> + / / ' clrex ' . <nl> + " clrex \ n \ t " <nl> + : [ prev ] " = & r " ( prev ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ old_value ] " r " ( old_value ) , <nl> + [ new_value ] " r " ( new_value ) <nl> + : " memory " , " cc " <nl> + ) ; / / NOLINT <nl> + <nl> + return prev ; <nl> + } <nl> + <nl> + inline Atomic32 Release_CompareAndSwap ( volatile Atomic32 * ptr , <nl> + Atomic32 old_value , <nl> + Atomic32 new_value ) { <nl> + Atomic32 prev ; <nl> + int32_t temp ; <nl> + <nl> + MemoryBarrier ( ) ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % w [ prev ] , % [ ptr ] \ n \ t " / / Load the previous value . <nl> + " cmp % w [ prev ] , % w [ old_value ] \ n \ t " <nl> + " bne 1f \ n \ t " <nl> + " stxr % w [ temp ] , % w [ new_value ] , % [ ptr ] \ n \ t " / / Try to store the new value . <nl> + " cbnz % w [ temp ] , 0b \ n \ t " / / Retry if it did not work . <nl> + " 1 : \ n \ t " <nl> + / / If the compare failed the we still need a ' clrex ' . <nl> + " clrex \ n \ t " <nl> + : [ prev ] " = & r " ( prev ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ old_value ] " r " ( old_value ) , <nl> + [ new_value ] " r " ( new_value ) <nl> + : " memory " , " cc " <nl> + ) ; / / NOLINT <nl> + <nl> + return prev ; <nl> + } <nl> + <nl> + inline void NoBarrier_Store ( volatile Atomic32 * ptr , Atomic32 value ) { <nl> + * ptr = value ; <nl> + } <nl> + <nl> + inline void Acquire_Store ( volatile Atomic32 * ptr , Atomic32 value ) { <nl> + * ptr = value ; <nl> + MemoryBarrier ( ) ; <nl> + } <nl> + <nl> + inline void Release_Store ( volatile Atomic32 * ptr , Atomic32 value ) { <nl> + MemoryBarrier ( ) ; <nl> + * ptr = value ; <nl> + } <nl> + <nl> + inline Atomic32 NoBarrier_Load ( volatile const Atomic32 * ptr ) { <nl> + return * ptr ; <nl> + } <nl> + <nl> + inline Atomic32 Acquire_Load ( volatile const Atomic32 * ptr ) { <nl> + Atomic32 value = * ptr ; <nl> + MemoryBarrier ( ) ; <nl> + return value ; <nl> + } <nl> + <nl> + inline Atomic32 Release_Load ( volatile const Atomic32 * ptr ) { <nl> + MemoryBarrier ( ) ; <nl> + return * ptr ; <nl> + } <nl> + <nl> + / / 64 - bit versions of the operations . <nl> + / / See the 32 - bit versions for comments . <nl> + <nl> + inline Atomic64 NoBarrier_CompareAndSwap ( volatile Atomic64 * ptr , <nl> + Atomic64 old_value , <nl> + Atomic64 new_value ) { <nl> + Atomic64 prev ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % [ prev ] , % [ ptr ] \ n \ t " <nl> + " cmp % [ prev ] , % [ old_value ] \ n \ t " <nl> + " bne 1f \ n \ t " <nl> + " stxr % w [ temp ] , % [ new_value ] , % [ ptr ] \ n \ t " <nl> + " cbnz % w [ temp ] , 0b \ n \ t " <nl> + " 1 : \ n \ t " <nl> + " clrex \ n \ t " <nl> + : [ prev ] " = & r " ( prev ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ old_value ] " r " ( old_value ) , <nl> + [ new_value ] " r " ( new_value ) <nl> + : " memory " , " cc " <nl> + ) ; / / NOLINT <nl> + <nl> + return prev ; <nl> + } <nl> + <nl> + inline Atomic64 NoBarrier_AtomicExchange ( volatile Atomic64 * ptr , <nl> + Atomic64 new_value ) { <nl> + Atomic64 result ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % [ result ] , % [ ptr ] \ n \ t " <nl> + " stxr % w [ temp ] , % [ new_value ] , % [ ptr ] \ n \ t " <nl> + " cbnz % w [ temp ] , 0b \ n \ t " <nl> + : [ result ] " = & r " ( result ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ new_value ] " r " ( new_value ) <nl> + : " memory " <nl> + ) ; / / NOLINT <nl> + <nl> + return result ; <nl> + } <nl> + <nl> + inline Atomic64 NoBarrier_AtomicIncrement ( volatile Atomic64 * ptr , <nl> + Atomic64 increment ) { <nl> + Atomic64 result ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % [ result ] , % [ ptr ] \ n \ t " <nl> + " add % [ result ] , % [ result ] , % [ increment ] \ n \ t " <nl> + " stxr % w [ temp ] , % [ result ] , % [ ptr ] \ n \ t " <nl> + " cbnz % w [ temp ] , 0b \ n \ t " <nl> + : [ result ] " = & r " ( result ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ increment ] " r " ( increment ) <nl> + : " memory " <nl> + ) ; / / NOLINT <nl> + <nl> + return result ; <nl> + } <nl> + <nl> + inline Atomic64 Barrier_AtomicIncrement ( volatile Atomic64 * ptr , <nl> + Atomic64 increment ) { <nl> + MemoryBarrier ( ) ; <nl> + Atomic64 result = NoBarrier_AtomicIncrement ( ptr , increment ) ; <nl> + MemoryBarrier ( ) ; <nl> + <nl> + return result ; <nl> + } <nl> + <nl> + inline Atomic64 Acquire_CompareAndSwap ( volatile Atomic64 * ptr , <nl> + Atomic64 old_value , <nl> + Atomic64 new_value ) { <nl> + Atomic64 prev ; <nl> + int32_t temp ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % [ prev ] , % [ ptr ] \ n \ t " <nl> + " cmp % [ prev ] , % [ old_value ] \ n \ t " <nl> + " bne 1f \ n \ t " <nl> + " stxr % w [ temp ] , % [ new_value ] , % [ ptr ] \ n \ t " <nl> + " cbnz % w [ temp ] , 0b \ n \ t " <nl> + " dmb ish \ n \ t " <nl> + " 1 : \ n \ t " <nl> + " clrex \ n \ t " <nl> + : [ prev ] " = & r " ( prev ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ old_value ] " r " ( old_value ) , <nl> + [ new_value ] " r " ( new_value ) <nl> + : " memory " , " cc " <nl> + ) ; / / NOLINT <nl> + <nl> + return prev ; <nl> + } <nl> + <nl> + inline Atomic64 Release_CompareAndSwap ( volatile Atomic64 * ptr , <nl> + Atomic64 old_value , <nl> + Atomic64 new_value ) { <nl> + Atomic64 prev ; <nl> + int32_t temp ; <nl> + <nl> + MemoryBarrier ( ) ; <nl> + <nl> + __asm__ __volatile__ ( / / NOLINT <nl> + " 0 : \ n \ t " <nl> + " ldxr % [ prev ] , % [ ptr ] \ n \ t " <nl> + " cmp % [ prev ] , % [ old_value ] \ n \ t " <nl> + " bne 1f \ n \ t " <nl> + " stxr % w [ temp ] , % [ new_value ] , % [ ptr ] \ n \ t " <nl> + " cbnz % w [ temp ] , 0b \ n \ t " <nl> + " 1 : \ n \ t " <nl> + " clrex \ n \ t " <nl> + : [ prev ] " = & r " ( prev ) , <nl> + [ temp ] " = & r " ( temp ) , <nl> + [ ptr ] " + Q " ( * ptr ) <nl> + : [ old_value ] " r " ( old_value ) , <nl> + [ new_value ] " r " ( new_value ) <nl> + : " memory " , " cc " <nl> + ) ; / / NOLINT <nl> + <nl> + return prev ; <nl> + } <nl> + <nl> + inline void NoBarrier_Store ( volatile Atomic64 * ptr , Atomic64 value ) { <nl> + * ptr = value ; <nl> + } <nl> + <nl> + inline void Acquire_Store ( volatile Atomic64 * ptr , Atomic64 value ) { <nl> + * ptr = value ; <nl> + MemoryBarrier ( ) ; <nl> + } <nl> + <nl> + inline void Release_Store ( volatile Atomic64 * ptr , Atomic64 value ) { <nl> + MemoryBarrier ( ) ; <nl> + * ptr = value ; <nl> + } <nl> + <nl> + inline Atomic64 NoBarrier_Load ( volatile const Atomic64 * ptr ) { <nl> + return * ptr ; <nl> + } <nl> + <nl> + inline Atomic64 Acquire_Load ( volatile const Atomic64 * ptr ) { <nl> + Atomic64 value = * ptr ; <nl> + MemoryBarrier ( ) ; <nl> + return value ; <nl> + } <nl> + <nl> + inline Atomic64 Release_Load ( volatile const Atomic64 * ptr ) { <nl> + MemoryBarrier ( ) ; <nl> + return * ptr ; <nl> + } <nl> + <nl> + } / / namespace internal <nl> + } / / namespace protobuf <nl> + } / / namespace google <nl> + <nl> + # endif / / GOOGLE_PROTOBUF_ATOMICOPS_INTERNALS_ARM64_GCC_H_ <nl>
Add Arm64 AtomicOps ( patch from rmcilroy @ )
protocolbuffers/protobuf
2ca19bd8066821a56f193e7fca47139b25c617ad
2014-03-26T03:05:53Z
mmm a / src / builtins / builtins - ic . cc <nl> ppp b / src / builtins / builtins - ic . cc <nl> namespace internal { <nl> } <nl> <nl> IC_BUILTIN ( LoadIC ) <nl> + IC_BUILTIN ( LoadIC_Noninlined ) <nl> IC_BUILTIN ( LoadIC_Uninitialized ) <nl> IC_BUILTIN ( KeyedLoadIC ) <nl> IC_BUILTIN ( LoadICTrampoline ) <nl> mmm a / src / builtins / builtins . h <nl> ppp b / src / builtins / builtins . h <nl> class Isolate ; <nl> \ <nl> / * ICs * / \ <nl> TFS ( LoadIC , LOAD_IC , kNoExtraICState , LoadWithVector , 1 ) \ <nl> + TFS ( LoadIC_Noninlined , BUILTIN , kNoExtraICState , LoadWithVector , 1 ) \ <nl> TFS ( LoadICTrampoline , LOAD_IC , kNoExtraICState , Load , 1 ) \ <nl> TFS ( KeyedLoadIC , KEYED_LOAD_IC , kNoExtraICState , LoadWithVector , 1 ) \ <nl> TFS ( KeyedLoadICTrampoline , KEYED_LOAD_IC , kNoExtraICState , Load , 1 ) \ <nl> mmm a / src / code - factory . cc <nl> ppp b / src / code - factory . cc <nl> Callable CodeFactory : : LoadICInOptimizedCode ( Isolate * isolate ) { <nl> LoadWithVectorDescriptor ( isolate ) ) ; <nl> } <nl> <nl> + / / static <nl> + Callable CodeFactory : : LoadICInOptimizedCode_Noninlined ( Isolate * isolate ) { <nl> + return Callable ( isolate - > builtins ( ) - > LoadIC_Noninlined ( ) , <nl> + LoadWithVectorDescriptor ( isolate ) ) ; <nl> + } <nl> + <nl> / / static <nl> Callable CodeFactory : : LoadGlobalIC ( Isolate * isolate , TypeofMode typeof_mode ) { <nl> return Callable ( <nl> mmm a / src / code - factory . h <nl> ppp b / src / code - factory . h <nl> class V8_EXPORT_PRIVATE CodeFactory final { <nl> static Callable LoadIC ( Isolate * isolate ) ; <nl> static Callable LoadIC_Uninitialized ( Isolate * isolate ) ; <nl> static Callable LoadICInOptimizedCode ( Isolate * isolate ) ; <nl> + static Callable LoadICInOptimizedCode_Noninlined ( Isolate * isolate ) ; <nl> static Callable LoadICProtoArray ( Isolate * isolate , bool throw_if_nonexistent ) ; <nl> static Callable LoadGlobalIC ( Isolate * isolate , TypeofMode typeof_mode ) ; <nl> static Callable LoadGlobalICInOptimizedCode ( Isolate * isolate , <nl> mmm a / src / code - stub - assembler . cc <nl> ppp b / src / code - stub - assembler . cc <nl> Node * CodeStubAssembler : : IsCallableMap ( Node * map ) { <nl> Int32Constant ( 0 ) ) ; <nl> } <nl> <nl> + Node * CodeStubAssembler : : IsDeprecatedMap ( Node * map ) { <nl> + CSA_ASSERT ( this , IsMap ( map ) ) ; <nl> + return IsSetWord32 < Map : : Deprecated > ( LoadMapBitField3 ( map ) ) ; <nl> + } <nl> + <nl> Node * CodeStubAssembler : : IsCallable ( Node * object ) { <nl> return IsCallableMap ( LoadMap ( object ) ) ; <nl> } <nl> mmm a / src / code - stub - assembler . h <nl> ppp b / src / code - stub - assembler . h <nl> class V8_EXPORT_PRIVATE CodeStubAssembler : public compiler : : CodeAssembler { <nl> Node * IsJSReceiverMap ( Node * map ) ; <nl> Node * IsMap ( Node * object ) ; <nl> Node * IsCallableMap ( Node * map ) ; <nl> + Node * IsDeprecatedMap ( Node * map ) ; <nl> Node * IsCallable ( Node * object ) ; <nl> Node * IsBoolean ( Node * object ) ; <nl> Node * IsHeapNumber ( Node * object ) ; <nl> mmm a / src / ic / accessor - assembler . cc <nl> ppp b / src / ic / accessor - assembler . cc <nl> void AccessorAssembler : : HandlePolymorphicCase ( Node * receiver_map , <nl> Comment ( " HandlePolymorphicCase " ) ; <nl> DCHECK_EQ ( MachineRepresentation : : kTagged , var_handler - > rep ( ) ) ; <nl> <nl> + / / Deferred so the unrolled case can omit frame construction in bytecode <nl> + / / handler . <nl> + Label loop ( this , Label : : kDeferred ) ; <nl> + <nl> / / Iterate { feedback } array . <nl> const int kEntrySize = 2 ; <nl> <nl> void AccessorAssembler : : HandlePolymorphicCase ( Node * receiver_map , <nl> <nl> Bind ( & next_entry ) ; <nl> } <nl> + Goto ( & loop ) ; <nl> <nl> / / Loop from { unroll_count } * kEntrySize to { length } . <nl> + Bind ( & loop ) ; <nl> Node * init = IntPtrConstant ( unroll_count * kEntrySize ) ; <nl> Node * length = LoadAndUntagFixedArrayBaseLength ( feedback ) ; <nl> BuildFastLoop ( <nl> void AccessorAssembler : : HandleKeyedStorePolymorphicCase ( <nl> <nl> void AccessorAssembler : : HandleLoadICHandlerCase ( <nl> const LoadICParameters * p , Node * handler , Label * miss , <nl> - ElementSupport support_elements ) { <nl> + ExitPoint * exit_point , ElementSupport support_elements ) { <nl> Comment ( " have_handler " ) ; <nl> - ExitPoint direct_exit ( this ) ; <nl> <nl> Variable var_holder ( this , MachineRepresentation : : kTagged , p - > receiver ) ; <nl> Variable var_smi_handler ( this , MachineRepresentation : : kTagged , handler ) ; <nl> void AccessorAssembler : : HandleLoadICHandlerCase ( <nl> Bind ( & if_smi_handler ) ; <nl> { <nl> HandleLoadICSmiHandlerCase ( p , var_holder . value ( ) , var_smi_handler . value ( ) , <nl> - miss , & direct_exit , support_elements ) ; <nl> + miss , exit_point , support_elements ) ; <nl> } <nl> <nl> Bind ( & try_proto_handler ) ; <nl> { <nl> GotoIf ( IsCodeMap ( LoadMap ( handler ) ) , & call_handler ) ; <nl> HandleLoadICProtoHandlerCase ( p , handler , & var_holder , & var_smi_handler , <nl> - & if_smi_handler , miss , & direct_exit , false ) ; <nl> + & if_smi_handler , miss , exit_point , false ) ; <nl> } <nl> <nl> Bind ( & call_handler ) ; <nl> { <nl> typedef LoadWithVectorDescriptor Descriptor ; <nl> - TailCallStub ( Descriptor ( isolate ( ) ) , handler , p - > context , p - > receiver , <nl> - p - > name , p - > slot , p - > vector ) ; <nl> + exit_point - > ReturnCallStub ( Descriptor ( isolate ( ) ) , handler , p - > context , <nl> + p - > receiver , p - > name , p - > slot , p - > vector ) ; <nl> } <nl> } <nl> <nl> void AccessorAssembler : : HandleLoadICSmiHandlerCase ( <nl> Comment ( " property_load " ) ; <nl> } <nl> <nl> - Label constant ( this , Label : : kDeferred ) , field ( this ) , <nl> - normal ( this , Label : : kDeferred ) ; <nl> + Label constant ( this ) , field ( this ) , normal ( this , Label : : kDeferred ) ; <nl> GotoIf ( WordEqual ( handler_kind , IntPtrConstant ( LoadHandler : : kForFields ) ) , <nl> & field ) ; <nl> <nl> void AccessorAssembler : : HandleStoreICProtoHandler ( const StoreICParameters * p , <nl> Node * transition = var_transition . value ( ) ; <nl> Node * handler_word = SmiUntag ( smi_handler ) ; <nl> <nl> - GotoIf ( IsSetWord32 < Map : : Deprecated > ( LoadMapBitField3 ( transition ) ) , miss ) ; <nl> + GotoIf ( IsDeprecatedMap ( transition ) , miss ) ; <nl> <nl> Node * handler_kind = DecodeWord < StoreHandler : : KindBits > ( handler_word ) ; <nl> GotoIf ( WordEqual ( handler_kind , IntPtrConstant ( StoreHandler : : kStoreNormal ) ) , <nl> void AccessorAssembler : : GenericPropertyLoad ( Node * receiver , Node * receiver_map , <nl> const LoadICParameters * p , <nl> Label * slow , <nl> UseStubCache use_stub_cache ) { <nl> + ExitPoint direct_exit ( this ) ; <nl> + <nl> Comment ( " key is unique name " ) ; <nl> Label if_found_on_receiver ( this ) , if_property_dictionary ( this ) , <nl> lookup_prototype_chain ( this ) ; <nl> void AccessorAssembler : : GenericPropertyLoad ( Node * receiver , Node * receiver_map , <nl> TryProbeStubCache ( isolate ( ) - > load_stub_cache ( ) , receiver , key , <nl> & found_handler , & var_handler , & stub_cache_miss ) ; <nl> Bind ( & found_handler ) ; <nl> - { HandleLoadICHandlerCase ( p , var_handler . value ( ) , slow ) ; } <nl> + { HandleLoadICHandlerCase ( p , var_handler . value ( ) , slow , & direct_exit ) ; } <nl> <nl> Bind ( & stub_cache_miss ) ; <nl> { <nl> void AccessorAssembler : : TryProbeStubCache ( StubCache * stub_cache , Node * receiver , <nl> <nl> / / / / / / / / / / / / / / / / / / / / Entry points into private implementation ( one per stub ) . <nl> <nl> + void AccessorAssembler : : LoadIC_BytecodeHandler ( const LoadICParameters * p , <nl> + ExitPoint * exit_point ) { <nl> + / / Must be kept in sync with LoadIC . <nl> + <nl> + / / This function is hand - tuned to omit frame construction for common cases , <nl> + / / e . g . : monomorphic field and constant loads through smi handlers . <nl> + / / Polymorphic ICs with a hit in the first two entries also omit frames . <nl> + / / TODO ( jgruber ) : Frame omission is fragile and can be affected by minor <nl> + / / changes in control flow and logic . We currently have no way of ensuring <nl> + / / that no frame is constructed , so it ' s easy to break this optimization by <nl> + / / accident . <nl> + Label stub_call ( this , Label : : kDeferred ) , miss ( this , Label : : kDeferred ) ; <nl> + <nl> + / / Inlined fast path . <nl> + { <nl> + Comment ( " LoadIC_BytecodeHandler_fast " ) ; <nl> + <nl> + Node * recv_map = LoadReceiverMap ( p - > receiver ) ; <nl> + GotoIf ( IsDeprecatedMap ( recv_map ) , & miss ) ; <nl> + <nl> + Variable var_handler ( this , MachineRepresentation : : kTagged ) ; <nl> + Label try_polymorphic ( this ) , if_handler ( this , & var_handler ) ; <nl> + <nl> + Node * feedback = <nl> + TryMonomorphicCase ( p - > slot , p - > vector , recv_map , & if_handler , <nl> + & var_handler , & try_polymorphic ) ; <nl> + <nl> + Bind ( & if_handler ) ; <nl> + HandleLoadICHandlerCase ( p , var_handler . value ( ) , & miss , exit_point ) ; <nl> + <nl> + Bind ( & try_polymorphic ) ; <nl> + { <nl> + GotoIfNot ( WordEqual ( LoadMap ( feedback ) , FixedArrayMapConstant ( ) ) , <nl> + & stub_call ) ; <nl> + HandlePolymorphicCase ( recv_map , feedback , & if_handler , & var_handler , <nl> + & miss , 2 ) ; <nl> + } <nl> + } <nl> + <nl> + Bind ( & stub_call ) ; <nl> + { <nl> + Comment ( " LoadIC_BytecodeHandler_noninlined " ) ; <nl> + <nl> + / / Call into the stub that implements the non - inlined parts of LoadIC . <nl> + Callable ic = CodeFactory : : LoadICInOptimizedCode_Noninlined ( isolate ( ) ) ; <nl> + Node * code_target = HeapConstant ( ic . code ( ) ) ; <nl> + exit_point - > ReturnCallStub ( ic . descriptor ( ) , code_target , p - > context , <nl> + p - > receiver , p - > name , p - > slot , p - > vector ) ; <nl> + } <nl> + <nl> + Bind ( & miss ) ; <nl> + { <nl> + Comment ( " LoadIC_BytecodeHandler_miss " ) ; <nl> + <nl> + exit_point - > ReturnCallRuntime ( Runtime : : kLoadIC_Miss , p - > context , <nl> + p - > receiver , p - > name , p - > slot , p - > vector ) ; <nl> + } <nl> + } <nl> + <nl> void AccessorAssembler : : LoadIC ( const LoadICParameters * p ) { <nl> + / / Must be kept in sync with LoadIC_BytecodeHandler . <nl> + <nl> + ExitPoint direct_exit ( this ) ; <nl> + <nl> Variable var_handler ( this , MachineRepresentation : : kTagged ) ; <nl> - Label if_handler ( this , & var_handler ) , try_polymorphic ( this , Label : : kDeferred ) , <nl> - try_megamorphic ( this , Label : : kDeferred ) , <nl> - try_uninitialized ( this , Label : : kDeferred ) , miss ( this , Label : : kDeferred ) ; <nl> + Label if_handler ( this , & var_handler ) , non_inlined ( this , Label : : kDeferred ) , <nl> + try_polymorphic ( this ) , miss ( this , Label : : kDeferred ) ; <nl> <nl> Node * receiver_map = LoadReceiverMap ( p - > receiver ) ; <nl> - GotoIf ( IsSetWord32 < Map : : Deprecated > ( LoadMapBitField3 ( receiver_map ) ) , & miss ) ; <nl> + GotoIf ( IsDeprecatedMap ( receiver_map ) , & miss ) ; <nl> <nl> / / Check monomorphic case . <nl> Node * feedback = <nl> TryMonomorphicCase ( p - > slot , p - > vector , receiver_map , & if_handler , <nl> & var_handler , & try_polymorphic ) ; <nl> Bind ( & if_handler ) ; <nl> - { HandleLoadICHandlerCase ( p , var_handler . value ( ) , & miss ) ; } <nl> + HandleLoadICHandlerCase ( p , var_handler . value ( ) , & miss , & direct_exit ) ; <nl> <nl> Bind ( & try_polymorphic ) ; <nl> { <nl> / / Check polymorphic case . <nl> Comment ( " LoadIC_try_polymorphic " ) ; <nl> GotoIfNot ( WordEqual ( LoadMap ( feedback ) , FixedArrayMapConstant ( ) ) , <nl> - & try_megamorphic ) ; <nl> + & non_inlined ) ; <nl> HandlePolymorphicCase ( receiver_map , feedback , & if_handler , & var_handler , <nl> & miss , 2 ) ; <nl> } <nl> <nl> - Bind ( & try_megamorphic ) ; <nl> + Bind ( & non_inlined ) ; <nl> + LoadIC_Noninlined ( p , receiver_map , feedback , & var_handler , & if_handler , & miss , <nl> + & direct_exit ) ; <nl> + <nl> + Bind ( & miss ) ; <nl> + direct_exit . ReturnCallRuntime ( Runtime : : kLoadIC_Miss , p - > context , p - > receiver , <nl> + p - > name , p - > slot , p - > vector ) ; <nl> + } <nl> + <nl> + void AccessorAssembler : : LoadIC_Noninlined ( const LoadICParameters * p , <nl> + Node * receiver_map , Node * feedback , <nl> + Variable * var_handler , <nl> + Label * if_handler , Label * miss , <nl> + ExitPoint * exit_point ) { <nl> + Label try_uninitialized ( this , Label : : kDeferred ) ; <nl> + <nl> + / / Neither deprecated map nor monomorphic . These cases are handled in the <nl> + / / bytecode handler . <nl> + CSA_ASSERT ( this , Word32BinaryNot ( IsDeprecatedMap ( receiver_map ) ) ) ; <nl> + CSA_ASSERT ( this , <nl> + WordNotEqual ( receiver_map , LoadWeakCellValueUnchecked ( feedback ) ) ) ; <nl> + CSA_ASSERT ( this , WordNotEqual ( LoadMap ( feedback ) , FixedArrayMapConstant ( ) ) ) ; <nl> + DCHECK_EQ ( MachineRepresentation : : kTagged , var_handler - > rep ( ) ) ; <nl> + <nl> { <nl> / / Check megamorphic case . <nl> GotoIfNot ( WordEqual ( feedback , LoadRoot ( Heap : : kmegamorphic_symbolRootIndex ) ) , <nl> & try_uninitialized ) ; <nl> <nl> TryProbeStubCache ( isolate ( ) - > load_stub_cache ( ) , p - > receiver , p - > name , <nl> - & if_handler , & var_handler , & miss ) ; <nl> + if_handler , var_handler , miss ) ; <nl> } <nl> + <nl> Bind ( & try_uninitialized ) ; <nl> { <nl> / / Check uninitialized case . <nl> GotoIfNot ( <nl> WordEqual ( feedback , LoadRoot ( Heap : : kuninitialized_symbolRootIndex ) ) , <nl> - & miss ) ; <nl> - TailCallStub ( CodeFactory : : LoadIC_Uninitialized ( isolate ( ) ) , p - > context , <nl> - p - > receiver , p - > name , p - > slot , p - > vector ) ; <nl> - } <nl> - Bind ( & miss ) ; <nl> - { <nl> - TailCallRuntime ( Runtime : : kLoadIC_Miss , p - > context , p - > receiver , p - > name , <nl> - p - > slot , p - > vector ) ; <nl> + miss ) ; <nl> + exit_point - > ReturnCallStub ( CodeFactory : : LoadIC_Uninitialized ( isolate ( ) ) , <nl> + p - > context , p - > receiver , p - > name , p - > slot , <nl> + p - > vector ) ; <nl> } <nl> } <nl> <nl> void AccessorAssembler : : LoadGlobalIC_MissCase ( const LoadICParameters * p , <nl> <nl> void AccessorAssembler : : LoadGlobalIC ( const LoadICParameters * p , <nl> TypeofMode typeof_mode ) { <nl> + / / Must be kept in sync with Interpreter : : BuildLoadGlobal . <nl> + <nl> ExitPoint direct_exit ( this ) ; <nl> <nl> Label try_handler ( this ) , miss ( this ) ; <nl> void AccessorAssembler : : LoadGlobalIC ( const LoadICParameters * p , <nl> } <nl> <nl> void AccessorAssembler : : KeyedLoadIC ( const LoadICParameters * p ) { <nl> + ExitPoint direct_exit ( this ) ; <nl> + <nl> Variable var_handler ( this , MachineRepresentation : : kTagged ) ; <nl> Label if_handler ( this , & var_handler ) , try_polymorphic ( this , Label : : kDeferred ) , <nl> try_megamorphic ( this , Label : : kDeferred ) , <nl> void AccessorAssembler : : KeyedLoadIC ( const LoadICParameters * p ) { <nl> miss ( this , Label : : kDeferred ) ; <nl> <nl> Node * receiver_map = LoadReceiverMap ( p - > receiver ) ; <nl> - GotoIf ( IsSetWord32 < Map : : Deprecated > ( LoadMapBitField3 ( receiver_map ) ) , & miss ) ; <nl> + GotoIf ( IsDeprecatedMap ( receiver_map ) , & miss ) ; <nl> <nl> / / Check monomorphic case . <nl> Node * feedback = <nl> TryMonomorphicCase ( p - > slot , p - > vector , receiver_map , & if_handler , <nl> & var_handler , & try_polymorphic ) ; <nl> Bind ( & if_handler ) ; <nl> - { HandleLoadICHandlerCase ( p , var_handler . value ( ) , & miss , kSupportElements ) ; } <nl> + { <nl> + HandleLoadICHandlerCase ( p , var_handler . value ( ) , & miss , & direct_exit , <nl> + kSupportElements ) ; <nl> + } <nl> <nl> Bind ( & try_polymorphic ) ; <nl> { <nl> void AccessorAssembler : : KeyedLoadIC ( const LoadICParameters * p ) { <nl> GotoIfNot ( WordEqual ( feedback , p - > name ) , & miss ) ; <nl> / / If the name comparison succeeded , we know we have a fixed array with <nl> / / at least one map / handler pair . <nl> - Node * offset = ElementOffsetFromIndex ( <nl> - p - > slot , FAST_HOLEY_ELEMENTS , SMI_PARAMETERS , <nl> - FixedArray : : kHeaderSize + kPointerSize - kHeapObjectTag ) ; <nl> - Node * array = Load ( MachineType : : AnyTagged ( ) , p - > vector , offset ) ; <nl> + Node * array = <nl> + LoadFixedArrayElement ( p - > vector , p - > slot , kPointerSize , SMI_PARAMETERS ) ; <nl> HandlePolymorphicCase ( receiver_map , array , & if_handler , & var_handler , & miss , <nl> 1 ) ; <nl> } <nl> void AccessorAssembler : : StoreIC ( const StoreICParameters * p ) { <nl> try_megamorphic ( this , Label : : kDeferred ) , miss ( this , Label : : kDeferred ) ; <nl> <nl> Node * receiver_map = LoadReceiverMap ( p - > receiver ) ; <nl> - GotoIf ( IsSetWord32 < Map : : Deprecated > ( LoadMapBitField3 ( receiver_map ) ) , & miss ) ; <nl> + GotoIf ( IsDeprecatedMap ( receiver_map ) , & miss ) ; <nl> <nl> / / Check monomorphic case . <nl> Node * feedback = <nl> void AccessorAssembler : : KeyedStoreIC ( const StoreICParameters * p , <nl> try_polymorphic_name ( this , Label : : kDeferred ) ; <nl> <nl> Node * receiver_map = LoadReceiverMap ( p - > receiver ) ; <nl> - GotoIf ( IsSetWord32 < Map : : Deprecated > ( LoadMapBitField3 ( receiver_map ) ) , & miss ) ; <nl> + GotoIf ( IsDeprecatedMap ( receiver_map ) , & miss ) ; <nl> <nl> / / Check monomorphic case . <nl> Node * feedback = <nl> void AccessorAssembler : : KeyedStoreIC ( const StoreICParameters * p , <nl> GotoIfNot ( WordEqual ( feedback , p - > name ) , & miss ) ; <nl> / / If the name comparison succeeded , we know we have a FixedArray with <nl> / / at least one map / handler pair . <nl> - Node * offset = ElementOffsetFromIndex ( <nl> - p - > slot , FAST_HOLEY_ELEMENTS , SMI_PARAMETERS , <nl> - FixedArray : : kHeaderSize + kPointerSize - kHeapObjectTag ) ; <nl> - Node * array = Load ( MachineType : : AnyTagged ( ) , p - > vector , offset ) ; <nl> + Node * array = LoadFixedArrayElement ( p - > vector , p - > slot , kPointerSize , <nl> + SMI_PARAMETERS ) ; <nl> HandlePolymorphicCase ( receiver_map , array , & if_handler , & var_handler , <nl> & miss , 1 ) ; <nl> } <nl> void AccessorAssembler : : GenerateLoadIC ( ) { <nl> LoadIC ( & p ) ; <nl> } <nl> <nl> + void AccessorAssembler : : GenerateLoadIC_Noninlined ( ) { <nl> + typedef LoadWithVectorDescriptor Descriptor ; <nl> + <nl> + Node * receiver = Parameter ( Descriptor : : kReceiver ) ; <nl> + Node * name = Parameter ( Descriptor : : kName ) ; <nl> + Node * slot = Parameter ( Descriptor : : kSlot ) ; <nl> + Node * vector = Parameter ( Descriptor : : kVector ) ; <nl> + Node * context = Parameter ( Descriptor : : kContext ) ; <nl> + <nl> + ExitPoint direct_exit ( this ) ; <nl> + Variable var_handler ( this , MachineRepresentation : : kTagged ) ; <nl> + Label if_handler ( this , & var_handler ) , miss ( this , Label : : kDeferred ) ; <nl> + <nl> + Node * receiver_map = LoadReceiverMap ( receiver ) ; <nl> + Node * feedback = LoadFixedArrayElement ( vector , slot , 0 , SMI_PARAMETERS ) ; <nl> + <nl> + LoadICParameters p ( context , receiver , name , slot , vector ) ; <nl> + LoadIC_Noninlined ( & p , receiver_map , feedback , & var_handler , & if_handler , <nl> + & miss , & direct_exit ) ; <nl> + <nl> + Bind ( & if_handler ) ; <nl> + HandleLoadICHandlerCase ( & p , var_handler . value ( ) , & miss , & direct_exit ) ; <nl> + <nl> + Bind ( & miss ) ; <nl> + direct_exit . ReturnCallRuntime ( Runtime : : kLoadIC_Miss , context , receiver , name , <nl> + slot , vector ) ; <nl> + } <nl> + <nl> void AccessorAssembler : : GenerateLoadIC_Uninitialized ( ) { <nl> typedef LoadWithVectorDescriptor Descriptor ; <nl> <nl> mmm a / src / ic / accessor - assembler . h <nl> ppp b / src / ic / accessor - assembler . h <nl> class AccessorAssembler : public CodeStubAssembler { <nl> : CodeStubAssembler ( state ) { } <nl> <nl> void GenerateLoadIC ( ) ; <nl> + void GenerateLoadIC_Noninlined ( ) ; <nl> void GenerateLoadIC_Uninitialized ( ) ; <nl> void GenerateLoadField ( ) ; <nl> void GenerateLoadICTrampoline ( ) ; <nl> class AccessorAssembler : public CodeStubAssembler { <nl> ExitPoint * exit_point , Label * miss ) ; <nl> void LoadGlobalIC_MissCase ( const LoadICParameters * p , ExitPoint * exit_point ) ; <nl> <nl> + / / Specialized LoadIC for inlined bytecode handler , hand - tuned to omit frame <nl> + / / construction on common paths . <nl> + void LoadIC_BytecodeHandler ( const LoadICParameters * p , ExitPoint * exit_point ) ; <nl> + <nl> protected : <nl> struct StoreICParameters : public LoadICParameters { <nl> StoreICParameters ( Node * context , Node * receiver , Node * name , Node * value , <nl> class AccessorAssembler : public CodeStubAssembler { <nl> private : <nl> / / Stub generation entry points . <nl> <nl> + / / LoadIC contains the full LoadIC logic , while LoadIC_Noninlined contains <nl> + / / logic not inlined into Ignition bytecode handlers . <nl> void LoadIC ( const LoadICParameters * p ) ; <nl> + void LoadIC_Noninlined ( const LoadICParameters * p , Node * receiver_map , <nl> + Node * feedback , Variable * var_handler , <nl> + Label * if_handler , Label * miss , ExitPoint * exit_point ) ; <nl> + <nl> void LoadIC_Uninitialized ( const LoadICParameters * p ) ; <nl> void LoadICProtoArray ( const LoadICParameters * p , Node * handler , <nl> bool throw_reference_error_if_nonexistent ) ; <nl> class AccessorAssembler : public CodeStubAssembler { <nl> <nl> void HandleLoadICHandlerCase ( <nl> const LoadICParameters * p , Node * handler , Label * miss , <nl> - ElementSupport support_elements = kOnlyProperties ) ; <nl> + ExitPoint * exit_point , ElementSupport support_elements = kOnlyProperties ) ; <nl> <nl> void HandleLoadICSmiHandlerCase ( const LoadICParameters * p , Node * holder , <nl> Node * smi_handler , Label * miss , <nl> mmm a / src / interpreter / interpreter . cc <nl> ppp b / src / interpreter / interpreter . cc <nl> void Interpreter : : DoMov ( InterpreterAssembler * assembler ) { <nl> __ Dispatch ( ) ; <nl> } <nl> <nl> - void Interpreter : : BuildLoadGlobal ( int slot_operand_index , <nl> - int name_operand_index , <nl> - TypeofMode typeof_mode , <nl> - InterpreterAssembler * assembler ) { <nl> + void Interpreter : : BuildLoadGlobalIC ( int slot_operand_index , <nl> + int name_operand_index , <nl> + TypeofMode typeof_mode , <nl> + InterpreterAssembler * assembler ) { <nl> + / / Must be kept in sync with AccessorAssembler : : LoadGlobalIC . <nl> + <nl> / / Load the global via the LoadGlobalIC . <nl> Node * feedback_vector = __ LoadFeedbackVector ( ) ; <nl> Node * feedback_slot = __ BytecodeOperandIdx ( slot_operand_index ) ; <nl> void Interpreter : : DoLdaGlobal ( InterpreterAssembler * assembler ) { <nl> static const int kNameOperandIndex = 0 ; <nl> static const int kSlotOperandIndex = 1 ; <nl> <nl> - BuildLoadGlobal ( kSlotOperandIndex , kNameOperandIndex , NOT_INSIDE_TYPEOF , <nl> - assembler ) ; <nl> + BuildLoadGlobalIC ( kSlotOperandIndex , kNameOperandIndex , NOT_INSIDE_TYPEOF , <nl> + assembler ) ; <nl> } <nl> <nl> / / LdaGlobalInsideTypeof < name_index > < slot > <nl> void Interpreter : : DoLdaGlobalInsideTypeof ( InterpreterAssembler * assembler ) { <nl> static const int kNameOperandIndex = 0 ; <nl> static const int kSlotOperandIndex = 1 ; <nl> <nl> - BuildLoadGlobal ( kSlotOperandIndex , kNameOperandIndex , INSIDE_TYPEOF , <nl> - assembler ) ; <nl> + BuildLoadGlobalIC ( kSlotOperandIndex , kNameOperandIndex , INSIDE_TYPEOF , <nl> + assembler ) ; <nl> } <nl> <nl> void Interpreter : : DoStaGlobal ( Callable ic , InterpreterAssembler * assembler ) { <nl> void Interpreter : : DoLdaLookupGlobalSlot ( Runtime : : FunctionId function_id , <nl> ? INSIDE_TYPEOF <nl> : NOT_INSIDE_TYPEOF ; <nl> <nl> - BuildLoadGlobal ( kSlotOperandIndex , kNameOperandIndex , typeof_mode , <nl> - assembler ) ; <nl> + BuildLoadGlobalIC ( kSlotOperandIndex , kNameOperandIndex , typeof_mode , <nl> + assembler ) ; <nl> } <nl> <nl> / / Slow path when we have to call out to the runtime <nl> void Interpreter : : DoStaLookupSlotStrict ( InterpreterAssembler * assembler ) { <nl> DoStaLookupSlot ( LanguageMode : : STRICT , assembler ) ; <nl> } <nl> <nl> + void Interpreter : : BuildLoadIC ( int recv_operand_index , int slot_operand_index , <nl> + int name_operand_index , <nl> + InterpreterAssembler * assembler ) { <nl> + __ Comment ( " BuildLoadIC " ) ; <nl> + <nl> + / / Load vector and slot . <nl> + Node * feedback_vector = __ LoadFeedbackVector ( ) ; <nl> + Node * feedback_slot = __ BytecodeOperandIdx ( slot_operand_index ) ; <nl> + Node * smi_slot = __ SmiTag ( feedback_slot ) ; <nl> + <nl> + / / Load receiver . <nl> + Node * register_index = __ BytecodeOperandReg ( recv_operand_index ) ; <nl> + Node * recv = __ LoadRegister ( register_index ) ; <nl> + <nl> + / / Load the name . <nl> + / / TODO ( jgruber ) : Not needed for monomorphic smi handler constant / field case . <nl> + Node * constant_index = __ BytecodeOperandIdx ( name_operand_index ) ; <nl> + Node * name = __ LoadConstantPoolEntry ( constant_index ) ; <nl> + <nl> + Node * context = __ GetContext ( ) ; <nl> + <nl> + Label done ( assembler ) ; <nl> + Variable var_result ( assembler , MachineRepresentation : : kTagged ) ; <nl> + ExitPoint exit_point ( assembler , & done , & var_result ) ; <nl> + <nl> + AccessorAssembler : : LoadICParameters params ( context , recv , name , smi_slot , <nl> + feedback_vector ) ; <nl> + AccessorAssembler accessor_asm ( assembler - > state ( ) ) ; <nl> + accessor_asm . LoadIC_BytecodeHandler ( & params , & exit_point ) ; <nl> + <nl> + __ Bind ( & done ) ; <nl> + { <nl> + __ SetAccumulator ( var_result . value ( ) ) ; <nl> + __ Dispatch ( ) ; <nl> + } <nl> + } <nl> + <nl> / / LdaNamedProperty < object > < name_index > < slot > <nl> / / <nl> / / Calls the LoadIC at FeedBackVector slot < slot > for < object > and the name at <nl> / / constant pool entry < name_index > . <nl> void Interpreter : : DoLdaNamedProperty ( InterpreterAssembler * assembler ) { <nl> - Callable ic = CodeFactory : : LoadICInOptimizedCode ( isolate_ ) ; <nl> - Node * code_target = __ HeapConstant ( ic . code ( ) ) ; <nl> - Node * register_index = __ BytecodeOperandReg ( 0 ) ; <nl> - Node * object = __ LoadRegister ( register_index ) ; <nl> - Node * constant_index = __ BytecodeOperandIdx ( 1 ) ; <nl> - Node * name = __ LoadConstantPoolEntry ( constant_index ) ; <nl> - Node * raw_slot = __ BytecodeOperandIdx ( 2 ) ; <nl> - Node * smi_slot = __ SmiTag ( raw_slot ) ; <nl> - Node * feedback_vector = __ LoadFeedbackVector ( ) ; <nl> - Node * context = __ GetContext ( ) ; <nl> - Node * result = __ CallStub ( ic . descriptor ( ) , code_target , context , object , <nl> - name , smi_slot , feedback_vector ) ; <nl> - __ SetAccumulator ( result ) ; <nl> - __ Dispatch ( ) ; <nl> + static const int kRecvOperandIndex = 0 ; <nl> + static const int kNameOperandIndex = 1 ; <nl> + static const int kSlotOperandIndex = 2 ; <nl> + <nl> + BuildLoadIC ( kRecvOperandIndex , kSlotOperandIndex , kNameOperandIndex , <nl> + assembler ) ; <nl> } <nl> <nl> / / KeyedLoadIC < object > < slot > <nl> mmm a / src / interpreter / interpreter . h <nl> ppp b / src / interpreter / interpreter . h <nl> class Interpreter { <nl> void DoStaLookupSlot ( LanguageMode language_mode , <nl> InterpreterAssembler * assembler ) ; <nl> <nl> - / / Generates code to load a global . <nl> - void BuildLoadGlobal ( int slot_operand_index , int name_operand_index , <nl> - TypeofMode typeof_mode , InterpreterAssembler * assembler ) ; <nl> + / / Generates code to load a global property . <nl> + void BuildLoadGlobalIC ( int slot_operand_index , int name_operand_index , <nl> + TypeofMode typeof_mode , <nl> + InterpreterAssembler * assembler ) ; <nl> + <nl> + / / Generates code to load a property . <nl> + void BuildLoadIC ( int recv_operand_index , int slot_operand_index , <nl> + int name_operand_index , InterpreterAssembler * assembler ) ; <nl> <nl> / / Generates code to prepare the result for ForInPrepare . Cache data <nl> / / are placed into the consecutive series of registers starting at <nl>
[ ic ] Inline LoadIC into LdaNamedProperty bytecode handler
v8/v8
0bfabaf17484579ff0adc33085ac232b4168636e
2017-03-07T10:21:33Z
mmm a / bson / util / builder . h <nl> ppp b / bson / util / builder . h <nl> namespace mongo { <nl> update $ push ( append ) operation <nl> various db . eval ( ) type operations <nl> * / <nl> - const int BSONObjMaxUserSize = 4 * 1024 * 1024 ; <nl> + const int BSONObjMaxUserSize = 8 * 1024 * 1024 ; <nl> <nl> / * <nl> Sometimeswe we need objects slightly larger - an object in the replication local . oplog <nl>
increase bson size to 8mb SERVER - 1918 SERVER - 431
mongodb/mongo
b357c3ea89ef9374dd775c326f75d404bebe7f68
2010-10-12T01:56:16Z
mmm a / cocos2dx / include / NSMutableDictionary . h <nl> ppp b / cocos2dx / include / NSMutableDictionary . h <nl> class NSMutableDictionary : public NSObject <nl> std : : vector < std : : string > allKeys ( ) <nl> { <nl> std : : vector < std : : string > tRet ; <nl> - NSObjectMapIter it ; <nl> - for ( it = m_Map . begin ( ) ; it ! = m_Map . end ( ) ; + + it ) <nl> + if ( m_Map . size ( ) > 0 ) <nl> { <nl> - tRet . push_back ( it - > first ) ; <nl> + NSObjectMapIter it ; <nl> + for ( it = m_Map . begin ( ) ; it ! = m_Map . end ( ) ; + + it ) <nl> + { <nl> + tRet . push_back ( it - > first ) ; <nl> + } <nl> } <nl> return tRet ; <nl> } <nl> class NSMutableDictionary : public NSObject <nl> std : : vector < std : : string > allKeysForObject ( _ValueT object ) <nl> { <nl> std : : vector < std : : string > tRet ; <nl> - NSObjectMapIter it ; <nl> - for ( it = m_Map . begin ( ) ; it ! = m_Map . end ( ) ; + + it ) <nl> + if ( m_Map . size ( ) > 0 ) <nl> { <nl> - if ( it - > second = = object ) <nl> + NSObjectMapIter it ; <nl> + for ( it = m_Map . begin ( ) ; it ! = m_Map . end ( ) ; + + it ) <nl> { <nl> - tRet . push_back ( it - > first ) ; <nl> + if ( it - > second = = object ) <nl> + { <nl> + tRet . push_back ( it - > first ) ; <nl> + } <nl> } <nl> } <nl> return tRet ; <nl> class NSMutableDictionary : public NSObject <nl> <nl> void removeAllObjects ( ) <nl> { <nl> - NSObjectMapIter it ; <nl> - for ( it = m_Map . begin ( ) ; it ! = m_Map . end ( ) ; + + it ) <nl> + if ( m_Map . size ( ) > 0 ) <nl> { <nl> - it - > second - > release ( ) ; <nl> + NSObjectMapIter it ; <nl> + for ( it = m_Map . begin ( ) ; it ! = m_Map . end ( ) ; + + it ) <nl> + { <nl> + if ( it - > second ) <nl> + { <nl> + it - > second - > release ( ) ; <nl> + } <nl> + } <nl> } <nl> - <nl> m_Map . clear ( ) ; <nl> } <nl> <nl> mmm a / cocos2dx / label_nodes / CCBitmapFontAtlas . cpp <nl> ppp b / cocos2dx / label_nodes / CCBitmapFontAtlas . cpp <nl> namespace cocos2d { <nl> } <nl> std : : string key ( fntFile ) ; <nl> pRet = configurations - > objectForKey ( key ) ; <nl> - if ( pRet = = NULL ) { <nl> + if ( pRet = = NULL ) <nl> + { <nl> pRet = CCBitmapFontConfiguration : : configurationWithFNTFile ( fntFile ) ; <nl> configurations - > setObject ( pRet , key ) ; <nl> } <nl> namespace cocos2d { <nl> <nl> void FNTConfigRemoveCache ( void ) <nl> { <nl> - configurations - > removeAllObjects ( ) ; <nl> + if ( configurations ) <nl> + { <nl> + configurations - > removeAllObjects ( ) ; <nl> + } <nl> } <nl> / / <nl> / / Hash Element <nl>
issue , debug : FNTConfigRemoveCache
cocos2d/cocos2d-x
acef33a5911657892217b28f7dad8a8f2ef28591
2010-08-18T04:03:31Z
mmm a / Examples / SequenceToSequence / CMUDict / Python / Sequence2Sequence . py <nl> ppp b / Examples / SequenceToSequence / CMUDict / Python / Sequence2Sequence . py <nl> def create_reader ( path , is_training ) : <nl> inputAxis = Axis ( ' inputAxis ' ) <nl> labelAxis = Axis ( ' labelAxis ' ) <nl> <nl> - def testit ( r , with_labels = True ) : <nl> - # from cntk . blocks import Constant , Type <nl> - if True : <nl> - # try : <nl> - r . dump ( ) <nl> - if with_labels : <nl> - r . update_signature ( Type ( 3 , dynamic_axes = [ Axis . default_batch_axis ( ) , inputAxis ] ) , <nl> - Type ( 3 , dynamic_axes = [ Axis . default_batch_axis ( ) , labelAxis ] ) ) <nl> - else : <nl> - r . update_signature ( Type ( 3 , dynamic_axes = [ Axis . default_batch_axis ( ) , inputAxis ] ) ) <nl> - r . dump ( ) <nl> - if with_labels : <nl> - res = r . eval ( { r . arguments [ 0 ] : [ [ [ 0 . 9 , 0 . 7 , 0 . 8 ] ] ] , r . arguments [ 1 ] : [ [ [ 0 , 1 , 0 ] ] ] } ) <nl> - else : <nl> - res = r . eval ( { r . arguments [ 0 ] : [ [ [ 0 . 9 , 0 . 7 , 0 . 8 ] ] ] } ) <nl> - print ( res ) <nl> - # except Exception as e : <nl> - print ( e ) <nl> - r . dump ( ) # maybe some updates were already made ? <nl> - pass <nl> - # input ( " hit enter " ) <nl> - exit ( ) <nl> - <nl> # create the s2s model <nl> def create_model ( ) : # : : ( history * , input * ) - > logP ( w ) * <nl> # Embedding : ( input * ) - - > ( embedded_input * ) <nl> def criterion ( input , labels ) : <nl> <nl> def train ( train_reader , valid_reader , vocab , i2w , s2smodel , max_epochs , epoch_size ) : <nl> <nl> - # this is what we train here <nl> - # s2smodel . update_signature ( Type ( input_vocab_dim , dynamic_axes = [ Axis . default_batch_axis ( ) , inputAxis ] ) , <nl> - # Type ( label_vocab_dim , dynamic_axes = [ Axis . default_batch_axis ( ) , Axis ( ' labelAxis1 ' ) ] ) ) <nl> - # BUGBUG : fails with " Currently if an operand of a elementwise operation has any dynamic axes , those must match the dynamic axes of the other operands " <nl> - # Maybe also attributable to a parameter - order mix - up ? <nl> - # Because criterion strips the < s > and therefore s2smodel has a different axis <nl> - # Need to think whether this makes sense . The axes are different for training and testing . <nl> + # Note : We would like to set the signature of ' s2smodel ' ( s2smodel . update_signature ( ) ) , but that will cause <nl> + # an error since the training criterion uses a reduced sequence axis for the labels . <nl> + # This is because it removes the initial < s > symbol . Hence , we must leave the model <nl> + # with unspecified input shapes and axes . <nl> <nl> - # TODO : test this refactoring <nl> + # create the training wrapper for the s2smodel , as well as the criterion function <nl> model_train = create_model_train ( s2smodel ) <nl> criterion = create_criterion_function ( model_train ) <nl> + <nl> + # also wire in a greedy decoder so that we can properly log progress on a validation example <nl> + # This is not used for the actual training process . <nl> model_greedy = create_model_greedy ( s2smodel ) <nl> <nl> - # for this model during training we wire in a greedy decoder so that we can properly sample the validation data <nl> # This does not need to be done in training generally though <nl> # Instantiate the trainer object to drive the model training <nl> minibatch_size = 72 <nl> learner = adam_sgd ( model_train . parameters , <nl> - # learner = momentum_sgd ( model_train . parameters , low_memory = True , <nl> - lr = learning_rate_schedule ( [ 0 . 005 ] * 2 + [ 0 . 0025 ] * 3 + [ 0 . 00125 ] , UnitType . sample , epoch_size ) , <nl> - momentum = momentum_as_time_constant_schedule ( 1100 ) , <nl> - gradient_clipping_threshold_per_sample = 2 . 3 , <nl> - gradient_clipping_with_truncation = True ) <nl> + lr = learning_rate_schedule ( [ 0 . 005 ] * 2 + [ 0 . 0025 ] * 3 + [ 0 . 00125 ] , UnitType . sample , epoch_size ) , <nl> + momentum = momentum_as_time_constant_schedule ( 1100 ) , <nl> + gradient_clipping_threshold_per_sample = 2 . 3 , <nl> + gradient_clipping_with_truncation = True ) <nl> trainer = Trainer ( None , criterion , learner ) <nl> <nl> # Get minibatches of sequences to train with and perform model training <nl> def train ( train_reader , valid_reader , vocab , i2w , s2smodel , max_epochs , epoch_si <nl> <nl> # print out some useful training information <nl> log_number_of_parameters ( model_train ) ; print ( ) <nl> - # progress_printer = ProgressPrinter ( freq = 30 , tag = ' Training ' ) <nl> progress_printer = ProgressPrinter ( freq = 30 , tag = ' Training ' ) <nl> # progress_printer = ProgressPrinter ( freq = 30 , tag = ' Training ' , log_to_file = os . path . join ( MODEL_DIR , " model_att % d . log " % use_attention ) ) <nl> <nl> def no_op ( input ) : <nl> # log a summary of the stats for the epoch <nl> progress_printer . epoch_summary ( with_metric = True ) <nl> <nl> - # save the model every epoch <nl> - <nl> - # NOTE : we are saving the model with the greedy decoder wired - in . This is NOT necessary and in some <nl> - # cases it would be better to save the model without the decoder to make it easier to wire - in a <nl> - # different decoder such as a beam search decoder . For now we save this one though so it ' s easy to <nl> - # load up and start using . <nl> + # done : save the final model <nl> print ( " Saving final model to ' % s ' " % model_path ( max_epochs ) ) <nl> s2smodel . save ( model_path ( max_epochs ) ) <nl> print ( " % d epochs complete . " % max_epochs ) <nl> def write ( reader , model , vocab , i2w ) : <nl> # test action # <nl> # # # # # # # # # # # # # # # # # # # # # # # <nl> <nl> + # This computes the metric on the test set . <nl> + # Note that this is not decoding ; just predicting words using ground - truth history , like in training . <nl> def test ( reader , s2smodel , num_minibatches = None ) : <nl> <nl> - # we use the test_minibatch ( ) function so need to setup a trainer <nl> - # model_greedy = create_model_greedy ( s2smodel ) <nl> - # @ Function <nl> - # def criterion ( input , labels ) : <nl> - # # labels = x_last <nl> - # # criterion function must drop the < s > from the labels <nl> - # postprocessed_labels = sequence . slice ( labels , 1 , 0 ) # < s > A B C < / s > - - > A B C < / s > <nl> - # z = model_greedy ( input ) # , postprocessed_labels ) <nl> - # ce = cross_entropy_with_softmax ( z , postprocessed_labels ) <nl> - # errs = classification_error ( z , postprocessed_labels ) <nl> - # return ( Function . NamedOutput ( loss = ce ) , Function . NamedOutput ( metric = errs ) ) <nl> - # def lam ( input , labels ) : <nl> - # return model_greedy ( input ) <nl> - # criterion = create_criterion_function ( lambda input , labels : model_greedy ( input ) ) <nl> model_train = create_model_train ( s2smodel ) <nl> criterion = create_criterion_function ( model_train ) <nl> <nl> - # label_sequence = sequence . slice ( find_arg_by_name ( ' raw_labels ' , model ) , 1 , 0 ) <nl> - # lr = learning_rate_schedule ( 0 . 007 , UnitType . sample ) <nl> - # momentum = momentum_as_time_constant_schedule ( 1100 ) # BUGBUG : use Evaluator <nl> - <nl> - # BUGBUG : Must do the same as in train ( ) , drop the first token <nl> - # ce = cross_entropy_with_softmax ( model , label_sequence ) <nl> - # errs = classification_error ( model , label_sequence ) <nl> - # trainer = Trainer ( model , ce , errs , [ momentum_sgd ( model . parameters , lr , momentum ) ] ) <nl> evaluator = Evaluator ( None , criterion ) <nl> <nl> test_minibatch_size = 1024 <nl> def test ( reader , s2smodel , num_minibatches = None ) : <nl> mb = reader . next_minibatch ( test_minibatch_size ) <nl> if not mb : # finish when end of test set reached <nl> break <nl> - mb_error = evaluator . test_minibatch ( mb [ train_reader . streams . features ] , mb [ train_reader . streams . labels ] ) <nl> - # mb_error = evaluator . test_minibatch ( { find_arg_by_name ( ' raw_input ' , model ) : mb [ reader . streams . features ] , <nl> - # find_arg_by_name ( ' raw_labels ' , model ) : mb [ reader . streams . labels ] } ) <nl> - num_samples = mb [ train_reader . streams . labels ] . num_samples <nl> + mb_error = evaluator . test_minibatch ( mb [ test_reader . streams . features ] , mb [ test_reader . streams . labels ] ) <nl> + num_samples = mb [ test_reader . streams . labels ] . num_samples <nl> total_error + = mb_error * num_samples <nl> i + = num_samples <nl> <nl> def test ( reader , s2smodel , num_minibatches = None ) : <nl> <nl> # and return the test error <nl> rate = total_error / i <nl> - print ( " error rate of { } % in { } samples " , 100 * rate , i ) <nl> + print ( " error rate of { : . 2f } % in { } samples " . format ( 100 * rate , i ) ) <nl> return rate <nl> <nl> # # # # # # # # # # # # # # # # # # # # # # # # <nl> def translate ( tokens , model_decoding , vocab , i2w , show_attention = False , max_labe <nl> w = [ vdict [ " < s > " ] ] + [ vdict [ c ] for c in tokens ] + [ vdict [ " < / s > " ] ] <nl> except : <nl> print ( ' Input contains an unexpected token . ' ) <nl> - return <nl> + return [ ] <nl> <nl> # convert to one_hot <nl> # TODO : I think we have a function for this now . <nl> def debug_attention ( model , input ) : <nl> # main function boilerplate # <nl> # # # # # # # # # # # # # # # # # # # # # # # # # # # # # <nl> <nl> - # # # BEGIN UNRELATED TEST <nl> - # a test I did for porting a Keras model for Xinying Song <nl> - <nl> - # Configurations <nl> - <nl> - # def merge_helper ( mode , layer_list ) : <nl> - # " " " <nl> - # # Args : <nl> - # mode ( str ) : <nl> - # layer_list ( list [ layer ] ) : OK to have None layers <nl> - # # Returns : <nl> - # layer or None ( if layer_list is empty ) <nl> - # " " " <nl> - # layer_list = [ item for item in layer_list if item is not None ] <nl> - # if len ( layer_list ) > = 2 : <nl> - # return Merge ( mode = mode ) ( layer_list ) <nl> - # elif len ( layer_list ) = = 1 : <nl> - # return layer_list [ 0 ] <nl> - # else : <nl> - # return None <nl> - # def create_regularizer ( ) : <nl> - # if reg_l1 > 1e - 7 and reg_l2 > 1e - 7 : <nl> - # return regularizers . WeightRegularizer ( l1 = reg_l1 , l2 = reg_l2 ) <nl> - # if reg_l1 > 1e - 7 : <nl> - # return regularizers . WeightRegularizer ( l1 = reg_l1 ) <nl> - # if reg_l2 > 1e - 7 : <nl> - # return regularizers . WeightRegularizer ( l2 = reg_l2 ) <nl> - # return None <nl> - <nl> - def build_model_xy ( model_save_path = None ) : <nl> - <nl> - rnn_type = ' LSTM ' <nl> - # rnn_type = ' GRU ' <nl> - dnn_hid_size_list = [ 32 , 32 , 32 , 32 , 32 ] <nl> - rnn_hid_size_list = [ 128 , 128 ] <nl> - batch_size = 128 <nl> - reg_l1 = 0 <nl> - reg_l2 = 0 <nl> - dropout = 0 <nl> - batch_normalization = True <nl> - residual_dnn = False <nl> - skip_connection = True <nl> - batch_size_predict = 1024 <nl> - # in cross - validation , we test up to this number and see the upper bound of RNN capability <nl> - # then in train / test , we use the best epoch number obtained by cross - validation <nl> - nb_epoch = 30 <nl> - patience = 5 # no use now because we don ' t do early stopping for now <nl> - if rnn_type = = ' GRU ' : <nl> - rnn_cell = GRU <nl> - elif rnn_type = = " LSTM " : <nl> - rnn_cell = LSTM <nl> - else : <nl> - assert False <nl> - <nl> - # fake some environment <nl> - rnn_dataset_dict = Record ( train = Record ( multiple_input = Record ( <nl> - rnn = Record ( shape = ( 100 , - 123 , 250 ) ) , # ( batch , max_len , dim ) max_len not needed / used by CNTK <nl> - dnn = Record ( shape = ( 100 , 300 ) ) # ( batch , dim ) <nl> - ) ) ) <nl> - model_flags = Record ( # True if input is to be included <nl> - rnn = True , <nl> - dnn = True <nl> - ) <nl> - logger = Record ( <nl> - info = lambda * args : print ( * args ) <nl> - ) <nl> - <nl> - # Build model <nl> - multiple_input = rnn_dataset_dict [ ' train ' ] . multiple_input <nl> - for key in multiple_input . keys ( ) : <nl> - logger . info ( ( key , model_flags [ key ] , multiple_input [ key ] . shape ) ) <nl> - # construct inputs <nl> - rnn_inputs_dict_by_length = { } # a map from length to rnn_inputs <nl> - rnn_input_total_dim_by_length = { } <nl> - rnn_inputs = [ ] # for feeding graph input only <nl> - <nl> - dnn_inputs = [ ] <nl> - dnn_input_total_dim = 0 <nl> - for key in multiple_input . keys ( ) : <nl> - if not key in model_flags or not model_flags [ key ] : <nl> - logger . info ( " Skipping input { 0 } " . format ( key ) ) <nl> - continue <nl> - logger . info ( " Created input { 0 } " . format ( key ) ) <nl> - flds = key . split ( ' - ' ) <nl> - # inp = Input ( shape = multiple_input [ key ] . shape [ 1 : ] , name = key ) <nl> - if flds [ 0 ] = = ' rnn ' : <nl> - inp = Input ( shape = multiple_input [ key ] . shape [ 2 : ] , name = key ) # no explicit length dimension in CNTK <nl> - rnn_inputs . append ( inp ) # for graph input only <nl> - timesteps = multiple_input [ key ] . shape [ 1 ] <nl> - if timesteps not in rnn_inputs_dict_by_length : <nl> - rnn_inputs_dict_by_length [ timesteps ] = ( [ ] , [ ] ) <nl> - rnn_inputs_dict_by_length [ timesteps ] [ 0 ] . append ( inp ) <nl> - if timesteps not in rnn_input_total_dim_by_length : <nl> - rnn_input_total_dim_by_length [ timesteps ] = 0 <nl> - rnn_input_total_dim_by_length [ timesteps ] + = multiple_input [ key ] . shape [ - 1 ] <nl> - elif flds [ 0 ] = = ' dnn ' : <nl> - # inp = Input ( shape = multiple_input [ key ] . shape [ 1 : ] , name = key ) <nl> - inp = Input ( shape = multiple_input [ key ] . shape [ 1 : ] , name = key , dynamic_axes = [ Axis . default_batch_axis ( ) ] ) # dnn input has no sequence dimension <nl> - dnn_input_total_dim + = multiple_input [ key ] . shape [ - 1 ] <nl> - dnn_inputs . append ( inp ) <nl> - <nl> - # construct RNN layers <nl> - rnn_hid_list = [ ] <nl> - # notes : rnn_inputs_dict_by_length . keys ( ) currently has only one element <nl> - for timesteps in rnn_inputs_dict_by_length . keys ( ) : <nl> - # rnn_final_input = merge_helper ( ' concat ' , <nl> - rnn_final_input = splice ( * <nl> - rnn_inputs_dict_by_length [ timesteps ] [ 0 ] ) <nl> - if rnn_final_input is not None : <nl> - # rnn_hid = Masking ( mask_value = 0 . ) ( rnn_final_input ) <nl> - rnn_hid = rnn_final_input <nl> - rnn_output_dim = rnn_hid_size_list [ - 1 ] <nl> - rnn_skip_connections = [ ] <nl> - prev_output_dim = rnn_input_total_dim_by_length [ timesteps ] <nl> - if skip_connection : <nl> - # tmp_out = ZeroMaskedEntries ( ) ( rnn_hid ) <nl> - tmp_out = rnn_hid <nl> - # tmp_out = Lambda ( lambda x : x [ : , - 1 , : ] , output_shape = lambda input_shape : ( input_shape [ 0 ] , input_shape [ 2 ] ) ) ( tmp_out ) <nl> - tmp_out = sequence . last ( tmp_out ) <nl> - if prev_output_dim ! = rnn_output_dim : <nl> - tmp_out = Dense ( rnn_output_dim # , <nl> - # W_regularizer = create_regularizer ( ) , # CNTK regularizers work differently <nl> - # b_regularizer = create_regularizer ( ) , <nl> - ) ( tmp_out ) <nl> - rnn_skip_connections . append ( tmp_out ) <nl> - for ( depth , hid_size ) in enumerate ( rnn_hid_size_list ) : <nl> - # rnn_hid = rnn_cell ( hid_size , <nl> - Recurrence_f = Recurrence if depth < len ( rnn_hid_size_list ) - 1 else Fold # CNKT uses two different functions for return_sequences <nl> - cell = rnn_cell ( hid_size ) # rnn_cell is the layer factory ; create the layer so that we can know len ( cell . outputs ) <nl> - rnn_hid = Recurrence_f ( cell > > ( Dropout ( dropout ) , ) * len ( cell . outputs ) # apply Dropout to all outputs <nl> - # return_sequences = ( True if depth < len ( rnn_hid_size_list ) - 1 else False ) , <nl> - # W_regularizer = create_regularizer ( ) , # CNTK regularizers work differently <nl> - # U_regularizer = create_regularizer ( ) , <nl> - # b_regularizer = create_regularizer ( ) , <nl> - # dropout_W = dropout , <nl> - # dropout_U = dropout <nl> - ) ( rnn_hid ) <nl> - pre_output_dim = hid_size <nl> - if skip_connection : <nl> - if depth = = len ( rnn_hid_size_list ) - 1 : <nl> - tmp_out = rnn_hid <nl> - else : <nl> - # tmp_out = ZeroMaskedEntries ( ) ( rnn_hid ) <nl> - tmp_out = rnn_hid <nl> - # tmp_out = Lambda ( lambda x : x [ : , - 1 , : ] , output_shape = lambda input_shape : ( input_shape [ 0 ] , input_shape [ 2 ] ) ) ( tmp_out ) <nl> - tmp_out = sequence . last ( tmp_out ) <nl> - if prev_output_dim ! = rnn_output_dim : <nl> - tmp_out = Dense ( rnn_output_dim # , <nl> - # W_regularizer = create_regularizer ( ) , # CNTK regularizers work differently <nl> - # b_regularizer = create_regularizer ( ) , <nl> - ) ( tmp_out ) <nl> - rnn_skip_connections . append ( tmp_out ) <nl> - if skip_connection : <nl> - # rnn_hid = merge_helper ( ' sum ' , rnn_skip_connections ) <nl> - rnn_hid = plus ( * rnn_skip_connections ) <nl> - else : <nl> - pass <nl> - else : <nl> - rnn_hid = None <nl> - rnn_hid_list . append ( rnn_hid ) <nl> - # construct DNN layers <nl> - # dnn_final_input = merge_helper ( ' concat ' , dnn_inputs ) <nl> - dnn_final_input = splice ( * dnn_inputs ) <nl> - if dnn_final_input is not None : <nl> - dnn_output_dim = dnn_hid_size_list [ - 1 ] <nl> - dnn_hid = dnn_final_input <nl> - dnn_skip_connections = [ ] <nl> - if dropout > 1e - 7 : <nl> - dnn_hid = Dropout ( dropout ) ( dnn_hid ) <nl> - prev_output_dim = dnn_input_total_dim <nl> - if skip_connection : <nl> - tmp_out = dnn_hid <nl> - if prev_output_dim ! = dnn_output_dim : <nl> - tmp_out = Dense ( dnn_output_dim # , <nl> - # W_regularizer = create_regularizer ( ) , <nl> - # b_regularizer = create_regularizer ( ) , <nl> - ) ( tmp_out ) <nl> - dnn_skip_connections . append ( tmp_out ) <nl> - for ( depth , hid_size ) in enumerate ( dnn_hid_size_list ) : <nl> - layer_input = dnn_hid <nl> - dnn_hid = Dense ( hid_size # , <nl> - # W_regularizer = create_regularizer ( ) , <nl> - # b_regularizer = create_regularizer ( ) , <nl> - ) ( dnn_hid ) <nl> - if batch_normalization : <nl> - dnn_hid = BatchNormalization ( ) ( dnn_hid ) <nl> - # dnn_hid = Activation ( ' tanh ' ) ( dnn_hid ) <nl> - dnn_hid = tanh ( dnn_hid ) <nl> - if dropout > 1e - 7 : <nl> - dnn_hid = Dropout ( dropout ) ( dnn_hid ) <nl> - if residual_dnn : <nl> - if prev_output_dim ! = hid_size : <nl> - layer_input = Dense ( hid_size # , <nl> - # W_regularizer = create_regularizer ( ) , <nl> - # b_regularizer = create_regularizer ( ) , <nl> - ) ( layer_input ) <nl> - # dnn_hid = Merge ( mode = ' sum ' ) ( [ dnn_hid , layer_input ] ) <nl> - dnn_hid = dnn_hid + layer_input <nl> - prev_output_dim = hid_size <nl> - if skip_connection : <nl> - tmp_out = dnn_hid <nl> - if prev_output_dim ! = dnn_output_dim : <nl> - tmp_out = Dense ( dnn_output_dim # , <nl> - # W_regularizer = create_regularizer ( ) , <nl> - # b_regularizer = create_regularizer ( ) , <nl> - ) ( tmp_out ) <nl> - dnn_skip_connections . append ( tmp_out ) <nl> - if skip_connection : <nl> - # dnn_hid = merge_helper ( ' sum ' , dnn_skip_connections ) <nl> - dnn_hid = plus ( * dnn_skip_connections ) <nl> - else : <nl> - pass <nl> - else : <nl> - dnn_hid = None <nl> - <nl> - # merge RNN with DNN and project to final <nl> - # rnn_dnn_merged = merge_helper ( ' concat ' , rnn_hid_list + [ dnn_hid ] ) <nl> - rnn_dnn_merged = splice ( * rnn_hid_list , dnn_hid ) <nl> - # assert rnn_dnn_merged is not None , " Error ! no inputs found ! " <nl> - # output = Dense ( 1 , W_regularizer = create_regularizer ( ) , <nl> - # b_regularizer = create_regularizer ( ) , ) ( rnn_dnn_merged ) <nl> - output = Dense ( 1 ) ( rnn_dnn_merged ) <nl> - if batch_normalization : <nl> - output = BatchNormalization ( ) ( output ) <nl> - # output = Activation ( ' sigmoid ' ) ( output ) <nl> - output = sigmoid ( output ) <nl> - <nl> - # final specify model <nl> - # model = Model ( input = rnn_inputs + dnn_inputs , output = output ) <nl> - model = output # Model ( input = rnn_inputs + dnn_inputs , output = output ) <nl> - debughelpers . dump_function ( model , " model " ) <nl> - # logger . info ( model . get_config ( ) ) <nl> - logger . info ( " build model completed " ) <nl> - if model_save_path : <nl> - # plot ( model , to_file = model_save_path + ' . png ' ) <nl> - from cntk . graph import plot <nl> - plot ( model , filename = model_save_path + ' . pdf ' , scale = 2 ) <nl> - # plot ( model , filename = model_save_path + ' . svg ' , scale = 2 ) <nl> - logger . info ( ' model graph saved to ' + model_save_path + ' . png ' ) <nl> - return model <nl> - <nl> - # # # END UNRELATED TEST <nl> - <nl> - <nl> - <nl> if __name__ = = ' __main__ ' : <nl> <nl> from _cntk_py import set_computation_network_trace_level , set_fixed_random_seed , force_deterministic_algorithms <nl> set_fixed_random_seed ( 1 ) # BUGBUG : has no effect at present # TODO : remove debugging facilities once this all works <nl> <nl> - # build_model_xy ( ' c : / me / xinying_graph ' ) <nl> - <nl> - # W = Parameter ( ( - 1 , 42 ) , init = glorot_uniform ( ) ) <nl> - # @ Function <nl> - # def simple_layer ( x ) : <nl> - # return sigmoid ( x @ W ) <nl> - # simple_layer . update_signature ( 13 ) <nl> - # from cntk . graph import plot <nl> - # plot ( simple_layer , ' c : / work / cntk / simple_layer . pdf ' ) <nl> - # debughelpers . dump_function ( simple_layer ) <nl> - <nl> - # x = placeholder_variable ( ' x ' ) # Function argument in definition <nl> - # s = x * x # Function <nl> - # debughelpers . dump_function ( s ) <nl> - # arg = placeholder_variable ( ' arg ' ) # apply Function to another placeholder <nl> - # y = s . clone ( CloneMethod . share , { x : arg } ) <nl> - # debughelpers . dump_function ( y ) <nl> - # print ( 13 ) <nl> - <nl> - <nl> - # x = placeholder_variable ( ' x ' ) # Function argument <nl> - # h_f = placeholder_variable ( ' h_f ' ) # recurrent forward reference <nl> - # h = sigmoid ( 2 * h_f + 2 * x ) <nl> - # h . replace_placeholders ( { h_f : h } ) # end of Function definition <nl> - # h . replace_placeholders ( { x : input_variable ( 300 ) } ) # Function application <nl> - # debughelpers . dump_function ( h ) <nl> - # print ( 13 ) <nl> - <nl> - # test for multi - input plus ( ) <nl> - # from cntk . ops import plus , element_times , max , min , log_add_exp <nl> - # for op in ( log_add_exp , max , min , plus , element_times ) : <nl> - # s4 = op ( Placeholder ( name = ' a ' ) , Placeholder ( 3 , name = ' b ' ) , Placeholder ( 4 , name = ' c ' ) , Placeholder ( 5 , name = ' d ' ) , name = ' s4 ' ) <nl> - # s4 . dump ( ' s4 ' ) <nl> - # sequence_reduce_max = Fold ( max ) <nl> - # sequence_reduce_max . dump ( ' sequence_reduce_max ' ) <nl> - # TODO : create proper test case for this <nl> - <nl> - <nl> - # L = Dense ( 500 ) <nl> - # L1 = L . clone ( CloneMethod . clone ) <nl> - # x = placeholder_variable ( ) <nl> - # y = L ( x ) + L1 ( x ) <nl> - <nl> - # L = Dense ( 500 ) <nl> - # o = L . outputs <nl> - # sh = L . shape <nl> - # W = L . W <nl> - # w = L . weights <nl> - <nl> - # repro for as_block <nl> - from cntk import placeholder_variable , combine , alias , as_block <nl> - def f ( x , y ) : <nl> - return y - x <nl> - arg_names = [ ' x ' , ' y ' ] <nl> - args = [ placeholder_variable ( name = name ) for name in arg_names ] <nl> - block_args = [ placeholder_variable ( name = arg . name ) for arg in args ] # placeholders inside the BlockFunction <nl> - combined_block_args = combine ( block_args ) # the content of the BlockFunction <nl> - arg_map = list ( zip ( block_args , args ) ) # after wrapping , the block_args map to args <nl> - combined_args = as_block ( composite = combined_block_args , block_arguments_map = arg_map , block_op_name = ' f_parameter_pack ' ) <nl> - funargs = combined_args . outputs # the Python function is called with these instead <nl> - # combined_args = None <nl> - out = f ( * funargs ) <nl> - out_arg_names = [ arg . name for arg in out . arguments ] <nl> - # out = Recurrence ( out , initial_state = 13 . 0 ) <nl> - # out_arg_names = [ arg . name for arg in out . arguments ] <nl> - out = out . clone ( CloneMethod . share , { out . arguments [ 0 ] : input_variable ( 1 , name = ' x1 ' ) , out . arguments [ 1 ] : input_variable ( 1 , name = ' y1 ' ) } ) <nl> - out_arg_names = [ arg . name for arg in out . arguments ] <nl> - res = out . eval ( { out . arguments [ 0 ] : [ [ 3 . 0 ] ] , out . arguments [ 1 ] : [ [ 5 . 0 ] ] } ) <nl> - # res = out . eval ( [ [ 3 . 0 ] ] ) <nl> - <nl> # hook up data <nl> - train_reader = create_reader ( os . path . join ( DATA_DIR , TRAINING_DATA ) , True ) <nl> - valid_reader = create_reader ( os . path . join ( DATA_DIR , VALIDATION_DATA ) , True ) <nl> vocab , i2w , w2i = get_vocab ( os . path . join ( DATA_DIR , VOCAB_FILE ) ) <nl> <nl> # create inputs and create model <nl> - model = create_model ( ) <nl> - <nl> - # train <nl> - train ( train_reader , valid_reader , vocab , i2w , model , max_epochs = 10 , epoch_size = 908241 ) <nl> + # model = create_model ( ) <nl> + # <nl> + # # train <nl> + # train_reader = create_reader ( os . path . join ( DATA_DIR , TRAINING_DATA ) , True ) <nl> + # valid_reader = create_reader ( os . path . join ( DATA_DIR , VALIDATION_DATA ) , True ) <nl> + # train ( train_reader , valid_reader , vocab , i2w , model , max_epochs = 10 , epoch_size = 908241 ) <nl> <nl> test_epoch = 10 <nl> + model = Function . load ( model_path ( test_epoch ) ) <nl> <nl> # write <nl> # model = load_model ( " model_epoch0 . cmf " ) <nl> # write ( valid_reader , model , vocab , i2w ) <nl> <nl> # test <nl> - model = Function . load ( model_path ( test_epoch ) ) <nl> test_reader = create_reader ( os . path . join ( DATA_DIR , TESTING_DATA ) , False ) <nl> test ( test_reader , model ) <nl> <nl> # try the model out in an interactive session <nl> - model = Function . load ( model_path ( test_epoch ) ) <nl> interactive_session ( model , vocab , i2w , show_attention = True ) <nl>
cleaned up seq2seq example
microsoft/CNTK
88dc3f7115477130c05a778a53fab278ca7094b4
2017-01-26T23:33:21Z
mmm a / fdbcli / fdbcli . actor . cpp <nl> ppp b / fdbcli / fdbcli . actor . cpp <nl> std : : string getDateInfoString ( StatusObjectReader statusObj , std : : string key ) { <nl> std : : string getProcessAddressByServerID ( StatusObjectReader processesMap , std : : string serverID ) { <nl> if ( serverID = = " " ) <nl> return " unknown " ; <nl> - <nl> + <nl> for ( auto proc : processesMap . obj ( ) ) { <nl> try { <nl> StatusArray rolesArray = proc . second . get_obj ( ) [ " roles " ] . get_array ( ) ; <nl> ACTOR Future < int > cli ( CLIOptions opt , LineNoise * plinenoise ) { <nl> <nl> if ( ! opt . exec . present ( ) ) { <nl> if ( opt . initialStatusCheck ) { <nl> - wait ( makeInterruptable ( checkStatus ( Void ( ) , db - > getConnectionFile ( ) ) ) ) ; <nl> + Future < Void > checkStatusF = checkStatus ( Void ( ) , db - > getConnectionFile ( ) ) ; <nl> + Future < Void > checkDDStatusF = checkDataDistributionStatus ( db , true ) ; <nl> + wait ( makeInterruptable ( success ( checkStatusF ) & & success ( checkDDStatusF ) ) ) ; <nl> } <nl> else { <nl> printf ( " \ n " ) ; <nl> ACTOR Future < int > cli ( CLIOptions opt , LineNoise * plinenoise ) { <nl> wait ( makeInterruptable ( printHealthyZone ( db ) ) ) ; <nl> } <nl> else if ( tokens . size ( ) = = 2 & & tokencmp ( tokens [ 1 ] , " off " ) ) { <nl> - wait ( makeInterruptable ( clearHealthyZone ( db ) ) ) ; <nl> + bool clearResult = wait ( makeInterruptable ( clearHealthyZone ( db , true ) ) ) ; <nl> + is_error = ! clearResult ; <nl> } <nl> else if ( tokens . size ( ) = = 4 & & tokencmp ( tokens [ 1 ] , " on " ) ) { <nl> double seconds ; <nl> ACTOR Future < int > cli ( CLIOptions opt , LineNoise * plinenoise ) { <nl> printUsage ( tokens [ 0 ] ) ; <nl> is_error = true ; <nl> } else { <nl> - wait ( makeInterruptable ( setHealthyZone ( db , tokens [ 2 ] , seconds ) ) ) ; <nl> + bool setResult = wait ( makeInterruptable ( setHealthyZone ( db , tokens [ 2 ] , seconds , true ) ) ) ; <nl> + is_error = ! setResult ; <nl> } <nl> } else { <nl> printUsage ( tokens [ 0 ] ) ; <nl> ACTOR Future < int > cli ( CLIOptions opt , LineNoise * plinenoise ) { <nl> } <nl> <nl> if ( tokencmp ( tokens [ 0 ] , " datadistribution " ) ) { <nl> - if ( tokens . size ( ) ! = 2 ) { <nl> - printf ( " Usage : datadistribution < on | off > \ n " ) ; <nl> + if ( tokens . size ( ) ! = 2 & & tokens . size ( ) ! = 3 ) { <nl> + printf ( " Usage : datadistribution < status | on | off | disable < ssfailure | rebalance > | enable " <nl> + " < ssfailure | rebalance > > \ n " ) ; <nl> is_error = true ; <nl> } else { <nl> - if ( tokencmp ( tokens [ 1 ] , " on " ) ) { <nl> + if ( tokencmp ( tokens [ 1 ] , " status " ) ) { <nl> + wait ( makeInterruptable ( checkDataDistributionStatus ( db ) ) ) ; <nl> + } else if ( tokencmp ( tokens [ 1 ] , " on " ) ) { <nl> wait ( success ( setDDMode ( db , 1 ) ) ) ; <nl> - printf ( " Data distribution is enabled \ n " ) ; <nl> - } else if ( tokencmp ( tokens [ 1 ] , " off " ) ) { <nl> + printf ( " Data distribution is turned on . \ n " ) ; <nl> + } else if ( tokencmp ( tokens [ 1 ] , " off " ) ) { <nl> wait ( success ( setDDMode ( db , 0 ) ) ) ; <nl> - printf ( " Data distribution is disabled \ n " ) ; <nl> + printf ( " Data distribution is turned off . \ n " ) ; <nl> + } else if ( tokencmp ( tokens [ 1 ] , " disable " ) ) { <nl> + if ( tokencmp ( tokens [ 2 ] , " ssfailure " ) ) { <nl> + bool _ = wait ( makeInterruptable ( setHealthyZone ( db , ignoreSSFailuresZoneString , 0 ) ) ) ; <nl> + printf ( " Data distribution is disabled for storage server failures . \ n " ) ; <nl> + } else if ( tokencmp ( tokens [ 2 ] , " rebalance " ) ) { <nl> + wait ( makeInterruptable ( setDDIgnoreRebalanceSwitch ( db , true ) ) ) ; <nl> + printf ( " Data distribution is disabled for rebalance . \ n " ) ; <nl> + } else { <nl> + printf ( " Usage : datadistribution < status | on | off | disable < ssfailure | rebalance > | enable " <nl> + " < ssfailure | rebalance > > \ n " ) ; <nl> + is_error = true ; <nl> + } <nl> + } else if ( tokencmp ( tokens [ 1 ] , " enable " ) ) { <nl> + if ( tokencmp ( tokens [ 2 ] , " ssfailure " ) ) { <nl> + bool _ = wait ( makeInterruptable ( clearHealthyZone ( db , false , true ) ) ) ; <nl> + printf ( " Data distribution is enabled for storage server failures . \ n " ) ; <nl> + } else if ( tokencmp ( tokens [ 2 ] , " rebalance " ) ) { <nl> + wait ( makeInterruptable ( setDDIgnoreRebalanceSwitch ( db , false ) ) ) ; <nl> + printf ( " Data distribution is enabled for rebalance . \ n " ) ; <nl> + } else { <nl> + printf ( " Usage : datadistribution < status | on | off | disable < ssfailure | rebalance > | enable " <nl> + " < ssfailure | rebalance > > \ n " ) ; <nl> + is_error = true ; <nl> + } <nl> } else { <nl> - printf ( " Usage : datadistribution < on | off > \ n " ) ; <nl> + printf ( " Usage : datadistribution < status | on | off | disable < ssfailure | rebalance > | enable " <nl> + " < ssfailure | rebalance > > \ n " ) ; <nl> is_error = true ; <nl> } <nl> } <nl> mmm a / fdbclient / ManagementAPI . actor . cpp <nl> ppp b / fdbclient / ManagementAPI . actor . cpp <nl> ACTOR Future < ConfigurationResult : : Type > changeConfig ( Database cx , std : : map < std : <nl> if ( ! newConfig . isValid ( ) ) { <nl> return ConfigurationResult : : INVALID_CONFIGURATION ; <nl> } <nl> - <nl> + <nl> if ( newConfig . tLogPolicy - > attributeKeys ( ) . count ( " dcid " ) & & newConfig . regions . size ( ) > 0 ) { <nl> return ConfigurationResult : : REGION_REPLICATION_MISMATCH ; <nl> } <nl> ACTOR Future < vector < AddressExclusion > > getExcludedServers ( Database cx ) { <nl> } <nl> } <nl> <nl> + ACTOR Future < Void > checkDataDistributionStatus ( Database cx , bool printWarningOnly ) { <nl> + state Transaction tr ( cx ) ; <nl> + loop { <nl> + try { <nl> + tr . setOption ( FDBTransactionOptions : : LOCK_AWARE ) ; <nl> + state Future < Optional < Value > > overallSwitchF = tr . get ( dataDistributionModeKey ) ; <nl> + state Future < Optional < Value > > healthyZoneValueF = tr . get ( healthyZoneKey ) ; <nl> + state Future < Optional < Value > > rebalanceDDIgnoreValueF = tr . get ( rebalanceDDIgnoreKey ) ; <nl> + wait ( success ( overallSwitchF ) & & success ( healthyZoneValueF ) & & success ( rebalanceDDIgnoreValueF ) ) ; <nl> + if ( overallSwitchF . get ( ) . present ( ) ) { <nl> + BinaryReader rd ( overallSwitchF . get ( ) . get ( ) , Unversioned ( ) ) ; <nl> + int currentMode ; <nl> + rd > > currentMode ; <nl> + if ( currentMode = = 0 ) { <nl> + printf ( " WARNING : Data distribution is off . \ n " ) ; <nl> + return Void ( ) ; <nl> + } <nl> + } <nl> + if ( ! printWarningOnly ) { <nl> + printf ( " Data distribution is on . \ n " ) ; <nl> + } <nl> + if ( healthyZoneValueF . get ( ) . present ( ) ) { <nl> + auto healthyZoneKV = decodeHealthyZoneValue ( healthyZoneValueF . get ( ) . get ( ) ) ; <nl> + if ( healthyZoneKV . first = = ignoreSSFailuresZoneString ) { <nl> + printf ( " WARNING : Data distribution is currently turned on but disabled for all storage server " <nl> + " failures . \ n " ) ; <nl> + } else { <nl> + printf ( " WARNING : Data distribution is currently turned on but zone % s is under maintenance and " <nl> + " will continue for % " PRId64 " seconds . \ n " , <nl> + healthyZoneKV . first . toString ( ) . c_str ( ) , <nl> + ( healthyZoneKV . second - tr . getReadVersion ( ) . get ( ) ) / CLIENT_KNOBS - > CORE_VERSIONSPERSECOND ) ; <nl> + } <nl> + } <nl> + if ( rebalanceDDIgnoreValueF . get ( ) . present ( ) ) { <nl> + printf ( " WARNING : Data distribution is currently turned on but shard size balancing is currently " <nl> + " disabled . \ n " ) ; <nl> + } <nl> + return Void ( ) ; <nl> + } catch ( Error & e ) { <nl> + wait ( tr . onError ( e ) ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> ACTOR Future < Void > printHealthyZone ( Database cx ) { <nl> state Transaction tr ( cx ) ; <nl> loop { <nl> ACTOR Future < Void > printHealthyZone ( Database cx ) { <nl> Optional < Value > val = wait ( tr . get ( healthyZoneKey ) ) ; <nl> if ( ! val . present ( ) | | decodeHealthyZoneValue ( val . get ( ) ) . second < = tr . getReadVersion ( ) . get ( ) ) { <nl> printf ( " No ongoing maintenance . \ n " ) ; <nl> + } else if ( val . present ( ) & & decodeHealthyZoneValue ( val . get ( ) ) . first = = ignoreSSFailuresZoneString ) { <nl> + printf ( " Data distribution has been disabled for all storage server failures in this cluster and thus " <nl> + " maintenance mode is not active . \ n " ) ; <nl> } else { <nl> auto healthyZone = decodeHealthyZoneValue ( val . get ( ) ) ; <nl> printf ( " Maintenance for zone % s will continue for % " PRId64 " seconds . \ n " , healthyZone . first . toString ( ) . c_str ( ) , ( healthyZone . second - tr . getReadVersion ( ) . get ( ) ) / CLIENT_KNOBS - > CORE_VERSIONSPERSECOND ) ; <nl> ACTOR Future < Void > printHealthyZone ( Database cx ) { <nl> } <nl> } <nl> <nl> - ACTOR Future < Void > clearHealthyZone ( Database cx ) { <nl> + ACTOR Future < bool > clearHealthyZone ( Database cx , bool printWarning , bool clearSSFailureZoneString ) { <nl> state Transaction tr ( cx ) ; <nl> loop { <nl> try { <nl> tr . setOption ( FDBTransactionOptions : : LOCK_AWARE ) ; <nl> + tr . setOption ( FDBTransactionOptions : : PRIORITY_SYSTEM_IMMEDIATE ) ; <nl> + Optional < Value > val = wait ( tr . get ( healthyZoneKey ) ) ; <nl> + if ( ! clearSSFailureZoneString & & val . present ( ) & & <nl> + decodeHealthyZoneValue ( val . get ( ) ) . first = = ignoreSSFailuresZoneString ) { <nl> + if ( printWarning ) { <nl> + printf ( " ERROR : Maintenance mode cannot be used while data distribution is disabled for storage " <nl> + " server failures . Use ' datadistribution on ' to reenable data distribution . \ n " ) ; <nl> + } <nl> + return false ; <nl> + } <nl> + <nl> tr . clear ( healthyZoneKey ) ; <nl> wait ( tr . commit ( ) ) ; <nl> - return Void ( ) ; <nl> + return true ; <nl> } catch ( Error & e ) { <nl> wait ( tr . onError ( e ) ) ; <nl> } <nl> } <nl> } <nl> <nl> - ACTOR Future < Void > setHealthyZone ( Database cx , StringRef zoneId , double seconds ) { <nl> + ACTOR Future < bool > setHealthyZone ( Database cx , StringRef zoneId , double seconds , bool printWarning ) { <nl> state Transaction tr ( cx ) ; <nl> loop { <nl> try { <nl> tr . setOption ( FDBTransactionOptions : : LOCK_AWARE ) ; <nl> + tr . setOption ( FDBTransactionOptions : : PRIORITY_SYSTEM_IMMEDIATE ) ; <nl> + Optional < Value > val = wait ( tr . get ( healthyZoneKey ) ) ; <nl> + if ( val . present ( ) & & decodeHealthyZoneValue ( val . get ( ) ) . first = = ignoreSSFailuresZoneString ) { <nl> + if ( printWarning ) { <nl> + printf ( " ERROR : Maintenance mode cannot be used while data distribution is disabled for storage " <nl> + " server failures . Use ' datadistribution on ' to reenable data distribution . \ n " ) ; <nl> + } <nl> + return false ; <nl> + } <nl> Version readVersion = wait ( tr . getReadVersion ( ) ) ; <nl> tr . set ( healthyZoneKey , healthyZoneValue ( zoneId , readVersion + ( seconds * CLIENT_KNOBS - > CORE_VERSIONSPERSECOND ) ) ) ; <nl> wait ( tr . commit ( ) ) ; <nl> - return Void ( ) ; <nl> + return true ; <nl> } catch ( Error & e ) { <nl> wait ( tr . onError ( e ) ) ; <nl> } <nl> } <nl> } <nl> <nl> + ACTOR Future < Void > setDDIgnoreRebalanceSwitch ( Database cx , bool ignoreRebalance ) { <nl> + state Transaction tr ( cx ) ; <nl> + loop { <nl> + try { <nl> + tr . setOption ( FDBTransactionOptions : : LOCK_AWARE ) ; <nl> + if ( ignoreRebalance ) { <nl> + tr . set ( rebalanceDDIgnoreKey , LiteralStringRef ( " on " ) ) ; <nl> + } else { <nl> + tr . clear ( rebalanceDDIgnoreKey ) ; <nl> + } <nl> + wait ( tr . commit ( ) ) ; <nl> + return Void ( ) ; <nl> + } catch ( Error & e ) { <nl> + wait ( tr . onError ( e ) ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> ACTOR Future < int > setDDMode ( Database cx , int mode ) { <nl> state Transaction tr ( cx ) ; <nl> state int oldMode = - 1 ; <nl> ACTOR Future < int > setDDMode ( Database cx , int mode ) { <nl> tr . set ( moveKeysLockWriteKey , wrLastWrite . toValue ( ) ) ; <nl> <nl> tr . set ( dataDistributionModeKey , wr . toValue ( ) ) ; <nl> - <nl> + if ( mode ) { <nl> + / / set DDMode to 1 will enable all disabled parts , for instance the SS failure monitors . <nl> + Optional < Value > currentHealthyZoneValue = wait ( tr . get ( healthyZoneKey ) ) ; <nl> + if ( currentHealthyZoneValue . present ( ) & & <nl> + decodeHealthyZoneValue ( currentHealthyZoneValue . get ( ) ) . first = = ignoreSSFailuresZoneString ) { <nl> + / / only clear the key if it is currently being used to disable all SS failure data movement <nl> + tr . clear ( healthyZoneKey ) ; <nl> + } <nl> + tr . clear ( rebalanceDDIgnoreKey ) ; <nl> + } <nl> wait ( tr . commit ( ) ) ; <nl> return oldMode ; <nl> } catch ( Error & e ) { <nl> mmm a / fdbclient / ManagementAPI . actor . h <nl> ppp b / fdbclient / ManagementAPI . actor . h <nl> ACTOR Future < int > setDDMode ( Database cx , int mode ) ; <nl> <nl> ACTOR Future < Void > forceRecovery ( Reference < ClusterConnectionFile > clusterFile , Standalone < StringRef > dcId ) ; <nl> <nl> + ACTOR Future < Void > checkDataDistributionStatus ( Database cx , bool printWarningOnly = false ) ; <nl> ACTOR Future < Void > printHealthyZone ( Database cx ) ; <nl> - ACTOR Future < Void > clearHealthyZone ( Database cx ) ; <nl> - ACTOR Future < Void > setHealthyZone ( Database cx , StringRef zoneId , double seconds ) ; <nl> + ACTOR Future < Void > setDDIgnoreRebalanceSwitch ( Database cx , bool ignoreRebalance ) ; <nl> + ACTOR Future < bool > clearHealthyZone ( Database cx , bool printWarning = false , bool clearSSFailureZoneString = false ) ; <nl> + ACTOR Future < bool > setHealthyZone ( Database cx , StringRef zoneId , double seconds , bool printWarning = false ) ; <nl> <nl> ACTOR Future < Void > waitForPrimaryDC ( Database cx , StringRef dcId ) ; <nl> <nl> mmm a / fdbclient / SystemData . cpp <nl> ppp b / fdbclient / SystemData . cpp <nl> const KeyRange serverTagHistoryRangeBefore ( UID serverID , Version version ) { <nl> wr . serializeBytes ( serverTagHistoryKeys . begin ) ; <nl> wr < < serverID ; <nl> version = bigEndian64 ( version ) ; <nl> - <nl> + <nl> Key versionStr = makeString ( 8 ) ; <nl> uint8_t * data = mutateString ( versionStr ) ; <nl> memcpy ( data , & version , 8 ) ; <nl> const Key restoreWorkerKeyFor ( UID const & agentID ) { <nl> } <nl> <nl> const KeyRef healthyZoneKey = LiteralStringRef ( " \ xff \ x02 / healthyZone " ) ; <nl> + const StringRef ignoreSSFailuresZoneString = LiteralStringRef ( " IgnoreSSFailures " ) ; <nl> + const KeyRef rebalanceDDIgnoreKey = LiteralStringRef ( " \ xff \ x02 / rebalanceDDIgnored " ) ; <nl> <nl> const Value healthyZoneValue ( StringRef const & zoneId , Version version ) { <nl> BinaryWriter wr ( IncludeVersion ( ) ) ; <nl> mmm a / fdbclient / SystemData . h <nl> ppp b / fdbclient / SystemData . h <nl> extern const KeyRangeRef restoreWorkersKeys ; <nl> const Key restoreWorkerKeyFor ( UID const & agentID ) ; <nl> <nl> extern const KeyRef healthyZoneKey ; <nl> + extern const StringRef ignoreSSFailuresZoneString ; <nl> + extern const KeyRef rebalanceDDIgnoreKey ; <nl> <nl> const Value healthyZoneValue ( StringRef const & zoneId , Version version ) ; <nl> std : : pair < Key , Version > decodeHealthyZoneValue ( ValueRef const & ) ; <nl> mmm a / fdbserver / DataDistribution . actor . cpp <nl> ppp b / fdbserver / DataDistribution . actor . cpp <nl> ACTOR Future < Reference < InitialDataDistribution > > getInitialDataDistribution ( Dat <nl> server_dc . clear ( ) ; <nl> succeeded = false ; <nl> try { <nl> + <nl> + / / Read healthyZone value which is later used to determine on / off of failure triggered DD <nl> + tr . setOption ( FDBTransactionOptions : : READ_SYSTEM_KEYS ) ; <nl> + tr . setOption ( FDBTransactionOptions : : READ_LOCK_AWARE ) ; <nl> + Optional < Value > val = wait ( tr . get ( healthyZoneKey ) ) ; <nl> + if ( val . present ( ) ) { <nl> + auto p = decodeHealthyZoneValue ( val . get ( ) ) ; <nl> + if ( p . second > tr . getReadVersion ( ) . get ( ) | | p . first = = ignoreSSFailuresZoneString ) { <nl> + result - > initHealthyZoneValue = Optional < Key > ( p . first ) ; <nl> + } else { <nl> + result - > initHealthyZoneValue = Optional < Key > ( ) ; <nl> + } <nl> + } else { <nl> + result - > initHealthyZoneValue = Optional < Key > ( ) ; <nl> + } <nl> + <nl> result - > mode = 1 ; <nl> tr . setOption ( FDBTransactionOptions : : PRIORITY_SYSTEM_IMMEDIATE ) ; <nl> Optional < Value > mode = wait ( tr . get ( dataDistributionModeKey ) ) ; <nl> struct DDTeamCollection : ReferenceCounted < DDTeamCollection > { <nl> <nl> std : : vector < DDTeamCollection * > teamCollections ; <nl> AsyncVar < Optional < Key > > healthyZone ; <nl> - Future < Void > clearHealthyZoneFuture ; <nl> + Future < bool > clearHealthyZoneFuture ; <nl> <nl> void resetLocalitySet ( ) { <nl> storageServerSet = Reference < LocalitySet > ( new LocalityMap < UID > ( ) ) ; <nl> struct DDTeamCollection : ReferenceCounted < DDTeamCollection > { <nl> : cx ( cx ) , distributorId ( distributorId ) , lock ( lock ) , output ( output ) , <nl> shardsAffectedByTeamFailure ( shardsAffectedByTeamFailure ) , doBuildTeams ( true ) , lastBuildTeamsFailed ( false ) , teamBuilder ( Void ( ) ) , <nl> badTeamRemover ( Void ( ) ) , redundantMachineTeamRemover ( Void ( ) ) , redundantServerTeamRemover ( Void ( ) ) , <nl> - configuration ( configuration ) , readyToStart ( readyToStart ) , clearHealthyZoneFuture ( Void ( ) ) , <nl> + configuration ( configuration ) , readyToStart ( readyToStart ) , clearHealthyZoneFuture ( true ) , <nl> checkTeamDelay ( delay ( SERVER_KNOBS - > CHECK_TEAM_DELAY , TaskPriority : : DataDistribution ) ) , <nl> initialFailureReactionDelay ( <nl> delayed ( readyToStart , SERVER_KNOBS - > INITIAL_FAILURE_REACTION_DELAY , TaskPriority : : DataDistribution ) ) , <nl> struct DDTeamCollection : ReferenceCounted < DDTeamCollection > { <nl> } <nl> <nl> ACTOR static Future < Void > init ( DDTeamCollection * self , Reference < InitialDataDistribution > initTeams ) { <nl> + self - > healthyZone . set ( initTeams - > initHealthyZoneValue ) ; <nl> / / SOMEDAY : If some servers have teams and not others ( or some servers have more data than others ) and there is an address / locality collision , should <nl> / / we preferentially mark the least used server as undesirable ? <nl> for ( auto i = initTeams - > allServers . begin ( ) ; i ! = initTeams - > allServers . end ( ) ; + + i ) { <nl> ACTOR Future < Void > teamTracker ( DDTeamCollection * self , Reference < TCTeamInfo > tea <nl> rs . keys = shards [ i ] ; <nl> rs . priority = maxPriority ; <nl> <nl> + / / Failed server should not trigger DD if SS failures are set to be ignored <nl> + if ( rs . priority = = PRIORITY_TEAM_UNHEALTHY ) { <nl> + ASSERT_WE_THINK ( ! ( ! badTeam & & self - > healthyZone . get ( ) . present ( ) & & <nl> + ( self - > healthyZone . get ( ) . get ( ) = = ignoreSSFailuresZoneString ) ) ) ; <nl> + } <nl> self - > output . send ( rs ) ; <nl> if ( deterministicRandom ( ) - > random01 ( ) < 0 . 01 ) { <nl> TraceEvent ( " SendRelocateToDDQx100 " , self - > distributorId ) <nl> ACTOR Future < Void > waitHealthyZoneChange ( DDTeamCollection * self ) { <nl> state Future < Void > healthyZoneTimeout = Never ( ) ; <nl> if ( val . present ( ) ) { <nl> auto p = decodeHealthyZoneValue ( val . get ( ) ) ; <nl> - if ( p . second > tr . getReadVersion ( ) . get ( ) ) { <nl> + if ( p . first = = ignoreSSFailuresZoneString ) { <nl> + / / healthyZone is now overloaded for DD diabling purpose , which does not timeout <nl> + TraceEvent ( " DataDistributionDisabledForStorageServerFailuresStart " , self - > distributorId ) ; <nl> + healthyZoneTimeout = Never ( ) ; <nl> + } else if ( p . second > tr . getReadVersion ( ) . get ( ) ) { <nl> double timeoutSeconds = ( p . second - tr . getReadVersion ( ) . get ( ) ) / ( double ) SERVER_KNOBS - > VERSIONS_PER_SECOND ; <nl> healthyZoneTimeout = delay ( timeoutSeconds ) ; <nl> if ( self - > healthyZone . get ( ) ! = p . first ) { <nl> TraceEvent ( " MaintenanceZoneStart " , self - > distributorId ) . detail ( " ZoneID " , printable ( p . first ) ) . detail ( " EndVersion " , p . second ) . detail ( " Duration " , timeoutSeconds ) ; <nl> self - > healthyZone . set ( p . first ) ; <nl> } <nl> - } else if ( self - > healthyZone . get ( ) . present ( ) ) { <nl> - TraceEvent ( " MaintenanceZoneEnd " , self - > distributorId ) ; <nl> + } else if ( self - > healthyZone . get ( ) . present ( ) ) { <nl> + / / maintenance hits timeout <nl> + TraceEvent ( " MaintenanceZoneEndTimeout " , self - > distributorId ) ; <nl> self - > healthyZone . set ( Optional < Key > ( ) ) ; <nl> } <nl> } else if ( self - > healthyZone . get ( ) . present ( ) ) { <nl> - TraceEvent ( " MaintenanceZoneEnd " , self - > distributorId ) ; <nl> + / / ` healthyZone ` has been cleared <nl> + if ( self - > healthyZone . get ( ) . get ( ) = = ignoreSSFailuresZoneString ) { <nl> + TraceEvent ( " DataDistributionDisabledForStorageServerFailuresEnd " , self - > distributorId ) ; <nl> + } else { <nl> + TraceEvent ( " MaintenanceZoneEndManualClear " , self - > distributorId ) ; <nl> + } <nl> self - > healthyZone . set ( Optional < Key > ( ) ) ; <nl> } <nl> - <nl> + <nl> state Future < Void > watchFuture = tr . watch ( healthyZoneKey ) ; <nl> wait ( tr . commit ( ) ) ; <nl> wait ( watchFuture | | healthyZoneTimeout ) ; <nl> ACTOR Future < Void > waitForAllDataRemoved ( Database cx , UID serverID , Version add <nl> } <nl> } <nl> <nl> - ACTOR Future < Void > storageServerFailureTracker ( <nl> - DDTeamCollection * self , <nl> - TCServerInfo * server , <nl> - Database cx , <nl> - ServerStatus * status , <nl> - Version addedVersion ) <nl> - { <nl> + ACTOR Future < Void > storageServerFailureTracker ( DDTeamCollection * self , TCServerInfo * server , Database cx , <nl> + ServerStatus * status , Version addedVersion ) { <nl> state StorageServerInterface interf = server - > lastKnownInterface ; <nl> state int targetTeamNumPerServer = ( SERVER_KNOBS - > DESIRED_TEAMS_PER_SERVER * ( self - > configuration . storageTeamSize + 1 ) ) / 2 ; <nl> loop { <nl> - state bool inHealthyZone = self - > healthyZone . get ( ) . present ( ) & & interf . locality . zoneId ( ) = = self - > healthyZone . get ( ) ; <nl> - if ( inHealthyZone ) { <nl> - status - > isFailed = false ; <nl> + state bool inHealthyZone = false ; / / healthChanged actor will be Never ( ) if this flag is true <nl> + if ( self - > healthyZone . get ( ) . present ( ) ) { <nl> + if ( interf . locality . zoneId ( ) = = self - > healthyZone . get ( ) ) { <nl> + status - > isFailed = false ; <nl> + inHealthyZone = true ; <nl> + } else if ( self - > healthyZone . get ( ) . get ( ) = = ignoreSSFailuresZoneString ) { <nl> + / / Ignore all SS failures <nl> + status - > isFailed = false ; <nl> + inHealthyZone = true ; <nl> + TraceEvent ( " SSFailureTracker " , self - > distributorId ) <nl> + . suppressFor ( 1 . 0 ) <nl> + . detail ( " IgnoredFailure " , " BeforeChooseWhen " ) <nl> + . detail ( " ServerID " , interf . id ( ) ) <nl> + . detail ( " Status " , status - > toString ( ) ) ; <nl> + } <nl> } <nl> <nl> if ( self - > server_status . get ( interf . id ( ) ) . initialized ) { <nl> ACTOR Future < Void > storageServerFailureTracker ( <nl> if ( ! status - > isFailed & & ( server - > teams . size ( ) < targetTeamNumPerServer | | self - > lastBuildTeamsFailed ) ) { <nl> self - > doBuildTeams = true ; <nl> } <nl> - if ( status - > isFailed & & self - > healthyZone . get ( ) . present ( ) & & self - > clearHealthyZoneFuture . isReady ( ) ) { <nl> - self - > clearHealthyZoneFuture = clearHealthyZone ( self - > cx ) ; <nl> - TraceEvent ( " MaintenanceZoneCleared " , self - > distributorId ) ; <nl> - self - > healthyZone . set ( Optional < Key > ( ) ) ; <nl> + if ( status - > isFailed & & self - > healthyZone . get ( ) . present ( ) ) { <nl> + if ( self - > healthyZone . get ( ) . get ( ) = = ignoreSSFailuresZoneString ) { <nl> + / / Ignore the failed storage server <nl> + TraceEvent ( " SSFailureTracker " , self - > distributorId ) <nl> + . detail ( " IgnoredFailure " , " InsideChooseWhen " ) <nl> + . detail ( " ServerID " , interf . id ( ) ) <nl> + . detail ( " Status " , status - > toString ( ) ) ; <nl> + status - > isFailed = false ; <nl> + } else if ( self - > clearHealthyZoneFuture . isReady ( ) ) { <nl> + self - > clearHealthyZoneFuture = clearHealthyZone ( self - > cx ) ; <nl> + TraceEvent ( " MaintenanceZoneCleared " , self - > distributorId ) ; <nl> + self - > healthyZone . set ( Optional < Key > ( ) ) ; <nl> + } <nl> } <nl> - <nl> - TraceEvent ( " StatusMapChange " , self - > distributorId ) . detail ( " ServerID " , interf . id ( ) ) . detail ( " Status " , status - > toString ( ) ) <nl> - . detail ( " Available " , IFailureMonitor : : failureMonitor ( ) . getState ( interf . waitFailure . getEndpoint ( ) ) . isAvailable ( ) ) ; <nl> } <nl> when ( wait ( status - > isUnhealthy ( ) ? waitForAllDataRemoved ( cx , interf . id ( ) , addedVersion , self ) : Never ( ) ) ) { break ; } <nl> when ( wait ( self - > healthyZone . onChange ( ) ) ) { } <nl> } <nl> } <nl> <nl> - return Void ( ) ; <nl> + return Void ( ) ; / / Don ' t ignore failures <nl> } <nl> <nl> / / Check the status of a storage server . <nl> ACTOR Future < Void > storageServerTracker ( <nl> otherChanges . push_back ( self - > excludedServers . onChange ( addr ) ) ; <nl> otherChanges . push_back ( self - > excludedServers . onChange ( ipaddr ) ) ; <nl> <nl> - failureTracker = storageServerFailureTracker ( self , server , cx , & status , addedVersion ) ; <nl> - <nl> + failureTracker = storageServerFailureTracker ( self , server , cx , & status , addedVersion ) ; <nl> / / We need to recruit new storage servers if the key value store type has changed <nl> if ( hasWrongStoreTypeOrDC ) <nl> self - > restartRecruiting . trigger ( ) ; <nl> ACTOR Future < Void > storageServerTracker ( <nl> <nl> state bool recordTeamCollectionInfo = false ; <nl> choose { <nl> - when ( wait ( failureTracker ) ) { <nl> + when ( wait ( failureTracker ) ) { <nl> / / The server is failed AND all data has been removed from it , so permanently remove it . <nl> TraceEvent ( " StatusMapChange " , self - > distributorId ) . detail ( " ServerID " , server - > id ) . detail ( " Status " , " Removing " ) ; <nl> <nl> mmm a / fdbserver / DataDistribution . actor . h <nl> ppp b / fdbserver / DataDistribution . actor . h <nl> struct TeamCollectionInterface { <nl> class ShardsAffectedByTeamFailure : public ReferenceCounted < ShardsAffectedByTeamFailure > { <nl> public : <nl> ShardsAffectedByTeamFailure ( ) { } <nl> - <nl> + <nl> struct Team { <nl> vector < UID > servers ; / / sorted <nl> bool primary ; <nl> class ShardsAffectedByTeamFailure : public ReferenceCounted < ShardsAffectedByTeam <nl> <nl> bool operator < ( const Team & r ) const { <nl> if ( servers = = r . servers ) return primary < r . primary ; <nl> - return servers < r . servers ; <nl> + return servers < r . servers ; <nl> } <nl> bool operator = = ( const Team & r ) const { <nl> return servers = = r . servers & & primary = = r . primary ; <nl> struct InitialDataDistribution : ReferenceCounted < InitialDataDistribution > { <nl> std : : set < vector < UID > > primaryTeams ; <nl> std : : set < vector < UID > > remoteTeams ; <nl> vector < DDShardInfo > shards ; <nl> + Optional < Key > initHealthyZoneValue ; <nl> } ; <nl> <nl> Future < Void > dataDistributionTracker ( <nl> mmm a / fdbserver / DataDistributionQueue . actor . cpp <nl> ppp b / fdbserver / DataDistributionQueue . actor . cpp <nl> struct RelocateData { <nl> mergeWantsNewServers ( rs . keys , rs . priority ) ) , interval ( " QueuedRelocation " ) { } <nl> <nl> static bool mergeWantsNewServers ( KeyRangeRef keys , int priority ) { <nl> - return priority = = PRIORITY_MERGE_SHARD & & <nl> - ( SERVER_KNOBS - > MERGE_ONTO_NEW_TEAM = = 2 | | <nl> - ( SERVER_KNOBS - > MERGE_ONTO_NEW_TEAM = = 1 & & keys . begin . startsWith ( LiteralStringRef ( " \ xff " ) ) ) ) ; <nl> + return priority = = PRIORITY_MERGE_SHARD & & <nl> + ( SERVER_KNOBS - > MERGE_ONTO_NEW_TEAM = = 2 | | <nl> + ( SERVER_KNOBS - > MERGE_ONTO_NEW_TEAM = = 1 & & keys . begin . startsWith ( LiteralStringRef ( " \ xff " ) ) ) ) ; <nl> } <nl> <nl> bool operator > ( const RelocateData & rhs ) const { <nl> struct DDQueueData { <nl> } <nl> <nl> / / If the size of keyServerEntries is large , then just assume we are using all storage servers <nl> - / / Why the size can be large ? <nl> + / / Why the size can be large ? <nl> / / When a shard is inflight and DD crashes , some destination servers may have already got the data . <nl> / / The new DD will treat the destination servers as source servers . So the size can be large . <nl> else { <nl> ACTOR Future < bool > rebalanceTeams ( DDQueueData * self , int priority , Reference < ID <nl> } <nl> <nl> ACTOR Future < Void > BgDDMountainChopper ( DDQueueData * self , int teamCollectionIndex ) { <nl> - state double checkDelay = SERVER_KNOBS - > BG_DD_POLLING_INTERVAL ; <nl> + state double rebalancePollingInterval = SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ; <nl> state int resetCount = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT ; <nl> + state Transaction tr ( self - > cx ) ; <nl> + state double lastRead = 0 ; <nl> + state bool skipCurrentLoop = false ; <nl> loop { <nl> - wait ( delay ( checkDelay , TaskPriority : : DataDistributionLaunch ) ) ; <nl> - if ( self - > priority_relocations [ PRIORITY_REBALANCE_OVERUTILIZED_TEAM ] < SERVER_KNOBS - > DD_REBALANCE_PARALLELISM ) { <nl> - state Optional < Reference < IDataDistributionTeam > > randomTeam = wait ( brokenPromiseToNever ( self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( GetTeamRequest ( true , false , true ) ) ) ) ; <nl> - if ( randomTeam . present ( ) ) { <nl> - if ( randomTeam . get ( ) - > getMinFreeSpaceRatio ( ) > SERVER_KNOBS - > FREE_SPACE_RATIO_DD_CUTOFF ) { <nl> - state Optional < Reference < IDataDistributionTeam > > loadedTeam = wait ( brokenPromiseToNever ( self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( GetTeamRequest ( true , true , false ) ) ) ) ; <nl> - if ( loadedTeam . present ( ) ) { <nl> - bool moved = wait ( rebalanceTeams ( self , PRIORITY_REBALANCE_OVERUTILIZED_TEAM , loadedTeam . get ( ) , randomTeam . get ( ) , teamCollectionIndex = = 0 ) ) ; <nl> - if ( moved ) { <nl> - resetCount = 0 ; <nl> - } else { <nl> - resetCount + + ; <nl> + try { <nl> + state Future < Void > delayF = delay ( rebalancePollingInterval , TaskPriority : : DataDistributionLaunch ) ; <nl> + if ( ( now ( ) - lastRead ) > SERVER_KNOBS - > BG_REBALANCE_SWITCH_CHECK_INTERVAL ) { <nl> + tr . setOption ( FDBTransactionOptions : : LOCK_AWARE ) ; <nl> + Optional < Value > val = wait ( tr . get ( rebalanceDDIgnoreKey ) ) ; <nl> + lastRead = now ( ) ; <nl> + if ( skipCurrentLoop & & ! val . present ( ) ) { <nl> + / / reset loop interval <nl> + rebalancePollingInterval = SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ; <nl> + } <nl> + skipCurrentLoop = val . present ( ) ; <nl> + } <nl> + wait ( delayF ) ; <nl> + if ( skipCurrentLoop ) { <nl> + / / set loop interval to avoid busy wait here . <nl> + rebalancePollingInterval = <nl> + std : : max ( rebalancePollingInterval , SERVER_KNOBS - > BG_REBALANCE_SWITCH_CHECK_INTERVAL ) ; <nl> + continue ; <nl> + } <nl> + if ( self - > priority_relocations [ PRIORITY_REBALANCE_OVERUTILIZED_TEAM ] < <nl> + SERVER_KNOBS - > DD_REBALANCE_PARALLELISM ) { <nl> + state Optional < Reference < IDataDistributionTeam > > randomTeam = wait ( brokenPromiseToNever ( <nl> + self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( GetTeamRequest ( true , false , true ) ) ) ) ; <nl> + if ( randomTeam . present ( ) ) { <nl> + if ( randomTeam . get ( ) - > getMinFreeSpaceRatio ( ) > SERVER_KNOBS - > FREE_SPACE_RATIO_DD_CUTOFF ) { <nl> + state Optional < Reference < IDataDistributionTeam > > loadedTeam = <nl> + wait ( brokenPromiseToNever ( self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( <nl> + GetTeamRequest ( true , true , false ) ) ) ) ; <nl> + if ( loadedTeam . present ( ) ) { <nl> + bool moved = <nl> + wait ( rebalanceTeams ( self , PRIORITY_REBALANCE_OVERUTILIZED_TEAM , loadedTeam . get ( ) , <nl> + randomTeam . get ( ) , teamCollectionIndex = = 0 ) ) ; <nl> + if ( moved ) { <nl> + resetCount = 0 ; <nl> + } else { <nl> + resetCount + + ; <nl> + } <nl> } <nl> } <nl> } <nl> } <nl> - } <nl> <nl> - if ( now ( ) - ( * self - > lastLimited ) < SERVER_KNOBS - > BG_DD_SATURATION_DELAY ) { <nl> - checkDelay = std : : min ( SERVER_KNOBS - > BG_DD_MAX_WAIT , checkDelay * SERVER_KNOBS - > BG_DD_INCREASE_RATE ) ; <nl> - } else { <nl> - checkDelay = std : : max ( SERVER_KNOBS - > BG_DD_MIN_WAIT , checkDelay / SERVER_KNOBS - > BG_DD_DECREASE_RATE ) ; <nl> - } <nl> + if ( now ( ) - ( * self - > lastLimited ) < SERVER_KNOBS - > BG_DD_SATURATION_DELAY ) { <nl> + rebalancePollingInterval = std : : min ( SERVER_KNOBS - > BG_DD_MAX_WAIT , <nl> + rebalancePollingInterval * SERVER_KNOBS - > BG_DD_INCREASE_RATE ) ; <nl> + } else { <nl> + rebalancePollingInterval = std : : max ( SERVER_KNOBS - > BG_DD_MIN_WAIT , <nl> + rebalancePollingInterval / SERVER_KNOBS - > BG_DD_DECREASE_RATE ) ; <nl> + } <nl> <nl> - if ( resetCount > = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT & & checkDelay < SERVER_KNOBS - > BG_DD_POLLING_INTERVAL ) { <nl> - checkDelay = SERVER_KNOBS - > BG_DD_POLLING_INTERVAL ; <nl> - resetCount = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT ; <nl> + if ( resetCount > = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT & & <nl> + rebalancePollingInterval < SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ) { <nl> + rebalancePollingInterval = SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ; <nl> + resetCount = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT ; <nl> + } <nl> + tr . reset ( ) ; <nl> + } catch ( Error & e ) { <nl> + wait ( tr . onError ( e ) ) ; <nl> } <nl> } <nl> } <nl> <nl> ACTOR Future < Void > BgDDValleyFiller ( DDQueueData * self , int teamCollectionIndex ) { <nl> - state double checkDelay = SERVER_KNOBS - > BG_DD_POLLING_INTERVAL ; <nl> + state double rebalancePollingInterval = SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ; <nl> state int resetCount = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT ; <nl> + state Transaction tr ( self - > cx ) ; <nl> + state double lastRead = 0 ; <nl> + state bool skipCurrentLoop = false ; <nl> loop { <nl> - wait ( delay ( checkDelay , TaskPriority : : DataDistributionLaunch ) ) ; <nl> - if ( self - > priority_relocations [ PRIORITY_REBALANCE_UNDERUTILIZED_TEAM ] < SERVER_KNOBS - > DD_REBALANCE_PARALLELISM ) { <nl> - state Optional < Reference < IDataDistributionTeam > > randomTeam = wait ( brokenPromiseToNever ( self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( GetTeamRequest ( true , false , false ) ) ) ) ; <nl> - if ( randomTeam . present ( ) ) { <nl> - state Optional < Reference < IDataDistributionTeam > > unloadedTeam = wait ( brokenPromiseToNever ( self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( GetTeamRequest ( true , true , true ) ) ) ) ; <nl> - if ( unloadedTeam . present ( ) ) { <nl> - if ( unloadedTeam . get ( ) - > getMinFreeSpaceRatio ( ) > SERVER_KNOBS - > FREE_SPACE_RATIO_DD_CUTOFF ) { <nl> - bool moved = wait ( rebalanceTeams ( self , PRIORITY_REBALANCE_UNDERUTILIZED_TEAM , randomTeam . get ( ) , unloadedTeam . get ( ) , teamCollectionIndex = = 0 ) ) ; <nl> - if ( moved ) { <nl> - resetCount = 0 ; <nl> - } else { <nl> - resetCount + + ; <nl> + try { <nl> + state Future < Void > delayF = delay ( rebalancePollingInterval , TaskPriority : : DataDistributionLaunch ) ; <nl> + if ( ( now ( ) - lastRead ) > SERVER_KNOBS - > BG_REBALANCE_SWITCH_CHECK_INTERVAL ) { <nl> + tr . setOption ( FDBTransactionOptions : : LOCK_AWARE ) ; <nl> + Optional < Value > val = wait ( tr . get ( rebalanceDDIgnoreKey ) ) ; <nl> + lastRead = now ( ) ; <nl> + if ( skipCurrentLoop & & ! val . present ( ) ) { <nl> + / / reset loop interval <nl> + rebalancePollingInterval = SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ; <nl> + } <nl> + skipCurrentLoop = val . present ( ) ; <nl> + } <nl> + wait ( delayF ) ; <nl> + if ( skipCurrentLoop ) { <nl> + / / set loop interval to avoid busy wait here . <nl> + rebalancePollingInterval = <nl> + std : : max ( rebalancePollingInterval , SERVER_KNOBS - > BG_REBALANCE_SWITCH_CHECK_INTERVAL ) ; <nl> + continue ; <nl> + } <nl> + if ( self - > priority_relocations [ PRIORITY_REBALANCE_UNDERUTILIZED_TEAM ] < <nl> + SERVER_KNOBS - > DD_REBALANCE_PARALLELISM ) { <nl> + state Optional < Reference < IDataDistributionTeam > > randomTeam = wait ( brokenPromiseToNever ( <nl> + self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( GetTeamRequest ( true , false , false ) ) ) ) ; <nl> + if ( randomTeam . present ( ) ) { <nl> + state Optional < Reference < IDataDistributionTeam > > unloadedTeam = wait ( brokenPromiseToNever ( <nl> + self - > teamCollections [ teamCollectionIndex ] . getTeam . getReply ( GetTeamRequest ( true , true , true ) ) ) ) ; <nl> + if ( unloadedTeam . present ( ) ) { <nl> + if ( unloadedTeam . get ( ) - > getMinFreeSpaceRatio ( ) > SERVER_KNOBS - > FREE_SPACE_RATIO_DD_CUTOFF ) { <nl> + bool moved = <nl> + wait ( rebalanceTeams ( self , PRIORITY_REBALANCE_UNDERUTILIZED_TEAM , randomTeam . get ( ) , <nl> + unloadedTeam . get ( ) , teamCollectionIndex = = 0 ) ) ; <nl> + if ( moved ) { <nl> + resetCount = 0 ; <nl> + } else { <nl> + resetCount + + ; <nl> + } <nl> } <nl> } <nl> } <nl> } <nl> - } <nl> <nl> - if ( now ( ) - ( * self - > lastLimited ) < SERVER_KNOBS - > BG_DD_SATURATION_DELAY ) { <nl> - checkDelay = std : : min ( SERVER_KNOBS - > BG_DD_MAX_WAIT , checkDelay * SERVER_KNOBS - > BG_DD_INCREASE_RATE ) ; <nl> - } else { <nl> - checkDelay = std : : max ( SERVER_KNOBS - > BG_DD_MIN_WAIT , checkDelay / SERVER_KNOBS - > BG_DD_DECREASE_RATE ) ; <nl> - } <nl> + if ( now ( ) - ( * self - > lastLimited ) < SERVER_KNOBS - > BG_DD_SATURATION_DELAY ) { <nl> + rebalancePollingInterval = std : : min ( SERVER_KNOBS - > BG_DD_MAX_WAIT , <nl> + rebalancePollingInterval * SERVER_KNOBS - > BG_DD_INCREASE_RATE ) ; <nl> + } else { <nl> + rebalancePollingInterval = std : : max ( SERVER_KNOBS - > BG_DD_MIN_WAIT , <nl> + rebalancePollingInterval / SERVER_KNOBS - > BG_DD_DECREASE_RATE ) ; <nl> + } <nl> <nl> - if ( resetCount > = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT & & checkDelay < SERVER_KNOBS - > BG_DD_POLLING_INTERVAL ) { <nl> - checkDelay = SERVER_KNOBS - > BG_DD_POLLING_INTERVAL ; <nl> - resetCount = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT ; <nl> + if ( resetCount > = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT & & <nl> + rebalancePollingInterval < SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ) { <nl> + rebalancePollingInterval = SERVER_KNOBS - > BG_REBALANCE_POLLING_INTERVAL ; <nl> + resetCount = SERVER_KNOBS - > DD_REBALANCE_RESET_AMOUNT ; <nl> + } <nl> + tr . reset ( ) ; <nl> + } catch ( Error & e ) { <nl> + wait ( tr . onError ( e ) ) ; <nl> } <nl> } <nl> } <nl> mmm a / fdbserver / Knobs . cpp <nl> ppp b / fdbserver / Knobs . cpp <nl> ServerKnobs const * SERVER_KNOBS = new ServerKnobs ( ) ; <nl> # define init ( knob , value ) initKnob ( knob , value , # knob ) <nl> <nl> ServerKnobs : : ServerKnobs ( bool randomize , ClientKnobs * clientKnobs ) { <nl> + / / clang - format off <nl> / / Versions <nl> init ( VERSIONS_PER_SECOND , 1e6 ) ; <nl> init ( MAX_VERSIONS_IN_FLIGHT , 100 * VERSIONS_PER_SECOND ) ; <nl> ServerKnobs : : ServerKnobs ( bool randomize , ClientKnobs * clientKnobs ) { <nl> / / Data distribution queue <nl> init ( HEALTH_POLL_TIME , 1 . 0 ) ; <nl> init ( BEST_TEAM_STUCK_DELAY , 1 . 0 ) ; <nl> - init ( BG_DD_POLLING_INTERVAL , 10 . 0 ) ; <nl> + init ( BG_REBALANCE_POLLING_INTERVAL , 10 . 0 ) ; <nl> + init ( BG_REBALANCE_SWITCH_CHECK_INTERVAL , 5 . 0 ) ; if ( randomize & & BUGGIFY ) BG_REBALANCE_SWITCH_CHECK_INTERVAL = 1 . 0 ; <nl> init ( DD_QUEUE_LOGGING_INTERVAL , 5 . 0 ) ; <nl> init ( RELOCATION_PARALLELISM_PER_SOURCE_SERVER , 2 ) ; if ( randomize & & BUGGIFY ) RELOCATION_PARALLELISM_PER_SOURCE_SERVER = 1 ; <nl> init ( DD_QUEUE_MAX_KEY_SERVERS , 100 ) ; if ( randomize & & BUGGIFY ) DD_QUEUE_MAX_KEY_SERVERS = 1 ; <nl> ServerKnobs : : ServerKnobs ( bool randomize , ClientKnobs * clientKnobs ) { <nl> init ( DURABILITY_LAG_REDUCTION_RATE , 0 . 9999 ) ; <nl> init ( DURABILITY_LAG_INCREASE_RATE , 1 . 001 ) ; <nl> init ( STORAGE_SERVER_LIST_FETCH_TIMEOUT , 20 . 0 ) ; <nl> - <nl> + <nl> / / Storage Metrics <nl> init ( STORAGE_METRICS_AVERAGE_INTERVAL , 120 . 0 ) ; <nl> init ( STORAGE_METRICS_AVERAGE_INTERVAL_PER_KSECONDS , 1000 . 0 / STORAGE_METRICS_AVERAGE_INTERVAL ) ; / / milliHz ! <nl> ServerKnobs : : ServerKnobs ( bool randomize , ClientKnobs * clientKnobs ) { <nl> init ( TIME_KEEPER_DELAY , 10 ) ; <nl> init ( TIME_KEEPER_MAX_ENTRIES , 3600 * 24 * 30 * 6 ) ; if ( randomize & & BUGGIFY ) { TIME_KEEPER_MAX_ENTRIES = 2 ; } <nl> <nl> + / / clang - format on <nl> + <nl> if ( clientKnobs ) <nl> clientKnobs - > IS_ACCEPTABLE_DELAY = clientKnobs - > IS_ACCEPTABLE_DELAY * std : : min ( MAX_READ_TRANSACTION_LIFE_VERSIONS , MAX_WRITE_TRANSACTION_LIFE_VERSIONS ) / ( 5 . 0 * VERSIONS_PER_SECOND ) ; <nl> } <nl> mmm a / fdbserver / Knobs . h <nl> ppp b / fdbserver / Knobs . h <nl> class ServerKnobs : public Knobs { <nl> / / Data distribution queue <nl> double HEALTH_POLL_TIME ; <nl> double BEST_TEAM_STUCK_DELAY ; <nl> - double BG_DD_POLLING_INTERVAL ; <nl> + double BG_REBALANCE_POLLING_INTERVAL ; <nl> + double BG_REBALANCE_SWITCH_CHECK_INTERVAL ; <nl> double DD_QUEUE_LOGGING_INTERVAL ; <nl> double RELOCATION_PARALLELISM_PER_SOURCE_SERVER ; <nl> int DD_QUEUE_MAX_KEY_SERVERS ; <nl> mmm a / fdbserver / Status . actor . cpp <nl> ppp b / fdbserver / Status . actor . cpp <nl> struct RolesInfo { <nl> <nl> ACTOR static Future < JsonBuilderObject > processStatusFetcher ( <nl> Reference < AsyncVar < struct ServerDBInfo > > db , std : : vector < WorkerDetails > workers , WorkerEvents pMetrics , <nl> - WorkerEvents mMetrics , WorkerEvents nMetrics , WorkerEvents errors , WorkerEvents traceFileOpenErrors , <nl> + WorkerEvents mMetrics , WorkerEvents nMetrics , WorkerEvents errors , WorkerEvents traceFileOpenErrors , <nl> WorkerEvents programStarts , std : : map < std : : string , std : : vector < JsonBuilderObject > > processIssues , <nl> vector < std : : pair < StorageServerInterface , EventMap > > storageServers , <nl> vector < std : : pair < TLogInterface , EventMap > > tLogs , vector < std : : pair < MasterProxyInterface , EventMap > > proxies , <nl> static JsonBuilderObject clientStatusFetcher ( std : : map < NetworkAddress , std : : pair < <nl> std : : map < Standalone < ClientVersionRef > , ClientStats > supportedVersions ; <nl> std : : map < Key , ClientStats > maxSupportedProtocol ; <nl> <nl> - <nl> + <nl> for ( auto iter = clientStatusMap - > begin ( ) ; iter ! = clientStatusMap - > end ( ) ; + + iter ) { <nl> if ( now ( ) - iter - > second . first < 2 * SERVER_KNOBS - > COORDINATOR_REGISTER_INTERVAL ) { <nl> clientCount + = iter - > second . second . clientCount ; <nl> ACTOR static Future < Void > consistencyCheckStatusFetcher ( Database cx , JsonBuilder <nl> break ; <nl> } catch ( Error & e ) { <nl> if ( e . code ( ) = = error_code_timed_out ) { <nl> - messages - > push_back ( JsonString : : makeMessage ( " consistencycheck_suspendkey_fetch_timeout " , <nl> + messages - > push_back ( JsonString : : makeMessage ( " consistencycheck_suspendkey_fetch_timeout " , <nl> format ( " Timed out trying to fetch ` % s ` from the database . " , printable ( fdbShouldConsistencyCheckBeSuspended ) . c_str ( ) ) . c_str ( ) ) ) ; <nl> break ; <nl> } <nl> struct LoadConfigurationResult { <nl> bool fullReplication ; <nl> Optional < Key > healthyZone ; <nl> double healthyZoneSeconds ; <nl> + bool rebalanceDDIgnored ; <nl> <nl> - LoadConfigurationResult ( ) : fullReplication ( true ) , healthyZoneSeconds ( 0 ) { } <nl> + LoadConfigurationResult ( ) : fullReplication ( true ) , healthyZoneSeconds ( 0 ) , rebalanceDDIgnored ( false ) { } <nl> } ; <nl> <nl> ACTOR static Future < std : : pair < Optional < DatabaseConfiguration > , Optional < LoadConfigurationResult > > > loadConfiguration ( Database cx , JsonBuilderArray * messages , std : : set < std : : string > * status_incomplete_reasons ) { <nl> ACTOR static Future < std : : pair < Optional < DatabaseConfiguration > , Optional < LoadConfi <nl> replicasFutures . push_back ( tr . get ( datacenterReplicasKeyFor ( region . dcId ) ) ) ; <nl> } <nl> state Future < Optional < Value > > healthyZoneValue = tr . get ( healthyZoneKey ) ; <nl> + state Future < Optional < Value > > rebalanceDDIgnored = tr . get ( rebalanceDDIgnoreKey ) ; <nl> <nl> choose { <nl> - when ( wait ( waitForAll ( replicasFutures ) & & success ( healthyZoneValue ) ) ) { <nl> + when ( wait ( waitForAll ( replicasFutures ) & & success ( healthyZoneValue ) & & success ( rebalanceDDIgnored ) ) ) { <nl> int unreplicated = 0 ; <nl> for ( int i = 0 ; i < result . get ( ) . regions . size ( ) ; i + + ) { <nl> if ( ! replicasFutures [ i ] . get ( ) . present ( ) | | decodeDatacenterReplicasValue ( replicasFutures [ i ] . get ( ) . get ( ) ) < result . get ( ) . storageTeamSize ) { <nl> ACTOR static Future < std : : pair < Optional < DatabaseConfiguration > , Optional < LoadConfi <nl> res . healthyZoneSeconds = ( healthyZone . second - tr . getReadVersion ( ) . get ( ) ) / CLIENT_KNOBS - > CORE_VERSIONSPERSECOND ; <nl> } <nl> } <nl> + res . rebalanceDDIgnored = rebalanceDDIgnored . get ( ) . present ( ) ; <nl> loadResult = res ; <nl> } <nl> when ( wait ( getConfTimeout ) ) { <nl> ACTOR static Future < JsonBuilderObject > dataStatusFetcher ( WorkerDetails ddWorker , <nl> bool primary = inFlight . getInt ( " Primary " ) ; <nl> int highestPriority = inFlight . getInt ( " HighestPriority " ) ; <nl> <nl> - if ( movingHighestPriority < PRIORITY_TEAM_UNHEALTHY ) { <nl> + if ( movingHighestPriority < PRIORITY_TEAM_REDUNDANT ) { <nl> highestPriority = movingHighestPriority ; <nl> } else if ( partitionsInFlight > 0 ) { <nl> highestPriority = std : : max < int > ( highestPriority , PRIORITY_MERGE_SHARD ) ; <nl> static Future < vector < std : : pair < iface , EventMap > > > getServerMetrics ( vector < iface > <nl> <nl> ACTOR static Future < vector < std : : pair < StorageServerInterface , EventMap > > > getStorageServersAndMetrics ( Database cx , std : : unordered_map < NetworkAddress , WorkerInterface > address_workers ) { <nl> vector < StorageServerInterface > servers = wait ( timeoutError ( getStorageServers ( cx , true ) , 5 . 0 ) ) ; <nl> - vector < std : : pair < StorageServerInterface , EventMap > > results = wait ( getServerMetrics ( servers , address_workers , <nl> - std : : vector < std : : string > { " StorageMetrics " , " ReadLatencyMetrics " } ) ) ; <nl> + vector < std : : pair < StorageServerInterface , EventMap > > results = wait ( <nl> + getServerMetrics ( servers , address_workers , std : : vector < std : : string > { " StorageMetrics " , " ReadLatencyMetrics " } ) ) ; <nl> <nl> return results ; <nl> } <nl> <nl> ACTOR static Future < vector < std : : pair < TLogInterface , EventMap > > > getTLogsAndMetrics ( Reference < AsyncVar < struct ServerDBInfo > > db , std : : unordered_map < NetworkAddress , WorkerInterface > address_workers ) { <nl> vector < TLogInterface > servers = db - > get ( ) . logSystemConfig . allPresentLogs ( ) ; <nl> - vector < std : : pair < TLogInterface , EventMap > > results = wait ( getServerMetrics ( servers , address_workers , <nl> - std : : vector < std : : string > { " TLogMetrics " } ) ) ; <nl> + vector < std : : pair < TLogInterface , EventMap > > results = <nl> + wait ( getServerMetrics ( servers , address_workers , std : : vector < std : : string > { " TLogMetrics " } ) ) ; <nl> <nl> return results ; <nl> } <nl> ACTOR static Future < vector < std : : pair < MasterProxyInterface , EventMap > > > getProxie <nl> } <nl> } <nl> <nl> - vector < std : : pair < MasterProxyInterface , EventMap > > results = wait ( getServerMetrics ( servers , address_workers , <nl> - std : : vector < std : : string > { " GRVLatencyMetrics " , " CommitLatencyMetrics " } ) ) ; <nl> + vector < std : : pair < MasterProxyInterface , EventMap > > results = wait ( getServerMetrics ( <nl> + servers , address_workers , std : : vector < std : : string > { " GRVLatencyMetrics " , " CommitLatencyMetrics " } ) ) ; <nl> <nl> return results ; <nl> } <nl> ACTOR Future < StatusReply > clusterGetStatus ( <nl> if ( loadResult . present ( ) ) { <nl> statusObj [ " full_replication " ] = loadResult . get ( ) . fullReplication ; <nl> if ( loadResult . get ( ) . healthyZone . present ( ) ) { <nl> - statusObj [ " maintenance_zone " ] = loadResult . get ( ) . healthyZone . get ( ) . printable ( ) ; <nl> - statusObj [ " maintenance_seconds_remaining " ] = loadResult . get ( ) . healthyZoneSeconds ; <nl> + if ( loadResult . get ( ) . healthyZone . get ( ) ! = ignoreSSFailuresZoneString ) { <nl> + statusObj [ " maintenance_zone " ] = loadResult . get ( ) . healthyZone . get ( ) . printable ( ) ; <nl> + statusObj [ " maintenance_seconds_remaining " ] = loadResult . get ( ) . healthyZoneSeconds ; <nl> + } else { <nl> + statusObj [ " data_distribution_disabled_for_ss_failures " ] = true ; <nl> + } <nl> + } <nl> + if ( loadResult . get ( ) . rebalanceDDIgnored ) { <nl> + statusObj [ " data_distribution_disabled_for_rebalance " ] = true ; <nl> } <nl> } <nl> <nl> ACTOR Future < StatusReply > clusterGetStatus ( <nl> statusObj [ " layers " ] = layers ; <nl> } <nl> <nl> - JsonBuilderObject processStatus = wait ( processStatusFetcher ( db , workers , pMetrics , mMetrics , networkMetrics , <nl> - latestError , traceFileOpenErrors , programStarts , <nl> - processIssues , storageServers , tLogs , proxies , cx , <nl> - configuration , loadResult . present ( ) ? loadResult . get ( ) . healthyZone : Optional < Key > ( ) , <nl> + JsonBuilderObject processStatus = wait ( processStatusFetcher ( db , workers , pMetrics , mMetrics , networkMetrics , <nl> + latestError , traceFileOpenErrors , programStarts , <nl> + processIssues , storageServers , tLogs , proxies , cx , <nl> + configuration , loadResult . present ( ) ? loadResult . get ( ) . healthyZone : Optional < Key > ( ) , <nl> & status_incomplete_reasons ) ) ; <nl> statusObj [ " processes " ] = processStatus ; <nl> statusObj [ " clients " ] = clientStatusFetcher ( clientStatus ) ; <nl> TEST_CASE ( " / status / json / builderPerf " ) { <nl> printf ( " JsonBuilder : % 8lu bytes % - 7 . 5f gen + % - 7 . 5f serialize = % - 7 . 5f \ n " , s . size ( ) , generate , serialize , generate + serialize ) ; <nl> printf ( " json_spirit : % 8lu bytes % - 7 . 5f parse + % - 7 . 5f serialize = % - 7 . 5f \ n " , jsStr . size ( ) , jsParse , jsSerialize , jsParse + jsSerialize ) ; <nl> printf ( " \ n " ) ; <nl> - <nl> + <nl> generated + = generate ; <nl> serialized + = serialize ; <nl> bytes + = s . size ( ) ; <nl> mmm a / fdbserver / workloads / MachineAttrition . actor . cpp <nl> ppp b / fdbserver / workloads / MachineAttrition . actor . cpp <nl> static std : : set < int > const & normalAttritionErrors ( ) { <nl> return s ; <nl> } <nl> <nl> + ACTOR Future < Void > resetHealthyZoneAfter ( Database cx , double duration ) { <nl> + state Transaction tr ( cx ) ; <nl> + state Future < Void > delayF = delay ( duration ) ; <nl> + loop { <nl> + try { <nl> + tr . setOption ( FDBTransactionOptions : : LOCK_AWARE ) ; <nl> + wait ( delayF ) ; <nl> + tr . clear ( healthyZoneKey ) ; <nl> + wait ( tr . commit ( ) ) ; <nl> + return Void ( ) ; <nl> + } catch ( Error & e ) { <nl> + wait ( tr . onError ( e ) ) ; <nl> + } <nl> + } <nl> + } <nl> + <nl> struct MachineAttritionWorkload : TestWorkload { <nl> bool enabled ; <nl> int machinesToKill , machinesToLeave ; <nl> struct MachineAttritionWorkload : TestWorkload { <nl> <nl> / / decide on a machine to kill <nl> state LocalityData targetMachine = self - > machines . back ( ) ; <nl> - <nl> + state Future < Void > resetHealthyZone = Future < Void > ( Void ( ) ) ; <nl> if ( BUGGIFY_WITH_PROB ( 0 . 01 ) ) { <nl> TEST ( true ) ; / / Marked a zone for maintenance before killing it <nl> - wait ( setHealthyZone ( cx , targetMachine . zoneId ( ) . get ( ) , deterministicRandom ( ) - > random01 ( ) * 20 ) ) ; <nl> + bool _ = <nl> + wait ( setHealthyZone ( cx , targetMachine . zoneId ( ) . get ( ) , deterministicRandom ( ) - > random01 ( ) * 20 ) ) ; <nl> + / / } <nl> + } else if ( BUGGIFY_WITH_PROB ( 0 . 005 ) ) { <nl> + TEST ( true ) ; / / Disable DD for all storage server failures <nl> + bool _ = wait ( setHealthyZone ( cx , ignoreSSFailuresZoneString , <nl> + 0 ) ) ; / / duration doesn ' t matter since this won ' t timeout <nl> + resetHealthyZone = resetHealthyZoneAfter ( cx , deterministicRandom ( ) - > random01 ( ) * 5 ) ; <nl> } <nl> <nl> TraceEvent ( " Assassination " ) . detail ( " TargetMachine " , targetMachine . toString ( ) ) <nl> struct MachineAttritionWorkload : TestWorkload { <nl> if ( ! self - > replacement ) <nl> self - > machines . pop_back ( ) ; <nl> <nl> - wait ( delay ( meanDelay - delayBeforeKill ) ) ; <nl> + wait ( delay ( meanDelay - delayBeforeKill ) & & resetHealthyZone ) ; <nl> + <nl> delayBeforeKill = deterministicRandom ( ) - > random01 ( ) * meanDelay ; <nl> TraceEvent ( " WorkerKillAfterMeanDelay " ) . detail ( " DelayBeforeKill " , delayBeforeKill ) ; <nl> } <nl>
Merge pull request from dongxinEric / feature / 1508 / finer - grained - dd - controls
apple/foundationdb
7d7aa27c2d5202edc795bd224e1845eb710799b0
2019-08-01T00:36:20Z
mmm a / src / embind / embind . js <nl> ppp b / src / embind / embind . js <nl> function __embind_register_smart_ptr ( <nl> } ) ; <nl> } <nl> <nl> + function __embind_register_vector ( <nl> + vectorType , <nl> + elementType , <nl> + name , <nl> + constructor , <nl> + destructor , <nl> + length , <nl> + getter , <nl> + setter <nl> + ) { <nl> + name = Pointer_stringify ( name ) ; <nl> + elementType = requireRegisteredType ( elementType , ' vector ' + name ) ; <nl> + <nl> + constructor = FUNCTION_TABLE [ constructor ] ; <nl> + destructor = FUNCTION_TABLE [ destructor ] ; <nl> + length = FUNCTION_TABLE [ length ] ; <nl> + getter = FUNCTION_TABLE [ getter ] ; <nl> + setter = FUNCTION_TABLE [ setter ] ; <nl> + <nl> + registerType ( vectorType , name , { <nl> + name : name , <nl> + fromWireType : function ( ptr ) { <nl> + var arr = [ ] ; <nl> + var n = length ( ptr ) ; <nl> + <nl> + for ( var i = 0 ; i < n ; i + + ) { <nl> + var v = elementType . fromWireType ( getter ( ptr , i ) ) ; <nl> + arr . push ( v ) ; <nl> + } <nl> + <nl> + destructor ( ptr ) ; <nl> + return arr ; <nl> + } , <nl> + toWireType : function ( destructors , o ) { <nl> + var vec = constructor ( ) ; <nl> + for ( var val in o ) { <nl> + setter ( vec , elementType . toWireType ( destructors , o [ val ] ) ) ; <nl> + } <nl> + destructors . push ( destructor ) ; <nl> + destructors . push ( vec ) ; <nl> + return vec ; <nl> + } <nl> + } ) ; <nl> + } <nl> + <nl> function __embind_register_class ( <nl> classType , <nl> pointerType , <nl> mmm a / system / include / emscripten / bind . h <nl> ppp b / system / include / emscripten / bind . h <nl> <nl> # include < stddef . h > <nl> # include < assert . h > <nl> # include < string > <nl> + # include < vector > <nl> # include < type_traits > <nl> # include < emscripten / val . h > <nl> # include < emscripten / wire . h > <nl> namespace emscripten { <nl> GenericFunction destructor , <nl> GenericFunction getPointee ) ; <nl> <nl> + void _embind_register_vector ( <nl> + TYPEID vectorType , <nl> + TYPEID elementType , <nl> + const char * name , <nl> + GenericFunction constructor , <nl> + GenericFunction destructor , <nl> + GenericFunction length , <nl> + GenericFunction getter , <nl> + GenericFunction setter ) ; <nl> + <nl> void _embind_register_class ( <nl> TYPEID classType , <nl> TYPEID pointerType , <nl> namespace emscripten { <nl> setter ( ptr , FieldBinding : : fromWireType ( value ) ) ; <nl> } <nl> } ; <nl> + <nl> + template < typename VectorType > <nl> + struct Vector { <nl> + typedef typename VectorType : : value_type ElementType ; <nl> + typedef internal : : BindingType < ElementType > FieldBinding ; <nl> + typedef typename FieldBinding : : WireType WireType ; <nl> + <nl> + static int length ( <nl> + VectorType * ptr <nl> + ) { <nl> + return ( * ptr ) . size ( ) ; <nl> + } <nl> + <nl> + static WireType getAt ( <nl> + VectorType * ptr , <nl> + int pos <nl> + ) { <nl> + return FieldBinding : : toWireType ( ( * ptr ) . at ( pos ) ) ; <nl> + } <nl> + <nl> + static void push_back ( <nl> + VectorType * ptr , <nl> + WireType val <nl> + ) { <nl> + ( * ptr ) . push_back ( FieldBinding : : fromWireType ( val ) ) ; <nl> + } <nl> + } ; <nl> } <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> namespace emscripten { <nl> reinterpret_cast < internal : : GenericFunction > ( & internal : : get_pointee < PointerType > ) ) ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / VECTORS <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + template < typename VectorType > <nl> + inline void register_vector ( const char * name ) { <nl> + typedef typename VectorType : : value_type ElementType ; <nl> + <nl> + internal : : registerStandardTypes ( ) ; <nl> + internal : : _embind_register_vector ( <nl> + internal : : TypeID < VectorType > : : get ( ) , <nl> + internal : : TypeID < ElementType > : : get ( ) , <nl> + name , <nl> + reinterpret_cast < internal : : GenericFunction > ( & internal : : raw_constructor < VectorType > ) , <nl> + reinterpret_cast < internal : : GenericFunction > ( & internal : : raw_destructor < VectorType > ) , <nl> + reinterpret_cast < internal : : GenericFunction > ( & internal : : Vector < VectorType > : : length ) , <nl> + reinterpret_cast < internal : : GenericFunction > ( & internal : : Vector < VectorType > : : getAt ) , <nl> + reinterpret_cast < internal : : GenericFunction > ( & internal : : Vector < VectorType > : : push_back ) <nl> + ) ; <nl> + } <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / CLASSES <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl>
Add support for std : : vector .
emscripten-core/emscripten
3c5dead6741c0a2f3268b04d1ee9a0262f368785
2013-04-12T11:21:41Z
mmm a / Kodi . xcodeproj / project . pbxproj <nl> ppp b / Kodi . xcodeproj / project . pbxproj <nl> <nl> 7CDAE9040FFCA3520040B25F / * DVDTSCorrection . h * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . c . h ; path = DVDTSCorrection . h ; sourceTree = " < group > " ; } ; <nl> 7CEBD8A60F33A0D800CAF6AD / * SpecialProtocolDirectory . cpp * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . cpp . cpp ; path = SpecialProtocolDirectory . cpp ; sourceTree = " < group > " ; } ; <nl> 7CEBD8A70F33A0D800CAF6AD / * SpecialProtocolDirectory . h * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . c . h ; path = SpecialProtocolDirectory . h ; sourceTree = " < group > " ; } ; <nl> + 7CEE107B1C970BB800E0D426 / * kodi_inputstream_dll . h * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . c . h ; name = kodi_inputstream_dll . h ; path = " kodi - addon - dev - kit / include / kodi / kodi_inputstream_dll . h " ; sourceTree = " < group > " ; } ; <nl> + 7CEE107C1C970BB800E0D426 / * kodi_inputstream_types . h * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . c . h ; name = kodi_inputstream_types . h ; path = " kodi - addon - dev - kit / include / kodi / kodi_inputstream_types . h " ; sourceTree = " < group > " ; } ; <nl> + 7CEE107D1C970BB800E0D426 / * libKODI_inputstream . h * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . c . h ; name = libKODI_inputstream . h ; path = " kodi - addon - dev - kit / include / kodi / libKODI_inputstream . h " ; sourceTree = " < group > " ; } ; <nl> 7CEE587C1B5A3FFB007C2B5A / * AudioDSPSettings . cpp * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . cpp . cpp ; path = AudioDSPSettings . cpp ; sourceTree = " < group > " ; } ; <nl> 7CF05049190A1D7200222135 / * FFmpeg . cpp * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . cpp . cpp ; path = FFmpeg . cpp ; sourceTree = " < group > " ; } ; <nl> 7CF0504A190A1D7200222135 / * FFmpeg . h * / = { isa = PBXFileReference ; fileEncoding = 4 ; lastKnownFileType = sourcecode . c . h ; path = FFmpeg . h ; sourceTree = " < group > " ; } ; <nl> <nl> EDE8C70F1C7F618500A86ECC / * kodi_audiodec_dll . h * / , <nl> EDE8C7101C7F618500A86ECC / * kodi_audiodec_types . h * / , <nl> EDE8C7111C7F618500A86ECC / * kodi_audioengine_types . h * / , <nl> + 7CEE107B1C970BB800E0D426 / * kodi_inputstream_dll . h * / , <nl> + 7CEE107C1C970BB800E0D426 / * kodi_inputstream_types . h * / , <nl> 68AE5BA71C92414B00C4D527 / * kodi_peripheral_callbacks . h * / , <nl> 68AE5BA81C92414B00C4D527 / * kodi_peripheral_dll . h * / , <nl> 68AE5BA91C92414B00C4D527 / * kodi_peripheral_types . h * / , <nl> <nl> EDE8C7151C7F618500A86ECC / * libKODI_guilib . h * / , <nl> EDE8C7161C7F618500A86ECC / * libXBMC_addon . h * / , <nl> EDE8C7171C7F618500A86ECC / * libXBMC_codec . h * / , <nl> + 7CEE107D1C970BB800E0D426 / * libKODI_inputstream . h * / , <nl> EDE8C7181C7F618500A86ECC / * libXBMC_pvr . h * / , <nl> EDE8C7191C7F618500A86ECC / * xbmc_addon_cpp_dll . h * / , <nl> EDE8C71A1C7F618500A86ECC / * xbmc_addon_dll . h * / , <nl> mmm a / addons / kodi . inputstream / addon . xml <nl> ppp b / addons / kodi . inputstream / addon . xml <nl> <nl> < ? xml version = " 1 . 0 " encoding = " UTF - 8 " ? > <nl> - < addon id = " kodi . inputstream " version = " 1 . 0 . 0 " provider - name = " Team Kodi " > <nl> - < backwards - compatibility abi = " 1 . 0 . 0 " / > <nl> + < addon id = " kodi . inputstream " version = " 1 . 0 . 1 " provider - name = " Team Kodi " > <nl> + < backwards - compatibility abi = " 1 . 0 . 1 " / > <nl> < requires > <nl> < import addon = " xbmc . core " version = " 0 . 1 . 0 " / > <nl> < / requires > <nl> mmm a / xbmc / addons / AddonDll . h <nl> ppp b / xbmc / addons / AddonDll . h <nl> namespace ADDON <nl> TheProps * m_pInfo ; <nl> CAddonInterfaces * m_pHelpers ; <nl> bool m_bIsChild ; <nl> + std : : string m_parentLib ; <nl> <nl> private : <nl> TheDll * m_pDll ; <nl> CAddonDll < TheDll , TheStruct , TheProps > : : CAddonDll ( AddonProps props ) <nl> m_pInfo = NULL ; <nl> m_pHelpers = NULL ; <nl> m_needsavedsettings = false ; <nl> + m_parentLib . clear ( ) ; <nl> } <nl> <nl> template < class TheDll , typename TheStruct , typename TheProps > <nl> CAddonDll < TheDll , TheStruct , TheProps > : : CAddonDll ( const CAddonDll < TheDll , TheStr <nl> m_pInfo = rhs . m_pInfo ; <nl> m_pHelpers = rhs . m_pHelpers ; <nl> m_needsavedsettings = rhs . m_needsavedsettings ; <nl> + m_parentLib = rhs . m_parentLib ; <nl> } <nl> <nl> template < class TheDll , typename TheStruct , typename TheProps > <nl> bool CAddonDll < TheDll , TheStruct , TheProps > : : LoadDll ( ) <nl> <nl> XFILE : : CFile : : Copy ( libPath , strFileName ) ; <nl> <nl> + m_parentLib = libPath ; <nl> CLog : : Log ( LOGNOTICE , " ADDON : Loaded virtual child addon % s " , strFileName . c_str ( ) ) ; <nl> } <nl> <nl> mmm a / xbmc / addons / InputStream . cpp <nl> ppp b / xbmc / addons / InputStream . cpp <nl> <nl> # include " utils / log . h " <nl> # include " cores / VideoPlayer / DVDDemuxers / DVDDemux . h " <nl> # include " utils / RegExp . h " <nl> + # include " utils / URIUtils . h " <nl> <nl> namespace ADDON <nl> { <nl> bool CInputStream : : Open ( CFileItem & fileitem ) <nl> props . m_nCountInfoValues + + ; <nl> } <nl> props . m_strURL = fileitem . GetPath ( ) . c_str ( ) ; <nl> + props . m_libFolder = URIUtils : : GetDirectory ( m_parentLib ) . c_str ( ) ; <nl> <nl> bool ret = false ; <nl> try <nl> mmm a / xbmc / addons / kodi - addon - dev - kit / include / kodi / kodi_inputstream_types . h <nl> ppp b / xbmc / addons / kodi - addon - dev - kit / include / kodi / kodi_inputstream_types . h <nl> extern " C " { <nl> const char * m_strKey ; <nl> const char * m_strValue ; <nl> } m_ListItemProperties [ MAX_INFO_COUNT ] ; <nl> + <nl> + const char * m_libFolder ; <nl> } INPUTSTREAM ; <nl> <nl> / * ! <nl>
Merge pull request from FernetMenta / libpath
xbmc/xbmc
2a26ae40715198d2cb447b74aa8d5f544341f5d8
2016-03-14T18:48:05Z
mmm a / xbmc / input / SDLJoystick . cpp <nl> ppp b / xbmc / input / SDLJoystick . cpp <nl> <nl> # include " settings / AdvancedSettings . h " <nl> # include " settings / Setting . h " <nl> # include " utils / log . h " <nl> + # include " utils / StringUtils . h " <nl> <nl> # include < math . h > <nl> <nl> void CJoystick : : Initialize ( ) <nl> continue ; <nl> } <nl> # endif <nl> - <nl> - m_Joysticks . push_back ( joy ) ; <nl> if ( joy ) <nl> { <nl> - m_JoystickNames . push_back ( string ( SDL_JoystickName ( i ) ) ) ; <nl> - CLog : : Log ( LOGNOTICE , " Enabled Joystick : % s " , SDL_JoystickName ( i ) ) ; <nl> - CLog : : Log ( LOGNOTICE , " Details : Total Axis : % d Total Hats : % d Total Buttons : % d " , <nl> - SDL_JoystickNumAxes ( joy ) , SDL_JoystickNumHats ( joy ) , SDL_JoystickNumButtons ( joy ) ) ; <nl> + / / Some ( Microsoft ) Keyboards are recognized as Joysticks by modern kernels <nl> + / / Don ' t enumerate them <nl> + / / https : / / bugs . launchpad . net / ubuntu / + source / linux / + bug / 390959 <nl> + / / NOTICE : Enabled Joystick : Microsoft Wired Keyboard 600 <nl> + / / Details : Total Axis : 37 Total Hats : 0 Total Buttons : 57 <nl> + int num_axis = SDL_JoystickNumAxes ( joy ) ; <nl> + if ( num_axis > 20 & & StringUtils : : FindWords ( SDL_JoystickName ( i ) , " keyboard " ) ! = std : : string : : npos ) <nl> + CLog : : Log ( LOGNOTICE , " Your Joystick is a Keyboard , ignoring it : % s Axis : % d " , SDL_JoystickName ( i ) , num_axis ) ; <nl> + else <nl> + { <nl> + m_JoystickNames . push_back ( string ( SDL_JoystickName ( i ) ) ) ; <nl> + CLog : : Log ( LOGNOTICE , " Enabled Joystick : % s " , SDL_JoystickName ( i ) ) ; <nl> + CLog : : Log ( LOGNOTICE , " Details : Total Axis : % d Total Hats : % d Total Buttons : % d " , <nl> + num_axis , SDL_JoystickNumHats ( joy ) , SDL_JoystickNumButtons ( joy ) ) ; <nl> + m_Joysticks . push_back ( joy ) ; <nl> + } <nl> } <nl> else <nl> { <nl>
Merge pull request from fritsch / microsoft - js
xbmc/xbmc
38c19ba69c5d160e0b647978fb396e6966b3641e
2013-11-10T07:09:55Z
mmm a / atom / renderer / atom_render_view_observer . cc <nl> ppp b / atom / renderer / atom_render_view_observer . cc <nl> <nl> # include " net / grit / net_resources . h " <nl> # include " third_party / WebKit / public / web / WebDocument . h " <nl> # include " third_party / WebKit / public / web / WebDraggableRegion . h " <nl> + # include " third_party / WebKit / public / web / WebElement . h " <nl> # include " third_party / WebKit / public / web / WebFrame . h " <nl> # include " third_party / WebKit / public / web / WebKit . h " <nl> # include " third_party / WebKit / public / web / WebLocalFrame . h " <nl> AtomRenderViewObserver : : AtomRenderViewObserver ( <nl> content : : RenderView * render_view , <nl> AtomRendererClient * renderer_client ) <nl> : content : : RenderViewObserver ( render_view ) , <nl> - renderer_client_ ( renderer_client ) , <nl> - document_created_ ( false ) { <nl> + renderer_client_ ( renderer_client ) { <nl> / / Initialise resource for directory listing . <nl> net : : NetModule : : SetResourceProvider ( NetResourceProvider ) ; <nl> } <nl> void AtomRenderViewObserver : : EmitIPCEvent ( blink : : WebFrame * frame , <nl> } <nl> } <nl> <nl> - void AtomRenderViewObserver : : DidCreateDocumentElement ( <nl> - blink : : WebLocalFrame * frame ) { <nl> - document_created_ = true ; <nl> - } <nl> - <nl> void AtomRenderViewObserver : : DraggableRegionsChanged ( blink : : WebFrame * frame ) { <nl> blink : : WebVector < blink : : WebDraggableRegion > webregions = <nl> frame - > GetDocument ( ) . DraggableRegions ( ) ; <nl> void AtomRenderViewObserver : : OnDestruct ( ) { <nl> void AtomRenderViewObserver : : OnBrowserMessage ( bool send_to_all , <nl> const base : : string16 & channel , <nl> const base : : ListValue & args ) { <nl> - if ( ! document_created_ ) <nl> - return ; <nl> - <nl> if ( ! render_view ( ) - > GetWebView ( ) ) <nl> return ; <nl> <nl> void AtomRenderViewObserver : : OnBrowserMessage ( bool send_to_all , <nl> if ( ! frame | | frame - > IsWebRemoteFrame ( ) ) <nl> return ; <nl> <nl> + / / Don ' t handle browser messages before document element is created . <nl> + / / When we receive a message from the browser , we try to transfer it <nl> + / / to a web page , and when we do that Blink creates an empty <nl> + / / document element if it hasn ' t been created yet , and it makes our init <nl> + / / script to run while ` window . location ` is still " about : blank " . <nl> + blink : : WebDocument document = frame - > GetDocument ( ) ; <nl> + blink : : WebElement html_element = document . DocumentElement ( ) ; <nl> + if ( html_element . IsNull ( ) ) { <nl> + return ; <nl> + } <nl> + <nl> EmitIPCEvent ( frame , channel , args ) ; <nl> <nl> / / Also send the message to all sub - frames . <nl> mmm a / atom / renderer / atom_render_view_observer . h <nl> ppp b / atom / renderer / atom_render_view_observer . h <nl> class AtomRenderViewObserver : public content : : RenderViewObserver { <nl> <nl> private : <nl> / / content : : RenderViewObserver implementation . <nl> - void DidCreateDocumentElement ( blink : : WebLocalFrame * frame ) override ; <nl> void DraggableRegionsChanged ( blink : : WebFrame * frame ) override ; <nl> bool OnMessageReceived ( const IPC : : Message & message ) override ; <nl> void OnDestruct ( ) override ; <nl> class AtomRenderViewObserver : public content : : RenderViewObserver { <nl> <nl> AtomRendererClient * renderer_client_ ; <nl> <nl> - / / Whether the document object has been created . <nl> - bool document_created_ ; <nl> - <nl> DISALLOW_COPY_AND_ASSIGN ( AtomRenderViewObserver ) ; <nl> } ; <nl> <nl>
Remove unused RenderViewObserver methods .
electron/electron
370476c4affc56ed9799068187b1e8fd5d1beb0a
2017-11-24T01:58:16Z
mmm a / CMakeLists . txt <nl> ppp b / CMakeLists . txt <nl> install ( FILES src / icons / $ { PROJECT_NAME } . png <nl> <nl> install ( FILES distri / $ { PROJECT_NAME } . desktop <nl> DESTINATION share / applications / ) <nl> + <nl> + install ( FILES distri / $ { PROJECT_NAME } . desktop . appdata . xml <nl> + DESTINATION share / appdata / ) <nl> endif ( UNIX AND NOT APPLE ) <nl> <nl> # cpack <nl> new file mode 100644 <nl> index 000000000 . . 11a0e99d9 <nl> mmm / dev / null <nl> ppp b / distri / sqlitebrowser . desktop . appdata . xml <nl> <nl> + < ? xml version = " 1 . 0 " encoding = " UTF - 8 " ? > <nl> + < component type = " desktop " > <nl> + < id > sqlitebrowser . desktop < / id > <nl> + < metadata_license > CC0 - 1 . 0 < / metadata_license > <nl> + < project_license > MPL - 2 . 0 and GPL - 3 . 0 + < / project_license > <nl> + < name > DB Browser for SQLite < / name > <nl> + < summary > DB Browser for SQLite is a light GUI editor for SQLite databases < / summary > <nl> + < description > <nl> + < p > DB Browser for SQLite is a high quality , visual , open source tool to create , design , and edit database files compatible with SQLite . < / p > <nl> + < p > It is for users and developers wanting to create databases , search , and edit data . It uses a familiar spreadsheet - like interface , and you don ' t need to learn complicated SQL commands . < / p > <nl> + < p > Controls and wizards are available for users to : < / p > <nl> + < ul > <nl> + < li > Create and compact database files < / li > <nl> + < li > Create , define , modify and delete tables < / li > <nl> + < li > Create , define and delete indexes < / li > <nl> + < li > Browse , edit , add and delete records < / li > <nl> + < li > Search records < / li > <nl> + < li > Import and export records as text < / li > <nl> + < li > Import and export tables from / to CSV files < / li > <nl> + < li > Import and export databases from / to SQL dump files < / li > <nl> + < li > Issue SQL queries and inspect the results < / li > <nl> + < li > Examine a log of all SQL commands issued by the application < / li > <nl> + < / ul > <nl> + < / description > <nl> + < screenshots > <nl> + < screenshot type = " default " > <nl> + < image > https : / / raw . githubusercontent . com / sqlitebrowser / db4s - screenshots / master / v3 . 3 / gnome3_2 - execute . png < / image > <nl> + < caption > DB Browser for SQLite , executing query < / caption > <nl> + < / screenshot > <nl> + < screenshot > <nl> + < image > https : / / raw . githubusercontent . com / sqlitebrowser / db4s - screenshots / master / v3 . 3 / gnome3_1 - plot . png < / image > <nl> + < caption > DB Browser for SQLite , browsing data with plot < / caption > <nl> + < / screenshot > <nl> + < screenshot > <nl> + < image > https : / / raw . githubusercontent . com / sqlitebrowser / db4s - screenshots / master / v3 . 3 / kde413_2 - blob . png < / image > <nl> + < caption > DB Browser for SQLite , browing a blob field < / caption > <nl> + < / screenshot > <nl> + < screenshot > <nl> + < image > https : / / raw . githubusercontent . com / sqlitebrowser / db4s - screenshots / master / v3 . 3 / kde413_1 - create_table . png < / image > <nl> + < caption > DB Browser for SQLite , creating a table < / caption > <nl> + < / screenshot > <nl> + < / screenshots > <nl> + < url type = " homepage " > http : / / sqlitebrowser . org / < / url > <nl> + < url type = " bugtracker " > https : / / github . com / sqlitebrowser / sqlitebrowser / issues < / url > <nl> + < / component > <nl>
Create an AppData file
sqlitebrowser/sqlitebrowser
95416457fd3b52053050b31446462a2cb6b1a5f1
2015-03-26T15:32:10Z
mmm a / flow / error_definitions . h <nl> ppp b / flow / error_definitions . h <nl> ERROR ( backup_duplicate , 2311 , " Backup duplicate request " ) <nl> ERROR ( backup_unneeded , 2312 , " Backup unneeded request " ) <nl> ERROR ( backup_bad_block_size , 2313 , " Backup file block size too small " ) <nl> ERROR ( backup_invalid_url , 2314 , " Backup Container URL invalid " ) <nl> - ERROR ( backup_invalid_info , 2315 , " Backup Container URL invalid " ) <nl> + ERROR ( backup_invalid_info , 2315 , " Backup Container info invalid " ) <nl> ERROR ( backup_cannot_expire , 2316 , " Cannot expire requested data from backup without violating minimum restorability " ) <nl> ERROR ( backup_auth_missing , 2317 , " Cannot find authentication details ( such as a password or secret key ) for the specified Backup Container URL " ) <nl> ERROR ( backup_auth_unreadable , 2318 , " Cannot read or parse one or more sources of authentication information for Backup Container URLs " ) <nl>
Merge pull request from tclinken / fix - backup - invalid - info - message
apple/foundationdb
41732bc21844cc5489e33f5bdfb9a4af5467a3ac
2020-05-07T21:40:56Z
mmm a / bson / bsondemo / bsondemo . cpp <nl> ppp b / bson / bsondemo / bsondemo . cpp <nl> <nl> - / * * @ file bsondemo . cpp * / <nl> + / * * @ file bsondemo . cpp <nl> + <nl> + Example of use of BSON from C + + . <nl> + <nl> + Requires boost ( headers only ) . <nl> + Works headers only ( the parts actually exercised herein that is - some functions require . cpp files ) . <nl> + * / <nl> <nl> # include " . . / bson . h " <nl> # include < iostream > <nl> <nl> using namespace std ; <nl> using namespace bson ; <nl> <nl> + void iter ( bo o ) { <nl> + / * iterator example * / <nl> + cout < < " \ niter ( ) \ n " ; <nl> + for ( bo : : iterator i ( o ) ; i . more ( ) ; ) { <nl> + cout < < ' ' < < i . next ( ) . toString ( ) < < ' \ n ' ; <nl> + } <nl> + } <nl> + <nl> int main ( ) <nl> { <nl> - cout < < " build bits : " < < 8 * sizeof ( char * ) < < endl ; <nl> + cout < < " build bits : " < < 8 * sizeof ( char * ) < < ' \ n ' < < endl ; <nl> <nl> / * a bson object defaults on construction to { } * / <nl> bo empty ; <nl> int main ( ) <nl> x . vals ( strs ) ; <nl> cout < < strs . size ( ) < < " strings , first one : " < < strs [ 0 ] < < endl ; <nl> <nl> + iter ( y ) ; <nl> return 0 ; <nl> } <nl> mmm a / bson / bsoninlines . h <nl> ppp b / bson / bsoninlines . h <nl> <nl> <nl> namespace mongo { <nl> <nl> + inline BSONObjIterator BSONObj : : begin ( ) { <nl> + return BSONObjIterator ( * this ) ; <nl> + } <nl> + <nl> inline BSONObj BSONElement : : embeddedObjectUserCheck ( ) const { <nl> uassert ( 10065 , " invalid parameter : expected an object " , isABSONObj ( ) ) ; <nl> return BSONObj ( value ( ) ) ; <nl> mmm a / bson / bsonobj . h <nl> ppp b / bson / bsonobj . h <nl> namespace mongo { <nl> template < class T > <nl> void vals ( list < T > & ) const ; <nl> <nl> - private : <nl> friend class BSONObjIterator ; <nl> + typedef BSONObjIterator iterator ; <nl> + BSONObjIterator begin ( ) ; <nl> + <nl> + private : <nl> class Holder { <nl> public : <nl> Holder ( const char * objdata ) : <nl> mmm a / bson / bsonobjiterator . h <nl> ppp b / bson / bsonobjiterator . h <nl> namespace mongo { <nl> _pos = jso . objdata ( ) + 4 ; <nl> _theend = jso . objdata ( ) + sz ; <nl> } <nl> - <nl> + <nl> BSONObjIterator ( const char * start , const char * end ) { <nl> _pos = start + 4 ; <nl> _theend = end ; <nl> namespace mongo { <nl> _pos + = e . size ( checkEnd ? ( int ) ( _theend - _pos ) : - 1 ) ; <nl> return e ; <nl> } <nl> + <nl> + void operator + + ( ) { next ( ) ; } <nl> + void operator + + ( int ) { next ( ) ; } <nl> + <nl> + BSONElement operator * ( ) { <nl> + assert ( _pos < _theend ) ; <nl> + return BSONElement ( _pos , - 1 ) ; <nl> + } <nl> + <nl> private : <nl> const char * _pos ; <nl> const char * _theend ; <nl>
bson tweaking
mongodb/mongo
6efbf9af37bb6798773798244a07e9a0993908b4
2010-05-23T22:35:08Z
mmm a / tests / eosd_run_test . sh <nl> ppp b / tests / eosd_run_test . sh <nl> verifyErrorCode ( ) <nl> killAll ( ) <nl> { <nl> if [ " $ SERVER " = = " localhost " ] ; then <nl> - $ DIR / programs / launcher / launcher - k 9 - - eosdBinary $ EOSD <nl> + $ DIR / programs / launcher / launcher - k 9 - - eosdBinary $ EOSD - - genesis $ DIR / genesis . json <nl> fi <nl> kill - 9 $ WALLETD_PROC_ID <nl> } <nl> LOG_FILE = eosd_run_test . log <nl> <nl> # eosd <nl> if [ " $ SERVER " = = " localhost " ] ; then <nl> - $ DIR / programs / launcher / launcher - - eosdBinary $ EOSD <nl> + $ DIR / programs / launcher / launcher - - eosdBinary $ EOSD - - genesis $ DIR / genesis . json <nl> verifyErrorCode " launcher " <nl> sleep 60 <nl> count = ` grep - c " generated block " tn_data_00 / stderr . txt ` <nl> if [ $ CODE_HASH ! = 0 ] ; then <nl> fi <nl> <nl> # upload a contract <nl> - INFO = " $ ( $ DIR / programs / eosc / eosc - - host $ SERVER - - port $ PORT - - wallet - port 8899 set contract currency contracts / currency / currency . wast contracts / currency / currency . abi ) " <nl> + INFO = " $ ( $ DIR / programs / eosc / eosc - - host $ SERVER - - port $ PORT - - wallet - port 8899 set contract currency $ DIR / contracts / currency / currency . wast $ DIR / contracts / currency / currency . abi ) " <nl> verifyErrorCode " eosc set contract currency " <nl> count = ` echo $ INFO | grep - c " processed " ` <nl> if [ $ count = = 0 ] ; then <nl> fi <nl> # <nl> <nl> # upload exchange contract <nl> - INFO = " $ ( $ DIR / programs / eosc / eosc - - host $ SERVER - - port $ PORT - - wallet - port 8899 set contract exchange contracts / exchange / exchange . wast contracts / exchange / exchange . abi ) " <nl> + INFO = " $ ( $ DIR / programs / eosc / eosc - - host $ SERVER - - port $ PORT - - wallet - port 8899 set contract exchange $ DIR / contracts / exchange / exchange . wast $ DIR / contracts / exchange / exchange . abi ) " <nl> verifyErrorCode " eosc set contract exchange " <nl> count = ` echo $ INFO | grep - c " processed " ` <nl> if [ $ count = = 0 ] ; then <nl> getTransactionId " $ INFO " <nl> # Verify eosc generates an error , but does not core dump . <nl> # <nl> <nl> - INFO = " $ ( { $ DIR / programs / eosc / eosc - - host $ SERVER - - port $ PORT - - wallet - port 8899 set contract simpledb contracts / simpledb / simpledb . wast contracts / simpledb / simpledb . abi ; } 2 > & 1 ) " <nl> + INFO = " $ ( { $ DIR / programs / eosc / eosc - - host $ SERVER - - port $ PORT - - wallet - port 8899 set contract simpledb $ DIR / contracts / simpledb / simpledb . wast $ DIR / contracts / simpledb / simpledb . abi ; } 2 > & 1 ) " <nl> rc = $ ? <nl> if [ $ rc - eq 0 ] | | [ $ rc - eq 139 ] ; then # 139 SIGSEGV <nl> error " FAILURE - $ 1 returned error code $ rc , should have failed to execute . " <nl>
[ test ] set relative path of genesis . json and contracts
EOSIO/eos
6eac73e8d4a4499f96b730e96156f96c60b72555
2018-01-18T11:53:20Z
mmm a / dlib / dnn / core . h <nl> ppp b / dlib / dnn / core . h <nl> namespace dlib <nl> template < typename solver_type > <nl> void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> { <nl> - subnetwork - > update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> + update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> } <nl> <nl> const tensor & get_parameter_gradient ( <nl> namespace dlib <nl> } <nl> } <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { <nl> + update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> + } <nl> + <nl> const tensor & get_parameter_gradient ( <nl> ) const { return params_grad ; } <nl> <nl> namespace dlib <nl> subnetwork . update_parameters ( solvers , learning_rate ) ; <nl> } <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { <nl> + update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> + } <nl> + <nl> const tensor & get_parameter_gradient ( <nl> ) const { return params_grad ; } <nl> <nl> namespace dlib <nl> subnetwork . update_parameters ( solvers . pop ( comp_layers_in_each_group * details . size ( ) ) , learning_rate ) ; <nl> } <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { <nl> + update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> + } <nl> + <nl> const subnet_type & subnet ( ) const { return subnetwork ; } <nl> subnet_type & subnet ( ) { return subnetwork ; } <nl> <nl> namespace dlib <nl> / / nothing to do <nl> } <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { <nl> + update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> + } <nl> + <nl> const subnet_type & subnet ( ) const { return input_layer ; } <nl> subnet_type & subnet ( ) { return input_layer ; } <nl> <nl> namespace dlib <nl> subnetwork . update_parameters ( solvers , learning_rate ) ; <nl> } <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { <nl> + update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> + } <nl> + <nl> const subnet_type & subnet ( ) const { return subnetwork ; } <nl> subnet_type & subnet ( ) { return subnetwork ; } <nl> const loss_details_type & loss_details ( ) const { return loss ; } <nl> namespace dlib <nl> subnetwork . update_parameters ( solvers , learning_rate ) ; <nl> } <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { <nl> + update_parameters ( make_sstack ( solvers ) , learning_rate ) ; <nl> + } <nl> + <nl> const tensor & get_parameter_gradient ( <nl> ) const { return params_grad ; } <nl> <nl> mmm a / dlib / dnn / core_abstract . h <nl> ppp b / dlib / dnn / core_abstract . h <nl> namespace dlib <nl> - The solvers use the given learning rate . <nl> ! * / <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { update_parameters ( make_sstack ( solvers ) , learning_rate ) ; } <nl> + / * ! <nl> + Convenience method for calling update_parameters ( ) <nl> + ! * / <nl> + <nl> void clean ( <nl> ) ; <nl> / * ! <nl> namespace dlib <nl> - The solvers use the given learning rate . <nl> ! * / <nl> <nl> + template < typename solver_type > <nl> + void update_parameters ( std : : vector < solver_type > & solvers , double learning_rate ) <nl> + { update_parameters ( make_sstack ( solvers ) , learning_rate ) ; } <nl> + / * ! <nl> + Convenience method for calling update_parameters ( ) <nl> + ! * / <nl> + <nl> / / mmmmmmmmmmmm - <nl> <nl> void clean ( <nl>
make update_parameters ( ) a little more uniform
davisking/dlib
c79f64f52db223a57d563d40aa90cc42a73c62c6
2020-03-29T15:19:37Z
mmm a / etc / perf . yml <nl> ppp b / etc / perf . yml <nl> modules : <nl> - name : py - tpcc <nl> repo : git @ github . com : 10gen / py - tpcc . git <nl> prefix : . . / . . / src <nl> + branch : master <nl> ref : 52185e96bf28be6608ea65b61ff1d134b7cb3f13 <nl> <nl> # # # <nl> functions : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> + revisions : & revisions_list <nl> dsi : $ { dsi_rev } <nl> genny : $ { genny_rev } <nl> signal - processing : $ { signal - processing_rev } <nl> tasks : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> - dsi : $ { dsi_rev } <nl> - genny : $ { genny_rev } <nl> - signal - processing : $ { signal - processing_rev } <nl> - linkbench : $ { linkbench_rev } <nl> - linkbench2 : $ { linkbench2_rev } <nl> - workloads : $ { workloads_rev } <nl> - mongo - perf : $ { mongo - perf_rev } <nl> - YCSB : $ { YCSB_rev } <nl> - benchmarks : $ { benchmarks_rev } <nl> - py - tpcc : $ { py - tpcc_rev } <nl> + revisions : * revisions_list <nl> - command : expansions . write <nl> params : <nl> file : . / expansions . yml <nl> tasks : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> - dsi : $ { dsi_rev } <nl> - genny : $ { genny_rev } <nl> - signal - processing : $ { signal - processing_rev } <nl> - linkbench : $ { linkbench_rev } <nl> - linkbench2 : $ { linkbench2_rev } <nl> - workloads : $ { workloads_rev } <nl> - mongo - perf : $ { mongo - perf_rev } <nl> - YCSB : $ { YCSB_rev } <nl> - benchmarks : $ { benchmarks_rev } <nl> - py - tpcc : $ { py - tpcc_rev } <nl> + revisions : * revisions_list <nl> - command : expansions . write <nl> params : <nl> file : . / expansions . yml <nl> tasks : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> - dsi : $ { dsi_rev } <nl> - genny : $ { genny_rev } <nl> - signal - processing : $ { signal - processing_rev } <nl> - linkbench : $ { linkbench_rev } <nl> - linkbench2 : $ { linkbench2_rev } <nl> - workloads : $ { workloads_rev } <nl> - mongo - perf : $ { mongo - perf_rev } <nl> - YCSB : $ { YCSB_rev } <nl> - benchmarks : $ { benchmarks_rev } <nl> - py - tpcc : $ { py - tpcc_rev } <nl> + revisions : * revisions_list <nl> - command : expansions . write <nl> params : <nl> file : . / expansions . yml <nl> mmm a / etc / system_perf . yml <nl> ppp b / etc / system_perf . yml <nl> modules : <nl> - name : py - tpcc <nl> repo : git @ github . com : 10gen / py - tpcc . git <nl> prefix : . . / . . / src <nl> + branch : master <nl> ref : 52185e96bf28be6608ea65b61ff1d134b7cb3f13 <nl> <nl> # # # <nl> functions : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> + revisions : & revisions_list <nl> dsi : $ { dsi_rev } <nl> genny : $ { genny_rev } <nl> signal - processing : $ { signal - processing_rev } <nl> tasks : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> - dsi : $ { dsi_rev } <nl> - genny : $ { genny_rev } <nl> - signal - processing : $ { signal - processing_rev } <nl> - linkbench : $ { linkbench_rev } <nl> - linkbench2 : $ { linkbench2_rev } <nl> - workloads : $ { workloads_rev } <nl> - mongo - perf : $ { mongo - perf_rev } <nl> - YCSB : $ { YCSB_rev } <nl> - benchmarks : $ { benchmarks_rev } <nl> - py - tpcc : $ { py - tpcc_rev } <nl> + revisions : * revisions_list <nl> - command : expansions . write <nl> params : <nl> file : . / expansions . yml <nl> tasks : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> - dsi : $ { dsi_rev } <nl> - genny : $ { genny_rev } <nl> - signal - processing : $ { signal - processing_rev } <nl> - linkbench : $ { linkbench_rev } <nl> - linkbench2 : $ { linkbench2_rev } <nl> - workloads : $ { workloads_rev } <nl> - mongo - perf : $ { mongo - perf_rev } <nl> - YCSB : $ { YCSB_rev } <nl> - benchmarks : $ { benchmarks_rev } <nl> - py - tpcc : $ { py - tpcc_rev } <nl> + revisions : * revisions_list <nl> - command : expansions . write <nl> params : <nl> file : . / expansions . yml <nl> tasks : <nl> - command : git . get_project <nl> params : <nl> directory : * src_dir <nl> - revisions : <nl> - dsi : $ { dsi_rev } <nl> - genny : $ { genny_rev } <nl> - signal - processing : $ { signal - processing_rev } <nl> - linkbench : $ { linkbench_rev } <nl> - linkbench2 : $ { linkbench2_rev } <nl> - workloads : $ { workloads_rev } <nl> - mongo - perf : $ { mongo - perf_rev } <nl> - YCSB : $ { YCSB_rev } <nl> - benchmarks : $ { benchmarks_rev } <nl> - py - tpcc : $ { py - tpcc_rev } <nl> + revisions : * revisions_list <nl> - command : expansions . write <nl> params : <nl> file : . / expansions . yml <nl>
SERVER - 52775 Specify py - tpcc branch in sys perf and reorganize revisions
mongodb/mongo
cfa95825eebad3d8d9de0a4903523fd6a41184eb
2020-11-20T18:52:41Z
mmm a / buildscripts / resmokeconfig / suites / sharding_last_stable_mongos_and_mixed_shards . yml <nl> ppp b / buildscripts / resmokeconfig / suites / sharding_last_stable_mongos_and_mixed_shards . yml <nl> selector : <nl> - jstests / sharding / out_write_concern . js <nl> - jstests / sharding / restart_transactions . js <nl> - jstests / sharding / shard7 . js <nl> + # TODO : SERVER - 38541 remove from blacklist <nl> - jstests / sharding / shard_collection_existing_zones . js <nl> - jstests / sharding / snapshot_cursor_commands_mongos . js <nl> - jstests / sharding / transactions_error_labels . js <nl> mmm a / buildscripts / resmokeconfig / suites / sharding_last_stable_mongos_and_mixed_shards_misc . yml <nl> ppp b / buildscripts / resmokeconfig / suites / sharding_last_stable_mongos_and_mixed_shards_misc . yml <nl> selector : <nl> - jstests / sharding / out_write_concern . js <nl> - jstests / sharding / restart_transactions . js <nl> - jstests / sharding / shard7 . js <nl> + # TODO : SERVER - 38541 remove from blacklist <nl> - jstests / sharding / shard_collection_existing_zones . js <nl> - jstests / sharding / snapshot_cursor_commands_mongos . js <nl> - jstests / sharding / transactions_error_labels . js <nl> mmm a / jstests / sharding / shard_collection_existing_zones . js <nl> ppp b / jstests / sharding / shard_collection_existing_zones . js <nl> <nl> assert . commandWorked ( testDB . runCommand ( { drop : kCollName } ) ) ; <nl> } <nl> <nl> + / * * <nl> + * Tests that a non - empty collection associated with zones can be sharded . <nl> + * / <nl> + function testNonemptyZonedCollection ( ) { <nl> + var shardKey = { x : 1 } ; <nl> + var shards = configDB . shards . find ( ) . toArray ( ) ; <nl> + var testColl = testDB . getCollection ( kCollName ) ; <nl> + var ranges = [ <nl> + { min : { x : 0 } , max : { x : 10 } } , <nl> + { min : { x : 10 } , max : { x : 20 } } , <nl> + { min : { x : 20 } , max : { x : 40 } } <nl> + ] ; <nl> + <nl> + for ( let i = 0 ; i < 40 ; i + + ) { <nl> + assert . writeOK ( testColl . insert ( { x : i } ) ) ; <nl> + } <nl> + <nl> + assert . commandWorked ( testColl . createIndex ( shardKey ) ) ; <nl> + <nl> + for ( let i = 0 ; i < shards . length ; i + + ) { <nl> + assert . commandWorked ( <nl> + mongos . adminCommand ( { addShardToZone : shards [ i ] . _id , zone : zoneName + i } ) ) ; <nl> + assert . commandWorked ( mongos . adminCommand ( { <nl> + updateZoneKeyRange : ns , <nl> + min : ranges [ i ] . min , <nl> + max : ranges [ i ] . max , <nl> + zone : zoneName + i <nl> + } ) ) ; <nl> + } <nl> + <nl> + assert . commandWorked ( mongos . adminCommand ( { shardCollection : ns , key : shardKey } ) ) ; <nl> + <nl> + / / Check that there is initially 1 chunk . <nl> + assert . eq ( 1 , configDB . chunks . count ( { ns : ns } ) ) ; <nl> + <nl> + st . startBalancer ( ) ; <nl> + <nl> + / / Check that the chunks were moved properly . <nl> + assert . soon ( ( ) = > { <nl> + let res = configDB . chunks . count ( { ns : ns } ) ; <nl> + return res = = = 5 ; <nl> + } , ' balancer never ran ' , 10 * 60 * 1000 , 1000 ) ; <nl> + <nl> + assert . commandWorked ( testDB . runCommand ( { drop : kCollName } ) ) ; <nl> + } <nl> + <nl> / / test that shardCollection checks that a zone is associated with a shard . <nl> testShardZoneAssociationValidation ( { x : 1 } , false , false ) ; <nl> <nl> <nl> testChunkSplits ( false ) ; <nl> testChunkSplits ( true ) ; <nl> <nl> + testNonemptyZonedCollection ( ) ; <nl> + <nl> st . stop ( ) ; <nl> } ) ( ) ; <nl> mmm a / src / mongo / db / s / config / initial_split_policy . cpp <nl> ppp b / src / mongo / db / s / config / initial_split_policy . cpp <nl> InitialSplitPolicy : : generateShardCollectionInitialZonedChunks ( <nl> const Timestamp & validAfter , <nl> const std : : vector < TagsType > & tags , <nl> const StringMap < std : : vector < ShardId > > & tagToShards , <nl> - const std : : vector < ShardId > & allShardIds ) { <nl> + const std : : vector < ShardId > & allShardIds , <nl> + const bool isEmpty ) { <nl> invariant ( ! allShardIds . empty ( ) ) ; <nl> invariant ( ! tags . empty ( ) ) ; <nl> <nl> InitialSplitPolicy : : generateShardCollectionInitialZonedChunks ( <nl> <nl> std : : vector < ChunkType > chunks ; <nl> <nl> - for ( const auto & tag : tags ) { <nl> - if ( tag . getMinKey ( ) . woCompare ( lastChunkMax ) > 0 ) { <nl> - / / create a chunk for the hole between zones <nl> - const ShardId shardId = allShardIds [ indx + + % allShardIds . size ( ) ] ; <nl> - appendChunk ( nss , lastChunkMax , tag . getMinKey ( ) , & version , validAfter , shardId , & chunks ) ; <nl> - } <nl> - <nl> - / / check that this tag is associated with a shard and if so create a chunk for the zone . <nl> - const auto it = tagToShards . find ( tag . getTag ( ) ) ; <nl> - invariant ( it ! = tagToShards . end ( ) ) ; <nl> - const auto & shardIdsForChunk = it - > second ; <nl> - uassert ( 50973 , <nl> + if ( ! isEmpty ) { <nl> + / / For a non - empty collection , create one chunk on the primary shard and leave it to the <nl> + / / balancer to do the final zone partitioning / rebalancing . <nl> + appendChunk ( nss , <nl> + keyPattern . globalMin ( ) , <nl> + keyPattern . globalMax ( ) , <nl> + & version , <nl> + validAfter , <nl> + allShardIds [ 0 ] , <nl> + & chunks ) ; <nl> + } else { <nl> + for ( const auto & tag : tags ) { <nl> + if ( tag . getMinKey ( ) . woCompare ( lastChunkMax ) > 0 ) { <nl> + / / create a chunk for the hole between zones <nl> + const ShardId shardId = allShardIds [ indx + + % allShardIds . size ( ) ] ; <nl> + appendChunk ( <nl> + nss , lastChunkMax , tag . getMinKey ( ) , & version , validAfter , shardId , & chunks ) ; <nl> + } <nl> + <nl> + / / check that this tag is associated with a shard and if so create a chunk for the zone . <nl> + const auto it = tagToShards . find ( tag . getTag ( ) ) ; <nl> + invariant ( it ! = tagToShards . end ( ) ) ; <nl> + const auto & shardIdsForChunk = it - > second ; <nl> + uassert ( <nl> + 50973 , <nl> str : : stream ( ) <nl> < < " cannot shard collection " <nl> < < nss . ns ( ) <nl> InitialSplitPolicy : : generateShardCollectionInitialZonedChunks ( <nl> < < " which is not associated with a shard . please add this zone to a shard . " , <nl> ! shardIdsForChunk . empty ( ) ) ; <nl> <nl> - appendChunk ( nss , <nl> - tag . getMinKey ( ) , <nl> - tag . getMaxKey ( ) , <nl> - & version , <nl> - validAfter , <nl> - shardIdsForChunk [ 0 ] , <nl> - & chunks ) ; <nl> - lastChunkMax = tag . getMaxKey ( ) ; <nl> - } <nl> + appendChunk ( nss , <nl> + tag . getMinKey ( ) , <nl> + tag . getMaxKey ( ) , <nl> + & version , <nl> + validAfter , <nl> + shardIdsForChunk [ 0 ] , <nl> + & chunks ) ; <nl> + lastChunkMax = tag . getMaxKey ( ) ; <nl> + } <nl> <nl> - if ( lastChunkMax . woCompare ( keyPattern . globalMax ( ) ) < 0 ) { <nl> - / / existing zones do not span to $ maxKey so create a chunk for that <nl> - const ShardId shardId = allShardIds [ indx + + % allShardIds . size ( ) ] ; <nl> - appendChunk ( <nl> - nss , lastChunkMax , keyPattern . globalMax ( ) , & version , validAfter , shardId , & chunks ) ; <nl> + if ( lastChunkMax . woCompare ( keyPattern . globalMax ( ) ) < 0 ) { <nl> + / / existing zones do not span to $ maxKey so create a chunk for that <nl> + const ShardId shardId = allShardIds [ indx + + % allShardIds . size ( ) ] ; <nl> + appendChunk ( <nl> + nss , lastChunkMax , keyPattern . globalMax ( ) , & version , validAfter , shardId , & chunks ) ; <nl> + } <nl> } <nl> <nl> log ( ) < < " Created " < < chunks . size ( ) < < " chunk ( s ) for : " < < nss < < " using new epoch " <nl> InitialSplitPolicy : : ShardCollectionConfig InitialSplitPolicy : : createFirstChunks ( <nl> const std : : vector < BSONObj > & splitPoints , <nl> const std : : vector < TagsType > & tags , <nl> const bool distributeInitialChunks , <nl> + const bool isEmpty , <nl> const int numContiguousChunksPerShard ) { <nl> const auto & keyPattern = shardKeyPattern . getKeyPattern ( ) ; <nl> <nl> InitialSplitPolicy : : ShardCollectionConfig InitialSplitPolicy : : createFirstChunks ( <nl> auto primaryShard = <nl> uassertStatusOK ( Grid : : get ( opCtx ) - > shardRegistry ( ) - > getShard ( opCtx , primaryShardId ) ) ; <nl> <nl> - auto result = uassertStatusOK ( primaryShard - > runCommandWithFixedRetryAttempts ( <nl> - opCtx , <nl> - ReadPreferenceSetting { ReadPreference : : PrimaryPreferred } , <nl> - nss . db ( ) . toString ( ) , <nl> - BSON ( " count " < < nss . coll ( ) ) , <nl> - Shard : : RetryPolicy : : kIdempotent ) ) ; <nl> - <nl> - long long numObjects = 0 ; <nl> - uassertStatusOK ( result . commandStatus ) ; <nl> - uassertStatusOK ( bsonExtractIntegerField ( result . response , " n " , & numObjects ) ) ; <nl> - <nl> / / Refresh the balancer settings to ensure the chunk size setting , which is sent as part of <nl> / / the splitVector command and affects the number of chunks returned , has been loaded . <nl> uassertStatusOK ( Grid : : get ( opCtx ) - > getBalancerConfiguration ( ) - > refreshAndCheck ( opCtx ) ) ; <nl> <nl> - if ( numObjects > 0 ) { <nl> + if ( ! isEmpty ) { <nl> finalSplitPoints = uassertStatusOK ( shardutil : : selectChunkSplitPoints ( <nl> opCtx , <nl> primaryShardId , <nl> InitialSplitPolicy : : ShardCollectionConfig InitialSplitPolicy : : createFirstChunks ( <nl> <nl> / / If docs already exist for the collection , must use primary shard , <nl> / / otherwise defer to passed - in distribution option . <nl> - if ( numObjects = = 0 & & distributeInitialChunks ) { <nl> + if ( isEmpty & & distributeInitialChunks ) { <nl> Grid : : get ( opCtx ) - > shardRegistry ( ) - > getAllShardIdsNoReload ( & shardIds ) ; <nl> } else { <nl> shardIds . push_back ( primaryShardId ) ; <nl> InitialSplitPolicy : : ShardCollectionConfig InitialSplitPolicy : : createFirstChunks ( <nl> shardIds , <nl> numContiguousChunksPerShard ) <nl> : InitialSplitPolicy : : generateShardCollectionInitialZonedChunks ( <nl> - nss , shardKeyPattern , validAfter , tags , getTagToShardIds ( opCtx , tags ) , shardIds ) ; <nl> + nss , <nl> + shardKeyPattern , <nl> + validAfter , <nl> + tags , <nl> + getTagToShardIds ( opCtx , tags ) , <nl> + shardIds , <nl> + isEmpty ) ; <nl> <nl> return initialChunks ; <nl> } <nl> mmm a / src / mongo / db / s / config / initial_split_policy . h <nl> ppp b / src / mongo / db / s / config / initial_split_policy . h <nl> class InitialSplitPolicy { <nl> const Timestamp & validAfter , <nl> const std : : vector < TagsType > & tags , <nl> const StringMap < std : : vector < ShardId > > & tagToShards , <nl> - const std : : vector < ShardId > & allShardIds ) ; <nl> + const std : : vector < ShardId > & allShardIds , <nl> + const bool isEmpty ) ; <nl> <nl> / * * <nl> * Creates the first chunks for a newly sharded collection . <nl> class InitialSplitPolicy { <nl> const std : : vector < BSONObj > & splitPoints , <nl> const std : : vector < TagsType > & tags , <nl> const bool distributeInitialChunks , <nl> + const bool isEmpty , <nl> const int numContiguousChunksPerShard = 1 ) ; <nl> <nl> / * * <nl> mmm a / src / mongo / db / s / config / initial_split_policy_test . cpp <nl> ppp b / src / mongo / db / s / config / initial_split_policy_test . cpp <nl> class GenerateShardCollectionInitialZonedChunksTest : public GenerateInitialSpli <nl> timeStamp ( ) , <nl> tags , <nl> makeTagToShards ( numShards ) , <nl> - makeShardIds ( numShards ) ) ; <nl> + makeShardIds ( numShards ) , <nl> + true ) ; <nl> const std : : vector < ChunkType > expectedChunks = <nl> makeChunks ( expectedChunkRanges , expectedShardIds ) ; <nl> assertChunkVectorsAreEqual ( expectedChunks , shardCollectionConfig . chunks ) ; <nl> TEST_F ( GenerateShardCollectionInitialZonedChunksTest , ZoneNotAssociatedWithAnySh <nl> <nl> ASSERT_THROWS_CODE ( <nl> InitialSplitPolicy : : generateShardCollectionInitialZonedChunks ( <nl> - nss ( ) , shardKeyPattern ( ) , timeStamp ( ) , tags , tagToShards , makeShardIds ( 1 ) ) , <nl> + nss ( ) , shardKeyPattern ( ) , timeStamp ( ) , tags , tagToShards , makeShardIds ( 1 ) , true ) , <nl> AssertionException , <nl> 50973 ) ; <nl> } <nl> mmm a / src / mongo / db / s / config / sharding_catalog_manager_collection_operations . cpp <nl> ppp b / src / mongo / db / s / config / sharding_catalog_manager_collection_operations . cpp <nl> void ShardingCatalogManager : : shardCollection ( OperationContext * opCtx , <nl> } <nl> <nl> std : : vector < TagsType > tags ; <nl> - const auto initialChunks = InitialSplitPolicy : : createFirstChunks ( <nl> - opCtx , nss , fieldsAndOrder , dbPrimaryShardId , splitPoints , tags , distributeInitialChunks ) ; <nl> + / / Since this code runs on the config server , we cannot guarantee that the collection is still <nl> + / / empty by the time the metadata is written so always assume we are sharding a non - empty <nl> + / / collection . <nl> + bool isEmpty = false ; <nl> + const auto initialChunks = InitialSplitPolicy : : createFirstChunks ( opCtx , <nl> + nss , <nl> + fieldsAndOrder , <nl> + dbPrimaryShardId , <nl> + splitPoints , <nl> + tags , <nl> + distributeInitialChunks , <nl> + isEmpty ) ; <nl> <nl> InitialSplitPolicy : : writeFirstChunksToConfig ( opCtx , initialChunks ) ; <nl> <nl> mmm a / src / mongo / db / s / config / sharding_catalog_manager_shard_collection_test . cpp <nl> ppp b / src / mongo / db / s / config / sharding_catalog_manager_shard_collection_test . cpp <nl> const NamespaceString kNamespace ( " db1 . foo " ) ; <nl> <nl> class ShardCollectionTest : public ConfigServerTestFixture { <nl> public : <nl> - void expectCount ( const HostAndPort & receivingHost , <nl> - const NamespaceString & expectedNss , <nl> - const BSONObj & expectedQuery , <nl> - const StatusWith < long long > & response ) { <nl> + void expectSplitVector ( const HostAndPort & shardHost , <nl> + const ShardKeyPattern & keyPattern , <nl> + const BSONObj & splitPoints ) { <nl> onCommand ( [ & ] ( const RemoteCommandRequest & request ) { <nl> - ASSERT_EQUALS ( receivingHost , request . target ) ; <nl> + ASSERT_EQUALS ( shardHost , request . target ) ; <nl> string cmdName = request . cmdObj . firstElement ( ) . fieldName ( ) ; <nl> - <nl> - ASSERT_EQUALS ( " count " , cmdName ) ; <nl> - <nl> - const NamespaceString nss ( request . dbname , request . cmdObj . firstElement ( ) . String ( ) ) ; <nl> - ASSERT_EQUALS ( expectedNss , nss ) ; <nl> - <nl> - if ( expectedQuery . isEmpty ( ) ) { <nl> - auto queryElem = request . cmdObj [ " query " ] ; <nl> - ASSERT_TRUE ( queryElem . eoo ( ) | | queryElem . Obj ( ) . isEmpty ( ) ) ; <nl> - } else { <nl> - ASSERT_BSONOBJ_EQ ( expectedQuery , request . cmdObj [ " query " ] . Obj ( ) ) ; <nl> - } <nl> - <nl> - if ( response . isOK ( ) ) { <nl> - return BSON ( " ok " < < 1 < < " n " < < response . getValue ( ) ) ; <nl> - } <nl> - <nl> - BSONObjBuilder responseBuilder ; <nl> - CommandHelpers : : appendCommandStatusNoThrow ( responseBuilder , response . getStatus ( ) ) ; <nl> - return responseBuilder . obj ( ) ; <nl> + ASSERT_EQUALS ( " splitVector " , cmdName ) ; <nl> + ASSERT_EQUALS ( kNamespace . ns ( ) , <nl> + request . cmdObj [ " splitVector " ] . String ( ) ) ; / / splitVector uses full ns <nl> + <nl> + ASSERT_BSONOBJ_EQ ( keyPattern . toBSON ( ) , request . cmdObj [ " keyPattern " ] . Obj ( ) ) ; <nl> + ASSERT_BSONOBJ_EQ ( keyPattern . getKeyPattern ( ) . globalMin ( ) , request . cmdObj [ " min " ] . Obj ( ) ) ; <nl> + ASSERT_BSONOBJ_EQ ( keyPattern . getKeyPattern ( ) . globalMax ( ) , request . cmdObj [ " max " ] . Obj ( ) ) ; <nl> + ASSERT_EQUALS ( 64 * 1024 * 1024ULL , <nl> + static_cast < uint64_t > ( request . cmdObj [ " maxChunkSizeBytes " ] . numberLong ( ) ) ) ; <nl> + ASSERT_EQUALS ( 0 , request . cmdObj [ " maxSplitPoints " ] . numberLong ( ) ) ; <nl> + ASSERT_EQUALS ( 0 , request . cmdObj [ " maxChunkObjects " ] . numberLong ( ) ) ; <nl> + <nl> + ASSERT_BSONOBJ_EQ ( <nl> + ReadPreferenceSetting ( ReadPreference : : PrimaryPreferred ) . toContainingBSON ( ) , <nl> + rpc : : TrackingMetadata : : removeTrackingData ( request . metadata ) ) ; <nl> + <nl> + return BSON ( " ok " < < 1 < < " splitKeys " < < splitPoints ) ; <nl> } ) ; <nl> } <nl> <nl> TEST_F ( ShardCollectionTest , noInitialChunksOrData ) { <nl> testPrimaryShard ) ; <nl> } ) ; <nl> <nl> - / / Report that no documents exist for the given collection on the primary shard <nl> - expectCount ( shardHost , kNamespace , BSONObj ( ) , 0 ) ; <nl> + / / Respond to the splitVector command sent to the shard to figure out initial split points . <nl> + expectSplitVector ( shardHost , shardKeyPattern , BSONObj ( ) ) ; <nl> <nl> / / Expect the set shard version for that namespace . <nl> / / We do not check for a specific ChunkVersion , because we cannot easily know the OID that was <nl> TEST_F ( ShardCollectionTest , withInitialData ) { <nl> BSONObj splitPoint2 = BSON ( " _id " < < 200 ) ; <nl> BSONObj splitPoint3 = BSON ( " _id " < < 300 ) ; <nl> <nl> - ChunkVersion expectedVersion ( 1 , 0 , OID : : gen ( ) ) ; <nl> - <nl> - ChunkType expectedChunk0 ; <nl> - expectedChunk0 . setNS ( kNamespace ) ; <nl> - expectedChunk0 . setShard ( shard . getName ( ) ) ; <nl> - expectedChunk0 . setMin ( keyPattern . getKeyPattern ( ) . globalMin ( ) ) ; <nl> - expectedChunk0 . setMax ( splitPoint0 ) ; <nl> - expectedChunk0 . setVersion ( expectedVersion ) ; <nl> - expectedVersion . incMinor ( ) ; <nl> - <nl> - ChunkType expectedChunk1 ; <nl> - expectedChunk1 . setNS ( kNamespace ) ; <nl> - expectedChunk1 . setShard ( shard . getName ( ) ) ; <nl> - expectedChunk1 . setMin ( splitPoint0 ) ; <nl> - expectedChunk1 . setMax ( splitPoint1 ) ; <nl> - expectedChunk1 . setVersion ( expectedVersion ) ; <nl> - expectedVersion . incMinor ( ) ; <nl> - <nl> - ChunkType expectedChunk2 ; <nl> - expectedChunk2 . setNS ( kNamespace ) ; <nl> - expectedChunk2 . setShard ( shard . getName ( ) ) ; <nl> - expectedChunk2 . setMin ( splitPoint1 ) ; <nl> - expectedChunk2 . setMax ( splitPoint2 ) ; <nl> - expectedChunk2 . setVersion ( expectedVersion ) ; <nl> - expectedVersion . incMinor ( ) ; <nl> - <nl> - ChunkType expectedChunk3 ; <nl> - expectedChunk3 . setNS ( kNamespace ) ; <nl> - expectedChunk3 . setShard ( shard . getName ( ) ) ; <nl> - expectedChunk3 . setMin ( splitPoint2 ) ; <nl> - expectedChunk3 . setMax ( splitPoint3 ) ; <nl> - expectedChunk3 . setVersion ( expectedVersion ) ; <nl> - expectedVersion . incMinor ( ) ; <nl> - <nl> - ChunkType expectedChunk4 ; <nl> - expectedChunk4 . setNS ( kNamespace ) ; <nl> - expectedChunk4 . setShard ( shard . getName ( ) ) ; <nl> - expectedChunk4 . setMin ( splitPoint3 ) ; <nl> - expectedChunk4 . setMax ( keyPattern . getKeyPattern ( ) . globalMax ( ) ) ; <nl> - expectedChunk4 . setVersion ( expectedVersion ) ; <nl> - <nl> - vector < ChunkType > expectedChunks { <nl> - expectedChunk0 , expectedChunk1 , expectedChunk2 , expectedChunk3 , expectedChunk4 } ; <nl> - <nl> BSONObj defaultCollation ; <nl> <nl> / / Now start actually sharding the collection . <nl> TEST_F ( ShardCollectionTest , withInitialData ) { <nl> testPrimaryShard ) ; <nl> } ) ; <nl> <nl> - / / Report that documents exist for the given collection on the primary shard , so that calling <nl> - / / splitVector is required for calculating the initial split points . <nl> - expectCount ( shardHost , kNamespace , BSONObj ( ) , 1000 ) ; <nl> - <nl> - / / Respond to the splitVector command sent to the shard to figure out initial split points <nl> - onCommand ( [ & ] ( const RemoteCommandRequest & request ) { <nl> - ASSERT_EQUALS ( shardHost , request . target ) ; <nl> - string cmdName = request . cmdObj . firstElement ( ) . fieldName ( ) ; <nl> - ASSERT_EQUALS ( " splitVector " , cmdName ) ; <nl> - ASSERT_EQUALS ( kNamespace . ns ( ) , <nl> - request . cmdObj [ " splitVector " ] . String ( ) ) ; / / splitVector uses full ns <nl> - <nl> - ASSERT_BSONOBJ_EQ ( keyPattern . toBSON ( ) , request . cmdObj [ " keyPattern " ] . Obj ( ) ) ; <nl> - ASSERT_BSONOBJ_EQ ( keyPattern . getKeyPattern ( ) . globalMin ( ) , request . cmdObj [ " min " ] . Obj ( ) ) ; <nl> - ASSERT_BSONOBJ_EQ ( keyPattern . getKeyPattern ( ) . globalMax ( ) , request . cmdObj [ " max " ] . Obj ( ) ) ; <nl> - ASSERT_EQUALS ( 64 * 1024 * 1024ULL , <nl> - static_cast < uint64_t > ( request . cmdObj [ " maxChunkSizeBytes " ] . numberLong ( ) ) ) ; <nl> - ASSERT_EQUALS ( 0 , request . cmdObj [ " maxSplitPoints " ] . numberLong ( ) ) ; <nl> - ASSERT_EQUALS ( 0 , request . cmdObj [ " maxChunkObjects " ] . numberLong ( ) ) ; <nl> - <nl> - ASSERT_BSONOBJ_EQ ( <nl> - ReadPreferenceSetting ( ReadPreference : : PrimaryPreferred ) . toContainingBSON ( ) , <nl> - rpc : : TrackingMetadata : : removeTrackingData ( request . metadata ) ) ; <nl> - <nl> - return BSON ( " ok " < < 1 < < " splitKeys " <nl> - < < BSON_ARRAY ( splitPoint0 < < splitPoint1 < < splitPoint2 < < splitPoint3 ) ) ; <nl> - } ) ; <nl> + / / Respond to the splitVector command sent to the shard to figure out initial split points . <nl> + expectSplitVector ( shardHost , <nl> + keyPattern , <nl> + BSON_ARRAY ( splitPoint0 < < splitPoint1 < < splitPoint2 < < splitPoint3 ) ) ; <nl> <nl> / / Expect the set shard version for that namespace <nl> / / We do not check for a specific ChunkVersion , because we cannot easily know the OID that was <nl> mmm a / src / mongo / db / s / shardsvr_shard_collection . cpp <nl> ppp b / src / mongo / db / s / shardsvr_shard_collection . cpp <nl> void shardCollection ( OperationContext * opCtx , <nl> const std : : vector < TagsType > & tags , <nl> const bool fromMapReduce , <nl> const ShardId & dbPrimaryShardId , <nl> - const int numContiguousChunksPerShard ) { <nl> + const int numContiguousChunksPerShard , <nl> + const bool isEmpty ) { <nl> const auto shardRegistry = Grid : : get ( opCtx ) - > shardRegistry ( ) ; <nl> <nl> const auto primaryShard = uassertStatusOK ( shardRegistry - > getShard ( opCtx , dbPrimaryShardId ) ) ; <nl> void shardCollection ( OperationContext * opCtx , <nl> splitPoints , <nl> tags , <nl> distributeChunks , <nl> + isEmpty , <nl> numContiguousChunksPerShard ) ; <nl> <nl> / / Create collections on all shards that will receive chunks . We need to do this after we mark <nl> class ShardsvrShardCollectionCommand : public BasicCommand { <nl> <nl> if ( request . getInitialSplitPoints ( ) ) { <nl> finalSplitPoints = std : : move ( * request . getInitialSplitPoints ( ) ) ; <nl> - } else if ( ! tags . empty ( ) ) { <nl> - / / no need to find split points since we will create chunks based on <nl> - / / the existing zones <nl> - uassert ( ErrorCodes : : InvalidOptions , <nl> - str : : stream ( ) < < " found existing zones but the collection is not empty " , <nl> - isEmpty ) ; <nl> - } else { <nl> + } else if ( tags . empty ( ) ) { <nl> InitialSplitPolicy : : calculateHashedSplitPointsForEmptyCollection ( <nl> shardKeyPattern , <nl> isEmpty , <nl> class ShardsvrShardCollectionCommand : public BasicCommand { <nl> tags , <nl> fromMapReduce , <nl> ShardingState : : get ( opCtx ) - > shardId ( ) , <nl> - numContiguousChunksPerShard ) ; <nl> + numContiguousChunksPerShard , <nl> + isEmpty ) ; <nl> <nl> status = Status : : OK ( ) ; <nl> } catch ( const DBException & e ) { <nl>
SERVER - 38392 : remove assertion that we can ' t shard a non - empty collection associated with tags
mongodb/mongo
778f905b2905c00b6f394d8db6e7d12e87d7ad3d
2018-12-11T21:55:57Z
mmm a / benchmark / utils / DriverUtils . swift <nl> ppp b / benchmark / utils / DriverUtils . swift <nl> func internalMedian ( _ inputs : [ UInt64 ] ) - > UInt64 { <nl> # if SWIFT_RUNTIME_ENABLE_LEAK_CHECKER <nl> <nl> @ _silgen_name ( " swift_leaks_startTrackingObjects " ) <nl> - func startTrackingObjects ( _ : UnsafeMutablePointer < Void > ) - > ( ) <nl> + func startTrackingObjects ( _ : UnsafeMutableRawPointer ) - > ( ) <nl> @ _silgen_name ( " swift_leaks_stopTrackingObjects " ) <nl> - func stopTrackingObjects ( _ : UnsafeMutablePointer < Void > ) - > Int <nl> + func stopTrackingObjects ( _ : UnsafeMutableRawPointer ) - > Int <nl> <nl> # endif <nl> <nl> class SampleRunner { <nl> / / Start the timer . <nl> # if SWIFT_RUNTIME_ENABLE_LEAK_CHECKER <nl> var str = name <nl> - startTrackingObjects ( UnsafeMutablePointer < Void > ( str . _core . startASCII ) ) <nl> + startTrackingObjects ( UnsafeMutableRawPointer ( str . _core . startASCII ) ) <nl> # endif <nl> let start_ticks = mach_absolute_time ( ) <nl> fn ( Int ( num_iters ) ) <nl> / / Stop the timer . <nl> let end_ticks = mach_absolute_time ( ) <nl> # if SWIFT_RUNTIME_ENABLE_LEAK_CHECKER <nl> - stopTrackingObjects ( UnsafeMutablePointer < Void > ( str . _core . startASCII ) ) <nl> + stopTrackingObjects ( UnsafeMutableRawPointer ( str . _core . startASCII ) ) <nl> # endif <nl> <nl> / / Compute the spent time and the scaling factor . <nl> mmm a / docs / proposals / CPointerInteropLanguageModel . rst <nl> ppp b / docs / proposals / CPointerInteropLanguageModel . rst <nl> accept any of the following : <nl> array , and lifetime - extended for the duration of the callee . <nl> <nl> As a special case , when a function is declared as taking an <nl> - ` ` UnsafeMutablePointer < Void > ` ` argument , it can accept the same operands as <nl> + ` ` UnsafeMutableRawPointer ` ` argument , it can accept the same operands as <nl> ` ` UnsafeMutablePointer < T > ` ` for any type T . <nl> <nl> So if you have a function declared : : <nl> You can call it as any of : : <nl> <nl> And if you have a function declared : : <nl> <nl> - func bar ( _ x : UnsafeMutablePointer < Void > ) <nl> + func bar ( _ x : UnsafeMutableRawPointer ) <nl> <nl> You can call it as any of : : <nl> <nl> accept any of the following : <nl> array , and lifetime - extended for the duration of the callee . <nl> <nl> As a special case , when a function is declared as taking an <nl> - ` ` UnsafePointer < Void > ` ` argument , it can accept the same operands as <nl> + ` ` UnsafeRawPointer ` ` argument , it can accept the same operands as <nl> ` ` UnsafePointer < T > ` ` for any type ` ` T ` ` . Pointers to certain integer <nl> types can furthermore interoperate with strings ; see ` Strings ` _ below . <nl> <nl> You can call it as any of : : <nl> <nl> And if you have a function declared : : <nl> <nl> - func zang ( _ x : UnsafePointer < Void > ) <nl> + func zang ( _ x : UnsafeRawPointer ) <nl> <nl> You can call it as any of : : <nl> <nl> You can call it as any of : : <nl> zang ( ints ) <nl> <nl> A type checker limitation prevents array literals from being passed directly <nl> - to ` ` UnsafePointer < Void > ` ` arguments without type annotation . As a <nl> + to ` ` UnsafeRawPointer ` ` arguments without type annotation . As a <nl> workaround , you can bind the array literal to a constant , as above , or <nl> specify the array type with ` ` as ` ` : : <nl> <nl> mmm a / lib / AST / Module . cpp <nl> ppp b / lib / AST / Module . cpp <nl> VarDecl * Module : : getDSOHandle ( ) { <nl> if ( DSOHandle ) <nl> return DSOHandle ; <nl> <nl> - auto unsafeMutablePtr = getASTContext ( ) . getUnsafeMutablePointerDecl ( ) ; <nl> - if ( ! unsafeMutablePtr ) <nl> + auto unsafeMutableRawPtr = getASTContext ( ) . getUnsafeMutableRawPointerDecl ( ) ; <nl> + if ( ! unsafeMutableRawPtr ) <nl> return nullptr ; <nl> <nl> - Type arg ; <nl> auto & ctx = getASTContext ( ) ; <nl> - if ( auto voidDecl = ctx . getVoidDecl ( ) ) { <nl> - arg = voidDecl - > getDeclaredInterfaceType ( ) ; <nl> - } else { <nl> - arg = TupleType : : getEmpty ( ctx ) ; <nl> - } <nl> - <nl> - Type type = BoundGenericType : : get ( unsafeMutablePtr , Type ( ) , { arg } ) ; <nl> auto handleVar = new ( ctx ) VarDecl ( / * IsStatic = * / false , / * IsLet = * / false , <nl> SourceLoc ( ) , <nl> ctx . getIdentifier ( " __dso_handle " ) , <nl> - type , Files [ 0 ] ) ; <nl> + unsafeMutableRawPtr - > getDeclaredType ( ) , <nl> + Files [ 0 ] ) ; <nl> handleVar - > setImplicit ( true ) ; <nl> handleVar - > getAttrs ( ) . add ( <nl> new ( ctx ) SILGenNameAttr ( " __dso_handle " , / * Implicit = * / true ) ) ; <nl> mmm a / lib / ClangImporter / ImportType . cpp <nl> ppp b / lib / ClangImporter / ImportType . cpp <nl> namespace { <nl> <nl> ImportResult VisitPointerType ( const clang : : PointerType * type ) { <nl> auto pointeeQualType = type - > getPointeeType ( ) ; <nl> + auto quals = pointeeQualType . getQualifiers ( ) ; <nl> <nl> / / Special case for NSZone * , which has its own Swift wrapper . <nl> if ( const clang : : RecordType * pointee = <nl> namespace { <nl> return { wrapperTy , ImportHint : : OtherPointer } ; <nl> } <nl> } <nl> - <nl> + <nl> + / / Import ' void * ' as ' UnsafeMutableRawPointer ' and ' const void * ' as <nl> + / / ' UnsafeRawPointer ' . This is Swift ' s version of an untyped pointer . Note <nl> + / / that ' Unsafe [ Mutable ] Pointer < T > ' implicitly converts to <nl> + / / ' Unsafe [ Mutable ] RawPointer ' for interoperability . <nl> + if ( pointeeQualType - > isVoidType ( ) ) { <nl> + return { <nl> + ( quals . hasConst ( ) ? Impl . SwiftContext . getUnsafeRawPointerDecl ( ) <nl> + : Impl . SwiftContext . getUnsafeMutableRawPointerDecl ( ) ) <nl> + - > getDeclaredType ( ) , <nl> + ImportHint : : OtherPointer } ; <nl> + } <nl> + <nl> / / All other C pointers to concrete types map to <nl> / / UnsafeMutablePointer < T > or OpaquePointer ( FIXME : , except in <nl> / / parameter position under the pre - <nl> namespace { <nl> } ; <nl> } <nl> <nl> - auto quals = pointeeQualType . getQualifiers ( ) ; <nl> - <nl> if ( quals . hasConst ( ) ) { <nl> return { Impl . getNamedSwiftTypeSpecialization ( Impl . getStdlibModule ( ) , <nl> " UnsafePointer " , <nl> mmm a / lib / IDE / CodeCompletion . cpp <nl> ppp b / lib / IDE / CodeCompletion . cpp <nl> static void addExprKeywords ( CodeCompletionResultSink & Sink ) { <nl> / / Same : Swift . IntegerLiteralType . <nl> AddKeyword ( " # line " , " Int " , CodeCompletionKeywordKind : : pound_line ) ; <nl> AddKeyword ( " # column " , " Int " , CodeCompletionKeywordKind : : pound_column ) ; <nl> - AddKeyword ( " # dsohandle " , " UnsafeMutablePointer < Void > " , CodeCompletionKeywordKind : : pound_dsohandle ) ; <nl> + AddKeyword ( " # dsohandle " , " UnsafeMutableRawPointer " , CodeCompletionKeywordKind : : pound_dsohandle ) ; <nl> } <nl> <nl> static void addAnyTypeKeyword ( CodeCompletionResultSink & Sink ) { <nl> mmm a / lib / PrintAsObjC / PrintAsObjC . cpp <nl> ppp b / lib / PrintAsObjC / PrintAsObjC . cpp <nl> class ObjCPrinter : private DeclVisitor < ObjCPrinter > , <nl> MAP ( Bool , " BOOL " , false ) ; <nl> <nl> MAP ( OpaquePointer , " void * " , true ) ; <nl> + MAP ( UnsafeRawPointer , " void const * " , true ) ; <nl> + MAP ( UnsafeMutableRawPointer , " void * " , true ) ; <nl> <nl> Identifier ID_ObjectiveC = ctx . Id_ObjectiveC ; <nl> specialNames [ { ID_ObjectiveC , ctx . getIdentifier ( " ObjCBool " ) } ] <nl> mmm a / lib / Sema / CSGen . cpp <nl> ppp b / lib / Sema / CSGen . cpp <nl> namespace { <nl> return visitLiteralExpr ( expr ) ; <nl> <nl> case MagicIdentifierLiteralExpr : : DSOHandle : { <nl> - / / # dsohandle has type UnsafeMutablePointer < Void > . <nl> + / / # dsohandle has type UnsafeMutableRawPointer . <nl> auto & tc = CS . getTypeChecker ( ) ; <nl> if ( tc . requirePointerArgumentIntrinsics ( expr - > getLoc ( ) ) ) <nl> return nullptr ; <nl> mmm a / stdlib / private / SwiftPrivatePthreadExtras / SwiftPrivatePthreadExtras . swift <nl> ppp b / stdlib / private / SwiftPrivatePthreadExtras / SwiftPrivatePthreadExtras . swift <nl> internal class PthreadBlockContext { <nl> / / / Execute the block , and return an ` UnsafeMutablePointer ` to memory <nl> / / / allocated with ` UnsafeMutablePointer . alloc ` containing the result of the <nl> / / / block . <nl> - func run ( ) - > UnsafeMutablePointer < Void > { fatalError ( " abstract " ) } <nl> + func run ( ) - > UnsafeMutableRawPointer { fatalError ( " abstract " ) } <nl> } <nl> <nl> internal class PthreadBlockContextImpl < Argument , Result > : PthreadBlockContext { <nl> internal class PthreadBlockContextImpl < Argument , Result > : PthreadBlockContext { <nl> super . init ( ) <nl> } <nl> <nl> - override func run ( ) - > UnsafeMutablePointer < Void > { <nl> + override func run ( ) - > UnsafeMutableRawPointer { <nl> let result = UnsafeMutablePointer < Result > . allocate ( capacity : 1 ) <nl> result . initialize ( to : block ( arg ) ) <nl> - return UnsafeMutablePointer ( result ) <nl> + return UnsafeMutableRawPointer ( result ) <nl> } <nl> } <nl> <nl> / / / Entry point for ` pthread_create ` that invokes a block context . <nl> internal func invokeBlockContext ( <nl> - _ contextAsVoidPointer : UnsafeMutablePointer < Void > ? <nl> - ) - > UnsafeMutablePointer < Void > ! { <nl> + _ contextAsVoidPointer : UnsafeMutableRawPointer ? <nl> + ) - > UnsafeMutableRawPointer ! { <nl> / / The context is passed in + 1 ; we ' re responsible for releasing it . <nl> let context = Unmanaged < PthreadBlockContext > <nl> . fromOpaque ( contextAsVoidPointer ! ) <nl> public func _stdlib_pthread_join < Result > ( <nl> _ thread : pthread_t , <nl> _ resultType : Result . Type <nl> ) - > ( CInt , Result ? ) { <nl> - var threadResultPtr : UnsafeMutablePointer < Void > ? = nil <nl> - let result = pthread_join ( thread , & threadResultPtr ) <nl> + var threadResultRawPtr : UnsafeMutableRawPointer ? = nil <nl> + let result = pthread_join ( thread , & threadResultRawPtr ) <nl> if result = = 0 { <nl> - let threadResult = UnsafeMutablePointer < Result > ( threadResultPtr ! ) . pointee <nl> - threadResultPtr ! . deinitialize ( ) <nl> - threadResultPtr ! . deallocate ( capacity : 1 ) <nl> + let threadResultPtr = threadResultRawPtr ! . assumingMemoryBound ( <nl> + to : Result . self ) <nl> + let threadResult = threadResultPtr . pointee <nl> + threadResultPtr . deinitialize ( ) <nl> + threadResultPtr . deallocate ( capacity : 1 ) <nl> return ( result , threadResult ) <nl> } else { <nl> return ( result , nil ) <nl> mmm a / stdlib / private / SwiftReflectionTest / SwiftReflectionTest . swift <nl> ppp b / stdlib / private / SwiftReflectionTest / SwiftReflectionTest . swift <nl> public enum InstanceKind : UInt8 { <nl> / / / Represents a section in a loaded image in this process . <nl> internal struct Section { <nl> / / / The absolute start address of the section ' s data in this address space . <nl> - let startAddress : UnsafePointer < Void > <nl> + let startAddress : UnsafeRawPointer <nl> <nl> / / / The size of the section in bytes . <nl> let size : UInt <nl> internal func sendBytes ( ) { <nl> let count = Int ( readUInt ( ) ) <nl> debugLog ( " Parent requested \ ( count ) bytes from \ ( address ) " ) <nl> var totalBytesWritten = 0 <nl> - var pointer = unsafeBitCast ( address , to : UnsafeMutablePointer < Void > . self ) <nl> + var pointer = unsafeBitCast ( address , to : UnsafeMutableRawPointer . self ) <nl> while totalBytesWritten < count { <nl> let bytesWritten = Int ( fwrite ( pointer , 1 , Int ( count ) , stdout ) ) <nl> fflush ( stdout ) <nl> internal func sendSymbolAddress ( ) { <nl> debugLog ( " BEGIN \ ( # function ) " ) ; defer { debugLog ( " END \ ( # function ) " ) } <nl> let name = readLine ( ) ! <nl> name . withCString { <nl> - let handle = unsafeBitCast ( Int ( - 2 ) , to : UnsafeMutablePointer < Void > . self ) <nl> + let handle = unsafeBitCast ( Int ( - 2 ) , to : UnsafeMutableRawPointer . self ) <nl> let symbol = dlsym ( handle , $ 0 ) <nl> let symbolAddress = unsafeBitCast ( symbol , to : UInt . self ) <nl> sendValue ( symbolAddress ) <nl> internal func sendStringLength ( ) { <nl> / / / Send the size of this architecture ' s pointer type . <nl> internal func sendPointerSize ( ) { <nl> debugLog ( " BEGIN \ ( # function ) " ) ; defer { debugLog ( " END \ ( # function ) " ) } <nl> - let pointerSize = UInt8 ( sizeof ( UnsafePointer < Void > . self ) ) <nl> + let pointerSize = UInt8 ( sizeof ( UnsafeRawPointer . self ) ) <nl> sendValue ( pointerSize ) <nl> } <nl> <nl> struct ThickFunction3 { <nl> } <nl> <nl> struct ThickFunctionParts { <nl> - var function : UnsafePointer < Void > <nl> - var context : Optional < UnsafePointer < Void > > <nl> + var function : UnsafeRawPointer <nl> + var context : Optional < UnsafeRawPointer > <nl> } <nl> <nl> / / / Reflect a closure context . The given function must be a Swift - native <nl> mmm a / stdlib / public / Platform / Platform . swift <nl> ppp b / stdlib / public / Platform / Platform . swift <nl> internal func _swift_Platform_fcntl ( <nl> internal func _swift_Platform_fcntlPtr ( <nl> _ fd : Int32 , <nl> _ cmd : Int32 , <nl> - _ ptr : UnsafeMutablePointer < Void > <nl> + _ ptr : UnsafeMutableRawPointer <nl> ) - > Int32 <nl> <nl> public func fcntl ( <nl> public func fcntl ( <nl> public func fcntl ( <nl> _ fd : Int32 , <nl> _ cmd : Int32 , <nl> - _ ptr : UnsafeMutablePointer < Void > <nl> + _ ptr : UnsafeMutableRawPointer <nl> ) - > Int32 { <nl> return _swift_Platform_fcntlPtr ( fd , cmd , ptr ) <nl> } <nl> internal func _swift_Platform_ioctl ( <nl> internal func _swift_Platform_ioctlPtr ( <nl> _ fd : CInt , <nl> _ request : UInt , <nl> - _ ptr : UnsafeMutablePointer < Void > <nl> + _ ptr : UnsafeMutableRawPointer <nl> ) - > CInt <nl> <nl> public func ioctl ( <nl> public func ioctl ( <nl> public func ioctl ( <nl> _ fd : CInt , <nl> _ request : UInt , <nl> - _ ptr : UnsafeMutablePointer < Void > <nl> + _ ptr : UnsafeMutableRawPointer <nl> ) - > CInt { <nl> return _swift_Platform_ioctlPtr ( fd , request , ptr ) <nl> } <nl> mmm a / stdlib / public / SDK / CoreAudio / CoreAudio . swift <nl> ppp b / stdlib / public / SDK / CoreAudio / CoreAudio . swift <nl> <nl> <nl> extension UnsafeBufferPointer { <nl> / / / Initialize an ` UnsafeBufferPointer < Element > ` from an ` AudioBuffer ` . <nl> + / / / Binds the the buffer ' s memory type to ` Element ` . <nl> public init ( _ audioBuffer : AudioBuffer ) { <nl> - self . init ( <nl> - start : UnsafePointer < Element > ( audioBuffer . mData ) , <nl> - count : Int ( audioBuffer . mDataByteSize ) / strideof ( Element . self ) ) <nl> + let count = Int ( audioBuffer . mDataByteSize ) / strideof ( Element . self ) <nl> + let elementPtr = audioBuffer . mData ? . bindMemory ( <nl> + to : Element . self , capacity : count ) <nl> + self . init ( start : elementPtr , count : count ) <nl> } <nl> } <nl> <nl> extension UnsafeMutableBufferPointer { <nl> / / / Initialize an ` UnsafeMutableBufferPointer < Element > ` from an <nl> / / / ` AudioBuffer ` . <nl> public init ( _ audioBuffer : AudioBuffer ) { <nl> - self . init ( <nl> - start : UnsafeMutablePointer < Element > ( audioBuffer . mData ) , <nl> - count : Int ( audioBuffer . mDataByteSize ) / strideof ( Element . self ) ) <nl> + let count = Int ( audioBuffer . mDataByteSize ) / strideof ( Element . self ) <nl> + let elementPtr = audioBuffer . mData ? . bindMemory ( <nl> + to : Element . self , capacity : count ) <nl> + self . init ( start : elementPtr , count : count ) <nl> } <nl> } <nl> <nl> extension AudioBuffer { <nl> numberOfChannels : Int <nl> ) { <nl> self . mNumberChannels = UInt32 ( numberOfChannels ) <nl> - self . mData = UnsafeMutablePointer < Void > ( typedBuffer . baseAddress ) <nl> + self . mData = UnsafeMutableRawPointer ( typedBuffer . baseAddress ) <nl> self . mDataByteSize = UInt32 ( typedBuffer . count * strideof ( Element . self ) ) <nl> } <nl> } <nl> extension AudioBufferList { <nl> _precondition ( ablMemory ! = nil , <nl> " failed to allocate memory for an AudioBufferList " ) <nl> <nl> - let abl = UnsafeMutableAudioBufferListPointer ( <nl> - UnsafeMutablePointer < AudioBufferList > ( ablMemory ! ) ) <nl> + let listPtr = ablMemory ! . bindMemory ( to : AudioBufferList . self , capacity : 1 ) <nl> + ( ablMemory ! + strideof ( AudioBufferList . self ) ) . bindMemory ( <nl> + to : AudioBuffer . self , capacity : maximumBuffers ) <nl> + let abl = UnsafeMutableAudioBufferListPointer ( listPtr ) <nl> abl . count = maximumBuffers <nl> return abl <nl> } <nl> public struct UnsafeMutableAudioBufferListPointer { <nl> / / AudioBufferList has one AudioBuffer in a " flexible array member " . <nl> / / Position the pointer after that , and skip one AudioBuffer back . This <nl> / / brings us to the start of AudioBuffer array . <nl> - return UnsafeMutablePointer < AudioBuffer > ( unsafeMutablePointer + 1 ) - 1 <nl> + let rawPtr = UnsafeMutableRawPointer ( unsafeMutablePointer + 1 ) <nl> + return rawPtr . assumingMemoryBound ( to : AudioBuffer . self ) - 1 <nl> } <nl> <nl> / / FIXME : the properties ' unsafePointer ' and ' unsafeMutablePointer ' should be <nl> mmm a / stdlib / public / SDK / Dispatch / Data . swift <nl> ppp b / stdlib / public / SDK / Dispatch / Data . swift <nl> public struct DispatchData : RandomAccessCollection , _ObjectiveCBridgeable { <nl> public func withUnsafeBytes < Result , ContentType > ( <nl> body : @ noescape ( UnsafePointer < ContentType > ) throws - > Result ) rethrows - > Result <nl> { <nl> - var ptr : UnsafePointer < Void > ? = nil <nl> + var ptr : UnsafeRawPointer ? = nil <nl> var size = 0 <nl> let data = __dispatch_data_create_map ( __wrapped , & ptr , & size ) <nl> + let contentPtr = ptr ! . bindMemory ( <nl> + to : ContentType . self , capacity : size / strideof ( ContentType . self ) ) <nl> defer { _fixLifetime ( data ) } <nl> - return try body ( UnsafePointer < ContentType > ( ptr ! ) ) <nl> + return try body ( contentPtr ) <nl> } <nl> <nl> public func enumerateBytes ( <nl> block : @ noescape ( buffer : UnsafeBufferPointer < UInt8 > , byteIndex : Int , stop : inout Bool ) - > Void ) <nl> { <nl> - _swift_dispatch_data_apply ( __wrapped ) { ( data : __DispatchData , offset : Int , ptr : UnsafePointer < Void > , size : Int ) in <nl> - let bp = UnsafeBufferPointer ( start : UnsafePointer < UInt8 > ( ptr ) , count : size ) <nl> + _swift_dispatch_data_apply ( __wrapped ) { ( data : __DispatchData , offset : Int , ptr : UnsafeRawPointer , size : Int ) in <nl> + let bytePtr = ptr . bindMemory ( to : UInt8 . self , capacity : size ) <nl> + let bp = UnsafeBufferPointer ( start : bytePtr , count : size ) <nl> var stop = false <nl> block ( buffer : bp , byteIndex : offset , stop : & stop ) <nl> return ! stop <nl> public struct DispatchData : RandomAccessCollection , _ObjectiveCBridgeable { <nl> <nl> private func _copyBytesHelper ( to pointer : UnsafeMutablePointer < UInt8 > , from range : CountableRange < Index > ) { <nl> var copiedCount = 0 <nl> - __dispatch_data_apply ( __wrapped ) { ( data : __DispatchData , offset : Int , ptr : UnsafePointer < Void > , size : Int ) in <nl> + __dispatch_data_apply ( __wrapped ) { ( data : __DispatchData , offset : Int , ptr : UnsafeRawPointer , size : Int ) in <nl> let limit = Swift . min ( ( range . endIndex - range . startIndex ) - copiedCount , size ) <nl> memcpy ( pointer + copiedCount , ptr , limit ) <nl> copiedCount + = limit <nl> public struct DispatchData : RandomAccessCollection , _ObjectiveCBridgeable { <nl> var offset = 0 <nl> let subdata = __dispatch_data_copy_region ( __wrapped , index , & offset ) <nl> <nl> - var ptr : UnsafePointer < Void > ? = nil <nl> + var ptr : UnsafeRawPointer ? = nil <nl> var size = 0 <nl> let map = __dispatch_data_create_map ( subdata , & ptr , & size ) <nl> defer { _fixLifetime ( map ) } <nl> <nl> - let pptr = UnsafePointer < UInt8 > ( ptr ! ) <nl> - return pptr [ index - offset ] <nl> + return ptr ! . load ( fromByteOffset : index - offset , as : UInt8 . self ) <nl> } <nl> <nl> public subscript ( bounds : Range < Int > ) - > RandomAccessSlice < DispatchData > { <nl> public struct DispatchDataIterator : IteratorProtocol , Sequence { <nl> <nl> / / / Create an iterator over the given DispatchData <nl> public init ( _data : DispatchData ) { <nl> - var ptr : UnsafePointer < Void > ? <nl> + var ptr : UnsafeRawPointer ? <nl> self . _count = 0 <nl> self . _data = __dispatch_data_create_map ( <nl> _data as __DispatchData , & ptr , & self . _count ) <nl> - self . _ptr = UnsafePointer ( ptr ) <nl> + self . _ptr = ptr <nl> self . _position = _data . startIndex <nl> <nl> / / The only time we expect a ' nil ' pointer is when the data is empty . <nl> public struct DispatchDataIterator : IteratorProtocol , Sequence { <nl> / / / element exists . <nl> public mutating func next ( ) - > DispatchData . _Element ? { <nl> if _position = = _count { return nil } <nl> - let element = _ptr [ _position ] <nl> + let element = _ptr . load ( fromByteOffset : _position , as : UInt8 . self ) <nl> _position = _position + 1 <nl> return element <nl> } <nl> <nl> internal let _data : __DispatchData <nl> - internal var _ptr : UnsafePointer < UInt8 > ! <nl> + internal var _ptr : UnsafeRawPointer ! <nl> internal var _count : Int <nl> internal var _position : DispatchData . Index <nl> } <nl> extension DispatchData { <nl> } <nl> } <nl> <nl> - typealias _swift_data_applier = @ convention ( block ) @ noescape ( __DispatchData , Int , UnsafePointer < Void > , Int ) - > Bool <nl> + typealias _swift_data_applier = @ convention ( block ) @ noescape ( __DispatchData , Int , UnsafeRawPointer , Int ) - > Bool <nl> <nl> @ _silgen_name ( " _swift_dispatch_data_apply " ) <nl> internal func _swift_dispatch_data_apply ( _ data : __DispatchData , _ block : _swift_data_applier ) <nl> mmm a / stdlib / public / SDK / Dispatch / Private . swift <nl> ppp b / stdlib / public / SDK / Dispatch / Private . swift <nl> func dispatch_write ( _ fd : Int32 , _ data : __DispatchData , _ queue : DispatchQueue , <nl> } <nl> <nl> @ available ( * , unavailable , renamed : " DispatchData . init ( bytes : ) " ) <nl> - public func dispatch_data_create ( _ buffer : UnsafePointer < Void > , _ size : Int , _ queue : DispatchQueue ? , _ destructor : ( ( ) - > Void ) ? ) - > __DispatchData <nl> + public func dispatch_data_create ( _ buffer : UnsafeRawPointer , _ size : Int , _ queue : DispatchQueue ? , _ destructor : ( ( ) - > Void ) ? ) - > __DispatchData <nl> { <nl> fatalError ( ) <nl> } <nl> public func dispatch_data_get_size ( _ data : __DispatchData ) - > Int <nl> } <nl> <nl> @ available ( * , unavailable , renamed : " DispatchData . withUnsafeBytes ( self : body : ) " ) <nl> - public func dispatch_data_create_map ( _ data : __DispatchData , _ buffer_ptr : UnsafeMutablePointer < UnsafePointer < Void > ? > ? , _ size_ptr : UnsafeMutablePointer < Int > ? ) - > __DispatchData <nl> + public func dispatch_data_create_map ( _ data : __DispatchData , _ buffer_ptr : UnsafeMutablePointer < UnsafeRawPointer ? > ? , _ size_ptr : UnsafeMutablePointer < Int > ? ) - > __DispatchData <nl> { <nl> fatalError ( ) <nl> } <nl> public func dispatch_data_create_subrange ( _ data : __DispatchData , _ offset : Int , <nl> } <nl> <nl> @ available ( * , unavailable , renamed : " DispatchData . enumerateBytes ( self : block : ) " ) <nl> - public func dispatch_data_apply ( _ data : __DispatchData , _ applier : ( __DispatchData , Int , UnsafePointer < Void > , Int ) - > Bool ) - > Bool <nl> + public func dispatch_data_apply ( _ data : __DispatchData , _ applier : ( __DispatchData , Int , UnsafeRawPointer , Int ) - > Bool ) - > Bool <nl> { <nl> fatalError ( ) <nl> } <nl> public func dispatch_barrier_sync ( _ queue : DispatchQueue , _ block : @ noescape ( ) <nl> } <nl> <nl> @ available ( * , unavailable , renamed : " DispatchQueue . setSpecific ( self : key : value : ) " ) <nl> - public func dispatch_queue_set_specific ( _ queue : DispatchQueue , _ key : UnsafePointer < Void > , _ context : UnsafeMutablePointer < Void > ? , _ destructor : ( @ convention ( c ) ( UnsafeMutablePointer < Void > ? ) - > Void ) ? ) <nl> + public func dispatch_queue_set_specific ( _ queue : DispatchQueue , _ key : UnsafeRawPointer , _ context : UnsafeMutableRawPointer ? , _ destructor : ( @ convention ( c ) ( UnsafeMutableRawPointer ? ) - > Void ) ? ) <nl> { <nl> fatalError ( ) <nl> } <nl> <nl> @ available ( * , unavailable , renamed : " DispatchQueue . getSpecific ( self : key : ) " ) <nl> - public func dispatch_queue_get_specific ( _ queue : DispatchQueue , _ key : UnsafePointer < Void > ) - > UnsafeMutablePointer < Void > ? <nl> + public func dispatch_queue_get_specific ( _ queue : DispatchQueue , _ key : UnsafeRawPointer ) - > UnsafeMutableRawPointer ? <nl> { <nl> fatalError ( ) <nl> } <nl> <nl> @ available ( * , unavailable , renamed : " DispatchQueue . getSpecific ( key : ) " ) <nl> - public func dispatch_get_specific ( _ key : UnsafePointer < Void > ) - > UnsafeMutablePointer < Void > ? <nl> + public func dispatch_get_specific ( _ key : UnsafeRawPointer ) - > UnsafeMutableRawPointer ? <nl> { <nl> fatalError ( ) <nl> } <nl> mmm a / stdlib / public / SDK / Dispatch / Queue . swift <nl> ppp b / stdlib / public / SDK / Dispatch / Queue . swift <nl> public extension DispatchQueue { <nl> } <nl> } <nl> <nl> - private func _destructDispatchSpecificValue ( ptr : UnsafeMutablePointer < Void > ? ) { <nl> + private func _destructDispatchSpecificValue ( ptr : UnsafeMutableRawPointer ? ) { <nl> if let p = ptr { <nl> Unmanaged < AnyObject > . fromOpaque ( p ) . release ( ) <nl> } <nl> mmm a / stdlib / public / SDK / Foundation / Data . swift <nl> ppp b / stdlib / public / SDK / Foundation / Data . swift <nl> <nl> @ _exported import Foundation / / Clang module <nl> <nl> @ _silgen_name ( " __NSDataInvokeDeallocatorVM " ) <nl> - internal func __NSDataInvokeDeallocatorVM ( _ mem : UnsafeMutablePointer < Void > , _ length : Int ) - > Void <nl> + internal func __NSDataInvokeDeallocatorVM ( _ mem : UnsafeMutableRawPointer , _ length : Int ) - > Void <nl> <nl> @ _silgen_name ( " __NSDataInvokeDeallocatorUnmap " ) <nl> - internal func __NSDataInvokeDeallocatorUnmap ( _ mem : UnsafeMutablePointer < Void > , _ length : Int ) - > Void <nl> + internal func __NSDataInvokeDeallocatorUnmap ( _ mem : UnsafeMutableRawPointer , _ length : Int ) - > Void <nl> <nl> @ _silgen_name ( " __NSDataInvokeDeallocatorFree " ) <nl> - internal func __NSDataInvokeDeallocatorFree ( _ mem : UnsafeMutablePointer < Void > , _ length : Int ) - > Void <nl> + internal func __NSDataInvokeDeallocatorFree ( _ mem : UnsafeMutableRawPointer , _ length : Int ) - > Void <nl> <nl> @ _silgen_name ( " _NSWriteDataToFile_Swift " ) <nl> internal func _NSWriteDataToFile_Swift ( url : URL , data : NSData , options : UInt , error : NSErrorPointer ) - > Bool <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> / / / A custom deallocator . <nl> case custom ( ( UnsafeMutablePointer < UInt8 > , Int ) - > Void ) <nl> <nl> - fileprivate var _deallocator : ( ( UnsafeMutablePointer < Void > , Int ) - > Void ) ? { <nl> + fileprivate var _deallocator : ( ( UnsafeMutableRawPointer , Int ) - > Void ) ? { <nl> switch self { <nl> case . virtualMemory : <nl> return { __NSDataInvokeDeallocatorVM ( $ 0 , $ 1 ) } <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> return nil <nl> case . custom ( let b ) : <nl> return { ( ptr , len ) in <nl> - b ( UnsafeMutablePointer < UInt8 > ( ptr ) , len ) <nl> + / / Bind memory to UInt8 since that is what the public deallocation function expects . <nl> + let bytePtr = ptr . bindMemory ( to : UInt8 . self , capacity : len ) <nl> + b ( bytePtr , len ) <nl> } <nl> } <nl> } <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> } <nl> <nl> <nl> - private func _getUnsafeBytesPointer ( ) - > UnsafePointer < Void > { <nl> + private func _getUnsafeBytesPointer ( ) - > UnsafeRawPointer { <nl> return _mapUnmanaged { return $ 0 . bytes } <nl> } <nl> <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> public func withUnsafeBytes < ResultType , ContentType > ( _ body : @ noescape ( UnsafePointer < ContentType > ) throws - > ResultType ) rethrows - > ResultType { <nl> let bytes = _getUnsafeBytesPointer ( ) <nl> defer { _fixLifetime ( self ) } <nl> - return try body ( UnsafePointer ( bytes ) ) <nl> + let contentPtr = bytes . bindMemory ( to : ContentType . self , capacity : count / strideof ( ContentType . self ) ) <nl> + return try body ( contentPtr ) <nl> } <nl> <nl> - private mutating func _getUnsafeMutableBytesPointer ( ) - > UnsafeMutablePointer < Void > { <nl> + private mutating func _getUnsafeMutableBytesPointer ( ) - > UnsafeMutableRawPointer { <nl> return _applyUnmanagedMutation { <nl> return $ 0 . mutableBytes <nl> } <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> public mutating func withUnsafeMutableBytes < ResultType , ContentType > ( _ body : @ noescape ( UnsafeMutablePointer < ContentType > ) throws - > ResultType ) rethrows - > ResultType { <nl> let mutableBytes = _getUnsafeMutableBytesPointer ( ) <nl> defer { _fixLifetime ( self ) } <nl> - return try body ( UnsafeMutablePointer ( mutableBytes ) ) <nl> + let contentPtr = mutableBytes . bindMemory ( to : ContentType . self , capacity : count / strideof ( ContentType . self ) ) <nl> + return try body ( UnsafeMutablePointer ( contentPtr ) ) <nl> } <nl> <nl> / / MARK : - <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> _mapUnmanaged { <nl> $ 0 . enumerateBytes { ( ptr , range , stop ) in <nl> var stopv = false <nl> - block ( buffer : UnsafeBufferPointer ( start : UnsafePointer < UInt8 > ( ptr ) , count : range . length ) , byteIndex : range . length , stop : & stopv ) <nl> + let bytePtr = ptr . bindMemory ( to : UInt8 . self , capacity : range . length ) <nl> + block ( buffer : UnsafeBufferPointer ( start : bytePtr , count : range . length ) , byteIndex : range . length , stop : & stopv ) <nl> if stopv { <nl> stop . pointee = true <nl> } <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> } <nl> <nl> @ available ( * , unavailable , message : " use withUnsafeBytes instead " ) <nl> - public var bytes : UnsafePointer < Void > { fatalError ( ) } <nl> + public var bytes : UnsafeRawPointer { fatalError ( ) } <nl> <nl> @ available ( * , unavailable , message : " use withUnsafeMutableBytes instead " ) <nl> - public var mutableBytes : UnsafeMutablePointer < Void > { fatalError ( ) } <nl> + public var mutableBytes : UnsafeMutableRawPointer { fatalError ( ) } <nl> <nl> / / / Returns ` true ` if the two ` Data ` arguments are equal . <nl> public static func = = ( d1 : Data , d2 : Data ) - > Bool { <nl> extension _SwiftNSData { <nl> } <nl> <nl> @ objc ( bytes ) <nl> - var bytes : UnsafePointer < Void > { <nl> + var bytes : UnsafeRawPointer { <nl> return _mapUnmanaged { $ 0 . bytes } <nl> } <nl> <nl> extension _SwiftNSData { <nl> } <nl> <nl> @ objc ( getBytes : length : ) <nl> - func getBytes ( _ buffer : UnsafeMutablePointer < Void > , length : Int ) { <nl> + func getBytes ( _ buffer : UnsafeMutableRawPointer , length : Int ) { <nl> return _mapUnmanaged { $ 0 . getBytes ( buffer , length : length ) } <nl> } <nl> <nl> @ objc ( getBytes : range : ) <nl> - func getBytes ( _ buffer : UnsafeMutablePointer < Void > , range : NSRange ) { <nl> + func getBytes ( _ buffer : UnsafeMutableRawPointer , range : NSRange ) { <nl> return _mapUnmanaged { $ 0 . getBytes ( buffer , range : range ) } <nl> } <nl> <nl> extension _SwiftNSData { <nl> } <nl> <nl> @ objc ( enumerateByteRangesUsingBlock : ) <nl> - func enumerateByteRanges ( using block : @ noescape ( UnsafePointer < Void > , NSRange , UnsafeMutablePointer < ObjCBool > ) - > Void ) { <nl> + func enumerateByteRanges ( using block : @ noescape ( UnsafeRawPointer , NSRange , UnsafeMutablePointer < ObjCBool > ) - > Void ) { <nl> return _mapUnmanaged { $ 0 . enumerateBytes ( block ) } <nl> } <nl> <nl> mmm a / stdlib / public / SDK / Foundation / Foundation . swift <nl> ppp b / stdlib / public / SDK / Foundation / Foundation . swift <nl> extension NSString : ExpressibleByStringLiteral { <nl> var immutableResult : NSString <nl> if value . hasPointerRepresentation { <nl> immutableResult = NSString ( <nl> - bytesNoCopy : UnsafeMutablePointer < Void > ( value . utf8Start ) , <nl> + bytesNoCopy : UnsafeMutableRawPointer ( mutating : value . utf8Start ) , <nl> length : Int ( value . utf8CodeUnitCount ) , <nl> encoding : value . isASCII ? String . Encoding . ascii . rawValue : String . Encoding . utf8 . rawValue , <nl> freeWhenDone : false ) ! <nl> mmm a / stdlib / public / SDK / Foundation / IndexPath . swift <nl> ppp b / stdlib / public / SDK / Foundation / IndexPath . swift <nl> public struct IndexPath : ReferenceConvertible , Equatable , Hashable , MutableColl <nl> if count = = 0 { <nl> _indexes = [ ] <nl> } else { <nl> - var ptr = UnsafeMutablePointer < Element > ( malloc ( count * sizeof ( Element . self ) ) ) <nl> + var ptr = malloc ( count * sizeof ( Element . self ) ) <nl> defer { free ( ptr ) } <nl> + <nl> + let elementPtr = ptr ! . bindMemory ( to : Element . self , capacity : count ) <nl> + nsIndexPath . getIndexes ( elementPtr , range : NSMakeRange ( 0 , count ) ) <nl> <nl> - nsIndexPath . getIndexes ( ptr ! , range : NSMakeRange ( 0 , count ) ) <nl> - <nl> - let buffer = UnsafeBufferPointer ( start : ptr , count : count ) <nl> + let buffer = UnsafeBufferPointer ( start : elementPtr , count : count ) <nl> _indexes = buffer . map { $ 0 } <nl> } <nl> } <nl> mmm a / stdlib / public / SDK / Foundation / NSError . swift <nl> ppp b / stdlib / public / SDK / Foundation / NSError . swift <nl> internal func NS_Swift_performErrorRecoverySelector ( <nl> delegate : AnyObject ? , <nl> selector : Selector , <nl> success : ObjCBool , <nl> - contextInfo : UnsafeMutablePointer < Void > ? ) <nl> + contextInfo : UnsafeMutableRawPointer ? ) <nl> <nl> / / / Class that implements the informal protocol <nl> / / / NSErrorRecoveryAttempting , which is used by NSError when it <nl> class _NSErrorRecoveryAttempter { <nl> optionIndex recoveryOptionIndex : Int , <nl> delegate : AnyObject ? , <nl> didRecoverSelector : Selector , <nl> - contextInfo : UnsafeMutablePointer < Void > ? ) { <nl> + contextInfo : UnsafeMutableRawPointer ? ) { <nl> let error = nsError as Error as ! RecoverableError <nl> error . attemptRecovery ( optionIndex : recoveryOptionIndex ) { success in <nl> NS_Swift_performErrorRecoverySelector ( <nl> mmm a / stdlib / public / SDK / Foundation / NSStringAPI . swift <nl> ppp b / stdlib / public / SDK / Foundation / NSStringAPI . swift <nl> extension String { <nl> / / / in a given encoding , and optionally frees the buffer . WARNING : <nl> / / / this initializer is not memory - safe ! <nl> public init ? ( <nl> - bytesNoCopy bytes : UnsafeMutablePointer < Void > , length : Int , <nl> + bytesNoCopy bytes : UnsafeMutableRawPointer , length : Int , <nl> encoding : Encoding , freeWhenDone flag : Bool <nl> ) { <nl> if let ns = NSString ( <nl> mmm a / stdlib / public / core / ArrayBuffer . swift <nl> ppp b / stdlib / public / core / ArrayBuffer . swift <nl> extension _ArrayBuffer { <nl> / / / A value that identifies the storage used by the buffer . Two <nl> / / / buffers address the same elements when they have the same <nl> / / / identity and count . <nl> - public var identity : UnsafePointer < Void > { <nl> + public var identity : UnsafeRawPointer { <nl> if _isNative { <nl> return _native . identity <nl> } <nl> mmm a / stdlib / public / core / ArrayBufferProtocol . swift <nl> ppp b / stdlib / public / core / ArrayBufferProtocol . swift <nl> public protocol _ArrayBufferProtocol <nl> / / / A value that identifies the storage used by the buffer . Two <nl> / / / buffers address the same elements when they have the same <nl> / / / identity and count . <nl> - var identity : UnsafePointer < Void > { get } <nl> + var identity : UnsafeRawPointer { get } <nl> <nl> var startIndex : Int { get } <nl> } <nl> mmm a / stdlib / public / core / Arrays . swift . gyb <nl> ppp b / stdlib / public / core / Arrays . swift . gyb <nl> extension $ { Self } : CustomStringConvertible , CustomDebugStringConvertible { <nl> extension $ { Self } { <nl> @ _versioned <nl> @ _transparent <nl> - internal func _cPointerArgs ( ) - > ( AnyObject ? , UnsafePointer < Void > ? ) { <nl> + internal func _cPointerArgs ( ) - > ( AnyObject ? , UnsafeRawPointer ? ) { <nl> let p = _baseAddressIfContiguous <nl> if _fastPath ( p ! = nil | | isEmpty ) { <nl> - return ( _owner , UnsafePointer ( p ) ) <nl> + return ( _owner , UnsafeRawPointer ( p ) ) <nl> } <nl> let n = ContiguousArray ( self . _buffer ) . _buffer <nl> - return ( n . owner , UnsafePointer ( n . firstElementAddress ) ) <nl> + return ( n . owner , UnsafeRawPointer ( n . firstElementAddress ) ) <nl> } <nl> } <nl> <nl> mmm a / stdlib / public / core / BridgeObjectiveC . swift <nl> ppp b / stdlib / public / core / BridgeObjectiveC . swift <nl> public func = = < Pointee > ( <nl> @ _fixed_layout <nl> internal struct _CocoaFastEnumerationStackBuf { <nl> / / Clang uses 16 pointers . So do we . <nl> - internal var _item0 : UnsafePointer < Void > ? <nl> - internal var _item1 : UnsafePointer < Void > ? <nl> - internal var _item2 : UnsafePointer < Void > ? <nl> - internal var _item3 : UnsafePointer < Void > ? <nl> - internal var _item4 : UnsafePointer < Void > ? <nl> - internal var _item5 : UnsafePointer < Void > ? <nl> - internal var _item6 : UnsafePointer < Void > ? <nl> - internal var _item7 : UnsafePointer < Void > ? <nl> - internal var _item8 : UnsafePointer < Void > ? <nl> - internal var _item9 : UnsafePointer < Void > ? <nl> - internal var _item10 : UnsafePointer < Void > ? <nl> - internal var _item11 : UnsafePointer < Void > ? <nl> - internal var _item12 : UnsafePointer < Void > ? <nl> - internal var _item13 : UnsafePointer < Void > ? <nl> - internal var _item14 : UnsafePointer < Void > ? <nl> - internal var _item15 : UnsafePointer < Void > ? <nl> + internal var _item0 : UnsafeRawPointer ? <nl> + internal var _item1 : UnsafeRawPointer ? <nl> + internal var _item2 : UnsafeRawPointer ? <nl> + internal var _item3 : UnsafeRawPointer ? <nl> + internal var _item4 : UnsafeRawPointer ? <nl> + internal var _item5 : UnsafeRawPointer ? <nl> + internal var _item6 : UnsafeRawPointer ? <nl> + internal var _item7 : UnsafeRawPointer ? <nl> + internal var _item8 : UnsafeRawPointer ? <nl> + internal var _item9 : UnsafeRawPointer ? <nl> + internal var _item10 : UnsafeRawPointer ? <nl> + internal var _item11 : UnsafeRawPointer ? <nl> + internal var _item12 : UnsafeRawPointer ? <nl> + internal var _item13 : UnsafeRawPointer ? <nl> + internal var _item14 : UnsafeRawPointer ? <nl> + internal var _item15 : UnsafeRawPointer ? <nl> <nl> @ _transparent <nl> internal var count : Int { <nl> internal struct _CocoaFastEnumerationStackBuf { <nl> _item15 = _item0 <nl> <nl> _sanityCheck ( sizeofValue ( self ) > = <nl> - sizeof ( Optional < UnsafePointer < Void > > . self ) * count ) <nl> + sizeof ( Optional < UnsafeRawPointer > . self ) * count ) <nl> } <nl> } <nl> <nl> mmm a / stdlib / public / core / Builtin . swift <nl> ppp b / stdlib / public / core / Builtin . swift <nl> internal func _isClassOrObjCExistential < T > ( _ x : T . Type ) - > Bool { <nl> / / / not much you can do with this other than use it to identify the <nl> / / / object . <nl> @ _transparent <nl> - public func unsafeAddress ( of object : AnyObject ) - > UnsafePointer < Void > { <nl> - return UnsafePointer ( Builtin . bridgeToRawPointer ( object ) ) <nl> + public func unsafeAddress ( of object : AnyObject ) - > UnsafeRawPointer { <nl> + return UnsafeRawPointer ( Builtin . bridgeToRawPointer ( object ) ) <nl> } <nl> <nl> @ available ( * , unavailable , renamed : " unsafeAddress ( of : ) " ) <nl> - public func unsafeAddressOf ( _ object : AnyObject ) - > UnsafePointer < Void > { <nl> + public func unsafeAddressOf ( _ object : AnyObject ) - > UnsafeRawPointer { <nl> Builtin . unreachable ( ) <nl> } <nl> <nl> mmm a / stdlib / public / core / CTypes . swift <nl> ppp b / stdlib / public / core / CTypes . swift <nl> extension OpaquePointer : CustomDebugStringConvertible { <nl> <nl> extension Int { <nl> public init ( bitPattern pointer : OpaquePointer ? ) { <nl> - self . init ( bitPattern : UnsafePointer < Void > ( pointer ) ) <nl> + self . init ( bitPattern : UnsafeRawPointer ( pointer ) ) <nl> } <nl> } <nl> <nl> extension UInt { <nl> public init ( bitPattern pointer : OpaquePointer ? ) { <nl> - self . init ( bitPattern : UnsafePointer < Void > ( pointer ) ) <nl> + self . init ( bitPattern : UnsafeRawPointer ( pointer ) ) <nl> } <nl> } <nl> <nl> extension OpaquePointer : Equatable { <nl> / / / The corresponding Swift type to ` va_list ` in imported C APIs . <nl> @ _fixed_layout <nl> public struct CVaListPointer { <nl> - var value : UnsafeMutablePointer < Void > <nl> + var value : UnsafeMutableRawPointer <nl> <nl> public / / @ testable <nl> - init ( _fromUnsafeMutablePointer from : UnsafeMutablePointer < Void > ) { <nl> + init ( _fromUnsafeMutablePointer from : UnsafeMutableRawPointer ) { <nl> value = from <nl> } <nl> } <nl> extension CVaListPointer : CustomDebugStringConvertible { <nl> } <nl> <nl> func _memcpy ( <nl> - dest destination : UnsafeMutablePointer < Void > , <nl> - src : UnsafeMutablePointer < Void > , <nl> + dest destination : UnsafeMutableRawPointer , <nl> + src : UnsafeMutableRawPointer , <nl> size : UInt <nl> ) { <nl> let dest = destination . _rawValue <nl> mmm a / stdlib / public / core / ContiguousArrayBuffer . swift <nl> ppp b / stdlib / public / core / ContiguousArrayBuffer . swift <nl> struct _ContiguousArrayBuffer < Element > : _ArrayBufferProtocol { <nl> / / / <nl> / / / Two buffers address the same elements when they have the same <nl> / / / identity and count . <nl> - public var identity : UnsafePointer < Void > { <nl> - return UnsafePointer ( firstElementAddress ) <nl> + public var identity : UnsafeRawPointer { <nl> + return UnsafeRawPointer ( firstElementAddress ) <nl> } <nl> <nl> / / / Returns ` true ` iff we have storage for elements of the given <nl> mmm a / stdlib / public / core / HashedCollections . swift . gyb <nl> ppp b / stdlib / public / core / HashedCollections . swift . gyb <nl> final internal class _Native $ { Self } StorageOwner < $ { TypeParametersDecl } > <nl> @ objc <nl> internal required init ( <nl> objects : UnsafePointer < AnyObject ? > , <nl> - forKeys : UnsafePointer < Void > , <nl> + forKeys : UnsafeRawPointer , <nl> count : Int <nl> ) { <nl> _sanityCheckFailure ( " don ' t call this designated initializer " ) <nl> mmm a / stdlib / public / core / Pointer . swift <nl> ppp b / stdlib / public / core / Pointer . swift <nl> func _convertConstArrayToPointerArgument < <nl> validPointer = ToPointer ( addr . _rawValue ) <nl> } else { <nl> let lastAlignedValue = ~ ( alignof ( FromElement . self ) - 1 ) <nl> - let lastAlignedPointer = UnsafePointer < Void > ( bitPattern : lastAlignedValue ) ! <nl> + let lastAlignedPointer = UnsafeRawPointer ( bitPattern : lastAlignedValue ) ! <nl> validPointer = ToPointer ( lastAlignedPointer . _rawValue ) <nl> } <nl> return ( owner , validPointer ) <nl> mmm a / stdlib / public / core / Runtime . swift . gyb <nl> ppp b / stdlib / public / core / Runtime . swift . gyb <nl> import SwiftShims <nl> @ _transparent <nl> public / / @ testable <nl> func _stdlib_atomicCompareExchangeStrongPtrImpl ( <nl> - object target : UnsafeMutablePointer < UnsafeMutablePointer < Void > ? > , <nl> - expected : UnsafeMutablePointer < UnsafeMutablePointer < Void > ? > , <nl> - desired : UnsafeMutablePointer < Void > ? ) - > Bool { <nl> + object target : UnsafeMutablePointer < UnsafeMutableRawPointer ? > , <nl> + expected : UnsafeMutablePointer < UnsafeMutableRawPointer ? > , <nl> + desired : UnsafeMutableRawPointer ? ) - > Bool { <nl> <nl> / / We use Builtin . Word here because Builtin . RawPointer can ' t be nil . <nl> let ( oldValue , won ) = Builtin . cmpxchg_seqcst_seqcst_Word ( <nl> target . _rawValue , <nl> UInt ( bitPattern : expected . pointee ) . _builtinWordValue , <nl> UInt ( bitPattern : desired ) . _builtinWordValue ) <nl> - expected . pointee = UnsafeMutablePointer ( bitPattern : Int ( oldValue ) ) <nl> + expected . pointee = UnsafeMutableRawPointer ( bitPattern : Int ( oldValue ) ) <nl> return Bool ( won ) <nl> } <nl> <nl> public / / @ testable <nl> func _stdlib_atomicInitializeARCRef ( <nl> object target : UnsafeMutablePointer < AnyObject ? > , <nl> desired : AnyObject ) - > Bool { <nl> - var expected : UnsafeMutablePointer < Void > ? = nil <nl> + var expected : UnsafeMutableRawPointer ? = nil <nl> let desiredPtr = Unmanaged . passRetained ( desired ) . toOpaque ( ) <nl> let wonRace = _stdlib_atomicCompareExchangeStrongPtrImpl ( <nl> object : UnsafeMutablePointer ( target ) , <nl> func _uint64ToString ( <nl> func _rawPointerToString ( _ value : Builtin . RawPointer ) - > String { <nl> var result = _uint64ToString ( <nl> UInt64 ( <nl> - UInt ( bitPattern : UnsafePointer < Void > ( value ) ) ) , <nl> + UInt ( bitPattern : UnsafeRawPointer ( value ) ) ) , <nl> radix : 16 , <nl> uppercase : false <nl> ) <nl> - for _ in 0 . . < ( 2 * sizeof ( UnsafePointer < Void > . self ) - result . utf16 . count ) { <nl> + for _ in 0 . . < ( 2 * sizeof ( UnsafeRawPointer . self ) - result . utf16 . count ) { <nl> result = " 0 " + result <nl> } <nl> return " 0x " + result <nl> mmm a / stdlib / public / core / ShadowProtocols . swift <nl> ppp b / stdlib / public / core / ShadowProtocols . swift <nl> public protocol _NSDictionaryCore : <nl> / / The designated initializer of ` NSDictionary ` . <nl> init ( <nl> objects : UnsafePointer < AnyObject ? > , <nl> - forKeys : UnsafePointer < Void > , count : Int ) <nl> + forKeys : UnsafeRawPointer , count : Int ) <nl> <nl> var count : Int { get } <nl> <nl> mmm a / stdlib / public / core / SliceBuffer . swift <nl> ppp b / stdlib / public / core / SliceBuffer . swift <nl> struct _SliceBuffer < Element > : _ArrayBufferProtocol , RandomAccessCollection { <nl> / / / A value that identifies the storage used by the buffer . Two <nl> / / / buffers address the same elements when they have the same <nl> / / / identity and count . <nl> - public var identity : UnsafePointer < Void > { <nl> - return UnsafePointer ( firstElementAddress ) <nl> + public var identity : UnsafeRawPointer { <nl> + return UnsafeRawPointer ( firstElementAddress ) <nl> } <nl> <nl> / / / An object that keeps the elements stored in this buffer alive . <nl> mmm a / stdlib / public / core / Unmanaged . swift <nl> ppp b / stdlib / public / core / Unmanaged . swift <nl> public struct Unmanaged < Instance : AnyObject > { <nl> / / / <nl> / / / let str : CFString = Unmanaged . fromOpaque ( ptr ) . takeUnretainedValue ( ) <nl> @ _transparent <nl> - public static func fromOpaque ( _ value : UnsafePointer < Void > ) - > Unmanaged { <nl> + public static func fromOpaque ( _ value : UnsafeRawPointer ) - > Unmanaged { <nl> return Unmanaged ( _private : unsafeBitCast ( value , to : Instance . self ) ) <nl> } <nl> <nl> public struct Unmanaged < Instance : AnyObject > { <nl> / / / let bits = Unmanaged . passUnretained ( str0 ) <nl> / / / let ptr = bits . toOpaque ( ) <nl> @ _transparent <nl> - public func toOpaque ( ) - > UnsafeMutablePointer < Void > { <nl> - return unsafeBitCast ( _value , to : UnsafeMutablePointer < Void > . self ) <nl> + public func toOpaque ( ) - > UnsafeMutableRawPointer { <nl> + return unsafeBitCast ( _value , to : UnsafeMutableRawPointer . self ) <nl> } <nl> <nl> / / / Create an unmanaged reference with an unbalanced retain . <nl> public struct Unmanaged < Instance : AnyObject > { <nl> <nl> extension Unmanaged { <nl> @ available ( * , unavailable , <nl> - message : " use ' fromOpaque ( _ : UnsafePointer < Void > ) ' instead " ) <nl> + message : " use ' fromOpaque ( _ : UnsafeRawPointer ) ' instead " ) <nl> public static func fromOpaque ( _ value : OpaquePointer ) - > Unmanaged { <nl> Builtin . unreachable ( ) <nl> } <nl> <nl> @ available ( * , unavailable , <nl> - message : " use ' toOpaque ( ) - > UnsafePointer < Void > ' instead " ) <nl> + message : " use ' toOpaque ( ) - > UnsafeRawPointer ' instead " ) <nl> public func toOpaque ( ) - > OpaquePointer { <nl> Builtin . unreachable ( ) <nl> } <nl> mmm a / stdlib / public / core / UnsafeRawPointer . swift . gyb <nl> ppp b / stdlib / public / core / UnsafeRawPointer . swift . gyb <nl> public struct Unsafe $ { Mutable } RawPointer : Strideable , Hashable , _Pointer { <nl> / / / Implements conformance to the public protocol ` _Pointer ` . <nl> public let _rawValue : Builtin . RawPointer <nl> <nl> + / / Construct $ { a_Self } from another $ { a_Self } . <nl> + / / FIXME : Why is this necessary ? <nl> + @ _transparent <nl> + public init ( _ other : Unsafe $ { Mutable } RawPointer ) { <nl> + self = other <nl> + } <nl> + <nl> / / / Convert a builtin raw pointer to $ { a_Self } . <nl> @ _transparent <nl> public init ( _ _rawValue : Builtin . RawPointer ) { <nl> public struct Unsafe $ { Mutable } RawPointer : Strideable , Hashable , _Pointer { <nl> / / / - Warning : Binding memory to a type is potentially undefined if the <nl> / / / memory is ever accessed as an unrelated type . <nl> @ _transparent <nl> + @ discardableResult <nl> public func bindMemory < T > ( to type : T . Type , capacity count : Int ) <nl> - > Unsafe $ { Mutable } Pointer < T > { <nl> Builtin . bindMemory ( _rawValue , count . _builtinWordValue , type ) <nl> mmm a / stdlib / public / core / VarArgs . swift <nl> ppp b / stdlib / public / core / VarArgs . swift <nl> final internal class _VaListBuilder { <nl> header . overflow_arg_area <nl> = storage . _baseAddress + _x86_64RegisterSaveWords <nl> return CVaListPointer ( <nl> - _fromUnsafeMutablePointer : UnsafeMutablePointer < Void > ( <nl> + _fromUnsafeMutablePointer : UnsafeMutableRawPointer ( <nl> Builtin . addressof ( & self . header ) ) ) <nl> } <nl> <nl> mmm a / test / 1_stdlib / ErrorBridged . swift <nl> ppp b / test / 1_stdlib / ErrorBridged . swift <nl> extension MySwiftCustomizedError : RecoverableError { <nl> optionIndex recoveryOptionIndex : Int , <nl> delegate : AnyObject ? , <nl> didRecoverSelector : Selector , <nl> - contextInfo : UnsafeMutablePointer < Void > ? ) <nl> + contextInfo : UnsafeMutableRawPointer ? ) <nl> <nl> @ objc ( attemptRecoveryFromError : optionIndex : ) <nl> func attemptRecovery ( fromError nsError : Error , <nl> extension MySwiftCustomizedError : RecoverableError { <nl> <nl> class RecoveryDelegate { <nl> let expectedSuccess : Bool <nl> - let expectedContextInfo : UnsafeMutablePointer < Void > ? <nl> + let expectedContextInfo : UnsafeMutableRawPointer ? <nl> var called = false <nl> <nl> init ( expectedSuccess : Bool , <nl> - expectedContextInfo : UnsafeMutablePointer < Void > ? ) { <nl> + expectedContextInfo : UnsafeMutableRawPointer ? ) { <nl> self . expectedSuccess = expectedSuccess <nl> self . expectedContextInfo = expectedContextInfo <nl> } <nl> <nl> - @ objc func recover ( success : Bool , contextInfo : UnsafeMutablePointer < Void > ? ) { <nl> + @ objc func recover ( success : Bool , contextInfo : UnsafeMutableRawPointer ? ) { <nl> expectEqual ( expectedSuccess , success ) <nl> expectEqual ( expectedContextInfo , contextInfo ) <nl> called = true <nl> mmm a / test / 1_stdlib / KVO . swift <nl> ppp b / test / 1_stdlib / KVO . swift <nl> class Observer : NSObject { <nl> override func observeValue ( forKeyPath : String ? , <nl> of obj : Any ? , <nl> change : Dictionary < NSKeyValueChangeKey , Any > ? , <nl> - context : UnsafeMutablePointer < Void > ? ) { <nl> + context : UnsafeMutableRawPointer ? ) { <nl> target ! . print ( ) <nl> } <nl> } <nl> class ObserverKVO : NSObject { <nl> override func observeValue ( forKeyPath : String ? , <nl> of obj : Any ? , <nl> change : Dictionary < NSKeyValueChangeKey , Any > ? , <nl> - context : UnsafeMutablePointer < Void > ? ) { <nl> + context : UnsafeMutableRawPointer ? ) { <nl> if context = = & kvoContext { <nl> target ! . print ( ) <nl> } <nl> mmm a / test / 1_stdlib / POSIX . swift <nl> ppp b / test / 1_stdlib / POSIX . swift <nl> POSIXTests . test ( " ioctl ( CInt , UInt , CInt ) : fail " ) { <nl> <nl> # if os ( Linux ) <nl> / / Successful creation of a socket and listing interfaces <nl> - POSIXTests . test ( " ioctl ( CInt , UInt , UnsafeMutablePointer < Void > ) : listing interfaces success " ) { <nl> + POSIXTests . test ( " ioctl ( CInt , UInt , UnsafeMutableRawPointer ) : listing interfaces success " ) { <nl> / / Create a socket <nl> let sock = socket ( PF_INET , 1 , 0 ) <nl> expectGT ( Int ( sock ) , 0 ) <nl> POSIXTests . test ( " fcntl ( CInt , CInt , CInt ) : block and unblocking sockets success " ) <nl> expectEqual ( 0 , rc ) <nl> } <nl> <nl> - POSIXTests . test ( " fcntl ( CInt , CInt , UnsafeMutablePointer < Void > ) : locking and unlocking success " ) { <nl> + POSIXTests . test ( " fcntl ( CInt , CInt , UnsafeMutableRawPointer ) : locking and unlocking success " ) { <nl> / / Create the file and add data to it . . . <nl> var fd = open ( fn , O_CREAT | O_WRONLY , 0o666 ) <nl> expectGT ( Int ( fd ) , 0 ) <nl> mmm a / test / 1_stdlib / Renames . swift <nl> ppp b / test / 1_stdlib / Renames . swift <nl> func _UnicodeScalar ( s : UnicodeScalar ) { <nl> } <nl> <nl> func _Unmanaged < T > ( x : Unmanaged < T > , p : OpaquePointer ) { <nl> - _ = Unmanaged < T > . fromOpaque ( p ) / / expected - error { { ' fromOpaque ' is unavailable : use ' fromOpaque ( _ : UnsafePointer < Void > ) ' instead } } { { none } } <nl> - let _ : OpaquePointer = x . toOpaque ( ) / / expected - error { { ' toOpaque ( ) ' is unavailable : use ' toOpaque ( ) - > UnsafePointer < Void > ' instead } } { { none } } <nl> + _ = Unmanaged < T > . fromOpaque ( p ) / / expected - error { { ' fromOpaque ' is unavailable : use ' fromOpaque ( _ : UnsafeRawPointer ) ' instead } } { { none } } <nl> + let _ : OpaquePointer = x . toOpaque ( ) / / expected - error { { ' toOpaque ( ) ' is unavailable : use ' toOpaque ( ) - > UnsafeRawPointer ' instead } } { { none } } <nl> } <nl> <nl> func _UnsafeBufferPointer ( ) { <nl> mmm a / test / 1_stdlib / Runtime . swift . gyb <nl> ppp b / test / 1_stdlib / Runtime . swift . gyb <nl> Reflection . test ( " ObjectIdentifier / CustomDebugStringConvertible " ) { <nl> expectEqual ( String ( reflecting : oi1 ) , String ( reflecting : oi1 ) ) <nl> expectNotEqual ( String ( reflecting : oi1 ) , String ( reflecting : oi2 ) ) <nl> <nl> - let p1 = UnsafePointer < Void > ( bitPattern : UInt ( bitPattern : oi1 ) ) ! <nl> + let p1 = UnsafeRawPointer ( bitPattern : UInt ( bitPattern : oi1 ) ) ! <nl> expectPrinted ( " ObjectIdentifier ( \ ( p1 ) ) " , oi1 ) <nl> - let p2 = UnsafePointer < Void > ( bitPattern : Int ( bitPattern : oi1 ) ) ! <nl> + let p2 = UnsafeRawPointer ( bitPattern : Int ( bitPattern : oi1 ) ) ! <nl> expectPrinted ( " ObjectIdentifier ( \ ( p2 ) ) " , oi1 ) <nl> <nl> } <nl> mmm a / test / 1_stdlib / TestData . swift <nl> ppp b / test / 1_stdlib / TestData . swift <nl> class TestData : TestDataSuper { <nl> class AllOnesData : NSMutableData { <nl> <nl> private var _length : Int <nl> - var _pointer : UnsafeMutableBufferPointer < Void > ? { <nl> + var _pointer : UnsafeMutableBufferPointer < UInt8 > ? { <nl> willSet { <nl> if let p = _pointer { free ( p . baseAddress ) } <nl> } <nl> class TestData : TestDataSuper { <nl> memmove ( newBuffer , ptr . baseAddress , _length ) <nl> memset ( newBuffer + _length , 1 , newValue - _length ) <nl> } <nl> - _pointer = UnsafeMutableBufferPointer ( start : newBuffer , count : newValue ) <nl> + let bytePtr = newBuffer . bindMemory ( to : UInt8 . self , capacity : newValue ) <nl> + _pointer = UnsafeMutableBufferPointer ( start : bytePtr , count : newValue ) <nl> } else { <nl> _length = newValue <nl> } <nl> } <nl> } <nl> <nl> - override var bytes : UnsafePointer < Void > { <nl> + override var bytes : UnsafeRawPointer { <nl> if let d = _pointer { <nl> - return UnsafePointer ( d . baseAddress ! ) <nl> + return UnsafeRawPointer ( d . baseAddress ! ) <nl> } else { <nl> / / Need to allocate the buffer now . <nl> / / It doesn ' t matter if the buffer is uniquely referenced or not here . <nl> let buffer = malloc ( length ) <nl> memset ( buffer , 1 , length ) <nl> - let result = UnsafeMutableBufferPointer ( start : buffer , count : length ) <nl> + let bytePtr = buffer ! . bindMemory ( to : UInt8 . self , capacity : length ) <nl> + let result = UnsafeMutableBufferPointer ( start : bytePtr , count : length ) <nl> _pointer = result <nl> - return UnsafePointer ( result . baseAddress ! ) <nl> + return UnsafeRawPointer ( result . baseAddress ! ) <nl> } <nl> } <nl> <nl> - override var mutableBytes : UnsafeMutablePointer < Void > { <nl> + override var mutableBytes : UnsafeMutableRawPointer { <nl> let newBufferLength = _length <nl> let newBuffer = malloc ( newBufferLength ) <nl> if let ptr = _pointer { <nl> class TestData : TestDataSuper { <nl> / / Set new data to 1s <nl> memset ( newBuffer , 1 , newBufferLength ) <nl> } <nl> - <nl> - let result = UnsafeMutableBufferPointer ( start : newBuffer , count : newBufferLength ) <nl> + let bytePtr = newBuffer ! . bindMemory ( to : UInt8 . self , capacity : newBufferLength ) <nl> + let result = UnsafeMutableBufferPointer ( start : bytePtr , count : newBufferLength ) <nl> _pointer = result <nl> _length = newBufferLength <nl> - return result . baseAddress ! <nl> + return UnsafeMutableRawPointer ( result . baseAddress ! ) <nl> } <nl> <nl> - override func getBytes ( _ buffer : UnsafeMutablePointer < Void > , length : Int ) { <nl> + override func getBytes ( _ buffer : UnsafeMutableRawPointer , length : Int ) { <nl> if let d = _pointer { <nl> / / Get the real data from the buffer <nl> memmove ( buffer , d . baseAddress , length ) <nl> class TestData : TestDataSuper { <nl> / / Scope the data to a block to control lifecycle <nl> do { <nl> let buffer = malloc ( 16 ) ! <nl> - var data = Data ( bytesNoCopy : UnsafeMutablePointer < UInt8 > ( buffer ) , count : 16 , deallocator : . custom ( { ( ptr , size ) in <nl> + let bytePtr = buffer . bindMemory ( to : UInt8 . self , capacity : 16 ) <nl> + var data = Data ( bytesNoCopy : bytePtr , count : 16 , deallocator : . custom ( { ( ptr , size ) in <nl> deallocatorCalled = true <nl> - free ( UnsafeMutablePointer < Void > ( ptr ) ) <nl> + free ( UnsafeMutableRawPointer ( ptr ) ) <nl> } ) ) <nl> / / Use the data <nl> data [ 0 ] = 1 <nl> class TestData : TestDataSuper { <nl> func testCopyBytes ( ) { <nl> let c = 10 <nl> let underlyingBuffer = malloc ( c * strideof ( UInt16 . self ) ) ! <nl> - let buffer = UnsafeMutableBufferPointer < UInt16 > ( start : UnsafeMutablePointer < UInt16 > ( underlyingBuffer ) , count : c ) <nl> + let u16Ptr = underlyingBuffer . bindMemory ( to : UInt16 . self , capacity : c ) <nl> + let buffer = UnsafeMutableBufferPointer < UInt16 > ( start : u16Ptr , count : c ) <nl> <nl> buffer [ 0 ] = 0 <nl> buffer [ 1 ] = 0 <nl> class TestData : TestDataSuper { <nl> <nl> let count = 1 < < 24 <nl> let randomMemory = malloc ( count ) ! <nl> - let ptr = UnsafeMutablePointer < UInt8 > ( randomMemory ) ! <nl> + let ptr = randomMemory . bindMemory ( to : UInt8 . self , capacity : count ) <nl> let data = Data ( bytesNoCopy : ptr , count : count , deallocator : . free ) <nl> do { <nl> try data . write ( to : url ) <nl> class TestData : TestDataSuper { <nl> / / equal size <nl> let underlyingBuffer = malloc ( 6 * strideof ( MyStruct . self ) ) ! <nl> defer { free ( underlyingBuffer ) } <nl> - <nl> - let buffer = UnsafeMutableBufferPointer < MyStruct > ( start : UnsafeMutablePointer < MyStruct > ( underlyingBuffer ) , count : 6 ) <nl> + <nl> + let ptr = underlyingBuffer . bindMemory ( to : MyStruct . self , capacity : 6 ) <nl> + let buffer = UnsafeMutableBufferPointer < MyStruct > ( start : ptr , count : 6 ) <nl> <nl> let byteCount = data . copyBytes ( to : buffer ) <nl> expectEqual ( 6 * strideof ( MyStruct . self ) , byteCount ) <nl> class TestData : TestDataSuper { <nl> / / undersized <nl> let underlyingBuffer = malloc ( 3 * strideof ( MyStruct . self ) ) ! <nl> defer { free ( underlyingBuffer ) } <nl> - <nl> - let buffer = UnsafeMutableBufferPointer < MyStruct > ( start : UnsafeMutablePointer < MyStruct > ( underlyingBuffer ) , count : 3 ) <nl> + <nl> + let ptr = underlyingBuffer . bindMemory ( to : MyStruct . self , capacity : 3 ) <nl> + let buffer = UnsafeMutableBufferPointer < MyStruct > ( start : ptr , count : 3 ) <nl> <nl> let byteCount = data . copyBytes ( to : buffer ) <nl> expectEqual ( 3 * strideof ( MyStruct . self ) , byteCount ) <nl> class TestData : TestDataSuper { <nl> let underlyingBuffer = malloc ( 12 * strideof ( MyStruct . self ) ) ! <nl> defer { free ( underlyingBuffer ) } <nl> <nl> - let buffer = UnsafeMutableBufferPointer < MyStruct > ( start : UnsafeMutablePointer < MyStruct > ( underlyingBuffer ) , count : 6 ) <nl> + let ptr = underlyingBuffer . bindMemory ( to : MyStruct . self , capacity : 6 ) <nl> + let buffer = UnsafeMutableBufferPointer < MyStruct > ( start : ptr , count : 6 ) <nl> <nl> let byteCount = data . copyBytes ( to : buffer ) <nl> expectEqual ( 6 * strideof ( MyStruct . self ) , byteCount ) <nl> mmm a / test / 1_stdlib / Unmanaged . swift <nl> ppp b / test / 1_stdlib / Unmanaged . swift <nl> UnmanagedTests . test ( " Opaque " ) { <nl> let opaquePtr = Unmanaged . passUnretained ( ref ) . toOpaque ( ) <nl> <nl> let unknownPtr = Int ( bitPattern : opaquePtr ) <nl> - let voidPtr = UnsafePointer < Void > ( bitPattern : unknownPtr ) <nl> + let voidPtr = UnsafeRawPointer ( bitPattern : unknownPtr ) <nl> expectNotEmpty ( voidPtr , " toOpaque must not return null pointer " ) <nl> <nl> let unmanaged = Unmanaged < Foobar > . fromOpaque ( voidPtr ! ) <nl> mmm a / test / ClangModules / cfuncs_parse . swift <nl> ppp b / test / ClangModules / cfuncs_parse . swift <nl> func test_pointer ( ) { <nl> param_const_pointer ( ia ) <nl> param_const_pointer ( [ 1 , 2 , 3 ] ) <nl> <nl> - param_void_pointer ( nil as UnsafeMutablePointer < Void > ? ) <nl> + param_void_pointer ( nil as UnsafeMutableRawPointer ? ) <nl> param_void_pointer ( nil as UnsafeMutablePointer < CInt > ? ) <nl> param_void_pointer ( nil as UnsafeMutablePointer < CFloat > ? ) <nl> param_void_pointer ( & i ) <nl> func test_pointer ( ) { <nl> param_void_pointer ( & f ) <nl> param_void_pointer ( & fa ) <nl> <nl> - param_const_void_pointer ( nil as UnsafeMutablePointer < Void > ? ) <nl> + param_const_void_pointer ( nil as UnsafeMutableRawPointer ? ) <nl> param_const_void_pointer ( nil as UnsafeMutablePointer < CInt > ? ) <nl> param_const_void_pointer ( nil as UnsafeMutablePointer < CFloat > ? ) <nl> - param_const_void_pointer ( nil as UnsafePointer < Void > ? ) <nl> + param_const_void_pointer ( nil as UnsafeRawPointer ? ) <nl> param_const_void_pointer ( nil as UnsafePointer < CInt > ? ) <nl> param_const_void_pointer ( nil as UnsafePointer < CFloat > ? ) <nl> param_const_void_pointer ( & i ) <nl> mmm a / test / ClangModules / ctypes_parse . swift <nl> ppp b / test / ClangModules / ctypes_parse . swift <nl> func testFunctionPointers ( ) { <nl> useFunctionPointer ( wrapper . a ) <nl> _ = wrapper . b as ( @ convention ( c ) ( CInt ) - > CInt ) <nl> <nl> - var anotherFP : @ convention ( c ) ( CInt , CLong , UnsafeMutablePointer < Void > ? ) - > Void <nl> + var anotherFP : @ convention ( c ) ( CInt , CLong , UnsafeMutableRawPointer ? ) - > Void <nl> = getFunctionPointer2 ( ) <nl> <nl> useFunctionPointer2 ( anotherFP ) <nl> - anotherFP = fp / / expected - error { { cannot assign value of type ' fptr ? ' to type ' @ convention ( c ) ( CInt , CLong , UnsafeMutablePointer < Void > ? ) - > Void ' } } <nl> + anotherFP = fp / / expected - error { { cannot assign value of type ' fptr ? ' to type ' @ convention ( c ) ( CInt , CLong , UnsafeMutableRawPointer ? ) - > Void ' } } <nl> } <nl> <nl> func testStructDefaultInit ( ) { <nl> mmm a / test / ClangModules / cvars_parse . swift <nl> ppp b / test / ClangModules / cvars_parse . swift <nl> func getPI ( ) - > Float { <nl> <nl> func testPointers ( ) { <nl> let cp = globalConstPointer <nl> - cp . abcde ( ) / / expected - error { { value of type ' UnsafePointer < Void > ? ' has no member ' abcde ' } } <nl> + cp . abcde ( ) / / expected - error { { value of type ' UnsafeRawPointer ? ' has no member ' abcde ' } } <nl> let mp = globalPointer <nl> - mp . abcde ( ) / / expected - error { { value of type ' UnsafeMutablePointer < Void > ? ' has no member ' abcde ' } } <nl> + mp . abcde ( ) / / expected - error { { value of type ' UnsafeMutableRawPointer ? ' has no member ' abcde ' } } <nl> } <nl> mmm a / test / ClangModules / objc_bridging . swift <nl> ppp b / test / ClangModules / objc_bridging . swift <nl> func foo ( ) { <nl> DummyClass ( ) . setProperty . onlyOnSet ( ) <nl> } <nl> <nl> - func allocateMagic ( _ zone : NSZone ) - > UnsafeMutablePointer < Void > { <nl> + func allocateMagic ( _ zone : NSZone ) - > UnsafeMutableRawPointer { <nl> return allocate ( zone ) <nl> } <nl> <nl> mmm a / test / ClangModules / objc_ir . swift <nl> ppp b / test / ClangModules / objc_ir . swift <nl> func pointerProperties ( _ obj : PointerWrapper ) { <nl> / / CHECK : load i8 * , i8 * * @ " \ 01L_selector ( setVoidPtr : ) " <nl> / / CHECK : load i8 * , i8 * * @ " \ 01L_selector ( setIntPtr : ) " <nl> / / CHECK : load i8 * , i8 * * @ " \ 01L_selector ( setIdPtr : ) " <nl> - obj . voidPtr = nil as UnsafeMutablePointer ? <nl> + obj . voidPtr = nil as UnsafeMutableRawPointer ? <nl> obj . intPtr = nil as UnsafeMutablePointer ? <nl> obj . idPtr = nil as AutoreleasingUnsafeMutablePointer ? <nl> } <nl> mmm a / test / ClangModules / serialization - sil . swift <nl> ppp b / test / ClangModules / serialization - sil . swift <nl> public func testPartialApply ( _ obj : Test ) { <nl> if let curried2 = obj . innerPointer { <nl> / / CHECK : dynamic_method_br [ [ CURRIED2_OBJ : % . + ] ] : $ @ opened ( [ [ CURRIED2_EXISTENTIAL : . + ] ] ) Test , # Test . innerPointer ! 1 . foreign , [ [ CURRIED2_TRUE : [ ^ , ] + ] ] , [ [ CURRIED2_FALSE : [ ^ , ] + ] ] <nl> / / CHECK : [ [ CURRIED2_FALSE ] ] : <nl> - / / CHECK : [ [ CURRIED2_TRUE ] ] ( [ [ CURRIED2_METHOD : % . + ] ] : $ @ convention ( objc_method ) ( @ opened ( [ [ CURRIED2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutablePointer < ( ) > ) : <nl> - / / CHECK : [ [ CURRIED2_PARTIAL : % . + ] ] = partial_apply [ [ CURRIED2_METHOD ] ] ( [ [ CURRIED2_OBJ ] ] ) : $ @ convention ( objc_method ) ( @ opened ( [ [ CURRIED2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutablePointer < ( ) > <nl> - / / CHECK : [ [ CURRIED2_THUNK : % . + ] ] = function_ref @ _TTRXFo__dGSpT___XFo_iT__iGSpT___ : $ @ convention ( thin ) ( @ in ( ) , @ owned @ callee_owned ( ) - > UnsafeMutablePointer < ( ) > ) - > @ out UnsafeMutablePointer < ( ) > <nl> - / / CHECK : = partial_apply [ [ CURRIED2_THUNK ] ] ( [ [ CURRIED2_PARTIAL ] ] ) : $ @ convention ( thin ) ( @ in ( ) , @ owned @ callee_owned ( ) - > UnsafeMutablePointer < ( ) > ) - > @ out UnsafeMutablePointer < ( ) > <nl> + / / CHECK : [ [ CURRIED2_TRUE ] ] ( [ [ CURRIED2_METHOD : % . + ] ] : $ @ convention ( objc_method ) ( @ opened ( [ [ CURRIED2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutableRawPointer ) : <nl> + / / CHECK : [ [ CURRIED2_PARTIAL : % . + ] ] = partial_apply [ [ CURRIED2_METHOD ] ] ( [ [ CURRIED2_OBJ ] ] ) : $ @ convention ( objc_method ) ( @ opened ( [ [ CURRIED2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutableRawPointer <nl> + / / CHECK : [ [ CURRIED2_THUNK : % . + ] ] = function_ref @ _TTRXFo__dSv_XFo_iT__iSv_ : $ @ convention ( thin ) ( @ in ( ) , @ owned @ callee_owned ( ) - > UnsafeMutableRawPointer ) - > @ out UnsafeMutableRawPointer <nl> + / / CHECK : = partial_apply [ [ CURRIED2_THUNK ] ] ( [ [ CURRIED2_PARTIAL ] ] ) : $ @ convention ( thin ) ( @ in ( ) , @ owned @ callee_owned ( ) - > UnsafeMutableRawPointer ) - > @ out UnsafeMutableRawPointer <nl> curried2 ( ) <nl> } <nl> if let prop1 = obj . normalObjectProp { <nl> public func testPartialApply ( _ obj : Test ) { <nl> if let prop2 = obj . innerPointerProp { <nl> / / CHECK : dynamic_method_br [ [ PROP2_OBJ : % . + ] ] : $ @ opened ( [ [ PROP2_EXISTENTIAL : . + ] ] ) Test , # Test . innerPointerProp ! getter . 1 . foreign , [ [ PROP2_TRUE : [ ^ , ] + ] ] , [ [ PROP2_FALSE : [ ^ , ] + ] ] <nl> / / CHECK : [ [ PROP2_FALSE ] ] : <nl> - / / CHECK : [ [ PROP2_TRUE ] ] ( [ [ PROP2_METHOD : % . + ] ] : $ @ convention ( objc_method ) ( @ opened ( [ [ PROP2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutablePointer < ( ) > ) : <nl> - / / CHECK : [ [ PROP2_PARTIAL : % . + ] ] = partial_apply [ [ PROP2_METHOD ] ] ( [ [ PROP2_OBJ ] ] ) : $ @ convention ( objc_method ) ( @ opened ( [ [ PROP2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutablePointer < ( ) > <nl> - / / CHECK : = apply [ [ PROP2_PARTIAL ] ] ( ) : $ @ callee_owned ( ) - > UnsafeMutablePointer < ( ) > <nl> + / / CHECK : [ [ PROP2_TRUE ] ] ( [ [ PROP2_METHOD : % . + ] ] : $ @ convention ( objc_method ) ( @ opened ( [ [ PROP2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutableRawPointer ) : <nl> + / / CHECK : [ [ PROP2_PARTIAL : % . + ] ] = partial_apply [ [ PROP2_METHOD ] ] ( [ [ PROP2_OBJ ] ] ) : $ @ convention ( objc_method ) ( @ opened ( [ [ PROP2_EXISTENTIAL ] ] ) Test ) - > @ unowned_inner_pointer UnsafeMutableRawPointer <nl> + / / CHECK : = apply [ [ PROP2_PARTIAL ] ] ( ) : $ @ callee_owned ( ) - > UnsafeMutableRawPointer <nl> _ = prop2 <nl> } <nl> } / / CHECK : { { ^ } $ } } <nl> mmm a / test / Constraints / diagnostics . swift <nl> ppp b / test / Constraints / diagnostics . swift <nl> _ = - UnaryOp ( ) / / expected - error { { unary operator ' - ' cannot be applied to an op <nl> <nl> / / < rdar : / / problem / 23433271 > Swift compiler segfault in failure diagnosis <nl> func f23433271 ( _ x : UnsafePointer < Int > ) { } <nl> - func segfault23433271 ( _ a : UnsafeMutablePointer < Void > ) { <nl> - f23433271 ( a [ 0 ] ) / / expected - error { { cannot convert value of type ' Void ' ( aka ' ( ) ' ) to expected argument type ' UnsafePointer < Int > ' } } <nl> + func segfault23433271 ( _ a : UnsafeMutableRawPointer ) { <nl> + f23433271 ( a [ 0 ] ) / / expected - error { { type ' UnsafeMutableRawPointer ' has no subscript members } } <nl> } <nl> <nl> / / < rdar : / / problem / 22058555 > crash in cs diags in withCString <nl> extension Foo23752537 { <nl> <nl> <nl> / / < rdar : / / problem / 22276040 > QoI : not great error message with " withUnsafePointer " sametype constraints <nl> - func read2 ( _ p : UnsafeMutablePointer < Void > , maxLength : Int ) { } <nl> + func read2 ( _ p : UnsafeMutableRawPointer , maxLength : Int ) { } <nl> func read < T : Integer > ( ) - > T ? { <nl> var buffer : T <nl> let n = withUnsafePointer ( & buffer ) { ( p ) in <nl> - read2 ( UnsafePointer ( p ) , maxLength : sizeof ( T ) ) / / expected - error { { cannot convert value of type ' UnsafePointer < _ > ' to expected argument type ' UnsafeMutablePointer < Void > ' ( aka ' UnsafeMutablePointer < ( ) > ' ) } } <nl> + read2 ( UnsafePointer ( p ) , maxLength : sizeof ( T ) ) / / expected - error { { cannot convert value of type ' UnsafePointer < _ > ' to expected argument type ' UnsafeMutableRawPointer ' } } <nl> } <nl> } <nl> <nl> mmm a / test / Constraints / lvalues . swift <nl> ppp b / test / Constraints / lvalues . swift <nl> takeArrayRef ( [ " asdf " , " 1234 " ] ) / / expected - error { { contextual type ' inout Array < S <nl> <nl> / / < rdar : / / problem / 19835413 > Reference to value from array changed <nl> func rdar19835413 ( ) { <nl> - func f1 ( _ p : UnsafeMutablePointer < Void > ) { } <nl> + func f1 ( _ p : UnsafeMutableRawPointer ) { } <nl> func f2 ( _ a : [ Int ] , i : Int , pi : UnsafeMutablePointer < Int > ) { <nl> var a = a <nl> f1 ( & a ) <nl> mmm a / test / Generics / slice_test . swift <nl> ppp b / test / Generics / slice_test . swift <nl> func testslice ( _ s : Array < Int > ) { <nl> _ = s [ 0 . . . 1 ] <nl> } <nl> <nl> - @ _silgen_name ( " malloc " ) func c_malloc ( _ size : Int ) - > UnsafeMutablePointer < Void > <nl> - @ _silgen_name ( " free " ) func c_free ( _ p : UnsafeMutablePointer < Void > ) <nl> + @ _silgen_name ( " malloc " ) func c_malloc ( _ size : Int ) - > UnsafeMutableRawPointer <nl> + @ _silgen_name ( " free " ) func c_free ( _ p : UnsafeMutableRawPointer ) <nl> <nl> class Vector < T > { <nl> var length : Int <nl> class Vector < T > { <nl> if length = = capacity { <nl> let newcapacity = capacity * 2 + 2 <nl> let size = Int ( Builtin . sizeof ( T . self ) ) <nl> - let newbase = UnsafeMutablePointer < T > ( c_malloc ( newcapacity * size ) ) <nl> + let newbase = UnsafeMutablePointer < T > ( c_malloc ( newcapacity * size ) <nl> + . bindMemory ( to : T . self , capacity : newcapacity ) ) <nl> for i in 0 . . < length { <nl> ( newbase + i ) . initialize ( to : ( base + i ) . move ( ) ) <nl> } <nl> mmm a / test / IRGen / objc_pointers . swift <nl> ppp b / test / IRGen / objc_pointers . swift <nl> <nl> import Foundation <nl> <nl> @ objc class Foo : NSObject { <nl> - / / CHECK : define internal void @ _TToFC13objc_pointers3Foo16pointerArgumentsfTGSpSi_1yGSpT__1zGSPSi_1wGVs33AutoreleasingUnsafeMutablePointerGSqS0____T_ ( % 0 * , i8 * , i64 * , i8 * , i64 * , % 0 * * ) <nl> + / / CHECK : define internal void @ _TToFC13objc_pointers3Foo16pointerArgumentsfTGSpSi_1ySv1zGSPSi_1wGVs33AutoreleasingUnsafeMutablePointerGSqS0____T_ ( % 0 * , i8 * , i64 * , i8 * , i64 * , % 0 * * ) <nl> @ objc func pointerArguments ( _ x : UnsafeMutablePointer < Int > , <nl> - y : UnsafeMutablePointer < Void > , <nl> + y : UnsafeMutableRawPointer , <nl> z : UnsafePointer < Int > , <nl> w : AutoreleasingUnsafeMutablePointer < Foo ? > ) { } <nl> } <nl> mmm a / test / Interpreter / SDK / KVO . swift <nl> ppp b / test / Interpreter / SDK / KVO . swift <nl> class Observer : NSObject { <nl> model . number = 42 <nl> } <nl> <nl> - override func observeValue ( forKeyPath keyPath : String ? , of object : Any ? , change : [ NSKeyValueChangeKey : Any ] ? , context : UnsafeMutablePointer < Void > ? ) { <nl> + override func observeValue ( forKeyPath keyPath : String ? , of object : Any ? , change : [ NSKeyValueChangeKey : Any ] ? , context : UnsafeMutableRawPointer ? ) { <nl> if context ! = & kvoContext { <nl> / / FIXME : we shouldn ' t need to unwrap these here , but it doesn ' t work on <nl> / / older SDKs where these are non - optional types . <nl> mmm a / test / Interpreter / SDK / Reflection_KVO . swift <nl> ppp b / test / Interpreter / SDK / Reflection_KVO . swift <nl> class ValueObserver : NSObject { <nl> observedValue . removeObserver ( self , forKeyPath : " amount " ) <nl> } <nl> <nl> - override func observeValue ( forKeyPath keyPath : String ? , of object : Any ? , change : [ NSKeyValueChangeKey : Any ] ? , context : UnsafeMutablePointer < Void > ? ) { <nl> + override func observeValue ( forKeyPath keyPath : String ? , of object : Any ? , change : [ NSKeyValueChangeKey : Any ] ? , context : UnsafeMutableRawPointer ? ) { <nl> if context = = & observeContext { <nl> if let change_ = change { <nl> if let amount = change_ [ . newKey ] as ? Int { <nl> mmm a / test / Interpreter / SDK / autolinking . swift <nl> ppp b / test / Interpreter / SDK / autolinking . swift <nl> if global ( ) ! = 42 { <nl> <nl> # else <nl> <nl> - let RTLD_DEFAULT = UnsafeMutablePointer < Void > ( bitPattern : - 2 ) <nl> + let RTLD_DEFAULT = UnsafeMutableRawPointer ( bitPattern : - 2 ) <nl> if dlsym ( RTLD_DEFAULT , " global " ) = = nil { <nl> print ( String ( cString : dlerror ( ) ) ) <nl> exit ( EXIT_FAILURE ) <nl> mmm a / test / Interpreter / SDK / c_pointers . swift <nl> ppp b / test / Interpreter / SDK / c_pointers . swift <nl> puts ( s ) <nl> <nl> var unsorted = [ 3 , 14 , 15 , 9 , 2 , 6 , 5 ] <nl> qsort ( & unsorted , unsorted . count , sizeofValue ( unsorted [ 0 ] ) ) { a , b in <nl> - return Int32 ( UnsafePointer < Int > ( a ! ) . pointee - UnsafePointer < Int > ( b ! ) . pointee ) <nl> + return Int32 ( a ! . load ( as : Int . self ) - b ! . load ( as : Int . self ) ) <nl> } <nl> / / CHECK - NEXT : [ 2 , 3 , 5 , 6 , 9 , 14 , 15 ] <nl> print ( unsorted ) <nl> mmm a / test / Interpreter / SDK / objc_inner_pointer . swift <nl> ppp b / test / Interpreter / SDK / objc_inner_pointer . swift <nl> autoreleasepool { <nl> repeat { <nl> let data = NSData ( bytes : [ 2 , 3 , 5 , 7 ] as [ UInt8 ] , length : 4 ) <nl> hangCanary ( data ) <nl> - bytes = UnsafeMutablePointer < UInt8 > ( data . bytes ) <nl> + bytes = UnsafeMutablePointer < UInt8 > ( data . bytes . assumingMemoryBound ( to : UInt8 . self ) ) <nl> } while false / / CHECK - NOT : died <nl> print ( bytes [ 0 ] ) / / CHECK : 2 <nl> print ( bytes [ 1 ] ) / / CHECK - NEXT : 3 <nl> autoreleasepool { <nl> let data = NSData ( bytes : [ 11 , 13 , 17 , 19 ] as [ UInt8 ] , length : 4 ) <nl> hangCanary ( data ) <nl> let dataAsAny : AnyObject = data <nl> - bytes = UnsafeMutablePointer < UInt8 > ( dataAsAny . bytes ! ) <nl> + bytes = UnsafeMutablePointer < UInt8 > ( dataAsAny . bytes ! . assumingMemoryBound ( to : UInt8 . self ) ) <nl> } while false / / CHECK - NOT : died <nl> print ( bytes [ 0 ] ) / / CHECK : 11 <nl> print ( bytes [ 1 ] ) / / CHECK - NEXT : 13 <nl> mmm a / test / Interpreter / errors_imported . swift <nl> ppp b / test / Interpreter / errors_imported . swift <nl> ErrorHandlingTests . test ( " pointerFailure " ) { <nl> ErrorHandlingTests . test ( " pointerSuccess " ) { <nl> do { <nl> var pointer = try TestingNSError . maybeThrow ( false ) <nl> - expectType ( UnsafeMutablePointer < Void > . self , & pointer ) <nl> + expectType ( UnsafeMutableRawPointer . self , & pointer ) <nl> expectEqual ( UnsafeMutablePointer ( bitPattern : 42 ) ! , pointer ) <nl> } catch { <nl> expectUnreachableCatch ( error ) <nl> mmm a / test / Parse / pointer_conversion . swift . gyb <nl> ppp b / test / Parse / pointer_conversion . swift . gyb <nl> class C { } <nl> class D { } <nl> <nl> func takesMutablePointer ( _ x : UnsafeMutablePointer < Int > $ { suffix } ) { } <nl> - func takesMutableVoidPointer ( _ x : UnsafeMutablePointer < Void > $ { suffix } ) { } <nl> + func takesMutableVoidPointer ( _ x : UnsafeMutableRawPointer $ { suffix } ) { } <nl> func takesMutableRawPointer ( _ x : UnsafeMutableRawPointer $ { suffix } ) { } <nl> func takesMutableInt8Pointer ( _ x : UnsafeMutablePointer < Int8 > $ { suffix } ) { } <nl> func takesMutableArrayPointer ( _ x : UnsafeMutablePointer < [ Int ] > $ { suffix } ) { } <nl> func takesMutableArrayPointer ( _ x : UnsafeMutablePointer < [ Int ] > $ { suffix } ) { } <nl> func takesConstPointer ( _ x : UnsafePointer < Int > $ { suffix } ) - > Character { return " x " } <nl> func takesConstInt8Pointer ( _ x : UnsafePointer < Int8 > $ { suffix } ) { } <nl> func takesConstUInt8Pointer ( _ x : UnsafePointer < UInt8 > $ { suffix } ) { } <nl> - func takesConstVoidPointer ( _ x : UnsafePointer < Void > $ { suffix } ) { } <nl> + func takesConstVoidPointer ( _ x : UnsafeRawPointer $ { suffix } ) { } <nl> func takesConstRawPointer ( _ x : UnsafeRawPointer $ { suffix } ) { } <nl> <nl> func mutablePointerArguments ( _ p : UnsafeMutablePointer < Int > , <nl> func mutableVoidPointerArguments ( _ p : UnsafeMutablePointer < Int > , <nl> <nl> takesMutableVoidPointer ( p ) <nl> takesMutableVoidPointer ( fp ) <nl> - takesMutableVoidPointer ( cp ) / / expected - error { { cannot convert value of type ' UnsafePointer < Int > ' to expected argument type ' UnsafeMutablePointer < Void > $ { suffix } ' } } <nl> + takesMutableVoidPointer ( cp ) / / expected - error { { cannot convert value of type ' UnsafePointer < Int > ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> var i : Int = 0 <nl> var f : Float = 0 <nl> takesMutableVoidPointer ( & i ) <nl> takesMutableVoidPointer ( & f ) <nl> - takesMutableVoidPointer ( i ) / / expected - error { { cannot convert value of type ' Int ' to expected argument type ' UnsafeMutablePointer < Void > $ { suffix } ' } } <nl> - takesMutableVoidPointer ( f ) / / expected - error { { cannot convert value of type ' Float ' to expected argument type ' UnsafeMutablePointer < Void > $ { suffix } ' } } <nl> + takesMutableVoidPointer ( i ) / / expected - error { { cannot convert value of type ' Int ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> + takesMutableVoidPointer ( f ) / / expected - error { { cannot convert value of type ' Float ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> var ii : [ Int ] = [ 0 , 1 , 2 ] <nl> var dd : [ CInt ] = [ 1 , 2 , 3 ] <nl> var ff : [ Int ] = [ 0 , 1 , 2 ] <nl> takesMutableVoidPointer ( & ii ) <nl> takesMutableVoidPointer ( & dd ) <nl> takesMutableVoidPointer ( & ff ) <nl> - takesMutableVoidPointer ( ii ) / / expected - error { { cannot convert value of type ' [ Int ] ' to expected argument type ' UnsafeMutablePointer < Void > $ { suffix } ' } } <nl> - takesMutableVoidPointer ( ff ) / / expected - error { { cannot convert value of type ' [ Int ] ' to expected argument type ' UnsafeMutablePointer < Void > $ { suffix } ' } } <nl> + takesMutableVoidPointer ( ii ) / / expected - error { { cannot convert value of type ' [ Int ] ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> + takesMutableVoidPointer ( ff ) / / expected - error { { cannot convert value of type ' [ Int ] ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> <nl> / / We don ' t allow these conversions outside of function arguments . <nl> - var x : UnsafeMutablePointer < Void > = & i / / expected - error { { cannot convert value of type ' inout Int ' to specified type ' UnsafeMutablePointer < Void > ' } } <nl> - x = p / / expected - error { { cannot assign value of type ' UnsafeMutablePointer < Int > ' to type ' UnsafeMutablePointer < Void > ' } } <nl> - x = & ii / / expected - error { { cannot assign value of type ' inout [ Int ] ' ( aka ' inout Array < Int > ' ) to type ' UnsafeMutablePointer < Void > ' } } <nl> + var x : UnsafeMutableRawPointer = & i / / expected - error { { cannot convert value of type ' inout Int ' to specified type ' UnsafeMutableRawPointer ' } } <nl> + x = p / / expected - error { { cannot assign value of type ' UnsafeMutablePointer < Int > ' to type ' UnsafeMutableRawPointer ' } } <nl> + x = & ii / / expected - error { { cannot assign value of type ' inout [ Int ] ' ( aka ' inout Array < Int > ' ) to type ' UnsafeMutableRawPointer ' } } <nl> _ = x <nl> } <nl> <nl> func constVoidPointerArguments ( _ p : UnsafeMutablePointer < Int > , <nl> takesConstVoidPointer ( ff ) <nl> <nl> / / TODO : These two should be accepted , tracked by rdar : / / 17444930 . <nl> - takesConstVoidPointer ( [ 0 , 1 , 2 ] ) / / expected - error { { cannot convert value of type ' Int ' to expected element type ' ( ) ' } } <nl> - takesConstVoidPointer ( [ 0 . 0 , 1 . 0 , 2 . 0 ] ) / / expected - error { { cannot convert value of type ' Double ' to expected element type ' ( ) ' } } <nl> + takesConstVoidPointer ( [ 0 , 1 , 2 ] ) / / expected - error { { contextual type ' UnsafeRawPointer ' cannot be used with array literal } } <nl> + takesConstVoidPointer ( [ 0 . 0 , 1 . 0 , 2 . 0 ] ) / / expected - error { { contextual type ' UnsafeRawPointer ' cannot be used with array literal } } <nl> <nl> / / We don ' t allow these conversions outside of function arguments . <nl> - var x : UnsafePointer < Void > = & i / / expected - error { { cannot convert value of type ' inout Int ' to specified type ' UnsafePointer < Void > ' ( aka ' UnsafePointer < ( ) > ' ) } } <nl> - x = ii / / expected - error { { cannot assign value of type ' [ Int ] ' to type ' UnsafePointer < Void > ' ( aka ' UnsafePointer < ( ) > ' ) } } <nl> - x = p / / expected - error { { cannot assign value of type ' UnsafeMutablePointer < Int > ' to type ' UnsafePointer < Void > ' ( aka ' UnsafePointer < ( ) > ' ) } } <nl> - x = fp / / expected - error { { cannot assign value of type ' UnsafeMutablePointer < Float > ' to type ' UnsafePointer < Void > ' ( aka ' UnsafePointer < ( ) > ' ) } } <nl> - x = cp / / expected - error { { cannot assign value of type ' UnsafePointer < Int > ' to type ' UnsafePointer < Void > ' ( aka ' UnsafePointer < ( ) > ' ) } } <nl> - x = cfp / / expected - error { { cannot assign value of type ' UnsafePointer < Float > ' to type ' UnsafePointer < Void > ' ( aka ' UnsafePointer < ( ) > ' ) } } <nl> + var x : UnsafeRawPointer = & i / / expected - error { { cannot convert value of type ' inout Int ' to specified type ' UnsafeRawPointer ' } } <nl> + x = ii / / expected - error { { cannot assign value of type ' [ Int ] ' to type ' UnsafeRawPointer ' } } <nl> + x = p / / expected - error { { cannot assign value of type ' UnsafeMutablePointer < Int > ' to type ' UnsafeRawPointer ' } } <nl> + x = fp / / expected - error { { cannot assign value of type ' UnsafeMutablePointer < Float > ' to type ' UnsafeRawPointer ' } } <nl> + x = cp / / expected - error { { cannot assign value of type ' UnsafePointer < Int > ' to type ' UnsafeRawPointer ' } } <nl> + x = cfp / / expected - error { { cannot assign value of type ' UnsafePointer < Float > ' to type ' UnsafeRawPointer ' } } <nl> _ = x <nl> } <nl> <nl> func stringArguments ( _ s : String ) { <nl> takesConstUInt8Pointer ( s ) <nl> takesConstPointer ( s ) / / expected - error { { cannot convert value of type ' String ' to expected argument type ' UnsafePointer < Int > $ { suffix } ' } } <nl> <nl> - takesMutableVoidPointer ( s ) / / expected - error { { cannot convert value of type ' String ' to expected argument type ' UnsafeMutablePointer < Void > $ { suffix } ' } } <nl> + takesMutableVoidPointer ( s ) / / expected - error { { cannot convert value of type ' String ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> takesMutableRawPointer ( s ) / / expected - error { { cannot convert value of type ' String ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> takesMutableInt8Pointer ( s ) / / expected - error { { cannot convert value of type ' String ' to expected argument type ' UnsafeMutablePointer < Int8 > $ { suffix } ' } } <nl> takesMutableInt8Pointer ( & s ) / / expected - error { { cannot convert value of type ' String ' to expected argument type ' Int8 ' } } <nl> func f19478919 ( ) { <nl> GLKProject ( & viewport ) / / expected - error { { cannot convert value of type ' Int ' to expected argument type ' Int32 ' } } <nl> <nl> func GLKProjectUP ( _ a : UnsafePointer < Int32 > ) { } <nl> - func UP_Void ( _ a : UnsafePointer < Void > ) { } <nl> - func UMP_Void ( _ a : UnsafeMutablePointer < Void > ) { } <nl> + func UP_Void ( _ a : UnsafeRawPointer ) { } <nl> + func UMP_Void ( _ a : UnsafeMutableRawPointer ) { } <nl> UP_Void ( & viewport ) <nl> UMP_Void ( & viewport ) <nl> <nl> mmm a / test / Parse / pointer_conversion_objc . swift . gyb <nl> ppp b / test / Parse / pointer_conversion_objc . swift . gyb <nl> class C { } <nl> class D { } <nl> <nl> func takesMutablePointer ( _ x : UnsafeMutablePointer < Int > $ { suffix } ) { } <nl> - func takesMutableVoidPointer ( _ x : UnsafeMutablePointer < Void > $ { suffix } ) { } <nl> + func takesMutableVoidPointer ( _ x : UnsafeMutableRawPointer $ { suffix } ) { } <nl> @ discardableResult <nl> func takesConstPointer ( _ x : UnsafePointer < Int > $ { suffix } ) - > Character { return " x " } <nl> - func takesConstVoidPointer ( _ x : UnsafePointer < Void > $ { suffix } ) { } <nl> + func takesConstVoidPointer ( _ x : UnsafeRawPointer $ { suffix } ) { } <nl> <nl> func takesAutoreleasingPointer ( _ x : AutoreleasingUnsafeMutablePointer < C > $ { suffix } ) { } <nl> <nl> func pointerArgumentsObjC ( ap : AutoreleasingUnsafeMutablePointer < Int > , <nl> afp : AutoreleasingUnsafeMutablePointer < Float > ) { <nl> takesMutablePointer ( ap ) / / expected - error { { cannot convert value of type ' AutoreleasingUnsafeMutablePointer < Int > ' to expected argument type ' UnsafeMutablePointer < Int > $ { suffix } ' } } <nl> - takesMutableVoidPointer ( ap ) / / expected - error { { cannot convert value of type ' AutoreleasingUnsafeMutablePointer < Int > ' to expected argument type ' UnsafeMutablePointer < Void > $ { suffix } ' } } <nl> + takesMutableVoidPointer ( ap ) / / expected - error { { cannot convert value of type ' AutoreleasingUnsafeMutablePointer < Int > ' to expected argument type ' UnsafeMutableRawPointer $ { suffix } ' } } <nl> takesConstPointer ( ap ) <nl> takesConstVoidPointer ( ap ) <nl> takesConstVoidPointer ( afp ) <nl> <nl> - var x : UnsafePointer < Void > <nl> - x = ap / / expected - error { { cannot assign value of type ' AutoreleasingUnsafeMutablePointer < Int > ' to type ' UnsafePointer < Void > ' ( aka ' UnsafePointer < ( ) > ' ) } } <nl> + var x : UnsafeRawPointer <nl> + x = ap / / expected - error { { cannot assign value of type ' AutoreleasingUnsafeMutablePointer < Int > ' to type ' UnsafeRawPointer ' } } <nl> _ = x <nl> } <nl> <nl> mmm a / test / PrintAsObjC / classes . swift <nl> ppp b / test / PrintAsObjC / classes . swift <nl> typealias AliasForNSRect = NSRect <nl> func testNested ( _ a : UnsafeMutablePointer < UnsafeMutablePointer < Int > > ) { } <nl> <nl> func testBridging ( _ a : UnsafePointer < Int > , b : UnsafeMutablePointer < Int > , c : AutoreleasingUnsafeMutablePointer < Methods > ) { } <nl> - func testBridgingVoid ( _ a : UnsafeMutablePointer < Void > , b : UnsafePointer < Void > ) { } <nl> + func testBridgingVoid ( _ a : UnsafeMutableRawPointer , b : UnsafeRawPointer ) { } <nl> <nl> func testBridgingOptionality ( _ a : UnsafePointer < Int > ? , b : UnsafeMutablePointer < Int > ! , c : AutoreleasingUnsafeMutablePointer < Methods ? > ? ) { } <nl> } <nl> mmm a / test / SILGen / default_arguments . swift <nl> ppp b / test / SILGen / default_arguments . swift <nl> func testTakeDefaultArgUnnamed ( _ i : Int ) { <nl> takeDefaultArgUnnamed ( i ) <nl> } <nl> <nl> - func takeDSOHandle ( _ handle : UnsafeMutablePointer < Void > = # dsohandle ) { } <nl> + func takeDSOHandle ( _ handle : UnsafeMutableRawPointer = # dsohandle ) { } <nl> <nl> / / CHECK - LABEL : sil hidden @ _TF17default_arguments13testDSOHandleFT_T_ <nl> func testDSOHandle ( ) { <nl> - / / CHECK : [ [ DSO_HANDLE : % [ 0 - 9 ] + ] ] = global_addr @ __dso_handle : $ * UnsafeMutablePointer < ( ) > <nl> + / / CHECK : [ [ DSO_HANDLE : % [ 0 - 9 ] + ] ] = global_addr @ __dso_handle : $ * UnsafeMutableRawPointer <nl> takeDSOHandle ( ) <nl> } <nl> <nl> mmm a / test / SILGen / dso_handle . swift <nl> ppp b / test / SILGen / dso_handle . swift <nl> <nl> / / RUN : % target - swift - frontend - Xllvm - sil - full - demangle - emit - silgen % s | FileCheck % s <nl> <nl> - / / CHECK : sil_global hidden_external @ __dso_handle : $ UnsafeMutablePointer < ( ) > <nl> + / / CHECK : sil_global hidden_external @ __dso_handle : $ UnsafeMutableRawPointer <nl> <nl> / / CHECK - LABEL : sil @ main : $ @ convention ( c ) <nl> / / CHECK : bb0 <nl> - / / CHECK : [ [ DSO : % [ 0 - 9 ] + ] ] = global_addr @ __dso_handle : $ * UnsafeMutablePointer < ( ) > <nl> + / / CHECK : [ [ DSO : % [ 0 - 9 ] + ] ] = global_addr @ __dso_handle : $ * UnsafeMutableRawPointer <nl> / / CHECK : load [ [ DSO ] ] <nl> <nl> - / / CHECK - LABEL : sil hidden @ _TIF10dso_handle14printDSOHandleFT3dsoGSpT___GSpT__A_ <nl> - / / CHECK : [ [ DSO : % [ 0 - 9 ] + ] ] = global_addr @ __dso_handle : $ * UnsafeMutablePointer < ( ) > <nl> + / / CHECK - LABEL : sil hidden @ _TIF10dso_handle14printDSOHandleFT3dsoSv_SvA_ <nl> + / / CHECK : [ [ DSO : % [ 0 - 9 ] + ] ] = global_addr @ __dso_handle : $ * UnsafeMutableRawPointer <nl> / / CHECK : load [ [ DSO ] ] <nl> - func printDSOHandle ( dso : UnsafeMutablePointer < Void > = # dsohandle ) - > UnsafeMutablePointer < Void > { <nl> + func printDSOHandle ( dso : UnsafeMutableRawPointer = # dsohandle ) - > UnsafeMutableRawPointer { <nl> print ( dso ) <nl> } <nl> <nl> mmm a / test / SILGen / lying_about_optional_return . swift <nl> ppp b / test / SILGen / lying_about_optional_return . swift <nl> <nl> func optionalChainingForeignFunctionTypeProperties ( a : SomeCallbacks ? ) { <nl> / / CHECK : enum $ Optional < ( ) > , # Optional . some ! enumelt . 1 , { { % . * } } : $ ( ) <nl> let _ : ( ) ? = voidReturning ( ) <nl> - / / CHECK : unchecked_trivial_bit_cast { { % . * } } : $ UnsafeMutablePointer < ( ) > to $ Optional < UnsafeMutablePointer < ( ) > > <nl> - let _ : UnsafeMutablePointer < Void > ? = voidPointerReturning ( ) <nl> + / / CHECK : unchecked_trivial_bit_cast { { % . * } } : $ UnsafeMutableRawPointer to $ Optional < UnsafeMutableRawPointer > <nl> + let _ : UnsafeMutableRawPointer ? = voidPointerReturning ( ) <nl> / / CHECK : unchecked_trivial_bit_cast { { % . * } } : $ UnsafeMutablePointer < Int8 > to $ Optional < UnsafeMutablePointer < Int8 > > <nl> let _ : UnsafeMutablePointer < Int8 > ? = pointerReturning ( ) <nl> / / CHECK : unchecked_trivial_bit_cast { { % . * } } : $ UnsafePointer < Int8 > to $ Optional < UnsafePointer < Int8 > > <nl> func optionalChainingForeignFunctionTypeProperties ( a : SomeCallbacks ? ) { <nl> <nl> / / CHECK : enum $ Optional < ( ) > , # Optional . some ! enumelt . 1 , { { % . * } } : $ ( ) <nl> a ? . voidReturning ( ) <nl> - / / CHECK : unchecked_trivial_bit_cast { { % . * } } : $ UnsafeMutablePointer < ( ) > to $ Optional < UnsafeMutablePointer < ( ) > > <nl> + / / CHECK : unchecked_trivial_bit_cast { { % . * } } : $ UnsafeMutableRawPointer to $ Optional < UnsafeMutableRawPointer > <nl> a ? . voidPointerReturning ( ) <nl> / / CHECK : unchecked_trivial_bit_cast { { % . * } } : $ UnsafeMutablePointer < Int8 > to $ Optional < UnsafeMutablePointer < Int8 > > <nl> a ? . pointerReturning ( ) <nl> mmm a / test / SILGen / objc_currying . swift <nl> ppp b / test / SILGen / objc_currying . swift <nl> func curry_bridged ( _ x : CurryTest ) - > ( String ! ) - > String ! { <nl> / / CHECK : strong_release % 1 <nl> / / CHECK : return { { % . * } } : $ ImplicitlyUnwrappedOptional < String > <nl> <nl> - func curry_returnsInnerPointer ( _ x : CurryTest ) - > ( ) - > UnsafeMutablePointer < Void > ! { <nl> + func curry_returnsInnerPointer ( _ x : CurryTest ) - > ( ) - > UnsafeMutableRawPointer ! { <nl> return x . returnsInnerPointer <nl> } <nl> - / / CHECK - LABEL : sil hidden @ _TF13objc_currying25curry_returnsInnerPointerFCSo9CurryTestFT_GSQGSpT___ : $ @ convention ( thin ) ( @ owned CurryTest ) - > @ owned @ callee_owned ( ) - > ImplicitlyUnwrappedOptional < UnsafeMutablePointer < ( ) > > { <nl> - / / CHECK : [ [ THUNK : % . * ] ] = function_ref [ [ THUNK_RETURNSINNERPOINTER : @ _TTOFCSo9CurryTest19returnsInnerPointerFT_GSQGSpT___ ] ] <nl> + / / CHECK - LABEL : sil hidden @ _TF13objc_currying25curry_returnsInnerPointerFCSo9CurryTestFT_GSQSv_ : $ @ convention ( thin ) ( @ owned CurryTest ) - > @ owned @ callee_owned ( ) - > ImplicitlyUnwrappedOptional < UnsafeMutableRawPointer > { <nl> + / / CHECK : [ [ THUNK : % . * ] ] = function_ref [ [ THUNK_RETURNSINNERPOINTER : @ _TTOFCSo9CurryTest19returnsInnerPointerFT_GSQSv_ ] ] <nl> / / CHECK : [ [ FN : % . * ] ] = apply [ [ THUNK ] ] ( % 0 ) <nl> / / CHECK : return [ [ FN ] ] <nl> <nl> - / / CHECK : sil shared [ thunk ] [ [ THUNK_RETURNSINNERPOINTER ] ] : $ @ convention ( thin ) ( @ owned CurryTest ) - > @ owned @ callee_owned ( ) - > ImplicitlyUnwrappedOptional < UnsafeMutablePointer < ( ) > > <nl> - / / CHECK : [ [ THUNK : % . * ] ] = function_ref [ [ THUNK_RETURNSINNERPOINTER_2 : @ _TTOFCSo9CurryTest19returnsInnerPointerfT_GSQGSpT___ ] ] <nl> + / / CHECK : sil shared [ thunk ] [ [ THUNK_RETURNSINNERPOINTER ] ] : $ @ convention ( thin ) ( @ owned CurryTest ) - > @ owned @ callee_owned ( ) - > ImplicitlyUnwrappedOptional < UnsafeMutableRawPointer > <nl> + / / CHECK : [ [ THUNK : % . * ] ] = function_ref [ [ THUNK_RETURNSINNERPOINTER_2 : @ _TTOFCSo9CurryTest19returnsInnerPointerfT_GSQSv_ ] ] <nl> / / CHECK : [ [ FN : % . * ] ] = partial_apply [ [ THUNK ] ] ( % 0 ) <nl> / / CHECK : return [ [ FN ] ] <nl> <nl> - / / CHECK : sil shared [ thunk ] @ _TTOFCSo9CurryTest19returnsInnerPointerfT_GSQGSpT___ : $ @ convention ( method ) ( @ guaranteed CurryTest ) - > ImplicitlyUnwrappedOptional < UnsafeMutablePointer < ( ) > > <nl> + / / CHECK : sil shared [ thunk ] @ _TTOFCSo9CurryTest19returnsInnerPointerfT_GSQSv_ : $ @ convention ( method ) ( @ guaranteed CurryTest ) - > ImplicitlyUnwrappedOptional < UnsafeMutableRawPointer > <nl> / / CHECK : bb0 ( [ [ ARG1 : % . * ] ] : <nl> / / CHECK : strong_retain [ [ ARG1 ] ] <nl> / / CHECK : [ [ METHOD : % . * ] ] = class_method [ volatile ] % 0 : $ CurryTest , # CurryTest . returnsInnerPointer ! 1 . foreign <nl> - / / CHECK : [ [ RES : % . * ] ] = apply [ [ METHOD ] ] ( % 0 ) : $ @ convention ( objc_method ) ( CurryTest ) - > @ unowned_inner_pointer ImplicitlyUnwrappedOptional < UnsafeMutablePointer < ( ) > > <nl> + / / CHECK : [ [ RES : % . * ] ] = apply [ [ METHOD ] ] ( % 0 ) : $ @ convention ( objc_method ) ( CurryTest ) - > @ unowned_inner_pointer ImplicitlyUnwrappedOptional < UnsafeMutableRawPointer > <nl> / / CHECK : autorelease_value % 0 <nl> / / CHECK : return [ [ RES ] ] <nl> <nl> func curry_returnsSelf_AnyObject ( _ x : AnyObject ) - > ( ) - > AnyObject ! { <nl> return x . returnsSelf ! <nl> } <nl> <nl> - / / CHECK - LABEL : sil hidden @ _TF13objc_currying35curry_returnsInnerPointer_AnyObjectFPs9AnyObject_FT_GSQGSpT___ <nl> + / / CHECK - LABEL : sil hidden @ _TF13objc_currying35curry_returnsInnerPointer_AnyObjectFPs9AnyObject_FT_GSQSv_ <nl> / / CHECK : dynamic_method_br [ [ SELF : % . * ] ] : $ @ opened ( { { . * } } ) AnyObject , # CurryTest . returnsInnerPointer ! 1 . foreign , [ [ HAS_METHOD : bb [ 0 - 9 ] + ] ] <nl> - / / CHECK : [ [ HAS_METHOD ] ] ( [ [ METHOD : % . * ] ] : $ @ convention ( objc_method ) ( @ opened ( { { . * } } ) AnyObject ) - > @ unowned_inner_pointer ImplicitlyUnwrappedOptional < UnsafeMutablePointer < ( ) > > ) : <nl> + / / CHECK : [ [ HAS_METHOD ] ] ( [ [ METHOD : % . * ] ] : $ @ convention ( objc_method ) ( @ opened ( { { . * } } ) AnyObject ) - > @ unowned_inner_pointer ImplicitlyUnwrappedOptional < UnsafeMutableRawPointer > ) : <nl> / / CHECK : [ [ PA : % . * ] ] = partial_apply [ [ METHOD ] ] ( [ [ SELF ] ] ) <nl> - / / CHECK : [ [ PA ] ] { { . * } } @ owned @ callee_owned ( ) - > ImplicitlyUnwrappedOptional < UnsafeMutablePointer < ( ) > > <nl> + / / CHECK : [ [ PA ] ] { { . * } } @ owned @ callee_owned ( ) - > ImplicitlyUnwrappedOptional < UnsafeMutableRawPointer > <nl> <nl> - func curry_returnsInnerPointer_AnyObject ( _ x : AnyObject ) - > ( ) - > UnsafeMutablePointer < Void > ! { <nl> + func curry_returnsInnerPointer_AnyObject ( _ x : AnyObject ) - > ( ) - > UnsafeMutableRawPointer ! { <nl> return x . returnsInnerPointer ! <nl> } <nl> mmm a / test / SILGen / objc_ownership_conventions . swift <nl> ppp b / test / SILGen / objc_ownership_conventions . swift <nl> func maybeApplyBlock ( _ f : ( @ convention ( block ) ( Gizmo ) - > Gizmo ) ? , x : Gizmo ) - > G <nl> return f ? ( x ) <nl> } <nl> <nl> - func useInnerPointer ( _ p : UnsafeMutablePointer < Void > ) { } <nl> + func useInnerPointer ( _ p : UnsafeMutableRawPointer ) { } <nl> <nl> / / Handle inner - pointer methods by autoreleasing self after the call . <nl> / / CHECK - LABEL : sil hidden @ _TF26objc_ownership_conventions18innerPointerMethod <nl> / / CHECK : [ [ USE : % . * ] ] = function_ref @ _TF26objc_ownership_conventions15useInnerPointer <nl> - / / CHECK : [ [ METHOD : % . * ] ] = class_method [ volatile ] % 0 : $ Gizmo , # Gizmo . getBytes ! 1 . foreign : ( Gizmo ) - > ( ) - > UnsafeMutablePointer < ( ) > , $ @ convention ( objc_method ) ( Gizmo ) - > @ unowned_inner_pointer UnsafeMutablePointer < ( ) > <nl> + / / CHECK : [ [ METHOD : % . * ] ] = class_method [ volatile ] % 0 : $ Gizmo , # Gizmo . getBytes ! 1 . foreign : ( Gizmo ) - > ( ) - > UnsafeMutableRawPointer , $ @ convention ( objc_method ) ( Gizmo ) - > @ unowned_inner_pointer UnsafeMutableRawPointer <nl> / / CHECK : strong_retain % 0 <nl> / / CHECK : [ [ PTR : % . * ] ] = apply [ [ METHOD ] ] ( % 0 ) <nl> / / CHECK : autorelease_value % 0 <nl> func innerPointerMethod ( _ g : Gizmo ) { <nl> <nl> / / CHECK - LABEL : sil hidden @ _TF26objc_ownership_conventions20innerPointerProperty <nl> / / CHECK : [ [ USE : % . * ] ] = function_ref @ _TF26objc_ownership_conventions15useInnerPointer <nl> - / / CHECK : [ [ METHOD : % . * ] ] = class_method [ volatile ] % 0 : $ Gizmo , # Gizmo . innerProperty ! getter . 1 . foreign : ( Gizmo ) - > ( ) - > UnsafeMutablePointer < ( ) > , $ @ convention ( objc_method ) ( Gizmo ) - > @ unowned_inner_pointer UnsafeMutablePointer < ( ) > <nl> + / / CHECK : [ [ METHOD : % . * ] ] = class_method [ volatile ] % 0 : $ Gizmo , # Gizmo . innerProperty ! getter . 1 . foreign : ( Gizmo ) - > ( ) - > UnsafeMutableRawPointer , $ @ convention ( objc_method ) ( Gizmo ) - > @ unowned_inner_pointer UnsafeMutableRawPointer <nl> / / CHECK : strong_retain % 0 <nl> / / CHECK : [ [ PTR : % . * ] ] = apply [ [ METHOD ] ] ( % 0 ) <nl> / / CHECK : autorelease_value % 0 <nl> mmm a / test / SILGen / pointer_conversion . swift <nl> ppp b / test / SILGen / pointer_conversion . swift <nl> import Foundation <nl> <nl> func takesMutablePointer ( _ x : UnsafeMutablePointer < Int > ) { } <nl> func takesConstPointer ( _ x : UnsafePointer < Int > ) { } <nl> - func takesMutableVoidPointer ( _ x : UnsafeMutablePointer < Void > ) { } <nl> - func takesConstVoidPointer ( _ x : UnsafePointer < Void > ) { } <nl> + func takesMutableVoidPointer ( _ x : UnsafeMutableRawPointer ) { } <nl> + func takesConstVoidPointer ( _ x : UnsafeRawPointer ) { } <nl> func takesMutableRawPointer ( _ x : UnsafeMutableRawPointer ) { } <nl> func takesConstRawPointer ( _ x : UnsafeRawPointer ) { } <nl> <nl> func pointerToPointer ( _ mp : UnsafeMutablePointer < Int > , <nl> takesMutableVoidPointer ( mp ) <nl> / / CHECK : [ [ TAKES_MUTABLE_VOID_POINTER : % . * ] ] = function_ref @ _TF18pointer_conversion23takesMutableVoidPointer <nl> / / CHECK : [ [ CONVERT : % . * ] ] = function_ref @ _TFs32_convertPointerToPointerArgument <nl> - / / CHECK : apply [ [ CONVERT ] ] < UnsafeMutablePointer < Int > , UnsafeMutablePointer < ( ) > > <nl> + / / CHECK : apply [ [ CONVERT ] ] < UnsafeMutablePointer < Int > , UnsafeMutableRawPointer > <nl> / / CHECK : apply [ [ TAKES_MUTABLE_VOID_POINTER ] ] <nl> <nl> takesMutableRawPointer ( mp ) <nl> func pointerToPointer ( _ mp : UnsafeMutablePointer < Int > , <nl> / / CHECK : apply [ [ TAKES_CONST_POINTER ] ] <nl> <nl> takesConstVoidPointer ( mp ) <nl> - / / CHECK : [ [ TAKES_CONST_VOID_POINTER : % . * ] ] = function_ref @ _TF18pointer_conversion21takesConstVoidPointerFGSPT__T_ <nl> + / / CHECK : [ [ TAKES_CONST_VOID_POINTER : % . * ] ] = function_ref @ _TF18pointer_conversion21takesConstVoidPointerFSVT_ <nl> / / CHECK : [ [ CONVERT : % . * ] ] = function_ref @ _TFs32_convertPointerToPointerArgument <nl> - / / CHECK : apply [ [ CONVERT ] ] < UnsafeMutablePointer < Int > , UnsafePointer < ( ) > > <nl> + / / CHECK : apply [ [ CONVERT ] ] < UnsafeMutablePointer < Int > , UnsafeRawPointer > <nl> / / CHECK : apply [ [ TAKES_CONST_VOID_POINTER ] ] <nl> <nl> takesConstRawPointer ( mp ) <nl> func pointerToPointer ( _ mp : UnsafeMutablePointer < Int > , <nl> / / CHECK : apply [ [ TAKES_CONST_POINTER ] ] ( [ [ CP ] ] ) <nl> <nl> takesConstVoidPointer ( cp ) <nl> - / / CHECK : [ [ TAKES_CONST_VOID_POINTER : % . * ] ] = function_ref @ _TF18pointer_conversion21takesConstVoidPointerFGSPT__T_ <nl> + / / CHECK : [ [ TAKES_CONST_VOID_POINTER : % . * ] ] = function_ref @ _TF18pointer_conversion21takesConstVoidPointerFSVT_ <nl> / / CHECK : [ [ CONVERT : % . * ] ] = function_ref @ _TFs32_convertPointerToPointerArgument <nl> - / / CHECK : apply [ [ CONVERT ] ] < UnsafePointer < Int > , UnsafePointer < ( ) > > <nl> + / / CHECK : apply [ [ CONVERT ] ] < UnsafePointer < Int > , UnsafeRawPointer > <nl> / / CHECK : apply [ [ TAKES_CONST_VOID_POINTER ] ] <nl> <nl> takesConstRawPointer ( cp ) <nl> func arrayToPointer ( ) { <nl> / / CHECK - LABEL : sil hidden @ _TF18pointer_conversion15stringToPointerFSST_ <nl> func stringToPointer ( _ s : String ) { <nl> takesConstVoidPointer ( s ) <nl> - / / CHECK : [ [ TAKES_CONST_VOID_POINTER : % . * ] ] = function_ref @ _TF18pointer_conversion21takesConstVoidPointerFGSPT__T_ <nl> + / / CHECK : [ [ TAKES_CONST_VOID_POINTER : % . * ] ] = function_ref @ _TF18pointer_conversion21takesConstVoidPointerFSV <nl> / / CHECK : [ [ CONVERT_STRING : % . * ] ] = function_ref @ _TFs40_convertConstStringToUTF8PointerArgument <nl> - / / CHECK : [ [ OWNER : % . * ] ] = apply [ [ CONVERT_STRING ] ] < UnsafePointer < ( ) > > ( [ [ POINTER_BUF : % [ 0 - 9 ] * ] ] , <nl> + / / CHECK : [ [ OWNER : % . * ] ] = apply [ [ CONVERT_STRING ] ] < UnsafeRawPointer > ( [ [ POINTER_BUF : % [ 0 - 9 ] * ] ] , <nl> / / CHECK : [ [ POINTER : % . * ] ] = load [ [ POINTER_BUF ] ] <nl> / / CHECK : apply [ [ TAKES_CONST_VOID_POINTER ] ] ( [ [ POINTER ] ] ) <nl> / / CHECK : release_value [ [ OWNER ] ] <nl> mmm a / test / attr / attr_objc . swift <nl> ppp b / test / attr / attr_objc . swift <nl> class infer_instanceVar1 { <nl> var var_UnsafeMutablePointer11 : UnsafeMutablePointer < PlainProtocol > <nl> var var_UnsafeMutablePointer12 : UnsafeMutablePointer < AnyObject > <nl> var var_UnsafeMutablePointer13 : UnsafeMutablePointer < AnyObject . Type > <nl> - var var_UnsafeMutablePointer100 : UnsafeMutablePointer < ( ) > <nl> - var var_UnsafeMutablePointer101 : UnsafeMutablePointer < Void > <nl> + var var_UnsafeMutablePointer100 : UnsafeMutableRawPointer <nl> + var var_UnsafeMutablePointer101 : UnsafeMutableRawPointer <nl> var var_UnsafeMutablePointer102 : UnsafeMutablePointer < ( Int , Int ) > <nl> / / CHECK - LABEL : @ objc var var_UnsafeMutablePointer1 : UnsafeMutablePointer < Int > <nl> / / CHECK - LABEL : @ objc var var_UnsafeMutablePointer2 : UnsafeMutablePointer < Bool > <nl> class infer_instanceVar1 { <nl> / / CHECK - LABEL : { { ^ } } var var_UnsafeMutablePointer11 : UnsafeMutablePointer < PlainProtocol > <nl> / / CHECK - LABEL : @ objc var var_UnsafeMutablePointer12 : UnsafeMutablePointer < AnyObject > <nl> / / CHECK - LABEL : var var_UnsafeMutablePointer13 : UnsafeMutablePointer < AnyObject . Type > <nl> - / / CHECK - LABEL : { { ^ } } @ objc var var_UnsafeMutablePointer100 : UnsafeMutablePointer < ( ) > <nl> - / / CHECK - LABEL : { { ^ } } @ objc var var_UnsafeMutablePointer101 : UnsafeMutablePointer < Void > <nl> + / / CHECK - LABEL : { { ^ } } @ objc var var_UnsafeMutablePointer100 : UnsafeMutableRawPointer <nl> + / / CHECK - LABEL : { { ^ } } @ objc var var_UnsafeMutablePointer101 : UnsafeMutableRawPointer <nl> / / CHECK - LABEL : { { ^ } } var var_UnsafeMutablePointer102 : UnsafeMutablePointer < ( Int , Int ) > <nl> <nl> var var_Optional1 : Class_ObjC1 ? <nl> mmm a / validation - test / IDE / complete_from_cocoa . swift <nl> ppp b / validation - test / IDE / complete_from_cocoa . swift <nl> import Cocoa <nl> func testUnqualified ( ) { <nl> # ^ T1 ^ # <nl> / / T1 : Begin completions <nl> - / / T1 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : CFArrayCreate ( { # ( allocator ) : CFAllocator ! # } , { # ( values ) : UnsafeMutablePointer < UnsafePointer < Void > ? > ! # } , { # ( numValues ) : CFIndex # } , { # ( callBacks ) : UnsafePointer < CFArrayCallBacks > ! # } ) [ # CFArray ! # ] { { ; name = . + $ } } <nl> + / / T1 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : CFArrayCreate ( { # ( allocator ) : CFAllocator ! # } , { # ( values ) : UnsafeMutablePointer < UnsafeRawPointer ? > ! # } , { # ( numValues ) : CFIndex # } , { # ( callBacks ) : UnsafePointer < CFArrayCallBacks > ! # } ) [ # CFArray ! # ] { { ; name = . + $ } } <nl> / / T1 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : CFArrayGetCount ( { # ( theArray ) : CFArray ! # } ) [ # CFIndex # ] { { ; name = . + $ } } <nl> / / T1 - DAG : Decl [ Class ] / OtherModule [ ObjectiveC . NSObject ] : NSObject [ # NSObject # ] { { ; name = . + $ } } <nl> / / T1 : End completions <nl> mmm a / validation - test / IDE / complete_from_cocoa_2 . swift <nl> ppp b / validation - test / IDE / complete_from_cocoa_2 . swift <nl> import Cocoa <nl> func testQualifiedWithDot ( ) { <nl> Cocoa . # ^ T1 ^ # <nl> / / T1 : Begin completions <nl> - / / T1 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : CFArrayCreate ( { # ( allocator ) : CFAllocator ! # } , { # ( values ) : UnsafeMutablePointer < UnsafePointer < Void > ? > ! # } , { # ( numValues ) : CFIndex # } , { # ( callBacks ) : UnsafePointer < CFArrayCallBacks > ! # } ) [ # CFArray ! # ] { { ; name = . + $ } } <nl> + / / T1 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : CFArrayCreate ( { # ( allocator ) : CFAllocator ! # } , { # ( values ) : UnsafeMutablePointer < UnsafeRawPointer ? > ! # } , { # ( numValues ) : CFIndex # } , { # ( callBacks ) : UnsafePointer < CFArrayCallBacks > ! # } ) [ # CFArray ! # ] { { ; name = . + $ } } <nl> / / T1 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : CFArrayGetCount ( { # ( theArray ) : CFArray ! # } ) [ # CFIndex # ] { { ; name = . + $ } } <nl> / / T1 - DAG : Decl [ Class ] / OtherModule [ ObjectiveC . NSObject ] : NSObject [ # NSObject # ] { { ; name = . + $ } } <nl> / / T1 : End completions <nl> func testQualifiedWithDot ( ) { <nl> func testQualifiedWithoutDot ( ) { <nl> Cocoa # ^ T2 ^ # <nl> / / T2 : Begin completions <nl> - / / T2 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : . CFArrayCreate ( { # ( allocator ) : CFAllocator ! # } , { # ( values ) : UnsafeMutablePointer < UnsafePointer < Void > ? > ! # } , { # ( numValues ) : CFIndex # } , { # ( callBacks ) : UnsafePointer < CFArrayCallBacks > ! # } ) [ # CFArray ! # ] { { ; name = . + $ } } <nl> + / / T2 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : . CFArrayCreate ( { # ( allocator ) : CFAllocator ! # } , { # ( values ) : UnsafeMutablePointer < UnsafeRawPointer ? > ! # } , { # ( numValues ) : CFIndex # } , { # ( callBacks ) : UnsafePointer < CFArrayCallBacks > ! # } ) [ # CFArray ! # ] { { ; name = . + $ } } <nl> / / T2 - DAG : Decl [ FreeFunction ] / OtherModule [ CoreFoundation . CFArray ] : . CFArrayGetCount ( { # ( theArray ) : CFArray ! # } ) [ # CFIndex # ] { { ; name = . + $ } } <nl> / / T2 - DAG : Decl [ Class ] / OtherModule [ ObjectiveC . NSObject ] : . NSObject [ # NSObject # ] { { ; name = . + $ } } <nl> / / T2 : End completions <nl> mmm a / validation - test / stdlib / ArrayNew . swift . gyb <nl> ppp b / validation - test / stdlib / ArrayNew . swift . gyb <nl> all_array_types = [ ' ContiguousArray ' , ' ArraySlice ' , ' Array ' ] <nl> } % <nl> <nl> extension Array { <nl> - var identity : UnsafePointer < Void > { <nl> + var identity : UnsafeRawPointer { <nl> return self . _buffer . identity <nl> } <nl> } <nl> <nl> extension ArraySlice { <nl> - var identity : UnsafePointer < Void > { <nl> + var identity : UnsafeRawPointer { <nl> return self . _buffer . identity <nl> } <nl> } <nl> <nl> extension ContiguousArray { <nl> - var identity : UnsafePointer < Void > { <nl> + var identity : UnsafeRawPointer { <nl> return self . _buffer . identity <nl> } <nl> } <nl> mmm a / validation - test / stdlib / Arrays . swift . gyb <nl> ppp b / validation - test / stdlib / Arrays . swift . gyb <nl> all_array_types = [ ' ContiguousArray ' , ' ArraySlice ' , ' Array ' ] <nl> } % <nl> <nl> extension Array { <nl> - var identity : UnsafePointer < Void > { <nl> + var identity : UnsafeRawPointer { <nl> return self . _buffer . identity <nl> } <nl> } <nl> <nl> extension ArraySlice { <nl> - var identity : UnsafePointer < Void > { <nl> + var identity : UnsafeRawPointer { <nl> return self . _buffer . identity <nl> } <nl> } <nl> <nl> extension ContiguousArray { <nl> - var identity : UnsafePointer < Void > { <nl> + var identity : UnsafeRawPointer { <nl> return self . _buffer . identity <nl> } <nl> } <nl> mmm a / validation - test / stdlib / CoreAudio . swift <nl> ppp b / validation - test / stdlib / CoreAudio . swift <nl> CoreAudioTestSuite . test ( " UnsafeBufferPointer . init ( _ : AudioBuffer ) " ) { <nl> do { <nl> let audioBuffer = AudioBuffer ( <nl> mNumberChannels : 2 , mDataByteSize : 1024 , <nl> - mData : UnsafeMutablePointer < Void > ( bitPattern : 0x1234_5678 ) ) <nl> + mData : UnsafeMutableRawPointer ( bitPattern : 0x1234_5678 ) ) <nl> let result : UnsafeBufferPointer < Float > = UnsafeBufferPointer ( audioBuffer ) <nl> - expectEqual ( <nl> - UnsafePointer < Float > ( audioBuffer . mData ! ) , <nl> - result . baseAddress ) <nl> + expectEqual ( audioBuffer . mData , UnsafeRawPointer ( result . baseAddress ! ) ) <nl> expectEqual ( 256 , result . count ) <nl> } <nl> } <nl> CoreAudioTestSuite . test ( " UnsafeMutableBufferPointer . init ( _ : AudioBuffer ) " ) { <nl> do { <nl> let audioBuffer = AudioBuffer ( <nl> mNumberChannels : 2 , mDataByteSize : 1024 , <nl> - mData : UnsafeMutablePointer < Void > ( bitPattern : 0x1234_5678 ) ) <nl> + mData : UnsafeMutableRawPointer ( bitPattern : 0x1234_5678 ) ) <nl> let result : UnsafeMutableBufferPointer < Float > = <nl> UnsafeMutableBufferPointer ( audioBuffer ) <nl> - expectEqual ( <nl> - UnsafeMutablePointer < Float > ( audioBuffer . mData ! ) , <nl> - result . baseAddress ) <nl> + expectEqual ( audioBuffer . mData ! , UnsafeMutableRawPointer ( result . baseAddress ! ) ) <nl> expectEqual ( 256 , result . count ) <nl> } <nl> } <nl> CoreAudioTestSuite . test ( " UnsafeMutableAudioBufferListPointer . subscript ( _ : Int ) " ) <nl> / / Test getter . <nl> let audioBuffer = AudioBuffer ( <nl> mNumberChannels : 2 , mDataByteSize : 1024 , <nl> - mData : UnsafeMutablePointer < Void > ( bitPattern : 0x1234_5678 ) ) <nl> + mData : UnsafeMutableRawPointer ( bitPattern : 0x1234_5678 ) ) <nl> <nl> UnsafeMutablePointer < AudioBuffer > ( <nl> UnsafeMutablePointer < UInt8 > ( ablPtr ) + ablHeaderSize <nl> CoreAudioTestSuite . test ( " UnsafeMutableAudioBufferListPointer . subscript ( _ : Int ) " ) <nl> / / Test setter . <nl> let audioBuffer = AudioBuffer ( <nl> mNumberChannels : 5 , mDataByteSize : 256 , <nl> - mData : UnsafeMutablePointer < Void > ( bitPattern : 0x8765_4321 as UInt ) ) <nl> + mData : UnsafeMutableRawPointer ( bitPattern : 0x8765_4321 as UInt ) ) <nl> <nl> ablPtrWrapper . count = 2 <nl> ablPtrWrapper [ 1 ] = audioBuffer <nl> CoreAudioTestSuite . test ( " UnsafeMutableAudioBufferListPointer / Collection " ) { <nl> for i in 0 . . < 16 { <nl> let audioBuffer = AudioBuffer ( <nl> mNumberChannels : UInt32 ( 2 + i ) , mDataByteSize : UInt32 ( 1024 * i ) , <nl> - mData : UnsafeMutablePointer < Void > ( bitPattern : 0x1234_5678 + i * 10 ) ) <nl> + mData : UnsafeMutableRawPointer ( bitPattern : 0x1234_5678 + i * 10 ) ) <nl> <nl> ablPtrWrapper [ i ] = audioBuffer <nl> expected . append ( audioBuffer ) <nl> mmm a / validation - test / stdlib / NewArray . swift . gyb <nl> ppp b / validation - test / stdlib / NewArray . swift . gyb <nl> func printSequence < T : Sequence > ( _ x : T ) { <nl> print ( " ] " ) <nl> } <nl> <nl> - typealias BufferID = UnsafePointer < Void > ? <nl> + typealias BufferID = UnsafeRawPointer ? <nl> <nl> func bufferID < T : _ArrayProtocol > ( _ x : T ) - > BufferID { <nl> return x . _buffer . identity <nl> mmm a / validation - test / stdlib / SceneKit . swift <nl> ppp b / validation - test / stdlib / SceneKit . swift <nl> import SceneKit <nl> var SceneKitTests = TestSuite ( " SceneKit " ) <nl> <nl> func bytesFromNSData ( _ data : NSData ) - > [ UInt8 ] { <nl> - return Array ( UnsafeBufferPointer ( <nl> - start : UnsafePointer < UInt8 > ( data . bytes ) , <nl> - count : data . length ) ) <nl> + let bytePtr = data . bytes . bindMemory ( to : UInt8 . self , capacity : data . length ) <nl> + return Array ( UnsafeBufferPointer ( start : bytePtr , count : data . length ) ) <nl> } <nl> <nl> func floatsFromNSData ( _ data : NSData ) - > [ Float ] { <nl> - return Array ( UnsafeBufferPointer ( <nl> - start : UnsafePointer < Float > ( data . bytes ) , <nl> - count : data . length / sizeof ( Float ) ) ) <nl> + let floatPtr = data . bytes . bindMemory ( to : Float . self , capacity : data . length ) <nl> + return Array ( UnsafeBufferPointer ( start : floatPtr , count : data . length / sizeof ( Float ) ) ) <nl> } <nl> <nl> if # available ( iOS 8 . 0 , * ) { <nl>
Migrate from UnsafePointer < Void > to UnsafeRawPointer . ( )
apple/swift
ece0951924421b12c88cd65a653a4dd1c69d30c4
2016-07-26T09:18:21Z
mmm a / xbmc / Application . cpp <nl> ppp b / xbmc / Application . cpp <nl> bool CApplication : : SetupNetwork ( ) <nl> <nl> bool CApplication : : Create ( ) <nl> { <nl> + m_ServiceManager . reset ( new CServiceManager ( ) ) ; <nl> + if ( ! m_ServiceManager - > Init1 ( ) ) <nl> + { <nl> + return false ; <nl> + } <nl> + <nl> SetupNetwork ( ) ; <nl> Preflight ( ) ; <nl> <nl> bool CApplication : : Create ( ) <nl> / / set avutil callback <nl> av_log_set_callback ( ff_avutil_log ) ; <nl> <nl> - m_ServiceManager . reset ( new CServiceManager ( ) ) ; <nl> - if ( ! m_ServiceManager - > Init1 ( ) ) <nl> - { <nl> - return false ; <nl> - } <nl> - <nl> g_powerManager . Initialize ( ) ; <nl> <nl> / / Load the AudioEngine before settings as they need to query the engine <nl>
ServiceManager : move first wave of init before init of Network
xbmc/xbmc
dea55cdf54eba7655dcc9f9912eae879c8f9de73
2016-03-21T06:33:18Z
mmm a / tests / nightly / model_backwards_compatibility_check / common . py <nl> ppp b / tests / nightly / model_backwards_compatibility_check / common . py <nl> <nl> - # ! / usr / bin / env python <nl> + # ! / usr / bin / env python3 <nl> <nl> # Licensed to the Apache Software Foundation ( ASF ) under one <nl> # or more contributor license agreements . See the NOTICE file <nl> <nl> import re <nl> from mxnet . test_utils import assert_almost_equal <nl> <nl> - try : <nl> - cmp # Python 2 <nl> - except NameError : <nl> - # See : https : / / docs . python . org / 3 . 0 / whatsnew / 3 . 0 . html # ordering - comparisons <nl> - def cmp ( x , y ) : # Python 3 <nl> - return ( x > y ) - ( x < y ) <nl> + <nl> + def cmp ( x , y ) : # Python 3 <nl> + return ( x > y ) - ( x < y ) <nl> <nl> # Set fixed random seeds . <nl> mx . random . seed ( 7 ) <nl> mmm a / tests / nightly / model_backwards_compatibility_check / model_backward_compat_checker . sh <nl> ppp b / tests / nightly / model_backwards_compatibility_check / model_backward_compat_checker . sh <nl> echo ` pwd ` <nl> <nl> echo ' = = = = = = = = = = = = = = = = = = = = = = = = = = ' <nl> export MXNET_ENFORCE_DETERMINISM = 1 <nl> - python model_backwards_compat_inference . py <nl> + python3 model_backwards_compat_inference . py <nl> mmm a / tests / nightly / model_backwards_compatibility_check / model_backwards_compat_inference . py <nl> ppp b / tests / nightly / model_backwards_compatibility_check / model_backwards_compat_inference . py <nl> <nl> - # ! / usr / bin / env python <nl> + # ! / usr / bin / env python3 <nl> <nl> # Licensed to the Apache Software Foundation ( ASF ) under one <nl> # or more contributor license agreements . See the NOTICE file <nl> mmm a / tests / nightly / model_backwards_compatibility_check / model_backwards_compat_train . py <nl> ppp b / tests / nightly / model_backwards_compatibility_check / model_backwards_compat_train . py <nl> <nl> - # ! / usr / bin / env python <nl> + # ! / usr / bin / env python3 <nl> <nl> # Licensed to the Apache Software Foundation ( ASF ) under one <nl> # or more contributor license agreements . See the NOTICE file <nl>
Fix MBCC inference ( )
apache/incubator-mxnet
5098dbe75fa8f86d688667a02d27f9154cf7c924
2020-02-25T17:37:25Z
mmm a / modules / mono / csharp_script . cpp <nl> ppp b / modules / mono / csharp_script . cpp <nl> String CSharpLanguage : : _get_indentation ( ) const { <nl> <nl> Vector < ScriptLanguage : : StackInfo > CSharpLanguage : : debug_get_current_stack_info ( ) { <nl> <nl> + # ifdef DEBUG_ENABLED <nl> / / Printing an error here will result in endless recursion , so we must be careful <nl> <nl> if ( ! gdmono - > is_runtime_initialized ( ) | | ! GDMono : : get_singleton ( ) - > get_api_assembly ( ) | | ! GDMonoUtils : : mono_cache . corlib_cache_updated ) <nl> Vector < ScriptLanguage : : StackInfo > CSharpLanguage : : debug_get_current_stack_info ( ) <nl> si = stack_trace_get_info ( stack_trace ) ; <nl> <nl> return si ; <nl> + # else <nl> + return Vector < StackInfo > ( ) ; <nl> + # endif <nl> } <nl> <nl> + # ifdef DEBUG_ENABLED <nl> Vector < ScriptLanguage : : StackInfo > CSharpLanguage : : stack_trace_get_info ( MonoObject * p_stack_trace ) { <nl> <nl> / / Printing an error here could result in endless recursion , so we must be careful <nl> Vector < ScriptLanguage : : StackInfo > CSharpLanguage : : stack_trace_get_info ( MonoObjec <nl> <nl> return si ; <nl> } <nl> + # endif <nl> <nl> void CSharpLanguage : : frame ( ) { <nl> <nl> bool CSharpScript : : _update_exports ( ) { <nl> return false ; <nl> } <nl> <nl> + # ifdef TOOLS_ENABLED <nl> bool CSharpScript : : _get_member_export ( GDMonoClass * p_class , GDMonoClassMember * p_member , PropertyInfo & r_prop_info , bool & r_exported ) { <nl> <nl> StringName name = p_member - > get_name ( ) ; <nl> bool CSharpScript : : _get_member_export ( GDMonoClass * p_class , GDMonoClassMember * p <nl> <nl> return true ; <nl> } <nl> + # endif <nl> <nl> void CSharpScript : : _clear ( ) { <nl> <nl> mmm a / modules / mono / csharp_script . h <nl> ppp b / modules / mono / csharp_script . h <nl> class CSharpScript : public Script { <nl> void _clear ( ) ; <nl> <nl> bool _update_exports ( ) ; <nl> + # ifdef TOOLS_ENABLED <nl> bool _get_member_export ( GDMonoClass * p_class , GDMonoClassMember * p_member , PropertyInfo & r_prop_info , bool & r_exported ) ; <nl> + # endif <nl> <nl> CSharpInstance * _create_instance ( const Variant * * p_args , int p_argcount , Object * p_owner , bool p_isref , Variant : : CallError & r_error ) ; <nl> Variant _new ( const Variant * * p_args , int p_argcount , Variant : : CallError & r_error ) ; <nl> class CSharpLanguage : public ScriptLanguage { <nl> virtual void * alloc_instance_binding_data ( Object * p_object ) ; <nl> virtual void free_instance_binding_data ( void * p_data ) ; <nl> <nl> + # ifdef DEBUG_ENABLED <nl> Vector < StackInfo > stack_trace_get_info ( MonoObject * p_stack_trace ) ; <nl> + # endif <nl> <nl> CSharpLanguage ( ) ; <nl> ~ CSharpLanguage ( ) ; <nl>
Mono : Fix build errors with tools = no and target = release
godotengine/godot
0c3bbcaa0085c579daa9dcba4c3ac60626b07413
2018-01-27T17:44:04Z
mmm a / vnpy / app / algo_trading / genus . py <nl> ppp b / vnpy / app / algo_trading / genus . py <nl> <nl> - from datetime import datetime , timedelta <nl> + from datetime import datetime <nl> from typing import Any <nl> from dataclasses import dataclass <nl> import pytz <nl> + from pathlib import Path <nl> <nl> import quickfix as fix <nl> <nl> def __init__ ( self , client : " GenusClient " ) : <nl> " " " " " " <nl> super ( ) . __init__ ( ) <nl> <nl> - self . client = client <nl> + self . client : " GenusClient " = client <nl> <nl> - self . callbacks = { <nl> + self . callbacks : Dict [ int , callable ] = { <nl> fix . MsgType_NewOrderSingle : self . new_child_order , <nl> fix . MsgType_OrderCancelRequest : self . cancel_child_order , <nl> } <nl> <nl> - self . exec_id = 0 <nl> + self . exec_id : int = 0 <nl> self . child_orders : Dict [ str , GenusChildOrder ] = { } <nl> self . genus_vt_map : Dict [ str , str ] = { } <nl> <nl> + self . seq_num : int = 0 <nl> + <nl> def onCreate ( self , session_id : int ) : <nl> " " " " " " <nl> self . session_id = session_id <nl> def toApp ( self , message : fix . Message , session_id : int ) : <nl> " " " " " " <nl> print ( " to app " , session_id ) <nl> <nl> + self . seq_num = get_field_value ( message , fix . MsgSeqNum ( ) ) <nl> + <nl> def fromAdmin ( self , message : fix . Message , session_id : int ) : <nl> " " " " " " <nl> print ( " from admin " , session_id ) <nl> + self . update_seq_num ( message ) <nl> <nl> def fromApp ( self , message : fix . Message , session_id : int ) : <nl> " " " " " " <nl> - print ( " from app " ) <nl> + self . update_seq_num ( message ) <nl> <nl> header = message . getHeader ( ) <nl> - <nl> msg_type = get_field_value ( header , fix . MsgType ( ) ) <nl> callback = self . callbacks . get ( msg_type , None ) <nl> if callback : <nl> callback ( message ) <nl> <nl> + def update_seq_num ( self , message : fix . Message ) : <nl> + " " " " " " <nl> + seq_num : int = get_field_value ( message , fix . MsgSeqNum ( ) ) <nl> + self . seq_num = seq_num <nl> + <nl> + session : fix . Session = fix . Session . lookupSession ( self . session_id ) <nl> + session . setNextSenderMsgSeqNum ( self . seq_num + 1 ) <nl> + <nl> def new_child_order ( self , message : fix . Message ) : <nl> " " " " " " <nl> child_order = GenusChildOrder ( <nl> def init ( self ) : <nl> " " " " " " <nl> self . register_event ( ) <nl> <nl> + app_path = Path ( __file__ ) . parent <nl> + <nl> # For child app <nl> - child_settings = fix . SessionSettings ( " genus_child . cfg " ) <nl> + child_settings = fix . SessionSettings ( str ( app_path . joinpath ( " genus_child . cfg " ) ) ) <nl> child_settings . setString ( " SocketAcceptHost " , SETTINGS [ " genus . child_host " ] ) <nl> child_settings . setString ( " SocketAcceptPort " , SETTINGS [ " genus . child_port " ] ) <nl> child_settings . setString ( " SenderCompID " , SETTINGS [ " genus . child_sender " ] ) <nl> def init ( self ) : <nl> self . child_socket . start ( ) <nl> <nl> # For parent app <nl> - parent_settings = fix . SessionSettings ( " genus_parent . cfg " ) <nl> + parent_settings = fix . SessionSettings ( str ( app_path . joinpath ( " genus_parent . cfg " ) ) ) <nl> parent_settings . setString ( " SocketConnectHost " , SETTINGS [ " genus . parent_host " ] ) <nl> parent_settings . setString ( " SocketConnectPort " , SETTINGS [ " genus . parent_port " ] ) <nl> parent_settings . setString ( " SenderCompID " , SETTINGS [ " genus . parent_sender " ] ) <nl>
[ Mod ] auto update child app msg seq num
vnpy/vnpy
63a3241ca067148e54c566d45adee61d6b87b508
2020-08-10T02:36:39Z
mmm a / trunk / src / app / srs_app_config . cpp <nl> ppp b / trunk / src / app / srs_app_config . cpp <nl> using namespace _srs_internal ; <nl> # define SRS_CONF_PERFER_FALSE ( conf_arg ) conf_arg = = " on " <nl> # define SRS_CONF_PERFER_TRUE ( conf_arg ) conf_arg ! = " off " <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / default consts values <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + # define SRS_CONF_DEFAULT_PID_FILE " . / objs / srs . pid " <nl> + # define SRS_CONF_DEFAULT_LOG_FILE " . / objs / srs . log " <nl> + # define SRS_CONF_DEFAULT_LOG_LEVEL " trace " <nl> + # define SRS_CONF_DEFAULT_LOG_TANK_CONSOLE " console " <nl> + # define SRS_CONF_DEFAULT_COFNIG_FILE " conf / srs . conf " <nl> + # define SRS_CONF_DEFAULT_FF_LOG_DIR " . / objs " <nl> + # define SRS_CONF_DEFAULT_UTC_TIME false <nl> + <nl> + # define SRS_CONF_DEFAULT_MAX_CONNECTIONS 1000 <nl> + # define SRS_CONF_DEFAULT_HLS_PATH " . / objs / nginx / html " <nl> + # define SRS_CONF_DEFAULT_HLS_M3U8_FILE " [ app ] / [ stream ] . m3u8 " <nl> + # define SRS_CONF_DEFAULT_HLS_TS_FILE " [ app ] / [ stream ] - [ seq ] . ts " <nl> + # define SRS_CONF_DEFAULT_HLS_TS_FLOOR false <nl> + # define SRS_CONF_DEFAULT_HLS_FRAGMENT 10 <nl> + # define SRS_CONF_DEFAULT_HLS_TD_RATIO 1 . 5 <nl> + # define SRS_CONF_DEFAULT_HLS_AOF_RATIO 2 . 0 <nl> + # define SRS_CONF_DEFAULT_HLS_WINDOW 60 <nl> + # define SRS_CONF_DEFAULT_HLS_ON_ERROR_IGNORE " ignore " <nl> + # define SRS_CONF_DEFAULT_HLS_ON_ERROR_DISCONNECT " disconnect " <nl> + # define SRS_CONF_DEFAULT_HLS_ON_ERROR_CONTINUE " continue " <nl> + # define SRS_CONF_DEFAULT_HLS_ON_ERROR SRS_CONF_DEFAULT_HLS_ON_ERROR_IGNORE <nl> + # define SRS_CONF_DEFAULT_HLS_STORAGE " disk " <nl> + # define SRS_CONF_DEFAULT_HLS_MOUNT " [ vhost ] / [ app ] / [ stream ] . m3u8 " <nl> + # define SRS_CONF_DEFAULT_HLS_ACODEC " aac " <nl> + # define SRS_CONF_DEFAULT_HLS_VCODEC " h264 " <nl> + # define SRS_CONF_DEFAULT_HLS_CLEANUP true <nl> + # define SRS_CONF_DEFAULT_HLS_WAIT_KEYFRAME true <nl> + # define SRS_CONF_DEFAULT_HLS_NB_NOTIFY 64 <nl> + # define SRS_CONF_DEFAULT_DVR_PATH " . / objs / nginx / html / [ app ] / [ stream ] . [ timestamp ] . flv " <nl> + # define SRS_CONF_DEFAULT_DVR_PLAN_SESSION " session " <nl> + # define SRS_CONF_DEFAULT_DVR_PLAN_SEGMENT " segment " <nl> + # define SRS_CONF_DEFAULT_DVR_PLAN_APPEND " append " <nl> + # define SRS_CONF_DEFAULT_DVR_PLAN SRS_CONF_DEFAULT_DVR_PLAN_SESSION <nl> + # define SRS_CONF_DEFAULT_DVR_DURATION 30 <nl> + # define SRS_CONF_DEFAULT_TIME_JITTER " full " <nl> + # define SRS_CONF_DEFAULT_ATC_AUTO true <nl> + # define SRS_CONF_DEFAULT_MIX_CORRECT false <nl> + / / in seconds , the paused queue length . <nl> + # define SRS_CONF_DEFAULT_PAUSED_LENGTH 10 <nl> + / / the interval in seconds for bandwidth check <nl> + # define SRS_CONF_DEFAULT_BANDWIDTH_INTERVAL 30 <nl> + / / the interval in seconds for bandwidth check <nl> + # define SRS_CONF_DEFAULT_BANDWIDTH_LIMIT_KBPS 1000 <nl> + <nl> + # define SRS_CONF_DEFAULT_HTTP_MOUNT " [ vhost ] / " <nl> + # define SRS_CONF_DEFAULT_HTTP_REMUX_MOUNT " [ vhost ] / [ app ] / [ stream ] . flv " <nl> + # define SRS_CONF_DEFAULT_HTTP_DIR SRS_CONF_DEFAULT_HLS_PATH <nl> + # define SRS_CONF_DEFAULT_HTTP_AUDIO_FAST_CACHE 0 <nl> + <nl> + # define SRS_CONF_DEFAULT_HTTP_STREAM_PORT " 8080 " <nl> + # define SRS_CONF_DEFAULT_HTTP_API_PORT " 1985 " <nl> + # define SRS_CONF_DEFAULT_HTTP_API_CROSSDOMAIN true <nl> + <nl> + # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_ENABLED false <nl> + # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_INTERVAL 9 . 9 <nl> + # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_URL " http : / / " SRS_CONSTS_LOCALHOST " : 8085 / api / v1 / servers " <nl> + # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_SUMMARIES false <nl> + <nl> + # define SRS_CONF_DEFAULT_SECURITY_ENABLED false <nl> + <nl> + # define SRS_CONF_DEFAULT_STREAM_CASTER_ENABLED false <nl> + # define SRS_CONF_DEFAULT_STREAM_CASTER_MPEGTS_OVER_UDP " mpegts_over_udp " <nl> + # define SRS_CONF_DEFAULT_STREAM_CASTER_RTSP " rtsp " <nl> + # define SRS_CONF_DEFAULT_STREAM_CASTER_FLV " flv " <nl> + <nl> + # define SRS_CONF_DEFAULT_STATS_NETWORK_DEVICE_INDEX 0 <nl> + <nl> + # define SRS_CONF_DEFAULT_PITHY_PRINT_MS 10000 <nl> + <nl> + # define SRS_CONF_DEFAULT_INGEST_TYPE_FILE " file " <nl> + # define SRS_CONF_DEFAULT_INGEST_TYPE_STREAM " stream " <nl> + <nl> + # define SRS_CONF_DEFAULT_TRANSCODE_IFORMAT " flv " <nl> + # define SRS_CONF_DEFAULT_TRANSCODE_OFORMAT " flv " <nl> + <nl> + # define SRS_CONF_DEFAULT_EDGE_MODE false <nl> + # define SRS_CONF_DEFAULT_EDGE_TOKEN_TRAVERSE false <nl> + # define SRS_CONF_DEFAULT_EDGE_TRANSFORM_VHOST " [ vhost ] " <nl> + <nl> + / / hds default value <nl> + # define SRS_CONF_DEFAULT_HDS_PATH " . / objs / nginx / html " <nl> + # define SRS_CONF_DEFAULT_HDS_WINDOW ( 60 ) <nl> + # define SRS_CONF_DEFAULT_HDS_FRAGMENT ( 10 ) <nl> + <nl> / / ' \ n ' <nl> - # define SRS_LF ( char ) 0x0a <nl> + # define SRS_LF ( char ) SRS_CONSTS_LF <nl> <nl> / / ' \ r ' <nl> - # define SRS_CR ( char ) 0x0d <nl> + # define SRS_CR ( char ) SRS_CONSTS_CR <nl> <nl> bool is_common_space ( char ch ) <nl> { <nl> bool srs_directive_equals ( SrsConfDirective * a , SrsConfDirective * b ) <nl> return true ; <nl> } <nl> <nl> + bool srs_config_hls_is_on_error_ignore ( string strategy ) <nl> + { <nl> + return strategy = = SRS_CONF_DEFAULT_HLS_ON_ERROR_IGNORE ; <nl> + } <nl> + <nl> + bool srs_config_hls_is_on_error_continue ( string strategy ) <nl> + { <nl> + return strategy = = SRS_CONF_DEFAULT_HLS_ON_ERROR_CONTINUE ; <nl> + } <nl> + <nl> + bool srs_config_ingest_is_file ( string type ) <nl> + { <nl> + return type = = SRS_CONF_DEFAULT_INGEST_TYPE_FILE ; <nl> + } <nl> + <nl> + bool srs_config_ingest_is_stream ( string type ) <nl> + { <nl> + return type = = SRS_CONF_DEFAULT_INGEST_TYPE_STREAM ; <nl> + } <nl> + <nl> + bool srs_config_dvr_is_plan_segment ( string plan ) <nl> + { <nl> + return plan = = SRS_CONF_DEFAULT_DVR_PLAN_SEGMENT ; <nl> + } <nl> + <nl> + bool srs_config_dvr_is_plan_session ( string plan ) <nl> + { <nl> + return plan = = SRS_CONF_DEFAULT_DVR_PLAN_SESSION ; <nl> + } <nl> + <nl> + bool srs_config_dvr_is_plan_append ( string plan ) <nl> + { <nl> + return plan = = SRS_CONF_DEFAULT_DVR_PLAN_APPEND ; <nl> + } <nl> mmm a / trunk / src / app / srs_app_config . hpp <nl> ppp b / trunk / src / app / srs_app_config . hpp <nl> CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE . <nl> <nl> # include < srs_app_reload . hpp > <nl> <nl> - / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> - / / default consts values <nl> - / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> - # define SRS_CONF_DEFAULT_PID_FILE " . / objs / srs . pid " <nl> - # define SRS_CONF_DEFAULT_LOG_FILE " . / objs / srs . log " <nl> - # define SRS_CONF_DEFAULT_LOG_LEVEL " trace " <nl> - # define SRS_CONF_DEFAULT_LOG_TANK_CONSOLE " console " <nl> - # define SRS_CONF_DEFAULT_COFNIG_FILE " conf / srs . conf " <nl> - # define SRS_CONF_DEFAULT_FF_LOG_DIR " . / objs " <nl> - # define SRS_CONF_DEFAULT_UTC_TIME false <nl> - <nl> - # define SRS_CONF_DEFAULT_MAX_CONNECTIONS 1000 <nl> - # define SRS_CONF_DEFAULT_HLS_PATH " . / objs / nginx / html " <nl> - # define SRS_CONF_DEFAULT_HLS_M3U8_FILE " [ app ] / [ stream ] . m3u8 " <nl> - # define SRS_CONF_DEFAULT_HLS_TS_FILE " [ app ] / [ stream ] - [ seq ] . ts " <nl> - # define SRS_CONF_DEFAULT_HLS_TS_FLOOR false <nl> - # define SRS_CONF_DEFAULT_HLS_FRAGMENT 10 <nl> - # define SRS_CONF_DEFAULT_HLS_TD_RATIO 1 . 5 <nl> - # define SRS_CONF_DEFAULT_HLS_AOF_RATIO 2 . 0 <nl> - # define SRS_CONF_DEFAULT_HLS_WINDOW 60 <nl> - # define SRS_CONF_DEFAULT_HLS_ON_ERROR_IGNORE " ignore " <nl> - # define SRS_CONF_DEFAULT_HLS_ON_ERROR_DISCONNECT " disconnect " <nl> - # define SRS_CONF_DEFAULT_HLS_ON_ERROR_CONTINUE " continue " <nl> - # define SRS_CONF_DEFAULT_HLS_ON_ERROR SRS_CONF_DEFAULT_HLS_ON_ERROR_IGNORE <nl> - # define SRS_CONF_DEFAULT_HLS_STORAGE " disk " <nl> - # define SRS_CONF_DEFAULT_HLS_MOUNT " [ vhost ] / [ app ] / [ stream ] . m3u8 " <nl> - # define SRS_CONF_DEFAULT_HLS_ACODEC " aac " <nl> - # define SRS_CONF_DEFAULT_HLS_VCODEC " h264 " <nl> - # define SRS_CONF_DEFAULT_HLS_CLEANUP true <nl> - # define SRS_CONF_DEFAULT_HLS_WAIT_KEYFRAME true <nl> - # define SRS_CONF_DEFAULT_HLS_NB_NOTIFY 64 <nl> - # define SRS_CONF_DEFAULT_DVR_PATH " . / objs / nginx / html / [ app ] / [ stream ] . [ timestamp ] . flv " <nl> - # define SRS_CONF_DEFAULT_DVR_PLAN_SESSION " session " <nl> - # define SRS_CONF_DEFAULT_DVR_PLAN_SEGMENT " segment " <nl> - # define SRS_CONF_DEFAULT_DVR_PLAN_APPEND " append " <nl> - # define SRS_CONF_DEFAULT_DVR_PLAN SRS_CONF_DEFAULT_DVR_PLAN_SESSION <nl> - # define SRS_CONF_DEFAULT_DVR_DURATION 30 <nl> - # define SRS_CONF_DEFAULT_TIME_JITTER " full " <nl> - # define SRS_CONF_DEFAULT_ATC_AUTO true <nl> - # define SRS_CONF_DEFAULT_MIX_CORRECT false <nl> - / / in seconds , the paused queue length . <nl> - # define SRS_CONF_DEFAULT_PAUSED_LENGTH 10 <nl> - / / the interval in seconds for bandwidth check <nl> - # define SRS_CONF_DEFAULT_BANDWIDTH_INTERVAL 30 <nl> - / / the interval in seconds for bandwidth check <nl> - # define SRS_CONF_DEFAULT_BANDWIDTH_LIMIT_KBPS 1000 <nl> - <nl> - # define SRS_CONF_DEFAULT_HTTP_MOUNT " [ vhost ] / " <nl> - # define SRS_CONF_DEFAULT_HTTP_REMUX_MOUNT " [ vhost ] / [ app ] / [ stream ] . flv " <nl> - # define SRS_CONF_DEFAULT_HTTP_DIR SRS_CONF_DEFAULT_HLS_PATH <nl> - # define SRS_CONF_DEFAULT_HTTP_AUDIO_FAST_CACHE 0 <nl> - <nl> - # define SRS_CONF_DEFAULT_HTTP_STREAM_PORT " 8080 " <nl> - # define SRS_CONF_DEFAULT_HTTP_API_PORT " 1985 " <nl> - # define SRS_CONF_DEFAULT_HTTP_API_CROSSDOMAIN true <nl> - <nl> - # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_ENABLED false <nl> - # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_INTERVAL 9 . 9 <nl> - # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_URL " http : / / " SRS_CONSTS_LOCALHOST " : 8085 / api / v1 / servers " <nl> - # define SRS_CONF_DEFAULT_HTTP_HEAETBEAT_SUMMARIES false <nl> - <nl> - # define SRS_CONF_DEFAULT_SECURITY_ENABLED false <nl> - <nl> - # define SRS_CONF_DEFAULT_STREAM_CASTER_ENABLED false <nl> - # define SRS_CONF_DEFAULT_STREAM_CASTER_MPEGTS_OVER_UDP " mpegts_over_udp " <nl> - # define SRS_CONF_DEFAULT_STREAM_CASTER_RTSP " rtsp " <nl> - # define SRS_CONF_DEFAULT_STREAM_CASTER_FLV " flv " <nl> - <nl> - # define SRS_CONF_DEFAULT_STATS_NETWORK_DEVICE_INDEX 0 <nl> - <nl> - # define SRS_CONF_DEFAULT_PITHY_PRINT_MS 10000 <nl> - <nl> - # define SRS_CONF_DEFAULT_INGEST_TYPE_FILE " file " <nl> - # define SRS_CONF_DEFAULT_INGEST_TYPE_STREAM " stream " <nl> - <nl> - # define SRS_CONF_DEFAULT_TRANSCODE_IFORMAT " flv " <nl> - # define SRS_CONF_DEFAULT_TRANSCODE_OFORMAT " flv " <nl> - <nl> - # define SRS_CONF_DEFAULT_EDGE_MODE false <nl> - # define SRS_CONF_DEFAULT_EDGE_TOKEN_TRAVERSE false <nl> - # define SRS_CONF_DEFAULT_EDGE_TRANSFORM_VHOST " [ vhost ] " <nl> - <nl> - / / hds default value <nl> - # define SRS_CONF_DEFAULT_HDS_PATH " . / objs / nginx / html " <nl> - # define SRS_CONF_DEFAULT_HDS_WINDOW ( 60 ) <nl> - # define SRS_CONF_DEFAULT_HDS_FRAGMENT ( 10 ) <nl> - <nl> namespace _srs_internal <nl> { <nl> class SrsConfigBuffer ; <nl> namespace _srs_internal <nl> / * * <nl> * deep compare directive . <nl> * / <nl> - bool srs_directive_equals ( SrsConfDirective * a , SrsConfDirective * b ) ; <nl> + extern bool srs_directive_equals ( SrsConfDirective * a , SrsConfDirective * b ) ; <nl> + <nl> + / * * <nl> + * helper utilities , used for compare the consts values . <nl> + * / <nl> + extern bool srs_config_hls_is_on_error_ignore ( std : : string strategy ) ; <nl> + extern bool srs_config_hls_is_on_error_continue ( std : : string strategy ) ; <nl> + extern bool srs_config_ingest_is_file ( std : : string type ) ; <nl> + extern bool srs_config_ingest_is_stream ( std : : string type ) ; <nl> + extern bool srs_config_dvr_is_plan_segment ( std : : string plan ) ; <nl> + extern bool srs_config_dvr_is_plan_session ( std : : string plan ) ; <nl> + extern bool srs_config_dvr_is_plan_append ( std : : string plan ) ; <nl> <nl> / / global config <nl> extern SrsConfig * _srs_config ; <nl> mmm a / trunk / src / app / srs_app_dvr . cpp <nl> ppp b / trunk / src / app / srs_app_dvr . cpp <nl> int SrsDvrPlan : : on_reap_segment ( ) <nl> SrsDvrPlan * SrsDvrPlan : : create_plan ( string vhost ) <nl> { <nl> std : : string plan = _srs_config - > get_dvr_plan ( vhost ) ; <nl> - if ( plan = = SRS_CONF_DEFAULT_DVR_PLAN_SEGMENT ) { <nl> + if ( srs_config_dvr_is_plan_segment ( plan ) ) { <nl> return new SrsDvrSegmentPlan ( ) ; <nl> - } else if ( plan = = SRS_CONF_DEFAULT_DVR_PLAN_SESSION ) { <nl> + } else if ( srs_config_dvr_is_plan_session ( plan ) ) { <nl> return new SrsDvrSessionPlan ( ) ; <nl> - } else if ( plan = = SRS_CONF_DEFAULT_DVR_PLAN_APPEND ) { <nl> + } else if ( srs_config_dvr_is_plan_append ( plan ) ) { <nl> return new SrsDvrAppendPlan ( ) ; <nl> } else { <nl> srs_error ( " invalid dvr plan = % s , vhost = % s " , plan . c_str ( ) , vhost . c_str ( ) ) ; <nl> mmm a / trunk / src / app / srs_app_ingest . cpp <nl> ppp b / trunk / src / app / srs_app_ingest . cpp <nl> int SrsIngester : : initialize_ffmpeg ( SrsFFMPEG * ffmpeg , SrsConfDirective * vhost , S <nl> return ret ; <nl> } <nl> <nl> - if ( input_type = = SRS_CONF_DEFAULT_INGEST_TYPE_FILE ) { <nl> + if ( srs_config_ingest_is_file ( input_type ) ) { <nl> std : : string input_url = _srs_config - > get_ingest_input_url ( ingest ) ; <nl> if ( input_url . empty ( ) ) { <nl> ret = ERROR_ENCODER_NO_INPUT ; <nl> int SrsIngester : : initialize_ffmpeg ( SrsFFMPEG * ffmpeg , SrsConfDirective * vhost , S <nl> if ( ( ret = ffmpeg - > initialize ( input_url , output , log_file ) ) ! = ERROR_SUCCESS ) { <nl> return ret ; <nl> } <nl> - } else if ( input_type = = SRS_CONF_DEFAULT_INGEST_TYPE_STREAM ) { <nl> + } else if ( srs_config_ingest_is_stream ( input_type ) ) { <nl> std : : string input_url = _srs_config - > get_ingest_input_url ( ingest ) ; <nl> if ( input_url . empty ( ) ) { <nl> ret = ERROR_ENCODER_NO_INPUT ; <nl> mmm a / trunk / src / app / srs_app_source . cpp <nl> ppp b / trunk / src / app / srs_app_source . cpp <nl> int SrsSource : : on_audio_imp ( SrsSharedPtrMessage * msg ) <nl> / / apply the error strategy for hls . <nl> / / @ see https : / / github . com / simple - rtmp - server / srs / issues / 264 <nl> std : : string hls_error_strategy = _srs_config - > get_hls_on_error ( _req - > vhost ) ; <nl> - if ( hls_error_strategy = = SRS_CONF_DEFAULT_HLS_ON_ERROR_IGNORE ) { <nl> + if ( srs_config_hls_is_on_error_ignore ( hls_error_strategy ) ) { <nl> srs_warn ( " hls process audio message failed , ignore and disable hls . ret = % d " , ret ) ; <nl> <nl> / / unpublish , ignore ret . <nl> int SrsSource : : on_audio_imp ( SrsSharedPtrMessage * msg ) <nl> <nl> / / ignore . <nl> ret = ERROR_SUCCESS ; <nl> - } else if ( hls_error_strategy = = SRS_CONF_DEFAULT_HLS_ON_ERROR_CONTINUE ) { <nl> + } else if ( srs_config_hls_is_on_error_continue ( hls_error_strategy ) ) { <nl> / / compare the sequence header with audio , continue when it ' s actually an sequence header . <nl> if ( ret = = ERROR_HLS_DECODE_ERROR & & cache_sh_audio & & cache_sh_audio - > size = = msg - > size ) { <nl> srs_warn ( " the audio is actually a sequence header , ignore this packet . " ) ; <nl> int SrsSource : : on_video_imp ( SrsSharedPtrMessage * msg ) <nl> / / apply the error strategy for hls . <nl> / / @ see https : / / github . com / simple - rtmp - server / srs / issues / 264 <nl> std : : string hls_error_strategy = _srs_config - > get_hls_on_error ( _req - > vhost ) ; <nl> - if ( hls_error_strategy = = SRS_CONF_DEFAULT_HLS_ON_ERROR_IGNORE ) { <nl> + if ( srs_config_hls_is_on_error_ignore ( hls_error_strategy ) ) { <nl> srs_warn ( " hls process video message failed , ignore and disable hls . ret = % d " , ret ) ; <nl> <nl> / / unpublish , ignore ret . <nl> int SrsSource : : on_video_imp ( SrsSharedPtrMessage * msg ) <nl> <nl> / / ignore . <nl> ret = ERROR_SUCCESS ; <nl> - } else if ( hls_error_strategy = = SRS_CONF_DEFAULT_HLS_ON_ERROR_CONTINUE ) { <nl> + } else if ( srs_config_hls_is_on_error_continue ( hls_error_strategy ) ) { <nl> / / compare the sequence header with video , continue when it ' s actually an sequence header . <nl> if ( ret = = ERROR_HLS_DECODE_ERROR & & cache_sh_video & & cache_sh_video - > size = = msg - > size ) { <nl> srs_warn ( " the video is actually a sequence header , ignore this packet . " ) ; <nl>
refine config default values of srs , prepare to move each default values to functions .
ossrs/srs
f39faa78bb48b58e90cfa38eed27ad84655d988c
2015-07-06T03:11:59Z
mmm a / Code / Sandbox / Plugins / EditorAudioControlsEditor / EditorPortAudio / EditorImpl . cpp <nl> ppp b / Code / Sandbox / Plugins / EditorAudioControlsEditor / EditorPortAudio / EditorImpl . cpp <nl> <nl> # include < CryAudio / IProfileData . h > <nl> # include < CrySerialization / IArchiveHost . h > <nl> # include < CrySerialization / ClassFactory . h > <nl> - # include < CryAudio / IAudioInterfacesCommonData . h > <nl> <nl> namespace ACE <nl> { <nl> mmm a / Code / Sandbox / Plugins / EditorAudioControlsEditor / EditorSDLMixer / EditorImpl . cpp <nl> ppp b / Code / Sandbox / Plugins / EditorAudioControlsEditor / EditorSDLMixer / EditorImpl . cpp <nl> <nl> # include < CryAudio / IProfileData . h > <nl> # include < CrySerialization / IArchiveHost . h > <nl> # include < CrySerialization / ClassFactory . h > <nl> - # include < CryAudio / IAudioInterfacesCommonData . h > <nl> <nl> namespace ACE <nl> { <nl> mmm a / Code / Sandbox / Plugins / EditorAudioControlsEditor / EditorWwise / EditorImpl . cpp <nl> ppp b / Code / Sandbox / Plugins / EditorAudioControlsEditor / EditorWwise / EditorImpl . cpp <nl> <nl> # include < CryAudio / IProfileData . h > <nl> # include < CrySystem / ISystem . h > <nl> # include < CryCore / CryCrc32 . h > <nl> - # include < SystemTypes . h > <nl> # include < CryCore / StlUtils . h > <nl> # include < CrySerialization / IArchiveHost . h > <nl> <nl> mmm a / Code / Sandbox / Plugins / EditorAudioControlsEditor / SystemControlsModel . h <nl> ppp b / Code / Sandbox / Plugins / EditorAudioControlsEditor / SystemControlsModel . h <nl> enum class EDataRole <nl> <nl> namespace AudioModelUtils <nl> { <nl> - void GetAssetsFromIndices ( QModelIndexList const & list , std : : vector < CSystemLibrary * > & outLibraries , std : : vector < CSystemFolder * > & outFolders , std : : vector < CSystemControl * > & outControls ) ; <nl> + void GetAssetsFromIndices ( QModelIndexList const & list , std : : vector < CSystemLibrary * > & outLibraries , std : : vector < CSystemFolder * > & outFolders , std : : vector < CSystemControl * > & outControls ) ; <nl> CSystemAsset * GetAssetFromIndex ( QModelIndex const & index ) ; <nl> } / / namespace AudioModelUtils <nl> <nl> mmm a / Code / Sandbox / Plugins / EditorAudioControlsEditor / SystemControlsWidget . cpp <nl> ppp b / Code / Sandbox / Plugins / EditorAudioControlsEditor / SystemControlsWidget . cpp <nl> CSystemControlsWidget : : CSystemControlsWidget ( CSystemAssetsManager * pAssetsManage <nl> m_pTreeView - > setDragDropMode ( QAbstractItemView : : DragDrop ) ; <nl> m_pTreeView - > setDefaultDropAction ( Qt : : MoveAction ) ; <nl> m_pTreeView - > setSelectionMode ( QAbstractItemView : : ExtendedSelection ) ; <nl> + m_pTreeView - > setSelectionBehavior ( QAbstractItemView : : SelectRows ) ; <nl> m_pTreeView - > setContextMenuPolicy ( Qt : : CustomContextMenu ) ; <nl> m_pTreeView - > setModel ( m_pFilterProxyModel ) ; <nl> m_pTreeView - > sortByColumn ( 0 , Qt : : AscendingOrder ) ; <nl> + m_pTreeView - > viewport ( ) - > installEventFilter ( this ) ; <nl> m_pTreeView - > installEventFilter ( this ) ; <nl> pSplitter - > addWidget ( m_pTreeView ) ; <nl> <nl> bool CSystemControlsWidget : : eventFilter ( QObject * pObject , QEvent * pEvent ) <nl> } <nl> } <nl> } <nl> + else if ( pEvent - > type ( ) = = QEvent : : Drop ) <nl> + { <nl> + m_pTreeView - > selectionModel ( ) - > clearSelection ( ) ; <nl> + m_pTreeView - > selectionModel ( ) - > clearCurrentIndex ( ) ; <nl> + } <nl> + <nl> return QWidget : : eventFilter ( pObject , pEvent ) ; <nl> } <nl> <nl> CSystemAsset * CSystemControlsWidget : : GetSelectedAsset ( ) const <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> void CSystemControlsWidget : : SelectNewAsset ( QModelIndex const & parent , int const row ) <nl> { <nl> - if ( m_isCreatedFromMenu ) <nl> + if ( ! CAudioControlsEditorPlugin : : GetAssetsManager ( ) - > IsLoading ( ) ) <nl> { <nl> QModelIndex const & assetIndex = m_pFilterProxyModel - > mapFromSource ( m_pMountingProxyModel - > index ( row , 0 , parent ) ) ; <nl> - m_pTreeView - > setCurrentIndex ( assetIndex ) ; <nl> - m_pTreeView - > edit ( assetIndex ) ; <nl> - m_isCreatedFromMenu = false ; <nl> - } <nl> - else if ( ! CAudioControlsEditorPlugin : : GetAssetsManager ( ) - > IsLoading ( ) ) <nl> - { <nl> - QModelIndex const & parentIndex = m_pFilterProxyModel - > mapFromSource ( parent ) ; <nl> - m_pTreeView - > expand ( parentIndex ) ; <nl> - m_pTreeView - > setCurrentIndex ( parentIndex ) ; <nl> + <nl> + if ( m_isCreatedFromMenu ) <nl> + { <nl> + m_pTreeView - > setCurrentIndex ( assetIndex ) ; <nl> + m_pTreeView - > edit ( assetIndex ) ; <nl> + m_isCreatedFromMenu = false ; <nl> + } <nl> + else <nl> + { <nl> + QModelIndex const & parentIndex = m_pFilterProxyModel - > mapFromSource ( parent ) ; <nl> + m_pTreeView - > expand ( parentIndex ) ; <nl> + m_pTreeView - > selectionModel ( ) - > select ( assetIndex , QItemSelectionModel : : Select | QItemSelectionModel : : Rows ) ; <nl> + <nl> + if ( m_pTreeView - > selectionModel ( ) - > selectedRows ( ) . size ( ) = = 1 ) <nl> + { <nl> + m_pTreeView - > setCurrentIndex ( assetIndex ) ; <nl> + } <nl> + else <nl> + { <nl> + m_pTreeView - > scrollTo ( assetIndex ) ; <nl> + } <nl> + } <nl> } <nl> } <nl> <nl> mmm a / Code / Sandbox / Plugins / EditorAudioControlsEditor / common / IEditorImpl . h <nl> ppp b / Code / Sandbox / Plugins / EditorAudioControlsEditor / common / IEditorImpl . h <nl> <nl> # pragma once <nl> <nl> # include " SystemTypes . h " <nl> + <nl> # include < CrySystem / XML / IXml . h > <nl> # include < CryAudio / IAudioInterfacesCommonData . h > <nl> <nl> <nl> namespace ACE <nl> { <nl> class CImplItem ; <nl> - class CImplConnection ; <nl> <nl> struct IImplSettings <nl> { <nl>
! XT ( Audio ) On drag and drop , clear selection and select dropped items
CRYTEK/CRYENGINE
349fa8948bdb20f4eb0f4f8e610a921b36e9f7e7
2017-10-23T15:53:59Z
mmm a / src / flag - definitions . h <nl> ppp b / src / flag - definitions . h <nl> DEFINE_IMPLICATION ( es_staging , harmony_tailcalls ) <nl> / / Features that are complete ( but still behind - - harmony / es - staging flag ) . <nl> # define HARMONY_STAGED ( V ) \ <nl> V ( harmony_regexp_lookbehind , " harmony regexp lookbehind " ) \ <nl> + V ( harmony_instanceof , " harmony instanceof support " ) \ <nl> V ( harmony_object_values_entries , " harmony Object . values / Object . entries " ) \ <nl> V ( harmony_object_own_property_descriptors , \ <nl> " harmony Object . getOwnPropertyDescriptors ( ) " ) \ <nl> DEFINE_IMPLICATION ( es_staging , harmony_tailcalls ) <nl> # define HARMONY_SHIPPING ( V ) \ <nl> V ( harmony_array_prototype_values , " harmony Array . prototype . values " ) \ <nl> V ( harmony_function_name , " harmony Function name inference " ) \ <nl> - V ( harmony_instanceof , " harmony instanceof support " ) \ <nl> V ( harmony_iterator_close , " harmony iterator finalization " ) \ <nl> V ( harmony_regexps , " harmony regular expression extensions " ) \ <nl> V ( harmony_unicode_regexps , " harmony unicode regexps " ) \ <nl>
Revert of [ es6 ] Ship new ES6 instanceof operator semantics . ( patchset id : 1 of https : / / codereview . chromium . org / 1820903002 / )
v8/v8
3521b37df2b24261ae5fb4f73404bdaab2f58009
2016-03-22T16:04:52Z
mmm a / tensorflow / python / distribute / custom_training_loop_test . py <nl> ppp b / tensorflow / python / distribute / custom_training_loop_test . py <nl> <nl> from tensorflow . python . util import nest <nl> <nl> <nl> - class InputIterationTest ( test . TestCase , parameterized . TestCase ) : <nl> + def get_dataset_from_tensor_slices ( inp_array ) : <nl> + dataset = dataset_ops . DatasetV2 . from_tensor_slices ( inp_array ) <nl> + # TODO ( b / 138326910 ) : Remove Dataset V1 version once bug resolved . <nl> + if not tf2 . enabled ( ) : <nl> + dataset = dataset_ops . Dataset . from_tensor_slices ( inp_array ) <nl> + return dataset <nl> + <nl> + <nl> + class AssertFlattenedMixin ( object ) : <nl> + " " " Mixin for specialized asserts . " " " <nl> + <nl> + def assert_equal_flattened ( self , expected_results , actual_results ) : <nl> + " " " Asserts that flattened results are equal . <nl> + <nl> + Due to the number of replicas in the strategy , the output may have a <nl> + different structure and needs to be flattened for comparison . <nl> + <nl> + Args : <nl> + expected_results : The results expected as a result of a computation . <nl> + actual_results : The actual results of a computation . <nl> + " " " <nl> + self . assertEqual ( len ( expected_results ) , len ( actual_results ) ) <nl> + <nl> + for i , expected_result in enumerate ( expected_results ) : <nl> + final_result = [ ] <nl> + actual_result = actual_results [ i ] <nl> + for val in actual_result : <nl> + final_result . extend ( val . numpy ( ) ) <nl> + self . assertAllEqual ( expected_result , final_result ) <nl> + <nl> + <nl> + class InputIterationTest ( test . TestCase , parameterized . TestCase , <nl> + AssertFlattenedMixin ) : <nl> <nl> @ combinations . generate ( <nl> combinations . combine ( <nl> def computation ( x ) : <nl> distribution = strategy_combinations . strategies_minus_tpu , <nl> mode = [ " eager " ] ) ) <nl> def testFullEager ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> <nl> def train_step ( data ) : <nl> return math_ops . square ( data ) <nl> def train_step ( data ) : <nl> output = distribution . experimental_local_results ( <nl> distribution . experimental_run_v2 ( train_step , args = ( x , ) ) ) <nl> results . append ( output ) <nl> - self . _assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> + self . assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> <nl> @ combinations . generate ( <nl> combinations . combine ( <nl> def train_step ( data ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testStepInFunction ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> <nl> @ def_function . function <nl> def train_step ( data ) : <nl> def train_step ( data ) : <nl> output = distribution . experimental_local_results ( <nl> distribution . experimental_run_v2 ( train_step , args = ( x , ) ) ) <nl> results . append ( output ) <nl> - self . _assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> + self . assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> <nl> @ combinations . generate ( <nl> combinations . combine ( <nl> def train_step ( data ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testRunInFunction ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> <nl> def train_step ( data ) : <nl> return math_ops . square ( data ) <nl> def f_train_step ( input_data ) : <nl> for x in dist_dataset : <nl> output = f_train_step ( x ) <nl> results . append ( output ) <nl> - self . _assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> + self . assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> <nl> @ combinations . generate ( <nl> combinations . combine ( <nl> def f_train_step ( input_data ) : <nl> ] , <nl> mode = [ " eager " ] ) ) <nl> def testNestedOutput ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 0 , 1 , 2 , 3 ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 0 , 1 , 2 , 3 ] ) . batch ( 2 ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> @ def_function . function <nl> def computation ( x ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testRunInFunctionAutoGraphApplication ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> <nl> def train_step ( data ) : <nl> return math_ops . square ( data ) <nl> def f_train_step ( input_data ) : <nl> for x in dist_dataset : <nl> output = f_train_step ( x ) <nl> results . append ( output ) <nl> - self . _assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> + self . assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> <nl> @ combinations . generate ( <nl> combinations . combine ( <nl> def f_train_step ( dist_dataset ) : <nl> <nl> return number_of_steps , product_of_means <nl> <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> dist_dataset = distribution . experimental_distribute_dataset ( dataset ) <nl> <nl> number_of_steps , product_of_means = f_train_step ( dist_dataset ) <nl> def train ( dataset ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testDynamicShapes ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> @ def_function . function <nl> def computation ( x ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testDynamicShapesWithGetNextOutsideFunction ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> @ def_function . function <nl> def computation ( x ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testStrategyReduceWithDynamicShapes ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> @ def_function . function <nl> def run ( iterator ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testStrategyReduceWithDynamicShapesRank2 ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( <nl> + dataset = get_dataset_from_tensor_slices ( <nl> [ [ 1 . , 1 . ] , [ 1 . , 1 . ] , [ 1 . , 1 . ] ] ) . batch ( 4 ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> def run ( iterator ) : <nl> mode = [ " eager " ] <nl> ) ) <nl> def testDynamicShapesWithSizeOp ( self , distribution ) : <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . ] ) . batch ( 4 ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> @ def_function . function <nl> def computation ( x ) : <nl> ) ) <nl> def testDynamicShapesWithFirstReplicaNotMaximumShape ( self , distribution ) : <nl> def dataset_fn ( _ ) : <nl> - dataset1 = self . _get_dataset_from_tensor_slices ( [ [ 1 . , 2 . ] , [ 1 . , 2 . ] ] ) <nl> - dataset2 = self . _get_dataset_from_tensor_slices ( [ [ 1 . , 2 . , 3 . ] , <nl> - [ 1 . , 2 . , 3 . ] ] ) <nl> + dataset1 = get_dataset_from_tensor_slices ( [ [ 1 . , 2 . ] , [ 1 . , 2 . ] ] ) <nl> + dataset2 = get_dataset_from_tensor_slices ( [ [ 1 . , 2 . , 3 . ] , <nl> + [ 1 . , 2 . , 3 . ] ] ) <nl> dataset = dataset1 . concatenate ( dataset2 ) <nl> dataset = dataset . batch ( 2 , drop_remainder = True ) <nl> return dataset <nl> def testDatasetDistributeEvenlyDivisibleDrop ( self , distribution ) : <nl> # drop_remainder = True on the dataset , then DistributedIterator will use a <nl> # different ( and more efficient ) code path which avoids some control flow <nl> # ops . <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . ] ) . batch ( <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . ] ) . batch ( <nl> 2 , drop_remainder = True ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> def testDatasetDistributeEvenlyDivisibleDrop ( self , distribution ) : <nl> def testDatasetDistributeNotDivisibleDrop ( self , distribution ) : <nl> # If each batch is not evenly divisible by the number of workers , <nl> # the remainder will be dropped . <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . ] ) . batch ( <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . ] ) . batch ( <nl> 1 , drop_remainder = True ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> def testDatasetDistributeEvenlyDivisibleNoDrop ( self , distribution ) : <nl> # Setting drop_remainder = False on the dataset causes DistributedIterator <nl> # to use get_next_as_optional ( ) , even if the batched dataset is evenly <nl> # divisible by the number of workers . <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . ] ) . batch ( <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . ] ) . batch ( <nl> 2 , drop_remainder = False ) <nl> input_iterator = iter ( distribution . experimental_distribute_dataset ( dataset ) ) <nl> <nl> def train ( dataset ) : <nl> results . append ( output ) <nl> return results <nl> <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> dist_dataset = distribution . experimental_distribute_dataset ( dataset ) <nl> results = train ( dist_dataset ) <nl> - self . _assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> + self . assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> <nl> @ combinations . generate ( <nl> combinations . combine ( <nl> def f_train_step ( input_data ) : <nl> return distribution . experimental_local_results ( <nl> distribution . experimental_run_v2 ( train_step , args = ( input_data , ) ) ) <nl> <nl> - dataset = self . _get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> dist_dataset = distribution . experimental_distribute_dataset ( dataset ) <nl> iterator = iter ( dist_dataset ) <nl> results = [ ] <nl> def f_train_step ( input_data ) : <nl> for _ in range ( 2 ) : <nl> output = f_train_step ( next ( iterator ) ) <nl> results . append ( output ) <nl> - self . _assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> + self . assert_equal_flattened ( [ [ 25 . , 36 . ] , [ 49 . , 64 . ] ] , results ) <nl> <nl> - def _get_dataset_from_tensor_slices ( self , inp_array ) : <nl> - dataset = dataset_ops . DatasetV2 . from_tensor_slices ( inp_array ) <nl> - # TODO ( b / 138326910 ) : Remove Dataset V1 version once bug resolved . <nl> - if not tf2 . enabled ( ) : <nl> - dataset = dataset_ops . Dataset . from_tensor_slices ( inp_array ) <nl> - return dataset <nl> <nl> - def _assert_equal_flattened ( self , expected_results , actual_results ) : <nl> - " " " Asserts that flattened results are equal . <nl> + class GradientTapeTest ( test . TestCase , parameterized . TestCase , <nl> + AssertFlattenedMixin ) : <nl> <nl> - Due to the number of replicas in the strategy , the output may have a <nl> - different structure and needs to be flattened for comparison . <nl> + @ combinations . generate ( <nl> + combinations . combine ( <nl> + distribution = strategy_combinations . all_strategies , <nl> + mode = [ " eager " ] <nl> + ) ) <nl> + def testStepInFunctionGradient ( self , distribution ) : <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> <nl> - Args : <nl> - expected_results : The results expected as a result of a computation . <nl> - actual_results : The actual results of a computation . <nl> - " " " <nl> - self . assertEqual ( len ( expected_results ) , len ( actual_results ) ) <nl> + @ def_function . function <nl> + def train_step ( x ) : <nl> + def computation ( x ) : <nl> + return math_ops . square ( x ) <nl> + with backprop . GradientTape ( ) as tape : <nl> + tape . watch ( x ) # Manually watch non - variable tensors . <nl> + y = computation ( x ) <nl> + grads = tape . gradient ( y , x ) <nl> + return grads <nl> <nl> - for i , expected_result in enumerate ( expected_results ) : <nl> - final_result = [ ] <nl> - actual_result = actual_results [ i ] <nl> - for val in actual_result : <nl> - final_result . extend ( val . numpy ( ) ) <nl> - self . assertAllEqual ( expected_result , final_result ) <nl> + dist_dataset = distribution . experimental_distribute_dataset ( dataset ) <nl> + results = [ ] <nl> + for x in dist_dataset : <nl> + output = distribution . experimental_local_results ( <nl> + distribution . experimental_run_v2 ( train_step , args = ( x , ) ) ) <nl> + results . append ( output ) <nl> + self . assert_equal_flattened ( [ [ 10 . , 12 . ] , [ 14 . , 16 . ] ] , results ) <nl> + <nl> + @ combinations . generate ( <nl> + combinations . combine ( <nl> + distribution = strategy_combinations . all_strategies , <nl> + mode = [ " eager " ] <nl> + ) ) <nl> + def testRunInFunctionGradient ( self , distribution ) : <nl> + dataset = get_dataset_from_tensor_slices ( [ 5 . , 6 . , 7 . , 8 . ] ) . batch ( 2 ) <nl> <nl> + @ def_function . function <nl> + def run ( x ) : <nl> + def train_step ( x ) : <nl> + def computation ( x ) : <nl> + return math_ops . square ( x ) <nl> + with backprop . GradientTape ( ) as tape : <nl> + tape . watch ( x ) # Manually watch non - variable tensors . <nl> + y = computation ( x ) <nl> + grads = tape . gradient ( y , x ) <nl> + return grads <nl> + return distribution . experimental_local_results ( <nl> + distribution . experimental_run_v2 ( train_step , args = ( x , ) ) ) <nl> <nl> - class GradientTapeTest ( test . TestCase , parameterized . TestCase ) : <nl> + dist_dataset = distribution . experimental_distribute_dataset ( dataset ) <nl> + results = [ ] <nl> + for x in dist_dataset : <nl> + output = run ( x ) <nl> + results . append ( output ) <nl> + self . assert_equal_flattened ( [ [ 10 . , 12 . ] , [ 14 . , 16 . ] ] , results ) <nl> <nl> @ combinations . generate ( <nl> combinations . combine ( <nl>
Add gradient / backprop tests for custom training loops for dist strategies .
tensorflow/tensorflow
847c879de70cb443baafe2ceedbaa040333a00de
2020-01-17T21:48:33Z
mmm a / arangod / MRServer / ApplicationMR . cpp <nl> ppp b / arangod / MRServer / ApplicationMR . cpp <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> - / / / @ brief MR enigne configuration <nl> + / / / @ brief MR engine configuration <nl> / / / <nl> / / / @ file <nl> / / / <nl> <nl> # include " ApplicationMR . h " <nl> <nl> # include " Basics / ConditionLocker . h " <nl> + # include " Basics / ReadLocker . h " <nl> + # include " Basics / WriteLocker . h " <nl> # include " Logger / Logger . h " <nl> # include " MRServer / mr - actions . h " <nl> # include " VocBase / vocbase . h " <nl> namespace { <nl> public : <nl> MRGcThread ( ApplicationMR * applicationMR ) <nl> : Thread ( " mr - gc " ) , <nl> - _applicationMR ( applicationMR ) { <nl> + _applicationMR ( applicationMR ) , <nl> + _lock ( ) , <nl> + _lastGcStamp ( TRI_microtime ( ) ) { <nl> } <nl> <nl> public : <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief collect garbage in an endless loop ( main functon of GC thread ) <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> void run ( ) { <nl> _applicationMR - > collectGarbage ( ) ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief get the timestamp of the last GC <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + double getLastGcStamp ( ) { <nl> + READ_LOCKER ( _lock ) ; <nl> + return _lastGcStamp ; <nl> + } <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief set the global GC timestamp <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + void updateGcStamp ( double value ) { <nl> + WRITE_LOCKER ( _lock ) ; <nl> + } <nl> + <nl> private : <nl> ApplicationMR * _applicationMR ; <nl> + ReadWriteLock _lock ; <nl> + double _lastGcStamp ; <nl> } ; <nl> <nl> } <nl> ApplicationMR : : ApplicationMR ( string const & binaryPath ) <nl> _startupModules ( ) , <nl> _actionPath ( ) , <nl> _gcInterval ( 1000 ) , <nl> + _gcFrequency ( 10 . 0 ) , <nl> _startupLoader ( ) , <nl> _actionLoader ( ) , <nl> _vocbase ( 0 ) , <nl> ApplicationMR : : MRContext * ApplicationMR : : enterContext ( ) { <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> <nl> void ApplicationMR : : exitContext ( MRContext * context ) { <nl> + MRGcThread * gc = dynamic_cast < MRGcThread * > ( _gcThread ) ; <nl> + assert ( gc ! = 0 ) ; <nl> + double lastGc = gc - > getLastGcStamp ( ) ; <nl> + <nl> + + context - > _dirt ; <nl> <nl> { <nl> CONDITION_LOCKER ( guard , _contextCondition ) ; <nl> <nl> - if ( context - > _dirt < _gcInterval ) { <nl> - _freeContexts . push_back ( context ) ; <nl> + if ( context - > _lastGcStamp + _gcFrequency < lastGc ) { <nl> + LOGGER_TRACE < < " periodic gc interval reached " ; <nl> + _dirtyContexts . push_back ( context ) ; <nl> } <nl> - else { <nl> + else if ( context - > _dirt > = _gcInterval ) { <nl> + LOGGER_TRACE < < " maximum number of requests reached " ; <nl> _dirtyContexts . push_back ( context ) ; <nl> } <nl> + else { <nl> + _freeContexts . push_back ( context ) ; <nl> + } <nl> <nl> guard . broadcast ( ) ; <nl> } <nl> void ApplicationMR : : exitContext ( MRContext * context ) { <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> <nl> void ApplicationMR : : collectGarbage ( ) { <nl> + MRGcThread * gc = dynamic_cast < MRGcThread * > ( _gcThread ) ; <nl> + assert ( gc ! = 0 ) ; <nl> + uint64_t waitTime = ( uint64_t ) ( _gcFrequency * 1000 . 0 * 1000 . 0 ) ; <nl> + <nl> while ( _stopping = = 0 ) { <nl> MRContext * context = 0 ; <nl> + bool gotSignal = false ; <nl> <nl> { <nl> CONDITION_LOCKER ( guard , _contextCondition ) ; <nl> <nl> if ( _dirtyContexts . empty ( ) ) { <nl> - guard . wait ( ) ; <nl> + gotSignal = guard . wait ( waitTime ) ; <nl> } <nl> <nl> if ( ! _dirtyContexts . empty ( ) ) { <nl> context = _dirtyContexts . back ( ) ; <nl> _dirtyContexts . pop_back ( ) ; <nl> } <nl> + <nl> + if ( context = = 0 & & ! gotSignal ) { <nl> + / / we did not find a dirty context <nl> + / / so we ' ll pop one of the free contexts and clean it up <nl> + context = _freeContexts . back ( ) ; <nl> + if ( context ! = 0 ) { <nl> + _freeContexts . pop_back ( ) ; <nl> + } <nl> + } <nl> } <nl> <nl> + / / update last gc time <nl> + double lastGc = TRI_microtime ( ) ; <nl> + gc - > updateGcStamp ( lastGc ) ; <nl> + <nl> if ( context ! = 0 ) { <nl> LOGGER_TRACE < < " collecting MR garbage " ; <nl> <nl> / / TODO <nl> <nl> context - > _dirt = 0 ; <nl> + context - > _lastGcStamp = lastGc ; <nl> <nl> { <nl> CONDITION_LOCKER ( guard , _contextCondition ) ; <nl> void ApplicationMR : : disableActions ( ) { <nl> <nl> void ApplicationMR : : setupOptions ( map < string , basics : : ProgramOptionsDescription > & options ) { <nl> options [ " RUBY Options : help - admin " ] <nl> - ( " ruby . gc - interval " , & _gcInterval , " Ruby garbage collection interval ( each x requests ) " ) <nl> + ( " ruby . gc - interval " , & _gcInterval , " Ruby request - based garbage collection interval ( each x requests ) " ) <nl> + ( " ruby . gc - frequency " , & _gcFrequency , " Ruby time - based garbage collection frequency ( each x seconds ) " ) <nl> ; <nl> <nl> options [ " RUBY Options : help - admin " ] <nl> bool ApplicationMR : : prepareMRInstance ( size_t i ) { <nl> } <nl> } <nl> <nl> + context - > _lastGcStamp = TRI_microtime ( ) ; <nl> + <nl> / / and return from the context <nl> LOGGER_TRACE < < " initialised MR context # " < < i ; <nl> <nl> mmm a / arangod / MRServer / ApplicationMR . h <nl> ppp b / arangod / MRServer / ApplicationMR . h <nl> namespace triagens { <nl> <nl> struct MRContext { <nl> MR_state_t * _mrs ; <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief number of requests since last GC of the context <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> size_t _dirt ; <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief timestamp of last GC for the context <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + double _lastGcStamp ; <nl> + <nl> } ; <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> namespace triagens { <nl> string _actionPath ; <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> - / / / @ brief JavaScript garbage collection interval ( each x requests ) <nl> + / / / @ brief MRuby garbage collection interval ( each x requests ) <nl> / / / <nl> / / / @ CMDOPT { - - ruby . gc - interval @ CA { interval } } <nl> / / / <nl> / / / Specifies the interval ( approximately in number of requests ) that the <nl> - / / / garbage collection for JavaScript objects will be run in each thread . <nl> + / / / garbage collection for MRuby objects will be run in each thread . <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> <nl> uint64_t _gcInterval ; <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief MRuby garbage collection frequency ( each x seconds ) <nl> + / / / <nl> + / / / @ CMDOPT { - - ruby . gc - frequency @ CA { frequency } } <nl> + / / / <nl> + / / / Specifies the frequency in seconds for the automatic garbage collection of <nl> + / / / MRuby objects . This setting is useful to have the garbage collection <nl> + / / / still work in periods with no or little numbers of requests . <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + double _gcFrequency ; <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / / @ brief MR startup loader <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> mmm a / arangod / V8Server / ApplicationV8 . cpp <nl> ppp b / arangod / V8Server / ApplicationV8 . cpp <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> - / / / @ brief V8 enigne configuration <nl> + / / / @ brief V8 engine configuration <nl> / / / <nl> / / / @ file <nl> / / / <nl> <nl> # include " ApplicationV8 . h " <nl> <nl> # include " Basics / ConditionLocker . h " <nl> + # include " Basics / ReadLocker . h " <nl> + # include " Basics / WriteLocker . h " <nl> # include " Logger / Logger . h " <nl> # include " V8 / v8 - conv . h " <nl> # include " V8 / v8 - shell . h " <nl> namespace { <nl> public : <nl> V8GcThread ( ApplicationV8 * applicationV8 ) <nl> : Thread ( " v8 - gc " ) , <nl> - _applicationV8 ( applicationV8 ) { <nl> + _applicationV8 ( applicationV8 ) , <nl> + _lock ( ) , <nl> + _lastGcStamp ( TRI_microtime ( ) ) { <nl> } <nl> <nl> public : <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief collect garbage in an endless loop ( main functon of GC thread ) <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> void run ( ) { <nl> _applicationV8 - > collectGarbage ( ) ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief get the timestamp of the last GC <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + double getLastGcStamp ( ) { <nl> + READ_LOCKER ( _lock ) ; <nl> + return _lastGcStamp ; <nl> + } <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief set the global GC timestamp <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + void updateGcStamp ( double value ) { <nl> + WRITE_LOCKER ( _lock ) ; <nl> + _lastGcStamp = value ; <nl> + } <nl> + <nl> private : <nl> ApplicationV8 * _applicationV8 ; <nl> + ReadWriteLock _lock ; <nl> + double _lastGcStamp ; <nl> } ; <nl> <nl> } <nl> ApplicationV8 : : ApplicationV8 ( string const & binaryPath ) <nl> _actionPath ( ) , <nl> _useActions ( true ) , <nl> _gcInterval ( 1000 ) , <nl> + _gcFrequency ( 10 . 0 ) , <nl> _startupLoader ( ) , <nl> _actionLoader ( ) , <nl> _vocbase ( 0 ) , <nl> ApplicationV8 : : V8Context * ApplicationV8 : : enterContext ( ) { <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> <nl> void ApplicationV8 : : exitContext ( V8Context * context ) { <nl> + V8GcThread * gc = dynamic_cast < V8GcThread * > ( _gcThread ) ; <nl> + assert ( gc ! = 0 ) ; <nl> + double lastGc = gc - > getLastGcStamp ( ) ; <nl> + <nl> context - > _context - > Exit ( ) ; <nl> context - > _isolate - > Exit ( ) ; <nl> delete context - > _locker ; <nl> - <nl> + <nl> + + context - > _dirt ; <nl> <nl> { <nl> CONDITION_LOCKER ( guard , _contextCondition ) ; <nl> <nl> - if ( context - > _dirt < _gcInterval ) { <nl> - _freeContexts . push_back ( context ) ; <nl> + if ( context - > _lastGcStamp + _gcFrequency < lastGc ) { <nl> + LOGGER_TRACE < < " periodic gc interval reached " ; <nl> + _dirtyContexts . push_back ( context ) ; <nl> } <nl> - else { <nl> + else if ( context - > _dirt > = _gcInterval ) { <nl> + LOGGER_TRACE < < " maximum number of requests reached " ; <nl> _dirtyContexts . push_back ( context ) ; <nl> } <nl> + else { <nl> + _freeContexts . push_back ( context ) ; <nl> + } <nl> <nl> guard . broadcast ( ) ; <nl> } <nl> void ApplicationV8 : : exitContext ( V8Context * context ) { <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> <nl> void ApplicationV8 : : collectGarbage ( ) { <nl> + V8GcThread * gc = dynamic_cast < V8GcThread * > ( _gcThread ) ; <nl> + assert ( gc ! = 0 ) ; <nl> + uint64_t waitTime = ( uint64_t ) ( _gcFrequency * 1000 . 0 * 1000 . 0 ) ; <nl> + <nl> while ( _stopping = = 0 ) { <nl> V8Context * context = 0 ; <nl> + bool gotSignal = false ; <nl> <nl> { <nl> CONDITION_LOCKER ( guard , _contextCondition ) ; <nl> <nl> if ( _dirtyContexts . empty ( ) ) { <nl> - guard . wait ( ) ; <nl> + gotSignal = guard . wait ( waitTime ) ; <nl> } <nl> <nl> if ( ! _dirtyContexts . empty ( ) ) { <nl> context = _dirtyContexts . back ( ) ; <nl> _dirtyContexts . pop_back ( ) ; <nl> } <nl> + <nl> + if ( context = = 0 & & ! gotSignal ) { <nl> + / / we did not find a dirty context <nl> + / / so we ' ll pop one of the free contexts and clean it up <nl> + context = _freeContexts . back ( ) ; <nl> + if ( context ! = 0 ) { <nl> + _freeContexts . pop_back ( ) ; <nl> + } <nl> + } <nl> } <nl> + <nl> + / / update last gc time <nl> + double lastGc = TRI_microtime ( ) ; <nl> + gc - > updateGcStamp ( lastGc ) ; <nl> <nl> if ( context ! = 0 ) { <nl> LOGGER_TRACE < < " collecting V8 garbage " ; <nl> void ApplicationV8 : : collectGarbage ( ) { <nl> delete context - > _locker ; <nl> <nl> context - > _dirt = 0 ; <nl> + context - > _lastGcStamp = lastGc ; <nl> <nl> { <nl> CONDITION_LOCKER ( guard , _contextCondition ) ; <nl> void ApplicationV8 : : disableActions ( ) { <nl> <nl> void ApplicationV8 : : setupOptions ( map < string , basics : : ProgramOptionsDescription > & options ) { <nl> options [ " JAVASCRIPT Options : help - admin " ] <nl> - ( " javascript . gc - interval " , & _gcInterval , " JavaScript garbage collection interval ( each x requests ) " ) <nl> + ( " javascript . gc - interval " , & _gcInterval , " JavaScript request - based garbage collection interval ( each x requests ) " ) <nl> + ( " javascript . gc - frequency " , & _gcFrequency , " JavaScript time - based garbage collection frequency ( each x seconds ) " ) <nl> ; <nl> <nl> options [ " JAVASCRIPT Options : help - admin " ] <nl> bool ApplicationV8 : : prepareV8Instance ( size_t i ) { <nl> context - > _context - > Exit ( ) ; <nl> context - > _isolate - > Exit ( ) ; <nl> delete context - > _locker ; <nl> + <nl> + context - > _lastGcStamp = TRI_microtime ( ) ; <nl> <nl> LOGGER_TRACE < < " initialised V8 context # " < < i ; <nl> <nl> mmm a / arangod / V8Server / ApplicationV8 . h <nl> ppp b / arangod / V8Server / ApplicationV8 . h <nl> namespace triagens { <nl> v8 : : Persistent < v8 : : Context > _context ; <nl> v8 : : Isolate * _isolate ; <nl> v8 : : Locker * _locker ; <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief number of requests since last GC of the context <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> size_t _dirt ; <nl> + <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief timestamp of last GC for the context <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + double _lastGcStamp ; <nl> + <nl> } ; <nl> <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> namespace triagens { <nl> <nl> uint64_t _gcInterval ; <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief JavaScript garbage collection frequency ( each x seconds ) <nl> + / / / <nl> + / / / @ CMDOPT { - - javascript . gc - frequency @ CA { frequency } } <nl> + / / / <nl> + / / / Specifies the frequency in seconds for the automatic garbage collection of <nl> + / / / JavaScript objects . This setting is useful to have the garbage collection <nl> + / / / still work in periods with no or little numbers of requests . <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + double _gcFrequency ; <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / / @ brief V8 startup loader <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> mmm a / lib / Basics / ConditionLocker . cpp <nl> ppp b / lib / Basics / ConditionLocker . cpp <nl> void ConditionLocker : : wait ( ) { <nl> _conditionVariable - > wait ( ) ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief waits for an event to occur , with a timeout <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + bool ConditionLocker : : wait ( uint64_t delay ) { <nl> + return _conditionVariable - > wait ( delay ) ; <nl> + } <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / / @ brief broadcasts an event <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> mmm a / lib / Basics / ConditionLocker . h <nl> ppp b / lib / Basics / ConditionLocker . h <nl> namespace triagens { <nl> <nl> void wait ( ) ; <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief waits for an event to occur , using a timeout <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + bool wait ( uint64_t ) ; <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / / @ brief broadcasts an event <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl>
issue
arangodb/arangodb
a667f633d577cf06f452d0b08111bf7d013fa397
2012-09-04T20:30:44Z
mmm a / libraries / Hash / examples / sha1 . ino <nl> ppp b / libraries / Hash / examples / sha1 . ino <nl> void setup ( ) { <nl> } <nl> <nl> void loop ( ) { <nl> - uint8_t hash [ 20 ] ; <nl> - const uint8_t test [ ] = " test " ; <nl> <nl> - sha1 ( ( uint8_t * ) & test [ 0 ] , sizeof ( test ) - 1 , & hash [ 0 ] ) ; <nl> + / / usage as String <nl> + / / SHA1 : a94a8fe5ccb19ba61c4c0873d391e987982fbbd3 <nl> + <nl> + Serial . print ( " SHA1 : " ) ; <nl> + Serial . println ( sha1 ( " abc " ) ) ; <nl> + <nl> + / / usage as ptr <nl> + / / SHA1 : a94a8fe5ccb19ba61c4c0873d391e987982fbbd3 <nl> + uint8_t hash [ 20 ] ; <nl> + sha1 ( " abc " , & hash [ 0 ] ) ; <nl> <nl> - / / SHA1 : A94A8FE5CCB19BA61C4C0873D391E987982FBBD3 <nl> Serial . print ( " SHA1 : " ) ; <nl> for ( uint16_t i = 0 ; i < 20 ; i + + ) { <nl> - Serial . printf ( " % 02X " , hash [ i ] ) ; <nl> + Serial . printf ( " % 02x " , hash [ i ] ) ; <nl> } <nl> Serial . println ( ) ; <nl> <nl> mmm a / libraries / Hash / src / Hash . cpp <nl> ppp b / libraries / Hash / src / Hash . cpp <nl> <nl> # include " Hash . h " <nl> <nl> extern " C " { <nl> - # include " sha1 / sha1 . h " <nl> + # include " sha1 / sha1 . h " <nl> } <nl> <nl> / * * <nl> void sha1 ( uint8_t * data , uint32_t size , uint8_t hash [ 20 ] ) { <nl> # endif <nl> } <nl> <nl> + void sha1 ( char * data , uint32_t size , uint8_t hash [ 20 ] ) { <nl> + sha1 ( ( uint8_t * ) data , size , hash ) ; <nl> + } <nl> + <nl> void sha1 ( const uint8_t * data , uint32_t size , uint8_t hash [ 20 ] ) { <nl> sha1 ( ( uint8_t * ) data , size , hash ) ; <nl> } <nl> + <nl> + void sha1 ( const char * data , uint32_t size , uint8_t hash [ 20 ] ) { <nl> + sha1 ( ( uint8_t * ) data , size , hash ) ; <nl> + } <nl> + <nl> + void sha1 ( String data , uint8_t hash [ 20 ] ) { <nl> + sha1 ( data . c_str ( ) , data . length ( ) , hash ) ; <nl> + } <nl> + <nl> + String sha1 ( uint8_t * data , uint32_t size ) { <nl> + uint8_t hash [ 20 ] ; <nl> + String hashStr = " " ; <nl> + <nl> + sha1 ( & data [ 0 ] , size , & hash [ 0 ] ) ; <nl> + <nl> + for ( uint16_t i = 0 ; i < 20 ; i + + ) { <nl> + String hex = String ( hash [ i ] , HEX ) ; <nl> + if ( hex . length ( ) < 2 ) { <nl> + hex = " 0 " + hex ; <nl> + } <nl> + hashStr + = hex ; <nl> + } <nl> + <nl> + return hashStr ; <nl> + } <nl> + <nl> + String sha1 ( char * data , uint32_t size ) { <nl> + return sha1 ( ( uint8_t * ) data , size ) ; <nl> + } <nl> + <nl> + String sha1 ( const uint8_t * data , uint32_t size ) { <nl> + return sha1 ( ( uint8_t * ) data , size ) ; <nl> + } <nl> + <nl> + String sha1 ( const char * data , uint32_t size ) { <nl> + return sha1 ( ( uint8_t * ) data , size ) ; <nl> + } <nl> + <nl> + String sha1 ( String data ) { <nl> + return sha1 ( data . c_str ( ) , data . length ( ) ) ; <nl> + } <nl> + <nl> mmm a / libraries / Hash / src / Hash . h <nl> ppp b / libraries / Hash / src / Hash . h <nl> <nl> / / # define DEBUG_SHA1 <nl> <nl> void sha1 ( uint8_t * data , uint32_t size , uint8_t hash [ 20 ] ) ; <nl> + void sha1 ( char * data , uint32_t size , uint8_t hash [ 20 ] ) ; <nl> void sha1 ( const uint8_t * data , uint32_t size , uint8_t hash [ 20 ] ) ; <nl> + void sha1 ( const char * data , uint32_t size , uint8_t hash [ 20 ] ) ; <nl> + void sha1 ( String data , uint8_t hash [ 20 ] ) ; <nl> + <nl> + String sha1 ( uint8_t * data , uint32_t size ) ; <nl> + String sha1 ( char * data , uint32_t size ) ; <nl> + String sha1 ( uint8_t * data , uint32_t size ) ; <nl> + String sha1 ( const char * data , uint32_t size ) ; <nl> + String sha1 ( String data ) ; <nl> <nl> # endif / * HASH_H_ * / <nl>
add more functions to handle sha1 in differed cases
esp8266/Arduino
4b786d0df69581cfd83eb86554cf57b3067308d9
2015-05-20T17:41:40Z
mmm a / src / compiler / js - native - context - specialization . cc <nl> ppp b / src / compiler / js - native - context - specialization . cc <nl> Node * JSNativeContextSpecialization : : BuildExtendPropertiesBackingStore ( <nl> <nl> bool JSNativeContextSpecialization : : CanTreatHoleAsUndefined ( <nl> MapHandles const & receiver_maps ) { <nl> - / / Check if the array prototype chain is intact . <nl> - if ( ! isolate ( ) - > IsFastArrayConstructorPrototypeChainIntact ( ) ) return false ; <nl> - <nl> - / / Make sure both the initial Array and Object prototypes are stable . <nl> - Handle < JSObject > initial_array_prototype ( <nl> - native_context ( ) - > initial_array_prototype ( ) , isolate ( ) ) ; <nl> - Handle < JSObject > initial_object_prototype ( <nl> - native_context ( ) - > initial_object_prototype ( ) , isolate ( ) ) ; <nl> - if ( ! initial_array_prototype - > map ( ) - > is_stable ( ) | | <nl> - ! initial_object_prototype - > map ( ) - > is_stable ( ) ) { <nl> - return false ; <nl> - } <nl> - <nl> - / / Check if all { receiver_maps } either have the initial Array . prototype <nl> - / / or the initial Object . prototype as their prototype , as those are <nl> - / / guarded by the array protector cell . <nl> - for ( Handle < Map > map : receiver_maps ) { <nl> - if ( map - > prototype ( ) ! = * initial_array_prototype & & <nl> - map - > prototype ( ) ! = * initial_object_prototype ) { <nl> + / / Check if all { receiver_maps } either have one of the initial Array . prototype <nl> + / / or Object . prototype objects as their prototype ( in any of the current <nl> + / / native contexts , as the global Array protector works isolate - wide ) . <nl> + for ( Handle < Map > receiver_map : receiver_maps ) { <nl> + DisallowHeapAllocation no_gc ; <nl> + Object * const receiver_prototype = receiver_map - > prototype ( ) ; <nl> + if ( ! isolate ( ) - > IsInAnyContext ( receiver_prototype , <nl> + Context : : INITIAL_ARRAY_PROTOTYPE_INDEX ) & & <nl> + ! isolate ( ) - > IsInAnyContext ( receiver_prototype , <nl> + Context : : INITIAL_OBJECT_PROTOTYPE_INDEX ) ) { <nl> return false ; <nl> } <nl> } <nl> <nl> - / / Install code dependencies on the prototype maps . <nl> - for ( Handle < Map > map : receiver_maps ) { <nl> - dependencies ( ) - > AssumePrototypeMapsStable ( map , initial_object_prototype ) ; <nl> - } <nl> + / / Check if the array prototype chain is intact . <nl> + if ( ! isolate ( ) - > IsFastArrayConstructorPrototypeChainIntact ( ) ) return false ; <nl> <nl> / / Install code dependency on the array protector cell . <nl> dependencies ( ) - > AssumePropertyCell ( factory ( ) - > array_protector ( ) ) ; <nl> new file mode 100644 <nl> index 00000000000 . . 74b702b228c <nl> mmm / dev / null <nl> ppp b / test / mjsunit / regress / regress - 6607 - 1 . js <nl> <nl> + / / Copyright 2017 the V8 project authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE file . <nl> + <nl> + / / Flags : - - allow - natives - syntax - - opt <nl> + <nl> + function get ( a , i ) { <nl> + return a [ i ] ; <nl> + } <nl> + <nl> + get ( [ 1 , , 3 ] , 0 ) ; <nl> + get ( [ 1 , , 3 ] , 2 ) ; <nl> + % OptimizeFunctionOnNextCall ( get ) ; <nl> + get ( [ 1 , , 3 ] , 0 ) ; <nl> + assertOptimized ( get ) ; <nl> + <nl> + / / This unrelated change to the Array . prototype should be fine . <nl> + Array . prototype . unrelated = 1 ; <nl> + assertOptimized ( get ) ; <nl> new file mode 100644 <nl> index 00000000000 . . cfb00098459 <nl> mmm / dev / null <nl> ppp b / test / mjsunit / regress / regress - 6607 - 2 . js <nl> <nl> + / / Copyright 2017 the V8 project authors . All rights reserved . <nl> + / / Use of this source code is governed by a BSD - style license that can be <nl> + / / found in the LICENSE file . <nl> + <nl> + / / Flags : - - allow - natives - syntax - - opt <nl> + <nl> + function get ( a , i ) { <nl> + return a [ i ] ; <nl> + } <nl> + <nl> + get ( [ 1 , , 3 ] , 0 ) ; <nl> + get ( [ 1 , , 3 ] , 2 ) ; <nl> + % OptimizeFunctionOnNextCall ( get ) ; <nl> + get ( [ 1 , , 3 ] , 0 ) ; <nl> + assertOptimized ( get ) ; <nl> + <nl> + / / This unrelated change to the Object . prototype should be fine . <nl> + Object . prototype . unrelated = 1 ; <nl> + assertOptimized ( get ) ; <nl>
[ turbofan ] Fix CanTreatHoleAsUndefined check .
v8/v8
e1e35df329ea9d1e2ae152deb669ecca47638d28
2017-07-18T16:29:29Z
mmm a / DEPS <nl> ppp b / DEPS <nl> vars = { <nl> <nl> deps = { <nl> ' v8 / build ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / src / build . git ' + ' @ ' + ' 49671d3af749f02346a8fe246c56189e69e3b791 ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / src / build . git ' + ' @ ' + ' 277ad4304168986653055957a3cf2c647cf10eb9 ' , <nl> ' v8 / third_party / depot_tools ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / tools / depot_tools . git ' + ' @ ' + ' cb629a482b3d3c13e46a66031ba4c0cc3679d200 ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / tools / depot_tools . git ' + ' @ ' + ' f170af48e4490633334a300bbcb65d50fab09537 ' , <nl> ' v8 / third_party / icu ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / deps / icu . git ' + ' @ ' + ' b029971f1fc6b20d06887c47c7afebd5881f31ff ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / deps / icu . git ' + ' @ ' + ' 42d5027992a0946942839b8821765e1512afbc21 ' , <nl> ' v8 / third_party / instrumented_libraries ' : <nl> - Var ( ' chromium_url ' ) + ' / chromium / src / third_party / instrumented_libraries . git ' + ' @ ' + ' a90cbf3b4216430a437991fb53ede8e048dea454 ' , <nl> + Var ( ' chromium_url ' ) + ' / chromium / src / third_party / instrumented_libraries . git ' + ' @ ' + ' a959e4f0cb643003f2d75d179cede449979e3e77 ' , <nl> ' v8 / buildtools ' : <nl> Var ( ' chromium_url ' ) + ' / chromium / buildtools . git ' + ' @ ' + ' 13a00f110ef910a25763346d6538b60f12845656 ' , <nl> ' v8 / base / trace_event / common ' : <nl> deps = { <nl> ' condition ' : ' checkout_android ' , <nl> } , <nl> ' v8 / third_party / catapult ' : { <nl> - ' url ' : Var ( ' chromium_url ' ) + ' / catapult . git ' + ' @ ' + ' 36a23a7b2851af59ed8734145c92a2bb2eb243f2 ' , <nl> + ' url ' : Var ( ' chromium_url ' ) + ' / catapult . git ' + ' @ ' + ' 5e1c1c293b07ef04a247dd8dff50972d207663a4 ' , <nl> ' condition ' : ' checkout_android ' , <nl> } , <nl> ' v8 / third_party / colorama / src ' : { <nl> mmm a / test / test262 / test262 . status <nl> ppp b / test / test262 / test262 . status <nl> <nl> # Some of these are related to v8 : 4361 in being visible side effects from Intl . <nl> ' intl402 / DateTimeFormat / prototype / resolvedOptions / hourCycle ' : [ FAIL ] , <nl> <nl> + # TODO ( jshin ) : Started failing after ICU 63 . 1 update . <nl> + ' intl402 / Segmenter / prototype / segment / segment - line - following - modes ' : [ FAIL ] , <nl> + <nl> # https : / / bugs . chromium . org / p / v8 / issues / detail ? id = 7833 <nl> ' built - ins / Atomics / wait / cannot - suspend - throws ' : [ SKIP ] , <nl> ' built - ins / Atomics / wait / undefined - index - defaults - to - zero ' : [ SKIP ] , <nl>
Update V8 DEPS .
v8/v8
c11c8b26cd846b1165d153779879ddca82a70e0a
2018-10-31T17:17:21Z
mmm a / x64_dbg_gui / Project / Src / BasicView / AbstractTableView . cpp <nl> ppp b / x64_dbg_gui / Project / Src / BasicView / AbstractTableView . cpp <nl> void AbstractTableView : : mouseMoveEvent ( QMouseEvent * event ) <nl> repaint ( ) ; <nl> } <nl> default : <nl> - break ; <nl> + break ; <nl> } <nl> } <nl> <nl> void AbstractTableView : : vertSliderActionSlot ( int action ) <nl> switch ( action ) <nl> { <nl> case QAbstractSlider : : SliderNoAction : <nl> - break ; <nl> + break ; <nl> case QAbstractSlider : : SliderSingleStepAdd : <nl> wDelta = 1 ; <nl> - break ; <nl> + break ; <nl> case QAbstractSlider : : SliderSingleStepSub : <nl> wDelta = - 1 ; <nl> - break ; <nl> + break ; <nl> case QAbstractSlider : : SliderPageStepAdd : <nl> wDelta = 30 ; <nl> - break ; <nl> + break ; <nl> case QAbstractSlider : : SliderPageStepSub : <nl> wDelta = - 30 ; <nl> - break ; <nl> + break ; <nl> case QAbstractSlider : : SliderToMinimum : <nl> case QAbstractSlider : : SliderToMaximum : <nl> case QAbstractSlider : : SliderMove : <nl> void AbstractTableView : : vertSliderActionSlot ( int action ) <nl> # else <nl> wDelta = wSliderPos - mTableOffset ; <nl> # endif <nl> - break ; <nl> + break ; <nl> default : <nl> - break ; <nl> + break ; <nl> } <nl> <nl> / / Call the hook ( Usefull for disassembly ) <nl> mmm a / x64_dbg_gui / Project / Src / BasicView / XBytesLineEdit . cpp <nl> ppp b / x64_dbg_gui / Project / Src / BasicView / XBytesLineEdit . cpp <nl> <nl> - <nl> # include " XBytesLineEdit . h " <nl> # include < QClipboard > <nl> # include < QApplication > <nl> void XBytesLineEdit : : autoMask ( QString content ) <nl> int remainder = len % 2 ; <nl> int parts = ( len - remainder ) / 2 ; <nl> <nl> - if ( parts ! = mParts ) { <nl> + if ( parts ! = mParts ) <nl> + { <nl> QString m ( " HH " ) ; <nl> int backupPosition = cursorPosition ( ) ; <nl> setInputMask ( m . repeated ( parts + 1 ) ) ; <nl> void XBytesLineEdit : : autoMask ( QString content ) <nl> } <nl> } <nl> <nl> - void XBytesLineEdit : : paste ( ) { <nl> + void XBytesLineEdit : : paste ( ) <nl> + { <nl> <nl> QString rawClipboardText = QApplication : : clipboard ( ) - > text ( ) . replace ( ' ' , " " ) . toUpper ( ) ; <nl> <nl> void XBytesLineEdit : : paste ( ) { <nl> <nl> } <nl> <nl> - void XBytesLineEdit : : copy ( ) { <nl> + void XBytesLineEdit : : copy ( ) <nl> + { <nl> / / copy whole content <nl> QApplication : : clipboard ( ) - > setText ( text ( ) ) ; <nl> / / QApplication : : clipboard ( ) - > setText ( selectedText ( ) . replace ( ' ' , " " ) . toUpper ( ) ) ; <nl> } <nl> <nl> - void XBytesLineEdit : : cut ( ) { <nl> + void XBytesLineEdit : : cut ( ) <nl> + { <nl> / / prevent cutting <nl> } <nl> <nl> QString XBytesLineEdit : : text ( ) <nl> <nl> <nl> <nl> - void XBytesLineEdit : : keyPressEvent ( QKeyEvent * event ) { <nl> - if ( event - > matches ( QKeySequence : : Paste ) ) { <nl> + void XBytesLineEdit : : keyPressEvent ( QKeyEvent * event ) <nl> + { <nl> + if ( event - > matches ( QKeySequence : : Paste ) ) <nl> + { <nl> paste ( ) ; <nl> - } else if ( event - > matches ( QKeySequence : : Copy ) ) { <nl> + } <nl> + else if ( event - > matches ( QKeySequence : : Copy ) ) <nl> + { <nl> copy ( ) ; <nl> - } else if ( event - > matches ( QKeySequence : : Cut ) ) { <nl> + } <nl> + else if ( event - > matches ( QKeySequence : : Cut ) ) <nl> + { <nl> / / prevent cutting <nl> } <nl> - else { <nl> + else <nl> + { <nl> return QLineEdit : : keyPressEvent ( event ) ; <nl> } <nl> } <nl> mmm a / x64_dbg_gui / Project / Src / Gui / CPUDump . cpp <nl> ppp b / x64_dbg_gui / Project / Src / Gui / CPUDump . cpp <nl> CPUDump : : CPUDump ( QWidget * parent ) : HexDump ( parent ) <nl> { <nl> case ViewHexAscii : <nl> hexAsciiSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewHexUnicode : <nl> hexUnicodeSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewTextAscii : <nl> textAsciiSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewTextUnicode : <nl> textUnicodeSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewIntegerSignedShort : <nl> integerSignedShortSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewIntegerSignedLong : <nl> integerSignedLongSlot ( ) ; <nl> - break ; <nl> + break ; <nl> # ifdef _WIN64 <nl> case ViewIntegerSignedLongLong : <nl> integerSignedLongLongSlot ( ) ; <nl> - break ; <nl> + break ; <nl> # endif / / _WIN64 <nl> case ViewIntegerUnsignedShort : <nl> integerUnsignedShortSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewIntegerUnsignedLong : <nl> integerUnsignedLongSlot ( ) ; <nl> - break ; <nl> + break ; <nl> # ifdef _WIN64 <nl> case ViewIntegerUnsignedLongLong : <nl> integerUnsignedLongLongSlot ( ) ; <nl> - break ; <nl> + break ; <nl> # endif / / _WIN64 <nl> case ViewIntegerHexShort : <nl> integerHexShortSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewIntegerHexLong : <nl> integerHexLongSlot ( ) ; <nl> - break ; <nl> + break ; <nl> # ifdef _WIN64 <nl> case ViewIntegerHexLongLong : <nl> integerHexLongLongSlot ( ) ; <nl> - break ; <nl> + break ; <nl> # endif / / _WIN64 <nl> case ViewFloatFloat : <nl> floatFloatSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewFloatDouble : <nl> floatDoubleSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewFloatLongDouble : <nl> floatLongDoubleSlot ( ) ; <nl> - break ; <nl> + break ; <nl> case ViewAddress : <nl> addressSlot ( ) ; <nl> - break ; <nl> + break ; <nl> default : <nl> hexAsciiSlot ( ) ; <nl> - break ; <nl> + break ; <nl> } <nl> <nl> connect ( Bridge : : getBridge ( ) , SIGNAL ( dumpAt ( int_t ) ) , this , SLOT ( printDumpAt ( int_t ) ) ) ; <nl> mmm a / x64_dbg_gui / Project / Src / Gui / CloseDialog . h <nl> ppp b / x64_dbg_gui / Project / Src / Gui / CloseDialog . h <nl> <nl> <nl> # include < QDialog > <nl> <nl> - namespace Ui { <nl> + namespace Ui <nl> + { <nl> class CloseDialog ; <nl> } <nl> <nl>
GUI : AStyle
x64dbg/x64dbg
78e90c5ad162784d211fcf8031b7168648ab414c
2014-07-05T01:02:14Z
mmm a / src / wasm / baseline / ppc / liftoff - assembler - ppc . h <nl> ppp b / src / wasm / baseline / ppc / liftoff - assembler - ppc . h <nl> void LiftoffAssembler : : emit_f64x2_neg ( LiftoffRegister dst , <nl> bailout ( kUnsupportedArchitecture , " emit_f64x2neg " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f64x2_sqrt ( LiftoffRegister dst , <nl> + LiftoffRegister src ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f64x2sqrt " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_f64x2_add ( LiftoffRegister dst , LiftoffRegister lhs , <nl> LiftoffRegister rhs ) { <nl> bailout ( kUnsupportedArchitecture , " emit_f64x2add " ) ; <nl> void LiftoffAssembler : : emit_f64x2_mul ( LiftoffRegister dst , LiftoffRegister lhs , <nl> bailout ( kUnsupportedArchitecture , " emit_f64x2mul " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f64x2_div ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f64x2div " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_f32x4_splat ( LiftoffRegister dst , <nl> LiftoffRegister src ) { <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4_splat " ) ; <nl> void LiftoffAssembler : : emit_f32x4_neg ( LiftoffRegister dst , <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4neg " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f32x4_sqrt ( LiftoffRegister dst , <nl> + LiftoffRegister src ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f32x4sqrt " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_f32x4_add ( LiftoffRegister dst , LiftoffRegister lhs , <nl> LiftoffRegister rhs ) { <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4add " ) ; <nl> void LiftoffAssembler : : emit_f32x4_mul ( LiftoffRegister dst , LiftoffRegister lhs , <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4mul " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f32x4_div ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f32x4div " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_i64x2_splat ( LiftoffRegister dst , <nl> LiftoffRegister src ) { <nl> bailout ( kUnsupportedArchitecture , " emit_i64x2splat " ) ; <nl> void LiftoffAssembler : : emit_i8x16_max_u ( LiftoffRegister dst , <nl> bailout ( kUnsupportedArchitecture , " emit_i8x16_max_u " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_i8x16_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_i8x16_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_i16x8_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_i16x8_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_i32x4_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_i32x4_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_f32x4_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f32x4_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_f64x2_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f64x2_eq " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_i8x16_rounding_average_u ( LiftoffRegister dst , <nl> LiftoffRegister lhs , <nl> LiftoffRegister rhs ) { <nl> mmm a / src / wasm / baseline / s390 / liftoff - assembler - s390 . h <nl> ppp b / src / wasm / baseline / s390 / liftoff - assembler - s390 . h <nl> void LiftoffAssembler : : emit_f64x2_neg ( LiftoffRegister dst , <nl> bailout ( kUnsupportedArchitecture , " emit_f64x2neg " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f64x2_sqrt ( LiftoffRegister dst , <nl> + LiftoffRegister src ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f64x2sqrt " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_f64x2_add ( LiftoffRegister dst , LiftoffRegister lhs , <nl> LiftoffRegister rhs ) { <nl> bailout ( kUnsupportedArchitecture , " emit_f64x2add " ) ; <nl> void LiftoffAssembler : : emit_f64x2_mul ( LiftoffRegister dst , LiftoffRegister lhs , <nl> bailout ( kUnsupportedArchitecture , " emit_f64x2mul " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f64x2_div ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f64x2div " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_f32x4_splat ( LiftoffRegister dst , <nl> LiftoffRegister src ) { <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4_splat " ) ; <nl> void LiftoffAssembler : : emit_f32x4_neg ( LiftoffRegister dst , <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4neg " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f32x4_sqrt ( LiftoffRegister dst , <nl> + LiftoffRegister src ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f32x4sqrt " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_f32x4_add ( LiftoffRegister dst , LiftoffRegister lhs , <nl> LiftoffRegister rhs ) { <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4add " ) ; <nl> void LiftoffAssembler : : emit_f32x4_mul ( LiftoffRegister dst , LiftoffRegister lhs , <nl> bailout ( kUnsupportedArchitecture , " emit_f32x4mul " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_f32x4_div ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f32x4div " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_i64x2_splat ( LiftoffRegister dst , <nl> LiftoffRegister src ) { <nl> bailout ( kUnsupportedArchitecture , " emit_i64x2splat " ) ; <nl> void LiftoffAssembler : : emit_i8x16_max_u ( LiftoffRegister dst , <nl> bailout ( kUnsupportedArchitecture , " emit_i8x16_max_u " ) ; <nl> } <nl> <nl> + void LiftoffAssembler : : emit_i8x16_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_i8x16_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_i16x8_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_i16x8_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_i32x4_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_i32x4_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_f32x4_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f32x4_eq " ) ; <nl> + } <nl> + <nl> + void LiftoffAssembler : : emit_f64x2_eq ( LiftoffRegister dst , LiftoffRegister lhs , <nl> + LiftoffRegister rhs ) { <nl> + bailout ( kUnsupportedArchitecture , " emit_f64x2_eq " ) ; <nl> + } <nl> + <nl> void LiftoffAssembler : : emit_i8x16_rounding_average_u ( LiftoffRegister dst , <nl> LiftoffRegister lhs , <nl> LiftoffRegister rhs ) { <nl>
PPC / s390 : [ wasm - simd ] [ liftoff ] Implement eq on x64 and ia32
v8/v8
2a96e262187e66b2f92f2ab2ecd94dc4eab47df8
2020-04-15T15:59:39Z
mmm a / src / cascadia / TerminalApp / App . cpp <nl> ppp b / src / cascadia / TerminalApp / App . cpp <nl> namespace winrt : : TerminalApp : : implementation <nl> const int focusedTabIndex = _GetFocusedTabIndex ( ) ; <nl> auto focusedTab = _tabs [ focusedTabIndex ] ; <nl> <nl> - const auto canSplit = splitType = = Pane : : SplitState : : Horizontal ? focusedTab - > CanAddHorizontalSplit ( ) : <nl> - focusedTab - > CanAddVerticalSplit ( ) ; <nl> + const auto canSplit = focusedTab - > CanSplitPane ( splitType ) ; <nl> <nl> if ( ! canSplit ) <nl> { <nl> namespace winrt : : TerminalApp : : implementation <nl> / / Hookup our event handlers to the new terminal <nl> _RegisterTerminalEvents ( newControl , focusedTab ) ; <nl> <nl> - return splitType = = Pane : : SplitState : : Horizontal ? focusedTab - > AddHorizontalSplit ( realGuid , newControl ) : <nl> - focusedTab - > AddVerticalSplit ( realGuid , newControl ) ; <nl> + focusedTab - > SplitPane ( splitType , realGuid , newControl ) ; <nl> } <nl> <nl> / / Method Description : <nl> mmm a / src / cascadia / TerminalApp / Pane . cpp <nl> ppp b / src / cascadia / TerminalApp / Pane . cpp <nl> bool Pane : : ResizePane ( const Direction & direction ) <nl> { <nl> return _Resize ( direction ) ; <nl> } <nl> - else <nl> + <nl> + / / If neither of our children were the focused leaf , then recurse into <nl> + / / our children and see if they can handle the resize . <nl> + / / For each child , if it has a focused descendant , try having that child <nl> + / / handle the resize . <nl> + / / If the child wasn ' t able to handle the resize , it ' s possible that <nl> + / / there were no descendants with a separator the correct direction . If <nl> + / / our separator _is_ the correct direction , then we should be the pane <nl> + / / to resize . Otherwise , just return false , as we couldn ' t handle it <nl> + / / either . <nl> + if ( ( ! _firstChild - > _IsLeaf ( ) ) & & _firstChild - > _HasFocusedChild ( ) ) <nl> { <nl> - / / If neither of our children were the focused leaf , then recurse into <nl> - / / our children and see if they can handle the resize . <nl> - / / For each child , if it has a focused descendant , try having that child <nl> - / / handle the resize . <nl> - / / If the child wasn ' t able to handle the resize , it ' s possible that <nl> - / / there were no descendants with a separator the correct direction . If <nl> - / / our separator _is_ the correct direction , then we should be the pane <nl> - / / to resize . Otherwise , just return false , as we couldn ' t handle it <nl> - / / either . <nl> - if ( ( ! _firstChild - > _IsLeaf ( ) ) & & _firstChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _firstChild - > ResizePane ( direction ) | | _Resize ( direction ) ; <nl> - } <nl> - else if ( ( ! _secondChild - > _IsLeaf ( ) ) & & _secondChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _secondChild - > ResizePane ( direction ) | | _Resize ( direction ) ; <nl> - } <nl> + return _firstChild - > ResizePane ( direction ) | | _Resize ( direction ) ; <nl> } <nl> + <nl> + if ( ( ! _secondChild - > _IsLeaf ( ) ) & & _secondChild - > _HasFocusedChild ( ) ) <nl> + { <nl> + return _secondChild - > ResizePane ( direction ) | | _Resize ( direction ) ; <nl> + } <nl> + <nl> return false ; <nl> } <nl> <nl> bool Pane : : NavigateFocus ( const Direction & direction ) <nl> { <nl> return _NavigateFocus ( direction ) ; <nl> } <nl> - else <nl> + <nl> + / / If neither of our children were the focused leaf , then recurse into <nl> + / / our children and see if they can handle the focus move . <nl> + / / For each child , if it has a focused descendant , try having that child <nl> + / / handle the focus move . <nl> + / / If the child wasn ' t able to handle the focus move , it ' s possible that <nl> + / / there were no descendants with a separator the correct direction . If <nl> + / / our separator _is_ the correct direction , then we should be the pane <nl> + / / to move focus into our other child . Otherwise , just return false , as <nl> + / / we couldn ' t handle it either . <nl> + if ( ( ! _firstChild - > _IsLeaf ( ) ) & & _firstChild - > _HasFocusedChild ( ) ) <nl> { <nl> - / / If neither of our children were the focused leaf , then recurse into <nl> - / / our children and see if they can handle the focus move . <nl> - / / For each child , if it has a focused descendant , try having that child <nl> - / / handle the focus move . <nl> - / / If the child wasn ' t able to handle the focus move , it ' s possible that <nl> - / / there were no descendants with a separator the correct direction . If <nl> - / / our separator _is_ the correct direction , then we should be the pane <nl> - / / to move focus into our other child . Otherwise , just return false , as <nl> - / / we couldn ' t handle it either . <nl> - if ( ( ! _firstChild - > _IsLeaf ( ) ) & & _firstChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _firstChild - > NavigateFocus ( direction ) | | _NavigateFocus ( direction ) ; <nl> - } <nl> - else if ( ( ! _secondChild - > _IsLeaf ( ) ) & & _secondChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _secondChild - > NavigateFocus ( direction ) | | _NavigateFocus ( direction ) ; <nl> - } <nl> + return _firstChild - > NavigateFocus ( direction ) | | _NavigateFocus ( direction ) ; <nl> } <nl> + <nl> + if ( ( ! _secondChild - > _IsLeaf ( ) ) & & _secondChild - > _HasFocusedChild ( ) ) <nl> + { <nl> + return _secondChild - > NavigateFocus ( direction ) | | _NavigateFocus ( direction ) ; <nl> + } <nl> + <nl> return false ; <nl> } <nl> <nl> std : : shared_ptr < Pane > Pane : : GetFocusedPane ( ) <nl> { <nl> return _lastFocused ? shared_from_this ( ) : nullptr ; <nl> } <nl> - else <nl> + <nl> + auto firstFocused = _firstChild - > GetFocusedPane ( ) ; <nl> + if ( firstFocused ! = nullptr ) <nl> { <nl> - auto firstFocused = _firstChild - > GetFocusedPane ( ) ; <nl> - if ( firstFocused ! = nullptr ) <nl> - { <nl> - return firstFocused ; <nl> - } <nl> - return _secondChild - > GetFocusedPane ( ) ; <nl> + return firstFocused ; <nl> } <nl> + return _secondChild - > GetFocusedPane ( ) ; <nl> } <nl> <nl> / / Method Description : <nl> void Pane : : _ApplySplitDefinitions ( ) <nl> } <nl> <nl> / / Method Description : <nl> - / / - Determines whether the pane can be split vertically <nl> + / / - Determines whether the pane can be split <nl> / / Arguments : <nl> / / - splitType : what type of split we want to create . <nl> / / Return Value : <nl> - / / - True if the pane can be split vertically . False otherwise . <nl> - bool Pane : : CanSplitVertical ( ) <nl> + / / - True if the pane can be split . False otherwise . <nl> + bool Pane : : CanSplit ( SplitState splitType ) <nl> { <nl> - if ( ! _IsLeaf ( ) ) <nl> + if ( _IsLeaf ( ) ) <nl> { <nl> - if ( _firstChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _firstChild - > CanSplitVertical ( ) ; <nl> - } <nl> - else if ( _secondChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _secondChild - > CanSplitVertical ( ) ; <nl> - } <nl> - <nl> - return false ; <nl> + return _CanSplit ( splitType ) ; <nl> } <nl> <nl> - return _CanSplit ( SplitState : : Vertical ) ; <nl> - } <nl> - <nl> - / / Method Description : <nl> - / / - Vertically split the focused pane in our tree of panes , and place the given <nl> - / / TermControl into the newly created pane . If we ' re the focused pane , then <nl> - / / we ' ll create two new children , and place them side - by - side in our Grid . <nl> - / / Arguments : <nl> - / / - profile : The profile GUID to associate with the newly created pane . <nl> - / / - control : A TermControl to use in the new pane . <nl> - / / Return Value : <nl> - / / - < none > <nl> - void Pane : : SplitVertical ( const GUID & profile , const TermControl & control ) <nl> - { <nl> - / / If we ' re not the leaf , recurse into our children to split them . <nl> - if ( ! _IsLeaf ( ) ) <nl> + if ( _firstChild - > _HasFocusedChild ( ) ) <nl> { <nl> - if ( _firstChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - _firstChild - > SplitVertical ( profile , control ) ; <nl> - } <nl> - else if ( _secondChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - _secondChild - > SplitVertical ( profile , control ) ; <nl> - } <nl> - <nl> - return ; <nl> + return _firstChild - > CanSplit ( splitType ) ; <nl> } <nl> <nl> - _Split ( SplitState : : Vertical , profile , control ) ; <nl> - } <nl> - <nl> - / / Method Description : <nl> - / / - Determines whether the pane can be split horizontally <nl> - / / Arguments : <nl> - / / - splitType : what type of split we want to create . <nl> - / / Return Value : <nl> - / / - True if the pane can be split horizontally . False otherwise . <nl> - bool Pane : : CanSplitHorizontal ( ) <nl> - { <nl> - if ( ! _IsLeaf ( ) ) <nl> + if ( _secondChild - > _HasFocusedChild ( ) ) <nl> { <nl> - if ( _firstChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _firstChild - > CanSplitHorizontal ( ) ; <nl> - } <nl> - else if ( _secondChild - > _HasFocusedChild ( ) ) <nl> - { <nl> - return _secondChild - > CanSplitHorizontal ( ) ; <nl> - } <nl> - <nl> - return false ; <nl> + return _secondChild - > CanSplit ( splitType ) ; <nl> } <nl> <nl> - return _CanSplit ( SplitState : : Horizontal ) ; <nl> + return false ; <nl> } <nl> <nl> / / Method Description : <nl> - / / - Horizontally split the focused pane in our tree of panes , and place the given <nl> + / / - Split the focused pane in our tree of panes , and place the given <nl> / / TermControl into the newly created pane . If we ' re the focused pane , then <nl> / / we ' ll create two new children , and place them side - by - side in our Grid . <nl> / / Arguments : <nl> + / / - splitType : what type of split we want to create . <nl> / / - profile : The profile GUID to associate with the newly created pane . <nl> / / - control : A TermControl to use in the new pane . <nl> / / Return Value : <nl> / / - < none > <nl> - void Pane : : SplitHorizontal ( const GUID & profile , const TermControl & control ) <nl> + void Pane : : Split ( SplitState splitType , const GUID & profile , const TermControl & control ) <nl> { <nl> if ( ! _IsLeaf ( ) ) <nl> { <nl> if ( _firstChild - > _HasFocusedChild ( ) ) <nl> { <nl> - _firstChild - > SplitHorizontal ( profile , control ) ; <nl> + _firstChild - > Split ( splitType , profile , control ) ; <nl> } <nl> else if ( _secondChild - > _HasFocusedChild ( ) ) <nl> { <nl> - _secondChild - > SplitHorizontal ( profile , control ) ; <nl> + _secondChild - > Split ( splitType , profile , control ) ; <nl> } <nl> <nl> return ; <nl> } <nl> <nl> - _Split ( SplitState : : Horizontal , profile , control ) ; <nl> + _Split ( splitType , profile , control ) ; <nl> } <nl> <nl> / / Method Description : <nl> void Pane : : SplitHorizontal ( const GUID & profile , const TermControl & control ) <nl> / / - True if the pane can be split . False otherwise . <nl> bool Pane : : _CanSplit ( SplitState splitType ) <nl> { <nl> - const bool changeWidth = _splitState = = SplitState : : Vertical ; <nl> - <nl> const Size actualSize { gsl : : narrow_cast < float > ( _root . ActualWidth ( ) ) , <nl> gsl : : narrow_cast < float > ( _root . ActualHeight ( ) ) } ; <nl> <nl> Size Pane : : _GetMinSize ( ) const <nl> { <nl> return _control . MinimumSize ( ) ; <nl> } <nl> - else <nl> - { <nl> - const auto firstSize = _firstChild - > _GetMinSize ( ) ; <nl> - const auto secondSize = _secondChild - > _GetMinSize ( ) ; <nl> - const auto newWidth = firstSize . Width + secondSize . Width + ( _splitState = = SplitState : : Vertical ? PaneSeparatorSize : 0 ) ; <nl> - const auto newHeight = firstSize . Height + secondSize . Height + ( _splitState = = SplitState : : Horizontal ? PaneSeparatorSize : 0 ) ; <nl> - return { newWidth , newHeight } ; <nl> - } <nl> + <nl> + const auto firstSize = _firstChild - > _GetMinSize ( ) ; <nl> + const auto secondSize = _secondChild - > _GetMinSize ( ) ; <nl> + const auto newWidth = firstSize . Width + secondSize . Width + ( _splitState = = SplitState : : Vertical ? PaneSeparatorSize : 0 ) ; <nl> + const auto newHeight = firstSize . Height + secondSize . Height + ( _splitState = = SplitState : : Horizontal ? PaneSeparatorSize : 0 ) ; <nl> + return { newWidth , newHeight } ; <nl> } <nl> <nl> DEFINE_EVENT ( Pane , Closed , _closedHandlers , ConnectionClosedEventArgs ) ; <nl> mmm a / src / cascadia / TerminalApp / Pane . h <nl> ppp b / src / cascadia / TerminalApp / Pane . h <nl> class Pane : public std : : enable_shared_from_this < Pane > <nl> bool ResizePane ( const winrt : : TerminalApp : : Direction & direction ) ; <nl> bool NavigateFocus ( const winrt : : TerminalApp : : Direction & direction ) ; <nl> <nl> - bool CanSplitHorizontal ( ) ; <nl> - void SplitHorizontal ( const GUID & profile , const winrt : : Microsoft : : Terminal : : TerminalControl : : TermControl & control ) ; <nl> - <nl> - bool CanSplitVertical ( ) ; <nl> - void SplitVertical ( const GUID & profile , const winrt : : Microsoft : : Terminal : : TerminalControl : : TermControl & control ) ; <nl> + bool CanSplit ( SplitState splitType ) ; <nl> + void Split ( SplitState splitType , const GUID & profile , const winrt : : Microsoft : : Terminal : : TerminalControl : : TermControl & control ) ; <nl> <nl> void Close ( ) ; <nl> <nl> mmm a / src / cascadia / TerminalApp / Tab . cpp <nl> ppp b / src / cascadia / TerminalApp / Tab . cpp <nl> void Tab : : Scroll ( const int delta ) <nl> } <nl> <nl> / / Method Description : <nl> - / / - Determines whether the focused pane has sufficient space to be split vertically . <nl> - / / Return Value : <nl> - / / - True if the focused pane can be split horizontally . False otherwise . <nl> - bool Tab : : CanAddVerticalSplit ( ) <nl> - { <nl> - return _rootPane - > CanSplitVertical ( ) ; <nl> - } <nl> - <nl> - / / Method Description : <nl> - / / - Vertically split the focused pane in our tree of panes , and place the <nl> - / / given TermControl into the newly created pane . <nl> + / / - Determines whether the focused pane has sufficient space to be split . <nl> / / Arguments : <nl> - / / - profile : The profile GUID to associate with the newly created pane . <nl> - / / - control : A TermControl to use in the new pane . <nl> - / / Return Value : <nl> - / / - < none > <nl> - void Tab : : AddVerticalSplit ( const GUID & profile , TermControl & control ) <nl> - { <nl> - _rootPane - > SplitVertical ( profile , control ) ; <nl> - } <nl> - <nl> - / / Method Description : <nl> - / / - Determines whether the focused pane has sufficient space to be split horizontally . <nl> + / / - splitType : The type of split we want to create . <nl> / / Return Value : <nl> - / / - True if the focused pane can be split horizontally . False otherwise . <nl> - bool Tab : : CanAddHorizontalSplit ( ) <nl> + / / - True if the focused pane can be split . False otherwise . <nl> + bool Tab : : CanSplitPane ( Pane : : SplitState splitType ) <nl> { <nl> - return _rootPane - > CanSplitHorizontal ( ) ; <nl> + return _rootPane - > CanSplit ( splitType ) ; <nl> } <nl> <nl> / / Method Description : <nl> - / / - Horizontally split the focused pane in our tree of panes , and place the <nl> + / / - Split the focused pane in our tree of panes , and place the <nl> / / given TermControl into the newly created pane . <nl> / / Arguments : <nl> + / / - splitType : The type of split we want to create . <nl> / / - profile : The profile GUID to associate with the newly created pane . <nl> / / - control : A TermControl to use in the new pane . <nl> / / Return Value : <nl> / / - < none > <nl> - void Tab : : AddHorizontalSplit ( const GUID & profile , TermControl & control ) <nl> + void Tab : : SplitPane ( Pane : : SplitState splitType , const GUID & profile , TermControl & control ) <nl> { <nl> - _rootPane - > SplitHorizontal ( profile , control ) ; <nl> + _rootPane - > Split ( splitType , profile , control ) ; <nl> } <nl> <nl> / / Method Description : <nl> mmm a / src / cascadia / TerminalApp / Tab . h <nl> ppp b / src / cascadia / TerminalApp / Tab . h <nl> class Tab <nl> void SetFocused ( const bool focused ) ; <nl> <nl> void Scroll ( const int delta ) ; <nl> - bool CanAddVerticalSplit ( ) ; <nl> - void AddVerticalSplit ( const GUID & profile , winrt : : Microsoft : : Terminal : : TerminalControl : : TermControl & control ) ; <nl> - bool CanAddHorizontalSplit ( ) ; <nl> - void AddHorizontalSplit ( const GUID & profile , winrt : : Microsoft : : Terminal : : TerminalControl : : TermControl & control ) ; <nl> + <nl> + bool CanSplitPane ( Pane : : SplitState splitType ) ; <nl> + void SplitPane ( Pane : : SplitState splitType , const GUID & profile , winrt : : Microsoft : : Terminal : : TerminalControl : : TermControl & control ) ; <nl> <nl> void UpdateFocus ( ) ; <nl> void UpdateIcon ( const winrt : : hstring iconPath ) ; <nl>
Clean up Pane ( )
microsoft/terminal
f4294b17d7f6eaf60ea72d75e1004a10633571ea
2019-08-28T14:40:16Z
mmm a / src / core / hle / service / am / applets / software_keyboard . cpp <nl> ppp b / src / core / hle / service / am / applets / software_keyboard . cpp <nl> void SoftwareKeyboard : : ExecuteInteractive ( ) { <nl> <nl> switch ( request ) { <nl> case Request : : Calc : { <nl> - broker . PushNormalDataFromApplet ( <nl> - std : : make_shared < IStorage > ( std : : move ( std : : vector < u8 > { 1 } ) ) ) ; <nl> + broker . PushNormalDataFromApplet ( std : : make_shared < IStorage > ( std : : vector < u8 > { 1 } ) ) ; <nl> broker . SignalStateChanged ( ) ; <nl> break ; <nl> } <nl>
Merge pull request from lioncash / pessimizing2
yuzu-emu/yuzu
e6f9231ef031933066feef4eb4de25f26fdc2b9f
2020-08-15T06:13:44Z
mmm a / src / clustering / administration / main / serve . cc <nl> ppp b / src / clustering / administration / main / serve . cc <nl> try { <nl> } else { <nl> rassert ( ports . port = = connectivity_cluster_run . get_port ( ) ) ; <nl> } <nl> - printf ( " Listening for intracluster traffic on port % d . . . \ n " , ports . port ) ; <nl> + logSTDOUT ( " Listening for intracluster traffic on port % d . . . \ n " , ports . port ) ; <nl> <nl> auto_reconnector_t auto_reconnector ( <nl> & connectivity_cluster , <nl> try { <nl> / / TODO : Pardon me what , but is this how we fail here ? <nl> guarantee ( ports . http_port < 65536 ) ; <nl> <nl> - printf ( " Starting up administrative HTTP server on port % d . . . \ n " , ports . http_port ) ; <nl> + logSTDOUT ( " Starting up administrative HTTP server on port % d . . . \ n " , ports . http_port ) ; <nl> administrative_http_server_manager_t administrative_http_interface ( <nl> ports . http_port , <nl> & mailbox_manager , <nl> try { <nl> machine_id , <nl> web_assets ) ; <nl> <nl> - printf ( " Server started ; send SIGINT to stop . \ n " ) ; <nl> + logSTDOUT ( " Server started ; send SIGINT to stop . \ n " ) ; <nl> <nl> stop_cond - > wait_lazily_unordered ( ) ; <nl> <nl> - printf ( " Server got SIGINT ; shutting down . . . \ n " ) ; <nl> + logSTDOUT ( " Server got SIGINT ; shutting down . . . \ n " ) ; <nl> } <nl> <nl> cond_t non_interruptor ; <nl> try { <nl> return true ; <nl> <nl> } catch ( address_in_use_exc_t e ) { <nl> - printf ( " % s . Cannot bind to cluster port . Exiting . \ n " , e . what ( ) ) ; <nl> + logSTDOUT ( " % s . Cannot bind to cluster port . Exiting . \ n " , e . what ( ) ) ; <nl> exit ( 1 ) ; <nl> } <nl> <nl>
Modified serve . cc to log output in addition to printing it to stdout .
rethinkdb/rethinkdb
a402d2c620787d3df326f326fbf7b05e6c56c45c
2012-07-25T23:35:02Z
mmm a / . jenkins / pytorch / macos - build . sh <nl> ppp b / . jenkins / pytorch / macos - build . sh <nl> if [ [ " $ { JOB_BASE_NAME } " = = * cuda9 . 2 * ] ] ; then <nl> export CUDA_HOME = / Developer / NVIDIA / CUDA - $ { CUDA_VERSION } <nl> export NO_CUDA = 0 <nl> <nl> - # We need to do this for install_name_tool to be found <nl> - export PATH = / Applications / Xcode9 . app / Contents / Developer / Toolchains / XcodeDefault . xctoolchain / usr / bin : $ PATH <nl> + # Eigen gives " explicit specialization of class must precede its first use " error <nl> + # when compiling with Xcode 9 . 1 toolchain , so we have to use Xcode 8 . 2 toolchain instead . <nl> + export DEVELOPER_DIR = / Library / Developer / CommandLineTools <nl> + else <nl> + export DEVELOPER_DIR = / Applications / Xcode9 . app / Contents / Developer <nl> fi <nl> <nl> export MACOSX_DEPLOYMENT_TARGET = 10 . 9 <nl> mmm a / CMakeLists . txt <nl> ppp b / CMakeLists . txt <nl> if ( APPLE ) <nl> set ( CMAKE_FIND_APPBUNDLE LAST ) <nl> endif ( ) <nl> <nl> + # Get clang version on macOS <nl> + if ( APPLE ) <nl> + EXECUTE_PROCESS ( COMMAND $ { CMAKE_CXX_COMPILER } - - version OUTPUT_VARIABLE clang_full_version_string ) <nl> + string ( REGEX REPLACE " Apple LLVM version ( [ 0 - 9 ] + \ \ . [ 0 - 9 ] + ) . * " " \ \ 1 " CLANG_VERSION_STRING $ { clang_full_version_string } ) <nl> + MESSAGE ( STATUS " CLANG_VERSION_STRING : " $ { CLANG_VERSION_STRING } ) <nl> + endif ( ) <nl> + <nl> # mmm [ Options . <nl> # Note to developers : if you add an option below , make sure you also add it to <nl> # cmake / Summary . cmake so that the summary prints out the option values . <nl> if ( NOT MSVC ) <nl> set ( CMAKE_CXX_FLAGS " $ { CMAKE_CXX_FLAGS } - Wno - unused - variable " ) <nl> set ( CMAKE_CXX_FLAGS " $ { CMAKE_CXX_FLAGS } - Wno - unused - function " ) <nl> set ( CMAKE_CXX_FLAGS " $ { CMAKE_CXX_FLAGS } - Wno - unused - result " ) <nl> - if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 7 . 0 ) <nl> + if ( ( APPLE AND ( NOT ( " $ { CLANG_VERSION_STRING } " VERSION_LESS " 9 . 0 " ) ) ) <nl> + OR ( CMAKE_COMPILER_IS_GNUCXX AND ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 7 . 0 ) ) ) <nl> set ( CMAKE_CXX_FLAGS " $ { CMAKE_CXX_FLAGS } - faligned - new " ) <nl> endif ( ) <nl> if ( $ ENV { WERROR } ) <nl>
Use clang 8 to build CUDA in macOS CI ( )
pytorch/pytorch
1f02ebd323f855d4bea66a0f4726a46d58ae38f5
2018-06-12T02:45:40Z
mmm a / INSTALL . GIT . md <nl> ppp b / INSTALL . GIT . md <nl> <nl> # autotools ( LINUX / UNIX , msys . . . ) <nl> <nl> - <nl> - If you have cloned Tesseract from Github , you must generate <nl> + If you have cloned Tesseract from GitHub , you must generate <nl> the configure script . <nl> <nl> If you have tesseract 3 . 0x installation in your system , please remove it <nl> So , the steps for making Tesseract are : <nl> <nl> You need to install at least English language data file to TESSDATA_PREFIX <nl> directory . All language data files can be retrieved from git repository : <nl> + <nl> $ git clone https : / / github . com / tesseract - ocr / tessdata . git tesseract - ocr . tessdata <nl> + <nl> ( Repository it huge - more that 1 . 2Gb . You do not need to download <nl> all languages ) <nl> <nl> To compile ScrollView . jar you need to download piccolo2d - core - 3 . 0 . jar <nl> and [ piccolo2d - extras - 3 . 0 . jar ] ( http : / / search . maven . org / # search | ga | 1 | g % 3A % 22org . piccolo2d % 22 ) and place them to tesseract / java . <nl> <nl> - Than run : <nl> + Then run : <nl> + <nl> $ make ScrollView . jar <nl> <nl> and follow instruction on [ Viewer Debugging wiki ] ( https : / / github . com / tesseract - ocr / tesseract / wiki / ViewerDebugging ) . <nl>
Small improvements for documentation
tesseract-ocr/tesseract
7847860a1ee23d8671ec49236a27e74494b9d6ef
2015-12-04T09:50:16Z
mmm a / doc / tutorials / dask . rst <nl> ppp b / doc / tutorials / dask . rst <nl> easy management of distributed workers and excels handling large distributed dat <nl> workflows . The implementation in XGBoost originates from ` dask - xgboost <nl> < https : / / github . com / dask / dask - xgboost > ` _ with some extended functionalities and a <nl> different interface . Right now it is still under construction and may change ( with proper <nl> - warnings ) in the future . <nl> + warnings ) in the future . The tutorial here focus on basic usage of dask with CPU tree <nl> + algorithm . For an overview of GPU based training and internal working , see ` A New , <nl> + Official Dask API for XGBoost <nl> + < https : / / medium . com / rapids - ai / a - new - official - dask - api - for - xgboost - e8b10f3d1eb7 > ` _ . <nl> <nl> * * * * * * * * * * * * <nl> Requirements <nl>
[ DOC ] Mention dask blog post in doc . [ skip ci ] ( )
dmlc/xgboost
529b5c2cfddac859bc4266678bd99079aca4e9ff
2020-06-14T05:00:19Z
mmm a / src / mongo / SConscript <nl> ppp b / src / mongo / SConscript <nl> serverOnlyFiles = [ " db / curop . cpp " , <nl> " db / repl / rs . cpp " , <nl> " db / repl / consensus . cpp " , <nl> " db / repl / rs_initiate . cpp " , <nl> + " db / repl / repl_coordinator_legacy . cpp " , <nl> " db / repl / repl_set_health_poll_task . cpp " , <nl> " db / repl / repl_set_impl . cpp " , <nl> " db / repl / replset_commands . cpp " , <nl> env . Library ( " serveronly " , serverOnlyFiles , <nl> " db / exec / working_set " , <nl> " db / exec / exec " , <nl> " db / query / query " , <nl> + " db / repl / repl_coordinator_global " , <nl> ' db / storage / extent ' , <nl> ' db / structure / record_store ' , <nl> ' db / structure / record_store_v1 ' , <nl> env . Library ( " alltools " , <nl> " signal_handlers " , <nl> " $ BUILD_DIR / mongo / util / options_parser / options_parser " , <nl> " $ BUILD_DIR / mongo / db / auth / authmocks " , <nl> - " $ BUILD_DIR / mongo / db / auth / authmongod " ] ) <nl> + " $ BUILD_DIR / mongo / db / auth / authmongod " , <nl> + " $ BUILD_DIR / mongo / db / repl / repl_coordinator_global " , <nl> + " $ BUILD_DIR / mongo / db / repl / replmocks " ] ) <nl> <nl> normalTools = [ " dump " , " restore " , " export " , " import " , " stat " , " top " , " oplog " ] <nl> env . Alias ( " tools " , [ " # / $ { PROGPREFIX } mongo " + name + " $ { PROGSUFFIX } " for name in normalTools ] ) <nl> test = testEnv . Install ( <nl> " mocklib " , <nl> " db / exec / mock_stage " , <nl> " $ BUILD_DIR / mongo / db / auth / authmocks " , <nl> - " $ BUILD_DIR / mongo / db / query / query " ] ) ) <nl> + " $ BUILD_DIR / mongo / db / query / query " , <nl> + " $ BUILD_DIR / mongo / db / repl / repl_coordinator_global " , <nl> + " $ BUILD_DIR / mongo / db / repl / replmocks " ] ) ) <nl> <nl> if len ( testEnv . subst ( ' $ PROGSUFFIX ' ) ) : <nl> testEnv . Alias ( " dbtest " , " # / $ { PROGPREFIX } dbtest $ { PROGSUFFIX } " ) <nl> mmm a / src / mongo / db / db . cpp <nl> ppp b / src / mongo / db / db . cpp <nl> <nl> # include " mongo / db / query / internal_plans . h " <nl> # include " mongo / db / range_deleter_service . h " <nl> # include " mongo / db / repair_database . h " <nl> + # include " mongo / db / repl / repl_coordinator_global . h " <nl> + # include " mongo / db / repl / repl_coordinator_legacy . h " <nl> # include " mongo / db / repl / repl_settings . h " <nl> # include " mongo / db / repl / repl_start . h " <nl> # include " mongo / db / repl / rs . h " <nl> MONGO_INITIALIZER ( SetGlobalConfigExperiment ) ( InitializerContext * context ) { <nl> return Status : : OK ( ) ; <nl> } <nl> <nl> + MONGO_INITIALIZER_GENERAL ( CreateReplicationManager , <nl> + MONGO_NO_PREREQUISITES , <nl> + MONGO_NO_DEPENDENTS ) ( InitializerContext * context ) { <nl> + repl : : setGlobalReplicationCoordinator ( new repl : : LegacyReplicationCoordinator ( ) ) ; <nl> + return Status : : OK ( ) ; <nl> + } <nl> + <nl> # ifdef MONGO_SSL <nl> MONGO_INITIALIZER_GENERAL ( setSSLManagerType , <nl> MONGO_NO_PREREQUISITES , <nl> mmm a / src / mongo / db / repl / SConscript <nl> ppp b / src / mongo / db / repl / SConscript <nl> env . Library ( ' replication_executor ' , <nl> env . CppUnitTest ( ' replication_executor_test ' , <nl> ' replication_executor_test . cpp ' , <nl> LIBDEPS = [ ' replication_executor ' ] ) <nl> + <nl> + env . Library ( ' repl_coordinator_interface ' , <nl> + ' repl_coordinator . cpp ' ) <nl> + <nl> + env . Library ( ' repl_coordinator_global ' , <nl> + ' repl_coordinator_global . cpp ' , <nl> + LIBDEPS = [ ' repl_coordinator_interface ' ] ) <nl> + <nl> + env . Library ( ' replmocks ' , <nl> + ' repl_coordinator_mock . cpp ' , <nl> + LIBDEPS = [ ' repl_coordinator_interface ' ] ) <nl> new file mode 100644 <nl> index 000000000000 . . 0987f4999762 <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator . cpp <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # include " mongo / platform / basic . h " <nl> + <nl> + # include " mongo / db / repl / repl_coordinator . h " <nl> + <nl> + namespace mongo { <nl> + namespace repl { <nl> + <nl> + ReplicationCoordinator : : ReplicationCoordinator ( ) { } <nl> + ReplicationCoordinator : : ~ ReplicationCoordinator ( ) { } <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> new file mode 100644 <nl> index 000000000000 . . 3616be4bf2ef <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator . h <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # pragma once <nl> + <nl> + # include < boost / date_time / posix_time / posix_time_types . hpp > <nl> + <nl> + # include " mongo / base / disallow_copying . h " <nl> + # include " mongo / base / status . h " <nl> + <nl> + namespace mongo { <nl> + <nl> + struct HostAndPort ; <nl> + class IndexDescriptor ; <nl> + class NamespaceString ; <nl> + class OpTime ; <nl> + struct WriteConcernOptions ; <nl> + <nl> + namespace repl { <nl> + <nl> + struct MemberState ; <nl> + <nl> + / * * <nl> + * The ReplicationCoordinator is responsible for coordinating the interaction of replication <nl> + * with the rest of the system . The public methods on ReplicationCoordinator are the public <nl> + * API that the replication subsystem presents to the rest of the codebase . <nl> + * / <nl> + class ReplicationCoordinator { <nl> + MONGO_DISALLOW_COPYING ( ReplicationCoordinator ) ; <nl> + <nl> + public : <nl> + <nl> + typedef boost : : posix_time : : milliseconds Milliseconds ; <nl> + <nl> + virtual ~ ReplicationCoordinator ( ) ; <nl> + <nl> + / * * <nl> + * Does any initial bookkeeping needed to start replication , and instructs the other <nl> + * components of the replication system to start up whatever threads and do whatever <nl> + * initialization they need . <nl> + * / <nl> + virtual void startReplication ( ) = 0 ; <nl> + <nl> + / * * <nl> + * Does whatever cleanup is required to stop replication , including instructing the other <nl> + * components of the replication system to shut down and stop any threads they are using , <nl> + * blocking until all replication - related shutdown tasks are complete . <nl> + * / <nl> + virtual void shutdown ( ) = 0 ; <nl> + <nl> + / * * <nl> + * Returns true if it is safe to shut down the server now . Currently the only time this <nl> + * can be false is if this node is primary and there are no secondaries within 10 seconds <nl> + * of our optime . <nl> + * / <nl> + virtual bool isShutdownOkay ( ) const = 0 ; <nl> + <nl> + / * * <nl> + * Returns true if this node is configured to be a member of a replica set or master / slave <nl> + * setup . <nl> + * / <nl> + virtual bool isReplEnabled ( ) const = 0 ; <nl> + <nl> + / * * <nl> + * Returns the current replica set state of this node ( PRIMARY , SECONDARY , STARTUP , etc ) . <nl> + * / <nl> + virtual const MemberState & getCurrentMemberState ( ) const = 0 ; <nl> + <nl> + / * * <nl> + * Blocks the calling thread for up to " timeout " millis , or until " ts " has been replicated <nl> + * to at least a set of nodes that satisfies the writeConcern , whichever comes first . Will <nl> + * return a Status with ErrorCodes : : ExceededTimeLimit if the timeout is reached before the <nl> + * data has been sufficiently replicated . <nl> + * / <nl> + virtual Status awaitReplication ( const OpTime & ts , <nl> + const WriteConcernOptions & writeConcern , <nl> + Milliseconds timeout ) = 0 ; <nl> + <nl> + / * * <nl> + * TODO a way to trigger an action on replication of a given operation <nl> + * / <nl> + / / handle_t onReplication ( OpTime ts , writeConcern , callbackFunction ) ; / / TODO <nl> + <nl> + / * * <nl> + * Returns true if it is valid for this node to accept writes on the given collection . <nl> + * Currently this is true only if this node is Primary , master in master / slave , <nl> + * a standalone , or is writing to the local database . <nl> + * / <nl> + virtual bool canAcceptWritesFor ( const NamespaceString & collection ) = 0 ; <nl> + <nl> + / * * <nl> + * Returns true if it is valid for this node to serve reads on the given collection . <nl> + * / <nl> + virtual bool canServeReadsFor ( const NamespaceString & collection ) = 0 ; <nl> + <nl> + / * * <nl> + * Returns true if this node should ignore unique index constraints on new documents . <nl> + * Currently this is needed for nodes in STARTUP2 , RECOVERING , and ROLLBACK states . <nl> + * / <nl> + virtual bool shouldIgnoreUniqueIndex ( const IndexDescriptor * idx ) = 0 ; <nl> + <nl> + / * * <nl> + * Updates our internal tracking of the last OpTime applied for the given member of the <nl> + * set identified by " member " . Also updates all bookkeeping related to tracking what the <nl> + * last OpTime applied by all tag groups that " member " is a part of . This is called when <nl> + * secondaries notify the member they are syncing from of their progress in replication . <nl> + * This information is used by awaitReplication to satisfy write concerns . It is * not * <nl> + * used in elections , we maintain a separate view of member optimes in the topology <nl> + * coordinator based on incoming heartbeat messages , which is used in elections . <nl> + * / <nl> + virtual Status setLastOptime ( const HostAndPort & member , const OpTime & ts ) = 0 ; <nl> + <nl> + protected : <nl> + <nl> + ReplicationCoordinator ( ) ; <nl> + <nl> + } ; <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> new file mode 100644 <nl> index 000000000000 . . 862fc387ab9e <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator_global . cpp <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # include " mongo / platform / basic . h " <nl> + <nl> + # include " mongo / db / repl / repl_coordinator_global . h " <nl> + <nl> + namespace mongo { <nl> + namespace repl { <nl> + <nl> + namespace { <nl> + ReplicationCoordinator * coordinator = NULL ; <nl> + } / / namespace <nl> + <nl> + ReplicationCoordinator * getGlobalReplicationCoordinator ( ) { <nl> + return coordinator ; <nl> + } <nl> + <nl> + void setGlobalReplicationCoordinator ( ReplicationCoordinator * newCoordinator ) { <nl> + coordinator = newCoordinator ; <nl> + } <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> new file mode 100644 <nl> index 000000000000 . . 66398079edc3 <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator_global . h <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # pragma once <nl> + <nl> + # include " mongo / db / repl / repl_coordinator . h " <nl> + <nl> + namespace mongo { <nl> + namespace repl { <nl> + <nl> + ReplicationCoordinator * getGlobalReplicationCoordinator ( ) ; <nl> + void setGlobalReplicationCoordinator ( ReplicationCoordinator * coordinator ) ; <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> new file mode 100644 <nl> index 000000000000 . . ea7ac2a053a7 <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator_legacy . cpp <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # include " mongo / platform / basic . h " <nl> + <nl> + # include " mongo / db / repl / repl_coordinator_legacy . h " <nl> + <nl> + # include " mongo / base / status . h " <nl> + # include " mongo / util / assert_util . h " / / TODO : remove along with invariant from getCurrentMemberState <nl> + <nl> + namespace mongo { <nl> + namespace repl { <nl> + <nl> + LegacyReplicationCoordinator : : LegacyReplicationCoordinator ( ) { } <nl> + LegacyReplicationCoordinator : : ~ LegacyReplicationCoordinator ( ) { } <nl> + <nl> + void LegacyReplicationCoordinator : : startReplication ( ) { <nl> + / / TODO <nl> + } <nl> + <nl> + void LegacyReplicationCoordinator : : shutdown ( ) { <nl> + / / TODO <nl> + } <nl> + <nl> + bool LegacyReplicationCoordinator : : isShutdownOkay ( ) const { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + bool LegacyReplicationCoordinator : : isReplEnabled ( ) const { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + const MemberState & LegacyReplicationCoordinator : : getCurrentMemberState ( ) const { <nl> + / / TODO <nl> + invariant ( false ) ; <nl> + } <nl> + <nl> + Status LegacyReplicationCoordinator : : awaitReplication ( const OpTime & ts , <nl> + const WriteConcernOptions & writeConcern , <nl> + Milliseconds timeout ) { <nl> + / / TODO <nl> + return Status : : OK ( ) ; <nl> + } <nl> + <nl> + bool LegacyReplicationCoordinator : : canAcceptWritesFor ( const NamespaceString & collection ) { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + bool LegacyReplicationCoordinator : : canServeReadsFor ( const NamespaceString & collection ) { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + bool LegacyReplicationCoordinator : : shouldIgnoreUniqueIndex ( const IndexDescriptor * idx ) { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + Status LegacyReplicationCoordinator : : setLastOptime ( const HostAndPort & member , <nl> + const OpTime & ts ) { <nl> + / / TODO <nl> + return Status : : OK ( ) ; <nl> + } <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> new file mode 100644 <nl> index 000000000000 . . 6b1ccb3b76a4 <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator_legacy . h <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # pragma once <nl> + <nl> + # include " mongo / base / status . h " <nl> + # include " mongo / db / repl / repl_coordinator . h " <nl> + <nl> + namespace mongo { <nl> + namespace repl { <nl> + <nl> + / * * <nl> + * An implementation of ReplicationCoordinator that simply delegates to existing code . <nl> + * / <nl> + class LegacyReplicationCoordinator : public ReplicationCoordinator { <nl> + MONGO_DISALLOW_COPYING ( LegacyReplicationCoordinator ) ; <nl> + <nl> + public : <nl> + <nl> + LegacyReplicationCoordinator ( ) ; <nl> + virtual ~ LegacyReplicationCoordinator ( ) ; <nl> + <nl> + virtual void startReplication ( ) ; <nl> + <nl> + virtual void shutdown ( ) ; <nl> + <nl> + virtual bool isShutdownOkay ( ) const ; <nl> + <nl> + virtual bool isReplEnabled ( ) const ; <nl> + <nl> + virtual const MemberState & getCurrentMemberState ( ) const ; <nl> + <nl> + virtual Status awaitReplication ( const OpTime & ts , <nl> + const WriteConcernOptions & writeConcern , <nl> + Milliseconds timeout ) ; <nl> + <nl> + <nl> + virtual bool canAcceptWritesFor ( const NamespaceString & collection ) ; <nl> + <nl> + virtual bool canServeReadsFor ( const NamespaceString & collection ) ; <nl> + <nl> + virtual bool shouldIgnoreUniqueIndex ( const IndexDescriptor * idx ) ; <nl> + <nl> + virtual Status setLastOptime ( const HostAndPort & member , const OpTime & ts ) ; <nl> + } ; <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> new file mode 100644 <nl> index 000000000000 . . 33f53be96728 <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator_mock . cpp <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # include " mongo / platform / basic . h " <nl> + <nl> + # include " mongo / db / repl / repl_coordinator_mock . h " <nl> + <nl> + # include " mongo / base / status . h " <nl> + # include " mongo / util / assert_util . h " <nl> + <nl> + namespace mongo { <nl> + namespace repl { <nl> + <nl> + ReplicationCoordinatorMock : : ReplicationCoordinatorMock ( ) { } <nl> + ReplicationCoordinatorMock : : ~ ReplicationCoordinatorMock ( ) { } <nl> + <nl> + void ReplicationCoordinatorMock : : startReplication ( ) { <nl> + / / TODO <nl> + } <nl> + <nl> + void ReplicationCoordinatorMock : : shutdown ( ) { <nl> + / / TODO <nl> + } <nl> + <nl> + bool ReplicationCoordinatorMock : : isShutdownOkay ( ) const { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + bool ReplicationCoordinatorMock : : isReplEnabled ( ) const { <nl> + return false ; <nl> + } <nl> + <nl> + const MemberState & ReplicationCoordinatorMock : : getCurrentMemberState ( ) const { <nl> + / / TODO <nl> + invariant ( false ) ; <nl> + } <nl> + <nl> + Status ReplicationCoordinatorMock : : awaitReplication ( const OpTime & ts , <nl> + const WriteConcernOptions & writeConcern , <nl> + Milliseconds timeout ) { <nl> + / / TODO <nl> + return Status : : OK ( ) ; <nl> + } <nl> + <nl> + bool ReplicationCoordinatorMock : : canAcceptWritesFor ( const NamespaceString & collection ) { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + bool ReplicationCoordinatorMock : : canServeReadsFor ( const NamespaceString & collection ) { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + bool ReplicationCoordinatorMock : : shouldIgnoreUniqueIndex ( const IndexDescriptor * idx ) { <nl> + / / TODO <nl> + return false ; <nl> + } <nl> + <nl> + Status ReplicationCoordinatorMock : : setLastOptime ( const HostAndPort & member , <nl> + const OpTime & ts ) { <nl> + / / TODO <nl> + return Status : : OK ( ) ; <nl> + } <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> new file mode 100644 <nl> index 000000000000 . . af60cf422126 <nl> mmm / dev / null <nl> ppp b / src / mongo / db / repl / repl_coordinator_mock . h <nl> <nl> + / * * <nl> + * Copyright ( C ) 2014 MongoDB Inc . <nl> + * <nl> + * This program is free software : you can redistribute it and / or modify <nl> + * it under the terms of the GNU Affero General Public License , version 3 , <nl> + * as published by the Free Software Foundation . <nl> + * <nl> + * This program is distributed in the hope that it will be useful , <nl> + * but WITHOUT ANY WARRANTY ; without even the implied warranty of <nl> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the <nl> + * GNU Affero General Public License for more details . <nl> + * <nl> + * You should have received a copy of the GNU Affero General Public License <nl> + * along with this program . If not , see < http : / / www . gnu . org / licenses / > . <nl> + * <nl> + * As a special exception , the copyright holders give permission to link the <nl> + * code of portions of this program with the OpenSSL library under certain <nl> + * conditions as described in each individual source file and distribute <nl> + * linked combinations including the program with the OpenSSL library . You <nl> + * must comply with the GNU Affero General Public License in all respects for <nl> + * all of the code used other than as permitted herein . If you modify file ( s ) <nl> + * with this exception , you may extend this exception to your version of the <nl> + * file ( s ) , but you are not obligated to do so . If you do not wish to do so , <nl> + * delete this exception statement from your version . If you delete this <nl> + * exception statement from all source files in the program , then also delete <nl> + * it in the license file . <nl> + * / <nl> + <nl> + # pragma once <nl> + <nl> + # include " mongo / base / status . h " <nl> + # include " mongo / db / repl / repl_coordinator . h " <nl> + <nl> + namespace mongo { <nl> + namespace repl { <nl> + <nl> + / * * <nl> + * A mock ReplicationCoordinator . Currently it is extremely simple and exists solely to link <nl> + * into dbtests . <nl> + * / <nl> + class ReplicationCoordinatorMock : public ReplicationCoordinator { <nl> + MONGO_DISALLOW_COPYING ( ReplicationCoordinatorMock ) ; <nl> + <nl> + public : <nl> + <nl> + ReplicationCoordinatorMock ( ) ; <nl> + virtual ~ ReplicationCoordinatorMock ( ) ; <nl> + <nl> + virtual void startReplication ( ) ; <nl> + <nl> + virtual void shutdown ( ) ; <nl> + <nl> + virtual bool isShutdownOkay ( ) const ; <nl> + <nl> + virtual bool isReplEnabled ( ) const ; <nl> + <nl> + virtual const MemberState & getCurrentMemberState ( ) const ; <nl> + <nl> + virtual Status awaitReplication ( const OpTime & ts , <nl> + const WriteConcernOptions & writeConcern , <nl> + Milliseconds timeout ) ; <nl> + <nl> + <nl> + virtual bool canAcceptWritesFor ( const NamespaceString & collection ) ; <nl> + <nl> + virtual bool canServeReadsFor ( const NamespaceString & collection ) ; <nl> + <nl> + virtual bool shouldIgnoreUniqueIndex ( const IndexDescriptor * idx ) ; <nl> + <nl> + virtual Status setLastOptime ( const HostAndPort & member , const OpTime & ts ) ; <nl> + } ; <nl> + <nl> + } / / namespace repl <nl> + } / / namespace mongo <nl> mmm a / src / mongo / db / repl / repl_start . cpp <nl> ppp b / src / mongo / db / repl / repl_start . cpp <nl> <nl> <nl> # include " mongo / db / repl / master_slave . h " <nl> # include " mongo / db / repl / oplog . h " <nl> + # include " mongo / db / repl / repl_coordinator_global . h " <nl> # include " mongo / db / repl / repl_settings . h " <nl> # include " mongo / db / repl / rs . h " <nl> # include " mongo / stdx / functional . h " <nl> namespace repl { <nl> replSet = true ; <nl> ReplSetCmdline * replSetCmdline = new ReplSetCmdline ( replSettings . replSet ) ; <nl> boost : : thread t ( stdx : : bind ( & startReplSets , replSetCmdline ) ) ; <nl> - <nl> - return ; <nl> + } else { <nl> + startMasterSlave ( ) ; <nl> } <nl> <nl> - startMasterSlave ( ) ; <nl> + getGlobalReplicationCoordinator ( ) - > startReplication ( ) ; <nl> } <nl> <nl> } / / namespace repl <nl> mmm a / src / mongo / dbtests / dbtests . cpp <nl> ppp b / src / mongo / dbtests / dbtests . cpp <nl> <nl> # include " mongo / db / auth / authorization_manager . h " <nl> # include " mongo / db / auth / authorization_manager_global . h " <nl> # include " mongo / db / auth / authz_manager_external_state_mock . h " <nl> + # include " mongo / db / repl / repl_coordinator_global . h " <nl> + # include " mongo / db / repl / repl_coordinator_mock . h " <nl> # include " mongo / dbtests / dbtests . h " <nl> # include " mongo / dbtests / framework . h " <nl> # include " mongo / util / exception_filter_win32 . h " <nl> int dbtestsMain ( int argc , char * * argv , char * * envp ) { <nl> static StaticObserver StaticObserver ; <nl> setWindowsUnhandledExceptionFilter ( ) ; <nl> setGlobalAuthorizationManager ( new AuthorizationManager ( new AuthzManagerExternalStateMock ( ) ) ) ; <nl> + repl : : setGlobalReplicationCoordinator ( new repl : : ReplicationCoordinatorMock ( ) ) ; <nl> Command : : testCommandsEnabled = 1 ; <nl> mongo : : runGlobalInitializersOrDie ( argc , argv , envp ) ; <nl> StartupTest : : runTests ( ) ; <nl> mmm a / src / mongo / tools / restore . cpp <nl> ppp b / src / mongo / tools / restore . cpp <nl> namespace { <nl> const char * OPLOG_SENTINEL = " $ oplog " ; / / compare by ptr not strcmp <nl> } <nl> <nl> - MONGO_INITIALIZER_WITH_PREREQUISITES ( RestoreAuthExternalState , ( " ToolAuthExternalState " ) ) ( <nl> + MONGO_INITIALIZER_WITH_PREREQUISITES ( RestoreAuthExternalState , ( " ToolMocks " ) ) ( <nl> InitializerContext * context ) { <nl> / / Give restore the mongod implementation of AuthorizationManager so that it can run <nl> / / the _mergeAuthzCollections command directly against the data files <nl> mmm a / src / mongo / tools / tool . cpp <nl> ppp b / src / mongo / tools / tool . cpp <nl> <nl> # include " mongo / db / auth / authz_manager_external_state_mock . h " <nl> # include " mongo / db / client . h " <nl> # include " mongo / db / json . h " <nl> + # include " mongo / db / repl / repl_coordinator_global . h " <nl> + # include " mongo / db / repl / repl_coordinator_mock . h " <nl> # include " mongo / db / storage_options . h " <nl> # include " mongo / db / storage / mmap_v1 / dur . h " <nl> # include " mongo / platform / posix_fadvise . h " <nl> namespace mongo { <nl> delete _conn ; <nl> } <nl> <nl> - MONGO_INITIALIZER ( ToolAuthExternalState ) ( InitializerContext * ) { <nl> + MONGO_INITIALIZER ( ToolMocks ) ( InitializerContext * ) { <nl> setGlobalAuthorizationManager ( new AuthorizationManager ( <nl> new AuthzManagerExternalStateMock ( ) ) ) ; <nl> + repl : : setGlobalReplicationCoordinator ( new repl : : ReplicationCoordinatorMock ( ) ) ; <nl> return Status : : OK ( ) ; <nl> } <nl> <nl>
SERVER - 14135 Initial skeleton of ReplicationCoordinator interface
mongodb/mongo
a156d9a1bd205367c5b4dd8d12e6246090fafbbd
2014-06-06T15:13:18Z
mmm a / templates / lua - template - runtime / frameworks / runtime - src / Classes / Runtime . cpp <nl> ppp b / templates / lua - template - runtime / frameworks / runtime - src / Classes / Runtime . cpp <nl> THE SOFTWARE . <nl> # include " lua_debugger . h " <nl> # include " CCLuaEngine . h " <nl> # include " cocos2d . h " <nl> - # include " CCFontFNT . h " <nl> # include " json / document . h " <nl> # include " json / filestream . h " <nl> # include " json / stringbuffer . h " <nl> using namespace std ; <nl> using namespace cocos2d ; <nl> <nl> std : : string g_resourcePath ; <nl> - rapidjson : : Document g_filecfgjson ; <nl> + static rapidjson : : Document g_filecfgjson ; <nl> <nl> extern string getIPAddress ( ) ; <nl> const char * getRuntimeVersion ( ) <nl>
add static to variable
cocos2d/cocos2d-x
1fadba7de1f5e14cd57fc53e75b5bb233275353e
2014-04-11T02:41:35Z
mmm a / src / builtins / builtins . h <nl> ppp b / src / builtins / builtins . h <nl> class Builtins { <nl> return kAllBuiltinsAreIsolateIndependent ; <nl> } <nl> <nl> - / / Wasm runtime stubs are treated specially by wasm . To guarantee reachability <nl> - / / through near jumps , their code is completely copied into a fresh off - heap <nl> - / / area . <nl> static bool IsWasmRuntimeStub ( int index ) ; <nl> <nl> / / Initializes the table of builtin entry points based on the current contents <nl> mmm a / src / heap / heap . cc <nl> ppp b / src / heap / heap . cc <nl> void Heap : : IterateStrongRoots ( RootVisitor * v , VisitMode mode ) { <nl> isolate_ - > IterateDeferredHandles ( v ) ; <nl> v - > Synchronize ( VisitorSynchronization : : kHandleScope ) ; <nl> <nl> - / / Iterate over the builtin code objects and code stubs in the <nl> - / / heap . Note that it is not necessary to iterate over code objects <nl> - / / on scavenge collections . <nl> + / / Iterate over the builtin code objects in the heap . Note that it is not <nl> + / / necessary to iterate over code objects on scavenge collections . <nl> if ( ! isMinorGC ) { <nl> IterateBuiltins ( v ) ; <nl> v - > Synchronize ( VisitorSynchronization : : kBuiltins ) ; <nl> - <nl> - / / The dispatch table is set up directly from the builtins using <nl> - / / IntitializeDispatchTable so there is no need to iterate to create it . <nl> - if ( mode ! = VISIT_FOR_SERIALIZATION ) { <nl> - / / Currently we iterate the dispatch table to update pointers to possibly <nl> - / / moved Code objects for bytecode handlers . <nl> - / / TODO ( v8 : 6666 ) : Remove iteration once builtins are embedded ( and thus <nl> - / / immovable ) in every build configuration . <nl> - isolate_ - > interpreter ( ) - > IterateDispatchTable ( v ) ; <nl> - v - > Synchronize ( VisitorSynchronization : : kDispatchTable ) ; <nl> - } <nl> } <nl> <nl> / / Iterate over global handles . <nl> void Heap : : IterateBuiltins ( RootVisitor * v ) { <nl> FullObjectSlot ( builtin_address ( i ) ) ) ; <nl> } <nl> <nl> - / / The entry table does not need to be updated if all builtins are embedded . <nl> + / / The entry table doesn ' t need to be updated since all builtins are embedded . <nl> STATIC_ASSERT ( Builtins : : AllBuiltinsAreIsolateIndependent ( ) ) ; <nl> } <nl> <nl> mmm a / src / interpreter / interpreter . cc <nl> ppp b / src / interpreter / interpreter . cc <nl> Code Interpreter : : GetBytecodeHandler ( Bytecode bytecode , <nl> <nl> void Interpreter : : SetBytecodeHandler ( Bytecode bytecode , <nl> OperandScale operand_scale , Code handler ) { <nl> + DCHECK ( handler . is_off_heap_trampoline ( ) ) ; <nl> DCHECK ( handler . kind ( ) = = Code : : BYTECODE_HANDLER ) ; <nl> size_t index = GetDispatchTableIndex ( bytecode , operand_scale ) ; <nl> dispatch_table_ [ index ] = handler . InstructionStart ( ) ; <nl> size_t Interpreter : : GetDispatchTableIndex ( Bytecode bytecode , <nl> kEntriesPerOperandScale ; <nl> } <nl> <nl> - void Interpreter : : IterateDispatchTable ( RootVisitor * v ) { <nl> - if ( ! isolate_ - > serializer_enabled ( ) & & isolate_ - > embedded_blob ( ) ! = nullptr ) { <nl> - / / If we ' re not generating a snapshot , then every bytecode handler will be <nl> - / / off - heap , so there ' s no point iterating over them . <nl> - # ifdef DEBUG <nl> - for ( int i = 0 ; i < kDispatchTableSize ; i + + ) { <nl> - Address code_entry = dispatch_table_ [ i ] ; <nl> - CHECK ( code_entry = = kNullAddress | | <nl> - InstructionStream : : PcIsOffHeap ( isolate_ , code_entry ) ) ; <nl> - } <nl> - # endif / / DEBUG <nl> - return ; <nl> - } <nl> - <nl> - for ( int i = 0 ; i < kDispatchTableSize ; i + + ) { <nl> - Address code_entry = dispatch_table_ [ i ] ; <nl> - / / Skip over off - heap bytecode handlers since they will never move . <nl> - if ( InstructionStream : : PcIsOffHeap ( isolate_ , code_entry ) ) continue ; <nl> - <nl> - / / TODO ( jkummerow ) : Would it hurt to simply do : <nl> - / / if ( code_entry = = kNullAddress ) continue ; <nl> - Code code ; <nl> - if ( code_entry ! = kNullAddress ) { <nl> - code = Code : : GetCodeFromTargetAddress ( code_entry ) ; <nl> - } <nl> - Code old_code = code ; <nl> - v - > VisitRootPointer ( Root : : kDispatchTable , nullptr , FullObjectSlot ( & code ) ) ; <nl> - if ( code ! = old_code ) { <nl> - dispatch_table_ [ i ] = code . entry ( ) ; <nl> - } <nl> - } <nl> - } <nl> - <nl> int Interpreter : : InterruptBudget ( ) { <nl> return FLAG_interrupt_budget ; <nl> } <nl> mmm a / src / interpreter / interpreter . h <nl> ppp b / src / interpreter / interpreter . h <nl> class Interpreter { <nl> void SetBytecodeHandler ( Bytecode bytecode , OperandScale operand_scale , <nl> Code handler ) ; <nl> <nl> - / / GC support . <nl> - void IterateDispatchTable ( RootVisitor * v ) ; <nl> - <nl> / / Disassembler support . <nl> V8_EXPORT_PRIVATE const char * LookupNameOfBytecodeHandler ( const Code code ) ; <nl> <nl> mmm a / src / objects / visitors . h <nl> ppp b / src / objects / visitors . h <nl> class CodeDataContainer ; <nl> V ( kDebug , " ( Debugger ) " ) \ <nl> V ( kCompilationCache , " ( Compilation cache ) " ) \ <nl> V ( kHandleScope , " ( Handle scope ) " ) \ <nl> - V ( kDispatchTable , " ( Dispatch table ) " ) \ <nl> V ( kBuiltins , " ( Builtins ) " ) \ <nl> V ( kGlobalHandles , " ( Global handles ) " ) \ <nl> V ( kEternalHandles , " ( Eternal handles ) " ) \ <nl>
[ heap ] Do not visit the dispatch table
v8/v8
802a86a4df59bab03ef37a761d39aa310d6efd44
2019-11-06T15:28:53Z
mmm a / include / v8 . h <nl> ppp b / include / v8 . h <nl> <nl> # include < stddef . h > <nl> # include < stdint . h > <nl> # include < stdio . h > <nl> + # include < memory > <nl> # include < utility > <nl> # include < vector > <nl> <nl> class V8_EXPORT Proxy : public Object { <nl> static void CheckCast ( Value * obj ) ; <nl> } ; <nl> <nl> + class V8_EXPORT WasmCompiledModule : public Object { <nl> + public : <nl> + typedef std : : pair < std : : unique_ptr < const uint8_t [ ] > , size_t > SerializedModule ; <nl> + <nl> + SerializedModule Serialize ( ) ; <nl> + static MaybeLocal < WasmCompiledModule > Deserialize ( <nl> + Isolate * isolate , const SerializedModule serialized_data ) ; <nl> + V8_INLINE static WasmCompiledModule * Cast ( Value * obj ) ; <nl> + <nl> + private : <nl> + WasmCompiledModule ( ) ; <nl> + static void CheckCast ( Value * obj ) ; <nl> + } ; <nl> <nl> # ifndef V8_ARRAY_BUFFER_INTERNAL_FIELD_COUNT <nl> / / The number of required internal fields can be defined by embedder . <nl> Proxy * Proxy : : Cast ( v8 : : Value * value ) { <nl> return static_cast < Proxy * > ( value ) ; <nl> } <nl> <nl> + WasmCompiledModule * WasmCompiledModule : : Cast ( v8 : : Value * value ) { <nl> + # ifdef V8_ENABLE_CHECKS <nl> + CheckCast ( value ) ; <nl> + # endif <nl> + return static_cast < WasmCompiledModule * > ( value ) ; <nl> + } <nl> <nl> Promise : : Resolver * Promise : : Resolver : : Cast ( v8 : : Value * value ) { <nl> # ifdef V8_ENABLE_CHECKS <nl> mmm a / src / api . cc <nl> ppp b / src / api . cc <nl> <nl> # include " src / runtime - profiler . h " <nl> # include " src / runtime / runtime . h " <nl> # include " src / simulator . h " <nl> + # include " src / snapshot / code - serializer . h " <nl> # include " src / snapshot / natives . h " <nl> # include " src / snapshot / snapshot . h " <nl> # include " src / startup - data - util . h " <nl> <nl> # include " src / v8threads . h " <nl> # include " src / version . h " <nl> # include " src / vm - state - inl . h " <nl> + # include " src / wasm / wasm - module . h " <nl> <nl> namespace v8 { <nl> <nl> bool Value : : IsNumber ( ) const { <nl> bool Value : : IsProxy ( ) const { return Utils : : OpenHandle ( this ) - > IsJSProxy ( ) ; } <nl> <nl> bool Value : : IsWebAssemblyCompiledModule ( ) const { <nl> - return Utils : : OpenHandle ( this ) - > IsWebAssemblyCompiledModule ( ) ; <nl> + i : : Handle < i : : Object > obj = Utils : : OpenHandle ( this ) ; <nl> + if ( ! obj - > IsJSObject ( ) ) return false ; <nl> + i : : Handle < i : : JSObject > js_obj = i : : Handle < i : : JSObject > : : cast ( obj ) ; <nl> + return js_obj - > GetIsolate ( ) - > native_context ( ) - > wasm_module_constructor ( ) = = <nl> + js_obj - > map ( ) - > GetConstructor ( ) ; <nl> } <nl> <nl> # define VALUE_IS_SPECIFIC_TYPE ( Type , Class ) \ <nl> void v8 : : Proxy : : CheckCast ( Value * that ) { <nl> " Could not convert to proxy " ) ; <nl> } <nl> <nl> + void v8 : : WasmCompiledModule : : CheckCast ( Value * that ) { <nl> + Utils : : ApiCheck ( that - > IsWebAssemblyCompiledModule ( ) , <nl> + " v8 : : WasmCompiledModule : : Cast " , <nl> + " Could not convert to wasm compiled module " ) ; <nl> + } <nl> <nl> void v8 : : ArrayBuffer : : CheckCast ( Value * that ) { <nl> i : : Handle < i : : Object > obj = Utils : : OpenHandle ( that ) ; <nl> MaybeLocal < Proxy > Proxy : : New ( Local < Context > context , Local < Object > local_target , <nl> RETURN_ESCAPED ( result ) ; <nl> } <nl> <nl> + WasmCompiledModule : : SerializedModule WasmCompiledModule : : Serialize ( ) { <nl> + i : : Handle < i : : JSObject > obj = <nl> + i : : Handle < i : : JSObject > : : cast ( Utils : : OpenHandle ( this ) ) ; <nl> + i : : Handle < i : : FixedArray > compiled_part = <nl> + i : : handle ( i : : FixedArray : : cast ( obj - > GetInternalField ( 0 ) ) ) ; <nl> + std : : unique_ptr < i : : ScriptData > script_data = <nl> + i : : WasmCompiledModuleSerializer : : SerializeWasmModule ( obj - > GetIsolate ( ) , <nl> + compiled_part ) ; <nl> + size_t size = static_cast < size_t > ( script_data - > length ( ) ) ; <nl> + script_data . release ( ) ; <nl> + return { std : : unique_ptr < const uint8_t [ ] > ( script_data - > data ( ) ) , size } ; <nl> + } <nl> + <nl> + MaybeLocal < WasmCompiledModule > WasmCompiledModule : : Deserialize ( <nl> + Isolate * isolate , <nl> + const WasmCompiledModule : : SerializedModule serialized_data ) { <nl> + int size = static_cast < int > ( serialized_data . second ) ; <nl> + i : : ScriptData sc ( serialized_data . first . get ( ) , size ) ; <nl> + i : : Isolate * i_isolate = reinterpret_cast < i : : Isolate * > ( isolate ) ; <nl> + i : : MaybeHandle < i : : FixedArray > maybe_compiled_part = <nl> + i : : WasmCompiledModuleSerializer : : DeserializeWasmModule ( i_isolate , & sc ) ; <nl> + i : : Handle < i : : FixedArray > compiled_part ; <nl> + if ( ! maybe_compiled_part . ToHandle ( & compiled_part ) ) { <nl> + return MaybeLocal < WasmCompiledModule > ( ) ; <nl> + } <nl> + return Local < WasmCompiledModule > : : Cast ( Utils : : ToLocal ( <nl> + i : : wasm : : CreateCompiledModuleObject ( i_isolate , compiled_part ) ) ) ; <nl> + } <nl> + <nl> / / static <nl> v8 : : ArrayBuffer : : Allocator * v8 : : ArrayBuffer : : Allocator : : NewDefaultAllocator ( ) { <nl> return new ArrayBufferAllocator ( ) ; <nl> mmm a / src / objects - inl . h <nl> ppp b / src / objects - inl . h <nl> bool HeapObject : : IsFixedArrayBase ( ) const { <nl> return IsFixedArray ( ) | | IsFixedDoubleArray ( ) | | IsFixedTypedArrayBase ( ) ; <nl> } <nl> <nl> - bool HeapObject : : IsWebAssemblyCompiledModule ( ) const { <nl> - if ( ! IsJSObject ( ) ) return false ; <nl> - return GetIsolate ( ) - > native_context ( ) - > wasm_module_constructor ( ) = = <nl> - this - > map ( ) - > GetConstructor ( ) ; <nl> - } <nl> - <nl> bool HeapObject : : IsFixedArray ( ) const { <nl> InstanceType instance_type = map ( ) - > instance_type ( ) ; <nl> return instance_type = = FIXED_ARRAY_TYPE | | <nl> mmm a / src / objects . cc <nl> ppp b / src / objects . cc <nl> <nl> # include " src / prototype . h " <nl> # include " src / regexp / jsregexp . h " <nl> # include " src / safepoint - table . h " <nl> + # include " src / snapshot / code - serializer . h " <nl> # include " src / source - position - table . h " <nl> # include " src / string - builder . h " <nl> # include " src / string - search . h " <nl> Handle < JSArrayBuffer > JSTypedArray : : GetBuffer ( ) { <nl> return MaterializeArrayBuffer ( self ) ; <nl> } <nl> <nl> - std : : pair < std : : unique_ptr < const byte > , size_t > <nl> - WebAssemblyCompiledModule : : Serialize ( ) { <nl> - / / TODO ( mtrofin ) : tie to the internal serialization API <nl> - return { std : : unique_ptr < const byte > ( ) , 0 } ; <nl> - } <nl> - <nl> - MaybeHandle < WebAssemblyCompiledModule > WebAssemblyCompiledModule : : Deserialize ( <nl> - Isolate * isolate , const byte * data , size_t size ) { <nl> - / / TODO ( mtrofin ) : tie to the internal serialization API <nl> - return MaybeHandle < WebAssemblyCompiledModule > ( ) ; <nl> - } <nl> - <nl> Handle < PropertyCell > PropertyCell : : InvalidateEntry ( <nl> Handle < GlobalDictionary > dictionary , int entry ) { <nl> Isolate * isolate = dictionary - > GetIsolate ( ) ; <nl> mmm a / src / objects . h <nl> ppp b / src / objects . h <nl> template < class C > inline bool Is ( Object * obj ) ; <nl> V ( JSWeakMap ) \ <nl> V ( JSWeakSet ) \ <nl> V ( JSRegExp ) \ <nl> - V ( WebAssemblyCompiledModule ) \ <nl> V ( HashTable ) \ <nl> V ( Dictionary ) \ <nl> V ( UnseededNumberDictionary ) \ <nl> class JSMessageObject : public JSObject { <nl> kSize > BodyDescriptor ; <nl> } ; <nl> <nl> - / / A compiled web assembly module . <nl> - class WebAssemblyCompiledModule : public JSObject { <nl> - public : <nl> - / / Serialize the compiled module . The returned buffer is owned by <nl> - / / the caller , who may simply leave the return value drop out of <nl> - / / scope , once done processing the bytes . <nl> - / / TODO ( mtrofin ) : to avoid increased memory pressure , we should <nl> - / / explore a caller - provided segmented memory design . <nl> - std : : pair < std : : unique_ptr < const byte > , size_t > Serialize ( ) ; <nl> - <nl> - / / Deserialize a compiled module . The buffer is owned by the caller and may <nl> - / / be released after deserialization returns . <nl> - static MaybeHandle < WebAssemblyCompiledModule > Deserialize ( Isolate * isolate , <nl> - const byte * data , <nl> - size_t size ) ; <nl> - <nl> - private : <nl> - DISALLOW_IMPLICIT_CONSTRUCTORS ( WebAssemblyCompiledModule ) ; <nl> - } ; <nl> - <nl> / / Regular expressions <nl> / / The regular expression holds a single reference to a FixedArray in <nl> / / the kDataOffset field . <nl>
[ wasm ] serialization : updated external APIs .
v8/v8
d29bb4bfab5775e1922a367c872b9ef1b5e5a812
2016-08-10T06:35:42Z
mmm a / js / client / modules / @ arangodb / inspector . js <nl> ppp b / js / client / modules / @ arangodb / inspector . js <nl> function getServerData ( arango ) { <nl> agencyConfig = arango . GET ( ' _api / agency / config ' ) ; <nl> } <nl> const status = arango . GET ( ' _admin / status ' ) ; <nl> + const time = require ( ' internal ' ) . time ( ) ; <nl> + <nl> var tmp = executeExternalAndWait ( <nl> + ' / bin / bash ' , [ ' - c ' , ' date - u " + % Y - % m - % d % H : % M : % S % Z " | tee / tmp / inspector - date . out > / dev / null ' ] ) ; <nl> + const date = fs . readFileSync ( ' / tmp / inspector - date . out ' , ' utf8 ' ) . slice ( 0 , - 1 ) ; <nl> + tmp = executeExternalAndWait ( <nl> ' / bin / bash ' , [ ' - c ' , ' dmesg | tee / tmp / inspector - dmesg . out > / dev / null ' ] ) ; <nl> - const dmesg = fs . readFileSync ( ' / tmp / inspector - dmesg . out ' , ' utf8 ' ) ; <nl> + const dmesg = fs . readFileSync ( ' / tmp / inspector - dmesg . out ' , ' utf8 ' ) . slice ( 0 , - 1 ) ; <nl> tmp = executeExternalAndWait ( <nl> ' / bin / bash ' , [ ' - c ' , ' df - h | tee / tmp / inspector - df . out > / dev / null ' ] ) ; <nl> - const df = fs . readFileSync ( ' / tmp / inspector - df . out ' , ' utf8 ' ) ; <nl> + const df = fs . readFileSync ( ' / tmp / inspector - df . out ' , ' utf8 ' ) . slice ( 0 , - 1 ) ; <nl> tmp = executeExternalAndWait ( <nl> ' / bin / bash ' , [ ' - c ' , ' cat / proc / meminfo | tee / tmp / inspector - meminfo . out > / dev / null ' ] ) ; <nl> - const meminfo = fs . readFileSync ( ' / tmp / inspector - meminfo . out ' , ' utf8 ' ) ; <nl> + const meminfo = fs . readFileSync ( ' / tmp / inspector - meminfo . out ' , ' utf8 ' ) . slice ( 0 , - 1 ) ; <nl> tmp = executeExternalAndWait ( <nl> ' / bin / bash ' , [ ' - c ' , ' uptime | tee / tmp / inspector - uptime . out > / dev / null ' ] ) ; <nl> - const uptime = fs . readFileSync ( ' / tmp / inspector - uptime . out ' , ' utf8 ' ) ; <nl> + const uptime = fs . readFileSync ( ' / tmp / inspector - uptime . out ' , ' utf8 ' ) . slice ( 0 , - 1 ) ; <nl> tmp = executeExternalAndWait ( <nl> ' / bin / bash ' , [ ' - c ' , ' uname - a | tee / tmp / inspector - uname . out > / dev / null ' ] ) ; <nl> - const uname = fs . readFileSync ( ' / tmp / inspector - uname . out ' , ' utf8 ' ) ; <nl> + const uname = fs . readFileSync ( ' / tmp / inspector - uname . out ' , ' utf8 ' ) . slice ( 0 , - 1 ) ; <nl> var top ; <nl> if ( status . pid ! = = undefined ) { <nl> tmp = executeExternalAndWait ( <nl> ' / bin / bash ' , [ ' - c ' , ' top - b - H - p ' + status . pid + ' - n 1 | tee / tmp / inspector - top . out > / dev / null ' ] ) ; <nl> - top = fs . readFileSync ( ' / tmp / inspector - top . out ' , ' utf8 ' ) ; <nl> + top = fs . readFileSync ( ' / tmp / inspector - top . out ' , ' utf8 ' ) . slice ( 0 , - 1 ) ; <nl> } <nl> <nl> var local = { } ; <nl> function getServerData ( arango ) { <nl> report [ server ] = { <nl> version : version , log : log , dmesg : dmesg , statistics : statistics , <nl> status : status , df : df , uptime : uptime , uname : uname , meminfo : meminfo , <nl> - local : local } ; <nl> + local : local , date : date , time : time } ; <nl> <nl> if ( agencyConfig ! = = undefined ) { <nl> report [ server ] . config = agencyConfig ; <nl> new file mode 100755 <nl> index 00000000000 . . 6fa851c512b <nl> mmm / dev / null <nl> ppp b / scripts / unpackInspectorReport . sh <nl> <nl> + # ! / bin / bash <nl> + <nl> + filename = arango - inspector . json <nl> + outdir = arango - inspector <nl> + <nl> + if [ [ $ # > 1 ] ] ; then <nl> + echo " * * Error * * - usage : unpackInspectorReport [ filename ] " <nl> + exit 1 <nl> + elif [ [ $ # = = 1 ] ] ; then <nl> + filename = $ 1 <nl> + fi <nl> + <nl> + if [ - f $ filename ] ; then <nl> + <nl> + # check json validity <nl> + if jq - e . > / dev / null 2 > & 1 < < < " $ json_string " ; then <nl> + mkdir arango - inspector <nl> + if [ [ $ ? - ne 0 ] ] ; then # target directory exists <nl> + echo " * * Error * * - failed to create directory structure " <nl> + exit 1 <nl> + fi <nl> + <nl> + # dump agency <nl> + echo - n " writing agency dump . . . " <nl> + cat $ filename | jq . agency | tee $ outdir / agency . json > / dev / null <nl> + echo done <nl> + <nl> + # dump <nl> + echo - n " writing agency analysis . . . " <nl> + cat $ filename | jq . analysis | tee $ outdir / agency - analysis . json > / dev / null <nl> + echo done <nl> + <nl> + # servers <nl> + echo " writing servers . . . " <nl> + for i in $ ( cat $ filename | jq . servers | jq ' keys [ ] ' ) ; do <nl> + name = $ ( echo $ i | sed s / \ " / / g ) <nl> + mkdir $ outdir / $ name <nl> + echo - n " writing $ i . . . " <nl> + for j in $ ( cat $ filename | jq . servers [ $ i ] | jq ' keys [ ] ' ) ; do <nl> + what = $ ( echo $ j | sed s / \ " / / g ) <nl> + cat $ filename | jq - r . servers [ $ i ] [ $ j ] | tee $ outdir / $ name / $ what > / dev / null <nl> + done <nl> + echo done <nl> + done <nl> + echo " . . . done " <nl> + <nl> + else # invalid json <nl> + echo " * * Error * * - failed to parse JSON , or got false / null " <nl> + fi <nl> + else <nl> + echo " * * Error * * - file $ filename does not exit " <nl> + fi <nl> + <nl> + echo " The report was unpacked successfully . " <nl>
unpacker for inspector reports ( )
arangodb/arangodb
10888f33bb8ac36f265615220e441b5b1cbe617e
2018-06-07T11:41:55Z
mmm a / tensorflow / workspace . bzl <nl> ppp b / tensorflow / workspace . bzl <nl> def tf_repositories ( path_prefix = " " , tf_repo_name = " " ) : <nl> ) <nl> <nl> # Check out LLVM and MLIR from llvm - project . <nl> - LLVM_COMMIT = " 0581c0b0eeba03da590d1176a4580cf9b9e8d1e3 " <nl> - LLVM_SHA256 = " 9d93364e8ecd080258a2d2a113383387b91e5f6f2b662b48897cde8c47c178b6 " <nl> + LLVM_COMMIT = " 4225e7fa34febac6da8c9151bd69f998a6a1d7df " <nl> + LLVM_SHA256 = " 8643272edab941b3608a0c9445ffadfd5bd39ee647f0e61d818649591a1638e0 " <nl> LLVM_URLS = [ <nl> " https : / / storage . googleapis . com / mirror . tensorflow . org / github . com / llvm / llvm - project / archive / { commit } . tar . gz " . format ( commit = LLVM_COMMIT ) , <nl> " https : / / github . com / llvm / llvm - project / archive / { commit } . tar . gz " . format ( commit = LLVM_COMMIT ) , <nl> mmm a / third_party / mlir / BUILD <nl> ppp b / third_party / mlir / BUILD <nl> cc_library ( <nl> " : LinalgToStandard " , <nl> " : LinalgTransforms " , <nl> " : NVVMDialect " , <nl> + " : OpenACCDialect " , <nl> " : OpenMPDialect " , <nl> " : QuantOps " , <nl> " : QuantPassIncGen " , <nl> cc_binary ( <nl> ] , <nl> ) <nl> <nl> + # # OpenACC dialect <nl> + <nl> + gentbl ( <nl> + name = " OpenACCOpsIncGen " , <nl> + strip_include_prefix = " include " , <nl> + tbl_outs = [ <nl> + ( <nl> + " - gen - dialect - decls - dialect = acc " , <nl> + " include / mlir / Dialect / OpenACC / OpenACCOpsDialect . h . inc " , <nl> + ) , <nl> + ( <nl> + " - gen - op - decls " , <nl> + " include / mlir / Dialect / OpenACC / OpenACCOps . h . inc " , <nl> + ) , <nl> + ( <nl> + " - gen - op - defs " , <nl> + " include / mlir / Dialect / OpenACC / OpenACCOps . cpp . inc " , <nl> + ) , <nl> + ( <nl> + " - gen - enum - decls " , <nl> + " include / mlir / Dialect / OpenACC / OpenACCOpsEnums . h . inc " , <nl> + ) , <nl> + ( <nl> + " - gen - enum - defs " , <nl> + " include / mlir / Dialect / OpenACC / OpenACCOpsEnums . cpp . inc " , <nl> + ) , <nl> + ( <nl> + " - gen - op - doc " , <nl> + " g3doc / Dialects / OpenACC / OpenACCOps . md " , <nl> + ) , <nl> + ] , <nl> + tblgen = " : mlir - tblgen " , <nl> + td_file = " include / mlir / Dialect / OpenACC / OpenACCOps . td " , <nl> + td_srcs = [ <nl> + " : OpBaseTdFiles " , <nl> + " : OmpCommonTdGen " , <nl> + ] , <nl> + ) <nl> + <nl> + cc_library ( <nl> + name = " OpenACCDialect " , <nl> + srcs = glob ( <nl> + [ <nl> + " lib / Dialect / OpenACC / IR / * . cpp " , <nl> + " lib / Dialect / OpenACC / IR / * . h " , <nl> + ] , <nl> + ) , <nl> + hdrs = glob ( [ <nl> + " include / mlir / Dialect / OpenACC / * . h " , <nl> + ] ) , <nl> + includes = [ " include " ] , <nl> + deps = [ <nl> + " : IR " , <nl> + " : OpenACCOpsIncGen " , <nl> + " : StandardOps " , <nl> + " @ llvm - project / / llvm : Support " , <nl> + ] , <nl> + ) <nl> + <nl> # # OpenMP dialect <nl> gentbl ( <nl> name = " OmpCommonTdGen " , <nl> cc_library ( <nl> ] , <nl> ) <nl> <nl> + # # QuantOps dialect <nl> filegroup ( <nl> name = " QuantizationOpsTdFiles " , <nl> srcs = [ <nl> filegroup ( <nl> ] , <nl> ) <nl> <nl> - # # QuantOps dialect <nl> gentbl ( <nl> name = " QuantOpsIncGen " , <nl> strip_include_prefix = " include " , <nl>
Integrate LLVM at llvm / llvm - project @ 4225e7fa34fe
tensorflow/tensorflow
c257a5d21025e32ac6f954e50c51d8657ee62fb3
2020-08-17T12:51:11Z
mmm a / include / swift / AST / Attr . def <nl> ppp b / include / swift / AST / Attr . def <nl> SIMPLE_DECL_ATTR ( _weakLinked , WeakLinked , <nl> OnNominalType | OnAssociatedType | OnFunc | OnAccessor | OnVar | <nl> OnSubscript | OnConstructor | OnEnumElement | OnExtension | UserInaccessible , <nl> 75 ) <nl> - SIMPLE_DECL_ATTR ( _frozen , Frozen , <nl> - OnEnum | <nl> + SIMPLE_DECL_ATTR ( frozen , Frozen , <nl> + OnEnum | OnStruct | <nl> UserInaccessible , <nl> 76 ) <nl> + DECL_ATTR_ALIAS ( _frozen , Frozen ) <nl> SIMPLE_DECL_ATTR ( _forbidSerializingReference , ForbidSerializingReference , <nl> OnAnyDecl | <nl> LongAttribute | RejectByParser | UserInaccessible | NotSerialized , <nl> mmm a / include / swift / AST / Decl . h <nl> ppp b / include / swift / AST / Decl . h <nl> class VarDecl : public AbstractStorageDecl { <nl> / / / exposed to clients . <nl> / / / There ' s a very narrow case when we would : if the decl is an instance <nl> / / / member with an initializer expression and the parent type is <nl> - / / / @ _fixed_layout and resides in a resilient module . <nl> + / / / @ frozen and resides in a resilient module . <nl> bool isInitExposedToClients ( ) const ; <nl> <nl> / / / Is this a special debugger variable ? <nl> mmm a / include / swift / AST / DiagnosticsSema . def <nl> ppp b / include / swift / AST / DiagnosticsSema . def <nl> WARNING ( pattern_type_not_usable_from_inline_warn , none , <nl> " % select { % select { variable | constant } 0 | property } 1 " <nl> " should be ' @ usableFromInline ' or public " , <nl> ( bool , bool ) ) <nl> - ERROR ( pattern_type_not_usable_from_inline_fixed_layout , none , <nl> - " type referenced from a stored property in a ' @ _fixed_layout ' struct must " <nl> + ERROR ( pattern_type_not_usable_from_inline_frozen , none , <nl> + " type referenced from a stored property in a ' @ frozen ' struct must " <nl> " be ' @ usableFromInline ' or public " , <nl> ( / * ignored * / bool , / * ignored * / bool ) ) <nl> ERROR ( pattern_type_not_usable_from_inline_inferred , none , <nl> WARNING ( pattern_type_not_usable_from_inline_inferred_warn , none , <nl> " with inferred type % 2 " <nl> " should be ' @ usableFromInline ' or public " , <nl> ( bool , bool , Type ) ) <nl> - ERROR ( pattern_type_not_usable_from_inline_inferred_fixed_layout , none , <nl> + ERROR ( pattern_type_not_usable_from_inline_inferred_frozen , none , <nl> " type referenced from a stored property with inferred type % 2 in a " <nl> - " ' @ _fixed_layout ' struct must be ' @ usableFromInline ' or public " , <nl> + " ' @ frozen ' struct must be ' @ usableFromInline ' or public " , <nl> ( / * ignored * / bool , / * ignored * / bool , Type ) ) <nl> <nl> ERROR ( pattern_binds_no_variables , none , <nl> ERROR ( fixed_layout_attr_on_internal_type , <nl> " % select { private | fileprivate | internal | % error | % error } 1 " , <nl> ( DeclName , AccessLevel ) ) <nl> <nl> + WARNING ( fixed_layout_struct , <nl> + none , " ' @ frozen ' attribute is now used for fixed - layout structs " , ( ) ) <nl> + <nl> + ERROR ( frozen_attr_on_internal_type , <nl> + none , " ' @ frozen ' attribute can only be applied to ' @ usableFromInline ' " <nl> + " or public declarations , but % 0 is " <nl> + " % select { private | fileprivate | internal | % error | % error } 1 " , <nl> + ( DeclName , AccessLevel ) ) <nl> + <nl> ERROR ( usable_from_inline_attr_with_explicit_access , <nl> none , " ' @ usableFromInline ' attribute can only be applied to internal " <nl> " declarations , but % 0 is % select { private | fileprivate | % error | public | open } 1 " , <nl> ERROR ( usable_from_inline_attr_in_protocol , none , <nl> " an ' @ inlinable ' function | " \ <nl> " an ' @ _alwaysEmitIntoClient ' function | " \ <nl> " a default argument value | " \ <nl> - " a property initializer in a ' @ _fixed_layout ' type } " <nl> + " a property initializer in a ' @ frozen ' type } " <nl> <nl> # define DECL_OR_ACCESSOR " % select { % 0 | % 0 for } " <nl> <nl> mmm a / lib / AST / Decl . cpp <nl> ppp b / lib / AST / Decl . cpp <nl> bool VarDecl : : isInitExposedToClients ( ) const { <nl> if ( ! parent ) return false ; <nl> if ( ! hasInitialValue ( ) ) return false ; <nl> if ( isStatic ( ) ) return false ; <nl> - return parent - > getAttrs ( ) . hasAttribute < FixedLayoutAttr > ( ) ; <nl> + return parent - > getAttrs ( ) . hasAttribute < FrozenAttr > ( ) | | <nl> + parent - > getAttrs ( ) . hasAttribute < FixedLayoutAttr > ( ) ; <nl> } <nl> <nl> / / / Check whether the given type representation will be <nl> bool NominalTypeDecl : : isFormallyResilient ( ) const { <nl> / * treatUsableFromInlineAsPublic = * / true ) . isPublic ( ) ) <nl> return false ; <nl> <nl> - / / Check for an explicit @ _fixed_layout or @ _frozen attribute . <nl> + / / Check for an explicit @ _fixed_layout or @ frozen attribute . <nl> if ( getAttrs ( ) . hasAttribute < FixedLayoutAttr > ( ) | | <nl> getAttrs ( ) . hasAttribute < FrozenAttr > ( ) ) { <nl> return false ; <nl> mmm a / lib / ClangImporter / ImportDecl . cpp <nl> ppp b / lib / ClangImporter / ImportDecl . cpp <nl> namespace { <nl> errorWrapper - > setAddedImplicitInitializers ( ) ; <nl> errorWrapper - > setAccess ( AccessLevel : : Public ) ; <nl> errorWrapper - > getAttrs ( ) . add ( <nl> - new ( Impl . SwiftContext ) FixedLayoutAttr ( / * IsImplicit * / true ) ) ; <nl> + new ( Impl . SwiftContext ) FrozenAttr ( / * IsImplicit * / true ) ) ; <nl> <nl> StringRef nameForMangling ; <nl> ClangImporterSynthesizedTypeAttr : : Kind relatedEntityKind ; <nl> mmm a / lib / IRGen / StructLayout . cpp <nl> ppp b / lib / IRGen / StructLayout . cpp <nl> StructLayout : : StructLayout ( IRGenModule & IGM , <nl> <nl> assert ( typeToFill = = nullptr | | Ty = = typeToFill ) ; <nl> <nl> - / / If the struct is not @ _fixed_layout , it will have a dynamic <nl> + / / If the struct is not @ frozen , it will have a dynamic <nl> / / layout outside of its resilience domain . <nl> if ( decl ) { <nl> if ( IGM . isResilient ( decl , ResilienceExpansion : : Minimal ) ) <nl> mmm a / lib / SIL / SILDeclRef . cpp <nl> ppp b / lib / SIL / SILDeclRef . cpp <nl> SILLinkage SILDeclRef : : getLinkage ( ForDefinition_t forDefinition ) const { <nl> if ( isStoredPropertyInitializer ( ) ) { <nl> / / Three cases : <nl> / / <nl> - / / 1 ) Type is formally @ _fixed_layout . Root initializers can be declared <nl> - / / @ inlinable . The property initializer must only reference <nl> + / / 1 ) Type is formally @ _fixed_layout / @ frozen . Root initializers can be <nl> + / / declared @ inlinable . The property initializer must only reference <nl> / / public symbols , and is serialized , so we give it PublicNonABI linkage . <nl> / / <nl> - / / 2 ) Type is not formally @ _fixed_layout and the module is not resilient . <nl> - / / Root initializers can be declared @ inlinable . This is the annoying <nl> - / / case . We give the initializer public linkage if the type is public . <nl> + / / 2 ) Type is not formally @ _fixed_layout / @ frozen and the module is not <nl> + / / resilient . Root initializers can be declared @ inlinable . This is the <nl> + / / annoying case . We give the initializer public linkage if the type is <nl> + / / public . <nl> / / <nl> / / 3 ) Type is resilient . The property initializer is never public because <nl> / / root initializers cannot be @ inlinable . <nl> IsSerialized_t SILDeclRef : : isSerialized ( ) const { <nl> } <nl> <nl> / / Stored property initializers are inlinable if the type is explicitly <nl> - / / marked as @ _fixed_layout . <nl> + / / marked as @ frozen . <nl> if ( isStoredPropertyInitializer ( ) ) { <nl> auto * nominal = cast < NominalTypeDecl > ( d - > getDeclContext ( ) ) ; <nl> auto scope = <nl> mmm a / lib / Sema / TypeCheckAccess . cpp <nl> ppp b / lib / Sema / TypeCheckAccess . cpp <nl> class UsableFromInlineChecker : public AccessControlCheckerBase , <nl> auto * parentStruct = dyn_cast < StructDecl > ( PBD - > getDeclContext ( ) ) ; <nl> if ( ! parentStruct ) <nl> return nullptr ; <nl> - if ( ! parentStruct - > getAttrs ( ) . hasAttribute < FixedLayoutAttr > ( ) | | <nl> + if ( ! ( parentStruct - > getAttrs ( ) . hasAttribute < FrozenAttr > ( ) | | <nl> + parentStruct - > getAttrs ( ) . hasAttribute < FixedLayoutAttr > ( ) ) | | <nl> PBD - > isStatic ( ) | | ! PBD - > hasStorage ( ) ) { <nl> return nullptr ; <nl> } <nl> class UsableFromInlineChecker : public AccessControlCheckerBase , <nl> auto diagID = diag : : pattern_type_not_usable_from_inline_inferred ; <nl> if ( fixedLayoutStructContext ) { <nl> diagID = <nl> - diag : : pattern_type_not_usable_from_inline_inferred_fixed_layout ; <nl> + diag : : pattern_type_not_usable_from_inline_inferred_frozen ; <nl> } else if ( ! TC . Context . isSwiftVersionAtLeast ( 5 ) ) { <nl> diagID = diag : : pattern_type_not_usable_from_inline_inferred_warn ; <nl> } <nl> class UsableFromInlineChecker : public AccessControlCheckerBase , <nl> DowngradeToWarning downgradeToWarning ) { <nl> auto diagID = diag : : pattern_type_not_usable_from_inline ; <nl> if ( fixedLayoutStructContext ) <nl> - diagID = diag : : pattern_type_not_usable_from_inline_fixed_layout ; <nl> + diagID = diag : : pattern_type_not_usable_from_inline_frozen ; <nl> else if ( ! TC . Context . isSwiftVersionAtLeast ( 5 ) ) <nl> diagID = diag : : pattern_type_not_usable_from_inline_warn ; <nl> auto diag = TC . diagnose ( TP - > getLoc ( ) , diagID , anyVar - > isLet ( ) , <nl> mmm a / lib / Sema / TypeCheckAttr . cpp <nl> ppp b / lib / Sema / TypeCheckAttr . cpp <nl> void AttributeChecker : : visitSpecializeAttr ( SpecializeAttr * attr ) { <nl> } <nl> <nl> void AttributeChecker : : visitFixedLayoutAttr ( FixedLayoutAttr * attr ) { <nl> + if ( isa < StructDecl > ( D ) ) { <nl> + TC . diagnose ( attr - > getLocation ( ) , diag : : fixed_layout_struct ) <nl> + . fixItReplace ( attr - > getRange ( ) , " @ frozen " ) ; <nl> + } <nl> + <nl> auto * VD = cast < ValueDecl > ( D ) ; <nl> <nl> if ( VD - > getFormalAccess ( ) < AccessLevel : : Public & & <nl> void AttributeChecker : : visitImplementsAttr ( ImplementsAttr * attr ) { <nl> } <nl> <nl> void AttributeChecker : : visitFrozenAttr ( FrozenAttr * attr ) { <nl> - auto * ED = cast < EnumDecl > ( D ) ; <nl> + if ( auto * ED = dyn_cast < EnumDecl > ( D ) ) { <nl> + if ( ! ED - > getModuleContext ( ) - > isResilient ( ) ) { <nl> + diagnoseAndRemoveAttr ( attr , diag : : enum_frozen_nonresilient , attr ) ; <nl> + return ; <nl> + } <nl> <nl> - if ( ! ED - > getModuleContext ( ) - > isResilient ( ) ) { <nl> - diagnoseAndRemoveAttr ( attr , diag : : enum_frozen_nonresilient , attr ) ; <nl> - return ; <nl> + if ( ED - > getFormalAccess ( ) < AccessLevel : : Public & & <nl> + ! ED - > getAttrs ( ) . hasAttribute < UsableFromInlineAttr > ( ) ) { <nl> + diagnoseAndRemoveAttr ( attr , diag : : enum_frozen_nonpublic , attr ) ; <nl> + return ; <nl> + } <nl> } <nl> <nl> - if ( ED - > getFormalAccess ( ) < AccessLevel : : Public & & <nl> - ! ED - > getAttrs ( ) . hasAttribute < UsableFromInlineAttr > ( ) ) { <nl> - diagnoseAndRemoveAttr ( attr , diag : : enum_frozen_nonpublic , attr ) ; <nl> + auto * VD = cast < ValueDecl > ( D ) ; <nl> + <nl> + if ( VD - > getFormalAccess ( ) < AccessLevel : : Public & & <nl> + ! VD - > getAttrs ( ) . hasAttribute < UsableFromInlineAttr > ( ) ) { <nl> + diagnoseAndRemoveAttr ( attr , diag : : frozen_attr_on_internal_type , <nl> + VD - > getFullName ( ) , VD - > getFormalAccess ( ) ) ; <nl> } <nl> } <nl> <nl> mmm a / stdlib / private / StdlibUnittest / StdlibCoreExtras . swift <nl> ppp b / stdlib / private / StdlibUnittest / StdlibCoreExtras . swift <nl> public func _isStdlibDebugConfiguration ( ) - > Bool { <nl> # endif <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct LinearCongruentialGenerator : RandomNumberGenerator { <nl> <nl> @ usableFromInline <nl> mmm a / stdlib / public / Darwin / ARKit / ARKit . swift <nl> ppp b / stdlib / public / Darwin / ARKit / ARKit . swift <nl> extension ARCamera { <nl> / * * <nl> A value describing the camera ' s tracking state . <nl> * / <nl> - @ _frozen <nl> + @ frozen <nl> public enum TrackingState { <nl> public enum Reason { <nl> / * * Tracking is limited due to initialization in progress . * / <nl> mmm a / stdlib / public / Darwin / CoreGraphics / CGFloat . swift . gyb <nl> ppp b / stdlib / public / Darwin / CoreGraphics / CGFloat . swift . gyb <nl> word_bits = int ( CMAKE_SIZEOF_VOID_P ) * 8 <nl> @ _exported import CoreGraphics <nl> import Darwin <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct CGFloat { <nl> # if arch ( i386 ) | | arch ( arm ) <nl> / / / The native type used to store the CGFloat , which is Float on <nl> mmm a / stdlib / public / Darwin / Dispatch / Dispatch . swift <nl> ppp b / stdlib / public / Darwin / Dispatch / Dispatch . swift <nl> public struct DispatchQoS : Equatable { <nl> } <nl> <nl> / / / <nl> - @ _frozen <nl> + @ frozen <nl> public enum DispatchTimeoutResult { <nl> case success <nl> case timedOut <nl> mmm a / stdlib / public / Darwin / Foundation / Data . swift <nl> ppp b / stdlib / public / Darwin / Foundation / Data . swift <nl> internal class __NSSwiftData : NSData { <nl> # endif <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessCollection , MutableCollection , RangeReplaceableCollection , MutableDataProtocol , ContiguousBytes { <nl> public typealias ReferenceType = NSData <nl> <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> / / A small inline buffer of bytes suitable for stack - allocation of small data . <nl> / / Inlinability strategy : everything here should be inlined for direct operation on the stack wherever possible . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct InlineData { <nl> # if arch ( x86_64 ) | | arch ( arm64 ) | | arch ( s390x ) | | arch ( powerpc64 ) | | arch ( powerpc64le ) <nl> @ usableFromInline typealias Buffer = ( UInt8 , UInt8 , UInt8 , UInt8 , UInt8 , UInt8 , UInt8 , UInt8 , <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> / / A buffer of bytes too large to fit in an InlineData , but still small enough to fit a storage pointer + range in two words . <nl> / / Inlinability strategy : everything here should be easily inlinable as large _DataStorage methods should not inline into here . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct InlineSlice { <nl> / / * * * WARNING * * * <nl> / / These ivars are specifically laid out so that they cause the enum _Representation to be 16 bytes on 64 bit platforms . This means we _MUST_ have the class type thing last <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> / / A buffer of bytes whose range is too large to fit in a signle word . Used alongside a RangeReference to make it fit into _Representation ' s two - word size . <nl> / / Inlinability strategy : everything here should be easily inlinable as large _DataStorage methods should not inline into here . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct LargeSlice { <nl> / / * * * WARNING * * * <nl> / / These ivars are specifically laid out so that they cause the enum _Representation to be 16 bytes on 64 bit platforms . This means we _MUST_ have the class type thing last <nl> public struct Data : ReferenceConvertible , Equatable , Hashable , RandomAccessColl <nl> / / The actual storage for Data ' s various representations . <nl> / / Inlinability strategy : almost everything should be inlinable as forwarding the underlying implementations . ( Inlining can also help avoid retain - release traffic around pulling values out of enums . ) <nl> @ usableFromInline <nl> - @ _frozen <nl> + @ frozen <nl> internal enum _Representation { <nl> case empty <nl> case inline ( InlineData ) <nl> mmm a / stdlib / public / Darwin / ObjectiveC / ObjectiveC . swift <nl> ppp b / stdlib / public / Darwin / ObjectiveC / ObjectiveC . swift <nl> import _SwiftObjectiveCOverlayShims <nl> / / / On 64 - bit iOS , the Objective - C BOOL type is a typedef of C / C + + <nl> / / / bool . Elsewhere , it is " signed char " . The Clang importer imports it as <nl> / / / ObjCBool . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ObjCBool : ExpressibleByBooleanLiteral { <nl> # if os ( macOS ) | | ( os ( iOS ) & & ( arch ( i386 ) | | arch ( arm ) ) ) <nl> / / On OS X and 32 - bit iOS , Objective - C ' s BOOL type is a " signed char " . <nl> func _convertObjCBoolToBool ( _ x : ObjCBool ) - > Bool { <nl> / / / convert between C strings and selectors . <nl> / / / <nl> / / / The compiler has special knowledge of this type . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Selector : ExpressibleByStringLiteral { <nl> var ptr : OpaquePointer <nl> <nl> extension Selector : CustomReflectable { <nl> / / NSZone <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct NSZone { <nl> var pointer : OpaquePointer <nl> } <nl> mmm a / stdlib / public / Platform / Platform . swift <nl> ppp b / stdlib / public / Platform / Platform . swift <nl> public var noErr : OSStatus { return 0 } <nl> / / / Foundation . <nl> / / / <nl> / / / The C type is a typedef for ` unsigned char ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct DarwinBoolean : ExpressibleByBooleanLiteral { <nl> @ usableFromInline var _value : UInt8 <nl> <nl> mmm a / stdlib / public / core / ASCII . swift <nl> ppp b / stdlib / public / core / ASCII . swift <nl> <nl> / / <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> extension Unicode { <nl> - @ _frozen <nl> + @ frozen <nl> public enum ASCII { } <nl> } <nl> <nl> extension Unicode . ASCII : Unicode . Encoding { <nl> return encode ( FromEncoding . decode ( content ) ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Parser { <nl> @ inlinable <nl> public init ( ) { } <nl> mmm a / stdlib / public / core / Algorithm . swift <nl> ppp b / stdlib / public / core / Algorithm . swift <nl> public func max < T : Comparable > ( _ x : T , _ y : T , _ z : T , _ rest : T . . . ) - > T { <nl> / / / } <nl> / / / / / Prints " 0 : foo " <nl> / / / / / Prints " 1 : bar " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct EnumeratedSequence < Base : Sequence > { <nl> @ usableFromInline <nl> internal var _base : Base <nl> extension EnumeratedSequence { <nl> / / / <nl> / / / To create an instance , call <nl> / / / ` enumerated ( ) . makeIterator ( ) ` on a sequence or collection . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> @ usableFromInline <nl> internal var _base : Base . Iterator <nl> mmm a / stdlib / public / core / AnyHashable . swift <nl> ppp b / stdlib / public / core / AnyHashable . swift <nl> internal struct _ConcreteHashableBox < Base : Hashable > : _AnyHashableBox { <nl> / / / print ( descriptions [ AnyHashable ( 43 ) ] ) / / prints " nil " <nl> / / / print ( descriptions [ AnyHashable ( Int8 ( 43 ) ) ] ! ) / / prints " an Int8 " <nl> / / / print ( descriptions [ AnyHashable ( Set ( [ " a " , " b " ] ) ) ] ! ) / / prints " a set of strings " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AnyHashable { <nl> internal var _box : _AnyHashableBox <nl> <nl> mmm a / stdlib / public / core / Array . swift <nl> ppp b / stdlib / public / core / Array . swift <nl> <nl> / / / - Note : The ` ContiguousArray ` and ` ArraySlice ` types are not bridged ; <nl> / / / instances of those types always have a contiguous block of memory as <nl> / / / their storage . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Array < Element > : _DestructorSafeContainer { <nl> # if _runtime ( _ObjC ) <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / ArrayBody . swift <nl> ppp b / stdlib / public / core / ArrayBody . swift <nl> <nl> <nl> import SwiftShims <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct _ArrayBody { <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / ArrayBuffer . swift <nl> ppp b / stdlib / public / core / ArrayBuffer . swift <nl> internal typealias _ArrayBridgeStorage <nl> = _BridgeStorage < __ContiguousArrayStorageBase > <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _ArrayBuffer < Element > : _ArrayBufferProtocol { <nl> <nl> / / / Create an empty buffer . <nl> mmm a / stdlib / public / core / ArrayShared . swift <nl> ppp b / stdlib / public / core / ArrayShared . swift <nl> <nl> <nl> / / / This type is used as a result of the _checkSubscript call to associate the <nl> / / / call with the array access call it guards . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct _DependenceToken { <nl> @ inlinable <nl> public init ( ) { <nl> mmm a / stdlib / public / core / ArraySlice . swift <nl> ppp b / stdlib / public / core / ArraySlice . swift <nl> <nl> / / / - Note : To safely reference the starting and ending indices of a slice , <nl> / / / always use the ` startIndex ` and ` endIndex ` properties instead of <nl> / / / specific values . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ArraySlice < Element > : _DestructorSafeContainer { <nl> @ usableFromInline <nl> internal typealias _Buffer = _SliceBuffer < Element > <nl> mmm a / stdlib / public / core / Bitset . swift <nl> ppp b / stdlib / public / core / Bitset . swift <nl> <nl> / / / Because ` _UnsafeBitset ` implements a flat bit vector , it isn ' t suitable for <nl> / / / holding arbitrarily large integers . The maximal element a bitset can store <nl> / / / is fixed at its initialization . <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline / / @ testable <nl> internal struct _UnsafeBitset { <nl> @ usableFromInline <nl> extension _UnsafeBitset : Sequence { <nl> } <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> internal let bitset : _UnsafeBitset <nl> extension _UnsafeBitset : Sequence { <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> <nl> extension _UnsafeBitset { <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct Word { <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / Bool . swift <nl> ppp b / stdlib / public / core / Bool . swift <nl> <nl> / / / bridged into Swift as ` Bool ` . The single ` Bool ` type in Swift guarantees <nl> / / / that functions , methods , and properties imported from C and Objective - C <nl> / / / have a consistent type interface . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Bool { <nl> @ usableFromInline <nl> internal var _value : Builtin . Int1 <nl> mmm a / stdlib / public / core / BridgeObjectiveC . swift <nl> ppp b / stdlib / public / core / BridgeObjectiveC . swift <nl> public func _getBridgedNonVerbatimObjectiveCType < T > ( _ : T . Type ) - > Any . Type ? <nl> / / / This type does not carry an owner pointer unlike the other C * Pointer types <nl> / / / because it only needs to reference the results of inout conversions , which <nl> / / / already have writeback - scoped lifetime . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AutoreleasingUnsafeMutablePointer < Pointee / * TODO : class * / > <nl> : _Pointer { <nl> <nl> mmm a / stdlib / public / core / BridgeStorage . swift <nl> ppp b / stdlib / public / core / BridgeStorage . swift <nl> <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> import SwiftShims <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct _BridgeStorage < NativeClass : AnyObject > { <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / CTypes . swift <nl> ppp b / stdlib / public / core / CTypes . swift <nl> public typealias CBool = Bool <nl> / / / <nl> / / / Opaque pointers are used to represent C pointers to types that <nl> / / / cannot be represented in Swift , such as incomplete struct types . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct OpaquePointer { <nl> @ usableFromInline <nl> internal var _rawValue : Builtin . RawPointer <nl> extension UInt { <nl> <nl> / / / A wrapper around a C ` va_list ` pointer . <nl> # if arch ( arm64 ) & & ! ( os ( macOS ) | | os ( iOS ) | | os ( tvOS ) | | os ( watchOS ) | | os ( Windows ) ) <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct CVaListPointer { <nl> @ usableFromInline / / unsafe - performance <nl> internal var _value : ( __stack : UnsafeMutablePointer < Int > ? , <nl> extension CVaListPointer : CustomDebugStringConvertible { <nl> <nl> # else <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct CVaListPointer { <nl> @ usableFromInline / / unsafe - performance <nl> internal var _value : UnsafeMutableRawPointer <nl> mmm a / stdlib / public / core / Character . swift <nl> ppp b / stdlib / public / core / Character . swift <nl> <nl> / / / [ glossary ] : http : / / www . unicode . org / glossary / <nl> / / / [ clusters ] : http : / / www . unicode . org / glossary / # extended_grapheme_cluster <nl> / / / [ scalars ] : http : / / www . unicode . org / glossary / # unicode_scalar_value <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Character { <nl> @ usableFromInline <nl> internal var _str : String <nl> mmm a / stdlib / public / core / ClosedRange . swift <nl> ppp b / stdlib / public / core / ClosedRange . swift <nl> <nl> / / / ` Stride ` types , they cannot be used as the bounds of a countable range . If <nl> / / / you need to iterate over consecutive floating - point values , see the <nl> / / / ` stride ( from : through : by : ) ` function . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ClosedRange < Bound : Comparable > { <nl> / / / The range ' s lower bound . <nl> public let lowerBound : Bound <nl> where Bound : Strideable , Bound . Stride : SignedInteger { <nl> } <nl> <nl> extension ClosedRange where Bound : Strideable , Bound . Stride : SignedInteger { <nl> - @ _frozen / / FIXME ( resilience ) <nl> + @ frozen / / FIXME ( resilience ) <nl> public enum Index { <nl> case pastEnd <nl> case inRange ( Bound ) <nl> mmm a / stdlib / public / core / CocoaArray . swift <nl> ppp b / stdlib / public / core / CocoaArray . swift <nl> import SwiftShims <nl> / / / ` Collection ` conformance . Why not make ` _NSArrayCore ` conform directly ? <nl> / / / It ' s a class , and I don ' t want to pay for the dynamic dispatch overhead . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _CocoaArrayWrapper : RandomAccessCollection { <nl> @ usableFromInline <nl> typealias Indices = Range < Int > <nl> mmm a / stdlib / public / core / Collection . swift <nl> ppp b / stdlib / public / core / Collection . swift <nl> <nl> / / / } <nl> / / / / / Prints " 15 . 0 " <nl> / / / / / Prints " 20 . 0 " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct IndexingIterator < Elements : Collection > { <nl> @ usableFromInline <nl> internal let _elements : Elements <nl> mmm a / stdlib / public / core / CollectionDifference . swift <nl> ppp b / stdlib / public / core / CollectionDifference . swift <nl> <nl> @ available ( macOS 9999 , iOS 9999 , tvOS 9999 , watchOS 9999 , * ) / / FIXME ( availability - 5 . 1 ) <nl> public struct CollectionDifference < ChangeElement > { <nl> / / / A single change to a collection . <nl> - @ _frozen <nl> + @ frozen <nl> public enum Change { <nl> / / / An insertion . <nl> / / / <nl> extension CollectionDifference : Collection { <nl> public typealias Element = Change <nl> <nl> / / / The position of a collection difference . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Index { <nl> / / Opaque index type is isomorphic to Int <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / CollectionOfOne . swift <nl> ppp b / stdlib / public / core / CollectionOfOne . swift <nl> <nl> / / / let toAdd = 100 <nl> / / / let b = a + CollectionOfOne ( toAdd ) <nl> / / / / / b = = [ 1 , 2 , 3 , 4 , 100 ] <nl> - @ _fixed_layout / / trivial - implementation <nl> + @ frozen / / trivial - implementation <nl> public struct CollectionOfOne < Element > { <nl> @ usableFromInline / / trivial - implementation <nl> internal var _element : Element <nl> extension CollectionOfOne { <nl> / / / An iterator that produces one or zero instances of an element . <nl> / / / <nl> / / / ` IteratorOverOne ` is the iterator for the ` CollectionOfOne ` type . <nl> - @ _fixed_layout / / trivial - implementation <nl> + @ frozen / / trivial - implementation <nl> public struct Iterator { <nl> @ usableFromInline / / trivial - implementation <nl> internal var _elements : Element ? <nl> mmm a / stdlib / public / core / CommandLine . swift <nl> ppp b / stdlib / public / core / CommandLine . swift <nl> <nl> import SwiftShims <nl> <nl> / / / Command - line arguments for the current process . <nl> - @ _frozen / / namespace <nl> + @ frozen / / namespace <nl> public enum CommandLine { <nl> / / / The backing static variable for argument count may come either from the <nl> / / / entry point or it may need to be computed e . g . if we ' re in the REPL . <nl> mmm a / stdlib / public / core / ContiguousArray . swift <nl> ppp b / stdlib / public / core / ContiguousArray . swift <nl> <nl> / / / <nl> / / / For more information about using arrays , see ` Array ` and ` ArraySlice ` , with <nl> / / / which ` ContiguousArray ` shares most properties and methods . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ContiguousArray < Element > : _DestructorSafeContainer { <nl> @ usableFromInline <nl> internal typealias _Buffer = _ContiguousArrayBuffer < Element > <nl> mmm a / stdlib / public / core / ContiguousArrayBuffer . swift <nl> ppp b / stdlib / public / core / ContiguousArrayBuffer . swift <nl> internal final class _ContiguousArrayStorage < <nl> } <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _ContiguousArrayBuffer < Element > : _ArrayBufferProtocol { <nl> <nl> / / / Make a buffer with uninitialized elements . After using this <nl> internal func _copyCollectionToContiguousArray < <nl> / / / element - by - element . The type is unsafe because it cannot be deinitialized <nl> / / / until the buffer has been finalized by a call to ` finish ` . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _UnsafePartiallyInitializedContiguousArrayBuffer < Element > { <nl> @ usableFromInline <nl> internal var result : _ContiguousArrayBuffer < Element > <nl> mmm a / stdlib / public / core / DebuggerSupport . swift <nl> ppp b / stdlib / public / core / DebuggerSupport . swift <nl> <nl> <nl> import SwiftShims <nl> <nl> - @ _frozen / / namespace <nl> + @ frozen / / namespace <nl> public enum _DebuggerSupport { <nl> private enum CollectionStatus { <nl> case notACollection <nl> mmm a / stdlib / public / core / Dictionary . swift <nl> ppp b / stdlib / public / core / Dictionary . swift <nl> <nl> / / / ` NSDictionary ` and ` Dictionary ` share buffer using the same copy - on - write <nl> / / / optimization that is used when two instances of ` Dictionary ` share <nl> / / / buffer . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Dictionary < Key : Hashable , Value > { <nl> / / / The element type of a dictionary : a tuple containing an individual <nl> / / / key - value pair . <nl> extension Dictionary { <nl> } <nl> <nl> / / / A view of a dictionary ' s keys . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Keys <nl> : Collection , Equatable , <nl> CustomStringConvertible , CustomDebugStringConvertible { <nl> extension Dictionary { <nl> } <nl> <nl> / / / A view of a dictionary ' s values . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Values <nl> : MutableCollection , CustomStringConvertible , CustomDebugStringConvertible { <nl> public typealias Element = Value <nl> extension Dictionary { <nl> } <nl> <nl> extension Dictionary . Keys { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> internal var _base : Dictionary < Key , Value > . Iterator <nl> extension Dictionary . Keys { <nl> } <nl> <nl> extension Dictionary . Values { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> internal var _base : Dictionary < Key , Value > . Iterator <nl> extension Dictionary : CustomStringConvertible , CustomDebugStringConvertible { <nl> } <nl> <nl> @ usableFromInline <nl> - @ _frozen <nl> + @ frozen <nl> internal enum _MergeError : Error { <nl> case keyCollision <nl> } <nl> extension Dictionary { <nl> / / / 2 . Subscripting with an index , yielding a key - value pair : <nl> / / / <nl> / / / ( k , v ) = d [ i ] <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Index { <nl> / / Index for native dictionary is efficient . Index for bridged NSDictionary <nl> / / is not , because neither NSEnumerator nor fast enumeration support moving <nl> extension Dictionary { <nl> / / safe to copy the state . So , we cannot implement Index that is a value <nl> / / type for bridged NSDictionary in terms of Cocoa enumeration facilities . <nl> <nl> - @ _frozen <nl> + @ frozen <nl> @ usableFromInline <nl> internal enum _Variant { <nl> case native ( _HashTable . Index ) <nl> extension Dictionary . Index : Hashable { <nl> <nl> extension Dictionary { <nl> / / / An iterator over the members of a ` Dictionary < Key , Value > ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> / / Dictionary has a separate IteratorProtocol and Index because of <nl> / / efficiency and implementability reasons . <nl> extension Dictionary { <nl> / / IteratorProtocol , which is being consumed as iteration proceeds . <nl> <nl> @ usableFromInline <nl> - @ _frozen <nl> + @ frozen <nl> internal enum _Variant { <nl> case native ( _NativeDictionary < Key , Value > . Iterator ) <nl> # if _runtime ( _ObjC ) <nl> mmm a / stdlib / public / core / DictionaryBridging . swift <nl> ppp b / stdlib / public / core / DictionaryBridging . swift <nl> final internal class _SwiftDeferredNSDictionary < Key : Hashable , Value > <nl> / / classes , so it was renamed . The old names must not be used in the new <nl> / / runtime . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct __CocoaDictionary { <nl> @ usableFromInline <nl> internal let object : AnyObject <nl> extension __CocoaDictionary { <nl> } <nl> <nl> extension __CocoaDictionary { <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct Index { <nl> internal var _storage : Builtin . BridgeObject <nl> mmm a / stdlib / public / core / DictionaryBuilder . swift <nl> ppp b / stdlib / public / core / DictionaryBuilder . swift <nl> <nl> / / / <nl> / / / Using a builder can be faster than inserting members into an empty <nl> / / / ` Dictionary ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public / / SPI ( Foundation ) <nl> struct _DictionaryBuilder < Key : Hashable , Value > { <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / DictionaryVariant . swift <nl> ppp b / stdlib / public / core / DictionaryVariant . swift <nl> internal protocol _DictionaryBuffer { <nl> <nl> extension Dictionary { <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _Variant { <nl> @ usableFromInline <nl> internal var object : _BridgeStorage < __RawDictionaryStorage > <nl> mmm a / stdlib / public / core / DropWhile . swift <nl> ppp b / stdlib / public / core / DropWhile . swift <nl> <nl> <nl> / / / A sequence whose elements consist of the elements that follow the initial <nl> / / / consecutive elements of some base sequence that satisfy a given predicate . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct LazyDropWhileSequence < Base : Sequence > { <nl> public typealias Element = Base . Element <nl> <nl> extension LazyDropWhileSequence { <nl> / / / This is the associated iterator for the ` LazyDropWhileSequence ` , <nl> / / / ` LazyDropWhileCollection ` , and ` LazyDropWhileBidirectionalCollection ` <nl> / / / types . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct Iterator { <nl> public typealias Element = Base . Element <nl> <nl> mmm a / stdlib / public / core / EmptyCollection . swift <nl> ppp b / stdlib / public / core / EmptyCollection . swift <nl> <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> <nl> / / / A collection whose element type is ` Element ` but that is always empty . <nl> - @ _fixed_layout / / trivial - implementation <nl> + @ frozen / / trivial - implementation <nl> public struct EmptyCollection < Element > { <nl> / / no properties <nl> <nl> public struct EmptyCollection < Element > { <nl> <nl> extension EmptyCollection { <nl> / / / An iterator that never produces an element . <nl> - @ _fixed_layout / / trivial - implementation <nl> + @ frozen / / trivial - implementation <nl> public struct Iterator { <nl> / / no properties <nl> <nl> mmm a / stdlib / public / core / ExistentialCollection . swift . gyb <nl> ppp b / stdlib / public / core / ExistentialCollection . swift . gyb <nl> internal func _abstract ( <nl> / / / This iterator forwards its ` next ( ) ` method to an arbitrary underlying <nl> / / / iterator having the same ` Element ` type , hiding the specifics of the <nl> / / / underlying ` IteratorProtocol ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AnyIterator < Element > { <nl> @ usableFromInline <nl> internal let _box : _AnyIteratorBoxBase < Element > <nl> extension AnyIterator : IteratorProtocol { <nl> extension AnyIterator : Sequence { } <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _ClosureBasedIterator < Element > : IteratorProtocol { <nl> @ inlinable <nl> internal init ( _ body : @ escaping ( ) - > Element ? ) { <nl> internal final class _ $ { Kind } Box < S : $ { Kind } > : _Any $ { Kind } Box < S . Element > <nl> % end <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _ClosureBasedSequence < Iterator : IteratorProtocol > { <nl> @ usableFromInline <nl> internal var _makeUnderlyingIterator : ( ) - > Iterator <nl> extension _ClosureBasedSequence : Sequence { <nl> / / / An instance of ` AnySequence ` forwards its operations to an underlying base <nl> / / / sequence having the same ` Element ` type , hiding the specifics of the <nl> / / / underlying sequence . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AnySequence < Element > { <nl> @ usableFromInline <nl> internal let _box : _AnySequenceBox < Element > <nl> internal final class _IndexBox < BaseIndex : Comparable > : _AnyIndexBox { <nl> } <nl> <nl> / / / A wrapper over an underlying index that hides the specific underlying type . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AnyIndex { <nl> @ usableFromInline <nl> internal var _box : _AnyIndexBox <nl> protocol _AnyCollectionProtocol : Collection { <nl> / / / An ` $ { Self } ` instance forwards its operations to a base collection having the <nl> / / / same ` Element ` type , hiding the specifics of the underlying <nl> / / / collection . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct $ { Self } < Element > { <nl> @ usableFromInline <nl> internal let _box : _ $ { Self } Box < Element > <nl> mmm a / stdlib / public / core / Filter . swift <nl> ppp b / stdlib / public / core / Filter . swift <nl> <nl> / / / <nl> / / / - Note : ` s . lazy . filter { . . . } ` , for an arbitrary sequence ` s ` , <nl> / / / is a ` LazyFilterSequence ` . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct LazyFilterSequence < Base : Sequence > { <nl> @ usableFromInline / / lazy - performance <nl> internal var _base : Base <nl> extension LazyFilterSequence { <nl> / / / <nl> / / / - Note : This is the associated ` Iterator ` of ` LazyFilterSequence ` <nl> / / / and ` LazyFilterCollection ` . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct Iterator { <nl> / / / The underlying iterator whose elements are being filtered . <nl> public var base : Base . Iterator { return _base } <nl> mmm a / stdlib / public / core / Flatten . swift <nl> ppp b / stdlib / public / core / Flatten . swift <nl> <nl> / / / * ` s . joined ( ) ` does not create new storage <nl> / / / * ` s . joined ( ) . map ( f ) ` maps eagerly and returns a new array <nl> / / / * ` s . lazy . joined ( ) . map ( f ) ` maps lazily and returns a ` LazyMapSequence ` <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct FlattenSequence < Base : Sequence > where Base . Element : Sequence { <nl> <nl> @ usableFromInline / / lazy - performance <nl> public struct FlattenSequence < Base : Sequence > where Base . Element : Sequence { <nl> } <nl> <nl> extension FlattenSequence { <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct Iterator { <nl> @ usableFromInline / / lazy - performance <nl> internal var _base : Base . Iterator <nl> public typealias FlattenCollection < T : Collection > = FlattenSequence < T > where T . E <nl> <nl> extension FlattenSequence where Base : Collection , Base . Element : Collection { <nl> / / / A position in a FlattenCollection <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct Index { <nl> / / / The position in the outer collection of collections . <nl> @ usableFromInline / / lazy - performance <nl> mmm a / stdlib / public / core / FloatingPoint . swift <nl> ppp b / stdlib / public / core / FloatingPoint . swift <nl> public protocol FloatingPoint : SignedNumeric , Strideable , Hashable <nl> } <nl> <nl> / / / The sign of a floating - point value . <nl> - @ _frozen / / FIXME ( sil - serialize - all ) <nl> + @ frozen <nl> public enum FloatingPointSign : Int { <nl> / / / The sign for a positive value . <nl> case plus <nl> public enum FloatingPointSign : Int { <nl> } <nl> <nl> / / / The IEEE 754 floating - point classes . <nl> - @ _frozen / / FIXME ( sil - serialize - all ) <nl> + @ frozen <nl> public enum FloatingPointClassification { <nl> / / / A signaling NaN ( " not a number " ) . <nl> / / / <nl> mmm a / stdlib / public / core / FloatingPointTypes . swift . gyb <nl> ppp b / stdlib / public / core / FloatingPointTypes . swift . gyb <nl> else : <nl> % end <nl> <nl> $ { SelfDocComment } <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct $ { Self } { <nl> public / / @ testable <nl> var _value : Builtin . FPIEEE $ { bits } <nl> extension $ { Self } : BinaryFloatingPoint { <nl> } <nl> % else : <nl> / / Internal implementation details of x86 Float80 <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct _Representation { <nl> @ usableFromInline <nl> internal struct _ $ { Self } AnyHashableBox : _AnyHashableBox { <nl> # else <nl> <nl> $ { SelfDocComment } <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ available ( * , unavailable , message : " Float80 is only available on non - Windows x86 targets . " ) <nl> public struct $ { Self } { <nl> / / / Creates a value initialized to zero . <nl> mmm a / stdlib / public / core / HashTable . swift <nl> ppp b / stdlib / public / core / HashTable . swift <nl> internal protocol _HashTableDelegate { <nl> } <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _HashTable { <nl> @ usableFromInline <nl> internal typealias Word = _UnsafeBitset . Word <nl> extension _HashTable { <nl> } <nl> <nl> extension _HashTable { <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct Bucket { <nl> @ usableFromInline <nl> extension _HashTable . Bucket : Comparable { <nl> <nl> extension _HashTable { <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct Index { <nl> @ usableFromInline <nl> let bucket : Bucket <nl> extension _HashTable . Index : Comparable { <nl> <nl> extension _HashTable : Sequence { <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> let hashTable : _HashTable <nl> mmm a / stdlib / public / core / Hasher . swift <nl> ppp b / stdlib / public / core / Hasher . swift <nl> extension Hasher { <nl> / / / trailing bytes , while the most significant 8 bits hold the count of bytes <nl> / / / appended so far , modulo 256 . The count of bytes currently stored in the <nl> / / / buffer is in the lower three bits of the byte count . ) <nl> - / / FIXME : Remove @ usableFromInline and @ _fixed_layout once Hasher is resilient . <nl> + / / FIXME : Remove @ usableFromInline and @ frozen once Hasher is resilient . <nl> / / rdar : / / problem / 38549901 <nl> - @ usableFromInline @ _fixed_layout <nl> + @ usableFromInline @ frozen <nl> internal struct _TailBuffer { <nl> / / msb lsb <nl> / / + mmmmmmmmm + mmmmmm - + mmmmmm - + mmmmmm - + mmmmmm - + mmmmmm - + mmmmmm - + mmmmmm - + <nl> extension Hasher { <nl> } <nl> <nl> extension Hasher { <nl> - / / FIXME : Remove @ usableFromInline and @ _fixed_layout once Hasher is resilient . <nl> + / / FIXME : Remove @ usableFromInline and @ frozen once Hasher is resilient . <nl> / / rdar : / / problem / 38549901 <nl> - @ usableFromInline @ _fixed_layout <nl> + @ usableFromInline @ frozen <nl> internal struct _Core { <nl> private var _buffer : _TailBuffer <nl> private var _state : Hasher . _State <nl> extension Hasher { <nl> / / / different values on every new execution of your program . The hash <nl> / / / algorithm implemented by ` Hasher ` may itself change between any two <nl> / / / versions of the standard library . <nl> - @ _fixed_layout / / FIXME : Should be resilient ( rdar : / / problem / 38549901 ) <nl> + @ frozen / / FIXME : Should be resilient ( rdar : / / problem / 38549901 ) <nl> public struct Hasher { <nl> internal var _core : _Core <nl> <nl> mmm a / stdlib / public / core / Indices . swift <nl> ppp b / stdlib / public / core / Indices . swift <nl> <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> <nl> / / / A collection of indices for an arbitrary collection <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct DefaultIndices < Elements : Collection > { <nl> @ usableFromInline <nl> internal var _elements : Elements <nl> mmm a / stdlib / public / core / IntegerTypes . swift . gyb <nl> ppp b / stdlib / public / core / IntegerTypes . swift . gyb <nl> def unsafeOperationComment ( operator ) : <nl> / / / $ { Article } $ { bits } - bit $ { ' ' if signed else ' un ' } signed integer value <nl> / / / type . <nl> % end <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct $ { Self } <nl> : FixedWidthInteger , $ { Unsigned } Integer , <nl> _ExpressibleByBuiltinIntegerLiteral { <nl> $ { assignmentOperatorComment ( x . operator , True ) } <nl> } <nl> <nl> / / / A type that represents the words of this integer . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Words : RandomAccessCollection { <nl> public typealias Indices = Range < Int > <nl> public typealias SubSequence = Slice < $ { Self } . Words > <nl> mmm a / stdlib / public / core / Join . swift <nl> ppp b / stdlib / public / core / Join . swift <nl> <nl> <nl> / / / A sequence that presents the elements of a base sequence of sequences <nl> / / / concatenated using a given separator . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct JoinedSequence < Base : Sequence > where Base . Element : Sequence { <nl> <nl> public typealias Element = Base . Element . Element <nl> public struct JoinedSequence < Base : Sequence > where Base . Element : Sequence { <nl> extension JoinedSequence { <nl> / / / An iterator that presents the elements of the sequences traversed <nl> / / / by a base iterator , concatenated using a given separator . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct Iterator { <nl> @ usableFromInline / / lazy - performance <nl> internal var _base : Base . Iterator <nl> extension JoinedSequence { <nl> @ usableFromInline / / lazy - performance <nl> internal var _separator : ContiguousArray < Element > . Iterator ? <nl> <nl> - @ _frozen / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> @ usableFromInline / / lazy - performance <nl> internal enum _JoinIteratorState { <nl> case start <nl> mmm a / stdlib / public / core / KeyValuePairs . swift <nl> ppp b / stdlib / public / core / KeyValuePairs . swift <nl> <nl> / / / let pairs = IntPairs ( [ 1 : 2 , 1 : 1 , 3 : 4 , 2 : 1 ] ) <nl> / / / print ( pairs . elements ) <nl> / / / / / Prints " [ ( 1 , 2 ) , ( 1 , 1 ) , ( 3 , 4 ) , ( 2 , 1 ) ] " <nl> - @ _fixed_layout / / trivial - implementation <nl> + @ frozen / / trivial - implementation <nl> public struct KeyValuePairs < Key , Value > : ExpressibleByDictionaryLiteral { <nl> @ usableFromInline / / trivial - implementation <nl> internal let _elements : [ ( Key , Value ) ] <nl> mmm a / stdlib / public / core / LazySequence . swift <nl> ppp b / stdlib / public / core / LazySequence . swift <nl> extension LazySequenceProtocol where Elements : LazySequenceProtocol { <nl> / / / implemented lazily . <nl> / / / <nl> / / / - See also : ` LazySequenceProtocol ` <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct LazySequence < Base : Sequence > { <nl> @ usableFromInline <nl> internal var _base : Base <nl> mmm a / stdlib / public / core / ManagedBuffer . swift <nl> ppp b / stdlib / public / core / ManagedBuffer . swift <nl> extension ManagedBuffer { <nl> / / / } <nl> / / / } <nl> / / / <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ManagedBufferPointer < Header , Element > { <nl> <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / Map . swift <nl> ppp b / stdlib / public / core / Map . swift <nl> <nl> / / / ` Sequence ` passed through a transform function returning ` Element ` . <nl> / / / These elements are computed lazily , each time they ' re read , by <nl> / / / calling the transform function on a base element . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct LazyMapSequence < Base : Sequence , Element > { <nl> <nl> public typealias Elements = LazyMapSequence <nl> public struct LazyMapSequence < Base : Sequence , Element > { <nl> } <nl> <nl> extension LazyMapSequence { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> @ usableFromInline <nl> internal var _base : Base . Iterator <nl> mmm a / stdlib / public / core / MemoryLayout . swift <nl> ppp b / stdlib / public / core / MemoryLayout . swift <nl> <nl> / / / let pointPointer = UnsafeMutableRawPointer . allocate ( <nl> / / / bytes : count * MemoryLayout < Point > . stride , <nl> / / / alignedTo : MemoryLayout < Point > . alignment ) <nl> - @ _frozen / / namespace <nl> + @ frozen / / namespace <nl> public enum MemoryLayout < T > { <nl> / / / The contiguous memory footprint of ` T ` , in bytes . <nl> / / / <nl> mmm a / stdlib / public / core / MigrationSupport . swift <nl> ppp b / stdlib / public / core / MigrationSupport . swift <nl> extension Zip2Sequence { <nl> @ available ( swift , deprecated : 4 . 2 , message : " PlaygroundQuickLook will be removed in a future Swift version . For customizing how types are presented in playgrounds , use CustomPlaygroundDisplayConvertible instead . " ) <nl> public typealias PlaygroundQuickLook = _PlaygroundQuickLook <nl> <nl> - @ _frozen / / rdar : / / problem / 38719739 - needed by LLDB <nl> + @ frozen / / rdar : / / problem / 38719739 - needed by LLDB <nl> public enum _PlaygroundQuickLook { <nl> case text ( String ) <nl> case int ( Int64 ) <nl> mmm a / stdlib / public / core / NativeDictionary . swift <nl> ppp b / stdlib / public / core / NativeDictionary . swift <nl> <nl> / / / A wrapper around __RawDictionaryStorage that provides most of the <nl> / / / implementation of Dictionary . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _NativeDictionary < Key : Hashable , Value > { <nl> @ usableFromInline <nl> internal typealias Element = ( key : Key , value : Value ) <nl> extension _NativeDictionary { / / High - level operations <nl> <nl> extension _NativeDictionary : Sequence { <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct Iterator { <nl> / / The iterator is iterating over a frozen view of the collection state , so <nl> / / it keeps its own reference to the dictionary . <nl> mmm a / stdlib / public / core / NativeSet . swift <nl> ppp b / stdlib / public / core / NativeSet . swift <nl> <nl> / / / A wrapper around __RawSetStorage that provides most of the <nl> / / / implementation of Set . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _NativeSet < Element : Hashable > { <nl> / / / See the comments on __RawSetStorage and its subclasses to understand why we <nl> / / / store an untyped storage here . <nl> extension _NativeSet { / / Deletion <nl> <nl> extension _NativeSet : Sequence { <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct Iterator { <nl> / / The iterator is iterating over a frozen view of the collection state , so <nl> / / it keeps its own reference to the set . <nl> mmm a / stdlib / public / core / ObjectIdentifier . swift <nl> ppp b / stdlib / public / core / ObjectIdentifier . swift <nl> <nl> / / / <nl> / / / In Swift , only class instances and metatypes have unique identities . There <nl> / / / is no notion of identity for structs , enums , functions , or tuples . <nl> - @ _fixed_layout / / trivial - implementation <nl> + @ frozen / / trivial - implementation <nl> public struct ObjectIdentifier { <nl> @ usableFromInline / / trivial - implementation <nl> internal let _value : Builtin . RawPointer <nl> mmm a / stdlib / public / core / Optional . swift <nl> ppp b / stdlib / public / core / Optional . swift <nl> <nl> / / / <nl> / / / Unconditionally unwrapping a ` nil ` instance with ` ! ` triggers a runtime <nl> / / / error . <nl> - @ _frozen <nl> + @ frozen <nl> public enum Optional < Wrapped > : ExpressibleByNilLiteral { <nl> / / The compiler has special knowledge of Optional < Wrapped > , including the fact <nl> / / that it is an ` enum ` with cases named ` none ` and ` some ` . <nl> extension Optional : Hashable where Wrapped : Hashable { <nl> <nl> / / Enable pattern matching against the nil literal , even if the element type <nl> / / isn ' t equatable . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct _OptionalNilComparisonType : ExpressibleByNilLiteral { <nl> / / / Create an instance initialized with ` nil ` . <nl> @ _transparent <nl> mmm a / stdlib / public / core / Policy . swift <nl> ppp b / stdlib / public / core / Policy . swift <nl> <nl> / / / func crashAndBurn ( ) - > Never { <nl> / / / fatalError ( " Something very , very bad happened " ) <nl> / / / } <nl> - @ _frozen <nl> + @ frozen <nl> public enum Never { } <nl> <nl> extension Never : Error { } <nl> mmm a / stdlib / public / core / PrefixWhile . swift <nl> ppp b / stdlib / public / core / PrefixWhile . swift <nl> <nl> <nl> / / / A sequence whose elements consist of the initial consecutive elements of <nl> / / / some base sequence that satisfy a given predicate . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct LazyPrefixWhileSequence < Base : Sequence > { <nl> public typealias Element = Base . Element <nl> <nl> extension LazyPrefixWhileSequence { <nl> / / / This is the associated iterator for the ` LazyPrefixWhileSequence ` , <nl> / / / ` LazyPrefixWhileCollection ` , and ` LazyPrefixWhileBidirectionalCollection ` <nl> / / / types . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct Iterator { <nl> public typealias Element = Base . Element <nl> <nl> public typealias LazyPrefixWhileCollection < T : Collection > = LazyPrefixWhileSeque <nl> extension LazyPrefixWhileCollection { <nl> / / / A position in the base collection of a ` LazyPrefixWhileCollection ` or the <nl> / / / end of that collection . <nl> - @ _frozen / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> @ usableFromInline <nl> internal enum _IndexRepresentation { <nl> case index ( Base . Index ) <nl> extension LazyPrefixWhileCollection { <nl> <nl> / / / A position in a ` LazyPrefixWhileCollection ` or <nl> / / / ` LazyPrefixWhileBidirectionalCollection ` instance . <nl> - @ _fixed_layout / / lazy - performance <nl> + @ frozen / / lazy - performance <nl> public struct Index { <nl> / / / The position corresponding to ` self ` in the underlying collection . <nl> @ usableFromInline / / lazy - performance <nl> mmm a / stdlib / public / core / Random . swift <nl> ppp b / stdlib / public / core / Random . swift <nl> extension RandomNumberGenerator { <nl> / / / - Apple platforms use ` arc4random_buf ( 3 ) ` . <nl> / / / - Linux platforms use ` getrandom ( 2 ) ` when available ; otherwise , they read <nl> / / / from ` / dev / urandom ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct SystemRandomNumberGenerator : RandomNumberGenerator { <nl> / / / Creates a new instance of the system ' s default random number generator . <nl> @ inlinable <nl> mmm a / stdlib / public / core / Range . swift <nl> ppp b / stdlib / public / core / Range . swift <nl> extension RangeExpression { <nl> / / / ` Stride ` types , they cannot be used as the bounds of a countable range . If <nl> / / / you need to iterate over consecutive floating - point values , see the <nl> / / / ` stride ( from : to : by : ) ` function . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Range < Bound : Comparable > { <nl> / / / The range ' s lower bound . <nl> / / / <nl> extension Range : Encodable where Bound : Encodable { <nl> / / / let numbers = [ 10 , 20 , 30 , 40 , 50 , 60 , 70 ] <nl> / / / print ( numbers [ . . < 3 ] ) <nl> / / / / / Prints " [ 10 , 20 , 30 ] " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct PartialRangeUpTo < Bound : Comparable > { <nl> public let upperBound : Bound <nl> <nl> extension PartialRangeUpTo : Encodable where Bound : Encodable { <nl> / / / let numbers = [ 10 , 20 , 30 , 40 , 50 , 60 , 70 ] <nl> / / / print ( numbers [ . . . 3 ] ) <nl> / / / / / Prints " [ 10 , 20 , 30 , 40 ] " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct PartialRangeThrough < Bound : Comparable > { <nl> public let upperBound : Bound <nl> <nl> extension PartialRangeThrough : Encodable where Bound : Encodable { <nl> / / / ` Bound ` . For example , iterating over an instance of <nl> / / / ` PartialRangeFrom < Int > ` traps when the sequence ' s next value would be <nl> / / / above ` Int . max ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct PartialRangeFrom < Bound : Comparable > { <nl> public let lowerBound : Bound <nl> <nl> extension PartialRangeFrom : Sequence <nl> public typealias Element = Bound <nl> <nl> / / / The iterator for a ` PartialRangeFrom ` instance . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> internal var _current : Bound <nl> extension Comparable { <nl> / / / let word2 = " grisly " <nl> / / / let changes = countLetterChanges ( word1 [ . . . ] , word2 [ . . . ] ) <nl> / / / / / changes = = 2 <nl> - @ _frozen / / namespace <nl> + @ frozen / / namespace <nl> public enum UnboundedRange_ { <nl> / / FIXME : replace this with a computed var named ` . . . ` when the language makes <nl> / / that possible . <nl> mmm a / stdlib / public / core / Repeat . swift <nl> ppp b / stdlib / public / core / Repeat . swift <nl> <nl> / / / / / " Humperdinck " <nl> / / / / / " Humperdinck " <nl> / / / / / " Humperdinck " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Repeated < Element > { <nl> / / / The number of elements in this collection . <nl> public let count : Int <nl> mmm a / stdlib / public / core / Result . swift <nl> ppp b / stdlib / public / core / Result . swift <nl> <nl> <nl> / / / A value that represents either a success or a failure , including an <nl> / / / associated value in each case . <nl> - @ _frozen <nl> + @ frozen <nl> public enum Result < Success , Failure : Error > { <nl> / / / A success , storing a ` Success ` value . <nl> case success ( Success ) <nl> mmm a / stdlib / public / core / Reverse . swift <nl> ppp b / stdlib / public / core / Reverse . swift <nl> extension MutableCollection where Self : BidirectionalCollection { <nl> / / / * ` c . reversed ( ) ` does not create new storage <nl> / / / * ` c . reversed ( ) . map ( f ) ` maps eagerly and returns a new array <nl> / / / * ` c . lazy . reversed ( ) . map ( f ) ` maps lazily and returns a ` LazyMapCollection ` <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ReversedCollection < Base : BidirectionalCollection > { <nl> public let _base : Base <nl> <nl> public struct ReversedCollection < Base : BidirectionalCollection > { <nl> <nl> extension ReversedCollection { <nl> / / An iterator that can be much faster than the iterator of a reversed slice . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> @ usableFromInline <nl> internal let _base : Base <nl> extension ReversedCollection : Sequence { <nl> extension ReversedCollection { <nl> / / / An index that traverses the same positions as an underlying index , <nl> / / / with inverted traversal direction . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Index { <nl> / / / The position after this position in the underlying collection . <nl> / / / <nl> mmm a / stdlib / public / core / SIMDVector . swift <nl> ppp b / stdlib / public / core / SIMDVector . swift <nl> where Scalar : BinaryFloatingPoint , Scalar . RawSignificand : FixedWidthInteger { <nl> } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct SIMDMask < Storage > : SIMD <nl> where Storage : SIMD , <nl> Storage . Scalar : FixedWidthInteger & SignedInteger { <nl> mmm a / stdlib / public / core / SIMDVectorTypes . swift . gyb <nl> ppp b / stdlib / public / core / SIMDVectorTypes . swift . gyb <nl> ordinalPositions = [ ' first ' , ' second ' , ' third ' , ' fourth ' ] <nl> % for n in vectorscalarCounts : <nl> % storageN = 4 if n = = 3 else n <nl> / / / A vector of $ { spelledNumbers [ n ] } scalar values . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct SIMD $ { n } < Scalar > : SIMD where Scalar : SIMDScalar { <nl> <nl> public var _storage : Scalar . SIMD $ { storageN } Storage <nl> extension $ { Self } : SIMDScalar { <nl> % for n in storagescalarCounts : <nl> % bytes = n * self_type . bits / 8 <nl> / / / Storage for a vector of $ { spelledNumbers [ n ] } integers . <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ _alignment ( $ { bytes if bytes < = 16 else 16 } ) <nl> public struct SIMD $ { n } Storage : SIMDStorage { <nl> <nl> extension $ { Self } : SIMDScalar { <nl> % for n in storagescalarCounts : <nl> % bytes = n * bits / 8 <nl> / / / Storage for a vector of $ { spelledNumbers [ n ] } floating - point values . <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ _alignment ( $ { bytes if bytes < = 16 else 16 } ) <nl> public struct SIMD $ { n } Storage : SIMDStorage { <nl> <nl> mmm a / stdlib / public / core / Sequence . swift <nl> ppp b / stdlib / public / core / Sequence . swift <nl> extension Sequence where Self . Iterator = = Self { <nl> / / / ` Base ` iterator before possibly returning the first available element . <nl> / / / <nl> / / / The underlying iterator ' s sequence may be infinite . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct DropFirstSequence < Base : Sequence > { <nl> @ usableFromInline <nl> internal let _base : Base <nl> extension DropFirstSequence : Sequence { <nl> / / / ` Base ` iterator . <nl> / / / <nl> / / / The underlying iterator ' s sequence may be infinite . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct PrefixSequence < Base : Sequence > { <nl> @ usableFromInline <nl> internal var _base : Base <nl> public struct PrefixSequence < Base : Sequence > { <nl> } <nl> <nl> extension PrefixSequence { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> @ usableFromInline <nl> internal var _base : Base . Iterator <nl> extension PrefixSequence : Sequence { <nl> / / / ` Base ` iterator before possibly returning the first available element . <nl> / / / <nl> / / / The underlying iterator ' s sequence may be infinite . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct DropWhileSequence < Base : Sequence > { <nl> public typealias Element = Base . Element <nl> <nl> public struct DropWhileSequence < Base : Sequence > { <nl> } <nl> <nl> extension DropWhileSequence { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> @ usableFromInline <nl> internal var _iterator : Base . Iterator <nl> extension Sequence { <nl> / / / given just an iterator ` i ` : <nl> / / / <nl> / / / for x in IteratorSequence ( i ) { . . . } <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct IteratorSequence < Base : IteratorProtocol > { <nl> @ usableFromInline <nl> internal var _base : Base <nl> mmm a / stdlib / public / core / Set . swift <nl> ppp b / stdlib / public / core / Set . swift <nl> <nl> / / / unspecified . The instances of ` NSSet ` and ` Set ` share buffer using the <nl> / / / same copy - on - write optimization that is used when two instances of ` Set ` <nl> / / / share buffer . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Set < Element : Hashable > { <nl> @ usableFromInline <nl> internal var _variant : _Variant <nl> extension Set { <nl> <nl> extension Set { <nl> / / / The position of an element in a set . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Index { <nl> / / Index for native buffer is efficient . Index for bridged NSSet is <nl> / / not , because neither NSEnumerator nor fast enumeration support moving <nl> extension Set { <nl> / / safe to copy the state . So , we cannot implement Index that is a value <nl> / / type for bridged NSSet in terms of Cocoa enumeration facilities . <nl> <nl> - @ _frozen <nl> + @ frozen <nl> @ usableFromInline <nl> internal enum _Variant { <nl> case native ( _HashTable . Index ) <nl> extension Set . Index : Hashable { <nl> <nl> extension Set { <nl> / / / An iterator over the members of a ` Set < Element > ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> / / Set has a separate IteratorProtocol and Index because of efficiency <nl> / / and implementability reasons . <nl> extension Set { <nl> / / IteratorProtocol , which is being consumed as iteration proceeds . <nl> <nl> @ usableFromInline <nl> - @ _frozen <nl> + @ frozen <nl> internal enum _Variant { <nl> case native ( _NativeSet < Element > . Iterator ) <nl> # if _runtime ( _ObjC ) <nl> mmm a / stdlib / public / core / SetBridging . swift <nl> ppp b / stdlib / public / core / SetBridging . swift <nl> final internal class _SwiftDeferredNSSet < Element : Hashable > <nl> / / classes , so it was renamed . The old names must not be used in the new <nl> / / runtime . <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct __CocoaSet { <nl> @ usableFromInline <nl> internal let object : AnyObject <nl> extension __CocoaSet : _SetBuffer { <nl> } <nl> <nl> extension __CocoaSet { <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct Index { <nl> internal var _storage : Builtin . BridgeObject <nl> mmm a / stdlib / public / core / SetBuilder . swift <nl> ppp b / stdlib / public / core / SetBuilder . swift <nl> <nl> / / / <nl> / / / Using a builder can be faster than inserting members into an empty <nl> / / / ` Set ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public / / SPI ( Foundation ) <nl> struct _SetBuilder < Element : Hashable > { <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / SetVariant . swift <nl> ppp b / stdlib / public / core / SetVariant . swift <nl> internal protocol _SetBuffer { <nl> <nl> extension Set { <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> internal struct _Variant { <nl> @ usableFromInline <nl> internal var object : _BridgeStorage < __RawSetStorage > <nl> mmm a / stdlib / public / core / SipHash . swift <nl> ppp b / stdlib / public / core / SipHash . swift <nl> <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> <nl> extension Hasher { <nl> - / / FIXME : Remove @ usableFromInline and @ _fixed_layout once Hasher is resilient . <nl> + / / FIXME : Remove @ usableFromInline and @ frozen once Hasher is resilient . <nl> / / rdar : / / problem / 38549901 <nl> - @ usableFromInline @ _fixed_layout <nl> + @ usableFromInline @ frozen <nl> internal struct _State { <nl> / / " somepseudorandomlygeneratedbytes " <nl> private var v0 : UInt64 = 0x736f6d6570736575 <nl> mmm a / stdlib / public / core / Slice . swift <nl> ppp b / stdlib / public / core / Slice . swift <nl> <nl> / / / collection type , don ' t use ` Slice ` as its subsequence type . Instead , <nl> / / / define your own subsequence type that takes your index invalidation <nl> / / / requirements into account . <nl> - @ _fixed_layout / / generic - performance <nl> + @ frozen / / generic - performance <nl> public struct Slice < Base : Collection > { <nl> public var _startIndex : Base . Index <nl> public var _endIndex : Base . Index <nl> mmm a / stdlib / public / core / SliceBuffer . swift <nl> ppp b / stdlib / public / core / SliceBuffer . swift <nl> <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> <nl> / / / Buffer type for ` ArraySlice < Element > ` . <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline <nl> internal struct _SliceBuffer < Element > <nl> : _ArrayBufferProtocol , <nl> mmm a / stdlib / public / core / SmallString . swift <nl> ppp b / stdlib / public / core / SmallString . swift <nl> <nl> / / ↑ ↑ <nl> / / first ( leftmost ) code unit discriminator ( incl . count ) <nl> / / <nl> - @ _fixed_layout @ usableFromInline <nl> + @ frozen @ usableFromInline <nl> internal struct _SmallString { <nl> @ usableFromInline <nl> internal typealias RawBitPattern = ( UInt64 , UInt64 ) <nl> mmm a / stdlib / public / core / StaticString . swift <nl> ppp b / stdlib / public / core / StaticString . swift <nl> <nl> / / / commonly used ` String ` type . A static string can store its value as a <nl> / / / pointer to an ASCII code unit sequence , as a pointer to a UTF - 8 code unit <nl> / / / sequence , or as a single Unicode scalar value . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct StaticString <nl> : _ExpressibleByBuiltinUnicodeScalarLiteral , <nl> _ExpressibleByBuiltinExtendedGraphemeClusterLiteral , <nl> mmm a / stdlib / public / core / Stride . swift <nl> ppp b / stdlib / public / core / Stride . swift <nl> extension Strideable where Self : FloatingPoint , Self = = Stride { <nl> } <nl> <nl> / / / An iterator for a ` StrideTo ` instance . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct StrideToIterator < Element : Strideable > { <nl> @ usableFromInline <nl> internal let _start : Element <nl> extension StrideToIterator : IteratorProtocol { <nl> / / / A sequence of values formed by striding over a half - open interval . <nl> / / / <nl> / / / Use the ` stride ( from : to : by : ) ` function to create ` StrideTo ` instances . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct StrideTo < Element : Strideable > { <nl> @ usableFromInline <nl> internal let _start : Element <nl> public func stride < T > ( <nl> } <nl> <nl> / / / An iterator for a ` StrideThrough ` instance . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct StrideThroughIterator < Element : Strideable > { <nl> @ usableFromInline <nl> internal let _start : Element <nl> extension StrideThroughIterator : IteratorProtocol { <nl> / / / <nl> / / / Use the ` stride ( from : through : by : ) ` function to create ` StrideThrough ` <nl> / / / instances . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct StrideThrough < Element : Strideable > { <nl> @ usableFromInline <nl> internal let _start : Element <nl> mmm a / stdlib / public / core / String . swift <nl> ppp b / stdlib / public / core / String . swift <nl> internal func unimplemented_utf8_32bit ( <nl> / / / [ clusters ] : http : / / www . unicode . org / glossary / # extended_grapheme_cluster <nl> / / / [ scalars ] : http : / / www . unicode . org / glossary / # unicode_scalar_value <nl> / / / [ equivalence ] : http : / / www . unicode . org / glossary / # canonical_equivalent <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct String { <nl> public / / @ SPI ( Foundation ) <nl> var _guts : _StringGuts <nl> mmm a / stdlib / public / core / StringCharacterView . swift <nl> ppp b / stdlib / public / core / StringCharacterView . swift <nl> extension String : BidirectionalCollection { <nl> } <nl> <nl> extension String { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> internal var _guts : _StringGuts <nl> mmm a / stdlib / public / core / StringComparison . swift <nl> ppp b / stdlib / public / core / StringComparison . swift <nl> private func _findBoundary ( <nl> } <nl> } <nl> <nl> - @ _frozen <nl> + @ frozen <nl> @ usableFromInline <nl> internal enum _StringComparisonResult { <nl> case equal <nl> mmm a / stdlib / public / core / StringGuts . swift <nl> ppp b / stdlib / public / core / StringGuts . swift <nl> import SwiftShims <nl> / / StringGuts is a parameterization over String ' s representations . It provides <nl> / / functionality and guidance for efficiently working with Strings . <nl> / / <nl> - @ _fixed_layout <nl> + @ frozen <nl> public / / SPI ( corelibs - foundation ) <nl> struct _StringGuts { <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / StringIndex . swift <nl> ppp b / stdlib / public / core / StringIndex . swift <nl> the default value being ` 0 ` . <nl> * / <nl> extension String { <nl> / / / A position of a character or code unit in a string . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Index { <nl> @ usableFromInline <nl> internal var _rawBits : UInt64 <nl> mmm a / stdlib / public / core / StringInterpolation . swift <nl> ppp b / stdlib / public / core / StringInterpolation . swift <nl> <nl> / / / <nl> / / / ` DefaultStringInterpolation ` extensions should add only ` mutating ` members <nl> / / / and should not copy ` self ` or capture it in an escaping closure . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct DefaultStringInterpolation : StringInterpolationProtocol { <nl> / / / The string contents accumulated by this instance . <nl> @ usableFromInline <nl> mmm a / stdlib / public / core / StringObject . swift <nl> ppp b / stdlib / public / core / StringObject . swift <nl> <nl> ` _discriminator ` ' s ` b6 ` . <nl> * / <nl> <nl> - @ _fixed_layout @ usableFromInline <nl> + @ frozen @ usableFromInline <nl> internal struct _StringObject { <nl> / / Namespace to hold magic numbers <nl> - @ usableFromInline @ _frozen <nl> + @ usableFromInline @ frozen <nl> enum Nibbles { } <nl> <nl> / / Abstract the count and performance - flags containing word <nl> - @ _fixed_layout @ usableFromInline <nl> + @ frozen @ usableFromInline <nl> struct CountAndFlags { <nl> @ usableFromInline <nl> var _storage : UInt64 <nl> internal struct _StringObject { <nl> } <nl> <nl> # if arch ( i386 ) | | arch ( arm ) <nl> - @ usableFromInline @ _frozen <nl> + @ usableFromInline @ frozen <nl> internal enum Variant { <nl> case immortal ( UInt ) <nl> case native ( AnyObject ) <nl> mmm a / stdlib / public / core / StringSwitch . swift <nl> ppp b / stdlib / public / core / StringSwitch . swift <nl> func _findStringSwitchCase ( <nl> return - 1 <nl> } <nl> <nl> - @ _fixed_layout / / needs known size for static allocation <nl> + @ frozen / / needs known size for static allocation <nl> public / / used by COMPILER_INTRINSIC <nl> struct _OpaqueStringSwitchCache { <nl> var a : Builtin . Word <nl> mmm a / stdlib / public / core / StringUTF16View . swift <nl> ppp b / stdlib / public / core / StringUTF16View . swift <nl> extension String { <nl> / / / print ( snowy [ range ] ) <nl> / / / } <nl> / / / / / Prints " Let it snow ! " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UTF16View { <nl> @ usableFromInline <nl> internal var _guts : _StringGuts <nl> extension String . UTF16View : BidirectionalCollection { <nl> } <nl> <nl> extension String . UTF16View { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> internal var _guts : _StringGuts <nl> mmm a / stdlib / public / core / StringUTF8View . swift <nl> ppp b / stdlib / public / core / StringUTF8View . swift <nl> extension String { <nl> / / / / / Prints " - 17 " <nl> / / / print ( String ( s1 . utf8 . prefix ( 15 ) ) ) <nl> / / / / / Prints " They call me ' B " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UTF8View { <nl> @ usableFromInline <nl> internal var _guts : _StringGuts <nl> mmm a / stdlib / public / core / StringUnicodeScalarView . swift <nl> ppp b / stdlib / public / core / StringUnicodeScalarView . swift <nl> extension String { <nl> / / / print ( asciiPrefix ) <nl> / / / } <nl> / / / / / Prints " My favorite emoji is " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UnicodeScalarView { <nl> @ usableFromInline <nl> internal var _guts : _StringGuts <nl> extension String . UnicodeScalarView : BidirectionalCollection { <nl> } <nl> <nl> extension String . UnicodeScalarView { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol { <nl> @ usableFromInline <nl> internal var _guts : _StringGuts <nl> mmm a / stdlib / public / core / Substring . swift <nl> ppp b / stdlib / public / core / Substring . swift <nl> extension String { <nl> / / / when there is no other reference to the original string . Storing <nl> / / / substrings may , therefore , prolong the lifetime of string data that is <nl> / / / no longer otherwise accessible , which can appear to be memory leakage . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Substring { <nl> @ usableFromInline <nl> internal var _slice : Slice < String > <nl> extension Substring : LosslessStringConvertible { <nl> } <nl> <nl> extension Substring { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UTF8View { <nl> @ usableFromInline <nl> internal var _slice : Slice < String . UTF8View > <nl> extension String { <nl> } <nl> } <nl> extension Substring { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UTF16View { <nl> @ usableFromInline <nl> internal var _slice : Slice < String . UTF16View > <nl> extension String { <nl> } <nl> } <nl> extension Substring { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UnicodeScalarView { <nl> @ usableFromInline <nl> internal var _slice : Slice < String . UnicodeScalarView > <nl> mmm a / stdlib / public / core / UIntBuffer . swift <nl> ppp b / stdlib / public / core / UIntBuffer . swift <nl> <nl> / / 255 elements . <nl> / / <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct _UIntBuffer < Element : UnsignedInteger & FixedWidthInteger > { <nl> public typealias Storage = UInt32 <nl> public var _storage : Storage <nl> public struct _UIntBuffer < Element : UnsignedInteger & FixedWidthInteger > { <nl> extension _UIntBuffer : Sequence { <nl> public typealias SubSequence = Slice < _UIntBuffer > <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol , Sequence { <nl> @ inlinable <nl> @ inline ( __always ) <nl> extension _UIntBuffer : Sequence { <nl> } <nl> <nl> extension _UIntBuffer : Collection { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Index : Comparable { <nl> @ usableFromInline <nl> internal var bitOffset : UInt8 <nl> mmm a / stdlib / public / core / UTF16 . swift <nl> ppp b / stdlib / public / core / UTF16 . swift <nl> <nl> / / <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> extension Unicode { <nl> - @ _frozen <nl> + @ frozen <nl> public enum UTF16 { <nl> case _swift3Buffer ( Unicode . UTF16 . ForwardParser ) <nl> } <nl> extension Unicode . UTF16 : Unicode . Encoding { <nl> return encode ( FromEncoding . decode ( content ) ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ForwardParser { <nl> public typealias _Buffer = _UIntBuffer < UInt16 > <nl> @ inlinable <nl> extension Unicode . UTF16 : Unicode . Encoding { <nl> public var _buffer : _Buffer <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ReverseParser { <nl> public typealias _Buffer = _UIntBuffer < UInt16 > <nl> @ inlinable <nl> mmm a / stdlib / public / core / UTF32 . swift <nl> ppp b / stdlib / public / core / UTF32 . swift <nl> <nl> / / <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> extension Unicode { <nl> - @ _frozen <nl> + @ frozen <nl> public enum UTF32 { <nl> case _swift3Codec <nl> } <nl> extension Unicode . UTF32 : Unicode . Encoding { <nl> return EncodedScalar ( source . value ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Parser { <nl> @ inlinable <nl> public init ( ) { } <nl> mmm a / stdlib / public / core / UTF8 . swift <nl> ppp b / stdlib / public / core / UTF8 . swift <nl> <nl> / / <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> extension Unicode { <nl> - @ _frozen <nl> + @ frozen <nl> public enum UTF8 { <nl> case _swift3Buffer ( Unicode . UTF8 . ForwardParser ) <nl> } <nl> extension Unicode . UTF8 : _UnicodeEncoding { <nl> return encode ( FromEncoding . decode ( content ) ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ForwardParser { <nl> public typealias _Buffer = _UIntBuffer < UInt8 > <nl> @ inline ( __always ) <nl> extension Unicode . UTF8 : _UnicodeEncoding { <nl> public var _buffer : _Buffer <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ReverseParser { <nl> public typealias _Buffer = _UIntBuffer < UInt8 > <nl> @ inline ( __always ) <nl> mmm a / stdlib / public / core / UnfoldSequence . swift <nl> ppp b / stdlib / public / core / UnfoldSequence . swift <nl> public typealias UnfoldFirstSequence < T > = UnfoldSequence < T , ( T ? , Bool ) > <nl> / / / <nl> / / / Instances of ` UnfoldSequence ` are created with the functions <nl> / / / ` sequence ( first : next : ) ` and ` sequence ( state : next : ) ` . <nl> - @ _fixed_layout / / generic - performance <nl> + @ frozen / / generic - performance <nl> public struct UnfoldSequence < Element , State > : Sequence , IteratorProtocol { <nl> @ inlinable / / generic - performance <nl> public mutating func next ( ) - > Element ? { <nl> mmm a / stdlib / public / core / Unicode . swift <nl> ppp b / stdlib / public / core / Unicode . swift <nl> import SwiftShims <nl> / / / Each ` UnicodeDecodingResult ` instance can represent a Unicode scalar value , <nl> / / / an indication that no more Unicode scalars are available , or an indication <nl> / / / of a decoding error . <nl> - @ _frozen <nl> + @ frozen <nl> public enum UnicodeDecodingResult : Equatable { <nl> / / / A decoded Unicode scalar value . <nl> case scalarValue ( Unicode . Scalar ) <nl> public func transcode < Input , InputEncoding , OutputEncoding > ( <nl> } <nl> <nl> / / / A namespace for Unicode utilities . <nl> - @ _frozen <nl> + @ frozen <nl> public enum Unicode { } <nl> <nl> mmm a / stdlib / public / core / UnicodeParser . swift <nl> ppp b / stdlib / public / core / UnicodeParser . swift <nl> <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> extension Unicode { <nl> / / / The result of attempting to parse a ` T ` from some input . <nl> - @ _frozen <nl> + @ frozen <nl> public enum ParseResult < T > { <nl> / / / A ` T ` was parsed successfully <nl> case valid ( T ) <nl> mmm a / stdlib / public / core / UnicodeScalar . swift <nl> ppp b / stdlib / public / core / UnicodeScalar . swift <nl> extension Unicode { <nl> / / / let airplane = Unicode . Scalar ( 9992 ) <nl> / / / print ( airplane ) <nl> / / / / / Prints " ✈ ︎ " <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Scalar { <nl> @ inlinable <nl> internal init ( _value : UInt32 ) { <nl> extension Unicode . Scalar : Comparable { <nl> } <nl> <nl> extension Unicode . Scalar { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UTF16View { <nl> @ inlinable <nl> internal init ( value : Unicode . Scalar ) { <nl> extension Unicode . Scalar . UTF16View : RandomAccessCollection { <nl> <nl> extension Unicode . Scalar { <nl> @ available ( macOS 9999 , iOS 9999 , tvOS 9999 , watchOS 9999 , * ) <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UTF8View { <nl> @ inlinable <nl> internal init ( value : Unicode . Scalar ) { <nl> mmm a / stdlib / public / core / Unmanaged . swift <nl> ppp b / stdlib / public / core / Unmanaged . swift <nl> <nl> / / / <nl> / / / When you use this type , you become partially responsible for <nl> / / / keeping the object alive . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Unmanaged < Instance : AnyObject > { <nl> @ usableFromInline <nl> internal unowned ( unsafe ) var _value : Instance <nl> mmm a / stdlib / public / core / UnsafeBufferPointer . swift . gyb <nl> ppp b / stdlib / public / core / UnsafeBufferPointer . swift . gyb <nl> <nl> / / FIXME : rdar : / / 18157434 - until this is fixed , this has to be fixed layout <nl> / / to avoid a hang in Foundation , which has the following setup : <nl> / / struct A { struct B { let x : UnsafeMutableBufferPointer < . . . > } let b : B } <nl> - @ _fixed_layout / / unsafe - performance <nl> + @ frozen / / unsafe - performance <nl> public struct Unsafe $ { Mutable } BufferPointer < Element > { <nl> <nl> @ usableFromInline <nl> public struct Unsafe $ { Mutable } BufferPointer < Element > { <nl> extension UnsafeBufferPointer { <nl> / / / An iterator for the elements in the buffer referenced by an <nl> / / / ` UnsafeBufferPointer ` or ` UnsafeMutableBufferPointer ` instance . <nl> - @ _fixed_layout / / unsafe - performance <nl> + @ frozen / / unsafe - performance <nl> public struct Iterator { <nl> @ usableFromInline <nl> internal var _position , _end : UnsafePointer < Element > ? <nl> mmm a / stdlib / public / core / UnsafePointer . swift <nl> ppp b / stdlib / public / core / UnsafePointer . swift <nl> <nl> / / / var number = 5 <nl> / / / let numberPointer = UnsafePointer < Int > ( & number ) <nl> / / / / / Accessing ' numberPointer ' is undefined behavior . <nl> - @ _fixed_layout / / unsafe - performance <nl> + @ frozen / / unsafe - performance <nl> public struct UnsafePointer < Pointee > : _Pointer { <nl> <nl> / / / A type that represents the distance between two pointers . <nl> public struct UnsafePointer < Pointee > : _Pointer { <nl> / / / var number = 5 <nl> / / / let numberPointer = UnsafeMutablePointer < Int > ( & number ) <nl> / / / / / Accessing ' numberPointer ' is undefined behavior . <nl> - @ _fixed_layout / / unsafe - performance <nl> + @ frozen / / unsafe - performance <nl> public struct UnsafeMutablePointer < Pointee > : _Pointer { <nl> <nl> / / / A type that represents the distance between two pointers . <nl> mmm a / stdlib / public / core / UnsafeRawBufferPointer . swift . gyb <nl> ppp b / stdlib / public / core / UnsafeRawBufferPointer . swift . gyb <nl> <nl> / / / <nl> / / / destBytes [ 0 . . < n ] = someBytes [ n . . < ( n + n ) ] <nl> % end <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Unsafe $ { Mutable } RawBufferPointer { <nl> @ usableFromInline <nl> internal let _position , _end : Unsafe $ { Mutable } RawPointer ? <nl> public struct Unsafe $ { Mutable } RawBufferPointer { <nl> % if not mutable : <nl> extension UnsafeRawBufferPointer { <nl> / / / An iterator over the bytes viewed by a raw buffer pointer . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator { <nl> @ usableFromInline <nl> internal var _position , _end : UnsafeRawPointer ? <nl> mmm a / stdlib / public / core / UnsafeRawPointer . swift <nl> ppp b / stdlib / public / core / UnsafeRawPointer . swift <nl> <nl> / / / var number = 5 <nl> / / / let numberPointer = UnsafeRawPointer ( & number ) <nl> / / / / / Accessing ' numberPointer ' is undefined behavior . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UnsafeRawPointer : _Pointer { <nl> <nl> public typealias Pointee = UInt8 <nl> extension UnsafeRawPointer : Strideable { <nl> / / / var number = 5 <nl> / / / let numberPointer = UnsafeMutableRawPointer ( & number ) <nl> / / / / / Accessing ' numberPointer ' is undefined behavior . <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UnsafeMutableRawPointer : _Pointer { <nl> <nl> public typealias Pointee = UInt8 <nl> mmm a / stdlib / public / core / ValidUTF8Buffer . swift <nl> ppp b / stdlib / public / core / ValidUTF8Buffer . swift <nl> <nl> / / 0xFF <nl> / / <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct _ValidUTF8Buffer { <nl> public typealias Element = Unicode . UTF8 . CodeUnit <nl> <nl> public struct _ValidUTF8Buffer { <nl> extension _ValidUTF8Buffer : Sequence { <nl> public typealias SubSequence = Slice < _ValidUTF8Buffer > <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol , Sequence { <nl> @ inlinable <nl> public init ( _ x : _ValidUTF8Buffer ) { _biasedBits = x . _biasedBits } <nl> extension _ValidUTF8Buffer : Sequence { <nl> } <nl> <nl> extension _ValidUTF8Buffer : Collection { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Index : Comparable { <nl> @ usableFromInline <nl> internal var _biasedBits : UInt32 <nl> mmm a / stdlib / public / core / VarArgs . swift <nl> ppp b / stdlib / public / core / VarArgs . swift <nl> extension Float80 : CVarArg , _CVarArgAligned { <nl> @ usableFromInline / / c - abi <nl> final internal class __VaListBuilder { <nl> # if arch ( x86_64 ) | | arch ( s390x ) <nl> - @ _fixed_layout / / c - abi <nl> + @ frozen / / c - abi <nl> @ usableFromInline <nl> internal struct Header { <nl> @ inlinable / / c - abi <nl> mmm a / stdlib / public / core / Zip . swift <nl> ppp b / stdlib / public / core / Zip . swift <nl> public func zip < Sequence1 , Sequence2 > ( <nl> / / / / / Prints " two : 2 <nl> / / / / / Prints " three : 3 " <nl> / / / / / Prints " four : 4 " <nl> - @ _fixed_layout / / generic - performance <nl> + @ frozen / / generic - performance <nl> public struct Zip2Sequence < Sequence1 : Sequence , Sequence2 : Sequence > { <nl> @ usableFromInline / / generic - performance <nl> internal let _sequence1 : Sequence1 <nl> public struct Zip2Sequence < Sequence1 : Sequence , Sequence2 : Sequence > { <nl> <nl> extension Zip2Sequence { <nl> / / / An iterator for ` Zip2Sequence ` . <nl> - @ _fixed_layout / / generic - performance <nl> + @ frozen / / generic - performance <nl> public struct Iterator { <nl> @ usableFromInline / / generic - performance <nl> internal var _baseStream1 : Sequence1 . Iterator <nl> mmm a / test / ClangImporter / enum - exhaustivity . swift <nl> ppp b / test / ClangImporter / enum - exhaustivity . swift <nl> <nl> / / CHECK - NEXT : case B <nl> / / CHECK - NEXT : { { ^ } $ } } <nl> <nl> - / / Note that we don ' t print ' @ _frozen ' here yet . <nl> + / / Note that we don ' t print ' @ frozen ' here yet . <nl> / / CHECK - LABEL : { { ^ } } enum ExhaustiveEnum : { { . + } } { <nl> / / CHECK : case A <nl> / / CHECK - NEXT : case B <nl> mmm a / test / Compatibility / exhaustive_switch . swift <nl> ppp b / test / Compatibility / exhaustive_switch . swift <nl> public enum NonExhaustivePayload { <nl> case a ( Int ) , b ( Bool ) <nl> } <nl> <nl> - @ _frozen public enum TemporalProxy { <nl> + @ frozen public enum TemporalProxy { <nl> case seconds ( Int ) <nl> case milliseconds ( Int ) <nl> case microseconds ( Int ) <nl> mmm a / test / DebugInfo / patternvars . swift <nl> ppp b / test / DebugInfo / patternvars . swift <nl> <nl> / / RUN : % target - swift - frontend % s - emit - ir - g - o - | % FileCheck % s <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct UnicodeScalar { <nl> var _value : UInt32 <nl> public var value : UInt32 { return _value } <nl> mmm a / test / FixCode / fixits - switch - nonfrozen . swift <nl> ppp b / test / FixCode / fixits - switch - nonfrozen . swift <nl> public enum NonExhaustivePayload { <nl> case a ( Int ) , b ( Bool ) <nl> } <nl> <nl> - @ _frozen public enum TemporalProxy { <nl> + @ frozen public enum TemporalProxy { <nl> case seconds ( Int ) <nl> case milliseconds ( Int ) <nl> case microseconds ( Int ) <nl> mmm a / test / FixCode / fixits - switch - nonfrozen . swift . result <nl> ppp b / test / FixCode / fixits - switch - nonfrozen . swift . result <nl> public enum NonExhaustivePayload { <nl> case a ( Int ) , b ( Bool ) <nl> } <nl> <nl> - @ _frozen public enum TemporalProxy { <nl> + @ frozen public enum TemporalProxy { <nl> case seconds ( Int ) <nl> case milliseconds ( Int ) <nl> case microseconds ( Int ) <nl> mmm a / test / Frontend / sil - merge - partial - modules . swift <nl> ppp b / test / Frontend / sil - merge - partial - modules . swift <nl> public func inlinableFunction ( ) { <nl> fn ( ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Rectangle : Shape { <nl> @ inlinable <nl> public func draw ( ) { <nl> mmm a / test / IRGen / Inputs / type_layout_dumper_other . swift <nl> ppp b / test / IRGen / Inputs / type_layout_dumper_other . swift <nl> public class SomeClass { } <nl> <nl> public protocol SomeProtocol { } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ConcreteFragileStruct { <nl> var field : Int32 <nl> <nl> public struct NestedResilientStruct { } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct NonDependentFragileStruct < T : AnyObject > { <nl> var field : T <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct DependentFragileStruct < T > { <nl> var field : T <nl> } <nl> mmm a / test / IRGen / class_resilience . swift <nl> ppp b / test / IRGen / class_resilience . swift <nl> extension ResilientGenericOutsideParent { <nl> / / to their best - known value and made non - constant if that value might <nl> / / disagree with the dynamic value . <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Empty { } <nl> <nl> public class ClassWithEmptyThenResilient { <nl> mmm a / test / IRGen / class_resilience_objc . swift <nl> ppp b / test / IRGen / class_resilience_objc . swift <nl> func testConstantIndirectFieldAccess < T > ( _ o : GenericObjCSubclass < T > ) { <nl> o . field = 10 <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Empty { } <nl> <nl> public class ClassWithEmptyThenResilient : DummyClass { <nl> mmm a / test / IRGen / enum_resilience . swift <nl> ppp b / test / IRGen / enum_resilience . swift <nl> public struct Reference { <nl> public var n : Class <nl> } <nl> <nl> - @ _frozen public enum Either { <nl> + @ frozen public enum Either { <nl> case Left ( Reference ) <nl> case Right ( Reference ) <nl> } <nl> enum InternalEither { <nl> case Right ( Reference ) <nl> } <nl> <nl> - @ _fixed_layout public struct ReferenceFast { <nl> + @ frozen public struct ReferenceFast { <nl> public var n : Class <nl> } <nl> <nl> - @ _frozen public enum EitherFast { <nl> + @ frozen public enum EitherFast { <nl> case Left ( ReferenceFast ) <nl> case Right ( ReferenceFast ) <nl> } <nl> mmm a / test / Inputs / resilient_enum . swift <nl> ppp b / test / Inputs / resilient_enum . swift <nl> <nl> import resilient_struct <nl> <nl> / / Fixed - layout enum with resilient members <nl> - @ _frozen public enum SimpleShape { <nl> + @ frozen public enum SimpleShape { <nl> case KleinBottle <nl> case Triangle ( Size ) <nl> } <nl> <nl> / / Fixed - layout enum with resilient members <nl> - @ _frozen public enum Shape { <nl> + @ frozen public enum Shape { <nl> case Point <nl> case Rect ( Size ) <nl> case RoundedRect ( Size , Size ) <nl> } <nl> <nl> / / Fixed - layout enum with indirect resilient members <nl> - @ _frozen public enum FunnyShape { <nl> + @ frozen public enum FunnyShape { <nl> indirect case Parallelogram ( Size ) <nl> indirect case Trapezoid ( Size ) <nl> } <nl> <nl> - @ _frozen public enum FullyFixedLayout { <nl> + @ frozen public enum FullyFixedLayout { <nl> case noPayload <nl> case hasPayload ( Int ) <nl> } <nl> public struct Color { <nl> } <nl> } <nl> <nl> - @ _frozen public enum CustomColor { <nl> + @ frozen public enum CustomColor { <nl> case Black <nl> case White <nl> case Custom ( Color ) <nl> mmm a / test / Inputs / resilient_struct . swift <nl> ppp b / test / Inputs / resilient_struct . swift <nl> <nl> / / Fixed - layout struct <nl> - @ _fixed_layout public struct Point { <nl> + @ frozen public struct Point { <nl> public var x : Int / / read - write stored property <nl> public let y : Int / / read - only stored property <nl> <nl> public struct Size { <nl> } <nl> <nl> / / Fixed - layout struct with resilient members <nl> - @ _fixed_layout public struct Rectangle { <nl> + @ frozen public struct Rectangle { <nl> public let p : Point <nl> public let s : Size <nl> public let color : Int <nl> public struct ResilientDouble { <nl> } <nl> } <nl> <nl> - @ _fixed_layout public struct ResilientLayoutRuntimeTest { <nl> + @ frozen public struct ResilientLayoutRuntimeTest { <nl> public let b1 : ResilientBool <nl> public let i : ResilientInt <nl> public let b2 : ResilientBool <nl> mmm a / test / Interpreter / class_resilience . swift <nl> ppp b / test / Interpreter / class_resilience . swift <nl> ResilientClassTestSuite . test ( " TypeByName " ) { <nl> = = ChildOfOutsideParentWithResilientStoredProperty . self ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Empty { } <nl> <nl> / / rdar : / / 48031465 <nl> mmm a / test / ParseableInterface / Conformances . swiftinterface <nl> ppp b / test / ParseableInterface / Conformances . swiftinterface <nl> extension MyProto { <nl> / / NEGATIVE - MODULE - NOT : sil_default_witness_table { { . + } } MyProto <nl> <nl> <nl> - @ _fixed_layout / / allow conformance devirtualization <nl> + @ frozen / / allow conformance devirtualization <nl> public struct FullStructImpl : MyProto { <nl> public init ( ) <nl> public func method ( ) <nl> public struct FullStructImpl : MyProto { <nl> / / CHECK : function_ref @ $ s12Conformances14FullStructImplVyS2icig <nl> / / CHECK : end sil function ' $ s16ConformancesUser8testFullSiyF ' <nl> <nl> - @ _fixed_layout / / allow conformance devirtualization <nl> + @ frozen / / allow conformance devirtualization <nl> public struct OpaqueStructImpl : MyProto { } <nl> <nl> / / CHECK - LABEL : sil @ $ s16ConformancesUser10testOpaqueSiyF <nl> mmm a / test / ParseableInterface / attrs . swift <nl> ppp b / test / ParseableInterface / attrs . swift <nl> <nl> / / CHECK : @ _effects ( readnone ) public func illiterate ( ) { { $ } } <nl> @ _effects ( readnone ) public func illiterate ( ) { } <nl> <nl> - / / CHECK - LABEL : @ _fixed_layout public struct Point { <nl> - @ _fixed_layout public struct Point { <nl> + / / CHECK - LABEL : @ frozen public struct Point { <nl> + @ frozen public struct Point { <nl> / / CHECK - NEXT : public var x : Int <nl> public var x : Int <nl> / / CHECK - NEXT : public var y : Int <nl> mmm a / test / ParseableInterface / fixed - layout - property - initializers . swift <nl> ppp b / test / ParseableInterface / fixed - layout - property - initializers . swift <nl> <nl> / / RUN : % target - swift - frontend - emit - module - o % t / TestResilient . swiftmodule - enable - library - evolution % t - resilient . swiftinterface - disable - objc - attr - requires - foundation - module <nl> / / RUN : % target - swift - frontend - emit - module - o / dev / null - merge - modules % t / TestResilient . swiftmodule - module - name TestResilient - enable - library - evolution - emit - parseable - module - interface - path - | % FileCheck % s - - check - prefix FROMMODULE - - check - prefix RESILIENT - - check - prefix COMMON <nl> <nl> - / / COMMON : @ _fixed_layout public struct MyStruct { <nl> - @ _fixed_layout <nl> + / / COMMON : @ frozen public struct MyStruct { <nl> + @ frozen <nl> public struct MyStruct { <nl> / / COMMON : public var publicVar : [ [ BOOL : ( Swift \ . ) ? Bool ] ] = false <nl> public var publicVar : Bool = false <nl> mmm a / test / ParseableInterface / lazy - vars . swift <nl> ppp b / test / ParseableInterface / lazy - vars . swift <nl> <nl> / / RUN : % target - swift - frontend - build - module - from - parseable - interface % t / TestResilient . swiftinterface - o % t / TestResilient . swiftmodule <nl> / / RUN : % target - swift - frontend - emit - module - o / dev / null - merge - modules - emit - parseable - module - interface - path - % t / TestResilient . swiftmodule - module - name TestResilient | % FileCheck % s - - check - prefix CHECK - - check - prefix RESILIENT <nl> <nl> - / / CHECK : @ _fixed_layout public struct HasLazyVarsFixedLayout { <nl> + / / CHECK : @ frozen public struct HasLazyVarsFixedLayout { <nl> / / CHECK - NEXT : public var foo : [ [ INT : ( Swift \ . ) ? Int ] ] { <nl> / / CHECK - NEXT : mutating get <nl> / / CHECK - NEXT : set <nl> <nl> / / CHECK - NOT : private var bar <nl> / / CHECK : private var $ __lazy_storage_ $ _bar : [ [ INT ] ] ? <nl> / / CHECK - NEXT : } <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct HasLazyVarsFixedLayout { <nl> public lazy var foo : Int = 0 <nl> private lazy var bar : Int = 0 <nl> mmm a / test / ParseableInterface / stored - properties . swift <nl> ppp b / test / ParseableInterface / stored - properties . swift <nl> public struct HasStoredProperties { <nl> / / COMMON : } <nl> } <nl> <nl> - / / COMMON : @ _fixed_layout public struct BagOfVariables { <nl> - @ _fixed_layout <nl> + / / COMMON : @ frozen public struct BagOfVariables { <nl> + @ frozen <nl> public struct BagOfVariables { <nl> / / COMMON : private let hidden : [ [ INT ] ] = 0 <nl> private let hidden : Int = 0 <nl> public struct BagOfVariables { <nl> / / COMMON : } <nl> } <nl> <nl> - / / COMMON : @ _fixed_layout public struct HasStoredPropertiesFixedLayout { <nl> - @ _fixed_layout <nl> + / / COMMON : @ frozen public struct HasStoredPropertiesFixedLayout { <nl> + @ frozen <nl> public struct HasStoredPropertiesFixedLayout { <nl> / / COMMON : public var simpleStoredMutable : [ [ BAGOFVARIABLES : . * BagOfVariables ] ] <nl> public var simpleStoredMutable : BagOfVariables <nl> mmm a / test / PrintAsObjC / enums - frozen . swift <nl> ppp b / test / PrintAsObjC / enums - frozen . swift <nl> <nl> import Foundation <nl> <nl> / / CHECK - LABEL : typedef SWIFT_ENUM ( NSInteger , FrozenEnum , closed ) { <nl> - @ objc @ _frozen public enum FrozenEnum : Int { <nl> + @ objc @ frozen public enum FrozenEnum : Int { <nl> case yes <nl> case no <nl> } <nl> mmm a / test / Prototypes / UnicodeDecoders . swift <nl> ppp b / test / Prototypes / UnicodeDecoders . swift <nl> extension Unicode . Scalar { <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> <nl> extension Unicode { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public / / @ testable <nl> struct _ParsingIterator < <nl> CodeUnitIterator : IteratorProtocol , <nl> mmm a / test / SIL / Serialization / Inputs / nontransparent . swift <nl> ppp b / test / SIL / Serialization / Inputs / nontransparent . swift <nl> public enum Optional < T > { <nl> case some ( T ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct B { <nl> @ inlinable <nl> public func amIConfused ( ) { } <nl> public struct B { <nl> public init ( ) { } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct A { <nl> public var b : B <nl> <nl> mmm a / test / SIL / Serialization / Inputs / shared_function_serialization_input . swift <nl> ppp b / test / SIL / Serialization / Inputs / shared_function_serialization_input . swift <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct X { <nl> @ inlinable <nl> public init ( ) { } <nl> mmm a / test / SIL / Serialization / Inputs / specializer_input . swift <nl> ppp b / test / SIL / Serialization / Inputs / specializer_input . swift <nl> <nl> <nl> public typealias Int = Builtin . Int32 <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Container < V > { <nl> @ inlinable <nl> @ inline ( never ) <nl> mmm a / test / SILGen / fixed_layout_attribute . swift <nl> ppp b / test / SILGen / fixed_layout_attribute . swift <nl> public struct NonFixedStruct { <nl> / / CHECK : function_ref @ $ s22fixed_layout_attribute6globalSivau <nl> / / CHECK : return <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct FixedStruct { <nl> public var storedProperty = global <nl> } <nl> struct AnotherInternalStruct { <nl> <nl> / / Static properties in fixed - layout type is still resilient <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct HasStaticProperty { <nl> public static var staticProperty : Int = 0 <nl> } <nl> mmm a / test / SILGen / inlinable_attribute . swift <nl> ppp b / test / SILGen / inlinable_attribute . swift <nl> public class MyCls { <nl> / / CHECK - LABEL : sil non_abi [ transparent ] [ serialized ] [ ossa ] @ $ s19inlinable_attribute15HasInitializersV1xSivpfi : $ @ convention ( thin ) ( ) - > Int <nl> / / CHECK - LABEL : sil non_abi [ transparent ] [ serialized ] [ ossa ] @ $ s19inlinable_attribute15HasInitializersV1ySivpfi : $ @ convention ( thin ) ( ) - > Int <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct HasInitializers { <nl> public let x = 1234 <nl> internal let y = 4321 <nl> mmm a / test / Sema / Inputs / exhaustive_switch_testable_helper . swift <nl> ppp b / test / Sema / Inputs / exhaustive_switch_testable_helper . swift <nl> <nl> - @ _frozen public enum FrozenEnum { <nl> + @ frozen public enum FrozenEnum { <nl> case a , b , c <nl> } <nl> <nl> mmm a / test / Sema / exhaustive_switch . swift <nl> ppp b / test / Sema / exhaustive_switch . swift <nl> enum MyNever { } <nl> func ~ = ( _ : MyNever , _ : MyNever ) - > Bool { return true } <nl> func myFatalError ( ) - > MyNever { fatalError ( ) } <nl> <nl> - @ _frozen public enum UninhabitedT4 < A > { <nl> + @ frozen public enum UninhabitedT4 < A > { <nl> case x ( A ) <nl> } <nl> <nl> public enum NonExhaustivePayload { <nl> case a ( Int ) , b ( Bool ) <nl> } <nl> <nl> - @ _frozen public enum TemporalProxy { <nl> + @ frozen public enum TemporalProxy { <nl> case seconds ( Int ) <nl> case milliseconds ( Int ) <nl> case microseconds ( Int ) <nl> mmm a / test / Sema / implementation - only - import - library - evolution . swift <nl> ppp b / test / Sema / implementation - only - import - library - evolution . swift <nl> public class PublicClassStoredProperties { <nl> <nl> / / MARK : Frozen types <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct FrozenPublicStructStoredProperties { <nl> public var publiclyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> internal var internallyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> public struct FrozenPublicStructStoredProperties { <nl> @ usableFromInline internal var computedUFIIsNot : BadStruct ? { return nil } / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> @ usableFromInline internal struct FrozenUFIStructStoredProperties { <nl> @ usableFromInline var publiclyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> internal var internallyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> public struct FrozenPublicStructStoredProperties { <nl> @ usableFromInline internal var computedUFIIsNot : BadStruct ? { return nil } / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> } <nl> <nl> + @ _fixed_layout <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + public struct FixedLayoutPublicStructStoredProperties { <nl> + public var publiclyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + internal var internallyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + private var privatelyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + private let letIsLikeVar : [ BadStruct ] = [ ] / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + <nl> + private var computedIsOkay : BadStruct ? { return nil } / / okay <nl> + private static var staticIsOkay : BadStruct ? / / okay <nl> + @ usableFromInline internal var computedUFIIsNot : BadStruct ? { return nil } / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + } <nl> + <nl> + @ _fixed_layout <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + @ usableFromInline internal struct FixedLayoutUFIStructStoredProperties { <nl> + @ usableFromInline var publiclyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + internal var internallyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + private var privatelyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + private let letIsLikeVar : [ BadStruct ] = [ ] / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + <nl> + private var computedIsOkay : BadStruct ? { return nil } / / okay <nl> + private static var staticIsOkay : BadStruct ? / / okay <nl> + @ usableFromInline internal var computedUFIIsNot : BadStruct ? { return nil } / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> + } <nl> + <nl> @ _fixed_layout <nl> public class FrozenPublicClassStoredProperties { <nl> public var publiclyBad : BadStruct ? / / expected - error { { cannot use struct ' BadStruct ' here ; ' BADLibrary ' has been imported as implementation - only } } <nl> mmm a / test / Serialization / Inputs / def_always_inline . swift <nl> ppp b / test / Serialization / Inputs / def_always_inline . swift <nl> <nl> return x <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AlwaysInlineInitStruct { <nl> @ usableFromInline <nl> var x : Bool <nl> mmm a / test / Serialization / Inputs / def_enum . swift <nl> ppp b / test / Serialization / Inputs / def_enum . swift <nl> public enum Breakfast < Champions > : Int { <nl> case Coffee <nl> } <nl> <nl> - @ _frozen public enum Exhaustive { } <nl> + @ frozen public enum Exhaustive { } <nl> mmm a / test / Serialization / Inputs / def_noinline . swift <nl> ppp b / test / Serialization / Inputs / def_noinline . swift <nl> <nl> return x <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct NoInlineInitStruct { <nl> @ usableFromInline <nl> var x : Bool <nl> mmm a / test / Serialization / Inputs / def_transparent . swift <nl> ppp b / test / Serialization / Inputs / def_transparent . swift <nl> public func do_switch ( u u : MaybePair ) { <nl> e ( ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Wrapper { <nl> public var value : Int32 <nl> <nl> mmm a / test / Serialization / attr - invalid . swift <nl> ppp b / test / Serialization / attr - invalid . swift <nl> <nl> / / CHECK - RESILIENT : Frozen_DECL_ATTR <nl> / / CHECK - NON - RESILIENT - NOT : Frozen_DECL_ATTR <nl> <nl> - @ _frozen / / expected - warning { { @ _frozen has no effect without - enable - library - evolution } } <nl> + @ frozen / / expected - warning { { @ frozen has no effect without - enable - library - evolution } } <nl> public enum SomeEnum { <nl> case x <nl> } <nl> mmm a / test / Serialization / early - serialization . swift <nl> ppp b / test / Serialization / early - serialization . swift <nl> <nl> / / - it happens before the performance inlining and thus preserves @ _semantics functions <nl> / / - it happens after generic specialization <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Int { <nl> @ inlinable <nl> public init ( ) { } <nl> public func userOfSemanticsAnnotatedFunc ( _ a : Array < Int > ) - > Int { <nl> return a . _getCapacity ( ) <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Array < T > { <nl> @ inlinable <nl> public init ( ) { } <nl> mmm a / test / Serialization / enum . swift <nl> ppp b / test / Serialization / enum . swift <nl> <nl> <nl> / / CHECK - NOT : UnknownCode <nl> <nl> - / / CHECK - PRINT - DAG : @ _frozen enum Exhaustive { <nl> + / / CHECK - PRINT - DAG : @ frozen enum Exhaustive { <nl> <nl> import def_enum <nl> <nl> mmm a / test / SourceKit / Indexing / rdar_21602898 . swift . response <nl> ppp b / test / SourceKit / Indexing / rdar_21602898 . swift . response <nl> <nl> ] , <nl> key . attributes : [ <nl> { <nl> - key . attribute : source . decl . attribute . _fixed_layout <nl> + key . attribute : source . decl . attribute . frozen <nl> } <nl> ] <nl> } , <nl> mmm a / test / SourceKit / InterfaceGen / gen_stdlib . swift <nl> ppp b / test / SourceKit / InterfaceGen / gen_stdlib . swift <nl> var x : Int <nl> / / CHECK1 - NEXT : < Group > Math / Integers < / Group > <nl> / / CHECK1 - NEXT : / < interface - gen > { { $ } } <nl> / / CHECK1 - NEXT : SYSTEM <nl> - / / CHECK1 - NEXT : < Declaration > struct Int : < Type usr = " s : s17FixedWidthIntegerP " > FixedWidthInteger < / Type > { { . * } } < Type usr = " s : SZ " > SignedInteger < / Type > { { . * } } < / Declaration > <nl> + / / CHECK1 - NEXT : < Declaration > @ frozen struct Int : < Type usr = " s : s17FixedWidthIntegerP " > FixedWidthInteger < / Type > { { . * } } < Type usr = " s : SZ " > SignedInteger < / Type > { { . * } } < / Declaration > <nl> <nl> / / RUN : % sourcekitd - test - req = module - groups - module Swift | % FileCheck - check - prefix = GROUP1 % s <nl> / / GROUP1 : < GROUPS > <nl> mmm a / test / api - digester / Inputs / cake . swift <nl> ppp b / test / api - digester / Inputs / cake . swift <nl> <nl> public protocol P1 { } <nl> public protocol P2 { } <nl> public protocol P3 : P2 , P1 { } <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct S1 : P1 { <nl> public static func foo1 ( ) { } <nl> mutating public func foo2 ( ) { } <nl> public extension Int { <nl> public func foo ( ) { } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct fixedLayoutStruct { <nl> public var a = 1 <nl> private var b = 2 { didSet { } willSet ( value ) { } } <nl> mmm a / test / api - digester / Inputs / cake1 . swift <nl> ppp b / test / api - digester / Inputs / cake1 . swift <nl> public class C5 { <nl> <nl> public struct C6 { } <nl> <nl> - @ _frozen <nl> + @ frozen <nl> public enum IceKind { } <nl> <nl> public protocol P1 { } <nl> public extension P1 where Self : P2 { <nl> func P1Constraint ( ) { } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct fixedLayoutStruct { <nl> public var b = 2 <nl> public func foo ( ) { } <nl> public struct fixedLayoutStruct { <nl> } <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> struct fixedLayoutStruct2 { <nl> public private ( set ) var NoLongerWithFixedBinaryOrder = 1 <nl> public var BecomeFixedBinaryOrder : Int { return 1 } <nl> } <nl> <nl> - @ _frozen <nl> + @ frozen <nl> public enum FrozenKind { <nl> case Unchanged <nl> case Fixed <nl> mmm a / test / api - digester / Inputs / cake2 . swift <nl> ppp b / test / api - digester / Inputs / cake2 . swift <nl> public class C5 { <nl> public dynamic func dy_foo ( ) { } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct C6 { } <nl> <nl> public enum IceKind { } <nl> public extension P1 { <nl> func P1Constraint ( ) { } <nl> } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct fixedLayoutStruct { <nl> public var a = 1 <nl> public func OKChange ( ) { } <nl> public struct fixedLayoutStruct { <nl> } <nl> <nl> @ usableFromInline <nl> - @ _fixed_layout <nl> + @ frozen <nl> struct fixedLayoutStruct2 { <nl> public var NoLongerWithFixedBinaryOrder : Int { return 1 } <nl> public var BecomeFixedBinaryOrder = 1 <nl> } <nl> <nl> - @ _frozen <nl> + @ frozen <nl> public enum FrozenKind { <nl> case Unchanged <nl> case Rigid <nl> mmm a / test / api - digester / Inputs / stdlib - stable - abi . json <nl> ppp b / test / api - digester / Inputs / stdlib - stable - abi . json <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s16_DependenceTokenV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s13_UnsafeBitsetV8IteratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Sb " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " genericSig " : " < τ_0_0 where τ_0_0 : AnyObject > " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " usr " : " s : SJ " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s18_CocoaArrayWrapperV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Collection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Comparable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] <nl> } , <nl> <nl> " usr " : " s : s13OpaquePointerV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s14CVaListPointerV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] <nl> } , <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s17__CocoaDictionaryV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence , τ_0_0 . Element : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Collection , τ_0_0 . Element : Collection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence , τ_0_0 . Element : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s11AnyHashableV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s6HasherV11_TailBufferV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] <nl> } , <nl> <nl> " usr " : " s : s6HasherV5_CoreV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] <nl> } , <nl> <nl> " usr " : " s : s6HasherV6_StateV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] <nl> } <nl> <nl> " usr " : " s : s6HasherV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s10_HashTableV5IndexV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " usr " : " s : s10_HashTableV8IteratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " usr " : " s : s10_HashTableV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Collection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence , τ_0_0 . Element : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence , τ_0_0 . Element : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " usr " : " s : SO " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s26_OptionalNilComparisonTypeV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Collection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s27SystemRandomNumberGeneratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Comparable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Comparable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Comparable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Strideable , τ_0_0 . Stride : SignedInteger > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Comparable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : BidirectionalCollection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : BidirectionalCollection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : BidirectionalCollection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : IteratorProtocol > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] <nl> } , <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s10__CocoaSetV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Hashable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Collection > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s12StaticStringV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Strideable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Strideable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Strideable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : Strideable > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS5IndexV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS8IteratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS17UnicodeScalarViewV8IteratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS17UnicodeScalarViewV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS9UTF16ViewV8IteratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS9UTF16ViewV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS8UTF8ViewV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SS " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s11_StringGutsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " usr " : " s : s26DefaultStringInterpolationV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s24_OpaqueStringSwitchCacheV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " usr " : " s : Ss8UTF8ViewV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Ss9UTF16ViewV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Ss17UnicodeScalarViewV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Ss " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : FixedWidthInteger , τ_0_0 : UnsignedInteger > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : FixedWidthInteger , τ_0_0 : UnsignedInteger > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : FixedWidthInteger , τ_0_0 : UnsignedInteger > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : AnyObject > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Sv " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO5ASCIIO6ParserV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO6ScalarV9UTF16ViewV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO6ScalarV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO5UTF16O13ForwardParserV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO5UTF16O13ReverseParserV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO4UTF8O13ForwardParserV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO4UTF8O13ReverseParserV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s7UnicodeO5UTF32O6ParserV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s16_ValidUTF8BufferV8IteratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s16_ValidUTF8BufferV5IndexV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s16_ValidUTF8BufferV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " genericSig " : " < τ_0_0 > " , <nl> " isABIPlaceholder " : true , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMD , τ_0_0 . Scalar : FixedWidthInteger , τ_0_0 . Scalar : SignedInteger > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Sequence , τ_0_1 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 , τ_0_1 where τ_0_0 : Sequence , τ_0_1 : Sequence > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Sf " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Sd " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " UsableFromInline " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " usr " : " s : s7Float80V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5UInt8V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5UInt8V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s4Int8V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s4Int8V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s6UInt16V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s6UInt16V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5Int16V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5Int16V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s6UInt32V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s6UInt32V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5Int32V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5Int32V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s6UInt64V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s6UInt64V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5Int64V5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s5Int64V " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Su5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Su " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Si5WordsV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> " Alignment " , <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Si " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : Sw " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SW8IteratorV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : SW " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : IteratorProtocol > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " , <nl> + " Frozen " , <nl> " UsableFromInline " <nl> ] , <nl> " conformances " : [ <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : s8AnyIndexV " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMDScalar > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMDScalar > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMDScalar > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMDScalar > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMDScalar > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMDScalar > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " moduleName " : " Swift " , <nl> " genericSig " : " < τ_0_0 where τ_0_0 : SIMDScalar > " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> mmm a / test / api - digester / Outputs / Cake - abi . txt <nl> ppp b / test / api - digester / Outputs / Cake - abi . txt <nl> cake1 : Func ownershipChange ( _ : _ : ) has parameter 1 changing from Shared to Owned <nl> <nl> / * Decl Attribute changes * / <nl> cake1 : Class C5 is now without @ objc <nl> - cake1 : Enum IceKind is now without @ _frozen <nl> + cake1 : Enum IceKind is now without @ frozen <nl> cake1 : Func C1 . foo1 ( ) is now not static <nl> cake1 : Func C5 . dy_foo ( ) is now with dynamic <nl> cake1 : Func FinalFuncContainer . NewFinalFunc ( ) is now with final <nl> cake1 : Func HasMutatingMethodClone . foo ( ) has self access kind changing from Muta <nl> cake1 : Func S1 . foo1 ( ) has self access kind changing from NonMutating to Mutating <nl> cake1 : Func S1 . foo3 ( ) is now static <nl> cake1 : Func _NoResilientClass . NoLongerFinalFunc ( ) is now without final <nl> - cake1 : Struct C6 is now with @ _fixed_layout <nl> + cake1 : Struct C6 is now with @ frozen <nl> cake1 : Var C1 . CIIns1 changes from weak to strong <nl> cake1 : Var C1 . CIIns2 changes from strong to weak <nl> cake1 : Var GlobalLetChangedToVar changes from let to var <nl> mmm a / test / api - digester / Outputs / cake - abi . json <nl> ppp b / test / api - digester / Outputs / cake - abi . json <nl> <nl> " usr " : " s : 4cake2S1V " , <nl> " moduleName " : " cake " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : 4cake17fixedLayoutStructV " , <nl> " moduleName " : " cake " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " usr " : " s : Si " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " isExternal " : true , <nl> " conformances " : [ <nl> mmm a / test / api - digester / Outputs / cake . json <nl> ppp b / test / api - digester / Outputs / cake . json <nl> <nl> " usr " : " s : 4cake2S1V " , <nl> " moduleName " : " cake " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " conformances " : [ <nl> { <nl> <nl> " usr " : " s : 4cake17fixedLayoutStructV " , <nl> " moduleName " : " cake " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] <nl> } , <nl> { <nl> <nl> " usr " : " s : Si " , <nl> " moduleName " : " Swift " , <nl> " declAttributes " : [ <nl> - " FixedLayout " <nl> + " Frozen " <nl> ] , <nl> " isExternal " : true , <nl> " conformances " : [ <nl> mmm a / test / attr / attr_fixed_layout . swift <nl> ppp b / test / attr / attr_fixed_layout . swift <nl> <nl> / / RUN : not % target - swift - frontend - typecheck - swift - version 4 . 2 - dump - ast % s - enable - testing | % FileCheck - - check - prefix = RESILIENCE - OFF % s <nl> <nl> / / <nl> - / / Public types with @ _fixed_layout are always fixed layout <nl> + / / Public types with @ frozen are always fixed layout <nl> / / <nl> <nl> / / RESILIENCE - ON : struct_decl { { . * } } " Point " interface type = ' Point . Type ' access = public non - resilient <nl> / / RESILIENCE - OFF : struct_decl { { . * } } " Point " interface type = ' Point . Type ' access = public non - resilient <nl> - @ _fixed_layout public struct Point { <nl> + @ frozen public struct Point { <nl> + let x , y : Int <nl> + } <nl> + <nl> + / / RESILIENCE - ON : struct_decl { { . * } } " FixedPoint " interface type = ' FixedPoint . Type ' access = public non - resilient <nl> + / / RESILIENCE - OFF : struct_decl { { . * } } " FixedPoint " interface type = ' FixedPoint . Type ' access = public non - resilient <nl> + @ _fixed_layout public struct FixedPoint { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> let x , y : Int <nl> } <nl> <nl> / / RESILIENCE - ON : enum_decl { { . * } } " ChooseYourOwnAdventure " interface type = ' ChooseYourOwnAdventure . Type ' access = public non - resilient <nl> / / RESILIENCE - OFF : enum_decl { { . * } } " ChooseYourOwnAdventure " interface type = ' ChooseYourOwnAdventure . Type ' access = public non - resilient <nl> - @ _frozen public enum ChooseYourOwnAdventure { <nl> + @ frozen public enum ChooseYourOwnAdventure { <nl> case JumpIntoRabbitHole <nl> case EatMushroom <nl> } <nl> public struct Size { <nl> <nl> / / RESILIENCE - ON : struct_decl { { . * } } " UsableFromInlineStruct " interface type = ' UsableFromInlineStruct . Type ' access = internal non - resilient <nl> / / RESILIENCE - OFF : struct_decl { { . * } } " UsableFromInlineStruct " interface type = ' UsableFromInlineStruct . Type ' access = internal non - resilient <nl> - @ _fixed_layout @ usableFromInline struct UsableFromInlineStruct { } <nl> + @ frozen @ usableFromInline struct UsableFromInlineStruct { } <nl> + <nl> + / / RESILIENCE - ON : struct_decl { { . * } } " UsableFromInlineFixedStruct " interface type = ' UsableFromInlineFixedStruct . Type ' access = internal non - resilient <nl> + / / RESILIENCE - OFF : struct_decl { { . * } } " UsableFromInlineFixedStruct " interface type = ' UsableFromInlineFixedStruct . Type ' access = internal non - resilient <nl> + @ _fixed_layout @ usableFromInline struct UsableFromInlineFixedStruct { } <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> <nl> / / RESILIENCE - ON : enum_decl { { . * } } " TaxCredit " interface type = ' TaxCredit . Type ' access = public resilient <nl> / / RESILIENCE - OFF : enum_decl { { . * } } " TaxCredit " interface type = ' TaxCredit . Type ' access = public non - resilient <nl> struct Rectangle { <nl> / / Diagnostics <nl> / / <nl> <nl> - @ _fixed_layout struct InternalStruct { / / expected - note * { { declared here } } <nl> - / / expected - error @ - 1 { { ' @ _fixed_layout ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' InternalStruct ' is internal } } <nl> + @ frozen struct InternalStruct { / / expected - note * { { declared here } } <nl> + / / expected - error @ - 1 { { ' @ frozen ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' InternalStruct ' is internal } } <nl> + <nl> + @ frozen public struct NestedStruct { } <nl> + } <nl> + <nl> + @ _fixed_layout struct FixedInternalStruct { / / expected - note * { { declared here } } <nl> + / / expected - error @ - 1 { { ' @ _fixed_layout ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' FixedInternalStruct ' is internal } } <nl> + / / expected - warning @ - 2 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> <nl> @ _fixed_layout public struct NestedStruct { } <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + } <nl> + <nl> + @ frozen fileprivate struct FileprivateStruct { } <nl> + / / expected - error @ - 1 { { ' @ frozen ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' FileprivateStruct ' is fileprivate } } <nl> + <nl> + @ _fixed_layout fileprivate struct FixedFileprivateStruct { } <nl> + / / expected - error @ - 1 { { ' @ _fixed_layout ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' FixedFileprivateStruct ' is fileprivate } } <nl> + / / expected - warning @ - 2 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + <nl> + @ frozen private struct PrivateStruct { } / / expected - note * { { declared here } } <nl> + / / expected - error @ - 1 { { ' @ frozen ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' PrivateStruct ' is private } } <nl> + <nl> + @ _fixed_layout private struct FixedPrivateStruct { } / / expected - note * { { declared here } } <nl> + / / expected - error @ - 1 { { ' @ _fixed_layout ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' FixedPrivateStruct ' is private } } <nl> + / / expected - warning @ - 2 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + <nl> + <nl> + @ frozen public struct BadFields1 { <nl> + private var field : PrivateStruct / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> + } <nl> + <nl> + @ _fixed_layout public struct FixedBadFields1 { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + private var field : PrivateStruct / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> } <nl> <nl> - @ _fixed_layout fileprivate struct FileprivateStruct { } <nl> - / / expected - error @ - 1 { { ' @ _fixed_layout ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' FileprivateStruct ' is fileprivate } } <nl> + @ frozen public struct BadFields2 { <nl> + private var field : PrivateStruct ? / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> + } <nl> <nl> - @ _fixed_layout private struct PrivateStruct { } / / expected - note * { { declared here } } <nl> - / / expected - error @ - 1 { { ' @ _fixed_layout ' attribute can only be applied to ' @ usableFromInline ' or public declarations , but ' PrivateStruct ' is private } } <nl> + @ _fixed_layout public struct FixedBadFields2 { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + private var field : PrivateStruct ? / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> + } <nl> <nl> + @ frozen public struct BadFields3 { <nl> + internal var field : InternalStruct ? / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> + } <nl> <nl> - @ _fixed_layout public struct BadFields1 { <nl> - private var field : PrivateStruct / / expected - error { { type referenced from a stored property in a ' @ _fixed_layout ' struct must be ' @ usableFromInline ' or public } } <nl> + @ _fixed_layout public struct FixedBadFields3 { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + internal var field : InternalStruct ? / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> } <nl> <nl> - @ _fixed_layout public struct BadFields2 { <nl> - private var field : PrivateStruct ? / / expected - error { { type referenced from a stored property in a ' @ _fixed_layout ' struct must be ' @ usableFromInline ' or public } } <nl> + @ frozen @ usableFromInline struct BadFields4 { <nl> + internal var field : InternalStruct ? / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> } <nl> <nl> - @ _fixed_layout public struct BadFields3 { <nl> - internal var field : InternalStruct ? / / expected - error { { type referenced from a stored property in a ' @ _fixed_layout ' struct must be ' @ usableFromInline ' or public } } <nl> + @ _fixed_layout @ usableFromInline struct FixedBadFields4 { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + internal var field : InternalStruct ? / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> } <nl> <nl> - @ _fixed_layout @ usableFromInline struct BadFields4 { <nl> - internal var field : InternalStruct ? / / expected - error { { type referenced from a stored property in a ' @ _fixed_layout ' struct must be ' @ usableFromInline ' or public } } <nl> + @ frozen public struct BadFields5 { <nl> + private var field : PrivateStruct ? { / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> + didSet { } <nl> + <nl> + <nl> + } <nl> } <nl> <nl> - @ _fixed_layout public struct BadFields5 { <nl> - private var field : PrivateStruct ? { / / expected - error { { type referenced from a stored property in a ' @ _fixed_layout ' struct must be ' @ usableFromInline ' or public } } <nl> + @ _fixed_layout public struct FixedBadFields5 { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + private var field : PrivateStruct ? { / / expected - error { { type referenced from a stored property in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> didSet { } <nl> + <nl> + <nl> } <nl> } <nl> <nl> / / expected - warning @ + 1 { { the result of a ' @ usableFromInline ' function should be ' @ usableFromInline ' or public } } <nl> @ usableFromInline func notReallyUsableFromInline ( ) - > InternalStruct ? { return nil } <nl> - @ _fixed_layout public struct BadFields6 { <nl> - private var field = notReallyUsableFromInline ( ) / / expected - error { { type referenced from a stored property with inferred type ' InternalStruct ? ' in a ' @ _fixed_layout ' struct must be ' @ usableFromInline ' or public } } <nl> + @ frozen public struct BadFields6 { <nl> + private var field = notReallyUsableFromInline ( ) / / expected - error { { type referenced from a stored property with inferred type ' InternalStruct ? ' in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> + } <nl> + <nl> + / / expected - warning @ + 1 { { the result of a ' @ usableFromInline ' function should be ' @ usableFromInline ' or public } } <nl> + @ usableFromInline func notReallyUsableFromInlineFixed ( ) - > FixedInternalStruct ? { return nil } <nl> + @ _fixed_layout public struct FrozenBadFields6 { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> + private var field = notReallyUsableFromInlineFixed ( ) / / expected - error { { type referenced from a stored property with inferred type ' FixedInternalStruct ? ' in a ' @ frozen ' struct must be ' @ usableFromInline ' or public } } <nl> + } <nl> + <nl> + @ frozen public struct OKFields { <nl> + private var publicTy : Size <nl> + internal var ufiTy : UsableFromInlineStruct ? <nl> + <nl> + internal static var staticProp : InternalStruct ? <nl> + <nl> + private var computed : PrivateStruct ? { return nil } <nl> } <nl> <nl> - @ _fixed_layout public struct OKFields { <nl> + @ _fixed_layout public struct FixedOKFields { <nl> + / / expected - warning @ - 1 { { ' @ frozen ' attribute is now used for fixed - layout structs } } <nl> private var publicTy : Size <nl> internal var ufiTy : UsableFromInlineStruct ? <nl> <nl> mmm a / test / attr / attr_inlinable . swift <nl> ppp b / test / attr / attr_inlinable . swift <nl> public struct PublicResilientStructWithInit { <nl> private func privateIntReturningFunc ( ) - > Int { return 0 } <nl> internal func internalIntReturningFunc ( ) - > Int { return 0 } <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct PublicFixedStructWithInit { <nl> - var x = internalGlobal / / expected - error { { let ' internalGlobal ' is internal and cannot be referenced from a property initializer in a ' @ _fixed_layout ' type } } <nl> + var x = internalGlobal / / expected - error { { let ' internalGlobal ' is internal and cannot be referenced from a property initializer in a ' @ frozen ' type } } <nl> var y = publicGlobal / / OK <nl> static var z = privateIntReturningFunc ( ) / / OK <nl> static var a = internalIntReturningFunc ( ) / / OK <nl> mmm a / test / decl / enum / frozen - nonresilient . swift <nl> ppp b / test / decl / enum / frozen - nonresilient . swift <nl> <nl> / / RUN : % target - typecheck - verify - swift <nl> <nl> - @ _frozen public enum Exhaustive { } / / expected - warning { { @ _frozen has no effect without - enable - library - evolution } } { { 1 - 10 = } } <nl> + @ frozen public enum Exhaustive { } / / expected - warning { { @ frozen has no effect without - enable - library - evolution } } { { 1 - 9 = } } <nl> <nl> - @ _frozen enum NotPublic { } / / expected - warning { { @ _frozen has no effect without - enable - library - evolution } } { { 1 - 10 = } } <nl> + @ frozen enum NotPublic { } / / expected - warning { { @ frozen has no effect without - enable - library - evolution } } { { 1 - 9 = } } <nl> mmm a / test / decl / enum / frozen . swift <nl> ppp b / test / decl / enum / frozen . swift <nl> <nl> / / RUN : % target - typecheck - verify - swift - enable - library - evolution <nl> <nl> - @ _frozen public enum Exhaustive { } / / no - warning <nl> + @ frozen public enum Exhaustive { } / / no - warning <nl> <nl> - @ _frozen enum NotPublic { } / / expected - warning { { @ _frozen has no effect on non - public enums } } { { 1 - 10 = } } <nl> + @ frozen enum NotPublic { } / / expected - warning { { @ frozen has no effect on non - public enums } } { { 1 - 9 = } } <nl> <nl> internal enum Outer { <nl> - @ _frozen public enum ButThisIsOK { } / / no - warning <nl> + @ frozen public enum ButThisIsOK { } / / no - warning <nl> } <nl> <nl> - @ _frozen @ usableFromInline enum NotPublicButVersioned { } / / no - warning <nl> + @ frozen @ usableFromInline enum NotPublicButVersioned { } / / no - warning <nl> + <nl> + @ frozen enum DeprecationWarning { } / / expected - warning { { @ frozen has no effect on non - public enums } } { { 1 - 9 = } } <nl> + <nl> + @ _frozen public enum UnderscoredFrozen { } <nl> mmm a / test / decl / init / resilience - cross - module . swift <nl> ppp b / test / decl / init / resilience - cross - module . swift <nl> <nl> import resilient_struct <nl> import resilient_protocol <nl> <nl> - / / Size is not @ _fixed_layout , so we cannot define a new designated initializer <nl> + / / Size is not @ frozen , so we cannot define a new designated initializer <nl> extension Size { <nl> init ( ww : Int , hh : Int ) { <nl> self . w = ww <nl> mmm a / test / decl / init / resilience . swift <nl> ppp b / test / decl / init / resilience . swift <nl> <nl> / / RUN : % target - swift - frontend - typecheck - swift - version 4 % s <nl> / / RUN : % target - swift - frontend - typecheck - swift - version 5 % s <nl> <nl> - / / Animal is not @ _fixed_layout , so we cannot define an @ inlinable <nl> + / / Animal is not @ frozen , so we cannot define an @ inlinable <nl> / / designated initializer <nl> public struct Animal { <nl> public let name : String / / expected - note 2 { { declared here } } <nl> mmm a / test / sil - opt / sil - opt . swift <nl> ppp b / test / sil - opt / sil - opt . swift <nl> <nl> @ _silgen_name ( " unknown " ) <nl> public func unknown ( ) - > ( ) <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct X { <nl> @ inlinable <nl> public func test ( ) { <nl> mmm a / validation - test / Evolution / Inputs / backward_deploy_struct . swift <nl> ppp b / validation - test / Evolution / Inputs / backward_deploy_struct . swift <nl> public func getVersion ( ) - > Int { <nl> } <nl> } <nl> <nl> - @ _weakLinked @ _fixed_layout public struct FixedLayoutStruct { <nl> + @ _weakLinked @ frozen public struct FixedLayoutStruct { <nl> public init ( ) { } <nl> <nl> public func fn ( _ x : Int ) { } <nl> mmm a / validation - test / Evolution / Inputs / enum_change_size . swift <nl> ppp b / validation - test / Evolution / Inputs / enum_change_size . swift <nl> public struct ChangeSize { <nl> # endif <nl> } <nl> <nl> - @ _frozen public enum SingletonEnum { <nl> + @ frozen public enum SingletonEnum { <nl> case X ( ChangeSize ) <nl> } <nl> <nl> public func getSingletonEnumValues ( _ c : ChangeSize ) <nl> return [ . X ( c ) , nil ] <nl> } <nl> <nl> - @ _frozen public enum SinglePayloadEnum { <nl> + @ frozen public enum SinglePayloadEnum { <nl> case X ( ChangeSize ) <nl> case Y <nl> case Z <nl> public func getSinglePayloadEnumValues ( _ c : ChangeSize ) <nl> return [ . X ( c ) , . Y , . Z , nil ] <nl> } <nl> <nl> - @ _frozen public enum MultiPayloadEnum { <nl> + @ frozen public enum MultiPayloadEnum { <nl> case X ( ChangeSize ) <nl> case Y ( ChangeSize ) <nl> case Z <nl> mmm a / validation - test / Evolution / Inputs / struct_add_initializer . swift <nl> ppp b / validation - test / Evolution / Inputs / struct_add_initializer . swift <nl> public func getVersion ( ) - > Int { <nl> <nl> # if BEFORE <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AddInitializer { <nl> public var x : Int <nl> <nl> public struct AddInitializer { <nl> <nl> # else <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct AddInitializer { <nl> public var x : Int = 0 <nl> <nl> mmm a / validation - test / Evolution / Inputs / struct_change_size . swift <nl> ppp b / validation - test / Evolution / Inputs / struct_change_size . swift <nl> public struct ChangeSize { <nl> private var _version : T <nl> } <nl> <nl> - @ _fixed_layout public struct ChangeFieldOffsetsOfFixedLayout { <nl> + @ frozen public struct ChangeFieldOffsetsOfFixedLayout { <nl> public init ( major : Int32 , minor : Int32 , patch : Int32 ) { <nl> self . major = ChangeSize ( version : major ) <nl> self . minor = ChangeSize ( version : minor ) <nl> mmm a / validation - test / Evolution / Inputs / struct_change_stored_to_computed_static . swift <nl> ppp b / validation - test / Evolution / Inputs / struct_change_stored_to_computed_static . swift <nl> <nl> <nl> # if BEFORE <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ChangeStoredToComputed { <nl> public static var celsius : Int = 0 <nl> <nl> public struct ChangeStoredToComputed { <nl> <nl> # else <nl> <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ChangeStoredToComputed { <nl> public static var celsius : Int { <nl> get { <nl> mmm a / validation - test / Evolution / Inputs / struct_fixed_layout_add_conformance . swift <nl> ppp b / validation - test / Evolution / Inputs / struct_fixed_layout_add_conformance . swift <nl> public func getVersion ( ) - > Int { <nl> # endif <nl> } <nl> <nl> - @ _fixed_layout public struct AddConformance { <nl> + @ frozen public struct AddConformance { <nl> public init ( ) { <nl> x = 0 <nl> y = 0 <nl> mmm a / validation - test / Evolution / Inputs / struct_fixed_layout_remove_conformance . swift <nl> ppp b / validation - test / Evolution / Inputs / struct_fixed_layout_remove_conformance . swift <nl> <nl> <nl> - @ _fixed_layout public struct RemoveConformance { <nl> + @ frozen public struct RemoveConformance { <nl> public init ( ) { <nl> x = 0 <nl> y = 0 <nl> mmm a / validation - test / compiler_crashers_2_fixed / 0109 - sr4737 . swift <nl> ppp b / validation - test / compiler_crashers_2_fixed / 0109 - sr4737 . swift <nl> extension UnicodeScalar { <nl> } <nl> } <nl> / / = = = mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - = = = / / <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct _UIntBuffer < <nl> Storage : UnsignedInteger & FixedWidthInteger , <nl> Element : UnsignedInteger & FixedWidthInteger <nl> public struct _UIntBuffer < <nl> } <nl> <nl> extension _UIntBuffer : Sequence { <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct Iterator : IteratorProtocol , Sequence { <nl> @ inline ( __always ) <nl> public init ( _ x : _UIntBuffer ) { _impl = x } <nl> extension Unicode . DefaultScalarView : Collection { <nl> / / This should go in the standard library ; see <nl> / / https : / / github . com / apple / swift / pull / 9074 and <nl> / / https : / / bugs . swift . org / browse / SR - 4721 <nl> - @ _fixed_layout <nl> + @ frozen <nl> public struct ReverseIndexingIterator < <nl> Elements : BidirectionalCollection <nl> > : IteratorProtocol , Sequence { <nl> mmm a / validation - test / compiler_crashers_fixed / 28660 - false - encountered - error - in - diagnostic - text . swift <nl> ppp b / validation - test / compiler_crashers_fixed / 28660 - false - encountered - error - in - diagnostic - text . swift <nl> <nl> / / See https : / / swift . org / CONTRIBUTORS . txt for the list of Swift project authors <nl> <nl> / / RUN : not % target - swift - frontend % s - emit - ir <nl> - class C { @ _fixed_layout public struct P <nl> + class C { @ frozen public struct P <nl>
De - underscore @ frozen , apply it to structs ( )
apple/swift
e9d4687e31a3ae8e90604d3b15bf8b241479c211
2019-05-31T00:55:37Z
mmm a / hphp / util / thread - local . h <nl> ppp b / hphp / util / thread - local . h <nl> inline uintptr_t tlsBase ( ) { <nl> / / assembler warnings of unknown importance about incorrect section <nl> / / types <nl> / / <nl> + / / __thread on cygwin and mingw uses pthreads emulation not native tls so <nl> + / / the emulation for thread local must be used as well <nl> + / / <nl> / / So we use __thread on gcc , icc and clang , unless we are on OSX . On OSX , we <nl> / / use our own emulation . Use the DECLARE_THREAD_LOCAL ( ) and <nl> / / IMPLEMENT_THREAD_LOCAL ( ) macros to access either __thread or the emulation <nl> / / as appropriate . <nl> <nl> - # if ! defined ( NO_TLS ) & & ! defined ( __APPLE__ ) & & \ <nl> - ( ( __llvm__ & & __clang__ ) | | \ <nl> - __GNUC__ > 4 | | ( __GNUC__ = = 4 & & __GNUC_MINOR__ > 3 ) | | \ <nl> + # if ! defined ( NO_TLS ) & & ! defined ( __APPLE__ ) & & \ <nl> + ! defined ( __CYGWIN__ ) & & ! defined ( __MINGW__ ) & & \ <nl> + ( ( __llvm__ & & __clang__ ) | | \ <nl> + __GNUC__ > 4 | | ( __GNUC__ = = 4 & & __GNUC_MINOR__ > 3 ) | | \ <nl> __INTEL_COMPILER ) <nl> # define USE_GCC_FAST_TLS <nl> # endif <nl>
Don ' t use fast tls on mingw and cygwin
facebook/hhvm
b65f3ae3aa3c2bdc8fe53f8689920c3541eb2f7d
2014-08-15T17:00:24Z
mmm a / scene / main / viewport . cpp <nl> ppp b / scene / main / viewport . cpp <nl> void Viewport : : _gui_call_input ( Control * p_control , const Ref < InputEvent > & p_inpu <nl> Control * control = Object : : cast_to < Control > ( ci ) ; <nl> if ( control ) { <nl> <nl> - control - > emit_signal ( SceneStringNames : : get_singleton ( ) - > gui_input , ev ) ; / / signal should be first , so it ' s possible to override an event ( and then accept it ) <nl> + if ( control - > data . mouse_filter ! = Control : : MOUSE_FILTER_IGNORE ) { <nl> + control - > emit_signal ( SceneStringNames : : get_singleton ( ) - > gui_input , ev ) ; / / signal should be first , so it ' s possible to override an event ( and then accept it ) <nl> + } <nl> if ( gui . key_event_accepted ) <nl> break ; <nl> if ( ! control - > is_inside_tree ( ) ) <nl> break ; <nl> - control - > call_multilevel ( SceneStringNames : : get_singleton ( ) - > _gui_input , ev ) ; <nl> + <nl> + if ( control - > data . mouse_filter ! = Control : : MOUSE_FILTER_IGNORE ) { <nl> + control - > call_multilevel ( SceneStringNames : : get_singleton ( ) - > _gui_input , ev ) ; <nl> + } <nl> <nl> if ( ! control - > is_inside_tree ( ) | | control - > is_set_as_toplevel ( ) ) <nl> break ; <nl>
Do not allow controls in ignore mouse to get focus via their children , fixes
godotengine/godot
3331ececc4c03cc2d112dcf266dc01781540551c
2018-11-16T16:47:21Z
mmm a / html5 / render / vue / components / slider / slideMixin . js <nl> ppp b / html5 / render / vue / components / slider / slideMixin . js <nl> export default { <nl> const nextElm = this . _cells [ nextIndex ] . elm <nl> const currentElm = this . _cells [ this . currentIndex ] . elm <nl> <nl> + / / put current slide on the top . <nl> + currentElm . style . zIndex = 1 <nl> + <nl> / / clone prevCell if there are only tow slides . <nl> if ( this . _cells . length = = = 2 ) { <nl> this . _clonePrev & & removeClone ( this . _clonePrev , lastPrev ) <nl>
* [ html5 ] fix slide ' s sequence error .
apache/incubator-weex
30dda1a731986b6bb43df27488e1f876ac2ab6d4
2017-03-23T07:31:24Z
mmm a / Tools / make_binary_drop_linux <nl> ppp b / Tools / make_binary_drop_linux <nl> CopyFilesFromList $ includePath20 includeFiles20 [ @ ] $ baseIncludePath <nl> # Copy Examples <nl> echo " Copying Examples . . . " > & 3 <nl> cp - r Examples $ baseDropPath <nl> - # Remove CPPEvalV2Client examples until V2 binaries are included in the binary drop <nl> - rm - rf $ baseDropPath / Examples / Evaluation / CPPEvalV2Client <nl> + # Include CPPEvalV2Client examples in 2 . 0 Beta drop <nl> + # rm - rf $ baseDropPath / Examples / Evaluation / CPPEvalV2Client <nl> <nl> # Copy Scripts ( Scripts folder from the root of the Repo ) <nl> echo " Copying Scripts . . . " > & 3 <nl>
Add CPPEvalV2Client examples to Linux drop script
microsoft/CNTK
8c1fdfe81de19d3bdffaa47c900eb411f9bc3e8d
2016-10-20T14:56:08Z
mmm a / arangod / Aql / AqlItemBlock . h <nl> ppp b / arangod / Aql / AqlItemBlock . h <nl> namespace triagens { <nl> _data [ index * _nrRegs + varNr ] . erase ( ) ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + / / / @ brief eraseValue , erase the current value of a register not freeing it <nl> + / / / this is used if the value is stolen and later released from elsewhere <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> + void eraseAll ( ) { <nl> + size_t const n = _data . size ( ) ; <nl> + for ( size_t i = 0 ; i < n ; + + i ) { <nl> + _data [ i ] . erase ( ) ; <nl> + } <nl> + } <nl> + <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> / / / @ brief getDocumentCollection <nl> / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> mmm a / arangod / Aql / ExecutionBlock . h <nl> ppp b / arangod / Aql / ExecutionBlock . h <nl> namespace triagens { <nl> <nl> size_t firstRow ; <nl> size_t lastRow ; <nl> + bool rowsAreValid ; <nl> + <nl> + AggregatorGroup ( ) <nl> + : firstRow ( 0 ) , <nl> + lastRow ( 0 ) , <nl> + rowsAreValid ( false ) { <nl> + } <nl> <nl> ~ AggregatorGroup ( ) { <nl> reset ( ) ; <nl> namespace triagens { <nl> void initialize ( size_t capacity ) { <nl> groupValues . reserve ( capacity ) ; <nl> collections . reserve ( capacity ) ; <nl> - <nl> + <nl> for ( size_t i = 0 ; i < capacity ; + + i ) { <nl> groupValues [ i ] = AqlValue ( ) ; <nl> collections [ i ] = nullptr ; <nl> namespace triagens { <nl> delete ( * it ) ; <nl> } <nl> groupBlocks . clear ( ) ; <nl> + groupValues [ 0 ] . erase ( ) ; <nl> + } <nl> + <nl> + void setFirstRow ( size_t value ) { <nl> + firstRow = value ; <nl> + rowsAreValid = true ; <nl> + } <nl> + <nl> + void setLastRow ( size_t value ) { <nl> + lastRow = value ; <nl> + rowsAreValid = true ; <nl> } <nl> <nl> + void addValues ( AqlItemBlock const * src , <nl> + RegisterId groupRegister ) { <nl> + if ( groupRegister = = 0 ) { <nl> + / / nothing to do <nl> + return ; <nl> + } <nl> + <nl> + if ( rowsAreValid ) { <nl> + / / emit group details <nl> + TRI_ASSERT ( firstRow < = lastRow ) ; <nl> + <nl> + auto block = src - > slice ( firstRow , lastRow + 1 ) ; <nl> + try { <nl> + groupBlocks . push_back ( block ) ; <nl> + } <nl> + catch ( . . . ) { <nl> + delete block ; <nl> + throw ; <nl> + } <nl> + } <nl> + <nl> + firstRow = lastRow = 0 ; <nl> + / / the next statement ensures we don ' t add the same value ( row ) twice <nl> + rowsAreValid = false ; <nl> + } <nl> + <nl> + <nl> } ; <nl> <nl> / / mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> namespace triagens { <nl> if ( _done ) { <nl> return nullptr ; <nl> } <nl> - <nl> + <nl> if ( _buffer . empty ( ) ) { <nl> - if ( ! ExecutionBlock : : getBlock ( DefaultBatchSize , DefaultBatchSize ) ) { <nl> + if ( ! ExecutionBlock : : getBlock ( atLeast , atMost ) ) { <nl> _done = true ; <nl> return nullptr ; <nl> } <nl> namespace triagens { <nl> size_t const curRegs = cur - > getNrRegs ( ) ; <nl> <nl> auto res = new AqlItemBlock ( atMost , _varOverview - > nrRegs [ _depth ] ) ; <nl> + <nl> TRI_ASSERT ( curRegs < = res - > getNrRegs ( ) ) ; <nl> <nl> inheritRegisters ( cur , res , _pos ) ; <nl> namespace triagens { <nl> <nl> while ( j < atMost ) { <nl> / / read the next input tow <nl> - <nl> + <nl> bool newGroup = false ; <nl> if ( _currentGroup . groupValues [ 0 ] . isEmpty ( ) ) { <nl> / / we never had any previous group <nl> namespace triagens { <nl> if ( newGroup ) { <nl> if ( ! _currentGroup . groupValues [ 0 ] . isEmpty ( ) ) { <nl> / / need to emit the current group first <nl> + <nl> emitGroup ( cur , res , j ) ; <nl> + <nl> / / increase output row count <nl> + + j ; <nl> + <nl> + if ( j = = atMost ) { <nl> + / / output is full <nl> + <nl> + / / do NOT advance input pointer <nl> + return res ; <nl> + } <nl> } <nl> <nl> + / / still space left in the output to create a new group <nl> + <nl> / / construct the new group <nl> size_t i = 0 ; <nl> for ( auto it = _aggregateRegisters . begin ( ) ; it ! = _aggregateRegisters . end ( ) ; + + it ) { <nl> namespace triagens { <nl> + + i ; <nl> } <nl> <nl> - _currentGroup . firstRow = _pos ; <nl> + _currentGroup . setFirstRow ( _pos ) ; <nl> } <nl> - <nl> - _currentGroup . lastRow = _pos ; <nl> + <nl> + _currentGroup . setLastRow ( _pos ) ; <nl> <nl> if ( + + _pos > = cur - > size ( ) ) { <nl> _buffer . pop_front ( ) ; <nl> _pos = 0 ; <nl> - <nl> - bool hasMore = ExecutionBlock : : getBlock ( DefaultBatchSize , DefaultBatchSize ) ; <nl> + <nl> + bool hasMore = ! _buffer . empty ( ) ; <nl> + if ( ! hasMore ) { <nl> + hasMore = ExecutionBlock : : getBlock ( atLeast , atMost ) ; <nl> + } <nl> <nl> if ( ! hasMore ) { <nl> + / / no more input . we ' re done <nl> try { <nl> + / / emit last buffered group <nl> emitGroup ( cur , res , j ) ; <nl> + + j ; <nl> delete cur ; <nl> namespace triagens { <nl> <nl> / / hasMore <nl> <nl> + / / move over the last group details into the group before we delete the block <nl> + _currentGroup . addValues ( cur , _groupRegister ) ; <nl> + <nl> delete cur ; <nl> cur = _buffer . front ( ) ; <nl> - _currentGroup . firstRow = 0 ; <nl> - _currentGroup . lastRow = 0 ; <nl> } <nl> } <nl> <nl> + <nl> TRI_ASSERT ( j > 0 ) ; <nl> res - > shrink ( j ) ; <nl> <nl> namespace triagens { <nl> void emitGroup ( AqlItemBlock const * cur , <nl> AqlItemBlock * res , <nl> size_t row ) { <nl> + <nl> size_t i = 0 ; <nl> for ( auto it = _aggregateRegisters . begin ( ) ; it ! = _aggregateRegisters . end ( ) ; + + it ) { <nl> res - > setValue ( row , ( * it ) . first , _currentGroup . groupValues [ i ] ) ; <nl> namespace triagens { <nl> } <nl> <nl> if ( _groupRegister > 0 ) { <nl> - / / emit group details <nl> - TRI_ASSERT ( _currentGroup . firstRow < = _currentGroup . lastRow ) ; <nl> + / / set the group values <nl> + _currentGroup . addValues ( cur , _groupRegister ) ; <nl> <nl> - auto block = cur - > slice ( _currentGroup . firstRow , _currentGroup . lastRow + 1 ) ; <nl> - try { <nl> - _currentGroup . groupBlocks . push_back ( block ) ; <nl> - } <nl> - catch ( . . . ) { <nl> - delete block ; <nl> - throw ; <nl> - } <nl> - <nl> - / / finally set the group details <nl> res - > setValue ( row , _groupRegister , AqlValue : : CreateFromBlocks ( _currentGroup . groupBlocks , _variableNames ) ) ; <nl> - <nl> - / / and reset the group so a new one can start <nl> - _currentGroup . reset ( ) ; <nl> } <nl> + <nl> + / / reset the group so a new one can start <nl> + _currentGroup . reset ( ) ; <nl> } <nl> <nl> private : <nl> mmm a / arangod / Aql / V8Expression . cpp <nl> ppp b / arangod / Aql / V8Expression . cpp <nl> AqlValue V8Expression : : execute ( AQL_TRANSACTION_V8 * trx , <nl> THROW_ARANGO_EXCEPTION ( TRI_ERROR_OUT_OF_MEMORY ) ; <nl> } <nl> <nl> - return AqlValue ( new triagens : : basics : : Json ( TRI_UNKNOWN_MEM_ZONE , json ) ) ; <nl> + try { <nl> + auto j = new triagens : : basics : : Json ( TRI_UNKNOWN_MEM_ZONE , json ) ; <nl> + return AqlValue ( j ) ; <nl> + } <nl> + catch ( . . . ) { <nl> + / / prevent memleak <nl> + TRI_FreeJson ( TRI_UNKNOWN_MEM_ZONE , json ) ; <nl> + throw ; <nl> + } <nl> } <nl> <nl> / / mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl>
rework of COLLECT
arangodb/arangodb
7df456a1dd1f35e4f2d3a9b89690f7b4b54d95d2
2014-08-11T18:06:08Z
mmm a / fdbserver / VersionedBTree . actor . cpp <nl> ppp b / fdbserver / VersionedBTree . actor . cpp <nl> struct RedwoodMetrics { <nl> } <nl> <nl> struct Level { <nl> - unsigned int btreePageReads ; <nl> - unsigned int btreePageReadExt ; <nl> - unsigned int btreePageWrite ; <nl> - unsigned int btreePageWriteExt ; <nl> - unsigned int btreePageCommitStart ; <nl> - unsigned int btreePageModify ; <nl> - unsigned int btreePageModifyExt ; <nl> + unsigned int pageRead ; <nl> + unsigned int pageReadExt ; <nl> + unsigned int pageBuild ; <nl> + unsigned int pageBuildExt ; <nl> + unsigned int pageCommitStart ; <nl> + unsigned int pageModify ; <nl> + unsigned int pageModifyExt ; <nl> unsigned int lazyClearRequeue ; <nl> unsigned int lazyClearRequeueExt ; <nl> unsigned int lazyClearFree ; <nl> unsigned int lazyClearFreeExt ; <nl> - double writeRecPct ; <nl> - double writeFillPct ; <nl> - unsigned int writeItemCount ; <nl> - double updateRecPct ; <nl> - double updateFillPct ; <nl> - unsigned int updateItemCount ; <nl> + double buildStoredPct ; <nl> + double buildFillPct ; <nl> + unsigned int buildItemCount ; <nl> + double modifyStoredPct ; <nl> + double modifyFillPct ; <nl> + unsigned int modifyItemCount ; <nl> } ; <nl> <nl> Level levels [ btreeLevels ] ; <nl> struct RedwoodMetrics { <nl> for ( int i = 0 ; i < btreeLevels ; + + i ) { <nl> auto & level = levels [ i ] ; <nl> std : : pair < const char * , unsigned int > metrics [ ] = { <nl> - { " PageWrite " , level . btreePageWrite } , <nl> - { " PageWriteExt " , level . btreePageWriteExt } , <nl> - { " PageModify " , level . btreePageModify } , <nl> - { " PageModifyExt " , level . btreePageModifyExt } , <nl> + { " PageBuild " , level . pageBuild } , <nl> + { " PageBuildExt " , level . pageBuildExt } , <nl> + { " PageModify " , level . pageModify } , <nl> + { " PageModifyExt " , level . pageModifyExt } , <nl> { " " , 0 } , <nl> - { " PageRead " , level . btreePageReads } , <nl> - { " PageReadExt " , level . btreePageReadExt } , <nl> - { " PageCommitStart " , level . btreePageCommitStart } , <nl> + { " PageRead " , level . pageRead } , <nl> + { " PageReadExt " , level . pageReadExt } , <nl> + { " PageCommitStart " , level . pageCommitStart } , <nl> { " " , 0 } , <nl> { " LazyClearInt " , level . lazyClearRequeue } , <nl> { " LazyClearIntExt " , level . lazyClearRequeueExt } , <nl> { " LazyClear " , level . lazyClearFree } , <nl> { " LazyClearExt " , level . lazyClearFreeExt } , <nl> { " " , 0 } , <nl> - { " - WriteAvgCount " , level . btreePageWrite ? level . writeItemCount / level . btreePageWrite : 0 } , <nl> - { " - WriteAvgFillPct " , level . btreePageWrite ? level . writeFillPct / level . btreePageWrite * 100 : 0 } , <nl> - { " - WriteAvgRecPct " , level . btreePageWrite ? level . writeRecPct / level . btreePageWrite * 100 : 0 } , <nl> + { " - BldAvgCount " , level . pageBuild ? level . buildItemCount / level . pageBuild : 0 } , <nl> + { " - BldAvgFillPct " , level . pageBuild ? level . buildFillPct / level . pageBuild * 100 : 0 } , <nl> + { " - BldAvgStoredPct " , level . pageBuild ? level . buildStoredPct / level . pageBuild * 100 : 0 } , <nl> { " " , 0 } , <nl> - { " - ModAvgCount " , level . btreePageModify ? level . updateItemCount / level . btreePageModify : 0 } , <nl> - { " - ModAvgFillPct " , level . btreePageModify ? level . updateFillPct / level . btreePageModify * 100 : 0 } , <nl> - { " - ModAvgRecPct " , level . btreePageModify ? level . updateRecPct / level . btreePageModify * 100 : 0 } <nl> + { " - ModAvgCount " , level . pageModify ? level . modifyItemCount / level . pageModify : 0 } , <nl> + { " - ModAvgFillPct " , level . pageModify ? level . modifyFillPct / level . pageModify * 100 : 0 } , <nl> + { " - ModAvgStoredPct " , level . pageModify ? level . modifyStoredPct / level . pageModify * 100 : 0 } <nl> } ; <nl> <nl> if ( s ! = nullptr ) { <nl> class VersionedBTree : public IVersionedStore { <nl> } <nl> <nl> auto & metrics = g_redwoodMetrics . level ( btPage - > height ) ; <nl> - metrics . btreePageWrite + = 1 ; <nl> - metrics . btreePageWriteExt + = blockCount - 1 ; <nl> - metrics . writeFillPct + = ( double ) written / capacity ; <nl> - metrics . writeRecPct + = ( double ) btPage - > kvBytes / capacity ; <nl> - metrics . writeItemCount + = btPage - > tree ( ) . numItems ; <nl> + metrics . pageBuild + = 1 ; <nl> + metrics . pageBuildExt + = blockCount - 1 ; <nl> + metrics . buildFillPct + = ( double ) written / capacity ; <nl> + metrics . buildStoredPct + = ( double ) btPage - > kvBytes / capacity ; <nl> + metrics . buildItemCount + = btPage - > tree ( ) . numItems ; <nl> <nl> / / Create chunked pages <nl> / / TODO : Avoid copying page bytes , but this is not trivial due to how pager checksums are currently handled . <nl> class VersionedBTree : public IVersionedStore { <nl> debug_printf ( " readPage ( ) op = readComplete % s @ % " PRId64 " \ n " , toString ( id ) . c_str ( ) , snapshot - > getVersion ( ) ) ; <nl> const BTreePage * pTreePage = ( const BTreePage * ) page - > begin ( ) ; <nl> auto & metrics = g_redwoodMetrics . level ( pTreePage - > height ) ; <nl> - metrics . btreePageReads + = 1 ; <nl> - metrics . btreePageReadExt + = ( id . size ( ) - 1 ) ; <nl> + metrics . pageRead + = 1 ; <nl> + metrics . pageReadExt + = ( id . size ( ) - 1 ) ; <nl> <nl> if ( ! forLazyClear & & page - > userData = = nullptr ) { <nl> debug_printf ( " readPage ( ) Creating Reader for % s @ % " PRId64 " lower = % s upper = % s \ n " , toString ( id ) . c_str ( ) , <nl> class VersionedBTree : public IVersionedStore { <nl> <nl> / / Update activity counts <nl> auto & metrics = g_redwoodMetrics . level ( ( ( const BTreePage * ) page - > begin ( ) ) - > height ) ; <nl> - metrics . btreePageWrite + = 1 ; <nl> - metrics . btreePageWriteExt + = ( newID . size ( ) - 1 ) ; <nl> + metrics . pageBuild + = 1 ; <nl> + metrics . pageBuildExt + = ( newID . size ( ) - 1 ) ; <nl> <nl> return newID ; <nl> } <nl> class VersionedBTree : public IVersionedStore { <nl> / / Page was updated in - place through edits and written to maybeNewID <nl> void updatedInPlace ( BTreePageIDRef maybeNewID , BTreePage * btPage , int capacity ) { <nl> auto & metrics = g_redwoodMetrics . level ( btPage - > height ) ; <nl> - metrics . btreePageModify + = 1 ; <nl> - metrics . btreePageModify + = ( maybeNewID . size ( ) - 1 ) ; <nl> - metrics . updateFillPct + = ( double ) btPage - > size ( ) / capacity ; <nl> - metrics . updateRecPct + = ( double ) btPage - > kvBytes / capacity ; <nl> - metrics . updateItemCount + = btPage - > tree ( ) . numItems ; <nl> + metrics . pageModify + = 1 ; <nl> + metrics . pageModify + = ( maybeNewID . size ( ) - 1 ) ; <nl> + metrics . modifyFillPct + = ( double ) btPage - > size ( ) / capacity ; <nl> + metrics . modifyStoredPct + = ( double ) btPage - > kvBytes / capacity ; <nl> + metrics . modifyItemCount + = btPage - > tree ( ) . numItems ; <nl> <nl> / / The boundaries can ' t have changed , but the child page link may have . <nl> if ( maybeNewID ! = decodeLowerBound - > getChildPage ( ) ) { <nl> class VersionedBTree : public IVersionedStore { <nl> wait ( readPage ( snapshot , rootID , update - > decodeLowerBound , update - > decodeUpperBound ) ) ; <nl> state BTreePage * btPage = ( BTreePage * ) page - > begin ( ) ; <nl> ASSERT ( isLeaf = = btPage - > isLeaf ( ) ) ; <nl> - g_redwoodMetrics . level ( btPage - > height ) . btreePageCommitStart + = 1 ; <nl> + g_redwoodMetrics . level ( btPage - > height ) . pageCommitStart + = 1 ; <nl> <nl> / / TODO : Decide if it is okay to update if the subtree boundaries are expanded . It can result in <nl> / / records in a DeltaTree being outside its decode boundary range , which isn ' t actually invalid <nl>
Renamed some metrics to be clearer . Instead of " write " and " update " the terms are now " build " and " modify " which are less confusing and better represent what they mean in terms of DeltaTree operations .
apple/foundationdb
0d6f2269383a2a5ff7086139837b49e1498b2dc0
2020-05-19T10:31:03Z
mmm a / examples / siamese / convert_mnist_siamese_data . cpp <nl> ppp b / examples / siamese / convert_mnist_siamese_data . cpp <nl> void convert_dataset ( const char * image_filename , const char * label_filename , <nl> } <nl> <nl> delete db ; <nl> - delete pixels ; <nl> + delete [ ] pixels ; <nl> } <nl> <nl> int main ( int argc , char * * argv ) { <nl>
Fix memory leak in convert_mnist_siamese_data .
BVLC/caffe
e8f96f58aa6b64726f62f7304964d1c0a82b5c38
2015-09-01T23:14:43Z
mmm a / cocos / ui / UISlider . cpp <nl> ppp b / cocos / ui / UISlider . cpp <nl> void Slider : : setPercent ( int percent ) <nl> _slidBallRenderer - > setPosition ( dis , _contentSize . height / 2 . 0f ) ; <nl> if ( _scale9Enabled ) <nl> { <nl> - _progressBarRenderer - > setPreferredSize ( Size ( dis , _progressBarTextureSize . height ) ) ; <nl> + _progressBarRenderer - > setPreferredSize ( Size ( dis , _contentSize . height ) ) ; <nl> } <nl> else <nl> { <nl>
fix slider scale9 render
cocos2d/cocos2d-x
94e35d8d95f37b1c4a0789342d458930bfb6ca7a
2015-01-22T07:53:33Z
mmm a / lib / IDE / CodeCompletion . cpp <nl> ppp b / lib / IDE / CodeCompletion . cpp <nl> class CompletionLookup final : public swift : : VisibleDeclConsumer { <nl> Lookup . unboxType ( ReturnType ) ; <nl> } <nl> <nl> - static bool getPositionInTupleExpr ( DeclContext & DC , Expr * Target , <nl> - TupleExpr * Tuple , unsigned & Pos , <nl> - bool & HasName ) { <nl> - auto & SM = DC . getASTContext ( ) . SourceMgr ; <nl> - Pos = 0 ; <nl> - for ( auto E : Tuple - > getElements ( ) ) { <nl> - if ( SM . isBeforeInBuffer ( E - > getEndLoc ( ) , Target - > getStartLoc ( ) ) ) { <nl> - Pos + + ; <nl> - continue ; <nl> - } <nl> - HasName = ! Tuple - > getElementName ( Pos ) . empty ( ) ; <nl> - return true ; <nl> - } <nl> - return false ; <nl> - } <nl> - <nl> void addArgNameCompletionResults ( ArrayRef < StringRef > Names ) { <nl> for ( auto Name : Names ) { <nl> CodeCompletionResultBuilder Builder ( Sink , <nl> class CompletionLookup final : public swift : : VisibleDeclConsumer { <nl> <nl> using FunctionParams = ArrayRef < AnyFunctionType : : Param > ; <nl> <nl> - static void collectArgumentExpectation ( unsigned Position , bool HasName , <nl> - ArrayRef < FunctionParams > Candidates , <nl> - SourceLoc Loc , <nl> - std : : vector < Type > & ExpectedTypes , <nl> - std : : vector < StringRef > & ExpectedNames ) { <nl> - SmallPtrSet < TypeBase * , 4 > seenTypes ; <nl> - SmallPtrSet < const char * , 4 > seenNames ; <nl> - <nl> - for ( auto Params : Candidates ) { <nl> - if ( Position > = Params . size ( ) ) { <nl> - continue ; <nl> - } <nl> - const auto & Ele = Params [ Position ] ; <nl> - if ( Ele . hasLabel ( ) & & ! HasName ) { <nl> - if ( seenNames . insert ( Ele . getLabel ( ) . get ( ) ) . second ) <nl> - ExpectedNames . push_back ( Ele . getLabel ( ) . str ( ) ) ; <nl> - } else { <nl> - if ( seenTypes . insert ( Ele . getType ( ) . getPointer ( ) ) . second ) <nl> - ExpectedTypes . push_back ( Ele . getType ( ) ) ; <nl> - } <nl> - } <nl> - } <nl> - <nl> - bool lookupArgCompletionsAtPosition ( unsigned Position , bool HasName , <nl> - ArrayRef < FunctionParams > Candidates , <nl> - SourceLoc Loc ) { <nl> - std : : vector < Type > ExpectedTypes ; <nl> - std : : vector < StringRef > ExpectedNames ; <nl> - collectArgumentExpectation ( Position , HasName , Candidates , Loc , ExpectedTypes , <nl> - ExpectedNames ) ; <nl> - addArgNameCompletionResults ( ExpectedNames ) ; <nl> - if ( ! ExpectedTypes . empty ( ) ) { <nl> - setExpectedTypes ( ExpectedTypes ) ; <nl> - getValueCompletionsInDeclContext ( Loc , DefaultFilter ) ; <nl> - } <nl> - return true ; <nl> - } <nl> - <nl> - static bool isPotentialSignatureMatch ( ArrayRef < Type > TupleEles , <nl> - ArrayRef < Type > ExprTypes , <nl> - DeclContext * DC ) { <nl> - / / Not likely to be a match if users provide more arguments than expected . <nl> - if ( ExprTypes . size ( ) > = TupleEles . size ( ) ) <nl> - return false ; <nl> - for ( unsigned I = 0 ; I < ExprTypes . size ( ) ; + + I ) { <nl> - auto Ty = ExprTypes [ I ] ; <nl> - if ( Ty & & ! Ty - > is < ErrorType > ( ) ) { <nl> - if ( ! isConvertibleTo ( Ty , TupleEles [ I ] , * DC ) ) { <nl> - return false ; <nl> - } <nl> - } <nl> - } <nl> - return true ; <nl> - } <nl> - <nl> - static void removeUnlikelyOverloads ( SmallVectorImpl < Type > & PossibleArgTypes , <nl> - ArrayRef < Type > TupleEleTypes , <nl> - DeclContext * DC ) { <nl> - for ( auto It = PossibleArgTypes . begin ( ) ; It ! = PossibleArgTypes . end ( ) ; ) { <nl> - llvm : : SmallVector < Type , 3 > ExpectedTypes ; <nl> - if ( isa < TupleType > ( ( * It ) . getPointer ( ) ) ) { <nl> - auto Elements = ( * It ) - > getAs < TupleType > ( ) - > getElements ( ) ; <nl> - for ( auto Ele : Elements ) <nl> - ExpectedTypes . push_back ( Ele . getType ( ) ) ; <nl> - } else { <nl> - ExpectedTypes . push_back ( * It ) ; <nl> - } <nl> - if ( isPotentialSignatureMatch ( ExpectedTypes , TupleEleTypes , DC ) ) { <nl> - + + It ; <nl> - } else { <nl> - PossibleArgTypes . erase ( It ) ; <nl> - } <nl> - } <nl> - } <nl> - <nl> - static bool collectionInputTypes ( DeclContext & DC , CallExpr * callExpr , <nl> - SmallVectorImpl < FunctionParams > & candidates ) { <nl> + static bool <nl> + collectPossibleParamLists ( DeclContext & DC , CallExpr * callExpr , <nl> + SmallVectorImpl < FunctionParams > & candidates ) { <nl> auto * fnExpr = callExpr - > getFn ( ) ; <nl> <nl> if ( auto type = fnExpr - > getType ( ) ) { <nl> class CompletionLookup final : public swift : : VisibleDeclConsumer { <nl> return ! candidates . empty ( ) ; <nl> } <nl> <nl> - static bool collectPossibleArgTypes ( DeclContext & DC , CallExpr * CallE , <nl> - Expr * CCExpr , <nl> - SmallVectorImpl < FunctionParams > & Candidates , <nl> - unsigned & Position , bool & HasName ) { <nl> - if ( ! collectionInputTypes ( DC , CallE , Candidates ) ) <nl> - return false ; <nl> - <nl> - if ( auto * tuple = dyn_cast < TupleExpr > ( CallE - > getArg ( ) ) ) { <nl> - for ( unsigned i = 0 , n = tuple - > getNumElements ( ) ; i ! = n ; + + i ) { <nl> - if ( isa < CodeCompletionExpr > ( tuple - > getElement ( i ) ) ) { <nl> - HasName = ! tuple - > getElementName ( i ) . empty ( ) ; <nl> - Position = i ; <nl> - return true ; <nl> - } <nl> - } <nl> - <nl> - return getPositionInTupleExpr ( DC , CCExpr , tuple , Position , HasName ) ; <nl> - } else if ( isa < ParenExpr > ( CallE - > getArg ( ) ) ) { <nl> + static bool getPositionInArgs ( DeclContext & DC , Expr * Args , Expr * CCExpr , <nl> + unsigned & Position , bool & HasName ) { <nl> + if ( isa < ParenExpr > ( Args ) ) { <nl> HasName = false ; <nl> Position = 0 ; <nl> return true ; <nl> } <nl> <nl> + auto * tuple = dyn_cast < TupleExpr > ( Args ) ; <nl> + if ( ! tuple ) <nl> + return false ; <nl> + <nl> + for ( unsigned i = 0 , n = tuple - > getNumElements ( ) ; i ! = n ; + + i ) { <nl> + if ( isa < CodeCompletionExpr > ( tuple - > getElement ( i ) ) ) { <nl> + HasName = ! tuple - > getElementName ( i ) . empty ( ) ; <nl> + Position = i ; <nl> + return true ; <nl> + } <nl> + } <nl> + auto & SM = DC . getASTContext ( ) . SourceMgr ; <nl> + for ( unsigned i = 0 , n = tuple - > getNumElements ( ) ; i ! = n ; + + i ) { <nl> + if ( SM . isBeforeInBuffer ( tuple - > getElement ( i ) - > getEndLoc ( ) , <nl> + CCExpr - > getStartLoc ( ) ) ) <nl> + continue ; <nl> + HasName = ! tuple - > getElementName ( i ) . empty ( ) ; <nl> + Position = i ; <nl> + return true ; <nl> + } <nl> return false ; <nl> } <nl> <nl> class CompletionLookup final : public swift : : VisibleDeclConsumer { <nl> collectArgumentExpectation ( DeclContext & DC , CallExpr * CallE , Expr * CCExpr , <nl> std : : vector < Type > & ExpectedTypes , <nl> std : : vector < StringRef > & ExpectedNames ) { <nl> + / / Collect parameter lists for possible func decls . <nl> SmallVector < FunctionParams , 4 > Candidates ; <nl> + if ( ! collectPossibleParamLists ( DC , CallE , Candidates ) ) <nl> + return false ; <nl> + <nl> + / / Determine the position of code completion token in call argument . <nl> unsigned Position ; <nl> bool HasName ; <nl> - if ( collectPossibleArgTypes ( DC , CallE , CCExpr , Candidates , Position , <nl> - HasName ) ) { <nl> - collectArgumentExpectation ( Position , HasName , Candidates , <nl> - CCExpr - > getStartLoc ( ) , ExpectedTypes , <nl> - ExpectedNames ) ; <nl> - return ! ExpectedTypes . empty ( ) | | ! ExpectedNames . empty ( ) ; <nl> + if ( ! getPositionInArgs ( DC , CallE - > getArg ( ) , CCExpr , Position , HasName ) ) <nl> + return false ; <nl> + <nl> + / / Collect possible types at the position . <nl> + { <nl> + SmallPtrSet < TypeBase * , 4 > seenTypes ; <nl> + SmallPtrSet < Identifier , 4 > seenNames ; <nl> + for ( auto Params : Candidates ) { <nl> + if ( Position > = Params . size ( ) ) <nl> + continue ; <nl> + const auto & Param = Params [ Position ] ; <nl> + if ( Param . hasLabel ( ) & & ! HasName ) { <nl> + if ( seenNames . insert ( Param . getLabel ( ) ) . second ) <nl> + ExpectedNames . push_back ( Param . getLabel ( ) . str ( ) ) ; <nl> + } else { <nl> + if ( seenTypes . insert ( Param . getType ( ) . getPointer ( ) ) . second ) <nl> + ExpectedTypes . push_back ( Param . getType ( ) ) ; <nl> + } <nl> + } <nl> } <nl> - return false ; <nl> + return ! ExpectedTypes . empty ( ) | | ! ExpectedNames . empty ( ) ; <nl> } <nl> <nl> bool getCallArgCompletions ( DeclContext & DC , CallExpr * CallE , Expr * CCExpr ) { <nl> - SmallVector < FunctionParams , 4 > PossibleTypes ; <nl> - unsigned Position ; <nl> - bool HasName ; <nl> - bool hasPossibleArgTypes = collectPossibleArgTypes ( DC , CallE , CCExpr , <nl> - PossibleTypes , Position , <nl> - HasName ) ; <nl> - bool hasCompletions = lookupArgCompletionsAtPosition ( Position , HasName , <nl> - PossibleTypes , <nl> - CCExpr - > getStartLoc ( ) ) ; <nl> - <nl> - return hasPossibleArgTypes & & hasCompletions ; <nl> + std : : vector < Type > ExpectedTypes ; <nl> + std : : vector < StringRef > ExpectedNames ; <nl> + if ( ! collectArgumentExpectation ( DC , CallE , CCExpr , ExpectedTypes , <nl> + ExpectedNames ) ) <nl> + return false ; <nl> + <nl> + addArgNameCompletionResults ( ExpectedNames ) ; <nl> + if ( ! ExpectedTypes . empty ( ) ) { <nl> + setExpectedTypes ( ExpectedTypes ) ; <nl> + getValueCompletionsInDeclContext ( CCExpr - > getStartLoc ( ) , DefaultFilter ) ; <nl> + } <nl> + <nl> + return true ; <nl> } <nl> <nl> void getTypeContextEnumElementCompletions ( SourceLoc Loc ) { <nl> mmm a / utils / update_checkout / update - checkout - config . json <nl> ppp b / utils / update_checkout / update - checkout - config . json <nl> <nl> " stable - next " , " upstream " , <nl> " next - upstream " , " upstream - with - swift " ] , <nl> " repos " : { <nl> - " llvm " : " swift - 5 . 0 - branch " , <nl> - " clang " : " swift - 5 . 0 - branch " , <nl> - " compiler - rt " : " swift - 5 . 0 - branch " , <nl> + " llvm " : " upstream - with - swift " , <nl> + " clang " : " upstream - with - swift " , <nl> + " compiler - rt " : " upstream - with - swift " , <nl> " swift " : " master - next " , <nl> " lldb " : " upstream - with - swift " , <nl> " cmark " : " master " , <nl>
Merge remote - tracking branch ' origin / master ' into master - next
apple/swift
b378e9d1f144301a2a8006fabf9c0b5bf9ff4de4
2018-08-07T16:29:31Z
mmm a / lib / AST / CMakeLists . txt <nl> ppp b / lib / AST / CMakeLists . txt <nl> add_swift_library ( swiftAST STATIC <nl> COMPONENT_DEPENDS <nl> bitreader bitwriter irreader debuginfoDWARF <nl> profiledata instrumentation object objcarcopts mc mcparser <nl> - bitreader bitwriter ipo option core support $ { LLVM_TARGETS_TO_BUILD } <nl> + bitreader bitwriter lto ipo option core support $ { LLVM_TARGETS_TO_BUILD } <nl> ) <nl> <nl> if ( NOT SWIFT_BUILT_STANDALONE ) <nl>
AST : fix dependencies
apple/swift
02c52833dbe287b39a0d07dbc61e0eb2a74c6772
2016-08-15T18:33:22Z
mmm a / . travis . yml <nl> ppp b / . travis . yml <nl> matrix : <nl> sources : [ ' ubuntu - toolchain - r - test ' ] <nl> packages : [ ' g + + - 4 . 9 ' , ' ninja - build ' ] <nl> before_script : <nl> - - pip install - - user requests [ security ] cpp - coveralls <nl> + - pip install - - user httplib2 cpp - coveralls <nl> after_success : <nl> - coveralls - - build - root test - - include include / nlohmann - - gcov ' gcov - 4 . 9 ' - - gcov - options ' \ - lp ' <nl> env : <nl>
: alembic : try to fix SSL issue
nlohmann/json
697305819f6a74c2116b5dc8592e611fcbc820fa
2020-05-11T10:37:24Z
mmm a / src / video_core / engines / maxwell_3d . cpp <nl> ppp b / src / video_core / engines / maxwell_3d . cpp <nl> std : : vector < Texture : : FullTextureInfo > Maxwell3D : : GetStageTextures ( Regs : : ShaderSt <nl> std : : memcpy ( & tex_info . tic , & tic_entry , sizeof ( tic_entry ) ) ; <nl> <nl> / / Load the TSC data <nl> - if ( tex_handle . tsc_id ! = 0 ) { <nl> - auto tsc_entry = GetTSCEntry ( tex_handle . tsc_id ) ; <nl> - / / TODO ( Subv ) : Workaround for BitField ' s move constructor being deleted . <nl> - std : : memcpy ( & tex_info . tsc , & tsc_entry , sizeof ( tsc_entry ) ) ; <nl> - } <nl> + auto tsc_entry = GetTSCEntry ( tex_handle . tsc_id ) ; <nl> + / / TODO ( Subv ) : Workaround for BitField ' s move constructor being deleted . <nl> + std : : memcpy ( & tex_info . tsc , & tsc_entry , sizeof ( tsc_entry ) ) ; <nl> <nl> textures . push_back ( tex_info ) ; <nl> } <nl> Texture : : FullTextureInfo Maxwell3D : : GetStageTexture ( Regs : : ShaderStage stage , <nl> std : : memcpy ( & tex_info . tic , & tic_entry , sizeof ( tic_entry ) ) ; <nl> <nl> / / Load the TSC data <nl> - if ( tex_handle . tsc_id ! = 0 ) { <nl> - auto tsc_entry = GetTSCEntry ( tex_handle . tsc_id ) ; <nl> - / / TODO ( Subv ) : Workaround for BitField ' s move constructor being deleted . <nl> - std : : memcpy ( & tex_info . tsc , & tsc_entry , sizeof ( tsc_entry ) ) ; <nl> - } <nl> + auto tsc_entry = GetTSCEntry ( tex_handle . tsc_id ) ; <nl> + / / TODO ( Subv ) : Workaround for BitField ' s move constructor being deleted . <nl> + std : : memcpy ( & tex_info . tsc , & tsc_entry , sizeof ( tsc_entry ) ) ; <nl> <nl> return tex_info ; <nl> } <nl>
maxwell_3d : Allow sampler handles with TSC id zero
yuzu-emu/yuzu
04e68e973829ddfb262f0539627ec7fe56424fb2
2019-02-03T07:58:40Z
mmm a / Marlin / src / inc / Version . h <nl> ppp b / Marlin / src / inc / Version . h <nl> <nl> * version was tagged . <nl> * / <nl> # ifndef STRING_DISTRIBUTION_DATE <nl> - # define STRING_DISTRIBUTION_DATE " 2020 - 08 - 20 " <nl> + # define STRING_DISTRIBUTION_DATE " 2020 - 08 - 21 " <nl> # endif <nl> <nl> / * * <nl>
[ cron ] Bump distribution date ( 2020 - 08 - 21 )
MarlinFirmware/Marlin
16e1dbbb216d79535bcc9d95a74d865fa08c23c2
2020-08-21T00:11:45Z
deleted file mode 100644 <nl> index 484f74cf009 . . 00000000000 <nl> mmm a / toolsrc / include / PostBuildLint_OutdatedDynamicCrt . h <nl> ppp / dev / null <nl> <nl> - # pragma once <nl> - # include < vector > <nl> - # include < regex > <nl> - <nl> - namespace vcpkg : : PostBuildLint <nl> - { <nl> - struct OutdatedDynamicCrt <nl> - { <nl> - / / Old CPP <nl> - static const OutdatedDynamicCrt MSVCP100_DLL ; <nl> - static const OutdatedDynamicCrt MSVCP100D_DLL ; <nl> - static const OutdatedDynamicCrt MSVCP110_DLL ; <nl> - static const OutdatedDynamicCrt MSVCP110_WIN_DLL ; <nl> - static const OutdatedDynamicCrt MSVCP120_DLL ; <nl> - static const OutdatedDynamicCrt MSVCP120_CLR0400_DLL ; <nl> - static const OutdatedDynamicCrt MSVCP60_DLL ; <nl> - static const OutdatedDynamicCrt MSVCP_WIN_DLL ; <nl> - <nl> - / / Old C <nl> - static const OutdatedDynamicCrt MSVCR100_DLL ; <nl> - static const OutdatedDynamicCrt MSVCR100D_DLL ; <nl> - static const OutdatedDynamicCrt MSVCR100_CLR0400_DLL ; <nl> - static const OutdatedDynamicCrt MSVCR110_DLL ; <nl> - static const OutdatedDynamicCrt MSVCR120_DLL ; <nl> - static const OutdatedDynamicCrt MSVCR120_CLR0400_DLL ; <nl> - static const OutdatedDynamicCrt MSVCRT_DLL ; <nl> - static const OutdatedDynamicCrt MSVCRT20_DLL ; <nl> - static const OutdatedDynamicCrt MSVCRT40_DLL ; <nl> - <nl> - static const std : : vector < OutdatedDynamicCrt > & values ( ) <nl> - { <nl> - static const std : : vector < OutdatedDynamicCrt > v = { <nl> - MSVCP100_DLL , MSVCP100D_DLL , <nl> - MSVCP110_DLL , MSVCP110_WIN_DLL , <nl> - MSVCP120_DLL , MSVCP120_CLR0400_DLL , <nl> - MSVCP60_DLL , <nl> - MSVCP_WIN_DLL , <nl> - <nl> - MSVCR100_DLL , MSVCR100D_DLL , MSVCR100_CLR0400_DLL , <nl> - MSVCR110_DLL , <nl> - MSVCR120_DLL , MSVCR120_CLR0400_DLL , <nl> - MSVCRT_DLL , MSVCRT20_DLL , MSVCRT40_DLL <nl> - } ; <nl> - return v ; <nl> - } <nl> - <nl> - OutdatedDynamicCrt ( ) = delete ; <nl> - <nl> - std : : regex crt_regex ( ) const ; <nl> - const std : : string & toString ( ) const ; <nl> - <nl> - private : <nl> - explicit OutdatedDynamicCrt ( const std : : string & dll_name , const std : : string & crt_regex_as_string ) <nl> - : m_dll_name ( dll_name ) , m_crt_regex_as_string ( crt_regex_as_string ) { } <nl> - <nl> - std : : string m_dll_name ; <nl> - std : : string m_crt_regex_as_string ; <nl> - } ; <nl> - } <nl> mmm a / toolsrc / src / PostBuildLint . cpp <nl> ppp b / toolsrc / src / PostBuildLint . cpp <nl> <nl> # include " coff_file_reader . h " <nl> # include " PostBuildLint_BuildInfo . h " <nl> # include " PostBuildLint_BuildType . h " <nl> - # include " PostBuildLint_OutdatedDynamicCrt . h " <nl> <nl> namespace vcpkg : : PostBuildLint <nl> { <nl> namespace vcpkg : : PostBuildLint <nl> ERROR_DETECTED = 1 <nl> } ; <nl> <nl> + struct OutdatedDynamicCrt <nl> + { <nl> + std : : string name ; <nl> + std : : regex regex ; <nl> + } ; <nl> + <nl> + const std : : vector < OutdatedDynamicCrt > & get_outdated_dynamic_crts ( ) <nl> + { <nl> + static const std : : vector < OutdatedDynamicCrt > v = { <nl> + { " msvcp100 . dll " , std : : regex ( R " ( msvcp100 \ . dll ) " ) } , <nl> + { " msvcp100d . dll " , std : : regex ( R " ( msvcp100d \ . dll ) " ) } , <nl> + { " msvcp110 . dll " , std : : regex ( R " ( msvcp110 \ . dll ) " ) } , <nl> + { " msvcp110_win . dll " , std : : regex ( R " ( msvcp110_win \ . dll ) " ) } , <nl> + { " msvcp120 . dll " , std : : regex ( R " ( msvcp120 \ . dll ) " ) } , <nl> + { " msvcp120_clr0400 . dll " , std : : regex ( R " ( msvcp120_clr0400 \ . dll ) " ) } , <nl> + { " msvcp60 . dll " , std : : regex ( R " ( msvcp60 \ . dll ) " ) } , <nl> + { " msvcp60 . dll " , std : : regex ( R " ( msvcp60 \ . dll ) " ) } , <nl> + <nl> + { " msvcr100 . dll " , std : : regex ( R " ( msvcr100 \ . dll ) " ) } , <nl> + { " msvcr100d . dll " , std : : regex ( R " ( msvcr100d \ . dll ) " ) } , <nl> + { " msvcr100_clr0400 . dll " , std : : regex ( R " ( msvcr100_clr0400 \ . dll ) " ) } , <nl> + { " msvcr110 . dll " , std : : regex ( R " ( msvcr110 \ . dll ) " ) } , <nl> + { " msvcr120 . dll " , std : : regex ( R " ( msvcr120 \ . dll ) " ) } , <nl> + { " msvcr120_clr0400 . dll " , std : : regex ( R " ( msvcr120_clr0400 \ . dll ) " ) } , <nl> + { " msvcrt . dll " , std : : regex ( R " ( msvcrt \ . dll ) " ) } , <nl> + { " msvcrt20 . dll " , std : : regex ( R " ( msvcrt20 \ . dll ) " ) } , <nl> + { " msvcrt40 . dll " , std : : regex ( R " ( msvcrt40 \ . dll ) " ) } <nl> + } ; <nl> + <nl> + return v ; <nl> + } <nl> + <nl> static lint_status check_for_files_in_include_directory ( const fs : : path & package_dir ) <nl> { <nl> const fs : : path include_dir = package_dir / " include " ; <nl> namespace vcpkg : : PostBuildLint <nl> <nl> static lint_status check_outdated_crt_linkage_of_dlls ( const std : : vector < fs : : path > & dlls , const fs : : path dumpbin_exe ) <nl> { <nl> - const std : : vector < OutdatedDynamicCrt > & outdated_crts = OutdatedDynamicCrt : : values ( ) ; <nl> + const std : : vector < OutdatedDynamicCrt > & outdated_crts = get_outdated_dynamic_crts ( ) ; <nl> <nl> std : : vector < OutdatedDynamicCrt_and_file > dlls_with_outdated_crt ; <nl> <nl> namespace vcpkg : : PostBuildLint <nl> <nl> for ( const OutdatedDynamicCrt & outdated_crt : outdated_crts ) <nl> { <nl> - if ( std : : regex_search ( ec_data . output . cbegin ( ) , ec_data . output . cend ( ) , outdated_crt . crt_regex ( ) ) ) <nl> + if ( std : : regex_search ( ec_data . output . cbegin ( ) , ec_data . output . cend ( ) , outdated_crt . regex ) ) <nl> { <nl> dlls_with_outdated_crt . push_back ( { dll , outdated_crt } ) ; <nl> break ; <nl> namespace vcpkg : : PostBuildLint <nl> System : : println ( " " ) ; <nl> for ( const OutdatedDynamicCrt_and_file btf : dlls_with_outdated_crt ) <nl> { <nl> - System : : println ( " % s : % s " , btf . file . generic_string ( ) , btf . outdated_crt . toString ( ) ) ; <nl> + System : : println ( " % s : % s " , btf . file . generic_string ( ) , btf . outdated_crt . name ) ; <nl> } <nl> System : : println ( " " ) ; <nl> <nl> namespace vcpkg : : PostBuildLint <nl> <nl> switch ( linkage_type_value_of ( build_info . library_linkage ) ) <nl> { <nl> - case LinkageType : : DYNAMIC : <nl> - { <nl> - const std : : vector < fs : : path > debug_dlls = Files : : recursive_find_files_with_extension_in_dir ( debug_bin_dir , " . dll " ) ; <nl> - const std : : vector < fs : : path > release_dlls = Files : : recursive_find_files_with_extension_in_dir ( release_bin_dir , " . dll " ) ; <nl> + case LinkageType : : DYNAMIC : <nl> + { <nl> + const std : : vector < fs : : path > debug_dlls = Files : : recursive_find_files_with_extension_in_dir ( debug_bin_dir , " . dll " ) ; <nl> + const std : : vector < fs : : path > release_dlls = Files : : recursive_find_files_with_extension_in_dir ( release_bin_dir , " . dll " ) ; <nl> <nl> - error_count + = check_matching_debug_and_release_binaries ( debug_dlls , release_dlls ) ; <nl> + error_count + = check_matching_debug_and_release_binaries ( debug_dlls , release_dlls ) ; <nl> <nl> - error_count + = check_lib_files_are_available_if_dlls_are_available ( build_info . policies , debug_libs . size ( ) , debug_dlls . size ( ) , debug_lib_dir ) ; <nl> - error_count + = check_lib_files_are_available_if_dlls_are_available ( build_info . policies , release_libs . size ( ) , release_dlls . size ( ) , release_lib_dir ) ; <nl> + error_count + = check_lib_files_are_available_if_dlls_are_available ( build_info . policies , debug_libs . size ( ) , debug_dlls . size ( ) , debug_lib_dir ) ; <nl> + error_count + = check_lib_files_are_available_if_dlls_are_available ( build_info . policies , release_libs . size ( ) , release_dlls . size ( ) , release_lib_dir ) ; <nl> <nl> - std : : vector < fs : : path > dlls ; <nl> - dlls . insert ( dlls . cend ( ) , debug_dlls . cbegin ( ) , debug_dlls . cend ( ) ) ; <nl> - dlls . insert ( dlls . cend ( ) , release_dlls . cbegin ( ) , release_dlls . cend ( ) ) ; <nl> + std : : vector < fs : : path > dlls ; <nl> + dlls . insert ( dlls . cend ( ) , debug_dlls . cbegin ( ) , debug_dlls . cend ( ) ) ; <nl> + dlls . insert ( dlls . cend ( ) , release_dlls . cbegin ( ) , release_dlls . cend ( ) ) ; <nl> <nl> - error_count + = check_exports_of_dlls ( dlls , dumpbin_exe ) ; <nl> - error_count + = check_uwp_bit_of_dlls ( spec . target_triplet ( ) . system ( ) , dlls , dumpbin_exe ) ; <nl> - error_count + = check_dll_architecture ( spec . target_triplet ( ) . architecture ( ) , dlls ) ; <nl> + error_count + = check_exports_of_dlls ( dlls , dumpbin_exe ) ; <nl> + error_count + = check_uwp_bit_of_dlls ( spec . target_triplet ( ) . system ( ) , dlls , dumpbin_exe ) ; <nl> + error_count + = check_dll_architecture ( spec . target_triplet ( ) . architecture ( ) , dlls ) ; <nl> <nl> - error_count + = check_outdated_crt_linkage_of_dlls ( dlls , dumpbin_exe ) ; <nl> - break ; <nl> - } <nl> - case LinkageType : : STATIC : <nl> - { <nl> - std : : vector < fs : : path > dlls ; <nl> - Files : : recursive_find_files_with_extension_in_dir ( package_dir , " . dll " , & dlls ) ; <nl> - error_count + = check_no_dlls_present ( dlls ) ; <nl> + error_count + = check_outdated_crt_linkage_of_dlls ( dlls , dumpbin_exe ) ; <nl> + break ; <nl> + } <nl> + case LinkageType : : STATIC : <nl> + { <nl> + std : : vector < fs : : path > dlls ; <nl> + Files : : recursive_find_files_with_extension_in_dir ( package_dir , " . dll " , & dlls ) ; <nl> + error_count + = check_no_dlls_present ( dlls ) ; <nl> <nl> - error_count + = check_bin_folders_are_not_present_in_static_build ( package_dir ) ; <nl> + error_count + = check_bin_folders_are_not_present_in_static_build ( package_dir ) ; <nl> <nl> - error_count + = check_crt_linkage_of_libs ( BuildType : : value_of ( ConfigurationType : : DEBUG , linkage_type_value_of ( build_info . crt_linkage ) ) , debug_libs , dumpbin_exe ) ; <nl> - error_count + = check_crt_linkage_of_libs ( BuildType : : value_of ( ConfigurationType : : RELEASE , linkage_type_value_of ( build_info . crt_linkage ) ) , release_libs , dumpbin_exe ) ; <nl> - break ; <nl> - } <nl> - case LinkageType : : UNKNOWN : <nl> - { <nl> - error_count + = 1 ; <nl> - System : : println ( System : : color : : warning , " Unknown library_linkage architecture : [ % s ] " , build_info . library_linkage ) ; <nl> - break ; <nl> - } <nl> - default : <nl> - Checks : : unreachable ( ) ; <nl> + error_count + = check_crt_linkage_of_libs ( BuildType : : value_of ( ConfigurationType : : DEBUG , linkage_type_value_of ( build_info . crt_linkage ) ) , debug_libs , dumpbin_exe ) ; <nl> + error_count + = check_crt_linkage_of_libs ( BuildType : : value_of ( ConfigurationType : : RELEASE , linkage_type_value_of ( build_info . crt_linkage ) ) , release_libs , dumpbin_exe ) ; <nl> + break ; <nl> + } <nl> + case LinkageType : : UNKNOWN : <nl> + { <nl> + error_count + = 1 ; <nl> + System : : println ( System : : color : : warning , " Unknown library_linkage architecture : [ % s ] " , build_info . library_linkage ) ; <nl> + break ; <nl> + } <nl> + default : <nl> + Checks : : unreachable ( ) ; <nl> } <nl> # if 0 <nl> error_count + = check_no_subdirectories ( package_dir / " lib " ) ; <nl> deleted file mode 100644 <nl> index 67965cd936e . . 00000000000 <nl> mmm a / toolsrc / src / PostBuildLint_OutdatedDynamicCrt . cpp <nl> ppp / dev / null <nl> <nl> - # include " pch . h " <nl> - # include " PostBuildLint_OutdatedDynamicCrt . h " <nl> - <nl> - namespace vcpkg : : PostBuildLint <nl> - { <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP100_DLL = OutdatedDynamicCrt ( " msvcp100 . dll " , R " ( msvcp100 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP100D_DLL = OutdatedDynamicCrt ( " msvcp100d . dll " , R " ( msvcp100d \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP110_DLL = OutdatedDynamicCrt ( " msvcp110 . dll " , R " ( msvcp110 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP110_WIN_DLL = OutdatedDynamicCrt ( " msvcp110_win . dll " , R " ( msvcp110_win \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP120_DLL = OutdatedDynamicCrt ( " msvcp120 . dll " , R " ( msvcp120 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP120_CLR0400_DLL = OutdatedDynamicCrt ( " msvcp120_clr0400 . dll " , R " ( msvcp120_clr0400 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP60_DLL = OutdatedDynamicCrt ( " msvcp60 . dll " , R " ( msvcp60 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCP_WIN_DLL = OutdatedDynamicCrt ( " msvcp60 . dll " , R " ( msvcp60 \ . dll ) " ) ; ; <nl> - <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCR100_DLL = OutdatedDynamicCrt ( " msvcr100 . dll " , R " ( msvcr100 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCR100D_DLL = OutdatedDynamicCrt ( " msvcr100d . dll " , R " ( msvcr100d \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCR100_CLR0400_DLL = OutdatedDynamicCrt ( " msvcr100_clr0400 . dll " , R " ( msvcr100_clr0400 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCR110_DLL = OutdatedDynamicCrt ( " msvcr110 . dll " , R " ( msvcr110 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCR120_DLL = OutdatedDynamicCrt ( " msvcr120 . dll " , R " ( msvcr120 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCR120_CLR0400_DLL = OutdatedDynamicCrt ( " msvcr120_clr0400 . dll " , R " ( msvcr120_clr0400 \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCRT_DLL = OutdatedDynamicCrt ( " msvcrt . dll " , R " ( msvcrt \ . dll ) " ) ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCRT20_DLL = OutdatedDynamicCrt ( " msvcrt20 . dll " , R " ( msvcrt20 \ . dll ) " ) ; ; <nl> - const OutdatedDynamicCrt OutdatedDynamicCrt : : MSVCRT40_DLL = OutdatedDynamicCrt ( " msvcrt40 . dll " , R " ( msvcrt40 \ . dll ) " ) ; ; <nl> - <nl> - std : : regex OutdatedDynamicCrt : : crt_regex ( ) const <nl> - { <nl> - const std : : regex r ( this - > m_crt_regex_as_string , std : : regex_constants : : icase ) ; <nl> - return r ; <nl> - } <nl> - <nl> - const std : : string & OutdatedDynamicCrt : : toString ( ) const <nl> - { <nl> - return this - > m_dll_name ; <nl> - } <nl> - } <nl> mmm a / toolsrc / vcpkglib / vcpkglib . vcxproj <nl> ppp b / toolsrc / vcpkglib / vcpkglib . vcxproj <nl> <nl> < ClInclude Include = " . . \ include \ PostBuildLint_BuildType . h " / > <nl> < ClInclude Include = " . . \ include \ PostBuildLint_ConfigurationType . h " / > <nl> < ClInclude Include = " . . \ include \ PostBuildLint_LinkageType . h " / > <nl> - < ClInclude Include = " . . \ include \ PostBuildLint_OutdatedDynamicCrt . h " / > <nl> < ClInclude Include = " . . \ include \ SourceParagraph . h " / > <nl> < ClInclude Include = " . . \ include \ StatusParagraph . h " / > <nl> < ClInclude Include = " . . \ include \ StatusParagraphs . h " / > <nl> <nl> < ClCompile Include = " . . \ src \ PostBuildLint . cpp " / > <nl> < ClCompile Include = " . . \ src \ PostBuildLint_ConfigurationType . cpp " / > <nl> < ClCompile Include = " . . \ src \ PostBuildLint_LinkageType . cpp " / > <nl> - < ClCompile Include = " . . \ src \ PostBuildLint_OutdatedDynamicCrt . cpp " / > <nl> < ClCompile Include = " . . \ src \ PostBuildLint_BuildType . cpp " / > <nl> < ClCompile Include = " . . \ src \ Stopwatch . cpp " / > <nl> < ClCompile Include = " . . \ src \ vcpkglib . cpp " / > <nl> mmm a / toolsrc / vcpkglib / vcpkglib . vcxproj . filters <nl> ppp b / toolsrc / vcpkglib / vcpkglib . vcxproj . filters <nl> <nl> < ClCompile Include = " . . \ src \ PostBuildLint_ConfigurationType . cpp " > <nl> < Filter > Source Files < / Filter > <nl> < / ClCompile > <nl> - < ClCompile Include = " . . \ src \ PostBuildLint_OutdatedDynamicCrt . cpp " > <nl> - < Filter > Source Files < / Filter > <nl> - < / ClCompile > <nl> < ClCompile Include = " . . \ src \ PostBuildLint_BuildType . cpp " > <nl> < Filter > Source Files < / Filter > <nl> < / ClCompile > <nl> <nl> < ClInclude Include = " . . \ include \ PostBuildLint_BuildType . h " > <nl> < Filter > Header Files < / Filter > <nl> < / ClInclude > <nl> - < ClInclude Include = " . . \ include \ PostBuildLint_OutdatedDynamicCrt . h " > <nl> - < Filter > Header Files < / Filter > <nl> - < / ClInclude > <nl> < / ItemGroup > <nl> < / Project > <nl> \ No newline at end of file <nl>
Remove OutdatedDynamicCrt enum . Replace with vector < struct >
microsoft/vcpkg
d36a1b7cb0b8be59e7826a7a699d9951e91abc2c
2017-02-08T06:57:37Z
mmm a / src / MainWindow . ui <nl> ppp b / src / MainWindow . ui <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / widget > <nl> < / item > <nl> < item row = " 2 " column = " 0 " > <nl> + < widget class = " QLabel " name = " labelPragmaCaseSensitiveLike " > <nl> + < property name = " text " > <nl> + < string > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; https : / / www . sqlite . org / pragma . html # pragma_case_sensitive_like & quot ; & gt ; Case Sensitive Like & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> + < / property > <nl> + < property name = " openExternalLinks " > <nl> + < bool > true < / bool > <nl> + < / property > <nl> + < property name = " buddy " > <nl> + < cstring > checkboxPragmaCaseSensitiveLike < / cstring > <nl> + < / property > <nl> + < / widget > <nl> + < / item > <nl> + < item row = " 2 " column = " 1 " > <nl> + < widget class = " QCheckBox " name = " checkboxPragmaCaseSensitiveLike " > <nl> + < property name = " toolTip " > <nl> + < string > Warning : this pragma is not readable and this value has been inferred . Writing the pragma might overwrite a redefined LIKE provided by an SQLite extension . < / string > <nl> + < / property > <nl> + < property name = " text " > <nl> + < string / > <nl> + < / property > <nl> + < / widget > <nl> + < / item > <nl> + < item row = " 3 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaCheckpointFullFsync " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_checkpoint_fullfsync & quot ; & gt ; Checkpoint Full FSYNC & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 2 " column = " 1 " > <nl> + < item row = " 3 " column = " 1 " > <nl> < widget class = " QCheckBox " name = " checkboxPragmaCheckpointFullFsync " > <nl> < property name = " text " > <nl> < string / > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 3 " column = " 0 " > <nl> + < item row = " 4 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaForeignKeys " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_foreign_keys & quot ; & gt ; Foreign Keys & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 3 " column = " 1 " > <nl> + < item row = " 4 " column = " 1 " > <nl> < widget class = " QCheckBox " name = " checkboxPragmaForeignKeys " > <nl> < property name = " text " > <nl> < string / > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 4 " column = " 0 " > <nl> + < item row = " 5 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaFullFsync " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_fullfsync & quot ; & gt ; Full FSYNC & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 4 " column = " 1 " > <nl> + < item row = " 5 " column = " 1 " > <nl> < widget class = " QCheckBox " name = " checkboxPragmaFullFsync " > <nl> < property name = " text " > <nl> < string / > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 5 " column = " 0 " > <nl> + < item row = " 6 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaIgnoreCheckConstraints " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_ignore_check_constraints & quot ; & gt ; Ignore Check Constraints & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 5 " column = " 1 " > <nl> + < item row = " 6 " column = " 1 " > <nl> < widget class = " QCheckBox " name = " checkboxPragmaIgnoreCheckConstraints " > <nl> < property name = " text " > <nl> < string / > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 6 " column = " 0 " > <nl> + < item row = " 7 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaJournalMode " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_journal_mode & quot ; & gt ; Journal Mode & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 6 " column = " 1 " > <nl> + < item row = " 7 " column = " 1 " > <nl> < widget class = " QComboBox " name = " comboboxPragmaJournalMode " > <nl> < item > <nl> < property name = " text " > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / item > <nl> < / widget > <nl> < / item > <nl> - < item row = " 7 " column = " 0 " > <nl> + < item row = " 8 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelJournalSizeLimit " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_journal_size_limit & quot ; & gt ; Journal Size Limit & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 7 " column = " 1 " > <nl> + < item row = " 8 " column = " 1 " > <nl> < widget class = " QSpinBox " name = " spinPragmaJournalSizeLimit " > <nl> < property name = " minimum " > <nl> < number > - 1 < / number > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 8 " column = " 0 " > <nl> + < item row = " 9 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaLockingMode " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_locking_mode & quot ; & gt ; Locking Mode & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 8 " column = " 1 " > <nl> + < item row = " 9 " column = " 1 " > <nl> < widget class = " QComboBox " name = " comboboxPragmaLockingMode " > <nl> < item > <nl> < property name = " text " > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / item > <nl> < / widget > <nl> < / item > <nl> - < item row = " 9 " column = " 0 " > <nl> + < item row = " 10 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaMaxPageCount " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_max_page_count & quot ; & gt ; Max Page Count & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 9 " column = " 1 " > <nl> + < item row = " 10 " column = " 1 " > <nl> < widget class = " QSpinBox " name = " spinPragmaMaxPageCount " > <nl> < property name = " maximum " > <nl> < number > 2000000000 < / number > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 10 " column = " 0 " > <nl> + < item row = " 11 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaPageSize " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_page_size & quot ; & gt ; Page Size & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 10 " column = " 1 " > <nl> + < item row = " 11 " column = " 1 " > <nl> < widget class = " QSpinBox " name = " spinPragmaPageSize " > <nl> < property name = " minimum " > <nl> < number > 512 < / number > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 11 " column = " 0 " > <nl> + < item row = " 12 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaRecursiveTriggers " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_recursive_triggers & quot ; & gt ; Recursive Triggers & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 11 " column = " 1 " > <nl> + < item row = " 12 " column = " 1 " > <nl> < widget class = " QCheckBox " name = " checkboxPragmaRecursiveTriggers " > <nl> < property name = " text " > <nl> < string / > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 12 " column = " 0 " > <nl> + < item row = " 13 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaSecureDelete " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_secure_delete & quot ; & gt ; Secure Delete & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 12 " column = " 1 " > <nl> + < item row = " 13 " column = " 1 " > <nl> < widget class = " QCheckBox " name = " checkboxPragmaSecureDelete " > <nl> < property name = " text " > <nl> < string / > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 13 " column = " 0 " > <nl> + < item row = " 14 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaSynchronous " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_synchronous & quot ; & gt ; Synchronous & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 13 " column = " 1 " > <nl> + < item row = " 14 " column = " 1 " > <nl> < widget class = " QComboBox " name = " comboboxPragmaSynchronous " > <nl> < item > <nl> < property name = " text " > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / item > <nl> < / widget > <nl> < / item > <nl> - < item row = " 14 " column = " 0 " > <nl> + < item row = " 15 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaTempStore " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_temp_store & quot ; & gt ; Temp Store & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 14 " column = " 1 " > <nl> + < item row = " 15 " column = " 1 " > <nl> < widget class = " QComboBox " name = " comboboxPragmaTempStore " > <nl> < item > <nl> < property name = " text " > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / item > <nl> < / widget > <nl> < / item > <nl> - < item row = " 15 " column = " 0 " > <nl> + < item row = " 16 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaUserVersion " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_user_version & quot ; & gt ; User Version & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 15 " column = " 1 " > <nl> + < item row = " 16 " column = " 1 " > <nl> < widget class = " QSpinBox " name = " spinPragmaUserVersion " > <nl> < property name = " maximum " > <nl> < number > 2147483647 < / number > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 16 " column = " 0 " > <nl> + < item row = " 17 " column = " 0 " > <nl> < widget class = " QLabel " name = " labelPragmaWalAutoCheckpoint " > <nl> < property name = " text " > <nl> < string notr = " true " > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; http : / / www . sqlite . org / pragma . html # pragma_wal_autocheckpoint & quot ; & gt ; WAL Auto Checkpoint & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> You can drag SQL statements from an object row and drop them into other applicat <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 16 " column = " 1 " > <nl> + < item row = " 17 " column = " 1 " > <nl> < widget class = " QSpinBox " name = " spinPragmaWalAutoCheckpoint " > <nl> < property name = " maximum " > <nl> < number > 10000 < / number > <nl> < / property > <nl> < / widget > <nl> < / item > <nl> - < item row = " 17 " column = " 0 " > <nl> - < widget class = " QLabel " name = " labelPragmaCaseSensitiveLike " > <nl> - < property name = " text " > <nl> - < string > & lt ; html & gt ; & lt ; head / & gt ; & lt ; body & gt ; & lt ; p & gt ; & lt ; a href = & quot ; https : / / www . sqlite . org / pragma . html # pragma_case_sensitive_like & quot ; & gt ; Case Sensitive Like & lt ; / a & gt ; & lt ; / p & gt ; & lt ; / body & gt ; & lt ; / html & gt ; < / string > <nl> - < / property > <nl> - < property name = " openExternalLinks " > <nl> - < bool > true < / bool > <nl> - < / property > <nl> - < property name = " buddy " > <nl> - < cstring > checkboxPragmaCaseSensitiveLike < / cstring > <nl> - < / property > <nl> - < / widget > <nl> - < / item > <nl> - < item row = " 17 " column = " 1 " > <nl> - < widget class = " QCheckBox " name = " checkboxPragmaCaseSensitiveLike " > <nl> - < property name = " toolTip " > <nl> - < string > Warning : this pragma is not readable and this value has been inferred . Writing the pragma might overwrite a redefined LIKE provided by an SQLite extension . < / string > <nl> - < / property > <nl> - < property name = " text " > <nl> - < string / > <nl> - < / property > <nl> - < / widget > <nl> - < / item > <nl> < / layout > <nl> < / widget > <nl> < / widget > <nl> You can drag SQL statements from the Schema column and drop them into the SQL ed <nl> < tabstop > scrollareaPragmas < / tabstop > <nl> < tabstop > comboboxPragmaAutoVacuum < / tabstop > <nl> < tabstop > checkboxPragmaAutomaticIndex < / tabstop > <nl> + < tabstop > checkboxPragmaCaseSensitiveLike < / tabstop > <nl> < tabstop > checkboxPragmaCheckpointFullFsync < / tabstop > <nl> < tabstop > checkboxPragmaForeignKeys < / tabstop > <nl> < tabstop > checkboxPragmaFullFsync < / tabstop > <nl>
Order PRAGMAs in the Edit Pragma tab alphabetically
sqlitebrowser/sqlitebrowser
8f7fc07604913bc082907d14a48dc722f2e881d1
2018-08-09T12:16:34Z
mmm a / dbms / include / DB / Functions / FunctionsCoding . h <nl> ppp b / dbms / include / DB / Functions / FunctionsCoding . h <nl> class FunctionHex : public IFunction <nl> if ( byte = = 0 & & ! was_nonzero & & offset ) <nl> continue ; <nl> <nl> + was_nonzero = true ; <nl> + <nl> * ( out + + ) = digit [ byte > > 4 ] ; <nl> * ( out + + ) = digit [ byte & 15 ] ; <nl> } <nl>
clickhouse : fixed function hex [ # CONV - 6788 ] .
ClickHouse/ClickHouse
934566de4acfdf0d3c3ab568a4ec2733d903ff4e
2013-03-07T12:11:44Z
mmm a / cocos2dx / cocoa / CCObject . h <nl> ppp b / cocos2dx / cocoa / CCObject . h <nl> typedef int ( CCObject : : * SEL_Compare ) ( CCObject * ) ; <nl> # define compare_selector ( _SELECTOR ) ( SEL_Compare ) ( & _SELECTOR ) <nl> <nl> / / new callbacks based on C + + 11 <nl> - # define CALLBACK_0 ( __selector__ , __target__ , . . . ) std : : bind ( & __selector__ , __target__ , # # __VA_ARGS__ ) <nl> - # define CALLBACK_1 ( __selector__ , __target__ , . . . ) std : : bind ( & __selector__ , __target__ , std : : placeholders : : _1 , # # __VA_ARGS__ ) <nl> - # define CALLBACK_2 ( __selector__ , __target__ , . . . ) std : : bind ( & __selector__ , __target__ , std : : placeholders : : _1 , std : : placeholders : : _2 , # # __VA_ARGS__ ) <nl> + # define CC_CALLBACK_0 ( __selector__ , __target__ , . . . ) std : : bind ( & __selector__ , __target__ , # # __VA_ARGS__ ) <nl> + # define CC_CALLBACK_1 ( __selector__ , __target__ , . . . ) std : : bind ( & __selector__ , __target__ , std : : placeholders : : _1 , # # __VA_ARGS__ ) <nl> + # define CC_CALLBACK_2 ( __selector__ , __target__ , . . . ) std : : bind ( & __selector__ , __target__ , std : : placeholders : : _1 , std : : placeholders : : _2 , # # __VA_ARGS__ ) <nl> <nl> / / end of base_nodes group <nl> / / / @ } <nl> mmm a / cocos2dx / layers_scenes_transitions_nodes / CCLayer . cpp <nl> ppp b / cocos2dx / layers_scenes_transitions_nodes / CCLayer . cpp <nl> void CCLayer : : setKeyboardEnabled ( bool enabled ) <nl> CCDirector * pDirector = CCDirector : : sharedDirector ( ) ; <nl> if ( enabled ) <nl> { <nl> - pDirector - > getKeyboardDispatcher ( ) - > setKeyPressDelegate ( CALLBACK_1 ( CCLayer : : keyPressed , this ) ) ; <nl> - pDirector - > getKeyboardDispatcher ( ) - > setKeyReleaseDelegate ( CALLBACK_1 ( CCLayer : : keyReleased , this ) ) ; <nl> + pDirector - > getKeyboardDispatcher ( ) - > setKeyPressDelegate ( CC_CALLBACK_1 ( CCLayer : : keyPressed , this ) ) ; <nl> + pDirector - > getKeyboardDispatcher ( ) - > setKeyReleaseDelegate ( CC_CALLBACK_1 ( CCLayer : : keyReleased , this ) ) ; <nl> } <nl> else <nl> { <nl> mmm a / samples / Cpp / TestCpp / Classes / ActionsTest / ActionsTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ActionsTest / ActionsTest . cpp <nl> void ActionSequence2 : : onEnter ( ) <nl> CCPlace : : create ( ccp ( 200 , 200 ) ) , <nl> CCShow : : create ( ) , <nl> CCMoveBy : : create ( 1 , ccp ( 100 , 0 ) ) , <nl> - / / CALLBACK_0 = = std : : bind ( & function , instance , . . . ) <nl> - CCCallFunc : : create ( CALLBACK_0 ( ActionSequence2 : : callback1 , this ) ) , <nl> - CCCallFunc : : create ( CALLBACK_0 ( ActionSequence2 : : callback2 , this , m_grossini ) ) , <nl> - CCCallFunc : : create ( CALLBACK_0 ( ActionSequence2 : : callback3 , this , m_grossini , ( void * ) 0xbebabeba ) ) , <nl> + / / CC_CALLBACK_0 = = std : : bind ( & function , instance , . . . ) <nl> + CCCallFunc : : create ( CC_CALLBACK_0 ( ActionSequence2 : : callback1 , this ) ) , <nl> + CCCallFunc : : create ( CC_CALLBACK_0 ( ActionSequence2 : : callback2 , this , m_grossini ) ) , <nl> + CCCallFunc : : create ( CC_CALLBACK_0 ( ActionSequence2 : : callback3 , this , m_grossini , ( void * ) 0xbebabeba ) ) , <nl> NULL ) ; <nl> <nl> m_grossini - > runAction ( action ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / BaseTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / BaseTest . cpp <nl> void BaseTest : : onEnter ( ) <nl> } <nl> <nl> / / add menu <nl> - / / CALLBACK_1 = = std : : bind ( function_ptr , instance , std : : placeholders : : _1 , . . . ) <nl> - CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CALLBACK_1 ( BaseTest : : backCallback , this ) ) ; <nl> - CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CALLBACK_1 ( BaseTest : : restartCallback , this ) ) ; <nl> - CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CALLBACK_1 ( BaseTest : : nextCallback , this ) ) ; <nl> + / / CC_CALLBACK_1 = = std : : bind ( function_ptr , instance , std : : placeholders : : _1 , . . . ) <nl> + CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CC_CALLBACK_1 ( BaseTest : : backCallback , this ) ) ; <nl> + CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CC_CALLBACK_1 ( BaseTest : : restartCallback , this ) ) ; <nl> + CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CC_CALLBACK_1 ( BaseTest : : nextCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , item3 , NULL ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / Box2DTestBed / Box2dView . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / Box2DTestBed / Box2dView . cpp <nl> bool MenuLayer : : initWithEntryID ( int entryId ) <nl> addChild ( label , 1 ) ; <nl> label - > setPosition ( ccp ( visibleOrigin . x + visibleSize . width / 2 , visibleOrigin . y + visibleSize . height - 50 ) ) ; <nl> <nl> - CCMenuItemImage * item1 = CCMenuItemImage : : create ( " Images / b1 . png " , " Images / b2 . png " , CALLBACK_1 ( MenuLayer : : backCallback , this ) ) ; <nl> - CCMenuItemImage * item2 = CCMenuItemImage : : create ( " Images / r1 . png " , " Images / r2 . png " , CALLBACK_1 ( MenuLayer : : restartCallback , this ) ) ; <nl> - CCMenuItemImage * item3 = CCMenuItemImage : : create ( " Images / f1 . png " , " Images / f2 . png " , CALLBACK_1 ( MenuLayer : : nextCallback , this ) ) ; <nl> + CCMenuItemImage * item1 = CCMenuItemImage : : create ( " Images / b1 . png " , " Images / b2 . png " , CC_CALLBACK_1 ( MenuLayer : : backCallback , this ) ) ; <nl> + CCMenuItemImage * item2 = CCMenuItemImage : : create ( " Images / r1 . png " , " Images / r2 . png " , CC_CALLBACK_1 ( MenuLayer : : restartCallback , this ) ) ; <nl> + CCMenuItemImage * item3 = CCMenuItemImage : : create ( " Images / f1 . png " , " Images / f2 . png " , CC_CALLBACK_1 ( MenuLayer : : nextCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , item3 , NULL ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 1159 . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 1159 . cpp <nl> bool Bug1159Layer : : init ( ) <nl> sprite_b - > setPosition ( ccp ( s . width / 2 , s . height / 2 ) ) ; <nl> addChild ( sprite_b ) ; <nl> <nl> - CCMenuItemLabel * label = CCMenuItemLabel : : create ( CCLabelTTF : : create ( " Flip Me " , " Helvetica " , 24 ) , CALLBACK_1 ( Bug1159Layer : : callBack , this ) ) ; <nl> + CCMenuItemLabel * label = CCMenuItemLabel : : create ( CCLabelTTF : : create ( " Flip Me " , " Helvetica " , 24 ) , CC_CALLBACK_1 ( Bug1159Layer : : callBack , this ) ) ; <nl> CCMenu * menu = CCMenu : : create ( label , NULL ) ; <nl> menu - > setPosition ( ccp ( s . width - 200 . 0f , 50 . 0f ) ) ; <nl> addChild ( menu ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 422 . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 422 . cpp <nl> void Bug422Layer : : reset ( ) <nl> removeChild ( node , false ) ; <nl> / / [ self removeChildByTag : localtag - 1 cleanup : NO ] ; <nl> <nl> - CCMenuItem * item1 = CCMenuItemFont : : create ( " One " , CALLBACK_1 ( Bug422Layer : : menuCallback , this ) ) ; <nl> + CCMenuItem * item1 = CCMenuItemFont : : create ( " One " , CC_CALLBACK_1 ( Bug422Layer : : menuCallback , this ) ) ; <nl> CCLog ( " MenuItemFont : % p " , item1 ) ; <nl> - CCMenuItem * item2 = CCMenuItemFont : : create ( " Two " , CALLBACK_1 ( Bug422Layer : : menuCallback , this ) ) ; <nl> + CCMenuItem * item2 = CCMenuItemFont : : create ( " Two " , CC_CALLBACK_1 ( Bug422Layer : : menuCallback , this ) ) ; <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , NULL ) ; <nl> menu - > alignItemsVertically ( ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 458 / Bug - 458 . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 458 / Bug - 458 . cpp <nl> bool Bug458Layer : : init ( ) <nl> / / [ question setContentSize : CGSizeMake ( 50 , 50 ) ] ; <nl> / / [ question2 setContentSize : CGSizeMake ( 50 , 50 ) ] ; <nl> <nl> - CCMenuItemSprite * sprite = CCMenuItemSprite : : create ( question2 , question , CALLBACK_1 ( Bug458Layer : : selectAnswer , this ) ) ; <nl> + CCMenuItemSprite * sprite = CCMenuItemSprite : : create ( question2 , question , CC_CALLBACK_1 ( Bug458Layer : : selectAnswer , this ) ) ; <nl> CCLayerColor * layer = CCLayerColor : : create ( ccc4 ( 0 , 0 , 255 , 255 ) , 100 , 100 ) ; <nl> question - > release ( ) ; <nl> question2 - > release ( ) ; <nl> <nl> CCLayerColor * layer2 = CCLayerColor : : create ( ccc4 ( 255 , 0 , 0 , 255 ) , 100 , 100 ) ; <nl> - CCMenuItemSprite * sprite2 = CCMenuItemSprite : : create ( layer , layer2 , CALLBACK_1 ( Bug458Layer : : selectAnswer , this ) ) ; <nl> + CCMenuItemSprite * sprite2 = CCMenuItemSprite : : create ( layer , layer2 , CC_CALLBACK_1 ( Bug458Layer : : selectAnswer , this ) ) ; <nl> CCMenu * menu = CCMenu : : create ( sprite , sprite2 , NULL ) ; <nl> menu - > alignItemsVerticallyWithPadding ( 100 ) ; <nl> menu - > setPosition ( ccp ( size . width / 2 , size . height / 2 ) ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 914 . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / BugsTest / Bug - 914 . cpp <nl> bool Bug914Layer : : init ( ) <nl> <nl> / / create and initialize a Label <nl> CCLabelTTF * label = CCLabelTTF : : create ( " Hello World " , " Marker Felt " , 64 ) ; <nl> - CCMenuItem * item1 = CCMenuItemFont : : create ( " restart " , CALLBACK_1 ( Bug914Layer : : restart , this ) ) ; <nl> + CCMenuItem * item1 = CCMenuItemFont : : create ( " restart " , CC_CALLBACK_1 ( Bug914Layer : : restart , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , NULL ) ; <nl> menu - > alignItemsVertically ( ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / BugsTest / BugsTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / BugsTest / BugsTest . cpp <nl> void BugsTestBaseLayer : : onEnter ( ) <nl> <nl> CCMenuItemFont : : setFontName ( " Arial " ) ; <nl> CCMenuItemFont : : setFontSize ( 24 ) ; <nl> - CCMenuItemFont * pMainItem = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( BugsTestBaseLayer : : backCallback , this ) ) ; <nl> + CCMenuItemFont * pMainItem = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( BugsTestBaseLayer : : backCallback , this ) ) ; <nl> pMainItem - > setPosition ( ccp ( VisibleRect : : rightBottom ( ) . x - 50 , VisibleRect : : rightBottom ( ) . y + 25 ) ) ; <nl> CCMenu * pMenu = CCMenu : : create ( pMainItem , NULL ) ; <nl> pMenu - > setPosition ( CCPointZero ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / ChipmunkTest / ChipmunkTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ChipmunkTest / ChipmunkTest . cpp <nl> ChipmunkTestLayer : : ChipmunkTestLayer ( ) <nl> <nl> / / menu for debug layer <nl> CCMenuItemFont : : setFontSize ( 18 ) ; <nl> - CCMenuItemFont * item = CCMenuItemFont : : create ( " Toggle debug " , CALLBACK_1 ( ChipmunkTestLayer : : toggleDebugCallback , this ) ) ; <nl> + CCMenuItemFont * item = CCMenuItemFont : : create ( " Toggle debug " , CC_CALLBACK_1 ( ChipmunkTestLayer : : toggleDebugCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item , NULL ) ; <nl> this - > addChild ( menu ) ; <nl> void ChipmunkTestLayer : : update ( float delta ) <nl> <nl> void ChipmunkTestLayer : : createResetButton ( ) <nl> { <nl> - CCMenuItemImage * reset = CCMenuItemImage : : create ( " Images / r1 . png " , " Images / r2 . png " , CALLBACK_1 ( ChipmunkTestLayer : : reset , this ) ) ; <nl> + CCMenuItemImage * reset = CCMenuItemImage : : create ( " Images / r1 . png " , " Images / r2 . png " , CC_CALLBACK_1 ( ChipmunkTestLayer : : reset , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( reset , NULL ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / CocosDenshionTest / CocosDenshionTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / CocosDenshionTest / CocosDenshionTest . cpp <nl> m_nSoundId ( 0 ) <nl> / / # else <nl> CCLabelTTF * label = CCLabelTTF : : create ( testItems [ i ] . c_str ( ) , " Arial " , 24 ) ; <nl> / / # endif <nl> - CCMenuItemLabel * pMenuItem = CCMenuItemLabel : : create ( label , CALLBACK_1 ( CocosDenshionTest : : menuCallback , this ) ) ; <nl> + CCMenuItemLabel * pMenuItem = CCMenuItemLabel : : create ( label , CC_CALLBACK_1 ( CocosDenshionTest : : menuCallback , this ) ) ; <nl> <nl> m_pItmeMenu - > addChild ( pMenuItem , i + 10000 ) ; <nl> pMenuItem - > setPosition ( ccp ( VisibleRect : : center ( ) . x , ( VisibleRect : : top ( ) . y - ( i + 1 ) * LINE_SPACE ) ) ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / ExtensionsTest / CocosBuilderTest / TimelineCallbackTest / TimelineCallbackLayerLoader . h <nl> ppp b / samples / Cpp / TestCpp / Classes / ExtensionsTest / CocosBuilderTest / TimelineCallbackTest / TimelineCallbackLayerLoader . h <nl> <nl> - # ifndef _TIMELINECALLBACK_TESTLAYERLOADER_H_ <nl> - # define _TIMELINECALLBACK_TESTLAYERLOADER_H_ <nl> + # ifndef _TIMELINECC_CALLBACK_TESTLAYERLOADER_H_ <nl> + # define _TIMELINECC_CALLBACK_TESTLAYERLOADER_H_ <nl> <nl> # include " TimelineCallbackTestLayer . h " <nl> <nl> class TimelineCallbackTestLayerLoader : public cocos2d : : extension : : CCLayerLoader <nl> CCB_VIRTUAL_NEW_AUTORELEASE_CREATECCNODE_METHOD ( TimelineCallbackTestLayer ) ; <nl> } ; <nl> <nl> - # endif / * _TIMELINECALLBACK_TESTLAYERLOADER_H_ * / <nl> + # endif / * _TIMELINECC_CALLBACK_TESTLAYERLOADER_H_ * / <nl> mmm a / samples / Cpp / TestCpp / Classes / ExtensionsTest / ControlExtensionTest / CCControlScene . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ExtensionsTest / ControlExtensionTest / CCControlScene . cpp <nl> bool CCControlScene : : init ( ) <nl> { <nl> if ( CCLayer : : init ( ) ) <nl> { <nl> - CCMenuItemFont * pBackItem = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( CCControlScene : : toExtensionsMainLayer , this ) ) ; <nl> + CCMenuItemFont * pBackItem = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( CCControlScene : : toExtensionsMainLayer , this ) ) ; <nl> pBackItem - > setPosition ( ccp ( VisibleRect : : rightBottom ( ) . x - 50 , VisibleRect : : rightBottom ( ) . y + 25 ) ) ; <nl> CCMenu * pBackMenu = CCMenu : : create ( pBackItem , NULL ) ; <nl> pBackMenu - > setPosition ( CCPointZero ) ; <nl> bool CCControlScene : : init ( ) <nl> addChild ( m_pSceneTitleLabel , 1 ) ; <nl> <nl> / / Add the menu <nl> - CCMenuItemImage * item1 = CCMenuItemImage : : create ( " Images / b1 . png " , " Images / b2 . png " , CALLBACK_1 ( CCControlScene : : previousCallback , this ) ) ; <nl> - CCMenuItemImage * item2 = CCMenuItemImage : : create ( " Images / r1 . png " , " Images / r2 . png " , CALLBACK_1 ( CCControlScene : : restartCallback , this ) ) ; <nl> - CCMenuItemImage * item3 = CCMenuItemImage : : create ( " Images / f1 . png " , " Images / f2 . png " , CALLBACK_1 ( CCControlScene : : nextCallback , this ) ) ; <nl> + CCMenuItemImage * item1 = CCMenuItemImage : : create ( " Images / b1 . png " , " Images / b2 . png " , CC_CALLBACK_1 ( CCControlScene : : previousCallback , this ) ) ; <nl> + CCMenuItemImage * item2 = CCMenuItemImage : : create ( " Images / r1 . png " , " Images / r2 . png " , CC_CALLBACK_1 ( CCControlScene : : restartCallback , this ) ) ; <nl> + CCMenuItemImage * item3 = CCMenuItemImage : : create ( " Images / f1 . png " , " Images / f2 . png " , CC_CALLBACK_1 ( CCControlScene : : nextCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item3 , item2 , NULL ) ; <nl> menu - > setPosition ( CCPointZero ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / ExtensionsTest / EditBoxTest / EditBoxTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ExtensionsTest / EditBoxTest / EditBoxTest . cpp <nl> EditBoxTest : : EditBoxTest ( ) <nl> addChild ( m_pTTFShowEditReturn ) ; <nl> <nl> / / Back Menu <nl> - CCMenuItemFont * itemBack = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( EditBoxTest : : toExtensionsMainLayer , this ) ) ; <nl> + CCMenuItemFont * itemBack = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( EditBoxTest : : toExtensionsMainLayer , this ) ) ; <nl> itemBack - > setPosition ( ccp ( visibleOrigin . x + visibleSize . width - 50 , visibleOrigin . y + 25 ) ) ; <nl> CCMenu * menuBack = CCMenu : : create ( itemBack , NULL ) ; <nl> menuBack - > setPosition ( CCPointZero ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / ExtensionsTest / NetworkTest / HttpClientTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ExtensionsTest / NetworkTest / HttpClientTest . cpp <nl> HttpClientTest : : HttpClientTest ( ) <nl> <nl> / / Get <nl> CCLabelTTF * labelGet = CCLabelTTF : : create ( " Test Get " , " Arial " , 22 ) ; <nl> - CCMenuItemLabel * itemGet = CCMenuItemLabel : : create ( labelGet , CALLBACK_1 ( HttpClientTest : : onMenuGetTestClicked , this ) ) ; <nl> + CCMenuItemLabel * itemGet = CCMenuItemLabel : : create ( labelGet , CC_CALLBACK_1 ( HttpClientTest : : onMenuGetTestClicked , this ) ) ; <nl> itemGet - > setPosition ( ccp ( winSize . width / 2 , winSize . height - MARGIN - SPACE ) ) ; <nl> menuRequest - > addChild ( itemGet ) ; <nl> <nl> / / Post <nl> CCLabelTTF * labelPost = CCLabelTTF : : create ( " Test Post " , " Arial " , 22 ) ; <nl> - CCMenuItemLabel * itemPost = CCMenuItemLabel : : create ( labelPost , CALLBACK_1 ( HttpClientTest : : onMenuPostTestClicked , this ) ) ; <nl> + CCMenuItemLabel * itemPost = CCMenuItemLabel : : create ( labelPost , CC_CALLBACK_1 ( HttpClientTest : : onMenuPostTestClicked , this ) ) ; <nl> itemPost - > setPosition ( ccp ( winSize . width / 2 , winSize . height - MARGIN - 2 * SPACE ) ) ; <nl> menuRequest - > addChild ( itemPost ) ; <nl> <nl> / / Post Binary <nl> CCLabelTTF * labelPostBinary = CCLabelTTF : : create ( " Test Post Binary " , " Arial " , 22 ) ; <nl> - CCMenuItemLabel * itemPostBinary = CCMenuItemLabel : : create ( labelPostBinary , CALLBACK_1 ( HttpClientTest : : onMenuPostBinaryTestClicked , this ) ) ; <nl> + CCMenuItemLabel * itemPostBinary = CCMenuItemLabel : : create ( labelPostBinary , CC_CALLBACK_1 ( HttpClientTest : : onMenuPostBinaryTestClicked , this ) ) ; <nl> itemPostBinary - > setPosition ( ccp ( winSize . width / 2 , winSize . height - MARGIN - 3 * SPACE ) ) ; <nl> menuRequest - > addChild ( itemPostBinary ) ; <nl> <nl> / / Put <nl> CCLabelTTF * labelPut = CCLabelTTF : : create ( " Test Put " , " Arial " , 22 ) ; <nl> - CCMenuItemLabel * itemPut = CCMenuItemLabel : : create ( labelPut , CALLBACK_1 ( HttpClientTest : : onMenuPutTestClicked , this ) ) ; <nl> + CCMenuItemLabel * itemPut = CCMenuItemLabel : : create ( labelPut , CC_CALLBACK_1 ( HttpClientTest : : onMenuPutTestClicked , this ) ) ; <nl> itemPut - > setPosition ( ccp ( winSize . width / 2 , winSize . height - MARGIN - 4 * SPACE ) ) ; <nl> menuRequest - > addChild ( itemPut ) ; <nl> <nl> / / Delete <nl> CCLabelTTF * labelDelete = CCLabelTTF : : create ( " Test Delete " , " Arial " , 22 ) ; <nl> - CCMenuItemLabel * itemDelete = CCMenuItemLabel : : create ( labelDelete , CALLBACK_1 ( HttpClientTest : : onMenuDeleteTestClicked , this ) ) ; <nl> + CCMenuItemLabel * itemDelete = CCMenuItemLabel : : create ( labelDelete , CC_CALLBACK_1 ( HttpClientTest : : onMenuDeleteTestClicked , this ) ) ; <nl> itemDelete - > setPosition ( ccp ( winSize . width / 2 , winSize . height - MARGIN - 5 * SPACE ) ) ; <nl> menuRequest - > addChild ( itemDelete ) ; <nl> <nl> HttpClientTest : : HttpClientTest ( ) <nl> addChild ( m_labelStatusCode ) ; <nl> <nl> / / Back Menu <nl> - CCMenuItemFont * itemBack = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( HttpClientTest : : toExtensionsMainLayer , this ) ) ; <nl> + CCMenuItemFont * itemBack = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( HttpClientTest : : toExtensionsMainLayer , this ) ) ; <nl> itemBack - > setPosition ( ccp ( VisibleRect : : rightBottom ( ) . x - 50 , VisibleRect : : rightBottom ( ) . y + 25 ) ) ; <nl> CCMenu * menuBack = CCMenu : : create ( itemBack , NULL ) ; <nl> menuBack - > setPosition ( CCPointZero ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / ExtensionsTest / NotificationCenterTest / NotificationCenterTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ExtensionsTest / NotificationCenterTest / NotificationCenterTest . cpp <nl> NotificationCenterTest : : NotificationCenterTest ( ) <nl> { <nl> CCSize s = CCDirector : : sharedDirector ( ) - > getWinSize ( ) ; <nl> <nl> - CCMenuItemFont * pBackItem = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( NotificationCenterTest : : toExtensionsMainLayer , this ) ) ; <nl> + CCMenuItemFont * pBackItem = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( NotificationCenterTest : : toExtensionsMainLayer , this ) ) ; <nl> pBackItem - > setPosition ( ccp ( VisibleRect : : rightBottom ( ) . x - 50 , VisibleRect : : rightBottom ( ) . y + 25 ) ) ; <nl> CCMenu * pBackMenu = CCMenu : : create ( pBackItem , NULL ) ; <nl> pBackMenu - > setPosition ( CCPointZero ) ; <nl> NotificationCenterTest : : NotificationCenterTest ( ) <nl> CCLabelTTF * label2 = CCLabelTTF : : create ( " switch on " , " Marker Felt " , 26 ) ; <nl> CCMenuItemLabel * item1 = CCMenuItemLabel : : create ( label1 ) ; <nl> CCMenuItemLabel * item2 = CCMenuItemLabel : : create ( label2 ) ; <nl> - CCMenuItemToggle * item = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( NotificationCenterTest : : toggleSwitch , this ) , item1 , item2 , NULL ) ; <nl> + CCMenuItemToggle * item = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( NotificationCenterTest : : toggleSwitch , this ) , item1 , item2 , NULL ) ; <nl> / / turn on <nl> item - > setSelectedIndex ( 1 ) ; <nl> CCMenu * menu = CCMenu : : create ( item , NULL ) ; <nl> NotificationCenterTest : : NotificationCenterTest ( ) <nl> CCLabelTTF * label2 = CCLabelTTF : : create ( " connected " , " Marker Felt " , 26 ) ; <nl> CCMenuItemLabel * item1 = CCMenuItemLabel : : create ( label1 ) ; <nl> CCMenuItemLabel * item2 = CCMenuItemLabel : : create ( label2 ) ; <nl> - CCMenuItemToggle * item = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( NotificationCenterTest : : connectToSwitch , this ) , item1 , item2 , NULL ) ; <nl> + CCMenuItemToggle * item = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( NotificationCenterTest : : connectToSwitch , this ) , item1 , item2 , NULL ) ; <nl> item - > setTag ( kTagConnect + i ) ; <nl> item - > setPosition ( ccp ( light - > getPosition ( ) . x , light - > getPosition ( ) . y + 50 ) ) ; <nl> menuConnect - > addChild ( item , 0 ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / ExtensionsTest / TableViewTest / TableViewTestScene . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ExtensionsTest / TableViewTest / TableViewTestScene . cpp <nl> bool TableViewTestLayer : : init ( ) <nl> tableView - > reloadData ( ) ; <nl> <nl> / / Back Menu <nl> - CCMenuItemFont * itemBack = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( TableViewTestLayer : : toExtensionsMainLayer , this ) ) ; <nl> + CCMenuItemFont * itemBack = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( TableViewTestLayer : : toExtensionsMainLayer , this ) ) ; <nl> itemBack - > setPosition ( ccp ( VisibleRect : : rightBottom ( ) . x - 50 , VisibleRect : : rightBottom ( ) . y + 25 ) ) ; <nl> CCMenu * menuBack = CCMenu : : create ( itemBack , NULL ) ; <nl> menuBack - > setPosition ( CCPointZero ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / LabelTest / LabelTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / LabelTest / LabelTest . cpp <nl> LabelTTFTest : : LabelTTFTest ( ) <nl> <nl> CCMenuItemFont : : setFontSize ( 30 ) ; <nl> CCMenu * menu = CCMenu : : create ( <nl> - CCMenuItemFont : : create ( " Left " , CALLBACK_1 ( LabelTTFTest : : setAlignmentLeft , this ) ) , <nl> - CCMenuItemFont : : create ( " Center " , CALLBACK_1 ( LabelTTFTest : : setAlignmentCenter , this ) ) , <nl> - CCMenuItemFont : : create ( " Right " , CALLBACK_1 ( LabelTTFTest : : setAlignmentRight , this ) ) , <nl> + CCMenuItemFont : : create ( " Left " , CC_CALLBACK_1 ( LabelTTFTest : : setAlignmentLeft , this ) ) , <nl> + CCMenuItemFont : : create ( " Center " , CC_CALLBACK_1 ( LabelTTFTest : : setAlignmentCenter , this ) ) , <nl> + CCMenuItemFont : : create ( " Right " , CC_CALLBACK_1 ( LabelTTFTest : : setAlignmentRight , this ) ) , <nl> NULL ) ; <nl> menu - > alignItemsVerticallyWithPadding ( 4 ) ; <nl> menu - > setPosition ( ccp ( 50 , s . height / 2 - 20 ) ) ; <nl> this - > addChild ( menu ) ; <nl> <nl> menu = CCMenu : : create ( <nl> - CCMenuItemFont : : create ( " Top " , CALLBACK_1 ( LabelTTFTest : : setAlignmentTop , this ) ) , <nl> - CCMenuItemFont : : create ( " Middle " , CALLBACK_1 ( LabelTTFTest : : setAlignmentMiddle , this ) ) , <nl> - CCMenuItemFont : : create ( " Bottom " , CALLBACK_1 ( LabelTTFTest : : setAlignmentBottom , this ) ) , <nl> + CCMenuItemFont : : create ( " Top " , CC_CALLBACK_1 ( LabelTTFTest : : setAlignmentTop , this ) ) , <nl> + CCMenuItemFont : : create ( " Middle " , CC_CALLBACK_1 ( LabelTTFTest : : setAlignmentMiddle , this ) ) , <nl> + CCMenuItemFont : : create ( " Bottom " , CC_CALLBACK_1 ( LabelTTFTest : : setAlignmentBottom , this ) ) , <nl> NULL ) ; <nl> menu - > alignItemsVerticallyWithPadding ( 4 ) ; <nl> menu - > setPosition ( ccp ( s . width - 50 , s . height / 2 - 20 ) ) ; <nl> BitmapFontMultiLineAlignment : : BitmapFontMultiLineAlignment ( ) <nl> this - > m_pArrowsShouldRetain - > retain ( ) ; <nl> <nl> CCMenuItemFont : : setFontSize ( 20 ) ; <nl> - CCMenuItemFont * longSentences = CCMenuItemFont : : create ( " Long Flowing Sentences " , CALLBACK_1 ( BitmapFontMultiLineAlignment : : stringChanged , this ) ) ; <nl> - CCMenuItemFont * lineBreaks = CCMenuItemFont : : create ( " Short Sentences With Intentional Line Breaks " , CALLBACK_1 ( BitmapFontMultiLineAlignment : : stringChanged , this ) ) ; <nl> - CCMenuItemFont * mixed = CCMenuItemFont : : create ( " Long Sentences Mixed With Intentional Line Breaks " , CALLBACK_1 ( BitmapFontMultiLineAlignment : : stringChanged , this ) ) ; <nl> + CCMenuItemFont * longSentences = CCMenuItemFont : : create ( " Long Flowing Sentences " , CC_CALLBACK_1 ( BitmapFontMultiLineAlignment : : stringChanged , this ) ) ; <nl> + CCMenuItemFont * lineBreaks = CCMenuItemFont : : create ( " Short Sentences With Intentional Line Breaks " , CC_CALLBACK_1 ( BitmapFontMultiLineAlignment : : stringChanged , this ) ) ; <nl> + CCMenuItemFont * mixed = CCMenuItemFont : : create ( " Long Sentences Mixed With Intentional Line Breaks " , CC_CALLBACK_1 ( BitmapFontMultiLineAlignment : : stringChanged , this ) ) ; <nl> CCMenu * stringMenu = CCMenu : : create ( longSentences , lineBreaks , mixed , NULL ) ; <nl> stringMenu - > alignItemsVertically ( ) ; <nl> <nl> BitmapFontMultiLineAlignment : : BitmapFontMultiLineAlignment ( ) <nl> <nl> CCMenuItemFont : : setFontSize ( 30 ) ; <nl> <nl> - CCMenuItemFont * left = CCMenuItemFont : : create ( " Left " , CALLBACK_1 ( BitmapFontMultiLineAlignment : : alignmentChanged , this ) ) ; <nl> - CCMenuItemFont * center = CCMenuItemFont : : create ( " Center " , CALLBACK_1 ( BitmapFontMultiLineAlignment : : alignmentChanged , this ) ) ; <nl> - CCMenuItemFont * right = CCMenuItemFont : : create ( " Right " , CALLBACK_1 ( BitmapFontMultiLineAlignment : : alignmentChanged , this ) ) ; <nl> + CCMenuItemFont * left = CCMenuItemFont : : create ( " Left " , CC_CALLBACK_1 ( BitmapFontMultiLineAlignment : : alignmentChanged , this ) ) ; <nl> + CCMenuItemFont * center = CCMenuItemFont : : create ( " Center " , CC_CALLBACK_1 ( BitmapFontMultiLineAlignment : : alignmentChanged , this ) ) ; <nl> + CCMenuItemFont * right = CCMenuItemFont : : create ( " Right " , CC_CALLBACK_1 ( BitmapFontMultiLineAlignment : : alignmentChanged , this ) ) ; <nl> CCMenu * alignmentMenu = CCMenu : : create ( left , center , right , NULL ) ; <nl> alignmentMenu - > alignItemsHorizontallyWithPadding ( alignmentItemPadding ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / LayerTest / LayerTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / LayerTest / LayerTest . cpp <nl> LayerGradient : : LayerGradient ( ) <nl> CCLabelTTF * label2 = CCLabelTTF : : create ( " Compressed Interpolation : Disabled " , " Marker Felt " , 26 ) ; <nl> CCMenuItemLabel * item1 = CCMenuItemLabel : : create ( label1 ) ; <nl> CCMenuItemLabel * item2 = CCMenuItemLabel : : create ( label2 ) ; <nl> - CCMenuItemToggle * item = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( LayerGradient : : toggleItem , this ) , item1 , item2 , NULL ) ; <nl> + CCMenuItemToggle * item = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( LayerGradient : : toggleItem , this ) , item1 , item2 , NULL ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item , NULL ) ; <nl> addChild ( menu ) ; <nl> void LayerIgnoreAnchorPointPos : : onEnter ( ) <nl> CCSize lsize = l - > getContentSize ( ) ; <nl> child - > setPosition ( ccp ( lsize . width / 2 , lsize . height / 2 ) ) ; <nl> <nl> - CCMenuItemFont * item = CCMenuItemFont : : create ( " Toggle ignore anchor point " , CALLBACK_1 ( LayerIgnoreAnchorPointPos : : onToggle , this ) ) ; <nl> + CCMenuItemFont * item = CCMenuItemFont : : create ( " Toggle ignore anchor point " , CC_CALLBACK_1 ( LayerIgnoreAnchorPointPos : : onToggle , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item , NULL ) ; <nl> this - > addChild ( menu ) ; <nl> void LayerIgnoreAnchorPointRot : : onEnter ( ) <nl> CCSize lsize = l - > getContentSize ( ) ; <nl> child - > setPosition ( ccp ( lsize . width / 2 , lsize . height / 2 ) ) ; <nl> <nl> - CCMenuItemFont * item = CCMenuItemFont : : create ( " Toogle ignore anchor point " , CALLBACK_1 ( LayerIgnoreAnchorPointRot : : onToggle , this ) ) ; <nl> + CCMenuItemFont * item = CCMenuItemFont : : create ( " Toogle ignore anchor point " , CC_CALLBACK_1 ( LayerIgnoreAnchorPointRot : : onToggle , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item , NULL ) ; <nl> this - > addChild ( menu ) ; <nl> void LayerIgnoreAnchorPointScale : : onEnter ( ) <nl> CCSize lsize = l - > getContentSize ( ) ; <nl> child - > setPosition ( ccp ( lsize . width / 2 , lsize . height / 2 ) ) ; <nl> <nl> - CCMenuItemFont * item = CCMenuItemFont : : create ( " Toogle ignore anchor point " , CALLBACK_1 ( LayerIgnoreAnchorPointScale : : onToggle , this ) ) ; <nl> + CCMenuItemFont * item = CCMenuItemFont : : create ( " Toogle ignore anchor point " , CC_CALLBACK_1 ( LayerIgnoreAnchorPointScale : : onToggle , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item , NULL ) ; <nl> this - > addChild ( menu ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / MenuTest / MenuTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / MenuTest / MenuTest . cpp <nl> MenuLayerMainMenu : : MenuLayerMainMenu ( ) <nl> CCSprite * spriteSelected = CCSprite : : create ( s_MenuItem , CCRectMake ( 0 , 23 * 1 , 115 , 23 ) ) ; <nl> CCSprite * spriteDisabled = CCSprite : : create ( s_MenuItem , CCRectMake ( 0 , 23 * 0 , 115 , 23 ) ) ; <nl> <nl> - CCMenuItemSprite * item1 = CCMenuItemSprite : : create ( spriteNormal , spriteSelected , spriteDisabled , CALLBACK_1 ( MenuLayerMainMenu : : menuCallback , this ) ) ; <nl> + CCMenuItemSprite * item1 = CCMenuItemSprite : : create ( spriteNormal , spriteSelected , spriteDisabled , CC_CALLBACK_1 ( MenuLayerMainMenu : : menuCallback , this ) ) ; <nl> <nl> / / Image Item <nl> - CCMenuItem * item2 = CCMenuItemImage : : create ( s_SendScore , s_PressSendScore , CALLBACK_1 ( MenuLayerMainMenu : : menuCallback2 , this ) ) ; <nl> + CCMenuItem * item2 = CCMenuItemImage : : create ( s_SendScore , s_PressSendScore , CC_CALLBACK_1 ( MenuLayerMainMenu : : menuCallback2 , this ) ) ; <nl> <nl> / / Label Item ( LabelAtlas ) <nl> CCLabelAtlas * labelAtlas = CCLabelAtlas : : create ( " 0123456789 " , " fonts / labelatlas . png " , 16 , 24 , ' . ' ) ; <nl> - CCMenuItemLabel * item3 = CCMenuItemLabel : : create ( labelAtlas , CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackDisabled , this ) ) ; <nl> + CCMenuItemLabel * item3 = CCMenuItemLabel : : create ( labelAtlas , CC_CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackDisabled , this ) ) ; <nl> item3 - > setDisabledColor ( ccc3 ( 32 , 32 , 64 ) ) ; <nl> item3 - > setColor ( ccc3 ( 200 , 200 , 255 ) ) ; <nl> <nl> MenuLayerMainMenu : : MenuLayerMainMenu ( ) <nl> <nl> / / Label Item ( CCLabelBMFont ) <nl> CCLabelBMFont * label = CCLabelBMFont : : create ( " configuration " , " fonts / bitmapFontTest3 . fnt " ) ; <nl> - CCMenuItemLabel * item5 = CCMenuItemLabel : : create ( label , CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackConfig , this ) ) ; <nl> + CCMenuItemLabel * item5 = CCMenuItemLabel : : create ( label , CC_CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackConfig , this ) ) ; <nl> <nl> / / Testing issue # 500 <nl> item5 - > setScale ( 0 . 8f ) ; <nl> <nl> / / Events <nl> CCMenuItemFont : : setFontName ( " Marker Felt " ) ; <nl> - CCMenuItemFont * item6 = CCMenuItemFont : : create ( " Priority Test " , CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackPriorityTest , this ) ) ; <nl> + CCMenuItemFont * item6 = CCMenuItemFont : : create ( " Priority Test " , CC_CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackPriorityTest , this ) ) ; <nl> <nl> / / Bugs Item <nl> - CCMenuItemFont * item7 = CCMenuItemFont : : create ( " Bugs " , CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackBugsTest , this ) ) ; <nl> + CCMenuItemFont * item7 = CCMenuItemFont : : create ( " Bugs " , CC_CALLBACK_1 ( MenuLayerMainMenu : : menuCallbackBugsTest , this ) ) ; <nl> <nl> / / Font Item <nl> - CCMenuItemFont * item8 = CCMenuItemFont : : create ( " Quit " , CALLBACK_1 ( MenuLayerMainMenu : : onQuit , this ) ) ; <nl> + CCMenuItemFont * item8 = CCMenuItemFont : : create ( " Quit " , CC_CALLBACK_1 ( MenuLayerMainMenu : : onQuit , this ) ) ; <nl> <nl> - CCMenuItemFont * item9 = CCMenuItemFont : : create ( " Remove menu item when moving " , CALLBACK_1 ( MenuLayerMainMenu : : menuMovingCallback , this ) ) ; <nl> + CCMenuItemFont * item9 = CCMenuItemFont : : create ( " Remove menu item when moving " , CC_CALLBACK_1 ( MenuLayerMainMenu : : menuMovingCallback , this ) ) ; <nl> <nl> CCActionInterval * color_action = CCTintBy : : create ( 0 . 5f , 0 , - 255 , - 255 ) ; <nl> CCActionInterval * color_back = color_action - > reverse ( ) ; <nl> MenuLayer2 : : MenuLayer2 ( ) <nl> { <nl> for ( int i = 0 ; i < 2 ; i + + ) <nl> { <nl> - CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_PlayNormal , s_PlaySelect , CALLBACK_1 ( MenuLayer2 : : menuCallback , this ) ) ; <nl> - CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_HighNormal , s_HighSelect , CALLBACK_1 ( MenuLayer2 : : menuCallbackOpacity , this ) ) ; <nl> - CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_AboutNormal , s_AboutSelect , CALLBACK_1 ( MenuLayer2 : : menuCallbackAlign , this ) ) ; <nl> + CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_PlayNormal , s_PlaySelect , CC_CALLBACK_1 ( MenuLayer2 : : menuCallback , this ) ) ; <nl> + CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_HighNormal , s_HighSelect , CC_CALLBACK_1 ( MenuLayer2 : : menuCallbackOpacity , this ) ) ; <nl> + CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_AboutNormal , s_AboutSelect , CC_CALLBACK_1 ( MenuLayer2 : : menuCallbackAlign , this ) ) ; <nl> <nl> item1 - > setScaleX ( 1 . 5f ) ; <nl> item2 - > setScaleX ( 0 . 5f ) ; <nl> MenuLayer4 : : MenuLayer4 ( ) <nl> title1 - > setEnabled ( false ) ; <nl> CCMenuItemFont : : setFontName ( " Marker Felt " ) ; <nl> CCMenuItemFont : : setFontSize ( 34 ) ; <nl> - CCMenuItemToggle * item1 = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> + CCMenuItemToggle * item1 = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> CCMenuItemFont : : create ( " On " ) , <nl> CCMenuItemFont : : create ( " Off " ) , <nl> NULL ) ; <nl> MenuLayer4 : : MenuLayer4 ( ) <nl> title2 - > setEnabled ( false ) ; <nl> CCMenuItemFont : : setFontName ( " Marker Felt " ) ; <nl> CCMenuItemFont : : setFontSize ( 34 ) ; <nl> - CCMenuItemToggle * item2 = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> + CCMenuItemToggle * item2 = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> CCMenuItemFont : : create ( " On " ) , <nl> CCMenuItemFont : : create ( " Off " ) , <nl> NULL ) ; <nl> MenuLayer4 : : MenuLayer4 ( ) <nl> title3 - > setEnabled ( false ) ; <nl> CCMenuItemFont : : setFontName ( " Marker Felt " ) ; <nl> CCMenuItemFont : : setFontSize ( 34 ) ; <nl> - CCMenuItemToggle * item3 = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> + CCMenuItemToggle * item3 = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> CCMenuItemFont : : create ( " High " ) , <nl> CCMenuItemFont : : create ( " Low " ) , <nl> NULL ) ; <nl> MenuLayer4 : : MenuLayer4 ( ) <nl> title4 - > setEnabled ( false ) ; <nl> CCMenuItemFont : : setFontName ( " Marker Felt " ) ; <nl> CCMenuItemFont : : setFontSize ( 34 ) ; <nl> - CCMenuItemToggle * item4 = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> + CCMenuItemToggle * item4 = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( MenuLayer4 : : menuCallback , this ) , <nl> CCMenuItemFont : : create ( " Off " ) , <nl> NULL ) ; <nl> <nl> MenuLayer4 : : MenuLayer4 ( ) <nl> CCMenuItemFont : : setFontSize ( 34 ) ; <nl> <nl> CCLabelBMFont * label = CCLabelBMFont : : create ( " go back " , " fonts / bitmapFontTest3 . fnt " ) ; <nl> - CCMenuItemLabel * back = CCMenuItemLabel : : create ( label , CALLBACK_1 ( MenuLayer4 : : backCallback , this ) ) ; <nl> + CCMenuItemLabel * back = CCMenuItemLabel : : create ( label , CC_CALLBACK_1 ( MenuLayer4 : : backCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( <nl> title1 , title2 , <nl> MenuLayerPriorityTest : : MenuLayerPriorityTest ( ) <nl> / / Menu 1 <nl> CCMenuItemFont : : setFontName ( " Marker Felt " ) ; <nl> CCMenuItemFont : : setFontSize ( 18 ) ; <nl> - CCMenuItemFont * item1 = CCMenuItemFont : : create ( " Return to Main Menu " , CALLBACK_1 ( MenuLayerPriorityTest : : menuCallback , this ) ) ; <nl> + CCMenuItemFont * item1 = CCMenuItemFont : : create ( " Return to Main Menu " , CC_CALLBACK_1 ( MenuLayerPriorityTest : : menuCallback , this ) ) ; <nl> CCMenuItemFont * item2 = CCMenuItemFont : : create ( " Disable menu for 5 seconds " , [ & ] ( CCObject * sender ) { <nl> m_pMenu1 - > setEnabled ( false ) ; <nl> CCDelayTime * wait = CCDelayTime : : create ( 5 ) ; <nl> void MenuLayerPriorityTest : : menuCallback ( CCObject * pSender ) <nl> / / BugsTest <nl> BugsTest : : BugsTest ( ) <nl> { <nl> - CCMenuItemFont * issue1410 = CCMenuItemFont : : create ( " Issue 1410 " , CALLBACK_1 ( BugsTest : : issue1410MenuCallback , this ) ) ; <nl> - CCMenuItemFont * issue1410_2 = CCMenuItemFont : : create ( " Issue 1410 # 2 " , CALLBACK_1 ( BugsTest : : issue1410v2MenuCallback , this ) ) ; <nl> - CCMenuItemFont * back = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( BugsTest : : backMenuCallback , this ) ) ; <nl> + CCMenuItemFont * issue1410 = CCMenuItemFont : : create ( " Issue 1410 " , CC_CALLBACK_1 ( BugsTest : : issue1410MenuCallback , this ) ) ; <nl> + CCMenuItemFont * issue1410_2 = CCMenuItemFont : : create ( " Issue 1410 # 2 " , CC_CALLBACK_1 ( BugsTest : : issue1410v2MenuCallback , this ) ) ; <nl> + CCMenuItemFont * back = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( BugsTest : : backMenuCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( issue1410 , issue1410_2 , back , NULL ) ; <nl> addChild ( menu ) ; <nl> RemoveMenuItemWhenMove : : RemoveMenuItemWhenMove ( ) <nl> item = CCMenuItemFont : : create ( " item 1 " ) ; <nl> item - > retain ( ) ; <nl> <nl> - CCMenuItemFont * back = CCMenuItemFont : : create ( " go back " , CALLBACK_1 ( RemoveMenuItemWhenMove : : goBack , this ) ) ; <nl> + CCMenuItemFont * back = CCMenuItemFont : : create ( " go back " , CC_CALLBACK_1 ( RemoveMenuItemWhenMove : : goBack , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item , back , NULL ) ; <nl> addChild ( menu ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / MotionStreakTest / MotionStreakTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / MotionStreakTest / MotionStreakTest . cpp <nl> void MotionStreakTest : : onEnter ( ) <nl> <nl> CCSize s = CCDirector : : sharedDirector ( ) - > getWinSize ( ) ; <nl> <nl> - CCMenuItemToggle * itemMode = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( MotionStreakTest : : modeCallback , this ) , <nl> + CCMenuItemToggle * itemMode = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( MotionStreakTest : : modeCallback , this ) , <nl> CCMenuItemFont : : create ( " Use High Quality Mode " ) , <nl> CCMenuItemFont : : create ( " Use Fast Mode " ) , <nl> NULL ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / ParticleTest / ParticleTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / ParticleTest / ParticleTest . cpp <nl> void ParticleDemo : : onEnter ( void ) <nl> <nl> CCSize s = CCDirector : : sharedDirector ( ) - > getWinSize ( ) ; <nl> <nl> - CCMenuItemToggle * item4 = CCMenuItemToggle : : createWithCallback ( CALLBACK_1 ( ParticleDemo : : toggleCallback , this ) , <nl> + CCMenuItemToggle * item4 = CCMenuItemToggle : : createWithCallback ( CC_CALLBACK_1 ( ParticleDemo : : toggleCallback , this ) , <nl> CCMenuItemFont : : create ( " Free Movement " ) , <nl> CCMenuItemFont : : create ( " Relative Movement " ) , <nl> CCMenuItemFont : : create ( " Grouped Movement " ) , <nl> mmm a / samples / Cpp / TestCpp / Classes / PerformanceTest / PerformanceParticleTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / PerformanceTest / PerformanceParticleTest . cpp <nl> void ParticleMainScene : : initWithSubTest ( int asubtest , int particles ) <nl> { <nl> char str [ 10 ] = { 0 } ; <nl> sprintf ( str , " % d " , i ) ; <nl> - CCMenuItemFont * itemFont = CCMenuItemFont : : create ( str , CALLBACK_1 ( ParticleMainScene : : testNCallback , this ) ) ; <nl> + CCMenuItemFont * itemFont = CCMenuItemFont : : create ( str , CC_CALLBACK_1 ( ParticleMainScene : : testNCallback , this ) ) ; <nl> itemFont - > setTag ( i ) ; <nl> pSubMenu - > addChild ( itemFont , 10 ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / PerformanceTest / PerformanceSpriteTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / PerformanceTest / PerformanceSpriteTest . cpp <nl> void SpriteMainScene : : initWithSubTest ( int asubtest , int nNodes ) <nl> quantityNodes = 0 ; <nl> <nl> CCMenuItemFont : : setFontSize ( 65 ) ; <nl> - CCMenuItemFont * decrease = CCMenuItemFont : : create ( " - " , CALLBACK_1 ( SpriteMainScene : : onDecrease , this ) ) ; <nl> + CCMenuItemFont * decrease = CCMenuItemFont : : create ( " - " , CC_CALLBACK_1 ( SpriteMainScene : : onDecrease , this ) ) ; <nl> decrease - > setColor ( ccc3 ( 0 , 200 , 20 ) ) ; <nl> - CCMenuItemFont * increase = CCMenuItemFont : : create ( " + " , CALLBACK_1 ( SpriteMainScene : : onIncrease , this ) ) ; <nl> + CCMenuItemFont * increase = CCMenuItemFont : : create ( " + " , CC_CALLBACK_1 ( SpriteMainScene : : onIncrease , this ) ) ; <nl> increase - > setColor ( ccc3 ( 0 , 200 , 20 ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( decrease , increase , NULL ) ; <nl> void SpriteMainScene : : initWithSubTest ( int asubtest , int nNodes ) <nl> { <nl> char str [ 10 ] = { 0 } ; <nl> sprintf ( str , " % d " , i ) ; <nl> - CCMenuItemFont * itemFont = CCMenuItemFont : : create ( str , CALLBACK_1 ( SpriteMainScene : : testNCallback , this ) ) ; <nl> + CCMenuItemFont * itemFont = CCMenuItemFont : : create ( str , CC_CALLBACK_1 ( SpriteMainScene : : testNCallback , this ) ) ; <nl> itemFont - > setTag ( i ) ; <nl> pSubMenu - > addChild ( itemFont , 10 ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / PerformanceTest / PerformanceTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / PerformanceTest / PerformanceTest . cpp <nl> void PerformBasicLayer : : onEnter ( ) <nl> <nl> CCMenuItemFont : : setFontName ( " Arial " ) ; <nl> CCMenuItemFont : : setFontSize ( 24 ) ; <nl> - CCMenuItemFont * pMainItem = CCMenuItemFont : : create ( " Back " , CALLBACK_1 ( PerformBasicLayer : : toMainLayer , this ) ) ; <nl> + CCMenuItemFont * pMainItem = CCMenuItemFont : : create ( " Back " , CC_CALLBACK_1 ( PerformBasicLayer : : toMainLayer , this ) ) ; <nl> pMainItem - > setPosition ( ccp ( VisibleRect : : rightBottom ( ) . x - 50 , VisibleRect : : rightBottom ( ) . y + 25 ) ) ; <nl> CCMenu * pMenu = CCMenu : : create ( pMainItem , NULL ) ; <nl> pMenu - > setPosition ( CCPointZero ) ; <nl> <nl> if ( m_bControlMenuVisible ) <nl> { <nl> - CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CALLBACK_1 ( PerformBasicLayer : : backCallback , this ) ) ; <nl> - CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CALLBACK_1 ( PerformBasicLayer : : restartCallback , this ) ) ; <nl> - CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CALLBACK_1 ( PerformBasicLayer : : nextCallback , this ) ) ; <nl> + CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CC_CALLBACK_1 ( PerformBasicLayer : : backCallback , this ) ) ; <nl> + CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CC_CALLBACK_1 ( PerformBasicLayer : : restartCallback , this ) ) ; <nl> + CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CC_CALLBACK_1 ( PerformBasicLayer : : nextCallback , this ) ) ; <nl> item1 - > setPosition ( ccp ( VisibleRect : : center ( ) . x - item2 - > getContentSize ( ) . width * 2 , VisibleRect : : bottom ( ) . y + item2 - > getContentSize ( ) . height / 2 ) ) ; <nl> item2 - > setPosition ( ccp ( VisibleRect : : center ( ) . x , VisibleRect : : bottom ( ) . y + item2 - > getContentSize ( ) . height / 2 ) ) ; <nl> item3 - > setPosition ( ccp ( VisibleRect : : center ( ) . x + item2 - > getContentSize ( ) . width * 2 , VisibleRect : : bottom ( ) . y + item2 - > getContentSize ( ) . height / 2 ) ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / RenderTextureTest / RenderTextureTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / RenderTextureTest / RenderTextureTest . cpp <nl> RenderTextureSave : : RenderTextureSave ( ) <nl> <nl> / / Save Image menu <nl> CCMenuItemFont : : setFontSize ( 16 ) ; <nl> - CCMenuItem * item1 = CCMenuItemFont : : create ( " Save Image " , CALLBACK_1 ( RenderTextureSave : : saveImage , this ) ) ; <nl> - CCMenuItem * item2 = CCMenuItemFont : : create ( " Clear " , CALLBACK_1 ( RenderTextureSave : : clearImage , this ) ) ; <nl> + CCMenuItem * item1 = CCMenuItemFont : : create ( " Save Image " , CC_CALLBACK_1 ( RenderTextureSave : : saveImage , this ) ) ; <nl> + CCMenuItem * item2 = CCMenuItemFont : : create ( " Clear " , CC_CALLBACK_1 ( RenderTextureSave : : clearImage , this ) ) ; <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , NULL ) ; <nl> this - > addChild ( menu ) ; <nl> menu - > alignItemsVertically ( ) ; <nl> RenderTextureTargetNode : : RenderTextureTargetNode ( ) <nl> scheduleUpdate ( ) ; <nl> <nl> / / Toggle clear on / off <nl> - CCMenuItemFont * item = CCMenuItemFont : : create ( " Clear On / Off " , CALLBACK_1 ( RenderTextureTargetNode : : touched , this ) ) ; <nl> + CCMenuItemFont * item = CCMenuItemFont : : create ( " Clear On / Off " , CC_CALLBACK_1 ( RenderTextureTargetNode : : touched , this ) ) ; <nl> CCMenu * menu = CCMenu : : create ( item , NULL ) ; <nl> addChild ( menu ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / SceneTest / SceneTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / SceneTest / SceneTest . cpp <nl> enum <nl> <nl> SceneTestLayer1 : : SceneTestLayer1 ( ) <nl> { <nl> - CCMenuItemFont * item1 = CCMenuItemFont : : create ( " Test pushScene " , CALLBACK_1 ( SceneTestLayer1 : : onPushScene , this ) ) ; <nl> - CCMenuItemFont * item2 = CCMenuItemFont : : create ( " Test pushScene w / transition " , CALLBACK_1 ( SceneTestLayer1 : : onPushSceneTran , this ) ) ; <nl> - CCMenuItemFont * item3 = CCMenuItemFont : : create ( " Quit " , CALLBACK_1 ( SceneTestLayer1 : : onQuit , this ) ) ; <nl> + CCMenuItemFont * item1 = CCMenuItemFont : : create ( " Test pushScene " , CC_CALLBACK_1 ( SceneTestLayer1 : : onPushScene , this ) ) ; <nl> + CCMenuItemFont * item2 = CCMenuItemFont : : create ( " Test pushScene w / transition " , CC_CALLBACK_1 ( SceneTestLayer1 : : onPushSceneTran , this ) ) ; <nl> + CCMenuItemFont * item3 = CCMenuItemFont : : create ( " Quit " , CC_CALLBACK_1 ( SceneTestLayer1 : : onQuit , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , item3 , NULL ) ; <nl> menu - > alignItemsVertically ( ) ; <nl> SceneTestLayer2 : : SceneTestLayer2 ( ) <nl> { <nl> m_timeCounter = 0 ; <nl> <nl> - CCMenuItemFont * item1 = CCMenuItemFont : : create ( " replaceScene " , CALLBACK_1 ( SceneTestLayer2 : : onReplaceScene , this ) ) ; <nl> - CCMenuItemFont * item2 = CCMenuItemFont : : create ( " replaceScene w / transition " , CALLBACK_1 ( SceneTestLayer2 : : onReplaceSceneTran , this ) ) ; <nl> - CCMenuItemFont * item3 = CCMenuItemFont : : create ( " Go Back " , CALLBACK_1 ( SceneTestLayer2 : : onGoBack , this ) ) ; <nl> + CCMenuItemFont * item1 = CCMenuItemFont : : create ( " replaceScene " , CC_CALLBACK_1 ( SceneTestLayer2 : : onReplaceScene , this ) ) ; <nl> + CCMenuItemFont * item2 = CCMenuItemFont : : create ( " replaceScene w / transition " , CC_CALLBACK_1 ( SceneTestLayer2 : : onReplaceSceneTran , this ) ) ; <nl> + CCMenuItemFont * item3 = CCMenuItemFont : : create ( " Go Back " , CC_CALLBACK_1 ( SceneTestLayer2 : : onGoBack , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , item3 , NULL ) ; <nl> menu - > alignItemsVertically ( ) ; <nl> bool SceneTestLayer3 : : init ( ) <nl> { <nl> CCSize s = CCDirector : : sharedDirector ( ) - > getWinSize ( ) ; <nl> <nl> - CCMenuItemFont * item0 = CCMenuItemFont : : create ( " Touch to pushScene ( self ) " , CALLBACK_1 ( SceneTestLayer3 : : item0Clicked , this ) ) ; <nl> - CCMenuItemFont * item1 = CCMenuItemFont : : create ( " Touch to popScene " , CALLBACK_1 ( SceneTestLayer3 : : item1Clicked , this ) ) ; <nl> - CCMenuItemFont * item2 = CCMenuItemFont : : create ( " Touch to popToRootScene " , CALLBACK_1 ( SceneTestLayer3 : : item2Clicked , this ) ) ; <nl> - CCMenuItemFont * item3 = CCMenuItemFont : : create ( " Touch to popToSceneStackLevel ( 2 ) " , CALLBACK_1 ( SceneTestLayer3 : : item3Clicked , this ) ) ; <nl> + CCMenuItemFont * item0 = CCMenuItemFont : : create ( " Touch to pushScene ( self ) " , CC_CALLBACK_1 ( SceneTestLayer3 : : item0Clicked , this ) ) ; <nl> + CCMenuItemFont * item1 = CCMenuItemFont : : create ( " Touch to popScene " , CC_CALLBACK_1 ( SceneTestLayer3 : : item1Clicked , this ) ) ; <nl> + CCMenuItemFont * item2 = CCMenuItemFont : : create ( " Touch to popToRootScene " , CC_CALLBACK_1 ( SceneTestLayer3 : : item2Clicked , this ) ) ; <nl> + CCMenuItemFont * item3 = CCMenuItemFont : : create ( " Touch to popToSceneStackLevel ( 2 ) " , CC_CALLBACK_1 ( SceneTestLayer3 : : item3Clicked , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item0 , item1 , item2 , item3 , NULL ) ; <nl> this - > addChild ( menu ) ; <nl> mmm a / samples / Cpp / TestCpp / Classes / Texture2dTest / Texture2dTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / Texture2dTest / Texture2dTest . cpp <nl> void TextureMemoryAlloc : : onEnter ( ) <nl> <nl> CCMenuItemFont : : setFontSize ( 24 ) ; <nl> <nl> - CCMenuItem * item1 = CCMenuItemFont : : create ( " PNG " , CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> + CCMenuItem * item1 = CCMenuItemFont : : create ( " PNG " , CC_CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> item1 - > setTag ( 0 ) ; <nl> <nl> - CCMenuItem * item2 = CCMenuItemFont : : create ( " RGBA8 " , CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> + CCMenuItem * item2 = CCMenuItemFont : : create ( " RGBA8 " , CC_CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> item2 - > setTag ( 1 ) ; <nl> <nl> - CCMenuItem * item3 = CCMenuItemFont : : create ( " RGB8 " , CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> + CCMenuItem * item3 = CCMenuItemFont : : create ( " RGB8 " , CC_CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> item3 - > setTag ( 2 ) ; <nl> <nl> - CCMenuItem * item4 = CCMenuItemFont : : create ( " RGBA4 " , CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> + CCMenuItem * item4 = CCMenuItemFont : : create ( " RGBA4 " , CC_CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> item4 - > setTag ( 3 ) ; <nl> <nl> - CCMenuItem * item5 = CCMenuItemFont : : create ( " A8 " , CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> + CCMenuItem * item5 = CCMenuItemFont : : create ( " A8 " , CC_CALLBACK_1 ( TextureMemoryAlloc : : updateImage , this ) ) ; <nl> item5 - > setTag ( 4 ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , item3 , item4 , item5 , NULL ) ; <nl> void TextureMemoryAlloc : : onEnter ( ) <nl> <nl> addChild ( menu ) ; <nl> <nl> - CCMenuItemFont * warmup = CCMenuItemFont : : create ( " warm up texture " , CALLBACK_1 ( TextureMemoryAlloc : : changeBackgroundVisible , this ) ) ; <nl> + CCMenuItemFont * warmup = CCMenuItemFont : : create ( " warm up texture " , CC_CALLBACK_1 ( TextureMemoryAlloc : : changeBackgroundVisible , this ) ) ; <nl> <nl> CCMenu * menu2 = CCMenu : : create ( warmup , NULL ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / TransitionsTest / TransitionsTest . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / TransitionsTest / TransitionsTest . cpp <nl> TestLayer1 : : TestLayer1 ( void ) <nl> addChild ( label ) ; <nl> <nl> / / menu <nl> - CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CALLBACK_1 ( TestLayer1 : : backCallback , this ) ) ; <nl> - CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CALLBACK_1 ( TestLayer1 : : restartCallback , this ) ) ; <nl> - CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CALLBACK_1 ( TestLayer1 : : nextCallback , this ) ) ; <nl> + CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CC_CALLBACK_1 ( TestLayer1 : : backCallback , this ) ) ; <nl> + CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CC_CALLBACK_1 ( TestLayer1 : : restartCallback , this ) ) ; <nl> + CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CC_CALLBACK_1 ( TestLayer1 : : nextCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , item3 , NULL ) ; <nl> <nl> TestLayer2 : : TestLayer2 ( ) <nl> addChild ( label ) ; <nl> <nl> / / menu <nl> - CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CALLBACK_1 ( TestLayer2 : : backCallback , this ) ) ; <nl> - CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CALLBACK_1 ( TestLayer2 : : restartCallback , this ) ) ; <nl> - CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CALLBACK_1 ( TestLayer2 : : nextCallback , this ) ) ; <nl> + CCMenuItemImage * item1 = CCMenuItemImage : : create ( s_pPathB1 , s_pPathB2 , CC_CALLBACK_1 ( TestLayer2 : : backCallback , this ) ) ; <nl> + CCMenuItemImage * item2 = CCMenuItemImage : : create ( s_pPathR1 , s_pPathR2 , CC_CALLBACK_1 ( TestLayer2 : : restartCallback , this ) ) ; <nl> + CCMenuItemImage * item3 = CCMenuItemImage : : create ( s_pPathF1 , s_pPathF2 , CC_CALLBACK_1 ( TestLayer2 : : nextCallback , this ) ) ; <nl> <nl> CCMenu * menu = CCMenu : : create ( item1 , item2 , item3 , NULL ) ; <nl> <nl> mmm a / samples / Cpp / TestCpp / Classes / controller . cpp <nl> ppp b / samples / Cpp / TestCpp / Classes / controller . cpp <nl> TestController : : TestController ( ) <nl> : m_tBeginPos ( CCPointZero ) <nl> { <nl> / / add close menu <nl> - CCMenuItemImage * pCloseItem = CCMenuItemImage : : create ( s_pPathClose , s_pPathClose , CALLBACK_1 ( TestController : : closeCallback , this ) ) ; <nl> + CCMenuItemImage * pCloseItem = CCMenuItemImage : : create ( s_pPathClose , s_pPathClose , CC_CALLBACK_1 ( TestController : : closeCallback , this ) ) ; <nl> CCMenu * pMenu = CCMenu : : create ( pCloseItem , NULL ) ; <nl> <nl> pMenu - > setPosition ( CCPointZero ) ; <nl> TestController : : TestController ( ) <nl> / / # else <nl> CCLabelTTF * label = CCLabelTTF : : create ( g_aTestNames [ i ] . test_name , " Arial " , 24 ) ; <nl> / / # endif <nl> - CCMenuItemLabel * pMenuItem = CCMenuItemLabel : : create ( label , CALLBACK_1 ( TestController : : menuCallback , this ) ) ; <nl> + CCMenuItemLabel * pMenuItem = CCMenuItemLabel : : create ( label , CC_CALLBACK_1 ( TestController : : menuCallback , this ) ) ; <nl> <nl> m_pItemMenu - > addChild ( pMenuItem , i + 10000 ) ; <nl> pMenuItem - > setPosition ( ccp ( VisibleRect : : center ( ) . x , ( VisibleRect : : top ( ) . y - ( i + 1 ) * LINE_SPACE ) ) ) ; <nl>
Adds CC prefix to CALLBACK ( )
cocos2d/cocos2d-x
8a6d33a212f7a6cc087cba5e286771aa9329f7ac
2013-06-14T03:36:43Z
mmm a / UnitTests / HttpInterface / api - import - spec . rb <nl> ppp b / UnitTests / HttpInterface / api - import - spec . rb <nl> <nl> body = " { \ " sample \ " : \ " garbage \ " } " <nl> doc = ArangoDB . log_post ( " # { prefix } - self - contained - nonexist " , cmd , : body = > body ) <nl> <nl> - doc . code . should eq ( 400 ) <nl> + doc . code . should eq ( 404 ) <nl> doc . parsed_response [ ' error ' ] . should eq ( true ) <nl> doc . parsed_response [ ' errorNum ' ] . should eq ( 1203 ) <nl> end <nl> <nl> body = " [ \ " name \ " ] \ n " <nl> doc = ArangoDB . log_post ( " # { prefix } - data - nonexist " , cmd , : body = > body ) <nl> <nl> - doc . code . should eq ( 400 ) <nl> + doc . code . should eq ( 404 ) <nl> doc . parsed_response [ ' error ' ] . should eq ( true ) <nl> doc . parsed_response [ ' errorNum ' ] . should eq ( 1203 ) <nl> end <nl> mmm a / arangod / RestHandler / RestImportHandler . cpp <nl> ppp b / arangod / RestHandler / RestImportHandler . cpp <nl> <nl> <nl> # include " Basics / StringUtils . h " <nl> # include " BasicsC / string - buffer . h " <nl> + # include " BasicsC / strings . h " <nl> # include " Rest / HttpRequest . h " <nl> # include " Rest / JsonContainer . h " <nl> # include " VocBase / simple - collection . h " <nl> bool RestImportHandler : : createByArray ( ) { <nl> <nl> if ( res ! = TRI_ERROR_NO_ERROR ) { <nl> releaseCollection ( ) ; <nl> - <nl> - generateError ( HttpResponse : : BAD , <nl> - res , <nl> - " Could not use collection " ) ; <nl> + <nl> + / / error is already generated by useCollection ! <nl> return false ; <nl> } <nl> <nl> bool RestImportHandler : : createByList ( ) { <nl> TRI_FreeJson ( TRI_UNKNOWN_MEM_ZONE , keys ) ; <nl> } <nl> <nl> - generateError ( HttpResponse : : BAD , <nl> - res , <nl> - " Could not use collection " ) ; <nl> + / / error is already generated by useCollection ! <nl> return false ; <nl> } <nl> <nl> void RestImportHandler : : generateDocumentsCreated ( size_t numCreated , size_t numE <nl> TRI_json_t * RestImportHandler : : parseJsonLine ( const string & line ) { <nl> char * errmsg = 0 ; <nl> TRI_json_t * json = TRI_Json2String ( TRI_UNKNOWN_MEM_ZONE , line . c_str ( ) , & errmsg ) ; <nl> + <nl> + if ( errmsg ! = 0 ) { <nl> + / / must free this error message , otherwise we ' ll have a memleak <nl> + TRI_FreeString ( TRI_UNKNOWN_MEM_ZONE , errmsg ) ; <nl> + } <nl> return json ; <nl> } <nl> <nl>
issue : fixed memleaks in / _api / import
arangodb/arangodb
0c822a8da199646dc00ed614491f53099c51c82a
2012-08-15T12:24:59Z
mmm a / 3rdparty / include / opencl / 1 . 2 / CL / cl_platform . h <nl> ppp b / 3rdparty / include / opencl / 1 . 2 / CL / cl_platform . h <nl> typedef unsigned int cl_GLenum ; <nl> / * Define alignment keys * / <nl> # if defined ( __GNUC__ ) <nl> # define CL_ALIGNED ( _x ) __attribute__ ( ( aligned ( _x ) ) ) <nl> - # elif defined ( _WIN32 ) & & ( _MSC_VER ) <nl> + # elif defined ( _WIN32 ) & & defined ( _MSC_VER ) <nl> / * Alignment keys neutered on windows because MSVC can ' t swallow function arguments with alignment requirements * / <nl> / * http : / / msdn . microsoft . com / en - us / library / 373ak2y1 % 28VS . 71 % 29 . aspx * / <nl> / * # include < crtdefs . h > * / <nl> mmm a / apps / traincascade / imagestorage . cpp <nl> ppp b / apps / traincascade / imagestorage . cpp <nl> bool CvCascadeImageReader : : NegReader : : nextImg ( ) <nl> _offset . x = std : : min ( ( int ) round % winSize . width , src . cols - winSize . width ) ; <nl> _offset . y = std : : min ( ( int ) round / winSize . width , src . rows - winSize . height ) ; <nl> if ( ! src . empty ( ) & & src . type ( ) = = CV_8UC1 <nl> - & & offset . x > = 0 & & offset . y > = 0 ) <nl> + & & _offset . x > = 0 & & _offset . y > = 0 ) <nl> break ; <nl> } <nl> <nl> mmm a / doc / tutorials / introduction / crosscompilation / arm_crosscompile_with_cmake . rst <nl> ppp b / doc / tutorials / introduction / crosscompilation / arm_crosscompile_with_cmake . rst <nl> Building OpenCV <nl> Enable hardware optimizations <nl> mmmmmmmmmmmmmmmmmmmmmmmmmmm - - <nl> <nl> - Depending on target platfrom architecture different instruction sets can be used . By default <nl> + Depending on target platform architecture different instruction sets can be used . By default <nl> compiler generates code for armv5l without VFPv3 and NEON extensions . Add ` ` - DUSE_VFPV3 = ON ` ` <nl> to cmake command line to enable code generation for VFPv3 and ` ` - DUSE_NEON = ON ` ` for using <nl> NEON SIMD extensions . <nl> mmm a / modules / cudaimgproc / src / color . cpp <nl> ppp b / modules / cudaimgproc / src / color . cpp <nl> void cv : : cuda : : cvtColor ( InputArray src , OutputArray dst , int code , int dcn , Stre <nl> <nl> void cv : : cuda : : demosaicing ( InputArray _src , OutputArray _dst , int code , int dcn , Stream & stream ) <nl> { <nl> + CV_Assert ( ! _src . empty ( ) ) ; <nl> + <nl> switch ( code ) <nl> { <nl> case cv : : COLOR_BayerBG2GRAY : case cv : : COLOR_BayerGB2GRAY : case cv : : COLOR_BayerRG2GRAY : case cv : : COLOR_BayerGR2GRAY : <nl> mmm a / modules / cudaimgproc / test / test_color . cpp <nl> ppp b / modules / cudaimgproc / test / test_color . cpp <nl> struct Demosaicing : testing : : TestWithParam < cv : : cuda : : DeviceInfo > <nl> CUDA_TEST_P ( Demosaicing , BayerBG2BGR ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 1 , 1 ) ) ; <nl> CUDA_TEST_P ( Demosaicing , BayerBG2BGR ) <nl> CUDA_TEST_P ( Demosaicing , BayerGB2BGR ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 0 , 1 ) ) ; <nl> CUDA_TEST_P ( Demosaicing , BayerGB2BGR ) <nl> CUDA_TEST_P ( Demosaicing , BayerRG2BGR ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 0 , 0 ) ) ; <nl> CUDA_TEST_P ( Demosaicing , BayerRG2BGR ) <nl> CUDA_TEST_P ( Demosaicing , BayerGR2BGR ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 1 , 0 ) ) ; <nl> CUDA_TEST_P ( Demosaicing , BayerGR2BGR ) <nl> CUDA_TEST_P ( Demosaicing , BayerBG2BGR_MHT ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 1 , 1 ) ) ; <nl> CUDA_TEST_P ( Demosaicing , BayerBG2BGR_MHT ) <nl> CUDA_TEST_P ( Demosaicing , BayerGB2BGR_MHT ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 0 , 1 ) ) ; <nl> CUDA_TEST_P ( Demosaicing , BayerGB2BGR_MHT ) <nl> CUDA_TEST_P ( Demosaicing , BayerRG2BGR_MHT ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 0 , 0 ) ) ; <nl> CUDA_TEST_P ( Demosaicing , BayerRG2BGR_MHT ) <nl> CUDA_TEST_P ( Demosaicing , BayerGR2BGR_MHT ) <nl> { <nl> cv : : Mat img = readImage ( " stereobm / aloe - L . png " ) ; <nl> + ASSERT_FALSE ( img . empty ( ) ) < < " Can ' t load input image " ; <nl> <nl> cv : : Mat_ < uchar > src ; <nl> mosaic ( img , src , cv : : Point ( 1 , 0 ) ) ; <nl> mmm a / modules / features2d / src / orb . cpp <nl> ppp b / modules / features2d / src / orb . cpp <nl> static void computeOrbDescriptor ( const KeyPoint & kpt , <nl> float x , y ; <nl> int ix , iy ; <nl> # if 1 <nl> - # define GET_VALUE ( idx ) \ <nl> - ( x = pattern [ idx ] . x * a - pattern [ idx ] . y * b , \ <nl> - y = pattern [ idx ] . x * b + pattern [ idx ] . y * a , \ <nl> - ix = cvRound ( x ) , \ <nl> - iy = cvRound ( y ) , \ <nl> - * ( center + iy * step + ix ) ) <nl> + # define GET_VALUE ( idx ) \ <nl> + ( x = pattern [ idx ] . x * a - pattern [ idx ] . y * b , \ <nl> + y = pattern [ idx ] . x * b + pattern [ idx ] . y * a , \ <nl> + ix = cvRound ( x ) , \ <nl> + iy = cvRound ( y ) , \ <nl> + * ( center + iy * step + ix ) ) <nl> # else <nl> # define GET_VALUE ( idx ) \ <nl> ( x = pattern [ idx ] . x * a - pattern [ idx ] . y * b , \ <nl> mmm a / modules / highgui / src / window_gtk . cpp <nl> ppp b / modules / highgui / src / window_gtk . cpp <nl> static gboolean icvOnMouse ( GtkWidget * widget , GdkEvent * event , gpointer user_da <nl> / / image origin is not necessarily at ( 0 , 0 ) <nl> int x0 = ( widget - > allocation . width - image_widget - > scaled_image - > cols ) / 2 ; <nl> int y0 = ( widget - > allocation . height - image_widget - > scaled_image - > rows ) / 2 ; <nl> - pt . x = cvRound ( ( ( pt32f . x - x0 ) * image_widget - > original_image - > cols ) / <nl> + pt . x = cvFloor ( ( ( pt32f . x - x0 ) * image_widget - > original_image - > cols ) / <nl> image_widget - > scaled_image - > cols ) ; <nl> - pt . y = cvRound ( ( ( pt32f . y - y0 ) * image_widget - > original_image - > rows ) / <nl> + pt . y = cvFloor ( ( ( pt32f . y - y0 ) * image_widget - > original_image - > rows ) / <nl> image_widget - > scaled_image - > rows ) ; <nl> } <nl> else { <nl> mmm a / modules / imgproc / src / samplers . cpp <nl> ppp b / modules / imgproc / src / samplers . cpp <nl> adjustRect ( const uchar * src , size_t src_step , int pix_size , <nl> rect . x = win_size . width ; <nl> } <nl> <nl> - if ( ip . x + win_size . width < src_size . width ) <nl> + if ( ip . x < src_size . width - win_size . width ) <nl> rect . width = win_size . width ; <nl> else <nl> { <nl> adjustRect ( const uchar * src , size_t src_step , int pix_size , <nl> else <nl> rect . y = - ip . y ; <nl> <nl> - if ( ip . y + win_size . height < src_size . height ) <nl> + if ( ip . y < src_size . height - win_size . height ) <nl> rect . height = win_size . height ; <nl> else <nl> { <nl> void getRectSubPix_Cn_ ( const _Tp * src , size_t src_step , Size src_size , <nl> src_step / = sizeof ( src [ 0 ] ) ; <nl> dst_step / = sizeof ( dst [ 0 ] ) ; <nl> <nl> - if ( 0 < = ip . x & & ip . x + win_size . width < src_size . width & & <nl> - 0 < = ip . y & & ip . y + win_size . height < src_size . height ) <nl> + if ( 0 < = ip . x & & ip . x < src_size . width - win_size . width & & <nl> + 0 < = ip . y & & ip . y < src_size . height - win_size . height ) <nl> { <nl> / / extracted rectangle is totally inside the image <nl> src + = ip . y * src_step + ip . x * cn ; <nl> mmm a / modules / ocl / doc / data_structures . rst <nl> ppp b / modules / ocl / doc / data_structures . rst <nl> OpenCV C + + 1 - D or 2 - D dense array class : : <nl> / / ! returns true if oclMatrix data is NULL <nl> bool empty ( ) const ; <nl> <nl> - / / ! returns pointer to y - th row <nl> - uchar * ptr ( int y = 0 ) ; <nl> - const uchar * ptr ( int y = 0 ) const ; <nl> - <nl> - / / ! template version of the above method <nl> - template < typename _Tp > _Tp * ptr ( int y = 0 ) ; <nl> - template < typename _Tp > const _Tp * ptr ( int y = 0 ) const ; <nl> - <nl> / / ! matrix transposition <nl> oclMat t ( ) const ; <nl> <nl> mmm a / modules / ocl / include / opencv2 / ocl . hpp <nl> ppp b / modules / ocl / include / opencv2 / ocl . hpp <nl> namespace cv <nl> / / ! returns true if oclMatrix data is NULL <nl> bool empty ( ) const ; <nl> <nl> - / / ! returns pointer to y - th row <nl> - uchar * ptr ( int y = 0 ) ; <nl> - const uchar * ptr ( int y = 0 ) const ; <nl> - <nl> - / / ! template version of the above method <nl> - template < typename _Tp > _Tp * ptr ( int y = 0 ) ; <nl> - template < typename _Tp > const _Tp * ptr ( int y = 0 ) const ; <nl> - <nl> / / ! matrix transposition <nl> oclMat t ( ) const ; <nl> <nl> mmm a / modules / ocl / include / opencv2 / ocl / matrix_operations . hpp <nl> ppp b / modules / ocl / include / opencv2 / ocl / matrix_operations . hpp <nl> namespace cv <nl> return data = = 0 ; <nl> } <nl> <nl> - <nl> - <nl> - inline uchar * oclMat : : ptr ( int y ) <nl> - { <nl> - CV_DbgAssert ( ( unsigned ) y < ( unsigned ) rows ) ; <nl> - CV_Error ( Error : : GpuNotSupported , " This function hasn ' t been supported yet . \ n " ) ; <nl> - return data + step * y ; <nl> - } <nl> - <nl> - inline const uchar * oclMat : : ptr ( int y ) const <nl> - { <nl> - CV_DbgAssert ( ( unsigned ) y < ( unsigned ) rows ) ; <nl> - CV_Error ( Error : : GpuNotSupported , " This function hasn ' t been supported yet . \ n " ) ; <nl> - return data + step * y ; <nl> - } <nl> - <nl> - template < typename _Tp > inline _Tp * oclMat : : ptr ( int y ) <nl> - { <nl> - CV_DbgAssert ( ( unsigned ) y < ( unsigned ) rows ) ; <nl> - CV_Error ( Error : : GpuNotSupported , " This function hasn ' t been supported yet . \ n " ) ; <nl> - return ( _Tp * ) ( data + step * y ) ; <nl> - } <nl> - <nl> - template < typename _Tp > inline const _Tp * oclMat : : ptr ( int y ) const <nl> - { <nl> - CV_DbgAssert ( ( unsigned ) y < ( unsigned ) rows ) ; <nl> - CV_Error ( Error : : GpuNotSupported , " This function hasn ' t been supported yet . \ n " ) ; <nl> - return ( const _Tp * ) ( data + step * y ) ; <nl> - } <nl> - <nl> inline oclMat oclMat : : t ( ) const <nl> { <nl> oclMat tmp ; <nl> mmm a / modules / ocl / perf / perf_moments . cpp <nl> ppp b / modules / ocl / perf / perf_moments . cpp <nl> PERF_TEST_P ( MomentsFixture , Moments , <nl> Mat src ( srcSize , type ) , dst ( 7 , 1 , CV_64F ) ; <nl> randu ( src , 0 , 255 ) ; <nl> <nl> - oclMat src_d ( src ) ; <nl> cv : : Moments mom ; <nl> if ( RUN_OCL_IMPL ) <nl> { <nl> + oclMat src_d ( src ) ; <nl> OCL_TEST_CYCLE ( ) mom = cv : : ocl : : ocl_moments ( src_d , binaryImage ) ; <nl> } <nl> else if ( RUN_PLAIN_IMPL ) <nl> mmm a / modules / ocl / src / brute_force_matcher . cpp <nl> ppp b / modules / ocl / src / brute_force_matcher . cpp <nl> void cv : : ocl : : BruteForceMatcher_OCL_base : : matchCollection ( const oclMat & query , c <nl> ensureSizeIsEnough ( 1 , nQuery , CV_32S , imgIdx ) ; <nl> ensureSizeIsEnough ( 1 , nQuery , CV_32F , distance ) ; <nl> <nl> - matchDispatcher ( query , ( const oclMat * ) trainCollection . ptr ( ) , trainCollection . cols , masks , trainIdx , imgIdx , distance , distType ) ; <nl> + matchDispatcher ( query , & trainCollection , trainCollection . cols , masks , trainIdx , imgIdx , distance , distType ) ; <nl> <nl> return ; <nl> } <nl> mmm a / modules / ocl / src / cl_operations . cpp <nl> ppp b / modules / ocl / src / cl_operations . cpp <nl> void openCLFree ( void * devPtr ) <nl> } <nl> # else <nl> / / TODO FIXIT Attach clReleaseMemObject call to event completion callback <nl> - Context * ctx = Context : : getContext ( ) ; <nl> - clFinish ( getClCommandQueue ( ctx ) ) ; <nl> + / / TODO 2013 / 12 / 04 Disable workaround <nl> + / / Context * ctx = Context : : getContext ( ) ; <nl> + / / clFinish ( getClCommandQueue ( ctx ) ) ; <nl> # endif <nl> openCLSafeCall ( clReleaseMemObject ( data . mainBuffer ) ) ; <nl> } <nl> mmm a / modules / ocl / src / opencl / haarobjectdetect . cl <nl> ppp b / modules / ocl / src / opencl / haarobjectdetect . cl <nl> typedef struct __attribute__ ( ( aligned ( 128 ) ) ) GpuHidHaarTreeNode <nl> GpuHidHaarTreeNode ; <nl> <nl> <nl> - typedef struct __attribute__ ( ( aligned ( 32 ) ) ) GpuHidHaarClassifier <nl> - { <nl> - int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - GpuHidHaarTreeNode * node __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> - float * alpha __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> - } <nl> - GpuHidHaarClassifier ; <nl> + / / typedef struct __attribute__ ( ( aligned ( 32 ) ) ) GpuHidHaarClassifier <nl> + / / { <nl> + / / int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / GpuHidHaarTreeNode * node __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> + / / float * alpha __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> + / / } <nl> + / / GpuHidHaarClassifier ; <nl> <nl> <nl> typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarStageClassifier <nl> typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarStageClassifier <nl> GpuHidHaarStageClassifier ; <nl> <nl> <nl> - typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarClassifierCascade <nl> - { <nl> - int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int is_stump_based __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int has_tilted_features __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int is_tree __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - float inv_window_area __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - } GpuHidHaarClassifierCascade ; <nl> + / / typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarClassifierCascade <nl> + / / { <nl> + / / int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int is_stump_based __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int has_tilted_features __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int is_tree __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / float inv_window_area __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / } GpuHidHaarClassifierCascade ; <nl> <nl> <nl> # ifdef PACKED_CLASSIFIER <nl> __kernel void gpuRunHaarClassifierCascadePacked ( <nl> for ( int stageloop = start_stage ; ( stageloop < end_stage ) & & result ; stageloop + + ) <nl> { / / iterate until candidate is exist <nl> float stage_sum = 0 . 0f ; <nl> - int2 stageinfo = * ( global int2 * ) ( stagecascadeptr + stageloop ) ; <nl> - float stagethreshold = as_float ( stageinfo . y ) ; <nl> + __global GpuHidHaarStageClassifier * stageinfo = ( __global GpuHidHaarStageClassifier * ) <nl> + ( ( __global uchar * ) stagecascadeptr + stageloop * sizeof ( GpuHidHaarStageClassifier ) ) ; <nl> + int stagecount = stageinfo - > count ; <nl> + float stagethreshold = stageinfo - > threshold ; <nl> int lcl_off = ( lid_y * DATA_SIZE_X ) + ( lid_x ) ; <nl> - for ( int nodeloop = 0 ; nodeloop < stageinfo . x ; nodecounter + + , nodeloop + + ) <nl> + for ( int nodeloop = 0 ; nodeloop < stagecount ; nodecounter + + , nodeloop + + ) <nl> { <nl> / / simple macro to extract shorts from int <nl> # define M0 ( _t ) ( ( _t ) & 0xFFFF ) <nl> __kernel void __attribute__ ( ( reqd_work_group_size ( 8 , 8 , 1 ) ) ) gpuRunHaarClassifierCa <nl> variance_norm_factor = variance_norm_factor * correction - mean * mean ; <nl> variance_norm_factor = variance_norm_factor > = 0 . f ? sqrt ( variance_norm_factor ) : 1 . f ; <nl> <nl> - for ( int stageloop = start_stage ; ( stageloop < split_stage ) & & result ; stageloop + + ) <nl> + for ( int stageloop = start_stage ; ( stageloop < split_stage ) & & result ; stageloop + + ) <nl> { <nl> float stage_sum = 0 . f ; <nl> - int2 stageinfo = * ( global int2 * ) ( stagecascadeptr + stageloop ) ; <nl> - float stagethreshold = as_float ( stageinfo . y ) ; <nl> - for ( int nodeloop = 0 ; nodeloop < stageinfo . x ; ) <nl> + __global GpuHidHaarStageClassifier * stageinfo = ( __global GpuHidHaarStageClassifier * ) <nl> + ( ( __global uchar * ) stagecascadeptr + stageloop * sizeof ( GpuHidHaarStageClassifier ) ) ; <nl> + int stagecount = stageinfo - > count ; <nl> + float stagethreshold = stageinfo - > threshold ; <nl> + for ( int nodeloop = 0 ; nodeloop < stagecount ; ) <nl> { <nl> - __global GpuHidHaarTreeNode * currentnodeptr = ( nodeptr + nodecounter ) ; <nl> + __global GpuHidHaarTreeNode * currentnodeptr = ( __global GpuHidHaarTreeNode * ) <nl> + ( ( ( __global uchar * ) nodeptr ) + nodecounter * sizeof ( GpuHidHaarTreeNode ) ) ; <nl> <nl> int4 info1 = * ( __global int4 * ) ( & ( currentnodeptr - > p [ 0 ] [ 0 ] ) ) ; <nl> int4 info2 = * ( __global int4 * ) ( & ( currentnodeptr - > p [ 1 ] [ 0 ] ) ) ; <nl> __kernel void __attribute__ ( ( reqd_work_group_size ( 8 , 8 , 1 ) ) ) gpuRunHaarClassifierCa <nl> # endif <nl> } <nl> <nl> - result = ( stage_sum > = stagethreshold ) ; <nl> + result = ( stage_sum > = stagethreshold ) ? 1 : 0 ; <nl> } <nl> if ( factor < 2 ) <nl> { <nl> __kernel void __attribute__ ( ( reqd_work_group_size ( 8 , 8 , 1 ) ) ) gpuRunHaarClassifierCa <nl> lclcount [ 0 ] = 0 ; <nl> barrier ( CLK_LOCAL_MEM_FENCE ) ; <nl> <nl> - int2 stageinfo = * ( global int2 * ) ( stagecascadeptr + stageloop ) ; <nl> - float stagethreshold = as_float ( stageinfo . y ) ; <nl> + / / int2 stageinfo = * ( global int2 * ) ( stagecascadeptr + stageloop ) ; <nl> + __global GpuHidHaarStageClassifier * stageinfo = ( __global GpuHidHaarStageClassifier * ) <nl> + ( ( __global uchar * ) stagecascadeptr + stageloop * sizeof ( GpuHidHaarStageClassifier ) ) ; <nl> + int stagecount = stageinfo - > count ; <nl> + float stagethreshold = stageinfo - > threshold ; <nl> <nl> int perfscale = queuecount > 4 ? 3 : 2 ; <nl> int queuecount_loop = ( queuecount + ( 1 < < perfscale ) - 1 ) > > perfscale ; <nl> int lcl_compute_win = lcl_sz > > perfscale ; <nl> int lcl_compute_win_id = ( lcl_id > > ( 6 - perfscale ) ) ; <nl> - int lcl_loops = ( stageinfo . x + lcl_compute_win - 1 ) > > ( 6 - perfscale ) ; <nl> + int lcl_loops = ( stagecount + lcl_compute_win - 1 ) > > ( 6 - perfscale ) ; <nl> int lcl_compute_id = lcl_id - ( lcl_compute_win_id < < ( 6 - perfscale ) ) ; <nl> for ( int queueloop = 0 ; queueloop < queuecount_loop ; queueloop + + ) <nl> { <nl> __kernel void __attribute__ ( ( reqd_work_group_size ( 8 , 8 , 1 ) ) ) gpuRunHaarClassifierCa <nl> float part_sum = 0 . f ; <nl> const int stump_factor = STUMP_BASED ? 1 : 2 ; <nl> int root_offset = 0 ; <nl> - for ( int lcl_loop = 0 ; lcl_loop < lcl_loops & & tempnodecounter < stageinfo . x ; ) <nl> + for ( int lcl_loop = 0 ; lcl_loop < lcl_loops & & tempnodecounter < stagecount ; ) <nl> { <nl> - __global GpuHidHaarTreeNode * currentnodeptr = <nl> - nodeptr + ( nodecounter + tempnodecounter ) * stump_factor + root_offset ; <nl> + __global GpuHidHaarTreeNode * currentnodeptr = ( __global GpuHidHaarTreeNode * ) <nl> + ( ( ( __global uchar * ) nodeptr ) + sizeof ( GpuHidHaarTreeNode ) * ( ( nodecounter + tempnodecounter ) * stump_factor + root_offset ) ) ; <nl> <nl> int4 info1 = * ( __global int4 * ) ( & ( currentnodeptr - > p [ 0 ] [ 0 ] ) ) ; <nl> int4 info2 = * ( __global int4 * ) ( & ( currentnodeptr - > p [ 1 ] [ 0 ] ) ) ; <nl> __kernel void __attribute__ ( ( reqd_work_group_size ( 8 , 8 , 1 ) ) ) gpuRunHaarClassifierCa <nl> <nl> queuecount = lclcount [ 0 ] ; <nl> barrier ( CLK_LOCAL_MEM_FENCE ) ; <nl> - nodecounter + = stageinfo . x ; <nl> + nodecounter + = stagecount ; <nl> } / / end for ( int stageloop = splitstage ; stageloop < endstage & & queuecount > 0 ; stageloop + + ) <nl> <nl> if ( lcl_id < queuecount ) <nl> mmm a / modules / ocl / src / opencl / haarobjectdetect_scaled2 . cl <nl> ppp b / modules / ocl / src / opencl / haarobjectdetect_scaled2 . cl <nl> typedef struct __attribute__ ( ( aligned ( 128 ) ) ) GpuHidHaarTreeNode <nl> int right __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> } <nl> GpuHidHaarTreeNode ; <nl> - typedef struct __attribute__ ( ( aligned ( 32 ) ) ) GpuHidHaarClassifier <nl> - { <nl> - int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - GpuHidHaarTreeNode * node __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> - float * alpha __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> - } <nl> - GpuHidHaarClassifier ; <nl> + / / typedef struct __attribute__ ( ( aligned ( 32 ) ) ) GpuHidHaarClassifier <nl> + / / { <nl> + / / int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / GpuHidHaarTreeNode * node __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> + / / float * alpha __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> + / / } <nl> + / / GpuHidHaarClassifier ; <nl> typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarStageClassifier <nl> { <nl> int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarStageClassifier <nl> int reserved3 __attribute__ ( ( aligned ( 8 ) ) ) ; <nl> } <nl> GpuHidHaarStageClassifier ; <nl> - typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarClassifierCascade <nl> - { <nl> - int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int is_stump_based __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int has_tilted_features __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int is_tree __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int pq3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - int p3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - float inv_window_area __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> - } GpuHidHaarClassifierCascade ; <nl> + / / typedef struct __attribute__ ( ( aligned ( 64 ) ) ) GpuHidHaarClassifierCascade <nl> + / / { <nl> + / / int count __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int is_stump_based __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int has_tilted_features __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int is_tree __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int pq3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p0 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p1 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p2 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / int p3 __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / float inv_window_area __attribute__ ( ( aligned ( 4 ) ) ) ; <nl> + / / } GpuHidHaarClassifierCascade ; <nl> <nl> __kernel void gpuRunHaarClassifierCascade_scaled2 ( <nl> - global GpuHidHaarStageClassifier * stagecascadeptr , <nl> + global GpuHidHaarStageClassifier * stagecascadeptr_ , <nl> global int4 * info , <nl> - global GpuHidHaarTreeNode * nodeptr , <nl> + global GpuHidHaarTreeNode * nodeptr_ , <nl> global const int * restrict sum , <nl> - global const float * restrict sqsum , <nl> + global const float * restrict sqsum , <nl> global int4 * candidate , <nl> const int rows , <nl> const int cols , <nl> __kernel void gpuRunHaarClassifierCascade_scaled2 ( <nl> int max_idx = rows * cols - 1 ; <nl> for ( int scalei = 0 ; scalei < loopcount ; scalei + + ) <nl> { <nl> - int4 scaleinfo1 ; <nl> - scaleinfo1 = info [ scalei ] ; <nl> + int4 scaleinfo1 = info [ scalei ] ; <nl> int grpnumperline = ( scaleinfo1 . y & 0xffff0000 ) > > 16 ; <nl> int totalgrp = scaleinfo1 . y & 0xffff ; <nl> float factor = as_float ( scaleinfo1 . w ) ; <nl> __kernel void gpuRunHaarClassifierCascade_scaled2 ( <nl> for ( int stageloop = start_stage ; ( stageloop < end_stage ) & & result ; stageloop + + ) <nl> { <nl> float stage_sum = 0 . f ; <nl> - int stagecount = stagecascadeptr [ stageloop ] . count ; <nl> + __global GpuHidHaarStageClassifier * stageinfo = ( __global GpuHidHaarStageClassifier * ) <nl> + ( ( ( __global uchar * ) stagecascadeptr_ ) + stageloop * sizeof ( GpuHidHaarStageClassifier ) ) ; <nl> + int stagecount = stageinfo - > count ; <nl> for ( int nodeloop = 0 ; nodeloop < stagecount ; ) <nl> { <nl> - __global GpuHidHaarTreeNode * currentnodeptr = ( nodeptr + nodecounter ) ; <nl> + __global GpuHidHaarTreeNode * currentnodeptr = ( __global GpuHidHaarTreeNode * ) <nl> + ( ( ( __global uchar * ) nodeptr_ ) + nodecounter * sizeof ( GpuHidHaarTreeNode ) ) ; <nl> int4 info1 = * ( __global int4 * ) ( & ( currentnodeptr - > p [ 0 ] [ 0 ] ) ) ; <nl> int4 info2 = * ( __global int4 * ) ( & ( currentnodeptr - > p [ 1 ] [ 0 ] ) ) ; <nl> int4 info3 = * ( __global int4 * ) ( & ( currentnodeptr - > p [ 2 ] [ 0 ] ) ) ; <nl> float4 w = * ( __global float4 * ) ( & ( currentnodeptr - > weight [ 0 ] ) ) ; <nl> - float3 alpha3 = * ( __global float3 * ) ( & ( currentnodeptr - > alpha [ 0 ] ) ) ; <nl> + float3 alpha3 = * ( __global float3 * ) ( & ( currentnodeptr - > alpha [ 0 ] ) ) ; <nl> float nodethreshold = w . w * variance_norm_factor ; <nl> <nl> info1 . x + = p_offset ; <nl> __kernel void gpuRunHaarClassifierCascade_scaled2 ( <nl> sum [ clamp ( mad24 ( info3 . w , step , info3 . x ) , 0 , max_idx ) ] <nl> + sum [ clamp ( mad24 ( info3 . w , step , info3 . z ) , 0 , max_idx ) ] ) * w . z ; <nl> <nl> - bool passThres = classsum > = nodethreshold ; <nl> + bool passThres = ( classsum > = nodethreshold ) ? 1 : 0 ; <nl> <nl> # if STUMP_BASED <nl> stage_sum + = passThres ? alpha3 . y : alpha3 . x ; <nl> __kernel void gpuRunHaarClassifierCascade_scaled2 ( <nl> } <nl> # endif <nl> } <nl> - result = ( int ) ( stage_sum > = stagecascadeptr [ stageloop ] . threshold ) ; <nl> + <nl> + result = ( stage_sum > = stageinfo - > threshold ) ? 1 : 0 ; <nl> } <nl> <nl> barrier ( CLK_LOCAL_MEM_FENCE ) ; <nl> __kernel void gpuRunHaarClassifierCascade_scaled2 ( <nl> } <nl> } <nl> } <nl> - __kernel void gpuscaleclassifier ( global GpuHidHaarTreeNode * orinode , global GpuHidHaarTreeNode * newnode , float scale , float weight_scale , int nodenum ) <nl> + __kernel void gpuscaleclassifier ( global GpuHidHaarTreeNode * orinode , global GpuHidHaarTreeNode * newnode , float scale , float weight_scale , const int nodenum ) <nl> { <nl> - int counter = get_global_id ( 0 ) ; <nl> + const int counter = get_global_id ( 0 ) ; <nl> int tr_x [ 3 ] , tr_y [ 3 ] , tr_h [ 3 ] , tr_w [ 3 ] , i = 0 ; <nl> - GpuHidHaarTreeNode t1 = * ( orinode + counter ) ; <nl> + GpuHidHaarTreeNode t1 = * ( __global GpuHidHaarTreeNode * ) <nl> + ( ( ( __global uchar * ) orinode ) + counter * sizeof ( GpuHidHaarTreeNode ) ) ; <nl> + __global GpuHidHaarTreeNode * pNew = ( __global GpuHidHaarTreeNode * ) <nl> + ( ( ( __global uchar * ) newnode ) + ( counter + nodenum ) * sizeof ( GpuHidHaarTreeNode ) ) ; <nl> <nl> # pragma unroll <nl> for ( i = 0 ; i < 3 ; i + + ) <nl> __kernel void gpuscaleclassifier ( global GpuHidHaarTreeNode * orinode , global GpuH <nl> } <nl> <nl> t1 . weight [ 0 ] = - ( t1 . weight [ 1 ] * tr_h [ 1 ] * tr_w [ 1 ] + t1 . weight [ 2 ] * tr_h [ 2 ] * tr_w [ 2 ] ) / ( tr_h [ 0 ] * tr_w [ 0 ] ) ; <nl> - counter + = nodenum ; <nl> <nl> # pragma unroll <nl> for ( i = 0 ; i < 3 ; i + + ) <nl> { <nl> - newnode [ counter ] . p [ i ] [ 0 ] = tr_x [ i ] ; <nl> - newnode [ counter ] . p [ i ] [ 1 ] = tr_y [ i ] ; <nl> - newnode [ counter ] . p [ i ] [ 2 ] = tr_x [ i ] + tr_w [ i ] ; <nl> - newnode [ counter ] . p [ i ] [ 3 ] = tr_y [ i ] + tr_h [ i ] ; <nl> - newnode [ counter ] . weight [ i ] = t1 . weight [ i ] * weight_scale ; <nl> + pNew - > p [ i ] [ 0 ] = tr_x [ i ] ; <nl> + pNew - > p [ i ] [ 1 ] = tr_y [ i ] ; <nl> + pNew - > p [ i ] [ 2 ] = tr_x [ i ] + tr_w [ i ] ; <nl> + pNew - > p [ i ] [ 3 ] = tr_y [ i ] + tr_h [ i ] ; <nl> + pNew - > weight [ i ] = t1 . weight [ i ] * weight_scale ; <nl> } <nl> <nl> - newnode [ counter ] . left = t1 . left ; <nl> - newnode [ counter ] . right = t1 . right ; <nl> - newnode [ counter ] . threshold = t1 . threshold ; <nl> - newnode [ counter ] . alpha [ 0 ] = t1 . alpha [ 0 ] ; <nl> - newnode [ counter ] . alpha [ 1 ] = t1 . alpha [ 1 ] ; <nl> - newnode [ counter ] . alpha [ 2 ] = t1 . alpha [ 2 ] ; <nl> + pNew - > left = t1 . left ; <nl> + pNew - > right = t1 . right ; <nl> + pNew - > threshold = t1 . threshold ; <nl> + pNew - > alpha [ 0 ] = t1 . alpha [ 0 ] ; <nl> + pNew - > alpha [ 1 ] = t1 . alpha [ 1 ] ; <nl> + pNew - > alpha [ 2 ] = t1 . alpha [ 2 ] ; <nl> } <nl> mmm a / modules / ocl / src / opencl / imgproc_threshold . cl <nl> ppp b / modules / ocl / src / opencl / imgproc_threshold . cl <nl> __kernel void threshold ( __global const T * restrict src , int src_offset , int src <nl> VT vthresh = ( VT ) ( thresh ) ; <nl> <nl> # ifdef THRESH_BINARY <nl> - VT vecValue = sdata > vthresh ? max_val : ( VT ) ( 0 ) ; <nl> + VT vecValue = sdata > vthresh ? ( VT ) max_val : ( VT ) ( 0 ) ; <nl> # elif defined THRESH_BINARY_INV <nl> - VT vecValue = sdata > vthresh ? ( VT ) ( 0 ) : max_val ; <nl> + VT vecValue = sdata > vthresh ? ( VT ) ( 0 ) : ( VT ) max_val ; <nl> # elif defined THRESH_TRUNC <nl> - VT vecValue = sdata > vthresh ? thresh : sdata ; <nl> + VT vecValue = sdata > vthresh ? ( VT ) thresh : sdata ; <nl> # elif defined THRESH_TOZERO <nl> VT vecValue = sdata > vthresh ? sdata : ( VT ) ( 0 ) ; <nl> # elif defined THRESH_TOZERO_INV <nl> mmm a / platforms / android / service / engine / AndroidManifest . xml <nl> ppp b / platforms / android / service / engine / AndroidManifest . xml <nl> <nl> < ? xml version = " 1 . 0 " encoding = " utf - 8 " ? > <nl> < manifest xmlns : android = " http : / / schemas . android . com / apk / res / android " <nl> package = " org . opencv . engine " <nl> - android : versionCode = " 214 @ ANDROID_PLATFORM_VERSION_CODE @ " <nl> - android : versionName = " 2 . 14 " > <nl> + android : versionCode = " 216 @ ANDROID_PLATFORM_VERSION_CODE @ " <nl> + android : versionName = " 2 . 16 " > <nl> <nl> < uses - sdk android : minSdkVersion = " @ ANDROID_NATIVE_API_LEVEL @ " / > <nl> < uses - feature android : name = " android . hardware . touchscreen " android : required = " false " / > <nl> mmm a / platforms / android / service / engine / jni / NativeService / PackageInfo . cpp <nl> ppp b / platforms / android / service / engine / jni / NativeService / PackageInfo . cpp <nl> inline string JoinPlatform ( int platform ) <nl> return result ; <nl> } <nl> <nl> - inline int SplitPlatfrom ( const vector < string > & features ) <nl> + inline int SplitPlatform ( const vector < string > & features ) <nl> { <nl> int result = 0 ; <nl> <nl> InstallPath ( install_path ) <nl> return ; <nl> } <nl> <nl> - Platform = SplitPlatfrom ( features ) ; <nl> + Platform = SplitPlatform ( features ) ; <nl> if ( PLATFORM_UNKNOWN ! = Platform ) <nl> { <nl> switch ( Platform ) <nl> mmm a / platforms / android / service / engine / jni / Tests / HardwareDetectionTest . cpp <nl> ppp b / platforms / android / service / engine / jni / Tests / HardwareDetectionTest . cpp <nl> TEST ( CpuID , CheckVFPv3 ) <nl> EXPECT_TRUE ( cpu_id & FEATURES_HAS_VFPv3 ) ; <nl> } <nl> <nl> - TEST ( PlatfromDetector , CheckTegra ) <nl> + TEST ( PlatformDetector , CheckTegra ) <nl> { <nl> EXPECT_NE ( PLATFORM_UNKNOWN , DetectKnownPlatforms ( ) ) ; <nl> } <nl> mmm a / platforms / android / service / engine / src / org / opencv / engine / manager / ManagerActivity . java <nl> ppp b / platforms / android / service / engine / src / org / opencv / engine / manager / ManagerActivity . java <nl> public void onClick ( DialogInterface dialog , int which ) { <nl> mInstalledPackageView . setAdapter ( mInstalledPacksAdapter ) ; <nl> <nl> TextView HardwarePlatformView = ( TextView ) findViewById ( R . id . HardwareValue ) ; <nl> - int Platfrom = HardwareDetector . DetectKnownPlatforms ( ) ; <nl> + int Platform = HardwareDetector . DetectKnownPlatforms ( ) ; <nl> int CpuId = HardwareDetector . GetCpuID ( ) ; <nl> <nl> - if ( HardwareDetector . PLATFORM_UNKNOWN ! = Platfrom ) <nl> + if ( HardwareDetector . PLATFORM_UNKNOWN ! = Platform ) <nl> { <nl> - if ( HardwareDetector . PLATFORM_TEGRA = = Platfrom ) <nl> + if ( HardwareDetector . PLATFORM_TEGRA = = Platform ) <nl> { <nl> HardwarePlatformView . setText ( " Tegra " ) ; <nl> } <nl> - else if ( HardwareDetector . PLATFORM_TEGRA2 = = Platfrom ) <nl> + else if ( HardwareDetector . PLATFORM_TEGRA2 = = Platform ) <nl> { <nl> HardwarePlatformView . setText ( " Tegra 2 " ) ; <nl> } <nl> - else if ( HardwareDetector . PLATFORM_TEGRA3 = = Platfrom ) <nl> + else if ( HardwareDetector . PLATFORM_TEGRA3 = = Platform ) <nl> { <nl> HardwarePlatformView . setText ( " Tegra 3 " ) ; <nl> } <nl> - else if ( HardwareDetector . PLATFORM_TEGRA4i = = Platfrom ) <nl> + else if ( HardwareDetector . PLATFORM_TEGRA4i = = Platform ) <nl> { <nl> HardwarePlatformView . setText ( " Tegra 4i " ) ; <nl> } <nl> - else if ( HardwareDetector . PLATFORM_TEGRA4 = = Platfrom ) <nl> + else if ( HardwareDetector . PLATFORM_TEGRA4 = = Platform ) <nl> { <nl> HardwarePlatformView . setText ( " Tegra 4 " ) ; <nl> } <nl> mmm a / platforms / android / service / readme . txt <nl> ppp b / platforms / android / service / readme . txt <nl> manually using adb tool : <nl> <nl> . . code - block : : sh <nl> <nl> - adb install OpenCV - 2 . 4 . 7 - android - sdk / apk / OpenCV_2 . 4 . 7_Manager_2 . 14_ < platform > . apk <nl> + adb install OpenCV - 2 . 4 . 7 . 1 - android - sdk / apk / OpenCV_2 . 4 . 7 . 1_Manager_2 . 15_ < platform > . apk <nl> <nl> Use the table below to determine proper OpenCV Manager package for your device : <nl> <nl> - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - + <nl> - | Hardware Platform | Android ver . | Package name | <nl> - + = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = + = = = = = = = = = = = = = = + = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = + <nl> - | armeabi - v7a ( ARMv7 - A + NEON ) | > = 2 . 3 | OpenCV_2 . 4 . 7_Manager_2 . 14_armv7a - neon . apk | <nl> - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - + <nl> - | armeabi - v7a ( ARMv7 - A + NEON ) | = 2 . 2 | OpenCV_2 . 4 . 7_Manager_2 . 14_armv7a - neon - android8 . apk | <nl> - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - + <nl> - | armeabi ( ARMv5 , ARMv6 ) | > = 2 . 3 | OpenCV_2 . 4 . 7_Manager_2 . 14_armeabi . apk | <nl> - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - + <nl> - | Intel x86 | > = 2 . 3 | OpenCV_2 . 4 . 7_Manager_2 . 14_x86 . apk | <nl> - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - + <nl> - | MIPS | > = 2 . 3 | OpenCV_2 . 4 . 7_Manager_2 . 14_mips . apk | <nl> - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm - + <nl> + + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + <nl> + | Hardware Platform | Android ver . | Package name | <nl> + + = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = + = = = = = = = = = = = = = = + = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = + <nl> + | armeabi - v7a ( ARMv7 - A + NEON ) | > = 2 . 3 | OpenCV_2 . 4 . 7 . 1_Manager_2 . 15_armv7a - neon . apk | <nl> + + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + <nl> + | armeabi - v7a ( ARMv7 - A + NEON ) | = 2 . 2 | OpenCV_2 . 4 . 7 . 1_Manager_2 . 15_armv7a - neon - android8 . apk | <nl> + + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + <nl> + | armeabi ( ARMv5 , ARMv6 ) | > = 2 . 3 | OpenCV_2 . 4 . 7 . 1_Manager_2 . 15_armeabi . apk | <nl> + + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + <nl> + | Intel x86 | > = 2 . 3 | OpenCV_2 . 4 . 7 . 1_Manager_2 . 15_x86 . apk | <nl> + + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + <nl> + | MIPS | > = 2 . 3 | OpenCV_2 . 4 . 7 . 1_Manager_2 . 15_mips . apk | <nl> + + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + mmmmmmmmmmmm - - + mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm + <nl> mmm a / samples / cpp / CMakeLists . txt <nl> ppp b / samples / cpp / CMakeLists . txt <nl> if ( BUILD_EXAMPLES AND OCV_DEPENDENCIES_FOUND ) <nl> ocv_include_directories ( " $ { OpenCV_SOURCE_DIR } / modules / cudafilters / include " ) <nl> endif ( ) <nl> <nl> + if ( HAVE_opencv_ocl ) <nl> + ocv_include_directories ( " $ { OpenCV_SOURCE_DIR } / modules / ocl / include " ) <nl> + endif ( ) <nl> + <nl> if ( CMAKE_COMPILER_IS_GNUCXX AND NOT ENABLE_NOISY_WARNINGS ) <nl> set ( CMAKE_C_FLAGS " $ { CMAKE_C_FLAGS } - Wno - unused - function " ) <nl> endif ( ) <nl> if ( BUILD_EXAMPLES AND OCV_DEPENDENCIES_FOUND ) <nl> target_link_libraries ( $ { the_target } opencv_cudaarithm opencv_cudafilters ) <nl> endif ( ) <nl> <nl> + if ( HAVE_opencv_ocl ) <nl> + target_link_libraries ( $ { the_target } opencv_ocl ) <nl> + endif ( ) <nl> + <nl> set_target_properties ( $ { the_target } PROPERTIES <nl> OUTPUT_NAME " cpp - $ { sample_kind } - $ { name } " <nl> PROJECT_LABEL " ( $ { sample_KIND } ) $ { name } " ) <nl> mmm a / samples / cpp / bagofwords_classification . cpp <nl> ppp b / samples / cpp / bagofwords_classification . cpp <nl> <nl> + # include " opencv2 / opencv_modules . hpp " <nl> # include " opencv2 / highgui / highgui . hpp " <nl> # include " opencv2 / imgproc / imgproc . hpp " <nl> # include " opencv2 / features2d / features2d . hpp " <nl> # include " opencv2 / nonfree / nonfree . hpp " <nl> # include " opencv2 / ml / ml . hpp " <nl> + # ifdef HAVE_OPENCV_OCL <nl> + # define _OCL_SVM_ 1 / / select whether using ocl : : svm method or not , default is using <nl> + # include " opencv2 / ocl / ocl . hpp " <nl> + # endif <nl> <nl> # include < fstream > <nl> # include < iostream > <nl> static void setSVMTrainAutoParams ( CvParamGrid & c_grid , CvParamGrid & gamma_grid , <nl> degree_grid . step = 0 ; <nl> } <nl> <nl> + # if defined HAVE_OPENCV_OCL & & _OCL_SVM_ <nl> + static void trainSVMClassifier ( cv : : ocl : : CvSVM_OCL & svm , const SVMTrainParamsExt & svmParamsExt , const string & objClassName , VocData & vocData , <nl> + Ptr < BOWImgDescriptorExtractor > & bowExtractor , const Ptr < FeatureDetector > & fdetector , <nl> + const string & resPath ) <nl> + # else <nl> static void trainSVMClassifier ( CvSVM & svm , const SVMTrainParamsExt & svmParamsExt , const string & objClassName , VocData & vocData , <nl> Ptr < BOWImgDescriptorExtractor > & bowExtractor , const Ptr < FeatureDetector > & fdetector , <nl> const string & resPath ) <nl> + # endif <nl> { <nl> / * first check if a previously trained svm for the current class has been saved to file * / <nl> string svmFilename = resPath + svmsDir + " / " + objClassName + " . xml . gz " ; <nl> static void trainSVMClassifier ( CvSVM & svm , const SVMTrainParamsExt & svmParamsEx <nl> } <nl> } <nl> <nl> + # if defined HAVE_OPENCV_OCL & & _OCL_SVM_ <nl> + static void computeConfidences ( cv : : ocl : : CvSVM_OCL & svm , const string & objClassName , VocData & vocData , <nl> + Ptr < BOWImgDescriptorExtractor > & bowExtractor , const Ptr < FeatureDetector > & fdetector , <nl> + const string & resPath ) <nl> + # else <nl> static void computeConfidences ( CvSVM & svm , const string & objClassName , VocData & vocData , <nl> Ptr < BOWImgDescriptorExtractor > & bowExtractor , const Ptr < FeatureDetector > & fdetector , <nl> const string & resPath ) <nl> + # endif <nl> { <nl> cout < < " * * * CALCULATING CONFIDENCES FOR CLASS " < < objClassName < < " * * * " < < endl ; <nl> cout < < " CALCULATING BOW VECTORS FOR TEST SET OF " < < objClassName < < " . . . " < < endl ; <nl> int main ( int argc , char * * argv ) <nl> for ( size_t classIdx = 0 ; classIdx < objClasses . size ( ) ; + + classIdx ) <nl> { <nl> / / Train a classifier on train dataset <nl> + # if defined HAVE_OPENCV_OCL & & _OCL_SVM_ <nl> + cv : : ocl : : CvSVM_OCL svm ; <nl> + # else <nl> CvSVM svm ; <nl> + # endif <nl> trainSVMClassifier ( svm , svmTrainParamsExt , objClasses [ classIdx ] , vocData , <nl> bowExtractor , featureDetector , resPath ) ; <nl> <nl> mmm a / samples / cpp / points_classifier . cpp <nl> ppp b / samples / cpp / points_classifier . cpp <nl> <nl> + # include " opencv2 / opencv_modules . hpp " <nl> # include " opencv2 / core / core . hpp " <nl> # include " opencv2 / ml / ml . hpp " <nl> # include " opencv2 / highgui / highgui . hpp " <nl> + # ifdef HAVE_OPENCV_OCL <nl> + # define _OCL_KNN_ 1 / / select whether using ocl : : KNN method or not , default is using <nl> + # define _OCL_SVM_ 1 / / select whether using ocl : : svm method or not , default is using <nl> + # include " opencv2 / ocl / ocl . hpp " <nl> + # endif <nl> <nl> # include < stdio . h > <nl> <nl> static void find_decision_boundary_KNN ( int K ) <nl> prepare_train_data ( trainSamples , trainClasses ) ; <nl> <nl> / / learn classifier <nl> + # if defined HAVE_OPENCV_OCL & & _OCL_KNN_ <nl> + cv : : ocl : : KNearestNeighbour knnClassifier ; <nl> + Mat temp , result ; <nl> + knnClassifier . train ( trainSamples , trainClasses , temp , false , K ) ; <nl> + cv : : ocl : : oclMat testSample_ocl , reslut_ocl ; <nl> + # else <nl> CvKNearest knnClassifier ( trainSamples , trainClasses , Mat ( ) , false , K ) ; <nl> + # endif <nl> <nl> Mat testSample ( 1 , 2 , CV_32FC1 ) ; <nl> for ( int y = 0 ; y < img . rows ; y + = testStep ) <nl> static void find_decision_boundary_KNN ( int K ) <nl> { <nl> testSample . at < float > ( 0 ) = ( float ) x ; <nl> testSample . at < float > ( 1 ) = ( float ) y ; <nl> + # if defined HAVE_OPENCV_OCL & & _OCL_KNN_ <nl> + testSample_ocl . upload ( testSample ) ; <nl> + <nl> + knnClassifier . find_nearest ( testSample_ocl , K , reslut_ocl ) ; <nl> + <nl> + reslut_ocl . download ( result ) ; <nl> + int response = saturate_cast < int > ( result . at < float > ( 0 ) ) ; <nl> + circle ( imgDst , Point ( x , y ) , 1 , classColors [ response ] ) ; <nl> + # else <nl> <nl> int response = ( int ) knnClassifier . find_nearest ( testSample , K ) ; <nl> circle ( imgDst , Point ( x , y ) , 1 , classColors [ response ] ) ; <nl> + # endif <nl> } <nl> } <nl> } <nl> static void find_decision_boundary_SVM ( CvSVMParams params ) <nl> prepare_train_data ( trainSamples , trainClasses ) ; <nl> <nl> / / learn classifier <nl> + # if defined HAVE_OPENCV_OCL & & _OCL_SVM_ <nl> + cv : : ocl : : CvSVM_OCL svmClassifier ( trainSamples , trainClasses , Mat ( ) , Mat ( ) , params ) ; <nl> + # else <nl> CvSVM svmClassifier ( trainSamples , trainClasses , Mat ( ) , Mat ( ) , params ) ; <nl> + # endif <nl> <nl> Mat testSample ( 1 , 2 , CV_32FC1 ) ; <nl> for ( int y = 0 ; y < img . rows ; y + = testStep ) <nl> static void find_decision_boundary_SVM ( CvSVMParams params ) <nl> for ( int i = 0 ; i < svmClassifier . get_support_vector_count ( ) ; i + + ) <nl> { <nl> const float * supportVector = svmClassifier . get_support_vector ( i ) ; <nl> - circle ( imgDst , Point ( supportVector [ 0 ] , supportVector [ 1 ] ) , 5 , Scalar ( 255 , 255 , 255 ) , - 1 ) ; <nl> + circle ( imgDst , Point ( saturate_cast < int > ( supportVector [ 0 ] ) , saturate_cast < int > ( supportVector [ 1 ] ) ) , 5 , CV_RGB ( 255 , 255 , 255 ) , - 1 ) ; <nl> } <nl> <nl> } <nl> mmm a / samples / ocl / facedetect . cpp <nl> ppp b / samples / ocl / facedetect . cpp <nl> <nl> # include < iostream > <nl> # include < stdio . h > <nl> <nl> + # if defined ( _MSC_VER ) & & ( _MSC_VER > = 1700 ) <nl> + # include < thread > <nl> + # endif <nl> <nl> using namespace std ; <nl> using namespace cv ; <nl> # define LOOP_NUM 1 <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / single - threading faces detecting / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + <nl> const static Scalar colors [ ] = { CV_RGB ( 0 , 0 , 255 ) , <nl> CV_RGB ( 0 , 128 , 255 ) , <nl> CV_RGB ( 0 , 255 , 255 ) , <nl> const static Scalar colors [ ] = { CV_RGB ( 0 , 0 , 255 ) , <nl> <nl> int64 work_begin = 0 ; <nl> int64 work_end = 0 ; <nl> - string outputName ; <nl> + string inputName , outputName , cascadeName ; <nl> <nl> static void workBegin ( ) <nl> { <nl> static void Draw ( Mat & img , vector < Rect > & faces , double scale ) ; <nl> / / Else if will return ( total diff of each cpu and gpu rects covered pixels ) / ( total cpu rects covered pixels ) <nl> double checkRectSimilarity ( Size sz , vector < Rect > & cpu_rst , vector < Rect > & gpu_rst ) ; <nl> <nl> - int main ( int argc , const char * * argv ) <nl> + static int facedetect_one_thread ( bool useCPU , double scale ) <nl> { <nl> - const char * keys = <nl> - " { h help | false | print help message } " <nl> - " { i input | | specify input image } " <nl> - " { t template | haarcascade_frontalface_alt . xml | " <nl> - " specify template file path } " <nl> - " { c scale | 1 . 0 | scale image } " <nl> - " { s use_cpu | false | use cpu or gpu to process the image } " <nl> - " { o output | facedetect_output . jpg | " <nl> - " specify output image save path ( only works when input is images ) } " ; <nl> - <nl> - CommandLineParser cmd ( argc , argv , keys ) ; <nl> - if ( cmd . get < bool > ( " help " ) ) <nl> - { <nl> - cout < < " Usage : facedetect [ options ] " < < endl ; <nl> - cout < < " Available options : " < < endl ; <nl> - cmd . printMessage ( ) ; <nl> - return EXIT_SUCCESS ; <nl> - } <nl> - <nl> CvCapture * capture = 0 ; <nl> Mat frame , frameCopy0 , frameCopy , image ; <nl> <nl> - bool useCPU = cmd . get < bool > ( " s " ) ; <nl> - string inputName = cmd . get < string > ( " i " ) ; <nl> - outputName = cmd . get < string > ( " o " ) ; <nl> - string cascadeName = cmd . get < string > ( " t " ) ; <nl> - double scale = cmd . get < double > ( " c " ) ; <nl> ocl : : OclCascadeClassifier cascade ; <nl> CascadeClassifier cpu_cascade ; <nl> <nl> if ( ! cascade . load ( cascadeName ) | | ! cpu_cascade . load ( cascadeName ) ) <nl> { <nl> - cout < < " ERROR : Could not load classifier cascade " < < endl ; <nl> + cout < < " ERROR : Could not load classifier cascade : " < < cascadeName < < endl ; <nl> return EXIT_FAILURE ; <nl> } <nl> <nl> int main ( int argc , const char * * argv ) <nl> } <nl> <nl> cvDestroyWindow ( " result " ) ; <nl> + std : : cout < < " single - threaded sample has finished " < < std : : endl ; <nl> return 0 ; <nl> } <nl> <nl> + / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / detectfaces with multithreading / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / <nl> + # if defined ( _MSC_VER ) & & ( _MSC_VER > = 1700 ) <nl> + <nl> + # define MAX_THREADS 10 <nl> + <nl> + static void detectFaces ( std : : string fileName ) <nl> + { <nl> + ocl : : OclCascadeClassifier cascade ; <nl> + if ( ! cascade . load ( cascadeName ) ) <nl> + { <nl> + std : : cout < < " ERROR : Could not load classifier cascade : " < < cascadeName < < std : : endl ; <nl> + return ; <nl> + } <nl> + <nl> + Mat img = imread ( fileName , CV_LOAD_IMAGE_COLOR ) ; <nl> + if ( img . empty ( ) ) <nl> + { <nl> + std : : cout < < " cann ' t open file " + fileName < < std : : endl ; <nl> + return ; <nl> + } <nl> + <nl> + ocl : : oclMat d_img ; <nl> + d_img . upload ( img ) ; <nl> + <nl> + std : : vector < Rect > oclfaces ; <nl> + cascade . detectMultiScale ( d_img , oclfaces , 1 . 1 , 3 , 0 | CV_HAAR_SCALE_IMAGE , Size ( 30 , 30 ) , Size ( 0 , 0 ) ) ; <nl> + <nl> + for ( unsigned int i = 0 ; i < oclfaces . size ( ) ; i + + ) <nl> + rectangle ( img , Point ( oclfaces [ i ] . x , oclfaces [ i ] . y ) , Point ( oclfaces [ i ] . x + oclfaces [ i ] . width , oclfaces [ i ] . y + oclfaces [ i ] . height ) , colors [ i % 8 ] , 3 ) ; <nl> + <nl> + std : : string : : size_type pos = outputName . rfind ( ' . ' ) ; <nl> + std : : string outputNameTid = outputName + ' - ' + std : : to_string ( _threadid ) ; <nl> + if ( pos = = std : : string : : npos ) <nl> + { <nl> + std : : cout < < " Invalid output file name : " < < outputName < < std : : endl ; <nl> + } <nl> + else <nl> + { <nl> + outputNameTid = outputName . substr ( 0 , pos ) + " _ " + std : : to_string ( _threadid ) + outputName . substr ( pos ) ; <nl> + imwrite ( outputNameTid , img ) ; <nl> + } <nl> + imshow ( outputNameTid , img ) ; <nl> + waitKey ( 0 ) ; <nl> + } <nl> + <nl> + static void facedetect_multithreading ( int nthreads ) <nl> + { <nl> + int thread_number = MAX_THREADS < nthreads ? MAX_THREADS : nthreads ; <nl> + std : : vector < std : : thread > threads ; <nl> + for ( int i = 0 ; i < thread_number ; i + + ) <nl> + threads . push_back ( std : : thread ( detectFaces , inputName ) ) ; <nl> + for ( int i = 0 ; i < thread_number ; i + + ) <nl> + threads [ i ] . join ( ) ; <nl> + } <nl> + # endif <nl> + <nl> + int main ( int argc , const char * * argv ) <nl> + { <nl> + <nl> + const char * keys = <nl> + " { h help | false | print help message } " <nl> + " { i input | | specify input image } " <nl> + " { t template | haarcascade_frontalface_alt . xml | " <nl> + " specify template file path } " <nl> + " { c scale | 1 . 0 | scale image } " <nl> + " { s use_cpu | false | use cpu or gpu to process the image } " <nl> + " { o output | facedetect_output . jpg | " <nl> + " specify output image save path ( only works when input is images ) } " <nl> + " { n thread_num | 1 | set number of threads > = 1 } " ; <nl> + <nl> + CommandLineParser cmd ( argc , argv , keys ) ; <nl> + if ( cmd . has ( " help " ) ) <nl> + { <nl> + cout < < " Usage : facedetect [ options ] " < < endl ; <nl> + cout < < " Available options : " < < endl ; <nl> + cmd . printMessage ( ) ; <nl> + return EXIT_SUCCESS ; <nl> + } <nl> + bool useCPU = cmd . get < bool > ( " s " ) ; <nl> + inputName = cmd . get < string > ( " i " ) ; <nl> + outputName = cmd . get < string > ( " o " ) ; <nl> + cascadeName = cmd . get < string > ( " t " ) ; <nl> + double scale = cmd . get < double > ( " c " ) ; <nl> + int n = cmd . get < int > ( " n " ) ; <nl> + <nl> + if ( n > 1 ) <nl> + { <nl> + # if defined ( _MSC_VER ) & & ( _MSC_VER > = 1700 ) <nl> + std : : cout < < " multi - threaded sample is running " < < std : : endl ; <nl> + facedetect_multithreading ( n ) ; <nl> + std : : cout < < " multi - threaded sample has finished " < < std : : endl ; <nl> + return 0 ; <nl> + # else <nl> + std : : cout < < " std : : thread is not supported , running a single - threaded version " < < std : : endl ; <nl> + # endif <nl> + } <nl> + if ( n < 0 ) <nl> + std : : cout < < " incorrect number of threads : " < < n < < " , running a single - threaded version " < < std : : endl ; <nl> + else <nl> + std : : cout < < " single - threaded sample is running " < < std : : endl ; <nl> + return facedetect_one_thread ( useCPU , scale ) ; <nl> + <nl> + } <nl> + <nl> void detect ( Mat & img , vector < Rect > & faces , <nl> ocl : : OclCascadeClassifier & cascade , <nl> double scale ) <nl>
Merge pull request from SpecLad : merge - 2 . 4
opencv/opencv
90c28d254f06127e3bc5e8ad0cdfc3ee8590da16
2013-12-10T10:39:35Z
mmm a / BUILD <nl> ppp b / BUILD <nl> cc_library ( <nl> " src / google / protobuf / api . pb . cc " , <nl> " src / google / protobuf / compiler / importer . cc " , <nl> " src / google / protobuf / compiler / parser . cc " , <nl> - " src / google / protobuf / compiler / plugin . pb . cc " , <nl> " src / google / protobuf / descriptor . cc " , <nl> " src / google / protobuf / descriptor . pb . cc " , <nl> " src / google / protobuf / descriptor_database . cc " , <nl> cc_library ( <nl> " src / google / protobuf / compiler / objectivec / objectivec_primitive_field . cc " , <nl> " src / google / protobuf / compiler / php / php_generator . cc " , <nl> " src / google / protobuf / compiler / plugin . cc " , <nl> + " src / google / protobuf / compiler / plugin . pb . cc " , <nl> " src / google / protobuf / compiler / python / python_generator . cc " , <nl> " src / google / protobuf / compiler / ruby / ruby_generator . cc " , <nl> " src / google / protobuf / compiler / subprocess . cc " , <nl> mmm a / cmake / libprotobuf . cmake <nl> ppp b / cmake / libprotobuf . cmake <nl> set ( libprotobuf_files <nl> $ { protobuf_source_dir } / src / google / protobuf / api . pb . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / importer . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / parser . cc <nl> - $ { protobuf_source_dir } / src / google / protobuf / compiler / plugin . pb . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / descriptor . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / descriptor . pb . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / descriptor_database . cc <nl> mmm a / cmake / libprotoc . cmake <nl> ppp b / cmake / libprotoc . cmake <nl> set ( libprotoc_files <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / objectivec / objectivec_primitive_field . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / php / php_generator . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / plugin . cc <nl> + $ { protobuf_source_dir } / src / google / protobuf / compiler / plugin . pb . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / python / python_generator . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / ruby / ruby_generator . cc <nl> $ { protobuf_source_dir } / src / google / protobuf / compiler / subprocess . cc <nl> mmm a / src / Makefile . am <nl> ppp b / src / Makefile . am <nl> libprotobuf_la_SOURCES = \ <nl> google / protobuf / io / zero_copy_stream_impl . cc \ <nl> google / protobuf / compiler / importer . cc \ <nl> google / protobuf / compiler / parser . cc \ <nl> - google / protobuf / compiler / plugin . pb . cc \ <nl> google / protobuf / util / delimited_message_util . cc \ <nl> google / protobuf / util / field_comparator . cc \ <nl> google / protobuf / util / field_mask_util . cc \ <nl> libprotoc_la_SOURCES = \ <nl> google / protobuf / compiler / code_generator . cc \ <nl> google / protobuf / compiler / command_line_interface . cc \ <nl> google / protobuf / compiler / plugin . cc \ <nl> + google / protobuf / compiler / plugin . pb . cc \ <nl> google / protobuf / compiler / subprocess . cc \ <nl> google / protobuf / compiler / subprocess . h \ <nl> google / protobuf / compiler / zip_writer . cc \ <nl>
Merge pull request from xfxyjwf / pluginpb
protocolbuffers/protobuf
8e44a86facd0f42af7a3c0c47f8133f78f037269
2018-03-09T18:50:23Z
mmm a / dbms / tests / clickhouse - test <nl> ppp b / dbms / tests / clickhouse - test <nl> def remove_control_characters ( s ) : <nl> <nl> def run_single_test ( args , ext , server_logs_level , client_options , case_file , stdout_file , stderr_file ) : <nl> <nl> + # print ( client_options ) <nl> + <nl> params = { <nl> ' client ' : args . client_with_database , <nl> ' logs_level ' : server_logs_level , <nl> def run_single_test ( args , ext , server_logs_level , client_options , case_file , std <nl> ' stderr ' : stderr_file , <nl> } <nl> <nl> + pattern = ' { test } > { stdout } 2 > { stderr } ' <nl> + <nl> if ext = = ' . sql ' : <nl> - pattern = " { client } - - send_logs_level = { logs_level } - - testmode - - multiquery < { test } > { stdout } 2 > { stderr } " <nl> - command = pattern . format ( * * params ) <nl> - else : <nl> - command = " { test } > { stdout } 2 > { stderr } " . format ( * * params ) <nl> + pattern = " { client } - - send_logs_level = { logs_level } - - testmode - - multiquery { options } < " + pattern <nl> + <nl> + command = pattern . format ( * * params ) <nl> + # print ( command ) <nl> <nl> - proc = Popen ( command , shell = True ) <nl> + proc = Popen ( command , shell = True , env = os . environ ) <nl> start_time = datetime . now ( ) <nl> while ( datetime . now ( ) - start_time ) . total_seconds ( ) < args . timeout and proc . poll ( ) is None : <nl> sleep ( 0 . 01 ) <nl> def find_binary ( name ) : <nl> <nl> <nl> def get_additional_client_options ( args ) : <nl> - return ' ' . join ( ' - - ' + option for option in args . client_option ) <nl> + if args . client_option : <nl> + return ' ' . join ( ' - - ' + option for option in args . client_option ) <nl> + <nl> + return ' ' <nl> + <nl> + <nl> + def get_additional_client_options_url ( args ) : <nl> + if args . client_option : <nl> + return ' & ' . join ( args . client_option ) <nl> + <nl> + return ' ' <nl> <nl> <nl> if __name__ = = ' __main__ ' : <nl> if __name__ = = ' __main__ ' : <nl> args . client + = ' - - database = ' + os . getenv ( " CLICKHOUSE_DATABASE " ) <nl> <nl> if args . client_option : <nl> - os . environ [ ' CLICKHOUSE_CLIENT_OPT ' ] + = ' ' + get_additional_client_options ( args ) <nl> + # Set options for client <nl> + if ' CLICKHOUSE_CLIENT_OPT ' in os . environ : <nl> + os . environ [ ' CLICKHOUSE_CLIENT_OPT ' ] + = ' ' <nl> + else : <nl> + os . environ [ ' CLICKHOUSE_CLIENT_OPT ' ] = ' ' <nl> + <nl> + os . environ [ ' CLICKHOUSE_CLIENT_OPT ' ] + = get_additional_client_options ( args ) <nl> + <nl> + # Set options for curl <nl> + if ' CLICKHOUSE_URL_PARAMS ' in os . environ : <nl> + os . environ [ ' CLICKHOUSE_URL_PARAMS ' ] + = ' & ' <nl> + else : <nl> + os . environ [ ' CLICKHOUSE_URL_PARAMS ' ] = ' ' <nl> + <nl> + os . environ [ ' CLICKHOUSE_URL_PARAMS ' ] + = get_additional_client_options_url ( args ) <nl> + <nl> <nl> args . client_with_database = args . client <nl> if not args . database : <nl> mmm a / dbms / tests / queries / 0_stateless / 00039_inserts_through_http . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00039_inserts_through_http . sh <nl> echo ' DROP TABLE IF EXISTS long_insert ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_U <nl> echo ' CREATE TABLE long_insert ( a String ) ENGINE = Memory ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } - d @ - <nl> for string_size in 1 10 100 1000 10000 100000 1000000 ; do <nl> # LC_ALL = C is needed because otherwise Perl will bark on bad tuned environment . <nl> - LC_ALL = C perl - we ' for my $ letter ( " a " . . " z " ) { print ( ( $ letter x ' $ string_size ' ) . " \ n " ) } ' | $ { CLICKHOUSE_CURL } - sSg " $ { CLICKHOUSE_URL } ? query = INSERT + INTO + long_insert + FORMAT + TabSeparated " - - data - binary @ - <nl> + LC_ALL = C perl - we ' for my $ letter ( " a " . . " z " ) { print ( ( $ letter x ' $ string_size ' ) . " \ n " ) } ' | $ { CLICKHOUSE_CURL } - sSg " $ { CLICKHOUSE_URL } & query = INSERT + INTO + long_insert + FORMAT + TabSeparated " - - data - binary @ - <nl> echo ' SELECT substring ( a , 1 , 1 ) AS c , length ( a ) AS l FROM long_insert ORDER BY c , l ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } - d @ - <nl> done <nl> mmm a / dbms / tests / queries / 0_stateless / 00366_multi_statements . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00366_multi_statements . sh <nl> $ CLICKHOUSE_CLIENT - n - - query = " SELECT * FROM t_00366 " <nl> $ CLICKHOUSE_CLIENT - n - - query = " INSERT INTO t_00366 VALUES " < < < " ( 4 ) , ( 5 ) , ( 6 ) " <nl> $ CLICKHOUSE_CLIENT - n - - query = " SELECT * FROM t_00366 " <nl> <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 " <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 ; " <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 ; " <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 ; " <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 " <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 ; " <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 ; " <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 ; " <nl> <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 ; S " 2 > & 1 | grep - o ' Syntax error ' <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 ; SELECT 2 " 2 > & 1 | grep - o ' Syntax error ' <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 ; SELECT 2 ; " 2 > & 1 | grep - o ' Syntax error ' <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " SELECT 1 ; SELECT 2 ; SELECT " 2 > & 1 | grep - o ' Syntax error ' <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 ; S " 2 > & 1 | grep - o ' Syntax error ' <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 ; SELECT 2 " 2 > & 1 | grep - o ' Syntax error ' <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 ; SELECT 2 ; " 2 > & 1 | grep - o ' Syntax error ' <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " SELECT 1 ; SELECT 2 ; SELECT " 2 > & 1 | grep - o ' Syntax error ' <nl> <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - d " INSERT INTO t_00366 VALUES ( 1 ) , ( 2 ) , ( 3 ) ; " <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - d " INSERT INTO t_00366 VALUES ( 1 ) , ( 2 ) , ( 3 ) ; " <nl> $ CLICKHOUSE_CLIENT - - query = " SELECT * FROM t_00366 " <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT " - d " INTO t_00366 VALUES ( 4 ) , ( 5 ) , ( 6 ) ; " <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT " - d " INTO t_00366 VALUES ( 4 ) , ( 5 ) , ( 6 ) ; " <nl> $ CLICKHOUSE_CLIENT - - query = " SELECT * FROM t_00366 " <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT + INTO + t_00366 + VALUES " - d " ( 7 ) , ( 8 ) , ( 9 ) " <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT + INTO + t_00366 + VALUES " - d " ( 7 ) , ( 8 ) , ( 9 ) " <nl> $ CLICKHOUSE_CLIENT - - query = " SELECT * FROM t_00366 " <nl> <nl> $ CLICKHOUSE_CLIENT - n - - query = " DROP TABLE t_00366 ; " <nl> mmm a / dbms / tests / queries / 0_stateless / 00386_long_in_pk . python <nl> ppp b / dbms / tests / queries / 0_stateless / 00386_long_in_pk . python <nl> import requests <nl> import os <nl> <nl> def main ( ) : <nl> - url = os . environ [ ' CLICKHOUSE_URL_PARAMS ' ] <nl> + url = os . environ [ ' CLICKHOUSE_URL ' ] <nl> <nl> for q in gen_queries ( ) : <nl> resp = requests . post ( url , data = q ) <nl> mmm a / dbms / tests / queries / 0_stateless / 00485_http_insert_format . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00485_http_insert_format . sh <nl> CURDIR = $ ( cd " $ ( dirname " $ { BASH_SOURCE [ 0 ] } " ) " & & pwd ) <nl> $ CLICKHOUSE_CLIENT - - query = " DROP TABLE IF EXISTS format " <nl> $ CLICKHOUSE_CLIENT - - query = " CREATE TABLE format ( s String , x FixedString ( 3 ) ) ENGINE = Memory " <nl> <nl> - echo - ne ' \ tABC \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT + INTO + format + FORMAT + TabSeparated " - - data - binary @ - <nl> - echo - ne ' INSERT INTO format FORMAT TabSeparated \ n \ tDEF \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - - data - binary @ - <nl> - echo - ne ' INSERT INTO format FORMAT TabSeparated hello \ tGHI \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - - data - binary @ - <nl> - echo - ne ' INSERT INTO format FORMAT TabSeparated \ r \ n \ tJKL \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - - data - binary @ - <nl> - echo - ne ' INSERT INTO format FORMAT TabSeparated \ t \ r \ n \ tMNO \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - - data - binary @ - <nl> - echo - ne ' INSERT INTO format FORMAT TabSeparated \ t \ t \ thello \ tPQR \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } " - - data - binary @ - <nl> + echo - ne ' \ tABC \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT + INTO + format + FORMAT + TabSeparated " - - data - binary @ - <nl> + echo - ne ' INSERT INTO format FORMAT TabSeparated \ n \ tDEF \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - - data - binary @ - <nl> + echo - ne ' INSERT INTO format FORMAT TabSeparated hello \ tGHI \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - - data - binary @ - <nl> + echo - ne ' INSERT INTO format FORMAT TabSeparated \ r \ n \ tJKL \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - - data - binary @ - <nl> + echo - ne ' INSERT INTO format FORMAT TabSeparated \ t \ r \ n \ tMNO \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - - data - binary @ - <nl> + echo - ne ' INSERT INTO format FORMAT TabSeparated \ t \ t \ thello \ tPQR \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } " - - data - binary @ - <nl> <nl> $ CLICKHOUSE_CLIENT - - query = " SELECT * FROM format ORDER BY s , x FORMAT JSONEachRow " <nl> $ CLICKHOUSE_CLIENT - - query = " DROP TABLE format " <nl> mmm a / dbms / tests / queries / 0_stateless / 00564_enum_order . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00564_enum_order . sh <nl> <nl> CURDIR = $ ( cd " $ ( dirname " $ { BASH_SOURCE [ 0 ] } " ) " & & pwd ) <nl> . $ CURDIR / . . / shell_config . sh <nl> <nl> - $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL_PARAMS " - d " DROP TABLE IF EXISTS enum " ; <nl> - $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL_PARAMS " - d " CREATE TABLE enum ( x Enum8 ( ' a ' = 1 , ' bcdefghijklmno ' = 0 ) ) ENGINE = Memory " ; <nl> - $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL_PARAMS " - d " INSERT INTO enum VALUES ( ' a ' ) " ; <nl> - $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL_PARAMS " - d " SELECT * FROM enum " ; <nl> - $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL_PARAMS " - d " DROP TABLE enum " ; <nl> + $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL " - d " DROP TABLE IF EXISTS enum " ; <nl> + $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL " - d " CREATE TABLE enum ( x Enum8 ( ' a ' = 1 , ' bcdefghijklmno ' = 0 ) ) ENGINE = Memory " ; <nl> + $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL " - d " INSERT INTO enum VALUES ( ' a ' ) " ; <nl> + $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL " - d " SELECT * FROM enum " ; <nl> + $ CLICKHOUSE_CURL - sS " $ CLICKHOUSE_URL " - d " DROP TABLE enum " ; <nl> mmm a / dbms / tests / queries / 0_stateless / 00565_enum_order . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00565_enum_order . sh <nl> QUERY = ' INSERT INTO ` test_log ` ( ` date ` , ` datetime ` , ` path ` , ` gtid ` , ` query_serial ` <nl> ` new_fields ` . ` is_null ` , ` record_source_type ` , ` record_source_timestamp ` , ` deleted ` ) FORMAT TabSeparated ' <nl> QUERY = " $ ( tr - d ' \ n ' < < < " $ QUERY " ) " <nl> echo $ QUERY <nl> - URL = $ ( python - c ' print " ' $ { CLICKHOUSE_URL_PARAMS } ' & query = " + __import__ ( " urllib " ) . quote ( " ' " $ QUERY " ' " ) ' ) <nl> + URL = $ ( python - c ' print " ' $ { CLICKHOUSE_URL } ' & query = " + __import__ ( " urllib " ) . quote ( " ' " $ QUERY " ' " ) ' ) <nl> <nl> set + e <nl> for i in 1 2 3 ; do <nl> mmm a / dbms / tests / queries / 0_stateless / 00598_create_as_select_http . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00598_create_as_select_http . sh <nl> CURDIR = $ ( cd " $ ( dirname " $ { BASH_SOURCE [ 0 ] } " ) " & & pwd ) <nl> set - e - o pipefail <nl> <nl> $ CLICKHOUSE_CLIENT - - query = " DROP TABLE IF EXISTS test_00598 " <nl> - $ CLICKHOUSE_CURL - sS - d ' CREATE TABLE test_00598 ENGINE = Memory AS SELECT 1 ' $ CLICKHOUSE_URL_PARAMS <nl> + $ CLICKHOUSE_CURL - sS - d ' CREATE TABLE test_00598 ENGINE = Memory AS SELECT 1 ' $ CLICKHOUSE_URL <nl> $ CLICKHOUSE_CLIENT - - query = " SELECT * FROM test_00598 " <nl> $ CLICKHOUSE_CLIENT - - query = " DROP TABLE test_00598 " <nl> mmm a / dbms / tests / queries / 0_stateless / 00612_http_max_query_size . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00612_http_max_query_size . sh <nl> echo " select ' 1 ' " | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } / ? max_query_size = 10 <nl> echo - <nl> echo " select ' 11 ' " | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } / ? max_query_size = 10 - d @ - 2 > & 1 | grep - o " Max query size exceeded " <nl> <nl> - echo ' drop table if exists tab_00612_1 ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL_PARAMS } - d @ - <nl> - echo ' create table tab_00612_1 ( key UInt64 , val UInt64 ) engine = MergeTree order by key ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL_PARAMS } - d @ - <nl> - echo ' into tab_00612_1 values ( 1 , 1 ) , ( 2 , 2 ) , ( 3 , 3 ) , ( 4 , 4 ) , ( 5 , 5 ) ' | $ { CLICKHOUSE_CURL } - sSg " $ { CLICKHOUSE_URL_PARAMS } & max_query_size = 30 & query = insert " - d @ - <nl> - echo ' select val from tab_00612_1 order by val ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL_PARAMS } - d @ - <nl> - echo ' drop table tab_00612_1 ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL_PARAMS } - d @ - <nl> + echo ' drop table if exists tab_00612_1 ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } - d @ - <nl> + echo ' create table tab_00612_1 ( key UInt64 , val UInt64 ) engine = MergeTree order by key ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } - d @ - <nl> + echo ' into tab_00612_1 values ( 1 , 1 ) , ( 2 , 2 ) , ( 3 , 3 ) , ( 4 , 4 ) , ( 5 , 5 ) ' | $ { CLICKHOUSE_CURL } - sSg " $ { CLICKHOUSE_URL } & max_query_size = 30 & query = insert " - d @ - <nl> + echo ' select val from tab_00612_1 order by val ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } - d @ - <nl> + echo ' drop table tab_00612_1 ' | $ { CLICKHOUSE_CURL } - sSg $ { CLICKHOUSE_URL } - d @ - <nl> <nl> echo " <nl> import requests <nl> mmm a / dbms / tests / queries / 0_stateless / 00719_insert_block_without_column . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00719_insert_block_without_column . sh <nl> $ { CLICKHOUSE_CLIENT } - - query " create table squashed_numbers ( SomeID UInt64 , Diff <nl> # port = $ { CLICKHOUSE_PORT_HTTP } <nl> # url = " $ { CLICKHOUSE_PORT_HTTP_PROTO } : / / $ address : $ port / " <nl> <nl> - $ { CLICKHOUSE_CURL } - sS - - data - binary " @ $ { CLICKHOUSE_TMP } / test_squashing_block_without_column . out " " $ { CLICKHOUSE_URL_PARAMS } & query = insert % 20into % 20squashed_numbers % 20format % 20Native " <nl> + $ { CLICKHOUSE_CURL } - sS - - data - binary " @ $ { CLICKHOUSE_TMP } / test_squashing_block_without_column . out " " $ { CLICKHOUSE_URL } & query = insert % 20into % 20squashed_numbers % 20format % 20Native " <nl> <nl> $ { CLICKHOUSE_CLIENT } - - query " select ' Still alive ' " <nl> mmm a / dbms / tests / queries / 0_stateless / 00728_json_each_row_parsing . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00728_json_each_row_parsing . sh <nl> cur_name = $ { BASH_SOURCE [ 0 ] } <nl> $ { CLICKHOUSE_CLIENT } - - query = " DROP TABLE IF EXISTS json_parse ; " <nl> $ { CLICKHOUSE_CLIENT } - - query = " CREATE TABLE json_parse ( aaa String , bbb String ) ENGINE = Memory ; " <nl> <nl> - for n in { 1 . . 1000000 } ; do echo ' { " aaa " : " aaa " , " bbb " : " bbb " } ' ; done | curl - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT % 20INTO % 20json_parse % 20FORMAT % 20JSONEachRow " - 0 - - data - binary @ - <nl> + for n in { 1 . . 1000000 } ; do echo ' { " aaa " : " aaa " , " bbb " : " bbb " } ' ; done | curl - sS " $ { CLICKHOUSE_URL } & query = INSERT % 20INTO % 20json_parse % 20FORMAT % 20JSONEachRow " - 0 - - data - binary @ - <nl> <nl> $ { CLICKHOUSE_CLIENT } - - query = " SELECT count ( ) FROM json_parse ; " <nl> $ { CLICKHOUSE_CLIENT } - - query = " DROP TABLE json_parse ; " <nl> mmm a / dbms / tests / queries / 0_stateless / 00834_cancel_http_readonly_queries_on_client_close . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00834_cancel_http_readonly_queries_on_client_close . sh <nl> <nl> CURDIR = $ ( cd " $ ( dirname " $ { BASH_SOURCE [ 0 ] } " ) " & & pwd ) <nl> . $ CURDIR / . . / shell_config . sh <nl> <nl> - $ { CLICKHOUSE_CURL } - - max - time 1 - sS " $ { CLICKHOUSE_URL_PARAMS } & query_id = cancel_http_readonly_queries_on_client_close & cancel_http_readonly_queries_on_client_close = 1 & query = SELECT + count ( ) + FROM + system . numbers " 2 > & 1 | grep - cF ' curl : ( 28 ) ' <nl> + $ { CLICKHOUSE_CURL } - - max - time 1 - sS " $ { CLICKHOUSE_URL } & query_id = cancel_http_readonly_queries_on_client_close & cancel_http_readonly_queries_on_client_close = 1 & query = SELECT + count ( ) + FROM + system . numbers " 2 > & 1 | grep - cF ' curl : ( 28 ) ' <nl> <nl> for i in { 1 . . 10 } <nl> do <nl> - $ { CLICKHOUSE_CURL } - sS - - data " SELECT count ( ) FROM system . processes WHERE query_id = ' cancel_http_readonly_queries_on_client_close ' " " $ { CLICKHOUSE_URL_PARAMS } " | grep ' 0 ' & & break <nl> + $ { CLICKHOUSE_CURL } - sS - - data " SELECT count ( ) FROM system . processes WHERE query_id = ' cancel_http_readonly_queries_on_client_close ' " " $ { CLICKHOUSE_URL } " | grep ' 0 ' & & break <nl> sleep 0 . 2 <nl> done <nl> mmm a / dbms / tests / queries / 0_stateless / 00851_http_insert_json_defaults . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00851_http_insert_json_defaults . sh <nl> CURDIR = $ ( cd " $ ( dirname " $ { BASH_SOURCE [ 0 ] } " ) " & & pwd ) <nl> $ CLICKHOUSE_CLIENT - - query = " DROP TABLE IF EXISTS defaults " <nl> $ CLICKHOUSE_CLIENT - - query = " CREATE TABLE defaults ( x UInt32 , y UInt32 , a DEFAULT x + y , b Float32 DEFAULT round ( log ( 1 + x + y ) , 5 ) , c UInt32 DEFAULT 42 , e MATERIALIZED x + y , f ALIAS x + y ) ENGINE = Memory " <nl> <nl> - echo - ne ' { " x " : 1 , " y " : 1 } \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT % 20INTO % 20defaults % 20FORMAT % 20JSONEachRow % 20SETTINGS % 20input_format_defaults_for_omitted_fields = 1 " - - data - binary @ - <nl> - echo - ne ' { " x " : 2 , " y " : 2 , " c " : 2 } \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT + INTO + defaults + FORMAT + JSONEachRow + SETTINGS + input_format_defaults_for_omitted_fields = 1 " - - data - binary @ - <nl> + echo - ne ' { " x " : 1 , " y " : 1 } \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT % 20INTO % 20defaults % 20FORMAT % 20JSONEachRow % 20SETTINGS % 20input_format_defaults_for_omitted_fields = 1 " - - data - binary @ - <nl> + echo - ne ' { " x " : 2 , " y " : 2 , " c " : 2 } \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT + INTO + defaults + FORMAT + JSONEachRow + SETTINGS + input_format_defaults_for_omitted_fields = 1 " - - data - binary @ - <nl> echo - ne ' { " x " : 3 , " y " : 3 , " a " : 3 , " b " : 3 , " c " : 3 } \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } ? database = $ { CLICKHOUSE_DATABASE } & query = INSERT + INTO + defaults + FORMAT + JSONEachRow + SETTINGS + input_format_defaults_for_omitted_fields = 1 " - - data - binary @ - <nl> echo - ne ' { " x " : 4 } { " y " : 5 , " c " : 5 } { " a " : 6 , " b " : 6 , " c " : 6 } \ n ' | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } ? database = $ { CLICKHOUSE_DATABASE } & query = INSERT + INTO + defaults + FORMAT + JSONEachRow + SETTINGS + input_format_defaults_for_omitted_fields = 1 " - - data - binary @ - <nl> <nl> mmm a / dbms / tests / queries / 0_stateless / 00952_input_function . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 00952_input_function . sh <nl> $ { CLICKHOUSE_CLIENT } - - query = " SELECT * FROM input_function_table_1 FORMAT CSV " <nl> <nl> $ { CLICKHOUSE_CLIENT } - - query = " DROP TABLE IF EXISTS input_function_table_2 " <nl> $ { CLICKHOUSE_CLIENT } - - query = " CREATE TABLE input_function_table_2 ( a String , b Date , c Int32 , d Int16 ) ENGINE = Memory ( ) " <nl> - cat $ { CLICKHOUSE_TMP } / data_for_input_function . csv | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT % 20INTO % 20input_function_table_2 % 20 % 28a % 2C % 20b % 2C % 20c % 29 % 20SELECT % 20a % 2C % 20b % 2C % 20c % 2Ac % 20FROM % 20input % 28 % 27a % 20String % 2C % 20b % 20Int32 % 2C % 20c % 20Int32 % 27 % 29 % 20FORMAT % 20CSV " - - data - binary @ - <nl> + cat $ { CLICKHOUSE_TMP } / data_for_input_function . csv | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT % 20INTO % 20input_function_table_2 % 20 % 28a % 2C % 20b % 2C % 20c % 29 % 20SELECT % 20a % 2C % 20b % 2C % 20c % 2Ac % 20FROM % 20input % 28 % 27a % 20String % 2C % 20b % 20Int32 % 2C % 20c % 20Int32 % 27 % 29 % 20FORMAT % 20CSV " - - data - binary @ - <nl> $ { CLICKHOUSE_CLIENT } - - query = " SELECT * FROM input_function_table_2 FORMAT CSV " <nl> <nl> $ { CLICKHOUSE_CLIENT } - - query = " DROP TABLE IF EXISTS input_function_table_3 " <nl> $ { CLICKHOUSE_CLIENT } - - query = " SELECT * FROM input_function_table_3 FORMAT CSV " <nl> <nl> $ { CLICKHOUSE_CLIENT } - - query = " DROP TABLE IF EXISTS input_function_table_4 " <nl> $ { CLICKHOUSE_CLIENT } - - query = " CREATE TABLE input_function_table_4 ( a String , b Date , c Int32 , d Int16 ) ENGINE = Memory ( ) " <nl> - cat $ { CLICKHOUSE_TMP } / data_for_input_function . csv | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT % 20INTO % 20input_function_table_4 % 20 % 28a % 2C % 20b % 2C % 20c % 29 % 20SELECT % 20 % 2A % 20FROM % 20 % 28SELECT % 20s % 2C % 20b % 2C % 20c % 2Ac % 20FROM % 20input % 28 % 27s % 20String % 2C % 20b % 20Int32 % 2C % 20c % 20Int32 % 27 % 29 % 20JOIN % 20input_function_table_1 % 20ON % 20s % 3Dinput_function_table_1 . a % 29 % 20FORMAT % 20CSV " - - data - binary @ - <nl> + cat $ { CLICKHOUSE_TMP } / data_for_input_function . csv | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT % 20INTO % 20input_function_table_4 % 20 % 28a % 2C % 20b % 2C % 20c % 29 % 20SELECT % 20 % 2A % 20FROM % 20 % 28SELECT % 20s % 2C % 20b % 2C % 20c % 2Ac % 20FROM % 20input % 28 % 27s % 20String % 2C % 20b % 20Int32 % 2C % 20c % 20Int32 % 27 % 29 % 20JOIN % 20input_function_table_1 % 20ON % 20s % 3Dinput_function_table_1 . a % 29 % 20FORMAT % 20CSV " - - data - binary @ - <nl> $ { CLICKHOUSE_CLIENT } - - query = " SELECT * FROM input_function_table_4 FORMAT CSV " <nl> <nl> <nl> $ { CLICKHOUSE_CLIENT } - - query = " SELECT count ( ) FROM input_function_table_5 FORMAT <nl> <nl> $ { CLICKHOUSE_CLIENT } - - query = " DROP TABLE IF EXISTS input_function_table_6 " <nl> $ { CLICKHOUSE_CLIENT } - - query = " CREATE TABLE input_function_table_6 ( a String , b Date , c Int32 , d Int16 ) ENGINE = Memory ( ) " <nl> - cat $ { CLICKHOUSE_TMP } / data_for_input_function . csv | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT % 20INTO % 20input_function_table_6 % 20 % 28a % 2C % 20b % 2C % 20c % 29 % 20SELECT % 20a % 2C % 20b % 2C % 20c % 2Ac % 20FROM % 20input % 28 % 27a % 20String % 2C % 20b % 20Int32 % 2C % 20c % 20Int32 % 27 % 29 % 20FORMAT % 20CSV & max_block_size = 1000 " - - data - binary @ - <nl> + cat $ { CLICKHOUSE_TMP } / data_for_input_function . csv | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT % 20INTO % 20input_function_table_6 % 20 % 28a % 2C % 20b % 2C % 20c % 29 % 20SELECT % 20a % 2C % 20b % 2C % 20c % 2Ac % 20FROM % 20input % 28 % 27a % 20String % 2C % 20b % 20Int32 % 2C % 20c % 20Int32 % 27 % 29 % 20FORMAT % 20CSV & max_block_size = 1000 " - - data - binary @ - <nl> $ { CLICKHOUSE_CLIENT } - - query = " SELECT count ( ) FROM input_function_table_6 FORMAT CSV " <nl> <nl> <nl> mmm a / dbms / tests / queries / 0_stateless / 01010_low_cardinality_and_native_http . sh <nl> ppp b / dbms / tests / queries / 0_stateless / 01010_low_cardinality_and_native_http . sh <nl> $ CLICKHOUSE_CLIENT - - query = " create table tab_str ( x String ) engine = MergeTree o <nl> $ CLICKHOUSE_CLIENT - - query = " create table tab_str_lc ( x LowCardinality ( String ) ) engine = MergeTree order by tuple ( ) " ; <nl> $ CLICKHOUSE_CLIENT - - query = " insert into tab_str values ( ' abc ' ) " ; <nl> <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = select + x + from + tab_str + format + Native " | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT + INTO + tab_str_lc + FORMAT + Native " - - data - binary @ - <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = select + x + from + tab_str + format + Native " | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT + INTO + tab_str_lc + FORMAT + Native " - - data - binary @ - <nl> <nl> $ CLICKHOUSE_CLIENT - - query = " select x from tab_str_lc " ; <nl> <nl> - $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = select + x + from + tab_str_lc + format + Native " | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL_PARAMS } & query = INSERT + INTO + tab_str + FORMAT + Native " - - data - binary @ - <nl> + $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = select + x + from + tab_str_lc + format + Native " | $ { CLICKHOUSE_CURL } - sS " $ { CLICKHOUSE_URL } & query = INSERT + INTO + tab_str + FORMAT + Native " - - data - binary @ - <nl> <nl> $ CLICKHOUSE_CLIENT - - query = " select ' mmm - ' " ; <nl> $ CLICKHOUSE_CLIENT - - query = " select x from tab_str " ; <nl> mmm a / dbms / tests / queries / shell_config . sh <nl> ppp b / dbms / tests / queries / shell_config . sh <nl> export CLICKHOUSE_PORT_HTTPS = $ { CLICKHOUSE_PORT_HTTPS : = " 8443 " } <nl> export CLICKHOUSE_PORT_HTTP_PROTO = $ { CLICKHOUSE_PORT_HTTP_PROTO : = " http " } <nl> <nl> # Add database to url params <nl> - if [ - z " $ CLICKHOUSE_URL_PARAMS " ] <nl> + if [ - n " $ { CLICKHOUSE_URL_PARAMS } " ] <nl> then <nl> - export CLICKHOUSE_URL_PARAMS = $ { CLICKHOUSE_URL_PARAMS : = " $ { CLICKHOUSE_URL_PARAMS } & database = $ { CLICKHOUSE_DATABASE } " } <nl> + export CLICKHOUSE_URL_PARAMS = " $ { CLICKHOUSE_URL_PARAMS } & database = $ { CLICKHOUSE_DATABASE } " <nl> else <nl> - export CLICKHOUSE_URL_PARAMS = $ { CLICKHOUSE_URL_PARAMS : = " database = $ { CLICKHOUSE_DATABASE } " } <nl> + export CLICKHOUSE_URL_PARAMS = " database = $ { CLICKHOUSE_DATABASE } " <nl> fi <nl> <nl> export CLICKHOUSE_URL = $ { CLICKHOUSE_URL : = " $ { CLICKHOUSE_PORT_HTTP_PROTO } : / / $ { CLICKHOUSE_HOST } : $ { CLICKHOUSE_PORT_HTTP } / " } <nl> export CLICKHOUSE_URL_HTTPS = $ { CLICKHOUSE_URL_HTTPS : = " https : / / $ { CLICKHOUSE_HOST } : $ { CLICKHOUSE_PORT_HTTPS } / " } <nl> <nl> # Add url params to url <nl> - if [ - z " $ CLICKHOUSE_URL_PARAMS " ] <nl> + if [ - n " $ { CLICKHOUSE_URL_PARAMS } " ] <nl> then <nl> - export CLICKHOUSE_URL = $ { CLICKHOUSE_URL : = " $ { CLICKHOUSE_URL } ? $ { $ CLICKHOUSE_URL_PARAMS } " } <nl> - export CLICKHOUSE_URL_HTTPS = $ { CLICKHOUSE_URL_HTTPS : = " $ { CLICKHOUSE_URL_HTTPS } ? $ { $ CLICKHOUSE_URL_PARAMS } " } <nl> + export CLICKHOUSE_URL = " $ { CLICKHOUSE_URL } ? $ { CLICKHOUSE_URL_PARAMS } " <nl> + export CLICKHOUSE_URL_HTTPS = " $ { CLICKHOUSE_URL_HTTPS } ? $ { CLICKHOUSE_URL_PARAMS } " <nl> fi <nl> <nl> export CLICKHOUSE_PORT_INTERSERVER = $ { CLICKHOUSE_PORT_INTERSERVER : = ` $ { CLICKHOUSE_EXTRACT_CONFIG } - - try - - key = interserver_http_port 2 > / dev / null ` } 2 > / dev / null <nl>
Update url params in shell_config and tests . ]
ClickHouse/ClickHouse
10366f79ae180274303e6778029754a3942aeed1
2019-10-11T13:34:26Z
mmm a / lib / AST / ASTWalker . cpp <nl> ppp b / lib / AST / ASTWalker . cpp <nl> class Traversal : public ASTVisitor < Traversal , Expr * , Stmt * , <nl> Expr * sub = doIt ( objcStringLiteral ) ; <nl> if ( ! sub ) return nullptr ; <nl> E - > setObjCStringLiteralExpr ( sub ) ; <nl> - return E ; <nl> } <nl> <nl> auto components = E - > getComponents ( ) ; <nl> new file mode 100644 <nl> index 000000000000 . . 2405032541af <nl> mmm / dev / null <nl> ppp b / test / Index / index_keypaths . swift <nl> <nl> + / / RUN : % target - swift - ide - test - print - indexed - symbols - source - filename % s | % FileCheck % s <nl> + / / REQUIRES : objc_interop <nl> + <nl> + struct MyStruct { <nl> + struct Inner { <nl> + let myProp = 1 <nl> + } <nl> + } <nl> + <nl> + class MyClass { <nl> + class Inner { <nl> + @ objc var myProp = 1 <nl> + } <nl> + } <nl> + <nl> + let a = \ MyStruct . Inner . myProp <nl> + / / CHECK : [ [ @ LINE - 1 ] ] : 25 | { { . * } } | myProp <nl> + / / CHECK : [ [ @ LINE - 2 ] ] : 10 | { { . * } } | MyStruct <nl> + / / CHECK : [ [ @ LINE - 3 ] ] : 19 | { { . * } } | Inner <nl> + let b : KeyPath < MyStruct . Inner , Int > = \ . myProp <nl> + / / CHECK : [ [ @ LINE - 1 ] ] : 41 | { { . * } } | myProp <nl> + let c = \ MyClass . Inner . myProp <nl> + / / CHECK : [ [ @ LINE - 1 ] ] : 24 | { { . * } } | myProp <nl> + / / CHECK : [ [ @ LINE - 2 ] ] : 10 | { { . * } } | MyClass <nl> + / / CHECK : [ [ @ LINE - 3 ] ] : 18 | { { . * } } | Inner <nl> + let d : KeyPath < MyClass . Inner , Int > = \ . myProp <nl> + / / CHECK : [ [ @ LINE - 1 ] ] : 40 | { { . * } } | myProp <nl>
Indexing Swift @ objc key paths
apple/swift
32c593553dad3d389afb9e3bbd45ae89acf285bf
2018-10-18T20:49:24Z
diff - - git a / Dynamic Programming / Fibonacci_Bottom_Up . cpp b / Dynamic Programming / Fibonacci_Bottom_Up . cpp <nl> new file mode 100644 <nl> index 0000000000 . . cbd0912e8f <nl> mmm / dev / null <nl> ppp b / Dynamic Programming / Fibonacci_Bottom_Up . cpp <nl> <nl> + # include < bits / stdc + + . h > <nl> + using namespace std ; <nl> + int fib ( int n ) { <nl> + int res [ n + 1 ] ; <nl> + res [ 0 ] = 0 ; res [ 1 ] = 1 ; <nl> + for ( int i = 2 ; i < = n ; i + + ) { <nl> + res [ i ] = res [ i - 1 ] + res [ i - 2 ] ; <nl> + } <nl> + return res [ n ] ; <nl> + } <nl> + int main ( int argc , char const * argv [ ] ) <nl> + { <nl> + int n ; <nl> + cout < < " Enter n : " ; <nl> + cin > > n ; <nl> + cout < < " Fibonacci number is " ; <nl> + cout < < fib ( n ) < < endl ; <nl> + return 0 ; <nl> + } <nl> \ No newline at end of file <nl> diff - - git a / Dynamic Programming / Fibonacci_Top_Down . cpp b / Dynamic Programming / Fibonacci_Top_Down . cpp <nl> new file mode 100644 <nl> index 0000000000 . . 6578818929 <nl> mmm / dev / null <nl> ppp b / Dynamic Programming / Fibonacci_Top_Down . cpp <nl> <nl> + # include < bits / stdc + + . h > <nl> + using namespace std ; <nl> + int arr [ 1000000 ] ; <nl> + int fib ( int n ) { <nl> + if ( arr [ n ] = = - 1 ) { <nl> + if ( n < = 1 ) <nl> + arr [ n ] = n ; <nl> + else <nl> + arr [ n ] = fib ( n - 1 ) + fib ( n - 2 ) ; <nl> + } <nl> + return arr [ n ] ; <nl> + } <nl> + int main ( int argc , char const * argv [ ] ) <nl> + { <nl> + int n ; <nl> + cout < < " Enter n : " ; <nl> + cin > > n ; <nl> + for ( int i = 0 ; i < n + 1 ; + + i ) <nl> + { <nl> + arr [ i ] = - 1 ; <nl> + } <nl> + cout < < " Fibonacci number is " < < fib ( n ) < < endl ; <nl> + return 0 ; <nl> + } <nl> \ No newline at end of file <nl>
Add DP implementations
TheAlgorithms/C-Plus-Plus
4465288f60cc8fea1caf62f7d2ad1c17aee6bd16
2016-11-25T13:06:34Z
mmm a / include / swift / Driver / Options . td <nl> ppp b / include / swift / Driver / Options . td <nl> def O : Joined < [ " - " ] , " O " > , Group < O_Group > , Flags < [ FrontendOption ] > ; <nl> def Ofast : Flag < [ " - " ] , " Ofast " > , Group < O_Group > , Flags < [ FrontendOption ] > ; <nl> <nl> / / Assert configuration identifiers . <nl> - <nl> - def AssertConfig_Group : OptionGroup < " < Assert configuration > " > ; <nl> - <nl> - def AssertConfig : JoinedOrSeparate < [ " - " ] , " AssertConfig " > , <nl> - Group < AssertConfig_Group > , Flags < [ FrontendOption ] > ; <nl> - def AssertConfigEq : Joined < [ " - " ] , " AssertConfig = " > , <nl> - Group < AssertConfig_Group > , Alias < AssertConfig > , Flags < [ FrontendOption ] > ; <nl> + def AssertConfig : JoinedOrSeparate < [ " - " ] , " AssertConfig " > , <nl> + Flags < [ FrontendOption ] > , <nl> + HelpText < " Specify the assert_configuration replacement . " <nl> + " Possible values are Debug , Release , Replacement . " > ; <nl> + def AssertConfig_EQ : Joined < [ " - " ] , " AssertConfig = " > , <nl> + Flags < [ FrontendOption ] > , Alias < AssertConfig > ; <nl> <nl> / / File types <nl> <nl> mmm a / lib / Driver / Tools . cpp <nl> ppp b / lib / Driver / Tools . cpp <nl> static void addCommonFrontendArgs ( const ToolChain & TC , <nl> inputArgs . AddAllArgs ( arguments , options : : OPT_I ) ; <nl> inputArgs . AddAllArgs ( arguments , options : : OPT_F ) ; <nl> inputArgs . AddLastArg ( arguments , options : : OPT_nostdimport ) ; <nl> + inputArgs . AddLastArg ( arguments , options : : OPT_AssertConfig ) ; <nl> <nl> inputArgs . AddLastArg ( arguments , options : : OPT_g ) ; <nl> inputArgs . AddLastArg ( arguments , options : : OPT_resource_dir ) ; <nl> mmm a / lib / Frontend / CompilerInvocation . cpp <nl> ppp b / lib / Frontend / CompilerInvocation . cpp <nl> static bool ParseSILArgs ( SILOptions & Opts , ArgList & Args , <nl> } <nl> <nl> / / Parse the build configuration identifier . <nl> - if ( const Arg * A = Args . getLastArg ( OPT_AssertConfig_Group ) ) { <nl> + if ( const Arg * A = Args . getLastArg ( OPT_AssertConfig ) ) { <nl> / / We currently understand build configuration up to 3 of which we only use <nl> / / 0 and 1 in the standard library . <nl> StringRef Configuration = A - > getValue ( ) ; <nl>
Fix the driver ' s handling of the AssertConfig option
apple/swift
3ba7225aa3944bae9d7439c62ed2994eb38626fe
2014-05-05T18:21:57Z
new file mode 100644 <nl> index 00000000000 . . e69de29bb2d <nl> new file mode 100644 <nl> index 00000000000 . . a6e80ce2b08 <nl> mmm / dev / null <nl> ppp b / dbms / tests / integration / test_inconsistent_parts_after_clone_replica / configs / remote_servers . xml <nl> <nl> + < yandex > <nl> + < remote_servers > <nl> + < test_cluster > <nl> + < shard > <nl> + < internal_replication > true < / internal_replication > <nl> + < replica > <nl> + < default_database > shard_0 < / default_database > <nl> + < host > node1 < / host > <nl> + < port > 9000 < / port > <nl> + < / replica > <nl> + < replica > <nl> + < default_database > shard_0 < / default_database > <nl> + < host > node2 < / host > <nl> + < port > 9000 < / port > <nl> + < / replica > <nl> + < / shard > <nl> + < / test_cluster > <nl> + < / remote_servers > <nl> + < / yandex > <nl> new file mode 100644 <nl> index 00000000000 . . c1513798189 <nl> mmm / dev / null <nl> ppp b / dbms / tests / integration / test_inconsistent_parts_after_clone_replica / test . py <nl> <nl> + import pytest <nl> + <nl> + from helpers . cluster import ClickHouseCluster <nl> + from helpers . network import PartitionManager <nl> + from helpers . test_tools import assert_eq_with_retry <nl> + <nl> + <nl> + def fill_nodes ( nodes , shard ) : <nl> + for node in nodes : <nl> + node . query ( <nl> + ' ' ' <nl> + CREATE DATABASE test ; <nl> + CREATE TABLE test_table ( date Date , id UInt32 ) <nl> + ENGINE = ReplicatedMergeTree ( ' / clickhouse / tables / test { shard } / replicated ' , ' { replica } ' ) <nl> + ORDER BY id PARTITION BY toYYYYMM ( date ) <nl> + SETTINGS min_replicated_logs_to_keep = 3 , max_replicated_logs_to_keep = 5 , cleanup_delay_period = 0 , cleanup_delay_period_random_add = 0 ; <nl> + ' ' ' . format ( shard = shard , replica = node . name ) ) <nl> + <nl> + <nl> + cluster = ClickHouseCluster ( __file__ ) <nl> + node1 = cluster . add_instance ( ' node1 ' , main_configs = [ ' configs / remote_servers . xml ' ] , with_zookeeper = True ) <nl> + node2 = cluster . add_instance ( ' node2 ' , main_configs = [ ' configs / remote_servers . xml ' ] , with_zookeeper = True ) <nl> + <nl> + @ pytest . fixture ( scope = " module " ) <nl> + def start_cluster ( ) : <nl> + try : <nl> + cluster . start ( ) <nl> + fill_nodes ( [ node1 , node2 ] , 1 ) <nl> + yield cluster <nl> + except Exception as ex : <nl> + print ex <nl> + finally : <nl> + cluster . shutdown ( ) <nl> + <nl> + <nl> + def test_inconsistent_parts_if_drop_while_replica_not_active ( start_cluster ) : <nl> + with PartitionManager ( ) as pm : <nl> + # insert into all replicas <nl> + node1 . query ( " INSERT INTO test_table VALUES ( ' 2019 - 08 - 16 ' , 100 ) " ) <nl> + assert_eq_with_retry ( node2 , " SELECT count ( * ) FROM test_table " , node1 . query ( " SELECT count ( * ) FROM test_table " ) ) <nl> + <nl> + # disable network on the first replica <nl> + pm . partition_instances ( node1 , node2 ) <nl> + pm . drop_instance_zk_connections ( node1 ) <nl> + <nl> + # drop all parts on the second replica <nl> + node2 . query_with_retry ( " ALTER TABLE test_table DROP PARTITION 201908 " ) <nl> + assert_eq_with_retry ( node2 , " SELECT count ( * ) FROM test_table " , " 0 " ) <nl> + <nl> + # insert into the second replica <nl> + # DROP_RANGE will be removed from the replication log and the first replica will be lost <nl> + for i in range ( 100 ) : <nl> + node2 . query ( " INSERT INTO test_table VALUES ( ' 2019 - 08 - 16 ' , { } ) " . format ( i ) ) <nl> + <nl> + # the first replica will be cloned from the second <nl> + pm . heal_all ( ) <nl> + assert_eq_with_retry ( node1 , " SELECT count ( * ) FROM test_table " , node2 . query ( " SELECT count ( * ) FROM test_table " ) ) <nl> + <nl> + <nl>
add test
ClickHouse/ClickHouse
8bbcecf3b1c58aa3220d36ee09ea8ac38ab7e872
2019-08-16T16:15:36Z
mmm a / test / core / http / BUILD <nl> ppp b / test / core / http / BUILD <nl> <nl> # See the License for the specific language governing permissions and <nl> # limitations under the License . <nl> <nl> + load ( " / / bazel : grpc_build_system . bzl " , " grpc_cc_library " , " grpc_cc_test " , " grpc_cc_binary " ) <nl> + <nl> + licenses ( [ " notice " ] ) # Apache v2 <nl> + <nl> + load ( " / / test / core / util : grpc_fuzzer . bzl " , " grpc_fuzzer " ) <nl> + <nl> + grpc_fuzzer ( <nl> + name = " response_fuzzer " , <nl> + srcs = [ " response_fuzzer . c " ] , <nl> + language = " C " , <nl> + corpus = " response_corpus " , <nl> + deps = [ <nl> + " / / : gpr " , <nl> + " / / : grpc " , <nl> + " / / test / core / util : grpc_test_util " , <nl> + ] , <nl> + ) <nl> + <nl> + grpc_fuzzer ( <nl> + name = " request_fuzzer " , <nl> + srcs = [ " request_fuzzer . c " ] , <nl> + language = " C " , <nl> + corpus = " request_corpus " , <nl> + deps = [ <nl> + " / / : gpr " , <nl> + " / / : grpc " , <nl> + " / / test / core / util : grpc_test_util " , <nl> + ] , <nl> + ) <nl> + <nl> + # Copyright 2017 gRPC authors . <nl> + # <nl> + # Licensed under the Apache License , Version 2 . 0 ( the " License " ) ; <nl> + # you may not use this file except in compliance with the License . <nl> + # You may obtain a copy of the License at <nl> + # <nl> + # http : / / www . apache . org / licenses / LICENSE - 2 . 0 <nl> + # <nl> + # Unless required by applicable law or agreed to in writing , software <nl> + # distributed under the License is distributed on an " AS IS " BASIS , <nl> + # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND , either express or implied . <nl> + # See the License for the specific language governing permissions and <nl> + # limitations under the License . <nl> + <nl> licenses ( [ " notice " ] ) # Apache v2 <nl> <nl> load ( " / / test / core / util : grpc_fuzzer . bzl " , " grpc_fuzzer " ) <nl>
fix test / core / http / BUILD
grpc/grpc
0bf41fc88de57fc2064dea3055ba845dfc67952b
2017-06-08T09:29:25Z
mmm a / DEPS <nl> ppp b / DEPS <nl> vars = { <nl> <nl> deps = { <nl> " v8 / build " : <nl> - Var ( " chromium_url " ) + " / chromium / src / build . git " + " @ " + " 7321edc3e835447b4ec75732ab683fb70ad1a5fd " , <nl> + Var ( " chromium_url " ) + " / chromium / src / build . git " + " @ " + " 52f7afeca991d96d68cf0507e20dbdd5b845691f " , <nl> " v8 / tools / gyp " : <nl> Var ( " chromium_url " ) + " / external / gyp . git " + " @ " + " e7079f0e0e14108ab0dba58728ff219637458563 " , <nl> " v8 / third_party / icu " : <nl> deps = { <nl> " v8 / test / test262 / harness " : <nl> Var ( " chromium_url " ) + " / external / github . com / test262 - utils / test262 - harness - py . git " + " @ " + " cbd968f54f7a95c6556d53ba852292a4c49d11d8 " , <nl> " v8 / tools / clang " : <nl> - Var ( " chromium_url " ) + " / chromium / src / tools / clang . git " + " @ " + " 53bdedc7a79ddb513dddeb3aea99935893c2143d " , <nl> + Var ( " chromium_url " ) + " / chromium / src / tools / clang . git " + " @ " + " 7e1360625bed71368431620ef003c75fb60fde96 " , <nl> } <nl> <nl> deps_os = { <nl> deps_os = { <nl> " v8 / third_party / android_tools " : <nl> Var ( " chromium_url " ) + " / android_tools . git " + " @ " + " b43a6a289a7588b1769814f04dd6c7d7176974cc " , <nl> " v8 / third_party / catapult " : <nl> - Var ( ' chromium_url ' ) + " / external / github . com / catapult - project / catapult . git " + " @ " + " 19565fdb148afc5fc752516f395f715f5d27c1f1 " , <nl> + Var ( ' chromium_url ' ) + " / external / github . com / catapult - project / catapult . git " + " @ " + " c69690acc34b8be0d85596bc5ad40fce7502817a " , <nl> } , <nl> " win " : { <nl> " v8 / third_party / cygwin " : <nl>
Update V8 DEPS .
v8/v8
6e232fde8348373a3e6afcb59f77c5b6ff2aa8ff
2016-12-13T04:35:53Z
mmm a / examples / server / eof_server . php <nl> ppp b / examples / server / eof_server . php <nl> <nl> ' package_eof ' = > " \ r \ n \ r \ n " , <nl> ' open_eof_check ' = > true , <nl> ' worker_num ' = > 4 , <nl> - ' dispatch_mode ' = > 1 , <nl> + ' dispatch_mode ' = > 2 , <nl> ' package_max_length ' = > 1024 * 1024 * 2 , / / 2M <nl> ) ) ; <nl> / / $ serv - > on ( ' connect ' , function ( $ serv , $ fd ) { <nl> mmm a / include / Server . h <nl> ppp b / include / Server . h <nl> int swFactoryProcess_create ( swFactory * factory , int writer_num , int worker_num ) ; <nl> int swFactoryProcess_start ( swFactory * factory ) ; <nl> int swFactoryProcess_shutdown ( swFactory * factory ) ; <nl> int swFactoryProcess_end ( swFactory * factory , swDataHead * event ) ; <nl> - int swFactoryProcess_worker_excute ( swFactory * factory , swEventData * task ) ; <nl> <nl> int swFactoryThread_create ( swFactory * factory , int writer_num ) ; <nl> int swFactoryThread_start ( swFactory * factory ) ; <nl> mmm a / include / swoole . h <nl> ppp b / include / swoole . h <nl> typedef struct <nl> * / <nl> uint16_t pipe_round ; <nl> <nl> + / * * <nl> + * pipe_worker <nl> + * / <nl> + int pipe_fd ; <nl> + <nl> swString * * buffer_input ; <nl> <nl> } swWorkerG ; <nl> mmm a / src / factory / FactoryProcess . c <nl> ppp b / src / factory / FactoryProcess . c <nl> int swFactoryProcess_start ( swFactory * factory ) <nl> return SW_OK ; <nl> } <nl> <nl> - int swFactoryProcess_worker_excute ( swFactory * factory , swEventData * task ) <nl> + static sw_inline int swFactoryProcess_worker_excute ( swFactory * factory , swEventData * task ) <nl> { <nl> swServer * serv = factory - > ptr ; <nl> swString * package = NULL ; <nl> static int swFactoryProcess_worker_onPipeReceive ( swReactor * reactor , swEvent * ev <nl> swEventData task ; <nl> swServer * serv = reactor - > ptr ; <nl> swFactory * factory = & serv - > factory ; <nl> + int ret ; <nl> <nl> + read_from_pipe : <nl> if ( read ( event - > fd , & task , sizeof ( task ) ) > 0 ) <nl> { <nl> - return swFactoryProcess_worker_excute ( factory , & task ) ; <nl> + ret = swFactoryProcess_worker_excute ( factory , & task ) ; <nl> + if ( task . info . type = = SW_EVENT_PACKAGE_START ) <nl> + { <nl> + goto read_from_pipe ; <nl> + } <nl> + return ret ; <nl> } <nl> return SW_ERR ; <nl> } <nl> mmm a / src / network / Buffer . c <nl> ppp b / src / network / Buffer . c <nl> int swBuffer_free ( swBuffer * buffer ) <nl> * / <nl> int swBuffer_append ( swBuffer * buffer , void * data , uint32_t size ) <nl> { <nl> - swBuffer_trunk * trunk = swBuffer_new_trunk ( buffer , SW_TRUNK_DATA , size ) ; <nl> - if ( trunk = = NULL ) <nl> - { <nl> - return SW_ERR ; <nl> - } <nl> + swBuffer_trunk * trunk = swBuffer_new_trunk ( buffer , SW_TRUNK_DATA , size ) ; <nl> + if ( trunk = = NULL ) <nl> + { <nl> + return SW_ERR ; <nl> + } <nl> <nl> - buffer - > length + = size ; <nl> - trunk - > length = size ; <nl> + buffer - > length + = size ; <nl> + trunk - > length = size ; <nl> <nl> - memcpy ( trunk - > store . ptr , data , trunk - > length ) ; <nl> + memcpy ( trunk - > store . ptr , data , trunk - > length ) ; <nl> <nl> - swTraceLog ( SW_TRACE_BUFFER , " trunk_n = % d | size = % d | trunk_len = % d | trunk = % p " , buffer - > trunk_num , size , <nl> - trunk - > length , trunk ) ; <nl> + swTraceLog ( SW_TRACE_BUFFER , " trunk_n = % d | size = % d | trunk_len = % d | trunk = % p " , buffer - > trunk_num , size , <nl> + trunk - > length , trunk ) ; <nl> <nl> - return SW_OK ; <nl> + return SW_OK ; <nl> } <nl> <nl> / * * <nl>
Optimization big package transmission
swoole/swoole-src
bb6e41b1fa5c1ee3cb07ec6ff20df7cf9e2350f9
2014-09-03T12:36:48Z
mmm a / dbms / src / DataStreams / AddingDefaultBlockOutputStream . cpp <nl> ppp b / dbms / src / DataStreams / AddingDefaultBlockOutputStream . cpp <nl> void AddingDefaultBlockOutputStream : : write ( const DB : : Block & block ) <nl> { <nl> Block res = block ; <nl> <nl> - / / / Computes explicitly specified values ( in column_defaults ) by default . <nl> - / * * @ todo if somehow block does not contain values for implicitly - defaulted columns that are prerequisites <nl> - * for explicitly - defaulted ones , exception will be thrown during evaluating such columns <nl> - * ( implicitly - defaulted columns are evaluated on the line after following one . * / <nl> - evaluateMissingDefaults ( res , required_columns , column_defaults , context ) ; <nl> - <nl> / / / Adds not specified default values . <nl> / / / @ todo this may be moved before ` evaluateMissingDefaults ` with passing { required_columns - explicitly - defaulted columns } <nl> if ( ! only_explicit_column_defaults ) <nl> void AddingDefaultBlockOutputStream : : write ( const DB : : Block & block ) <nl> <nl> for ( const auto & requested_column : required_columns ) <nl> { <nl> - if ( res . has ( requested_column . name ) ) <nl> + const auto it = column_defaults . find ( requested_column . name ) ; <nl> + if ( res . has ( requested_column . name ) | | it ! = column_defaults . end ( ) ) <nl> continue ; <nl> <nl> ColumnWithTypeAndName column_to_add ; <nl> void AddingDefaultBlockOutputStream : : write ( const DB : : Block & block ) <nl> } <nl> } <nl> <nl> + / / / Computes explicitly specified values ( in column_defaults ) by default . <nl> + / * * @ todo if somehow block does not contain values for implicitly - defaulted columns that are prerequisites <nl> + * for explicitly - defaulted ones , exception will be thrown during evaluating such columns <nl> + * ( implicitly - defaulted columns are evaluated on the line after following one . * / <nl> + evaluateMissingDefaults ( res , required_columns , column_defaults , context ) ; <nl> + <nl> output - > write ( res ) ; <nl> } <nl> <nl> new file mode 100644 <nl> index 00000000000 . . bc42226d38f <nl> mmm / dev / null <nl> ppp b / dbms / tests / queries / 0_stateless / 00564_initial_column_values_with_default_expression . reference <nl> <nl> + 1 0 h264 2018 - 02 - 03 0 - h264 <nl> + 1 0 h264 2018 - 02 - 03 0 - h264 <nl> + 2 0 h264 CONTENT 2018 - 02 - 03 0 - h264CONTENT <nl> new file mode 100644 <nl> index 00000000000 . . 8ac087c6dee <nl> mmm / dev / null <nl> ppp b / dbms / tests / queries / 0_stateless / 00564_initial_column_values_with_default_expression . sql <nl> <nl> + DROP TABLE IF EXISTS test . test ; <nl> + <nl> + CREATE TABLE IF NOT EXISTS test . test ( id UInt32 , track UInt8 , codec String , content String , rdate Date DEFAULT ' 2018 - 02 - 03 ' , track_id String DEFAULT concat ( concat ( concat ( toString ( track ) , ' - ' ) , codec ) , content ) ) ENGINE = MergeTree ( rdate , ( id , track_id ) , 8192 ) ; <nl> + <nl> + INSERT INTO test . test ( id , track , codec ) VALUES ( 1 , 0 , ' h264 ' ) ; <nl> + <nl> + SELECT * FROM test . test ORDER BY id ; <nl> + <nl> + INSERT INTO test . test ( id , track , codec , content ) VALUES ( 2 , 0 , ' h264 ' , ' CONTENT ' ) ; <nl> + <nl> + SELECT * FROM test . test ORDER BY id ; <nl> + <nl> + DROP TABLE IF EXISTS test . test ; <nl>
Merge pull request from zhang2014 / fix / ISSUES - 67
ClickHouse/ClickHouse
48a29d6474189247cec17225ab570f3f190a2a21
2018-02-06T18:41:43Z
mmm a / . travis . yml <nl> ppp b / . travis . yml <nl> matrix : <nl> env : LLVM_VERSION = 3 . 8 . 1 <nl> compiler : clang <nl> <nl> + - os : linux <nl> + env : LLVM_VERSION = 3 . 9 . 0 <nl> + compiler : clang <nl> + <nl> # # # # # # # # # # # # # # # # # # # # # <nl> # installation step # <nl> # # # # # # # # # # # # # # # # # # # # # <nl>
Clang 3 . 9 . 0 has been released
nlohmann/json
9639f0dfb3fc926903e3f8db990073725b6c9a06
2016-09-02T22:37:45Z
mmm a / . gitignore <nl> ppp b / . gitignore <nl> d8 <nl> d8_g <nl> shell <nl> shell_g <nl> + / _ * <nl> / build / Debug <nl> / build / gyp <nl> / build / ipch / <nl> / build / Release <nl> + / hydrogen . cfg <nl> / obj <nl> / out <nl> + / perf . data <nl> + / perf . data . old <nl> / test / cctest / cctest . status2 <nl> / test / message / message . status2 <nl> / test / mjsunit / mjsunit . status2 <nl> shell_g <nl> / tools / oom_dump / oom_dump . o <nl> / tools / visual_studio / Debug <nl> / tools / visual_studio / Release <nl> + / v8 . log . ll <nl> / xcodebuild <nl> TAGS <nl> * . Makefile <nl> GTAGS <nl> GRTAGS <nl> GSYMS <nl> GPATH <nl> - / _ * <nl>
Add common artifacts to . gitignore .
v8/v8
d16ca488fa5e0021ac563602fb562299c020e281
2013-07-31T12:38:43Z
mmm a / src / init . cpp <nl> ppp b / src / init . cpp <nl> bool AppInitParameterInteraction ( ) <nl> <nl> / / also see : InitParameterInteraction ( ) <nl> <nl> - if ( ! fs : : is_directory ( GetBlocksDir ( false ) ) ) { <nl> + if ( ! fs : : is_directory ( GetBlocksDir ( ) ) ) { <nl> return InitError ( strprintf ( _ ( " Specified blocks directory \ " % s \ " does not exist . " ) , gArgs . GetArg ( " - blocksdir " , " " ) . c_str ( ) ) ) ; <nl> } <nl> <nl> mmm a / src / util / system . cpp <nl> ppp b / src / util / system . cpp <nl> fs : : path GetDefaultDataDir ( ) <nl> # endif <nl> } <nl> <nl> - static fs : : path g_blocks_path_cached ; <nl> static fs : : path g_blocks_path_cache_net_specific ; <nl> static fs : : path pathCached ; <nl> static fs : : path pathCachedNetSpecific ; <nl> static CCriticalSection csPathCached ; <nl> <nl> - const fs : : path & GetBlocksDir ( bool fNetSpecific ) <nl> + const fs : : path & GetBlocksDir ( ) <nl> { <nl> <nl> LOCK ( csPathCached ) ; <nl> <nl> - fs : : path & path = fNetSpecific ? g_blocks_path_cache_net_specific : g_blocks_path_cached ; <nl> + fs : : path & path = g_blocks_path_cache_net_specific ; <nl> <nl> / / This can be called during exceptions by LogPrintf ( ) , so we cache the <nl> / / value so we don ' t have to do memory allocations after that . <nl> const fs : : path & GetBlocksDir ( bool fNetSpecific ) <nl> } else { <nl> path = GetDataDir ( false ) ; <nl> } <nl> - if ( fNetSpecific ) <nl> - path / = BaseParams ( ) . DataDir ( ) ; <nl> <nl> + path / = BaseParams ( ) . DataDir ( ) ; <nl> path / = " blocks " ; <nl> fs : : create_directories ( path ) ; <nl> return path ; <nl> void ClearDatadirCache ( ) <nl> <nl> pathCached = fs : : path ( ) ; <nl> pathCachedNetSpecific = fs : : path ( ) ; <nl> - g_blocks_path_cached = fs : : path ( ) ; <nl> g_blocks_path_cache_net_specific = fs : : path ( ) ; <nl> } <nl> <nl> mmm a / src / util / system . h <nl> ppp b / src / util / system . h <nl> void ReleaseDirectoryLocks ( ) ; <nl> <nl> bool TryCreateDirectories ( const fs : : path & p ) ; <nl> fs : : path GetDefaultDataDir ( ) ; <nl> - const fs : : path & GetBlocksDir ( bool fNetSpecific = true ) ; <nl> + / / The blocks directory is always net specific . <nl> + const fs : : path & GetBlocksDir ( ) ; <nl> const fs : : path & GetDataDir ( bool fNetSpecific = true ) ; <nl> void ClearDatadirCache ( ) ; <nl> fs : : path GetConfigFile ( const std : : string & confPath ) ; <nl> mmm a / test / functional / feature_blocksdir . py <nl> ppp b / test / functional / feature_blocksdir . py <nl> def set_test_params ( self ) : <nl> <nl> def run_test ( self ) : <nl> self . stop_node ( 0 ) <nl> + assert os . path . isdir ( os . path . join ( self . nodes [ 0 ] . datadir , " regtest " , " blocks " ) ) <nl> + assert not os . path . isdir ( os . path . join ( self . nodes [ 0 ] . datadir , " blocks " ) ) <nl> shutil . rmtree ( self . nodes [ 0 ] . datadir ) <nl> initialize_datadir ( self . options . tmpdir , 0 ) <nl> self . log . info ( " Starting with nonexistent blocksdir . . . " ) <nl>
Merge : utils and libraries : Make ' blocksdir ' always net specific
bitcoin/bitcoin
64ee94356fb4b3ceda57c68d40ce192fc62c209e
2019-01-16T12:40:27Z
mmm a / src / python / grpcio_tests / tests / unit / _logging_test . py <nl> ppp b / src / python / grpcio_tests / tests / unit / _logging_test . py <nl> <nl> import functools <nl> import sys <nl> <nl> + <nl> def patch_stderr ( f ) : <nl> + <nl> @ functools . wraps ( f ) <nl> def _impl ( * args , * * kwargs ) : <nl> old_stderr = sys . stderr <nl> def _impl ( * args , * * kwargs ) : <nl> f ( args , kwargs ) <nl> finally : <nl> sys . stderr = old_stderr <nl> + <nl> return _impl <nl> <nl> + <nl> class LoggingTest ( unittest . TestCase ) : <nl> <nl> def test_logger_not_occupied ( self ) : <nl> def test_handler_found ( self ) : <nl> reload_module ( logging ) <nl> logging . basicConfig ( ) <nl> reload_module ( grpc ) <nl> - self . assertFalse ( " No handlers could be found " in sys . stderr . getvalue ( ) ) <nl> + self . assertFalse ( <nl> + " No handlers could be found " in sys . stderr . getvalue ( ) ) <nl> finally : <nl> reload_module ( logging ) <nl> <nl> + <nl> if __name__ = = ' __main__ ' : <nl> unittest . main ( verbosity = 2 ) <nl>
Format code
grpc/grpc
acc72c0835eb84a79e779ede7a0383bddd8feef5
2018-10-31T23:47:56Z
mmm a / xbmc / filesystem / Directory . cpp <nl> ppp b / xbmc / filesystem / Directory . cpp <nl> bool CDirectory : : Create ( const CStdString & strPath ) <nl> return false ; <nl> } <nl> <nl> - bool CDirectory : : Exists ( const CStdString & strPath ) <nl> + bool CDirectory : : Exists ( const CStdString & strPath , bool bUseCache / * = true * / ) <nl> { <nl> try <nl> { <nl> + if ( bUseCache ) <nl> + { <nl> + bool bPathInCache ; <nl> + if ( g_directoryCache . FileExists ( strPath , bPathInCache ) ) <nl> + return true ; <nl> + if ( bPathInCache ) <nl> + return false ; <nl> + } <nl> CStdString realPath = URIUtils : : SubstitutePath ( strPath ) ; <nl> auto_ptr < IDirectory > pDirectory ( CDirectoryFactory : : Create ( realPath ) ) ; <nl> if ( pDirectory . get ( ) ) <nl> mmm a / xbmc / filesystem / Directory . h <nl> ppp b / xbmc / filesystem / Directory . h <nl> class CDirectory <nl> , bool allowThreads = false ) ; <nl> <nl> static bool Create ( const CStdString & strPath ) ; <nl> - static bool Exists ( const CStdString & strPath ) ; <nl> + static bool Exists ( const CStdString & strPath , bool bUseCache = true ) ; <nl> static bool Remove ( const CStdString & strPath ) ; <nl> <nl> / * ! \ brief Filter files that act like directories from the list , replacing them with their directory counterparts <nl> mmm a / xbmc / filesystem / DirectoryCache . cpp <nl> ppp b / xbmc / filesystem / DirectoryCache . cpp <nl> bool CDirectoryCache : : FileExists ( const CStdString & strFile , bool & bInCache ) <nl> CSingleLock lock ( m_cs ) ; <nl> bInCache = false ; <nl> <nl> + CStdString strPath ( strFile ) ; <nl> + URIUtils : : RemoveSlashAtEnd ( strPath ) ; <nl> CStdString storedPath ; <nl> - URIUtils : : GetDirectory ( strFile , storedPath ) ; <nl> + URIUtils : : GetDirectory ( strPath , storedPath ) ; <nl> URIUtils : : RemoveSlashAtEnd ( storedPath ) ; <nl> <nl> ciCache i = m_cache . find ( storedPath ) ; <nl>
Merge pull request from ulion / check_dir_exists_use_dircache
xbmc/xbmc
33a0d638ab30c5d3f01fa07a27971fd1addfe844
2013-04-08T07:07:41Z
mmm a / tools / emterpretify . py <nl> ppp b / tools / emterpretify . py <nl> def post_process_code ( code ) : <nl> asm . staticbump + = 1 <nl> stack_start = len ( mem_init ) <nl> asm . staticbump + = EMT_STACK_MAX <nl> + while asm . staticbump % 8 ! = 0 : <nl> + asm . staticbump + = 1 <nl> <nl> open ( out_mem_file , ' wb ' ) . write ( ' ' . join ( map ( chr , mem_init ) ) ) <nl> <nl>
ensure the static bump is properly aligned in emterpreter
emscripten-core/emscripten
8b7a8caccab19f0f934011942de12db846cbf25f
2014-12-31T22:23:29Z