text
stringlengths 2
100k
| meta
dict |
---|---|
//
// Generated by class-dump 3.5 (64 bit) (Debug version compiled Oct 15 2018 10:31:50).
//
// class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2015 by Steve Nygard.
//
#import <AddressBookCore/ABGroup.h>
@interface ABSubscribedGroup : ABGroup
{
}
+ (BOOL)_isPublicRecord;
+ (id)nts_Groups;
+ (id)nts_GroupsWithAddressBook:(id)arg1;
+ (id)nts_GroupsAtRemoteLocation:(id)arg1;
+ (id)nts_GroupsAtRemoteLocation:(id)arg1 withAddressBook:(id)arg2;
+ (id)builtInProperties;
+ (id)searchElementForProperty:(id)arg1 label:(id)arg2 key:(id)arg3 value:(id)arg4 comparison:(long long)arg5;
- (void)markRecordWithGroupSubscription:(id)arg1;
- (BOOL)isSubscribed;
- (BOOL)isPublishable;
- (id)init;
@end
| {
"pile_set_name": "Github"
} |
---
layout: "docs"
page_title: "Commands: KV Get"
sidebar_current: "docs-commands-kv-get"
---
# Consul KV Get
Command: `consul kv get`
The `kv get` command is used to retrieve the value from Consul's KV
store at the given key name. If no key exists with that name, an error is
returned. If a key exists with that name but has no data, nothing is returned.
If the name or prefix is omitted, it defaults to "" which is the root of the
KV store.
## Usage
Usage: `consul kv get [options] [KEY_OR_PREFIX]`
#### API Options
<%= partial "docs/commands/http_api_options_client" %>
<%= partial "docs/commands/http_api_options_server" %>
#### KV Get Options
* `-base64` - Base 64 encode the value. The default value is false.
* `-detailed` - Provide additional metadata about the key in addition to the
value such as the ModifyIndex and any flags that may have been set on the key.
The default value is false.
* `-keys` - List keys which start with the given prefix, but not their values.
This is especially useful if you only need the key names themselves. This
option is commonly combined with the -separator option. The default value is
false.
* `-recurse` - Recursively look at all keys prefixed with the given path. The
default value is false.
* `-separator=<string>` - String to use as a separator between keys. The default
value is "/", but this option is only taken into account when paired with the
-keys flag.
## Examples
To retrieve the value for the key named "redis/config/connections" in the
KV store:
```
$ consul kv get redis/config/connections
5
```
This will return the original, raw value stored in Consul. To view detailed
information about the key, specify the "-detailed" flag. This will output all
known metadata about the key including ModifyIndex and any user-supplied
flags:
```
$ consul kv get -detailed redis/config/connections
CreateIndex 336
Flags 0
Key redis/config/connections
LockIndex 0
ModifyIndex 336
Session -
Value 5
```
If the key with the given name does not exist, an error is returned:
```
$ consul kv get not-a-real-key
Error! No key exists at: not-a-real-key
```
To treat the path as a prefix and list all keys which start with the given
prefix, specify the "-recurse" flag:
```
$ consul kv get -recurse redis/
redis/config/connections:5
redis/config/cpu:128
redis/config/memory:512
```
Or list detailed information about all pairs under a prefix:
```
$ consul kv get -recurse -detailed redis
CreateIndex 336
Flags 0
Key redis/config/connections
LockIndex 0
ModifyIndex 336
Session -
Value 5
CreateIndex 472
Flags 0
Key redis/config/cpu
LockIndex 0
ModifyIndex 472
Session -
Value 128
CreateIndex 471
Flags 0
Key redis/config/memory
LockIndex 0
ModifyIndex 471
Session -
Value 512
```
To just list the keys which start with the specified prefix, use the "-keys"
option instead. This is more performant and results in a smaller payload:
```
$ consul kv get -keys redis/config/
redis/config/connections
redis/config/cpu
redis/config/memory
```
By default, the `-keys` operation uses a separator of "/", meaning it will not
recurse beyond that separator. You can choose a different separator by setting
`-separator="<string>"`.
```
$ consul kv get -keys -separator="c" redis
redis/c
```
Alternatively, you can disable the separator altogether by setting it to the
empty string:
```
$ consul kv get -keys -separator="" redis
redis/config/connections
redis/config/cpu
redis/config/memory
```
To list all keys at the root, simply omit the prefix parameter:
```
$ consul kv get -keys
memcached/
redis/
```
| {
"pile_set_name": "Github"
} |
> From: [eLinux.org](http://eLinux.org/Leapster_Explorer:_Over_Clock "http://eLinux.org/Leapster_Explorer:_Over_Clock")
# Leapster Explorer: Over Clock
## Contents
- [1 Summary](#summary)
- [2 Hardware Needed](#hardware-needed)
- [3 Temporarily](#temporarily)
- [4 Permanent](#permanent)
- [5 Test Results](#test-results)
## Summary
This tutorial shows you how to overclock or underclock your Leapster
Explorer.
## Hardware Needed
[Console
Access](http://eLinux.org/LeapFrog_Pollux_Platform:_Console_Access "LeapFrog Pollux Platform: Console Access")
## Temporarily
To set the CPU Frequency run this command, example is 532.5mhz:
# echo 532500000 > /usr/speed
# cat /usr/speed > /sys/devices/platform/lf1000-gpio/cpu_freq_in_hz
## Permanent
Once you have tested your overclock, and are sure that it is stable, and
you haven't made any typos. You can make the change permanent.
There is a line in /etc/init.d/rcS that checks for
/flags/cpu\_freq\_in\_hz, and automatically copies the contents to
/sys/devices/platform/lf1000-gpio/cpu\_freq\_in\_hz
It is important that you test your chosen cpu frequency first using the
procedure in section 1, or you might be left with an unbootable device.
# echo 532500000 > /flags/cpu_freq_in_hz
Now just reboot, and your Leapster Explorer will automatically boot at
the faster speed. Again, **this is dangerous**, so test it first with
the above method. I cannot stress this enough.
## Test Results
GrizzlyAdams was able to get his to run at any arbitrary speed between
180MHz and 580Mhz. At 164MHz the display went blank, and serial port
stopped responding. At 600MHz a kernel panic occurred.
[Categories](http://eLinux.org/Special:Categories "Special:Categories"):
- [Leapster
Explorer](http://eLinux.org/Category:Leapster_Explorer "Category:Leapster Explorer")
- [LeapFrog Pollux
Platform](http://eLinux.org/index.php?title=Category:LeapFrog_Pollux_Platform&action=edit&redlink=1 "Category:LeapFrog Pollux Platform (page does not exist)")
| {
"pile_set_name": "Github"
} |
########################################################################
##
## Copyright (C) 1994-2020 The Octave Project Developers
##
## See the file COPYRIGHT.md in the top-level directory of this
## distribution or <https://octave.org/copyright/>.
##
## This file is part of Octave.
##
## Octave is free software: you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
##
## Octave is distributed in the hope that it will be useful, but
## WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
## GNU General Public License for more details.
##
## You should have received a copy of the GNU General Public License
## along with Octave; see the file COPYING. If not, see
## <https://www.gnu.org/licenses/>.
##
########################################################################
## -*- texinfo -*-
## @deftypefn {} {} cstrcat (@var{s1}, @var{s2}, @dots{})
## Return a string containing all the arguments concatenated horizontally
## with trailing white space preserved.
##
## For example:
##
## @example
## @group
## cstrcat ("ab ", "cd")
## @result{} "ab cd"
## @end group
## @end example
##
## @example
## @group
## s = [ "ab"; "cde" ];
## cstrcat (s, s, s)
## @result{} "ab ab ab "
## "cdecdecde"
## @end group
## @end example
## @seealso{strcat, char, strvcat}
## @end deftypefn
function st = cstrcat (varargin)
if (nargin == 0)
## Special because if varargin is empty, iscellstr still returns
## true but then "[varargin{:}]" would be of class double.
st = "";
elseif (iscellstr (varargin))
st = [varargin{:}];
else
error ("cstrcat: arguments must be character strings");
endif
endfunction
## Test the dimensionality
## 1d
%!assert (cstrcat ("ab ", "ab "), "ab ab ")
## 2d
%!assert (cstrcat (["ab ";"cde"], ["ab ";"cde"]), ["ab ab ";"cdecde"])
%!assert (cstrcat ("foo", "bar"), "foobar")
%!assert (cstrcat (["a "; "bb"], ["foo"; "bar"]), ["a foo"; "bbbar"])
%!assert (cstrcat (), "")
## Test input validation
%!error cstrcat (1, 2)
| {
"pile_set_name": "Github"
} |
module GHC.IO.Handle ( module Exports ) where
import GHC.Types
import "base" GHC.IO.Handle as Exports
| {
"pile_set_name": "Github"
} |
////////////////////////////////////////////////////////////////////////////////////////////////////////
// Part of Injectable Generic Camera System
// Copyright(c) 2019, Frans Bouma
// All rights reserved.
// https://github.com/FransBouma/InjectableGenericCameraSystem
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met :
//
// * Redistributions of source code must retain the above copyright notice, this
// list of conditions and the following disclaimer.
//
// * Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and / or other materials provided with the distribution.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
// DISCLAIMED.IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
// FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
// DAMAGES(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
// SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
// CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
// OR TORT(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
////////////////////////////////////////////////////////////////////////////////////////////////////////
#pragma once
namespace IGCS::InputHooker
{
void setInputHooks();
void setXInputHook(bool enableHook);
}
| {
"pile_set_name": "Github"
} |
<?xml version="1.0" encoding="UTF-8"?>
<project basedir="../../" default="deploy">
<property name="git.status" value=""/>
<property name="git.currentbranch" value=""/>
<target name="check-git-branch-status">
<exec command="git status -s -b" outputProperty="git.currentbranch" />
<echo msg="${git.currentbranch}"/>
<if>
<contains string="${git.currentbranch}" substring="${head}"/>
<then>
<echo>On branch ${head}</echo>
</then>
<else>
<fail message="-Dhead=${head} arg did not match ${git.currentbranch}"/>
</else>
</if>
<exec command="git status -s" outputProperty="git.status" />
<if>
<equals arg1="${git.status}" arg2="" trim="true"/>
<then>
<echo>working directory clean</echo>
</then>
<else>
<echo>${git.status}</echo>
<fail message="Working directory isn't clean." />
</else>
</if>
</target>
<property name="version.changelog" value=""/>
<property name="version.version" value=""/>
<target name="check-changelog-version">
<exec executable="fgrep" outputProperty="version.changelog">
<arg value="${new.version} ("/>
<arg value="${project.basedir}/CHANGELOG.md"/>
</exec>
<if>
<equals arg1="${version.changelog}" arg2="" trim="true"/>
<then>
<fail message="${new.version} not mentioned in CHANGELOG"/>
</then>
</if>
<exec executable="fgrep" outputProperty="version.version">
<arg value="const VERSION = '${new.version}'"/>
<arg value="${project.basedir}/src/Guzzle/Common/Version.php"/>
</exec>
<if>
<equals arg1="${version.version}" arg2="" trim="true"/>
<then>
<fail message="${new.version} not mentioned in Guzzle\Common\Version"/>
</then>
</if>
<echo>ChangeLog Match: ${version.changelog}</echo>
<echo>Guzzle\Common\Version Match: ${version.version}</echo>
</target>
<target name="help" description="HELP AND REMINDERS about what you can do with this project">
<echo>releasing: phing -Dnew.version=3.0.x -Dhead=master release</echo>
<echo>--</echo>
<exec command="phing -l" passthru="true"/>
</target>
<target name="release" depends="check-changelog-version,check-git-branch-status"
description="tag, subtree split, package, deploy: Use: phing -Dnew.version=[TAG] -Dhead=[BRANCH] release">
<if>
<isset property="new.version" />
<then>
<if>
<contains string="${new.version}" substring="v" casesensitive="false" />
<then>
<fail message="Please specify version as [0-9].[0-9].[0-9]. (I'll add v for you.)"/>
</then>
<else>
<echo>BEGINNING RELEASE FOR ${new.version}</echo>
<!-- checkout the specified branch -->
<!-- <gitcheckout repository="${repo.dir}" branchname="${head}" gitPath="${cmd.git}" /> -->
<!-- Ensure that the tag exists -->
<!-- push the tag up so subsplit will get it -->
<!--gitpush repository="${repo.dir}" tags="true" gitPath="${cmd.git}" /-->
<!-- now do the subsplits -->
<guzzlesubsplit
repository="${repo.dir}"
remote="${guzzle.remote}"
heads="${head}"
tags="v${new.version}"
base="src"
subIndicatorFile="composer.json"
gitPath="${cmd.git}" />
<!-- Copy .md files into the PEAR package -->
<copy file="${repo.dir}/LICENSE" tofile=".subsplit/src/Guzzle/LICENSE.md" />
<copy file="${repo.dir}/README.md" tofile=".subsplit/src/Guzzle/README.md" />
<copy file="${repo.dir}/CHANGELOG.md" tofile=".subsplit/src/Guzzle/CHANGELOG.md" />
<!-- and now the pear packages -->
<guzzlepear
version="${new.version}"
makephar="true"
/>
</else>
</if>
</then>
<else>
<echo>Tip: to create a new release, do: phing -Dnew.version=[TAG] -Dhead=[BRANCH] release</echo>
</else>
</if>
</target>
<target name="pear-channel">
<guzzlepear version="${new.version}" deploy="true" makephar="true" />
</target>
<target name="package-phar" description="Create a phar with an autoloader">
<pharpackage
destfile="${dir.output}/guzzle.phar"
basedir="${project.basedir}/.subsplit"
stub="phar-stub.php"
signature="md5">
<fileset dir="${project.basedir}/.subsplit">
<include name="src/**/*.php" />
<include name="src/**/*.pem" />
<include name="vendor/symfony/class-loader/Symfony/Component/ClassLoader/UniversalClassLoader.php" />
<include name="vendor/symfony/event-dispatcher/**/*.php" />
<include name="vendor/doctrine/common/lib/Doctrine/Common/Cache/*.php" />
<include name="vendor/monolog/monolog/src/**/*.php" />
</fileset>
<metadata>
<element name="author" value="Michael Dowling" />
</metadata>
</pharpackage>
</target>
</project>
| {
"pile_set_name": "Github"
} |
Выпуск №0
1 января 1970: Описание.
— Ольга Алексашенко
— Вадим Макеев
— Мария Просвирнина
— Алексей Симоненко
Содержание
00:00:00 Тема
Слушайте:
Google Podcasts — https://podcasts.google.com/?feed=aHR0cHM6Ly93ZWItc3RhbmRhcmRzLnJ1L3BvZGNhc3QvZmVlZC8
Apple Podcasts — https://podcasts.apple.com/podcast/id1080500016
ВКонтакте — https://vk.com/podcasts-32017543
Яндекс.Музыка — https://music.yandex.ru/album/6245956
Spotify — https://open.spotify.com/show/3rzAcADjpBpXt73L0epTjV
YouTube — https://www.youtube.com/playlist?list=PLMBnwIwFEFHcwuevhsNXkFTcadeX5R1Go
SoundCloud — https://soundcloud.com/web-standards
RSS — https://web-standards.ru/podcast/feed/
Читайте:
Твиттер — https://twitter.com/webstandards_ru
ВКонтакте — https://vk.com/webstandards_ru
Фейсбук — https://www.facebook.com/webstandardsru
Телеграм — https://t.me/webstandards_ru
Поддержите:
Патреон — https://www.patreon.com/webstandards_ru
Яндекс.Деньги — https://money.yandex.ru/to/41001119329753
PayPal — https://www.paypal.me/pepelsbey
Обсуждайте: http://slack.web-standards.ru/
---
Episode #0
January 1st: Description.
— Olga Alexashenko
— Vadim Makeev
— Maria Prosvirnina
— Alexey Simonenko
Topics
00:00:00 Topic
Listen:
iTunes — https://podcasts.apple.com/podcast/id1080500016
VK — https://vk.com/podcasts-32017543
Yandex.Music — https://music.yandex.ru/album/6245956
Spotify — https://open.spotify.com/show/3rzAcADjpBpXt73L0epTjV
YouTube — https://www.youtube.com/playlist?list=PLMBnwIwFEFHcwuevhsNXkFTcadeX5R1Go
SoundCloud — https://soundcloud.com/web-standards
RSS — https://web-standards.ru/podcast/feed/
Read:
Twitter — https://twitter.com/webstandards_ru
VK — https://vk.com/webstandards_ru
Facebook — https://www.facebook.com/webstandardsru
Telegram — https://t.me/webstandards_ru
Support:
Patreon — https://www.patreon.com/webstandards_ru
Yandex.Money — https://money.yandex.ru/to/41001119329753
PayPal — https://www.paypal.me/pepelsbey
Discuss in Slack — http://slack.web-standards.ru/
| {
"pile_set_name": "Github"
} |
@startuml
sprite $owm_601 [48x48/16] {
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000
0000000000000000016ACDDB830000000000000000000000
00000000000000009FFFFFFFFFC300000000000000000000
000000000000003EFFFFFFFFFFFF80000000000000000000
00000000000004FFFFE85446BFFFFA000000000000000000
0000000000002FFFF700000003DFFF800000000000000000
000000000000CFFF40000000000BFFF30000000000000000
000000000005FFF5000000000000DFFB0000000000000000
00000000000BFFB00000000000004FFF1000000000000000
00000000002FFF500000000000000DFF9641000000000000
000000004CFFFF0000000000000008FFFFFFC40000000000
00000009FFFFFE0000000000000006FFFFFFFFA000000000
000000BFFFFEC70000000000000002AAABEFFFFC00000000
000008FFFD40000000000000000000000003CFFF90000000
00002FFFB0000000000000000000000000000AFFF3000000
00009FFE100000000000000000000000000000DFFA000000
0000EFF60000000000000000000000000000005FFF000000
0001FFF20000000000000000000000000000000FFF200000
0002FFF00000000000000000000000000000000EFF300000
0001FFF10000000000000000000000000000000FFF200000
0000FFF40000000000000000000000000000002FFF000000
0000BFFB0000000000000045000000000000009FFC000000
00005FFF60000000000004FF70000000000004FFF6000000
00000CFFF7000000000005FF9000000000005FFFD0000000
000001EFFFD740001200008810001200037CFFFF20000000
0000002EFFFFF002FF6000000003FF500FFFFFE300000000
00000001AFFFF004FFA000000006FF800FFFFB2000000000
00000000027BC000AC2000000000AB200CB8200000000000
00000000000000000000009A100000000000000000000000
0000000000000000000006FF900000000000000000000000
0000000000000000000004FF700000000000000000000000
000000000000000057000034000067000000000000000000
0000000000000003FF9000000005FF800000000000000000
0000000000000003FF8000000005FF700000000000000000
000000000000000046000000000056000000000000000000
0000000000000000000001CD300000000000000000000000
0000000000000000000006FF900000000000000000000000
0000000000000000000002FF400000000000000000000000
000000000000000000000001000000000000000000000000
000000000000000000000000000000000000000000000000
}
!define WEATHER_OWM_601(_alias) ENTITY(rectangle,black,owm_601,_alias,WEATHER OWM_601)
!define WEATHER_OWM_601(_alias, _label) ENTITY(rectangle,black,owm_601,_label, _alias,WEATHER OWM_601)
!define WEATHER_OWM_601(_alias, _label, _shape) ENTITY(_shape,black,owm_601,_label, _alias,WEATHER OWM_601)
!define WEATHER_OWM_601(_alias, _label, _shape, _color) ENTITY(_shape,_color,owm_601,_label, _alias,WEATHER OWM_601)
skinparam folderBackgroundColor<<WEATHER OWM_601>> White
@enduml | {
"pile_set_name": "Github"
} |
/*!
* Nodeunit
* Copyright (c) 2010 Caolan McMahon
* MIT Licensed
*/
/**
* Module dependencies
*/
var async = require('../deps/async'),
types = require('./types'),
utils = require('./utils'),
core = require('./core'),
reporters = require('./reporters'),
assert = require('./assert'),
path = require('path'),
events = require('events');
/**
* Export sub-modules.
*/
exports.types = types;
exports.utils = utils;
exports.reporters = reporters;
exports.assert = assert;
// backwards compatibility
exports.testrunner = {
run: function () {
console.log(
'WARNING: nodeunit.testrunner is going to be deprecated, please ' +
'use nodeunit.reporters.default instead!'
);
return reporters['default'].run.apply(this, arguments);
}
};
/**
* Export all core functions
*/
for (var k in core) {
exports[k] = core[k];
};
/**
* Load modules from paths array and run all exported tests in series. If a path
* is a directory, load all supported file types inside it as modules. This only
* reads 1 level deep in the directory and does not recurse through
* sub-directories.
*
* @param {Array} paths
* @param {Object} opt
* @api public
*/
exports.runFiles = function (paths, opt) {
var all_assertions = [];
var options = types.options(opt);
var start = new Date().getTime();
if (!paths.length) {
return options.done(types.assertionList(all_assertions));
}
utils.modulePaths(paths, function (err, files) {
if (err) throw err;
async.concatSeries(files, function (file, cb) {
var name = path.basename(file);
exports.runModule(name, require(file), options, cb);
},
function (err, all_assertions) {
var end = new Date().getTime();
exports.done()
options.done(types.assertionList(all_assertions, end - start));
});
}, options.recursive);
};
/* Export all prototypes from events.EventEmitter */
var label;
for (label in events.EventEmitter.prototype) {
exports[label] = events.EventEmitter.prototype[label];
}
/* Emit event 'complete' on completion of a test suite. */
exports.complete = function(name, assertions)
{
exports.emit('complete', name, assertions);
};
/* Emit event 'complete' on completion of all tests. */
exports.done = function()
{
exports.emit('done');
};
module.exports = exports;
| {
"pile_set_name": "Github"
} |
/* zggbal.f -- translated by f2c (version 20061008).
You must link the resulting object file with libf2c:
on Microsoft Windows system, link with libf2c.lib;
on Linux or Unix systems, link with .../path/to/libf2c.a -lm
or, if you install libf2c.a in a standard place, with -lf2c -lm
-- in that order, at the end of the command line, as in
cc *.o -lf2c -lm
Source for libf2c is in /netlib/f2c/libf2c.zip, e.g.,
http://www.netlib.org/f2c/libf2c.zip
*/
#include "f2c.h"
#include "blaswrap.h"
/* Table of constant values */
static integer c__1 = 1;
static doublereal c_b36 = 10.;
static doublereal c_b72 = .5;
/* Subroutine */ int zggbal_(char *job, integer *n, doublecomplex *a, integer
*lda, doublecomplex *b, integer *ldb, integer *ilo, integer *ihi,
doublereal *lscale, doublereal *rscale, doublereal *work, integer *
info)
{
/* System generated locals */
integer a_dim1, a_offset, b_dim1, b_offset, i__1, i__2, i__3, i__4;
doublereal d__1, d__2, d__3;
/* Builtin functions */
double d_lg10(doublereal *), d_imag(doublecomplex *), z_abs(doublecomplex
*), d_sign(doublereal *, doublereal *), pow_di(doublereal *,
integer *);
/* Local variables */
integer i__, j, k, l, m;
doublereal t;
integer jc;
doublereal ta, tb, tc;
integer ir;
doublereal ew;
integer it, nr, ip1, jp1, lm1;
doublereal cab, rab, ewc, cor, sum;
integer nrp2, icab, lcab;
doublereal beta, coef;
integer irab, lrab;
doublereal basl, cmax;
extern doublereal ddot_(integer *, doublereal *, integer *, doublereal *,
integer *);
doublereal coef2, coef5, gamma, alpha;
extern /* Subroutine */ int dscal_(integer *, doublereal *, doublereal *,
integer *);
extern logical lsame_(char *, char *);
doublereal sfmin, sfmax;
integer iflow;
extern /* Subroutine */ int daxpy_(integer *, doublereal *, doublereal *,
integer *, doublereal *, integer *);
integer kount;
extern /* Subroutine */ int zswap_(integer *, doublecomplex *, integer *,
doublecomplex *, integer *);
extern doublereal dlamch_(char *);
doublereal pgamma;
extern /* Subroutine */ int xerbla_(char *, integer *), zdscal_(
integer *, doublereal *, doublecomplex *, integer *);
integer lsfmin;
extern integer izamax_(integer *, doublecomplex *, integer *);
integer lsfmax;
/* -- LAPACK routine (version 3.2) -- */
/* Univ. of Tennessee, Univ. of California Berkeley and NAG Ltd.. */
/* November 2006 */
/* .. Scalar Arguments .. */
/* .. */
/* .. Array Arguments .. */
/* .. */
/* Purpose */
/* ======= */
/* ZGGBAL balances a pair of general complex matrices (A,B). This */
/* involves, first, permuting A and B by similarity transformations to */
/* isolate eigenvalues in the first 1 to ILO$-$1 and last IHI+1 to N */
/* elements on the diagonal; and second, applying a diagonal similarity */
/* transformation to rows and columns ILO to IHI to make the rows */
/* and columns as close in norm as possible. Both steps are optional. */
/* Balancing may reduce the 1-norm of the matrices, and improve the */
/* accuracy of the computed eigenvalues and/or eigenvectors in the */
/* generalized eigenvalue problem A*x = lambda*B*x. */
/* Arguments */
/* ========= */
/* JOB (input) CHARACTER*1 */
/* Specifies the operations to be performed on A and B: */
/* = 'N': none: simply set ILO = 1, IHI = N, LSCALE(I) = 1.0 */
/* and RSCALE(I) = 1.0 for i=1,...,N; */
/* = 'P': permute only; */
/* = 'S': scale only; */
/* = 'B': both permute and scale. */
/* N (input) INTEGER */
/* The order of the matrices A and B. N >= 0. */
/* A (input/output) COMPLEX*16 array, dimension (LDA,N) */
/* On entry, the input matrix A. */
/* On exit, A is overwritten by the balanced matrix. */
/* If JOB = 'N', A is not referenced. */
/* LDA (input) INTEGER */
/* The leading dimension of the array A. LDA >= max(1,N). */
/* B (input/output) COMPLEX*16 array, dimension (LDB,N) */
/* On entry, the input matrix B. */
/* On exit, B is overwritten by the balanced matrix. */
/* If JOB = 'N', B is not referenced. */
/* LDB (input) INTEGER */
/* The leading dimension of the array B. LDB >= max(1,N). */
/* ILO (output) INTEGER */
/* IHI (output) INTEGER */
/* ILO and IHI are set to integers such that on exit */
/* A(i,j) = 0 and B(i,j) = 0 if i > j and */
/* j = 1,...,ILO-1 or i = IHI+1,...,N. */
/* If JOB = 'N' or 'S', ILO = 1 and IHI = N. */
/* LSCALE (output) DOUBLE PRECISION array, dimension (N) */
/* Details of the permutations and scaling factors applied */
/* to the left side of A and B. If P(j) is the index of the */
/* row interchanged with row j, and D(j) is the scaling factor */
/* applied to row j, then */
/* LSCALE(j) = P(j) for J = 1,...,ILO-1 */
/* = D(j) for J = ILO,...,IHI */
/* = P(j) for J = IHI+1,...,N. */
/* The order in which the interchanges are made is N to IHI+1, */
/* then 1 to ILO-1. */
/* RSCALE (output) DOUBLE PRECISION array, dimension (N) */
/* Details of the permutations and scaling factors applied */
/* to the right side of A and B. If P(j) is the index of the */
/* column interchanged with column j, and D(j) is the scaling */
/* factor applied to column j, then */
/* RSCALE(j) = P(j) for J = 1,...,ILO-1 */
/* = D(j) for J = ILO,...,IHI */
/* = P(j) for J = IHI+1,...,N. */
/* The order in which the interchanges are made is N to IHI+1, */
/* then 1 to ILO-1. */
/* WORK (workspace) REAL array, dimension (lwork) */
/* lwork must be at least max(1,6*N) when JOB = 'S' or 'B', and */
/* at least 1 when JOB = 'N' or 'P'. */
/* INFO (output) INTEGER */
/* = 0: successful exit */
/* < 0: if INFO = -i, the i-th argument had an illegal value. */
/* Further Details */
/* =============== */
/* See R.C. WARD, Balancing the generalized eigenvalue problem, */
/* SIAM J. Sci. Stat. Comp. 2 (1981), 141-152. */
/* ===================================================================== */
/* .. Parameters .. */
/* .. */
/* .. Local Scalars .. */
/* .. */
/* .. External Functions .. */
/* .. */
/* .. External Subroutines .. */
/* .. */
/* .. Intrinsic Functions .. */
/* .. */
/* .. Statement Functions .. */
/* .. */
/* .. Statement Function definitions .. */
/* .. */
/* .. Executable Statements .. */
/* Test the input parameters */
/* Parameter adjustments */
a_dim1 = *lda;
a_offset = 1 + a_dim1;
a -= a_offset;
b_dim1 = *ldb;
b_offset = 1 + b_dim1;
b -= b_offset;
--lscale;
--rscale;
--work;
/* Function Body */
*info = 0;
if (! lsame_(job, "N") && ! lsame_(job, "P") && ! lsame_(job, "S")
&& ! lsame_(job, "B")) {
*info = -1;
} else if (*n < 0) {
*info = -2;
} else if (*lda < max(1,*n)) {
*info = -4;
} else if (*ldb < max(1,*n)) {
*info = -6;
}
if (*info != 0) {
i__1 = -(*info);
xerbla_("ZGGBAL", &i__1);
return 0;
}
/* Quick return if possible */
if (*n == 0) {
*ilo = 1;
*ihi = *n;
return 0;
}
if (*n == 1) {
*ilo = 1;
*ihi = *n;
lscale[1] = 1.;
rscale[1] = 1.;
return 0;
}
if (lsame_(job, "N")) {
*ilo = 1;
*ihi = *n;
i__1 = *n;
for (i__ = 1; i__ <= i__1; ++i__) {
lscale[i__] = 1.;
rscale[i__] = 1.;
/* L10: */
}
return 0;
}
k = 1;
l = *n;
if (lsame_(job, "S")) {
goto L190;
}
goto L30;
/* Permute the matrices A and B to isolate the eigenvalues. */
/* Find row with one nonzero in columns 1 through L */
L20:
l = lm1;
if (l != 1) {
goto L30;
}
rscale[1] = 1.;
lscale[1] = 1.;
goto L190;
L30:
lm1 = l - 1;
for (i__ = l; i__ >= 1; --i__) {
i__1 = lm1;
for (j = 1; j <= i__1; ++j) {
jp1 = j + 1;
i__2 = i__ + j * a_dim1;
i__3 = i__ + j * b_dim1;
if (a[i__2].r != 0. || a[i__2].i != 0. || (b[i__3].r != 0. || b[
i__3].i != 0.)) {
goto L50;
}
/* L40: */
}
j = l;
goto L70;
L50:
i__1 = l;
for (j = jp1; j <= i__1; ++j) {
i__2 = i__ + j * a_dim1;
i__3 = i__ + j * b_dim1;
if (a[i__2].r != 0. || a[i__2].i != 0. || (b[i__3].r != 0. || b[
i__3].i != 0.)) {
goto L80;
}
/* L60: */
}
j = jp1 - 1;
L70:
m = l;
iflow = 1;
goto L160;
L80:
;
}
goto L100;
/* Find column with one nonzero in rows K through N */
L90:
++k;
L100:
i__1 = l;
for (j = k; j <= i__1; ++j) {
i__2 = lm1;
for (i__ = k; i__ <= i__2; ++i__) {
ip1 = i__ + 1;
i__3 = i__ + j * a_dim1;
i__4 = i__ + j * b_dim1;
if (a[i__3].r != 0. || a[i__3].i != 0. || (b[i__4].r != 0. || b[
i__4].i != 0.)) {
goto L120;
}
/* L110: */
}
i__ = l;
goto L140;
L120:
i__2 = l;
for (i__ = ip1; i__ <= i__2; ++i__) {
i__3 = i__ + j * a_dim1;
i__4 = i__ + j * b_dim1;
if (a[i__3].r != 0. || a[i__3].i != 0. || (b[i__4].r != 0. || b[
i__4].i != 0.)) {
goto L150;
}
/* L130: */
}
i__ = ip1 - 1;
L140:
m = k;
iflow = 2;
goto L160;
L150:
;
}
goto L190;
/* Permute rows M and I */
L160:
lscale[m] = (doublereal) i__;
if (i__ == m) {
goto L170;
}
i__1 = *n - k + 1;
zswap_(&i__1, &a[i__ + k * a_dim1], lda, &a[m + k * a_dim1], lda);
i__1 = *n - k + 1;
zswap_(&i__1, &b[i__ + k * b_dim1], ldb, &b[m + k * b_dim1], ldb);
/* Permute columns M and J */
L170:
rscale[m] = (doublereal) j;
if (j == m) {
goto L180;
}
zswap_(&l, &a[j * a_dim1 + 1], &c__1, &a[m * a_dim1 + 1], &c__1);
zswap_(&l, &b[j * b_dim1 + 1], &c__1, &b[m * b_dim1 + 1], &c__1);
L180:
switch (iflow) {
case 1: goto L20;
case 2: goto L90;
}
L190:
*ilo = k;
*ihi = l;
if (lsame_(job, "P")) {
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
lscale[i__] = 1.;
rscale[i__] = 1.;
/* L195: */
}
return 0;
}
if (*ilo == *ihi) {
return 0;
}
/* Balance the submatrix in rows ILO to IHI. */
nr = *ihi - *ilo + 1;
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
rscale[i__] = 0.;
lscale[i__] = 0.;
work[i__] = 0.;
work[i__ + *n] = 0.;
work[i__ + (*n << 1)] = 0.;
work[i__ + *n * 3] = 0.;
work[i__ + (*n << 2)] = 0.;
work[i__ + *n * 5] = 0.;
/* L200: */
}
/* Compute right side vector in resulting linear equations */
basl = d_lg10(&c_b36);
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
i__2 = *ihi;
for (j = *ilo; j <= i__2; ++j) {
i__3 = i__ + j * a_dim1;
if (a[i__3].r == 0. && a[i__3].i == 0.) {
ta = 0.;
goto L210;
}
i__3 = i__ + j * a_dim1;
d__3 = (d__1 = a[i__3].r, abs(d__1)) + (d__2 = d_imag(&a[i__ + j *
a_dim1]), abs(d__2));
ta = d_lg10(&d__3) / basl;
L210:
i__3 = i__ + j * b_dim1;
if (b[i__3].r == 0. && b[i__3].i == 0.) {
tb = 0.;
goto L220;
}
i__3 = i__ + j * b_dim1;
d__3 = (d__1 = b[i__3].r, abs(d__1)) + (d__2 = d_imag(&b[i__ + j *
b_dim1]), abs(d__2));
tb = d_lg10(&d__3) / basl;
L220:
work[i__ + (*n << 2)] = work[i__ + (*n << 2)] - ta - tb;
work[j + *n * 5] = work[j + *n * 5] - ta - tb;
/* L230: */
}
/* L240: */
}
coef = 1. / (doublereal) (nr << 1);
coef2 = coef * coef;
coef5 = coef2 * .5;
nrp2 = nr + 2;
beta = 0.;
it = 1;
/* Start generalized conjugate gradient iteration */
L250:
gamma = ddot_(&nr, &work[*ilo + (*n << 2)], &c__1, &work[*ilo + (*n << 2)]
, &c__1) + ddot_(&nr, &work[*ilo + *n * 5], &c__1, &work[*ilo + *
n * 5], &c__1);
ew = 0.;
ewc = 0.;
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
ew += work[i__ + (*n << 2)];
ewc += work[i__ + *n * 5];
/* L260: */
}
/* Computing 2nd power */
d__1 = ew;
/* Computing 2nd power */
d__2 = ewc;
/* Computing 2nd power */
d__3 = ew - ewc;
gamma = coef * gamma - coef2 * (d__1 * d__1 + d__2 * d__2) - coef5 * (
d__3 * d__3);
if (gamma == 0.) {
goto L350;
}
if (it != 1) {
beta = gamma / pgamma;
}
t = coef5 * (ewc - ew * 3.);
tc = coef5 * (ew - ewc * 3.);
dscal_(&nr, &beta, &work[*ilo], &c__1);
dscal_(&nr, &beta, &work[*ilo + *n], &c__1);
daxpy_(&nr, &coef, &work[*ilo + (*n << 2)], &c__1, &work[*ilo + *n], &
c__1);
daxpy_(&nr, &coef, &work[*ilo + *n * 5], &c__1, &work[*ilo], &c__1);
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
work[i__] += tc;
work[i__ + *n] += t;
/* L270: */
}
/* Apply matrix to vector */
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
kount = 0;
sum = 0.;
i__2 = *ihi;
for (j = *ilo; j <= i__2; ++j) {
i__3 = i__ + j * a_dim1;
if (a[i__3].r == 0. && a[i__3].i == 0.) {
goto L280;
}
++kount;
sum += work[j];
L280:
i__3 = i__ + j * b_dim1;
if (b[i__3].r == 0. && b[i__3].i == 0.) {
goto L290;
}
++kount;
sum += work[j];
L290:
;
}
work[i__ + (*n << 1)] = (doublereal) kount * work[i__ + *n] + sum;
/* L300: */
}
i__1 = *ihi;
for (j = *ilo; j <= i__1; ++j) {
kount = 0;
sum = 0.;
i__2 = *ihi;
for (i__ = *ilo; i__ <= i__2; ++i__) {
i__3 = i__ + j * a_dim1;
if (a[i__3].r == 0. && a[i__3].i == 0.) {
goto L310;
}
++kount;
sum += work[i__ + *n];
L310:
i__3 = i__ + j * b_dim1;
if (b[i__3].r == 0. && b[i__3].i == 0.) {
goto L320;
}
++kount;
sum += work[i__ + *n];
L320:
;
}
work[j + *n * 3] = (doublereal) kount * work[j] + sum;
/* L330: */
}
sum = ddot_(&nr, &work[*ilo + *n], &c__1, &work[*ilo + (*n << 1)], &c__1)
+ ddot_(&nr, &work[*ilo], &c__1, &work[*ilo + *n * 3], &c__1);
alpha = gamma / sum;
/* Determine correction to current iteration */
cmax = 0.;
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
cor = alpha * work[i__ + *n];
if (abs(cor) > cmax) {
cmax = abs(cor);
}
lscale[i__] += cor;
cor = alpha * work[i__];
if (abs(cor) > cmax) {
cmax = abs(cor);
}
rscale[i__] += cor;
/* L340: */
}
if (cmax < .5) {
goto L350;
}
d__1 = -alpha;
daxpy_(&nr, &d__1, &work[*ilo + (*n << 1)], &c__1, &work[*ilo + (*n << 2)]
, &c__1);
d__1 = -alpha;
daxpy_(&nr, &d__1, &work[*ilo + *n * 3], &c__1, &work[*ilo + *n * 5], &
c__1);
pgamma = gamma;
++it;
if (it <= nrp2) {
goto L250;
}
/* End generalized conjugate gradient iteration */
L350:
sfmin = dlamch_("S");
sfmax = 1. / sfmin;
lsfmin = (integer) (d_lg10(&sfmin) / basl + 1.);
lsfmax = (integer) (d_lg10(&sfmax) / basl);
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
i__2 = *n - *ilo + 1;
irab = izamax_(&i__2, &a[i__ + *ilo * a_dim1], lda);
rab = z_abs(&a[i__ + (irab + *ilo - 1) * a_dim1]);
i__2 = *n - *ilo + 1;
irab = izamax_(&i__2, &b[i__ + *ilo * b_dim1], ldb);
/* Computing MAX */
d__1 = rab, d__2 = z_abs(&b[i__ + (irab + *ilo - 1) * b_dim1]);
rab = max(d__1,d__2);
d__1 = rab + sfmin;
lrab = (integer) (d_lg10(&d__1) / basl + 1.);
ir = (integer) (lscale[i__] + d_sign(&c_b72, &lscale[i__]));
/* Computing MIN */
i__2 = max(ir,lsfmin), i__2 = min(i__2,lsfmax), i__3 = lsfmax - lrab;
ir = min(i__2,i__3);
lscale[i__] = pow_di(&c_b36, &ir);
icab = izamax_(ihi, &a[i__ * a_dim1 + 1], &c__1);
cab = z_abs(&a[icab + i__ * a_dim1]);
icab = izamax_(ihi, &b[i__ * b_dim1 + 1], &c__1);
/* Computing MAX */
d__1 = cab, d__2 = z_abs(&b[icab + i__ * b_dim1]);
cab = max(d__1,d__2);
d__1 = cab + sfmin;
lcab = (integer) (d_lg10(&d__1) / basl + 1.);
jc = (integer) (rscale[i__] + d_sign(&c_b72, &rscale[i__]));
/* Computing MIN */
i__2 = max(jc,lsfmin), i__2 = min(i__2,lsfmax), i__3 = lsfmax - lcab;
jc = min(i__2,i__3);
rscale[i__] = pow_di(&c_b36, &jc);
/* L360: */
}
/* Row scaling of matrices A and B */
i__1 = *ihi;
for (i__ = *ilo; i__ <= i__1; ++i__) {
i__2 = *n - *ilo + 1;
zdscal_(&i__2, &lscale[i__], &a[i__ + *ilo * a_dim1], lda);
i__2 = *n - *ilo + 1;
zdscal_(&i__2, &lscale[i__], &b[i__ + *ilo * b_dim1], ldb);
/* L370: */
}
/* Column scaling of matrices A and B */
i__1 = *ihi;
for (j = *ilo; j <= i__1; ++j) {
zdscal_(ihi, &rscale[j], &a[j * a_dim1 + 1], &c__1);
zdscal_(ihi, &rscale[j], &b[j * b_dim1 + 1], &c__1);
/* L380: */
}
return 0;
/* End of ZGGBAL */
} /* zggbal_ */
| {
"pile_set_name": "Github"
} |
package time // import "github.com/docker/docker/api/types/time"
import (
"fmt"
"math"
"strconv"
"strings"
"time"
)
// These are additional predefined layouts for use in Time.Format and Time.Parse
// with --since and --until parameters for `docker logs` and `docker events`
const (
rFC3339Local = "2006-01-02T15:04:05" // RFC3339 with local timezone
rFC3339NanoLocal = "2006-01-02T15:04:05.999999999" // RFC3339Nano with local timezone
dateWithZone = "2006-01-02Z07:00" // RFC3339 with time at 00:00:00
dateLocal = "2006-01-02" // RFC3339 with local timezone and time at 00:00:00
)
// GetTimestamp tries to parse given string as golang duration,
// then RFC3339 time and finally as a Unix timestamp. If
// any of these were successful, it returns a Unix timestamp
// as string otherwise returns the given value back.
// In case of duration input, the returned timestamp is computed
// as the given reference time minus the amount of the duration.
func GetTimestamp(value string, reference time.Time) (string, error) {
if d, err := time.ParseDuration(value); value != "0" && err == nil {
return strconv.FormatInt(reference.Add(-d).Unix(), 10), nil
}
var format string
// if the string has a Z or a + or three dashes use parse otherwise use parseinlocation
parseInLocation := !(strings.ContainsAny(value, "zZ+") || strings.Count(value, "-") == 3)
if strings.Contains(value, ".") {
if parseInLocation {
format = rFC3339NanoLocal
} else {
format = time.RFC3339Nano
}
} else if strings.Contains(value, "T") {
// we want the number of colons in the T portion of the timestamp
tcolons := strings.Count(value, ":")
// if parseInLocation is off and we have a +/- zone offset (not Z) then
// there will be an extra colon in the input for the tz offset subtract that
// colon from the tcolons count
if !parseInLocation && !strings.ContainsAny(value, "zZ") && tcolons > 0 {
tcolons--
}
if parseInLocation {
switch tcolons {
case 0:
format = "2006-01-02T15"
case 1:
format = "2006-01-02T15:04"
default:
format = rFC3339Local
}
} else {
switch tcolons {
case 0:
format = "2006-01-02T15Z07:00"
case 1:
format = "2006-01-02T15:04Z07:00"
default:
format = time.RFC3339
}
}
} else if parseInLocation {
format = dateLocal
} else {
format = dateWithZone
}
var t time.Time
var err error
if parseInLocation {
t, err = time.ParseInLocation(format, value, time.FixedZone(reference.Zone()))
} else {
t, err = time.Parse(format, value)
}
if err != nil {
// if there is a `-` then it's an RFC3339 like timestamp
if strings.Contains(value, "-") {
return "", err // was probably an RFC3339 like timestamp but the parser failed with an error
}
if _, _, err := parseTimestamp(value); err != nil {
return "", fmt.Errorf("failed to parse value as time or duration: %q", value)
}
return value, nil // unix timestamp in and out case (meaning: the value passed at the command line is already in the right format for passing to the server)
}
return fmt.Sprintf("%d.%09d", t.Unix(), int64(t.Nanosecond())), nil
}
// ParseTimestamps returns seconds and nanoseconds from a timestamp that has the
// format "%d.%09d", time.Unix(), int64(time.Nanosecond()))
// if the incoming nanosecond portion is longer or shorter than 9 digits it is
// converted to nanoseconds. The expectation is that the seconds and
// seconds will be used to create a time variable. For example:
// seconds, nanoseconds, err := ParseTimestamp("1136073600.000000001",0)
// if err == nil since := time.Unix(seconds, nanoseconds)
// returns seconds as def(aultSeconds) if value == ""
func ParseTimestamps(value string, def int64) (int64, int64, error) {
if value == "" {
return def, 0, nil
}
return parseTimestamp(value)
}
func parseTimestamp(value string) (int64, int64, error) {
sa := strings.SplitN(value, ".", 2)
s, err := strconv.ParseInt(sa[0], 10, 64)
if err != nil {
return s, 0, err
}
if len(sa) != 2 {
return s, 0, nil
}
n, err := strconv.ParseInt(sa[1], 10, 64)
if err != nil {
return s, n, err
}
// should already be in nanoseconds but just in case convert n to nanoseconds
n = int64(float64(n) * math.Pow(float64(10), float64(9-len(sa[1]))))
return s, n, nil
}
| {
"pile_set_name": "Github"
} |
<!DOCTYPE html>
<html lang="en">
<head>
<meta http-equiv="refresh" content="0;URL=../../../../libc/unix/notbsd/linux/constant._SC_COLL_WEIGHTS_MAX.html">
</head>
<body>
<p>Redirecting to <a href="../../../../libc/unix/notbsd/linux/constant._SC_COLL_WEIGHTS_MAX.html">../../../../libc/unix/notbsd/linux/constant._SC_COLL_WEIGHTS_MAX.html</a>...</p>
<script>location.replace("../../../../libc/unix/notbsd/linux/constant._SC_COLL_WEIGHTS_MAX.html" + location.search + location.hash);</script>
</body>
</html> | {
"pile_set_name": "Github"
} |
#
# Project Wok
#
# Copyright IBM Corp, 2015-2017
#
# Code derived from Project Kimchi
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import gettext
_ = gettext.gettext
messages = {
'WOKAPI0002E': _('Delete is not allowed for %(resource)s'),
'WOKAPI0003E': _('%(resource)s does not implement update method'),
'WOKAPI0005E': _('Create is not allowed for %(resource)s'),
'WOKAPI0006E': _('Unable to parse JSON request'),
'WOKAPI0007E': _('This API only supports JSON'),
'WOKAPI0008E': _('Parameters does not match requirement in schema: %(err)s'),
'WOKAPI0009E': _("You don't have permission to perform this operation."),
'WOKASYNC0001E': _('Unable to find task id: %(id)s'),
'WOKASYNC0002E': _('There is no callback to execute the kill task process.'),
'WOKASYNC0003E': _("Timeout of %(seconds)s seconds expired while running task '%(task)s."),
'WOKASYNC0004E': _('Unable to kill task due error: %(err)s'),
'WOKAUTH0001E': _("Authentication failed for user '%(username)s'. [Error code: %(code)s]"),
'WOKAUTH0002E': _('You are not authorized to access Wok. Please, login first.'),
'WOKAUTH0003E': _('Specify username to login into Wok.'),
'WOKAUTH0004E': _('You have failed to login in too much attempts. Please, wait for %(seconds)s seconds to try again.'),
'WOKAUTH0005E': _('Invalid LDAP configuration: %(item)s : %(value)s'),
'WOKAUTH0006E': _('Specify password to login into Wok.'),
'WOKAUTH0007E': _('You need to specify username and password to login into Wok.'),
'WOKAUTH0008E': _('The username or password you entered is incorrect. Please try again'),
'WOKLOG0001E': _('Invalid filter parameter. Filter parameters allowed: %(filters)s'),
'WOKLOG0002E': _('Creation of log file failed: %(err)s'),
'WOKNOT0001E': _('Unable to find notification %(id)s'),
'WOKNOT0002E': _('Unable to delete notification %(id)s: %(message)s'),
'WOKOBJST0001E': _('Unable to find %(item)s in datastore'),
'WOKUTILS0002E': _("Timeout while running command '%(cmd)s' after %(seconds)s seconds"),
'WOKUTILS0004E': _("Invalid data value '%(value)s'"),
'WOKUTILS0005E': _("Invalid data unit '%(unit)s'"),
'WOKCONFIG0001I': _('WoK is going to restart. Existing WoK connections will be closed.'),
'WOKPLUGIN0001E': _('Unable to find plug-in %(name)s'),
# These messages (ending with L) are for user log purposes
'WOKASYNC0001L': _("Successfully completed task '%(target_uri)s'"),
'WOKASYNC0002L': _("Failed to complete task '%(target_uri)s'"),
'WOKCOL0001L': _('Request made on collection'),
'WOKCONFIG0001L': _('Wok reload'),
'WOKRES0001L': _('Request made on resource'),
'WOKROOT0001L': _("User '%(username)s' login"),
'WOKROOT0002L': _("User '%(username)s' logout"),
'WOKPLUGIN0001L': _('Enable plug-in %(ident)s.'),
'WOKPLUGIN0002L': _('Disable plug-in %(ident)s.'),
}
| {
"pile_set_name": "Github"
} |
<div class="apiDetail">
<div>
<h2><span>Boolean / Function(treeId, treeNodes, targetNode)</span><span class="path">setting.edit.drag.</span>prev</h2>
<h3>概述<span class="h3_info">[ 依赖 <span class="highlight_green">jquery.ztree.exedit</span> 扩展 js ]</span></h3>
<div class="desc">
<p></p>
<div class="longdesc">
<p>拖拽到目标节点时,设置是否允许移动到目标节点前面的操作。<span class="highlight_red">[setting.edit.enable = true 时生效]</span></p>
<p class="highlight_red">拖拽目标是 根 的时候,不触发 prev 和 next,只会触发 inner</p>
<p class="highlight_red">此功能主要作用是对拖拽进行适当限制(辅助箭头),需要结合 next、inner 一起使用,才能实现完整功能。</p>
<p>默认值:true</p>
</div>
</div>
<h3>Boolean 格式说明</h3>
<div class="desc">
<p> true / false 分别表示 允许 / 不允许 移动到目标节点前面</p>
</div>
<h3>Function 参数说明</h3>
<div class="desc">
<h4><b>treeId</b><span>String</span></h4>
<p>对应 zTree 的 <b class="highlight_red">treeId</b>,便于用户操控(多棵树拖拽时,是目标节点所在树的 treeId)</p>
<h4 class="topLine"><b>treeNodes</b><span>Array(JSON)</span></h4>
<p>被拖拽的节点 JSON 数据集合</p>
<h4 class="topLine"><b>targetNode</b><span>JSON</span></h4>
<p>拖拽时的目标节点 JSON 数据对象</p>
<h4 class="topLine"><b>返回值</b><span>Boolean</span></h4>
<p>返回值同 Boolean 格式的数据</p>
</div>
<h3>setting & function 举例</h3>
<h4>1. 禁止拖拽到节点前面的操作</h4>
<pre xmlns=""><code>var setting = {
edit: {
enable: true,
drag: {
prev: false,
next: true,
inner: true
}
}
};
......</code></pre>
<h4>2. 禁止拖拽到父节点前面的操作</h4>
<pre xmlns=""><code>function canPrev(treeId, nodes, targetNode) {
return !targetNode.isParent;
}
var setting = {
edit: {
enable: true,
drag: {
prev: canPrev,
next: true,
inner: true
}
}
};
......</code></pre>
</div>
</div> | {
"pile_set_name": "Github"
} |
---
author: dansimp
ms.author: dansimp
ms.date: 10/02/2018
ms.reviewer:
audience: itpro
manager: dansimp
ms.prod: edge
ms.topic: include
---
By default, Microsoft Edge shows localhost IP address while making calls using the WebRTC protocol. Enabling this policy hides the localhost IP addresses.
| {
"pile_set_name": "Github"
} |
{ Options form for the lazarus package manager
Copyright (C) 2011 Darius Blaszyk
This library is free software; you can redistribute it and/or modify it
under the terms of the GNU Library General Public License as published by
the Free Software Foundation; either version 2 of the License, or (at your
option) any later version with the following modification:
As a special exception, the copyright holders of this library give you
permission to link this library with independent modules to produce an
executable, regardless of the license terms of these independent modules,and
to copy and distribute the resulting executable under terms of your choice,
provided that you also meet, for each linked independent module, the terms
and conditions of the license of that module. An independent module is a
module which is not derived from or based on this library. If you modify
this library, you may extend this exception to your version of the library,
but you are not obligated to do so. If you do not wish to do so, delete this
exception statement from your version.
This program is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE. See the GNU Library General Public License
for more details.
You should have received a copy of the GNU Library General Public License
along with this library; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street - Fifth Floor, Boston, MA 02110-1335, USA.
}
unit fppkg_optionsfrm;
{$mode objfpc}{$H+}
interface
uses
SysUtils, Forms, Dialogs, ComCtrls, ButtonPanel, StdCtrls, ExtCtrls, Buttons,
pkgoptions, pkgglobals;
type
TPkgColumn = record
Name: string;
Visible: boolean;
end;
{ TPackageManagerOption }
TPackageManagerOption = class(TObject)
private
FPkgColumnCount: integer;
FPkgColumns: array of TPkgColumn;
FVerbosity: TLogLevels;
function GetPkgColumns(index: integer): TPkgColumn;
procedure SetPkgColumnCount(const AValue: integer);
procedure SetPkgColumns(index: integer; const AValue: TPkgColumn);
procedure AddPkgColumn(Name: string; Visible: boolean);
procedure SetVerbosity(const AValue: TLogLevels);
public
constructor Create;
destructor Destroy; override;
property PkgColumns[index: integer]: TPkgColumn read GetPkgColumns write SetPkgColumns;
property PkgColumnCount: integer read FPkgColumnCount write SetPkgColumnCount;
function PkgColumnByName(AName: string): integer;
property Verbosity: TLogLevels read FVerbosity write SetVerbosity;
end;
{ TOptionsForm }
TOptionsForm = class(TForm)
Button1: TButton;
CheckBox5: TCheckBox;
CheckBox6: TCheckBox;
RemoveFromVisibleColumnsButton: TSpeedButton;
VisibleColumnsLabel: TLabel;
VisibleColumnsListBox: TListBox;
AvailableColumnsLabel: TLabel;
AvailableColumnsListBox: TListBox;
AddToVisibleColumnsButton: TSpeedButton;
lblMiddle: TLabel;
UserInterfaceTabSheet: TTabSheet;
VerbosityCheckGroup: TCheckGroup;
ComboBox1: TComboBox;
ComboBox2: TComboBox;
CompilerConfigCheckBox2: TCheckBox;
CompilerOptionsButton: TButton;
ButtonPanel1: TButtonPanel;
CheckBox1: TCheckBox;
CheckBox2: TCheckBox;
CheckBox3: TCheckBox;
CheckBox4: TCheckBox;
CompilerConfigCheckBox: TCheckBox;
CompilerConfigCheckBox1: TCheckBox;
CompilerConfigEdit: TEdit;
CompilerOptionsButton1: TButton;
CompilerOptionsEdit: TEdit;
CompilerOptionsEdit1: TEdit;
CompilerOptionsGroupBox1: TGroupBox;
Edit1: TEdit;
GlobalListView: TListView;
CompilerListView: TListView;
FPMakeListView: TListView;
FPMakePageControl: TPageControl;
GlobalTabSheet: TTabSheet;
CompilerTabSheet: TTabSheet;
FPMakeTabSheet: TTabSheet;
GroupBox1: TGroupBox;
CompilerOptionsGroupBox: TGroupBox;
Label1: TLabel;
Label2: TLabel;
Label3: TLabel;
OpenDialog: TOpenDialog;
ConfigTabSheet: TTabSheet;
TabSheet1: TTabSheet;
procedure AddToVisibleColumnsButtonClick(Sender: TObject);
procedure Button1Click(Sender: TObject);
procedure CancelButtonClick(Sender: TObject);
procedure FormCreate(Sender: TObject);
procedure FormShow(Sender: TObject);
procedure OKButtonClick(Sender: TObject);
procedure RemoveFromVisibleColumnsButtonClick(Sender: TObject);
private
procedure SetupColumnVisibility;
public
end;
var
OptionsForm: TOptionsForm;
LazPkgOptions: TPackageManagerOption;
implementation
{$R *.lfm}
uses
fppkg_const;
{ TPackageManagerOption }
function TPackageManagerOption.GetPkgColumns(index: integer): TPkgColumn;
begin
Result := FPkgColumns[index];
end;
procedure TPackageManagerOption.SetPkgColumnCount(const AValue: integer);
begin
if FPkgColumnCount=AValue then exit;
FPkgColumnCount:=AValue;
SetLength(FPkgColumns,FPkgColumnCount);
end;
procedure TPackageManagerOption.SetPkgColumns(index: integer;
const AValue: TPkgColumn);
begin
FPkgColumns[index] := AValue;
end;
constructor TPackageManagerOption.Create;
begin
PkgColumnCount := 0;
AddPkgColumn('Name', True);
AddPkgColumn('State', True);
AddPkgColumn('Version', True);
AddPkgColumn('Info', True);
AddPkgColumn('Description', False);
AddPkgColumn('Keywords', False);
AddPkgColumn('Category', False);
AddPkgColumn('Support', False);
{ AddPkgColumn('Author', False);
AddPkgColumn('License', False);
AddPkgColumn('HomepageURL', False);
AddPkgColumn('DownloadURL', False);
AddPkgColumn('FileName', False);
AddPkgColumn('Email', False);
AddPkgColumn('OS', False);
AddPkgColumn('CPU', False);
}
Verbosity := DefaultLogLevels;
end;
destructor TPackageManagerOption.Destroy;
begin
inherited Destroy;
end;
function TPackageManagerOption.PkgColumnByName(AName: string): integer;
var
i: integer;
begin
for i := 0 to PkgColumnCount - 1 do
if FPkgColumns[i].Name = AName then
begin
Result := i;
Exit;
end;
end;
procedure TPackageManagerOption.AddPkgColumn(Name: string;
Visible: boolean);
begin
PkgColumnCount := PkgColumnCount + 1;
FPkgColumns[PkgColumnCount-1].Name := Name;
FPkgColumns[PkgColumnCount-1].Visible := Visible;
end;
procedure TPackageManagerOption.SetVerbosity(const AValue: TLogLevels);
begin
if FVerbosity=AValue then exit;
FVerbosity:=AValue;
LogLevels := AValue;
end;
{ TOptionsForm }
procedure TOptionsForm.FormCreate(Sender: TObject);
procedure AddListItem(LV: TListView; ACaption, AValue: string);
var
li: TListItem;
begin
li := LV.Items.Add;
li.Caption:= ACaption;
li.SubItems.Add(AValue);
end;
begin
Caption := rsFppkgOptions;
//global
{ AddListItem(GlobalListView, rsRemoteMirrorsURL, GlobalOptions.GlobalSection.RemoteMirrorsURL);
AddListItem(GlobalListView, rsRemoteRepository, GlobalOptions.GlobalSection.RemoteRepository);
AddListItem(GlobalListView, rsLocalRepository, GlobalOptions.GlobalSection.LocalRepository);
AddListItem(GlobalListView, rsBuildDirectory, GlobalOptions.GlobalSection.BuildDir);
AddListItem(GlobalListView, rsArchivesDirectory, GlobalOptions.GlobalSection.ArchivesDir);
AddListItem(GlobalListView, rsCompilerConfigDirectory, GlobalOptions.GlobalSection.CompilerConfigDir);
AddListItem(GlobalListView, rsDefaultCompilerConfig, GlobalOptions.GlobalSection.CompilerConfig);
AddListItem(GlobalListView, rsFpmakeCompilerConfig, GlobalOptions.GlobalSection.FPMakeCompilerConfig);
AddListItem(GlobalListView, rsDownloader, GlobalOptions.GlobalSection.Downloader);
AddListItem(GlobalListView, rsCustomFpmakeOptions, GlobalOptions.GlobalSection.CustomFPMakeOptions);
//compiler
AddListItem(CompilerListView, rsCompiler, CompilerOptions.Compiler);
AddListItem(CompilerListView, rsCompilerTarget, CompilerOptions.CompilerTarget);
AddListItem(CompilerListView, rsCompilerVersion, CompilerOptions.CompilerVersion);
AddListItem(CompilerListView, rsGlobalPrefix, CompilerOptions.GlobalPrefix);
AddListItem(CompilerListView, rsLocalPrefix, CompilerOptions.LocalPrefix);
AddListItem(CompilerListView, rsGlobalInstallDir, CompilerOptions.GlobalInstallDir);
AddListItem(CompilerListView, rsLocalInstallDir, CompilerOptions.LocalInstallDir);
AddListItem(CompilerListView, rsOptions, CompilerOptions.Options.DelimitedText);
}
//fpmake
// Load FPMake compiler config, this is normally the same config as above
{ AddListItem(FPMakeListView, rsCompiler, FPMakeCompilerOptions.Compiler);
AddListItem(FPMakeListView, rsCompilerTarget, FPMakeCompilerOptions.CompilerTarget);
AddListItem(FPMakeListView, rsCompilerVersion, FPMakeCompilerOptions.CompilerVersion);
AddListItem(FPMakeListView, rsGlobalPrefix, FPMakeCompilerOptions.GlobalPrefix);
AddListItem(FPMakeListView, rsLocalPrefix, FPMakeCompilerOptions.LocalPrefix);
AddListItem(FPMakeListView, rsGlobalInstallDir, FPMakeCompilerOptions.GlobalInstallDir);
AddListItem(FPMakeListView, rsLocalInstallDir, FPMakeCompilerOptions.LocalInstallDir);
AddListItem(FPMakeListView, rsOptions, FPMakeCompilerOptions.Options.DelimitedText);
}
FPMakePageControl.ActivePage := ConfigTabSheet;
end;
procedure TOptionsForm.FormShow(Sender: TObject);
begin
SetupColumnVisibility;
//setup verbosity
with VerbosityCheckGroup do
begin
Checked[Items.IndexOf('Error')] := llError in LazPkgOptions.Verbosity;
Checked[Items.IndexOf('Warning')] := llWarning in LazPkgOptions.Verbosity;
Checked[Items.IndexOf('Info')] := llInfo in LazPkgOptions.Verbosity;
Checked[Items.IndexOf('Commands')] := llCommands in LazPkgOptions.Verbosity;
Checked[Items.IndexOf('Debug')] := llDebug in LazPkgOptions.Verbosity;
Checked[Items.IndexOf('Progress')] := llProgres in LazPkgOptions.Verbosity;
end;
end;
procedure TOptionsForm.OKButtonClick(Sender: TObject);
begin
//save the data to settings file
//save verbosity
LazPkgOptions.Verbosity := [];
with VerbosityCheckGroup do
begin
if Checked[Items.IndexOf('Error')] then
LazPkgOptions.Verbosity := LazPkgOptions.Verbosity + [llError];
if Checked[Items.IndexOf('Warning')] then
LazPkgOptions.Verbosity := LazPkgOptions.Verbosity + [llWarning];
if Checked[Items.IndexOf('Info')] then
LazPkgOptions.Verbosity := LazPkgOptions.Verbosity + [llInfo];
if Checked[Items.IndexOf('Commands')] then
LazPkgOptions.Verbosity := LazPkgOptions.Verbosity + [llCommands];
if Checked[Items.IndexOf('Debug')] then
LazPkgOptions.Verbosity := LazPkgOptions.Verbosity + [llDebug];
if Checked[Items.IndexOf('Progress')] then
LazPkgOptions.Verbosity := LazPkgOptions.Verbosity + [llProgres];
end;
Close;
end;
procedure TOptionsForm.RemoveFromVisibleColumnsButtonClick(Sender: TObject);
var
i: integer;
pkg: TPkgColumn;
c: integer;
begin
i := 0;
c := -1;
while i < VisibleColumnsListBox.Items.Count do
begin
if VisibleColumnsListBox.Selected[i] then
c := i;
Inc(i);
end;
if c <> -1 then
begin
i := LazPkgOptions.PkgColumnByName(VisibleColumnsListBox.Items[c]);
pkg := LazPkgOptions.PkgColumns[i];
pkg.Visible := False;
LazPkgOptions.PkgColumns[i] := pkg;
SetupColumnVisibility;
end;
end;
procedure TOptionsForm.SetupColumnVisibility;
var
i: integer;
begin
VisibleColumnsListBox.Clear;
AvailableColumnsListBox.Clear;
for i := 0 to LazPkgOptions.PkgColumnCount - 1 do
begin
if LazPkgOptions.PkgColumns[i].Visible then
VisibleColumnsListBox.Items.Add(LazPkgOptions.PkgColumns[i].Name)
else
AvailableColumnsListBox.Items.Add(LazPkgOptions.PkgColumns[i].Name)
end;
end;
procedure TOptionsForm.Button1Click(Sender: TObject);
begin
if OpenDialog.Execute then
CompilerConfigEdit.Text := OpenDialog.FileName;
end;
procedure TOptionsForm.AddToVisibleColumnsButtonClick(Sender: TObject);
var
i: integer;
pkg: TPkgColumn;
c: integer;
begin
i := 0;
c := -1;
while i < AvailableColumnsListBox.Items.Count do
begin
if AvailableColumnsListBox.Selected[i] then
c := i;
Inc(i);
end;
if c <> -1 then
begin
i := LazPkgOptions.PkgColumnByName(AvailableColumnsListBox.Items[c]);
pkg := LazPkgOptions.PkgColumns[i];
pkg.Visible := True;
LazPkgOptions.PkgColumns[i] := pkg;
SetupColumnVisibility;
end;
end;
procedure TOptionsForm.CancelButtonClick(Sender: TObject);
begin
Close;
end;
initialization
LazPkgOptions := TPackageManagerOption.Create;
finalization
FreeAndNil(LazPkgOptions);
end.
| {
"pile_set_name": "Github"
} |
// -------------- Configuration --------------
// CloudForms
def opentlc_creds = 'b93d2da4-c2b7-45b5-bf3b-ee2c08c6368e'
def opentlc_admin_creds = '73b84287-8feb-478a-b1f2-345fd0a1af47'
def cf_uri = 'https://labs.opentlc.com'
def cf_group = 'opentlc-access-cicd'
// IMAP
def imap_creds = 'd8762f05-ca66-4364-adf2-bc3ce1dca16c'
def imap_server = 'imap.gmail.com'
// Notifications
def notification_email = '[email protected]'
def rocketchat_hook = '5d28935e-f7ca-4b11-8b8e-d7a7161a013a'
// SSH key
def ssh_creds = '15e1788b-ed3c-4b18-8115-574045f32ce4'
// Admin host ssh location is in a credential too
def ssh_admin_host = 'admin-host-na'
// state variables
def guid=''
def ssh_location = ''
// Catalog items
def choices = [
'OPENTLC OpenShift Labs / OpenShift Implementation 3.11 Lab',
].join("\n")
def ocprelease_choice = [
'3.11.59',
].join("\n")
def region_choice = [
'na',
'emea',
'latam',
'apac',
].join("\n")
def nodes_choice = [
'3',
'1',
'2',
'4',
'5',
'6',
'7',
'8',
].join("\n")
pipeline {
agent any
options {
buildDiscarder(logRotator(daysToKeepStr: '30'))
}
parameters {
booleanParam(
defaultValue: false,
description: 'wait for user input before deleting the environment',
name: 'confirm_before_delete'
)
choice(
choices: choices,
description: 'Catalog item',
name: 'catalog_item',
)
choice(
choices: ocprelease_choice,
description: 'Catalog item',
name: 'ocprelease',
)
choice(
choices: nodes_choice,
description: 'Number of Nodes',
name: 'nodes',
)
choice(
choices: region_choice,
description: 'Region',
name: 'region',
)
}
stages {
stage('order from CF') {
environment {
uri = "${cf_uri}"
credentials = credentials("${opentlc_creds}")
DEBUG = 'true'
}
/* This step use the order_svc_guid.sh script to order
a service from CloudForms */
steps {
git url: 'https://github.com/redhat-gpte-devopsautomation/cloudforms-oob'
script {
def catalog = params.catalog_item.split(' / ')[0].trim()
def item = params.catalog_item.split(' / ')[1].trim()
def ocprelease = params.ocprelease.trim()
def nodes = params.nodes.trim()
def region = params.region.trim()
echo "'${catalog}' '${item}'"
guid = sh(
returnStdout: true,
script: """
./opentlc/order_svc_guid.sh \
-c '${catalog}' \
-i '${item}' \
-G '${cf_group}' \
-d 'status=t,expiration=7,runtime=8,ocprelease=${ocprelease},region=${region},nodes=${nodes},check=t,quotacheck=t'
"""
).trim()
echo "GUID is '${guid}'"
}
}
}
/* Skip this step because sometimes the completed email arrives
before the 'has started' email
stage('Wait for first email') {
environment {
credentials=credentials("${imap_creds}")
}
steps {
sh """./tests/jenkins/downstream/poll_email.py \
--server '${imap_server}' \
--guid ${guid} \
--timeout 20 \
--filter 'has started'"""
}
}
*/
stage('Wait for last email and parse SSH location') {
environment {
credentials=credentials("${imap_creds}")
}
steps {
git url: 'https://github.com/redhat-cop/agnosticd',
branch: 'development'
script {
email = sh(
returnStdout: true,
script: """
./tests/jenkins/downstream/poll_email.py \
--server '${imap_server}' \
--guid ${guid} \
--timeout 90 \
--filter 'has completed'
"""
).trim()
try {
def m = email =~ /<pre>. *ssh -i [^ ]+ *([^ <]+?) *<\/pre>/
ssh_location = m[0][1]
echo "ssh_location = '${ssh_location}'"
} catch(Exception ex) {
echo "Could not parse email:"
echo email
echo ex.toString()
throw ex
}
}
}
}
stage('SSH') {
steps {
withCredentials([
sshUserPrivateKey(
credentialsId: ssh_creds,
keyFileVariable: 'ssh_key',
usernameVariable: 'ssh_username')
]) {
sh "ssh -o StrictHostKeyChecking=no -i ${ssh_key} ${ssh_location} w"
sh "ssh -o StrictHostKeyChecking=no -i ${ssh_key} ${ssh_location} oc version"
}
}
}
stage('Confirm before retiring') {
when {
expression {
return params.confirm_before_delete
}
}
steps {
input "Continue ?"
}
}
stage('Retire service from CF') {
environment {
uri = "${cf_uri}"
credentials = credentials("${opentlc_creds}")
admin_credentials = credentials("${opentlc_admin_creds}")
DEBUG = 'true'
}
/* This step uses the delete_svc_guid.sh script to retire
the service from CloudForms */
steps {
git 'https://github.com/redhat-gpte-devopsautomation/cloudforms-oob'
sh "./opentlc/delete_svc_guid.sh '${guid}'"
}
post {
failure {
withCredentials([usernameColonPassword(credentialsId: imap_creds, variable: 'credentials')]) {
mail(
subject: "${env.JOB_NAME} (${env.BUILD_NUMBER}) failed retiring for GUID=${guid}",
body: "It appears that ${env.BUILD_URL} is failing, somebody should do something about that.\nMake sure GUID ${guid} is destroyed.",
to: "${notification_email}",
replyTo: "${notification_email}",
from: credentials.split(':')[0]
)
}
withCredentials([string(credentialsId: rocketchat_hook, variable: 'HOOK_URL')]) {
sh(
"""
curl -H 'Content-Type: application/json' \
-X POST '${HOOK_URL}' \
-d '{\"username\": \"jenkins\", \"icon_url\": \"https://dev-sfo01.opentlc.com/static/81c91982/images/headshot.png\", \"text\": \"@here :rage: ${env.JOB_NAME} (${env.BUILD_NUMBER}) failed retiring ${guid}.\"}'\
""".trim()
)
}
}
}
}
stage('Wait for deletion email') {
steps {
git url: 'https://github.com/redhat-cop/agnosticd',
branch: 'development'
withCredentials([usernameColonPassword(credentialsId: imap_creds, variable: 'credentials')]) {
sh """./tests/jenkins/downstream/poll_email.py \
--guid ${guid} \
--timeout 20 \
--server '${imap_server}' \
--filter 'has been deleted'"""
}
}
}
}
post {
failure {
git 'https://github.com/redhat-gpte-devopsautomation/cloudforms-oob'
/* retire in case of failure */
withCredentials(
[
usernameColonPassword(credentialsId: opentlc_creds, variable: 'credentials'),
usernameColonPassword(credentialsId: opentlc_admin_creds, variable: 'admin_credentials')
]
) {
sh """
export uri="${cf_uri}"
export DEBUG=true
./opentlc/delete_svc_guid.sh '${guid}'
"""
}
/* Print ansible logs */
withCredentials([
string(credentialsId: ssh_admin_host, variable: 'ssh_admin'),
sshUserPrivateKey(
credentialsId: ssh_creds,
keyFileVariable: 'ssh_key',
usernameVariable: 'ssh_username')
]) {
sh("""
ssh -o StrictHostKeyChecking=no -i ${ssh_key} ${ssh_admin} \
"find deployer_logs -name '*${guid}*log' | xargs cat"
""".trim()
)
}
withCredentials([usernameColonPassword(credentialsId: imap_creds, variable: 'credentials')]) {
mail(
subject: "${env.JOB_NAME} (${env.BUILD_NUMBER}) failed GUID=${guid}",
body: "It appears that ${env.BUILD_URL} is failing, somebody should do something about that.",
to: "${notification_email}",
replyTo: "${notification_email}",
from: credentials.split(':')[0]
)
}
withCredentials([string(credentialsId: rocketchat_hook, variable: 'HOOK_URL')]) {
sh(
"""
curl -H 'Content-Type: application/json' \
-X POST '${HOOK_URL}' \
-d '{\"username\": \"jenkins\", \"icon_url\": \"https://dev-sfo01.opentlc.com/static/81c91982/images/headshot.png\", \"text\": \"@here :rage: ${env.JOB_NAME} (${env.BUILD_NUMBER}) failed GUID=${guid}. It appears that ${env.BUILD_URL}/console is failing, somebody should do something about that.\"}'\
""".trim()
)
}
}
fixed {
withCredentials([string(credentialsId: rocketchat_hook, variable: 'HOOK_URL')]) {
sh(
"""
curl -H 'Content-Type: application/json' \
-X POST '${HOOK_URL}' \
-d '{\"username\": \"jenkins\", \"icon_url\": \"https://dev-sfo01.opentlc.com/static/81c91982/images/headshot.png\", \"text\": \"@here :smile: ${env.JOB_NAME} is now FIXED, see ${env.BUILD_URL}/console\"}'\
""".trim()
)
}
}
}
}
| {
"pile_set_name": "Github"
} |
#ifdef USE_CUDNN
#include <vector>
#include "thrust/device_vector.h"
#include "caffe/layers/cudnn_softmax_layer.hpp"
namespace caffe {
template <typename Dtype>
void CuDNNSoftmaxLayer<Dtype>::LayerSetUp(const vector<Blob<Dtype>*>& bottom,
const vector<Blob<Dtype>*>& top) {
SoftmaxLayer<Dtype>::LayerSetUp(bottom, top);
// Initialize CUDNN.
CUDNN_CHECK(cudnnCreate(&handle_));
cudnn::createTensor4dDesc<Dtype>(&bottom_desc_);
cudnn::createTensor4dDesc<Dtype>(&top_desc_);
handles_setup_ = true;
}
template <typename Dtype>
void CuDNNSoftmaxLayer<Dtype>::Reshape(const vector<Blob<Dtype>*>& bottom,
const vector<Blob<Dtype>*>& top) {
SoftmaxLayer<Dtype>::Reshape(bottom, top);
int N = this->outer_num_;
int K = bottom[0]->shape(this->softmax_axis_);
int H = this->inner_num_;
int W = 1;
cudnn::setTensor4dDesc<Dtype>(&bottom_desc_, N, K, H, W);
cudnn::setTensor4dDesc<Dtype>(&top_desc_, N, K, H, W);
}
template <typename Dtype>
CuDNNSoftmaxLayer<Dtype>::~CuDNNSoftmaxLayer() {
// Check that handles have been setup before destroying.
if (!handles_setup_) { return; }
cudnnDestroyTensorDescriptor(bottom_desc_);
cudnnDestroyTensorDescriptor(top_desc_);
cudnnDestroy(handle_);
}
INSTANTIATE_CLASS(CuDNNSoftmaxLayer);
} // namespace caffe
#endif
| {
"pile_set_name": "Github"
} |
import Button from "./components/Button";
import SubmitButton from "./components/SubmitButton";
export {
Button,
SubmitButton
} | {
"pile_set_name": "Github"
} |
<?php
//以下为日志
interface ILogHandler
{
public function write($msg);
}
class CLogFileHandler implements ILogHandler
{
private $handle = null;
public function __construct($file = '')
{
$this->handle = fopen($file,'a');
}
public function write($msg)
{
fwrite($this->handle, $msg, 4096);
}
public function __destruct()
{
fclose($this->handle);
}
}
class Log
{
private $handler = null;
private $level = 15;
private static $instance = null;
private function __construct(){}
private function __clone(){}
public static function Init($handler = null,$level = 15)
{
if(!self::$instance instanceof self)
{
self::$instance = new self();
self::$instance->__setHandle($handler);
self::$instance->__setLevel($level);
}
return self::$instance;
}
private function __setHandle($handler){
$this->handler = $handler;
}
private function __setLevel($level)
{
$this->level = $level;
}
public static function DEBUG($msg)
{
self::$instance->write(1, $msg);
}
public static function WARN($msg)
{
self::$instance->write(4, $msg);
}
public static function ERROR($msg)
{
$debugInfo = debug_backtrace();
$stack = "[";
foreach($debugInfo as $key => $val){
if(array_key_exists("file", $val)){
$stack .= ",file:" . $val["file"];
}
if(array_key_exists("line", $val)){
$stack .= ",line:" . $val["line"];
}
if(array_key_exists("function", $val)){
$stack .= ",function:" . $val["function"];
}
}
$stack .= "]";
self::$instance->write(8, $stack . $msg);
}
public static function INFO($msg)
{
self::$instance->write(2, $msg);
}
private function getLevelStr($level)
{
switch ($level)
{
case 1:
return 'debug';
break;
case 2:
return 'info';
break;
case 4:
return 'warn';
break;
case 8:
return 'error';
break;
default:
}
}
protected function write($level,$msg)
{
if(($level & $this->level) == $level )
{
$msg = '['.date('Y-m-d H:i:s').']['.$this->getLevelStr($level).'] '.$msg."\n";
$this->handler->write($msg);
}
}
}
| {
"pile_set_name": "Github"
} |
/*
* Copyright (c) 2016-2020 VMware, Inc. All Rights Reserved.
* This software is released under MIT license.
* The full license information can be found in LICENSE in the root directory of this project.
*/
import { registerElementSafely } from '@clr/core/internal';
import { CdsTestDropdown } from './test-dropdown.element.js';
registerElementSafely('cds-test-dropdown', CdsTestDropdown);
declare global {
interface HTMLElementTagNameMap {
'cds-test-dropdown': CdsTestDropdown;
}
}
| {
"pile_set_name": "Github"
} |
module namespace foaf = "http://www.w3.org/TestModules/foaf";
import module namespace ddl = "http://zorba.io/modules/store/static/collections/ddl";
import module namespace dml = "http://zorba.io/modules/store/static/collections/dml";
import module namespace index_ddl = "http://zorba.io/modules/store/static/indexes/ddl";
import module namespace index_dml = "http://zorba.io/modules/store/static/indexes/dml";
declare namespace ann = "http://zorba.io/annotations";
declare variable $foaf:network:= xs:QName("foaf:network");
declare collection foaf:network as object()*;
(:
Create and populate the collection, and then create the indexes
:)
declare %ann:sequential function foaf:create-db()
{
ddl:create($foaf:network);
dml:insert($foaf:network, (
{
"name" : "James T. Kirk",
"age" : 30,
"gender" : "male",
"friends" : [ "Mister Spock", "Scotty", "Jean-Luc Picard"]
},
{
"name" : "Jean-Luc Picard",
"age" : 40,
"gender" : "male",
"friends" : [ "James T. Kirk", "Lieutenant Commander Data", "Beverly Crusher" ]
},
{
"name" : "Beverly Crusher",
"age" : 38,
"gender" : "female",
"friends" : [ "Jean-Luc Picard", "Ensign Crusher" ]
},
{
"name" : "Lieutenant Commander Data",
"age" : 100,
"gender" : "positronic matrix",
"friends" : [ "Geordi La Forge" ]
}
));
};
| {
"pile_set_name": "Github"
} |
################################################################################
# Created by Horst Hunger 2008-05-07 #
# #
# Wrapper for 64 bit machines #
################################################################################
--source include/have_64bit.inc
--source suite/sys_vars/inc/min_examined_row_limit_basic.inc
| {
"pile_set_name": "Github"
} |
# gas comment with ``gnu'' style quotes
/* some directive tests */
.byte 0xff
.byte 1, 2, 3
.short 1, 2, 3
.word 1, 2, 3
.long 1, 2, 3
.int 1, 2, 3
.align 8
.byte 1
/* .align 16, 0x90 gas is too clever for us with 0x90 fill */
.align 16, 0x91 /* 0x91 tests the non-clever behaviour */
.skip 3
.skip 15, 0x90
.string "hello\0world"
/* some label tests */
movl %eax, %ebx
L1:
movl %eax, %ebx
mov 0x10000, %eax
L2:
movl $L2 - L1, %ecx
var1:
nop ; nop ; nop ; nop
mov var1, %eax
/* instruction tests */
movl %eax, %ebx
mov 0x10000, %eax
mov 0x10000, %ax
mov 0x10000, %al
mov %al, 0x10000
mov $1, %edx
mov $1, %dx
mov $1, %dl
movb $2, 0x100(%ebx,%edx,2)
movw $2, 0x100(%ebx,%edx,2)
movl $2, 0x100(%ebx,%edx,2)
movl %eax, 0x100(%ebx,%edx,2)
movl 0x100(%ebx,%edx,2), %edx
movw %ax, 0x100(%ebx,%edx,2)
mov %eax, 0x12(,%edx,2)
mov %cr3, %edx
mov %ecx, %cr3
movl %cr3, %eax
movl %tr3, %eax
movl %db3, %ebx
movl %dr6, %eax
movl %fs, %ecx
movl %ebx, %fs
movsbl 0x1000, %eax
movsbw 0x1000, %ax
movswl 0x1000, %eax
movzbl 0x1000, %eax
movzbw 0x1000, %ax
movzwl 0x1000, %eax
movzb 0x1000, %eax
movzb 0x1000, %ax
pushl %eax
pushw %ax
push %eax
push %cs
push %gs
push $1
push $100
popl %eax
popw %ax
pop %eax
pop %ds
pop %fs
xchg %eax, %ecx
xchg %edx, %eax
xchg %bx, 0x10000
xchg 0x10000, %ebx
xchg 0x10000, %dl
in $100, %al
in $100, %ax
in $100, %eax
in %dx, %al
in %dx, %ax
in %dx, %eax
inb %dx
inw %dx
inl %dx
out %al, $100
out %ax, $100
out %eax, $100
/* NOTE: gas is bugged here, so size must be added */
outb %al, %dx
outw %ax, %dx
outl %eax, %dx
leal 0x1000(%ebx), %ecx
lea 0x1000(%ebx), %ecx
les 0x2000, %eax
lds 0x2000, %ebx
lfs 0x2000, %ecx
lgs 0x2000, %edx
lss 0x2000, %edx
addl $0x123, %eax
add $0x123, %ebx
addl $0x123, 0x100
addl $0x123, 0x100(%ebx)
addl $0x123, 0x100(%ebx,%edx,2)
addl $0x123, 0x100(%esp)
addl $0x123, (3*8)(%esp)
addl $0x123, (%ebp)
addl $0x123, (%esp)
cmpl $0x123, (%esp)
add %eax, (%ebx)
add (%ebx), %eax
or %dx, (%ebx)
or (%ebx), %si
add %cl, (%ebx)
add (%ebx), %dl
inc %edx
incl 0x10000
incb 0x10000
dec %dx
test $1, %al
test $1, %cl
testl $1, 0x1000
testb $1, 0x1000
testw $1, 0x1000
test %eax, %ebx
test %eax, 0x1000
test 0x1000, %edx
not %edx
notw 0x10000
notl 0x10000
notb 0x10000
neg %edx
negw 0x10000
negl 0x10000
negb 0x10000
imul %ecx
mul %edx
mulb %cl
imul %eax, %ecx
imul 0x1000, %cx
imul $10, %eax, %ecx
imul $10, %ax, %cx
imul $10, %eax
imul $0x1100000, %eax
imul $1, %eax
idivw 0x1000
div %ecx
div %bl
div %ecx, %eax
shl %edx
shl $10, %edx
shl %cl, %edx
shld $1, %eax, %edx
shld %cl, %eax, %edx
shld %eax, %edx
shrd $1, %eax, %edx
shrd %cl, %eax, %edx
shrd %eax, %edx
L4:
call 0x1000
call L4
call *%eax
call *0x1000
call func1
.global L5,L6
L5:
L6:
lcall $0x100, $0x1000
jmp 0x1000
jmp *%eax
jmp *0x1000
ljmp $0x100, $0x1000
ret
retl
ret $10
retl $10
lret
lret $10
enter $1234, $10
L3:
jo 0x1000
jnp 0x1001
jne 0x1002
jg 0x1003
jo L3
jnp L3
jne L3
jg L3
loopne L3
loopnz L3
loope L3
loopz L3
loop L3
jecxz L3
seto %al
setnp 0x1000
setl 0xaaaa
setg %dl
fadd
fadd %st(1), %st
fadd %st(0), %st(1)
fadd %st(3)
fmul %st(0),%st(0)
fmul %st(0),%st(1)
faddp %st(5)
faddp
faddp %st(1), %st
fadds 0x1000
fiadds 0x1002
faddl 0x1004
fiaddl 0x1006
fmul
fmul %st(1), %st
fmul %st(3)
fmulp %st(5)
fmulp
fmulp %st(1), %st
fmuls 0x1000
fimuls 0x1002
fmull 0x1004
fimull 0x1006
fsub
fsub %st(1), %st
fsub %st(3)
fsubp %st(5)
fsubp
fsubp %st(1), %st
fsubs 0x1000
fisubs 0x1002
fsubl 0x1004
fisubl 0x1006
fsubr
fsubr %st(1), %st
fsubr %st(3)
fsubrp %st(5)
fsubrp
fsubrp %st(1), %st
fsubrs 0x1000
fisubrs 0x1002
fsubrl 0x1004
fisubrl 0x1006
fdiv
fdiv %st(1), %st
fdiv %st(3)
fdivp %st(5)
fdivp
fdivp %st(1), %st
fdivs 0x1000
fidivs 0x1002
fdivl 0x1004
fidivl 0x1006
fcom %st(3)
fcoms 0x1000
ficoms 0x1002
fcoml 0x1004
ficoml 0x1006
fcomp %st(5)
fcomp
fcompp
fcomps 0x1000
ficomps 0x1002
fcompl 0x1004
ficompl 0x1006
fld %st(5)
fldl 0x1000
flds 0x1002
fildl 0x1004
fst %st(4)
fstp %st(6)
fstpt 0x1006
fbstp 0x1008
fxch
fxch %st(4)
fucom %st(6)
fucomp %st(3)
fucompp
finit
fninit
fldcw 0x1000
fnstcw 0x1002
fstcw 0x1002
fnstsw 0x1004
fnstsw (%eax)
fstsw 0x1004
fstsw (%eax)
fnclex
fclex
fnstenv 0x1000
fstenv 0x1000
fldenv 0x1000
fnsave 0x1002
fsave 0x1000
frstor 0x1000
ffree %st(7)
ffreep %st(6)
ftst
fxam
fld1
fldl2t
fldl2e
fldpi
fldlg2
fldln2
fldz
f2xm1
fyl2x
fptan
fpatan
fxtract
fprem1
fdecstp
fincstp
fprem
fyl2xp1
fsqrt
fsincos
frndint
fscale
fsin
fcos
fchs
fabs
fnop
fwait
bswap %edx
xadd %ecx, %edx
xaddb %dl, 0x1000
xaddw %ax, 0x1000
xaddl %eax, 0x1000
cmpxchg %ecx, %edx
cmpxchgb %dl, 0x1000
cmpxchgw %ax, 0x1000
cmpxchgl %eax, 0x1000
invlpg 0x1000
cmpxchg8b 0x1002
fcmovb %st(5), %st
fcmove %st(5), %st
fcmovbe %st(5), %st
fcmovu %st(5), %st
fcmovnb %st(5), %st
fcmovne %st(5), %st
fcmovnbe %st(5), %st
fcmovnu %st(5), %st
fcomi %st(5), %st
fucomi %st(5), %st
fcomip %st(5), %st
fucomip %st(5), %st
cmovo 0x1000, %eax
cmovs 0x1000, %eax
cmovns %edx, %edi
int $3
int $0x10
pusha
popa
clc
cld
cli
clts
cmc
lahf
sahf
pushfl
popfl
pushf
popf
stc
std
sti
aaa
aas
daa
das
aad
aam
cbw
cwd
cwde
cdq
cbtw
cwtd
cwtl
cltd
leave
int3
into
iret
rsm
hlt
wait
nop
/* XXX: handle prefixes */
#if 0
aword
addr16
#endif
lock
rep
repe
repz
repne
repnz
nop
lock ;negl (%eax)
wait ;pushf
rep ;stosb
repe ;lodsb
repz ;cmpsb
repne;movsb
repnz;outsb
/* handle one-line prefix + ops */
lock negl (%eax)
wait pushf
rep stosb
repe lodsb
repz cmpsb
repne movsb
repnz outsb
invd
wbinvd
cpuid
wrmsr
rdtsc
rdmsr
rdpmc
ud2
emms
movd %edx, %mm3
movd 0x1000, %mm2
movd %mm4, %ecx
movd %mm5, 0x1000
movq 0x1000, %mm2
movq %mm4, 0x1000
pand 0x1000, %mm3
pand %mm4, %mm5
psllw $1, %mm6
psllw 0x1000, %mm7
psllw %mm2, %mm7
xlat
cmpsb
scmpw
insl
outsw
lodsb
slodl
movsb
movsl
smovb
scasb
sscaw
stosw
sstol
bsf 0x1000, %ebx
bsr 0x1000, %ebx
bt %edx, 0x1000
btl $2, 0x1000
btc %edx, 0x1000
btcl $2, 0x1000
btr %edx, 0x1000
btrl $2, 0x1000
bts %edx, 0x1000
btsl $2, 0x1000
boundl %edx, 0x10000
boundw %bx, 0x1000
arpl %bx, 0x1000
lar 0x1000, %eax
lgdt 0x1000
lidt 0x1000
lldt 0x1000
lmsw 0x1000
lsl 0x1000, %ecx
ltr 0x1000
sgdt 0x1000
sidt 0x1000
sldt 0x1000
smsw 0x1000
str 0x1000
verr 0x1000
verw 0x1000
push %ds
pushw %ds
pushl %ds
pop %ds
popw %ds
popl %ds
fxsave 1(%ebx)
fxrstor 1(%ecx)
pushl $1
pushw $1
push $1
#ifdef __ASSEMBLER__ // should be defined, for S files
inc %eax
#endif
ft1: ft2: ft3: ft4: ft5: ft6: ft7: ft8: ft9:
xor %eax, %eax
ret
.type ft1,STT_FUNC
.type ft2,@STT_FUNC
.type ft3,%STT_FUNC
.type ft4,"STT_FUNC"
.type ft5,function
.type ft6,@function
.type ft7,%function
.type ft8,"function"
pause
| {
"pile_set_name": "Github"
} |
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.apache.deltaspike.data.impl.criteria.selection.strings;
import org.apache.deltaspike.data.impl.criteria.selection.SingularAttributeSelection;
import javax.persistence.criteria.CriteriaBuilder;
import javax.persistence.criteria.CriteriaQuery;
import javax.persistence.criteria.Path;
import javax.persistence.criteria.Selection;
import javax.persistence.metamodel.SingularAttribute;
public class Trim<P> extends SingularAttributeSelection<P, String>
{
private final CriteriaBuilder.Trimspec trimspec;
public Trim(SingularAttribute<? super P, String> attribute)
{
super(attribute);
this.trimspec = CriteriaBuilder.Trimspec.BOTH;
}
public Trim(CriteriaBuilder.Trimspec trimspec, SingularAttribute<? super P, String> attribute)
{
super(attribute);
this.trimspec = trimspec;
}
@Override
public <R> Selection<String> toSelection(CriteriaQuery<R> query, CriteriaBuilder builder, Path<? extends P> path)
{
return builder.trim(this.trimspec, path.get(getAttribute()));
}
} | {
"pile_set_name": "Github"
} |
//===--- BasicGOTAndStubsBuilder.h - Generic GOT/Stub creation --*- C++ -*-===//
//
// Part of the LLVM Project, under the Apache License v2.0 with LLVM Exceptions.
// See https://llvm.org/LICENSE.txt for license information.
// SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
//
//===----------------------------------------------------------------------===//
//
// A base for simple GOT and stub creation.
//
//===----------------------------------------------------------------------===//
#ifndef LLVM_LIB_EXECUTIONENGINE_JITLINK_BASICGOTANDSTUBSBUILDER_H
#define LLVM_LIB_EXECUTIONENGINE_JITLINK_BASICGOTANDSTUBSBUILDER_H
#include "llvm/ExecutionEngine/JITLink/JITLink.h"
namespace llvm {
namespace jitlink {
template <typename BuilderImpl> class BasicGOTAndStubsBuilder {
public:
BasicGOTAndStubsBuilder(LinkGraph &G) : G(G) {}
void run() {
// We're going to be adding new blocks, but we don't want to iterate over
// the newly added ones, so just copy the existing blocks out.
std::vector<Block *> Blocks(G.blocks().begin(), G.blocks().end());
for (auto *B : Blocks)
for (auto &E : B->edges())
if (impl().isGOTEdge(E))
impl().fixGOTEdge(E, getGOTEntrySymbol(E.getTarget()));
else if (impl().isExternalBranchEdge(E))
impl().fixExternalBranchEdge(E, getStubSymbol(E.getTarget()));
}
protected:
Symbol &getGOTEntrySymbol(Symbol &Target) {
assert(Target.hasName() && "GOT edge cannot point to anonymous target");
auto GOTEntryI = GOTEntries.find(Target.getName());
// Build the entry if it doesn't exist.
if (GOTEntryI == GOTEntries.end()) {
auto &GOTEntry = impl().createGOTEntry(Target);
GOTEntryI =
GOTEntries.insert(std::make_pair(Target.getName(), &GOTEntry)).first;
}
assert(GOTEntryI != GOTEntries.end() && "Could not get GOT entry symbol");
return *GOTEntryI->second;
}
Symbol &getStubSymbol(Symbol &Target) {
assert(Target.hasName() &&
"External branch edge can not point to an anonymous target");
auto StubI = Stubs.find(Target.getName());
if (StubI == Stubs.end()) {
auto &StubSymbol = impl().createStub(Target);
StubI = Stubs.insert(std::make_pair(Target.getName(), &StubSymbol)).first;
}
assert(StubI != Stubs.end() && "Count not get stub symbol");
return *StubI->second;
}
LinkGraph &G;
private:
BuilderImpl &impl() { return static_cast<BuilderImpl &>(*this); }
DenseMap<StringRef, Symbol *> GOTEntries;
DenseMap<StringRef, Symbol *> Stubs;
};
} // end namespace jitlink
} // end namespace llvm
#endif // LLVM_LIB_EXECUTIONENGINE_JITLINK_BASICGOTANDSTUBSBUILDER_H
| {
"pile_set_name": "Github"
} |
from ._version import version_info, __version__ # noqa: F401 imported but unused
def get_include(user=False):
import os
d = os.path.dirname(__file__)
if os.path.exists(os.path.join(d, "include")):
# Package is installed
return os.path.join(d, "include")
else:
# Package is from a source directory
return os.path.join(os.path.dirname(d), "include")
| {
"pile_set_name": "Github"
} |
{
"kind": "ImageStream",
"apiVersion": "v1",
"metadata": {
"name": "jboss-datagrid73-openshift",
"creationTimestamp": null,
"annotations": {
"openshift.io/display-name": "Red Hat JBoss Data Grid 7.3",
"openshift.io/provider-display-name": "Red Hat, Inc.",
"version": "1.4"
}
},
"spec": {
"lookupPolicy": {
"local": false
},
"tags": [
{
"name": "1.0",
"annotations": {
"description": "Red Hat JBoss Data Grid 7.3 image",
"iconClass": "icon-datagrid",
"openshift.io/display-name": "Red Hat JBoss Data Grid 7.3",
"supports": "datagrid:7.3",
"tags": "datagrid,jboss,hidden",
"version": "1.0"
},
"from": {
"kind": "DockerImage",
"name": "registry.redhat.io/jboss-datagrid-7/datagrid73-openshift:1.0"
},
"generation": null,
"importPolicy": {},
"referencePolicy": {
"type": "Local"
}
},
{
"name": "1.1",
"annotations": {
"description": "Red Hat JBoss Data Grid 7.3 image",
"iconClass": "icon-datagrid",
"openshift.io/display-name": "Red Hat JBoss Data Grid 7.3",
"supports": "datagrid:7.3",
"tags": "datagrid,jboss,hidden",
"version": "1.1"
},
"from": {
"kind": "DockerImage",
"name": "registry.redhat.io/jboss-datagrid-7/datagrid73-openshift:1.1"
},
"generation": null,
"importPolicy": {},
"referencePolicy": {
"type": "Local"
}
},
{
"name": "1.2",
"annotations": {
"description": "Red Hat JBoss Data Grid 7.3 image",
"iconClass": "icon-datagrid",
"openshift.io/display-name": "Red Hat JBoss Data Grid 7.3",
"supports": "datagrid:7.3",
"tags": "datagrid,jboss,hidden",
"version": "1.2"
},
"from": {
"kind": "DockerImage",
"name": "registry.redhat.io/jboss-datagrid-7/datagrid73-openshift:1.2"
},
"generation": null,
"importPolicy": {},
"referencePolicy": {
"type": "Local"
}
},
{
"name": "1.3",
"annotations": {
"description": "Red Hat JBoss Data Grid 7.3 image",
"iconClass": "icon-datagrid",
"openshift.io/display-name": "Red Hat JBoss Data Grid 7.3",
"supports": "datagrid:7.3",
"tags": "datagrid,jboss,hidden",
"version": "1.3"
},
"from": {
"kind": "DockerImage",
"name": "registry.redhat.io/jboss-datagrid-7/datagrid73-openshift:1.3"
},
"generation": null,
"importPolicy": {},
"referencePolicy": {
"type": "Local"
}
},
{
"name": "1.4",
"annotations": {
"description": "Red Hat JBoss Data Grid 7.3 image",
"iconClass": "icon-datagrid",
"openshift.io/display-name": "Red Hat JBoss Data Grid 7.3",
"supports": "datagrid:7.3",
"tags": "datagrid,jboss,hidden",
"version": "1.4"
},
"from": {
"kind": "DockerImage",
"name": "registry.redhat.io/jboss-datagrid-7/datagrid73-openshift:1.4"
},
"generation": null,
"importPolicy": {},
"referencePolicy": {
"type": "Local"
}
}
]
},
"status": {
"dockerImageRepository": ""
}
} | {
"pile_set_name": "Github"
} |
.\" Copyright (c) 2019 The DragonFly Project
.\" All rights reserved.
.\"
.\" Redistribution and use in source and binary forms, with or without
.\" modification, are permitted provided that the following conditions
.\" are met:
.\" 1. Redistributions of source code must retain the above copyright
.\" notice, this list of conditions and the following disclaimer.
.\" 2. Redistributions in binary form must reproduce the above copyright
.\" notice, this list of conditions and the following disclaimer in the
.\" documentation and/or other materials provided with the distribution.
.\"
.\" THIS SOFTWARE IS PROVIDED BY THE AUTHORS AND CONTRIBUTORS ``AS IS'' AND
.\" ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
.\" IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
.\" ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHORS OR CONTRIBUTORS BE LIABLE
.\" FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
.\" DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
.\" OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
.\" HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
.\" LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
.\" OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
.\" SUCH DAMAGE.
.\"
.Dd March 25, 2019
.Dt MOUNT_FUSEFS 8
.Os
.Sh NAME
.Nm mount_fusefs
.Nd mount a FUSE file system
.Sh SYNOPSIS
.Nm
.Op Fl o Ar options
.Op Fl h
.Ar special
.Ar node
.Sh DESCRIPTION
The
.Nm
utility mounts a
.Nm FUSE
file system backed by
.Ar special
file at mount point
.Ar node .
.Sh SEE ALSO
.Xr mount 8
.Sh HISTORY
The
.Nm
utility first appeared in
.Dx 5.5 .
.Sh AUTHORS
.An Tomohiro Kusumi Aq Mt [email protected]
| {
"pile_set_name": "Github"
} |
import { Observable } from '../Observable';
/**
* We need this JSDoc comment for affecting ESDoc.
* @extends {Ignored}
* @hide true
*/
export class RangeObservable extends Observable {
constructor(start, count, scheduler) {
super();
this.start = start;
this._count = count;
this.scheduler = scheduler;
}
/**
* Creates an Observable that emits a sequence of numbers within a specified
* range.
*
* <span class="informal">Emits a sequence of numbers in a range.</span>
*
* <img src="./img/range.png" width="100%">
*
* `range` operator emits a range of sequential integers, in order, where you
* select the `start` of the range and its `length`. By default, uses no
* IScheduler and just delivers the notifications synchronously, but may use
* an optional IScheduler to regulate those deliveries.
*
* @example <caption>Emits the numbers 1 to 10</caption>
* var numbers = Rx.Observable.range(1, 10);
* numbers.subscribe(x => console.log(x));
*
* @see {@link timer}
* @see {@link interval}
*
* @param {number} [start=0] The value of the first integer in the sequence.
* @param {number} [count=0] The number of sequential integers to generate.
* @param {Scheduler} [scheduler] A {@link IScheduler} to use for scheduling
* the emissions of the notifications.
* @return {Observable} An Observable of numbers that emits a finite range of
* sequential integers.
* @static true
* @name range
* @owner Observable
*/
static create(start = 0, count = 0, scheduler) {
return new RangeObservable(start, count, scheduler);
}
static dispatch(state) {
const { start, index, count, subscriber } = state;
if (index >= count) {
subscriber.complete();
return;
}
subscriber.next(start);
if (subscriber.closed) {
return;
}
state.index = index + 1;
state.start = start + 1;
this.schedule(state);
}
/** @deprecated internal use only */ _subscribe(subscriber) {
let index = 0;
let start = this.start;
const count = this._count;
const scheduler = this.scheduler;
if (scheduler) {
return scheduler.schedule(RangeObservable.dispatch, 0, {
index, count, start, subscriber
});
}
else {
do {
if (index++ >= count) {
subscriber.complete();
break;
}
subscriber.next(start++);
if (subscriber.closed) {
break;
}
} while (true);
}
}
}
//# sourceMappingURL=RangeObservable.js.map | {
"pile_set_name": "Github"
} |
<!DOCTYPE html >
<html>
<head>
<script src="../libraries/RGraph.common.core.js" ></script>
<script src="../libraries/RGraph.common.dynamic.js" ></script>
<script src="../libraries/RGraph.bar.js" ></script>
<title>A bar chart with horizontal gradients</title>
<link rel="stylesheet" href="demos.css" type="text/css" media="screen" />
<meta name="robots" content="noindex,nofollow" />
<meta name="description" content="An example of a rotating bar chart made by usin CSS3 3D transformations" />
</head>
<body>
<h1>A rotating video/Bar chart using CSS3 3D transformations</h1>
<p>
Use Google Chrome to see the video (a WebM video). <a href="http://www.rgraph.net/javascript-charts" rel="nofollow">There's also an
example of 3D transforms on this page</a>.
</p>
<script>
/**
* If the browser is Chrome the element that spins is a WebM video. If not then it's an RGraph Bar chart.
*/
if (RGraph.ISCHROME) {
document.write('<video id="myElement" src="../video/video.webm" controls autoplay loop ></video>');
} else {
document.write('<canvas id="myElement" width="600" height="250">[No canvas support]</canvas>');
var bar = new RGraph.Bar({
id: 'myElement',
data: [4,8,6,8,7],
options: {
labels: ['John','Fred','George','Paul','Ringo']
}
}).draw();
}
/**
* Initially the x/y/z angles are all zero
*/
x = 0;
y = 0;
z = 0;
/**
* This is the spin function that gets called repeatedly and sets the appropriate CSS3 values.
* It calls itself again at the end after a small delay.
*/
mySpinFunc = function ()
{
/**
* Set the appropriate CSS3 properties for WebKit browsers
*/
document.getElementById("myElement").style.WebkitTransform = 'rotate3d(1,0,0, ' + x + 'deg) rotate3d(0,1,0, ' + y + 'deg) rotate3d(0,0,1, ' + z + 'deg)';
/**
* Set the unprefixed CSS3 properties (for Firefox, MSIE 10 etc)
*/
document.getElementById("myElement").style.transform = 'rotate3d(1,0,0, ' + x + 'deg) rotate3d(0,1,0, ' + y + 'deg) rotate3d(0,0,1, ' + z + 'deg)';
/**
* Increment the X/Y/Z angles
*/
x += 3;
y += 3;
z += 3;
/**
* Call ourselves again after a small delay
*/
setTimeout(mySpinFunc, 50);
}
mySpinFunc();
</script>
<p></p>
This goes in the documents header:
<pre class="code">
<script src="RGraph.common.core.js"></script>
<script src="RGraph.common.dynamic.js"></script>
<script src="RGraph.bar.js"></script>
</pre>
Put this where you want the chart to show up:
<pre class="code">
<canvas id="cvs" width="600" height="250">
[No canvas support]
</canvas>
</pre>
This is the code that generates the chart:
<pre class="code">
<script>
<span>/**
* If the browser is Chrome the element that spins is a WebM video. If not then it's an RGraph Bar chart.
*/</span>
if (RGraph.ISCHROME) {
document.write('<video id="myElement" src="../video/video.webm" controls autoplay loop ></video>');
} else {
document.write('<canvas id="myElement" width="600" height="250">[No canvas support]</canvas>');
var bar = new RGraph.Bar({
id: 'myElement',
data: [4,8,6,8,7],
options: {
labels: ['John','Fred','George','Paul','Ringo']
}
}).draw();
}
<span>/**
* Initially the x/y/z angles are all zero
*/</span>
x = 0;
y = 0;
z = 0;
<span>/**
* This is the spin function that gets called repeatedly and sets the appropriate CSS3 values.
* It calls itself again at the end after a small delay.
*/</span>
mySpinFunc = function ()
{
<span>/**
* Set the appropriate CSS3 properties for WebKit browsers
*/</span>
document.getElementById("myElement").style.WebkitTransform = 'rotate3d(1,0,0, ' + x + 'deg) rotate3d(0,1,0, ' + y + 'deg) rotate3d(0,0,1, ' + z + 'deg)';
<span>/**
* Set the unprefixed CSS3 properties (for Firefox, MSIE 10 etc)
*/</span>
document.getElementById("myElement").style.transform = 'rotate3d(1,0,0, ' + x + 'deg) rotate3d(0,1,0, ' + y + 'deg) rotate3d(0,0,1, ' + z + 'deg)';
<span>/**
* Increment the X/Y/Z angles
*/</span>
x += 3;
y += 3;
z += 3;
<span>/**
* Call ourselves again after a small delay
*/</span>
setTimeout(mySpinFunc, 50);
}
mySpinFunc();
</script>
</pre>
<p>
<a href="https://www.facebook.com/sharer/sharer.php?u=http://www.rgraph.net" target="_blank" onclick="window.open('https://www.facebook.com/sharer/sharer.php?u=http://www.rgraph.net', null, 'top=50,left=50,width=600,height=368'); return false"><img src="../images/facebook-large.png" width="200" height="43" alt="Share on Facebook" border="0" title="Visit the RGraph Facebook page" /></a>
<a href="https://twitter.com/_rgraph" target="_blank" onclick="window.open('https://twitter.com/_rgraph', null, 'top=50,left=50,width=700,height=400'); return false"><img src="../images/twitter-large.png" width="200" height="43" alt="Share on Twitter" border="0" title="Mention RGraph on Twitter" /></a>
</p>
<p>
<a href="./">« Back</a>
</p>
</body>
</html> | {
"pile_set_name": "Github"
} |
# coding: utf8
{
'"update" is an optional expression like "field1=\'newvalue\'". You cannot update or delete the results of a JOIN': '"update" é uma expressão opcional como "campo1=\'novovalor\'". Você não pode atualizar ou apagar os resultados de um JOIN',
'%Y-%m-%d': '%d-%m-%Y ',
'%Y-%m-%d %H:%M:%S': '%d-%m-%Y %H:%M:%S',
'%s rows deleted': '%s linhas apagadas',
'%s rows updated': '%s linhas atualizadas',
'About': 'About',
'Access Control': 'Access Control',
'Ajax Recipes': 'Ajax Recipes',
'Available databases and tables': 'Bancos de dados e tabelas disponíveis',
'Buy this book': 'Buy this book',
'Cannot be empty': 'Não pode ser vazio',
'Check to delete': 'Marque para apagar',
'Client IP': 'Client IP',
'Community': 'Community',
'Controller': 'Controlador',
'Copyright': 'Copyright',
'Current request': 'Requisição atual',
'Current response': 'Resposta atual',
'Current session': 'Sessão atual',
'DB Model': 'Modelo BD',
'Database': 'Banco de dados',
'Delete:': 'Apagar:',
'Demo': 'Demo',
'Deployment Recipes': 'Deployment Recipes',
'Description': 'Description',
'Documentation': 'Documentation',
'Download': 'Download',
'E-mail': 'E-mail',
'Edit': 'Editar',
'Edit This App': 'Edit This App',
'Edit current record': 'Editar o registro atual',
'Errors': 'Errors',
'FAQ': 'FAQ',
'First name': 'First name',
'Forms and Validators': 'Forms and Validators',
'Free Applications': 'Free Applications',
'Group ID': 'Group ID',
'Groups': 'Groups',
'Hello World': 'Olá Mundo',
'Home': 'Home',
'Import/Export': 'Importar/Exportar',
'Index': 'Início',
'Internal State': 'Estado Interno',
'Introduction': 'Introduction',
'Invalid Query': 'Consulta Inválida',
'Invalid email': 'Invalid email',
'Last name': 'Last name',
'Layout': 'Layout',
'Layouts': 'Layouts',
'Live chat': 'Live chat',
'Login': 'Autentique-se',
'Lost Password': 'Esqueceu sua senha?',
'Main Menu': 'Menu Principal',
'Menu Model': 'Modelo de Menu',
'Name': 'Name',
'New Record': 'Novo Registro',
'No databases in this application': 'Sem bancos de dados nesta aplicação',
'Origin': 'Origin',
'Other Recipes': 'Other Recipes',
'Overview': 'Overview',
'Password': 'Password',
'Plugins': 'Plugins',
'Powered by': 'Powered by',
'Preface': 'Preface',
'Python': 'Python',
'Query:': 'Consulta:',
'Quick Examples': 'Quick Examples',
'Recipes': 'Recipes',
'Record ID': 'Record ID',
'Register': 'Registre-se',
'Registration key': 'Registration key',
'Reset Password key': 'Reset Password key',
'Resources': 'Resources',
'Role': 'Role',
'Rows in table': 'Linhas na tabela',
'Rows selected': 'Linhas selecionadas',
'Semantic': 'Semantic',
'Services': 'Services',
'Stylesheet': 'Stylesheet',
'Support': 'Support',
'Sure you want to delete this object?': 'Está certo(a) que deseja apagar esse objeto ?',
'Table name': 'Table name',
'The "query" is a condition like "db.table1.field1==\'value\'". Something like "db.table1.field1==db.table2.field2" results in a SQL JOIN.': 'Uma "consulta" é uma condição como "db.tabela1.campo1==\'valor\'". Expressões como "db.tabela1.campo1==db.tabela2.campo2" resultam em um JOIN SQL.',
'The Core': 'The Core',
'The Views': 'The Views',
'The output of the file is a dictionary that was rendered by the view': 'The output of the file is a dictionary that was rendered by the view',
'This App': 'This App',
'This is a copy of the scaffolding application': 'This is a copy of the scaffolding application',
'Timestamp': 'Timestamp',
'Twitter': 'Twitter',
'Update:': 'Atualizar:',
'Use (...)&(...) for AND, (...)|(...) for OR, and ~(...) for NOT to build more complex queries.': 'Use (...)&(...) para AND, (...)|(...) para OR, e ~(...) para NOT para construir consultas mais complexas.',
'User ID': 'User ID',
'User Voice': 'User Voice',
'Videos': 'Videos',
'View': 'Visualização',
'Web2py': 'Web2py',
'Welcome': 'Welcome',
'Welcome %s': 'Vem vindo %s',
'Welcome to web2py': 'Bem vindo ao web2py',
'Which called the function': 'Which called the function',
'You are successfully running web2py': 'You are successfully running web2py',
'You are successfully running web2py.': 'You are successfully running web2py.',
'You can modify this application and adapt it to your needs': 'You can modify this application and adapt it to your needs',
'You visited the url': 'You visited the url',
'appadmin is disabled because insecure channel': 'Administração desativada devido ao canal inseguro',
'cache': 'cache',
'change password': 'modificar senha',
'click here for online examples': 'clique aqui para ver alguns exemplos',
'click here for the administrative interface': 'clique aqui para acessar a interface administrativa',
'customize me!': 'Personalize-me!',
'data uploaded': 'dados enviados',
'database': 'banco de dados',
'database %s select': 'Selecionar banco de dados %s',
'db': 'bd',
'design': 'design',
'documentation': 'documentation',
'done!': 'concluído!',
'edit profile': 'editar perfil',
'export as csv file': 'exportar como um arquivo csv',
'insert new': 'inserir novo',
'insert new %s': 'inserir novo %s',
'invalid request': 'requisição inválida',
'located in the file': 'located in the file',
'login': 'Entrar',
'logout': 'Sair',
'lost password?': 'lost password?',
'new record inserted': 'novo registro inserido',
'next 100 rows': 'próximas 100 linhas',
'or import from csv file': 'ou importar de um arquivo csv',
'previous 100 rows': '100 linhas anteriores',
'record': 'registro',
'record does not exist': 'registro não existe',
'record id': 'id do registro',
'register': 'Registre-se',
'selected': 'selecionado',
'state': 'estado',
'table': 'tabela',
'unable to parse csv file': 'não foi possível analisar arquivo csv',
}
| {
"pile_set_name": "Github"
} |
#include "stdafx.h"
#include "DX11Utils.h"
#include <string>
#include <sstream>
#include <iostream>
#include <fstream>
#include <shlwapi.h>
#include "SDKmisc.h"
void AddDefinitionToMacro( D3D_SHADER_MACRO source[9], D3D_SHADER_MACRO dest[10], D3D10_SHADER_MACRO &addMacro )
{
for (UINT j = 0;; j++) {
assert(j < 9);
if (source[j].Name == 0) {
dest[j] = addMacro;
dest[j+1] = source[j];
break;
}
dest[j] = source[j];
}
}
double GetTime()
{
unsigned __int64 pf;
QueryPerformanceFrequency( (LARGE_INTEGER *)&pf );
double freq_ = 1.0 / (double)pf;
unsigned __int64 val;
QueryPerformanceCounter( (LARGE_INTEGER *)&val );
return (val) * freq_;
}
bool OpenFileDialog(OPENFILENAME& ofn)
{
ofn.lStructSize = sizeof(ofn);
ofn.hwndOwner = DXUTGetHWND();
ofn.hInstance = GetModuleHandle(NULL);
ofn.Flags |= OFN_FILEMUSTEXIST | OFN_PATHMUSTEXIST | OFN_NOCHANGEDIR;
static WCHAR lastDir[MAX_PATH] = L"";
ofn.lpstrInitialDir = lastDir;
if(lastDir[0] == L'\0') {
WCHAR baseDir[MAX_PATH] = L"";
GetCurrentDirectory(MAX_PATH, baseDir);
PathCombine(lastDir, baseDir, L"Media");
}
BOOL isWindowed = DXUTIsWindowed();
WINDOWPLACEMENT oldPlacement;
if(isWindowed==FALSE)
{
DXUTToggleFullScreen();
ZeroMemory(&oldPlacement, sizeof(oldPlacement));
oldPlacement.length = sizeof(oldPlacement);
GetWindowPlacement(DXUTGetHWND(), &oldPlacement);
WINDOWPLACEMENT tmpPlacement = oldPlacement;
tmpPlacement.showCmd = SW_SHOWMAXIMIZED;
SetWindowPlacement(DXUTGetHWND(), &tmpPlacement);
}
BOOL result = GetOpenFileName(&ofn);
if(isWindowed==FALSE)
{
SetWindowPlacement(DXUTGetHWND(), &oldPlacement);
DXUTToggleFullScreen();
}
if(result != FALSE)
{
for (UINT i = 0; i < MAX_PATH; i++) {
lastDir[i] = ofn.lpstrFile[i];
if (ofn.lpstrFile[i] == '\0') break;
}
PathRemoveFileSpec(lastDir);
}
return result != FALSE;
}
static const bool s_bUsePreCompiledShaders = true;
static bool b_CompiledShaderDirectoryCreated = false;
HRESULT CompileShaderFromFile( WCHAR* szFileName, LPCSTR szEntryPoint, LPCSTR szShaderModel, ID3DBlob** ppBlobOut, const D3D_SHADER_MACRO* pDefines, DWORD pCompilerFlags)
{
HRESULT hr = S_OK;
std::wstring compiledShaderDirectory(L"CompiledShaders/");
if (!b_CompiledShaderDirectoryCreated) {
CreateDirectory(compiledShaderDirectory.c_str(), NULL);
b_CompiledShaderDirectoryCreated = true;
}
std::wstring compiledFilename(compiledShaderDirectory);
std::wstring fileName(szFileName);
unsigned int found = (unsigned int)fileName.find_last_of(L"/\\");
fileName = fileName.substr(found+1);
compiledFilename.append(fileName);
compiledFilename.push_back('.');
std::string entryPoint(szEntryPoint);
unsigned int oldLen = (unsigned int)compiledFilename.length();
compiledFilename.resize(entryPoint.length() + oldLen);
std::copy(entryPoint.begin(), entryPoint.end(), compiledFilename.begin()+oldLen);
if (pDefines) {
compiledFilename.push_back('.');
for (unsigned int i = 0; pDefines[i].Name != NULL; i++) {
std::string name(pDefines[i].Name);
if (name[0] == '\"') name[0] = 'x';
if (name[name.length()-1] == '\"') name[name.length()-1] = 'x';
std::string def(pDefines[i].Definition);
if (def[0] == '\"') def[0] = 'x';
if (def[def.length()-1] == '\"') def[def.length()-1] = 'x';
oldLen = (unsigned int)compiledFilename.length();
compiledFilename.resize(oldLen + name.length() + def.length());
std::copy(name.begin(), name.end(), compiledFilename.begin()+oldLen);
std::copy(def.begin(), def.end(), compiledFilename.begin()+name.length()+oldLen);
}
}
if (pCompilerFlags) {
compiledFilename.push_back('.');
std::wstringstream ss;
ss << pCompilerFlags;
std::wstring cf;
ss >> cf;
compiledFilename.append(cf);
}
compiledFilename.push_back('.');
compiledFilename.push_back('p');
HANDLE hFindShader;
HANDLE hFindCompiled;
WIN32_FIND_DATA findData_shader;
WIN32_FIND_DATA findData_compiled;
hFindShader = FindFirstFile(szFileName, &findData_shader);
hFindCompiled = FindFirstFile(compiledFilename.c_str(), &findData_compiled);
if (!s_bUsePreCompiledShaders || hFindCompiled == INVALID_HANDLE_VALUE || CompareFileTime(&findData_shader.ftLastWriteTime, &findData_compiled.ftLastWriteTime) > 0) {
// find the file
WCHAR str[MAX_PATH];
V_RETURN( DXUTFindDXSDKMediaFileCch( str, MAX_PATH, szFileName ) );
DWORD dwShaderFlags = 0;
dwShaderFlags |= D3DCOMPILE_ENABLE_STRICTNESS;
//dwShaderFlags |= D3DCOMPILE_OPTIMIZATION_LEVEL3;
//dwShaderFlags |= D3DCOMPILE_PARTIAL_PRECISION;
#if defined( DEBUG ) || defined( _DEBUG )
// Set the D3DCOMPILE_DEBUG flag to embed debug information in the shaders.
// Setting this flag improves the shader debugging experience, but still allows
// the shaders to be optimized and to run exactly the way they will run in
// the release configuration of this program.
dwShaderFlags |= D3DCOMPILE_DEBUG;
//dwShaderFlags |= D3DXSHADER_SKIPOPTIMIZATION;
//dwShaderFlags |= D3DCOMPILE_WARNINGS_ARE_ERRORS;
//dwShaderFlags |= D3DCOMPILE_SKIP_OPTIMIZATION;
//dwShaderFlags |= D3DCOMPILE_IEEE_STRICTNESS;
#endif
dwShaderFlags |= pCompilerFlags;
ID3DBlob* pErrorBlob;
hr = D3DX11CompileFromFile( str, pDefines, NULL, szEntryPoint, szShaderModel,
dwShaderFlags, 0, NULL, ppBlobOut, &pErrorBlob, NULL );
if( FAILED(hr) )
{
if( pErrorBlob != NULL )
OutputDebugStringA( (char*)pErrorBlob->GetBufferPointer() );
SAFE_RELEASE( pErrorBlob );
return hr;
}
SAFE_RELEASE( pErrorBlob );
std::ofstream compiledFile(compiledFilename.c_str(), std::ios::out | std::ios::binary);
compiledFile.write((char*)(*ppBlobOut)->GetBufferPointer(), (*ppBlobOut)->GetBufferSize());
compiledFile.close();
} else {
std::ifstream compiledFile(compiledFilename.c_str(), std::ios::in | std::ios::binary);
assert(compiledFile.is_open());
unsigned int size_data = findData_compiled.nFileSizeLow;
V_RETURN(D3DCreateBlob(size_data,ppBlobOut));
compiledFile.read((char*)(*ppBlobOut)->GetBufferPointer(), size_data);
compiledFile.close();
}
return hr;
}
void* CreateAndCopyToDebugBuf( ID3D11Device* pDevice, ID3D11DeviceContext* pd3dImmediateContext, ID3D11Buffer* pBuffer, bool returnCPUMemory )
{
ID3D11Buffer* debugbuf = NULL;
D3D11_BUFFER_DESC desc;
ZeroMemory( &desc, sizeof(desc) );
pBuffer->GetDesc( &desc );
desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ;
desc.Usage = D3D11_USAGE_STAGING;
desc.BindFlags = 0;
desc.MiscFlags = 0;
pDevice->CreateBuffer(&desc, NULL, &debugbuf);
pd3dImmediateContext->CopyResource( debugbuf, pBuffer );
INT *cpuMemory = new INT[desc.ByteWidth/sizeof(UINT)];
D3D11_MAPPED_SUBRESOURCE mappedResource;
pd3dImmediateContext->Map(debugbuf, D3D11CalcSubresource(0,0,0), D3D11_MAP_READ, 0, &mappedResource);
memcpy((void*)cpuMemory, (void*)mappedResource.pData, desc.ByteWidth);
pd3dImmediateContext->Unmap( debugbuf, 0 );
for(unsigned int i = 0; i<desc.ByteWidth/sizeof(UINT); i++)
{
if(cpuMemory[i] != 0)
{
//std::cout << "test" << std::endl;
}
}
SAFE_RELEASE(debugbuf);
if (!returnCPUMemory) {
SAFE_DELETE_ARRAY(cpuMemory);
return NULL;
} else {
return (void*)cpuMemory;
}
}
void* CreateAndCopyToDebugTexture2D( ID3D11Device* pDevice, ID3D11DeviceContext* pd3dImmediateContext, ID3D11Texture2D* pBufferTex, bool returnCPUMemory )
{
ID3D11Texture2D* debugtex = NULL;
D3D11_TEXTURE2D_DESC desc;
ZeroMemory( &desc, sizeof(desc) );
pBufferTex->GetDesc( &desc );
desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ;
desc.Usage = D3D11_USAGE_STAGING;
desc.BindFlags = 0;
desc.MiscFlags = 0;
pDevice->CreateTexture2D(&desc, NULL, &debugtex);
pd3dImmediateContext->CopyResource( debugtex, pBufferTex );
FLOAT *cpuMemory = new FLOAT[desc.Height * desc.Width];
D3D11_MAPPED_SUBRESOURCE mappedResource;
pd3dImmediateContext->Map(debugtex, D3D11CalcSubresource(0,0,0), D3D11_MAP_READ, 0, &mappedResource);
memcpy((void*)cpuMemory, (void*)mappedResource.pData, desc.Height * desc.Width * sizeof(FLOAT));
pd3dImmediateContext->Unmap( debugtex, 0 );
SAFE_RELEASE(debugtex);
if (!returnCPUMemory) {
SAFE_DELETE_ARRAY(cpuMemory);
return NULL;
} else {
return (void*)cpuMemory;
}
}
void* GetResourceData(ID3D11Resource* pResource, ResourceDesc& rdesc )
{
ID3D11Device *pDevice = DXUTGetD3D11Device();
ID3D11DeviceContext *pd3dImmediateContext = DXUTGetD3D11DeviceContext();
D3D11_RESOURCE_DIMENSION rType;
pResource->GetType(&rType);
assert(rType == D3D11_RESOURCE_DIMENSION_TEXTURE2D); //don't support any other types yet - tbd
ID3D11Texture2D* debugtex = NULL;
if (rType == D3D11_RESOURCE_DIMENSION_TEXTURE2D) {
ID3D11Texture2D* texture = (ID3D11Texture2D*)pResource;
D3D11_TEXTURE2D_DESC desc;
ZeroMemory( &desc, sizeof(desc) );
texture->GetDesc( &desc );
desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ;
desc.Usage = D3D11_USAGE_STAGING;
desc.BindFlags = 0;
desc.MiscFlags = 0;
pDevice->CreateTexture2D(&desc, NULL, &debugtex);
rdesc.m_Dimension = 2;
rdesc.m_Size.x = desc.Width;
rdesc.m_Size.y = desc.Height;
rdesc.m_Size.z = 1;
rdesc.m_OriginalFormat = desc.Format;
if (desc.Format == DXGI_FORMAT_R32_FLOAT ||
desc.Format == DXGI_FORMAT_R32_SINT ||
desc.Format == DXGI_FORMAT_R32_UINT) {
rdesc.m_ElementsPerEntry = 1;
rdesc.m_ElementSizeInBytes = 4;
} else if (
desc.Format == DXGI_FORMAT_R8G8B8A8_UINT ||
desc.Format == DXGI_FORMAT_R8G8B8A8_SINT ||
desc.Format == DXGI_FORMAT_R8G8B8A8_SNORM ||
desc.Format == DXGI_FORMAT_R8G8B8A8_UNORM) {
rdesc.m_ElementsPerEntry = 4;
rdesc.m_ElementSizeInBytes = 1;
} else {
assert(false);
}
}
pd3dImmediateContext->CopyResource( debugtex, pResource );
void *cpuMemory = new CHAR[rdesc.m_Size.x * rdesc.m_Size.y * rdesc.m_Size.z * rdesc.m_ElementsPerEntry * rdesc.m_ElementSizeInBytes];
D3D11_MAPPED_SUBRESOURCE mappedResource;
pd3dImmediateContext->Map(debugtex, D3D11CalcSubresource(0,0,0), D3D11_MAP_READ, 0, &mappedResource);
memcpy((void*)cpuMemory, (void*)mappedResource.pData, rdesc.m_Size.x * rdesc.m_Size.y * rdesc.m_Size.z * rdesc.m_ElementsPerEntry * rdesc.m_ElementSizeInBytes);
pd3dImmediateContext->Unmap( debugtex, 0 );
SAFE_RELEASE(debugtex);
//SAFE_DELETE_ARRAY(cpuMemory);
return cpuMemory;
}
void SetRSCulling (ID3D11RasterizerState** rs, D3D11_CULL_MODE cm) {
ID3D11Device* pd3dDevice = DXUTGetD3D11Device();
D3D11_RASTERIZER_DESC RasterDesc;
ZeroMemory( &RasterDesc, sizeof(D3D11_RASTERIZER_DESC) );
(*rs)->GetDesc(&RasterDesc);
RasterDesc.CullMode = cm;
SAFE_RELEASE(*rs);
pd3dDevice->CreateRasterizerState(&RasterDesc, rs);
}
void SetRSDrawing(ID3D11RasterizerState** rs, D3D11_FILL_MODE fm) {
ID3D11Device* pd3dDevice = DXUTGetD3D11Device();
D3D11_RASTERIZER_DESC RasterDesc;
ZeroMemory( &RasterDesc, sizeof(D3D11_RASTERIZER_DESC) );
(*rs)->GetDesc(&RasterDesc);
RasterDesc.FillMode = fm;
SAFE_RELEASE(*rs);
pd3dDevice->CreateRasterizerState(&RasterDesc, rs);
}
/*
void SaveCamera( CModelViewerCamera &camera, const char* filename )
{
std::ofstream file;
file.open(filename);
if (file.is_open()) {
file.precision(15);
file << camera.GetEyePt()->x << " " << camera.GetEyePt()->y << " " << camera.GetEyePt()->z << std::endl;
file << camera.GetLookAtPt()->x << " " << camera.GetLookAtPt()->y << " " << camera.GetLookAtPt()->z << std::endl;
file << camera.GetViewQuat().x << " " << camera.GetViewQuat().y << " " << camera.GetViewQuat().z << " " << camera.GetViewQuat().w << std::endl;
file << camera.GetWorldQuat().x << " " << camera.GetWorldQuat().y << " " << camera.GetWorldQuat().z << " " << camera.GetWorldQuat().w << std::endl;
}
file.close();
return;
}
void LoadCamera( CModelViewerCamera &camera, const char* filename )
{
std::ifstream file;
file.open(filename);
if (file.is_open() && file.good()) {
D3DXVECTOR3 eye, lookAt;
D3DXQUATERNION world, view;
file >> eye.x >> eye.y >> eye.z;
file >> lookAt.x >> lookAt.y >> lookAt.z;
file >> view.x >> view.y >> view.z >> view.w;
file >> world.x >> world.y >> world.z >> world.w;
camera.SetViewParams(&eye, &lookAt);
camera.SetViewQuat(view);
camera.SetWorldQuat(world);
}
file.close();
return;
}
*/ | {
"pile_set_name": "Github"
} |
from yowsup.layers import YowProtocolLayer
from .protocolentities import ImageDownloadableMediaMessageProtocolEntity
from .protocolentities import AudioDownloadableMediaMessageProtocolEntity
from .protocolentities import VideoDownloadableMediaMessageProtocolEntity
from .protocolentities import DocumentDownloadableMediaMessageProtocolEntity
from .protocolentities import StickerDownloadableMediaMessageProtocolEntity
from .protocolentities import LocationMediaMessageProtocolEntity
from .protocolentities import ContactMediaMessageProtocolEntity
from .protocolentities import ResultRequestUploadIqProtocolEntity
from .protocolentities import MediaMessageProtocolEntity
from .protocolentities import ExtendedTextMediaMessageProtocolEntity
from yowsup.layers.protocol_iq.protocolentities import IqProtocolEntity, ErrorIqProtocolEntity
import logging
logger = logging.getLogger(__name__)
class YowMediaProtocolLayer(YowProtocolLayer):
def __init__(self):
handleMap = {
"message": (self.recvMessageStanza, self.sendMessageEntity),
"iq": (self.recvIq, self.sendIq)
}
super(YowMediaProtocolLayer, self).__init__(handleMap)
def __str__(self):
return "Media Layer"
def sendMessageEntity(self, entity):
if entity.getType() == "media":
self.entityToLower(entity)
def recvMessageStanza(self, node):
if node.getAttributeValue("type") == "media":
mediaNode = node.getChild("proto")
if mediaNode.getAttributeValue("mediatype") == "image":
entity = ImageDownloadableMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
elif mediaNode.getAttributeValue("mediatype") == "sticker":
entity = StickerDownloadableMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
elif mediaNode.getAttributeValue("mediatype") in ("audio", "ptt"):
entity = AudioDownloadableMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
elif mediaNode.getAttributeValue("mediatype") in ("video", "gif"):
entity = VideoDownloadableMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
elif mediaNode.getAttributeValue("mediatype") == "location":
entity = LocationMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
elif mediaNode.getAttributeValue("mediatype") == "contact":
entity = ContactMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
elif mediaNode.getAttributeValue("mediatype") == "document":
entity = DocumentDownloadableMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
elif mediaNode.getAttributeValue("mediatype") == "url":
entity = ExtendedTextMediaMessageProtocolEntity.fromProtocolTreeNode(node)
self.toUpper(entity)
else:
logger.warn("Unsupported mediatype: %s, will send receipts" % mediaNode.getAttributeValue("mediatype"))
self.toLower(MediaMessageProtocolEntity.fromProtocolTreeNode(node).ack(True).toProtocolTreeNode())
def sendIq(self, entity):
"""
:type entity: IqProtocolEntity
"""
if entity.getType() == IqProtocolEntity.TYPE_SET and entity.getXmlns() == "w:m":
#media upload!
self._sendIq(entity, self.onRequestUploadSuccess, self.onRequestUploadError)
def recvIq(self, node):
"""
:type node: ProtocolTreeNode
"""
def onRequestUploadSuccess(self, resultNode, requestUploadEntity):
self.toUpper(ResultRequestUploadIqProtocolEntity.fromProtocolTreeNode(resultNode))
def onRequestUploadError(self, errorNode, requestUploadEntity):
self.toUpper(ErrorIqProtocolEntity.fromProtocolTreeNode(errorNode))
| {
"pile_set_name": "Github"
} |
/*
* Copyright 2016 Red Hat, Inc. and/or its affiliates.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.drools.testcoverage.functional.parser
declare DroolsTest
arg: int
end
rule "test"
when
DroolsTest( arg == (1 + 1) );
// REMOVE THE COMMENTED PARTS BELOW TO GET THE EXCEPTION:
(or eval(true);
eval(true);
)
then
System.out.println( "it works!" );
end
| {
"pile_set_name": "Github"
} |
/**
* Copyright (c) 2016 Tino Reichardt
* All rights reserved.
*
* You can contact the author at:
* - zstdmt source repository: https://github.com/mcmilk/zstdmt
*
* This source code is licensed under both the BSD-style license (found in the
* LICENSE file in the root directory of this source tree) and the GPLv2 (found
* in the COPYING file in the root directory of this source tree).
* You may select, at your option, one of the above-listed licenses.
*/
#ifndef THREADING_H_938743
#define THREADING_H_938743
#include "debug.h"
#if defined (__cplusplus)
extern "C" {
#endif
#if defined(ZSTD_MULTITHREAD) && defined(_WIN32)
/**
* Windows minimalist Pthread Wrapper, based on :
* http://www.cse.wustl.edu/~schmidt/win32-cv-1.html
*/
#ifdef WINVER
# undef WINVER
#endif
#define WINVER 0x0600
#ifdef _WIN32_WINNT
# undef _WIN32_WINNT
#endif
#define _WIN32_WINNT 0x0600
#ifndef WIN32_LEAN_AND_MEAN
# define WIN32_LEAN_AND_MEAN
#endif
#undef ERROR /* reported already defined on VS 2015 (Rich Geldreich) */
#include <windows.h>
#undef ERROR
#define ERROR(name) ZSTD_ERROR(name)
/* mutex */
#define ZSTD_pthread_mutex_t CRITICAL_SECTION
#define ZSTD_pthread_mutex_init(a, b) ((void)(b), InitializeCriticalSection((a)), 0)
#define ZSTD_pthread_mutex_destroy(a) DeleteCriticalSection((a))
#define ZSTD_pthread_mutex_lock(a) EnterCriticalSection((a))
#define ZSTD_pthread_mutex_unlock(a) LeaveCriticalSection((a))
/* condition variable */
#define ZSTD_pthread_cond_t CONDITION_VARIABLE
#define ZSTD_pthread_cond_init(a, b) ((void)(b), InitializeConditionVariable((a)), 0)
#define ZSTD_pthread_cond_destroy(a) ((void)(a))
#define ZSTD_pthread_cond_wait(a, b) SleepConditionVariableCS((a), (b), INFINITE)
#define ZSTD_pthread_cond_signal(a) WakeConditionVariable((a))
#define ZSTD_pthread_cond_broadcast(a) WakeAllConditionVariable((a))
/* ZSTD_pthread_create() and ZSTD_pthread_join() */
typedef struct {
HANDLE handle;
void* (*start_routine)(void*);
void* arg;
} ZSTD_pthread_t;
int ZSTD_pthread_create(ZSTD_pthread_t* thread, const void* unused,
void* (*start_routine) (void*), void* arg);
int ZSTD_pthread_join(ZSTD_pthread_t thread, void** value_ptr);
/**
* add here more wrappers as required
*/
#elif defined(ZSTD_MULTITHREAD) /* posix assumed ; need a better detection method */
/* === POSIX Systems === */
# include <pthread.h>
#if DEBUGLEVEL < 1
#define ZSTD_pthread_mutex_t pthread_mutex_t
#define ZSTD_pthread_mutex_init(a, b) pthread_mutex_init((a), (b))
#define ZSTD_pthread_mutex_destroy(a) pthread_mutex_destroy((a))
#define ZSTD_pthread_mutex_lock(a) pthread_mutex_lock((a))
#define ZSTD_pthread_mutex_unlock(a) pthread_mutex_unlock((a))
#define ZSTD_pthread_cond_t pthread_cond_t
#define ZSTD_pthread_cond_init(a, b) pthread_cond_init((a), (b))
#define ZSTD_pthread_cond_destroy(a) pthread_cond_destroy((a))
#define ZSTD_pthread_cond_wait(a, b) pthread_cond_wait((a), (b))
#define ZSTD_pthread_cond_signal(a) pthread_cond_signal((a))
#define ZSTD_pthread_cond_broadcast(a) pthread_cond_broadcast((a))
#define ZSTD_pthread_t pthread_t
#define ZSTD_pthread_create(a, b, c, d) pthread_create((a), (b), (c), (d))
#define ZSTD_pthread_join(a, b) pthread_join((a),(b))
#else /* DEBUGLEVEL >= 1 */
/* Debug implementation of threading.
* In this implementation we use pointers for mutexes and condition variables.
* This way, if we forget to init/destroy them the program will crash or ASAN
* will report leaks.
*/
#define ZSTD_pthread_mutex_t pthread_mutex_t*
int ZSTD_pthread_mutex_init(ZSTD_pthread_mutex_t* mutex, pthread_mutexattr_t const* attr);
int ZSTD_pthread_mutex_destroy(ZSTD_pthread_mutex_t* mutex);
#define ZSTD_pthread_mutex_lock(a) pthread_mutex_lock(*(a))
#define ZSTD_pthread_mutex_unlock(a) pthread_mutex_unlock(*(a))
#define ZSTD_pthread_cond_t pthread_cond_t*
int ZSTD_pthread_cond_init(ZSTD_pthread_cond_t* cond, pthread_condattr_t const* attr);
int ZSTD_pthread_cond_destroy(ZSTD_pthread_cond_t* cond);
#define ZSTD_pthread_cond_wait(a, b) pthread_cond_wait(*(a), *(b))
#define ZSTD_pthread_cond_signal(a) pthread_cond_signal(*(a))
#define ZSTD_pthread_cond_broadcast(a) pthread_cond_broadcast(*(a))
#define ZSTD_pthread_t pthread_t
#define ZSTD_pthread_create(a, b, c, d) pthread_create((a), (b), (c), (d))
#define ZSTD_pthread_join(a, b) pthread_join((a),(b))
#endif
#else /* ZSTD_MULTITHREAD not defined */
/* No multithreading support */
typedef int ZSTD_pthread_mutex_t;
#define ZSTD_pthread_mutex_init(a, b) ((void)(a), (void)(b), 0)
#define ZSTD_pthread_mutex_destroy(a) ((void)(a))
#define ZSTD_pthread_mutex_lock(a) ((void)(a))
#define ZSTD_pthread_mutex_unlock(a) ((void)(a))
typedef int ZSTD_pthread_cond_t;
#define ZSTD_pthread_cond_init(a, b) ((void)(a), (void)(b), 0)
#define ZSTD_pthread_cond_destroy(a) ((void)(a))
#define ZSTD_pthread_cond_wait(a, b) ((void)(a), (void)(b))
#define ZSTD_pthread_cond_signal(a) ((void)(a))
#define ZSTD_pthread_cond_broadcast(a) ((void)(a))
/* do not use ZSTD_pthread_t */
#endif /* ZSTD_MULTITHREAD */
#if defined (__cplusplus)
}
#endif
#endif /* THREADING_H_938743 */
| {
"pile_set_name": "Github"
} |
{
"name": "auth",
"scripts": {
"start": "serve . --single"
},
"dependencies": {
"serve": "latest"
}
}
| {
"pile_set_name": "Github"
} |
var baseGetTag = require('./_baseGetTag'),
isObjectLike = require('./isObjectLike');
/** `Object#toString` result references. */
var boolTag = '[object Boolean]';
/**
* Checks if `value` is classified as a boolean primitive or object.
*
* @static
* @memberOf _
* @since 0.1.0
* @category Lang
* @param {*} value The value to check.
* @returns {boolean} Returns `true` if `value` is a boolean, else `false`.
* @example
*
* _.isBoolean(false);
* // => true
*
* _.isBoolean(null);
* // => false
*/
function isBoolean(value) {
return value === true || value === false ||
(isObjectLike(value) && baseGetTag(value) == boolTag);
}
module.exports = isBoolean;
| {
"pile_set_name": "Github"
} |
// Delete all of the text from the combo box's edit control and copy it
// to the clipboard.
m_MyComboBox.SetEditSel(0, -1);
m_MyComboBox.Cut(); | {
"pile_set_name": "Github"
} |
<?php
/*********************************************************************************
* This file is part of Sentrifugo.
* Copyright (C) 2014 Sapplica
*
* Sentrifugo is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Sentrifugo is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with Sentrifugo. If not, see <http://www.gnu.org/licenses/>.
*
* Sentrifugo Support <[email protected]>
********************************************************************************/
?>
<?php if (count($this->messages)) {?>
<div id="dispmsgbankaccount" class="settingssuccess">
<?php
foreach ($this->messages as $message)
{
$flag = array();
$flag = array_keys($message);
echo "<div id='messageData' class='ml-alert-1-$flag[0]'><div style='display:block;'><span class='style-1-icon $flag[0]'></span>";
echo $message[$flag[0]];
echo "</div></div>";
}
?>
</div>
<?php } ?>
<span id="error_message"></span>
<?php
if($this->call == 'ajaxcall'){
$div = ''; $endDiv = '';
} else {
$div = '<div id="grid_'.$this->dataArray[0]['objectname'].'" class="all-grid-control">'; $endDiv = '</div>';
}
echo $div.'<div id="msg" style="display:none;font-style: oblique; font-weight: bold; color: green;""></div>'.$this->grid($this->dataArray[0]).$endDiv;
?>
<script type='text/javascript'>
$(document).ready(function(){
<?php if (count($this->messages)) {?>
setTimeout(function(){
$('#dispmsgbankaccount').fadeOut('slow');
},3000);
<?php } ?>
})
</script> | {
"pile_set_name": "Github"
} |
<?xml version='1.0' encoding='ISO-8859-1' ?>
<!DOCTYPE toc
PUBLIC
"-//Sun Microsystems Inc.//DTD
JavaHelp TOC Version 2.0//EN"
"http://java.sun.com/products/javahelp/toc_2_0.dtd">
<toc version="2.0" categoryclosedimage="folder" categoryopenimage="folder" topicimage="doc">
<tocitem text="General">
<tocitem text="Abstracted Metadata" target="AbstractedMetadata"/>
</tocitem>
<tocitem text="Sentinel Toolbox Application">
</tocitem>
</toc>
| {
"pile_set_name": "Github"
} |
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega
| {
"pile_set_name": "Github"
} |
package com.yyydjk.sliderlayoutdemo;
import android.app.Application;
import android.test.ApplicationTestCase;
/**
* <a href="http://d.android.com/tools/testing/testing_android.html">Testing Fundamentals</a>
*/
public class ApplicationTest extends ApplicationTestCase<Application> {
public ApplicationTest() {
super(Application.class);
}
} | {
"pile_set_name": "Github"
} |
import {Injectable} from '@angular/core';
import {
AnonymousLocalUser,
AnonymousRemoteUser,
PairwiseSessionLite,
RegisteredRemoteUser,
Transport
} from '../../crypto/castle';
import {SessionService} from '../session.service';
import {AccountDatabaseService} from './account-database.service';
import {CastleService} from './castle.service';
import {PotassiumService} from './potassium.service';
/**
* Castle instance for an anonymous user.
*/
@Injectable()
export class AnonymousCastleService extends CastleService {
/** @inheritDoc */
public async init (sessionService: SessionService) : Promise<void> {
const transport = new Transport(sessionService);
const handshakeState = await sessionService.handshakeState(
undefined,
undefined,
sessionService.sharedSecret !== undefined ? undefined : true
);
const localUser = new AnonymousLocalUser(
this.potassiumService,
handshakeState,
sessionService.sharedSecret
);
const remoteUser =
sessionService.sharedSecret !== undefined ?
new AnonymousRemoteUser(
this.potassiumService,
handshakeState,
sessionService.sharedSecret,
sessionService.remoteUsername
) :
new RegisteredRemoteUser(
this.accountDatabaseService,
false,
sessionService.remoteUsername
);
this.pairwiseSession.resolve(
new PairwiseSessionLite(
undefined,
undefined,
this.potassiumService,
transport,
localUser,
remoteUser,
handshakeState
)
);
}
/** @inheritDoc */
public spawn () : AnonymousCastleService {
return new AnonymousCastleService(
this.accountDatabaseService,
this.potassiumService
);
}
constructor (
/** @ignore */
private readonly accountDatabaseService: AccountDatabaseService,
/** @ignore */
private readonly potassiumService: PotassiumService
) {
super();
}
}
| {
"pile_set_name": "Github"
} |
using System;
using System.Collections.Generic;
using System.Linq;
using Merchello.Core.Models;
using Merchello.Core.Persistence;
using Merchello.Core.Services;
using Merchello.Tests.Base.Respositories;
using NUnit.Framework;
namespace Merchello.Tests.UnitTests.Services
{
using Moq;
using Umbraco.Core.Logging;
using Umbraco.Core.Persistence.SqlSyntax;
[TestFixture]
public class SettingsServiceTests
{
private IStoreSettingService _storeSettingService;
[SetUp]
public void Init()
{
var logger = Logger.CreateWithDefaultLog4NetConfiguration();
var syntax = new Mock<ISqlSyntaxProvider>().Object;
_storeSettingService = new StoreSettingService(logger, syntax);
}
[Test]
public void Can_Retrieve_A_List_Of_Countries_Without_Duplicates()
{
//// Arrange
//// Act
var countries = _storeSettingService.GetAllCountries();
foreach (var country in countries.OrderBy(x =>x.CountryCode))
{
Console.WriteLine("{0} {1}", country.CountryCode, country.Name);
}
var distinctCodes = countries.Select(x => x.CountryCode).Distinct();
//// Assert
Assert.AreEqual(countries.Count(), distinctCodes.Count());
}
///// <summary>
///// Test verifies that the service correctly removes countries passed an array of country codes
///// </summary>
//[Test]
//public void Can_Retrieve_A_RegionList_That_Excludes_Countries()
//{
// //// Arrange
// var excludes = new[] {"SA", "DK"};
// //// Act
// var regions = _storeSettingService.GetAllCountries(excludes);
// //// Assert
// Assert.IsTrue(regions.Any());
// Assert.IsFalse(regions.Contains(new Country("SA")));
// Assert.IsFalse(regions.Contains(new Country("DK")));
//}
/// <summary>
/// Test verifies that the service correctly returns the RegionInfo for Denmark given the country code DK
/// </summary>
[Test]
public void Can_Retrieve_Denmark_Region_By_DK_Code()
{
//// Arrange
const string countryCode = "DK";
//// Act
var denmark = _storeSettingService.GetCountryByCode(countryCode);
//// Assert
Assert.NotNull(denmark);
Assert.AreEqual(countryCode, denmark.CountryCode);
}
/// <summary>
/// Test verifies that a collection of currency codes can be created
/// </summary>
[Test]
public void Can_Create_A_Collection_Of_Currency_Codes()
{
//// Arrange
//// Act
var currencies = _storeSettingService.GetAllCurrencies().OrderBy(x => x.Name);
foreach (var currency in currencies)
{
Console.WriteLine("{0} {1} {2}", currency.CurrencyCode, currency.Symbol, currency.Name);
}
//// Assert
Assert.IsTrue(currencies.Any());
}
}
} | {
"pile_set_name": "Github"
} |
// Copyright 2015 The etcd Authors
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package types
import "strconv"
// ID represents a generic identifier which is canonically
// stored as a uint64 but is typically represented as a
// base-16 string for input/output
type ID uint64
func (i ID) String() string {
return strconv.FormatUint(uint64(i), 16)
}
// IDFromString attempts to create an ID from a base-16 string.
func IDFromString(s string) (ID, error) {
i, err := strconv.ParseUint(s, 16, 64)
return ID(i), err
}
// IDSlice implements the sort interface
type IDSlice []ID
func (p IDSlice) Len() int { return len(p) }
func (p IDSlice) Less(i, j int) bool { return uint64(p[i]) < uint64(p[j]) }
func (p IDSlice) Swap(i, j int) { p[i], p[j] = p[j], p[i] }
| {
"pile_set_name": "Github"
} |
cs
el
fi
fr
ga-IE
he
hu
it
nb-NO
nl
pl
ro
sv-SE
ru
| {
"pile_set_name": "Github"
} |
; Scheme 9 from Empty Space, Function Library
; By Nils M Holm, 2010
; Placed in the Public Domain
;
; (r4rs-syntax-objects) ==> list
; (s9fes-syntax-objects) ==> list
; (r4rs-procedures) ==> list
; (s9fes-procedures) ==> list
; (s9fes-extension-procedures) ==> list
; (s9fes-extension-symbols) ==> list
;
; (load-from-library "symbols.scm")
;
; Return lists of symbols bound the corresponding type of object.
; Note: only the R4RS symbols defined in S9fES are included here.
; Caveat Utilitor.
;
; Example: (s9fes-syntax-objects) ==> ()
(define (r4rs-syntax-objects)
'(=> and begin case cond define define-syntax delay do else if
lambda let let* letrec quote quasiquote or set! syntax-rules
unquote unquote-splicing))
(define (s9fes-syntax-objects)
'())
(define (r4rs-procedures)
'(* + - / < <= = > >= abs acos append apply asin assoc assq assv
atan boolean? caaaar caaadr caaar caadar caaddr caadr caar
cadaar cadadr cadar caddar cadddr caddr cadr
call-with-current-continuation call-with-input-file
call-with-output-file call/cc car cdaaar cdaadr cdaar cdadar
cdaddr cdadr cdar cddaar cddadr cddar cdddar cddddr cdddr cddr
cdr ceiling char->integer char-alphabetic? char-ci<=? char-ci<?
char-ci=? char-ci>=? char-ci>? char-downcase char-lower-case?
char-numeric? char-upcase char-upper-case? char-whitespace?
char<=? char<? char=? char>=? char>? char? close-input-port
close-output-port cons cos current-input-port current-output-port
display eof-object? eq? equal? eqv? even? exact->inexact exact?
exp expt floor for-each force gcd inexact->exact inexact?
input-port? integer->char integer? lcm length list list->string
list->vector list-ref list-tail list? load log make-string
make-vector map max member memq memv min modulo negative? newline
not null? number->string number? odd? open-input-file
open-output-file output-port? pair? peek-char port? positive?
procedure? quotient read read-char real? remainder reverse
round set-car! set-cdr! sin sqrt string string->list string->number
string->symbol string-append string-ci<=? string-ci<? string-ci=?
string-ci>=? string-ci>? string-copy string-fill! string-length
string-ref string-set! string<=? string<? string=? string>=?
string>? string? substring symbol->string symbol? tan truncate
unquote unquote-splicing vector vector->list vector-fill!
vector-length vector-ref vector-set! vector? with-input-from-file
with-output-to-file write write-char zero?))
(define (s9fes-procedures)
'(delete-file error file-exists? fold-left fold-right gensym
load-from-library locate-file macro-expand macro-expand-1 print
require-extension reverse! set-input-port! set-output-port! stats
symbols vector-append vector-copy void ** *extensions*
*library-path* *loading*))
(define (s9fes-extension-procedures)
'(sys:access sys:catch-errors sys:chdir sys:change-mode sys:chmod
sys:chown sys:close sys:command-line sys:creat sys:dup sys:dup2
sys:errno sys:execv sys:exit sys:fileno sys:flush sys:fork
sys:get-magic-value sys:getcwd sys:getenv sys:getgid sys:getgrgid
sys:getgrnam sys:getpgid sys:getpid sys:getpwent sys:getpwnam
sys:getpwuid sys:gettimeofday sys:getuid sys:group-name
sys:group-gid sys:kill sys:lchmod sys:lchown sys:lutimes sys:link
sys:lock sys:lseek sys:lstat sys:lstat-atime sys:lstat-ctime
sys:lstat-dev sys:lstat-gid sys:lstat-ino sys:lstat-mode
sys:lstat-mtime sys:lstat-name sys:lstat-nlink sys:lstat-size
sys:lstat-uid sys:make-input-port sys:make-output-port sys:mkdir
sys:notify sys:open sys:pipe sys:read sys:readdir sys:readlink
sys:rename sys:rmdir sys:select sys:setgid sys:setpgid sys:setuid
sys:stat sys:lstat-block-dev? sys:lstat-char-dev? sys:lstat-directory?
sys:lstat-pipe? sys:lstat-regular? sys:lstat-socket?
sys:lstat-symlink? sys:stat-atime sys:stat-block-dev?
sys:stat-char-dev? sys:stat-ctime sys:stat-dev sys:stat-directory?
sys:stat-gid sys:stat-ino sys:stat-mode sys:stat-mtime sys:stat-name
sys:stat-nlink sys:stat-pipe? sys:stat-regular? sys:stat-size
sys:stat-socket? sys:stat-uid sys:strerror sys:symlink sys:system
sys:time sys:umask sys:unlink sys:unlock sys:user-gecos
sys:user-gid sys:user-home sys:user-shell sys:user-name
sys:user-uid sys:utimes sys:wait sys:waitpid sys:write curs:addch
curs:addstr curs:attroff curs:attron curs:attrset curs:beep
curs:cbreak curs:clear curs:clearok curs:clrtobot curs:clrtoeol
curs:cols curs:cursoff curs:curson curs:delch curs:deleteln
curs:echo curs:endwin curs:flash curs:flushinp curs:getch
curs:getyx curs:idlok curs:inch curs:insch curs:initscr
curs:insertln curs:keypad curs:get-magic-value curs:lines
curs:move curs:mvaddch curs:mvaddstr curs:mvcur curs:mvdelch
curs:mvgetch curs:mvinch curs:mvinsch curs:nl curs:nocbreak
curs:nodelay curs:noecho curs:nonl curs:noraw curs:raw curs:refresh
curs:resetty curs:savetty curs:scroll curs:scrollok curs:standend
curs:standout curs:unctrl curs:ungetch))
(define (s9fes-extension-symbols)
'(sys:access-f-ok sys:access-r-ok sys:access-w-ok sys:access-x-ok
sys:read+write sys:read-only sys:s-irgrp sys:s-iroth sys:s-irusr
sys:s-irwxg sys:s-irwxo sys:s-irwxu sys:s-isgid sys:s-isuid
sys:s-isvtx sys:s-iwgrp sys:s-iwoth sys:s-iwusr sys:s-ixgrp
sys:s-ixoth sys:s-ixusr sys:seek-cur sys:seek-end sys:seek-set
sys:sigabrt sys:sigalrm sys:sigbus sys:sigemt sys:sigfpe
sys:sighup sys:sigill sys:sigint sys:sigkill sys:sigpipe
sys:sigquit sys:sigsegv sys:sigsys sys:sigterm sys:sigtrap
sys:write-only sys:inet-accept sys:inet-connect sys:inet-getpeername
sys:inet-listen curs:attr-normal curs:attr-standout curs:attr-underline
curs:attr-bold curs:key-backspace curs:key-up curs:key-down
curs:key-left curs:key-right curs:key-home curs:key-eol
curs:key-ppage curs:key-npage curs:key-dc curs:key-ic curs:key-end
curs:color-black curs:color-blue curs:color-green curs:color-cyan
curs:color-red curs:color-magenta curs:color-yellow curs:color-gray))
| {
"pile_set_name": "Github"
} |
/*
* MobiCore Driver API.
*
* The MobiCore (MC) Driver API provides access functions to the MobiCore
* runtime environment and the contained Trustlets.
*
* <-- Copyright Giesecke & Devrient GmbH 2009 - 2012 -->
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. The name of the author may not be used to endorse or promote
* products derived from this software without specific prior
* written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS
* OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
* DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
* GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
* WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
* NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
* SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
#ifndef _MOBICORE_DRIVER_API_H_
#define _MOBICORE_DRIVER_API_H_
#define __MC_CLIENT_LIB_API
#include "mcuuid.h"
/*
* Return values of MobiCore driver functions.
*/
enum mc_result {
/* Function call succeeded. */
MC_DRV_OK = 0,
/* No notification available. */
MC_DRV_NO_NOTIFICATION = 1,
/* Error during notification on communication level. */
MC_DRV_ERR_NOTIFICATION = 2,
/* Function not implemented. */
MC_DRV_ERR_NOT_IMPLEMENTED = 3,
/* No more resources available. */
MC_DRV_ERR_OUT_OF_RESOURCES = 4,
/* Driver initialization failed. */
MC_DRV_ERR_INIT = 5,
/* Unknown error. */
MC_DRV_ERR_UNKNOWN = 6,
/* The specified device is unknown. */
MC_DRV_ERR_UNKNOWN_DEVICE = 7,
/* The specified session is unknown.*/
MC_DRV_ERR_UNKNOWN_SESSION = 8,
/* The specified operation is not allowed. */
MC_DRV_ERR_INVALID_OPERATION = 9,
/* The response header from the MC is invalid. */
MC_DRV_ERR_INVALID_RESPONSE = 10,
/* Function call timed out. */
MC_DRV_ERR_TIMEOUT = 11,
/* Can not allocate additional memory. */
MC_DRV_ERR_NO_FREE_MEMORY = 12,
/* Free memory failed. */
MC_DRV_ERR_FREE_MEMORY_FAILED = 13,
/* Still some open sessions pending. */
MC_DRV_ERR_SESSION_PENDING = 14,
/* MC daemon not reachable */
MC_DRV_ERR_DAEMON_UNREACHABLE = 15,
/* The device file of the kernel module could not be opened. */
MC_DRV_ERR_INVALID_DEVICE_FILE = 16,
/* Invalid parameter. */
MC_DRV_ERR_INVALID_PARAMETER = 17,
/* Unspecified error from Kernel Module*/
MC_DRV_ERR_KERNEL_MODULE = 18,
/* Error during mapping of additional bulk memory to session. */
MC_DRV_ERR_BULK_MAPPING = 19,
/* Error during unmapping of additional bulk memory to session. */
MC_DRV_ERR_BULK_UNMAPPING = 20,
/* Notification received, exit code available. */
MC_DRV_INFO_NOTIFICATION = 21,
/* Set up of NWd connection failed. */
MC_DRV_ERR_NQ_FAILED = 22
};
/*
* Driver control command.
*/
enum mc_driver_ctrl {
/* Return the driver version */
MC_CTRL_GET_VERSION = 1
};
/*
* Structure of Session Handle, includes the Session ID and the Device ID the
* Session belongs to.
* The session handle will be used for session-based MobiCore communication.
* It will be passed to calls which address a communication end point in the
* MobiCore environment.
*/
struct mc_session_handle {
uint32_t session_id; /* MobiCore session ID */
uint32_t device_id; /* Device ID the session belongs to */
};
/*
* Information structure about additional mapped Bulk buffer between the
* Trustlet Connector (NWd) and the Trustlet (SWd). This structure is
* initialized from a Trustlet Connector by calling mc_map().
* In order to use the memory within a Trustlet the Trustlet Connector has to
* inform the Trustlet with the content of this structure via the TCI.
*/
struct mc_bulk_map {
/* The virtual address of the Bulk buffer regarding the address space
* of the Trustlet, already includes a possible offset! */
void *secure_virt_addr;
uint32_t secure_virt_len; /* Length of the mapped Bulk buffer */
};
/* The default device ID */
#define MC_DEVICE_ID_DEFAULT 0
/* Wait infinite for a response of the MC. */
#define MC_INFINITE_TIMEOUT ((int32_t)(-1))
/* Do not wait for a response of the MC. */
#define MC_NO_TIMEOUT 0
/* TCI/DCI must not exceed 1MiB */
#define MC_MAX_TCI_LEN 0x100000
/**
* mc_open_device() - Open a new connection to a MobiCore device.
* @device_id: Identifier for the MobiCore device to be used.
* MC_DEVICE_ID_DEFAULT refers to the default device.
*
* Initializes all device specific resources required to communicate with a
* MobiCore instance located on the specified device in the system. If the
* device does not exist the function will return MC_DRV_ERR_UNKNOWN_DEVICE.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_ERR_INVALID_OPERATION: device already opened
* MC_DRV_ERR_DAEMON_UNREACHABLE: problems with daemon
* MC_DRV_ERR_UNKNOWN_DEVICE: device_id unknown
* MC_DRV_ERR_INVALID_DEVICE_FILE: kernel module under /dev/mobicore
* cannot be opened
*/
__MC_CLIENT_LIB_API enum mc_result mc_open_device(uint32_t device_id);
/**
* mc_close_device() - Close the connection to a MobiCore device.
* @device_id: Identifier for the MobiCore device.
*
* When closing a device, active sessions have to be closed beforehand.
* Resources associated with the device will be released.
* The device may be opened again after it has been closed.
*
* MC_DEVICE_ID_DEFAULT refers to the default device.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_ERR_UNKNOWN_DEVICE: device id is invalid
* MC_DRV_ERR_SESSION_PENDING: a session is still open
* MC_DRV_ERR_DAEMON_UNREACHABLE: problems with daemon occur
*/
__MC_CLIENT_LIB_API enum mc_result mc_close_device(uint32_t device_id);
/**
* mc_open_session() - Open a new session to a Trustlet.
* @session: On success, the session data will be returned
* @uuid: UUID of the Trustlet to be opened
* @tci: TCI buffer for communicating with the Trustlet
* @tci_len: Length of the TCI buffer. Maximum allowed value
* is MC_MAX_TCI_LEN
*
* The Trustlet with the given UUID has to be available in the flash filesystem.
*
* Write MCP open message to buffer and notify MobiCore about the availability
* of a new command.
*
* Waits till the MobiCore responses with the new session ID (stored in the MCP
* buffer).
*
* Note that session.device_id has to be the device id of an opened device.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: session parameter is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id is invalid
* MC_DRV_ERR_DAEMON_UNREACHABLE: problems with daemon socket occur
* MC_DRV_ERR_NQ_FAILED: daemon returns an error
*/
__MC_CLIENT_LIB_API enum mc_result mc_open_session(
struct mc_session_handle *session, const struct mc_uuid_t *uuid,
uint8_t *tci, uint32_t tci_len);
/**
* mc_close_session() - Close a Trustlet session.
* @session: Session to be closed.
*
* Closes the specified MobiCore session. The call will block until the
* session has been closed.
*
* Device device_id has to be opened in advance.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: session parameter is invalid
* MC_DRV_ERR_UNKNOWN_SESSION: session id is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id of session is invalid
* MC_DRV_ERR_DAEMON_UNREACHABLE: problems with daemon occur
* MC_DRV_ERR_INVALID_DEVICE_FILE: daemon cannot open Trustlet file
*/
__MC_CLIENT_LIB_API enum mc_result mc_close_session(
struct mc_session_handle *session);
/**
* mc_notify() - Notify a session.
* @session: The session to be notified.
*
* Notifies the session end point about available message data.
* If the session parameter is correct, notify will always succeed.
* Corresponding errors can only be received by mc_wait_notification().
*
* A session has to be opened in advance.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: session parameter is invalid
* MC_DRV_ERR_UNKNOWN_SESSION: session id is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id of session is invalid
*/
__MC_CLIENT_LIB_API enum mc_result mc_notify(struct mc_session_handle *session);
/**
* mc_wait_notification() - Wait for a notification.
* @session: The session the notification should correspond to.
* @timeout: Time in milliseconds to wait
* (MC_NO_TIMEOUT : direct return, > 0 : milliseconds,
* MC_INFINITE_TIMEOUT : wait infinitely)
*
* Wait for a notification issued by the MobiCore for a specific session.
* The timeout parameter specifies the number of milliseconds the call will wait
* for a notification.
*
* If the caller passes 0 as timeout value the call will immediately return.
* If timeout value is below 0 the call will block until a notification for the
* session has been received.
*
* If timeout is below 0, call will block.
*
* Caller has to trust the other side to send a notification to wake him up
* again.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_ERR_TIMEOUT: no notification arrived in time
* MC_DRV_INFO_NOTIFICATION: a problem with the session was
* encountered. Get more details with
* mc_get_session_error_code()
* MC_DRV_ERR_NOTIFICATION: a problem with the socket occurred
* MC_DRV_INVALID_PARAMETER: a parameter is invalid
* MC_DRV_ERR_UNKNOWN_SESSION: session id is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id of session is invalid
*/
__MC_CLIENT_LIB_API enum mc_result mc_wait_notification(
struct mc_session_handle *session, int32_t timeout);
/**
* mc_malloc_wsm() - Allocate a block of world shared memory (WSM).
* @device_id: The ID of an opened device to retrieve the WSM from.
* @align: The alignment (number of pages) of the memory block
* (e.g. 0x00000001 for 4kb).
* @len: Length of the block in bytes.
* @wsm: Virtual address of the world shared memory block.
* @wsm_flags: Platform specific flags describing the memory to
* be allocated.
*
* The MC driver allocates a contiguous block of memory which can be used as
* WSM.
* This implicates that the allocated memory is aligned according to the
* alignment parameter.
*
* Always returns a buffer of size WSM_SIZE aligned to 4K.
*
* Align and wsm_flags are currently ignored
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: a parameter is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id is invalid
* MC_DRV_ERR_NO_FREE_MEMORY: no more contiguous memory is
* available in this size or for this
* process
*/
__MC_CLIENT_LIB_API enum mc_result mc_malloc_wsm(
uint32_t device_id,
uint32_t align,
uint32_t len,
uint8_t **wsm,
uint32_t wsm_flags
);
/**
* mc_free_wsm() - Free a block of world shared memory (WSM).
* @device_id: The ID to which the given address belongs
* @wsm: Address of WSM block to be freed
*
* The MC driver will free a block of world shared memory (WSM) previously
* allocated with mc_malloc_wsm(). The caller has to assure that the address
* handed over to the driver is a valid WSM address.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: a parameter is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: when device id is invalid
* MC_DRV_ERR_FREE_MEMORY_FAILED: on failure
*/
__MC_CLIENT_LIB_API enum mc_result mc_free_wsm(uint32_t device_id,
uint8_t *wsm);
/**
*mc_map() - Map additional bulk buffer between a Trustlet Connector (TLC)
* and the Trustlet (TL) for a session
* @session: Session handle with information of the device_id and
* the session_id. The given buffer is mapped to the
* session specified in the sessionHandle
* @buf: Virtual address of a memory portion (relative to TLC)
* to be shared with the Trustlet, already includes a
* possible offset!
* @len: length of buffer block in bytes.
* @map_info: Information structure about the mapped Bulk buffer
* between the TLC (NWd) and the TL (SWd).
*
* Memory allocated in user space of the TLC can be mapped as additional
* communication channel (besides TCI) to the Trustlet. Limitation of the
* Trustlet memory structure apply: only 6 chunks can be mapped with a maximum
* chunk size of 1 MiB each.
*
* It is up to the application layer (TLC) to inform the Trustlet
* about the additional mapped bulk memory.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: a parameter is invalid
* MC_DRV_ERR_UNKNOWN_SESSION: session id is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id of session is invalid
* MC_DRV_ERR_DAEMON_UNREACHABLE: problems with daemon occur
* MC_DRV_ERR_BULK_MAPPING: buf is already uses as bulk buffer or
* when registering the buffer failed
*/
__MC_CLIENT_LIB_API enum mc_result mc_map(
struct mc_session_handle *session, void *buf, uint32_t len,
struct mc_bulk_map *map_info);
/**
* mc_unmap() - Remove additional mapped bulk buffer between Trustlet Connector
* (TLC) and the Trustlet (TL) for a session
* @session: Session handle with information of the device_id and
* the session_id. The given buffer is unmapped from the
* session specified in the sessionHandle.
* @buf: Virtual address of a memory portion (relative to TLC)
* shared with the TL, already includes a possible offset!
* @map_info: Information structure about the mapped Bulk buffer
* between the TLC (NWd) and the TL (SWd)
*
* The bulk buffer will immediately be unmapped from the session context.
*
* The application layer (TLC) must inform the TL about unmapping of the
* additional bulk memory before calling mc_unmap!
*
* The clientlib currently ignores the len field in map_info.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: a parameter is invalid
* MC_DRV_ERR_UNKNOWN_SESSION: session id is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id of session is invalid
* MC_DRV_ERR_DAEMON_UNREACHABLE: problems with daemon occur
* MC_DRV_ERR_BULK_UNMAPPING: buf was not registered earlier
* or when unregistering failed
*/
__MC_CLIENT_LIB_API enum mc_result mc_unmap(
struct mc_session_handle *session, void *buf,
struct mc_bulk_map *map_info);
/**
* mc_driver_ctrl() - Execute driver specific command.
* @param: Command ID of the command to be executed
* @data: Command data and response depending on command
* @len: Length of the data block
*
* Can be used to execute driver specific commands. Besides the control command
* MC_CTRL_GET_VERSION commands are implementation specific.
*
* Please refer to the corresponding specification of the driver manufacturer.
*
* Return codes:
* MC_DRV_ERR_NOT_IMPLEMENTED.
*/
__MC_CLIENT_LIB_API enum mc_result mc_driver_ctrl(
enum mc_driver_ctrl param, uint8_t *data, uint32_t len);
/**
* mc_manage() - Execute application management command.
* @device_id: Identifier for the MobiCore device to be used.
* NULL refers to the default device.
* @data: Command data/response data depending on command
* @len: Length of the data block
*
* Shall be used to exchange application management commands with the MobiCore.
* The MobiCore Application Management Protocol is described in [MCAMP].
*
* Return codes:
* MC_DRV_ERR_NOT_IMPLEMENTED.
*/
__MC_CLIENT_LIB_API enum mc_result mc_manage(
uint32_t device_id, uint8_t *data, uint32_t len);
/**
* mc_get_session_error_code() - Get additional error information of the last
* error that occurred on a session.
* @session: Session handle with information of the device_id and
* the session_id
* @last_error: >0 Trustlet has terminated itself with this value,
* <0 Trustlet is dead because of an error within the
* MobiCore (e.g. Kernel exception). See also MCI
* definition.
*
* After the request the stored error code will be deleted.
*
* Return codes:
* MC_DRV_OK: operation completed successfully
* MC_DRV_INVALID_PARAMETER: a parameter is invalid
* MC_DRV_ERR_UNKNOWN_SESSION: session id is invalid
* MC_DRV_ERR_UNKNOWN_DEVICE: device id of session is invalid
*/
__MC_CLIENT_LIB_API enum mc_result mc_get_session_error_code(
struct mc_session_handle *session, int32_t *last_error);
#endif /* _MOBICORE_DRIVER_API_H_ */
| {
"pile_set_name": "Github"
} |
---
title: "unaryManagementConditionExpressionOperatorType enum type"
description: "Supported operators for unary management condition expressions"
author: "dougeby"
localization_priority: Normal
ms.prod: "intune"
doc_type: enumPageType
---
# unaryManagementConditionExpressionOperatorType enum type
Namespace: microsoft.graph
> **Important:** Microsoft Graph APIs under the /beta version are subject to change; production use is not supported.
> **Note:** The Microsoft Graph API for Intune requires an [active Intune license](https://go.microsoft.com/fwlink/?linkid=839381) for the tenant.
Supported operators for unary management condition expressions
## Members
|Member|Value|Description|
|:---|:---|:---|
|not|0|Negates the evaluation of the operand.|
| {
"pile_set_name": "Github"
} |
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
is.autoload=true
javac.source=1.8
javac.compilerargs=-Xlint -Xlint:-serial
| {
"pile_set_name": "Github"
} |
<html>
<body>
<p>this tag should be <spot></spot>splitted at caret position</p>
</body>
</html> | {
"pile_set_name": "Github"
} |
// Copyright Aleksey Gurtovoy 2000-2004
//
// Distributed under the Boost Software License, Version 1.0.
// (See accompanying file LICENSE_1_0.txt or copy at
// http://www.boost.org/LICENSE_1_0.txt)
//
// Preprocessed version of "boost/mpl/not_equal_to.hpp" header
// -- DO NOT modify by hand!
namespace boost { namespace mpl {
template<
typename Tag1
, typename Tag2
>
struct not_equal_to_impl
: if_c<
( BOOST_MPL_AUX_NESTED_VALUE_WKND(int, Tag1)
> BOOST_MPL_AUX_NESTED_VALUE_WKND(int, Tag2)
)
, aux::cast2nd_impl< not_equal_to_impl< Tag1,Tag1 >,Tag1, Tag2 >
, aux::cast1st_impl< not_equal_to_impl< Tag2,Tag2 >,Tag1, Tag2 >
>::type
{
};
/// for Digital Mars C++/compilers with no CTPS/TTP support
template<> struct not_equal_to_impl< na,na >
{
template< typename U1, typename U2 > struct apply
{
typedef apply type;
BOOST_STATIC_CONSTANT(int, value = 0);
};
};
template<> struct not_equal_to_impl< na,integral_c_tag >
{
template< typename U1, typename U2 > struct apply
{
typedef apply type;
BOOST_STATIC_CONSTANT(int, value = 0);
};
};
template<> struct not_equal_to_impl< integral_c_tag,na >
{
template< typename U1, typename U2 > struct apply
{
typedef apply type;
BOOST_STATIC_CONSTANT(int, value = 0);
};
};
template< typename T > struct not_equal_to_tag
{
typedef typename T::tag type;
};
template<
typename BOOST_MPL_AUX_NA_PARAM(N1)
, typename BOOST_MPL_AUX_NA_PARAM(N2)
>
struct not_equal_to
: not_equal_to_impl<
typename not_equal_to_tag<N1>::type
, typename not_equal_to_tag<N2>::type
>::template apply< N1,N2 >::type
{
BOOST_MPL_AUX_LAMBDA_SUPPORT(2, not_equal_to, (N1, N2))
};
BOOST_MPL_AUX_NA_SPEC2(2, 2, not_equal_to)
}}
namespace boost { namespace mpl {
template<>
struct not_equal_to_impl< integral_c_tag,integral_c_tag >
{
template< typename N1, typename N2 > struct apply
: bool_< ( BOOST_MPL_AUX_VALUE_WKND(N1)::value != BOOST_MPL_AUX_VALUE_WKND(N2)::value ) >
{
};
};
}}
| {
"pile_set_name": "Github"
} |
<?xml version="1.0" encoding="utf-8"?>
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup Condition=" '$(Configuration)' == 'Debug' ">
<StartAction>Program</StartAction>
<StartProgram>C:\Users\admin\Desktop\Christian Moldenhauer\vvvv_40beta21\vvvv.exe</StartProgram>
<StartArguments>/o "D:\SIEMENS-IFA09 (1436)\patches\Kochen\disccontrol\LinBus (Devices).v4p"</StartArguments>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)' == 'vvvv' ">
<StartAction>Program</StartAction>
<StartProgram>C:\Program Files\vvvv\vvvv_40beta21\vvvv.exe</StartProgram>
<StartArguments>/o "D:\SIEMENS-IFA09 (1436)\patches\Kochen\disccontrol\LinBus (Devices).v4p"</StartArguments>
</PropertyGroup>
<PropertyGroup>
<PublishUrlHistory />
<InstallUrlHistory />
<SupportUrlHistory />
<UpdateUrlHistory />
<BootstrapperUrlHistory />
<ErrorReportUrlHistory />
<FallbackCulture>en-US</FallbackCulture>
<VerifyUploadedFiles>false</VerifyUploadedFiles>
</PropertyGroup>
</Project> | {
"pile_set_name": "Github"
} |
StartChar: uni067C.init_BaaYaaIsol
Encoding: 1117496 -1 3668
Width: 258
Flags: HW
AnchorPoint: "TashkilAbove" 16 801 basechar 0
AnchorPoint: "TashkilBelow" 167 -391 basechar 0
LayerCount: 2
Fore
Refer: 215 -1 N 1 0 0 1 171 244 2
Refer: 192 -1 N 1 0 0 1 41 566 2
Refer: 393 -1 N 1 0 0 1 0 0 3
EndChar
| {
"pile_set_name": "Github"
} |
/* x86 fat binary initializers.
THE FUNCTIONS AND VARIABLES IN THIS FILE ARE FOR INTERNAL USE ONLY.
THEY'RE ALMOST CERTAIN TO BE SUBJECT TO INCOMPATIBLE CHANGES OR DISAPPEAR
COMPLETELY IN FUTURE GNU MP RELEASES.
Copyright 2003, 2004 Free Software Foundation, Inc.
This file is part of the GNU MP Library.
The GNU MP Library is free software; you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as published by
the Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version.
The GNU MP Library is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public
License for more details.
You should have received a copy of the GNU Lesser General Public License
along with the GNU MP Library; see the file COPYING.LIB. If not, write to
the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston,
MA 02110-1301, USA. */
#include <stdio.h> /* for printf */
#include <stdlib.h> /* for getenv */
#include <string.h>
#include "mpir.h"
#include "gmp-impl.h"
/* Change this to "#define TRACE(x) x" for some traces. */
#define TRACE(x)
/* fat_entry.asm */
long __gmpn_cpuid(char dst[12], int id);
struct cpuvec_t __gmpn_cpuvec = {
__MPN(add_err1_n_init),
__MPN(add_err2_n_init),
__MPN(add_n_init),
__MPN(addmul_1_init),
__MPN(copyd_init),
__MPN(copyi_init),
__MPN(divexact_1_init),
__MPN(divexact_by3c_init),
__MPN(divexact_byfobm1_init),
__MPN(divrem_1_init),
__MPN(divrem_2_init),
__MPN(divrem_euclidean_qr_1_init),
__MPN(divrem_euclidean_qr_2_init),
__MPN(gcd_1_init),
__MPN(lshift_init),
__MPN(mod_1_init),
__MPN(mod_34lsub1_init),
__MPN(modexact_1c_odd_init),
__MPN(mul_1_init),
__MPN(mul_basecase_init),
__MPN(mulmid_basecase_init),
__MPN(preinv_divrem_1_init),
__MPN(preinv_mod_1_init),
__MPN(redc_1_init),
__MPN(rshift_init),
__MPN(sqr_basecase_init),
__MPN(sub_err1_n_init),
__MPN(sub_err2_n_init),
__MPN(sub_n_init),
__MPN(submul_1_init),
__MPN(sumdiff_n_init),
0
};
/* The following setups start with generic x86, then overwrite with
specifics for a chip, and higher versions of that chip.
The arrangement of the setups here will normally be the same as the $path
selections in configure.in for the respective chips.
This code is reentrant and thread safe. We always calculate the same
decided_cpuvec, so if two copies of the code are running it doesn't
matter which completes first, both write the same to __gmpn_cpuvec.
We need to go via decided_cpuvec because if one thread has completed
__gmpn_cpuvec then it may be making use of the threshold values in that
vector. If another thread is still running __gmpn_cpuvec_init then we
don't want it to write different values to those fields since some of the
asm routines only operate correctly up to their own defined threshold,
not an arbitrary value. */
#define CONFIG_GUESS 0
#define CONFIG_GUESS_32BIT 0
#define CONFIG_GUESS_64BIT 0
#define FAT32 1
#define FAT64 0
#define INFAT 1
#define CPUSETUP_pentium CPUVEC_SETUP_pentium
#define CPUSETUP_pentiummmx CPUVEC_SETUP_pentium;CPUVEC_SETUP_pentium_mmx
#define CPUSETUP_pentiumpro CPUVEC_SETUP_p6
#define CPUSETUP_pentium2 CPUVEC_SETUP_p6;CPUVEC_SETUP_p6_mmx
#define CPUSETUP_pentium3 CPUVEC_SETUP_p6;CPUVEC_SETUP_p6_mmx;CPUVEC_SETUP_p6_p3mmx
#define CPUSETUP_core CPUVEC_SETUP_p6;CPUVEC_SETUP_p6_mmx;CPUVEC_SETUP_p6_p3mmx
#define CPUSETUP_core2 CPUVEC_SETUP_core2
#define CPUSETUP_penryn CPUVEC_SETUP_p6;CPUVEC_SETUP_p6_mmx;CPUVEC_SETUP_p6_p3mmx
#define CPUSETUP_nehalem CPUVEC_SETUP_nehalem
#define CPUSETUP_westmere CPUVEC_SETUP_nehalem
#define CPUSETUP_sandybridge CPUVEC_SETUP_nehalem
#define CPUSETUP_ivybridge CPUVEC_SETUP_nehalem
#define CPUSETUP_haswell CPUVEC_SETUP_nehalem
#define CPUSETUP_haswellavx CPUVEC_SETUP_nehalem
#define CPUSETUP_broadwell CPUVEC_SETUP_nehalem
#define CPUSETUP_skylake CPUVEC_SETUP_nehalem
#define CPUSETUP_skylakeavx CPUVEC_SETUP_nehalem
#define CPUSETUP_atom CPUVEC_SETUP_p6;CPUVEC_SETUP_p6_mmx;CPUVEC_SETUP_p6_p3mmx
#define CPUSETUP_nano CPUVEC_SETUP_p6;CPUVEC_SETUP_p6_mmx;CPUVEC_SETUP_p6_p3mmx
#define CPUSETUP_pentium4 CPUVEC_SETUP_pentium4;CPUVEC_SETUP_pentium4_mmx;CPUVEC_SETUP_pentium4_sse2
#define CPUSETUP_prescott CPUVEC_SETUP_pentium4;CPUVEC_SETUP_pentium4_mmx;CPUVEC_SETUP_pentium4_sse2
#define CPUSETUP_k5 do{}while(0)
#define CPUSETUP_k6 CPUVEC_SETUP_k6;CPUVEC_SETUP_k6_mmx
#define CPUSETUP_k62 CPUVEC_SETUP_k6;CPUVEC_SETUP_k6_mmx;CPUVEC_SETUP_k6_k62mmx
#define CPUSETUP_k63 CPUVEC_SETUP_k6;CPUVEC_SETUP_k6_mmx;CPUVEC_SETUP_k6_k62mmx
#define CPUSETUP_k7 CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx
#define CPUSETUP_k8 CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx;CPUVEC_SETUP_k7_mmx_k8
#define CPUSETUP_k10 CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx;CPUVEC_SETUP_k7_mmx_k8;CPUVEC_SETUP_k7_mmx_k8_k10
#define CPUSETUP_k102 CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx;CPUVEC_SETUP_k7_mmx_k8;CPUVEC_SETUP_k7_mmx_k8_k10;CPUVEC_SETUP_k7_mmx_k8_k10_k102
#define CPUSETUP_k103 CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx;CPUVEC_SETUP_k7_mmx_k8;CPUVEC_SETUP_k7_mmx_k8_k10;CPUVEC_SETUP_k7_mmx_k8_k10_k102
#define CPUSETUP_bulldozer CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx;CPUVEC_SETUP_k7_mmx_k8;CPUVEC_SETUP_k7_mmx_k8_k10;CPUVEC_SETUP_k7_mmx_k8_k10_k102
#define CPUSETUP_piledriver CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx;CPUVEC_SETUP_k7_mmx_k8;CPUVEC_SETUP_k7_mmx_k8_k10;CPUVEC_SETUP_k7_mmx_k8_k10_k102
#define CPUSETUP_bobcat CPUVEC_SETUP_k7;CPUVEC_SETUP_k7_mmx;CPUVEC_SETUP_k7_mmx_k8;CPUVEC_SETUP_k7_mmx_k8_k10;CPUVEC_SETUP_k7_mmx_k8_k10_k102
#define CPUSETUP_viac3 do{}while(0)
#define CPUSETUP_viac32 do{}while(0)
#include "cpuid.c"
void
__gmpn_cpuvec_init (void)
{
struct cpuvec_t decided_cpuvec;
TRACE (printf ("__gmpn_cpuvec_init:\n"));
__gmpn_cpu(&decided_cpuvec);
ASSERT_CPUVEC (decided_cpuvec);
CPUVEC_INSTALL (decided_cpuvec);
/* Set this once the threshold fields are ready.
Use volatile to prevent it getting moved. */
((volatile struct cpuvec_t *) &__gmpn_cpuvec)->initialized = 1;
}
| {
"pile_set_name": "Github"
} |
#ifndef gameobjectsrender_h
#define gameobjectsrender_h
#include "tilemap.h"
#include "gameobjects.h"
void CalcObjectXYZ(int game, float *offX, float *offY, float *offZ, tile LevelInfo[64][64], ObjectItem objList[1600], long nextObj, int x, int y);
void CalcObjectXYZ(int game, float *offX, float *offY, float *offZ, tile LevelInfo[64][64], ObjectItem objList[1600], long nextObj, int x, int y, short WallAdjust);
void EntityRotation(int heading);
void EntityRotationSHOCK(int heading);
void RenderEntityA_DOOR_TRAP(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityModel(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityDecal(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityComputerScreen(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityNPC(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityDoor(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityKey(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityContainer(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityCorpse(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityButton(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityA_DO_TRAP(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityA_CHANGE_TERRAIN_TRAP(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityTMAP(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityBOOK(int game, float x, float y, float z, short message, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntitySIGN(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityA_TELEPORT_TRAP(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityA_MOVE_TRIGGER(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityNULL_TRIGGER(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityLEVEL_ENTRY(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityREPULSOR(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityWords(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityGrating(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntitySHOCKDoor(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityPaintingUW(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityBridgeUW(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntityParticle(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64], int bind);
void RenderEntityActivator(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64]);
void RenderEntitySound(int game, float x, float y, float z, ObjectItem &currobj, ObjectItem objList[1600], tile LevelInfo[64][64], int bind);
#endif /*gameobjectsrender_h*/ | {
"pile_set_name": "Github"
} |
fileFormatVersion: 2
guid: 40d6e8fdcb9bbf047a171b518b7628af
timeCreated: 1505333492
licenseType: Pro
MonoImporter:
externalObjects: {}
serializedVersion: 2
defaultReferences: []
executionOrder: 0
icon: {instanceID: 0}
userData:
assetBundleName:
assetBundleVariant:
| {
"pile_set_name": "Github"
} |
import PSOperations
import UIKit
@UIApplicationMain
class PSOperationsTestAppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
}
| {
"pile_set_name": "Github"
} |
package typings.dojo
import org.scalablytyped.runtime.TopLevel
import typings.dojo.dojox.widget.WizardPane
import scala.scalajs.js
import scala.scalajs.js.`|`
import scala.scalajs.js.annotation._
@JSImport("dojox/widget/WizardPane", JSImport.Namespace)
@js.native
object wizardPaneMod extends TopLevel[WizardPane]
| {
"pile_set_name": "Github"
} |
//
// refactored_echo_server.cpp
// ~~~~~~~~~~~~~~~~~~~~~~~~~~
//
// Copyright (c) 2003-2020 Christopher M. Kohlhoff (chris at kohlhoff dot com)
//
// Distributed under the Boost Software License, Version 1.0. (See accompanying
// file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)
//
#include <asio/co_spawn.hpp>
#include <asio/detached.hpp>
#include <asio/io_context.hpp>
#include <asio/ip/tcp.hpp>
#include <asio/signal_set.hpp>
#include <asio/write.hpp>
#include <cstdio>
using asio::ip::tcp;
using asio::awaitable;
using asio::co_spawn;
using asio::detached;
using asio::use_awaitable;
namespace this_coro = asio::this_coro;
awaitable<void> echo_once(tcp::socket& socket)
{
char data[128];
std::size_t n = co_await socket.async_read_some(asio::buffer(data), use_awaitable);
co_await async_write(socket, asio::buffer(data, n), use_awaitable);
}
awaitable<void> echo(tcp::socket socket)
{
try
{
for (;;)
{
// The asynchronous operations to echo a single chunk of data have been
// refactored into a separate function. When this function is called, the
// operations are still performed in the context of the current
// coroutine, and the behaviour is functionally equivalent.
co_await echo_once(socket);
}
}
catch (std::exception& e)
{
std::printf("echo Exception: %s\n", e.what());
}
}
awaitable<void> listener()
{
auto executor = co_await this_coro::executor;
tcp::acceptor acceptor(executor, {tcp::v4(), 55555});
for (;;)
{
tcp::socket socket = co_await acceptor.async_accept(use_awaitable);
co_spawn(executor, echo(std::move(socket)), detached);
}
}
int main()
{
try
{
asio::io_context io_context(1);
asio::signal_set signals(io_context, SIGINT, SIGTERM);
signals.async_wait([&](auto, auto){ io_context.stop(); });
co_spawn(io_context, listener(), detached);
io_context.run();
}
catch (std::exception& e)
{
std::printf("Exception: %s\n", e.what());
}
}
| {
"pile_set_name": "Github"
} |
[package]
name = "pg-amqp-bridge"
version = "0.0.7"
authors = ["steve-chavez <[email protected]>"]
[[test]]
name = "main"
harness = false
[dependencies]
amqp = "0.1.0"
env_logger = "0.4.2"
fallible-iterator = "0.1.3"
log = "0.3.7"
postgres = "0.15.2"
r2d2 = "0.8.2"
r2d2_postgres = "0.14.0"
[dev-dependencies]
rustc-test = "0.3.0"
futures = "0.1.12"
lapin-futures = "0.9.0"
tokio-core = "0.1.6"
| {
"pile_set_name": "Github"
} |
<?php
// $Id: Raw.php,v 1.1 2005/01/31 15:46:52 pmjones Exp $
/**
*
* This class implements a Text_Wiki rule to find sections of the source
* text that are not to be processed by Text_Wiki. These blocks of "raw"
* text will be rendered as they were found.
*
* @author Paul M. Jones <[email protected]>
*
* @package Text_Wiki
*
*/
class Text_Wiki_Parse_Raw extends Text_Wiki_Parse {
/**
*
* The regular expression used to find source text matching this
* rule.
*
* @access public
*
* @var string
*
*/
var $regex = "/``(.*)``/U";
/**
*
* Generates a token entry for the matched text. Token options are:
*
* 'text' => The full matched text.
*
* @access public
*
* @param array &$matches The array of matches from parse().
*
* @return A delimited token number to be used as a placeholder in
* the source text.
*
*/
function process(&$matches)
{
$options = array('text' => $matches[1]);
return $this->wiki->addToken($this->rule, $options);
}
}
?> | {
"pile_set_name": "Github"
} |
// Source : https://oj.leetcode.com/problems/merge-sorted-array/
// Author : Hao Chen
// Date : 2014-06-20
/**********************************************************************************
*
* Given two sorted integer arrays A and B, merge B into A as one sorted array.
*
* Note:
* You may assume that A has enough space (size that is greater or equal to m + n)
* to hold additional elements from B. The number of elements initialized in A and B
* are m and n respectively.
*
**********************************************************************************/
#include <stdio.h>
void merge(int A[], int m, int B[], int n) {
int ia = m-1 ;
int ib = n-1 ;
int i = m + n - 1;
for (int i=m+n-1; i>=0; i--){
if (ia>=0 && ib<0){
break;
}
if (ia<0 && ib>=0){
A[i] = B[ib--];
continue;
}
if (ia>=0 && ib>=0){
if (A[ia] > B[ib]){
A[i] = A[ia--];
}else{
A[i] = B[ib--];
}
}
}
}
void printArray(int A[], int n) {
printf("{");
for(int i=0; i<n; i++) {
printf("%d, ", A[i]);
}
printf("}\n");
}
int main()
{
int a[]={2,4,6,8,10,0,0,0};
int b[]={1,3,5};
merge(a, 5, b, 3 );
printArray(a, sizeof(a)/sizeof(int));
int a1[]={2,4,0,0,0};
int b1[]={3,5,7};
merge(a1, 2, b1, 3 );
printArray(a1, sizeof(a1)/sizeof(int));
int a2[]={12,14,16,18,20,0,0,0};
int b2[]={1,3,5};
merge(a2, 5, b2, 3 );
printArray(a2, sizeof(a2)/sizeof(int));
int a3[]={2,0};
int b3[]={3,};
merge(a3, 1, b3, 1 );
printArray(a3, sizeof(a3)/sizeof(int));
int a4[]={0,0,0};
int b4[]={1,3,5};
merge(a4, 0, b4, 3 );
printArray(a4, sizeof(a4)/sizeof(int));
int a5[]={2,4,6,8,10,0,0,0};
int b5[]={11,13,15};
merge(a5, 5, b5, 3 );
printArray(a5, sizeof(a5)/sizeof(int));
int a6[]={2,4,0,0,0,0,0,0};
int b6[]={1,3,5,7,9,11};
merge(a6, 2, b6, 6 );
printArray(a6, sizeof(a6)/sizeof(int));
return 0;
}
| {
"pile_set_name": "Github"
} |
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProductVersion>8.0.30703</ProductVersion>
<SchemaVersion>2.0</SchemaVersion>
<ProjectGuid>{CCE78B48-1352-4746-A14E-DCB18BB5CCA2}</ProjectGuid>
<OutputType>Library</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>Chraft.Utilities</RootNamespace>
<AssemblyName>ChraftUtilities</AssemblyName>
<TargetFrameworkVersion>v4.0</TargetFrameworkVersion>
<FileAlignment>512</FileAlignment>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
<DebugSymbols>true</DebugSymbols>
<DebugType>full</DebugType>
<Optimize>false</Optimize>
<OutputPath>bin\Debug\</OutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<DebugType>pdbonly</DebugType>
<Optimize>true</Optimize>
<OutputPath>bin\Release\</OutputPath>
<DefineConstants>TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
</PropertyGroup>
<ItemGroup>
<Reference Include="Nini">
<HintPath>..\Chraft\Nini.dll</HintPath>
</Reference>
<Reference Include="System" />
<Reference Include="System.Core" />
<Reference Include="System.Xml.Linq" />
<Reference Include="System.Data.DataSetExtensions" />
<Reference Include="Microsoft.CSharp" />
<Reference Include="System.Data" />
<Reference Include="System.Xml" />
</ItemGroup>
<ItemGroup>
<Compile Include="Blocks\BlockData.cs" />
<Compile Include="Blocks\BlockFace.cs" />
<Compile Include="Collision\BoundingBox.cs" />
<Compile Include="Collision\RayTraceHit.cs" />
<Compile Include="Config\ChraftConfig.cs" />
<Compile Include="Config\Configuration.cs" />
<Compile Include="Coords\AbsWorldCoords.cs" />
<Compile Include="Coords\UniversalCoords.cs" />
<Compile Include="Math\MathExtensions.cs" />
<Compile Include="Math\MathHelper.cs" />
<Compile Include="Math\Vector3.cs" />
<Compile Include="Math\WeightedPercentValue.cs" />
<Compile Include="Math\WeightedValue.cs" />
<Compile Include="Misc\Chat.cs" />
<Compile Include="Misc\CommandType.cs" />
<Compile Include="Misc\DamageCause.cs" />
<Compile Include="Misc\Direction.cs" />
<Compile Include="Misc\Experience.cs" />
<Compile Include="Misc\Hash.cs" />
<Compile Include="Misc\Http.cs" />
<Compile Include="Misc\MobType.cs" />
<Compile Include="Misc\NibbleArray.cs" />
<Compile Include="Misc\RecipeObject.cs" />
<Compile Include="Misc\WoolColor.cs" />
<Compile Include="NBT\ChunkTag.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\EndiannessConverter.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\MCPoint.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\MCROffset.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\MCRTStamp.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\NBTFile.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\NBTTag.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\RegionFile.cs">
<SubType>Code</SubType>
</Compile>
<Compile Include="NBT\SCHEMATICTag.cs">
<SubType>Code</SubType>
</Compile>
</ItemGroup>
<ItemGroup>
<Folder Include="Properties\" />
</ItemGroup>
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<!-- To modify your build process, add your task inside one of the targets below and uncomment it.
Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target>
-->
</Project> | {
"pile_set_name": "Github"
} |
#CMax circuit
+10: (61,20)
gnd: (61,19)
pot: (34,15)--(35,13)--(36,15)
motor: (50,7)--(45,7)
wire: (45,9)--(45,12)
wire: (45,16)--(45,19)
wire: (46,9)--(46,12)
wire: (35,9)--(35,12)
wire: (34,16)--(34,20)
wire: (36,16)--(36,19)
opamp: (38,12)--(38,9)
wire: (39,16)--(39,20)
wire: (41,16)--(41,19)
wire: (38,13)--(46,13)
wire: (39,5)--(35,5)
wire: (38,8)--(46,8)
wire: (40,14)--(42,14)
wire: (42,9)--(42,12)
wire: (42,7)--(41,7)
wire: (40,7)--(31,7)
wire: (31,9)--(31,12)
wire: (31,16)--(31,19)
| {
"pile_set_name": "Github"
} |
//
// BSELocationTool.m
// BSEM
//
// Created by 逸信Mac on 16/3/25.
// Copyright © 2016年 逸信Mac. All rights reserved.
//
#import "BSELocationTool.h"
@implementation BSELocationTool
static BSELocationTool *_instance;
+ (instancetype)shareLocation{
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
_instance = [[self alloc] init];
});
return _instance;
}
+(void)TransCLLocationTo:(CLLocation *)loc City:(getCityBlock)CityBlock Location:(getLocationBlock)LocationBlock Coordinate:(getCoordinateBlock)CoordinateBlock//反地理编码
{
CLGeocoder *geocoder = [[CLGeocoder alloc] init];
[geocoder reverseGeocodeLocation:loc completionHandler:^(NSArray *placemarks, NSError *error) {
if (placemarks.count) {
//获取当前城市
CLPlacemark *mark = placemarks.firstObject;
NSDictionary *dict = [mark addressDictionary];
NSString *city = [dict objectForKey:@"City"];
//城市
if (CityBlock) {
CityBlock(city);
}
//坐标
if(CoordinateBlock){
CoordinateBlock(mark.location.coordinate);
}
//位置信息
if (LocationBlock) {
LocationBlock(mark.name);
}
}
}];
}
@end
| {
"pile_set_name": "Github"
} |
CLASS net/minecraft/class_4725 net/minecraft/client/texture/MipmapHelper
FIELD field_21747 COLOR_FRACTIONS [F
METHOD method_24099 getColorFraction (I)F
ARG 0 value
METHOD method_24100 getColorComponent (IIIII)I
ARG 0 one
ARG 1 two
ARG 2 three
ARG 3 four
ARG 4 bits
METHOD method_24101 blend (IIIIZ)I
ARG 0 one
ARG 1 two
ARG 2 three
ARG 3 four
ARG 4 checkAlpha
METHOD method_24102 getMipmapLevelsImages (Lnet/minecraft/class_1011;I)[Lnet/minecraft/class_1011;
ARG 0 image
ARG 1 mipmap
| {
"pile_set_name": "Github"
} |
import 'package:flutter/painting.dart';
import 'package:mp_chart/mp/core/adapter_android_mp.dart';
import 'package:mp_chart/mp/core/data_interfaces/i_line_data_set.dart';
import 'package:mp_chart/mp/core/data_set/base_data_set.dart';
import 'package:mp_chart/mp/core/data_set/data_set.dart';
import 'package:mp_chart/mp/core/data_set/line_radar_data_set.dart';
import 'package:mp_chart/mp/core/entry/entry.dart';
import 'package:mp_chart/mp/core/enums/mode.dart';
import 'package:mp_chart/mp/core/fill_formatter/default_fill_formatter.dart';
import 'package:mp_chart/mp/core/fill_formatter/i_fill_formatter.dart';
import 'package:mp_chart/mp/core/utils/color_utils.dart';
import 'package:mp_chart/mp/core/utils/utils.dart';
class LineDataSet extends LineRadarDataSet<Entry> implements ILineDataSet {
/// Drawing mode for this line dataset
///*/
Mode _mode = Mode.LINEAR;
/// List representing all colors that are used for the circles
List<Color> _circleColors;
/// the color of the inner circles
Color _circleHoleColor = ColorUtils.WHITE;
/// the radius of the circle-shaped value indicators
double _circleRadius = 8;
/// the hole radius of the circle-shaped value indicators
double _circleHoleRadius = 4;
/// sets the intensity of the cubic lines
double _cubicIntensity = 0.2;
/// the path effect of this DataSet that makes dashed lines possible
DashPathEffect _dashPathEffect;
/// formatter for customizing the position of the fill-line
IFillFormatter _fillFormatter = DefaultFillFormatter();
/// if true, drawing circles is enabled
bool _draw = true;
bool mDrawCircleHole = true;
LineDataSet(List<Entry> yVals, String label) : super(yVals, label) {
// _circleRadius = Utils.convertDpToPixel(4f);
// mLineWidth = Utils.convertDpToPixel(1f);
if (_circleColors == null) {
_circleColors = List();
}
_circleColors.clear();
// default colors
// mColors.add(Color.rgb(192, 255, 140));
// mColors.add(Color.rgb(255, 247, 140));
_circleColors.add(Color.fromARGB(255, 140, 234, 255));
}
@override
void copy(BaseDataSet baseDataSet) {
super.copy(baseDataSet);
if (baseDataSet is LineDataSet) {
var lineDataSet = baseDataSet;
lineDataSet._circleColors = _circleColors;
lineDataSet._circleHoleColor = _circleHoleColor;
lineDataSet._circleHoleRadius = _circleHoleRadius;
lineDataSet._circleRadius = _circleRadius;
lineDataSet._cubicIntensity = _cubicIntensity;
lineDataSet._dashPathEffect = _dashPathEffect;
lineDataSet.mDrawCircleHole = mDrawCircleHole;
lineDataSet._draw = mDrawCircleHole;
lineDataSet._fillFormatter = _fillFormatter;
lineDataSet._mode = _mode;
}
}
/// Returns the drawing mode for this line dataset
///
/// @return
@override
Mode getMode() {
return _mode;
}
/// Returns the drawing mode for this LineDataSet
///
/// @return
void setMode(Mode mode) {
_mode = mode;
}
/// Sets the intensity for cubic lines (if enabled). Max = 1f = very cubic,
/// Min = 0.05f = low cubic effect, Default: 0.2f
///
/// @param intensity
void setCubicIntensity(double intensity) {
if (intensity > 1) intensity = 1;
if (intensity < 0.05) intensity = 0.05;
_cubicIntensity = intensity;
}
@override
double getCubicIntensity() {
return _cubicIntensity;
}
/// Sets the radius of the drawn circles.
/// Default radius = 4f, Min = 1f
///
/// @param radius
void setCircleRadius(double radius) {
if (radius >= 1) {
_circleRadius = Utils.convertDpToPixel(radius);
}
}
@override
double getCircleRadius() {
return _circleRadius;
}
/// Sets the hole radius of the drawn circles.
/// Default radius = 2f, Min = 0.5f
///
/// @param holeRadius
void setCircleHoleRadius(double holeRadius) {
if (holeRadius >= 0.5) {
_circleHoleRadius = Utils.convertDpToPixel(holeRadius);
}
}
@override
double getCircleHoleRadius() {
return _circleHoleRadius;
}
/// sets the size (radius) of the circle shpaed value indicators,
/// default size = 4f
/// <p/>
/// This method is deprecated because of unclarity. Use setCircleRadius instead.
///
/// @param size
void setCircleSize(double size) {
setCircleRadius(size);
}
/// This function is deprecated because of unclarity. Use getCircleRadius instead.
double getCircleSize() {
return getCircleRadius();
}
/// Enables the line to be drawn in dashed mode, e.g. like this
/// "- - - - - -". THIS ONLY WORKS IF HARDWARE-ACCELERATION IS TURNED OFF.
/// Keep in mind that hardware acceleration boosts performance.
///
/// @param lineLength the length of the line pieces
/// @param spaceLength the length of space in between the pieces
/// @param phase offset, in degrees (normally, use 0)
void enableDashedLine(double lineLength, double spaceLength, double phase) {
_dashPathEffect = DashPathEffect(lineLength, spaceLength, phase);
}
/// Disables the line to be drawn in dashed mode.
void disableDashedLine() {
_dashPathEffect = null;
}
@override
bool isDashedLineEnabled() {
return _dashPathEffect == null ? false : true;
}
@override
DashPathEffect getDashPathEffect() {
return _dashPathEffect;
}
set dashPathEffect(DashPathEffect value) {
_dashPathEffect = value;
}
/// set this to true to enable the drawing of circle indicators for this
/// DataSet, default true
///
/// @param enabled
void setDrawCircles(bool enabled) {
this._draw = enabled;
}
@override
bool isDrawCirclesEnabled() {
return _draw;
}
@override
bool isDrawCubicEnabled() {
return _mode == Mode.CUBIC_BEZIER;
}
@override
bool isDrawSteppedEnabled() {
return _mode == Mode.STEPPED;
}
/** ALL CODE BELOW RELATED TO CIRCLE-COLORS */
/// returns all colors specified for the circles
///
/// @return
List<Color> getCircleColors() {
return _circleColors;
}
@override
Color getCircleColor(int index) {
return _circleColors[index];
}
@override
int getCircleColorCount() {
return _circleColors.length;
}
/// Sets the colors that should be used for the circles of this DataSet.
/// Colors are reused as soon as the number of Entries the DataSet represents
/// is higher than the size of the colors array. Make sure that the colors
/// are already prepared (by calling getResources().getColor(...)) before
/// adding them to the DataSet.
///
/// @param colors
void setCircleColors(List<Color> colors) {
_circleColors = colors;
}
/// Sets the one and ONLY color that should be used for this DataSet.
/// Internally, this recreates the colors array and adds the specified color.
///
/// @param color
void setCircleColor(Color color) {
resetCircleColors();
_circleColors.add(color);
}
/// resets the circle-colors array and creates a one
void resetCircleColors() {
if (_circleColors == null) {
_circleColors = List();
}
_circleColors.clear();
}
/// Sets the color of the inner circle of the line-circles.
///
/// @param color
void setCircleHoleColor(Color color) {
_circleHoleColor = color;
}
@override
Color getCircleHoleColor() {
return _circleHoleColor;
}
/// Set this to true to allow drawing a hole in each data circle.
///
/// @param enabled
void setDrawCircleHole(bool enabled) {
mDrawCircleHole = enabled;
}
@override
bool isDrawCircleHoleEnabled() {
return mDrawCircleHole;
}
/// Sets a custom IFillFormatter to the chart that handles the position of the
/// filled-line for each DataSet. Set this to null to use the default logic.
///
/// @param formatter
void setFillFormatter(IFillFormatter formatter) {
if (formatter == null)
_fillFormatter = DefaultFillFormatter();
else
_fillFormatter = formatter;
}
@override
IFillFormatter getFillFormatter() {
return _fillFormatter;
}
@override
DataSet<Entry> copy1() {
List<Entry> entries = List();
for (int i = 0; i < values.length; i++) {
entries.add(Entry(
x: values[i].x,
y: values[i].y,
icon: values[i].mIcon,
data: values[i].mData));
}
LineDataSet copied = LineDataSet(entries, getLabel());
copy(copied);
return copied;
}
@override
String toString() {
return '${super.toString()}\nLineDataSet{_mode: $_mode,\n _circleColors: $_circleColors,\n _circleHoleColor: $_circleHoleColor,\n _circleRadius: $_circleRadius,\n _circleHoleRadius: $_circleHoleRadius,\n _cubicIntensity: $_cubicIntensity,\n _dashPathEffect: $_dashPathEffect,\n _fillFormatter: $_fillFormatter,\n _draw: $_draw,\n mDrawCircleHole: $mDrawCircleHole}';
}
}
| {
"pile_set_name": "Github"
} |
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Ansi 0 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.10980392156862745</real>
<key>Green Component</key>
<real>0.09019607843137255</real>
<key>Red Component</key>
<real>0.09803921568627451</real>
</dict>
<key>Ansi 1 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.47058823529411764</real>
<key>Green Component</key>
<real>0.27450980392156865</real>
<key>Red Component</key>
<real>0.7450980392156863</real>
</dict>
<key>Ansi 10 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.5725490196078431</real>
<key>Green Component</key>
<real>0.5725490196078431</real>
<key>Red Component</key>
<real>0.16470588235294117</real>
</dict>
<key>Ansi 11 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.23137254901960785</real>
<key>Green Component</key>
<real>0.43137254901960786</real>
<key>Red Component</key>
<real>0.6274509803921569</real>
</dict>
<key>Ansi 12 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.8588235294117647</real>
<key>Green Component</key>
<real>0.42745098039215684</real>
<key>Red Component</key>
<real>0.3411764705882353</real>
</dict>
<key>Ansi 13 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.9058823529411765</real>
<key>Green Component</key>
<real>0.35294117647058826</real>
<key>Red Component</key>
<real>0.5843137254901961</real>
</dict>
<key>Ansi 14 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.7764705882352941</real>
<key>Green Component</key>
<real>0.5450980392156862</real>
<key>Red Component</key>
<real>0.2235294117647059</real>
</dict>
<key>Ansi 15 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.9568627450980393</real>
<key>Green Component</key>
<real>0.9254901960784314</real>
<key>Red Component</key>
<real>0.9372549019607843</real>
</dict>
<key>Ansi 2 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.5725490196078431</real>
<key>Green Component</key>
<real>0.5725490196078431</real>
<key>Red Component</key>
<real>0.16470588235294117</real>
</dict>
<key>Ansi 3 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.23137254901960785</real>
<key>Green Component</key>
<real>0.43137254901960786</real>
<key>Red Component</key>
<real>0.6274509803921569</real>
</dict>
<key>Ansi 4 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.8588235294117647</real>
<key>Green Component</key>
<real>0.42745098039215684</real>
<key>Red Component</key>
<real>0.3411764705882353</real>
</dict>
<key>Ansi 5 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.9058823529411765</real>
<key>Green Component</key>
<real>0.35294117647058826</real>
<key>Red Component</key>
<real>0.5843137254901961</real>
</dict>
<key>Ansi 6 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.7764705882352941</real>
<key>Green Component</key>
<real>0.5450980392156862</real>
<key>Red Component</key>
<real>0.2235294117647059</real>
</dict>
<key>Ansi 7 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.5725490196078431</real>
<key>Green Component</key>
<real>0.5294117647058824</real>
<key>Red Component</key>
<real>0.5450980392156862</real>
</dict>
<key>Ansi 8 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.42745098039215684</real>
<key>Green Component</key>
<real>0.37254901960784315</real>
<key>Red Component</key>
<real>0.396078431372549</real>
</dict>
<key>Ansi 9 Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.47058823529411764</real>
<key>Green Component</key>
<real>0.27450980392156865</real>
<key>Red Component</key>
<real>0.7450980392156863</real>
</dict>
<key>Background Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.10980392156862745</real>
<key>Green Component</key>
<real>0.09019607843137255</real>
<key>Red Component</key>
<real>0.09803921568627451</real>
</dict>
<key>Bold Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.5725490196078431</real>
<key>Green Component</key>
<real>0.5294117647058824</real>
<key>Red Component</key>
<real>0.5450980392156862</real>
</dict>
<key>Cursor Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.5725490196078431</real>
<key>Green Component</key>
<real>0.5294117647058824</real>
<key>Red Component</key>
<real>0.5450980392156862</real>
</dict>
<key>Cursor Text Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.10980392156862745</real>
<key>Green Component</key>
<real>0.09019607843137255</real>
<key>Red Component</key>
<real>0.09803921568627451</real>
</dict>
<key>Foreground Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.5725490196078431</real>
<key>Green Component</key>
<real>0.5294117647058824</real>
<key>Red Component</key>
<real>0.5450980392156862</real>
</dict>
<key>Selected Text Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.5725490196078431</real>
<key>Green Component</key>
<real>0.5294117647058824</real>
<key>Red Component</key>
<real>0.5450980392156862</real>
</dict>
<key>Selection Color</key>
<dict>
<key>Color Space</key>
<string>sRGB</string>
<key>Blue Component</key>
<real>0.3764705882352941</real>
<key>Green Component</key>
<real>0.3215686274509804</real>
<key>Red Component</key>
<real>0.34509803921568627</real>
</dict>
</dict>
</plist>
| {
"pile_set_name": "Github"
} |
count1
Result: 0
Time(sec): 0
Result: 0
Time(sec): 0
Result: 0
Time(sec): 0
count2
Result: 0
Time(sec): 0
Result: 0
Time(sec): 0
Result: 0
Time(sec): 0
| {
"pile_set_name": "Github"
} |
import { ObjectDictionary } from "@opticss/util";
import * as debugGenerator from "debug";
import { postcss } from "opticss";
import * as path from "path";
import { Block } from "../BlockTree";
import { Options, ResolvedConfiguration } from "../configuration";
import { CssBlockError } from "../errors";
import { FileIdentifier, ImportedCompiledCssFile, ImportedFile } from "../importing";
import { BlockFactoryBase, sourceMapFromProcessedFile } from "./BlockFactoryBase";
import { BlockParser, ParsedSource } from "./BlockParser";
import { PreprocessorSync, PreprocessorsSync, ProcessedFile, Syntax, annotateCssContentWithSourceMap, syntaxName } from "./preprocessing";
const debug = debugGenerator("css-blocks:BlockFactorySync");
interface ErrorWithErrNum {
code?: string;
message: string;
}
// DEVELOPER NOTE: There's currently a lot of duplication between this file ane
// BlockFactory.ts. It's very likely that any change you're making here has to
// also be made over there. Please keep these files in sync.
/**
* This factory ensures that instances of a block are re-used when blocks are
* going to be compiled/optimized together. Multiple instances of the same
* block will result in analysis and optimization bugs.
*
* This also ensures that importers and preprocessors are correctly used when loading a block file.
*/
export class BlockFactorySync extends BlockFactoryBase {
parser: BlockParser;
preprocessors: PreprocessorsSync;
private blocks: ObjectDictionary<Block>;
private paths: ObjectDictionary<string>;
get isSync(): true {
return true;
}
constructor(options: Options, postcssImpl = postcss, faultTolerant = false) {
super(options, postcssImpl, faultTolerant);
this.preprocessors = this.configuration.preprocessorsSync;
this.parser = new BlockParser(options, this);
this.blocks = {};
this.paths = {};
this.faultTolerant = faultTolerant;
}
/**
* Toss out any caches in this BlockFactory. Any future requests for a block
* or block path will be loaded fresh from persistent storage.
*/
reset() {
super.reset();
this.blocks = {};
this.paths = {};
}
/**
* Parse a `postcss.Root` into a Block object. Use parseRoot if we need to
* catch errors.
*
* This function is referenced only in tests.
* @param root The postcss.Root to parse.
* @param identifier A unique identifier for this Block file.
* @param name Default name for the block.
* @param isDfnFile Whether to treat this as a definition file.
* @returns The Block object.
*/
parseRootFaultTolerant(root: postcss.Root, identifier: string, name: string, isDfnFile = false, expectedGuid?: string): Block {
return this.parser.parseSync(root, identifier, name, isDfnFile, expectedGuid);
}
/**
* Parse a `postcss.Root` into a Block object. Also assert that the block is
* valid so that we can catch any errors that the block contains.
*
* This function is only used in tests
* @param root The postcss.Root to parse.
* @param identifier A unique identifier for this Block file.
* @param name Default name for the block.
* @param isDfnFile Whether to treat this as a definition file.
* @returns The Block object.
*/
parseRoot(root: postcss.Root, identifier: string, name: string, isDfnFile = false, expectedGuid?: string): Block {
const block = this.parseRootFaultTolerant(root, identifier, name, isDfnFile, expectedGuid);
return this._surfaceBlockErrors(block);
}
/**
* This method doesn't do anything, but it's provided for parity with
* `BlockFactory`.
*/
prepareForExit(): void {
}
/**
* Given a file path (or other data path reference), load data from storage and parse it
* into a CSS Block.
*
* @param filePath - The path to the file or data in persistent storage. The Importer that you've
* configured to use will resolve this to a location in the storage system.
* @returns The parsed block.
*/
getBlockFromPath(filePath: string): Block {
if (!path.isAbsolute(filePath)) {
throw new Error(`An absolute path is required. Got: ${filePath}.`);
}
filePath = path.resolve(filePath);
let identifier: FileIdentifier = this.paths[filePath] || this.importer.identifier(null, filePath, this.configuration);
return this.getBlock(identifier);
}
/**
* Given a FileIdentifier that points to block data on storage, load the data and parse it
* into a CSS Block. In most cases, you'll likely want to use getBlockFromPath() instead.
*
* If the block for the given identifier has already been loaded and parsed,
* the cached block will be returned instead.
*
* @param identifier - An identifier that points at a data file or blob in persistent storage.
* These identifiers are created by the Importer that you've configured
* to use.
* @returns The parsed block.
*/
getBlock(identifier: FileIdentifier): Block {
if (this.blocks[identifier]) {
return this.blocks[identifier];
}
return this._getBlock(identifier);
}
/**
* Loads and parses a CSS Block data file. We load the data here, using the
* Importer, then defer to another method to actually parse the data file
* into a Block.
*
* @param identifier - An identifier that points at a data file or blob in persistent storage.
* @returns The parsed block.
*/
private _getBlock(identifier: FileIdentifier): Block {
let file = this.importer.importSync(identifier, this.configuration);
let block: Block;
if (file.type === "ImportedCompiledCssFile") {
block = this._reconstituteCompiledCssSource(file);
} else {
block = this._importAndPreprocessBlock(file);
}
debug(`Finalizing Block object for "${block.identifier}"`);
// last check to make sure we don't return a new instance
if (this.blocks[block.identifier]) {
return this.blocks[block.identifier];
}
// Ensure this block name is unique.
const uniqueName = this.getUniqueBlockName(block.name, block.identifier, file.type === "ImportedCompiledCssFile");
if (uniqueName === null) {
// For ImportedCompiledCssFiles, leave the name alone and add an error.
block.addError(
new CssBlockError(`Block uses a name that has already been used by ${this.blockNames[block.name]}`, {
filename: block.identifier,
}),
);
block.setName(block.name);
} else {
block.setName(uniqueName);
}
// We only register guids from blocks that don't have errors because those will get re-parsed.
if (block.isValid()) {
// Ensure the GUID is unique.
const guidRegResult = this.registerGuid(block.guid);
if (!guidRegResult) {
block.addError(
new CssBlockError("Block uses a GUID that has already been used! Check dependencies for conflicting GUIDs and/or increase the number of significant characters used to generate GUIDs.", {
filename: block.identifier,
},
),
);
}
}
// if the block has any errors, surface them here unless we're in fault tolerant mode.
this._surfaceBlockErrors(block);
this.blocks[block.identifier] = block;
return block;
}
/**
* Parse the file into a `Block`. Specifically, this method runs the data through any related
* preprocessor (such as a SASS or LESS compiler), parses the CSS into an AST using PostCSS, then
* hands the AST off to the Block parser to validate the CSS and transform it into the Block
* class used by CSS Blocks.
*
* Notably, this method expects that any related file data has already been loaded from memory
* using the Importer.
*
* @param file - The file information that has been previously imported by the Importer, for
* a single block identifier.
* @returns A parsed block.
**/
private _importAndPreprocessBlock(file: ImportedFile): Block {
// If the file identifier maps back to a real filename, ensure it is actually unique.
let realFilename = this.importer.filesystemPath(file.identifier, this.configuration);
if (realFilename) {
if (this.paths[realFilename] && this.paths[realFilename] !== file.identifier) {
throw new Error(`The same block file was returned with different identifiers: ${this.paths[realFilename]} and ${file.identifier}`);
} else {
this.paths[realFilename] = file.identifier;
}
}
// Skip preprocessing if we can.
if (this.blocks[file.identifier]) {
debug(`Using pre-compiled Block for "${file.identifier}"`);
return this.blocks[file.identifier];
}
// Preprocess the file.
let filename: string = realFilename || this.importer.debugIdentifier(file.identifier, this.configuration);
let preprocessor = this.preprocessor(file);
debug(`Preprocessing "${filename}"`);
let preprocessResult = preprocessor(filename, file.contents, this.configuration);
debug(`Generating PostCSS AST for "${filename}"`);
let sourceMap = sourceMapFromProcessedFile(preprocessResult);
let content = preprocessResult.content;
if (sourceMap) {
content = annotateCssContentWithSourceMap(this.configuration, filename, content, sourceMap);
}
let root = this.postcssImpl.parse(content, { from: filename });
// Skip parsing if we can.
if (this.blocks[file.identifier]) {
return this.blocks[file.identifier];
}
debug(`Parsing Block object for "${filename}"`);
let source: ParsedSource = {
identifier: file.identifier,
defaultName: file.defaultName,
parseResult: root,
originalSource: file.contents,
originalSyntax: file.syntax,
dependencies: preprocessResult.dependencies || [],
};
return this.parser.parseSourceSync(source);
}
private _reconstituteCompiledCssSource(file: ImportedCompiledCssFile): Block {
// Maybe we already have this block in cache?
if (this.blocks[file.identifier]) {
debug(`Using pre-compiled Block for "${file.identifier}"`);
return this.blocks[file.identifier];
}
const { definitionAst, cssContentsAst } = this._prepareDefinitionASTs(file);
// Construct a Block out of the definition file.
const block = this.parser.parseDefinitionSourceSync(definitionAst, file.identifier, file.blockId, file.defaultName);
// Merge the rules from the CSS contents into the Block.
block.precompiledStylesheet = cssContentsAst;
block.precompiledStylesheetUnedited = file.rawCssContents;
this._mergeCssRulesIntoDefinitionBlock(block, cssContentsAst, file);
// And we're done!
return block;
}
/**
* Similar to getBlock(), this imports and parses a block data file. However, this
* method parses a block relative to another block.
*
* @param fromIdentifier - The FileIdentifier that references the base location that the
* import path is relative to.
* @param importPath - The relative import path for the file to import.
* @returns The parsed block.
*/
getBlockRelative(fromIdentifier: FileIdentifier, importPath: string): Block {
let importer = this.importer;
let fromPath = importer.debugIdentifier(fromIdentifier, this.configuration);
let identifier = importer.identifier(fromIdentifier, importPath, this.configuration);
try {
return this.getBlock(identifier);
} catch (err) {
if ((<ErrorWithErrNum>err).code === "ENOENT") {
err.message = `From ${fromPath}: ${err.message}`;
}
throw err;
}
}
preprocessor(file: ImportedFile): PreprocessorSync {
let syntax = file.syntax;
let firstPreprocessor: PreprocessorSync | undefined = this.preprocessors[syntax];
let preprocessor: PreprocessorSync | null = null;
if (firstPreprocessor) {
if (syntax !== Syntax.css && this.preprocessors.css && !this.configuration.disablePreprocessChaining) {
let cssProcessor = this.preprocessors.css;
preprocessor = (fullPath: string, content: string, configuration: ResolvedConfiguration): ProcessedFile => {
let result = firstPreprocessor!(fullPath, content, configuration);
let content2 = result.content.toString();
let result2 = cssProcessor(fullPath, content2, configuration, sourceMapFromProcessedFile(result));
return {
content: result2.content,
sourceMap: sourceMapFromProcessedFile(result2),
dependencies: (result.dependencies || []).concat(result2.dependencies || []),
};
};
} else {
preprocessor = firstPreprocessor;
}
} else if (syntax !== Syntax.css) {
throw new Error(`No preprocessor provided for ${syntaxName(syntax)}.`);
} else {
preprocessor = (_fullPath: string, content: string, _options: ResolvedConfiguration): ProcessedFile => {
return {
content,
};
};
}
return preprocessor;
}
}
| {
"pile_set_name": "Github"
} |
fileFormatVersion: 2
guid: 40a8584811a9e42ecb1685d175e51337
NativeFormatImporter:
userData:
assetBundleName:
assetBundleVariant:
| {
"pile_set_name": "Github"
} |
#region License
/*
* Copyright © 2002-2007 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#endregion
using System;
using System.Reflection;
using System.Runtime.Serialization;
using NUnit.Framework;
namespace Common.Logging
{
/// <summary>
/// Exception with indexer property should be logged without <see cref="TargetParameterCountException"/> error
/// </summary>
/// <author>artem1</author>
/// <version>$Id:$</version>
public class LoggingExceptionWithIndexerBugTest
{
/// <summary>
/// This bug was found by me in Npgsql driver for PostgreSQL (NpgsqlException)
/// </summary>
[Test]
public void ErrorNotThrownWhenLoggedExceptionHasIndexerProperty()
{
ILog log = LogManager.GetCurrentClassLogger();
ExceptionWithIndexerException exception = new ExceptionWithIndexerException();
Assert.That(() => log.Error("error catched", exception), Throws.Nothing);
}
[Serializable]
public class ExceptionWithIndexerException : Exception
{
public ExceptionWithIndexerException()
{
}
public ExceptionWithIndexerException(string message) : base(message)
{
}
public ExceptionWithIndexerException(string message, Exception innerException) : base(message, innerException)
{
}
protected ExceptionWithIndexerException(SerializationInfo info, StreamingContext context) : base(info, context)
{
}
public string this[string key]
{
get { return null; }
}
}
}
} | {
"pile_set_name": "Github"
} |
package com.bearever.push.target.xiaomi;
import android.app.ActivityManager;
import android.app.Application;
import android.content.Context;
import android.os.Process;
import android.util.Log;
import com.bearever.push.model.PushTargetEnum;
import com.bearever.push.model.ReceiverInfo;
import com.bearever.push.receiver.PushReceiverHandleManager;
import com.bearever.push.target.BasePushTargetInit;
import com.bearever.push.util.ApplicationUtil;
import com.xiaomi.mipush.sdk.MiPushClient;
import java.util.List;
/**
* 小米推送的初始化
* Created by luoming on 2018/5/28.
*/
public class XiaomiInit extends BasePushTargetInit {
private static final String TAG = "XiaomiInit";
public XiaomiInit(Application context) {
super(context);
if (shouldInit()) {
//注册SDK
String appId = ApplicationUtil.getMetaData(context, "XMPUSH_APPID");
String appKey = ApplicationUtil.getMetaData(context, "XMPUSH_APPKEY");
MiPushClient.registerPush(context, appId.replaceAll(" ", ""),
appKey.replaceAll(" ", ""));
Log.d(TAG, "初始化小米推送");
}
}
private boolean shouldInit() {
ActivityManager am = (ActivityManager) mApplication.getSystemService(Context.ACTIVITY_SERVICE);
List<ActivityManager.RunningAppProcessInfo> processInfos = am.getRunningAppProcesses();
String mainProcessName = mApplication.getPackageName();
int myPid = Process.myPid();
for (ActivityManager.RunningAppProcessInfo info : processInfos) {
if (info.pid == myPid && mainProcessName.equals(info.processName)) {
return true;
}
}
return false;
}
@Override
public void setAlias(Context context, String alias, ReceiverInfo registerInfo) {
MiPushClient.setAlias(context, alias, null);
ReceiverInfo aliasInfo = new ReceiverInfo();
aliasInfo.setContent(alias);
aliasInfo.setPushTarget(PushTargetEnum.XIAOMI);
PushReceiverHandleManager.getInstance().onAliasSet(context, aliasInfo);
}
}
| {
"pile_set_name": "Github"
} |
#!/bin/sh
# script is buggy; until patched just do exit 0
#exit 0
# add zeros to device or bus
add_zeros () {
case "$(echo $1 | wc -L)" in
1) echo "00$1" ;;
2) echo "0$1" ;;
*) echo "$1"
esac
exit 0
}
# bus and device dirs in /sys
USB_PATH=$(echo $MDEV | sed -e 's/usbdev\([0-9]\).[0-9]/usb\1/')
USB_PATH=$(find /sys/devices -type d -name "$USB_PATH")
USB_DEV_DIR=$(echo $MDEV | sed -e 's/usbdev\([0-9]\).\([0-9]\)/\1-\2/')
# dir names in /dev
BUS=$(add_zeros $(echo $MDEV | sed -e 's/^usbdev\([0-9]\).[0-9]/\1/'))
USB_DEV=$(add_zeros $(echo $MDEV | sed -e 's/^usbdev[0-9].\([0-9]\)/\1/'))
# try to load the proper driver for usb devices
case "$ACTION" in
add|"")
# load usb bus driver
for i in $USB_PATH/*/modalias ; do
modprobe `cat $i` 2>/dev/null
done
# load usb device driver if existent
if [ -d $USB_PATH/$USB_DEV_DIR ]; then
for i in $USB_PATH/$USB_DEV_DIR/*/modalias ; do
modprobe `cat $i` 2>/dev/null
done
fi
# move usb device file
mkdir -p bus/usb/$BUS
mv $MDEV bus/usb/$BUS/$USB_DEV
;;
remove)
# unload device driver, if device dir is existent
if [ -d $USB_PATH/$USB_DEV_DIR ]; then
for i in $USB_PATH/$USB_DEV_DIR/*/modalias ; do
modprobe -r `cat $i` 2>/dev/null
done
fi
# unload usb bus driver. Does this make sense?
# what happens, if two usb devices are plugged in
# and one is removed?
for i in $USB_PATH/*/modalias ; do
modprobe -r `cat $i` 2>/dev/null
done
# remove device file and possible empty dirs
rm -f bus/usb/$BUS/$USB_DEV
rmdir bus/usb/$BUS/ 2>/dev/null
rmdir bus/usb/ 2>/dev/null
rmdir bus/ 2>/dev/null
esac
| {
"pile_set_name": "Github"
} |
/*
* Copyright (C) 2002 Alexandre Julliard
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 2.1 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public
* License along with this library; if not, write to the Free Software
* Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA
*/
#ifndef __DSHOW_INCLUDED__
#define __DSHOW_INCLUDED__
#define AM_NOVTABLE
#include <windows.h>
#ifndef __WINESRC__
# include <windowsx.h>
#endif
#include <olectl.h>
#include <ddraw.h>
#include <mmsystem.h>
#ifndef NO_DSHOW_STRSAFE
#define NO_SHLWAPI_STRFCNS
#include <strsafe.h>
#endif
#ifndef NUMELMS
#define NUMELMS(array) (sizeof(array)/sizeof((array)[0]))
#endif
#include <strmif.h>
#include <amvideo.h>
#ifdef DSHOW_USE_AMAUDIO
#include <amaudio.h>
#endif
#include <control.h>
#include <evcode.h>
#include <uuids.h>
#include <errors.h>
/* FIXME: #include <edevdefs.h> */
#include <audevcod.h>
/* FIXME: #include <dvdevcod.h> */
#ifndef OATRUE
#define OATRUE (-1)
#endif
#ifndef OAFALSE
#define OAFALSE (0)
#endif
#endif /* __DSHOW_INCLUDED__ */
| {
"pile_set_name": "Github"
} |
static int x = []() {std::ios::sync_with_stdio(false); cin.tie(0); return 0; }();
class Solution
{
public:
int maxAncestorDiff(TreeNode* root, int mx=INT_MIN, int mn=INT_MAX)
{
return root == nullptr ? mx - mn : max(maxAncestorDiff(root->left, max(mx, root->val), min(mn, root->val)),
maxAncestorDiff(root->right, max(mx,root->val), min(mn, root->val)));
}
}; | {
"pile_set_name": "Github"
} |
{ stdenv, makeWrapper, fetchFromBitbucket, fetchFromGitHub, pkgconfig
, alsaLib, curl, glew, glfw, gtk2-x11, jansson, libjack2, libXext, libXi
, libzip, rtaudio, rtmidi, speex, libsamplerate }:
let
# The package repo vendors some of the package dependencies as submodules.
# Others are downloaded with `make deps`. Due to previous issues with the
# `glfw` submodule (see above) and because we can not access the network when
# building in a sandbox, we fetch the dependency source manually.
pfft-source = fetchFromBitbucket {
owner = "jpommier";
repo = "pffft";
rev = "74d7261be17cf659d5930d4830609406bd7553e3";
sha256 = "084csgqa6f1a270bhybjayrh3mpyi2jimc87qkdgsqcp8ycsx1l1";
};
nanovg-source = fetchFromGitHub {
owner = "memononen";
repo = "nanovg";
rev = "1f9c8864fc556a1be4d4bf1d6bfe20cde25734b4";
sha256 = "08r15zrr6p1kxigxzxrg5rgya7wwbdx7d078r362qbkmws83wk27";
};
nanosvg-source = fetchFromGitHub {
owner = "memononen";
repo = "nanosvg";
rev = "25241c5a8f8451d41ab1b02ab2d865b01600d949";
sha256 = "114qgfmazsdl53rm4pgqif3gv8msdmfwi91lyc2jfadgzfd83xkg";
};
osdialog-source = fetchFromGitHub {
owner = "AndrewBelt";
repo = "osdialog";
rev = "e5db5de6444f4b2c4e1390c67b3efd718080c3da";
sha256 = "0iqxn1md053nl19hbjk8rqsdcmjwa5l5z0ci4fara77q43rc323i";
};
oui-blendish-source = fetchFromGitHub {
owner = "AndrewBelt";
repo = "oui-blendish";
rev = "79ec59e6bc7201017fc13a20c6e33380adca1660";
sha256 = "17kd0lh2x3x12bxkyhq6z8sg6vxln8m9qirf0basvcsmylr6rb64";
};
in
with stdenv.lib; stdenv.mkDerivation rec {
pname = "VCV-Rack";
version = "1.1.6";
src = fetchFromGitHub {
owner = "VCVRack";
repo = "Rack";
rev = "v${version}";
sha256 = "0ji64prr74qzxf5bx1sw022kbslx9nzll16lmk5in78hbl137b3i";
};
patches = [
./rack-minimize-vendoring.patch
];
prePatch = ''
# As we can't use `make dep` to set up the dependencies (as explained
# above), we do it here manually
mkdir -p dep/include
cp -r ${pfft-source} dep/jpommier-pffft-source
cp -r ${nanovg-source}/* dep/nanovg
cp -r ${nanosvg-source}/* dep/nanosvg
cp -r ${osdialog-source}/* dep/osdialog
cp -r ${oui-blendish-source}/* dep/oui-blendish
cp dep/jpommier-pffft-source/*.h dep/include
cp dep/nanosvg/**/*.h dep/include
cp dep/nanovg/src/*.h dep/include
cp dep/osdialog/*.h dep/include
cp dep/oui-blendish/*.h dep/include
substituteInPlace include/audio.hpp --replace "<RtAudio.h>" "<rtaudio/RtAudio.h>"
substituteInPlace compile.mk --replace "-march=nocona" ""
'';
enableParallelBuilding = true;
nativeBuildInputs = [ makeWrapper pkgconfig ];
buildInputs = [ alsaLib curl glew glfw gtk2-x11 jansson libjack2 libsamplerate libzip rtaudio rtmidi speex ];
buildFlags = [ "Rack" ];
installPhase = ''
install -D -m755 -t $out/bin Rack
mkdir -p $out/share/vcv-rack
cp -r res Core.json template.vcv LICENSE* cacert.pem $out/share/vcv-rack
# Override the default global resource file directory
wrapProgram $out/bin/Rack --add-flags "-s $out/share/vcv-rack"
'';
meta = with stdenv.lib; {
description = "Open-source virtual modular synthesizer";
homepage = "https://vcvrack.com/";
# The source is BSD-3 licensed, some of the art is CC-BY-NC 4.0 or under a
# no-derivatives clause
license = with licenses; [ bsd3 cc-by-nc-40 unfreeRedistributable ];
maintainers = with maintainers; [ moredread nathyong ];
platforms = platforms.linux;
};
}
| {
"pile_set_name": "Github"
} |
/*
* ICE / OPCODE - Optimized Collision Detection
* http://www.codercorner.com/Opcode.htm
*
* Copyright (c) 2001-2008 Pierre Terdiman, [email protected]
This software is provided 'as-is', without any express or implied warranty.
In no event will the authors be held liable for any damages arising from the use of this software.
Permission is granted to anyone to use this software for any purpose,
including commercial applications, and to alter it and redistribute it freely,
subject to the following restrictions:
1. The origin of this software must not be misrepresented; you must not claim that you wrote the original software. If you use this software in a product, an acknowledgment in the product documentation would be appreciated but is not required.
2. Altered source versions must be plainly marked as such, and must not be misrepresented as being the original software.
3. This notice may not be removed or altered from any source distribution.
*/
///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
/**
* Contains code for random generators.
* \file IceRandom.cpp
* \author Pierre Terdiman
* \date August, 9, 2001
*/
///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// Precompiled Header
#include "Stdafx.h"
using namespace Opcode;
void IceCore:: SRand(udword seed)
{
srand(seed);
}
udword IceCore::Rand()
{
return rand();
}
static BasicRandom gRandomGenerator(42);
udword IceCore::GetRandomIndex(udword max_index)
{
// We don't use rand() since it's limited to RAND_MAX
udword Index = gRandomGenerator.Randomize();
return Index % max_index;
}
| {
"pile_set_name": "Github"
} |
#!/usr/bin/env python
#
# Copyright (C) 2014 Narf Industries <[email protected]>
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
---
nodes:
- name: start
- name: strncmp
edges:
- start: strncmp
| {
"pile_set_name": "Github"
} |
CREATE TABLE pbAaDistS (
x float default NULL,
y float default NULL
) TYPE=MyISAM;
| {
"pile_set_name": "Github"
} |
// Copyright 2009 The Go Authors. All rights reserved.
// Use of this source code is governed by a BSD-style
// license that can be found in the LICENSE file.
// +build 386,freebsd
package unix
import (
"syscall"
"unsafe"
)
func setTimespec(sec, nsec int64) Timespec {
return Timespec{Sec: int32(sec), Nsec: int32(nsec)}
}
func setTimeval(sec, usec int64) Timeval {
return Timeval{Sec: int32(sec), Usec: int32(usec)}
}
func SetKevent(k *Kevent_t, fd, mode, flags int) {
k.Ident = uint32(fd)
k.Filter = int16(mode)
k.Flags = uint16(flags)
}
func (iov *Iovec) SetLen(length int) {
iov.Len = uint32(length)
}
func (msghdr *Msghdr) SetControllen(length int) {
msghdr.Controllen = uint32(length)
}
func (msghdr *Msghdr) SetIovlen(length int) {
msghdr.Iovlen = int32(length)
}
func (cmsg *Cmsghdr) SetLen(length int) {
cmsg.Len = uint32(length)
}
func sendfile(outfd int, infd int, offset *int64, count int) (written int, err error) {
var writtenOut uint64 = 0
_, _, e1 := Syscall9(SYS_SENDFILE, uintptr(infd), uintptr(outfd), uintptr(*offset), uintptr((*offset)>>32), uintptr(count), 0, uintptr(unsafe.Pointer(&writtenOut)), 0, 0)
written = int(writtenOut)
if e1 != 0 {
err = e1
}
return
}
func Syscall9(num, a1, a2, a3, a4, a5, a6, a7, a8, a9 uintptr) (r1, r2 uintptr, err syscall.Errno)
func PtraceGetFsBase(pid int, fsbase *int64) (err error) {
return ptrace(PTRACE_GETFSBASE, pid, uintptr(unsafe.Pointer(fsbase)), 0)
}
func PtraceIO(req int, pid int, addr uintptr, out []byte, countin int) (count int, err error) {
ioDesc := PtraceIoDesc{Op: int32(req), Offs: (*byte)(unsafe.Pointer(addr)), Addr: (*byte)(unsafe.Pointer(&out[0])), Len: uint32(countin)}
err = ptrace(PTRACE_IO, pid, uintptr(unsafe.Pointer(&ioDesc)), 0)
return int(ioDesc.Len), err
}
| {
"pile_set_name": "Github"
} |
// Distributed under the terms of the MIT license
// Test case submitted to project by https://github.com/practicalswift (practicalswift)
// Test case found by fuzzing
protocol P {
deinit {
let f = e.h> {
typealias b {
struct g<I : c() {
protocol A {
}
}
}
class a {
enum A {
struct g<I : A {
class
case c,
class a
| {
"pile_set_name": "Github"
} |
.parent {
display: flex;
}
.spacer {
flex-grow: 1;
}
.spacing {
padding-top: 1.5rem;
}
| {
"pile_set_name": "Github"
} |
// !$*UTF8*$!
{
archiveVersion = 1;
classes = {
};
objectVersion = 46;
objects = {
/* Begin PBXBuildFile section */
22CF115E0EE9A6F40054F513 /* lores~.c in Sources */ = {isa = PBXBuildFile; fileRef = 22CF115D0EE9A6F40054F513 /* lores~.c */; };
/* End PBXBuildFile section */
/* Begin PBXFileReference section */
22CF10220EE984600054F513 /* maxmspsdk.xcconfig */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.xcconfig; name = maxmspsdk.xcconfig; path = ../../maxmspsdk.xcconfig; sourceTree = SOURCE_ROOT; };
22CF115D0EE9A6F40054F513 /* lores~.c */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.c; path = "lores~.c"; sourceTree = "<group>"; };
2FBBEAE508F335360078DB84 /* lores~.mxo */ = {isa = PBXFileReference; explicitFileType = wrapper.cfbundle; includeInIndex = 0; path = "lores~.mxo"; sourceTree = BUILT_PRODUCTS_DIR; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
2FBBEADC08F335360078DB84 /* Frameworks */ = {
isa = PBXFrameworksBuildPhase;
buildActionMask = 2147483647;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXFrameworksBuildPhase section */
/* Begin PBXGroup section */
089C166AFE841209C02AAC07 /* iterator */ = {
isa = PBXGroup;
children = (
22CF10220EE984600054F513 /* maxmspsdk.xcconfig */,
22CF115D0EE9A6F40054F513 /* lores~.c */,
19C28FB4FE9D528D11CA2CBB /* Products */,
);
name = iterator;
sourceTree = "<group>";
};
19C28FB4FE9D528D11CA2CBB /* Products */ = {
isa = PBXGroup;
children = (
2FBBEAE508F335360078DB84 /* lores~.mxo */,
);
name = Products;
sourceTree = "<group>";
};
/* End PBXGroup section */
/* Begin PBXHeadersBuildPhase section */
2FBBEAD708F335360078DB84 /* Headers */ = {
isa = PBXHeadersBuildPhase;
buildActionMask = 2147483647;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXHeadersBuildPhase section */
/* Begin PBXNativeTarget section */
2FBBEAD608F335360078DB84 /* max-external */ = {
isa = PBXNativeTarget;
buildConfigurationList = 2FBBEAE008F335360078DB84 /* Build configuration list for PBXNativeTarget "max-external" */;
buildPhases = (
2FBBEAD708F335360078DB84 /* Headers */,
2FBBEAD808F335360078DB84 /* Resources */,
2FBBEADA08F335360078DB84 /* Sources */,
2FBBEADC08F335360078DB84 /* Frameworks */,
2FBBEADF08F335360078DB84 /* Rez */,
);
buildRules = (
);
dependencies = (
);
name = "max-external";
productName = iterator;
productReference = 2FBBEAE508F335360078DB84 /* lores~.mxo */;
productType = "com.apple.product-type.bundle";
};
/* End PBXNativeTarget section */
/* Begin PBXProject section */
089C1669FE841209C02AAC07 /* Project object */ = {
isa = PBXProject;
attributes = {
LastUpgradeCheck = 0610;
};
buildConfigurationList = 2FBBEACF08F335010078DB84 /* Build configuration list for PBXProject "lores~" */;
compatibilityVersion = "Xcode 3.2";
developmentRegion = English;
hasScannedForEncodings = 1;
knownRegions = (
en,
);
mainGroup = 089C166AFE841209C02AAC07 /* iterator */;
projectDirPath = "";
projectRoot = "";
targets = (
2FBBEAD608F335360078DB84 /* max-external */,
);
};
/* End PBXProject section */
/* Begin PBXResourcesBuildPhase section */
2FBBEAD808F335360078DB84 /* Resources */ = {
isa = PBXResourcesBuildPhase;
buildActionMask = 2147483647;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXResourcesBuildPhase section */
/* Begin PBXRezBuildPhase section */
2FBBEADF08F335360078DB84 /* Rez */ = {
isa = PBXRezBuildPhase;
buildActionMask = 2147483647;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXRezBuildPhase section */
/* Begin PBXSourcesBuildPhase section */
2FBBEADA08F335360078DB84 /* Sources */ = {
isa = PBXSourcesBuildPhase;
buildActionMask = 2147483647;
files = (
22CF115E0EE9A6F40054F513 /* lores~.c in Sources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXSourcesBuildPhase section */
/* Begin XCBuildConfiguration section */
2FBBEAD008F335010078DB84 /* Development */ = {
isa = XCBuildConfiguration;
buildSettings = {
};
name = Development;
};
2FBBEAD108F335010078DB84 /* Deployment */ = {
isa = XCBuildConfiguration;
buildSettings = {
};
name = Deployment;
};
2FBBEAE108F335360078DB84 /* Development */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = 22CF10220EE984600054F513 /* maxmspsdk.xcconfig */;
buildSettings = {
COPY_PHASE_STRIP = NO;
GCC_OPTIMIZATION_LEVEL = 0;
};
name = Development;
};
2FBBEAE208F335360078DB84 /* Deployment */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = 22CF10220EE984600054F513 /* maxmspsdk.xcconfig */;
buildSettings = {
COPY_PHASE_STRIP = YES;
};
name = Deployment;
};
/* End XCBuildConfiguration section */
/* Begin XCConfigurationList section */
2FBBEACF08F335010078DB84 /* Build configuration list for PBXProject "lores~" */ = {
isa = XCConfigurationList;
buildConfigurations = (
2FBBEAD008F335010078DB84 /* Development */,
2FBBEAD108F335010078DB84 /* Deployment */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Development;
};
2FBBEAE008F335360078DB84 /* Build configuration list for PBXNativeTarget "max-external" */ = {
isa = XCConfigurationList;
buildConfigurations = (
2FBBEAE108F335360078DB84 /* Development */,
2FBBEAE208F335360078DB84 /* Deployment */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Development;
};
/* End XCConfigurationList section */
};
rootObject = 089C1669FE841209C02AAC07 /* Project object */;
}
| {
"pile_set_name": "Github"
} |
# Doxyfile 1.5.2
# This file describes the settings to be used by the documentation system
# doxygen (www.doxygen.org) for a project
#
# All text after a hash (#) is considered a comment and will be ignored
# The format is:
# TAG = value [value, ...]
# For lists items can also be appended using:
# TAG += value [value, ...]
# Values that contain spaces should be placed between quotes (" ")
#---------------------------------------------------------------------------
# Project related configuration options
#---------------------------------------------------------------------------
# This tag specifies the encoding used for all characters in the config file that
# follow. The default is UTF-8 which is also the encoding used for all text before
# the first occurrence of this tag. Doxygen uses libiconv (or the iconv built into
# libc) for the transcoding. See http://www.gnu.org/software/libiconv for the list of
# possible encodings.
DOXYFILE_ENCODING = UTF-8
# The PROJECT_NAME tag is a single word (or a sequence of words surrounded
# by quotes) that should identify the project.
PROJECT_NAME = memcached
# The PROJECT_NUMBER tag can be used to enter a project or revision number.
# This could be handy for archiving the generated documentation or
# if some version control system is used.
PROJECT_NUMBER = 0.8
# The OUTPUT_DIRECTORY tag is used to specify the (relative or absolute)
# base path where the generated documentation will be put.
# If a relative path is entered, it will be relative to the location
# where doxygen was started. If left blank the current directory will be used.
OUTPUT_DIRECTORY = doxygen
# If the CREATE_SUBDIRS tag is set to YES, then doxygen will create
# 4096 sub-directories (in 2 levels) under the output directory of each output
# format and will distribute the generated files over these directories.
# Enabling this option can be useful when feeding doxygen a huge amount of
# source files, where putting all generated files in the same directory would
# otherwise cause performance problems for the file system.
CREATE_SUBDIRS = NO
# The OUTPUT_LANGUAGE tag is used to specify the language in which all
# documentation generated by doxygen is written. Doxygen will use this
# information to generate all constant output in the proper language.
# The default language is English, other supported languages are:
# Afrikaans, Arabic, Brazilian, Catalan, Chinese, Chinese-Traditional,
# Croatian, Czech, Danish, Dutch, Finnish, French, German, Greek, Hungarian,
# Italian, Japanese, Japanese-en (Japanese with English messages), Korean,
# Korean-en, Lithuanian, Norwegian, Polish, Portuguese, Romanian, Russian,
# Serbian, Slovak, Slovene, Spanish, Swedish, and Ukrainian.
OUTPUT_LANGUAGE = English
# If the BRIEF_MEMBER_DESC tag is set to YES (the default) Doxygen will
# include brief member descriptions after the members that are listed in
# the file and class documentation (similar to JavaDoc).
# Set to NO to disable this.
BRIEF_MEMBER_DESC = YES
# If the REPEAT_BRIEF tag is set to YES (the default) Doxygen will prepend
# the brief description of a member or function before the detailed description.
# Note: if both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the
# brief descriptions will be completely suppressed.
REPEAT_BRIEF = NO
# This tag implements a quasi-intelligent brief description abbreviator
# that is used to form the text in various listings. Each string
# in this list, if found as the leading text of the brief description, will be
# stripped from the text and the result after processing the whole list, is
# used as the annotated text. Otherwise, the brief description is used as-is.
# If left blank, the following values are used ("$name" is automatically
# replaced with the name of the entity): "The $name class" "The $name widget"
# "The $name file" "is" "provides" "specifies" "contains"
# "represents" "a" "an" "the"
ABBREVIATE_BRIEF =
# If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then
# Doxygen will generate a detailed section even if there is only a brief
# description.
ALWAYS_DETAILED_SEC = YES
# If the INLINE_INHERITED_MEMB tag is set to YES, doxygen will show all
# inherited members of a class in the documentation of that class as if those
# members were ordinary class members. Constructors, destructors and assignment
# operators of the base classes will not be shown.
INLINE_INHERITED_MEMB = NO
# If the FULL_PATH_NAMES tag is set to YES then Doxygen will prepend the full
# path before files name in the file list and in the header files. If set
# to NO the shortest path that makes the file name unique will be used.
FULL_PATH_NAMES = NO
# If the FULL_PATH_NAMES tag is set to YES then the STRIP_FROM_PATH tag
# can be used to strip a user-defined part of the path. Stripping is
# only done if one of the specified strings matches the left-hand part of
# the path. The tag can be used to show relative paths in the file list.
# If left blank the directory from which doxygen is run is used as the
# path to strip.
STRIP_FROM_PATH =
# The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of
# the path mentioned in the documentation of a class, which tells
# the reader which header file to include in order to use a class.
# If left blank only the name of the header file containing the class
# definition is used. Otherwise one should specify the include paths that
# are normally passed to the compiler using the -I flag.
STRIP_FROM_INC_PATH =
# If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter
# (but less readable) file names. This can be useful is your file systems
# doesn't support long names like on DOS, Mac, or CD-ROM.
SHORT_NAMES = NO
# If the JAVADOC_AUTOBRIEF tag is set to YES then Doxygen
# will interpret the first line (until the first dot) of a JavaDoc-style
# comment as the brief description. If set to NO, the JavaDoc
# comments will behave just like the Qt-style comments (thus requiring an
# explicit @brief command for a brief description.
JAVADOC_AUTOBRIEF = NO
# The MULTILINE_CPP_IS_BRIEF tag can be set to YES to make Doxygen
# treat a multi-line C++ special comment block (i.e. a block of //! or ///
# comments) as a brief description. This used to be the default behaviour.
# The new default is to treat a multi-line C++ comment block as a detailed
# description. Set this tag to YES if you prefer the old behaviour instead.
MULTILINE_CPP_IS_BRIEF = NO
# If the DETAILS_AT_TOP tag is set to YES then Doxygen
# will output the detailed description near the top, like JavaDoc.
# If set to NO, the detailed description appears after the member
# documentation.
DETAILS_AT_TOP = NO
# If the INHERIT_DOCS tag is set to YES (the default) then an undocumented
# member inherits the documentation from any documented member that it
# re-implements.
INHERIT_DOCS = YES
# If the SEPARATE_MEMBER_PAGES tag is set to YES, then doxygen will produce
# a new page for each member. If set to NO, the documentation of a member will
# be part of the file/class/namespace that contains it.
SEPARATE_MEMBER_PAGES = NO
# The TAB_SIZE tag can be used to set the number of spaces in a tab.
# Doxygen uses this value to replace tabs by spaces in code fragments.
TAB_SIZE = 8
# This tag can be used to specify a number of aliases that acts
# as commands in the documentation. An alias has the form "name=value".
# For example adding "sideeffect=\par Side Effects:\n" will allow you to
# put the command \sideeffect (or @sideeffect) in the documentation, which
# will result in a user-defined paragraph with heading "Side Effects:".
# You can put \n's in the value part of an alias to insert newlines.
ALIASES =
# Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C
# sources only. Doxygen will then generate output that is more tailored for C.
# For instance, some of the names that are used will be different. The list
# of all members will be omitted, etc.
OPTIMIZE_OUTPUT_FOR_C = NO
# Set the OPTIMIZE_OUTPUT_JAVA tag to YES if your project consists of Java
# sources only. Doxygen will then generate output that is more tailored for Java.
# For instance, namespaces will be presented as packages, qualified scopes
# will look different, etc.
OPTIMIZE_OUTPUT_JAVA = NO
# If you use STL classes (i.e. std::string, std::vector, etc.) but do not want to
# include (a tag file for) the STL sources as input, then you should
# set this tag to YES in order to let doxygen match functions declarations and
# definitions whose arguments contain STL classes (e.g. func(std::string); v.s.
# func(std::string) {}). This also make the inheritance and collaboration
# diagrams that involve STL classes more complete and accurate.
BUILTIN_STL_SUPPORT = NO
# If you use Microsoft's C++/CLI language, you should set this option to YES to
# enable parsing support.
CPP_CLI_SUPPORT = NO
# If member grouping is used in the documentation and the DISTRIBUTE_GROUP_DOC
# tag is set to YES, then doxygen will reuse the documentation of the first
# member in the group (if any) for the other members of the group. By default
# all members of a group must be documented explicitly.
DISTRIBUTE_GROUP_DOC = NO
# Set the SUBGROUPING tag to YES (the default) to allow class member groups of
# the same type (for instance a group of public functions) to be put as a
# subgroup of that type (e.g. under the Public Functions section). Set it to
# NO to prevent subgrouping. Alternatively, this can be done per class using
# the \nosubgrouping command.
SUBGROUPING = YES
#---------------------------------------------------------------------------
# Build related configuration options
#---------------------------------------------------------------------------
# If the EXTRACT_ALL tag is set to YES doxygen will assume all entities in
# documentation are documented, even if no documentation was available.
# Private class members and static file members will be hidden unless
# the EXTRACT_PRIVATE and EXTRACT_STATIC tags are set to YES
EXTRACT_ALL = YES
# If the EXTRACT_PRIVATE tag is set to YES all private members of a class
# will be included in the documentation.
EXTRACT_PRIVATE = NO
# If the EXTRACT_STATIC tag is set to YES all static members of a file
# will be included in the documentation.
EXTRACT_STATIC = NO
# If the EXTRACT_LOCAL_CLASSES tag is set to YES classes (and structs)
# defined locally in source files will be included in the documentation.
# If set to NO only classes defined in header files are included.
EXTRACT_LOCAL_CLASSES = YES
# This flag is only useful for Objective-C code. When set to YES local
# methods, which are defined in the implementation section but not in
# the interface are included in the documentation.
# If set to NO (the default) only methods in the interface are included.
EXTRACT_LOCAL_METHODS = NO
# If the HIDE_UNDOC_MEMBERS tag is set to YES, Doxygen will hide all
# undocumented members of documented classes, files or namespaces.
# If set to NO (the default) these members will be included in the
# various overviews, but no documentation section is generated.
# This option has no effect if EXTRACT_ALL is enabled.
HIDE_UNDOC_MEMBERS = NO
# If the HIDE_UNDOC_CLASSES tag is set to YES, Doxygen will hide all
# undocumented classes that are normally visible in the class hierarchy.
# If set to NO (the default) these classes will be included in the various
# overviews. This option has no effect if EXTRACT_ALL is enabled.
HIDE_UNDOC_CLASSES = NO
# If the HIDE_FRIEND_COMPOUNDS tag is set to YES, Doxygen will hide all
# friend (class|struct|union) declarations.
# If set to NO (the default) these declarations will be included in the
# documentation.
HIDE_FRIEND_COMPOUNDS = NO
# If the HIDE_IN_BODY_DOCS tag is set to YES, Doxygen will hide any
# documentation blocks found inside the body of a function.
# If set to NO (the default) these blocks will be appended to the
# function's detailed documentation block.
HIDE_IN_BODY_DOCS = NO
# The INTERNAL_DOCS tag determines if documentation
# that is typed after a \internal command is included. If the tag is set
# to NO (the default) then the documentation will be excluded.
# Set it to YES to include the internal documentation.
INTERNAL_DOCS = YES
# If the CASE_SENSE_NAMES tag is set to NO then Doxygen will only generate
# file names in lower-case letters. If set to YES upper-case letters are also
# allowed. This is useful if you have classes or files whose names only differ
# in case and if your file system supports case sensitive file names. Windows
# and Mac users are advised to set this option to NO.
CASE_SENSE_NAMES = YES
# If the HIDE_SCOPE_NAMES tag is set to NO (the default) then Doxygen
# will show members with their full class and namespace scopes in the
# documentation. If set to YES the scope will be hidden.
HIDE_SCOPE_NAMES = NO
# If the SHOW_INCLUDE_FILES tag is set to YES (the default) then Doxygen
# will put a list of the files that are included by a file in the documentation
# of that file.
SHOW_INCLUDE_FILES = YES
# If the INLINE_INFO tag is set to YES (the default) then a tag [inline]
# is inserted in the documentation for inline members.
INLINE_INFO = YES
# If the SORT_MEMBER_DOCS tag is set to YES (the default) then doxygen
# will sort the (detailed) documentation of file and class members
# alphabetically by member name. If set to NO the members will appear in
# declaration order.
SORT_MEMBER_DOCS = NO
# If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the
# brief documentation of file, namespace and class members alphabetically
# by member name. If set to NO (the default) the members will appear in
# declaration order.
SORT_BRIEF_DOCS = NO
# If the SORT_BY_SCOPE_NAME tag is set to YES, the class list will be
# sorted by fully-qualified names, including namespaces. If set to
# NO (the default), the class list will be sorted only by class name,
# not including the namespace part.
# Note: This option is not very useful if HIDE_SCOPE_NAMES is set to YES.
# Note: This option applies only to the class list, not to the
# alphabetical list.
SORT_BY_SCOPE_NAME = NO
# The GENERATE_TODOLIST tag can be used to enable (YES) or
# disable (NO) the todo list. This list is created by putting \todo
# commands in the documentation.
GENERATE_TODOLIST = YES
# The GENERATE_TESTLIST tag can be used to enable (YES) or
# disable (NO) the test list. This list is created by putting \test
# commands in the documentation.
GENERATE_TESTLIST = YES
# The GENERATE_BUGLIST tag can be used to enable (YES) or
# disable (NO) the bug list. This list is created by putting \bug
# commands in the documentation.
GENERATE_BUGLIST = YES
# The GENERATE_DEPRECATEDLIST tag can be used to enable (YES) or
# disable (NO) the deprecated list. This list is created by putting
# \deprecated commands in the documentation.
GENERATE_DEPRECATEDLIST= YES
# The ENABLED_SECTIONS tag can be used to enable conditional
# documentation sections, marked by \if sectionname ... \endif.
ENABLED_SECTIONS =
# The MAX_INITIALIZER_LINES tag determines the maximum number of lines
# the initial value of a variable or define consists of for it to appear in
# the documentation. If the initializer consists of more lines than specified
# here it will be hidden. Use a value of 0 to hide initializers completely.
# The appearance of the initializer of individual variables and defines in the
# documentation can be controlled using \showinitializer or \hideinitializer
# command in the documentation regardless of this setting.
MAX_INITIALIZER_LINES = 30
# Set the SHOW_USED_FILES tag to NO to disable the list of files generated
# at the bottom of the documentation of classes and structs. If set to YES the
# list will mention the files that were used to generate the documentation.
SHOW_USED_FILES = YES
# If the sources in your project are distributed over multiple directories
# then setting the SHOW_DIRECTORIES tag to YES will show the directory hierarchy
# in the documentation. The default is NO.
SHOW_DIRECTORIES = NO
# The FILE_VERSION_FILTER tag can be used to specify a program or script that
# doxygen should invoke to get the current version for each file (typically from the
# version control system). Doxygen will invoke the program by executing (via
# popen()) the command <command> <input-file>, where <command> is the value of
# the FILE_VERSION_FILTER tag, and <input-file> is the name of an input file
# provided by doxygen. Whatever the program writes to standard output
# is used as the file version. See the manual for examples.
FILE_VERSION_FILTER =
#---------------------------------------------------------------------------
# configuration options related to warning and progress messages
#---------------------------------------------------------------------------
# The QUIET tag can be used to turn on/off the messages that are generated
# by doxygen. Possible values are YES and NO. If left blank NO is used.
QUIET = NO
# The WARNINGS tag can be used to turn on/off the warning messages that are
# generated by doxygen. Possible values are YES and NO. If left blank
# NO is used.
WARNINGS = YES
# If WARN_IF_UNDOCUMENTED is set to YES, then doxygen will generate warnings
# for undocumented members. If EXTRACT_ALL is set to YES then this flag will
# automatically be disabled.
WARN_IF_UNDOCUMENTED = YES
# If WARN_IF_DOC_ERROR is set to YES, doxygen will generate warnings for
# potential errors in the documentation, such as not documenting some
# parameters in a documented function, or documenting parameters that
# don't exist or using markup commands wrongly.
WARN_IF_DOC_ERROR = YES
# This WARN_NO_PARAMDOC option can be abled to get warnings for
# functions that are documented, but have no documentation for their parameters
# or return value. If set to NO (the default) doxygen will only warn about
# wrong or incomplete parameter documentation, but not about the absence of
# documentation.
WARN_NO_PARAMDOC = NO
# The WARN_FORMAT tag determines the format of the warning messages that
# doxygen can produce. The string should contain the $file, $line, and $text
# tags, which will be replaced by the file and line number from which the
# warning originated and the warning text. Optionally the format may contain
# $version, which will be replaced by the version of the file (if it could
# be obtained via FILE_VERSION_FILTER)
WARN_FORMAT = "$file:$line: $text"
# The WARN_LOGFILE tag can be used to specify a file to which warning
# and error messages should be written. If left blank the output is written
# to stderr.
WARN_LOGFILE =
#---------------------------------------------------------------------------
# configuration options related to the input files
#---------------------------------------------------------------------------
# The INPUT tag can be used to specify the files and/or directories that contain
# documented source files. You may enter file names like "myfile.cpp" or
# directories like "/usr/src/myproject". Separate the files or directories
# with spaces.
INPUT = ..
# This tag can be used to specify the character encoding of the source files that
# doxygen parses. Internally doxygen uses the UTF-8 encoding, which is also the default
# input encoding. Doxygen uses libiconv (or the iconv built into libc) for the transcoding.
# See http://www.gnu.org/software/libiconv for the list of possible encodings.
INPUT_ENCODING = UTF-8
# If the value of the INPUT tag contains directories, you can use the
# FILE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp
# and *.h) to filter out the source-files in the directories. If left
# blank the following patterns are tested:
# *.c *.cc *.cxx *.cpp *.c++ *.java *.ii *.ixx *.ipp *.i++ *.inl *.h *.hh *.hxx
# *.hpp *.h++ *.idl *.odl *.cs *.php *.php3 *.inc *.m *.mm *.py
FILE_PATTERNS = *.h \
*.c
# The RECURSIVE tag can be used to turn specify whether or not subdirectories
# should be searched for input files as well. Possible values are YES and NO.
# If left blank NO is used.
RECURSIVE = NO
# The EXCLUDE tag can be used to specify files and/or directories that should
# excluded from the INPUT source files. This way you can easily exclude a
# subdirectory from a directory tree whose root is specified with the INPUT tag.
EXCLUDE = config.h
# The EXCLUDE_SYMLINKS tag can be used select whether or not files or
# directories that are symbolic links (a Unix filesystem feature) are excluded
# from the input.
EXCLUDE_SYMLINKS = NO
# If the value of the INPUT tag contains directories, you can use the
# EXCLUDE_PATTERNS tag to specify one or more wildcard patterns to exclude
# certain files from those directories. Note that the wildcards are matched
# against the file with absolute path, so to exclude all test directories
# for example use the pattern */test/*
EXCLUDE_PATTERNS =
# The EXCLUDE_SYMBOLS tag can be used to specify one or more symbol names
# (namespaces, classes, functions, etc.) that should be excluded from the output.
# The symbol name can be a fully qualified name, a word, or if the wildcard * is used,
# a substring. Examples: ANamespace, AClass, AClass::ANamespace, ANamespace::*Test
EXCLUDE_SYMBOLS =
# The EXAMPLE_PATH tag can be used to specify one or more files or
# directories that contain example code fragments that are included (see
# the \include command).
EXAMPLE_PATH =
# If the value of the EXAMPLE_PATH tag contains directories, you can use the
# EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp
# and *.h) to filter out the source-files in the directories. If left
# blank all files are included.
EXAMPLE_PATTERNS =
# If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be
# searched for input files to be used with the \include or \dontinclude
# commands irrespective of the value of the RECURSIVE tag.
# Possible values are YES and NO. If left blank NO is used.
EXAMPLE_RECURSIVE = NO
# The IMAGE_PATH tag can be used to specify one or more files or
# directories that contain image that are included in the documentation (see
# the \image command).
IMAGE_PATH =
# The INPUT_FILTER tag can be used to specify a program that doxygen should
# invoke to filter for each input file. Doxygen will invoke the filter program
# by executing (via popen()) the command <filter> <input-file>, where <filter>
# is the value of the INPUT_FILTER tag, and <input-file> is the name of an
# input file. Doxygen will then use the output that the filter program writes
# to standard output. If FILTER_PATTERNS is specified, this tag will be
# ignored.
INPUT_FILTER =
# The FILTER_PATTERNS tag can be used to specify filters on a per file pattern
# basis. Doxygen will compare the file name with each pattern and apply the
# filter if there is a match. The filters are a list of the form:
# pattern=filter (like *.cpp=my_cpp_filter). See INPUT_FILTER for further
# info on how filters are used. If FILTER_PATTERNS is empty, INPUT_FILTER
# is applied to all files.
FILTER_PATTERNS =
# If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using
# INPUT_FILTER) will be used to filter the input files when producing source
# files to browse (i.e. when SOURCE_BROWSER is set to YES).
FILTER_SOURCE_FILES = NO
#---------------------------------------------------------------------------
# configuration options related to source browsing
#---------------------------------------------------------------------------
# If the SOURCE_BROWSER tag is set to YES then a list of source files will
# be generated. Documented entities will be cross-referenced with these sources.
# Note: To get rid of all source code in the generated output, make sure also
# VERBATIM_HEADERS is set to NO.
SOURCE_BROWSER = NO
# Setting the INLINE_SOURCES tag to YES will include the body
# of functions and classes directly in the documentation.
INLINE_SOURCES = NO
# Setting the STRIP_CODE_COMMENTS tag to YES (the default) will instruct
# doxygen to hide any special comment blocks from generated source code
# fragments. Normal C and C++ comments will always remain visible.
STRIP_CODE_COMMENTS = YES
# If the REFERENCED_BY_RELATION tag is set to YES (the default)
# then for each documented function all documented
# functions referencing it will be listed.
REFERENCED_BY_RELATION = YES
# If the REFERENCES_RELATION tag is set to YES (the default)
# then for each documented function all documented entities
# called/used by that function will be listed.
REFERENCES_RELATION = YES
# If the REFERENCES_LINK_SOURCE tag is set to YES (the default)
# and SOURCE_BROWSER tag is set to YES, then the hyperlinks from
# functions in REFERENCES_RELATION and REFERENCED_BY_RELATION lists will
# link to the source code. Otherwise they will link to the documentstion.
REFERENCES_LINK_SOURCE = YES
# If the USE_HTAGS tag is set to YES then the references to source code
# will point to the HTML generated by the htags(1) tool instead of doxygen
# built-in source browser. The htags tool is part of GNU's global source
# tagging system (see http://www.gnu.org/software/global/global.html). You
# will need version 4.8.6 or higher.
USE_HTAGS = NO
# If the VERBATIM_HEADERS tag is set to YES (the default) then Doxygen
# will generate a verbatim copy of the header file for each class for
# which an include is specified. Set to NO to disable this.
VERBATIM_HEADERS = YES
#---------------------------------------------------------------------------
# configuration options related to the alphabetical class index
#---------------------------------------------------------------------------
# If the ALPHABETICAL_INDEX tag is set to YES, an alphabetical index
# of all compounds will be generated. Enable this if the project
# contains a lot of classes, structs, unions or interfaces.
ALPHABETICAL_INDEX = NO
# If the alphabetical index is enabled (see ALPHABETICAL_INDEX) then
# the COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns
# in which this list will be split (can be a number in the range [1..20])
COLS_IN_ALPHA_INDEX = 5
# In case all classes in a project start with a common prefix, all
# classes will be put under the same header in the alphabetical index.
# The IGNORE_PREFIX tag can be used to specify one or more prefixes that
# should be ignored while generating the index headers.
IGNORE_PREFIX =
#---------------------------------------------------------------------------
# configuration options related to the HTML output
#---------------------------------------------------------------------------
# If the GENERATE_HTML tag is set to YES (the default) Doxygen will
# generate HTML output.
GENERATE_HTML = YES
# The HTML_OUTPUT tag is used to specify where the HTML docs will be put.
# If a relative path is entered the value of OUTPUT_DIRECTORY will be
# put in front of it. If left blank `html' will be used as the default path.
HTML_OUTPUT = html
# The HTML_FILE_EXTENSION tag can be used to specify the file extension for
# each generated HTML page (for example: .htm,.php,.asp). If it is left blank
# doxygen will generate files with .html extension.
HTML_FILE_EXTENSION = .html
# The HTML_HEADER tag can be used to specify a personal HTML header for
# each generated HTML page. If it is left blank doxygen will generate a
# standard header.
HTML_HEADER =
# The HTML_FOOTER tag can be used to specify a personal HTML footer for
# each generated HTML page. If it is left blank doxygen will generate a
# standard footer.
HTML_FOOTER =
# The HTML_STYLESHEET tag can be used to specify a user-defined cascading
# style sheet that is used by each HTML page. It can be used to
# fine-tune the look of the HTML output. If the tag is left blank doxygen
# will generate a default style sheet. Note that doxygen will try to copy
# the style sheet file to the HTML output directory, so don't put your own
# stylesheet in the HTML output directory as well, or it will be erased!
HTML_STYLESHEET =
# If the HTML_ALIGN_MEMBERS tag is set to YES, the members of classes,
# files or namespaces will be aligned in HTML using tables. If set to
# NO a bullet list will be used.
HTML_ALIGN_MEMBERS = YES
# If the GENERATE_HTMLHELP tag is set to YES, additional index files
# will be generated that can be used as input for tools like the
# Microsoft HTML help workshop to generate a compressed HTML help file (.chm)
# of the generated HTML documentation.
GENERATE_HTMLHELP = NO
# If the GENERATE_HTMLHELP tag is set to YES, the CHM_FILE tag can
# be used to specify the file name of the resulting .chm file. You
# can add a path in front of the file if the result should not be
# written to the html output directory.
CHM_FILE =
# If the GENERATE_HTMLHELP tag is set to YES, the HHC_LOCATION tag can
# be used to specify the location (absolute path including file name) of
# the HTML help compiler (hhc.exe). If non-empty doxygen will try to run
# the HTML help compiler on the generated index.hhp.
HHC_LOCATION =
# If the GENERATE_HTMLHELP tag is set to YES, the GENERATE_CHI flag
# controls if a separate .chi index file is generated (YES) or that
# it should be included in the master .chm file (NO).
GENERATE_CHI = NO
# If the GENERATE_HTMLHELP tag is set to YES, the BINARY_TOC flag
# controls whether a binary table of contents is generated (YES) or a
# normal table of contents (NO) in the .chm file.
BINARY_TOC = NO
# The TOC_EXPAND flag can be set to YES to add extra items for group members
# to the contents of the HTML help documentation and to the tree view.
TOC_EXPAND = NO
# The DISABLE_INDEX tag can be used to turn on/off the condensed index at
# top of each HTML page. The value NO (the default) enables the index and
# the value YES disables it.
DISABLE_INDEX = NO
# This tag can be used to set the number of enum values (range [1..20])
# that doxygen will group on one line in the generated HTML documentation.
ENUM_VALUES_PER_LINE = 4
# If the GENERATE_TREEVIEW tag is set to YES, a side panel will be
# generated containing a tree-like index structure (just like the one that
# is generated for HTML Help). For this to work a browser that supports
# JavaScript, DHTML, CSS and frames is required (for instance Mozilla 1.0+,
# Netscape 6.0+, Internet explorer 5.0+, or Konqueror). Windows users are
# probably better off using the HTML help feature.
GENERATE_TREEVIEW = NO
# If the treeview is enabled (see GENERATE_TREEVIEW) then this tag can be
# used to set the initial width (in pixels) of the frame in which the tree
# is shown.
TREEVIEW_WIDTH = 250
#---------------------------------------------------------------------------
# configuration options related to the LaTeX output
#---------------------------------------------------------------------------
# If the GENERATE_LATEX tag is set to YES (the default) Doxygen will
# generate Latex output.
GENERATE_LATEX = NO
# The LATEX_OUTPUT tag is used to specify where the LaTeX docs will be put.
# If a relative path is entered the value of OUTPUT_DIRECTORY will be
# put in front of it. If left blank `latex' will be used as the default path.
LATEX_OUTPUT = latex
# The LATEX_CMD_NAME tag can be used to specify the LaTeX command name to be
# invoked. If left blank `latex' will be used as the default command name.
LATEX_CMD_NAME = latex
# The MAKEINDEX_CMD_NAME tag can be used to specify the command name to
# generate index for LaTeX. If left blank `makeindex' will be used as the
# default command name.
MAKEINDEX_CMD_NAME = makeindex
# If the COMPACT_LATEX tag is set to YES Doxygen generates more compact
# LaTeX documents. This may be useful for small projects and may help to
# save some trees in general.
COMPACT_LATEX = NO
# The PAPER_TYPE tag can be used to set the paper type that is used
# by the printer. Possible values are: a4, a4wide, letter, legal and
# executive. If left blank a4wide will be used.
PAPER_TYPE = a4wide
# The EXTRA_PACKAGES tag can be to specify one or more names of LaTeX
# packages that should be included in the LaTeX output.
EXTRA_PACKAGES =
# The LATEX_HEADER tag can be used to specify a personal LaTeX header for
# the generated latex document. The header should contain everything until
# the first chapter. If it is left blank doxygen will generate a
# standard header. Notice: only use this tag if you know what you are doing!
LATEX_HEADER =
# If the PDF_HYPERLINKS tag is set to YES, the LaTeX that is generated
# is prepared for conversion to pdf (using ps2pdf). The pdf file will
# contain links (just like the HTML output) instead of page references
# This makes the output suitable for online browsing using a pdf viewer.
PDF_HYPERLINKS = NO
# If the USE_PDFLATEX tag is set to YES, pdflatex will be used instead of
# plain latex in the generated Makefile. Set this option to YES to get a
# higher quality PDF documentation.
USE_PDFLATEX = NO
# If the LATEX_BATCHMODE tag is set to YES, doxygen will add the \\batchmode.
# command to the generated LaTeX files. This will instruct LaTeX to keep
# running if errors occur, instead of asking the user for help.
# This option is also used when generating formulas in HTML.
LATEX_BATCHMODE = NO
# If LATEX_HIDE_INDICES is set to YES then doxygen will not
# include the index chapters (such as File Index, Compound Index, etc.)
# in the output.
LATEX_HIDE_INDICES = NO
#---------------------------------------------------------------------------
# configuration options related to the RTF output
#---------------------------------------------------------------------------
# If the GENERATE_RTF tag is set to YES Doxygen will generate RTF output
# The RTF output is optimized for Word 97 and may not look very pretty with
# other RTF readers or editors.
GENERATE_RTF = NO
# The RTF_OUTPUT tag is used to specify where the RTF docs will be put.
# If a relative path is entered the value of OUTPUT_DIRECTORY will be
# put in front of it. If left blank `rtf' will be used as the default path.
RTF_OUTPUT = rtf
# If the COMPACT_RTF tag is set to YES Doxygen generates more compact
# RTF documents. This may be useful for small projects and may help to
# save some trees in general.
COMPACT_RTF = NO
# If the RTF_HYPERLINKS tag is set to YES, the RTF that is generated
# will contain hyperlink fields. The RTF file will
# contain links (just like the HTML output) instead of page references.
# This makes the output suitable for online browsing using WORD or other
# programs which support those fields.
# Note: wordpad (write) and others do not support links.
RTF_HYPERLINKS = NO
# Load stylesheet definitions from file. Syntax is similar to doxygen's
# config file, i.e. a series of assignments. You only have to provide
# replacements, missing definitions are set to their default value.
RTF_STYLESHEET_FILE =
# Set optional variables used in the generation of an rtf document.
# Syntax is similar to doxygen's config file.
RTF_EXTENSIONS_FILE =
#---------------------------------------------------------------------------
# configuration options related to the man page output
#---------------------------------------------------------------------------
# If the GENERATE_MAN tag is set to YES (the default) Doxygen will
# generate man pages
GENERATE_MAN = YES
# The MAN_OUTPUT tag is used to specify where the man pages will be put.
# If a relative path is entered the value of OUTPUT_DIRECTORY will be
# put in front of it. If left blank `man' will be used as the default path.
MAN_OUTPUT = man
# The MAN_EXTENSION tag determines the extension that is added to
# the generated man pages (default is the subroutine's section .3)
MAN_EXTENSION = .3
# If the MAN_LINKS tag is set to YES and Doxygen generates man output,
# then it will generate one additional man file for each entity
# documented in the real man page(s). These additional files
# only source the real man page, but without them the man command
# would be unable to find the correct page. The default is NO.
MAN_LINKS = NO
#---------------------------------------------------------------------------
# configuration options related to the XML output
#---------------------------------------------------------------------------
# If the GENERATE_XML tag is set to YES Doxygen will
# generate an XML file that captures the structure of
# the code including all documentation.
GENERATE_XML = NO
# The XML_OUTPUT tag is used to specify where the XML pages will be put.
# If a relative path is entered the value of OUTPUT_DIRECTORY will be
# put in front of it. If left blank `xml' will be used as the default path.
XML_OUTPUT = xml
# The XML_SCHEMA tag can be used to specify an XML schema,
# which can be used by a validating XML parser to check the
# syntax of the XML files.
XML_SCHEMA =
# The XML_DTD tag can be used to specify an XML DTD,
# which can be used by a validating XML parser to check the
# syntax of the XML files.
XML_DTD =
# If the XML_PROGRAMLISTING tag is set to YES Doxygen will
# dump the program listings (including syntax highlighting
# and cross-referencing information) to the XML output. Note that
# enabling this will significantly increase the size of the XML output.
XML_PROGRAMLISTING = YES
#---------------------------------------------------------------------------
# configuration options for the AutoGen Definitions output
#---------------------------------------------------------------------------
# If the GENERATE_AUTOGEN_DEF tag is set to YES Doxygen will
# generate an AutoGen Definitions (see autogen.sf.net) file
# that captures the structure of the code including all
# documentation. Note that this feature is still experimental
# and incomplete at the moment.
GENERATE_AUTOGEN_DEF = NO
#---------------------------------------------------------------------------
# configuration options related to the Perl module output
#---------------------------------------------------------------------------
# If the GENERATE_PERLMOD tag is set to YES Doxygen will
# generate a Perl module file that captures the structure of
# the code including all documentation. Note that this
# feature is still experimental and incomplete at the
# moment.
GENERATE_PERLMOD = NO
# If the PERLMOD_LATEX tag is set to YES Doxygen will generate
# the necessary Makefile rules, Perl scripts and LaTeX code to be able
# to generate PDF and DVI output from the Perl module output.
PERLMOD_LATEX = NO
# If the PERLMOD_PRETTY tag is set to YES the Perl module output will be
# nicely formatted so it can be parsed by a human reader. This is useful
# if you want to understand what is going on. On the other hand, if this
# tag is set to NO the size of the Perl module output will be much smaller
# and Perl will parse it just the same.
PERLMOD_PRETTY = YES
# The names of the make variables in the generated doxyrules.make file
# are prefixed with the string contained in PERLMOD_MAKEVAR_PREFIX.
# This is useful so different doxyrules.make files included by the same
# Makefile don't overwrite each other's variables.
PERLMOD_MAKEVAR_PREFIX =
#---------------------------------------------------------------------------
# Configuration options related to the preprocessor
#---------------------------------------------------------------------------
# If the ENABLE_PREPROCESSING tag is set to YES (the default) Doxygen will
# evaluate all C-preprocessor directives found in the sources and include
# files.
ENABLE_PREPROCESSING = YES
# If the MACRO_EXPANSION tag is set to YES Doxygen will expand all macro
# names in the source code. If set to NO (the default) only conditional
# compilation will be performed. Macro expansion can be done in a controlled
# way by setting EXPAND_ONLY_PREDEF to YES.
MACRO_EXPANSION = NO
# If the EXPAND_ONLY_PREDEF and MACRO_EXPANSION tags are both set to YES
# then the macro expansion is limited to the macros specified with the
# PREDEFINED and EXPAND_AS_DEFINED tags.
EXPAND_ONLY_PREDEF = NO
# If the SEARCH_INCLUDES tag is set to YES (the default) the includes files
# in the INCLUDE_PATH (see below) will be search if a #include is found.
SEARCH_INCLUDES = YES
# The INCLUDE_PATH tag can be used to specify one or more directories that
# contain include files that are not input files but should be processed by
# the preprocessor.
INCLUDE_PATH =
# You can use the INCLUDE_FILE_PATTERNS tag to specify one or more wildcard
# patterns (like *.h and *.hpp) to filter out the header-files in the
# directories. If left blank, the patterns specified with FILE_PATTERNS will
# be used.
INCLUDE_FILE_PATTERNS =
# The PREDEFINED tag can be used to specify one or more macro names that
# are defined before the preprocessor is started (similar to the -D option of
# gcc). The argument of the tag is a list of macros of the form: name
# or name=definition (no spaces). If the definition and the = are
# omitted =1 is assumed. To prevent a macro definition from being
# undefined via #undef or recursively expanded use the := operator
# instead of the = operator.
PREDEFINED =
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then
# this tag can be used to specify a list of macro names that should be expanded.
# The macro definition that is found in the sources will be used.
# Use the PREDEFINED tag if you want to use a different macro definition.
EXPAND_AS_DEFINED =
# If the SKIP_FUNCTION_MACROS tag is set to YES (the default) then
# doxygen's preprocessor will remove all function-like macros that are alone
# on a line, have an all uppercase name, and do not end with a semicolon. Such
# function macros are typically used for boiler-plate code, and will confuse
# the parser if not removed.
SKIP_FUNCTION_MACROS = YES
#---------------------------------------------------------------------------
# Configuration::additions related to external references
#---------------------------------------------------------------------------
# The TAGFILES option can be used to specify one or more tagfiles.
# Optionally an initial location of the external documentation
# can be added for each tagfile. The format of a tag file without
# this location is as follows:
# TAGFILES = file1 file2 ...
# Adding location for the tag files is done as follows:
# TAGFILES = file1=loc1 "file2 = loc2" ...
# where "loc1" and "loc2" can be relative or absolute paths or
# URLs. If a location is present for each tag, the installdox tool
# does not have to be run to correct the links.
# Note that each tag file must have a unique name
# (where the name does NOT include the path)
# If a tag file is not located in the directory in which doxygen
# is run, you must also specify the path to the tagfile here.
TAGFILES =
# When a file name is specified after GENERATE_TAGFILE, doxygen will create
# a tag file that is based on the input files it reads.
GENERATE_TAGFILE =
# If the ALLEXTERNALS tag is set to YES all external classes will be listed
# in the class index. If set to NO only the inherited external classes
# will be listed.
ALLEXTERNALS = NO
# If the EXTERNAL_GROUPS tag is set to YES all external groups will be listed
# in the modules index. If set to NO, only the current project's groups will
# be listed.
EXTERNAL_GROUPS = YES
# The PERL_PATH should be the absolute path and name of the perl script
# interpreter (i.e. the result of `which perl').
PERL_PATH = /usr/bin/perl
#---------------------------------------------------------------------------
# Configuration options related to the dot tool
#---------------------------------------------------------------------------
# If the CLASS_DIAGRAMS tag is set to YES (the default) Doxygen will
# generate a inheritance diagram (in HTML, RTF and LaTeX) for classes with base
# or super classes. Setting the tag to NO turns the diagrams off. Note that
# this option is superseded by the HAVE_DOT option below. This is only a
# fallback. It is recommended to install and use dot, since it yields more
# powerful graphs.
CLASS_DIAGRAMS = YES
# You can define message sequence charts within doxygen comments using the \msc
# command. Doxygen will then run the mscgen tool (see http://www.mcternan.me.uk/mscgen/) to
# produce the chart and insert it in the documentation. The MSCGEN_PATH tag allows you to
# specify the directory where the mscgen tool resides. If left empty the tool is assumed to
# be found in the default search path.
MSCGEN_PATH =
# If set to YES, the inheritance and collaboration graphs will hide
# inheritance and usage relations if the target is undocumented
# or is not a class.
HIDE_UNDOC_RELATIONS = YES
# If you set the HAVE_DOT tag to YES then doxygen will assume the dot tool is
# available from the path. This tool is part of Graphviz, a graph visualization
# toolkit from AT&T and Lucent Bell Labs. The other options in this section
# have no effect if this option is set to NO (the default)
HAVE_DOT = NO
# If the CLASS_GRAPH and HAVE_DOT tags are set to YES then doxygen
# will generate a graph for each documented class showing the direct and
# indirect inheritance relations. Setting this tag to YES will force the
# the CLASS_DIAGRAMS tag to NO.
CLASS_GRAPH = YES
# If the COLLABORATION_GRAPH and HAVE_DOT tags are set to YES then doxygen
# will generate a graph for each documented class showing the direct and
# indirect implementation dependencies (inheritance, containment, and
# class references variables) of the class with other documented classes.
COLLABORATION_GRAPH = YES
# If the GROUP_GRAPHS and HAVE_DOT tags are set to YES then doxygen
# will generate a graph for groups, showing the direct groups dependencies
GROUP_GRAPHS = YES
# If the UML_LOOK tag is set to YES doxygen will generate inheritance and
# collaboration diagrams in a style similar to the OMG's Unified Modeling
# Language.
UML_LOOK = NO
# If set to YES, the inheritance and collaboration graphs will show the
# relations between templates and their instances.
TEMPLATE_RELATIONS = NO
# If the ENABLE_PREPROCESSING, SEARCH_INCLUDES, INCLUDE_GRAPH, and HAVE_DOT
# tags are set to YES then doxygen will generate a graph for each documented
# file showing the direct and indirect include dependencies of the file with
# other documented files.
INCLUDE_GRAPH = YES
# If the ENABLE_PREPROCESSING, SEARCH_INCLUDES, INCLUDED_BY_GRAPH, and
# HAVE_DOT tags are set to YES then doxygen will generate a graph for each
# documented header file showing the documented files that directly or
# indirectly include this file.
INCLUDED_BY_GRAPH = YES
# If the CALL_GRAPH and HAVE_DOT tags are set to YES then doxygen will
# generate a call dependency graph for every global function or class method.
# Note that enabling this option will significantly increase the time of a run.
# So in most cases it will be better to enable call graphs for selected
# functions only using the \callgraph command.
CALL_GRAPH = NO
# If the CALLER_GRAPH and HAVE_DOT tags are set to YES then doxygen will
# generate a caller dependency graph for every global function or class method.
# Note that enabling this option will significantly increase the time of a run.
# So in most cases it will be better to enable caller graphs for selected
# functions only using the \callergraph command.
CALLER_GRAPH = NO
# If the GRAPHICAL_HIERARCHY and HAVE_DOT tags are set to YES then doxygen
# will graphical hierarchy of all classes instead of a textual one.
GRAPHICAL_HIERARCHY = YES
# If the DIRECTORY_GRAPH, SHOW_DIRECTORIES and HAVE_DOT tags are set to YES
# then doxygen will show the dependencies a directory has on other directories
# in a graphical way. The dependency relations are determined by the #include
# relations between the files in the directories.
DIRECTORY_GRAPH = YES
# The DOT_IMAGE_FORMAT tag can be used to set the image format of the images
# generated by dot. Possible values are png, jpg, or gif
# If left blank png will be used.
DOT_IMAGE_FORMAT = png
# The tag DOT_PATH can be used to specify the path where the dot tool can be
# found. If left blank, it is assumed the dot tool can be found in the path.
DOT_PATH =
# The DOTFILE_DIRS tag can be used to specify one or more directories that
# contain dot files that are included in the documentation (see the
# \dotfile command).
DOTFILE_DIRS =
# The MAX_DOT_GRAPH_MAX_NODES tag can be used to set the maximum number of
# nodes that will be shown in the graph. If the number of nodes in a graph
# becomes larger than this value, doxygen will truncate the graph, which is
# visualized by representing a node as a red box. Note that doxygen will always
# show the root nodes and its direct children regardless of this setting.
DOT_GRAPH_MAX_NODES = 50
# Set the DOT_TRANSPARENT tag to YES to generate images with a transparent
# background. This is disabled by default, which results in a white background.
# Warning: Depending on the platform used, enabling this option may lead to
# badly anti-aliased labels on the edges of a graph (i.e. they become hard to
# read).
DOT_TRANSPARENT = NO
# Set the DOT_MULTI_TARGETS tag to YES allow dot to generate multiple output
# files in one run (i.e. multiple -o and -T options on the command line). This
# makes dot run faster, but since only newer versions of dot (>1.8.10)
# support this, this feature is disabled by default.
DOT_MULTI_TARGETS = NO
# If the GENERATE_LEGEND tag is set to YES (the default) Doxygen will
# generate a legend page explaining the meaning of the various boxes and
# arrows in the dot generated graphs.
GENERATE_LEGEND = YES
# If the DOT_CLEANUP tag is set to YES (the default) Doxygen will
# remove the intermediate dot files that are used to generate
# the various graphs.
DOT_CLEANUP = YES
#---------------------------------------------------------------------------
# Configuration::additions related to the search engine
#---------------------------------------------------------------------------
# The SEARCHENGINE tag specifies whether or not a search engine should be
# used. If set to NO the values of all tags below this one will be ignored.
SEARCHENGINE = NO
| {
"pile_set_name": "Github"
} |
// Copyright (c) 2011 The LevelDB Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file. See the AUTHORS file for names of contributors.
#include <stdio.h>
#include <stdlib.h>
#include <kcpolydb.h>
#include "util/histogram.h"
#include "util/random.h"
#include "util/testutil.h"
// Comma-separated list of operations to run in the specified order
// Actual benchmarks:
//
// fillseq -- write N values in sequential key order in async mode
// fillrandom -- write N values in random key order in async mode
// overwrite -- overwrite N values in random key order in async mode
// fillseqsync -- write N/100 values in sequential key order in sync mode
// fillrandsync -- write N/100 values in random key order in sync mode
// fillrand100K -- write N/1000 100K values in random order in async mode
// fillseq100K -- write N/1000 100K values in seq order in async mode
// readseq -- read N times sequentially
// readseq100K -- read N/1000 100K values in sequential order in async mode
// readrand100K -- read N/1000 100K values in sequential order in async mode
// readrandom -- read N times in random order
static const char* FLAGS_benchmarks =
"fillseq,"
"fillseqsync,"
"fillrandsync,"
"fillrandom,"
"overwrite,"
"readrandom,"
"readseq,"
"fillrand100K,"
"fillseq100K,"
"readseq100K,"
"readrand100K,"
;
// Number of key/values to place in database
static int FLAGS_num = 1000000;
// Number of read operations to do. If negative, do FLAGS_num reads.
static int FLAGS_reads = -1;
// Size of each value
static int FLAGS_value_size = 100;
// Arrange to generate values that shrink to this fraction of
// their original size after compression
static double FLAGS_compression_ratio = 0.5;
// Print histogram of operation timings
static bool FLAGS_histogram = false;
// Cache size. Default 4 MB
static int FLAGS_cache_size = 4194304;
// Page size. Default 1 KB
static int FLAGS_page_size = 1024;
// If true, do not destroy the existing database. If you set this
// flag and also specify a benchmark that wants a fresh database, that
// benchmark will fail.
static bool FLAGS_use_existing_db = false;
// Compression flag. If true, compression is on. If false, compression
// is off.
static bool FLAGS_compression = true;
// Use the db with the following name.
static const char* FLAGS_db = NULL;
inline
static void DBSynchronize(kyotocabinet::TreeDB* db_)
{
// Synchronize will flush writes to disk
if (!db_->synchronize()) {
fprintf(stderr, "synchronize error: %s\n", db_->error().name());
}
}
namespace leveldb {
// Helper for quickly generating random data.
namespace {
class RandomGenerator {
private:
std::string data_;
int pos_;
public:
RandomGenerator() {
// We use a limited amount of data over and over again and ensure
// that it is larger than the compression window (32KB), and also
// large enough to serve all typical value sizes we want to write.
Random rnd(301);
std::string piece;
while (data_.size() < 1048576) {
// Add a short fragment that is as compressible as specified
// by FLAGS_compression_ratio.
test::CompressibleString(&rnd, FLAGS_compression_ratio, 100, &piece);
data_.append(piece);
}
pos_ = 0;
}
Slice Generate(int len) {
if (pos_ + len > data_.size()) {
pos_ = 0;
assert(len < data_.size());
}
pos_ += len;
return Slice(data_.data() + pos_ - len, len);
}
};
static Slice TrimSpace(Slice s) {
int start = 0;
while (start < s.size() && isspace(s[start])) {
start++;
}
int limit = s.size();
while (limit > start && isspace(s[limit-1])) {
limit--;
}
return Slice(s.data() + start, limit - start);
}
} // namespace
class Benchmark {
private:
kyotocabinet::TreeDB* db_;
int db_num_;
int num_;
int reads_;
double start_;
double last_op_finish_;
int64_t bytes_;
std::string message_;
Histogram hist_;
RandomGenerator gen_;
Random rand_;
kyotocabinet::LZOCompressor<kyotocabinet::LZO::RAW> comp_;
// State kept for progress messages
int done_;
int next_report_; // When to report next
void PrintHeader() {
const int kKeySize = 16;
PrintEnvironment();
fprintf(stdout, "Keys: %d bytes each\n", kKeySize);
fprintf(stdout, "Values: %d bytes each (%d bytes after compression)\n",
FLAGS_value_size,
static_cast<int>(FLAGS_value_size * FLAGS_compression_ratio + 0.5));
fprintf(stdout, "Entries: %d\n", num_);
fprintf(stdout, "RawSize: %.1f MB (estimated)\n",
((static_cast<int64_t>(kKeySize + FLAGS_value_size) * num_)
/ 1048576.0));
fprintf(stdout, "FileSize: %.1f MB (estimated)\n",
(((kKeySize + FLAGS_value_size * FLAGS_compression_ratio) * num_)
/ 1048576.0));
PrintWarnings();
fprintf(stdout, "------------------------------------------------\n");
}
void PrintWarnings() {
#if defined(__GNUC__) && !defined(__OPTIMIZE__)
fprintf(stdout,
"WARNING: Optimization is disabled: benchmarks unnecessarily slow\n"
);
#endif
#ifndef NDEBUG
fprintf(stdout,
"WARNING: Assertions are enabled; benchmarks unnecessarily slow\n");
#endif
}
void PrintEnvironment() {
fprintf(stderr, "Kyoto Cabinet: version %s, lib ver %d, lib rev %d\n",
kyotocabinet::VERSION, kyotocabinet::LIBVER, kyotocabinet::LIBREV);
#if defined(__linux)
time_t now = time(NULL);
fprintf(stderr, "Date: %s", ctime(&now)); // ctime() adds newline
FILE* cpuinfo = fopen("/proc/cpuinfo", "r");
if (cpuinfo != NULL) {
char line[1000];
int num_cpus = 0;
std::string cpu_type;
std::string cache_size;
while (fgets(line, sizeof(line), cpuinfo) != NULL) {
const char* sep = strchr(line, ':');
if (sep == NULL) {
continue;
}
Slice key = TrimSpace(Slice(line, sep - 1 - line));
Slice val = TrimSpace(Slice(sep + 1));
if (key == "model name") {
++num_cpus;
cpu_type = val.ToString();
} else if (key == "cache size") {
cache_size = val.ToString();
}
}
fclose(cpuinfo);
fprintf(stderr, "CPU: %d * %s\n", num_cpus, cpu_type.c_str());
fprintf(stderr, "CPUCache: %s\n", cache_size.c_str());
}
#endif
}
void Start() {
start_ = Env::Default()->NowMicros() * 1e-6;
bytes_ = 0;
message_.clear();
last_op_finish_ = start_;
hist_.Clear();
done_ = 0;
next_report_ = 100;
}
void FinishedSingleOp() {
if (FLAGS_histogram) {
double now = Env::Default()->NowMicros() * 1e-6;
double micros = (now - last_op_finish_) * 1e6;
hist_.Add(micros);
if (micros > 20000) {
fprintf(stderr, "long op: %.1f micros%30s\r", micros, "");
fflush(stderr);
}
last_op_finish_ = now;
}
done_++;
if (done_ >= next_report_) {
if (next_report_ < 1000) next_report_ += 100;
else if (next_report_ < 5000) next_report_ += 500;
else if (next_report_ < 10000) next_report_ += 1000;
else if (next_report_ < 50000) next_report_ += 5000;
else if (next_report_ < 100000) next_report_ += 10000;
else if (next_report_ < 500000) next_report_ += 50000;
else next_report_ += 100000;
fprintf(stderr, "... finished %d ops%30s\r", done_, "");
fflush(stderr);
}
}
void Stop(const Slice& name) {
double finish = Env::Default()->NowMicros() * 1e-6;
// Pretend at least one op was done in case we are running a benchmark
// that does not call FinishedSingleOp().
if (done_ < 1) done_ = 1;
if (bytes_ > 0) {
char rate[100];
snprintf(rate, sizeof(rate), "%6.1f MB/s",
(bytes_ / 1048576.0) / (finish - start_));
if (!message_.empty()) {
message_ = std::string(rate) + " " + message_;
} else {
message_ = rate;
}
}
fprintf(stdout, "%-12s : %11.3f micros/op;%s%s\n",
name.ToString().c_str(),
(finish - start_) * 1e6 / done_,
(message_.empty() ? "" : " "),
message_.c_str());
if (FLAGS_histogram) {
fprintf(stdout, "Microseconds per op:\n%s\n", hist_.ToString().c_str());
}
fflush(stdout);
}
public:
enum Order {
SEQUENTIAL,
RANDOM
};
enum DBState {
FRESH,
EXISTING
};
Benchmark()
: db_(NULL),
num_(FLAGS_num),
reads_(FLAGS_reads < 0 ? FLAGS_num : FLAGS_reads),
bytes_(0),
rand_(301) {
std::vector<std::string> files;
std::string test_dir;
Env::Default()->GetTestDirectory(&test_dir);
Env::Default()->GetChildren(test_dir.c_str(), &files);
if (!FLAGS_use_existing_db) {
for (int i = 0; i < files.size(); i++) {
if (Slice(files[i]).starts_with("dbbench_polyDB")) {
std::string file_name(test_dir);
file_name += "/";
file_name += files[i];
Env::Default()->DeleteFile(file_name.c_str());
}
}
}
}
~Benchmark() {
if (!db_->close()) {
fprintf(stderr, "close error: %s\n", db_->error().name());
}
}
void Run() {
PrintHeader();
Open(false);
const char* benchmarks = FLAGS_benchmarks;
while (benchmarks != NULL) {
const char* sep = strchr(benchmarks, ',');
Slice name;
if (sep == NULL) {
name = benchmarks;
benchmarks = NULL;
} else {
name = Slice(benchmarks, sep - benchmarks);
benchmarks = sep + 1;
}
Start();
bool known = true;
bool write_sync = false;
if (name == Slice("fillseq")) {
Write(write_sync, SEQUENTIAL, FRESH, num_, FLAGS_value_size, 1);
DBSynchronize(db_);
} else if (name == Slice("fillrandom")) {
Write(write_sync, RANDOM, FRESH, num_, FLAGS_value_size, 1);
DBSynchronize(db_);
} else if (name == Slice("overwrite")) {
Write(write_sync, RANDOM, EXISTING, num_, FLAGS_value_size, 1);
DBSynchronize(db_);
} else if (name == Slice("fillrandsync")) {
write_sync = true;
Write(write_sync, RANDOM, FRESH, num_ / 100, FLAGS_value_size, 1);
DBSynchronize(db_);
} else if (name == Slice("fillseqsync")) {
write_sync = true;
Write(write_sync, SEQUENTIAL, FRESH, num_ / 100, FLAGS_value_size, 1);
DBSynchronize(db_);
} else if (name == Slice("fillrand100K")) {
Write(write_sync, RANDOM, FRESH, num_ / 1000, 100 * 1000, 1);
DBSynchronize(db_);
} else if (name == Slice("fillseq100K")) {
Write(write_sync, SEQUENTIAL, FRESH, num_ / 1000, 100 * 1000, 1);
DBSynchronize(db_);
} else if (name == Slice("readseq")) {
ReadSequential();
} else if (name == Slice("readrandom")) {
ReadRandom();
} else if (name == Slice("readrand100K")) {
int n = reads_;
reads_ /= 1000;
ReadRandom();
reads_ = n;
} else if (name == Slice("readseq100K")) {
int n = reads_;
reads_ /= 1000;
ReadSequential();
reads_ = n;
} else {
known = false;
if (name != Slice()) { // No error message for empty name
fprintf(stderr, "unknown benchmark '%s'\n", name.ToString().c_str());
}
}
if (known) {
Stop(name);
}
}
}
private:
void Open(bool sync) {
assert(db_ == NULL);
// Initialize db_
db_ = new kyotocabinet::TreeDB();
char file_name[100];
db_num_++;
std::string test_dir;
Env::Default()->GetTestDirectory(&test_dir);
snprintf(file_name, sizeof(file_name),
"%s/dbbench_polyDB-%d.kct",
test_dir.c_str(),
db_num_);
// Create tuning options and open the database
int open_options = kyotocabinet::PolyDB::OWRITER |
kyotocabinet::PolyDB::OCREATE;
int tune_options = kyotocabinet::TreeDB::TSMALL |
kyotocabinet::TreeDB::TLINEAR;
if (FLAGS_compression) {
tune_options |= kyotocabinet::TreeDB::TCOMPRESS;
db_->tune_compressor(&comp_);
}
db_->tune_options(tune_options);
db_->tune_page_cache(FLAGS_cache_size);
db_->tune_page(FLAGS_page_size);
db_->tune_map(256LL<<20);
if (sync) {
open_options |= kyotocabinet::PolyDB::OAUTOSYNC;
}
if (!db_->open(file_name, open_options)) {
fprintf(stderr, "open error: %s\n", db_->error().name());
}
}
void Write(bool sync, Order order, DBState state,
int num_entries, int value_size, int entries_per_batch) {
// Create new database if state == FRESH
if (state == FRESH) {
if (FLAGS_use_existing_db) {
message_ = "skipping (--use_existing_db is true)";
return;
}
delete db_;
db_ = NULL;
Open(sync);
Start(); // Do not count time taken to destroy/open
}
if (num_entries != num_) {
char msg[100];
snprintf(msg, sizeof(msg), "(%d ops)", num_entries);
message_ = msg;
}
// Write to database
for (int i = 0; i < num_entries; i++)
{
const int k = (order == SEQUENTIAL) ? i : (rand_.Next() % num_entries);
char key[100];
snprintf(key, sizeof(key), "%016d", k);
bytes_ += value_size + strlen(key);
std::string cpp_key = key;
if (!db_->set(cpp_key, gen_.Generate(value_size).ToString())) {
fprintf(stderr, "set error: %s\n", db_->error().name());
}
FinishedSingleOp();
}
}
void ReadSequential() {
kyotocabinet::DB::Cursor* cur = db_->cursor();
cur->jump();
std::string ckey, cvalue;
while (cur->get(&ckey, &cvalue, true)) {
bytes_ += ckey.size() + cvalue.size();
FinishedSingleOp();
}
delete cur;
}
void ReadRandom() {
std::string value;
for (int i = 0; i < reads_; i++) {
char key[100];
const int k = rand_.Next() % reads_;
snprintf(key, sizeof(key), "%016d", k);
db_->get(key, &value);
FinishedSingleOp();
}
}
};
} // namespace leveldb
int main(int argc, char** argv) {
std::string default_db_path;
for (int i = 1; i < argc; i++) {
double d;
int n;
char junk;
if (leveldb::Slice(argv[i]).starts_with("--benchmarks=")) {
FLAGS_benchmarks = argv[i] + strlen("--benchmarks=");
} else if (sscanf(argv[i], "--compression_ratio=%lf%c", &d, &junk) == 1) {
FLAGS_compression_ratio = d;
} else if (sscanf(argv[i], "--histogram=%d%c", &n, &junk) == 1 &&
(n == 0 || n == 1)) {
FLAGS_histogram = n;
} else if (sscanf(argv[i], "--num=%d%c", &n, &junk) == 1) {
FLAGS_num = n;
} else if (sscanf(argv[i], "--reads=%d%c", &n, &junk) == 1) {
FLAGS_reads = n;
} else if (sscanf(argv[i], "--value_size=%d%c", &n, &junk) == 1) {
FLAGS_value_size = n;
} else if (sscanf(argv[i], "--cache_size=%d%c", &n, &junk) == 1) {
FLAGS_cache_size = n;
} else if (sscanf(argv[i], "--page_size=%d%c", &n, &junk) == 1) {
FLAGS_page_size = n;
} else if (sscanf(argv[i], "--compression=%d%c", &n, &junk) == 1 &&
(n == 0 || n == 1)) {
FLAGS_compression = (n == 1) ? true : false;
} else if (strncmp(argv[i], "--db=", 5) == 0) {
FLAGS_db = argv[i] + 5;
} else {
fprintf(stderr, "Invalid flag '%s'\n", argv[i]);
exit(1);
}
}
// Choose a location for the test database if none given with --db=<path>
if (FLAGS_db == NULL) {
leveldb::Env::Default()->GetTestDirectory(&default_db_path);
default_db_path += "/dbbench";
FLAGS_db = default_db_path.c_str();
}
leveldb::Benchmark benchmark;
benchmark.Run();
return 0;
}
| {
"pile_set_name": "Github"
} |
# -*- coding: utf-8 -*-
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('recurring_payments', '0002_recurringpayment_platform'),
]
operations = [
migrations.AlterField(
model_name='recurringpayment',
name='object_content_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.SET_NULL, blank=True, to='contenttypes.ContentType', null=True),
),
]
| {
"pile_set_name": "Github"
} |
md5 binutils-linaro-2.25.0-2015.01-2.tar.xz 5c4b97c60f8bf624a34e2acef3138eec
sha1 binutils-linaro-2.25.0-2015.01-2.tar.xz 194f85bf029c10e889db4b8b299979e81a234829
sha256 binutils-linaro-2.25.0-2015.01-2.tar.xz aed2aef13926911923b47a71ee88dc0943d544718d91f8caee5fc48fd20ef3a7
sha512 binutils-linaro-2.25.0-2015.01-2.tar.xz 1fb74b2ba5c47f9d91664f10aa079cafab877a59c992aec1abd1f94754c1067f18f99f10d32be13416316081bb1c576e99fb34e6b1541a089d908896d43909e5
| {
"pile_set_name": "Github"
} |
#!/usr/bin/env python3
import os
from shutil import copyfile
def create_gdbinit_file():
"""
Create and insert into a .gdbinit file the python code to set-up cbmc pretty-printers.
"""
print("Attempting to enable cbmc-specific pretty-printers.")
home_folder = os.path.expanduser("~")
if not home_folder:
print(home_folder + " is an invalid home folder, can't auto-configure .gdbinit.")
return
# This is the code that should be copied if you're applying the changes by hand.
gdb_directory = os.path.dirname(os.path.abspath(__file__))
code_block_start = "cbmc_printers_folder = "
code_block = \
[
"{0}'{1}'".format(code_block_start, gdb_directory),
"if os.path.exists(cbmc_printers_folder):",
" sys.path.insert(1, cbmc_printers_folder)",
" from pretty_printers import load_cbmc_printers",
" load_cbmc_printers()",
]
gdbinit_file = os.path.join(home_folder, ".gdbinit")
lines = []
imports = { "os", "sys" }
if os.path.exists(gdbinit_file):
with open(gdbinit_file, 'r') as file:
lines = [ line.rstrip() for line in file ]
line_no = 0
while line_no < len(lines):
if lines[line_no].startswith('import '):
imports.add(lines[line_no][len("import "):].strip())
lines.pop(line_no)
else:
if lines[line_no].startswith(code_block_start):
print(".gdbinit already contains our pretty printers, not changing it")
return
line_no += 1
while len(lines) != 0 and (lines[0] == "" or lines[0] == "python"):
lines.pop(0)
backup_file = os.path.join(home_folder, "backup.gdbinit")
if os.path.exists(backup_file):
print("backup.gdbinit file already exists. Type 'y' if you would like to overwrite it or any other key to exit.")
choice = input().lower()
if choice != 'y':
return
print("Backing up {0}".format(gdbinit_file))
copyfile(gdbinit_file, backup_file)
lines = [ "python" ] + list(map("import {}".format, sorted(imports))) + [ "", "" ] + code_block + [ "", "" ] + lines + [ "" ]
print("Adding pretty-print commands to {0}.".format(gdbinit_file))
try:
with open(gdbinit_file, 'w+') as file:
file.write('\n'.join(lines))
print("Commands added.")
except:
print("Exception occured writing to file. Please apply changes manually.")
if __name__ == "__main__":
create_gdbinit_file()
| {
"pile_set_name": "Github"
} |
/****************************************************************************
Copyright (c) 2012-2013 cocos2d-x.org
http://www.cocos2d-x.org
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
****************************************************************************/
package org.cocos2dx.plugin;
import java.util.Hashtable;
import cn.uc.gamesdk.UCCallbackListener;
import cn.uc.gamesdk.UCGameSDK;
import cn.uc.gamesdk.UCGameSDKStatusCode;
import cn.uc.gamesdk.UCLogLevel;
import cn.uc.gamesdk.UCOrientation;
import cn.uc.gamesdk.info.GameParamInfo;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.content.res.Configuration;
public class UCWrapper {
private static boolean isInited = false;
public static void initSDK(Context ctx, Hashtable<String, String> cpInfo, boolean isDebug, UCCallbackListener<String> listener) {
if (isInited) {
return;
}
try {
final UCCallbackListener<String> curListener = listener;
UCGameSDK.defaultSDK().setLogoutNotifyListener(new UCCallbackListener<String>() {
@Override
public void callback(int statuscode, String data) {
switch (statuscode) {
case UCGameSDKStatusCode.SUCCESS:
mLogined = false;
break;
default:
break;
}
if (null != curListener) {
curListener.callback(statuscode, data);
}
}
});
String strCpId = cpInfo.get("UCCpID");
String strGameId = cpInfo.get("UCGameID");
String strServerId = cpInfo.get("UCServerID");
int cpId = Integer.parseInt(strCpId);
int gameId = Integer.parseInt(strGameId);
int serverId = Integer.parseInt(strServerId);
GameParamInfo gpi = new GameParamInfo();
gpi.setCpId(cpId);
gpi.setGameId(gameId);
gpi.setServerId(serverId);
if (isLandscape(ctx)) {
UCGameSDK.defaultSDK().setOrientation(UCOrientation.LANDSCAPE);
}
UCGameSDK.defaultSDK().initSDK(ctx, UCLogLevel.ERROR, isDebug, gpi, new UCCallbackListener<String>() {
@Override
public void callback(int code, String msg) {
System.out.println("msg:" + msg);
switch (code) {
case UCGameSDKStatusCode.SUCCESS:
isInited = true;
break;
case UCGameSDKStatusCode.INIT_FAIL:
default:
isInited = false;
break;
}
}
});
} catch (Exception e) {
isInited = false;
}
}
public static boolean SDKInited() {
return isInited;
}
public static boolean isLandscape(Context ctx)
{
Configuration config = ctx.getResources().getConfiguration();
int orientation = config.orientation;
if (orientation != Configuration.ORIENTATION_LANDSCAPE &&
orientation != Configuration.ORIENTATION_PORTRAIT)
{
orientation = Configuration.ORIENTATION_PORTRAIT;
}
return (orientation == Configuration.ORIENTATION_LANDSCAPE);
}
public static String getSDKVersion() {
return "2.3.4";
}
private static boolean mLogined = false;
private static boolean waitHandle = false;
public static void userLogin(Context ctx, UCCallbackListener<String> listener) {
try {
waitHandle = true;
final UCCallbackListener<String> curListener = listener;
UCGameSDK.defaultSDK().login(ctx, new UCCallbackListener<String>() {
@Override
public void callback(int code, String msg) {
if (! waitHandle) {
return;
}
switch(code) {
case UCGameSDKStatusCode.SUCCESS:
mLogined = true;
break;
default:
mLogined = false;
break;
}
waitHandle = false;
curListener.callback(code, msg);
}
});
} catch (Exception e) {
mLogined = false;
waitHandle = false;
listener.callback(UCGameSDKStatusCode.FAIL, "Login Failed");
}
}
public static void userLogout() {
try {
UCGameSDK.defaultSDK().logout();
mLogined = false;
} catch (Exception e) {
e.printStackTrace();
}
}
public static boolean isLogined() {
return mLogined;
}
}
| {
"pile_set_name": "Github"
} |
# $Id: Makefile.in,v 1.73 2020/02/02 23:34:34 tom Exp $
##############################################################################
# Copyright 2018-2019,2020 Thomas E. Dickey #
# Copyright 1998-2016,2017 Free Software Foundation, Inc. #
# #
# Permission is hereby granted, free of charge, to any person obtaining a #
# copy of this software and associated documentation files (the "Software"), #
# to deal in the Software without restriction, including without limitation #
# the rights to use, copy, modify, merge, publish, distribute, distribute #
# with modifications, sublicense, and/or sell copies of the Software, and to #
# permit persons to whom the Software is furnished to do so, subject to the #
# following conditions: #
# #
# The above copyright notice and this permission notice shall be included in #
# all copies or substantial portions of the Software. #
# #
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR #
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, #
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL #
# THE ABOVE COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER #
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING #
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER #
# DEALINGS IN THE SOFTWARE. #
# #
# Except as contained in this notice, the name(s) of the above copyright #
# holders shall not be used in advertising or otherwise to promote the sale, #
# use or other dealings in this Software without prior written #
# authorization. #
##############################################################################
#
# Author: Thomas E. Dickey 1996-on
#
# Makefile for ncurses miscellany directory
#
# This makes/installs the terminfo database
#
# The variable 'srcdir' refers to the source-distribution, and can be set with
# the configure script by "--srcdir=DIR".
#
# The rules are organized to produce the libraries for the configured models,
# and the programs with the configured default model.
# turn off _all_ suffix rules; we'll generate our own
.SUFFIXES :
SHELL = @SHELL@
VPATH = @srcdir@
THIS = Makefile
CF_MFLAGS = @cf_cv_makeflags@
@SET_MAKE@
DESTDIR = @DESTDIR@
top_srcdir = @top_srcdir@
srcdir = @srcdir@
prefix = @prefix@
exec_prefix = @exec_prefix@
bindir = @bindir@
libdir = @libdir@
datarootdir = @datarootdir@
datadir = @datadir@
includesubdir = @includesubdir@
INCLUDEDIR = $(DESTDIR)$(includedir)$(includesubdir)
tabsetdir = $(datadir)/tabset
ticdir = @TERMINFO@
ticfile = $(ticdir).db
source = @TERMINFO_SRC@
INSTALL = @INSTALL@
INSTALL_PROGRAM = @INSTALL_PROGRAM@
INSTALL_SCRIPT = @INSTALL_SCRIPT@
INSTALL_DATA = @INSTALL_DATA@
PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@
################################################################################
all \
sources ::
@MAKE_DATABASE@all \
@MAKE_DATABASE@sources :: terminfo.tmp
depend :
@MAKE_DATABASE@install :: @MISC_INSTALL_DATA@
@[email protected] :: terminfo.tmp \
@MAKE_DATABASE@ $(DESTDIR)$(libdir) \
@MAKE_DATABASE@ $(DESTDIR)$(datadir) \
@MAKE_DATABASE@ $(DESTDIR)$(tabsetdir)
@MAKE_DATABASE@ DESTDIR=${DESTDIR} \
@MAKE_DATABASE@ prefix=${prefix} \
@MAKE_DATABASE@ exec_prefix=${exec_prefix} \
@MAKE_DATABASE@ bindir=${bindir} \
@MAKE_DATABASE@ top_srcdir=${top_srcdir} \
@MAKE_DATABASE@ srcdir=${srcdir} \
@MAKE_DATABASE@ datadir=${datadir} \
@MAKE_DATABASE@ ticdir=${ticdir} \
@MAKE_DATABASE@ source=terminfo.tmp \
@MAKE_DATABASE@ cross_compiling=@cross_compiling@ \
@MAKE_DATABASE@ $(SHELL) ./run_tic.sh
@MAKE_DATABASE@ @cd $(srcdir)/tabset && \
@MAKE_DATABASE@ $(SHELL) -c 'for i in * ; do \
@MAKE_DATABASE@ if test -f $$i ; then \
@MAKE_DATABASE@ echo installing $$i; \
@MAKE_DATABASE@ $(INSTALL_DATA) $$i $(DESTDIR)$(tabsetdir)/$$i; \
@MAKE_DATABASE@ fi; done'
install.data ::
@echo "finished $@"
NCURSES_CONFIG = ncurses@DFT_ARG_SUFFIX@@cf_cv_abi_version@@cf_config_suffix@-config
install \
install.libs :: $(DESTDIR)$(bindir) ncurses-config
$(INSTALL_SCRIPT) ncurses-config $(DESTDIR)$(bindir)/$(NCURSES_CONFIG)
# Make a list of the files that gen-pkgconfig might create:
@MAKE_PC_FILES@PC_FILES = \
@MAKE_PC_FILES@ @LIB_NAME@@USE_ARG_SUFFIX@@[email protected] \
@MAKE_PC_FILES@ @TINFO_ARG_SUFFIX@@[email protected] \
@MAKE_PC_FILES@ @TICS_NAME@@USE_ARG_SUFFIX@@[email protected] \
@MAKE_PC_FILES@ @PANEL_NAME@@USE_ARG_SUFFIX@@[email protected] \
@MAKE_PC_FILES@ @MENU_NAME@@USE_ARG_SUFFIX@@[email protected] \
@MAKE_PC_FILES@ @FORM_NAME@@USE_ARG_SUFFIX@@[email protected] \
@MAKE_PC_FILES@ @CXX_NAME@@USE_ARG_SUFFIX@@[email protected]
# some packagers prefer to be able to construct pc-files on servers where
# pkg-config is not installed. Work around that by creating the library
# directory during this rule:
@MAKE_PC_FILES@install \
@[email protected] :: pc-files
@MAKE_PC_FILES@ @$(SHELL) -c 'case "x$(DESTDIR)$(PKG_CONFIG_LIBDIR)" in \
@MAKE_PC_FILES@ (x/*) \
@MAKE_PC_FILES@ mkdir -p $(DESTDIR)$(PKG_CONFIG_LIBDIR); \
@MAKE_PC_FILES@ for name in $(PC_FILES); do \
@MAKE_PC_FILES@ test -f $$name || continue; \
@MAKE_PC_FILES@ echo installing $$name; \
@MAKE_PC_FILES@ $(INSTALL_DATA) $$name $(DESTDIR)$(PKG_CONFIG_LIBDIR)/$$name; \
@MAKE_PC_FILES@ done \
@MAKE_PC_FILES@ ;; \
@MAKE_PC_FILES@ (*) \
@MAKE_PC_FILES@ echo "...skip actual install: no destination was given" ; \
@MAKE_PC_FILES@ ;; \
@MAKE_PC_FILES@ esac'
@MAKE_PC_FILES@all \
@MAKE_PC_FILES@sources :: pc-files
@MAKE_PC_FILES@pc-files :
@MAKE_PC_FILES@ $(SHELL) ./gen-pkgconfig
@MAKE_PC_FILES@ -touch $@
terminfo.tmp : run_tic.sed $(source)
echo '** adjusting tabset paths'
sed -f run_tic.sed $(source) >terminfo.tmp
run_tic.sed :
WHICH_XTERM=@WHICH_XTERM@ \
XTERM_KBS=@XTERM_KBS@ \
datadir=${datadir} \
$(SHELL) $(srcdir)/gen_edit.sh >$@
$(DESTDIR)$(bindir) \
$(DESTDIR)$(libdir) \
$(DESTDIR)$(datadir) \
$(DESTDIR)$(tabsetdir) :
mkdir -p $@
uninstall : @MISC_UNINSTALL_DATA@ uninstall.libs
uninstall.data :
-test -d $(DESTDIR)$(tabsetdir) && rm -rf $(DESTDIR)$(tabsetdir)
-test -d $(DESTDIR)$(ticdir) && rm -rf $(DESTDIR)$(ticdir)
-test -f $(DESTDIR)$(ticfile) && rm -f $(DESTDIR)$(ticfile)
uninstall.libs :
-rm -f $(DESTDIR)$(bindir)/$(NCURSES_CONFIG)
@MAKE_PC_FILES@ @$(SHELL) -c 'case x$(DESTDIR)$(PKG_CONFIG_LIBDIR) in \
@MAKE_PC_FILES@ (x/*) \
@MAKE_PC_FILES@ for name in $(PC_FILES); do \
@MAKE_PC_FILES@ test -f $$name || continue; \
@MAKE_PC_FILES@ echo uninstalling $$name; \
@MAKE_PC_FILES@ rm -f $(DESTDIR)$(PKG_CONFIG_LIBDIR)/$$name; \
@MAKE_PC_FILES@ done \
@MAKE_PC_FILES@ ;; \
@MAKE_PC_FILES@ esac'
tags :
@MAKE_UPPER_TAGS@TAGS :
mostlyclean :
@MAKE_DATABASE@ -rm -f terminfo.tmp
@MAKE_DATABASE@ -rm -f run_tic.sed
@MAKE_PC_FILES@ -rm -f pc-files $(PC_FILES)
-rm -f core tags TAGS *~ *.bak *.ln *.atac trace
clean :: mostlyclean
distclean : clean
@MAKE_PC_FILES@ -rm -f gen-pkgconfig
-rm -f Makefile run_tic.sh ncurses-config
realclean : distclean
###############################################################################
# The remainder of this file is automatically generated during configuration
###############################################################################
| {
"pile_set_name": "Github"
} |
# Input mapping
All keyboard and joystick events can be mapped to any Amiga keyboard and joystick action, overriding default settings and (and the setup resulting from using the joystick_port_n options). A few examples of what you can do:
* Do you want the key ‘S’ on your keyboard to press the ‘R’ key on the Amiga keyboard? You can do this with custom input mapping.
* Do you want to create a setup where you can play Pinball Dreams with a game pad? You can map game pad buttons to the Amiga keyboard F1, F2, F3, F4, cursor down, left shift and right shift keys.
General configuration
To map an action, you add a line like this to your configuration file:
devicename_eventname = actionname
Here is a specific example, mapping the keyboard key q to the fire button on the primary Amiga joystick (joy_1 is the joystick in joystick port 1):
keyboard_key_q = action_joy_1_fire_button
Input Actions
Examples of input actions are:
* action_joy_1_fire_button (fire button on joystick in port 1)
* action_key_z (press amiga keyboard key Z)
* action_drive_0_insert_floppy_0 (insert disk from floppy image list)
I have put the list of available actions on its own page due to the large
number of actions.
Mapping Joystick Events
Here are the different types of event names you can use. joystick_0 is the first joystick connected, joystick_1 is the second, etc. Similarly, you can use button_0 (first button), button_1 (second button) and so on:
joystick_0_button_0
joystick_0_axis_0_neg
joystick_0_axis_0_pos
joystick_0_hat_0_up
joystick_0_hat_0_down
joystick_0_hat_0_left
joystick_0_hat_0_right
For most game pads with a proper universal configuration, you can use universal event names instead and make the configuration work identically even with different types of game pads connected. The above configuration is not suited for this, because button numbers have generally no relation to the physical placement of the game pad buttons.
Universal Event Names
With universal event names, you can map these events instead:
joystick_0_dpad_left
joystick_0_dpad_right
joystick_0_dpad_up
joystick_0_dpad_down
joystick_0_lstick_left
joystick_0_lstick_right
joystick_0_lstick_up
joystick_0_lstick_down
joystick_0_lstick_button
joystick_0_rstick_left
joystick_0_rstick_right
joystick_0_rstick_up
joystick_0_rstick_down
joystick_0_rstick_button
joystick_0_south_button
joystick_0_west_button
joystick_0_north_button
joystick_0_east_button
joystick_0_start_button
joystick_0_select_button
joystick_0_left_shoulder
joystick_0_right_shoulder
joystick_0_left_trigger
joystick_0_right_trigger
Matching a Joystick Device
To match a specific joystick device, you can either (as in the above examples) match joystick by number:
joystick_0
joystick_1
joystick_2
etc...
Or you can match devices by name:
logitech_dual_action_usb
The device name is the name of the device as it appears in your operating system, converted to lower case and all characters other than letters and numbers converted to underscores. Adjacent underscores are merged into one, and trailing underscores are stripped.
For instance, a device named:
Controller (Xbox 360 Wireless Receiver)
will be matched by:
controller_xbox_360_wireless_receiver
If you have more than game pad of the same model connected, you can match the other ones with:
logitech_dual_action_usb_2 (the second connected of this model)
logitech_dual_action_usb_3 (the third connected of this model)
etc...
More Examples
If you want to play Pinball Dreams with the first connected game pad, instead of using keyboard controls, you might want to map something like this:
joystick_0_left_shoulder = action_key_shift_left
joystick_0_right_shoulder = action_key_shift_right
joystick_0_north_button = action_key_f1
joystick_0_dpad_down = action_key_cursor_down
Along with a few more actions, such as mappings for amiga keys F2, F3, F4 and Space.
## Mapping keyboard keys
See [keyboard mapping](keyboard-mapping.md).
| {
"pile_set_name": "Github"
} |
/*
* Copyright (c) 2000, 2012, Oracle and/or its affiliates. All rights reserved.
* DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
*
* This code is free software; you can redistribute it and/or modify it
* under the terms of the GNU General Public License version 2 only, as
* published by the Free Software Foundation.
*
* This code is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
* FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
* version 2 for more details (a copy is included in the LICENSE file that
* accompanied this code).
*
* You should have received a copy of the GNU General Public License version
* 2 along with this work; if not, write to the Free Software Foundation,
* Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
*
* Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
* or visit www.oracle.com if you need additional information or have any
* questions.
*
*/
#ifndef SHARE_VM_MEMORY_REFERENCEPOLICY_HPP
#define SHARE_VM_MEMORY_REFERENCEPOLICY_HPP
// referencePolicy is used to determine when soft reference objects
// should be cleared.
class ReferencePolicy : public CHeapObj<mtGC> {
public:
virtual bool should_clear_reference(oop p, jlong timestamp_clock) {
ShouldNotReachHere();
return true;
}
// Capture state (of-the-VM) information needed to evaluate the policy
virtual void setup() { /* do nothing */ }
};
class NeverClearPolicy : public ReferencePolicy {
public:
virtual bool should_clear_reference(oop p, jlong timestamp_clock) {
return false;
}
};
class AlwaysClearPolicy : public ReferencePolicy {
public:
virtual bool should_clear_reference(oop p, jlong timestamp_clock) {
return true;
}
};
class LRUCurrentHeapPolicy : public ReferencePolicy {
private:
jlong _max_interval;
public:
LRUCurrentHeapPolicy();
// Capture state (of-the-VM) information needed to evaluate the policy
void setup();
virtual bool should_clear_reference(oop p, jlong timestamp_clock);
};
class LRUMaxHeapPolicy : public ReferencePolicy {
private:
jlong _max_interval;
public:
LRUMaxHeapPolicy();
// Capture state (of-the-VM) information needed to evaluate the policy
void setup();
virtual bool should_clear_reference(oop p, jlong timestamp_clock);
};
#endif // SHARE_VM_MEMORY_REFERENCEPOLICY_HPP
| {
"pile_set_name": "Github"
} |
using Server.Spells.First;
namespace Server.Items
{
public class FeebleWand : BaseWand
{
[Constructable]
public FeebleWand()
: base(WandEffect.Feeblemindedness, 5, 30)
{
}
public FeebleWand(Serial serial)
: base(serial)
{
}
public override void Serialize(GenericWriter writer)
{
base.Serialize(writer);
writer.Write(0); // version
}
public override void Deserialize(GenericReader reader)
{
base.Deserialize(reader);
int version = reader.ReadInt();
}
public override void OnWandUse(Mobile from)
{
Cast(new FeeblemindSpell(from, this));
}
}
} | {
"pile_set_name": "Github"
} |
##
# This file is part of WhatWeb and may be subject to
# redistribution and commercial restrictions. Please see the WhatWeb
# web site for more information on licensing and terms of use.
# http://www.morningstarsecurity.com/research/whatweb
##
Plugin.define "Mason" do
author "Brendan Coles <[email protected]>" # 2012-11-04
version "0.1"
description "Mason is a powerful Perl-based templating system for generating HTML or other dynamic content. - Homepage: http://www.masonhq.com/"
# ShodanHQ results as at 2012-11-04 #
# 27 for X-Powered-By: HTML::Mason
# Matches #
matches [
# Version Detection # HTTP Server Header
{ :search=>"headers[x-powered-by]", :regexp=>/HTML::Mason/ },
]
end
| {
"pile_set_name": "Github"
} |
{
"created_at": "2015-02-27T22:28:00.583921",
"description": "An android application that can turns the screen off without putting device into sleep together with many HDMI / Remoting apis",
"fork": false,
"full_name": "nkahoang/screenstandby",
"language": "Java",
"updated_at": "2015-02-27T23:42:02.804457"
} | {
"pile_set_name": "Github"
} |
/*
Copyright (C) 2012 Matthew Fries
MF Gig Calendar is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation; either version 2
of the License, or (at your option) any later version.
MF Gig Calendar is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
=======================================
Calendar Layout
=======================================
CALENDAR OUTPUT FORMAT:
<ul id="cal">
<li class="event">
<div class="date">
<!-- ONLY OUTPUT FOR MULTI-DAY EVENTS -->
<div class="start-date">
<div class="weekday"></div>
<div class="day"></div>
<div class="month"></div>
<div class="year"></div>
</div>
<!-- ONLY OUTPUT FOR MULTI-DAY EVENTS -->
<div class="end-date">
<div class="weekday"></div>
<div class="day"></div>
<div class="month"></div>
<div class="year"></div>
</div>
</div>
<div class="info_block">
<h3>[Event Title]</h3>
<span class="time">[Event Time]</span>
<span class="location">[Event Location]</span>
<span class="details">[Event Details]</span>
</div>
</li>
</ul>
=======================================
*/
a.rss-link {
display:block;
height:16px;
float:right;
background:transparent url(../images/icon-rss.png) right top no-repeat;
padding-right:20px;
line-height:16px;
}
#cal_nav {
font-size:.9em;
margin-bottom: 1.5em;
}
ul#cal {
list-style: none;
margin: 1em 0 1.5em 0;
padding: 0;
}
#cal li.event {
clear: both;
padding-top: 1.5em;
}
#cal .date {
white-space:nowrap;
float:left;
padding-bottom:1.5em;
}
#cal .start-date {
float:left;
text-align:right;
width:3em;
padding-right: .5em;
background:transparent url(../images/hyphen.png) right 2em no-repeat;
overflow:hidden;
}
#cal .end-date {
text-align:left;
width:3em;
margin-left:3.9em;
overflow:hidden;
}
#cal .year {
font-size:.8em;
line-height:1em;
letter-spacing: .1em;
}
#cal .month {
text-transform:uppercase;
font-size:.9em;
line-height:1em;
padding:.4em 0 .5em 0;
}
#cal .day {
font-size:1.6em;
font-weight:bold;
line-height:1em;
margin-bottom:-.2em;
}
#cal .info_block {
padding:0 0 1em 0;
margin:0 0 0 7.5em;
}
#cal .info_block h3 {
clear:none;
margin-top:0;
padding-top:0;
}
#cal .weekday {
font-size:.9em;
text-transform:uppercase;
}
#cal .time {
font-weight:bold;
}
| {
"pile_set_name": "Github"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.