commit
stringlengths
40
40
old_file
stringlengths
4
237
new_file
stringlengths
4
237
old_contents
stringlengths
1
4.24k
new_contents
stringlengths
5
4.84k
subject
stringlengths
15
778
message
stringlengths
16
6.86k
lang
stringlengths
1
30
license
stringclasses
13 values
repos
stringlengths
5
116k
config
stringlengths
1
30
content
stringlengths
105
8.72k
385c09568b9eec678e0f94af93d6484541fa2e19
README.md
README.md
Tells you if the string you pass in is a built-in node module ## Install ``` npm install isbuiltin --save ``` ## Usage ``` var isBuiltIn = require('isbuiltin); isBuiltIn('console'); => true isBuiltIn('woosh'); => false ``` ## License MIT © [Deepak Kapoor](http://kapoor.io)
Tells you if the string you pass in is a built-in node module ## Install ``` npm install isbuiltin --save ``` ## Usage ``` var isBuiltIn = require('isbuiltin); isBuiltIn('console'); => true isBuiltIn('woosh'); => false ``` ## Run Tests ``` npm install && npm test ``` ## License MIT © [Deepak Kapoor](http://kapoor.io)
Update readme with test info
Update readme with test info
Markdown
mit
deepak-kapoor/isbuiltin
markdown
## Code Before: Tells you if the string you pass in is a built-in node module ## Install ``` npm install isbuiltin --save ``` ## Usage ``` var isBuiltIn = require('isbuiltin); isBuiltIn('console'); => true isBuiltIn('woosh'); => false ``` ## License MIT © [Deepak Kapoor](http://kapoor.io) ## Instruction: Update readme with test info ## Code After: Tells you if the string you pass in is a built-in node module ## Install ``` npm install isbuiltin --save ``` ## Usage ``` var isBuiltIn = require('isbuiltin); isBuiltIn('console'); => true isBuiltIn('woosh'); => false ``` ## Run Tests ``` npm install && npm test ``` ## License MIT © [Deepak Kapoor](http://kapoor.io)
ef03ea164f2e4f6af770f7024188ee3f2f0d1829
devtools/platform-descriptor-json/src/main/resources/codestarts/quarkus/core/project/quarkus/codestart.yml
devtools/platform-descriptor-json/src/main/resources/codestarts/quarkus/core/project/quarkus/codestart.yml
name: quarkus type: project fallback: true output-strategy: "pom.xml": smart-pom-merge "README.md": append "readme.md": forbidden "README.adoc": forbidden "readme.adoc": forbidden ".gitignore": append "src/native-test/*": native-test-dir-resolve "src/main/resources/application.yml": smart-config-merge "src/main/resources/application.properties": forbidden "src/test/resources/application.yml": smart-config-merge "src/test/resources/application.properties": forbidden "*": fail-on-duplicate language: base: shared-data: supports: dev-mode: true native: true project: group-id: org.acme artifact-id: quarkus-project
name: quarkus type: project fallback: true output-strategy: "pom.xml": smart-pom-merge "README.md": append "readme.md": forbidden "README.adoc": forbidden "readme.adoc": forbidden ".gitignore": append "src/native-test/*": native-test-dir-resolve "src/main/resources/application.yml": smart-config-merge "src/main/resources/application.properties": forbidden "src/test/resources/application.yml": smart-config-merge "src/test/resources/application.properties": forbidden "*": fail-on-duplicate language: base: shared-data: supports: dev-mode: true native: true project: group-id: org.acme artifact-id: quarkus-project dependencies: - io.quarkus:quarkus-arc
Add quarkus-arc as a project dependency in Quarkus Codestarts
Add quarkus-arc as a project dependency in Quarkus Codestarts
YAML
apache-2.0
quarkusio/quarkus,quarkusio/quarkus,quarkusio/quarkus,quarkusio/quarkus,quarkusio/quarkus
yaml
## Code Before: name: quarkus type: project fallback: true output-strategy: "pom.xml": smart-pom-merge "README.md": append "readme.md": forbidden "README.adoc": forbidden "readme.adoc": forbidden ".gitignore": append "src/native-test/*": native-test-dir-resolve "src/main/resources/application.yml": smart-config-merge "src/main/resources/application.properties": forbidden "src/test/resources/application.yml": smart-config-merge "src/test/resources/application.properties": forbidden "*": fail-on-duplicate language: base: shared-data: supports: dev-mode: true native: true project: group-id: org.acme artifact-id: quarkus-project ## Instruction: Add quarkus-arc as a project dependency in Quarkus Codestarts ## Code After: name: quarkus type: project fallback: true output-strategy: "pom.xml": smart-pom-merge "README.md": append "readme.md": forbidden "README.adoc": forbidden "readme.adoc": forbidden ".gitignore": append "src/native-test/*": native-test-dir-resolve "src/main/resources/application.yml": smart-config-merge "src/main/resources/application.properties": forbidden "src/test/resources/application.yml": smart-config-merge "src/test/resources/application.properties": forbidden "*": fail-on-duplicate language: base: shared-data: supports: dev-mode: true native: true project: group-id: org.acme artifact-id: quarkus-project dependencies: - io.quarkus:quarkus-arc
7627171580ebbff16f876bca86692d10e1055930
.travis.yml
.travis.yml
language: sh notifications: email: false slack: secure: dNE7nOHDxyrwcgPfcDJB6WZeoYCYx/ZGI4rOJsSxbxK//0uUjZ1ECwUW4wN0+sQkYUqm0xKku9ZUHFCVDr8R572srM+7cJ8vUcEy2m3cQ+aHkZF6mvbiyncEB/GbPQto8z9n5FwwHDOneBGvk/+nv0AM0hkShbAUQxNNxFFy5kE= env: - HALCYON_SANDBOX_EXTRA_APPS="alex-3.1.4 happy-1.19.5" before_install: - sudo mkdir /app - sudo chown $USER:$USER /app install: - git clone https://github.com/mietek/halcyon.git /app/halcyon - /app/halcyon/halcyon build before_script: - script/setup - /app/halcyon/halcyon paths > halcyon-env script: - source halcyon-env - script/test addons: postgresql: '9.3'
language: sh notifications: email: false slack: secure: dNE7nOHDxyrwcgPfcDJB6WZeoYCYx/ZGI4rOJsSxbxK//0uUjZ1ECwUW4wN0+sQkYUqm0xKku9ZUHFCVDr8R572srM+7cJ8vUcEy2m3cQ+aHkZF6mvbiyncEB/GbPQto8z9n5FwwHDOneBGvk/+nv0AM0hkShbAUQxNNxFFy5kE= env: - HALCYON_SANDBOX_EXTRA_APPS="alex-3.1.4 happy-1.19.5 yesod-bin" before_install: - sudo mkdir /app - sudo chown $USER:$USER /app install: - git clone https://github.com/mietek/halcyon.git /app/halcyon - /app/halcyon/halcyon build before_script: - script/setup - /app/halcyon/halcyon paths > halcyon-env script: - source halcyon-env - script/test addons: postgresql: '9.3'
Install yesod-bin as an extra app too
Install yesod-bin as an extra app too
YAML
mit
jspahrsummers/ScrumBut
yaml
## Code Before: language: sh notifications: email: false slack: secure: dNE7nOHDxyrwcgPfcDJB6WZeoYCYx/ZGI4rOJsSxbxK//0uUjZ1ECwUW4wN0+sQkYUqm0xKku9ZUHFCVDr8R572srM+7cJ8vUcEy2m3cQ+aHkZF6mvbiyncEB/GbPQto8z9n5FwwHDOneBGvk/+nv0AM0hkShbAUQxNNxFFy5kE= env: - HALCYON_SANDBOX_EXTRA_APPS="alex-3.1.4 happy-1.19.5" before_install: - sudo mkdir /app - sudo chown $USER:$USER /app install: - git clone https://github.com/mietek/halcyon.git /app/halcyon - /app/halcyon/halcyon build before_script: - script/setup - /app/halcyon/halcyon paths > halcyon-env script: - source halcyon-env - script/test addons: postgresql: '9.3' ## Instruction: Install yesod-bin as an extra app too ## Code After: language: sh notifications: email: false slack: secure: dNE7nOHDxyrwcgPfcDJB6WZeoYCYx/ZGI4rOJsSxbxK//0uUjZ1ECwUW4wN0+sQkYUqm0xKku9ZUHFCVDr8R572srM+7cJ8vUcEy2m3cQ+aHkZF6mvbiyncEB/GbPQto8z9n5FwwHDOneBGvk/+nv0AM0hkShbAUQxNNxFFy5kE= env: - HALCYON_SANDBOX_EXTRA_APPS="alex-3.1.4 happy-1.19.5 yesod-bin" before_install: - sudo mkdir /app - sudo chown $USER:$USER /app install: - git clone https://github.com/mietek/halcyon.git /app/halcyon - /app/halcyon/halcyon build before_script: - script/setup - /app/halcyon/halcyon paths > halcyon-env script: - source halcyon-env - script/test addons: postgresql: '9.3'
8033b64ca88720ddbc7394843a3375613cc5b0e1
autogen.sh
autogen.sh
AC_VERSION= AUTOMAKE=${AUTOMAKE:-automake} AM_INSTALLED_VERSION=$($AUTOMAKE --version | sed -e '2,$ d' -e 's/.* \([0-9]*\.[0-9]*\).*/\1/') if [ "$AM_INSTALLED_VERSION" != "1.10" \ -a "$AM_INSTALLED_VERSION" != "1.11" ];then echo echo "You must have automake > 1.10 or 1.11 installed to compile lxmenu-data." echo "Install the appropriate package for your distribution," echo "or get the source tarball at http://ftp.gnu.org/gnu/automake/" exit 1 fi set -x if [ "x${ACLOCAL_DIR}" != "x" ]; then ACLOCAL_ARG=-I ${ACLOCAL_DIR} fi ${ACLOCAL:-aclocal$AM_VERSION} ${ACLOCAL_ARG} AUTOMAKE=$AUTOMAKE libtoolize -c --automake --force AUTOMAKE=$AUTOMAKE intltoolize -c --automake --force $AUTOMAKE --add-missing --copy --include-deps ${AUTOCONF:-autoconf$AC_VERSION} rm -rf autom4te.cache
AC_VERSION= AUTOMAKE=${AUTOMAKE:-automake} AM_INSTALLED_VERSION=$($AUTOMAKE --version | sed -e '2,$ d' -e 's/.* \([0-9]*\.[0-9]*\).*/\1/') if [ $AM_INSTALLED_VERSION < 1.10 ];then echo echo "You must have automake > 1.10 installed to compile lxmenu-data." echo "Install the appropriate package for your distribution," echo "or get the source tarball at http://ftp.gnu.org/gnu/automake/" exit 1 fi set -x if [ "x${ACLOCAL_DIR}" != "x" ]; then ACLOCAL_ARG=-I ${ACLOCAL_DIR} fi ${ACLOCAL:-aclocal$AM_VERSION} ${ACLOCAL_ARG} AUTOMAKE=$AUTOMAKE libtoolize -c --automake --force AUTOMAKE=$AUTOMAKE intltoolize -c --automake --force $AUTOMAKE --add-missing --copy --include-deps ${AUTOCONF:-autoconf$AC_VERSION} rm -rf autom4te.cache
Fix build with automake > 1.11
Fix build with automake > 1.11
Shell
lgpl-2.1
lxde/lxmenu-data,smart2128/lxmenu-data,lxde/lxmenu-data,smart2128/lxmenu-data
shell
## Code Before: AC_VERSION= AUTOMAKE=${AUTOMAKE:-automake} AM_INSTALLED_VERSION=$($AUTOMAKE --version | sed -e '2,$ d' -e 's/.* \([0-9]*\.[0-9]*\).*/\1/') if [ "$AM_INSTALLED_VERSION" != "1.10" \ -a "$AM_INSTALLED_VERSION" != "1.11" ];then echo echo "You must have automake > 1.10 or 1.11 installed to compile lxmenu-data." echo "Install the appropriate package for your distribution," echo "or get the source tarball at http://ftp.gnu.org/gnu/automake/" exit 1 fi set -x if [ "x${ACLOCAL_DIR}" != "x" ]; then ACLOCAL_ARG=-I ${ACLOCAL_DIR} fi ${ACLOCAL:-aclocal$AM_VERSION} ${ACLOCAL_ARG} AUTOMAKE=$AUTOMAKE libtoolize -c --automake --force AUTOMAKE=$AUTOMAKE intltoolize -c --automake --force $AUTOMAKE --add-missing --copy --include-deps ${AUTOCONF:-autoconf$AC_VERSION} rm -rf autom4te.cache ## Instruction: Fix build with automake > 1.11 ## Code After: AC_VERSION= AUTOMAKE=${AUTOMAKE:-automake} AM_INSTALLED_VERSION=$($AUTOMAKE --version | sed -e '2,$ d' -e 's/.* \([0-9]*\.[0-9]*\).*/\1/') if [ $AM_INSTALLED_VERSION < 1.10 ];then echo echo "You must have automake > 1.10 installed to compile lxmenu-data." echo "Install the appropriate package for your distribution," echo "or get the source tarball at http://ftp.gnu.org/gnu/automake/" exit 1 fi set -x if [ "x${ACLOCAL_DIR}" != "x" ]; then ACLOCAL_ARG=-I ${ACLOCAL_DIR} fi ${ACLOCAL:-aclocal$AM_VERSION} ${ACLOCAL_ARG} AUTOMAKE=$AUTOMAKE libtoolize -c --automake --force AUTOMAKE=$AUTOMAKE intltoolize -c --automake --force $AUTOMAKE --add-missing --copy --include-deps ${AUTOCONF:-autoconf$AC_VERSION} rm -rf autom4te.cache
f23eee6f5fa754c5553b17d95ac2ec3515e7ff01
server/test/controllers/user-controller.js
server/test/controllers/user-controller.js
const should = require('should'); const sinon = require('sinon'); require('should-sinon'); const proxyquire = require('proxyquire'); const httpMocks = require('node-mocks-http'); // Mock Service const mockObj = { getAllUsers() { return Promise.resolve(['item 1', 'item 2']); } }; const userController = proxyquire('../../app/controllers/user-controller', { '../services/user-service': mockObj }); describe('User controller tests', function () { it('should not fail and send a JSON parsed data', async function () { // Given const res = httpMocks.createResponse(); const req = httpMocks.createRequest({ method: 'GET', url: '/user/', params: {} }); // When await userController.getAllUsers(req, res); // Then res._isEndCalled().should.be.true(); res._isJSON().should.be.true(); res._getData().should.be.equal('["item 1","item 2"]'); }); it('should fail correctly if there is a problem in service', async function () { // Given const res = httpMocks.createResponse(); const req = httpMocks.createRequest({ method: 'GET', url: '/user/', params: {} }); const serviceStub = sinon.stub(mockObj, 'getAllUsers'); serviceStub.throws(); // When await userController.getAllUsers(req, res); // Then res._isEndCalled().should.be.true(); res.statusCode.should.be.equal(500); serviceStub.restore(); }); });
const should = require('should'); const sinon = require('sinon'); require('should-sinon'); const proxyquire = require('proxyquire'); const httpMocks = require('node-mocks-http'); // Mock Service const mockObj = { getAllUsers() { return Promise.resolve(['item 1', 'item 2']); } }; const userController = proxyquire('../../app/controllers/user-controller', { '../services/user-service': mockObj }); describe('User controller tests', function () { it('should not fail and send a JSON parsed data', async function () { // Given const res = httpMocks.createResponse(); const req = httpMocks.createRequest({ method: 'GET', url: '/user/', params: {} }); // When await userController.getAllUsers(req, res); // Then res._isEndCalled().should.be.true(); res._isJSON().should.be.true(); res._getData().should.be.equal('["item 1","item 2"]'); }); });
Remove test for fail case (handled by express)
[tests] Remove test for fail case (handled by express)
JavaScript
apache-2.0
K-Fet/K-App,K-Fet/K-App,K-Fet/K-App,K-Fet/K-App
javascript
## Code Before: const should = require('should'); const sinon = require('sinon'); require('should-sinon'); const proxyquire = require('proxyquire'); const httpMocks = require('node-mocks-http'); // Mock Service const mockObj = { getAllUsers() { return Promise.resolve(['item 1', 'item 2']); } }; const userController = proxyquire('../../app/controllers/user-controller', { '../services/user-service': mockObj }); describe('User controller tests', function () { it('should not fail and send a JSON parsed data', async function () { // Given const res = httpMocks.createResponse(); const req = httpMocks.createRequest({ method: 'GET', url: '/user/', params: {} }); // When await userController.getAllUsers(req, res); // Then res._isEndCalled().should.be.true(); res._isJSON().should.be.true(); res._getData().should.be.equal('["item 1","item 2"]'); }); it('should fail correctly if there is a problem in service', async function () { // Given const res = httpMocks.createResponse(); const req = httpMocks.createRequest({ method: 'GET', url: '/user/', params: {} }); const serviceStub = sinon.stub(mockObj, 'getAllUsers'); serviceStub.throws(); // When await userController.getAllUsers(req, res); // Then res._isEndCalled().should.be.true(); res.statusCode.should.be.equal(500); serviceStub.restore(); }); }); ## Instruction: [tests] Remove test for fail case (handled by express) ## Code After: const should = require('should'); const sinon = require('sinon'); require('should-sinon'); const proxyquire = require('proxyquire'); const httpMocks = require('node-mocks-http'); // Mock Service const mockObj = { getAllUsers() { return Promise.resolve(['item 1', 'item 2']); } }; const userController = proxyquire('../../app/controllers/user-controller', { '../services/user-service': mockObj }); describe('User controller tests', function () { it('should not fail and send a JSON parsed data', async function () { // Given const res = httpMocks.createResponse(); const req = httpMocks.createRequest({ method: 'GET', url: '/user/', params: {} }); // When await userController.getAllUsers(req, res); // Then res._isEndCalled().should.be.true(); res._isJSON().should.be.true(); res._getData().should.be.equal('["item 1","item 2"]'); }); });
dc9cc4021609ba1f5d4654d3aafc587807941d81
.travis.yml
.travis.yml
language: clojure jdk: oraclejdk10 lein: 2.8.1 script: - lein doo phantom test once - lein cljsbuild once release - lein less once
language: clojure jdk: openjdk10 lein: 2.8.1 before_install: - rm "${JAVA_HOME}/lib/security/cacerts" - ln -s /etc/ssl/certs/java/cacerts "${JAVA_HOME}/lib/security/cacerts" script: - lein doo phantom test once - lein cljsbuild once release - lein less once
Switch to openjdk10 since oraclejdk10 is deprecated
Switch to openjdk10 since oraclejdk10 is deprecated
YAML
apache-2.0
asciinema/asciinema-player,asciinema/asciinema-player
yaml
## Code Before: language: clojure jdk: oraclejdk10 lein: 2.8.1 script: - lein doo phantom test once - lein cljsbuild once release - lein less once ## Instruction: Switch to openjdk10 since oraclejdk10 is deprecated ## Code After: language: clojure jdk: openjdk10 lein: 2.8.1 before_install: - rm "${JAVA_HOME}/lib/security/cacerts" - ln -s /etc/ssl/certs/java/cacerts "${JAVA_HOME}/lib/security/cacerts" script: - lein doo phantom test once - lein cljsbuild once release - lein less once
3e5235b6114772606a59e3c3165b2b0f5fa77542
mudlib/mud/home/Help/initd.c
mudlib/mud/home/Help/initd.c
inherit LIB_INITD; inherit UTILITY_COMPILE; static void create() { load_dir("obj", 1); load_dir("sys", 1); } void full_reset() { object turkeylist; object cursor; object first; object this; ACCESS_CHECK(PRIVILEGED()); turkeylist = new_object(BIGSTRUCT_DEQUE_LWO); this = this_object(); cursor = KERNELD->first_link("Help"); first = cursor; do { LOGD->post_message("help", LOG_DEBUG, "Listing " + object_name(cursor)); turkeylist->push_back(cursor); cursor = KERNELD->next_link(cursor); } while (cursor != first); while (!turkeylist->empty()) { object turkey; turkey = turkeylist->get_front(); turkeylist->pop_front(); if (!turkey || turkey == this) { /* don't self destruct or double destruct */ continue; } destruct_object(turkey); } load_dir("obj", 1); load_dir("sys", 1); }
inherit LIB_INITD; inherit UTILITY_COMPILE; static void create() { load_dir("obj", 1); load_dir("sys", 1); } void full_reset() { object turkeylist; object cursor; object first; object this; ACCESS_CHECK(PRIVILEGED()); turkeylist = new_object(BIGSTRUCT_DEQUE_LWO); this = this_object(); cursor = KERNELD->first_link("Help"); first = cursor; do { turkeylist->push_back(cursor); cursor = KERNELD->next_link(cursor); } while (cursor != first); while (!turkeylist->empty()) { object turkey; turkey = turkeylist->get_front(); turkeylist->pop_front(); if (!turkey || turkey == this) { /* don't self destruct or double destruct */ continue; } destruct_object(turkey); } load_dir("obj", 1); load_dir("sys", 1); }
Remove now needless debug spam
Remove now needless debug spam
C
agpl-3.0
shentino/kotaka,shentino/kotaka,shentino/kotaka
c
## Code Before: inherit LIB_INITD; inherit UTILITY_COMPILE; static void create() { load_dir("obj", 1); load_dir("sys", 1); } void full_reset() { object turkeylist; object cursor; object first; object this; ACCESS_CHECK(PRIVILEGED()); turkeylist = new_object(BIGSTRUCT_DEQUE_LWO); this = this_object(); cursor = KERNELD->first_link("Help"); first = cursor; do { LOGD->post_message("help", LOG_DEBUG, "Listing " + object_name(cursor)); turkeylist->push_back(cursor); cursor = KERNELD->next_link(cursor); } while (cursor != first); while (!turkeylist->empty()) { object turkey; turkey = turkeylist->get_front(); turkeylist->pop_front(); if (!turkey || turkey == this) { /* don't self destruct or double destruct */ continue; } destruct_object(turkey); } load_dir("obj", 1); load_dir("sys", 1); } ## Instruction: Remove now needless debug spam ## Code After: inherit LIB_INITD; inherit UTILITY_COMPILE; static void create() { load_dir("obj", 1); load_dir("sys", 1); } void full_reset() { object turkeylist; object cursor; object first; object this; ACCESS_CHECK(PRIVILEGED()); turkeylist = new_object(BIGSTRUCT_DEQUE_LWO); this = this_object(); cursor = KERNELD->first_link("Help"); first = cursor; do { turkeylist->push_back(cursor); cursor = KERNELD->next_link(cursor); } while (cursor != first); while (!turkeylist->empty()) { object turkey; turkey = turkeylist->get_front(); turkeylist->pop_front(); if (!turkey || turkey == this) { /* don't self destruct or double destruct */ continue; } destruct_object(turkey); } load_dir("obj", 1); load_dir("sys", 1); }
6fa02c0a03b148bf6e6e0d298d1e392ba313e044
src/Oro/Bundle/FormBundle/Resources/public/blank/scss/variables/input-config.scss
src/Oro/Bundle/FormBundle/Resources/public/blank/scss/variables/input-config.scss
/* @theme: blank; */ $use-base-style-for-input: true !default; // Default $input-width-short: 64px !default; $input-padding: 8px 9px !default; $input-date-height: 35px !default; $input-number-box-shadow: none !default; $input-font-size: $base-ui-element-font-size !default; $input-font-family: $base-ui-element-font-family !default; $input-border: $base-ui-element-border !default; $input-border-radius: $base-ui-element-border-radius !default; $input-background-color: $base-ui-element-bg-color !default; $input-color: $base-ui-element-color !default; // Hover $input-border-color-hover-state: get-color('additional', 'dark') !default; // Focus $input-border-color-focus-state: $input-border-color-hover-state !default; // Error $input-border-color-error-state: get-color('ui', 'error-dark') !default;
/* @theme: blank; */ $use-base-style-for-input: true !default; // Default $input-width-short: 64px !default; $input-padding: 8px 9px !default; $input-date-height: 35px !default; $input-number-box-shadow: none !default; $input-font-size: $base-ui-element-font-size !default; $input-font-family: $base-ui-element-font-family !default; $input-line-height: normal !default; $input-border: $base-ui-element-border !default; $input-border-radius: $base-ui-element-border-radius !default; $input-background-color: $base-ui-element-bg-color !default; $input-color: $base-ui-element-color !default; // Hover $input-border-color-hover-state: get-color('additional', 'dark') !default; // Focus $input-border-color-focus-state: $input-border-color-hover-state !default; // Error $input-border-color-error-state: get-color('ui', 'error-dark') !default;
Customize Bootstrap for frontend (reuse current settings) - small fixes and clean up
BB-14806: Customize Bootstrap for frontend (reuse current settings) - small fixes and clean up
SCSS
mit
orocrm/platform,orocrm/platform,orocrm/platform
scss
## Code Before: /* @theme: blank; */ $use-base-style-for-input: true !default; // Default $input-width-short: 64px !default; $input-padding: 8px 9px !default; $input-date-height: 35px !default; $input-number-box-shadow: none !default; $input-font-size: $base-ui-element-font-size !default; $input-font-family: $base-ui-element-font-family !default; $input-border: $base-ui-element-border !default; $input-border-radius: $base-ui-element-border-radius !default; $input-background-color: $base-ui-element-bg-color !default; $input-color: $base-ui-element-color !default; // Hover $input-border-color-hover-state: get-color('additional', 'dark') !default; // Focus $input-border-color-focus-state: $input-border-color-hover-state !default; // Error $input-border-color-error-state: get-color('ui', 'error-dark') !default; ## Instruction: BB-14806: Customize Bootstrap for frontend (reuse current settings) - small fixes and clean up ## Code After: /* @theme: blank; */ $use-base-style-for-input: true !default; // Default $input-width-short: 64px !default; $input-padding: 8px 9px !default; $input-date-height: 35px !default; $input-number-box-shadow: none !default; $input-font-size: $base-ui-element-font-size !default; $input-font-family: $base-ui-element-font-family !default; $input-line-height: normal !default; $input-border: $base-ui-element-border !default; $input-border-radius: $base-ui-element-border-radius !default; $input-background-color: $base-ui-element-bg-color !default; $input-color: $base-ui-element-color !default; // Hover $input-border-color-hover-state: get-color('additional', 'dark') !default; // Focus $input-border-color-focus-state: $input-border-color-hover-state !default; // Error $input-border-color-error-state: get-color('ui', 'error-dark') !default;
3a4d8a11b8b3581f65f304487d7bbd3aea64e3c7
test/test_helper.rb
test/test_helper.rb
require "protest" require "test/unit/assertions" Protest.report_with(:documentation) module TestHelpers class IORecorder attr_reader :messages def initialize @messages = [] end def puts(msg=nil) @messages << msg end def print(msg=nil) @messages << msg end end def silent_report(type=:progress) Protest.report(type, IORecorder.new) end def mock_test_case(&block) test_case = Protest.describe(name, &block) test_case.description = "" nested_contexts = Protest.send(:available_test_cases).select {|t| t < test_case } report = silent_report [test_case, *nested_contexts].each do |test_case| test_case.run(Protest::Runner.new(report)) end report end end class Protest::TestCase include TestHelpers end
require "protest" require "test/unit/assertions" Protest.report_with(:documentation) module TestHelpers class IORecorder attr_reader :messages def initialize @messages = [] end def puts(msg=nil) @messages << msg end def print(msg=nil) @messages << msg end end def silent_report(type=:progress) Protest.report(type, IORecorder.new) end def mock_test_case(report=:progress, &block) test_case = Protest.describe(name, &block) test_case.description = "" nested_contexts = Protest.send(:available_test_cases).select {|t| t < test_case } report = silent_report(report) Protest::Runner.new(report).run(*[test_case, *nested_contexts]) # [test_case, *nested_contexts].each do |test_case| # test_case.run(Protest::Runner.new(report)) # end report end end class Protest::TestCase include TestHelpers end
Allow mock test cases to be run with a report other than :progress
Allow mock test cases to be run with a report other than :progress
Ruby
mit
matflores/protest
ruby
## Code Before: require "protest" require "test/unit/assertions" Protest.report_with(:documentation) module TestHelpers class IORecorder attr_reader :messages def initialize @messages = [] end def puts(msg=nil) @messages << msg end def print(msg=nil) @messages << msg end end def silent_report(type=:progress) Protest.report(type, IORecorder.new) end def mock_test_case(&block) test_case = Protest.describe(name, &block) test_case.description = "" nested_contexts = Protest.send(:available_test_cases).select {|t| t < test_case } report = silent_report [test_case, *nested_contexts].each do |test_case| test_case.run(Protest::Runner.new(report)) end report end end class Protest::TestCase include TestHelpers end ## Instruction: Allow mock test cases to be run with a report other than :progress ## Code After: require "protest" require "test/unit/assertions" Protest.report_with(:documentation) module TestHelpers class IORecorder attr_reader :messages def initialize @messages = [] end def puts(msg=nil) @messages << msg end def print(msg=nil) @messages << msg end end def silent_report(type=:progress) Protest.report(type, IORecorder.new) end def mock_test_case(report=:progress, &block) test_case = Protest.describe(name, &block) test_case.description = "" nested_contexts = Protest.send(:available_test_cases).select {|t| t < test_case } report = silent_report(report) Protest::Runner.new(report).run(*[test_case, *nested_contexts]) # [test_case, *nested_contexts].each do |test_case| # test_case.run(Protest::Runner.new(report)) # end report end end class Protest::TestCase include TestHelpers end
0189b6cf1a07db9781fa6e1c5fc5f67ddef80351
zsh/init/history.zsh
zsh/init/history.zsh
setopt extended_history # record timestamp of command in HISTFILE setopt hist_expire_dups_first # delete duplicates first when HISTFILE size exceeds HISTSIZE setopt hist_ignore_dups # ignore duplicated commands history list setopt hist_ignore_space # ignore commands that start with space setopt hist_verify # show command with history expansion to user before running it setopt share_history # share command history data
[ -z "$HISTFILE" ] && HISTFILE="$HOME/.zsh_history" setopt extended_history # record timestamp of command in HISTFILE setopt hist_expire_dups_first # delete duplicates first when HISTFILE size exceeds HISTSIZE setopt hist_ignore_dups # ignore duplicated commands history list setopt hist_ignore_space # ignore commands that start with space setopt hist_verify # show command with history expansion to user before running it setopt share_history # share command history data
Set HISTFILE if it's not set
Set HISTFILE if it's not set
Shell
mit
marshall-lee/dotfiles,marshall-lee/dotfiles
shell
## Code Before: setopt extended_history # record timestamp of command in HISTFILE setopt hist_expire_dups_first # delete duplicates first when HISTFILE size exceeds HISTSIZE setopt hist_ignore_dups # ignore duplicated commands history list setopt hist_ignore_space # ignore commands that start with space setopt hist_verify # show command with history expansion to user before running it setopt share_history # share command history data ## Instruction: Set HISTFILE if it's not set ## Code After: [ -z "$HISTFILE" ] && HISTFILE="$HOME/.zsh_history" setopt extended_history # record timestamp of command in HISTFILE setopt hist_expire_dups_first # delete duplicates first when HISTFILE size exceeds HISTSIZE setopt hist_ignore_dups # ignore duplicated commands history list setopt hist_ignore_space # ignore commands that start with space setopt hist_verify # show command with history expansion to user before running it setopt share_history # share command history data
6f028af99c98f9e2b16b28571900d82d880f1b3e
SDK/Version.rb
SDK/Version.rb
REPOSITORY_ROOT = File.expand_path File.join(File.dirname(__FILE__), '..') unless defined? REPOSITORY_ROOT require "#{REPOSITORY_ROOT}/Scripts/Shared.rb" require "#{REPOSITORY_ROOT}/Scripts/Git.rb" def sdk_version version = File.read("#{REPOSITORY_ROOT}/VERSION").lines.first.strip # Include the branch name if not on the master branch branch = Git.active_branch(REPOSITORY_ROOT) branch = nil if branch == 'master' || branch == 'HEAD' # Include the short commit hash if not building an official release from master commit_hash = Git.short_head_commit_hash(REPOSITORY_ROOT) if version.include?('-') || branch [version, branch, commit_hash].compact.join '-' end
REPOSITORY_ROOT = File.expand_path File.join(File.dirname(__FILE__), '..') unless defined? REPOSITORY_ROOT require "#{REPOSITORY_ROOT}/Scripts/Shared.rb" require "#{REPOSITORY_ROOT}/Scripts/Git.rb" def sdk_version version = File.read("#{REPOSITORY_ROOT}/VERSION").lines.first.strip # Include the branch name if not on the main branch branch = Git.active_branch(REPOSITORY_ROOT) branch = nil if branch == 'main' || branch == 'HEAD' # Include the short commit hash if not building an official release from main commit_hash = Git.short_head_commit_hash(REPOSITORY_ROOT) if version.include?('-') || branch [version, branch, commit_hash].compact.join '-' end
Rename master branch to main
Rename master branch to main
Ruby
mpl-2.0
savant-nz/carbon,savant-nz/carbon,savant-nz/carbon,savant-nz/carbon,savant-nz/carbon,savant-nz/carbon
ruby
## Code Before: REPOSITORY_ROOT = File.expand_path File.join(File.dirname(__FILE__), '..') unless defined? REPOSITORY_ROOT require "#{REPOSITORY_ROOT}/Scripts/Shared.rb" require "#{REPOSITORY_ROOT}/Scripts/Git.rb" def sdk_version version = File.read("#{REPOSITORY_ROOT}/VERSION").lines.first.strip # Include the branch name if not on the master branch branch = Git.active_branch(REPOSITORY_ROOT) branch = nil if branch == 'master' || branch == 'HEAD' # Include the short commit hash if not building an official release from master commit_hash = Git.short_head_commit_hash(REPOSITORY_ROOT) if version.include?('-') || branch [version, branch, commit_hash].compact.join '-' end ## Instruction: Rename master branch to main ## Code After: REPOSITORY_ROOT = File.expand_path File.join(File.dirname(__FILE__), '..') unless defined? REPOSITORY_ROOT require "#{REPOSITORY_ROOT}/Scripts/Shared.rb" require "#{REPOSITORY_ROOT}/Scripts/Git.rb" def sdk_version version = File.read("#{REPOSITORY_ROOT}/VERSION").lines.first.strip # Include the branch name if not on the main branch branch = Git.active_branch(REPOSITORY_ROOT) branch = nil if branch == 'main' || branch == 'HEAD' # Include the short commit hash if not building an official release from main commit_hash = Git.short_head_commit_hash(REPOSITORY_ROOT) if version.include?('-') || branch [version, branch, commit_hash].compact.join '-' end
c61aed7c6a65a7cf1ae3f9fbdd8719bcb727b37d
README.md
README.md
![](https://travis-ci.org/marnen/middleman-breadcrumbs.svg) # middleman-breadcrumbs Breadcrumbs helper for Middleman ## Installation Install the gem as usual: put `gem 'middleman-breadcrumbs'` in Gemfile, then run `bundle install`. Put `activate :breadcrumbs` in config.rb (*not* in the `configure :build` block). ## Usage In your view files, just call `breadcrumbs(current_page)` to display breadcrumbs. ### Options #### `:separator` The breadcrumb levels are separated with ` > ` by default. If you want a different separator, use the `:separator` option—e.g., `breadcrumbs(current_page, separator: '|||')`. #### `:wrapper` If you want to wrap each breadcrumb level in a tag, pass the tag name to the `:wrapper` option. For example, if you want your breadcrumbs in a list: ```erb <ul> <%= breadcrumbs current_page, wrapper: :li, separator: nil %> </ul> ``` This will wrap each breadcrumb level in a `<li>` element.
![](https://travis-ci.org/marnen/middleman-breadcrumbs.svg) # middleman-breadcrumbs Breadcrumbs helper for [Middleman](https://middlemanapp.com/) ## Installation Install the gem as usual: put `gem 'middleman-breadcrumbs'` in Gemfile, then run `bundle install`. Put `activate :breadcrumbs` in config.rb (*not* in the `configure :build` block). ## Usage In your view files, just call `breadcrumbs(current_page)` to display breadcrumbs. ### Options #### `:separator` The breadcrumb levels are separated with ` > ` by default. If you want a different separator, use the `:separator` option—e.g., `breadcrumbs(current_page, separator: '|||')`. #### `:wrapper` If you want to wrap each breadcrumb level in a tag, pass the tag name to the `:wrapper` option. For example, if you want your breadcrumbs in a list: ```erb <ul> <%= breadcrumbs current_page, wrapper: :li, separator: nil %> </ul> ``` This will wrap each breadcrumb level in a `<li>` element.
Add link to Middleman website.
Add link to Middleman website.
Markdown
mit
marnen/middleman-breadcrumbs,marnen/middleman-breadcrumbs
markdown
## Code Before: ![](https://travis-ci.org/marnen/middleman-breadcrumbs.svg) # middleman-breadcrumbs Breadcrumbs helper for Middleman ## Installation Install the gem as usual: put `gem 'middleman-breadcrumbs'` in Gemfile, then run `bundle install`. Put `activate :breadcrumbs` in config.rb (*not* in the `configure :build` block). ## Usage In your view files, just call `breadcrumbs(current_page)` to display breadcrumbs. ### Options #### `:separator` The breadcrumb levels are separated with ` > ` by default. If you want a different separator, use the `:separator` option—e.g., `breadcrumbs(current_page, separator: '|||')`. #### `:wrapper` If you want to wrap each breadcrumb level in a tag, pass the tag name to the `:wrapper` option. For example, if you want your breadcrumbs in a list: ```erb <ul> <%= breadcrumbs current_page, wrapper: :li, separator: nil %> </ul> ``` This will wrap each breadcrumb level in a `<li>` element. ## Instruction: Add link to Middleman website. ## Code After: ![](https://travis-ci.org/marnen/middleman-breadcrumbs.svg) # middleman-breadcrumbs Breadcrumbs helper for [Middleman](https://middlemanapp.com/) ## Installation Install the gem as usual: put `gem 'middleman-breadcrumbs'` in Gemfile, then run `bundle install`. Put `activate :breadcrumbs` in config.rb (*not* in the `configure :build` block). ## Usage In your view files, just call `breadcrumbs(current_page)` to display breadcrumbs. ### Options #### `:separator` The breadcrumb levels are separated with ` > ` by default. If you want a different separator, use the `:separator` option—e.g., `breadcrumbs(current_page, separator: '|||')`. #### `:wrapper` If you want to wrap each breadcrumb level in a tag, pass the tag name to the `:wrapper` option. For example, if you want your breadcrumbs in a list: ```erb <ul> <%= breadcrumbs current_page, wrapper: :li, separator: nil %> </ul> ``` This will wrap each breadcrumb level in a `<li>` element.
38b314f8e50bd17e1e712974cccbb78d3470404f
metadata/com.github.webierta.call_counter.yml
metadata/com.github.webierta.call_counter.yml
Categories: - Phone & SMS License: GPL-3.0-only AuthorName: Webierta SourceCode: https://github.com/Webierta/call-counter AutoName: Call Counter RepoType: git Repo: https://github.com/Webierta/call-counter.git Builds: - versionName: 1.2.0 versionCode: 2 commit: v1.2.0 output: build/app/outputs/flutter-apk/app-release.apk srclibs: - [email protected] build: - $$flutter$$/bin/flutter config --no-analytics - $$flutter$$/bin/flutter build apk AutoUpdateMode: Version v%v UpdateCheckMode: Tags UpdateCheckData: https://raw.githubusercontent.com/Webierta/call-counter/master/pubspec.yaml|version:\s.+\+(\d+)|.|version:\s(.+)\+ CurrentVersion: v1.2.0 CurrentVersionCode: 2
Categories: - Phone & SMS License: GPL-3.0-only AuthorName: Webierta SourceCode: https://github.com/Webierta/call-counter IssueTracker: https://github.com/Webierta/call-counter/issues Changelog: https://github.com/Webierta/call-counter/releases AutoName: Call Counter RepoType: git Repo: https://github.com/Webierta/call-counter.git Builds: - versionName: 1.2.0 versionCode: 2 commit: v1.2.0 output: build/app/outputs/flutter-apk/app-release.apk srclibs: - [email protected] build: - $$flutter$$/bin/flutter config --no-analytics - $$flutter$$/bin/flutter build apk AutoUpdateMode: Version v%v UpdateCheckMode: Tags UpdateCheckData: https://raw.githubusercontent.com/Webierta/call-counter/master/pubspec.yaml|version:\s.+\+(\d+)|.|version:\s(.+)\+ CurrentVersion: v1.2.0 CurrentVersionCode: 2
Call Counter: add IssueTracker & Changelog urls
Call Counter: add IssueTracker & Changelog urls
YAML
agpl-3.0
f-droid/fdroiddata,f-droid/fdroiddata
yaml
## Code Before: Categories: - Phone & SMS License: GPL-3.0-only AuthorName: Webierta SourceCode: https://github.com/Webierta/call-counter AutoName: Call Counter RepoType: git Repo: https://github.com/Webierta/call-counter.git Builds: - versionName: 1.2.0 versionCode: 2 commit: v1.2.0 output: build/app/outputs/flutter-apk/app-release.apk srclibs: - [email protected] build: - $$flutter$$/bin/flutter config --no-analytics - $$flutter$$/bin/flutter build apk AutoUpdateMode: Version v%v UpdateCheckMode: Tags UpdateCheckData: https://raw.githubusercontent.com/Webierta/call-counter/master/pubspec.yaml|version:\s.+\+(\d+)|.|version:\s(.+)\+ CurrentVersion: v1.2.0 CurrentVersionCode: 2 ## Instruction: Call Counter: add IssueTracker & Changelog urls ## Code After: Categories: - Phone & SMS License: GPL-3.0-only AuthorName: Webierta SourceCode: https://github.com/Webierta/call-counter IssueTracker: https://github.com/Webierta/call-counter/issues Changelog: https://github.com/Webierta/call-counter/releases AutoName: Call Counter RepoType: git Repo: https://github.com/Webierta/call-counter.git Builds: - versionName: 1.2.0 versionCode: 2 commit: v1.2.0 output: build/app/outputs/flutter-apk/app-release.apk srclibs: - [email protected] build: - $$flutter$$/bin/flutter config --no-analytics - $$flutter$$/bin/flutter build apk AutoUpdateMode: Version v%v UpdateCheckMode: Tags UpdateCheckData: https://raw.githubusercontent.com/Webierta/call-counter/master/pubspec.yaml|version:\s.+\+(\d+)|.|version:\s(.+)\+ CurrentVersion: v1.2.0 CurrentVersionCode: 2
f340eaf39e9c340ab778616c8794cc7e9e6b3d28
app/views/users/new.html.erb
app/views/users/new.html.erb
<h1>Register</h1> <%= form_for @user do |f| %> <%= f.label :email %> <%= f.text_field :email %> <%= f.label :username %> <%= f.text_field :username %> <%= f.label :password %> <%= f.password_field :password %> <%= f.submit %> <% end %> <% if @user.errors.any? %> <ul> <% @user.errors.full_messages.each do |msg| %> <li><%= msg %></li> <% end %> </ul> <% end %>
<div class="ques-answer cf"> <h3>Register</h3> <%= form_for @user do |f| %> <%= f.label :email %> <%= f.text_field :email %> <br><br> <%= f.label :username %> <%= f.text_field :username %> <br><br> <%= f.label :password %> <%= f.password_field :password %> <br><br> <div class="submit-btn"><%= f.submit %></div> <% end %> <% if @user.errors.any? %> <ul> <% @user.errors.full_messages.each do |msg| %> <li><%= msg %></li> <% end %> </ul> <% end %> </div>
Add styling tags for register user page
Add styling tags for register user page
HTML+ERB
mit
jeder28/stack_overphil,jeder28/stack_overphil,jeder28/stack_overphil
html+erb
## Code Before: <h1>Register</h1> <%= form_for @user do |f| %> <%= f.label :email %> <%= f.text_field :email %> <%= f.label :username %> <%= f.text_field :username %> <%= f.label :password %> <%= f.password_field :password %> <%= f.submit %> <% end %> <% if @user.errors.any? %> <ul> <% @user.errors.full_messages.each do |msg| %> <li><%= msg %></li> <% end %> </ul> <% end %> ## Instruction: Add styling tags for register user page ## Code After: <div class="ques-answer cf"> <h3>Register</h3> <%= form_for @user do |f| %> <%= f.label :email %> <%= f.text_field :email %> <br><br> <%= f.label :username %> <%= f.text_field :username %> <br><br> <%= f.label :password %> <%= f.password_field :password %> <br><br> <div class="submit-btn"><%= f.submit %></div> <% end %> <% if @user.errors.any? %> <ul> <% @user.errors.full_messages.each do |msg| %> <li><%= msg %></li> <% end %> </ul> <% end %> </div>
438c6d0b8bea994e9f57b4c480ebcdfd197a0e28
setup.py
setup.py
from setuptools import setup, find_packages import plaid url = 'https://github.com/plaid/plaid-python' setup( name='plaid-python', version=plaid.__version__, description='Simple Python API client for Plaid', long_description='', keywords='api, client, plaid', author='Chris Forrette', author_email='[email protected]', url=url, download_url='{}/tarball/v{}'.format(url, plaid.__version__), license='MIT', packages=find_packages(exclude='tests'), package_data={'README': ['README.md']}, install_requires=['requests==2.2.1'], zip_safe=False, include_package_data=True, classifiers=[ "Programming Language :: Python", "Topic :: Software Development :: Libraries :: Python Modules", "Environment :: Web Environment", ] )
from setuptools import setup, find_packages import plaid url = 'https://github.com/plaid/plaid-python' setup( name='plaid-python', version=plaid.__version__, description='Simple Python API client for Plaid', long_description='', keywords='api, client, plaid', author='Chris Forrette', author_email='[email protected]', url=url, download_url='{}/tarball/v{}'.format(url, plaid.__version__), license='MIT', packages=find_packages(exclude='tests'), package_data={'README': ['README.md']}, install_requires=['requests==2.2.1'], zip_safe=False, include_package_data=True, classifiers=[ "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.4", "Topic :: Software Development :: Libraries :: Python Modules", "Environment :: Web Environment", ] )
Update "Programming Language" Tag to Indicate Python3 Support
Update "Programming Language" Tag to Indicate Python3 Support
Python
mit
erikbern/plaid-python,qwil/plaid-python,plaid/plaid-python
python
## Code Before: from setuptools import setup, find_packages import plaid url = 'https://github.com/plaid/plaid-python' setup( name='plaid-python', version=plaid.__version__, description='Simple Python API client for Plaid', long_description='', keywords='api, client, plaid', author='Chris Forrette', author_email='[email protected]', url=url, download_url='{}/tarball/v{}'.format(url, plaid.__version__), license='MIT', packages=find_packages(exclude='tests'), package_data={'README': ['README.md']}, install_requires=['requests==2.2.1'], zip_safe=False, include_package_data=True, classifiers=[ "Programming Language :: Python", "Topic :: Software Development :: Libraries :: Python Modules", "Environment :: Web Environment", ] ) ## Instruction: Update "Programming Language" Tag to Indicate Python3 Support ## Code After: from setuptools import setup, find_packages import plaid url = 'https://github.com/plaid/plaid-python' setup( name='plaid-python', version=plaid.__version__, description='Simple Python API client for Plaid', long_description='', keywords='api, client, plaid', author='Chris Forrette', author_email='[email protected]', url=url, download_url='{}/tarball/v{}'.format(url, plaid.__version__), license='MIT', packages=find_packages(exclude='tests'), package_data={'README': ['README.md']}, install_requires=['requests==2.2.1'], zip_safe=False, include_package_data=True, classifiers=[ "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.4", "Topic :: Software Development :: Libraries :: Python Modules", "Environment :: Web Environment", ] )
3ae5e55d22297040032f12836730843500bfbf2b
.travis.yml
.travis.yml
language: node_js node_js: - "0.10" after_success: - npm run-script deploy env: global: - GH_REF: github.com/tddbin/tddbin-frontend.git - secure: HofeV8v4B9m97cQ7S4otS7r+iY39ME1pIh/oGJc2U46oce+vAI1ONGVccgM0325myUqYTyZmP/dZGewalNm2X2KFoMrJefSNvMVTmyI9DzWKni575ss3tbuOy/RTcHTDzSc4eEZgSdnIC7OyzeTGi7WN583WUthFkc6P+wq0IeA=
language: node_js node_js: - "0.10" after_success: - chmod +x ./deploy-to-ghpages.sh - npm run-script deploy env: global: - GH_REF: github.com/tddbin/tddbin-frontend.git - secure: HofeV8v4B9m97cQ7S4otS7r+iY39ME1pIh/oGJc2U46oce+vAI1ONGVccgM0325myUqYTyZmP/dZGewalNm2X2KFoMrJefSNvMVTmyI9DzWKni575ss3tbuOy/RTcHTDzSc4eEZgSdnIC7OyzeTGi7WN583WUthFkc6P+wq0IeA=
Make sure deploy gets called.
Make sure deploy gets called.
YAML
mit
tddbin/tddbin-frontend,tddbin/tddbin-frontend,tddbin/tddbin-frontend
yaml
## Code Before: language: node_js node_js: - "0.10" after_success: - npm run-script deploy env: global: - GH_REF: github.com/tddbin/tddbin-frontend.git - secure: HofeV8v4B9m97cQ7S4otS7r+iY39ME1pIh/oGJc2U46oce+vAI1ONGVccgM0325myUqYTyZmP/dZGewalNm2X2KFoMrJefSNvMVTmyI9DzWKni575ss3tbuOy/RTcHTDzSc4eEZgSdnIC7OyzeTGi7WN583WUthFkc6P+wq0IeA= ## Instruction: Make sure deploy gets called. ## Code After: language: node_js node_js: - "0.10" after_success: - chmod +x ./deploy-to-ghpages.sh - npm run-script deploy env: global: - GH_REF: github.com/tddbin/tddbin-frontend.git - secure: HofeV8v4B9m97cQ7S4otS7r+iY39ME1pIh/oGJc2U46oce+vAI1ONGVccgM0325myUqYTyZmP/dZGewalNm2X2KFoMrJefSNvMVTmyI9DzWKni575ss3tbuOy/RTcHTDzSc4eEZgSdnIC7OyzeTGi7WN583WUthFkc6P+wq0IeA=
2f3a29dcfc711296db249feaa2de70b6f678c7c7
src/widgets/index.js
src/widgets/index.js
/** * mSupply Mobile * Sustainable Solutions (NZ) Ltd. 2019 */ export { Button, ProgressBar } from 'react-native-ui-components'; export { AutocompleteSelector } from './AutocompleteSelector'; export { ExpiryTextInput } from './ExpiryTextInput'; export { FinaliseButton } from './FinaliseButton'; export { GenericChoiceList } from './GenericChoiceList'; export { IconCell } from './IconCell'; export { NavigationBar } from './NavigationBar'; export { OnePressButton } from './OnePressButton'; export { PageButton } from './PageButton'; export { PageInfo } from './PageInfo'; export { SequentialSteps } from './SequentialSteps'; export { Spinner } from './Spinner'; export { SyncState } from './SyncState'; export { TextEditor } from './TextEditor'; export { TextInput } from './TextInput'; export { ToggleBar } from './ToggleBar'; export { ToggleSelector } from './ToggleSelector'; export { NewExpiryDateInput } from './NewExpiryDateInput'; export { SortAscIcon, SortNeutralIcon, SortDescIcon, CheckedComponent, UncheckedComponent, DisabledCheckedComponent, DisabledUncheckedComponent, } from './icons';
/** * mSupply Mobile * Sustainable Solutions (NZ) Ltd. 2019 */ export { Button, ProgressBar } from 'react-native-ui-components'; export { AutocompleteSelector } from './AutocompleteSelector'; export { ExpiryTextInput } from './ExpiryTextInput'; export { FinaliseButton } from './FinaliseButton'; export { GenericChoiceList } from './GenericChoiceList'; export { IconCell } from './IconCell'; export { NavigationBar } from './NavigationBar'; export { OnePressButton } from './OnePressButton'; export { PageButton } from './PageButton'; export { PageInfo } from './PageInfo'; export { SequentialSteps } from './SequentialSteps'; export { Spinner } from './Spinner'; export { SyncState } from './SyncState'; export { TextEditor } from './TextEditor'; export { TextInput } from './TextInput'; export { ToggleBar } from './ToggleBar'; export { ToggleSelector } from './ToggleSelector'; export { NewExpiryDateInput } from './NewExpiryDateInput'; export { SortAscIcon, SortNeutralIcon, SortDescIcon, CheckedComponent, UncheckedComponent, DisabledCheckedComponent, DisabledUncheckedComponent, Cancel, } from './icons';
Add export of Cancel icon
Add export of Cancel icon
JavaScript
mit
sussol/mobile,sussol/mobile,sussol/mobile,sussol/mobile
javascript
## Code Before: /** * mSupply Mobile * Sustainable Solutions (NZ) Ltd. 2019 */ export { Button, ProgressBar } from 'react-native-ui-components'; export { AutocompleteSelector } from './AutocompleteSelector'; export { ExpiryTextInput } from './ExpiryTextInput'; export { FinaliseButton } from './FinaliseButton'; export { GenericChoiceList } from './GenericChoiceList'; export { IconCell } from './IconCell'; export { NavigationBar } from './NavigationBar'; export { OnePressButton } from './OnePressButton'; export { PageButton } from './PageButton'; export { PageInfo } from './PageInfo'; export { SequentialSteps } from './SequentialSteps'; export { Spinner } from './Spinner'; export { SyncState } from './SyncState'; export { TextEditor } from './TextEditor'; export { TextInput } from './TextInput'; export { ToggleBar } from './ToggleBar'; export { ToggleSelector } from './ToggleSelector'; export { NewExpiryDateInput } from './NewExpiryDateInput'; export { SortAscIcon, SortNeutralIcon, SortDescIcon, CheckedComponent, UncheckedComponent, DisabledCheckedComponent, DisabledUncheckedComponent, } from './icons'; ## Instruction: Add export of Cancel icon ## Code After: /** * mSupply Mobile * Sustainable Solutions (NZ) Ltd. 2019 */ export { Button, ProgressBar } from 'react-native-ui-components'; export { AutocompleteSelector } from './AutocompleteSelector'; export { ExpiryTextInput } from './ExpiryTextInput'; export { FinaliseButton } from './FinaliseButton'; export { GenericChoiceList } from './GenericChoiceList'; export { IconCell } from './IconCell'; export { NavigationBar } from './NavigationBar'; export { OnePressButton } from './OnePressButton'; export { PageButton } from './PageButton'; export { PageInfo } from './PageInfo'; export { SequentialSteps } from './SequentialSteps'; export { Spinner } from './Spinner'; export { SyncState } from './SyncState'; export { TextEditor } from './TextEditor'; export { TextInput } from './TextInput'; export { ToggleBar } from './ToggleBar'; export { ToggleSelector } from './ToggleSelector'; export { NewExpiryDateInput } from './NewExpiryDateInput'; export { SortAscIcon, SortNeutralIcon, SortDescIcon, CheckedComponent, UncheckedComponent, DisabledCheckedComponent, DisabledUncheckedComponent, Cancel, } from './icons';
7da90a95b72ac5a36336bfe630aa904bf1b1892b
ansible/roles/atom/tasks/main.yml
ansible/roles/atom/tasks/main.yml
--- - name: "install atom using deb link" apt: deb="https://github.com/atom/atom/releases/download/v{{atom_version}}/atom-amd64.deb" autoremove=yes
--- - name: "install atom using deb link" apt: deb="https://github.com/atom/atom/releases/download/v{{atom_version}}/atom-amd64.deb" autoremove=yes - name: "install en_gb dictionary" apt: name=myspell-en-gb state=present autoremove=yes
Add en_gb dictionary to Atom installation
Add en_gb dictionary to Atom installation
YAML
apache-2.0
cagiti/developer-environment
yaml
## Code Before: --- - name: "install atom using deb link" apt: deb="https://github.com/atom/atom/releases/download/v{{atom_version}}/atom-amd64.deb" autoremove=yes ## Instruction: Add en_gb dictionary to Atom installation ## Code After: --- - name: "install atom using deb link" apt: deb="https://github.com/atom/atom/releases/download/v{{atom_version}}/atom-amd64.deb" autoremove=yes - name: "install en_gb dictionary" apt: name=myspell-en-gb state=present autoremove=yes
b8b7ef924c45f611b1430931973dfb9b64c8cf1c
routes/index.js
routes/index.js
var express = require('express'); var router = express.Router(); var windspeed = require('../middleware/windspeed') router.use(windspeed) var statusValues = { 1: { human: "Safe(ish)", machine: "safe" }, 2: { human: "Moderate", machine: "moderate" }, 3: { human: "Hazardous", machine: "hazardous" } } /* GET home page. */ router.get('/', function(req, res) { var status = statusValues[req.wind.status] res.render('index', { bodyClass: "home " + status.machine, wind: req.wind, status: status.human }); }); /* GET about page */ router.get('/about', function(req, res){ res.render('about', { bodyClass: "about" }) }) module.exports = router;
var express = require('express'); var router = express.Router(); var windspeed = require('../middleware/windspeed') router.use(windspeed) var statusValues = { 1: { human: "Safe(ish)", machine: "safe" }, 2: { human: "Moderate", machine: "moderate" }, 3: { human: "Hazardous", machine: "hazardous" } } /* GET home page. */ router.get('/', function(req, res) { var status = statusValues[req.wind.status] res.render('index', { bodyClass: "home " + status.machine, wind: req.wind, status: status.human }); }); module.exports = router;
Remove /about route since there's no content for it yet
Remove /about route since there's no content for it yet
JavaScript
mit
renemarcelo/kcbx-petcoke,chihacknight/kcbx-petcoke,benwilhelm/kcbx-petcoke,justinmanley/kcbx-petcoke,chihacknight/kcbx-petcoke,manleyjster/kcbx-petcoke,open-city/kcbx-petcoke,open-city/kcbx-petcoke,manleyjster/kcbx-petcoke,chihacknight/kcbx-petcoke,benwilhelm/kcbx-petcoke,benwilhelm/kcbx-petcoke,open-city/kcbx-petcoke,justinmanley/kcbx-petcoke,renemarcelo/kcbx-petcoke
javascript
## Code Before: var express = require('express'); var router = express.Router(); var windspeed = require('../middleware/windspeed') router.use(windspeed) var statusValues = { 1: { human: "Safe(ish)", machine: "safe" }, 2: { human: "Moderate", machine: "moderate" }, 3: { human: "Hazardous", machine: "hazardous" } } /* GET home page. */ router.get('/', function(req, res) { var status = statusValues[req.wind.status] res.render('index', { bodyClass: "home " + status.machine, wind: req.wind, status: status.human }); }); /* GET about page */ router.get('/about', function(req, res){ res.render('about', { bodyClass: "about" }) }) module.exports = router; ## Instruction: Remove /about route since there's no content for it yet ## Code After: var express = require('express'); var router = express.Router(); var windspeed = require('../middleware/windspeed') router.use(windspeed) var statusValues = { 1: { human: "Safe(ish)", machine: "safe" }, 2: { human: "Moderate", machine: "moderate" }, 3: { human: "Hazardous", machine: "hazardous" } } /* GET home page. */ router.get('/', function(req, res) { var status = statusValues[req.wind.status] res.render('index', { bodyClass: "home " + status.machine, wind: req.wind, status: status.human }); }); module.exports = router;
98672dd768661be91232708abc641a13360d37b8
modules/doc/content/newsletter/2020_12.md
modules/doc/content/newsletter/2020_12.md
The MOOSE development team decided to move all new questions to a [Github discussion forum](https://github.com/idaholab/moose/discussions). ## Reporter Transfer The ability to transfer [Reporter values](Reporters/index.md) to and from sub-applications was added via the [MultiAppReporterTransfer.md]. ## MultiApp UserObject Transfer `MultiAppUserObjectTransfer` can now be restricted to a block or a boundary using `block` or `boundary` parameter. This helps to combine multiple transfers into a single variable which can also simplify the overall simulation setup of kernels, BCs, etc.
The MOOSE development team decided to move all new questions to a [Github discussion forum](https://github.com/idaholab/moose/discussions). ## Reporter Transfer The ability to transfer [Reporter values](Reporters/index.md) to and from sub-applications was added via the [MultiAppReporterTransfer.md]. ## MultiApp UserObject Transfer `MultiAppUserObjectTransfer` can now be restricted to a block or a boundary using `block` or `boundary` parameter. This helps to combine multiple transfers into a single variable which can also simplify the overall simulation setup of kernels, BCs, etc. ## Scaling Support for Global AD Indexing We have added both automatic and manual scaling support for MOOSE's global AD indexing configuration (obtained by running `./configure --with-ad-indexing-type=global`). This works for both finite elements and finite volumes variables.
Add scaling for global AD indexing blurb
Add scaling for global AD indexing blurb
Markdown
lgpl-2.1
harterj/moose,SudiptaBiswas/moose,idaholab/moose,laagesen/moose,harterj/moose,SudiptaBiswas/moose,laagesen/moose,harterj/moose,sapitts/moose,milljm/moose,nuclear-wizard/moose,idaholab/moose,milljm/moose,nuclear-wizard/moose,sapitts/moose,harterj/moose,jessecarterMOOSE/moose,lindsayad/moose,andrsd/moose,laagesen/moose,dschwen/moose,sapitts/moose,andrsd/moose,nuclear-wizard/moose,idaholab/moose,dschwen/moose,laagesen/moose,harterj/moose,bwspenc/moose,andrsd/moose,lindsayad/moose,bwspenc/moose,andrsd/moose,bwspenc/moose,lindsayad/moose,idaholab/moose,bwspenc/moose,jessecarterMOOSE/moose,idaholab/moose,lindsayad/moose,milljm/moose,dschwen/moose,SudiptaBiswas/moose,nuclear-wizard/moose,lindsayad/moose,sapitts/moose,SudiptaBiswas/moose,milljm/moose,SudiptaBiswas/moose,milljm/moose,laagesen/moose,sapitts/moose,jessecarterMOOSE/moose,jessecarterMOOSE/moose,dschwen/moose,dschwen/moose,andrsd/moose,bwspenc/moose,jessecarterMOOSE/moose
markdown
## Code Before: The MOOSE development team decided to move all new questions to a [Github discussion forum](https://github.com/idaholab/moose/discussions). ## Reporter Transfer The ability to transfer [Reporter values](Reporters/index.md) to and from sub-applications was added via the [MultiAppReporterTransfer.md]. ## MultiApp UserObject Transfer `MultiAppUserObjectTransfer` can now be restricted to a block or a boundary using `block` or `boundary` parameter. This helps to combine multiple transfers into a single variable which can also simplify the overall simulation setup of kernels, BCs, etc. ## Instruction: Add scaling for global AD indexing blurb ## Code After: The MOOSE development team decided to move all new questions to a [Github discussion forum](https://github.com/idaholab/moose/discussions). ## Reporter Transfer The ability to transfer [Reporter values](Reporters/index.md) to and from sub-applications was added via the [MultiAppReporterTransfer.md]. ## MultiApp UserObject Transfer `MultiAppUserObjectTransfer` can now be restricted to a block or a boundary using `block` or `boundary` parameter. This helps to combine multiple transfers into a single variable which can also simplify the overall simulation setup of kernels, BCs, etc. ## Scaling Support for Global AD Indexing We have added both automatic and manual scaling support for MOOSE's global AD indexing configuration (obtained by running `./configure --with-ad-indexing-type=global`). This works for both finite elements and finite volumes variables.
8fb67ede30e8aaaa2bcd0d5defb89e6cd63a9101
lib/lobber/cli.rb
lib/lobber/cli.rb
require 'thor' module Lobber class CLI < Thor default_task :lob desc "DIRECTORY", "Upload a directory to Amazon S3" option :bucket def lob(directory = nil) return usage unless directory unless File.directory? directory error "#{directory} is not a valid directory" exit 1 end Lobber.upload(directory, options) say "Successfully uploaded #{directory}", "\033[32m" end desc "usage", "Display usage banner", hide: true def usage say "Lobber #{Lobber::VERSION}" say "https://github.com/mdb/lob" say "\n" help end end end
require 'thor' module Lobber class CLI < Thor default_task :lob desc "DIRECTORY", "Upload a directory to Amazon S3" method_option :bucket, type: :string method_option :dry_run, default: false, type: :boolean method_option :verbose, default: true, type: :boolean def lob(directory = nil) return usage unless directory unless File.directory? directory error "#{directory} is not a valid directory" exit 1 end Lobber.upload(directory, options) say "Successfully uploaded #{directory}", "\033[32m" end desc "usage", "Display usage banner", hide: true def usage say "Lobber #{Lobber::VERSION}" say "https://github.com/mdb/lob" say "\n" help end end end
Add method_options for dry-run and verbose
Add method_options for dry-run and verbose
Ruby
mit
mdb/lobber
ruby
## Code Before: require 'thor' module Lobber class CLI < Thor default_task :lob desc "DIRECTORY", "Upload a directory to Amazon S3" option :bucket def lob(directory = nil) return usage unless directory unless File.directory? directory error "#{directory} is not a valid directory" exit 1 end Lobber.upload(directory, options) say "Successfully uploaded #{directory}", "\033[32m" end desc "usage", "Display usage banner", hide: true def usage say "Lobber #{Lobber::VERSION}" say "https://github.com/mdb/lob" say "\n" help end end end ## Instruction: Add method_options for dry-run and verbose ## Code After: require 'thor' module Lobber class CLI < Thor default_task :lob desc "DIRECTORY", "Upload a directory to Amazon S3" method_option :bucket, type: :string method_option :dry_run, default: false, type: :boolean method_option :verbose, default: true, type: :boolean def lob(directory = nil) return usage unless directory unless File.directory? directory error "#{directory} is not a valid directory" exit 1 end Lobber.upload(directory, options) say "Successfully uploaded #{directory}", "\033[32m" end desc "usage", "Display usage banner", hide: true def usage say "Lobber #{Lobber::VERSION}" say "https://github.com/mdb/lob" say "\n" help end end end
27ee0bc88155928319ab3f178dde44ca8a9a52b9
lib/formtastic/inputs/file_input.rb
lib/formtastic/inputs/file_input.rb
module Formtastic module Inputs # Outputs a simple `<label>` with a `<input type="file">` wrapped in the standard # `<li>` wrapper. This is the default input choice for objects with attributes that appear # to be for file uploads, by detecting some common method names used by popular file upload # libraries such as Paperclip and CarrierWave. You can add to or alter these method names # through the `file_methods` config, but can be applied to any input with `:as => :file`. # # Don't forget to set the multipart attribute in your `<form>` tag! # # @example Full form context and output # # <%= semantic_form_for(@user, :html => { :multipart => true }) do |f| %> # <%= f.inputs do %> # <%= f.input :email_address, :as => :email %> # <% end %> # <% end %> # # <form...> # <fieldset> # <ol> # <li class="email"> # <label for="user_email_address">Email address</label> # <input type="email" id="user_email_address" name="user[email_address]"> # </li> # </ol> # </fieldset> # </form> # # @see Formtastic::Helpers::InputsHelper#input InputsHelper#input for full documetation of all possible options. class FileInput include Base def to_html input_wrapping do label_html << builder.file_field(method, input_html_options) end end end end end
module Formtastic module Inputs # Outputs a simple `<label>` with a `<input type="file">` wrapped in the standard # `<li>` wrapper. This is the default input choice for objects with attributes that appear # to be for file uploads, by detecting some common method names used by popular file upload # libraries such as Paperclip and CarrierWave. You can add to or alter these method names # through the `file_methods` config, but can be applied to any input with `:as => :file`. # # Don't forget to set the multipart attribute in your `<form>` tag! # # @example Full form context and output # # <%= semantic_form_for(@user, :html => { :multipart => true }) do |f| %> # <%= f.inputs do %> # <%= f.input :avatar, :as => :file %> # <% end %> # <% end %> # # <form...> # <fieldset> # <ol> # <li class="email"> # <label for="user_avatar">Avatar</label> # <input type="file" id="user_avatar" name="user[avatar]"> # </li> # </ol> # </fieldset> # </form> # # @see Formtastic::Helpers::InputsHelper#input InputsHelper#input for full documetation of all possible options. class FileInput include Base def to_html input_wrapping do label_html << builder.file_field(method, input_html_options) end end end end end
Fix example code for FileInput
Fix example code for FileInput Closes GH-635
Ruby
mit
mikz/formtastic,Guestfolio/formtastic,ryanfox1985/formtastic,ryanfox1985/formtastic,ConferenceCloud/formtastic,justinfrench/formtastic,sideci-sample/sideci-sample-formtastic,Baltazore/formtastic,Davidzhu001/formtastic,thenthen8515/formtastic,Davidzhu001/formtastic,thenthen8515/formtastic,EndingChaung/formtastic,crashlytics/formtastic,EndingChaung/formtastic,justinfrench/formtastic,paulodiovani/formtastic,ConferenceCloud/formtastic,ryanfox1985/formtastic,justinfrench/formtastic,paulodiovani/formtastic,nishaya/formtastic,Guestfolio/formtastic,mikz/formtastic,nishaya/formtastic
ruby
## Code Before: module Formtastic module Inputs # Outputs a simple `<label>` with a `<input type="file">` wrapped in the standard # `<li>` wrapper. This is the default input choice for objects with attributes that appear # to be for file uploads, by detecting some common method names used by popular file upload # libraries such as Paperclip and CarrierWave. You can add to or alter these method names # through the `file_methods` config, but can be applied to any input with `:as => :file`. # # Don't forget to set the multipart attribute in your `<form>` tag! # # @example Full form context and output # # <%= semantic_form_for(@user, :html => { :multipart => true }) do |f| %> # <%= f.inputs do %> # <%= f.input :email_address, :as => :email %> # <% end %> # <% end %> # # <form...> # <fieldset> # <ol> # <li class="email"> # <label for="user_email_address">Email address</label> # <input type="email" id="user_email_address" name="user[email_address]"> # </li> # </ol> # </fieldset> # </form> # # @see Formtastic::Helpers::InputsHelper#input InputsHelper#input for full documetation of all possible options. class FileInput include Base def to_html input_wrapping do label_html << builder.file_field(method, input_html_options) end end end end end ## Instruction: Fix example code for FileInput Closes GH-635 ## Code After: module Formtastic module Inputs # Outputs a simple `<label>` with a `<input type="file">` wrapped in the standard # `<li>` wrapper. This is the default input choice for objects with attributes that appear # to be for file uploads, by detecting some common method names used by popular file upload # libraries such as Paperclip and CarrierWave. You can add to or alter these method names # through the `file_methods` config, but can be applied to any input with `:as => :file`. # # Don't forget to set the multipart attribute in your `<form>` tag! # # @example Full form context and output # # <%= semantic_form_for(@user, :html => { :multipart => true }) do |f| %> # <%= f.inputs do %> # <%= f.input :avatar, :as => :file %> # <% end %> # <% end %> # # <form...> # <fieldset> # <ol> # <li class="email"> # <label for="user_avatar">Avatar</label> # <input type="file" id="user_avatar" name="user[avatar]"> # </li> # </ol> # </fieldset> # </form> # # @see Formtastic::Helpers::InputsHelper#input InputsHelper#input for full documetation of all possible options. class FileInput include Base def to_html input_wrapping do label_html << builder.file_field(method, input_html_options) end end end end end
412e913a29c2ded1393e9251a705ed0d1ef2eb28
packages/ha/haxl-amazonka.yaml
packages/ha/haxl-amazonka.yaml
homepage: http://github.com/tvh/haxl-amazonka#readme changelog-type: '' hash: 720d0b228bf0a7462677ba901c4d3fefe7618eee2e173e7318ff5a69017b52ad test-bench-deps: {} maintainer: [email protected] synopsis: Haxl data source for accessing AWS services through amazonka. changelog: '' basic-deps: amazonka: ! '>=1.3' base: ! '>=4.7 && <5' haxl: ! '>=0.3 && <0.5' async: -any conduit: -any amazonka-core: ! '>=1.3' hashable: -any transformers: -any all-versions: - '0.1.0.0' author: Timo von Holtz latest: '0.1.0.0' description-type: haddock description: Haxl data source for accessing AWS services through amazonka. license-name: BSD3
homepage: http://github.com/tvh/haxl-amazonka#readme changelog-type: '' hash: b79e2b4846f82b94973eb937f284ab78430513ecb82ec6c4da77f17299c63d12 test-bench-deps: {} maintainer: [email protected] synopsis: Haxl data source for accessing AWS services through amazonka. changelog: '' basic-deps: amazonka: ! '>=1.3' base: ! '>=4.7 && <5' haxl: ! '>=0.5' async: -any conduit: -any amazonka-core: ! '>=1.3' hashable: -any transformers: -any all-versions: - '0.1.0.0' - '0.1.1' author: Timo von Holtz latest: '0.1.1' description-type: haddock description: Haxl data source for accessing AWS services through amazonka. license-name: BSD3
Update from Hackage at 2017-01-17T17:55:51Z
Update from Hackage at 2017-01-17T17:55:51Z
YAML
mit
commercialhaskell/all-cabal-metadata
yaml
## Code Before: homepage: http://github.com/tvh/haxl-amazonka#readme changelog-type: '' hash: 720d0b228bf0a7462677ba901c4d3fefe7618eee2e173e7318ff5a69017b52ad test-bench-deps: {} maintainer: [email protected] synopsis: Haxl data source for accessing AWS services through amazonka. changelog: '' basic-deps: amazonka: ! '>=1.3' base: ! '>=4.7 && <5' haxl: ! '>=0.3 && <0.5' async: -any conduit: -any amazonka-core: ! '>=1.3' hashable: -any transformers: -any all-versions: - '0.1.0.0' author: Timo von Holtz latest: '0.1.0.0' description-type: haddock description: Haxl data source for accessing AWS services through amazonka. license-name: BSD3 ## Instruction: Update from Hackage at 2017-01-17T17:55:51Z ## Code After: homepage: http://github.com/tvh/haxl-amazonka#readme changelog-type: '' hash: b79e2b4846f82b94973eb937f284ab78430513ecb82ec6c4da77f17299c63d12 test-bench-deps: {} maintainer: [email protected] synopsis: Haxl data source for accessing AWS services through amazonka. changelog: '' basic-deps: amazonka: ! '>=1.3' base: ! '>=4.7 && <5' haxl: ! '>=0.5' async: -any conduit: -any amazonka-core: ! '>=1.3' hashable: -any transformers: -any all-versions: - '0.1.0.0' - '0.1.1' author: Timo von Holtz latest: '0.1.1' description-type: haddock description: Haxl data source for accessing AWS services through amazonka. license-name: BSD3
0c17be983a9d3d7ae463defe642531bef6435d9d
src/Backend/Core/Cronjobs/SendQueuedEmails.php
src/Backend/Core/Cronjobs/SendQueuedEmails.php
<?php namespace Backend\Core\Cronjobs; /* * This file is part of Fork CMS. * * For the full copyright and license information, please view the license * file that was distributed with this source code. */ use Backend\Core\Engine\Base\Cronjob; use Backend\Core\Engine\Mailer; /** * This is the cronjob to send the queued emails. * * @author Tijs Verkoyen <[email protected]> * @author Davy Hellemans <[email protected]> */ class BackendCoreCronjobSendQueuedEmails extends Cronjob { /** * Execute the action */ public function execute() { // set busy file $this->setBusyFile(); // send all queued e-mails foreach (Mailer::getQueuedMailIds() as $id) { Mailer::send($id); } // remove busy file $this->clearBusyFile(); } }
<?php namespace Backend\Core\Cronjobs; /* * This file is part of Fork CMS. * * For the full copyright and license information, please view the license * file that was distributed with this source code. */ use Backend\Core\Engine\Base\Cronjob; use Common\Mailer; /** * This is the cronjob to send the queued emails. * * @author Tijs Verkoyen <[email protected]> * @author Davy Hellemans <[email protected]> */ class BackendCoreCronjobSendQueuedEmails extends Cronjob { /** * Execute the action */ public function execute() { // set busy file $this->setBusyFile(); // send all queued e-mails $mailer = $this->get('mailer'); foreach ($mailer->getQueuedMailIds() as $id) { $mailer->send($id); } // remove busy file $this->clearBusyFile(); } }
Use the common mailer in the sendQueuedEmails cronjob
Use the common mailer in the sendQueuedEmails cronjob
PHP
mit
dmoerman/forkcms,dmoerman/forkcms,carakas/forkcms,WouterSioen/forkcms,riadvice/forkcms,riadvice/forkcms,jacob-v-dam/forkcms,nosnickid/forkcms,Katrienvh/forkcms,jonasdekeukelaere/forkcms,Thijzer/forkcms,dmoerman/forkcms,carakas/forkcms,forkcms/forkcms,riadvice/forkcms,jacob-v-dam/forkcms,forkcms/forkcms,Katrienvh/forkcms,sumocoders/forkcms,ikoene/forkcms,carakas/forkcms,tommyvdv/forkcms,vytsci/forkcms,Saxithon/gh-pages,jeroendesloovere/forkcms,Saxithon/gh-pages,Thijzer/forkcms,vytenizs/forkcms,jessedobbelaere/forkcms,ikoene/forkcms,matthiasmullie/forkcms,bartdc/forkcms,WouterSioen/forkcms,ikoene/forkcms,vytsci/forkcms,matthiasmullie/forkcms,forkcms/forkcms,bartdc/forkcms,tommyvdv/forkcms,tommyvdv/forkcms,sumocoders/forkcms,DegradationOfMankind/forkcms,jessedobbelaere/forkcms,wengyulin/forkcms,mathiashelin/forkcms,DegradationOfMankind/forkcms,jeroendesloovere/forkcms,jonasdekeukelaere/forkcms,justcarakas/forkcms,riadvice/forkcms,jonasdekeukelaere/forkcms,jonasdekeukelaere/forkcms,jessedobbelaere/forkcms,vytenizs/forkcms,sumocoders/forkcms,jonasdekeukelaere/forkcms,jeroendesloovere/forkcms,carakas/forkcms,sumocoders/forkcms,tommyvdv/forkcms,justcarakas/forkcms,jacob-v-dam/forkcms,wengyulin/forkcms,matthiasmullie/forkcms,mathiashelin/forkcms,forkcms/forkcms,carakas/forkcms,bartdc/forkcms,Thijzer/forkcms,nosnickid/forkcms,DegradationOfMankind/forkcms,mathiashelin/forkcms,justcarakas/forkcms,vytsci/forkcms,Katrienvh/forkcms,sumocoders/forkcms,Katrienvh/forkcms,DegradationOfMankind/forkcms,mathiashelin/forkcms,jessedobbelaere/forkcms,jeroendesloovere/forkcms,jacob-v-dam/forkcms,nosnickid/forkcms,vytenizs/forkcms,wengyulin/forkcms
php
## Code Before: <?php namespace Backend\Core\Cronjobs; /* * This file is part of Fork CMS. * * For the full copyright and license information, please view the license * file that was distributed with this source code. */ use Backend\Core\Engine\Base\Cronjob; use Backend\Core\Engine\Mailer; /** * This is the cronjob to send the queued emails. * * @author Tijs Verkoyen <[email protected]> * @author Davy Hellemans <[email protected]> */ class BackendCoreCronjobSendQueuedEmails extends Cronjob { /** * Execute the action */ public function execute() { // set busy file $this->setBusyFile(); // send all queued e-mails foreach (Mailer::getQueuedMailIds() as $id) { Mailer::send($id); } // remove busy file $this->clearBusyFile(); } } ## Instruction: Use the common mailer in the sendQueuedEmails cronjob ## Code After: <?php namespace Backend\Core\Cronjobs; /* * This file is part of Fork CMS. * * For the full copyright and license information, please view the license * file that was distributed with this source code. */ use Backend\Core\Engine\Base\Cronjob; use Common\Mailer; /** * This is the cronjob to send the queued emails. * * @author Tijs Verkoyen <[email protected]> * @author Davy Hellemans <[email protected]> */ class BackendCoreCronjobSendQueuedEmails extends Cronjob { /** * Execute the action */ public function execute() { // set busy file $this->setBusyFile(); // send all queued e-mails $mailer = $this->get('mailer'); foreach ($mailer->getQueuedMailIds() as $id) { $mailer->send($id); } // remove busy file $this->clearBusyFile(); } }
18a518ed6f18487693755f614fbc943adc3a2bc4
gulpfile.js
gulpfile.js
var browserify = require('browserify'); var gulp = require('gulp'); var glob = require('glob'); var reactify = require('reactify'); var source = require('vinyl-source-stream'); var shell = require('gulp-shell'); /* gulp browserify: Bundles all .js files in the /js/ directory into one file in /build/bundle.js that will be imported onto all pages */ gulp.task('browserify', function() { var files = glob.sync('./js/**/*.js'); var b = browserify(); for (var i = 0; i < files.length; i++) { var matches = /(\w+)\.js$/.exec(files[i]); b.require(files[i], {expose: matches[1]}); } b.require('xhpjs'); return b .bundle() .pipe(source('bundle.js')) .pipe(gulp.dest('./build/')); }); gulp.task('install-php', shell.task([ 'hhvm composer.phar install' ])); gulp.task('autoload', shell.task([ 'hhvm composer.phar dump-autoload' ])); gulp.task('default', ['install-php','browserify']);
var browserify = require('browserify'); var gulp = require('gulp'); var glob = require('glob'); var reactify = require('reactify'); var source = require('vinyl-source-stream'); var shell = require('gulp-shell'); /* gulp browserify: Bundles all .js files in the /js/ directory into one file in /build/bundle.js that will be imported onto all pages */ gulp.task('browserify', function() { var files = glob.sync('./js/**/*.js'); var b = browserify(); for (var i = 0; i < files.length; i++) { var matches = /(\w+)\.js$/.exec(files[i]); b.require(files[i], {expose: matches[1]}); } b.require('xhpjs'); return b .bundle() .pipe(source('bundle.js')) .pipe(gulp.dest('./build/')); }); gulp.task('install-php', shell.task([ 'hhvm composer.phar install' ])); gulp.task('autoload', shell.task([ 'hhvm composer.phar dump-autoload' ])); gulp.task('serve', shell.task([ 'hhvm -m server -p 8080' ])); gulp.task('default', ['install-php','browserify','serve']);
Add gulp command for running server
Add gulp command for running server
JavaScript
mit
BinghamtonCoRE/core-site
javascript
## Code Before: var browserify = require('browserify'); var gulp = require('gulp'); var glob = require('glob'); var reactify = require('reactify'); var source = require('vinyl-source-stream'); var shell = require('gulp-shell'); /* gulp browserify: Bundles all .js files in the /js/ directory into one file in /build/bundle.js that will be imported onto all pages */ gulp.task('browserify', function() { var files = glob.sync('./js/**/*.js'); var b = browserify(); for (var i = 0; i < files.length; i++) { var matches = /(\w+)\.js$/.exec(files[i]); b.require(files[i], {expose: matches[1]}); } b.require('xhpjs'); return b .bundle() .pipe(source('bundle.js')) .pipe(gulp.dest('./build/')); }); gulp.task('install-php', shell.task([ 'hhvm composer.phar install' ])); gulp.task('autoload', shell.task([ 'hhvm composer.phar dump-autoload' ])); gulp.task('default', ['install-php','browserify']); ## Instruction: Add gulp command for running server ## Code After: var browserify = require('browserify'); var gulp = require('gulp'); var glob = require('glob'); var reactify = require('reactify'); var source = require('vinyl-source-stream'); var shell = require('gulp-shell'); /* gulp browserify: Bundles all .js files in the /js/ directory into one file in /build/bundle.js that will be imported onto all pages */ gulp.task('browserify', function() { var files = glob.sync('./js/**/*.js'); var b = browserify(); for (var i = 0; i < files.length; i++) { var matches = /(\w+)\.js$/.exec(files[i]); b.require(files[i], {expose: matches[1]}); } b.require('xhpjs'); return b .bundle() .pipe(source('bundle.js')) .pipe(gulp.dest('./build/')); }); gulp.task('install-php', shell.task([ 'hhvm composer.phar install' ])); gulp.task('autoload', shell.task([ 'hhvm composer.phar dump-autoload' ])); gulp.task('serve', shell.task([ 'hhvm -m server -p 8080' ])); gulp.task('default', ['install-php','browserify','serve']);
2aafa48da8e6288bac0982dc42ef4d095134e669
test/l2/ts_misc.rb
test/l2/ts_misc.rb
$:.unshift File.join(File.dirname(__FILE__), "../..", "lib") require 'test/unit' require 'racket' class TestL2Misc < Test::Unit::TestCase def test_convert (0..512).each { len = rand(512) mac = Racket::L2::Misc.randommac(len) long = Racket::L2::Misc.mac2long(mac) assert_equal(mac, Racket::L2::Misc.long2mac(long, len)) assert_equal(long, Racket::L2::Misc.mac2long(mac)) } end end # vim: set ts=2 et sw=2:
$:.unshift File.join(File.dirname(__FILE__), "../..", "lib") require 'test/unit' require 'racket' class TestL2Misc < Test::Unit::TestCase def test_convert (0..rand(1024)).each { len = rand(512)+1 mac = Racket::L2::Misc.randommac(len) assert_equal(mac.length, (len*2) + (len-1)) long = Racket::L2::Misc.mac2long(mac) assert_equal(mac, Racket::L2::Misc.long2mac(long, len)) assert_equal(long, Racket::L2::Misc.mac2long(mac)) } end end # vim: set ts=2 et sw=2:
Test that proper number of : exist
Test that proper number of : exist git-svn-id: b3d365f95d627ddffb8722c756fee9dc0a59d335@114 64fbf49a-8b99-41c6-b593-eb2128c2192d
Ruby
bsd-3-clause
axsh/racket
ruby
## Code Before: $:.unshift File.join(File.dirname(__FILE__), "../..", "lib") require 'test/unit' require 'racket' class TestL2Misc < Test::Unit::TestCase def test_convert (0..512).each { len = rand(512) mac = Racket::L2::Misc.randommac(len) long = Racket::L2::Misc.mac2long(mac) assert_equal(mac, Racket::L2::Misc.long2mac(long, len)) assert_equal(long, Racket::L2::Misc.mac2long(mac)) } end end # vim: set ts=2 et sw=2: ## Instruction: Test that proper number of : exist git-svn-id: b3d365f95d627ddffb8722c756fee9dc0a59d335@114 64fbf49a-8b99-41c6-b593-eb2128c2192d ## Code After: $:.unshift File.join(File.dirname(__FILE__), "../..", "lib") require 'test/unit' require 'racket' class TestL2Misc < Test::Unit::TestCase def test_convert (0..rand(1024)).each { len = rand(512)+1 mac = Racket::L2::Misc.randommac(len) assert_equal(mac.length, (len*2) + (len-1)) long = Racket::L2::Misc.mac2long(mac) assert_equal(mac, Racket::L2::Misc.long2mac(long, len)) assert_equal(long, Racket::L2::Misc.mac2long(mac)) } end end # vim: set ts=2 et sw=2:
e7e89343d51bc18212b5de062af69c691c07d79b
td/challenge260/SemanticAnalyserTest.cpp
td/challenge260/SemanticAnalyserTest.cpp
class SemanticAnalyser { public: bool analyse(const Tokens& tokens) { return tokens.empty(); } }; constexpr Token createTokenWithZeroValue(TokenType type) { return {type, 0}; } TEST(SemanticAnalyserTest, ShouldAcceptEmptyTokens) { SemanticAnalyser analyser; Tokens tokens{}; ASSERT_TRUE(analyser.analyse(tokens)); } TEST(SemanticAnalyserTest, ShouldNotAcceptInvalidInstructionSet) { SemanticAnalyser analyser; Tokens tokens{createTokenWithZeroValue(TokenType::Ld)}; ASSERT_FALSE(analyser.analyse(tokens)); }
class SemanticAnalyser { public: bool analyse(const Tokens& tokens) { return tokens.empty(); } }; using namespace ::testing; struct SemanticAnalyserTest : public Test { SemanticAnalyser analyser; static constexpr Token createTokenWithZeroValue(TokenType type) { return { type, 0}; } }; TEST_F(SemanticAnalyserTest, ShouldAcceptEmptyTokens) { Tokens tokens{}; ASSERT_TRUE(analyser.analyse(tokens)); } TEST_F(SemanticAnalyserTest, ShouldNotAcceptInvalidInstructionSet) { Tokens tokens{createTokenWithZeroValue(TokenType::Ld)}; ASSERT_FALSE(analyser.analyse(tokens)); }
Add test suite to Analyser
Add test suite to Analyser
C++
mit
lukasz-m-maciejewski/papuc_exercises,lukasz-m-maciejewski/papuc_exercises,NadzwyczajnaGrupaRobocza/papuc_exercises,NadzwyczajnaGrupaRobocza/papuc_exercises
c++
## Code Before: class SemanticAnalyser { public: bool analyse(const Tokens& tokens) { return tokens.empty(); } }; constexpr Token createTokenWithZeroValue(TokenType type) { return {type, 0}; } TEST(SemanticAnalyserTest, ShouldAcceptEmptyTokens) { SemanticAnalyser analyser; Tokens tokens{}; ASSERT_TRUE(analyser.analyse(tokens)); } TEST(SemanticAnalyserTest, ShouldNotAcceptInvalidInstructionSet) { SemanticAnalyser analyser; Tokens tokens{createTokenWithZeroValue(TokenType::Ld)}; ASSERT_FALSE(analyser.analyse(tokens)); } ## Instruction: Add test suite to Analyser ## Code After: class SemanticAnalyser { public: bool analyse(const Tokens& tokens) { return tokens.empty(); } }; using namespace ::testing; struct SemanticAnalyserTest : public Test { SemanticAnalyser analyser; static constexpr Token createTokenWithZeroValue(TokenType type) { return { type, 0}; } }; TEST_F(SemanticAnalyserTest, ShouldAcceptEmptyTokens) { Tokens tokens{}; ASSERT_TRUE(analyser.analyse(tokens)); } TEST_F(SemanticAnalyserTest, ShouldNotAcceptInvalidInstructionSet) { Tokens tokens{createTokenWithZeroValue(TokenType::Ld)}; ASSERT_FALSE(analyser.analyse(tokens)); }
468eab0977c2fdf34cad5a8a3d3b7ff1b55ee5a4
applications/NXtranslate/test_SNS_nexus.xml
applications/NXtranslate/test_SNS_nexus.xml
<NXroot NXS:source="test_SNS_nexus3_10_5.dat" NXS:mime_type="application/x-SNS-histogram"> <entry_1D NXS:location=" [3,10,5][3,10,5]#!{Tbin|0,2,1}AND((!{pixelID|1,2,3,4,5}OR(!{pixelY|0,2}AND{pixelY|loop(1,2,1)})))"/> </NXroot>
<NXroot NXS:source="test_SNS_nexus3_10_5.dat" NXS:mime_type="application/x-SNS-histogram"> <entry type="NXentry"> <entry_1D NXS:location=" [3,10,5][3,10,5]#((({pixelY|0,1}OR{pixelID|1,2,3,4,5,6,7})))OR{pixelX|1,3,1}OR{Tbin|1,2}"/> </entry> </NXroot>
Test file of SNS_histogram retriever The previous version was not correctly defined.
Test file of SNS_histogram retriever The previous version was not correctly defined. git-svn-id: 1001b6e0dd469eaa302a741812f7f5d223cf35f7@707 ff5d1e40-2be0-497f-93bd-dc18237bd3c7
XML
lgpl-2.1
chrisemblhh/nexus,nexusformat/code,nexusformat/code,nexusformat/code,nexusformat/code,chrisemblhh/nexus,chrisemblhh/nexus,nexusformat/code,chrisemblhh/nexus,chrisemblhh/nexus
xml
## Code Before: <NXroot NXS:source="test_SNS_nexus3_10_5.dat" NXS:mime_type="application/x-SNS-histogram"> <entry_1D NXS:location=" [3,10,5][3,10,5]#!{Tbin|0,2,1}AND((!{pixelID|1,2,3,4,5}OR(!{pixelY|0,2}AND{pixelY|loop(1,2,1)})))"/> </NXroot> ## Instruction: Test file of SNS_histogram retriever The previous version was not correctly defined. git-svn-id: 1001b6e0dd469eaa302a741812f7f5d223cf35f7@707 ff5d1e40-2be0-497f-93bd-dc18237bd3c7 ## Code After: <NXroot NXS:source="test_SNS_nexus3_10_5.dat" NXS:mime_type="application/x-SNS-histogram"> <entry type="NXentry"> <entry_1D NXS:location=" [3,10,5][3,10,5]#((({pixelY|0,1}OR{pixelID|1,2,3,4,5,6,7})))OR{pixelX|1,3,1}OR{Tbin|1,2}"/> </entry> </NXroot>
50d2c7c66ac1769d9629542ab5769922681d1d62
packages/nl/nlopt-haskell.yaml
packages/nl/nlopt-haskell.yaml
homepage: https://github.com/peddie/nlopt-haskell changelog-type: markdown hash: 829a4a9a63f6a3f45af99f7109c16f1d23c2b2e3ce786c8fa31a60c885a68dd1 test-bench-deps: base: ! '>=4.9 && <4.10' nlopt-haskell: -any vector: ! '>=0.10' maintainer: Matthew Peddie <[email protected]> synopsis: Low-level bindings to the NLOPT optimization library changelog: ! '# Revision history for haskell-nlopt ## 0.1.0.0 -- 2017-27-03 * First version, for NLOPT 2.4.2. The entire API is brought out. ' basic-deps: base: ! '>=4.9 && <4.10' vector: ! '>=0.10' all-versions: - '0.1.0.0' author: Matthew Peddie <[email protected]> latest: '0.1.0.0' description-type: haddock description: ! 'This library provides low-level bindings to <http://ab-initio.mit.edu/wiki/index.php/NLopt the NLOPT optimization library>. You will need the NLOPT library and development headers installed.' license-name: BSD3
homepage: https://github.com/peddie/nlopt-haskell changelog-type: markdown hash: 3e436a5ae44a25ceceb03c3b334768b83c21ff93db1d8ac7bdf426d3f2db224e test-bench-deps: base: ! '>=4.9 && <4.11' nlopt-haskell: -any vector: ! '>=0.10' maintainer: Matthew Peddie <[email protected]> synopsis: Low-level bindings to the NLOPT optimization library changelog: ! '# Revision history for haskell-nlopt ## 0.1.0.0 -- 2017-27-03 * First version, for NLOPT 2.4.2. The entire API is brought out. ' basic-deps: base: ! '>=4.9 && <4.11' vector: ! '>=0.10' all-versions: - '0.1.0.0' - '0.1.1.0' author: Matthew Peddie <[email protected]> latest: '0.1.1.0' description-type: haddock description: ! 'This library provides low-level bindings to <http://ab-initio.mit.edu/wiki/index.php/NLopt the NLOPT optimization library>. You will need the NLOPT library and development headers installed.' license-name: BSD3
Update from Hackage at 2018-01-21T06:50:54Z
Update from Hackage at 2018-01-21T06:50:54Z
YAML
mit
commercialhaskell/all-cabal-metadata
yaml
## Code Before: homepage: https://github.com/peddie/nlopt-haskell changelog-type: markdown hash: 829a4a9a63f6a3f45af99f7109c16f1d23c2b2e3ce786c8fa31a60c885a68dd1 test-bench-deps: base: ! '>=4.9 && <4.10' nlopt-haskell: -any vector: ! '>=0.10' maintainer: Matthew Peddie <[email protected]> synopsis: Low-level bindings to the NLOPT optimization library changelog: ! '# Revision history for haskell-nlopt ## 0.1.0.0 -- 2017-27-03 * First version, for NLOPT 2.4.2. The entire API is brought out. ' basic-deps: base: ! '>=4.9 && <4.10' vector: ! '>=0.10' all-versions: - '0.1.0.0' author: Matthew Peddie <[email protected]> latest: '0.1.0.0' description-type: haddock description: ! 'This library provides low-level bindings to <http://ab-initio.mit.edu/wiki/index.php/NLopt the NLOPT optimization library>. You will need the NLOPT library and development headers installed.' license-name: BSD3 ## Instruction: Update from Hackage at 2018-01-21T06:50:54Z ## Code After: homepage: https://github.com/peddie/nlopt-haskell changelog-type: markdown hash: 3e436a5ae44a25ceceb03c3b334768b83c21ff93db1d8ac7bdf426d3f2db224e test-bench-deps: base: ! '>=4.9 && <4.11' nlopt-haskell: -any vector: ! '>=0.10' maintainer: Matthew Peddie <[email protected]> synopsis: Low-level bindings to the NLOPT optimization library changelog: ! '# Revision history for haskell-nlopt ## 0.1.0.0 -- 2017-27-03 * First version, for NLOPT 2.4.2. The entire API is brought out. ' basic-deps: base: ! '>=4.9 && <4.11' vector: ! '>=0.10' all-versions: - '0.1.0.0' - '0.1.1.0' author: Matthew Peddie <[email protected]> latest: '0.1.1.0' description-type: haddock description: ! 'This library provides low-level bindings to <http://ab-initio.mit.edu/wiki/index.php/NLopt the NLOPT optimization library>. You will need the NLOPT library and development headers installed.' license-name: BSD3
64a1aa4f12b3d285b2dd4ba534f814ed11b0c8f2
src/js/geometry/box-geometry.js
src/js/geometry/box-geometry.js
'use strict'; module.exports = function addBoxGeometry( geometry, width, height, depth, dx, dy, dz ) { dx = dx || 0; dy = dy || 0; dz = dz || 0; var halfWidth = width / 2; var halfHeight = height / 2; var halfDepth = depth / 2; var vertices = [ // Counterclockwise from far left. // Bottom. -halfWidth, -halfHeight, -halfDepth, -halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, -halfDepth, // Top. -halfWidth, halfHeight, -halfDepth, -halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, -halfDepth ]; for ( var i = 0; i < vertices.length; i += 3 ) { vertices[ i ] += dx; vertices[ i + 1 ] += dy; vertices[ i + 2 ] += dz; } var faces = [ // Sides. [ 0, 1, 5, 4 ], [ 1, 2, 6, 5 ], [ 2, 3, 7, 6 ], [ 3, 0, 4, 7 ], // Top. [ 4, 5, 6, 7 ], // Bottom. [ 0, 3, 2, 1 ] ]; return geometry.push( vertices, faces ); };
'use strict'; module.exports = function addBoxGeometry( geometry, width, height, depth ) { var halfWidth = width / 2; var halfHeight = height / 2; var halfDepth = depth / 2; var vertices = [ // Counterclockwise from far left. // Bottom. -halfWidth, -halfHeight, -halfDepth, -halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, -halfDepth, // Top. -halfWidth, halfHeight, -halfDepth, -halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, -halfDepth ]; var faces = [ // Sides. [ 0, 1, 5, 4 ], [ 1, 2, 6, 5 ], [ 2, 3, 7, 6 ], [ 3, 0, 4, 7 ], // Top. [ 4, 5, 6, 7 ], // Bottom. [ 0, 3, 2, 1 ] ]; return geometry.push( vertices, faces ); };
Remove box geometry offset arguments.
Remove box geometry offset arguments.
JavaScript
mit
js13kGames/Roulette,js13kGames/Roulette,razh/js13k-2015,razh/js13k-2015,Hitman666/js13k-2015,Hitman666/js13k-2015
javascript
## Code Before: 'use strict'; module.exports = function addBoxGeometry( geometry, width, height, depth, dx, dy, dz ) { dx = dx || 0; dy = dy || 0; dz = dz || 0; var halfWidth = width / 2; var halfHeight = height / 2; var halfDepth = depth / 2; var vertices = [ // Counterclockwise from far left. // Bottom. -halfWidth, -halfHeight, -halfDepth, -halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, -halfDepth, // Top. -halfWidth, halfHeight, -halfDepth, -halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, -halfDepth ]; for ( var i = 0; i < vertices.length; i += 3 ) { vertices[ i ] += dx; vertices[ i + 1 ] += dy; vertices[ i + 2 ] += dz; } var faces = [ // Sides. [ 0, 1, 5, 4 ], [ 1, 2, 6, 5 ], [ 2, 3, 7, 6 ], [ 3, 0, 4, 7 ], // Top. [ 4, 5, 6, 7 ], // Bottom. [ 0, 3, 2, 1 ] ]; return geometry.push( vertices, faces ); }; ## Instruction: Remove box geometry offset arguments. ## Code After: 'use strict'; module.exports = function addBoxGeometry( geometry, width, height, depth ) { var halfWidth = width / 2; var halfHeight = height / 2; var halfDepth = depth / 2; var vertices = [ // Counterclockwise from far left. // Bottom. -halfWidth, -halfHeight, -halfDepth, -halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, halfDepth, halfWidth, -halfHeight, -halfDepth, // Top. -halfWidth, halfHeight, -halfDepth, -halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, halfDepth, halfWidth, halfHeight, -halfDepth ]; var faces = [ // Sides. [ 0, 1, 5, 4 ], [ 1, 2, 6, 5 ], [ 2, 3, 7, 6 ], [ 3, 0, 4, 7 ], // Top. [ 4, 5, 6, 7 ], // Bottom. [ 0, 3, 2, 1 ] ]; return geometry.push( vertices, faces ); };
fecb837888f3754a06ad2a33ddcf5ee561ccadaa
src/Node/Blockquote.php
src/Node/Blockquote.php
<?php namespace FluxBB\Markdown\Node; class Blockquote extends Block implements NodeInterface, NodeAcceptorInterface { public function getType() { return 'block_quote'; } public function canContain(Node $other) { return $other->getType() == 'paragraph'; } public function accepts(Node $block) { return $block->getType() == 'paragraph'; } public function proposeTo(NodeAcceptorInterface $block) { return $block->acceptBlockquote($this); } public function acceptParagraph(Paragraph $paragraph) { $this->addChild($paragraph); return $paragraph; } public function acceptBlockquote(Blockquote $blockquote) { $this->merge($blockquote); return $this; } public function acceptBlankLine(BlankLine $blankLine) { $this->close(); return $this->parent; } public function visit(NodeVisitorInterface $visitor) { $visitor->enterBlockquote($this); parent::visit($visitor); $visitor->leaveBlockquote($this); } }
<?php namespace FluxBB\Markdown\Node; class Blockquote extends Block implements NodeInterface, NodeAcceptorInterface { public function getType() { return 'block_quote'; } public function canContain(Node $other) { return $other->getType() == 'paragraph'; } public function accepts(Node $block) { return $block->getType() == 'paragraph'; } public function proposeTo(NodeAcceptorInterface $block) { return $block->acceptBlockquote($this); } public function acceptParagraph(Paragraph $paragraph) { $this->addChild($paragraph); return $paragraph; } public function acceptHeading(Heading $heading) { $this->addChild($heading); return $this; } public function acceptBlockquote(Blockquote $blockquote) { $this->merge($blockquote); return $this; } public function acceptBlankLine(BlankLine $blankLine) { $this->close(); return $this->parent; } public function visit(NodeVisitorInterface $visitor) { $visitor->enterBlockquote($this); parent::visit($visitor); $visitor->leaveBlockquote($this); } }
Make sure headers can be children of blockquotes.
Make sure headers can be children of blockquotes.
PHP
mit
fluxbb/commonmark
php
## Code Before: <?php namespace FluxBB\Markdown\Node; class Blockquote extends Block implements NodeInterface, NodeAcceptorInterface { public function getType() { return 'block_quote'; } public function canContain(Node $other) { return $other->getType() == 'paragraph'; } public function accepts(Node $block) { return $block->getType() == 'paragraph'; } public function proposeTo(NodeAcceptorInterface $block) { return $block->acceptBlockquote($this); } public function acceptParagraph(Paragraph $paragraph) { $this->addChild($paragraph); return $paragraph; } public function acceptBlockquote(Blockquote $blockquote) { $this->merge($blockquote); return $this; } public function acceptBlankLine(BlankLine $blankLine) { $this->close(); return $this->parent; } public function visit(NodeVisitorInterface $visitor) { $visitor->enterBlockquote($this); parent::visit($visitor); $visitor->leaveBlockquote($this); } } ## Instruction: Make sure headers can be children of blockquotes. ## Code After: <?php namespace FluxBB\Markdown\Node; class Blockquote extends Block implements NodeInterface, NodeAcceptorInterface { public function getType() { return 'block_quote'; } public function canContain(Node $other) { return $other->getType() == 'paragraph'; } public function accepts(Node $block) { return $block->getType() == 'paragraph'; } public function proposeTo(NodeAcceptorInterface $block) { return $block->acceptBlockquote($this); } public function acceptParagraph(Paragraph $paragraph) { $this->addChild($paragraph); return $paragraph; } public function acceptHeading(Heading $heading) { $this->addChild($heading); return $this; } public function acceptBlockquote(Blockquote $blockquote) { $this->merge($blockquote); return $this; } public function acceptBlankLine(BlankLine $blankLine) { $this->close(); return $this->parent; } public function visit(NodeVisitorInterface $visitor) { $visitor->enterBlockquote($this); parent::visit($visitor); $visitor->leaveBlockquote($this); } }
3e7c22849f1f5da1115ed3048d09fe80c16a5b62
example/dynamic.html
example/dynamic.html
<!DOCTYPE html> <html ng-app="example"> <head lang="en"> <meta charset="UTF-8"> <title>Example - Dynamic Table</title> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.css" /> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.theme.default.css" /> <script src="lib/jquery.min.js"></script> <script src="lib/jquery.treetable.js"></script> <script src="lib/angular.min.js"></script> <script src="../src/angular-treetable.js"></script> <script src="js/example_dynamic.js"></script> </head> <body ng-controller="ExampleCtrl"> <h3>Dynamic File Browser</h3> <script type="text/ng-template" id="tree_node"> <tr tt-node is-branch="node.type == 'folder'"> <td><span ng-bind="node.name"></span></td> <td ng-bind="node.type"></td> <td ng-bind="node.size"></td> </tr> </script> <table id="table" tt-table nodes="get_nodes" template="'tree_node'"> <thead> <tr> <th>Filename</th> <th>Type</th> <th>Size</th> </tr> </thead> <tbody></tbody> </table> </body> </html>
<!DOCTYPE html> <html ng-app="example"> <head lang="en"> <meta charset="UTF-8"> <title>Example - Dynamic Table</title> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.css" /> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.theme.default.css" /> <script src="lib/jquery.min.js"></script> <script src="lib/jquery.treetable.js"></script> <script src="lib/angular.min.js"></script> <script src="../src/angular-treetable.js"></script> <script src="js/example_dynamic.js"></script> </head> <body ng-controller="ExampleCtrl"> <h3>Dynamic File Browser</h3> <script type="text/ng-template" id="tree_node"> <tr tt-node is-branch="node.type == 'folder'" parent="parent"> <td><span ng-bind="node.name"></span></td> <td ng-bind="node.type"></td> <td ng-bind="node.size"></td> </tr> </script> <table id="table" tt-table nodes="get_nodes" template="'tree_node'"> <thead> <tr> <th>Filename</th> <th>Type</th> <th>Size</th> </tr> </thead> <tbody></tbody> </table> </body> </html>
Fix bug in the example, not passing along parent
Fix bug in the example, not passing along parent
HTML
mit
GarrettHeel/angular-treetable,yesobo/angular-treetable,yesobo/angular-treetable
html
## Code Before: <!DOCTYPE html> <html ng-app="example"> <head lang="en"> <meta charset="UTF-8"> <title>Example - Dynamic Table</title> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.css" /> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.theme.default.css" /> <script src="lib/jquery.min.js"></script> <script src="lib/jquery.treetable.js"></script> <script src="lib/angular.min.js"></script> <script src="../src/angular-treetable.js"></script> <script src="js/example_dynamic.js"></script> </head> <body ng-controller="ExampleCtrl"> <h3>Dynamic File Browser</h3> <script type="text/ng-template" id="tree_node"> <tr tt-node is-branch="node.type == 'folder'"> <td><span ng-bind="node.name"></span></td> <td ng-bind="node.type"></td> <td ng-bind="node.size"></td> </tr> </script> <table id="table" tt-table nodes="get_nodes" template="'tree_node'"> <thead> <tr> <th>Filename</th> <th>Type</th> <th>Size</th> </tr> </thead> <tbody></tbody> </table> </body> </html> ## Instruction: Fix bug in the example, not passing along parent ## Code After: <!DOCTYPE html> <html ng-app="example"> <head lang="en"> <meta charset="UTF-8"> <title>Example - Dynamic Table</title> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.css" /> <link rel="stylesheet" type="text/css" href="style/jquery.treetable.theme.default.css" /> <script src="lib/jquery.min.js"></script> <script src="lib/jquery.treetable.js"></script> <script src="lib/angular.min.js"></script> <script src="../src/angular-treetable.js"></script> <script src="js/example_dynamic.js"></script> </head> <body ng-controller="ExampleCtrl"> <h3>Dynamic File Browser</h3> <script type="text/ng-template" id="tree_node"> <tr tt-node is-branch="node.type == 'folder'" parent="parent"> <td><span ng-bind="node.name"></span></td> <td ng-bind="node.type"></td> <td ng-bind="node.size"></td> </tr> </script> <table id="table" tt-table nodes="get_nodes" template="'tree_node'"> <thead> <tr> <th>Filename</th> <th>Type</th> <th>Size</th> </tr> </thead> <tbody></tbody> </table> </body> </html>
12d9dffdebc9f157e320d059fe2de7416a2baf96
index.html
index.html
<!DOCTYPE html> <html> <body> <p>Hello Github!</p> <p>Testing speed</p> </body> </html>
<!DOCTYPE html> <html> <body> <p>Hello Github!</p> <p>Testing speed</p> <p>Now in jekyll</p> </body> </html>
Add <p> on testing jekyll
Add <p> on testing jekyll added a <p> for testing jekyll
HTML
mit
tomhohenstein/tomhohenstein.github.io,tomhohenstein/tomhohenstein.github.io,tomhohenstein/tomhohenstein.github.io
html
## Code Before: <!DOCTYPE html> <html> <body> <p>Hello Github!</p> <p>Testing speed</p> </body> </html> ## Instruction: Add <p> on testing jekyll added a <p> for testing jekyll ## Code After: <!DOCTYPE html> <html> <body> <p>Hello Github!</p> <p>Testing speed</p> <p>Now in jekyll</p> </body> </html>
11b9a99bf12f16ededdf79af2e46034e96857659
WebContent/META-INF/context.xml
WebContent/META-INF/context.xml
<?xml version="1.0" encoding="UTF-8"?> <Context path="/courses"> <Resource name="jdbc/coursedb" auth="Container" type="javax.sql.DataSource" factory="org.apache.tomcat.jdbc.pool.DataSourceFactory" driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://devproserv.com:54406/coursedb" username="javauser" password="zSxcvb" initialSize="10" maxActive="20" maxIdle="10" minIdle="5" maxWait="10000" testOnBorrow="true" testOnConnect="true" testOnReturn="true" testWhileIdle="true" validationQuery="SELECT 1" validationInterval="60000" minEvictableIdleTimeMillis="60000" timeBetweenEvictionRunsMillis="60000" removeAbandoned="true" removeAbandonedTimeout="60" logAbandoned="true" /> </Context>
<?xml version="1.0" encoding="UTF-8"?> <Context path="/courses"> <Resource name="jdbc/coursedb" auth="Container" type="javax.sql.DataSource" factory="org.apache.tomcat.jdbc.pool.DataSourceFactory" driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://devproserv.com:54406/coursedb" connectionProperties="useUnicode=yes;characterEncoding=utf8;" username="javauser" password="zSxcvb" initialSize="10" maxActive="20" maxIdle="10" minIdle="5" maxWait="10000" testOnBorrow="true" testOnConnect="true" testOnReturn="true" testWhileIdle="true" validationQuery="SELECT 1" validationInterval="60000" minEvictableIdleTimeMillis="60000" timeBetweenEvictionRunsMillis="60000" removeAbandoned="true" removeAbandonedTimeout="60" logAbandoned="true" /> </Context>
Set encoding UTF8 in the connection pool
Set encoding UTF8 in the connection pool
XML
mit
Vovas11/courses,Vovas11/courses,Vovas11/courses
xml
## Code Before: <?xml version="1.0" encoding="UTF-8"?> <Context path="/courses"> <Resource name="jdbc/coursedb" auth="Container" type="javax.sql.DataSource" factory="org.apache.tomcat.jdbc.pool.DataSourceFactory" driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://devproserv.com:54406/coursedb" username="javauser" password="zSxcvb" initialSize="10" maxActive="20" maxIdle="10" minIdle="5" maxWait="10000" testOnBorrow="true" testOnConnect="true" testOnReturn="true" testWhileIdle="true" validationQuery="SELECT 1" validationInterval="60000" minEvictableIdleTimeMillis="60000" timeBetweenEvictionRunsMillis="60000" removeAbandoned="true" removeAbandonedTimeout="60" logAbandoned="true" /> </Context> ## Instruction: Set encoding UTF8 in the connection pool ## Code After: <?xml version="1.0" encoding="UTF-8"?> <Context path="/courses"> <Resource name="jdbc/coursedb" auth="Container" type="javax.sql.DataSource" factory="org.apache.tomcat.jdbc.pool.DataSourceFactory" driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://devproserv.com:54406/coursedb" connectionProperties="useUnicode=yes;characterEncoding=utf8;" username="javauser" password="zSxcvb" initialSize="10" maxActive="20" maxIdle="10" minIdle="5" maxWait="10000" testOnBorrow="true" testOnConnect="true" testOnReturn="true" testWhileIdle="true" validationQuery="SELECT 1" validationInterval="60000" minEvictableIdleTimeMillis="60000" timeBetweenEvictionRunsMillis="60000" removeAbandoned="true" removeAbandonedTimeout="60" logAbandoned="true" /> </Context>
e552e1fc22c66c593c4479d85b7a770fda09e5d0
spec/services/delete_branch_service_spec.rb
spec/services/delete_branch_service_spec.rb
require 'spec_helper' describe DeleteBranchService, services: true do let(:project) { create(:project) } let(:user) { create(:user) } let(:service) { described_class.new(project, user) } describe '#execute' do let(:result) { service.execute('feature') } context 'when user has access to push to repository' do before do project.team << [user, :developer] end it 'removes the branch' do expect(result[:status]).to eq :success end end context 'when user does not have access to push to repository' do it 'does not remove branch' do expect(result[:status]).to eq :error expect(result[:message]).to eq 'You dont have push access to repo' end end end end
require 'spec_helper' describe DeleteBranchService, services: true do let(:project) { create(:project) } let(:repository) { project.repository } let(:user) { create(:user) } let(:service) { described_class.new(project, user) } describe '#execute' do context 'when user has access to push to repository' do before do project.team << [user, :developer] end it 'removes the branch' do expect(branch_exists?('feature')).to be true result = service.execute('feature') expect(result[:status]).to eq :success expect(branch_exists?('feature')).to be false end end context 'when user does not have access to push to repository' do it 'does not remove branch' do expect(branch_exists?('feature')).to be true result = service.execute('feature') expect(result[:status]).to eq :error expect(result[:message]).to eq 'You dont have push access to repo' expect(branch_exists?('feature')).to be true end end end def branch_exists?(branch_name) repository.ref_exists?("refs/heads/#{branch_name}") end end
Extend tests for delete branch service
Extend tests for delete branch service
Ruby
mit
LUMC/gitlabhq,iiet/iiet-git,iiet/iiet-git,dreampet/gitlab,dplarson/gitlabhq,openwide-java/gitlabhq,openwide-java/gitlabhq,mmkassem/gitlabhq,stoplightio/gitlabhq,jirutka/gitlabhq,htve/GitlabForChinese,stoplightio/gitlabhq,t-zuehlsdorff/gitlabhq,jirutka/gitlabhq,t-zuehlsdorff/gitlabhq,dplarson/gitlabhq,dplarson/gitlabhq,dreampet/gitlab,jirutka/gitlabhq,iiet/iiet-git,dreampet/gitlab,darkrasid/gitlabhq,dreampet/gitlab,darkrasid/gitlabhq,openwide-java/gitlabhq,iiet/iiet-git,stoplightio/gitlabhq,LUMC/gitlabhq,mmkassem/gitlabhq,htve/GitlabForChinese,axilleas/gitlabhq,LUMC/gitlabhq,LUMC/gitlabhq,axilleas/gitlabhq,axilleas/gitlabhq,t-zuehlsdorff/gitlabhq,mmkassem/gitlabhq,darkrasid/gitlabhq,htve/GitlabForChinese,dplarson/gitlabhq,stoplightio/gitlabhq,jirutka/gitlabhq,mmkassem/gitlabhq,darkrasid/gitlabhq,openwide-java/gitlabhq,axilleas/gitlabhq,htve/GitlabForChinese,t-zuehlsdorff/gitlabhq
ruby
## Code Before: require 'spec_helper' describe DeleteBranchService, services: true do let(:project) { create(:project) } let(:user) { create(:user) } let(:service) { described_class.new(project, user) } describe '#execute' do let(:result) { service.execute('feature') } context 'when user has access to push to repository' do before do project.team << [user, :developer] end it 'removes the branch' do expect(result[:status]).to eq :success end end context 'when user does not have access to push to repository' do it 'does not remove branch' do expect(result[:status]).to eq :error expect(result[:message]).to eq 'You dont have push access to repo' end end end end ## Instruction: Extend tests for delete branch service ## Code After: require 'spec_helper' describe DeleteBranchService, services: true do let(:project) { create(:project) } let(:repository) { project.repository } let(:user) { create(:user) } let(:service) { described_class.new(project, user) } describe '#execute' do context 'when user has access to push to repository' do before do project.team << [user, :developer] end it 'removes the branch' do expect(branch_exists?('feature')).to be true result = service.execute('feature') expect(result[:status]).to eq :success expect(branch_exists?('feature')).to be false end end context 'when user does not have access to push to repository' do it 'does not remove branch' do expect(branch_exists?('feature')).to be true result = service.execute('feature') expect(result[:status]).to eq :error expect(result[:message]).to eq 'You dont have push access to repo' expect(branch_exists?('feature')).to be true end end end def branch_exists?(branch_name) repository.ref_exists?("refs/heads/#{branch_name}") end end
1f3730ac4d531ca0d582a8b8bded871acb409847
backend/api-server/warehaus_api/events/models.py
backend/api-server/warehaus_api/events/models.py
from .. import db class Event(db.Model): timestamp = db.Field() obj_id = db.Field() # The object for which this event was created about user_id = db.Field() # The user who performed the action # A list of IDs which are interested in this event. For example, when creating # a server we obviously want this event to be shows in the server page, but we # also want it to be shown in the lab page. So we put two IDs in the list: the # server ID and the lab ID. # Another example is when we delete the server. Then we would be able to show # that event in the lab page although the server is already deleted. interested_ids = db.Field() title = db.Field() # Event title content = db.Field() # Event content def create_event(obj_id, user_id, interested_ids, title, content=''): event = Event( timestamp = db.times.now(), obj_id = obj_id, interested_ids = interested_ids, title = title, content = content, ) event.save()
from .. import db class Event(db.Model): timestamp = db.Field() obj_id = db.Field() # The object for which this event was created about user_id = db.Field() # The user who performed the action # A list of IDs which are interested in this event. For example, when creating # a server we obviously want this event to be shows in the server page, but we # also want it to be shown in the lab page. So we put two IDs in the list: the # server ID and the lab ID. # Another example is when we delete the server. Then we would be able to show # that event in the lab page although the server is already deleted. interested_ids = db.Field() title = db.Field() # Event title content = db.Field() # Event content def create_event(obj_id, user_id, interested_ids, title, content=''): event = Event( timestamp = db.times.now(), obj_id = obj_id, user_id = user_id, interested_ids = interested_ids, title = title, content = content, ) event.save()
Fix api-server events not saving the user ID
Fix api-server events not saving the user ID
Python
agpl-3.0
labsome/labsome,warehaus/warehaus,warehaus/warehaus,labsome/labsome,warehaus/warehaus,labsome/labsome
python
## Code Before: from .. import db class Event(db.Model): timestamp = db.Field() obj_id = db.Field() # The object for which this event was created about user_id = db.Field() # The user who performed the action # A list of IDs which are interested in this event. For example, when creating # a server we obviously want this event to be shows in the server page, but we # also want it to be shown in the lab page. So we put two IDs in the list: the # server ID and the lab ID. # Another example is when we delete the server. Then we would be able to show # that event in the lab page although the server is already deleted. interested_ids = db.Field() title = db.Field() # Event title content = db.Field() # Event content def create_event(obj_id, user_id, interested_ids, title, content=''): event = Event( timestamp = db.times.now(), obj_id = obj_id, interested_ids = interested_ids, title = title, content = content, ) event.save() ## Instruction: Fix api-server events not saving the user ID ## Code After: from .. import db class Event(db.Model): timestamp = db.Field() obj_id = db.Field() # The object for which this event was created about user_id = db.Field() # The user who performed the action # A list of IDs which are interested in this event. For example, when creating # a server we obviously want this event to be shows in the server page, but we # also want it to be shown in the lab page. So we put two IDs in the list: the # server ID and the lab ID. # Another example is when we delete the server. Then we would be able to show # that event in the lab page although the server is already deleted. interested_ids = db.Field() title = db.Field() # Event title content = db.Field() # Event content def create_event(obj_id, user_id, interested_ids, title, content=''): event = Event( timestamp = db.times.now(), obj_id = obj_id, user_id = user_id, interested_ids = interested_ids, title = title, content = content, ) event.save()
85d7102a958298b299aba66ce43a0d7643bc644d
tests/scripts/vagrant_up.sh
tests/scripts/vagrant_up.sh
vagrant box add --provider libvirt --name centos/8 https://cloud.centos.org/centos/8/x86_64/images/CentOS-8-Vagrant-8.1.1911-20200113.3.x86_64.vagrant-libvirt.box retries=0 until [ $retries -ge 5 ] do echo "Attempting to start VMs. Attempts: $retries" timeout 10m time vagrant up "$@" && break retries=$[$retries+1] sleep 5 done sleep 10
retries=0 until [ $retries -ge 5 ] do echo "Attempting to start VMs. Attempts: $retries" timeout 10m time vagrant up "$@" && break retries=$[$retries+1] sleep 5 done sleep 10
Revert "vagrant: temp workaround for CentOS 8 cloud image"
Revert "vagrant: temp workaround for CentOS 8 cloud image" The CentOS 8 vagrant image download is now fixed. This reverts commit a5385e104884a3692954e4691f3348847a35c7fa. Signed-off-by: Dimitri Savineau <[email protected]>
Shell
apache-2.0
ceph/ceph-ansible,ceph/ceph-ansible
shell
## Code Before: vagrant box add --provider libvirt --name centos/8 https://cloud.centos.org/centos/8/x86_64/images/CentOS-8-Vagrant-8.1.1911-20200113.3.x86_64.vagrant-libvirt.box retries=0 until [ $retries -ge 5 ] do echo "Attempting to start VMs. Attempts: $retries" timeout 10m time vagrant up "$@" && break retries=$[$retries+1] sleep 5 done sleep 10 ## Instruction: Revert "vagrant: temp workaround for CentOS 8 cloud image" The CentOS 8 vagrant image download is now fixed. This reverts commit a5385e104884a3692954e4691f3348847a35c7fa. Signed-off-by: Dimitri Savineau <[email protected]> ## Code After: retries=0 until [ $retries -ge 5 ] do echo "Attempting to start VMs. Attempts: $retries" timeout 10m time vagrant up "$@" && break retries=$[$retries+1] sleep 5 done sleep 10
67bb0c9e3e191f622be9863a48717370266f9413
app/views/tasks/show.html.slim
app/views/tasks/show.html.slim
h1 = @task.title p = @task.state p = @task.milestone.try :name p = @task.assigned_user.try :name
h1 = @task.title p = @task.state p = @task.milestone_name p = @task.assigned_user_name
Remove even more `try` calls in task show.
Remove even more `try` calls in task show.
Slim
mit
vesln/freedom,vesln/freedom
slim
## Code Before: h1 = @task.title p = @task.state p = @task.milestone.try :name p = @task.assigned_user.try :name ## Instruction: Remove even more `try` calls in task show. ## Code After: h1 = @task.title p = @task.state p = @task.milestone_name p = @task.assigned_user_name
d6c0ad1c6f8a30e3e2e0272ea318f67d841a2abc
tools/check-cluster.sh
tools/check-cluster.sh
for i in `echo 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j 3k 3l 4a 4b 4c`; do echo $i ssh volt$i ps -ef | grep java && exit done
for i in `echo 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j 3k 3l 4a 4b 4c`; do echo $i ssh volt$i ps -ef | grep java | grep -v slave.jar && exit done
Allow cluster machines to be running a Hudson slave process.
Allow cluster machines to be running a Hudson slave process. Hudson slaves don't interfere with a server from a ports point of view, and practically speaking anything they're doing that interferes from a CPU point of view will probably still turn up in the check-cluster script.
Shell
agpl-3.0
simonzhangsm/voltdb,flybird119/voltdb,creative-quant/voltdb,paulmartel/voltdb,zuowang/voltdb,zuowang/voltdb,migue/voltdb,creative-quant/voltdb,flybird119/voltdb,deerwalk/voltdb,paulmartel/voltdb,flybird119/voltdb,kumarrus/voltdb,ingted/voltdb,simonzhangsm/voltdb,creative-quant/voltdb,VoltDB/voltdb,paulmartel/voltdb,flybird119/voltdb,kobronson/cs-voltdb,migue/voltdb,migue/voltdb,paulmartel/voltdb,paulmartel/voltdb,creative-quant/voltdb,ingted/voltdb,kumarrus/voltdb,ingted/voltdb,paulmartel/voltdb,VoltDB/voltdb,flybird119/voltdb,creative-quant/voltdb,zuowang/voltdb,wolffcm/voltdb,kobronson/cs-voltdb,flybird119/voltdb,deerwalk/voltdb,creative-quant/voltdb,ingted/voltdb,wolffcm/voltdb,simonzhangsm/voltdb,deerwalk/voltdb,kobronson/cs-voltdb,kumarrus/voltdb,VoltDB/voltdb,flybird119/voltdb,deerwalk/voltdb,kobronson/cs-voltdb,creative-quant/voltdb,simonzhangsm/voltdb,kumarrus/voltdb,ingted/voltdb,paulmartel/voltdb,paulmartel/voltdb,ingted/voltdb,creative-quant/voltdb,wolffcm/voltdb,simonzhangsm/voltdb,kumarrus/voltdb,deerwalk/voltdb,deerwalk/voltdb,kumarrus/voltdb,migue/voltdb,ingted/voltdb,wolffcm/voltdb,kumarrus/voltdb,kobronson/cs-voltdb,migue/voltdb,migue/voltdb,kobronson/cs-voltdb,deerwalk/voltdb,wolffcm/voltdb,kobronson/cs-voltdb,kumarrus/voltdb,simonzhangsm/voltdb,zuowang/voltdb,wolffcm/voltdb,zuowang/voltdb,wolffcm/voltdb,VoltDB/voltdb,VoltDB/voltdb,simonzhangsm/voltdb,simonzhangsm/voltdb,deerwalk/voltdb,zuowang/voltdb,migue/voltdb,ingted/voltdb,wolffcm/voltdb,zuowang/voltdb,kobronson/cs-voltdb,migue/voltdb,zuowang/voltdb,VoltDB/voltdb,flybird119/voltdb,VoltDB/voltdb
shell
## Code Before: for i in `echo 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j 3k 3l 4a 4b 4c`; do echo $i ssh volt$i ps -ef | grep java && exit done ## Instruction: Allow cluster machines to be running a Hudson slave process. Hudson slaves don't interfere with a server from a ports point of view, and practically speaking anything they're doing that interferes from a CPU point of view will probably still turn up in the check-cluster script. ## Code After: for i in `echo 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j 3k 3l 4a 4b 4c`; do echo $i ssh volt$i ps -ef | grep java | grep -v slave.jar && exit done
0e82fba1c9769f71c162c8364fe783d2cc3cda17
setup.py
setup.py
try: from setuptools import setup except ImportError: from distutils.core import setup from sys import platform import subprocess import glob import os ver = os.environ.get("PKGVER") or subprocess.run(['git', 'describe', '--tags'], stdout=subprocess.PIPE).stdout.decode().strip() setup( name = 'knightos', packages = ['knightos', 'knightos.commands'], version = ver, description = 'KnightOS SDK', author = 'Drew DeVault', author_email = '[email protected]', url = 'https://github.com/KnightOS/sdk', install_requires = ['requests', 'pyyaml', 'pystache', 'docopt'], license = 'AGPL-3.0', scripts=['bin/knightos'], package_data={ 'knightos': [ 'templates/\'*\'', 'templates/c/\'*\'', 'templates/assembly/\'*\'', ] }, )
try: from setuptools import setup except ImportError: from distutils.core import setup from sys import platform import subprocess import glob import os ver = os.environ.get("PKGVER") or subprocess.run(['git', 'describe', '--tags'], stdout=subprocess.PIPE).stdout.decode().strip() setup( name = 'knightos', packages = ['knightos', 'knightos.commands'], version = ver, description = 'KnightOS SDK', author = 'Drew DeVault', author_email = '[email protected]', url = 'https://github.com/KnightOS/sdk', install_requires = ['requests', 'pyyaml', 'pystache', 'docopt'], license = 'AGPL-3.0', scripts=['bin/knightos'], include_package_data=True, package_data={ 'knightos': [ 'knightos/templates/\'*\'', 'knightos/templates/c/\'*\'', 'knightos/templates/assembly/\'*\'', ] }, )
Fix templates building on OS X
Fix templates building on OS X
Python
mit
KnightOS/sdk,KnightOS/sdk,KnightOS/sdk
python
## Code Before: try: from setuptools import setup except ImportError: from distutils.core import setup from sys import platform import subprocess import glob import os ver = os.environ.get("PKGVER") or subprocess.run(['git', 'describe', '--tags'], stdout=subprocess.PIPE).stdout.decode().strip() setup( name = 'knightos', packages = ['knightos', 'knightos.commands'], version = ver, description = 'KnightOS SDK', author = 'Drew DeVault', author_email = '[email protected]', url = 'https://github.com/KnightOS/sdk', install_requires = ['requests', 'pyyaml', 'pystache', 'docopt'], license = 'AGPL-3.0', scripts=['bin/knightos'], package_data={ 'knightos': [ 'templates/\'*\'', 'templates/c/\'*\'', 'templates/assembly/\'*\'', ] }, ) ## Instruction: Fix templates building on OS X ## Code After: try: from setuptools import setup except ImportError: from distutils.core import setup from sys import platform import subprocess import glob import os ver = os.environ.get("PKGVER") or subprocess.run(['git', 'describe', '--tags'], stdout=subprocess.PIPE).stdout.decode().strip() setup( name = 'knightos', packages = ['knightos', 'knightos.commands'], version = ver, description = 'KnightOS SDK', author = 'Drew DeVault', author_email = '[email protected]', url = 'https://github.com/KnightOS/sdk', install_requires = ['requests', 'pyyaml', 'pystache', 'docopt'], license = 'AGPL-3.0', scripts=['bin/knightos'], include_package_data=True, package_data={ 'knightos': [ 'knightos/templates/\'*\'', 'knightos/templates/c/\'*\'', 'knightos/templates/assembly/\'*\'', ] }, )
a460713d36e8310a9f975d13d49579e77d83dfe7
examples/with-shapely.py
examples/with-shapely.py
import logging import sys from shapely.geometry import mapping, shape from fiona import collection logging.basicConfig(stream=sys.stderr, level=logging.INFO) with collection("docs/data/test_uk.shp", "r") as input: schema = input.schema.copy() with collection( "with-shapely.shp", "w", "ESRI Shapefile", schema ) as output: for f in input: try: geom = shape(f['geometry']) if not geom.is_valid: clean = geom.buffer(0.0) assert geom.is_valid geom = clean f['geometry'] = mapping(geom) output.write(f) except Exception, e: # Writing uncleanable features to a different shapefile # is another option. logging.exception("Error cleaning feature %s:", f['id'])
import logging import sys from shapely.geometry import mapping, shape from fiona import collection logging.basicConfig(stream=sys.stderr, level=logging.INFO) with collection("docs/data/test_uk.shp", "r") as input: schema = input.schema.copy() with collection( "with-shapely.shp", "w", "ESRI Shapefile", schema ) as output: for f in input: try: geom = shape(f['geometry']) if not geom.is_valid: clean = geom.buffer(0.0) assert clean.is_valid assert clean.geom_type == 'Polygon' geom = clean f['geometry'] = mapping(geom) output.write(f) except Exception, e: # Writing uncleanable features to a different shapefile # is another option. logging.exception("Error cleaning feature %s:", f['id'])
Fix validity assertion and add another.
Fix validity assertion and add another.
Python
bsd-3-clause
johanvdw/Fiona,Toblerity/Fiona,rbuffat/Fiona,Toblerity/Fiona,perrygeo/Fiona,perrygeo/Fiona,sgillies/Fiona,rbuffat/Fiona
python
## Code Before: import logging import sys from shapely.geometry import mapping, shape from fiona import collection logging.basicConfig(stream=sys.stderr, level=logging.INFO) with collection("docs/data/test_uk.shp", "r") as input: schema = input.schema.copy() with collection( "with-shapely.shp", "w", "ESRI Shapefile", schema ) as output: for f in input: try: geom = shape(f['geometry']) if not geom.is_valid: clean = geom.buffer(0.0) assert geom.is_valid geom = clean f['geometry'] = mapping(geom) output.write(f) except Exception, e: # Writing uncleanable features to a different shapefile # is another option. logging.exception("Error cleaning feature %s:", f['id']) ## Instruction: Fix validity assertion and add another. ## Code After: import logging import sys from shapely.geometry import mapping, shape from fiona import collection logging.basicConfig(stream=sys.stderr, level=logging.INFO) with collection("docs/data/test_uk.shp", "r") as input: schema = input.schema.copy() with collection( "with-shapely.shp", "w", "ESRI Shapefile", schema ) as output: for f in input: try: geom = shape(f['geometry']) if not geom.is_valid: clean = geom.buffer(0.0) assert clean.is_valid assert clean.geom_type == 'Polygon' geom = clean f['geometry'] = mapping(geom) output.write(f) except Exception, e: # Writing uncleanable features to a different shapefile # is another option. logging.exception("Error cleaning feature %s:", f['id'])
6c12f97bfed8b8a4749f75e1a508caf0ea310423
docker/update-production.py
docker/update-production.py
import argparse import subprocess import json import sys parser = argparse.ArgumentParser() args = parser.parse_args() def info(msg): sys.stdout.write('* {}\n'.format(msg)) sys.stdout.flush() info('Determining current production details...') output = subprocess.check_output(['tutum', 'service', 'inspect', 'lb.muzhack-staging']).decode( 'utf-8') data = json.loads(output) linked_service = data['linked_to_service'][0]['name'] info('Currently linked service is \'{}\''.format(linked_service)) if linked_service == 'muzhack-green': link_to = 'muzhack-blue' else: assert linked_service == 'muzhack-blue' link_to = 'muzhack-green' info('Redeploying service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'redeploy', '--sync', link_to], stdout=subprocess.PIPE) info('Linking to service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'set', '--link-service', '{0}:{0}'.format(link_to), '--sync', 'lb'], stdout=subprocess.PIPE) info('Successfully switched production service to {}'.format(link_to))
import argparse import subprocess import json import sys parser = argparse.ArgumentParser() args = parser.parse_args() def info(msg): sys.stdout.write('* {}\n'.format(msg)) sys.stdout.flush() info('Determining current production details...') output = subprocess.check_output(['tutum', 'service', 'inspect', 'lb.muzhack-staging']).decode( 'utf-8') data = json.loads(output) linked_service = data['linked_to_service'][0]['name'] info('Currently linked service is \'{}\''.format(linked_service)) if linked_service == 'muzhack-green': link_to = 'muzhack-blue' else: assert linked_service == 'muzhack-blue' link_to = 'muzhack-green' info('Redeploying service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'redeploy', '--sync', link_to], stdout=subprocess.PIPE) info('Linking to service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'set', '--link-service', '{0}:{0}'.format(link_to), '--sync', 'lb.muzhack-staging'], stdout=subprocess.PIPE) info('Successfully switched production service to {}'.format(link_to))
Make sure to update correct load balancer
Make sure to update correct load balancer
Python
mit
muzhack/musitechhub,muzhack/musitechhub,muzhack/muzhack,muzhack/muzhack,muzhack/musitechhub,muzhack/muzhack,muzhack/musitechhub,muzhack/muzhack
python
## Code Before: import argparse import subprocess import json import sys parser = argparse.ArgumentParser() args = parser.parse_args() def info(msg): sys.stdout.write('* {}\n'.format(msg)) sys.stdout.flush() info('Determining current production details...') output = subprocess.check_output(['tutum', 'service', 'inspect', 'lb.muzhack-staging']).decode( 'utf-8') data = json.loads(output) linked_service = data['linked_to_service'][0]['name'] info('Currently linked service is \'{}\''.format(linked_service)) if linked_service == 'muzhack-green': link_to = 'muzhack-blue' else: assert linked_service == 'muzhack-blue' link_to = 'muzhack-green' info('Redeploying service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'redeploy', '--sync', link_to], stdout=subprocess.PIPE) info('Linking to service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'set', '--link-service', '{0}:{0}'.format(link_to), '--sync', 'lb'], stdout=subprocess.PIPE) info('Successfully switched production service to {}'.format(link_to)) ## Instruction: Make sure to update correct load balancer ## Code After: import argparse import subprocess import json import sys parser = argparse.ArgumentParser() args = parser.parse_args() def info(msg): sys.stdout.write('* {}\n'.format(msg)) sys.stdout.flush() info('Determining current production details...') output = subprocess.check_output(['tutum', 'service', 'inspect', 'lb.muzhack-staging']).decode( 'utf-8') data = json.loads(output) linked_service = data['linked_to_service'][0]['name'] info('Currently linked service is \'{}\''.format(linked_service)) if linked_service == 'muzhack-green': link_to = 'muzhack-blue' else: assert linked_service == 'muzhack-blue' link_to = 'muzhack-green' info('Redeploying service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'redeploy', '--sync', link_to], stdout=subprocess.PIPE) info('Linking to service \'{}\'...'.format(link_to)) subprocess.check_call(['tutum', 'service', 'set', '--link-service', '{0}:{0}'.format(link_to), '--sync', 'lb.muzhack-staging'], stdout=subprocess.PIPE) info('Successfully switched production service to {}'.format(link_to))
268467c5eb12e92a031d22e3e0515e9d6fc264b7
ofgem_certificates.py
ofgem_certificates.py
__author__ = 'david reid' __version__ = '0.1' import argparse from Ofgem.certificate import OfgemCertificateData def do_search(month, year): ocd = OfgemCertificateData(month = month, year = year) if ocd.get_data(): for c in ocd.certificates: print c.as_string() else: print "No certificates returned." if __name__ == '__main__': parser = argparse.ArgumentParser(description='Get details of a station by accreditation number') # parser.add_argument('accreditation', action='store', help='Accreditation number to search for') args = parser.parse_args() do_search(9, 2012)
__author__ = 'david reid' __version__ = '0.1' import argparse from Ofgem.certificate import OfgemCertificateData def do_search(month, year): ocd = OfgemCertificateData(month = month, year = year) if ocd.get_data(): for c in ocd.certificates: print c.as_string() else: print "No certificates returned." if __name__ == '__main__': parser = argparse.ArgumentParser(description='Get Ofgem certificates for a given month & year') parser.add_argument('month', type=int, action='store', help='Month') parser.add_argument('year', type=int, action='store', help='Year') args = parser.parse_args() do_search(args.month, args.year)
Update the certificate script to accept month & year
Update the certificate script to accept month & year
Python
unlicense
zathras777/pywind,zathras777/pywind
python
## Code Before: __author__ = 'david reid' __version__ = '0.1' import argparse from Ofgem.certificate import OfgemCertificateData def do_search(month, year): ocd = OfgemCertificateData(month = month, year = year) if ocd.get_data(): for c in ocd.certificates: print c.as_string() else: print "No certificates returned." if __name__ == '__main__': parser = argparse.ArgumentParser(description='Get details of a station by accreditation number') # parser.add_argument('accreditation', action='store', help='Accreditation number to search for') args = parser.parse_args() do_search(9, 2012) ## Instruction: Update the certificate script to accept month & year ## Code After: __author__ = 'david reid' __version__ = '0.1' import argparse from Ofgem.certificate import OfgemCertificateData def do_search(month, year): ocd = OfgemCertificateData(month = month, year = year) if ocd.get_data(): for c in ocd.certificates: print c.as_string() else: print "No certificates returned." if __name__ == '__main__': parser = argparse.ArgumentParser(description='Get Ofgem certificates for a given month & year') parser.add_argument('month', type=int, action='store', help='Month') parser.add_argument('year', type=int, action='store', help='Year') args = parser.parse_args() do_search(args.month, args.year)
89fee0e35e56ca0bc47877b567c4a6d39d060fd5
app/views/layouts/ci/_nav_dashboard.html.haml
app/views/layouts/ci/_nav_dashboard.html.haml
%ul.nav.nav-sidebar = nav_link do = link_to root_path, title: 'Back to dashboard', data: {placement: 'right'}, class: 'back-link' do = icon('caret-square-o-left fw') %span Back to GitLab %li.separate-item = nav_link path: 'projects#index' do = link_to ci_root_path do %i.fa.fa-home %span Projects - if current_user && current_user.is_admin? %li = link_to ci_admin_projects_path do %i.fa.fa-cogs %span Admin %li = link_to ci_help_path do %i.fa.fa-info %span Help
%ul.nav.nav-sidebar = nav_link do = link_to root_path, title: 'Back to dashboard', data: {placement: 'right'}, class: 'back-link' do = icon('caret-square-o-left fw') %span Back to GitLab %li.separate-item = nav_link path: 'projects#index' do = link_to ci_root_path do %i.fa.fa-home %span Projects - if current_user && current_user.is_admin? %li = link_to ci_admin_projects_path do %i.fa.fa-cogs %span Admin = nav_link path: "helps#show" do = link_to ci_help_path do %i.fa.fa-info %span Help
Fix active state of CI Help nav link.
Fix active state of CI Help nav link.
Haml
mit
sonalkr132/gitlabhq,Soullivaneuh/gitlabhq,chenrui2014/gitlabhq,mr-dxdy/gitlabhq,mrb/gitlabhq,OtkurBiz/gitlabhq,Soullivaneuh/gitlabhq,martijnvermaat/gitlabhq,mr-dxdy/gitlabhq,ferdinandrosario/gitlabhq,NKMR6194/gitlabhq,Tyrael/gitlabhq,joalmeid/gitlabhq,fscherwi/gitlabhq,gopeter/gitlabhq,SVArago/gitlabhq,copystudy/gitlabhq,fscherwi/gitlabhq,screenpages/gitlabhq,salipro4ever/gitlabhq,yatish27/gitlabhq,stoplightio/gitlabhq,dwrensha/gitlabhq,joalmeid/gitlabhq,allysonbarros/gitlabhq,ikappas/gitlabhq,allysonbarros/gitlabhq,Exeia/gitlabhq,Devin001/gitlabhq,axilleas/gitlabhq,Datacom/gitlabhq,mmkassem/gitlabhq,duduribeiro/gitlabhq,vjustov/gitlabhq,MauriceMohlek/gitlabhq,openwide-java/gitlabhq,shinexiao/gitlabhq,shinexiao/gitlabhq,mmkassem/gitlabhq,salipro4ever/gitlabhq,dwrensha/gitlabhq,dwrensha/gitlabhq,jrjang/gitlabhq,screenpages/gitlabhq,SVArago/gitlabhq,icedwater/gitlabhq,gopeter/gitlabhq,ferdinandrosario/gitlabhq,aaronsnyder/gitlabhq,Tyrael/gitlabhq,MauriceMohlek/gitlabhq,fpgentil/gitlabhq,MauriceMohlek/gitlabhq,MauriceMohlek/gitlabhq,axilleas/gitlabhq,mr-dxdy/gitlabhq,dreampet/gitlab,szechyjs/gitlabhq,ksoichiro/gitlabhq,sue445/gitlabhq,szechyjs/gitlabhq,delkyd/gitlabhq,sonalkr132/gitlabhq,daiyu/gitlab-zh,daiyu/gitlab-zh,louahola/gitlabhq,ttasanen/gitlabhq,t-zuehlsdorff/gitlabhq,joalmeid/gitlabhq,aaronsnyder/gitlabhq,louahola/gitlabhq,dreampet/gitlab,SVArago/gitlabhq,jrjang/gitlabhq,dplarson/gitlabhq,vjustov/gitlabhq,copystudy/gitlabhq,dreampet/gitlab,mrb/gitlabhq,Devin001/gitlabhq,jrjang/gitlabhq,htve/GitlabForChinese,htve/GitlabForChinese,darkrasid/gitlabhq,yfaizal/gitlabhq,Telekom-PD/gitlabhq,ksoichiro/gitlabhq,NKMR6194/gitlabhq,iiet/iiet-git,darkrasid/gitlabhq,cui-liqiang/gitlab-ce,duduribeiro/gitlabhq,LUMC/gitlabhq,dplarson/gitlabhq,LUMC/gitlabhq,OtkurBiz/gitlabhq,chenrui2014/gitlabhq,cui-liqiang/gitlab-ce,delkyd/gitlabhq,stoplightio/gitlabhq,allysonbarros/gitlabhq,ikappas/gitlabhq,Devin001/gitlabhq,cui-liqiang/gitlab-ce,salipro4ever/gitlabhq,gopeter/gitlabhq,Tyrael/gitlabhq,allysonbarros/gitlabhq,mr-dxdy/gitlabhq,fscherwi/gitlabhq,delkyd/gitlabhq,sue445/gitlabhq,ksoichiro/gitlabhq,jrjang/gitlab-ce,vjustov/gitlabhq,Telekom-PD/gitlabhq,koreamic/gitlabhq,louahola/gitlabhq,icedwater/gitlabhq,sonalkr132/gitlabhq,OlegGirko/gitlab-ce,OtkurBiz/gitlabhq,OlegGirko/gitlab-ce,screenpages/gitlabhq,ttasanen/gitlabhq,salipro4ever/gitlabhq,copystudy/gitlabhq,Soullivaneuh/gitlabhq,gopeter/gitlabhq,Exeia/gitlabhq,OlegGirko/gitlab-ce,iiet/iiet-git,icedwater/gitlabhq,darkrasid/gitlabhq,t-zuehlsdorff/gitlabhq,axilleas/gitlabhq,larryli/gitlabhq,jrjang/gitlabhq,OlegGirko/gitlab-ce,t-zuehlsdorff/gitlabhq,jrjang/gitlab-ce,larryli/gitlabhq,ferdinandrosario/gitlabhq,dplarson/gitlabhq,openwide-java/gitlabhq,dplarson/gitlabhq,yatish27/gitlabhq,shinexiao/gitlabhq,iiet/iiet-git,fantasywind/gitlabhq,duduribeiro/gitlabhq,duduribeiro/gitlabhq,t-zuehlsdorff/gitlabhq,daiyu/gitlab-zh,fpgentil/gitlabhq,koreamic/gitlabhq,LUMC/gitlabhq,icedwater/gitlabhq,stoplightio/gitlabhq,copystudy/gitlabhq,sonalkr132/gitlabhq,louahola/gitlabhq,fpgentil/gitlabhq,daiyu/gitlab-zh,SVArago/gitlabhq,Tyrael/gitlabhq,ksoichiro/gitlabhq,cui-liqiang/gitlab-ce,jirutka/gitlabhq,jrjang/gitlab-ce,mrb/gitlabhq,yatish27/gitlabhq,mmkassem/gitlabhq,mmkassem/gitlabhq,jrjang/gitlab-ce,delkyd/gitlabhq,NKMR6194/gitlabhq,fpgentil/gitlabhq,fantasywind/gitlabhq,Telekom-PD/gitlabhq,dreampet/gitlab,koreamic/gitlabhq,ikappas/gitlabhq,iiet/iiet-git,larryli/gitlabhq,joalmeid/gitlabhq,aaronsnyder/gitlabhq,aaronsnyder/gitlabhq,yfaizal/gitlabhq,axilleas/gitlabhq,dwrensha/gitlabhq,martijnvermaat/gitlabhq,LUMC/gitlabhq,chenrui2014/gitlabhq,openwide-java/gitlabhq,martijnvermaat/gitlabhq,darkrasid/gitlabhq,larryli/gitlabhq,screenpages/gitlabhq,Datacom/gitlabhq,koreamic/gitlabhq,szechyjs/gitlabhq,Soullivaneuh/gitlabhq,szechyjs/gitlabhq,openwide-java/gitlabhq,Exeia/gitlabhq,sue445/gitlabhq,ttasanen/gitlabhq,Exeia/gitlabhq,NKMR6194/gitlabhq,sue445/gitlabhq,ttasanen/gitlabhq,shinexiao/gitlabhq,jirutka/gitlabhq,fantasywind/gitlabhq,fantasywind/gitlabhq,fscherwi/gitlabhq,yatish27/gitlabhq,vjustov/gitlabhq,OtkurBiz/gitlabhq,Datacom/gitlabhq,htve/GitlabForChinese,chenrui2014/gitlabhq,jirutka/gitlabhq,Devin001/gitlabhq,yfaizal/gitlabhq,htve/GitlabForChinese,Telekom-PD/gitlabhq,ikappas/gitlabhq,mrb/gitlabhq,martijnvermaat/gitlabhq,stoplightio/gitlabhq,ferdinandrosario/gitlabhq,jirutka/gitlabhq,Datacom/gitlabhq,yfaizal/gitlabhq
haml
## Code Before: %ul.nav.nav-sidebar = nav_link do = link_to root_path, title: 'Back to dashboard', data: {placement: 'right'}, class: 'back-link' do = icon('caret-square-o-left fw') %span Back to GitLab %li.separate-item = nav_link path: 'projects#index' do = link_to ci_root_path do %i.fa.fa-home %span Projects - if current_user && current_user.is_admin? %li = link_to ci_admin_projects_path do %i.fa.fa-cogs %span Admin %li = link_to ci_help_path do %i.fa.fa-info %span Help ## Instruction: Fix active state of CI Help nav link. ## Code After: %ul.nav.nav-sidebar = nav_link do = link_to root_path, title: 'Back to dashboard', data: {placement: 'right'}, class: 'back-link' do = icon('caret-square-o-left fw') %span Back to GitLab %li.separate-item = nav_link path: 'projects#index' do = link_to ci_root_path do %i.fa.fa-home %span Projects - if current_user && current_user.is_admin? %li = link_to ci_admin_projects_path do %i.fa.fa-cogs %span Admin = nav_link path: "helps#show" do = link_to ci_help_path do %i.fa.fa-info %span Help
df5d6f061055377757302432e034990f57267d41
.travis.yml
.travis.yml
dist: trusty sudo: false group: beta language: node_js node_js: - node addons: firefox: latest-esr cache: directories: - node_modules - "$HOME/.cache/bower"
dist: trusty sudo: false group: beta language: node_js node_js: - node addons: firefox: latest-esr cache: directories: - node_modules - "$HOME/.cache/bower" script: - xvfb-run npm test
Use xvfb-run for running tests inside Travis
Use xvfb-run for running tests inside Travis 459b9fc068c2a34585455190d75d5254c1be1ed5 removed the 'script' as Travis would use 'npm test' by default anyways. Initially this worked, but it seems that something has changed and we're now seeing timeouts, likely because the browsers do not properly start up. For now: switch back to using 'xvfb-run' to make the tests work. Note that we do not want this xvfb-run encoded in the package.json 'test' script, because that would a) break windows compatibility, b) require local changes when debugging test.
YAML
apache-2.0
Collaborne/iron-justified-gallery
yaml
## Code Before: dist: trusty sudo: false group: beta language: node_js node_js: - node addons: firefox: latest-esr cache: directories: - node_modules - "$HOME/.cache/bower" ## Instruction: Use xvfb-run for running tests inside Travis 459b9fc068c2a34585455190d75d5254c1be1ed5 removed the 'script' as Travis would use 'npm test' by default anyways. Initially this worked, but it seems that something has changed and we're now seeing timeouts, likely because the browsers do not properly start up. For now: switch back to using 'xvfb-run' to make the tests work. Note that we do not want this xvfb-run encoded in the package.json 'test' script, because that would a) break windows compatibility, b) require local changes when debugging test. ## Code After: dist: trusty sudo: false group: beta language: node_js node_js: - node addons: firefox: latest-esr cache: directories: - node_modules - "$HOME/.cache/bower" script: - xvfb-run npm test
24f531247365185204d4b9417235f7b4f6b35aff
CHANGELOG.md
CHANGELOG.md
* First public release, imported from CountDownLatch
* Added BouncingLatch, which works the same as a CountDownLatch, but allows counting up as well as down. #wait takes a block which is called on each tick. # v1.0.0 * First public release, imported from CountDownLatch
Add changelog entry for BouncingLatch
Add changelog entry for BouncingLatch
Markdown
mit
benlangfeld/synchronicity
markdown
## Code Before: * First public release, imported from CountDownLatch ## Instruction: Add changelog entry for BouncingLatch ## Code After: * Added BouncingLatch, which works the same as a CountDownLatch, but allows counting up as well as down. #wait takes a block which is called on each tick. # v1.0.0 * First public release, imported from CountDownLatch
708cae7a0c5a46a15f927bcbdf126831e1dfa100
README.md
README.md
[![Build Status](https://travis-ci.org/codemirror/CodeMirror.svg)](https://travis-ci.org/codemirror/CodeMirror) [![NPM version](https://img.shields.io/npm/v/codemirror.svg)](https://www.npmjs.org/package/codemirror) CodeMirror is a JavaScript component that provides a code editor in the browser. When a mode is available for the language you are coding in, it will color your code, and optionally help with indentation. The project page is http://codemirror.net The manual is at http://codemirror.net/doc/manual.html The contributing guidelines are in [CONTRIBUTING.md](https://github.com/codemirror/CodeMirror/blob/master/CONTRIBUTING.md)
[![Build Status](https://travis-ci.org/abdelouahabb/CodeMirror.svg)](https://travis-ci.org/abdelouahabb/CodeMirror) [![NPM version](https://img.shields.io/npm/v/codemirror.svg)](https://www.npmjs.org/package/codemirror) CodeMirror is a JavaScript component that provides a code editor in the browser. When a mode is available for the language you are coding in, it will color your code, and optionally help with indentation. The project page is http://codemirror.net The manual is at http://codemirror.net/doc/manual.html The contributing guidelines are in [CONTRIBUTING.md](https://github.com/codemirror/CodeMirror/blob/master/CONTRIBUTING.md)
Make the travis check on this repo.
Make the travis check on this repo.
Markdown
mit
abdelouahabb/CodeMirror,abdelouahabb/CodeMirror,abdelouahabb/CodeMirror
markdown
## Code Before: [![Build Status](https://travis-ci.org/codemirror/CodeMirror.svg)](https://travis-ci.org/codemirror/CodeMirror) [![NPM version](https://img.shields.io/npm/v/codemirror.svg)](https://www.npmjs.org/package/codemirror) CodeMirror is a JavaScript component that provides a code editor in the browser. When a mode is available for the language you are coding in, it will color your code, and optionally help with indentation. The project page is http://codemirror.net The manual is at http://codemirror.net/doc/manual.html The contributing guidelines are in [CONTRIBUTING.md](https://github.com/codemirror/CodeMirror/blob/master/CONTRIBUTING.md) ## Instruction: Make the travis check on this repo. ## Code After: [![Build Status](https://travis-ci.org/abdelouahabb/CodeMirror.svg)](https://travis-ci.org/abdelouahabb/CodeMirror) [![NPM version](https://img.shields.io/npm/v/codemirror.svg)](https://www.npmjs.org/package/codemirror) CodeMirror is a JavaScript component that provides a code editor in the browser. When a mode is available for the language you are coding in, it will color your code, and optionally help with indentation. The project page is http://codemirror.net The manual is at http://codemirror.net/doc/manual.html The contributing guidelines are in [CONTRIBUTING.md](https://github.com/codemirror/CodeMirror/blob/master/CONTRIBUTING.md)
312a6c392b6459797afc52d2aed20ff5a3e5895b
workstation.sh
workstation.sh
docker run -it --rm \ -v $(pwd):/workspace \ -v ~/.ssh/gitlab_rsa:/root/.ssh/id_rsa:ro \ -v ~/.ssh/known_hosts:/root/.ssh/known_hosts:ro \ -e AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION \ -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \ -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \ genesysarch/cloud-workstation $@
docker run -it --rm \ -v $(pwd):/workspace \ -v ~/.ssh/gitlab_rsa:/root/.ssh/id_rsa:ro \ -v ~/.ssh/known_hosts:/root/.ssh/known_hosts:ro \ -e AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION \ -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \ -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \ jmahowald/cloud-workstation $@
Use personal docker image for now
Use personal docker image for now
Shell
mit
genesyslab/cloud-workstation,genesyslab/cloud-workstation
shell
## Code Before: docker run -it --rm \ -v $(pwd):/workspace \ -v ~/.ssh/gitlab_rsa:/root/.ssh/id_rsa:ro \ -v ~/.ssh/known_hosts:/root/.ssh/known_hosts:ro \ -e AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION \ -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \ -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \ genesysarch/cloud-workstation $@ ## Instruction: Use personal docker image for now ## Code After: docker run -it --rm \ -v $(pwd):/workspace \ -v ~/.ssh/gitlab_rsa:/root/.ssh/id_rsa:ro \ -v ~/.ssh/known_hosts:/root/.ssh/known_hosts:ro \ -e AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION \ -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \ -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \ jmahowald/cloud-workstation $@
89a4f3df4e6b3b6927ebac31bdbb33617f8a934f
code/index.js
code/index.js
'use strict' const fs = require('fs') const path = require('path') const mkdirp = require('mkdirp') const assign = require('lodash.assign') const makeTemplate = require('./make-template.js') const defaultDirectives = require('./default-directives.js') module.exports = function (directory, settings) { settings = settings || {} directory = path.resolve(process.cwd(), directory) + '/' settings.cacheDirectory = settings.cacheDirectory || directory + 'compiled/' + (new Date()).getTime() + '/' var directives = assign({}, defaultDirectives, settings.directives || {}) var promises = {} return function load (name) { if (!promises[name]) { promises[name] = new Promise(function (resolve, reject) { fs.readFile(directory + name, { encoding: 'utf-8' }, function (err, result) { if (err) throw err makeTemplate(result, load, directives, function (err, result) { if (err) throw err mkdirp(path.dirname(settings.cacheDirectory + name + '.js'), function (err) { if (err) throw err fs.writeFile(settings.cacheDirectory + name + '.js', result, function (err) { if (err) throw err resolve(settings.cacheDirectory + name + '.js') }) }) }) }) }) } return promises[name].then(function (path) { var Template = require(path) return Promise.resolve(function (content) { return (new Template()).render(content) }) }) } }
'use strict' const fs = require('fs') const path = require('path') const mkdirp = require('mkdirp') const assign = require('lodash.assign') const makeTemplate = require('./make-template.js') const defaultDirectives = require('./default-directives.js') module.exports = function (directory, settings) { settings = settings || {} directory = path.resolve(process.cwd(), directory) + '/' settings.cacheDirectory = settings.cacheDirectory || directory + 'compiled/' var directives = assign({}, defaultDirectives, settings.directives || {}) var promises = {} return function load (name) { if (!promises[name]) { promises[name] = new Promise(function (resolve, reject) { fs.readFile(directory + name, { encoding: 'utf-8' }, function (err, result) { if (err) throw err makeTemplate(result, load, directives, function (err, result) { if (err) throw err mkdirp(path.dirname(settings.cacheDirectory + name + '.js'), function (err) { if (err) throw err fs.writeFile(settings.cacheDirectory + name + '.js', result, function (err) { if (err) throw err resolve(settings.cacheDirectory + name + '.js') }) }) }) }) }) } return promises[name].then(function (path) { delete require.cache[path] var Template = require(path) return Promise.resolve(function (content) { return (new Template()).render(content) }) }) } }
Delete require cache rather than create endless directories
Delete require cache rather than create endless directories
JavaScript
mit
erickmerchant/atlatl
javascript
## Code Before: 'use strict' const fs = require('fs') const path = require('path') const mkdirp = require('mkdirp') const assign = require('lodash.assign') const makeTemplate = require('./make-template.js') const defaultDirectives = require('./default-directives.js') module.exports = function (directory, settings) { settings = settings || {} directory = path.resolve(process.cwd(), directory) + '/' settings.cacheDirectory = settings.cacheDirectory || directory + 'compiled/' + (new Date()).getTime() + '/' var directives = assign({}, defaultDirectives, settings.directives || {}) var promises = {} return function load (name) { if (!promises[name]) { promises[name] = new Promise(function (resolve, reject) { fs.readFile(directory + name, { encoding: 'utf-8' }, function (err, result) { if (err) throw err makeTemplate(result, load, directives, function (err, result) { if (err) throw err mkdirp(path.dirname(settings.cacheDirectory + name + '.js'), function (err) { if (err) throw err fs.writeFile(settings.cacheDirectory + name + '.js', result, function (err) { if (err) throw err resolve(settings.cacheDirectory + name + '.js') }) }) }) }) }) } return promises[name].then(function (path) { var Template = require(path) return Promise.resolve(function (content) { return (new Template()).render(content) }) }) } } ## Instruction: Delete require cache rather than create endless directories ## Code After: 'use strict' const fs = require('fs') const path = require('path') const mkdirp = require('mkdirp') const assign = require('lodash.assign') const makeTemplate = require('./make-template.js') const defaultDirectives = require('./default-directives.js') module.exports = function (directory, settings) { settings = settings || {} directory = path.resolve(process.cwd(), directory) + '/' settings.cacheDirectory = settings.cacheDirectory || directory + 'compiled/' var directives = assign({}, defaultDirectives, settings.directives || {}) var promises = {} return function load (name) { if (!promises[name]) { promises[name] = new Promise(function (resolve, reject) { fs.readFile(directory + name, { encoding: 'utf-8' }, function (err, result) { if (err) throw err makeTemplate(result, load, directives, function (err, result) { if (err) throw err mkdirp(path.dirname(settings.cacheDirectory + name + '.js'), function (err) { if (err) throw err fs.writeFile(settings.cacheDirectory + name + '.js', result, function (err) { if (err) throw err resolve(settings.cacheDirectory + name + '.js') }) }) }) }) }) } return promises[name].then(function (path) { delete require.cache[path] var Template = require(path) return Promise.resolve(function (content) { return (new Template()).render(content) }) }) } }
04f7cf71c0595097d596b26700e59366642014fb
lib/asciinema/emails/email.ex
lib/asciinema/emails/email.ex
defmodule Asciinema.Emails.Email do use Bamboo.Phoenix, view: AsciinemaWeb.EmailView import Bamboo.Email def signup_email(email_address, signup_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Welcome to #{hostname}") |> render("signup.text", signup_url: signup_url, hostname: hostname) |> render("signup.html", signup_url: signup_url, hostname: hostname) end def login_email(email_address, login_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Login to #{hostname}") |> render("login.text", login_url: login_url, hostname: hostname) |> render("login.html", login_url: login_url, hostname: hostname) end defp base_email do new_email() |> from({"asciinema", from_address()}) |> put_header("Reply-To", reply_to_address()) |> put_html_layout({AsciinemaWeb.LayoutView, "email.html"}) end defp from_address do System.get_env("SMTP_FROM_ADDRESS") || "hello@#{instance_hostname()}" end defp reply_to_address do System.get_env("SMTP_REPLY_TO_ADDRESS") || "[email protected]" end defp instance_hostname do System.get_env("URL_HOST") || "asciinema.org" end end
defmodule Asciinema.Emails.Email do use Bamboo.Phoenix, view: AsciinemaWeb.EmailView import Bamboo.Email def signup_email(email_address, signup_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Welcome to #{hostname}") |> render("signup.text", signup_url: signup_url, hostname: hostname) |> render("signup.html", signup_url: signup_url, hostname: hostname) end def login_email(email_address, login_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Login to #{hostname}") |> render("login.text", login_url: login_url, hostname: hostname) |> render("login.html", login_url: login_url, hostname: hostname) end defp base_email do new_email() |> from({"asciinema", from_address()}) |> put_header("Reply-To", reply_to_address()) |> put_html_layout({AsciinemaWeb.LayoutView, "email.html"}) end defp from_address do System.get_env("SMTP_FROM_ADDRESS") || "hello@#{instance_hostname()}" end defp reply_to_address do System.get_env("SMTP_REPLY_TO_ADDRESS") || "admin@#{instance_hostname()}" end defp instance_hostname do System.get_env("URL_HOST") || "asciinema.org" end end
Use instance's hostname in default reply-to address
Use instance's hostname in default reply-to address
Elixir
apache-2.0
asciinema/asciinema-server,asciinema/asciinema-server,asciinema/asciinema-server,asciinema/asciinema-server
elixir
## Code Before: defmodule Asciinema.Emails.Email do use Bamboo.Phoenix, view: AsciinemaWeb.EmailView import Bamboo.Email def signup_email(email_address, signup_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Welcome to #{hostname}") |> render("signup.text", signup_url: signup_url, hostname: hostname) |> render("signup.html", signup_url: signup_url, hostname: hostname) end def login_email(email_address, login_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Login to #{hostname}") |> render("login.text", login_url: login_url, hostname: hostname) |> render("login.html", login_url: login_url, hostname: hostname) end defp base_email do new_email() |> from({"asciinema", from_address()}) |> put_header("Reply-To", reply_to_address()) |> put_html_layout({AsciinemaWeb.LayoutView, "email.html"}) end defp from_address do System.get_env("SMTP_FROM_ADDRESS") || "hello@#{instance_hostname()}" end defp reply_to_address do System.get_env("SMTP_REPLY_TO_ADDRESS") || "[email protected]" end defp instance_hostname do System.get_env("URL_HOST") || "asciinema.org" end end ## Instruction: Use instance's hostname in default reply-to address ## Code After: defmodule Asciinema.Emails.Email do use Bamboo.Phoenix, view: AsciinemaWeb.EmailView import Bamboo.Email def signup_email(email_address, signup_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Welcome to #{hostname}") |> render("signup.text", signup_url: signup_url, hostname: hostname) |> render("signup.html", signup_url: signup_url, hostname: hostname) end def login_email(email_address, login_url) do hostname = instance_hostname() base_email() |> to(email_address) |> subject("Login to #{hostname}") |> render("login.text", login_url: login_url, hostname: hostname) |> render("login.html", login_url: login_url, hostname: hostname) end defp base_email do new_email() |> from({"asciinema", from_address()}) |> put_header("Reply-To", reply_to_address()) |> put_html_layout({AsciinemaWeb.LayoutView, "email.html"}) end defp from_address do System.get_env("SMTP_FROM_ADDRESS") || "hello@#{instance_hostname()}" end defp reply_to_address do System.get_env("SMTP_REPLY_TO_ADDRESS") || "admin@#{instance_hostname()}" end defp instance_hostname do System.get_env("URL_HOST") || "asciinema.org" end end
a5b2c1d0818ffb7a232ffe65799785f5c53e627e
.travis.yml
.travis.yml
language: cpp before_install: - sudo add-apt-repository --yes ppa:ubuntu-sdk-team/ppa - sudo add-apt-repository --yes ppa:canonical-qt5-edgers/qt5-beta2 - sudo apt-get update -qq - sudo apt-get install -qq qt5-qmake libpulse-dev qtbase5-dev qtdeclarative5-dev libqt5svg5-dev qtmultimedia5-dev libqt5webkit5-dev libsqlite3-dev - sudo apt-get install build-essential mercurial xorg-dev libudev-dev libts-dev libgl1-mesa-dev libglu1-mesa-dev libasound2-dev libpulse-dev libopenal-dev libogg-dev libvorbis-dev libaudiofile-dev libpng12-dev libfreetype6-dev libusb-dev libdbus-1-dev zlib1g-dev libdirectfb-dev script: # Download and install SDL 2 (seems that libsdl2-dev is not available) - mkdir sdl2 && cd sdl2 - hg clone http://hg.libsdl.org/SDL - cd SDL - ./configure - make - sudo make install - sudo ldconfig - cd .. && cd .. # Build desktop version - qmake -qt=qt5 Desktop.pro - make - make clean # Build mobile version - qmake -qt=qt5 Mobile.pro - make - make clean
language: cpp before_install: - sudo add-apt-repository "deb http://archive.ubuntu.com/ubuntu trusty universe" - sudo add-apt-repository "deb http://archive.ubuntu.com/ubuntu trusty main" - sudo apt-get update -qq - sudo apt-get install -qq build-essential mercurial xorg-dev libudev-dev libts-dev libgl1-mesa-dev libglu1-mesa-dev libasound2-dev libpulse-dev libopenal-dev libogg-dev libvorbis-dev libaudiofile-dev libpng12-dev libfreetype6-dev libusb-dev libdbus-1-dev zlib1g-dev libdirectfb-dev script: # Download and install SDL 2 (seems that libsdl2-dev is not available) - mkdir sdl2 && cd sdl2 - hg clone http://hg.libsdl.org/SDL - cd SDL - ./configure - make - sudo make install - sudo ldconfig - cd .. && cd .. # Build desktop version - qmake -qt=qt5 Desktop.pro - make - make clean # Build mobile version - qmake -qt=qt5 Mobile.pro - make - make clean
Install Qt 5.2 to avoid compilation errors
Install Qt 5.2 to avoid compilation errors
YAML
mit
FRC-Utilities/QDriverStation,FRC-Utilities/QDriverStation,FRC-Utilities/QDriverStation
yaml
## Code Before: language: cpp before_install: - sudo add-apt-repository --yes ppa:ubuntu-sdk-team/ppa - sudo add-apt-repository --yes ppa:canonical-qt5-edgers/qt5-beta2 - sudo apt-get update -qq - sudo apt-get install -qq qt5-qmake libpulse-dev qtbase5-dev qtdeclarative5-dev libqt5svg5-dev qtmultimedia5-dev libqt5webkit5-dev libsqlite3-dev - sudo apt-get install build-essential mercurial xorg-dev libudev-dev libts-dev libgl1-mesa-dev libglu1-mesa-dev libasound2-dev libpulse-dev libopenal-dev libogg-dev libvorbis-dev libaudiofile-dev libpng12-dev libfreetype6-dev libusb-dev libdbus-1-dev zlib1g-dev libdirectfb-dev script: # Download and install SDL 2 (seems that libsdl2-dev is not available) - mkdir sdl2 && cd sdl2 - hg clone http://hg.libsdl.org/SDL - cd SDL - ./configure - make - sudo make install - sudo ldconfig - cd .. && cd .. # Build desktop version - qmake -qt=qt5 Desktop.pro - make - make clean # Build mobile version - qmake -qt=qt5 Mobile.pro - make - make clean ## Instruction: Install Qt 5.2 to avoid compilation errors ## Code After: language: cpp before_install: - sudo add-apt-repository "deb http://archive.ubuntu.com/ubuntu trusty universe" - sudo add-apt-repository "deb http://archive.ubuntu.com/ubuntu trusty main" - sudo apt-get update -qq - sudo apt-get install -qq build-essential mercurial xorg-dev libudev-dev libts-dev libgl1-mesa-dev libglu1-mesa-dev libasound2-dev libpulse-dev libopenal-dev libogg-dev libvorbis-dev libaudiofile-dev libpng12-dev libfreetype6-dev libusb-dev libdbus-1-dev zlib1g-dev libdirectfb-dev script: # Download and install SDL 2 (seems that libsdl2-dev is not available) - mkdir sdl2 && cd sdl2 - hg clone http://hg.libsdl.org/SDL - cd SDL - ./configure - make - sudo make install - sudo ldconfig - cd .. && cd .. # Build desktop version - qmake -qt=qt5 Desktop.pro - make - make clean # Build mobile version - qmake -qt=qt5 Mobile.pro - make - make clean
42a1aaba8daa253b99f444a512f8231db47dfbb2
helpers.py
helpers.py
import array import numpy as np def load_glove_vectors(filename, vocab=None): """ Load glove vectors from a .txt file. Optionally limit the vocabulary to save memory. `vocab` should be a set. """ dct = {} vectors = array.array('d') current_idx = 0 with open(filename, "r", encoding="utf-8") as f: for _, line in enumerate(f): tokens = line.split(" ") word = tokens[0] entries = tokens[1:] if not vocab or word in vocab: dct[word] = current_idx vectors.extend(float(x) for x in entries) current_idx += 1 word_dim = len(entries) num_vectors = len(dct) return [np.array(vectors).reshape(num_vectors, word_dim), dct] def evaluate_recall(y, y_labels, n=1): num_examples = float(len(y)) num_correct = 0 for predictions, label in zip(y, y_labels): if label in predictions[:n]: num_correct += 1 return num_correct/num_examples
import array import numpy as np import pandas as pd def load_glove_vectors(filename, vocab=None): """ Load glove vectors from a .txt file. Optionally limit the vocabulary to save memory. `vocab` should be a set. """ dct = {} vectors = array.array('d') current_idx = 0 with open(filename, "r", encoding="utf-8") as f: for _, line in enumerate(f): tokens = line.split(" ") word = tokens[0] entries = tokens[1:] if not vocab or word in vocab: dct[word] = current_idx vectors.extend(float(x) for x in entries) current_idx += 1 word_dim = len(entries) num_vectors = len(dct) return [np.array(vectors).reshape(num_vectors, word_dim), dct] def evaluate_recall(y, y_labels, n=1): num_examples = float(len(y)) num_correct = 0 for predictions, label in zip(y, y_labels): if label in predictions[:n]: num_correct += 1 return num_correct/num_examples def convert_to_labeled_df(df): """ Converts the test/validation data from the Ubuntu Dialog corpus into a train-like Data Frame with labels. This Data Frame can be used to easily get accuarcy values for cross-validation """ result = [] for idx, row in df.iterrows(): context = row.Context result.append([context, row.iloc[1], 1]) for distractor in row.iloc[2:]: result.append([context, distractor, 0]) return pd.DataFrame(result, columns=["Context", "Utterance", "Label"])
Add dataset conversion helper function
Add dataset conversion helper function
Python
mit
AotY/chatbot-retrieval,LepiorzDaniel/test2
python
## Code Before: import array import numpy as np def load_glove_vectors(filename, vocab=None): """ Load glove vectors from a .txt file. Optionally limit the vocabulary to save memory. `vocab` should be a set. """ dct = {} vectors = array.array('d') current_idx = 0 with open(filename, "r", encoding="utf-8") as f: for _, line in enumerate(f): tokens = line.split(" ") word = tokens[0] entries = tokens[1:] if not vocab or word in vocab: dct[word] = current_idx vectors.extend(float(x) for x in entries) current_idx += 1 word_dim = len(entries) num_vectors = len(dct) return [np.array(vectors).reshape(num_vectors, word_dim), dct] def evaluate_recall(y, y_labels, n=1): num_examples = float(len(y)) num_correct = 0 for predictions, label in zip(y, y_labels): if label in predictions[:n]: num_correct += 1 return num_correct/num_examples ## Instruction: Add dataset conversion helper function ## Code After: import array import numpy as np import pandas as pd def load_glove_vectors(filename, vocab=None): """ Load glove vectors from a .txt file. Optionally limit the vocabulary to save memory. `vocab` should be a set. """ dct = {} vectors = array.array('d') current_idx = 0 with open(filename, "r", encoding="utf-8") as f: for _, line in enumerate(f): tokens = line.split(" ") word = tokens[0] entries = tokens[1:] if not vocab or word in vocab: dct[word] = current_idx vectors.extend(float(x) for x in entries) current_idx += 1 word_dim = len(entries) num_vectors = len(dct) return [np.array(vectors).reshape(num_vectors, word_dim), dct] def evaluate_recall(y, y_labels, n=1): num_examples = float(len(y)) num_correct = 0 for predictions, label in zip(y, y_labels): if label in predictions[:n]: num_correct += 1 return num_correct/num_examples def convert_to_labeled_df(df): """ Converts the test/validation data from the Ubuntu Dialog corpus into a train-like Data Frame with labels. This Data Frame can be used to easily get accuarcy values for cross-validation """ result = [] for idx, row in df.iterrows(): context = row.Context result.append([context, row.iloc[1], 1]) for distractor in row.iloc[2:]: result.append([context, distractor, 0]) return pd.DataFrame(result, columns=["Context", "Utterance", "Label"])
790ebf82d9b854d47a2aad27cdaee8025f549f9e
README.md
README.md
Geometrify written in Rust [![Build Status](https://travis-ci.org/theunknownxy/geometrify-rs.svg?branch=master)](https://travis-ci.org/theunknownxy/geometrify-rs)
Geometrify written in Rust
Move build status behind title.
Move build status behind title.
Markdown
mpl-2.0
theunknownxy/geometrify-rs,theunknownxy/geometrify-rs
markdown
## Code Before: Geometrify written in Rust [![Build Status](https://travis-ci.org/theunknownxy/geometrify-rs.svg?branch=master)](https://travis-ci.org/theunknownxy/geometrify-rs) ## Instruction: Move build status behind title. ## Code After: Geometrify written in Rust
9c593e10c013d0000c3b61e6a0ee97a89418eff9
ObjectiveRocks/RocksDBCuckooTableOptions.h
ObjectiveRocks/RocksDBCuckooTableOptions.h
// // RocksDBCuckooTableOptions.h // ObjectiveRocks // // Created by Iska on 04/01/15. // Copyright (c) 2015 BrainCookie. All rights reserved. // #import <Foundation/Foundation.h> @interface RocksDBCuckooTableOptions : NSObject @property (nonatomic, assign) double hashTableRatio; @property (nonatomic, assign) uint32_t maxSearchDepth; @property (nonatomic, assign) uint32_t cuckooBlockSize; @property (nonatomic, assign) BOOL identityAsFirstHash; @property (nonatomic, assign) BOOL useModuleHash; @end
// // RocksDBCuckooTableOptions.h // ObjectiveRocks // // Created by Iska on 04/01/15. // Copyright (c) 2015 BrainCookie. All rights reserved. // #import <Foundation/Foundation.h> @interface RocksDBCuckooTableOptions : NSObject /** @brief Determines the utilization of hash tables. Smaller values result in larger hash tables with fewer collisions. */ @property (nonatomic, assign) double hashTableRatio; /** @brief A property used by builder to determine the depth to go to to search for a path to displace elements in case of collision. Higher values result in more efficient hash tables with fewer lookups but take more time to build. */ @property (nonatomic, assign) uint32_t maxSearchDepth; /** @brief In case of collision while inserting, the builder attempts to insert in the next `cuckooBlockSize` locations before skipping over to the next Cuckoo hash function. This makes lookups more cache friendly in case of collisions. */ @property (nonatomic, assign) uint32_t cuckooBlockSize; /** @brief If this option is enabled, user key is treated as uint64_t and its value is used as hash value directly. This option changes builder's behavior. Reader ignore this option and behave according to what specified in table property. */ @property (nonatomic, assign) BOOL identityAsFirstHash; /** @brief If this option is set to true, module is used during hash calculation. This often yields better space efficiency at the cost of performance. If this optino is set to false, # of entries in table is constrained to be power of two, and bit and is used to calculate hash, which is faster in general. */ @property (nonatomic, assign) BOOL useModuleHash; @end
Add source code documentation for the RocksDB Cuckoo Table Options class
Add source code documentation for the RocksDB Cuckoo Table Options class
C
mit
iabudiab/ObjectiveRocks,iabudiab/ObjectiveRocks,iabudiab/ObjectiveRocks,iabudiab/ObjectiveRocks
c
## Code Before: // // RocksDBCuckooTableOptions.h // ObjectiveRocks // // Created by Iska on 04/01/15. // Copyright (c) 2015 BrainCookie. All rights reserved. // #import <Foundation/Foundation.h> @interface RocksDBCuckooTableOptions : NSObject @property (nonatomic, assign) double hashTableRatio; @property (nonatomic, assign) uint32_t maxSearchDepth; @property (nonatomic, assign) uint32_t cuckooBlockSize; @property (nonatomic, assign) BOOL identityAsFirstHash; @property (nonatomic, assign) BOOL useModuleHash; @end ## Instruction: Add source code documentation for the RocksDB Cuckoo Table Options class ## Code After: // // RocksDBCuckooTableOptions.h // ObjectiveRocks // // Created by Iska on 04/01/15. // Copyright (c) 2015 BrainCookie. All rights reserved. // #import <Foundation/Foundation.h> @interface RocksDBCuckooTableOptions : NSObject /** @brief Determines the utilization of hash tables. Smaller values result in larger hash tables with fewer collisions. */ @property (nonatomic, assign) double hashTableRatio; /** @brief A property used by builder to determine the depth to go to to search for a path to displace elements in case of collision. Higher values result in more efficient hash tables with fewer lookups but take more time to build. */ @property (nonatomic, assign) uint32_t maxSearchDepth; /** @brief In case of collision while inserting, the builder attempts to insert in the next `cuckooBlockSize` locations before skipping over to the next Cuckoo hash function. This makes lookups more cache friendly in case of collisions. */ @property (nonatomic, assign) uint32_t cuckooBlockSize; /** @brief If this option is enabled, user key is treated as uint64_t and its value is used as hash value directly. This option changes builder's behavior. Reader ignore this option and behave according to what specified in table property. */ @property (nonatomic, assign) BOOL identityAsFirstHash; /** @brief If this option is set to true, module is used during hash calculation. This often yields better space efficiency at the cost of performance. If this optino is set to false, # of entries in table is constrained to be power of two, and bit and is used to calculate hash, which is faster in general. */ @property (nonatomic, assign) BOOL useModuleHash; @end
aa9186d27e38bdc48287cf52673452c3ac50e87b
website/templates/user/register/register.html
website/templates/user/register/register.html
{% extends "base.html" %} {% load i18n bootstrap %} {% block content %} <div class="row"> <div class="panel panel-default col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-2 col-lg-4 col-lg-offset-2"> <div class="panel-body"> <form action="{% url 'user-registration' %}" class="form" method="post"> {% csrf_token %} {{ form|bootstrap_inline }} <button class="btn btn-primary">Sign up!</button> </form> </div> </div> <div class="col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-0 col-lg-4 col-lg-offset-0"> <h4>Strong Passwords</h4> <p>Using a mix of upper and lower case letter, numbers and symbols help to make your password stronger. Try using a sentence rather than a single word if that helps you remember it.</p> <h4>Lose Your Password, Lose Your Account</h4> <p>We have consciously chosen not to implement a password recovery procedure. Please don't forget your password, you won't be able to get into your account otherwise.</p> </div> </div> {% endblock %}
{% extends "base.html" %} {% load i18n bootstrap %} {% block content %} <div class="row"> <div class="panel panel-default col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-2 col-lg-4 col-lg-offset-2"> <div class="panel-body"> <form action="{% url 'user-registration' %}" class="form" method="post"> {% csrf_token %} {{ form|bootstrap_inline }} <button class="btn btn-primary">Sign up!</button> </form> </div> </div> <div class="col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-0 col-lg-4 col-lg-offset-0"> <h4>Strong Passwords</h4> <p>Using a mix of upper and lower case letters, numbers, and symbols help make your password stronger. Try using a sentence rather than a single word if that helps you remember it.</p> <h4>Lose Your Password, Lose Your Account</h4> <p>We have consciously chosen not to implement a password recovery procedure. Please don't forget your password, you won't be able to get into your account otherwise.</p> </div> </div> {% endblock %}
Fix English on registration page
Fix English on registration page
HTML
agpl-3.0
Inboxen/Inboxen,Inboxen/Inboxen,Inboxen/Inboxen,Inboxen/Inboxen
html
## Code Before: {% extends "base.html" %} {% load i18n bootstrap %} {% block content %} <div class="row"> <div class="panel panel-default col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-2 col-lg-4 col-lg-offset-2"> <div class="panel-body"> <form action="{% url 'user-registration' %}" class="form" method="post"> {% csrf_token %} {{ form|bootstrap_inline }} <button class="btn btn-primary">Sign up!</button> </form> </div> </div> <div class="col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-0 col-lg-4 col-lg-offset-0"> <h4>Strong Passwords</h4> <p>Using a mix of upper and lower case letter, numbers and symbols help to make your password stronger. Try using a sentence rather than a single word if that helps you remember it.</p> <h4>Lose Your Password, Lose Your Account</h4> <p>We have consciously chosen not to implement a password recovery procedure. Please don't forget your password, you won't be able to get into your account otherwise.</p> </div> </div> {% endblock %} ## Instruction: Fix English on registration page ## Code After: {% extends "base.html" %} {% load i18n bootstrap %} {% block content %} <div class="row"> <div class="panel panel-default col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-2 col-lg-4 col-lg-offset-2"> <div class="panel-body"> <form action="{% url 'user-registration' %}" class="form" method="post"> {% csrf_token %} {{ form|bootstrap_inline }} <button class="btn btn-primary">Sign up!</button> </form> </div> </div> <div class="col-xs-12 col-sm-8 col-sm-offset-2 col-md-4 col-md-offset-0 col-lg-4 col-lg-offset-0"> <h4>Strong Passwords</h4> <p>Using a mix of upper and lower case letters, numbers, and symbols help make your password stronger. Try using a sentence rather than a single word if that helps you remember it.</p> <h4>Lose Your Password, Lose Your Account</h4> <p>We have consciously chosen not to implement a password recovery procedure. Please don't forget your password, you won't be able to get into your account otherwise.</p> </div> </div> {% endblock %}
3bb560dc03809238f586f78385deb41bba512ba9
scripts/asgard-deploy.py
scripts/asgard-deploy.py
import sys import logging import click import tubular.asgard as asgard logging.basicConfig(stream=sys.stdout, level=logging.INFO) @click.command() @click.option('--ami_id', envvar='AMI_ID', help='The ami-id to deploy', required=True) def deploy(ami_id): try: asgard.deploy(ami_id) except Exception, e: click.secho("Error Deploying AMI: {0}.\nMessage: {1}".format(ami_id, e.message), fg='red') sys.exit(1) sys.exit(0) if __name__ == "__main__": deploy()
import sys import logging import click from os import path # Add top-level module path to sys.path before importing tubular code. sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) ) from tubular import asgard logging.basicConfig(stream=sys.stdout, level=logging.INFO) @click.command() @click.option('--ami_id', envvar='AMI_ID', help='The ami-id to deploy', required=True) def deploy(ami_id): try: asgard.deploy(ami_id) except Exception, e: click.secho("Error Deploying AMI: {0}.\nMessage: {1}".format(ami_id, e.message), fg='red') sys.exit(1) sys.exit(0) if __name__ == "__main__": deploy()
Add top-level module path before tubular import.
Add top-level module path before tubular import.
Python
agpl-3.0
eltoncarr/tubular,eltoncarr/tubular
python
## Code Before: import sys import logging import click import tubular.asgard as asgard logging.basicConfig(stream=sys.stdout, level=logging.INFO) @click.command() @click.option('--ami_id', envvar='AMI_ID', help='The ami-id to deploy', required=True) def deploy(ami_id): try: asgard.deploy(ami_id) except Exception, e: click.secho("Error Deploying AMI: {0}.\nMessage: {1}".format(ami_id, e.message), fg='red') sys.exit(1) sys.exit(0) if __name__ == "__main__": deploy() ## Instruction: Add top-level module path before tubular import. ## Code After: import sys import logging import click from os import path # Add top-level module path to sys.path before importing tubular code. sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) ) from tubular import asgard logging.basicConfig(stream=sys.stdout, level=logging.INFO) @click.command() @click.option('--ami_id', envvar='AMI_ID', help='The ami-id to deploy', required=True) def deploy(ami_id): try: asgard.deploy(ami_id) except Exception, e: click.secho("Error Deploying AMI: {0}.\nMessage: {1}".format(ami_id, e.message), fg='red') sys.exit(1) sys.exit(0) if __name__ == "__main__": deploy()
2d5c5a1bf693f428b53f8d4a6e788f7be864aa9e
image_site_app/forms.py
image_site_app/forms.py
from django import forms class SignupForm(forms.Form): field_order = ['username', 'first_name', 'last_name', 'email', 'password', 'password2'] first_name = forms.CharField(max_length=30, label='First name (optional)', required=False) last_name = forms.CharField(max_length=30, label='Last name (optional)', required=False) def signup(self, request, user): user.first_name = self.cleaned_data['first_name'] user.last_name = self.cleaned_data['last_name'] user.save()
from django import forms class SignupForm(forms.Form): field_order = ['username', 'first_name', 'last_name', 'email', 'password', 'password2'] first_name = forms.CharField(max_length=30, label='First name (optional)', required=False, widget=forms.TextInput(attrs={ 'placeholder': 'First name' })) last_name = forms.CharField(max_length=30, label='Last name (optional)', required=False, widget=forms.TextInput(attrs={ 'placeholder': 'Last name' })) def signup(self, request, user): user.first_name = self.cleaned_data['first_name'] user.last_name = self.cleaned_data['last_name'] user.save()
Add placeholder to first_name and last_name fields in signup form
Add placeholder to first_name and last_name fields in signup form
Python
mit
frostblooded/kanq,frostblooded/kanq,frostblooded/kanq,frostblooded/kanq,frostblooded/kanq
python
## Code Before: from django import forms class SignupForm(forms.Form): field_order = ['username', 'first_name', 'last_name', 'email', 'password', 'password2'] first_name = forms.CharField(max_length=30, label='First name (optional)', required=False) last_name = forms.CharField(max_length=30, label='Last name (optional)', required=False) def signup(self, request, user): user.first_name = self.cleaned_data['first_name'] user.last_name = self.cleaned_data['last_name'] user.save() ## Instruction: Add placeholder to first_name and last_name fields in signup form ## Code After: from django import forms class SignupForm(forms.Form): field_order = ['username', 'first_name', 'last_name', 'email', 'password', 'password2'] first_name = forms.CharField(max_length=30, label='First name (optional)', required=False, widget=forms.TextInput(attrs={ 'placeholder': 'First name' })) last_name = forms.CharField(max_length=30, label='Last name (optional)', required=False, widget=forms.TextInput(attrs={ 'placeholder': 'Last name' })) def signup(self, request, user): user.first_name = self.cleaned_data['first_name'] user.last_name = self.cleaned_data['last_name'] user.save()
1f9ecb9452c3185bb5e83da6930a36f8dbef7762
wercker.yml
wercker.yml
box: wercker/ruby build: steps: - bundle-install - install-packages: packages: wget unzip - script: name: install packer cwd: packer/ code: | wget https://dl.bintray.com/mitchellh/packer/0.7.5_linux_amd64.zip unzip 0.7.5_linux_amd64.zip ./packer version | tee $WERCKER_REPORT_MESSAGE_FILE - script: cwd: packer/ name: validate packer template code: ./packer validate ubuntu-14.04_amd64.json | tee $WERCKER_REPORT_MESSAGE_FILE - script: name: build image cwd: packer/ code: | ./packer build ubuntu-14.04_amd64.json | tee "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" tail -n 1 "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" | tee $WERCKER_REPORT_MESSAGE_FILE
box: wercker/ruby build: steps: - bundle-install - install-packages: packages: wget unzip - script: name: install packer cwd: packer/ code: | wget https://dl.bintray.com/mitchellh/packer/packer_0.7.5_linux_amd64.zip unzip packer_0.7.5_linux_amd64.zip ./packer version | tee $WERCKER_REPORT_MESSAGE_FILE - script: cwd: packer/ name: validate packer template code: ./packer validate ubuntu-14.04_amd64.json | tee $WERCKER_REPORT_MESSAGE_FILE - script: name: build image cwd: packer/ code: | ./packer build ubuntu-14.04_amd64.json | tee "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" tail -n 1 "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" | tee $WERCKER_REPORT_MESSAGE_FILE
Fix link to packer download
Fix link to packer download
YAML
mit
phungmanhcuong/Appolo,siliconmeadow/Apollo,atsaki/Apollo,mehulsbhatt/Apollo,enxebre/Apollo,siliconmeadow/Apollo,Capgemini/Apollo,mehulsbhatt/Apollo,atsaki/Apollo,Capgemini/Apollo,Capgemini/Apollo,phungmanhcuong/Appolo,ravbaba/Apollo,ravbaba/Apollo,enxebre/Apollo
yaml
## Code Before: box: wercker/ruby build: steps: - bundle-install - install-packages: packages: wget unzip - script: name: install packer cwd: packer/ code: | wget https://dl.bintray.com/mitchellh/packer/0.7.5_linux_amd64.zip unzip 0.7.5_linux_amd64.zip ./packer version | tee $WERCKER_REPORT_MESSAGE_FILE - script: cwd: packer/ name: validate packer template code: ./packer validate ubuntu-14.04_amd64.json | tee $WERCKER_REPORT_MESSAGE_FILE - script: name: build image cwd: packer/ code: | ./packer build ubuntu-14.04_amd64.json | tee "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" tail -n 1 "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" | tee $WERCKER_REPORT_MESSAGE_FILE ## Instruction: Fix link to packer download ## Code After: box: wercker/ruby build: steps: - bundle-install - install-packages: packages: wget unzip - script: name: install packer cwd: packer/ code: | wget https://dl.bintray.com/mitchellh/packer/packer_0.7.5_linux_amd64.zip unzip packer_0.7.5_linux_amd64.zip ./packer version | tee $WERCKER_REPORT_MESSAGE_FILE - script: cwd: packer/ name: validate packer template code: ./packer validate ubuntu-14.04_amd64.json | tee $WERCKER_REPORT_MESSAGE_FILE - script: name: build image cwd: packer/ code: | ./packer build ubuntu-14.04_amd64.json | tee "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" tail -n 1 "${WERCKER_REPORT_ARTIFACTS_DIR}/packer_build_output.log" | tee $WERCKER_REPORT_MESSAGE_FILE
cb2b32b169eea3fa34785cc0fd045bd83eb27a73
piecewise/client_secrets.json
piecewise/client_secrets.json
{"installed":{"auth_uri":"https://accounts.google.com/o/oauth2/auth","client_secret":"nhp2lFbD3-2xb_RdfoYR-drM","token_uri":"https://accounts.google.com/o/oauth2/token","client_email":"","redirect_uris":["urn:ietf:wg:oauth:2.0:oob","oob"],"client_x509_cert_url":"","client_id":"422648324111-008mgukna2nflr2646qbgu1o3ohenlpj.apps.googleusercontent.com","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs"}}
{ "web": { "client_id": "233384409938-0mmsq2d7vqsr4gocn73n75dpbttlcicq.apps.googleusercontent.com", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://accounts.google.com/o/oauth2/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_email": "233384409938-0mmsq2d7vqsr4gocn73n75dpbttlcicq@developer.gserviceaccount.com", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/233384409938-0mmsq2d7vqsr4gocn73n75dpbttlcicq%40developer.gserviceaccount.com", "client_secret": "eNHe8ElVjwwXiSEZSJyCG85n" } }
Use oauth details specific to M-Lab developer account.
Use oauth details specific to M-Lab developer account.
JSON
apache-2.0
critzo/piecewise,opentechinstitute/piecewise,opentechinstitute/piecewise,critzo/piecewise,opentechinstitute/piecewise,critzo/piecewise,opentechinstitute/piecewise,critzo/piecewise
json
## Code Before: {"installed":{"auth_uri":"https://accounts.google.com/o/oauth2/auth","client_secret":"nhp2lFbD3-2xb_RdfoYR-drM","token_uri":"https://accounts.google.com/o/oauth2/token","client_email":"","redirect_uris":["urn:ietf:wg:oauth:2.0:oob","oob"],"client_x509_cert_url":"","client_id":"422648324111-008mgukna2nflr2646qbgu1o3ohenlpj.apps.googleusercontent.com","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs"}} ## Instruction: Use oauth details specific to M-Lab developer account. ## Code After: { "web": { "client_id": "233384409938-0mmsq2d7vqsr4gocn73n75dpbttlcicq.apps.googleusercontent.com", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://accounts.google.com/o/oauth2/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_email": "233384409938-0mmsq2d7vqsr4gocn73n75dpbttlcicq@developer.gserviceaccount.com", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/233384409938-0mmsq2d7vqsr4gocn73n75dpbttlcicq%40developer.gserviceaccount.com", "client_secret": "eNHe8ElVjwwXiSEZSJyCG85n" } }
4bfebc202f21cc1b0056ab46d04d217b18e4efac
src/Service/AtlassianRestClientInterface.php
src/Service/AtlassianRestClientInterface.php
<?php declare(strict_types=1); namespace AtlassianConnectBundle\Service; use AtlassianConnectBundle\Entity\TenantInterface; use Symfony\Component\HttpFoundation\File\UploadedFile; interface AtlassianRestClientInterface { public function setTenant(TenantInterface $tenant): self; public function actAsUser(string $userId): self; public function get(string $restUrl): string; public function post(string $restUrl, array $json): string; public function delete(string $restUrl): string; public function sendFile(UploadedFile $file, string $restUrl): string; public function doRequest(string $method, string $restUrl, array $options): string; }
<?php declare(strict_types=1); namespace AtlassianConnectBundle\Service; use AtlassianConnectBundle\Entity\TenantInterface; use Symfony\Component\HttpFoundation\File\UploadedFile; interface AtlassianRestClientInterface { public function setTenant(TenantInterface $tenant): self; public function actAsUser(string $userId): self; public function get(string $restUrl): string; public function post(string $restUrl, array $json): string; public function put(string $restUrl, array $json): string; public function delete(string $restUrl): string; public function sendFile(UploadedFile $file, string $restUrl): string; public function doRequest(string $method, string $restUrl, array $options): string; }
Add missing put to client interface
Add missing put to client interface
PHP
mit
thecatontheflat/atlassian-connect-bundle
php
## Code Before: <?php declare(strict_types=1); namespace AtlassianConnectBundle\Service; use AtlassianConnectBundle\Entity\TenantInterface; use Symfony\Component\HttpFoundation\File\UploadedFile; interface AtlassianRestClientInterface { public function setTenant(TenantInterface $tenant): self; public function actAsUser(string $userId): self; public function get(string $restUrl): string; public function post(string $restUrl, array $json): string; public function delete(string $restUrl): string; public function sendFile(UploadedFile $file, string $restUrl): string; public function doRequest(string $method, string $restUrl, array $options): string; } ## Instruction: Add missing put to client interface ## Code After: <?php declare(strict_types=1); namespace AtlassianConnectBundle\Service; use AtlassianConnectBundle\Entity\TenantInterface; use Symfony\Component\HttpFoundation\File\UploadedFile; interface AtlassianRestClientInterface { public function setTenant(TenantInterface $tenant): self; public function actAsUser(string $userId): self; public function get(string $restUrl): string; public function post(string $restUrl, array $json): string; public function put(string $restUrl, array $json): string; public function delete(string $restUrl): string; public function sendFile(UploadedFile $file, string $restUrl): string; public function doRequest(string $method, string $restUrl, array $options): string; }
635c2226c501b765b641adf3ca2947b30dbf3a9d
dat.json
dat.json
{ "url": "dat://4a6d3de9851bc88387fb793158298b4fe715483673cc6a914240105932a5c42e/", "title": "Beautiful Web Type", "description": "In-depth guide to the best free fonts from Google and across the web.", "author": "Chad Mazzola" }
{ "url": "dat://4a6d3de9851bc88387fb793158298b4fe715483673cc6a914240105932a5c42e/", "title": "Beautiful Web Type", "description": "In-depth guide to the best open-source fonts from across the web.", "author": "Chad Mazzola", "links": { "payment": [ { "href": "https://www.patreon.com/ubuwaits", "type": "text/html" } ] } }
Add donation link for Beaker Browser
Add donation link for Beaker Browser
JSON
mit
ubuwaits/beautiful-web-type,ubuwaits/beautiful-web-type,ubuwaits/beautiful-web-type
json
## Code Before: { "url": "dat://4a6d3de9851bc88387fb793158298b4fe715483673cc6a914240105932a5c42e/", "title": "Beautiful Web Type", "description": "In-depth guide to the best free fonts from Google and across the web.", "author": "Chad Mazzola" } ## Instruction: Add donation link for Beaker Browser ## Code After: { "url": "dat://4a6d3de9851bc88387fb793158298b4fe715483673cc6a914240105932a5c42e/", "title": "Beautiful Web Type", "description": "In-depth guide to the best open-source fonts from across the web.", "author": "Chad Mazzola", "links": { "payment": [ { "href": "https://www.patreon.com/ubuwaits", "type": "text/html" } ] } }
0d143cb68f52c09808aa88a7b4d19f40deb7fc0e
README.md
README.md
Shark === [![Build Status](https://img.shields.io/travis/shark-js/shark/master.svg)](https://travis-ci.org/shark-js/shark) [![Build status](https://ci.appveyor.com/api/projects/status/umxg297hoyjd4iq2?svg=true)](https://ci.appveyor.com/project/vadimgoncharov/shark) [![Coverage Status](https://img.shields.io/coveralls/shark-js/shark/master.svg)](https://coveralls.io/r/shark-js/shark) # Getting Started #### 1. Install shark-cli globally: ```sh $ npm install shark-cli -g ``` #### 2. Add next line to your ~/.bashrc or ~/.zshrc to enable shark tab-completion ```sh . <(shark completion) ``` #### 3. Install shark in your project devDependencies: ```sh $ npm install shark-core --save-dev ``` #### 4. Create a `sharkfile.js` at the root of your project: ```js 'use strict'; const path = require('path'); const Shark = require('shark-core'); const shark = Shark({ tasksPath: path.join(__dirname, './shark/tasks') }); module.exports = shark; ``` #### 4. Run shark: ```sh $ shark ```
Shark === [![Build Status](https://img.shields.io/travis/shark-js/shark/master.svg)](https://travis-ci.org/shark-js/shark) [![Coverage Status](https://img.shields.io/coveralls/shark-js/shark/master.svg)](https://coveralls.io/r/shark-js/shark) # Getting Started #### 1. Install shark-cli globally: ```sh $ npm install shark-cli -g ``` #### 2. Add next line to your ~/.bashrc or ~/.zshrc to enable shark tab-completion ```sh . <(shark completion) ``` #### 3. Install shark in your project devDependencies: ```sh $ npm install shark-core --save-dev ``` #### 4. Create a `sharkfile.js` at the root of your project: ```js 'use strict'; const path = require('path'); const Shark = require('shark-core'); const shark = Shark({ tasksPath: path.join(__dirname, './shark/tasks') }); module.exports = shark; ``` #### 4. Run shark: ```sh $ shark ```
Remove appVeyor badge till fix
Remove appVeyor badge till fix
Markdown
mit
shark-js/shark
markdown
## Code Before: Shark === [![Build Status](https://img.shields.io/travis/shark-js/shark/master.svg)](https://travis-ci.org/shark-js/shark) [![Build status](https://ci.appveyor.com/api/projects/status/umxg297hoyjd4iq2?svg=true)](https://ci.appveyor.com/project/vadimgoncharov/shark) [![Coverage Status](https://img.shields.io/coveralls/shark-js/shark/master.svg)](https://coveralls.io/r/shark-js/shark) # Getting Started #### 1. Install shark-cli globally: ```sh $ npm install shark-cli -g ``` #### 2. Add next line to your ~/.bashrc or ~/.zshrc to enable shark tab-completion ```sh . <(shark completion) ``` #### 3. Install shark in your project devDependencies: ```sh $ npm install shark-core --save-dev ``` #### 4. Create a `sharkfile.js` at the root of your project: ```js 'use strict'; const path = require('path'); const Shark = require('shark-core'); const shark = Shark({ tasksPath: path.join(__dirname, './shark/tasks') }); module.exports = shark; ``` #### 4. Run shark: ```sh $ shark ``` ## Instruction: Remove appVeyor badge till fix ## Code After: Shark === [![Build Status](https://img.shields.io/travis/shark-js/shark/master.svg)](https://travis-ci.org/shark-js/shark) [![Coverage Status](https://img.shields.io/coveralls/shark-js/shark/master.svg)](https://coveralls.io/r/shark-js/shark) # Getting Started #### 1. Install shark-cli globally: ```sh $ npm install shark-cli -g ``` #### 2. Add next line to your ~/.bashrc or ~/.zshrc to enable shark tab-completion ```sh . <(shark completion) ``` #### 3. Install shark in your project devDependencies: ```sh $ npm install shark-core --save-dev ``` #### 4. Create a `sharkfile.js` at the root of your project: ```js 'use strict'; const path = require('path'); const Shark = require('shark-core'); const shark = Shark({ tasksPath: path.join(__dirname, './shark/tasks') }); module.exports = shark; ``` #### 4. Run shark: ```sh $ shark ```
0164cb003172e6e3224498eb77bbf14ca5ece039
mfmod.c
mfmod.c
static struct nf_hook_ops nfho; unsigned int hook_func(unsigned int hooknum, struct sk_buff *skb, const struct net_device *in, const struct net_device *out, int (*okfn)(struct sk_buff *)) { return NF_DROP; } int init_module() { nfho.hook = hook_func; nfho.hooknum = NF_INET_PRE_ROUTING; nfho.pf = PF_INET; nfho.priority = NF_IP_PRI_FIRST; nf_register_hook(&nfho); return 0; } void cleanup_module() { nf_unregister_hook(&nfho); }
static struct nf_hook_ops nfho; unsigned int hook_func(unsigned int hooknum, struct sk_buff *skb, const struct net_device *in, const struct net_device *out, int (*okfn)(struct sk_buff *)) { socket_buff = skb; if (!socket_buff) { return NF_ACCEPT; } else { ip_header = (stuct iphdr *)skb_network_header(socket_buff); // Network header // Drop all ICMP packets if (ip_header->protocol == IPPROTO_ICMP) { return NF_DROP; } } } int init_module() { nfho.hook = hook_func; nfho.hooknum = NF_INET_PRE_ROUTING; nfho.pf = PF_INET; nfho.priority = NF_IP_PRI_FIRST; nf_register_hook(&nfho); return 0; } void cleanup_module() { nf_unregister_hook(&nfho); }
Set hook to block all ICMP packets
Set hook to block all ICMP packets
C
mit
Muhlenberg/mfw
c
## Code Before: static struct nf_hook_ops nfho; unsigned int hook_func(unsigned int hooknum, struct sk_buff *skb, const struct net_device *in, const struct net_device *out, int (*okfn)(struct sk_buff *)) { return NF_DROP; } int init_module() { nfho.hook = hook_func; nfho.hooknum = NF_INET_PRE_ROUTING; nfho.pf = PF_INET; nfho.priority = NF_IP_PRI_FIRST; nf_register_hook(&nfho); return 0; } void cleanup_module() { nf_unregister_hook(&nfho); } ## Instruction: Set hook to block all ICMP packets ## Code After: static struct nf_hook_ops nfho; unsigned int hook_func(unsigned int hooknum, struct sk_buff *skb, const struct net_device *in, const struct net_device *out, int (*okfn)(struct sk_buff *)) { socket_buff = skb; if (!socket_buff) { return NF_ACCEPT; } else { ip_header = (stuct iphdr *)skb_network_header(socket_buff); // Network header // Drop all ICMP packets if (ip_header->protocol == IPPROTO_ICMP) { return NF_DROP; } } } int init_module() { nfho.hook = hook_func; nfho.hooknum = NF_INET_PRE_ROUTING; nfho.pf = PF_INET; nfho.priority = NF_IP_PRI_FIRST; nf_register_hook(&nfho); return 0; } void cleanup_module() { nf_unregister_hook(&nfho); }
838ab9a926098aa3faf8a3dbe535d3d7f586ab0b
index.html
index.html
<!doctype html> <html> <head> <title>ascii cam</title> <meta charset="utf-8" /> <link rel="stylesheet" href="css/main.css"/> </head> <body> <h1 id="info">Please allow this page to access your camera.</h1> <div id="notSupported"> <h1>Your browser does not support the Camera API.</h1> <img src="http://idevelop.github.com/ascii-camera/images/screenshot.png" align="center" /> </div> <pre id="ascii"></pre> <a href="https://github.com/idevelop/ascii-camera" id="fork" target="_blank"><img src="images/forkme.png" alt="Fork me on GitHub"></a> <script src="script/camera.js"></script> <script src="script/ascii.js"></script> <script src="script/app.js"></script> </body> </html>
<!doctype html> <html> <head> <title>ascii cam</title> <meta charset="utf-8" /> <link rel="stylesheet" href="css/main.css"/> </head> <body> <h1 id="info">Please allow this page to access your camera.</h1> <div id="notSupported"> <h1>Your browser does not support the Camera API.</h1> <img src="http://idevelop.github.com/ascii-camera/images/screenshot.png" align="center" /> </div> <pre id="ascii"></pre> <a href="https://github.com/idevelop/ascii-camera" id="fork" target="_blank"><img src="images/forkme.png" alt="Fork me on GitHub" border="0" /></a> <script src="script/camera.js"></script> <script src="script/ascii.js"></script> <script src="script/app.js"></script> </body> </html>
Hide border on fork image link
Hide border on fork image link
HTML
mit
boboanan/ascii-camera,rakesh-mohanta/ascii-camera,pedroha/ascii-camera,andygiannini/ascii-camera,DigitalCoder/ascii-camera,boboanan/ascii-camera,DigitalCoder/ascii-camera,idevelop/ascii-camera,idevelop/ascii-camera,rakesh-mohanta/ascii-camera,zhukaixy/ascii-camera,zhukaixy/ascii-camera,andygiannini/ascii-camera,pedroha/ascii-camera
html
## Code Before: <!doctype html> <html> <head> <title>ascii cam</title> <meta charset="utf-8" /> <link rel="stylesheet" href="css/main.css"/> </head> <body> <h1 id="info">Please allow this page to access your camera.</h1> <div id="notSupported"> <h1>Your browser does not support the Camera API.</h1> <img src="http://idevelop.github.com/ascii-camera/images/screenshot.png" align="center" /> </div> <pre id="ascii"></pre> <a href="https://github.com/idevelop/ascii-camera" id="fork" target="_blank"><img src="images/forkme.png" alt="Fork me on GitHub"></a> <script src="script/camera.js"></script> <script src="script/ascii.js"></script> <script src="script/app.js"></script> </body> </html> ## Instruction: Hide border on fork image link ## Code After: <!doctype html> <html> <head> <title>ascii cam</title> <meta charset="utf-8" /> <link rel="stylesheet" href="css/main.css"/> </head> <body> <h1 id="info">Please allow this page to access your camera.</h1> <div id="notSupported"> <h1>Your browser does not support the Camera API.</h1> <img src="http://idevelop.github.com/ascii-camera/images/screenshot.png" align="center" /> </div> <pre id="ascii"></pre> <a href="https://github.com/idevelop/ascii-camera" id="fork" target="_blank"><img src="images/forkme.png" alt="Fork me on GitHub" border="0" /></a> <script src="script/camera.js"></script> <script src="script/ascii.js"></script> <script src="script/app.js"></script> </body> </html>
3a679c2a492b8e127a3e73a685d4ac77b700bbe7
app/views/shared/_sidebar.html.erb
app/views/shared/_sidebar.html.erb
<div class="main-sidebar"> <!-- Inner sidebar --> <div class="sidebar"> <!-- Sidebar Menu --> <ul class="sidebar-menu"> <li class="header">Menu de operações</li> <li> <%= link_to exercises_path do %> <i class="fa fa-pencil-square-o"></i> <span>Meus exercícios</span> <% end %> </li> <li class="treeview"> <a href="#"><span>Multilevel</span> <i class="fa fa-angle-left pull-right"></i></a> <ul class="treeview-menu"> <li><a href="#">Link in level 2</a></li> <li><a href="#">Link in level 2</a></li> </ul> </li> </ul><!-- /.sidebar-menu --> </div><!-- /.sidebar --> </div><!-- /.main-sidebar -->
<div class="main-sidebar"> <!-- Inner sidebar --> <div class="sidebar"> <!-- Sidebar Menu --> <ul class="sidebar-menu"> <li class="header">Menu de operações</li> <li> <%= link_to exercises_path do %> <i class="fa fa-pencil-square-o"></i> <span>Meus exercícios</span> <% end %> </li> <li> <%= link_to teams_path do %> <i class="fa fa-users"></i> <span>Minhas turmas</span> <% end %> </li> <li class="treeview"> <a href="#"><span>Multilevel</span> <i class="fa fa-angle-left pull-right"></i></a> <ul class="treeview-menu"> <li><a href="#">Link in level 2</a></li> <li><a href="#">Link in level 2</a></li> </ul> </li> </ul><!-- /.sidebar-menu --> </div><!-- /.sidebar --> </div><!-- /.main-sidebar -->
Update sidebar with 'my_teams' link
Update sidebar with 'my_teams' link
HTML+ERB
mit
rwehresmann/farma_alg_reborn,rwehresmann/farma_alg_reborn,rwehresmann/farma_alg_reborn,rwehresmann/farma_alg_reborn,rwehresmann/farma_alg_reborn
html+erb
## Code Before: <div class="main-sidebar"> <!-- Inner sidebar --> <div class="sidebar"> <!-- Sidebar Menu --> <ul class="sidebar-menu"> <li class="header">Menu de operações</li> <li> <%= link_to exercises_path do %> <i class="fa fa-pencil-square-o"></i> <span>Meus exercícios</span> <% end %> </li> <li class="treeview"> <a href="#"><span>Multilevel</span> <i class="fa fa-angle-left pull-right"></i></a> <ul class="treeview-menu"> <li><a href="#">Link in level 2</a></li> <li><a href="#">Link in level 2</a></li> </ul> </li> </ul><!-- /.sidebar-menu --> </div><!-- /.sidebar --> </div><!-- /.main-sidebar --> ## Instruction: Update sidebar with 'my_teams' link ## Code After: <div class="main-sidebar"> <!-- Inner sidebar --> <div class="sidebar"> <!-- Sidebar Menu --> <ul class="sidebar-menu"> <li class="header">Menu de operações</li> <li> <%= link_to exercises_path do %> <i class="fa fa-pencil-square-o"></i> <span>Meus exercícios</span> <% end %> </li> <li> <%= link_to teams_path do %> <i class="fa fa-users"></i> <span>Minhas turmas</span> <% end %> </li> <li class="treeview"> <a href="#"><span>Multilevel</span> <i class="fa fa-angle-left pull-right"></i></a> <ul class="treeview-menu"> <li><a href="#">Link in level 2</a></li> <li><a href="#">Link in level 2</a></li> </ul> </li> </ul><!-- /.sidebar-menu --> </div><!-- /.sidebar --> </div><!-- /.main-sidebar -->
35a660ca846a58cea2c2a995f7d30c76f65b6e28
_sass/_variables.scss
_sass/_variables.scss
// // VARIABLES // // Colors $blue: #4183C4; // Grays $black: #000; $darkerGray: #222; $darkGray: #333; $gray: #666; $lightGray: #eee; $white: #fff; // Font stacks $helvetica: Helvetica, Arial, sans-serif; $helveticaNeue: "Helvetica Neue", Helvetica, Arial, sans-serif; $georgia: Georgia, serif; // Mobile breakpoints @mixin mobile { @media screen and (max-width: 640px) { @content; } }
// // VARIABLES // // Colors $blue: #4183C4; // Grays $black: #000; $darkerGray: #222; $darkGray: #333; $gray: #666; $lightGray: #eee; $white: #fff; // Font stacks $monaco: monaco, consolas, Lucida Console, monospace; $helvetica: Helvetica, Arial, sans-serif; $helveticaNeue: "Helvetica Neue", Helvetica, Arial, sans-serif; $georgia: Georgia, serif; // Mobile breakpoints @mixin mobile { @media screen and (max-width: 640px) { @content; } }
Add Monaco and style-like fonts to variables
Add Monaco and style-like fonts to variables
SCSS
mit
sonirico/sonirico.github.io
scss
## Code Before: // // VARIABLES // // Colors $blue: #4183C4; // Grays $black: #000; $darkerGray: #222; $darkGray: #333; $gray: #666; $lightGray: #eee; $white: #fff; // Font stacks $helvetica: Helvetica, Arial, sans-serif; $helveticaNeue: "Helvetica Neue", Helvetica, Arial, sans-serif; $georgia: Georgia, serif; // Mobile breakpoints @mixin mobile { @media screen and (max-width: 640px) { @content; } } ## Instruction: Add Monaco and style-like fonts to variables ## Code After: // // VARIABLES // // Colors $blue: #4183C4; // Grays $black: #000; $darkerGray: #222; $darkGray: #333; $gray: #666; $lightGray: #eee; $white: #fff; // Font stacks $monaco: monaco, consolas, Lucida Console, monospace; $helvetica: Helvetica, Arial, sans-serif; $helveticaNeue: "Helvetica Neue", Helvetica, Arial, sans-serif; $georgia: Georgia, serif; // Mobile breakpoints @mixin mobile { @media screen and (max-width: 640px) { @content; } }
2cfe6e6c9284dfffba2943a8562e38844b6ba089
temba/campaigns/migrations/0015_campaignevent_message_new.py
temba/campaigns/migrations/0015_campaignevent_message_new.py
from __future__ import unicode_literals import json import temba.utils.models from django.contrib.postgres.operations import HStoreExtension from django.db import migrations def populate_message_new(apps, schema_editor): CampaignEvent = apps.get_model('campaigns', 'CampaignEvent') events = list(CampaignEvent.objects.filter(event_type='M').select_related('flow')) for event in events: try: event.message_new = json.loads(event.message) except Exception: event.message_new = {event.flow.base_language: event.message} event.save(update_fields=('message_new',)) if events: print("Converted %d campaign events" % len(events)) class Migration(migrations.Migration): dependencies = [ ('campaigns', '0014_auto_20170228_0837'), ] operations = [ HStoreExtension(), migrations.AddField( model_name='campaignevent', name='message_new', field=temba.utils.models.TranslatableField(max_length=640, null=True), ), migrations.RunPython(populate_message_new) ]
from __future__ import unicode_literals import json import temba.utils.models from django.contrib.postgres.operations import HStoreExtension from django.db import migrations def populate_message_new(apps, schema_editor): CampaignEvent = apps.get_model('campaigns', 'CampaignEvent') events = list(CampaignEvent.objects.filter(event_type='M').select_related('flow')) for event in events: try: event.message_new = json.loads(event.message) except Exception: base_lang = event.flow.base_language or 'base' event.message_new = {base_lang: event.message} event.save(update_fields=('message_new',)) if events: print("Converted %d campaign events" % len(events)) class Migration(migrations.Migration): dependencies = [ ('campaigns', '0014_auto_20170228_0837'), ] operations = [ HStoreExtension(), migrations.AddField( model_name='campaignevent', name='message_new', field=temba.utils.models.TranslatableField(max_length=640, null=True), ), migrations.RunPython(populate_message_new) ]
Fix migration to work with flows with no base_language
Fix migration to work with flows with no base_language
Python
agpl-3.0
pulilab/rapidpro,pulilab/rapidpro,pulilab/rapidpro,pulilab/rapidpro,pulilab/rapidpro
python
## Code Before: from __future__ import unicode_literals import json import temba.utils.models from django.contrib.postgres.operations import HStoreExtension from django.db import migrations def populate_message_new(apps, schema_editor): CampaignEvent = apps.get_model('campaigns', 'CampaignEvent') events = list(CampaignEvent.objects.filter(event_type='M').select_related('flow')) for event in events: try: event.message_new = json.loads(event.message) except Exception: event.message_new = {event.flow.base_language: event.message} event.save(update_fields=('message_new',)) if events: print("Converted %d campaign events" % len(events)) class Migration(migrations.Migration): dependencies = [ ('campaigns', '0014_auto_20170228_0837'), ] operations = [ HStoreExtension(), migrations.AddField( model_name='campaignevent', name='message_new', field=temba.utils.models.TranslatableField(max_length=640, null=True), ), migrations.RunPython(populate_message_new) ] ## Instruction: Fix migration to work with flows with no base_language ## Code After: from __future__ import unicode_literals import json import temba.utils.models from django.contrib.postgres.operations import HStoreExtension from django.db import migrations def populate_message_new(apps, schema_editor): CampaignEvent = apps.get_model('campaigns', 'CampaignEvent') events = list(CampaignEvent.objects.filter(event_type='M').select_related('flow')) for event in events: try: event.message_new = json.loads(event.message) except Exception: base_lang = event.flow.base_language or 'base' event.message_new = {base_lang: event.message} event.save(update_fields=('message_new',)) if events: print("Converted %d campaign events" % len(events)) class Migration(migrations.Migration): dependencies = [ ('campaigns', '0014_auto_20170228_0837'), ] operations = [ HStoreExtension(), migrations.AddField( model_name='campaignevent', name='message_new', field=temba.utils.models.TranslatableField(max_length=640, null=True), ), migrations.RunPython(populate_message_new) ]
7a72e1411197997e042fddaa011f32addf776679
views/tasks.jade
views/tasks.jade
extends layout block content div#content div#toast - var message = "Welcome " + user.displayName + "! " p#welcome: #[= message] #[a(href="prefs") Preferences] #[a(href="logout") Log out] div p Add task: #[input#inputCourseCode(type="text", placeholder="course code", name="coursecode")] #[input#inputTask(type="text", placeholder="task", name="task")] #[input#inputDeadline(type="text", name="deadline", placeholder="deadline")] #[input#inputWeight(type="number", min="0", max="100", placeholder="10.0", name="weight")]% &nbsp;&nbsp; #[button#btnSubmit(type="submit") submit] div#tasksList table#tasksTable tr th th Code th Task th Deadline th Weight th Priority h2 Completed Items table#completedTable tr th th Code th Task th Deadline th Weight
extends layout block content div#content div#toast - var message = "Welcome " + user.displayName + "! " p#welcome: #[= message] #[a(href="prefs") Preferences] #[a(href="logout") Log out] div p Add task: #[input#inputCourseCode(type="text", placeholder="course code", name="coursecode")] #[input#inputTask(type="text", placeholder="task", name="task")] #[input#inputDeadline(type="text", name="deadline", placeholder="deadline")] #[input#inputWeight(type="number", min="0", max="100", step="0.1", placeholder="10.0", name="weight")]% &nbsp;&nbsp; #[button#btnSubmit(type="submit") submit] div#tasksList table#tasksTable tr th th Code th Task th Deadline th Weight th Priority h2 Completed Items table#completedTable tr th th Code th Task th Deadline th Weight
Set step size to one tenth of a percent
Set step size to one tenth of a percent
Jade
mit
davidschlachter/multivariable-todo,davidschlachter/multivariable-todo
jade
## Code Before: extends layout block content div#content div#toast - var message = "Welcome " + user.displayName + "! " p#welcome: #[= message] #[a(href="prefs") Preferences] #[a(href="logout") Log out] div p Add task: #[input#inputCourseCode(type="text", placeholder="course code", name="coursecode")] #[input#inputTask(type="text", placeholder="task", name="task")] #[input#inputDeadline(type="text", name="deadline", placeholder="deadline")] #[input#inputWeight(type="number", min="0", max="100", placeholder="10.0", name="weight")]% &nbsp;&nbsp; #[button#btnSubmit(type="submit") submit] div#tasksList table#tasksTable tr th th Code th Task th Deadline th Weight th Priority h2 Completed Items table#completedTable tr th th Code th Task th Deadline th Weight ## Instruction: Set step size to one tenth of a percent ## Code After: extends layout block content div#content div#toast - var message = "Welcome " + user.displayName + "! " p#welcome: #[= message] #[a(href="prefs") Preferences] #[a(href="logout") Log out] div p Add task: #[input#inputCourseCode(type="text", placeholder="course code", name="coursecode")] #[input#inputTask(type="text", placeholder="task", name="task")] #[input#inputDeadline(type="text", name="deadline", placeholder="deadline")] #[input#inputWeight(type="number", min="0", max="100", step="0.1", placeholder="10.0", name="weight")]% &nbsp;&nbsp; #[button#btnSubmit(type="submit") submit] div#tasksList table#tasksTable tr th th Code th Task th Deadline th Weight th Priority h2 Completed Items table#completedTable tr th th Code th Task th Deadline th Weight
31b70c6b08eedc1773d4993e9d9d420d84197b49
corehq/apps/domain/__init__.py
corehq/apps/domain/__init__.py
from django.conf import settings from corehq.preindex import ExtraPreindexPlugin ExtraPreindexPlugin.register('domain', __file__, ( settings.NEW_DOMAINS_DB, settings.NEW_USERS_GROUPS_DB, settings.NEW_FIXTURES_DB, 'meta', )) SHARED_DOMAIN = "<shared>" UNKNOWN_DOMAIN = "<unknown>"
SHARED_DOMAIN = "<shared>" UNKNOWN_DOMAIN = "<unknown>"
Remove domain design doc from irrelevant couch dbs
Remove domain design doc from irrelevant couch dbs It used to contain views that were relevant to users, fixtures, and meta dbs, but these have since been removed. Currently all views in the domain design doc emit absolutely nothing in those domains
Python
bsd-3-clause
dimagi/commcare-hq,dimagi/commcare-hq,dimagi/commcare-hq,dimagi/commcare-hq,dimagi/commcare-hq
python
## Code Before: from django.conf import settings from corehq.preindex import ExtraPreindexPlugin ExtraPreindexPlugin.register('domain', __file__, ( settings.NEW_DOMAINS_DB, settings.NEW_USERS_GROUPS_DB, settings.NEW_FIXTURES_DB, 'meta', )) SHARED_DOMAIN = "<shared>" UNKNOWN_DOMAIN = "<unknown>" ## Instruction: Remove domain design doc from irrelevant couch dbs It used to contain views that were relevant to users, fixtures, and meta dbs, but these have since been removed. Currently all views in the domain design doc emit absolutely nothing in those domains ## Code After: SHARED_DOMAIN = "<shared>" UNKNOWN_DOMAIN = "<unknown>"
3b17a2b2adc4d5c0569578836460c4d616d57809
tox.ini
tox.ini
[tox] envlist = py27, py36, py37, py38, py39, purepy27, purepy38, py27-no-gpg, py38-no-gpg skipsdist = True [testenv] install_command = pip install --pre {opts} {packages} deps = -r{toxinidir}/requirements-pinned.txt -r{toxinidir}/requirements-test.txt commands = coverage run tests/aggregate_tests.py coverage report -m --fail-under 97 [testenv:purepy27] deps = -r{toxinidir}/requirements-min.txt -r{toxinidir}/requirements-test.txt commands = python -m tests.check_public_interfaces [testenv:purepy38] deps = -r{toxinidir}/requirements-min.txt commands = python -m tests.check_public_interfaces [testenv:py27-no-gpg] setenv = GNUPG = nonexisting-gpg-for-testing commands = python -m tests.check_public_interfaces_gpg [testenv:py38-no-gpg] setenv = GNUPG = nonexisting-gpg-for-testing commands = python -m tests.check_public_interfaces_gpg
[tox] envlist = py36, py37, py38, py39, purepy38, py38-no-gpg skipsdist = True [testenv] install_command = pip install --pre {opts} {packages} deps = -r{toxinidir}/requirements-pinned.txt -r{toxinidir}/requirements-test.txt commands = coverage run tests/aggregate_tests.py coverage report -m --fail-under 97 [testenv:purepy38] deps = -r{toxinidir}/requirements-min.txt commands = python -m tests.check_public_interfaces [testenv:py38-no-gpg] setenv = GNUPG = nonexisting-gpg-for-testing commands = python -m tests.check_public_interfaces_gpg
Drop Python 2.7 from test automation
Drop Python 2.7 from test automation We do not need to include Python 2.7 versions of our tox environments any more. Signed-off-by: Joshua Lock <[email protected]>
INI
mit
secure-systems-lab/securesystemslib,secure-systems-lab/securesystemslib
ini
## Code Before: [tox] envlist = py27, py36, py37, py38, py39, purepy27, purepy38, py27-no-gpg, py38-no-gpg skipsdist = True [testenv] install_command = pip install --pre {opts} {packages} deps = -r{toxinidir}/requirements-pinned.txt -r{toxinidir}/requirements-test.txt commands = coverage run tests/aggregate_tests.py coverage report -m --fail-under 97 [testenv:purepy27] deps = -r{toxinidir}/requirements-min.txt -r{toxinidir}/requirements-test.txt commands = python -m tests.check_public_interfaces [testenv:purepy38] deps = -r{toxinidir}/requirements-min.txt commands = python -m tests.check_public_interfaces [testenv:py27-no-gpg] setenv = GNUPG = nonexisting-gpg-for-testing commands = python -m tests.check_public_interfaces_gpg [testenv:py38-no-gpg] setenv = GNUPG = nonexisting-gpg-for-testing commands = python -m tests.check_public_interfaces_gpg ## Instruction: Drop Python 2.7 from test automation We do not need to include Python 2.7 versions of our tox environments any more. Signed-off-by: Joshua Lock <[email protected]> ## Code After: [tox] envlist = py36, py37, py38, py39, purepy38, py38-no-gpg skipsdist = True [testenv] install_command = pip install --pre {opts} {packages} deps = -r{toxinidir}/requirements-pinned.txt -r{toxinidir}/requirements-test.txt commands = coverage run tests/aggregate_tests.py coverage report -m --fail-under 97 [testenv:purepy38] deps = -r{toxinidir}/requirements-min.txt commands = python -m tests.check_public_interfaces [testenv:py38-no-gpg] setenv = GNUPG = nonexisting-gpg-for-testing commands = python -m tests.check_public_interfaces_gpg
5ffd9b11fddf653755c619163e54817c4bae565e
tox.ini
tox.ini
[tox] envlist = py26, py27, py33, py34, py35, flake8 [testenv:flake8] basepython=python deps=flake8 commands=flake8 thrum [testenv] setenv = PYTHONPATH = {toxinidir}:{toxinidir}/thrum commands = python setup.py test ; If you want to make tox run the tests with the same versions, create a ; requirements.txt with the pinned versions and uncomment the following lines: ; deps = ; -r{toxinidir}/requirements.txt
[tox] envlist = pypy, py27, py33, py34, py35, flake8 skip_missing_interpreters=True [testenv:flake8] basepython=python deps = flake8 twisted commands=flake8 thrum [testenv] passenv = * setenv = PYTHONPATH = {toxinidir}:{toxinidir}/thrum deps = twisted commands = python -m twisted.trial thrum ; If you want to make tox run the tests with the same versions, create a ; requirements.txt with the pinned versions and uncomment the following lines: ; deps = ; -r{toxinidir}/requirements.txt
Remove py26; Add pypy; Use Trial
Tox: Remove py26; Add pypy; Use Trial
INI
mit
donalm/thrum
ini
## Code Before: [tox] envlist = py26, py27, py33, py34, py35, flake8 [testenv:flake8] basepython=python deps=flake8 commands=flake8 thrum [testenv] setenv = PYTHONPATH = {toxinidir}:{toxinidir}/thrum commands = python setup.py test ; If you want to make tox run the tests with the same versions, create a ; requirements.txt with the pinned versions and uncomment the following lines: ; deps = ; -r{toxinidir}/requirements.txt ## Instruction: Tox: Remove py26; Add pypy; Use Trial ## Code After: [tox] envlist = pypy, py27, py33, py34, py35, flake8 skip_missing_interpreters=True [testenv:flake8] basepython=python deps = flake8 twisted commands=flake8 thrum [testenv] passenv = * setenv = PYTHONPATH = {toxinidir}:{toxinidir}/thrum deps = twisted commands = python -m twisted.trial thrum ; If you want to make tox run the tests with the same versions, create a ; requirements.txt with the pinned versions and uncomment the following lines: ; deps = ; -r{toxinidir}/requirements.txt
283242f69d931b91b1ee823937d550e38ae2b35b
spec/spec_helper.rb
spec/spec_helper.rb
ENV["RAILS_ENV"] = "test" require File.expand_path("../dummy/config/environment.rb", __FILE__) require 'rspec/rails' require 'factory_girl' require 'database_cleaner' ENGINE_RAILS_ROOT=File.join(File.dirname(__FILE__), '../') # Requires supporting ruby files with custom matchers and macros, etc, # in spec/support/ and its subdirectories. Dir[File.join(ENGINE_RAILS_ROOT, "spec/support/**/*.rb")].each {|f| require f } RSpec.configure do |config| config.use_transactional_fixtures = false config.include(MailerMacros) config.before(:each) { reset_email } config.include Devise::TestHelpers, :type => :controller config.before(:suite) do DatabaseCleaner.strategy = :transaction DatabaseCleaner.clean_with(:truncation) end config.before(:each) do DatabaseCleaner.start end config.after(:each) do DatabaseCleaner.clean end end
ENV["RAILS_ENV"] = "test" require File.expand_path("../dummy/config/environment.rb", __FILE__) require 'rspec/rails' require 'factory_girl' require 'database_cleaner' ENGINE_RAILS_ROOT=File.join(File.dirname(__FILE__), '../') # Requires supporting ruby files with custom matchers and macros, etc, # in spec/support/ and its subdirectories. Dir[File.join(ENGINE_RAILS_ROOT, "spec/support/**/*.rb")].each {|f| require f } RSpec.configure do |config| config.use_transactional_fixtures = false config.include(MailerMacros) config.before(:each) { reset_email } config.include Devise::TestHelpers, :type => :controller config.before(:each) do if example.metadata[:js] DatabaseCleaner.strategy = :truncation else DatabaseCleaner.strategy = :transaction end end config.before(:each) do DatabaseCleaner.start end config.after(:each) do DatabaseCleaner.clean end end
Use truncation database cleaner strategy for js tests
Use truncation database cleaner strategy for js tests
Ruby
mit
NOX73/forem,trakt/forem,bcavileer/forem,BanzaiMan/forem,ajeje/forem,mmcc/forem,jarray52/forem,jeremyong/forem,bbatsov/forem,ketanjain/forem,hepu/forem,rubysherpas/forem,radar/forem,knewter/forem,xenonn/forem,efrence/forem,mijoharas/forem,eivindhagen/forem,sandboxrem/forem,link-er/forem,tomrc/forem,abrambailey/forem,bagusyuu/forem,nikolaevav/forem,wesleycho/forem,etjossem/Goodsmiths-Hub,nikolaevav/forem,ketanjain/forem,tomkrus/forem,mmcc/forem,mauriciopasquier/forem,RostDetal/forem,amwelles/forem,bodrovis/forem,eivindhagen/forem,elm-city-craftworks/forem,kot-begemot/forem,trakt/forem,link-er/forem,joneslee85/forem,kamarcum/forem,amwelles/forem,moh-alsheikh/forem,zshannon/forem,andypike/forem,ajeje/forem,tomkrus/forem,tomrc/forem,zshannon/forem,rubysherpas/forem,mauriciopasquier/forem,kamarcum/forem,moh-alsheikh/forem,SamLau95/forem,etjossem/Goodsmiths-Hub,rdy/forem,juozasg/forem,abrambailey/forem,ciscou/forem,jeremyong/forem,devleoper/forem,RostDetal/forem,syndicut/forem,efrence/forem,rubysherpas/forem,frankmarineau/forem,netjungle/forem,devleoper/forem,devleoper/forem,frankmarineau/forem,jaybrueder/forem,kunalchaudhari/forem,kunalchaudhari/forem,gabriel-dehan/forem,andypike/forem,Fullscreen/forem,tonic20/forem,tonic20/forem,zhublik/forem,tomrc/forem,radar/forem,beansmile/forem,zhublik/forem,trakt/forem,bvsatyaram/forem,kl/forem,jaybrueder/forem,wuwx/forem,mhoad/forem,knewter/forem,NOX73/forem,rdy/forem,kot-begemot/forem,juozasg/forem,Fullscreen/forem,0xCCD/forem,codev/forem,beefsack/forem,code-mancers/forem,gabriel-dehan/forem,yan-hoose/forem,prasadrails/forem,daniely/forem,bbatsov/forem,codev/forem,efrence/forem,ch000/forem,hepu/forem,bcavileer/forem,0xCCD/forem,ascendbruce/forem,ciscou/forem,code-mancers/forem,xenonn/forem,brooks/forem,ascendbruce/forem,joneslee85/forem,hanspolo/forem,BoboFraggins/forem,danman01/forem,mijoharas/forem,BanzaiMan/forem,beefsack/forem,ketanjain/forem,daniely/forem,ch000/forem,SamLau95/forem,elm-city-craftworks/forem,netjungle/forem,hanspolo/forem,syndicut/forem,kenmazaika/forem,danman01/forem,BoboFraggins/forem,bodrovis/forem,mhoad/forem,jaybrueder/forem,hanspolo/forem,bagusyuu/forem,radar/forem,bvsatyaram/forem,kl/forem,jarray52/forem,frankmarineau/forem,zhublik/forem,prasadrails/forem,sandboxrem/forem,wuwx/forem,wuwx/forem,brooks/forem,RostDetal/forem,nikolaevav/forem,ajeje/forem,wesleycho/forem,kenmazaika/forem,yan-hoose/forem,beansmile/forem
ruby
## Code Before: ENV["RAILS_ENV"] = "test" require File.expand_path("../dummy/config/environment.rb", __FILE__) require 'rspec/rails' require 'factory_girl' require 'database_cleaner' ENGINE_RAILS_ROOT=File.join(File.dirname(__FILE__), '../') # Requires supporting ruby files with custom matchers and macros, etc, # in spec/support/ and its subdirectories. Dir[File.join(ENGINE_RAILS_ROOT, "spec/support/**/*.rb")].each {|f| require f } RSpec.configure do |config| config.use_transactional_fixtures = false config.include(MailerMacros) config.before(:each) { reset_email } config.include Devise::TestHelpers, :type => :controller config.before(:suite) do DatabaseCleaner.strategy = :transaction DatabaseCleaner.clean_with(:truncation) end config.before(:each) do DatabaseCleaner.start end config.after(:each) do DatabaseCleaner.clean end end ## Instruction: Use truncation database cleaner strategy for js tests ## Code After: ENV["RAILS_ENV"] = "test" require File.expand_path("../dummy/config/environment.rb", __FILE__) require 'rspec/rails' require 'factory_girl' require 'database_cleaner' ENGINE_RAILS_ROOT=File.join(File.dirname(__FILE__), '../') # Requires supporting ruby files with custom matchers and macros, etc, # in spec/support/ and its subdirectories. Dir[File.join(ENGINE_RAILS_ROOT, "spec/support/**/*.rb")].each {|f| require f } RSpec.configure do |config| config.use_transactional_fixtures = false config.include(MailerMacros) config.before(:each) { reset_email } config.include Devise::TestHelpers, :type => :controller config.before(:each) do if example.metadata[:js] DatabaseCleaner.strategy = :truncation else DatabaseCleaner.strategy = :transaction end end config.before(:each) do DatabaseCleaner.start end config.after(:each) do DatabaseCleaner.clean end end
e15a760e13a4ca71808716e78807478234f45a41
requirements/development.txt
requirements/development.txt
-r staging.txt ansible==6.4.0 ansible-lint[yamllint]==6.7.0 coverage==6.4.4 factory_boy==3.2.1 flake8==5.0.4 ipython==8.5.0 rstcheck==5.0.0 sphinx==5.2.2
-r staging.txt ansible==6.4.0 ansible-lint==6.7.0 coverage==6.4.4 factory_boy==3.2.1 flake8==5.0.4 ipython==8.5.0 rstcheck==5.0.0 sphinx==5.2.2
Remove yamllint extra from ansible-lint
Remove yamllint extra from ansible-lint ansible-lint 6.0.0 made yamllint a direct dependency, so it is no longer necessary to include it in the requirements file.
Text
mit
pydata/conf_site,pydata/conf_site,pydata/conf_site
text
## Code Before: -r staging.txt ansible==6.4.0 ansible-lint[yamllint]==6.7.0 coverage==6.4.4 factory_boy==3.2.1 flake8==5.0.4 ipython==8.5.0 rstcheck==5.0.0 sphinx==5.2.2 ## Instruction: Remove yamllint extra from ansible-lint ansible-lint 6.0.0 made yamllint a direct dependency, so it is no longer necessary to include it in the requirements file. ## Code After: -r staging.txt ansible==6.4.0 ansible-lint==6.7.0 coverage==6.4.4 factory_boy==3.2.1 flake8==5.0.4 ipython==8.5.0 rstcheck==5.0.0 sphinx==5.2.2
596adc3b120987457db1c5b5a077156eeb656348
website/partners/solutions/possible-uk/index.hbs
website/partners/solutions/possible-uk/index.hbs
--- layout: partners.hbs title: "POSSIBLE" solutions: true summary: logo: "//d1qmdf3vop2l07.cloudfront.net/optimizely-marketer-assets.cloudvent.net/raw/partner-logos/solutions/possible-uk.png" website_link: "http://www.possible.com/" website_display: "www.possible.com" kb_article: stars: 2 industry: "Strategic Consulting, Custom Implementation, Test Development, Execution, & Management" locations: - location: phone: "+44 020 3349 5800" email: "[email protected]" region: "EMEA - UK" city: "London" state: "UK" address1: "77 Hatton Garden" address2: zip: "EC1N 8JS" country: "UK" contact: languages: - "English" - "Dutch" tags: - "E-Commerce / Retail" - "Insurance" - "Finance" - "Healthcare" - "Mobile" --- POSSIBLE is a creative agency that cares about results. Our mission is simple: create world-class work that works. That’s why we back up every idea with hard-core data for solutions that make a measurable difference.
--- layout: partners title: "POSSIBLE" solutions: true TR_summary: logo: "//d1qmdf3vop2l07.cloudfront.net/optimizely-marketer-assets.cloudvent.net/raw/partner-logos/solutions/possible-uk.png" website_link: "//www.possible.com/" website_display: "www.possible.com" kb_article: stars: 2 TR_industry: "Strategic Consulting, Custom Implementation, Test Development, Execution, & Management" locations: - location: phone: "+44 020 3349 5800" email: "[email protected]" region: "EMEA - UK" city: "London" state: "UK" address1: "77 Hatton Garden" address2: zip: "EC1N 8JS" country: "UK" contact: languages: - "English" - "Dutch" tags: - "E-Commerce / Retail" - "Insurance" - "Finance" - "Healthcare" - "Mobile" MD_page_content: true --- POSSIBLE is a creative agency that cares about results. Our mission is simple: create world-class work that works. That’s why we back up every idea with hard-core data for solutions that make a measurable difference.
Update syntax for assemble v6
Update syntax for assemble v6
Handlebars
mit
CilantroOrg/CilantroVegOrg,CilantroOrg/CilantroVegOrg,CilantroOrg/CilantroVegOrg
handlebars
## Code Before: --- layout: partners.hbs title: "POSSIBLE" solutions: true summary: logo: "//d1qmdf3vop2l07.cloudfront.net/optimizely-marketer-assets.cloudvent.net/raw/partner-logos/solutions/possible-uk.png" website_link: "http://www.possible.com/" website_display: "www.possible.com" kb_article: stars: 2 industry: "Strategic Consulting, Custom Implementation, Test Development, Execution, & Management" locations: - location: phone: "+44 020 3349 5800" email: "[email protected]" region: "EMEA - UK" city: "London" state: "UK" address1: "77 Hatton Garden" address2: zip: "EC1N 8JS" country: "UK" contact: languages: - "English" - "Dutch" tags: - "E-Commerce / Retail" - "Insurance" - "Finance" - "Healthcare" - "Mobile" --- POSSIBLE is a creative agency that cares about results. Our mission is simple: create world-class work that works. That’s why we back up every idea with hard-core data for solutions that make a measurable difference. ## Instruction: Update syntax for assemble v6 ## Code After: --- layout: partners title: "POSSIBLE" solutions: true TR_summary: logo: "//d1qmdf3vop2l07.cloudfront.net/optimizely-marketer-assets.cloudvent.net/raw/partner-logos/solutions/possible-uk.png" website_link: "//www.possible.com/" website_display: "www.possible.com" kb_article: stars: 2 TR_industry: "Strategic Consulting, Custom Implementation, Test Development, Execution, & Management" locations: - location: phone: "+44 020 3349 5800" email: "[email protected]" region: "EMEA - UK" city: "London" state: "UK" address1: "77 Hatton Garden" address2: zip: "EC1N 8JS" country: "UK" contact: languages: - "English" - "Dutch" tags: - "E-Commerce / Retail" - "Insurance" - "Finance" - "Healthcare" - "Mobile" MD_page_content: true --- POSSIBLE is a creative agency that cares about results. Our mission is simple: create world-class work that works. That’s why we back up every idea with hard-core data for solutions that make a measurable difference.
974d38622d1761c97627ed69538fb1986482e003
templates/agenda/includes/event_metainfo_line.html
templates/agenda/includes/event_metainfo_line.html
{% load i18n mezzanine_tags event_tags organization_tags %} <div class="page__meta-date"> <a href="{{ event.get_absolute_url }}"> {% include 'agenda/includes/event_date_line.html' with object=event %} </a> </div> <div class="page__meta-separator page__meta-separator--small"></div> {% if event.location %} <div class="page__meta-title">{{ event.location }}</div> {% endif %} {% if unit_booking %} {% if event.prices.all and not is_archive and not event.is_full %} <p> {% if event.trainings.all|length %} {% with event.links.all as links %} {% if links %} <a class="button mt1" href="{{ links|get_type_link:'link'|first }}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} {% endwith %} {% else %} <a class="button mt1" href="{% url 'event_booking' event.slug %}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} </p> {% endif %} {% endif %}
{% load i18n mezzanine_tags event_tags organization_tags %} <div class="page__meta-date"> <a href="{{ event.get_absolute_url }}"> {% include 'agenda/includes/event_date_line.html' with object=event %} </a> </div> <div class="page__meta-separator page__meta-separator--small"></div> {% if event.location %} <div class="page__meta-title">{{ event.location }}</div> {% endif %} {% if unit_booking %} {% if event.prices.all and not is_archive and not event.is_full %} <p> {% if event.trainings.all|length and not event.external_id %} {% with event.links.all as links %} {% if links %} <a class="button mt1" href="{{ links|get_type_link:'link'|first }}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} {% endwith %} {% else %} <a class="button mt1" href="{% url 'event_booking' event.slug %}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} </p> {% endif %} {% endif %}
Fix booking button on event list
Fix booking button on event list
HTML
agpl-3.0
Ircam-Web/mezzanine-organization,Ircam-Web/mezzanine-organization
html
## Code Before: {% load i18n mezzanine_tags event_tags organization_tags %} <div class="page__meta-date"> <a href="{{ event.get_absolute_url }}"> {% include 'agenda/includes/event_date_line.html' with object=event %} </a> </div> <div class="page__meta-separator page__meta-separator--small"></div> {% if event.location %} <div class="page__meta-title">{{ event.location }}</div> {% endif %} {% if unit_booking %} {% if event.prices.all and not is_archive and not event.is_full %} <p> {% if event.trainings.all|length %} {% with event.links.all as links %} {% if links %} <a class="button mt1" href="{{ links|get_type_link:'link'|first }}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} {% endwith %} {% else %} <a class="button mt1" href="{% url 'event_booking' event.slug %}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} </p> {% endif %} {% endif %} ## Instruction: Fix booking button on event list ## Code After: {% load i18n mezzanine_tags event_tags organization_tags %} <div class="page__meta-date"> <a href="{{ event.get_absolute_url }}"> {% include 'agenda/includes/event_date_line.html' with object=event %} </a> </div> <div class="page__meta-separator page__meta-separator--small"></div> {% if event.location %} <div class="page__meta-title">{{ event.location }}</div> {% endif %} {% if unit_booking %} {% if event.prices.all and not is_archive and not event.is_full %} <p> {% if event.trainings.all|length and not event.external_id %} {% with event.links.all as links %} {% if links %} <a class="button mt1" href="{{ links|get_type_link:'link'|first }}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} {% endwith %} {% else %} <a class="button mt1" href="{% url 'event_booking' event.slug %}" class="event__meta__btn"> {% trans "Reserve" %} </a> {% endif %} </p> {% endif %} {% endif %}
36941e1de9275e35b57a0e1869521a715df7cad0
README.md
README.md
This is for a React JS presentation.
React Slides ===================== A simple presentation slide system build in ReactJS. This is not intended to be used as a production slide system just an education tool in learning ReactJS. ### Usage With Docker ``` docker build -t react-slides . docker run -d -p 3000:3000 -v $(pwd)/react-slides/src:/app/src react-slides ``` or just locally ``` npm install npm start open http://localhost:3000 ``` ### Dependencies * React * Webpack * [webpack-dev-server](https://github.com/webpack/webpack-dev-server) * [babel-loader](https://github.com/babel/babel-loader) * [react-hot-loader](https://github.com/gaearon/react-hot-loader) ### Resources **ADD READ MORE SOURCES HERE**
Update readme with some instructions.
readme: Update readme with some instructions.
Markdown
mit
leadiv/react-slides,leadiv/react-slides
markdown
## Code Before: This is for a React JS presentation. ## Instruction: readme: Update readme with some instructions. ## Code After: React Slides ===================== A simple presentation slide system build in ReactJS. This is not intended to be used as a production slide system just an education tool in learning ReactJS. ### Usage With Docker ``` docker build -t react-slides . docker run -d -p 3000:3000 -v $(pwd)/react-slides/src:/app/src react-slides ``` or just locally ``` npm install npm start open http://localhost:3000 ``` ### Dependencies * React * Webpack * [webpack-dev-server](https://github.com/webpack/webpack-dev-server) * [babel-loader](https://github.com/babel/babel-loader) * [react-hot-loader](https://github.com/gaearon/react-hot-loader) ### Resources **ADD READ MORE SOURCES HERE**
83e8d43369df0834ca1826704a9066192b4d7d1a
.travis.yml
.travis.yml
language: python dist: trusty python: - "2.7" install: - pip install requests configargparse bcrypt - pip install coverage nose script: - nosetests --with-coverage --cover-xml --cover-package=xclib after_success: - bash <(curl -s https://codecov.io/bash)
language: python dist: trusty sudo: required python: - "3.5" install: - sudo apt install libdb5.3-dev - pip install requests configargparse bcrypt bsddb3 - pip install coverage nose rednose script: - nosetests --with-coverage --cover-xml --cover-package=xclib after_success: - bash <(curl -s https://codecov.io/bash)
Switch Travis to Python 3
Switch Travis to Python 3
YAML
mit
jsxc/xmpp-cloud-auth,jsxc/xmpp-cloud-auth,jsxc/xmpp-cloud-auth,jsxc/xmpp-cloud-auth
yaml
## Code Before: language: python dist: trusty python: - "2.7" install: - pip install requests configargparse bcrypt - pip install coverage nose script: - nosetests --with-coverage --cover-xml --cover-package=xclib after_success: - bash <(curl -s https://codecov.io/bash) ## Instruction: Switch Travis to Python 3 ## Code After: language: python dist: trusty sudo: required python: - "3.5" install: - sudo apt install libdb5.3-dev - pip install requests configargparse bcrypt bsddb3 - pip install coverage nose rednose script: - nosetests --with-coverage --cover-xml --cover-package=xclib after_success: - bash <(curl -s https://codecov.io/bash)
ffc3e16e474fae0f28c8d2b3022e5bb2ce2c7ac1
bash.php
bash.php
<?php $credentials = require __DIR__.'/credentials.php'; foreach (glob(__DIR__.'/includes/*.php') as $file) { require_once $file; } $config = [ 'rootDir' => __DIR__, 'password' => $credentials['password'], 'title' => 'SCReddit QDB', 'enableCaptcha' => false, 'latest' => 10, 'top' => 25, 'browsePP' => 50, 'random' => 25, 'search' => 25, ]; $app = new Application($config); $app->connectMysql($credentials['mysqlUser'], $credentials['mysqlPass'], 'bash'); $app->run();
<?php $credentials = require __DIR__.'/credentials.php'; foreach (glob(__DIR__.'/includes/*.php') as $file) { require_once $file; } $config = [ 'rootDir' => __DIR__, 'password' => $credentials['password'], 'title' => 'SCReddit QDB', 'enableCaptcha' => $credentials['enableCaptcha'], 'latest' => 10, 'top' => 25, 'browsePP' => 50, 'random' => 25, 'search' => 25, ]; $app = new Application($config); $app->connectMysql($credentials['mysqlUser'], $credentials['mysqlPass'], 'bash'); $app->run();
Allow CAPTCHA from config file
Allow CAPTCHA from config file
PHP
bsd-2-clause
x89/redditeu-bash,x89/redditeu-bash
php
## Code Before: <?php $credentials = require __DIR__.'/credentials.php'; foreach (glob(__DIR__.'/includes/*.php') as $file) { require_once $file; } $config = [ 'rootDir' => __DIR__, 'password' => $credentials['password'], 'title' => 'SCReddit QDB', 'enableCaptcha' => false, 'latest' => 10, 'top' => 25, 'browsePP' => 50, 'random' => 25, 'search' => 25, ]; $app = new Application($config); $app->connectMysql($credentials['mysqlUser'], $credentials['mysqlPass'], 'bash'); $app->run(); ## Instruction: Allow CAPTCHA from config file ## Code After: <?php $credentials = require __DIR__.'/credentials.php'; foreach (glob(__DIR__.'/includes/*.php') as $file) { require_once $file; } $config = [ 'rootDir' => __DIR__, 'password' => $credentials['password'], 'title' => 'SCReddit QDB', 'enableCaptcha' => $credentials['enableCaptcha'], 'latest' => 10, 'top' => 25, 'browsePP' => 50, 'random' => 25, 'search' => 25, ]; $app = new Application($config); $app->connectMysql($credentials['mysqlUser'], $credentials['mysqlPass'], 'bash'); $app->run();
5fd25b2a1beb8cf2bc34fe81bf6016bcb90267e8
features/scripts/create_unity_project.sh
features/scripts/create_unity_project.sh
export PATH="$PATH:/Applications/Unity/Unity.app/Contents/MacOS" pushd "${0%/*}" pushd ../.. package_path=`pwd` popd pushd ../fixtures git clean -xdf . project_path="$(pwd)/unity_project" Unity -nographics -quit -batchmode -logFile unity.log -createProject $project_path Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -importPackage "$package_path/Bugsnag.unitypackage" cp Main.cs unity_project/Assets/Main.cs BUGSNAG_APIKEY=a35a2a72bd230ac0aa0f52715bbdc6aa Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -executeMethod "Main.CreateScene" Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -buildOSXUniversalPlayer "$package_path/features/fixtures/Mazerunner.app" popd popd
export PATH="$PATH:/Applications/Unity/Unity.app/Contents/MacOS" pushd "${0%/*}" pushd ../.. package_path=`pwd` popd pushd ../fixtures git clean -xdf . project_path="$(pwd)/unity_project" Unity -nographics -quit -batchmode -logFile unity.log -createProject $project_path Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -importPackage "$package_path/Bugsnag.unitypackage" cp Main.cs unity_project/Assets/Main.cs Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -executeMethod "Main.CreateScene" Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -buildOSXUniversalPlayer "$package_path/features/fixtures/Mazerunner.app" popd popd
Remove api key env variable, set elsewhere
Remove api key env variable, set elsewhere
Shell
mit
bugsnag/bugsnag-unity,bugsnag/bugsnag-unity,bugsnag/bugsnag-unity,bugsnag/bugsnag-unity,bugsnag/bugsnag-unity,bugsnag/bugsnag-unity
shell
## Code Before: export PATH="$PATH:/Applications/Unity/Unity.app/Contents/MacOS" pushd "${0%/*}" pushd ../.. package_path=`pwd` popd pushd ../fixtures git clean -xdf . project_path="$(pwd)/unity_project" Unity -nographics -quit -batchmode -logFile unity.log -createProject $project_path Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -importPackage "$package_path/Bugsnag.unitypackage" cp Main.cs unity_project/Assets/Main.cs BUGSNAG_APIKEY=a35a2a72bd230ac0aa0f52715bbdc6aa Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -executeMethod "Main.CreateScene" Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -buildOSXUniversalPlayer "$package_path/features/fixtures/Mazerunner.app" popd popd ## Instruction: Remove api key env variable, set elsewhere ## Code After: export PATH="$PATH:/Applications/Unity/Unity.app/Contents/MacOS" pushd "${0%/*}" pushd ../.. package_path=`pwd` popd pushd ../fixtures git clean -xdf . project_path="$(pwd)/unity_project" Unity -nographics -quit -batchmode -logFile unity.log -createProject $project_path Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -importPackage "$package_path/Bugsnag.unitypackage" cp Main.cs unity_project/Assets/Main.cs Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -executeMethod "Main.CreateScene" Unity -nographics -quit -batchmode -logFile unity.log -projectpath $project_path -buildOSXUniversalPlayer "$package_path/features/fixtures/Mazerunner.app" popd popd
1c666e12e17c58086b8fe0644c342c5b2818101a
.travis.yml
.travis.yml
language: java addons: hosts: - test-bucket.localhost env: - PATH=$PATH:$HOME/.s3cmd SECOR_LOCAL_S3=true S3CMD=1.0.1 jdk: - openjdk7 - oraclejdk7 - oraclejdk8 before_install: - wget https://github.com/s3tools/s3cmd/archive/v$S3CMD.tar.gz -O /tmp/s3cmd.tar.gz - tar -xzf /tmp/s3cmd.tar.gz -C $HOME - mv $HOME/s3cmd-$S3CMD $HOME/.s3cmd - cd $HOME/.s3cmd && python setup.py install --user && cd - - gem install fakes3 -v 0.1.7 script: - make unit - make integration
language: java addons: hosts: - test-bucket.localhost env: - PATH=$PATH:$HOME/.s3cmd SECOR_LOCAL_S3=true S3CMD=1.0.1 jdk: - oraclejdk7 - oraclejdk8 before_install: - wget https://github.com/s3tools/s3cmd/archive/v$S3CMD.tar.gz -O /tmp/s3cmd.tar.gz - tar -xzf /tmp/s3cmd.tar.gz -C $HOME - mv $HOME/s3cmd-$S3CMD $HOME/.s3cmd - cd $HOME/.s3cmd && python setup.py install --user && cd - - gem install fakes3 -v 0.1.7 script: - make unit - make integration
Remove openjdk7 since it constantsly causing core dumps on starting zookeeper:
Remove openjdk7 since it constantsly causing core dumps on starting zookeeper: running ./scripts/run_kafka_class.sh org.apache.zookeeper.server.quorum.QuorumPeerMain ./scripts/../zookeeper.test.properties > /tmp/secor_dev/logs/zookeeper.log 2>&1 & *** buffer overflow detected ***: /usr/lib/jvm/java-7-openjdk-amd64/bin/java terminated ======= Backtrace: ========= /lib/x86_64-linux-gnu/libc.so.6(__fortify_fail+0x37)[0x2b029fa7ae57]
YAML
apache-2.0
TiVo/secor,Smartling/secor,Smartling/secor,TiVo/secor,nguyenvanthan/secor,liamstewart/secor,strava/secor,shawnsnguyen/secor,pinterest/secor,morrifeldman/secor,FundingCircle/secor,morrifeldman/secor,FundingCircle/secor,liamstewart/secor,shawnsnguyen/secor,HenryCaiHaiying/secor,strava/secor,pinterest/secor,HenryCaiHaiying/secor,nguyenvanthan/secor
yaml
## Code Before: language: java addons: hosts: - test-bucket.localhost env: - PATH=$PATH:$HOME/.s3cmd SECOR_LOCAL_S3=true S3CMD=1.0.1 jdk: - openjdk7 - oraclejdk7 - oraclejdk8 before_install: - wget https://github.com/s3tools/s3cmd/archive/v$S3CMD.tar.gz -O /tmp/s3cmd.tar.gz - tar -xzf /tmp/s3cmd.tar.gz -C $HOME - mv $HOME/s3cmd-$S3CMD $HOME/.s3cmd - cd $HOME/.s3cmd && python setup.py install --user && cd - - gem install fakes3 -v 0.1.7 script: - make unit - make integration ## Instruction: Remove openjdk7 since it constantsly causing core dumps on starting zookeeper: running ./scripts/run_kafka_class.sh org.apache.zookeeper.server.quorum.QuorumPeerMain ./scripts/../zookeeper.test.properties > /tmp/secor_dev/logs/zookeeper.log 2>&1 & *** buffer overflow detected ***: /usr/lib/jvm/java-7-openjdk-amd64/bin/java terminated ======= Backtrace: ========= /lib/x86_64-linux-gnu/libc.so.6(__fortify_fail+0x37)[0x2b029fa7ae57] ## Code After: language: java addons: hosts: - test-bucket.localhost env: - PATH=$PATH:$HOME/.s3cmd SECOR_LOCAL_S3=true S3CMD=1.0.1 jdk: - oraclejdk7 - oraclejdk8 before_install: - wget https://github.com/s3tools/s3cmd/archive/v$S3CMD.tar.gz -O /tmp/s3cmd.tar.gz - tar -xzf /tmp/s3cmd.tar.gz -C $HOME - mv $HOME/s3cmd-$S3CMD $HOME/.s3cmd - cd $HOME/.s3cmd && python setup.py install --user && cd - - gem install fakes3 -v 0.1.7 script: - make unit - make integration
3a204de33589de943ff09525895812530baac0b2
saylua/modules/pets/models/db.py
saylua/modules/pets/models/db.py
from google.appengine.ext import ndb # This is to store alternate linart versions of the same pets class SpeciesVersion(ndb.Model): name = ndb.StringProperty() base_image = ndb.StringProperty() base_psd = ndb.StringProperty() default_image = ndb.StringProperty() # Pets are divided into species and species are divided into variations class Species(ndb.Model): name = ndb.StringProperty(indexed=True) versions = ndb.StructuredProperty(SpeciesVersion, repeated=True) description = ndb.StringProperty() class SpeciesVariation(ndb.Model): species_key = ndb.KeyProperty(indexed=True) name = ndb.StringProperty(indexed=True) description = ndb.StringProperty() class Pet(ndb.Model): user_key = ndb.KeyProperty(indexed=True) variation_key = ndb.KeyProperty(indexed=True) # Only set if the pet is a variation species_name = ndb.StringProperty(indexed=True) # Note the denormalization # Personal profile information for the pet name = ndb.StringProperty() css = ndb.StringProperty() description = ndb.StringProperty() # If either of these is set to a number other than 0, the pet is for sale ss_price = ndb.IntegerProperty(default=0, indexed=True) cc_price = ndb.IntegerProperty(default=0, indexed=True)
from google.appengine.ext import ndb # This is to store alternate linart versions of the same pets class SpeciesVersion(ndb.Model): name = ndb.StringProperty() base_image = ndb.StringProperty() base_psd = ndb.StringProperty() default_image = ndb.StringProperty() # Pets are divided into species and species are divided into variations class Species(ndb.Model): name = ndb.StringProperty() versions = ndb.StructuredProperty(SpeciesVersion) description = ndb.TextProperty() class SpeciesVariation(ndb.Model): species_id = ndb.StringProperty() name = ndb.StringProperty() description = ndb.TextProperty() class Pet(ndb.Model): pet_id = ndb.StringProperty() owner_id = ndb.IntegerProperty() variation_key = ndb.KeyProperty() # Only set if the pet is a variation species_name = ndb.StringProperty() # Note the denormalization # Personal profile information for the pet name = ndb.StringProperty() css = ndb.TextProperty() description = ndb.TextProperty() # If either of these is set to a number other than 0, the pet is for sale ss_price = ndb.IntegerProperty(default=0) cc_price = ndb.IntegerProperty(default=0)
Update to pet model for provisioner
Update to pet model for provisioner
Python
agpl-3.0
saylua/SayluaV2,saylua/SayluaV2,LikeMyBread/Saylua,LikeMyBread/Saylua,saylua/SayluaV2,LikeMyBread/Saylua,LikeMyBread/Saylua
python
## Code Before: from google.appengine.ext import ndb # This is to store alternate linart versions of the same pets class SpeciesVersion(ndb.Model): name = ndb.StringProperty() base_image = ndb.StringProperty() base_psd = ndb.StringProperty() default_image = ndb.StringProperty() # Pets are divided into species and species are divided into variations class Species(ndb.Model): name = ndb.StringProperty(indexed=True) versions = ndb.StructuredProperty(SpeciesVersion, repeated=True) description = ndb.StringProperty() class SpeciesVariation(ndb.Model): species_key = ndb.KeyProperty(indexed=True) name = ndb.StringProperty(indexed=True) description = ndb.StringProperty() class Pet(ndb.Model): user_key = ndb.KeyProperty(indexed=True) variation_key = ndb.KeyProperty(indexed=True) # Only set if the pet is a variation species_name = ndb.StringProperty(indexed=True) # Note the denormalization # Personal profile information for the pet name = ndb.StringProperty() css = ndb.StringProperty() description = ndb.StringProperty() # If either of these is set to a number other than 0, the pet is for sale ss_price = ndb.IntegerProperty(default=0, indexed=True) cc_price = ndb.IntegerProperty(default=0, indexed=True) ## Instruction: Update to pet model for provisioner ## Code After: from google.appengine.ext import ndb # This is to store alternate linart versions of the same pets class SpeciesVersion(ndb.Model): name = ndb.StringProperty() base_image = ndb.StringProperty() base_psd = ndb.StringProperty() default_image = ndb.StringProperty() # Pets are divided into species and species are divided into variations class Species(ndb.Model): name = ndb.StringProperty() versions = ndb.StructuredProperty(SpeciesVersion) description = ndb.TextProperty() class SpeciesVariation(ndb.Model): species_id = ndb.StringProperty() name = ndb.StringProperty() description = ndb.TextProperty() class Pet(ndb.Model): pet_id = ndb.StringProperty() owner_id = ndb.IntegerProperty() variation_key = ndb.KeyProperty() # Only set if the pet is a variation species_name = ndb.StringProperty() # Note the denormalization # Personal profile information for the pet name = ndb.StringProperty() css = ndb.TextProperty() description = ndb.TextProperty() # If either of these is set to a number other than 0, the pet is for sale ss_price = ndb.IntegerProperty(default=0) cc_price = ndb.IntegerProperty(default=0)
0372b281b3183c6094dc5ea6afaab05fb5b5836f
examples/git-pull.js
examples/git-pull.js
var connect = require( 'connect' ), flick = require( '..' ), handler = flick(), app = connect(); handler.use( 'romac/romac.github.com', gitPull ); function gitPull( req, res, next ) { console.log( req.flick ); next(); } app.use( connect.bodyParser() ); app.use( flick.whitelist( { local: true } ) ); app.use( flick.payload() ); app.use( handler ); app.use( function( req, res ) { res.writeHead( 200 ); res.end( 'Thank you, dear friend.\n' ); } ); app.listen( 4001 ); console.log( 'flick is listening on port 4001' );
var connect = require( 'connect' ), shell = require( 'shelljs' ), flick = require( '..' ), handler = flick(), app = connect(); handler.use( 'romac/romac.github.com', gitPull( '/var/www/romac.me', { rebase: true } ) ); function gitPull( root, options ) { return function( req, res, next ) { var cmd = 'git pull' + ( options.rebase ? ' --rebase' : '' ); shell.cd( root ); shell.exec( cmd, function( code, output ) { console.log( cmd ' exited with code ' + code ); } ); next(); }; } app.use( connect.bodyParser() ); app.use( flick.whitelist( { local: true } ) ); app.use( flick.payload() ); app.use( handler ); app.use( function( req, res ) { res.writeHead( 200 ); res.end( 'Thank you, dear friend.\n' ); } ); app.listen( 4001 ); console.log( 'flick is listening on port 4001' );
Add a proper example as of the new API design.
Add a proper example as of the new API design.
JavaScript
mit
romac/node-flick
javascript
## Code Before: var connect = require( 'connect' ), flick = require( '..' ), handler = flick(), app = connect(); handler.use( 'romac/romac.github.com', gitPull ); function gitPull( req, res, next ) { console.log( req.flick ); next(); } app.use( connect.bodyParser() ); app.use( flick.whitelist( { local: true } ) ); app.use( flick.payload() ); app.use( handler ); app.use( function( req, res ) { res.writeHead( 200 ); res.end( 'Thank you, dear friend.\n' ); } ); app.listen( 4001 ); console.log( 'flick is listening on port 4001' ); ## Instruction: Add a proper example as of the new API design. ## Code After: var connect = require( 'connect' ), shell = require( 'shelljs' ), flick = require( '..' ), handler = flick(), app = connect(); handler.use( 'romac/romac.github.com', gitPull( '/var/www/romac.me', { rebase: true } ) ); function gitPull( root, options ) { return function( req, res, next ) { var cmd = 'git pull' + ( options.rebase ? ' --rebase' : '' ); shell.cd( root ); shell.exec( cmd, function( code, output ) { console.log( cmd ' exited with code ' + code ); } ); next(); }; } app.use( connect.bodyParser() ); app.use( flick.whitelist( { local: true } ) ); app.use( flick.payload() ); app.use( handler ); app.use( function( req, res ) { res.writeHead( 200 ); res.end( 'Thank you, dear friend.\n' ); } ); app.listen( 4001 ); console.log( 'flick is listening on port 4001' );
ef0c97624a4809bad454e4f7924062cad95e7a7a
README.md
README.md
rmd-dashboard ============= ResMed Dashboard # Dev setup ## Grunt If you don't already have it, you will need to install the grunt-cli npm install -g grunt-cli ## Install dependencies To set up the development environment, execute the following commands npm install grunt bower:install ## Run server in development mode grunt run
rmd-dashboard ============= ResMed Dashboard # Dev setup ## Grunt If you don't already have it, you will need to install the grunt-cli npm install -g grunt-cli ## Install dependencies To set up the development environment, execute the following commands npm install grunt bower:install ## Run server in development mode grunt run # Troubleshooting ## When I run `grunt bower:install` I get a Git error Does it look like this? Fatal error: Failed to execute "git ls-remote --tags --heads git://github.com/SteveSanderson/knockout.git", exit code of #128 Your firewall is probably blocking connections to port 22 (SSH). Run the following command to use port 443 (SSL) for GitHub downloads (**Note**: this is a global change) git config --global url."https://".insteadOf git:// Read more about it [here](https://coderwall.com/p/sitezg).
Add troubleshooting section to fix GitHub errors when running bower:install
Add troubleshooting section to fix GitHub errors when running bower:install
Markdown
mit
iknowcss/rmd-dashboard
markdown
## Code Before: rmd-dashboard ============= ResMed Dashboard # Dev setup ## Grunt If you don't already have it, you will need to install the grunt-cli npm install -g grunt-cli ## Install dependencies To set up the development environment, execute the following commands npm install grunt bower:install ## Run server in development mode grunt run ## Instruction: Add troubleshooting section to fix GitHub errors when running bower:install ## Code After: rmd-dashboard ============= ResMed Dashboard # Dev setup ## Grunt If you don't already have it, you will need to install the grunt-cli npm install -g grunt-cli ## Install dependencies To set up the development environment, execute the following commands npm install grunt bower:install ## Run server in development mode grunt run # Troubleshooting ## When I run `grunt bower:install` I get a Git error Does it look like this? Fatal error: Failed to execute "git ls-remote --tags --heads git://github.com/SteveSanderson/knockout.git", exit code of #128 Your firewall is probably blocking connections to port 22 (SSH). Run the following command to use port 443 (SSL) for GitHub downloads (**Note**: this is a global change) git config --global url."https://".insteadOf git:// Read more about it [here](https://coderwall.com/p/sitezg).
1a40804ffc1381d0f3b666f6f20dff6e0c2c21dc
src/modules/LCIOWriter/CMakeLists.txt
src/modules/LCIOWriter/CMakeLists.txt
ALLPIX_UNIQUE_MODULE(MODULE_NAME) # Add source files to library ALLPIX_MODULE_SOURCES(${MODULE_NAME} LCIOWriterModule.cpp # ADD SOURCE FILES HERE... ) # Provide standard install target ALLPIX_MODULE_INSTALL(${MODULE_NAME})
FIND_PACKAGE(LCIO REQUIRED) IF(LCIO_FOUND) # Define module and return the generated name as MODULE_NAME ALLPIX_UNIQUE_MODULE(MODULE_NAME) # Add source files to library ALLPIX_MODULE_SOURCES(${MODULE_NAME} LCIOWriterModule.cpp # ADD SOURCE FILES HERE... ) # Provide standard install target ALLPIX_MODULE_INSTALL(${MODULE_NAME}) ELSE(LCIO_FOUND) MESSAGE(WARNING "LCIO not found, cannot compile LCIOWriter module.") ENDIF(LCIO_FOUND)
Build LCIOWriter module only if LCIO is found
Build LCIOWriter module only if LCIO is found
Text
mit
Koensw/allpix-squared,Koensw/allpix-squared,Koensw/allpix-squared,Koensw/allpix-squared
text
## Code Before: ALLPIX_UNIQUE_MODULE(MODULE_NAME) # Add source files to library ALLPIX_MODULE_SOURCES(${MODULE_NAME} LCIOWriterModule.cpp # ADD SOURCE FILES HERE... ) # Provide standard install target ALLPIX_MODULE_INSTALL(${MODULE_NAME}) ## Instruction: Build LCIOWriter module only if LCIO is found ## Code After: FIND_PACKAGE(LCIO REQUIRED) IF(LCIO_FOUND) # Define module and return the generated name as MODULE_NAME ALLPIX_UNIQUE_MODULE(MODULE_NAME) # Add source files to library ALLPIX_MODULE_SOURCES(${MODULE_NAME} LCIOWriterModule.cpp # ADD SOURCE FILES HERE... ) # Provide standard install target ALLPIX_MODULE_INSTALL(${MODULE_NAME}) ELSE(LCIO_FOUND) MESSAGE(WARNING "LCIO not found, cannot compile LCIOWriter module.") ENDIF(LCIO_FOUND)
f868b126b3bd81ec900f378ff1fa8bd29ab8ea4c
transformations/Transformations.py
transformations/Transformations.py
from transformations.BackTranslation import BackTranslation from transformations.ButterFingersPerturbation import ButterFingersPerturbation from transformations.ChangeNamedEntities import ChangeNamedEntities from transformations.SentenceTransformation import SentenceTransformation from transformations.WithoutPunctuation import WithoutPunctuation class TransformationsList(SentenceTransformation): def __init__(self): transformations = [] transformations.append(ButterFingersPerturbation()) transformations.append(WithoutPunctuation()) transformations.append(ChangeNamedEntities()) transformations.append(BackTranslation()) self.transformations = transformations def generate(self, sentence: str): print(f"Original Input : {sentence}") generations = {"Original": sentence} for transformation in self.transformations: generations[transformation.name()] = transformation.generate(sentence) return generations
from transformations.BackTranslation import BackTranslation from transformations.ButterFingersPerturbation import ButterFingersPerturbation from transformations.ChangeNamedEntities import ChangeNamedEntities from transformations.SentenceTransformation import SentenceTransformation from transformations.WithoutPunctuation import WithoutPunctuation class TransformationsList(SentenceTransformation): def __init__(self): transformations = [ButterFingersPerturbation(), WithoutPunctuation(), ChangeNamedEntities(), BackTranslation()] self.transformations = transformations def generate(self, sentence: str): print(f"Original Input : {sentence}") generations = {"Original": sentence} for transformation in self.transformations: generations[transformation.name()] = transformation.generate(sentence) return generations
Add interface for source+label pertubation
Add interface for source+label pertubation
Python
mit
GEM-benchmark/NL-Augmenter
python
## Code Before: from transformations.BackTranslation import BackTranslation from transformations.ButterFingersPerturbation import ButterFingersPerturbation from transformations.ChangeNamedEntities import ChangeNamedEntities from transformations.SentenceTransformation import SentenceTransformation from transformations.WithoutPunctuation import WithoutPunctuation class TransformationsList(SentenceTransformation): def __init__(self): transformations = [] transformations.append(ButterFingersPerturbation()) transformations.append(WithoutPunctuation()) transformations.append(ChangeNamedEntities()) transformations.append(BackTranslation()) self.transformations = transformations def generate(self, sentence: str): print(f"Original Input : {sentence}") generations = {"Original": sentence} for transformation in self.transformations: generations[transformation.name()] = transformation.generate(sentence) return generations ## Instruction: Add interface for source+label pertubation ## Code After: from transformations.BackTranslation import BackTranslation from transformations.ButterFingersPerturbation import ButterFingersPerturbation from transformations.ChangeNamedEntities import ChangeNamedEntities from transformations.SentenceTransformation import SentenceTransformation from transformations.WithoutPunctuation import WithoutPunctuation class TransformationsList(SentenceTransformation): def __init__(self): transformations = [ButterFingersPerturbation(), WithoutPunctuation(), ChangeNamedEntities(), BackTranslation()] self.transformations = transformations def generate(self, sentence: str): print(f"Original Input : {sentence}") generations = {"Original": sentence} for transformation in self.transformations: generations[transformation.name()] = transformation.generate(sentence) return generations
a4d1b39425678d48c5a0ec591b38017625cef006
src/browser/ui/chips/chip-list.component.scss
src/browser/ui/chips/chip-list.component.scss
@import "./chip-sizes"; .ChipList { display: block; overflow: hidden; &__wrapper { display: flex; flex-direction: row; flex-wrap: wrap; align-items: center; margin: -$chip-chips-margin; .Chip { margin: $chip-chips-margin; } } } input.ChipInput { margin: $chip-chips-margin; flex: 1 0 auto; }
@import "./chip-sizes"; .ChipList { display: inline-flex; overflow: hidden; &__wrapper { display: flex; flex-direction: row; flex-wrap: wrap; align-items: center; margin: -$chip-chips-margin; .Chip { margin: $chip-chips-margin; } } } input.ChipInput { margin: $chip-chips-margin; flex: 1 0 auto; }
Change display type: 'block' -> 'inline-block'
Change display type: 'block' -> 'inline-block'
SCSS
mit
seokju-na/geeks-diary,seokju-na/geeks-diary,seokju-na/geeks-diary
scss
## Code Before: @import "./chip-sizes"; .ChipList { display: block; overflow: hidden; &__wrapper { display: flex; flex-direction: row; flex-wrap: wrap; align-items: center; margin: -$chip-chips-margin; .Chip { margin: $chip-chips-margin; } } } input.ChipInput { margin: $chip-chips-margin; flex: 1 0 auto; } ## Instruction: Change display type: 'block' -> 'inline-block' ## Code After: @import "./chip-sizes"; .ChipList { display: inline-flex; overflow: hidden; &__wrapper { display: flex; flex-direction: row; flex-wrap: wrap; align-items: center; margin: -$chip-chips-margin; .Chip { margin: $chip-chips-margin; } } } input.ChipInput { margin: $chip-chips-margin; flex: 1 0 auto; }
77ac03544f85e95603507e1dc0cec2189e0d5a03
get_ip.py
get_ip.py
import boto import boto.ec2 import getopt import sys # # Get the profile # def connect(region): profile = metadata['iam']['info']['InstanceProfileArn'] profile = profile[profile.find('/') + 1:] conn = boto.ec2.connection.EC2Connection( region=boto.ec2.get_region(region), aws_access_key_id=metadata['iam']['security-credentials'][profile]['AccessKeyId'], aws_secret_access_key=metadata['iam']['security-credentials'][profile]['SecretAccessKey'], security_token=metadata['iam']['security-credentials'][profile]['Token'] ) return conn # # Print out private IPv4 # def print_ips(region, tag_name): conn = connect(region) reservations = conn.get_all_instances(filters={"tag:Name": tag_name}) print("%s" % (reservations[0]["Instances"][0]["PrivateIpAddress"])) # # Main # opts, args = getopt.getopt(sys.argv[1:], "Lt:r:", ["tag-name", "region"]) for opt, arg in opts: if opt in ("-t", "--tag-name"): tag_name = arg elif opt in ("-r", "--region"): region = arg print_ips(region, tag_name)
import boto import boto.ec2 import getopt import sys # # Get the profile # def connect(): metadata = boto.utils.get_instance_metadata() region = metadata['placement']['availability-zone'][:-1] profile = metadata['iam']['info']['InstanceProfileArn'] profile = profile[profile.find('/') + 1:] conn = boto.ec2.connection.EC2Connection( region=boto.ec2.get_region(region), aws_access_key_id=metadata['iam']['security-credentials'][profile]['AccessKeyId'], aws_secret_access_key=metadata['iam']['security-credentials'][profile]['SecretAccessKey'], security_token=metadata['iam']['security-credentials'][profile]['Token'] ) return conn # # Print out private IPv4 # def print_ips(tag_name): conn = connect() reservations = conn.get_all_instances(filters={"tag:Name": tag_name}) print("%s" % (reservations[0]["Instances"][0]["PrivateIpAddress"])) # # Main # opts, args = getopt.getopt(sys.argv[1:], "Lt:r:", ["tag-name", "region"]) tag_name = "" region = "" for opt, arg in opts: if opt in ("-t", "--tag-name"): tag_name = arg print_ips(tag_name)
Add python script for IPs discovery
Add python script for IPs discovery
Python
bsd-3-clause
GetStream/Stream-Framework-Bench,GetStream/Stream-Framework-Bench
python
## Code Before: import boto import boto.ec2 import getopt import sys # # Get the profile # def connect(region): profile = metadata['iam']['info']['InstanceProfileArn'] profile = profile[profile.find('/') + 1:] conn = boto.ec2.connection.EC2Connection( region=boto.ec2.get_region(region), aws_access_key_id=metadata['iam']['security-credentials'][profile]['AccessKeyId'], aws_secret_access_key=metadata['iam']['security-credentials'][profile]['SecretAccessKey'], security_token=metadata['iam']['security-credentials'][profile]['Token'] ) return conn # # Print out private IPv4 # def print_ips(region, tag_name): conn = connect(region) reservations = conn.get_all_instances(filters={"tag:Name": tag_name}) print("%s" % (reservations[0]["Instances"][0]["PrivateIpAddress"])) # # Main # opts, args = getopt.getopt(sys.argv[1:], "Lt:r:", ["tag-name", "region"]) for opt, arg in opts: if opt in ("-t", "--tag-name"): tag_name = arg elif opt in ("-r", "--region"): region = arg print_ips(region, tag_name) ## Instruction: Add python script for IPs discovery ## Code After: import boto import boto.ec2 import getopt import sys # # Get the profile # def connect(): metadata = boto.utils.get_instance_metadata() region = metadata['placement']['availability-zone'][:-1] profile = metadata['iam']['info']['InstanceProfileArn'] profile = profile[profile.find('/') + 1:] conn = boto.ec2.connection.EC2Connection( region=boto.ec2.get_region(region), aws_access_key_id=metadata['iam']['security-credentials'][profile]['AccessKeyId'], aws_secret_access_key=metadata['iam']['security-credentials'][profile]['SecretAccessKey'], security_token=metadata['iam']['security-credentials'][profile]['Token'] ) return conn # # Print out private IPv4 # def print_ips(tag_name): conn = connect() reservations = conn.get_all_instances(filters={"tag:Name": tag_name}) print("%s" % (reservations[0]["Instances"][0]["PrivateIpAddress"])) # # Main # opts, args = getopt.getopt(sys.argv[1:], "Lt:r:", ["tag-name", "region"]) tag_name = "" region = "" for opt, arg in opts: if opt in ("-t", "--tag-name"): tag_name = arg print_ips(tag_name)
db5623472a5ba399d68acf6a1f6de0560714ac54
spec/pagseguro/payment_request/response_spec.rb
spec/pagseguro/payment_request/response_spec.rb
require "spec_helper" describe PagSeguro::PaymentRequest::Response do context "when payment request is created" do def xml_response(path) response = double( body: File.read("./spec/fixtures/#{path}"), code: 200, content_type: "text/xml", "[]" => nil ) Aitch::Response.new(Aitch::Configuration.new, response) end let(:http_response) { xml_response("payment_request/success.xml") } subject(:response) { described_class.new(http_response) } it { expect(response.code).to eql("8CF4BE7DCECEF0F004A6DFA0A8243412") } it { expect(response.created_at).to eql(Time.parse("2010-12-02T10:11:28.000-02:00")) } it { expect(response.url).to eql("https://pagseguro.uol.com.br/v2/checkout/payment.html?code=8CF4BE7DCECEF0F004A6DFA0A8243412") } end end
require "spec_helper" describe PagSeguro::PaymentRequest::Response do context "when payment request is created" do def xml_response(path) response = double( body: File.read("./spec/fixtures/#{path}"), code: 200, content_type: "text/xml", "[]" => nil ) Aitch::Response.new({xml_parser: Aitch::XMLParser}, response) end let(:http_response) { xml_response("payment_request/success.xml") } subject(:response) { described_class.new(http_response) } it { expect(response.code).to eql("8CF4BE7DCECEF0F004A6DFA0A8243412") } it { expect(response.created_at).to eql(Time.parse("2010-12-02T10:11:28.000-02:00")) } it { expect(response.url).to eql("https://pagseguro.uol.com.br/v2/checkout/payment.html?code=8CF4BE7DCECEF0F004A6DFA0A8243412") } end end
Update spec to conform new Aitch API.
Update spec to conform new Aitch API.
Ruby
apache-2.0
caiofct/pagseguro-ruby,pagseguro/ruby,88rabbit/ruby
ruby
## Code Before: require "spec_helper" describe PagSeguro::PaymentRequest::Response do context "when payment request is created" do def xml_response(path) response = double( body: File.read("./spec/fixtures/#{path}"), code: 200, content_type: "text/xml", "[]" => nil ) Aitch::Response.new(Aitch::Configuration.new, response) end let(:http_response) { xml_response("payment_request/success.xml") } subject(:response) { described_class.new(http_response) } it { expect(response.code).to eql("8CF4BE7DCECEF0F004A6DFA0A8243412") } it { expect(response.created_at).to eql(Time.parse("2010-12-02T10:11:28.000-02:00")) } it { expect(response.url).to eql("https://pagseguro.uol.com.br/v2/checkout/payment.html?code=8CF4BE7DCECEF0F004A6DFA0A8243412") } end end ## Instruction: Update spec to conform new Aitch API. ## Code After: require "spec_helper" describe PagSeguro::PaymentRequest::Response do context "when payment request is created" do def xml_response(path) response = double( body: File.read("./spec/fixtures/#{path}"), code: 200, content_type: "text/xml", "[]" => nil ) Aitch::Response.new({xml_parser: Aitch::XMLParser}, response) end let(:http_response) { xml_response("payment_request/success.xml") } subject(:response) { described_class.new(http_response) } it { expect(response.code).to eql("8CF4BE7DCECEF0F004A6DFA0A8243412") } it { expect(response.created_at).to eql(Time.parse("2010-12-02T10:11:28.000-02:00")) } it { expect(response.url).to eql("https://pagseguro.uol.com.br/v2/checkout/payment.html?code=8CF4BE7DCECEF0F004A6DFA0A8243412") } end end
7a906da52e48cbef1266b9de0b248079177a60d6
functions/expand:execute.fish
functions/expand:execute.fish
function expand:execute -d "Executes word expansion on the current token" # Iterate available expansions for one that matches the current token. for expansion in $__word_expansions set -l expansion (echo $expansion) # If the expansion matches, execute the expander and use its output as the replacement. if eval "$expansion[1]" > /dev/null if set -l replacement (eval "$expansion[2]") commandline -t -r "$replacement" return 0 end end end # Defer to regular completion. commandline -f complete end
function expand:execute -d "Executes word expansion on the current token" # Choose a filter. if not set -q FILTER if type -q peco set FILTER 'peco --prompt ">" --layout bottom-up' else if type -q percol set FILTER 'percol' else if type -q fzf set FILTER 'fzf' else if type -q selecta set FILTER 'selecta' end end # Iterate available expansions for one that matches the current token. for expansion in $__word_expansions set -l expansion (echo $expansion) # Check if the expansion condition matches the token. if eval "$expansion[1]" > /dev/null # If the expansion matches, execute the expander and use its output as replacements. if set -l new (eval "$expansion[2]") set replacements $replacements $new end end end # If a filter is specified and more than one replacement is available, use it to filter through the available # replacements. if begin; set -q replacements[2]; and set -q FILTER; end for line in $replacements echo $line end | eval "$FILTER" | read replacement # Interactive filters will cause Fish to need a repaint. commandline -f repaint # If a replacement was chosen, use it. if test -n "$replacement" commandline -t -r "$replacement" end # If only one replacement is available, just use it without prompt. else if set -q replacements[1] # No filter is specified, so just use the top replacement. commandline -t -r "$replacements[1]" # No replacements are available for the current token, so defer to regular completion. else commandline -f complete end end
Add support for filtering available expansions
Add support for filtering available expansions
fish
mit
oh-my-fish/plugin-expand
fish
## Code Before: function expand:execute -d "Executes word expansion on the current token" # Iterate available expansions for one that matches the current token. for expansion in $__word_expansions set -l expansion (echo $expansion) # If the expansion matches, execute the expander and use its output as the replacement. if eval "$expansion[1]" > /dev/null if set -l replacement (eval "$expansion[2]") commandline -t -r "$replacement" return 0 end end end # Defer to regular completion. commandline -f complete end ## Instruction: Add support for filtering available expansions ## Code After: function expand:execute -d "Executes word expansion on the current token" # Choose a filter. if not set -q FILTER if type -q peco set FILTER 'peco --prompt ">" --layout bottom-up' else if type -q percol set FILTER 'percol' else if type -q fzf set FILTER 'fzf' else if type -q selecta set FILTER 'selecta' end end # Iterate available expansions for one that matches the current token. for expansion in $__word_expansions set -l expansion (echo $expansion) # Check if the expansion condition matches the token. if eval "$expansion[1]" > /dev/null # If the expansion matches, execute the expander and use its output as replacements. if set -l new (eval "$expansion[2]") set replacements $replacements $new end end end # If a filter is specified and more than one replacement is available, use it to filter through the available # replacements. if begin; set -q replacements[2]; and set -q FILTER; end for line in $replacements echo $line end | eval "$FILTER" | read replacement # Interactive filters will cause Fish to need a repaint. commandline -f repaint # If a replacement was chosen, use it. if test -n "$replacement" commandline -t -r "$replacement" end # If only one replacement is available, just use it without prompt. else if set -q replacements[1] # No filter is specified, so just use the top replacement. commandline -t -r "$replacements[1]" # No replacements are available for the current token, so defer to regular completion. else commandline -f complete end end
afac07ce173af3e7db4a6ba6dab4786903e217b7
ocradmin/ocr/tools/plugins/cuneiform_wrapper.py
ocradmin/ocr/tools/plugins/cuneiform_wrapper.py
from generic_wrapper import * def main_class(): return CuneiformWrapper class CuneiformWrapper(GenericWrapper): """ Override certain methods of the OcropusWrapper to use Cuneiform for recognition of individual lines. """ name = "cuneiform" capabilities = ("line", "page") binary = get_binary("cuneiform") def get_command(self, outfile, image): """ Cuneiform command line. Simplified for now. """ return [self.binary, "-o", outfile, image]
import tempfile import subprocess as sp from ocradmin.ocr.tools import check_aborted, set_progress from ocradmin.ocr.utils import HocrParser from generic_wrapper import * def main_class(): return CuneiformWrapper class CuneiformWrapper(GenericWrapper): """ Override certain methods of the OcropusWrapper to use Cuneiform for recognition of individual lines. """ name = "cuneiform" capabilities = ("line", "page") binary = get_binary("cuneiform") def get_command(self, outfile, image): """ Cuneiform command line. Simplified for now. """ return [self.binary, "-o", outfile, image] def convert(self, filepath, *args, **kwargs): """ Convert a full page. """ json = None with tempfile.NamedTemporaryFile(delete=False) as tmp: tmp.close() args = [self.binary, "-f", "hocr", "-o", tmp.name, filepath] self.logger.info(args) proc = sp.Popen(args, stderr=sp.PIPE) err = proc.stderr.read() if proc.wait() != 0: return "!!! %s CONVERSION ERROR %d: %s !!!" % ( os.path.basename(self.binary).upper(), proc.returncode, err) json = HocrParser().parsefile(tmp.name) self.logger.info("%s" % json) os.unlink(tmp.name) set_progress(self.logger, kwargs["progress_func"], 100, 100) return json
Allow Cuneiform to do full page conversions. Downsides: 1) it crashes on quite a lot of pages 2) there's no progress output
Allow Cuneiform to do full page conversions. Downsides: 1) it crashes on quite a lot of pages 2) there's no progress output
Python
apache-2.0
vitorio/ocropodium,vitorio/ocropodium,vitorio/ocropodium,vitorio/ocropodium
python
## Code Before: from generic_wrapper import * def main_class(): return CuneiformWrapper class CuneiformWrapper(GenericWrapper): """ Override certain methods of the OcropusWrapper to use Cuneiform for recognition of individual lines. """ name = "cuneiform" capabilities = ("line", "page") binary = get_binary("cuneiform") def get_command(self, outfile, image): """ Cuneiform command line. Simplified for now. """ return [self.binary, "-o", outfile, image] ## Instruction: Allow Cuneiform to do full page conversions. Downsides: 1) it crashes on quite a lot of pages 2) there's no progress output ## Code After: import tempfile import subprocess as sp from ocradmin.ocr.tools import check_aborted, set_progress from ocradmin.ocr.utils import HocrParser from generic_wrapper import * def main_class(): return CuneiformWrapper class CuneiformWrapper(GenericWrapper): """ Override certain methods of the OcropusWrapper to use Cuneiform for recognition of individual lines. """ name = "cuneiform" capabilities = ("line", "page") binary = get_binary("cuneiform") def get_command(self, outfile, image): """ Cuneiform command line. Simplified for now. """ return [self.binary, "-o", outfile, image] def convert(self, filepath, *args, **kwargs): """ Convert a full page. """ json = None with tempfile.NamedTemporaryFile(delete=False) as tmp: tmp.close() args = [self.binary, "-f", "hocr", "-o", tmp.name, filepath] self.logger.info(args) proc = sp.Popen(args, stderr=sp.PIPE) err = proc.stderr.read() if proc.wait() != 0: return "!!! %s CONVERSION ERROR %d: %s !!!" % ( os.path.basename(self.binary).upper(), proc.returncode, err) json = HocrParser().parsefile(tmp.name) self.logger.info("%s" % json) os.unlink(tmp.name) set_progress(self.logger, kwargs["progress_func"], 100, 100) return json
1e71c32621d37144223221cf4afc668525e33134
README.md
README.md
wind ==== Wind is a lightweight and fast logging server for PHP. Currently it's more a proof of concept. Goal ---- Logging can slow down an application. PHP runs everything in the same process. Wind exposes a light REST server to which the main PHP application can post log requests. This will trigger a *new* PHP process for the actual logging, leaving the main application to concentrate on its main tasks. Wind does not provide any logging functionality as-is. It is meant to be used with a PSR-3 compatible logging library (like Monolog). Usage ----- Coming soon...
Wind - Fast logging for PHP =========================== [![Build Status](https://travis-ci.org/wadmiraal/wind.svg?branch=master)](https://travis-ci.org/wadmiraal/wind) Wind is a lightweight and fast logging server for PHP. Goal ---- Logging can slow down an application. PHP runs everything in the same process. Wind exposes a light REST server to which the main PHP application can post log requests. This will trigger a *new* PHP process for the actual logging, leaving the main application to concentrate on its main tasks. Wind does not provide any logging functionality as-is. It is meant to be used with a PSR-3 compatible logging library (like Monolog). Usage ----- Coming soon...
Add Travis CI build image.
Add Travis CI build image.
Markdown
mit
wadmiraal/wind,wadmiraal/wind
markdown
## Code Before: wind ==== Wind is a lightweight and fast logging server for PHP. Currently it's more a proof of concept. Goal ---- Logging can slow down an application. PHP runs everything in the same process. Wind exposes a light REST server to which the main PHP application can post log requests. This will trigger a *new* PHP process for the actual logging, leaving the main application to concentrate on its main tasks. Wind does not provide any logging functionality as-is. It is meant to be used with a PSR-3 compatible logging library (like Monolog). Usage ----- Coming soon... ## Instruction: Add Travis CI build image. ## Code After: Wind - Fast logging for PHP =========================== [![Build Status](https://travis-ci.org/wadmiraal/wind.svg?branch=master)](https://travis-ci.org/wadmiraal/wind) Wind is a lightweight and fast logging server for PHP. Goal ---- Logging can slow down an application. PHP runs everything in the same process. Wind exposes a light REST server to which the main PHP application can post log requests. This will trigger a *new* PHP process for the actual logging, leaving the main application to concentrate on its main tasks. Wind does not provide any logging functionality as-is. It is meant to be used with a PSR-3 compatible logging library (like Monolog). Usage ----- Coming soon...
ae2d17fc4ae6dfc1e6234a2de6f0b63c7df5649a
modules/gtp/package.json
modules/gtp/package.json
{ "name": "gtp", "main": "index.js" }
{ "name": "gtp", "version": "0.1.0", "main": "index.js" }
Add version to gtp module
Add version to gtp module
JSON
mit
yishn/Goban,yishn/Sabaki,yishn/Sabaki,yishn/Goban
json
## Code Before: { "name": "gtp", "main": "index.js" } ## Instruction: Add version to gtp module ## Code After: { "name": "gtp", "version": "0.1.0", "main": "index.js" }
5b33e1307e8061984128de6873eb79e0fba7a950
app/controllers/users_controller.rb
app/controllers/users_controller.rb
class UsersController < ApplicationController def new @user = User.new render :'users/new' end def create @user = User.new(reg_params) if @user.save redirect_to experiments_path else render new_user_path end end private def reg_params params.require(:user).permit(:role, :name, :email, :password) end end
class UsersController < ApplicationController def new @user = User.new render :'users/new' end def create @user = User.new(reg_params) if @user.save redirect_to new_session_path else render new_user_path end end private def reg_params params.require(:user).permit(:role, :name, :email, :password) end end
Change route for successul registration to login page
Change route for successul registration to login page
Ruby
mit
njarin/earp,njarin/earp,njarin/earp
ruby
## Code Before: class UsersController < ApplicationController def new @user = User.new render :'users/new' end def create @user = User.new(reg_params) if @user.save redirect_to experiments_path else render new_user_path end end private def reg_params params.require(:user).permit(:role, :name, :email, :password) end end ## Instruction: Change route for successul registration to login page ## Code After: class UsersController < ApplicationController def new @user = User.new render :'users/new' end def create @user = User.new(reg_params) if @user.save redirect_to new_session_path else render new_user_path end end private def reg_params params.require(:user).permit(:role, :name, :email, :password) end end
e6c66260031b793e25b2c4c50859699b9d0d7e17
index.js
index.js
"use strict" var fs = require("fs") var path = require("path") var dz = require("dezalgo") var npa = require("npm-package-arg") module.exports = function (spec, where, cb) { if (where instanceof Function) cb = where, where = null if (where == null) where = "." cb = dz(cb) try { var dep = npa(spec) } catch (e) { return cb(e) } var specpath = path.resolve(where, dep.type == "local" ? dep.spec : spec) fs.stat(specpath, function (er, s) { if (er) return finalize() if (!s.isDirectory()) return finalize("local") fs.stat(path.join(specpath, "package.json"), function (er) { finalize(er ? null : "directory") }) }) function finalize(type) { if (type != null) dep.type = type if (dep.type == "local" || dep.type == "directory") dep.spec = specpath cb(null, dep) } }
"use strict" var fs = require("fs") var path = require("path") var dz = require("dezalgo") var npa = require("npm-package-arg") module.exports = function (spec, where, cb) { if (where instanceof Function) cb = where, where = null if (where == null) where = "." cb = dz(cb) try { var dep = npa(spec) } catch (e) { return cb(e) } var specpath = dep.type == "local" ? path.resolve(where, dep.spec) : path.resolve(spec) fs.stat(specpath, function (er, s) { if (er) return finalize() if (!s.isDirectory()) return finalize("local") fs.stat(path.join(specpath, "package.json"), function (er) { finalize(er ? null : "directory") }) }) function finalize(type) { if (type != null) dep.type = type if (dep.type == "local" || dep.type == "directory") dep.spec = specpath cb(null, dep) } }
Resolve ONLY file:// deps relative to package root, everything else CWD
Resolve ONLY file:// deps relative to package root, everything else CWD
JavaScript
isc
npm/realize-package-specifier
javascript
## Code Before: "use strict" var fs = require("fs") var path = require("path") var dz = require("dezalgo") var npa = require("npm-package-arg") module.exports = function (spec, where, cb) { if (where instanceof Function) cb = where, where = null if (where == null) where = "." cb = dz(cb) try { var dep = npa(spec) } catch (e) { return cb(e) } var specpath = path.resolve(where, dep.type == "local" ? dep.spec : spec) fs.stat(specpath, function (er, s) { if (er) return finalize() if (!s.isDirectory()) return finalize("local") fs.stat(path.join(specpath, "package.json"), function (er) { finalize(er ? null : "directory") }) }) function finalize(type) { if (type != null) dep.type = type if (dep.type == "local" || dep.type == "directory") dep.spec = specpath cb(null, dep) } } ## Instruction: Resolve ONLY file:// deps relative to package root, everything else CWD ## Code After: "use strict" var fs = require("fs") var path = require("path") var dz = require("dezalgo") var npa = require("npm-package-arg") module.exports = function (spec, where, cb) { if (where instanceof Function) cb = where, where = null if (where == null) where = "." cb = dz(cb) try { var dep = npa(spec) } catch (e) { return cb(e) } var specpath = dep.type == "local" ? path.resolve(where, dep.spec) : path.resolve(spec) fs.stat(specpath, function (er, s) { if (er) return finalize() if (!s.isDirectory()) return finalize("local") fs.stat(path.join(specpath, "package.json"), function (er) { finalize(er ? null : "directory") }) }) function finalize(type) { if (type != null) dep.type = type if (dep.type == "local" || dep.type == "directory") dep.spec = specpath cb(null, dep) } }
940f6bbc3d99b5c46201384d323f18c3578d838f
docs/installation.rst
docs/installation.rst
================== Installation Guide ================== Melexis Warnings plugin is available as a package on `PyPI <https://pypi.org/project/mlx.warnings/>`_. Once installed, it offers direct command line invocation (no need to run python infront). Installation is as simple as: .. code-block:: bash # released version from PyPI pip3 install mlx.warnings # editable package based on master branch git clone https://github.com/melexis/warnings-plugin.git pip3 install -e warnings-plugin So far we are not aware of any problems with installation of the plugin, but in case you have any please open an Issue.
================== Installation Guide ================== Melexis Warnings plugin is available as a package on `PyPI <https://pypi.org/project/mlx.warnings/>`_. Once installed, it offers direct command line invocation (no need to run python infront). Installation is as simple as: .. code-block:: bash # released version from PyPI pip3 install mlx.warnings # editable package based on master branch git clone https://github.com/melexis/warnings-plugin.git pip3 install -e warnings-plugin .. note:: mlx.warnings has a dependency to lxml, which may require your to install additional libraries on Linux. If this is the case, please refer to the `installation guide of lxml <https://lxml.de/installation.html>`_. So far we are not aware of any problems with installation of the plugin, but in case you have any please open an Issue.
Add note about requirements for lxml on Linux
Add note about requirements for lxml on Linux
reStructuredText
apache-2.0
melexis/warnings-plugin
restructuredtext
## Code Before: ================== Installation Guide ================== Melexis Warnings plugin is available as a package on `PyPI <https://pypi.org/project/mlx.warnings/>`_. Once installed, it offers direct command line invocation (no need to run python infront). Installation is as simple as: .. code-block:: bash # released version from PyPI pip3 install mlx.warnings # editable package based on master branch git clone https://github.com/melexis/warnings-plugin.git pip3 install -e warnings-plugin So far we are not aware of any problems with installation of the plugin, but in case you have any please open an Issue. ## Instruction: Add note about requirements for lxml on Linux ## Code After: ================== Installation Guide ================== Melexis Warnings plugin is available as a package on `PyPI <https://pypi.org/project/mlx.warnings/>`_. Once installed, it offers direct command line invocation (no need to run python infront). Installation is as simple as: .. code-block:: bash # released version from PyPI pip3 install mlx.warnings # editable package based on master branch git clone https://github.com/melexis/warnings-plugin.git pip3 install -e warnings-plugin .. note:: mlx.warnings has a dependency to lxml, which may require your to install additional libraries on Linux. If this is the case, please refer to the `installation guide of lxml <https://lxml.de/installation.html>`_. So far we are not aware of any problems with installation of the plugin, but in case you have any please open an Issue.
f8290954b27e655562878d16df7e4793262f50d7
wafer/tickets/management/commands/import_quicket_guest_list.py
wafer/tickets/management/commands/import_quicket_guest_list.py
import csv from django.core.management.base import BaseCommand, CommandError from wafer.tickets.views import import_ticket class Command(BaseCommand): args = '<csv file>' help = "Import a guest list CSV from Quicket" def handle(self, *args, **options): if len(args) != 1: raise CommandError('1 CSV File required') with open(args[0], 'r') as f: reader = csv.reader(f) header = next(reader) if len(header) != 11: raise CommandError('CSV format has changed. Update wafer') for ticket in reader: self.import_ticket(*ticket) def import_ticket(self, ticket_number, ticket_barcode, purchase_date, ticket_type, ticket_holder, email, cellphone, checked_in, checked_in_date, checked_in_by, complimentary): import_ticket(ticket_barcode, ticket_type, email)
import csv from django.core.management.base import BaseCommand, CommandError from wafer.tickets.views import import_ticket class Command(BaseCommand): args = '<csv file>' help = "Import a guest list CSV from Quicket" def handle(self, *args, **options): if len(args) != 1: raise CommandError('1 CSV File required') columns = ('Ticket Number', 'Ticket Barcode', 'Purchase Date', 'Ticket Type', 'Ticket Holder', 'Email', 'Cellphone', 'Checked in', 'Checked in date', 'Checked in by', 'Complimentary') keys = [column.lower().replace(' ', '_') for column in columns] with open(args[0], 'r') as f: reader = csv.reader(f) header = tuple(next(reader)) if header != columns: raise CommandError('CSV format has changed. Update wafer') for row in reader: ticket = dict(zip(keys, row)) import_ticket(ticket['ticket_barcode'], ticket['ticket_type'], ticket['email'])
Check CSV header, not column count (and refactor)
Check CSV header, not column count (and refactor)
Python
isc
CarlFK/wafer,CarlFK/wafer,CarlFK/wafer,CarlFK/wafer,CTPUG/wafer,CTPUG/wafer,CTPUG/wafer,CTPUG/wafer
python
## Code Before: import csv from django.core.management.base import BaseCommand, CommandError from wafer.tickets.views import import_ticket class Command(BaseCommand): args = '<csv file>' help = "Import a guest list CSV from Quicket" def handle(self, *args, **options): if len(args) != 1: raise CommandError('1 CSV File required') with open(args[0], 'r') as f: reader = csv.reader(f) header = next(reader) if len(header) != 11: raise CommandError('CSV format has changed. Update wafer') for ticket in reader: self.import_ticket(*ticket) def import_ticket(self, ticket_number, ticket_barcode, purchase_date, ticket_type, ticket_holder, email, cellphone, checked_in, checked_in_date, checked_in_by, complimentary): import_ticket(ticket_barcode, ticket_type, email) ## Instruction: Check CSV header, not column count (and refactor) ## Code After: import csv from django.core.management.base import BaseCommand, CommandError from wafer.tickets.views import import_ticket class Command(BaseCommand): args = '<csv file>' help = "Import a guest list CSV from Quicket" def handle(self, *args, **options): if len(args) != 1: raise CommandError('1 CSV File required') columns = ('Ticket Number', 'Ticket Barcode', 'Purchase Date', 'Ticket Type', 'Ticket Holder', 'Email', 'Cellphone', 'Checked in', 'Checked in date', 'Checked in by', 'Complimentary') keys = [column.lower().replace(' ', '_') for column in columns] with open(args[0], 'r') as f: reader = csv.reader(f) header = tuple(next(reader)) if header != columns: raise CommandError('CSV format has changed. Update wafer') for row in reader: ticket = dict(zip(keys, row)) import_ticket(ticket['ticket_barcode'], ticket['ticket_type'], ticket['email'])
d7e4fdab026da1dd665dacf2b29ce40873c4b095
docs/src/siteConfig.tsx
docs/src/siteConfig.tsx
// List of projects/orgs using your project for the users page. export const siteConfig = { editUrl: 'https://github.com/formik/formik/edit/master', copyright: `Copyright © ${new Date().getFullYear()} Jared Palmer. All Rights Reserved.`, repoUrl: 'https://github.com/formik/formik', algolia: { appId: 'BH4D9OD16A', apiKey: '32fabc38a054677ee9b24e69d699fbd0', indexName: 'formik', // algoliaOptions: { // facetFilters: ['version:VERSION'], // }, }, };
// List of projects/orgs using your project for the users page. export const siteConfig = { editUrl: 'https://github.com/formik/formik/edit/master/docs/src/pages', copyright: `Copyright © ${new Date().getFullYear()} Jared Palmer. All Rights Reserved.`, repoUrl: 'https://github.com/formik/formik', algolia: { appId: 'BH4D9OD16A', apiKey: '32fabc38a054677ee9b24e69d699fbd0', indexName: 'formik', // algoliaOptions: { // facetFilters: ['version:VERSION'], // }, }, };
Fix Docs: correct github edit URL
Fix Docs: correct github edit URL
TypeScript
apache-2.0
jaredpalmer/formik,jaredpalmer/formik,jaredpalmer/formik
typescript
## Code Before: // List of projects/orgs using your project for the users page. export const siteConfig = { editUrl: 'https://github.com/formik/formik/edit/master', copyright: `Copyright © ${new Date().getFullYear()} Jared Palmer. All Rights Reserved.`, repoUrl: 'https://github.com/formik/formik', algolia: { appId: 'BH4D9OD16A', apiKey: '32fabc38a054677ee9b24e69d699fbd0', indexName: 'formik', // algoliaOptions: { // facetFilters: ['version:VERSION'], // }, }, }; ## Instruction: Fix Docs: correct github edit URL ## Code After: // List of projects/orgs using your project for the users page. export const siteConfig = { editUrl: 'https://github.com/formik/formik/edit/master/docs/src/pages', copyright: `Copyright © ${new Date().getFullYear()} Jared Palmer. All Rights Reserved.`, repoUrl: 'https://github.com/formik/formik', algolia: { appId: 'BH4D9OD16A', apiKey: '32fabc38a054677ee9b24e69d699fbd0', indexName: 'formik', // algoliaOptions: { // facetFilters: ['version:VERSION'], // }, }, };
cd720b9cd93a40fbbb3e07b92d2ab390d377b993
bin/run-celery-worker.sh
bin/run-celery-worker.sh
newrelic-admin run-program celery -A kitsune worker -Q fxa -n fxa@%h --maxtasksperchild=${CELERY_WORKER_MAX_TASKS_PER_CHILD:-25} & exec newrelic-admin run-program celery -A kitsune worker -n default@%h --maxtasksperchild=${CELERY_WORKER_MAX_TASKS_PER_CHILD:-25} &
exec newrelic-admin run-program celery -A kitsune worker -Q default,fxa --maxtasksperchild=${CELERY_WORKER_MAX_TASKS_PER_CHILD:-25} &
Add fxa queue together with the default.
Add fxa queue together with the default.
Shell
bsd-3-clause
mozilla/kitsune,mozilla/kitsune,mozilla/kitsune,mozilla/kitsune
shell
## Code Before: newrelic-admin run-program celery -A kitsune worker -Q fxa -n fxa@%h --maxtasksperchild=${CELERY_WORKER_MAX_TASKS_PER_CHILD:-25} & exec newrelic-admin run-program celery -A kitsune worker -n default@%h --maxtasksperchild=${CELERY_WORKER_MAX_TASKS_PER_CHILD:-25} & ## Instruction: Add fxa queue together with the default. ## Code After: exec newrelic-admin run-program celery -A kitsune worker -Q default,fxa --maxtasksperchild=${CELERY_WORKER_MAX_TASKS_PER_CHILD:-25} &
ba76c7928a50031fee913b058e276819515c111d
doc/cli.md
doc/cli.md
Update each function so that it takes an object bag of arguments. Update the cli.js to parse the user's .npmrc first, and then the arguments, merge the two objects together, and pass the data over to the function. If something relevant is missing, then each command should provide a "help" menu that lists out what's important for that command. Doing `npm help foo` should also output the "foo" command documentation. There needs to be a consistent pattern by which flags and other data is defined and named.
Update each function so that it takes an object bag of arguments. Update the cli.js to parse the user's .npmrc first, and then the arguments, merge the two objects together, and pass the data over to the function. If something relevant is missing, then each command should provide a "help" menu that lists out what's important for that command. Doing `npm help foo` should also output the "foo" command documentation. There needs to be a consistent pattern by which flags and other data is defined and named. Handle abbrevs. Cuz that's just the nice thing to do.
Add a todo about abbrevs
Add a todo about abbrevs
Markdown
artistic-2.0
chadnickbok/npm,segrey/npm,midniteio/npm,misterbyrne/npm,yyx990803/npm,ekmartin/npm,Volune/npm,segment-boneyard/npm,rsp/npm,cchamberlain/npm,evocateur/npm,kimshinelove/naver-npm,cchamberlain/npm-msys2,midniteio/npm,segrey/npm,cchamberlain/npm,evanlucas/npm,yibn2008/npm,DIREKTSPEED-LTD/npm,haggholm/npm,TimeToogo/npm,segmentio/npm,yibn2008/npm,evocateur/npm,yibn2008/npm,Volune/npm,kimshinelove/naver-npm,cchamberlain/npm,rsp/npm,segment-boneyard/npm,DaveEmmerson/npm,xalopp/npm,kriskowal/npm,chadnickbok/npm,thomblake/npm,princeofdarkness76/npm,xalopp/npm,ekmartin/npm,DaveEmmerson/npm,DaveEmmerson/npm,rsp/npm,midniteio/npm,segmentio/npm,yodeyer/npm,Volune/npm,kemitchell/npm,TimeToogo/npm,segmentio/npm,kriskowal/npm,yodeyer/npm,yyx990803/npm,TimeToogo/npm,evanlucas/npm,thomblake/npm,lxe/npm,segrey/npm,evocateur/npm,evanlucas/npm,yyx990803/npm,thomblake/npm,misterbyrne/npm,cchamberlain/npm-msys2,ekmartin/npm,DIREKTSPEED-LTD/npm,princeofdarkness76/npm,chadnickbok/npm,princeofdarkness76/npm,segment-boneyard/npm,kemitchell/npm,lxe/npm,haggholm/npm,kimshinelove/naver-npm,cchamberlain/npm-msys2,kriskowal/npm,kemitchell/npm,yodeyer/npm,DIREKTSPEED-LTD/npm,xalopp/npm,misterbyrne/npm,lxe/npm,haggholm/npm
markdown
## Code Before: Update each function so that it takes an object bag of arguments. Update the cli.js to parse the user's .npmrc first, and then the arguments, merge the two objects together, and pass the data over to the function. If something relevant is missing, then each command should provide a "help" menu that lists out what's important for that command. Doing `npm help foo` should also output the "foo" command documentation. There needs to be a consistent pattern by which flags and other data is defined and named. ## Instruction: Add a todo about abbrevs ## Code After: Update each function so that it takes an object bag of arguments. Update the cli.js to parse the user's .npmrc first, and then the arguments, merge the two objects together, and pass the data over to the function. If something relevant is missing, then each command should provide a "help" menu that lists out what's important for that command. Doing `npm help foo` should also output the "foo" command documentation. There needs to be a consistent pattern by which flags and other data is defined and named. Handle abbrevs. Cuz that's just the nice thing to do.
00712888b761bce556b73e36c9c7270829d3a1d4
tests/test_entity.py
tests/test_entity.py
from test_provider_gtfs import provider from busbus.entity import BaseEntityJSONEncoder import json import pytest @pytest.fixture(scope='module') def agency(provider): return next(provider.agencies) def test_entity_repr(agency): assert 'DTA' in repr(agency) def test_entity_failed_getattr(agency): with pytest.raises(AttributeError): agency.the_weather_in_london def test_entity_failed_getitem(agency): with pytest.raises(KeyError): agency['the_weather_in_london'] def test_entity_to_dict(agency): assert dict(agency)['id'] == 'DTA' def test_entity_to_json(provider): json_str = BaseEntityJSONEncoder().encode(next(provider.arrivals)) json.loads(json_str)
from test_provider_gtfs import provider from busbus.entity import BaseEntityJSONEncoder import json import pytest @pytest.fixture(scope='module') def agency(provider): return next(provider.agencies) def test_entity_repr(agency): assert 'DTA' in repr(agency) def test_entity_failed_getattr(agency): with pytest.raises(AttributeError): agency.the_weather_in_london def test_entity_failed_getitem(agency): with pytest.raises(KeyError): agency['the_weather_in_london'] def test_entity_to_dict(agency): assert dict(agency)['id'] == 'DTA' def test_entity_to_json(provider): json_str = BaseEntityJSONEncoder().encode(next(provider.arrivals)) json.loads(json_str) def test_bad_json(): with pytest.raises(TypeError): BaseEntityJSONEncoder().encode(set())
Test the failure branch in BaseEntityJSONDecoder
Test the failure branch in BaseEntityJSONDecoder
Python
mit
spaceboats/busbus
python
## Code Before: from test_provider_gtfs import provider from busbus.entity import BaseEntityJSONEncoder import json import pytest @pytest.fixture(scope='module') def agency(provider): return next(provider.agencies) def test_entity_repr(agency): assert 'DTA' in repr(agency) def test_entity_failed_getattr(agency): with pytest.raises(AttributeError): agency.the_weather_in_london def test_entity_failed_getitem(agency): with pytest.raises(KeyError): agency['the_weather_in_london'] def test_entity_to_dict(agency): assert dict(agency)['id'] == 'DTA' def test_entity_to_json(provider): json_str = BaseEntityJSONEncoder().encode(next(provider.arrivals)) json.loads(json_str) ## Instruction: Test the failure branch in BaseEntityJSONDecoder ## Code After: from test_provider_gtfs import provider from busbus.entity import BaseEntityJSONEncoder import json import pytest @pytest.fixture(scope='module') def agency(provider): return next(provider.agencies) def test_entity_repr(agency): assert 'DTA' in repr(agency) def test_entity_failed_getattr(agency): with pytest.raises(AttributeError): agency.the_weather_in_london def test_entity_failed_getitem(agency): with pytest.raises(KeyError): agency['the_weather_in_london'] def test_entity_to_dict(agency): assert dict(agency)['id'] == 'DTA' def test_entity_to_json(provider): json_str = BaseEntityJSONEncoder().encode(next(provider.arrivals)) json.loads(json_str) def test_bad_json(): with pytest.raises(TypeError): BaseEntityJSONEncoder().encode(set())