commit
stringlengths 40
40
| old_file
stringlengths 4
237
| new_file
stringlengths 4
237
| old_contents
stringlengths 1
4.24k
| new_contents
stringlengths 5
4.84k
| subject
stringlengths 15
778
| message
stringlengths 16
6.86k
| lang
stringlengths 1
30
| license
stringclasses 13
values | repos
stringlengths 5
116k
| config
stringlengths 1
30
| content
stringlengths 105
8.72k
|
---|---|---|---|---|---|---|---|---|---|---|---|
f72e4e09d5001c972abc62b4b2e351dffb76166b
|
src/_interactive.css
|
src/_interactive.css
|
/* INTERACTIVE */
details {
display: block;
}
|
/* INTERACTIVE */
details {
display: block;
}
details *,
details ::before,
details ::after {
box-sizing: border-box;
}
|
Add box-sizing: border-box for details children
|
Add box-sizing: border-box for details children
Close #23
|
CSS
|
mit
|
barcia/standarize
|
css
|
## Code Before:
/* INTERACTIVE */
details {
display: block;
}
## Instruction:
Add box-sizing: border-box for details children
Close #23
## Code After:
/* INTERACTIVE */
details {
display: block;
}
details *,
details ::before,
details ::after {
box-sizing: border-box;
}
|
477853ccf9d701ceeaa40bc1cba8703efea02019
|
modules/akamai_logs/templates/check_akamai_logs.sh.erb
|
modules/akamai_logs/templates/check_akamai_logs.sh.erb
|
num_files=$(find <%= @local_logs_dir %>/ -name 'gdslog*.gz' -mtime -1 -size +10M | wc -l)
if [ "$num_files" -lt "4" ]; then
echo "ERROR: Fewer downloaded Akamai logs than expected."
exit 2
else
echo "OK"
exit 0
fi
|
num_files=$(find <%= @local_logs_dir %>/ -name 'gdslog*.gz' -mtime -1.2 -size +10M | wc -l)
if [ "$num_files" -lt "4" ]; then
echo "ERROR: Expected 4 downloaded Akamai logs, only found $num_files."
exit 2
else
echo "OK"
exit 0
fi
|
Increase time interval for Akamai logs nagios check
|
Increase time interval for Akamai logs nagios check
Prevent Nagios alert by providing a bit of a buffer around the time
interval.
Also add the actual number of logs found in the error message.
|
HTML+ERB
|
mit
|
alphagov/govuk-puppet,alphagov/govuk-puppet,alphagov/govuk-puppet,alphagov/govuk-puppet,alphagov/govuk-puppet,alphagov/govuk-puppet
|
html+erb
|
## Code Before:
num_files=$(find <%= @local_logs_dir %>/ -name 'gdslog*.gz' -mtime -1 -size +10M | wc -l)
if [ "$num_files" -lt "4" ]; then
echo "ERROR: Fewer downloaded Akamai logs than expected."
exit 2
else
echo "OK"
exit 0
fi
## Instruction:
Increase time interval for Akamai logs nagios check
Prevent Nagios alert by providing a bit of a buffer around the time
interval.
Also add the actual number of logs found in the error message.
## Code After:
num_files=$(find <%= @local_logs_dir %>/ -name 'gdslog*.gz' -mtime -1.2 -size +10M | wc -l)
if [ "$num_files" -lt "4" ]; then
echo "ERROR: Expected 4 downloaded Akamai logs, only found $num_files."
exit 2
else
echo "OK"
exit 0
fi
|
905a08bf59f6a7d51218aaa4559e7f4efa6244a9
|
thunderdome/tests/groovy/test_scanner.py
|
thunderdome/tests/groovy/test_scanner.py
|
import os
from unittest import TestCase
from thunderdome.gremlin import parse
class GroovyScannerTest(TestCase):
"""
Test Groovy language scanner
"""
def test_parsing_complicated_function(self):
groovy_file = os.path.join(os.path.dirname(__file__), 'test.groovy')
result = parse(groovy_file)
import ipdb; ipdb.set_trace()
assert len(result[6].body.split('\n')) == 8
|
import os
from unittest import TestCase
from thunderdome.gremlin import parse
class GroovyScannerTest(TestCase):
"""
Test Groovy language scanner
"""
def test_parsing_complicated_function(self):
groovy_file = os.path.join(os.path.dirname(__file__), 'test.groovy')
result = parse(groovy_file)
assert len(result[6].body.split('\n')) == 8
result_map = {x.name: x for x in result}
assert 'get_self' in result_map
assert 'return_value' in result_map
assert 'long_func' in result_map
|
Add Unit-Test For Scanner Problem
|
Add Unit-Test For Scanner Problem
|
Python
|
mit
|
StartTheShift/thunderdome,StartTheShift/thunderdome
|
python
|
## Code Before:
import os
from unittest import TestCase
from thunderdome.gremlin import parse
class GroovyScannerTest(TestCase):
"""
Test Groovy language scanner
"""
def test_parsing_complicated_function(self):
groovy_file = os.path.join(os.path.dirname(__file__), 'test.groovy')
result = parse(groovy_file)
import ipdb; ipdb.set_trace()
assert len(result[6].body.split('\n')) == 8
## Instruction:
Add Unit-Test For Scanner Problem
## Code After:
import os
from unittest import TestCase
from thunderdome.gremlin import parse
class GroovyScannerTest(TestCase):
"""
Test Groovy language scanner
"""
def test_parsing_complicated_function(self):
groovy_file = os.path.join(os.path.dirname(__file__), 'test.groovy')
result = parse(groovy_file)
assert len(result[6].body.split('\n')) == 8
result_map = {x.name: x for x in result}
assert 'get_self' in result_map
assert 'return_value' in result_map
assert 'long_func' in result_map
|
9b12ee79b89a855208910356b2ca669180b02c83
|
.travis.yml
|
.travis.yml
|
sudo: false
language: go
go:
- 1.3
- 1.4
- 1.5
- 1.6
- 1.7
- 1.8
- 1.9
- "1.10"
- tip
os:
- linux
env:
global:
- DB_URL="http://browscap.org/stream?q=BrowsCapCSV"
- TMP_PATH=test/tmp.csv
- DB_PATH=test/browscap.csv
matrix:
- CPU=1
- CPU=8
script: ./test.bash
matrix:
allow_failures:
- go: tip
|
sudo: false
language: go
go:
- "1.3"
- "1.4"
- "1.5"
- "1.6"
- "1.7"
- "1.8"
- "1.9"
- "1.10"
- "1.11"
- tip
os:
- linux
env:
global:
- DB_URL="http://browscap.org/stream?q=BrowsCapCSV"
- TMP_PATH=test/tmp.csv
- DB_PATH=test/browscap.csv
matrix:
- CPU=1
- CPU=8
script: ./test.bash
matrix:
allow_failures:
- go: tip
|
Add 1.11 to Travis config
|
Add 1.11 to Travis config
|
YAML
|
mit
|
kavu/cappa,kavu/cappa
|
yaml
|
## Code Before:
sudo: false
language: go
go:
- 1.3
- 1.4
- 1.5
- 1.6
- 1.7
- 1.8
- 1.9
- "1.10"
- tip
os:
- linux
env:
global:
- DB_URL="http://browscap.org/stream?q=BrowsCapCSV"
- TMP_PATH=test/tmp.csv
- DB_PATH=test/browscap.csv
matrix:
- CPU=1
- CPU=8
script: ./test.bash
matrix:
allow_failures:
- go: tip
## Instruction:
Add 1.11 to Travis config
## Code After:
sudo: false
language: go
go:
- "1.3"
- "1.4"
- "1.5"
- "1.6"
- "1.7"
- "1.8"
- "1.9"
- "1.10"
- "1.11"
- tip
os:
- linux
env:
global:
- DB_URL="http://browscap.org/stream?q=BrowsCapCSV"
- TMP_PATH=test/tmp.csv
- DB_PATH=test/browscap.csv
matrix:
- CPU=1
- CPU=8
script: ./test.bash
matrix:
allow_failures:
- go: tip
|
4754e781d96f7eda3af007494dead7ac652f3118
|
_includes/social_buttons.html
|
_includes/social_buttons.html
|
<ul class="rrssb-buttons clearfix">
</ul>
|
{% assign post_url = site.url | append: page.url %}
<ul class="rrssb-buttons clearfix">
<li class="rrssb-twitter">
<a href="{{ 'http://twitter.com/home?status=' | append: page.title | append: ' by @ne1ro ' | append: post_url | uri_escape }}" class="popup">
<span class="rrssb-icon"><svg xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28"><path d="M24.253 8.756C24.69 17.08 18.297 24.182 9.97 24.62c-3.122.162-6.22-.646-8.86-2.32 2.702.18 5.375-.648 7.507-2.32-2.072-.248-3.818-1.662-4.49-3.64.802.13 1.62.077 2.4-.154-2.482-.466-4.312-2.586-4.412-5.11.688.276 1.426.408 2.168.387-2.135-1.65-2.73-4.62-1.394-6.965C5.574 7.816 9.54 9.84 13.802 10.07c-.842-2.738.694-5.64 3.434-6.48 2.018-.624 4.212.043 5.546 1.682 1.186-.213 2.318-.662 3.33-1.317-.386 1.256-1.248 2.312-2.4 2.942 1.048-.106 2.07-.394 3.02-.85-.458 1.182-1.343 2.15-2.48 2.71z"/></svg></span>
<span class="rrssb-text">twitter</span>
</a>
</li>
</ul>
|
Add Bit Twitter Button :joy:
|
Add Bit Twitter Button :joy:
|
HTML
|
mit
|
ne1ro/neiro.io,ne1ro/neiro.io
|
html
|
## Code Before:
<ul class="rrssb-buttons clearfix">
</ul>
## Instruction:
Add Bit Twitter Button :joy:
## Code After:
{% assign post_url = site.url | append: page.url %}
<ul class="rrssb-buttons clearfix">
<li class="rrssb-twitter">
<a href="{{ 'http://twitter.com/home?status=' | append: page.title | append: ' by @ne1ro ' | append: post_url | uri_escape }}" class="popup">
<span class="rrssb-icon"><svg xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28"><path d="M24.253 8.756C24.69 17.08 18.297 24.182 9.97 24.62c-3.122.162-6.22-.646-8.86-2.32 2.702.18 5.375-.648 7.507-2.32-2.072-.248-3.818-1.662-4.49-3.64.802.13 1.62.077 2.4-.154-2.482-.466-4.312-2.586-4.412-5.11.688.276 1.426.408 2.168.387-2.135-1.65-2.73-4.62-1.394-6.965C5.574 7.816 9.54 9.84 13.802 10.07c-.842-2.738.694-5.64 3.434-6.48 2.018-.624 4.212.043 5.546 1.682 1.186-.213 2.318-.662 3.33-1.317-.386 1.256-1.248 2.312-2.4 2.942 1.048-.106 2.07-.394 3.02-.85-.458 1.182-1.343 2.15-2.48 2.71z"/></svg></span>
<span class="rrssb-text">twitter</span>
</a>
</li>
</ul>
|
f9ad5fc73cb5167c76c66686c83a6795e0a336ae
|
metadata/com.knirirr.beecount.txt
|
metadata/com.knirirr.beecount.txt
|
Categories:Office
License:Apache2
Web Site:https://code.google.com/p/beecount
Source Code:https://code.google.com/p/beecount/source
Issue Tracker:https://code.google.com/p/beecount/issues
Auto Name:BeeCount
Summary:Knitting row counter
Description:
A knitting project helper, with the following features:
* Keep track of several knitting projects.
* For each project, track several items which need counting, e.g. rows, pattern repeats etc
* Allow counting up, down and re-setting to zero of these counts.
* Allow editing of projects to add, remove or change counts after project creation.
* Link counts together so that one increments another.
* Set row alerts and cause counts to reset at particular values.
* Allow backing up and restoring of the entire projects database.
.
Repo Type:hg
Repo:https://code.google.com/p/beecount
Build:1.2.1,47
commit=1.2.1
target=android-15
Build:1.2.2,48
commit=1.2.2
target=android-15
Build:1.2.3,50
commit=1.2.3
target=android-15
Build:1.2.5,53
commit=1.2.5
target=android-15
Auto Update Mode:None
Update Check Mode:Tags
Current Version:1.2.6
Current Version Code:56
|
Categories:Office
License:Apache2
Web Site:https://code.google.com/p/beecount
Source Code:https://code.google.com/p/beecount/source
Issue Tracker:https://code.google.com/p/beecount/issues
Auto Name:BeeCount
Summary:Knitting row counter
Description:
A knitting project helper, with the following features:
* Keep track of several knitting projects.
* For each project, track several items which need counting, e.g. rows, pattern repeats etc
* Allow counting up, down and re-setting to zero of these counts.
* Allow editing of projects to add, remove or change counts after project creation.
* Link counts together so that one increments another.
* Set row alerts and cause counts to reset at particular values.
* Allow backing up and restoring of the entire projects database.
.
Repo Type:hg
Repo:https://code.google.com/p/beecount
Build:1.2.1,47
commit=1.2.1
target=android-15
Build:1.2.2,48
commit=1.2.2
target=android-15
Build:1.2.3,50
commit=1.2.3
target=android-15
Build:1.2.5,53
commit=1.2.5
target=android-15
Build:1.2.6,56
commit=1.2.6
target=android-15
Auto Update Mode:None
Update Check Mode:Tags
Current Version:1.2.6
Current Version Code:56
|
Update BeeCount to 1.2.6 (56)
|
Update BeeCount to 1.2.6 (56)
|
Text
|
agpl-3.0
|
f-droid/fdroiddata,f-droid/fdroiddata,f-droid/fdroid-data
|
text
|
## Code Before:
Categories:Office
License:Apache2
Web Site:https://code.google.com/p/beecount
Source Code:https://code.google.com/p/beecount/source
Issue Tracker:https://code.google.com/p/beecount/issues
Auto Name:BeeCount
Summary:Knitting row counter
Description:
A knitting project helper, with the following features:
* Keep track of several knitting projects.
* For each project, track several items which need counting, e.g. rows, pattern repeats etc
* Allow counting up, down and re-setting to zero of these counts.
* Allow editing of projects to add, remove or change counts after project creation.
* Link counts together so that one increments another.
* Set row alerts and cause counts to reset at particular values.
* Allow backing up and restoring of the entire projects database.
.
Repo Type:hg
Repo:https://code.google.com/p/beecount
Build:1.2.1,47
commit=1.2.1
target=android-15
Build:1.2.2,48
commit=1.2.2
target=android-15
Build:1.2.3,50
commit=1.2.3
target=android-15
Build:1.2.5,53
commit=1.2.5
target=android-15
Auto Update Mode:None
Update Check Mode:Tags
Current Version:1.2.6
Current Version Code:56
## Instruction:
Update BeeCount to 1.2.6 (56)
## Code After:
Categories:Office
License:Apache2
Web Site:https://code.google.com/p/beecount
Source Code:https://code.google.com/p/beecount/source
Issue Tracker:https://code.google.com/p/beecount/issues
Auto Name:BeeCount
Summary:Knitting row counter
Description:
A knitting project helper, with the following features:
* Keep track of several knitting projects.
* For each project, track several items which need counting, e.g. rows, pattern repeats etc
* Allow counting up, down and re-setting to zero of these counts.
* Allow editing of projects to add, remove or change counts after project creation.
* Link counts together so that one increments another.
* Set row alerts and cause counts to reset at particular values.
* Allow backing up and restoring of the entire projects database.
.
Repo Type:hg
Repo:https://code.google.com/p/beecount
Build:1.2.1,47
commit=1.2.1
target=android-15
Build:1.2.2,48
commit=1.2.2
target=android-15
Build:1.2.3,50
commit=1.2.3
target=android-15
Build:1.2.5,53
commit=1.2.5
target=android-15
Build:1.2.6,56
commit=1.2.6
target=android-15
Auto Update Mode:None
Update Check Mode:Tags
Current Version:1.2.6
Current Version Code:56
|
217e4e5450379452769fac74766aeb473d212d61
|
src/rivets.coffee
|
src/rivets.coffee
|
Rivets =
# Binder definitions, publicly accessible on `module.binders`. Can be
# overridden globally or local to a `Rivets.View` instance.
binders: {}
# Component definitions, publicly accessible on `module.components`. Can be
# overridden globally or local to a `Rivets.View` instance.
components: {}
# Formatter definitions, publicly accessible on `module.formatters`. Can be
# overridden globally or local to a `Rivets.View` instance.
formatters: {}
# Adapter definitions, publicly accessible on `module.adapters`. Can be
# overridden globally or local to a `Rivets.View` instance.
adapters: {}
# The default configuration, publicly accessible on `module.config`. Can be
# overridden globally or local to a `Rivets.View` instance.
config:
prefix: 'rv'
rootInterface: '.'
preloadData: true
handler: (context, ev, binding) ->
@call context, ev, binding.view.models
|
Rivets =
# Binder definitions, publicly accessible on `module.binders`. Can be
# overridden globally or local to a `Rivets.View` instance.
binders: {}
# Component definitions, publicly accessible on `module.components`. Can be
# overridden globally or local to a `Rivets.View` instance.
components: {}
# Formatter definitions, publicly accessible on `module.formatters`. Can be
# overridden globally or local to a `Rivets.View` instance.
formatters: {}
# Adapter definitions, publicly accessible on `module.adapters`. Can be
# overridden globally or local to a `Rivets.View` instance.
adapters: {}
# The default configuration, publicly accessible on `module.config`. Can be
# overridden globally or local to a `Rivets.View` instance.
config:
prefix: 'rv'
templateDelimiters: ['{', '}']
rootInterface: '.'
preloadData: true
handler: (context, ev, binding) ->
@call context, ev, binding.view.models
|
Add default { template } delimiters.
|
Add default { template } delimiters.
|
CoffeeScript
|
mit
|
benderTheCrime/tiny-rivets,nopnop/rivets,mikeric/rivets,altmind/rivets,re-clone/rivets,kangax/rivets,nopnop/rivets,MishaMykhalyuk/rivets,jccazeaux/rivets,zongkelong/rivets,MishaMykhalyuk/rivets,QAPInt/rivets,jccazeaux/rivets,altmind/rivets,zongkelong/rivets,QAPInt/rivets,re-clone/rivets,altmind/rivets,MishaMykhalyuk/rivets,zongkelong/rivets,GerHobbelt/rivets,nopnop/rivets,GerHobbelt/rivets,npmcomponent/mikeric-rivets,moneyadviceservice/rivets,jccazeaux/rivets,GerHobbelt/rivets,mikeric/rivets,mikeric/rivets,re-clone/rivets,QAPInt/rivets
|
coffeescript
|
## Code Before:
Rivets =
# Binder definitions, publicly accessible on `module.binders`. Can be
# overridden globally or local to a `Rivets.View` instance.
binders: {}
# Component definitions, publicly accessible on `module.components`. Can be
# overridden globally or local to a `Rivets.View` instance.
components: {}
# Formatter definitions, publicly accessible on `module.formatters`. Can be
# overridden globally or local to a `Rivets.View` instance.
formatters: {}
# Adapter definitions, publicly accessible on `module.adapters`. Can be
# overridden globally or local to a `Rivets.View` instance.
adapters: {}
# The default configuration, publicly accessible on `module.config`. Can be
# overridden globally or local to a `Rivets.View` instance.
config:
prefix: 'rv'
rootInterface: '.'
preloadData: true
handler: (context, ev, binding) ->
@call context, ev, binding.view.models
## Instruction:
Add default { template } delimiters.
## Code After:
Rivets =
# Binder definitions, publicly accessible on `module.binders`. Can be
# overridden globally or local to a `Rivets.View` instance.
binders: {}
# Component definitions, publicly accessible on `module.components`. Can be
# overridden globally or local to a `Rivets.View` instance.
components: {}
# Formatter definitions, publicly accessible on `module.formatters`. Can be
# overridden globally or local to a `Rivets.View` instance.
formatters: {}
# Adapter definitions, publicly accessible on `module.adapters`. Can be
# overridden globally or local to a `Rivets.View` instance.
adapters: {}
# The default configuration, publicly accessible on `module.config`. Can be
# overridden globally or local to a `Rivets.View` instance.
config:
prefix: 'rv'
templateDelimiters: ['{', '}']
rootInterface: '.'
preloadData: true
handler: (context, ev, binding) ->
@call context, ev, binding.view.models
|
ada5e5d04aaf766b9a88f72a53d0d9c3a62dc69d
|
tests/sharedtria/CMakeLists.txt
|
tests/sharedtria/CMakeLists.txt
|
CMAKE_MINIMUM_REQUIRED(VERSION 2.8.9)
INCLUDE(${DEAL_II_SOURCE_DIR}/tests/setup_testsubproject.cmake)
PROJECT(testsuite CXX)
INCLUDE(${DEAL_II_TARGET_CONFIG})
DEAL_II_PICKUP_TESTS()
|
CMAKE_MINIMUM_REQUIRED(VERSION 2.8.9)
INCLUDE(../setup_testsubproject.cmake)
PROJECT(testsuite CXX)
INCLUDE(${DEAL_II_TARGET_CONFIG})
DEAL_II_PICKUP_TESTS()
|
Fix path to cmake file.
|
Fix path to cmake file.
The current path leads to a failure on my system when calling
'ninja setup_tests':
FAILED: cd /node/bangerth/trunk/build/tests/sharedtria && /w/bangerth/share/software/cmake-2.8.12.2/bin/cmake -GNinja -DDEAL_II_DIR=/node/bangerth/trunk/build -UDIFF_DIR -UNUMDIFF_DIR -UTEST_PICKUP_REGEX -UTEST_TIME_LIMIT /node/bangerth/trunk/dealii/tests/sharedtria > /dev/null
CMake Error at CMakeLists.txt:2 (INCLUDE):
include could not find load file:
../tests/setup_testsubproject.cmake
CMake Error at CMakeLists.txt:4 (INCLUDE):
include called with wrong number of arguments. Include only takes one
file.
Fix this by providing the same path as in the tests/distributed
directory.
|
Text
|
lgpl-2.1
|
sriharisundar/dealii,sriharisundar/dealii,pesser/dealii,ibkim11/dealii,kalj/dealii,ibkim11/dealii,angelrca/dealii,sairajat/dealii,maieneuro/dealii,danshapero/dealii,adamkosik/dealii,spco/dealii,nicolacavallini/dealii,Arezou-gh/dealii,Arezou-gh/dealii,sriharisundar/dealii,maieneuro/dealii,ibkim11/dealii,angelrca/dealii,gpitton/dealii,spco/dealii,JaeryunYim/dealii,gpitton/dealii,adamkosik/dealii,adamkosik/dealii,sairajat/dealii,shakirbsm/dealii,ESeNonFossiIo/dealii,ESeNonFossiIo/dealii,pesser/dealii,JaeryunYim/dealii,adamkosik/dealii,andreamola/dealii,maieneuro/dealii,sairajat/dealii,YongYang86/dealii,naliboff/dealii,sairajat/dealii,EGP-CIG-REU/dealii,nicolacavallini/dealii,danshapero/dealii,andreamola/dealii,maieneuro/dealii,andreamola/dealii,JaeryunYim/dealii,spco/dealii,EGP-CIG-REU/dealii,gpitton/dealii,andreamola/dealii,sriharisundar/dealii,adamkosik/dealii,kalj/dealii,ibkim11/dealii,danshapero/dealii,nicolacavallini/dealii,ESeNonFossiIo/dealii,angelrca/dealii,EGP-CIG-REU/dealii,andreamola/dealii,sriharisundar/dealii,YongYang86/dealii,JaeryunYim/dealii,adamkosik/dealii,naliboff/dealii,maieneuro/dealii,kalj/dealii,spco/dealii,sriharisundar/dealii,spco/dealii,maieneuro/dealii,sriharisundar/dealii,danshapero/dealii,angelrca/dealii,ESeNonFossiIo/dealii,Arezou-gh/dealii,JaeryunYim/dealii,ibkim11/dealii,gpitton/dealii,johntfoster/dealii,danshapero/dealii,nicolacavallini/dealii,YongYang86/dealii,angelrca/dealii,shakirbsm/dealii,EGP-CIG-REU/dealii,YongYang86/dealii,nicolacavallini/dealii,johntfoster/dealii,jperryhouts/dealii,andreamola/dealii,jperryhouts/dealii,johntfoster/dealii,jperryhouts/dealii,johntfoster/dealii,pesser/dealii,danshapero/dealii,EGP-CIG-REU/dealii,YongYang86/dealii,angelrca/dealii,shakirbsm/dealii,sairajat/dealii,gpitton/dealii,pesser/dealii,johntfoster/dealii,kalj/dealii,spco/dealii,ESeNonFossiIo/dealii,johntfoster/dealii,kalj/dealii,pesser/dealii,jperryhouts/dealii,ESeNonFossiIo/dealii,kalj/dealii,JaeryunYim/dealii,YongYang86/dealii,angelrca/dealii,shakirbsm/dealii,naliboff/dealii,Arezou-gh/dealii,sairajat/dealii,kalj/dealii,shakirbsm/dealii,EGP-CIG-REU/dealii,shakirbsm/dealii,jperryhouts/dealii,nicolacavallini/dealii,maieneuro/dealii,danshapero/dealii,ibkim11/dealii,ibkim11/dealii,jperryhouts/dealii,gpitton/dealii,sairajat/dealii,naliboff/dealii,johntfoster/dealii,jperryhouts/dealii,pesser/dealii,gpitton/dealii,ESeNonFossiIo/dealii,shakirbsm/dealii,EGP-CIG-REU/dealii,naliboff/dealii,JaeryunYim/dealii,andreamola/dealii,adamkosik/dealii,naliboff/dealii,YongYang86/dealii,Arezou-gh/dealii,Arezou-gh/dealii,Arezou-gh/dealii,naliboff/dealii,pesser/dealii,nicolacavallini/dealii,spco/dealii
|
text
|
## Code Before:
CMAKE_MINIMUM_REQUIRED(VERSION 2.8.9)
INCLUDE(${DEAL_II_SOURCE_DIR}/tests/setup_testsubproject.cmake)
PROJECT(testsuite CXX)
INCLUDE(${DEAL_II_TARGET_CONFIG})
DEAL_II_PICKUP_TESTS()
## Instruction:
Fix path to cmake file.
The current path leads to a failure on my system when calling
'ninja setup_tests':
FAILED: cd /node/bangerth/trunk/build/tests/sharedtria && /w/bangerth/share/software/cmake-2.8.12.2/bin/cmake -GNinja -DDEAL_II_DIR=/node/bangerth/trunk/build -UDIFF_DIR -UNUMDIFF_DIR -UTEST_PICKUP_REGEX -UTEST_TIME_LIMIT /node/bangerth/trunk/dealii/tests/sharedtria > /dev/null
CMake Error at CMakeLists.txt:2 (INCLUDE):
include could not find load file:
../tests/setup_testsubproject.cmake
CMake Error at CMakeLists.txt:4 (INCLUDE):
include called with wrong number of arguments. Include only takes one
file.
Fix this by providing the same path as in the tests/distributed
directory.
## Code After:
CMAKE_MINIMUM_REQUIRED(VERSION 2.8.9)
INCLUDE(../setup_testsubproject.cmake)
PROJECT(testsuite CXX)
INCLUDE(${DEAL_II_TARGET_CONFIG})
DEAL_II_PICKUP_TESTS()
|
d9ecc149156b70926e843f0207bb8303e3f80377
|
BreweryDBTests/RequestBuilderTests.swift
|
BreweryDBTests/RequestBuilderTests.swift
|
//
// RequestBuilderTests.swift
// BreweryDB
//
// Created by Jake Welton on 1/30/16.
// Copyright © 2016 Jake Welton. All rights reserved.
//
import XCTest
@testable import BreweryDB
class RequestBuilderTests: XCTestCase {
override func setUp() {
super.setUp()
// Put setup code here. This method is called before the invocation of each test method in the class.
}
override func tearDown() {
// Put teardown code here. This method is called after the invocation of each test method in the class.
super.tearDown()
}
func testRequestBuilderInitsWithRequestEndPoint() {
let requestBuilder = RequestBuilder(endPoint: .Beer)
XCTAssertNotNil(requestBuilder)
}
}
|
//
// RequestBuilderTests.swift
// BreweryDB
//
// Created by Jake Welton on 1/30/16.
// Copyright © 2016 Jake Welton. All rights reserved.
//
import XCTest
@testable import BreweryDB
class RequestBuilderTests: XCTestCase {
override func setUp() {
super.setUp()
// Put setup code here. This method is called before the invocation of each test method in the class.
}
override func tearDown() {
// Put teardown code here. This method is called after the invocation of each test method in the class.
super.tearDown()
}
func testRequestBuilderInitsWithRequestEndPoint() {
let requestBuilder = RequestBuilder(endPoint: .Beer)
XCTAssertNotNil(requestBuilder)
}
func testRequestBuilderNSURLExtensionReturnsURLWithRawValue() {
let baseURL = NSURL(string: "app.mywebservice.com")
let url = baseURL?.URLByAppendingPathComponent(.Beer)
XCTAssertEqual(url, NSURL(string: "app.mywebservice.com/beer"))
}
}
|
Add test for NSURL extension
|
Add test for NSURL extension
|
Swift
|
mit
|
jwelton/BreweryDB,jwelton/BreweryDB
|
swift
|
## Code Before:
//
// RequestBuilderTests.swift
// BreweryDB
//
// Created by Jake Welton on 1/30/16.
// Copyright © 2016 Jake Welton. All rights reserved.
//
import XCTest
@testable import BreweryDB
class RequestBuilderTests: XCTestCase {
override func setUp() {
super.setUp()
// Put setup code here. This method is called before the invocation of each test method in the class.
}
override func tearDown() {
// Put teardown code here. This method is called after the invocation of each test method in the class.
super.tearDown()
}
func testRequestBuilderInitsWithRequestEndPoint() {
let requestBuilder = RequestBuilder(endPoint: .Beer)
XCTAssertNotNil(requestBuilder)
}
}
## Instruction:
Add test for NSURL extension
## Code After:
//
// RequestBuilderTests.swift
// BreweryDB
//
// Created by Jake Welton on 1/30/16.
// Copyright © 2016 Jake Welton. All rights reserved.
//
import XCTest
@testable import BreweryDB
class RequestBuilderTests: XCTestCase {
override func setUp() {
super.setUp()
// Put setup code here. This method is called before the invocation of each test method in the class.
}
override func tearDown() {
// Put teardown code here. This method is called after the invocation of each test method in the class.
super.tearDown()
}
func testRequestBuilderInitsWithRequestEndPoint() {
let requestBuilder = RequestBuilder(endPoint: .Beer)
XCTAssertNotNil(requestBuilder)
}
func testRequestBuilderNSURLExtensionReturnsURLWithRawValue() {
let baseURL = NSURL(string: "app.mywebservice.com")
let url = baseURL?.URLByAppendingPathComponent(.Beer)
XCTAssertEqual(url, NSURL(string: "app.mywebservice.com/beer"))
}
}
|
1f395c66eeaabdc36f11cb333529d271ed4566a0
|
lib/resty/auto-ssl/utils/shell_execute.lua
|
lib/resty/auto-ssl/utils/shell_execute.lua
|
local shell = require "resty.auto-ssl.vendor.shell"
local start_sockproc = require "resty.auto-ssl.utils.start_sockproc"
return function(command)
-- Make sure the sockproc has started before trying to execute any commands
-- (since it's started by only a single worker in init_worker, it's possible
-- other workers have already finished their init_worker phases before the
-- process is actually started).
if not ngx.shared.auto_ssl:get("sockproc_started") then
start_sockproc()
local wait_time = 0
local sleep_time = 0.01
local max_time = 5
while not ngx.shared.auto_ssl:get("sockproc_started") do
ngx.sleep(sleep_time)
wait_time = wait_time + sleep_time
if wait_time > max_time then
break
end
end
end
local status, out, err = shell.execute(command)
-- If the script fails due to a missing sockproc socket, try starting up
-- the sockproc process again and then retry.
if status ~= 0 and err == "no such file or directory" then
ngx.log(ngx.ERR, "auto-ssl: sockproc unexpectedly not available, trying to restart")
start_sockproc(true)
status, out, err = shell.execute(command)
end
return status, out, err
end
|
local shell = require "resty.auto-ssl.vendor.shell"
local start_sockproc = require "resty.auto-ssl.utils.start_sockproc"
return function(command)
-- Make sure the sockproc has started before trying to execute any commands
-- (since it's started by only a single worker in init_worker, it's possible
-- other workers have already finished their init_worker phases before the
-- process is actually started).
if not ngx.shared.auto_ssl:get("sockproc_started") then
start_sockproc()
local wait_time = 0
local sleep_time = 0.01
local max_time = 5
while not ngx.shared.auto_ssl:get("sockproc_started") do
ngx.sleep(sleep_time)
wait_time = wait_time + sleep_time
if wait_time > max_time then
break
end
end
end
local status, out, err = shell.execute(command)
-- If the script fails due to a missing sockproc socket, try starting up
-- the sockproc process again and then retry.
if status ~= 0 and err == "no such file or directory" then
ngx.log(ngx.ERR, "auto-ssl: sockproc unexpectedly not available, trying to restart")
start_sockproc(true)
status, out, err = shell.execute(command, { timeout = 60 })
end
return status, out, err
end
|
Increase default timeout for calling letsencrypt.sh to 60 seconds.
|
Increase default timeout for calling letsencrypt.sh to 60 seconds.
This attempts to workaround potential slowdowns on Let's Encrypt's end
without dropping the initial request while registering the cert.
The previous default timeout builtin to resty-shell was 15 seconds.
Hopefully certificate registrations won't actually take that long, but
this should hopefully help with random slowness (or at least provide a
better error message from letsencrypt.sh, rather than timing out while
the letsencrypt.sh script continues to run).
See: https://github.com/GUI/lua-resty-auto-ssl/issues/11
|
Lua
|
mit
|
UseFedora/lua-resty-auto-ssl,UseFedora/lua-resty-auto-ssl,GUI/lua-resty-auto-ssl
|
lua
|
## Code Before:
local shell = require "resty.auto-ssl.vendor.shell"
local start_sockproc = require "resty.auto-ssl.utils.start_sockproc"
return function(command)
-- Make sure the sockproc has started before trying to execute any commands
-- (since it's started by only a single worker in init_worker, it's possible
-- other workers have already finished their init_worker phases before the
-- process is actually started).
if not ngx.shared.auto_ssl:get("sockproc_started") then
start_sockproc()
local wait_time = 0
local sleep_time = 0.01
local max_time = 5
while not ngx.shared.auto_ssl:get("sockproc_started") do
ngx.sleep(sleep_time)
wait_time = wait_time + sleep_time
if wait_time > max_time then
break
end
end
end
local status, out, err = shell.execute(command)
-- If the script fails due to a missing sockproc socket, try starting up
-- the sockproc process again and then retry.
if status ~= 0 and err == "no such file or directory" then
ngx.log(ngx.ERR, "auto-ssl: sockproc unexpectedly not available, trying to restart")
start_sockproc(true)
status, out, err = shell.execute(command)
end
return status, out, err
end
## Instruction:
Increase default timeout for calling letsencrypt.sh to 60 seconds.
This attempts to workaround potential slowdowns on Let's Encrypt's end
without dropping the initial request while registering the cert.
The previous default timeout builtin to resty-shell was 15 seconds.
Hopefully certificate registrations won't actually take that long, but
this should hopefully help with random slowness (or at least provide a
better error message from letsencrypt.sh, rather than timing out while
the letsencrypt.sh script continues to run).
See: https://github.com/GUI/lua-resty-auto-ssl/issues/11
## Code After:
local shell = require "resty.auto-ssl.vendor.shell"
local start_sockproc = require "resty.auto-ssl.utils.start_sockproc"
return function(command)
-- Make sure the sockproc has started before trying to execute any commands
-- (since it's started by only a single worker in init_worker, it's possible
-- other workers have already finished their init_worker phases before the
-- process is actually started).
if not ngx.shared.auto_ssl:get("sockproc_started") then
start_sockproc()
local wait_time = 0
local sleep_time = 0.01
local max_time = 5
while not ngx.shared.auto_ssl:get("sockproc_started") do
ngx.sleep(sleep_time)
wait_time = wait_time + sleep_time
if wait_time > max_time then
break
end
end
end
local status, out, err = shell.execute(command)
-- If the script fails due to a missing sockproc socket, try starting up
-- the sockproc process again and then retry.
if status ~= 0 and err == "no such file or directory" then
ngx.log(ngx.ERR, "auto-ssl: sockproc unexpectedly not available, trying to restart")
start_sockproc(true)
status, out, err = shell.execute(command, { timeout = 60 })
end
return status, out, err
end
|
a5474a6773c59ed8593b8d0f31943f116de6ba3c
|
scripts/aws/mongo-install.sh
|
scripts/aws/mongo-install.sh
|
set -e
cat <<END > mongodb-org-3.0.repo
[mongodb-org-3.0]
name=MongoDB Repository
baseurl=https://repo.mongodb.org/yum/amazon/2013.03/mongodb-org/3.0/x86_64/
gpgcheck=0
enabled=1
END
sudo mv -v mongodb-org-3.0.repo /etc/yum.repos.d/mongodb-org-3.0.repo
sudo yum install -y mongodb-org
sudo service mongod start
sudo chkconfig mongod on
|
cat <<END | sudo tee /etc/yum.repos.d/mongodb-org-3.2.repo
[mongodb-org-3.2]
name=MongoDB Repository
baseurl=https://repo.mongodb.org/yum/amazon/2013.03/mongodb-org/3.2/x86_64/
gpgcheck=1
enabled=1
gpgkey=https://www.mongodb.org/static/pgp/server-3.2.asc
END
sudo yum update -y
set -e
sudo yum install -y mongodb-org
if [[ ! -e /var/run/mongodb/mongod.pid ]] ; then
sudo service mongod start
sudo chkconfig mongod on
fi
|
Improve mongo installer for aws linux
|
Improve mongo installer for aws linux
|
Shell
|
mit
|
c9s/typeloy,c9s/typeloy
|
shell
|
## Code Before:
set -e
cat <<END > mongodb-org-3.0.repo
[mongodb-org-3.0]
name=MongoDB Repository
baseurl=https://repo.mongodb.org/yum/amazon/2013.03/mongodb-org/3.0/x86_64/
gpgcheck=0
enabled=1
END
sudo mv -v mongodb-org-3.0.repo /etc/yum.repos.d/mongodb-org-3.0.repo
sudo yum install -y mongodb-org
sudo service mongod start
sudo chkconfig mongod on
## Instruction:
Improve mongo installer for aws linux
## Code After:
cat <<END | sudo tee /etc/yum.repos.d/mongodb-org-3.2.repo
[mongodb-org-3.2]
name=MongoDB Repository
baseurl=https://repo.mongodb.org/yum/amazon/2013.03/mongodb-org/3.2/x86_64/
gpgcheck=1
enabled=1
gpgkey=https://www.mongodb.org/static/pgp/server-3.2.asc
END
sudo yum update -y
set -e
sudo yum install -y mongodb-org
if [[ ! -e /var/run/mongodb/mongod.pid ]] ; then
sudo service mongod start
sudo chkconfig mongod on
fi
|
298c66efec5397378156ddb3dc1260930d81fe42
|
radian-emacs/radian-help.el
|
radian-emacs/radian-help.el
|
;;; radian-help.el --- Improve the Emacs help system
(require 'radian-package)
;; This package provides several useful features, but the most
;; important is `describe-keymap'. Because the package is rather
;; outdated, it's not autoloaded. (But `use-package' takes care of
;; that for us.)
(use-package help-fns+
:defer-install t
:bind (("C-h M-k" . describe-keymap)))
(provide 'radian-help)
;;; radian-help.el ends here
|
;;; radian-help.el --- Improve the Emacs help system
(require 'radian-package)
;; This package provides several useful features, but the most
;; important is `describe-keymap'. Because the package is rather
;; outdated, it's not autoloaded. (But `use-package' takes care of
;; that for us.)
(use-package help-fns+
:defer-install t
:bind (("C-h M-k" . describe-keymap)
;; Prevent help-fns+ from overriding this built-in
;; keybinding:
("C-h o" . describe-symbol)))
(provide 'radian-help)
;;; radian-help.el ends here
|
Make sure describe-symbol isn't unbound
|
Make sure describe-symbol isn't unbound
|
Emacs Lisp
|
mit
|
raxod502/radian,raxod502/radian
|
emacs-lisp
|
## Code Before:
;;; radian-help.el --- Improve the Emacs help system
(require 'radian-package)
;; This package provides several useful features, but the most
;; important is `describe-keymap'. Because the package is rather
;; outdated, it's not autoloaded. (But `use-package' takes care of
;; that for us.)
(use-package help-fns+
:defer-install t
:bind (("C-h M-k" . describe-keymap)))
(provide 'radian-help)
;;; radian-help.el ends here
## Instruction:
Make sure describe-symbol isn't unbound
## Code After:
;;; radian-help.el --- Improve the Emacs help system
(require 'radian-package)
;; This package provides several useful features, but the most
;; important is `describe-keymap'. Because the package is rather
;; outdated, it's not autoloaded. (But `use-package' takes care of
;; that for us.)
(use-package help-fns+
:defer-install t
:bind (("C-h M-k" . describe-keymap)
;; Prevent help-fns+ from overriding this built-in
;; keybinding:
("C-h o" . describe-symbol)))
(provide 'radian-help)
;;; radian-help.el ends here
|
4569342455a71ddd56f77714eb07c28068d48383
|
examples/logWriter_config.js
|
examples/logWriter_config.js
|
/* jshint -W079 */
var scribe = require('../scribe.js')({
createDefaultConsole : false
});
var consoleOne = scribe.console({
console : {
colors : 'white'
},
logWriter : {
rootPath : 'logsConsoleOne' //all logs in ./logsConsoleOne
}
});
var consoleTwo = scribe.console({
console : {
colors : 'inverse'
},
logWriter : {
rootPath : 'logsConsoleTwo' //all logs in ./logsConsoleTwo
}
});
var consoleThree = scribe.console({
console : {
colors : 'magenta'
},
logWriter : false //don't save logs on disk
});
consoleOne.addLogger('log');
consoleTwo.addLogger('log');
consoleThree.addLogger('log');
consoleOne.time().log('Hello World from consoleOne');
consoleTwo.time().log('Hello World from consoleTwo');
consoleThree.time().log('Hello World from consoleThree');
|
/* jshint -W079 */
var moment = require('moment'),
path = require('path');
var scribe = require('../scribe.js')({
createDefaultConsole : false
});
/**
* Create a custom LogWriter
*
* It'll save logs under logsConsoleTwo/[user]/[logger]/[DD_MMM_YY].[logger].json
*
* @see lib/logWriter.js for details
*/
var myLogWriter = new scribe.LogWriter('logsConsoleTwo');
myLogWriter.getPath = function (opt) {
return path.join(
this.getUser(),
opt.logger.name
);
};
myLogWriter.getFilename = function (opt) {
var now = moment();
return (now.format('DD_MMM_YY')).toLowerCase() +
'.' +
opt.logger.name +
'.json';
};
/**
* Create 3 console2 instances
*/
var consoleOne = scribe.console({
console : {
colors : 'white'
},
logWriter : {
rootPath : 'logsConsoleOne' //all logs in ./logsConsoleOne
}
});
var consoleTwo = scribe.console(
{
console : {
colors : 'inverse'
}
},
myLogWriter //don't pass a logWriter config, but a custom LogWriter instead
);
var consoleThree = scribe.console({
console : {
colors : 'magenta'
},
logWriter : false //don't save logs on disk
});
/**
* Use the consoles
*
* Then check logsConsoleOne and logsConsoleTwo folders
*/
consoleOne.addLogger('log');
consoleTwo.addLogger('log');
consoleThree.addLogger('log');
consoleOne.time().log('Hello World from consoleOne');
consoleTwo.time().log('Hello World from consoleTwo');
consoleThree.time().log('Hello World from consoleThree');
|
Add a custom LogWriter example
|
Add a custom LogWriter example
|
JavaScript
|
mit
|
bluejamesbond/Scribe.js,joshball/Scribe.js,joshball/Scribe.js,bluejamesbond/Scribe.js
|
javascript
|
## Code Before:
/* jshint -W079 */
var scribe = require('../scribe.js')({
createDefaultConsole : false
});
var consoleOne = scribe.console({
console : {
colors : 'white'
},
logWriter : {
rootPath : 'logsConsoleOne' //all logs in ./logsConsoleOne
}
});
var consoleTwo = scribe.console({
console : {
colors : 'inverse'
},
logWriter : {
rootPath : 'logsConsoleTwo' //all logs in ./logsConsoleTwo
}
});
var consoleThree = scribe.console({
console : {
colors : 'magenta'
},
logWriter : false //don't save logs on disk
});
consoleOne.addLogger('log');
consoleTwo.addLogger('log');
consoleThree.addLogger('log');
consoleOne.time().log('Hello World from consoleOne');
consoleTwo.time().log('Hello World from consoleTwo');
consoleThree.time().log('Hello World from consoleThree');
## Instruction:
Add a custom LogWriter example
## Code After:
/* jshint -W079 */
var moment = require('moment'),
path = require('path');
var scribe = require('../scribe.js')({
createDefaultConsole : false
});
/**
* Create a custom LogWriter
*
* It'll save logs under logsConsoleTwo/[user]/[logger]/[DD_MMM_YY].[logger].json
*
* @see lib/logWriter.js for details
*/
var myLogWriter = new scribe.LogWriter('logsConsoleTwo');
myLogWriter.getPath = function (opt) {
return path.join(
this.getUser(),
opt.logger.name
);
};
myLogWriter.getFilename = function (opt) {
var now = moment();
return (now.format('DD_MMM_YY')).toLowerCase() +
'.' +
opt.logger.name +
'.json';
};
/**
* Create 3 console2 instances
*/
var consoleOne = scribe.console({
console : {
colors : 'white'
},
logWriter : {
rootPath : 'logsConsoleOne' //all logs in ./logsConsoleOne
}
});
var consoleTwo = scribe.console(
{
console : {
colors : 'inverse'
}
},
myLogWriter //don't pass a logWriter config, but a custom LogWriter instead
);
var consoleThree = scribe.console({
console : {
colors : 'magenta'
},
logWriter : false //don't save logs on disk
});
/**
* Use the consoles
*
* Then check logsConsoleOne and logsConsoleTwo folders
*/
consoleOne.addLogger('log');
consoleTwo.addLogger('log');
consoleThree.addLogger('log');
consoleOne.time().log('Hello World from consoleOne');
consoleTwo.time().log('Hello World from consoleTwo');
consoleThree.time().log('Hello World from consoleThree');
|
91ed18aedd564d41b3fdfbc2c0fd241a25fe09ea
|
app/uk/gov/hmrc/hmrcemailrenderer/templates/transactionengine/TransactionEngineFromAddress.scala
|
app/uk/gov/hmrc/hmrcemailrenderer/templates/transactionengine/TransactionEngineFromAddress.scala
|
/*
* Copyright 2017 HM Revenue & Customs
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package uk.gov.hmrc.hmrcemailrenderer.templates.transactionengine
import play.api.Play
import scala.util.Try
object TransactionEngineFromAddress {
import play.api.Play.current
private val defaultDomain = "transaction-engine.tax.service.gov.uk"
lazy val replyDomain = Try(Play.configuration.getString("transactionEngine.fromAddress.domain")).toOption.flatten.getOrElse(defaultDomain)
def noReply(name: String): String = s"$name <noreply@$replyDomain>"
lazy val transactionEngineAddress = noReply("Government Gateway Transaction Engine")
}
|
/*
* Copyright 2017 HM Revenue & Customs
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package uk.gov.hmrc.hmrcemailrenderer.templates.transactionengine
import play.api.Play
import scala.util.Try
object TransactionEngineFromAddress {
import play.api.Play.current
private val defaultDomain = "confirmation.tax.service.gov.uk"
lazy val replyDomain = Try(Play.configuration.getString("transactionEngine.fromAddress.domain")).toOption.flatten.getOrElse(defaultDomain)
def noReply(name: String): String = s"$name <noreply@$replyDomain>"
lazy val transactionEngineAddress = noReply("Gateway Confirmation")
}
|
Update default transaction engine from address
|
RATE-4305: Update default transaction engine from address
|
Scala
|
apache-2.0
|
saurabharora80/hmrc-email-renderer
|
scala
|
## Code Before:
/*
* Copyright 2017 HM Revenue & Customs
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package uk.gov.hmrc.hmrcemailrenderer.templates.transactionengine
import play.api.Play
import scala.util.Try
object TransactionEngineFromAddress {
import play.api.Play.current
private val defaultDomain = "transaction-engine.tax.service.gov.uk"
lazy val replyDomain = Try(Play.configuration.getString("transactionEngine.fromAddress.domain")).toOption.flatten.getOrElse(defaultDomain)
def noReply(name: String): String = s"$name <noreply@$replyDomain>"
lazy val transactionEngineAddress = noReply("Government Gateway Transaction Engine")
}
## Instruction:
RATE-4305: Update default transaction engine from address
## Code After:
/*
* Copyright 2017 HM Revenue & Customs
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package uk.gov.hmrc.hmrcemailrenderer.templates.transactionengine
import play.api.Play
import scala.util.Try
object TransactionEngineFromAddress {
import play.api.Play.current
private val defaultDomain = "confirmation.tax.service.gov.uk"
lazy val replyDomain = Try(Play.configuration.getString("transactionEngine.fromAddress.domain")).toOption.flatten.getOrElse(defaultDomain)
def noReply(name: String): String = s"$name <noreply@$replyDomain>"
lazy val transactionEngineAddress = noReply("Gateway Confirmation")
}
|
51baf5b1caf02a09f4c9558b4d5e085cbea5a068
|
README.md
|
README.md
|
Additional validator implementations for javax.validation.
Validators will treat null and blank input strings as valid. If you want to make sure fields are not null/blank
you have to use `@NotNull` or `@NotBlank` in addition.
## ISBNs
```java
@ISBN
private String isbn;
@ISBN(type = ISBNType.ISBN_10)
private String isbn10;
@ISBN(type = ISBNType.ISBN_13)
private String isbn13;
```
## Numbers
```java
@Numeric
private String numeric;
```
|
Additional validator implementations for [javax.validation](http://beanvalidation.org/).
Validators will treat null and blank input strings as valid. If you want to make sure fields are not null/blank
you have to use `@NotNull` or `@NotBlank` in addition.
## ISBNs
```java
@ISBN
private String isbn;
@ISBN(type = ISBNType.ISBN_10)
private String isbn10;
@ISBN(type = ISBNType.ISBN_13)
private String isbn13;
```
## Numbers
```java
@Numeric
private String numeric;
```
|
Add link to bean validation documentation
|
Add link to bean validation documentation
|
Markdown
|
apache-2.0
|
britter/bean-validators,britter/bean-validators
|
markdown
|
## Code Before:
Additional validator implementations for javax.validation.
Validators will treat null and blank input strings as valid. If you want to make sure fields are not null/blank
you have to use `@NotNull` or `@NotBlank` in addition.
## ISBNs
```java
@ISBN
private String isbn;
@ISBN(type = ISBNType.ISBN_10)
private String isbn10;
@ISBN(type = ISBNType.ISBN_13)
private String isbn13;
```
## Numbers
```java
@Numeric
private String numeric;
```
## Instruction:
Add link to bean validation documentation
## Code After:
Additional validator implementations for [javax.validation](http://beanvalidation.org/).
Validators will treat null and blank input strings as valid. If you want to make sure fields are not null/blank
you have to use `@NotNull` or `@NotBlank` in addition.
## ISBNs
```java
@ISBN
private String isbn;
@ISBN(type = ISBNType.ISBN_10)
private String isbn10;
@ISBN(type = ISBNType.ISBN_13)
private String isbn13;
```
## Numbers
```java
@Numeric
private String numeric;
```
|
ce0c966f6d094412086cc91cdbb392f98507daee
|
src/spec/models/deployable_spec.rb
|
src/spec/models/deployable_spec.rb
|
require 'spec_helper'
describe Deployable do
it "should have automatically generated uuid after validation" do
d = Factory.build(:deployable)
d.uuid = nil
d.save
d.uuid.should_not be_nil
end
it "should not be valid if deployable name is too long" do
d = Factory.build(:deployable)
d.name = ('a' * 256)
d.valid?.should be_false
d.errors[:name].should_not be_nil
d.errors[:name].should =~ /^is too long.*/
end
it "should have associated assembly" do
d = Factory.build(:deployable)
d.assemblies.size.should eql(1)
end
end
|
require 'spec_helper'
describe Deployable do
it "should have automatically generated uuid after validation" do
d = Factory.build(:deployable)
d.uuid = nil
d.save
d.uuid.should_not be_nil
end
it "should not be valid if deployable name is too long" do
d = Factory.build(:deployable)
d.name = ('a' * 256)
d.valid?.should be_false
d.errors[:name].should_not be_nil
d.errors[:name].should =~ /^is too long.*/
end
it "should have associated assembly" do
d = Factory.build(:deployable)
d.assemblies.size.should eql(1)
end
it "should not be destroyable when it has running instances" do
deployable = Factory.create(:deployable)
deployment = Factory.create(:deployment, :deployable_id => deployable.id)
assembly = Factory.create(:assembly)
instance = Factory.create(:instance, :deployment_id => deployment.id, :assembly_id => assembly.id, :template_id => nil)
Deployable.find(deployable).should_not be_destroyable
instance.state = Instance::STATE_STOPPED
instance.save!
Deployable.find(deployable).should be_destroyable
end
it "should not be destroyable when it has stopped stateful instances" do
deployable = Factory.build(:deployable)
deployment = Factory.build(:deployment, :deployable_id => deployable.id)
deployable.deployments << deployment
assembly = Factory.build(:assembly)
instance = Factory.build(:instance, :deployment_id => deployment.id, :assembly_id => assembly.id, :template_id => nil)
instance.stub!(:restartable?).and_return(true)
deployment.instances << instance
deployable.should_not be_destroyable
instance.state = Instance::STATE_STOPPED
deployable.should_not be_destroyable
instance.stub!(:restartable?).and_return(false)
deployable.should be_destroyable
end
end
|
Add tests for deployable validation
|
Add tests for deployable validation
|
Ruby
|
apache-2.0
|
aeolusproject/conductor,aeolusproject/conductor,aeolusproject/conductor,aeolusproject/conductor
|
ruby
|
## Code Before:
require 'spec_helper'
describe Deployable do
it "should have automatically generated uuid after validation" do
d = Factory.build(:deployable)
d.uuid = nil
d.save
d.uuid.should_not be_nil
end
it "should not be valid if deployable name is too long" do
d = Factory.build(:deployable)
d.name = ('a' * 256)
d.valid?.should be_false
d.errors[:name].should_not be_nil
d.errors[:name].should =~ /^is too long.*/
end
it "should have associated assembly" do
d = Factory.build(:deployable)
d.assemblies.size.should eql(1)
end
end
## Instruction:
Add tests for deployable validation
## Code After:
require 'spec_helper'
describe Deployable do
it "should have automatically generated uuid after validation" do
d = Factory.build(:deployable)
d.uuid = nil
d.save
d.uuid.should_not be_nil
end
it "should not be valid if deployable name is too long" do
d = Factory.build(:deployable)
d.name = ('a' * 256)
d.valid?.should be_false
d.errors[:name].should_not be_nil
d.errors[:name].should =~ /^is too long.*/
end
it "should have associated assembly" do
d = Factory.build(:deployable)
d.assemblies.size.should eql(1)
end
it "should not be destroyable when it has running instances" do
deployable = Factory.create(:deployable)
deployment = Factory.create(:deployment, :deployable_id => deployable.id)
assembly = Factory.create(:assembly)
instance = Factory.create(:instance, :deployment_id => deployment.id, :assembly_id => assembly.id, :template_id => nil)
Deployable.find(deployable).should_not be_destroyable
instance.state = Instance::STATE_STOPPED
instance.save!
Deployable.find(deployable).should be_destroyable
end
it "should not be destroyable when it has stopped stateful instances" do
deployable = Factory.build(:deployable)
deployment = Factory.build(:deployment, :deployable_id => deployable.id)
deployable.deployments << deployment
assembly = Factory.build(:assembly)
instance = Factory.build(:instance, :deployment_id => deployment.id, :assembly_id => assembly.id, :template_id => nil)
instance.stub!(:restartable?).and_return(true)
deployment.instances << instance
deployable.should_not be_destroyable
instance.state = Instance::STATE_STOPPED
deployable.should_not be_destroyable
instance.stub!(:restartable?).and_return(false)
deployable.should be_destroyable
end
end
|
535c3b80506c5e96abe1487836a96a2b576fb02a
|
TVGuide/app/models/channel.rb
|
TVGuide/app/models/channel.rb
|
class Channel < ApplicationRecord
belongs_to :category
has_many :shows
has_many :schedules, through: :shows
validates :name, presence: true, uniqueness: true
end
|
class Channel < ApplicationRecord
belongs_to :category
has_many :schedules, through: :shows
has_many :shows
validates :name, presence: true, uniqueness: true
end
|
Change order of attributes in Channel model.
|
Change order of attributes in Channel model.
|
Ruby
|
unlicense
|
hemi93/tv-guide,hemi93/tv-guide,hemi93/tv-guide,hemi93/tv-guide
|
ruby
|
## Code Before:
class Channel < ApplicationRecord
belongs_to :category
has_many :shows
has_many :schedules, through: :shows
validates :name, presence: true, uniqueness: true
end
## Instruction:
Change order of attributes in Channel model.
## Code After:
class Channel < ApplicationRecord
belongs_to :category
has_many :schedules, through: :shows
has_many :shows
validates :name, presence: true, uniqueness: true
end
|
d7d97b46bdcf96f1fc4d0ab7574c41266b52b90d
|
companies.json
|
companies.json
|
{
"bloggers": [
{
"jsonId": 10003,
"name": "Goyello",
"rss": "http://blog.goyello.com/pl/feed/",
"twitter": "@goyello"
},
{
"jsonId": 10004,
"name": "Pragmatists",
"rss": "http://pragmatists.pl/blog/feed/",
"twitter": "@pragmatists"
},
{
"jsonId": 10005,
"name": "Allegro.Tech",
"rss": "http://allegro.tech/feed.xml",
"twitter": "@AllegroTechBlog"
},
{
"jsonId": 10006,
"name": "ScalaC",
"rss": "http://blog.scalac.io/feeds/index.xml",
"twitter": "@scalac_io"
}
]
}
|
{
"bloggers": [
{
"jsonId": 10002,
"name": "TouK",
"rss": "http://touk.pl/blog/feed/",
"twitter": "@touk_pl"
},
{
"jsonId": 10003,
"name": "Goyello",
"rss": "http://blog.goyello.com/pl/feed/",
"twitter": "@goyello"
},
{
"jsonId": 10004,
"name": "Pragmatists",
"rss": "http://pragmatists.pl/blog/feed/",
"twitter": "@pragmatists"
},
{
"jsonId": 10005,
"name": "Allegro.Tech",
"rss": "http://allegro.tech/feed.xml",
"twitter": "@AllegroTechBlog"
},
{
"jsonId": 10006,
"name": "ScalaC",
"rss": "http://blog.scalac.io/feeds/index.xml",
"twitter": "@scalac_io"
}
]
}
|
Add TouK rss feed available via http
|
Add TouK rss feed available via http
|
JSON
|
mit
|
szpak/jvm-bloggers,sobkowiak/jvm-bloggers,szpak/jvm-bloggers,tdziurko/jvm-bloggers,tdziurko/jvm-bloggers,jvm-bloggers/jvm-bloggers,alien11689/jvm-bloggers,szpak/jvm-bloggers,jvm-bloggers/jvm-bloggers,tdziurko/jvm-bloggers,jvm-bloggers/jvm-bloggers,tdziurko/jvm-bloggers,alien11689/jvm-bloggers,jvm-bloggers/jvm-bloggers,sobkowiak/jvm-bloggers,kraluk/jvm-bloggers,kraluk/jvm-bloggers,kraluk/jvm-bloggers,szpak/jvm-bloggers,kraluk/jvm-bloggers
|
json
|
## Code Before:
{
"bloggers": [
{
"jsonId": 10003,
"name": "Goyello",
"rss": "http://blog.goyello.com/pl/feed/",
"twitter": "@goyello"
},
{
"jsonId": 10004,
"name": "Pragmatists",
"rss": "http://pragmatists.pl/blog/feed/",
"twitter": "@pragmatists"
},
{
"jsonId": 10005,
"name": "Allegro.Tech",
"rss": "http://allegro.tech/feed.xml",
"twitter": "@AllegroTechBlog"
},
{
"jsonId": 10006,
"name": "ScalaC",
"rss": "http://blog.scalac.io/feeds/index.xml",
"twitter": "@scalac_io"
}
]
}
## Instruction:
Add TouK rss feed available via http
## Code After:
{
"bloggers": [
{
"jsonId": 10002,
"name": "TouK",
"rss": "http://touk.pl/blog/feed/",
"twitter": "@touk_pl"
},
{
"jsonId": 10003,
"name": "Goyello",
"rss": "http://blog.goyello.com/pl/feed/",
"twitter": "@goyello"
},
{
"jsonId": 10004,
"name": "Pragmatists",
"rss": "http://pragmatists.pl/blog/feed/",
"twitter": "@pragmatists"
},
{
"jsonId": 10005,
"name": "Allegro.Tech",
"rss": "http://allegro.tech/feed.xml",
"twitter": "@AllegroTechBlog"
},
{
"jsonId": 10006,
"name": "ScalaC",
"rss": "http://blog.scalac.io/feeds/index.xml",
"twitter": "@scalac_io"
}
]
}
|
638dfd12436925ef992e3fbe3ce76bae53073bce
|
.travis.yml
|
.travis.yml
|
language: ruby
rvm:
- 2.3
- jruby
env:
global:
- JRUBY_OPTS=--dev
sudo: false
before_script:
- RAILS_ENV=test bundle exec rake --rakefile=test/dummy/Rakefile db:setup
matrix:
allow_failures:
- rvm: jruby
|
language: ruby
rvm:
- 2.3.1
- ruby-head
- jruby
env:
global:
- JRUBY_OPTS=--dev
sudo: false
before_script:
- RAILS_ENV=test bundle exec rake --rakefile=test/dummy/Rakefile db:setup
matrix:
allow_failures:
- rvm: jruby
|
Set specific Ruby 2.3.1 and test on ruby HEAD
|
Set specific Ruby 2.3.1 and test on ruby HEAD
|
YAML
|
mit
|
gsamokovarov/rvt,gsamokovarov/rvt,gsamokovarov/rvt
|
yaml
|
## Code Before:
language: ruby
rvm:
- 2.3
- jruby
env:
global:
- JRUBY_OPTS=--dev
sudo: false
before_script:
- RAILS_ENV=test bundle exec rake --rakefile=test/dummy/Rakefile db:setup
matrix:
allow_failures:
- rvm: jruby
## Instruction:
Set specific Ruby 2.3.1 and test on ruby HEAD
## Code After:
language: ruby
rvm:
- 2.3.1
- ruby-head
- jruby
env:
global:
- JRUBY_OPTS=--dev
sudo: false
before_script:
- RAILS_ENV=test bundle exec rake --rakefile=test/dummy/Rakefile db:setup
matrix:
allow_failures:
- rvm: jruby
|
b973c6abe4d325b08278822f85f72ebc1761a825
|
changes/constants.py
|
changes/constants.py
|
from enum import Enum
class Status(Enum):
unknown = 0
queued = 1
in_progress = 2
finished = 3
collecting_results = 4
def __str__(self):
return STATUS_LABELS[self]
class Result(Enum):
unknown = 0
passed = 1
failed = 2
skipped = 3
errored = 4
aborted = 5
timedout = 6
def __str__(self):
return RESULT_LABELS[self]
class Provider(Enum):
unknown = 0
koality = 'koality'
class Cause(Enum):
unknown = 0
manual = 1
push = 2
retry = 3
def __str__(self):
return CAUSE_LABELS[self]
STATUS_LABELS = {
Status.unknown: 'Unknown',
Status.queued: 'Queued',
Status.in_progress: 'In progress',
Status.finished: 'Finished'
}
RESULT_LABELS = {
Result.unknown: 'Unknown',
Result.passed: 'Passed',
Result.failed: 'Failed',
Result.skipped: 'Skipped',
Result.errored: 'Errored',
Result.aborted: 'Aborted',
Result.timedout: 'Timed out'
}
CAUSE_LABELS = {
Cause.unknown: 'Unknown',
Cause.manual: 'Manual',
Cause.push: 'Code Push',
Cause.retry: 'Retry',
}
|
from enum import Enum
class Status(Enum):
unknown = 0
queued = 1
in_progress = 2
finished = 3
collecting_results = 4
def __str__(self):
return STATUS_LABELS[self]
class Result(Enum):
unknown = 0
passed = 1
failed = 2
skipped = 3
aborted = 5
timedout = 6
def __str__(self):
return RESULT_LABELS[self]
class Provider(Enum):
unknown = 0
koality = 'koality'
class Cause(Enum):
unknown = 0
manual = 1
push = 2
retry = 3
def __str__(self):
return CAUSE_LABELS[self]
STATUS_LABELS = {
Status.unknown: 'Unknown',
Status.queued: 'Queued',
Status.in_progress: 'In progress',
Status.finished: 'Finished'
}
RESULT_LABELS = {
Result.unknown: 'Unknown',
Result.passed: 'Passed',
Result.failed: 'Failed',
Result.skipped: 'Skipped',
Result.aborted: 'Aborted',
Result.timedout: 'Timed out'
}
CAUSE_LABELS = {
Cause.unknown: 'Unknown',
Cause.manual: 'Manual',
Cause.push: 'Code Push',
Cause.retry: 'Retry',
}
|
Remove errored state (lets rely on a single failure state)
|
Remove errored state (lets rely on a single failure state)
|
Python
|
apache-2.0
|
bowlofstew/changes,dropbox/changes,wfxiang08/changes,dropbox/changes,bowlofstew/changes,bowlofstew/changes,wfxiang08/changes,dropbox/changes,dropbox/changes,wfxiang08/changes,wfxiang08/changes,bowlofstew/changes
|
python
|
## Code Before:
from enum import Enum
class Status(Enum):
unknown = 0
queued = 1
in_progress = 2
finished = 3
collecting_results = 4
def __str__(self):
return STATUS_LABELS[self]
class Result(Enum):
unknown = 0
passed = 1
failed = 2
skipped = 3
errored = 4
aborted = 5
timedout = 6
def __str__(self):
return RESULT_LABELS[self]
class Provider(Enum):
unknown = 0
koality = 'koality'
class Cause(Enum):
unknown = 0
manual = 1
push = 2
retry = 3
def __str__(self):
return CAUSE_LABELS[self]
STATUS_LABELS = {
Status.unknown: 'Unknown',
Status.queued: 'Queued',
Status.in_progress: 'In progress',
Status.finished: 'Finished'
}
RESULT_LABELS = {
Result.unknown: 'Unknown',
Result.passed: 'Passed',
Result.failed: 'Failed',
Result.skipped: 'Skipped',
Result.errored: 'Errored',
Result.aborted: 'Aborted',
Result.timedout: 'Timed out'
}
CAUSE_LABELS = {
Cause.unknown: 'Unknown',
Cause.manual: 'Manual',
Cause.push: 'Code Push',
Cause.retry: 'Retry',
}
## Instruction:
Remove errored state (lets rely on a single failure state)
## Code After:
from enum import Enum
class Status(Enum):
unknown = 0
queued = 1
in_progress = 2
finished = 3
collecting_results = 4
def __str__(self):
return STATUS_LABELS[self]
class Result(Enum):
unknown = 0
passed = 1
failed = 2
skipped = 3
aborted = 5
timedout = 6
def __str__(self):
return RESULT_LABELS[self]
class Provider(Enum):
unknown = 0
koality = 'koality'
class Cause(Enum):
unknown = 0
manual = 1
push = 2
retry = 3
def __str__(self):
return CAUSE_LABELS[self]
STATUS_LABELS = {
Status.unknown: 'Unknown',
Status.queued: 'Queued',
Status.in_progress: 'In progress',
Status.finished: 'Finished'
}
RESULT_LABELS = {
Result.unknown: 'Unknown',
Result.passed: 'Passed',
Result.failed: 'Failed',
Result.skipped: 'Skipped',
Result.aborted: 'Aborted',
Result.timedout: 'Timed out'
}
CAUSE_LABELS = {
Cause.unknown: 'Unknown',
Cause.manual: 'Manual',
Cause.push: 'Code Push',
Cause.retry: 'Retry',
}
|
307b568d320ab964e6f7dabe9dca5370aa9b0f6f
|
README.rst
|
README.rst
|
SMS sender
==========
A library for sending SMS messages through various service providers (gateways).
The currently supported gateways are:
- `Nexmo <http://nexmo.com/>`_
- `BulkSms <http://bulksms.com/>`_
- `ProSms.eu <http://pro-sms.eu/>`_
Usage
-----
Example::
<?php
use Devture\Component\SmsSender\Gateway\NexmoGateway;
use Devture\Component\SmsSender\Message;
$message = new Message('sender-name', 'receiver-phone-number', 'message text');
$gateway = new NexmoGateway('username', 'password');
$gateway->send($message);
echo 'Account Balance is: ', $gateway->getBalance();
|
SMS sender
==========
A library for sending SMS messages through various service providers (gateways).
The currently supported gateways are:
- `Nexmo <http://nexmo.com/>`_
- `BulkSms <http://bulksms.com/>`_
- `ProSms.eu <http://pro-sms.eu/>`_
Usage
-----
Example::
<?php
use Devture\Component\SmsSender\Message;
use Devture\Component\SmsSender\Gateway\NexmoGateway;
$message = new Message('sender-name', 'receiver-phone-number', 'message text');
$gateway = new NexmoGateway('username', 'password');
$gateway->send($message);
echo 'Account Balance is: ', $gateway->getBalance();
|
Fix use statements order in example
|
Fix use statements order in example
|
reStructuredText
|
bsd-3-clause
|
spantaleev/sms-sender
|
restructuredtext
|
## Code Before:
SMS sender
==========
A library for sending SMS messages through various service providers (gateways).
The currently supported gateways are:
- `Nexmo <http://nexmo.com/>`_
- `BulkSms <http://bulksms.com/>`_
- `ProSms.eu <http://pro-sms.eu/>`_
Usage
-----
Example::
<?php
use Devture\Component\SmsSender\Gateway\NexmoGateway;
use Devture\Component\SmsSender\Message;
$message = new Message('sender-name', 'receiver-phone-number', 'message text');
$gateway = new NexmoGateway('username', 'password');
$gateway->send($message);
echo 'Account Balance is: ', $gateway->getBalance();
## Instruction:
Fix use statements order in example
## Code After:
SMS sender
==========
A library for sending SMS messages through various service providers (gateways).
The currently supported gateways are:
- `Nexmo <http://nexmo.com/>`_
- `BulkSms <http://bulksms.com/>`_
- `ProSms.eu <http://pro-sms.eu/>`_
Usage
-----
Example::
<?php
use Devture\Component\SmsSender\Message;
use Devture\Component\SmsSender\Gateway\NexmoGateway;
$message = new Message('sender-name', 'receiver-phone-number', 'message text');
$gateway = new NexmoGateway('username', 'password');
$gateway->send($message);
echo 'Account Balance is: ', $gateway->getBalance();
|
44e062dd5f302c5eed66e2d54858e1b8f78b745b
|
src/data.py
|
src/data.py
|
import csv
import datetime
class Row(dict):
def __init__(self, *args, **kwargs):
super(Row, self).__init__(*args, **kwargs)
self._start_date = None
self._end_date = None
def _cast_date(self, s):
if not s:
return None
return datetime.datetime.strptime(s, '%m/%d/%Y').date()
def _get_date_or_cast(self, s, attr):
if getattr(self, attr) is None:
setattr(self, attr, self._cast_date(s))
return getattr(self, attr)
@property
def start_date(self):
return self._get_date_or_cast(
self['DATE ISSUED'],
'_start_date',
)
@property
def end_date(self):
return self._get_date_or_cast(
self['LICENSE TERM EXPIRATION DATE'],
'_end_date',
)
@property
def account_number(self):
return self['ACCOUNT NUMBER']
@property
def neighborhood(self):
return self['NEIGHBORHOOD']
class RawReader(csv.DictReader):
def __iter__(self, *args, **kwargs):
row = self.next()
while row:
yield Row(row)
row = self.next()
class RawWriter(csv.DictWriter):
pass
|
import csv
import datetime
class Row(dict):
def __init__(self, *args, **kwargs):
super(Row, self).__init__(*args, **kwargs)
self._start_date = None
self._end_date = None
def _cast_date(self, s):
if not s:
return None
return datetime.datetime.strptime(s, '%m/%d/%Y').date()
def _get_date_or_cast(self, s, attr):
if getattr(self, attr) is None:
setattr(self, attr, self._cast_date(s))
return getattr(self, attr)
@property
def start_date(self):
return self._get_date_or_cast(
self['DATE ISSUED'],
'_start_date',
)
@property
def end_date(self):
return self._get_date_or_cast(
self['LICENSE TERM EXPIRATION DATE'],
'_end_date',
)
@property
def application_type(self):
return self['APPLICATION TYPE']
@property
def account_number(self):
return self['ACCOUNT NUMBER']
@property
def site_number(self):
return self['SITE NUMBER']
@property
def neighborhood(self):
return self['NEIGHBORHOOD']
class RawReader(csv.DictReader):
def __iter__(self, *args, **kwargs):
row = self.next()
while row:
yield Row(row)
row = self.next()
class RawWriter(csv.DictWriter):
pass
|
Add site number and application type to properties. For better filtering of new and old biz.
|
Add site number and application type to properties. For better filtering of new and old biz.
|
Python
|
unlicense
|
datascopeanalytics/chicago-new-business,datascopeanalytics/chicago-new-business
|
python
|
## Code Before:
import csv
import datetime
class Row(dict):
def __init__(self, *args, **kwargs):
super(Row, self).__init__(*args, **kwargs)
self._start_date = None
self._end_date = None
def _cast_date(self, s):
if not s:
return None
return datetime.datetime.strptime(s, '%m/%d/%Y').date()
def _get_date_or_cast(self, s, attr):
if getattr(self, attr) is None:
setattr(self, attr, self._cast_date(s))
return getattr(self, attr)
@property
def start_date(self):
return self._get_date_or_cast(
self['DATE ISSUED'],
'_start_date',
)
@property
def end_date(self):
return self._get_date_or_cast(
self['LICENSE TERM EXPIRATION DATE'],
'_end_date',
)
@property
def account_number(self):
return self['ACCOUNT NUMBER']
@property
def neighborhood(self):
return self['NEIGHBORHOOD']
class RawReader(csv.DictReader):
def __iter__(self, *args, **kwargs):
row = self.next()
while row:
yield Row(row)
row = self.next()
class RawWriter(csv.DictWriter):
pass
## Instruction:
Add site number and application type to properties. For better filtering of new and old biz.
## Code After:
import csv
import datetime
class Row(dict):
def __init__(self, *args, **kwargs):
super(Row, self).__init__(*args, **kwargs)
self._start_date = None
self._end_date = None
def _cast_date(self, s):
if not s:
return None
return datetime.datetime.strptime(s, '%m/%d/%Y').date()
def _get_date_or_cast(self, s, attr):
if getattr(self, attr) is None:
setattr(self, attr, self._cast_date(s))
return getattr(self, attr)
@property
def start_date(self):
return self._get_date_or_cast(
self['DATE ISSUED'],
'_start_date',
)
@property
def end_date(self):
return self._get_date_or_cast(
self['LICENSE TERM EXPIRATION DATE'],
'_end_date',
)
@property
def application_type(self):
return self['APPLICATION TYPE']
@property
def account_number(self):
return self['ACCOUNT NUMBER']
@property
def site_number(self):
return self['SITE NUMBER']
@property
def neighborhood(self):
return self['NEIGHBORHOOD']
class RawReader(csv.DictReader):
def __iter__(self, *args, **kwargs):
row = self.next()
while row:
yield Row(row)
row = self.next()
class RawWriter(csv.DictWriter):
pass
|
0de1054fa5565661179195ddc23b9bed0e401687
|
.travis.yml
|
.travis.yml
|
language: minimal
sudo: required
env:
matrix:
- OS_VERSION=6 OSG_VERSION=3.4
- OS_VERSION=7 OSG_VERSION=3.4
services:
- docker
before_install:
- sudo docker pull centos:centos${OS_VERSION}
script:
# Run tests in Container
- sudo docker run --rm=true -v `pwd`:/osg-configure:rw centos:centos${OS_VERSION} /bin/bash /osg-configure/tests/test_inside_docker.sh ${OS_VERSION} ${OSG_VERSION}
|
language: minimal
sudo: required
env:
matrix:
- OS_VERSION=7 OSG_VERSION=3.5
services:
- docker
before_install:
- sudo docker pull centos:centos${OS_VERSION}
script:
# Run tests in Container
- sudo docker run --rm=true -v `pwd`:/osg-configure:rw centos:centos${OS_VERSION} /bin/bash /osg-configure/tests/test_inside_docker.sh ${OS_VERSION} ${OSG_VERSION}
|
Drop el6 from CI and use 3.5
|
Drop el6 from CI and use 3.5
|
YAML
|
apache-2.0
|
matyasselmeci/osg-configure,opensciencegrid/osg-configure,opensciencegrid/osg-configure,matyasselmeci/osg-configure
|
yaml
|
## Code Before:
language: minimal
sudo: required
env:
matrix:
- OS_VERSION=6 OSG_VERSION=3.4
- OS_VERSION=7 OSG_VERSION=3.4
services:
- docker
before_install:
- sudo docker pull centos:centos${OS_VERSION}
script:
# Run tests in Container
- sudo docker run --rm=true -v `pwd`:/osg-configure:rw centos:centos${OS_VERSION} /bin/bash /osg-configure/tests/test_inside_docker.sh ${OS_VERSION} ${OSG_VERSION}
## Instruction:
Drop el6 from CI and use 3.5
## Code After:
language: minimal
sudo: required
env:
matrix:
- OS_VERSION=7 OSG_VERSION=3.5
services:
- docker
before_install:
- sudo docker pull centos:centos${OS_VERSION}
script:
# Run tests in Container
- sudo docker run --rm=true -v `pwd`:/osg-configure:rw centos:centos${OS_VERSION} /bin/bash /osg-configure/tests/test_inside_docker.sh ${OS_VERSION} ${OSG_VERSION}
|
36356df84615ef4d29638a437c74765e5aeb7219
|
docs/Powered-by-Mesos.md
|
docs/Powered-by-Mesos.md
|
Organizations using Mesos:
* [Airbnb](http://www.airbnb.com)
* [Categorize](http://categorize.co)
* [CloudPhysics](http://cloudphysics.com)
* [Conviva](http://www.conviva.com)
* [MediaCrossing](http://www.mediacrossing.com)
* [Sharethrough](http://www.sharethrough.com)
* [Twitter](http://www.twitter.com)
* [UCSF](http://www.ucsf.edu)
* [UC Berkeley](http://www.berkeley.edu)
* [Xogito](http://www.xogito.com)
Software projects built on Mesos:
* [Spark](http://spark.incubator.apache.org/) cluster computing framework
Spark
http://spark.incubator.apache.org/index.html
* [Chronos](https://github.com/airbnb/chronos)
* [Marathon](https://github.com/mesosphere/marathon)
If you're using Mesos, please add yourself to the list above, or email [email protected] and we'll add you!
|
Organizations using Mesos:
* [Airbnb](http://www.airbnb.com)
* [Categorize](http://categorize.co)
* [CloudPhysics](http://cloudphysics.com)
* [Conviva](http://www.conviva.com)
* [iQIYI](http://www.iqiyi.com/)
* [MediaCrossing](http://www.mediacrossing.com)
* [Sharethrough](http://www.sharethrough.com)
* [Twitter](http://www.twitter.com)
* [UCSF](http://www.ucsf.edu)
* [UC Berkeley](http://www.berkeley.edu)
* [Xogito](http://www.xogito.com)
Software projects built on Mesos:
* [Spark](http://spark.incubator.apache.org/) cluster computing framework
Spark
http://spark.incubator.apache.org/index.html
* [Chronos](https://github.com/airbnb/chronos)
* [Marathon](https://github.com/mesosphere/marathon)
If you're using Mesos, please add yourself to the list above, or email [email protected] and we'll add you!
|
Add iQIYI to the Powered by Mesos list.
|
Add iQIYI to the Powered by Mesos list.
|
Markdown
|
apache-2.0
|
Gilbert88/mesos,Aman-Jain-14/customizedMesos,andschwa/mesos,chhsia0/mesos,neilconway/mesos,verizonlabs/mesos,zmalik/mesos,Aman-Jain-14/customizedMesos,chhsia0/mesos,asamerh4/mesos,abudnik/mesos,dforsyth/mesos,Gilbert88/mesos,kaysoky/mesos,dforsyth/mesos,andschwa/mesos,dforsyth/mesos,zmalik/mesos,Aman-Jain-14/customizedMesos,jpeach/mesos,craimbert/mesos,asamerh4/mesos,jpeach/mesos,gsantovena/mesos,dforsyth/mesos,reneploetz/mesos,neilconway/mesos,kaysoky/mesos,andschwa/mesos,yuquanshan/customizedMesos,yuquanshan/customizedMesos,Gilbert88/mesos,neilconway/mesos,craimbert/mesos,zmalik/mesos,yuquanshan/customizedMesos,kaysoky/mesos,chhsia0/mesos,jpeach/mesos,gsantovena/mesos,neilconway/mesos,verizonlabs/mesos,chhsia0/mesos,verizonlabs/mesos,craimbert/mesos,kaysoky/mesos,abudnik/mesos,jpeach/mesos,asamerh4/mesos,kaysoky/mesos,verizonlabs/mesos,zmalik/mesos,dforsyth/mesos,gsantovena/mesos,neilconway/mesos,Gilbert88/mesos,andschwa/mesos,kaysoky/mesos,verizonlabs/mesos,abudnik/mesos,zmalik/mesos,zmalik/mesos,abudnik/mesos,jpeach/mesos,gsantovena/mesos,asamerh4/mesos,craimbert/mesos,gsantovena/mesos,Gilbert88/mesos,neilconway/mesos,craimbert/mesos,asamerh4/mesos,asamerh4/mesos,andschwa/mesos,jpeach/mesos,asamerh4/mesos,craimbert/mesos,gsantovena/mesos,verizonlabs/mesos,abudnik/mesos,kaysoky/mesos,Gilbert88/mesos,reneploetz/mesos,chhsia0/mesos,gsantovena/mesos,dforsyth/mesos,chhsia0/mesos,Aman-Jain-14/customizedMesos,yuquanshan/customizedMesos,reneploetz/mesos,Aman-Jain-14/customizedMesos,craimbert/mesos,Gilbert88/mesos,reneploetz/mesos,yuquanshan/customizedMesos,Aman-Jain-14/customizedMesos,yuquanshan/customizedMesos,verizonlabs/mesos,andschwa/mesos,neilconway/mesos,zmalik/mesos,reneploetz/mesos,yuquanshan/customizedMesos,reneploetz/mesos,jpeach/mesos,dforsyth/mesos,abudnik/mesos,abudnik/mesos,reneploetz/mesos,Aman-Jain-14/customizedMesos,andschwa/mesos,chhsia0/mesos
|
markdown
|
## Code Before:
Organizations using Mesos:
* [Airbnb](http://www.airbnb.com)
* [Categorize](http://categorize.co)
* [CloudPhysics](http://cloudphysics.com)
* [Conviva](http://www.conviva.com)
* [MediaCrossing](http://www.mediacrossing.com)
* [Sharethrough](http://www.sharethrough.com)
* [Twitter](http://www.twitter.com)
* [UCSF](http://www.ucsf.edu)
* [UC Berkeley](http://www.berkeley.edu)
* [Xogito](http://www.xogito.com)
Software projects built on Mesos:
* [Spark](http://spark.incubator.apache.org/) cluster computing framework
Spark
http://spark.incubator.apache.org/index.html
* [Chronos](https://github.com/airbnb/chronos)
* [Marathon](https://github.com/mesosphere/marathon)
If you're using Mesos, please add yourself to the list above, or email [email protected] and we'll add you!
## Instruction:
Add iQIYI to the Powered by Mesos list.
## Code After:
Organizations using Mesos:
* [Airbnb](http://www.airbnb.com)
* [Categorize](http://categorize.co)
* [CloudPhysics](http://cloudphysics.com)
* [Conviva](http://www.conviva.com)
* [iQIYI](http://www.iqiyi.com/)
* [MediaCrossing](http://www.mediacrossing.com)
* [Sharethrough](http://www.sharethrough.com)
* [Twitter](http://www.twitter.com)
* [UCSF](http://www.ucsf.edu)
* [UC Berkeley](http://www.berkeley.edu)
* [Xogito](http://www.xogito.com)
Software projects built on Mesos:
* [Spark](http://spark.incubator.apache.org/) cluster computing framework
Spark
http://spark.incubator.apache.org/index.html
* [Chronos](https://github.com/airbnb/chronos)
* [Marathon](https://github.com/mesosphere/marathon)
If you're using Mesos, please add yourself to the list above, or email [email protected] and we'll add you!
|
743ac57c0d62e01b9a8ad3481792271d91a0dcce
|
.travis.yml
|
.travis.yml
|
language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
- elixir: 1.2
|
language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
|
Remove Elixir v1.2 from tests
|
Remove Elixir v1.2 from tests
|
YAML
|
mit
|
geolessel/react-phoenix
|
yaml
|
## Code Before:
language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
- elixir: 1.2
## Instruction:
Remove Elixir v1.2 from tests
## Code After:
language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
|
b27471e3ae289e4b3e97302ff7a5e9cc4ace59e8
|
src/Result.js
|
src/Result.js
|
'use strict'
var chalk = require('chalk')
var deepEqual = require('deep-equal')
var indent = require('./indent')
var os = require('os')
const CHECK = '\u2713'
const CROSS = '\u2717'
const PASS_COLOR = 'green'
const FAIL_COLOR = 'red'
module.exports = class Result {
constructor (runnable, options) {
options = options || {}
this.runnable = runnable
this.error = options.error
this.results = options.results || []
this.actual = options.actual
this.expected = options.expected
if (this.error === undefined && (this.actual !== undefined || this.expected !== undefined)) {
this.error = !deepEqual(this.actual, this.expected, { strict: true })
}
}
isErroring () {
return Boolean(this.error || this.results.some(result => result.isErroring()))
}
toString () {
var isErroring = this.isErroring()
var status = isErroring ? CROSS : CHECK
var color = isErroring ? FAIL_COLOR : PASS_COLOR
return chalk[color](`${status} ${this.runnable}`)
}
toTree () {
var indented = this.results.map(result => indent(result.toTree()))
return [this.toString()].concat(indented).join(os.EOL)
}
}
|
'use strict'
var chalk = require('chalk')
var deepEqual = require('deep-equal')
var indent = require('./indent')
var os = require('os')
const CHECK = '✓'
const CROSS = '✗'
const PASS_COLOR = 'green'
const FAIL_COLOR = 'red'
module.exports = class Result {
constructor (runnable, options) {
options = options || {}
this.runnable = runnable
this.error = options.error
this.results = options.results || []
this.actual = options.actual
this.expected = options.expected
if (this.error === undefined && (this.actual !== undefined || this.expected !== undefined)) {
this.error = !deepEqual(this.actual, this.expected, { strict: true })
}
}
isErroring () {
return Boolean(this.error || this.results.some(result => result.isErroring()))
}
toString () {
var isErroring = this.isErroring()
var status = isErroring ? CROSS : CHECK
var color = isErroring ? FAIL_COLOR : PASS_COLOR
return chalk[color](`${status} ${this.runnable}`)
}
toTree () {
var indented = this.results.map(result => indent(result.toTree()))
return [this.toString()].concat(indented).join(os.EOL)
}
}
|
Use unicode special characters directly in source
|
Use unicode special characters directly in source
|
JavaScript
|
isc
|
nickmccurdy/purespec
|
javascript
|
## Code Before:
'use strict'
var chalk = require('chalk')
var deepEqual = require('deep-equal')
var indent = require('./indent')
var os = require('os')
const CHECK = '\u2713'
const CROSS = '\u2717'
const PASS_COLOR = 'green'
const FAIL_COLOR = 'red'
module.exports = class Result {
constructor (runnable, options) {
options = options || {}
this.runnable = runnable
this.error = options.error
this.results = options.results || []
this.actual = options.actual
this.expected = options.expected
if (this.error === undefined && (this.actual !== undefined || this.expected !== undefined)) {
this.error = !deepEqual(this.actual, this.expected, { strict: true })
}
}
isErroring () {
return Boolean(this.error || this.results.some(result => result.isErroring()))
}
toString () {
var isErroring = this.isErroring()
var status = isErroring ? CROSS : CHECK
var color = isErroring ? FAIL_COLOR : PASS_COLOR
return chalk[color](`${status} ${this.runnable}`)
}
toTree () {
var indented = this.results.map(result => indent(result.toTree()))
return [this.toString()].concat(indented).join(os.EOL)
}
}
## Instruction:
Use unicode special characters directly in source
## Code After:
'use strict'
var chalk = require('chalk')
var deepEqual = require('deep-equal')
var indent = require('./indent')
var os = require('os')
const CHECK = '✓'
const CROSS = '✗'
const PASS_COLOR = 'green'
const FAIL_COLOR = 'red'
module.exports = class Result {
constructor (runnable, options) {
options = options || {}
this.runnable = runnable
this.error = options.error
this.results = options.results || []
this.actual = options.actual
this.expected = options.expected
if (this.error === undefined && (this.actual !== undefined || this.expected !== undefined)) {
this.error = !deepEqual(this.actual, this.expected, { strict: true })
}
}
isErroring () {
return Boolean(this.error || this.results.some(result => result.isErroring()))
}
toString () {
var isErroring = this.isErroring()
var status = isErroring ? CROSS : CHECK
var color = isErroring ? FAIL_COLOR : PASS_COLOR
return chalk[color](`${status} ${this.runnable}`)
}
toTree () {
var indented = this.results.map(result => indent(result.toTree()))
return [this.toString()].concat(indented).join(os.EOL)
}
}
|
6607c4271f7de0dfaafb2ff6eec2fe4360a1e051
|
app/concerns/action_controller/cors_protection.rb
|
app/concerns/action_controller/cors_protection.rb
|
module ActionController::CORSProtection
extend ActiveSupport::Concern
included do
before_filter :allow_cors
after_filter :set_cors
end
def set_cors
if $settings[:api][:v2][:cors].include?(request.env['HTTP_ORIGIN'])
headers["Access-Control-Allow-Origin"] = request.env['HTTP_ORIGIN']
headers["Access-Control-Allow-Credentials"] = "true"
headers["Access-Control-Allow-Headers"] = "*"
headers["Access-Control-Allow-Methods"] = [
"GET", "POST", "PUT", "DELETE", "PATCH", "OPTIONS"
].join(",")
headers["Access-Control-Allow-Headers"] = [
"accept", "content-type", "referer",
"origin", "connection", "host",
"user-agent"
].join(",") if request.env['HTTP_ACCESS_CONTROL_REQUEST_METHOD'] == "POST"
end
end
def allow_cors
if request.request_method == "OPTIONS"
set_cors
head(:ok)
end
end
end
|
module ActionController::CORSProtection
extend ActiveSupport::Concern
included do
before_filter :allow_cors
after_filter :set_cors
end
def set_cors
if $settings[:api][:v2][:cors].include?(request.env['HTTP_ORIGIN'])
headers["Access-Control-Allow-Origin"] = request.env['HTTP_ORIGIN']
headers["Access-Control-Allow-Credentials"] = "true"
headers["Access-Control-Allow-Headers"] = "*"
headers["Access-Control-Allow-Methods"] = [
"GET", "POST", "PUT", "DELETE", "PATCH", "OPTIONS"
].join(",")
headers["Access-Control-Allow-Headers"] = [
"accept", "content-type", "referer",
"origin", "connection", "host",
"user-agent"
].join(",") if ["POST", "PUT", "PATCH"].include?(
request.env['HTTP_ACCESS_CONTROL_REQUEST_METHOD']
)
end
end
def allow_cors
if request.request_method == "OPTIONS"
set_cors
head(:ok)
end
end
end
|
Fix core for PUT & PATCH
|
Fix core for PUT & PATCH
|
Ruby
|
mit
|
cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web
|
ruby
|
## Code Before:
module ActionController::CORSProtection
extend ActiveSupport::Concern
included do
before_filter :allow_cors
after_filter :set_cors
end
def set_cors
if $settings[:api][:v2][:cors].include?(request.env['HTTP_ORIGIN'])
headers["Access-Control-Allow-Origin"] = request.env['HTTP_ORIGIN']
headers["Access-Control-Allow-Credentials"] = "true"
headers["Access-Control-Allow-Headers"] = "*"
headers["Access-Control-Allow-Methods"] = [
"GET", "POST", "PUT", "DELETE", "PATCH", "OPTIONS"
].join(",")
headers["Access-Control-Allow-Headers"] = [
"accept", "content-type", "referer",
"origin", "connection", "host",
"user-agent"
].join(",") if request.env['HTTP_ACCESS_CONTROL_REQUEST_METHOD'] == "POST"
end
end
def allow_cors
if request.request_method == "OPTIONS"
set_cors
head(:ok)
end
end
end
## Instruction:
Fix core for PUT & PATCH
## Code After:
module ActionController::CORSProtection
extend ActiveSupport::Concern
included do
before_filter :allow_cors
after_filter :set_cors
end
def set_cors
if $settings[:api][:v2][:cors].include?(request.env['HTTP_ORIGIN'])
headers["Access-Control-Allow-Origin"] = request.env['HTTP_ORIGIN']
headers["Access-Control-Allow-Credentials"] = "true"
headers["Access-Control-Allow-Headers"] = "*"
headers["Access-Control-Allow-Methods"] = [
"GET", "POST", "PUT", "DELETE", "PATCH", "OPTIONS"
].join(",")
headers["Access-Control-Allow-Headers"] = [
"accept", "content-type", "referer",
"origin", "connection", "host",
"user-agent"
].join(",") if ["POST", "PUT", "PATCH"].include?(
request.env['HTTP_ACCESS_CONTROL_REQUEST_METHOD']
)
end
end
def allow_cors
if request.request_method == "OPTIONS"
set_cors
head(:ok)
end
end
end
|
b63c4008cffdc5b919c0b7c43c8f1487541ccd51
|
haval-algorithm/src/test/java/com/m4gik/util/UtilTest.java
|
haval-algorithm/src/test/java/com/m4gik/util/UtilTest.java
|
package com.m4gik.util;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.notNullValue;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
/**
*
* This class contains JUnit tests for class {@link Util}.
*
* @author Michał Szczygieł <[email protected]>
*
*/
@RunWith(Parameterized.class)
public class UtilTest {
@Parameterized.Parameters
public static Collection<Object[]> data() {
return Arrays.asList(new Object[][] {
{ new String("test").getBytes() },
{ new Integer(8).byteValue() } });
}
private final Object input;
public UtilTest(Object input) {
this.input = input;
}
@Test
public void testInputIsNotNull() {
assertThat(input, is(notNullValue()));
}
}
|
package com.m4gik.util;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.notNullValue;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
/**
*
* This class contains JUnit tests for class {@link Util}.
*
* @author Michał Szczygieł <[email protected]>
*
*/
@RunWith(Parameterized.class)
public class UtilTest {
@Parameterized.Parameters
public static Collection<Object[]> data() {
return Arrays
.asList(new Object[][] {
{ new String("713502673d67e5fa557629a71d331945")
.getBytes() },
{ new String("6eece560a2e8d6b919e81fe91b0e7156")
.getBytes() }, });
}
private final Object input;
public UtilTest(Object input) {
this.input = input;
}
@Test
public void testInputIsNotNull() {
assertThat(input, is(notNullValue()));
}
@Test(
expected = NumberFormatException.class)
public void testThrowIfGivenStingIsNotHexValue() {
Util.toString((byte[]) input);
}
}
|
Create TDD test for Util class.
|
Create TDD test for Util class.
|
Java
|
mit
|
M4GiK/tosi-projects,M4GiK/tosi-projects
|
java
|
## Code Before:
package com.m4gik.util;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.notNullValue;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
/**
*
* This class contains JUnit tests for class {@link Util}.
*
* @author Michał Szczygieł <[email protected]>
*
*/
@RunWith(Parameterized.class)
public class UtilTest {
@Parameterized.Parameters
public static Collection<Object[]> data() {
return Arrays.asList(new Object[][] {
{ new String("test").getBytes() },
{ new Integer(8).byteValue() } });
}
private final Object input;
public UtilTest(Object input) {
this.input = input;
}
@Test
public void testInputIsNotNull() {
assertThat(input, is(notNullValue()));
}
}
## Instruction:
Create TDD test for Util class.
## Code After:
package com.m4gik.util;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.notNullValue;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
/**
*
* This class contains JUnit tests for class {@link Util}.
*
* @author Michał Szczygieł <[email protected]>
*
*/
@RunWith(Parameterized.class)
public class UtilTest {
@Parameterized.Parameters
public static Collection<Object[]> data() {
return Arrays
.asList(new Object[][] {
{ new String("713502673d67e5fa557629a71d331945")
.getBytes() },
{ new String("6eece560a2e8d6b919e81fe91b0e7156")
.getBytes() }, });
}
private final Object input;
public UtilTest(Object input) {
this.input = input;
}
@Test
public void testInputIsNotNull() {
assertThat(input, is(notNullValue()));
}
@Test(
expected = NumberFormatException.class)
public void testThrowIfGivenStingIsNotHexValue() {
Util.toString((byte[]) input);
}
}
|
f092f68b5df40e7915d838650db7c9fa147cce18
|
lib/seedData/index.js
|
lib/seedData/index.js
|
var path = require( 'path' )
, fs = require( 'fs' )
, fileNames = [ 'seedData' ]
, packageJson = require( path.resolve( __dirname + '/../../' ) + '/package.json' )
, seedData = require( path.resolve( path.join( __dirname, '..', '..', 'schema', 'seedData.json' ) ) );
packageJson.bundledDependencies.forEach(function( moduleName ) {
var moduleConfigPath = [ path.resolve( __dirname + '/../../modules' ), moduleName, 'schema', '' ].join( path.sep );
fileNames.forEach(function( fileName ) {
var filePath = moduleConfigPath + fileName + '.json';
if ( fs.existsSync( filePath ) ) {
var data = require( filePath );
Object.keys( data ).forEach( function( key ) {
if ( !!seedData[ key ] ) {
seedData[ key ] = seedData[ key ].concat( data[ key ] );
} else {
seedData[ key ] = data[ key ];
}
})
}
});
});
module.exports = seedData;
|
var path = require('path')
, fs = require('fs')
, packageJson = require(path.resolve(__dirname + '/../../') + '/package.json')
, seedData = {};
function loadSeedDataFile(filePath) {
var data = require(filePath);
Object.keys(data).forEach(function(key) {
if (!!seedData[key]) {
seedData[key] = seedData[key].concat(data[key]);
} else {
seedData[key] = data[key];
}
});
}
packageJson.bundledDependencies.forEach(function(moduleName) {
var filePath = [path.resolve(__dirname + '/../../modules'), moduleName, 'schema', 'seedData.json'].join(path.sep)
if (fs.existsSync(filePath)) {
loadSeedDataFile(filePath);
}
});
loadSeedDataFile(path.resolve(path.join(__dirname, '..', '..', 'schema', 'seedData.json')));
module.exports = seedData;
|
Order of seedData now places the main schema json after all other modules plus refactoring
|
fix(seedData): Order of seedData now places the main schema json after all other modules plus refactoring
|
JavaScript
|
mit
|
CleverStack/node-seed,modulexcite/node-seed
|
javascript
|
## Code Before:
var path = require( 'path' )
, fs = require( 'fs' )
, fileNames = [ 'seedData' ]
, packageJson = require( path.resolve( __dirname + '/../../' ) + '/package.json' )
, seedData = require( path.resolve( path.join( __dirname, '..', '..', 'schema', 'seedData.json' ) ) );
packageJson.bundledDependencies.forEach(function( moduleName ) {
var moduleConfigPath = [ path.resolve( __dirname + '/../../modules' ), moduleName, 'schema', '' ].join( path.sep );
fileNames.forEach(function( fileName ) {
var filePath = moduleConfigPath + fileName + '.json';
if ( fs.existsSync( filePath ) ) {
var data = require( filePath );
Object.keys( data ).forEach( function( key ) {
if ( !!seedData[ key ] ) {
seedData[ key ] = seedData[ key ].concat( data[ key ] );
} else {
seedData[ key ] = data[ key ];
}
})
}
});
});
module.exports = seedData;
## Instruction:
fix(seedData): Order of seedData now places the main schema json after all other modules plus refactoring
## Code After:
var path = require('path')
, fs = require('fs')
, packageJson = require(path.resolve(__dirname + '/../../') + '/package.json')
, seedData = {};
function loadSeedDataFile(filePath) {
var data = require(filePath);
Object.keys(data).forEach(function(key) {
if (!!seedData[key]) {
seedData[key] = seedData[key].concat(data[key]);
} else {
seedData[key] = data[key];
}
});
}
packageJson.bundledDependencies.forEach(function(moduleName) {
var filePath = [path.resolve(__dirname + '/../../modules'), moduleName, 'schema', 'seedData.json'].join(path.sep)
if (fs.existsSync(filePath)) {
loadSeedDataFile(filePath);
}
});
loadSeedDataFile(path.resolve(path.join(__dirname, '..', '..', 'schema', 'seedData.json')));
module.exports = seedData;
|
8ea720226093944aa22e14a5fe58d52487e05a47
|
view/map.css
|
view/map.css
|
.counties {
fill: none;
pointer-events: none;
}
.states {
fill: none;
stroke: none;
pointer-events: all;
}
.states.active {
fill: #c37500;
fill-opacity: 0.4;
}
.state-borders {
fill: none;
stroke: #000;
stroke-width: 0.5px;
stroke-linejoin: round;
}
.background {
pointer-events: all;
fill: none;
}
|
.counties {
fill: none;
pointer-events: none;
}
.states {
fill: none;
stroke: none;
pointer-events: all;
}
.states.active {
fill: #c37500;
fill-opacity: 0.4;
}
.state-borders {
fill: none;
stroke: #000;
stroke-width: 0.5px;
stroke-linejoin: round;
}
.background {
pointer-events: all;
fill: none;
}
.colorlegend-title {
font-size: 1.5em;
}
|
Increase font size for legend title.
|
Increase font size for legend title.
|
CSS
|
mit
|
nubs/chicken-dinner,nubs/chicken-dinner
|
css
|
## Code Before:
.counties {
fill: none;
pointer-events: none;
}
.states {
fill: none;
stroke: none;
pointer-events: all;
}
.states.active {
fill: #c37500;
fill-opacity: 0.4;
}
.state-borders {
fill: none;
stroke: #000;
stroke-width: 0.5px;
stroke-linejoin: round;
}
.background {
pointer-events: all;
fill: none;
}
## Instruction:
Increase font size for legend title.
## Code After:
.counties {
fill: none;
pointer-events: none;
}
.states {
fill: none;
stroke: none;
pointer-events: all;
}
.states.active {
fill: #c37500;
fill-opacity: 0.4;
}
.state-borders {
fill: none;
stroke: #000;
stroke-width: 0.5px;
stroke-linejoin: round;
}
.background {
pointer-events: all;
fill: none;
}
.colorlegend-title {
font-size: 1.5em;
}
|
9438b2bf5681b8edba3777749fd47dc90ed24a20
|
ansible/roles/prechecks/tasks/service_checks.yml
|
ansible/roles/prechecks/tasks/service_checks.yml
|
---
- name: Checking that libvirt is not running
stat: path=/var/run/libvirt/libvirt-sock
register: result
failed_when: result.stat.exists
when: inventory_hostname in groups['compute']
- name: Checking Docker version
command: docker version
register: result
failed_when: result | failed
or (result.stdout | from_yaml).Server.Version | regex_replace('(\\d+\\.\\d+\\.\\d+).*', '\\1') | version_compare(docker_version_min, '<')
|
---
- name: Checking that libvirt is not running
stat: path=/var/run/libvirt/libvirt-sock
register: result
failed_when: result.stat.exists
when: inventory_hostname in groups['compute']
- name: Checking Docker version
command: docker version
register: result
failed_when: result | failed
or (result.stdout | from_yaml).Server.Version | regex_replace('(\\d+\\.\\d+\\.\\d+).*', '\\1') | version_compare(docker_version_min, '<')
- name: Checking if 'MountFlags' in /lib/systemd/system/docker.service is set to 'shared'
command: cat /lib/systemd/system/docker.service
register: result
failed_when: result.stdout.find('MountFlags=shared') == -1
when:
- (inventory_hostname in groups['neutron-dhcp-agent']
or inventory_hostname in groups['neutron-l3-agent']
or inventory_hostname in groups['neutron-metadata-agent'])
- ansible_os_family == 'RedHat'
or (ansible_distribution == 'Ubuntu' and ansible_distribution_version > '14.04')
- name: Checking if '/run' mount flag is set to 'shared'
command: awk '$5 == "/run" {print $7}' /proc/self/mountinfo
register: result
failed_when: result.stdout.find('shared') == -1
when:
- (inventory_hostname in groups['neutron-dhcp-agent']
or inventory_hostname in groups['neutron-l3-agent']
or inventory_hostname in groups['neutron-metadata-agent'])
- ansible_distribution == 'Ubuntu' and ansible_distribution_version == '14.04'
|
Add a precheck for MountFlags=shared
|
Add a precheck for MountFlags=shared
In order to avoid the neutron-dhcp-agent container from
failing, you need to change 'MountFlags' to 'shared' in
/var/lib/systemd/system/docker.serivce. Add a precheck
so that this issue will not happen as often.
Closes-bug: #1546681
Change-Id: I339b5e93e870534fe16c6610f299ca789e5ada62
|
YAML
|
apache-2.0
|
negronjl/kolla,coolsvap/kolla,toby82/kolla,coolsvap/kolla,coolsvap/kolla,intel-onp/kolla,dardelean/kolla-ansible,tonyli71/kolla,nihilifer/kolla,negronjl/kolla,openstack/kolla,mandre/kolla,stackforge/kolla,rahulunair/kolla,mandre/kolla,stackforge/kolla,dardelean/kolla-ansible,nihilifer/kolla,mrangana/kolla,openstack/kolla,rahulunair/kolla,mandre/kolla,GalenMa/kolla,tonyli71/kolla,intel-onp/kolla,negronjl/kolla,GalenMa/kolla,toby82/kolla,stackforge/kolla,mrangana/kolla,toby82/kolla,dardelean/kolla-ansible
|
yaml
|
## Code Before:
---
- name: Checking that libvirt is not running
stat: path=/var/run/libvirt/libvirt-sock
register: result
failed_when: result.stat.exists
when: inventory_hostname in groups['compute']
- name: Checking Docker version
command: docker version
register: result
failed_when: result | failed
or (result.stdout | from_yaml).Server.Version | regex_replace('(\\d+\\.\\d+\\.\\d+).*', '\\1') | version_compare(docker_version_min, '<')
## Instruction:
Add a precheck for MountFlags=shared
In order to avoid the neutron-dhcp-agent container from
failing, you need to change 'MountFlags' to 'shared' in
/var/lib/systemd/system/docker.serivce. Add a precheck
so that this issue will not happen as often.
Closes-bug: #1546681
Change-Id: I339b5e93e870534fe16c6610f299ca789e5ada62
## Code After:
---
- name: Checking that libvirt is not running
stat: path=/var/run/libvirt/libvirt-sock
register: result
failed_when: result.stat.exists
when: inventory_hostname in groups['compute']
- name: Checking Docker version
command: docker version
register: result
failed_when: result | failed
or (result.stdout | from_yaml).Server.Version | regex_replace('(\\d+\\.\\d+\\.\\d+).*', '\\1') | version_compare(docker_version_min, '<')
- name: Checking if 'MountFlags' in /lib/systemd/system/docker.service is set to 'shared'
command: cat /lib/systemd/system/docker.service
register: result
failed_when: result.stdout.find('MountFlags=shared') == -1
when:
- (inventory_hostname in groups['neutron-dhcp-agent']
or inventory_hostname in groups['neutron-l3-agent']
or inventory_hostname in groups['neutron-metadata-agent'])
- ansible_os_family == 'RedHat'
or (ansible_distribution == 'Ubuntu' and ansible_distribution_version > '14.04')
- name: Checking if '/run' mount flag is set to 'shared'
command: awk '$5 == "/run" {print $7}' /proc/self/mountinfo
register: result
failed_when: result.stdout.find('shared') == -1
when:
- (inventory_hostname in groups['neutron-dhcp-agent']
or inventory_hostname in groups['neutron-l3-agent']
or inventory_hostname in groups['neutron-metadata-agent'])
- ansible_distribution == 'Ubuntu' and ansible_distribution_version == '14.04'
|
cba39ca9fe3dc10ad9adcdd000f002271db86adc
|
test/Transforms/CorrelatedExprs/2002-09-23-PHIUpdateBug.ll
|
test/Transforms/CorrelatedExprs/2002-09-23-PHIUpdateBug.ll
|
; RUN: as < %s | opt -cee
implementation
declare void %foo(int)
void %test(int %A, bool %C) {
br bool %C, label %bb3, label %bb1
bb1: ;[#uses=0]
%cond212 = setgt int %A, 9 ; <bool> [#uses=1]
br bool %cond212, label %bb2, label %bb3
bb2: ;[#uses=1]
%cond = setgt int %A, 7
br bool %cond, label %bb3, label %bb7
bb3: ;[#uses=1]
%X = phi int [ 0, %0], [ 12, %bb1]
call void %foo( int %X )
br label %bb7
bb7: ;[#uses=2]
ret void
}
|
; RUN: as < %s | opt -cee
implementation
declare void %foo(int)
void %test(int %A, bool %C) {
br bool %C, label %bb0, label %bb1
bb0:
br label %bb3
Unreachable:
br label %bb2
bb1: ;[#uses=0]
%cond212 = setgt int %A, 9 ; <bool> [#uses=1]
br bool %cond212, label %bb2, label %bb7
bb2: ;[#uses=1]
%cond = setgt int %A, 7
br bool %cond, label %bb3, label %bb7
bb3: ;[#uses=1]
%X = phi int [ 0, %bb0], [ 12, %bb2]
call void %foo( int %X )
br label %bb7
bb7: ;[#uses=2]
ret void
}
|
Fix testcase to accurately expose bug
|
Fix testcase to accurately expose bug
git-svn-id: 0ff597fd157e6f4fc38580e8d64ab130330d2411@3890 91177308-0d34-0410-b5e6-96231b3b80d8
|
LLVM
|
apache-2.0
|
apple/swift-llvm,GPUOpen-Drivers/llvm,llvm-mirror/llvm,dslab-epfl/asap,llvm-mirror/llvm,dslab-epfl/asap,dslab-epfl/asap,apple/swift-llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,dslab-epfl/asap,apple/swift-llvm,llvm-mirror/llvm,chubbymaggie/asap,chubbymaggie/asap,chubbymaggie/asap,chubbymaggie/asap,GPUOpen-Drivers/llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,apple/swift-llvm,llvm-mirror/llvm,apple/swift-llvm,GPUOpen-Drivers/llvm,GPUOpen-Drivers/llvm,llvm-mirror/llvm,llvm-mirror/llvm,dslab-epfl/asap,apple/swift-llvm,llvm-mirror/llvm,apple/swift-llvm,dslab-epfl/asap,chubbymaggie/asap,GPUOpen-Drivers/llvm,apple/swift-llvm,dslab-epfl/asap,GPUOpen-Drivers/llvm,chubbymaggie/asap
|
llvm
|
## Code Before:
; RUN: as < %s | opt -cee
implementation
declare void %foo(int)
void %test(int %A, bool %C) {
br bool %C, label %bb3, label %bb1
bb1: ;[#uses=0]
%cond212 = setgt int %A, 9 ; <bool> [#uses=1]
br bool %cond212, label %bb2, label %bb3
bb2: ;[#uses=1]
%cond = setgt int %A, 7
br bool %cond, label %bb3, label %bb7
bb3: ;[#uses=1]
%X = phi int [ 0, %0], [ 12, %bb1]
call void %foo( int %X )
br label %bb7
bb7: ;[#uses=2]
ret void
}
## Instruction:
Fix testcase to accurately expose bug
git-svn-id: 0ff597fd157e6f4fc38580e8d64ab130330d2411@3890 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
; RUN: as < %s | opt -cee
implementation
declare void %foo(int)
void %test(int %A, bool %C) {
br bool %C, label %bb0, label %bb1
bb0:
br label %bb3
Unreachable:
br label %bb2
bb1: ;[#uses=0]
%cond212 = setgt int %A, 9 ; <bool> [#uses=1]
br bool %cond212, label %bb2, label %bb7
bb2: ;[#uses=1]
%cond = setgt int %A, 7
br bool %cond, label %bb3, label %bb7
bb3: ;[#uses=1]
%X = phi int [ 0, %bb0], [ 12, %bb2]
call void %foo( int %X )
br label %bb7
bb7: ;[#uses=2]
ret void
}
|
c48839d551073b6159b2b4e3b47a08b67ee2b4bb
|
bootstrap/vagrant/util.sh
|
bootstrap/vagrant/util.sh
|
APOLLO_ROOT=$(dirname "${BASH_SOURCE}")/../..
verify_prereqs() {
if [[ "$(which vagrant)" == "" ]]; then
echo -e "${color_red}Can't find vagrant in PATH, please fix and retry.${color_norm}"
exit 1
fi
}
apollo_launch() {
vagrant up --provision
open_urls
}
apollo_down() {
vagrant destroy -f
}
|
APOLLO_ROOT=$(dirname "${BASH_SOURCE}")/../..
verify_prereqs() {
if [[ "$(which vagrant)" == "" ]]; then
echo -e "${color_red}Can't find vagrant in PATH, please fix and retry.${color_norm}"
exit 1
fi
}
apollo_launch() {
install_contributed_roles
vagrant up --provision
open_urls
}
apollo_down() {
vagrant destroy -f
}
|
Install galaxy roles for vagrant.
|
Install galaxy roles for vagrant.
|
Shell
|
mit
|
enxebre/Apollo,Capgemini/Apollo,Capgemini/Apollo,enxebre/Apollo,Capgemini/Apollo
|
shell
|
## Code Before:
APOLLO_ROOT=$(dirname "${BASH_SOURCE}")/../..
verify_prereqs() {
if [[ "$(which vagrant)" == "" ]]; then
echo -e "${color_red}Can't find vagrant in PATH, please fix and retry.${color_norm}"
exit 1
fi
}
apollo_launch() {
vagrant up --provision
open_urls
}
apollo_down() {
vagrant destroy -f
}
## Instruction:
Install galaxy roles for vagrant.
## Code After:
APOLLO_ROOT=$(dirname "${BASH_SOURCE}")/../..
verify_prereqs() {
if [[ "$(which vagrant)" == "" ]]; then
echo -e "${color_red}Can't find vagrant in PATH, please fix and retry.${color_norm}"
exit 1
fi
}
apollo_launch() {
install_contributed_roles
vagrant up --provision
open_urls
}
apollo_down() {
vagrant destroy -f
}
|
f5cdb43607f9894f915e266b634a88e3577cbe05
|
bin/gh.js
|
bin/gh.js
|
/*
* Copyright 2013-2015, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Eduardo Lundgren <[email protected]>
*/
'use strict';
var path = require('path'),
fs = require('fs'),
logger = require('../lib/logger'),
pkg = require('../package.json'),
semver = require('semver'),
tracker = require('../lib/tracker'),
configs = require('../lib/configs');
// Check node version
if (!isCompatibleNodeVersion()) {
logger.error('Please update your NodeJS version: http://nodejs.org/download');
}
if (!fs.existsSync(configs.getUserHomePath())) {
configs.createGlobalConfig();
}
// If configs.PLUGINS_PATH_KEY is undefined, try to cache it before proceeding.
if (configs.getConfig()[configs.PLUGINS_PATH_KEY] === undefined) {
configs.getNodeModulesGlobalPath();
}
// -- Env ------------------------------------------------------------------------------------------
try {
process.env.GH_PATH = path.join(__dirname, '../');
require('../lib/cmd.js').run();
} catch (e) {
tracker.track('error');
throw e;
}
function isCompatibleNodeVersion() {
return semver.satisfies(process.version, pkg.engines.node);
}
|
/*
* Copyright 2013-2015, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Eduardo Lundgren <[email protected]>
*/
'use strict';
var path = require('path'),
fs = require('fs'),
logger = require('../lib/logger'),
pkg = require('../package.json'),
semver = require('semver'),
tracker = require('../lib/tracker'),
configs = require('../lib/configs');
// Check node version
function isCompatibleNodeVersion() {
return semver.satisfies(process.version, pkg.engines.node);
}
if (!isCompatibleNodeVersion()) {
logger.error('Please update your NodeJS version: http://nodejs.org/download');
}
if (!fs.existsSync(configs.getUserHomePath())) {
configs.createGlobalConfig();
}
// If configs.PLUGINS_PATH_KEY is undefined, try to cache it before proceeding.
if (configs.getConfig()[configs.PLUGINS_PATH_KEY] === undefined) {
configs.getNodeModulesGlobalPath();
}
// -- Env ------------------------------------------------------------------------------------------
try {
process.env.GH_PATH = path.join(__dirname, '../');
require('../lib/cmd.js').run();
} catch (e) {
tracker.track('error');
throw e;
}
|
Fix linting issue. Source formatting.
|
Fix linting issue. Source formatting.
|
JavaScript
|
bsd-3-clause
|
oouyang/gh,tomzx/gh,henvic/gh,modulexcite/gh,TomzxForks/gh,tomzx/gh,dustinryerson/gh,oouyang/gh,henvic/gh,modulexcite/gh,TomzxForks/gh,dustinryerson/gh
|
javascript
|
## Code Before:
/*
* Copyright 2013-2015, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Eduardo Lundgren <[email protected]>
*/
'use strict';
var path = require('path'),
fs = require('fs'),
logger = require('../lib/logger'),
pkg = require('../package.json'),
semver = require('semver'),
tracker = require('../lib/tracker'),
configs = require('../lib/configs');
// Check node version
if (!isCompatibleNodeVersion()) {
logger.error('Please update your NodeJS version: http://nodejs.org/download');
}
if (!fs.existsSync(configs.getUserHomePath())) {
configs.createGlobalConfig();
}
// If configs.PLUGINS_PATH_KEY is undefined, try to cache it before proceeding.
if (configs.getConfig()[configs.PLUGINS_PATH_KEY] === undefined) {
configs.getNodeModulesGlobalPath();
}
// -- Env ------------------------------------------------------------------------------------------
try {
process.env.GH_PATH = path.join(__dirname, '../');
require('../lib/cmd.js').run();
} catch (e) {
tracker.track('error');
throw e;
}
function isCompatibleNodeVersion() {
return semver.satisfies(process.version, pkg.engines.node);
}
## Instruction:
Fix linting issue. Source formatting.
## Code After:
/*
* Copyright 2013-2015, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Eduardo Lundgren <[email protected]>
*/
'use strict';
var path = require('path'),
fs = require('fs'),
logger = require('../lib/logger'),
pkg = require('../package.json'),
semver = require('semver'),
tracker = require('../lib/tracker'),
configs = require('../lib/configs');
// Check node version
function isCompatibleNodeVersion() {
return semver.satisfies(process.version, pkg.engines.node);
}
if (!isCompatibleNodeVersion()) {
logger.error('Please update your NodeJS version: http://nodejs.org/download');
}
if (!fs.existsSync(configs.getUserHomePath())) {
configs.createGlobalConfig();
}
// If configs.PLUGINS_PATH_KEY is undefined, try to cache it before proceeding.
if (configs.getConfig()[configs.PLUGINS_PATH_KEY] === undefined) {
configs.getNodeModulesGlobalPath();
}
// -- Env ------------------------------------------------------------------------------------------
try {
process.env.GH_PATH = path.join(__dirname, '../');
require('../lib/cmd.js').run();
} catch (e) {
tracker.track('error');
throw e;
}
|
a4dfa18e930d26ad09a0137839be04eef349d3fb
|
crawler/src/util.spec.ts
|
crawler/src/util.spec.ts
|
import 'mocha'
import { expect } from 'chai'
import { parsePrice } from '../src/util'
describe('util functions', () => {
it('should parse price', () => {
const testInput = ['5', '25', '1.2', '1,2', '-12,25€', '-12,25 €', '$5.5', '1.256.000', '1,213.12', '.25', ',0025']
const result = testInput.map(val => parsePrice(val))
const spected = [5, 25, 1.2, 1.2, -12.25, -12.25, 5.5, 1256000, 1213.12, 0.25, 0.0025]
expect(result).to.eql(spected)
})
})
|
import 'mocha'
import { expect } from 'chai'
import { parsePrice } from '../src/util'
describe('util functions', () => {
it('should parse price', () => {
const testInput = ['5', '25', '1.2', '1,2', '-12,25€', '-12,25 €', '144,50 € ', '$5.5', '1.256.000', '1,213.12', '.25', ',0025']
const result = testInput.map(val => parsePrice(val))
const spected = [5, 25, 1.2, 1.2, -12.25, -12.25, 144.50, 5.5, 1256000, 1213.12, 0.25, 0.0025]
expect(result).to.eql(spected)
})
})
|
Add price case to util.ts test
|
Add price case to util.ts test
|
TypeScript
|
mit
|
davidglezz/tfg,davidglezz/tfg,davidglezz/tfg,davidglezz/tfg
|
typescript
|
## Code Before:
import 'mocha'
import { expect } from 'chai'
import { parsePrice } from '../src/util'
describe('util functions', () => {
it('should parse price', () => {
const testInput = ['5', '25', '1.2', '1,2', '-12,25€', '-12,25 €', '$5.5', '1.256.000', '1,213.12', '.25', ',0025']
const result = testInput.map(val => parsePrice(val))
const spected = [5, 25, 1.2, 1.2, -12.25, -12.25, 5.5, 1256000, 1213.12, 0.25, 0.0025]
expect(result).to.eql(spected)
})
})
## Instruction:
Add price case to util.ts test
## Code After:
import 'mocha'
import { expect } from 'chai'
import { parsePrice } from '../src/util'
describe('util functions', () => {
it('should parse price', () => {
const testInput = ['5', '25', '1.2', '1,2', '-12,25€', '-12,25 €', '144,50 € ', '$5.5', '1.256.000', '1,213.12', '.25', ',0025']
const result = testInput.map(val => parsePrice(val))
const spected = [5, 25, 1.2, 1.2, -12.25, -12.25, 144.50, 5.5, 1256000, 1213.12, 0.25, 0.0025]
expect(result).to.eql(spected)
})
})
|
b37e29af4a737587988192b29bc996e95666eff7
|
client/includes/header.js
|
client/includes/header.js
|
Template.header.helpers({
displayRegister: function () {
if (Meteor.user().profile) {return "none";}
}
})
Template.heading.helpers({
heading: function () {
return pageTitle;
}
})
|
Template.header.helpers({
displayRegister: function () {
if (Meteor.user().profile) {return "none";}
}
});
|
Fix error causing degraded performance
|
Fix error causing degraded performance
|
JavaScript
|
agpl-3.0
|
gazhayes/Popvote-HK,gazhayes/Popvote-HK
|
javascript
|
## Code Before:
Template.header.helpers({
displayRegister: function () {
if (Meteor.user().profile) {return "none";}
}
})
Template.heading.helpers({
heading: function () {
return pageTitle;
}
})
## Instruction:
Fix error causing degraded performance
## Code After:
Template.header.helpers({
displayRegister: function () {
if (Meteor.user().profile) {return "none";}
}
});
|
2cceb69cc81d9746aefedcb8a99b640818faaa38
|
modules/load-dir-to-db-basex.xquery
|
modules/load-dir-to-db-basex.xquery
|
(: =====================================================
Load DITA map and dependencies to BaseX
Author: W. Eliot Kimber
===================================================== :)
(: import module namespace db="http://basex.org/modules/db"; :)
declare variable $db external;
(: The directory load from: :)
declare variable $dir external;
(: The collection to load into: :)
declare variable $collection external;
(: URI of entity resolution catalog to use for DTD resolution: :)
declare variable $catalogURI external;
declare variable $xmlLoadOptions as element() :=
<options>
<catfile value='{$catalogURI}'/>
<dtd value='true'/>
<chop value='false'/>
<skipcorrupt value='true'/>
<createfilter value="*.xml,*.dita,*.ditamap,*.ditaval"/>
</options>;
for $file in (file:list($dir, true()))
let $fullDir := concat($dir, '/', $file)
(:return if (file:is-dir($fullDir)) then <cmd>{(concat('db:add("',$db,'","', $fullDir, '","',$file, '",'), $xmlLoadOptions,')')}</cmd> else () :)
return if (file:is-dir($fullDir)) then db:add($db, $fullDir, $file, $xmlLoadOptions) else ()
(: ==================== End of Module ================================= :)
|
(: =====================================================
Load DITA map and dependencies to BaseX
Author: W. Eliot Kimber
===================================================== :)
(: import module namespace db="http://basex.org/modules/db"; :)
declare variable $db external;
(: The directory load from: :)
declare variable $dir external;
(: The collection to load into: :)
declare variable $collection external;
(: URI of entity resolution catalog to use for DTD resolution: :)
declare variable $catalogURI external;
declare variable $xmlLoadOptions as element() :=
<options>
<catfile value='{$catalogURI}'/>
<dtd value='true'/>
<chop value='false'/>
<skipcorrupt value='true'/>
<createfilter value="*.xml,*.dita,*.ditamap,*.ditaval"/>
</options>;
for $file in (file:list($dir, false(), '*.xml,*.dita,*.ditamap,*.ditaval'))
let $fullDir := concat($dir, '/', $file)
return if (file:is-file($file)) then db:replace($db, $file, $fullDir, $xmlLoadOptions) else (),
for $file in (file:list($dir, true()))
let $fullDir := concat($dir, '/', $file)
(:return if (file:is-dir($fullDir)) then <cmd>{(concat('db:replace("',$db,'","', $fullDir, '","',$file, '",'), $xmlLoadOptions,')')}</cmd> else () :)
return if (file:is-dir($fullDir)) then db:replace($db, $file, $fullDir, $xmlLoadOptions) else ()
(: ==================== End of Module ================================= :)
|
Use replace rather than add to avoid duplicate files
|
Use replace rather than add to avoid duplicate files
|
XQuery
|
apache-2.0
|
dita-for-small-teams/dita-for-small-teams,dita-for-small-teams/dita-for-small-teams
|
xquery
|
## Code Before:
(: =====================================================
Load DITA map and dependencies to BaseX
Author: W. Eliot Kimber
===================================================== :)
(: import module namespace db="http://basex.org/modules/db"; :)
declare variable $db external;
(: The directory load from: :)
declare variable $dir external;
(: The collection to load into: :)
declare variable $collection external;
(: URI of entity resolution catalog to use for DTD resolution: :)
declare variable $catalogURI external;
declare variable $xmlLoadOptions as element() :=
<options>
<catfile value='{$catalogURI}'/>
<dtd value='true'/>
<chop value='false'/>
<skipcorrupt value='true'/>
<createfilter value="*.xml,*.dita,*.ditamap,*.ditaval"/>
</options>;
for $file in (file:list($dir, true()))
let $fullDir := concat($dir, '/', $file)
(:return if (file:is-dir($fullDir)) then <cmd>{(concat('db:add("',$db,'","', $fullDir, '","',$file, '",'), $xmlLoadOptions,')')}</cmd> else () :)
return if (file:is-dir($fullDir)) then db:add($db, $fullDir, $file, $xmlLoadOptions) else ()
(: ==================== End of Module ================================= :)
## Instruction:
Use replace rather than add to avoid duplicate files
## Code After:
(: =====================================================
Load DITA map and dependencies to BaseX
Author: W. Eliot Kimber
===================================================== :)
(: import module namespace db="http://basex.org/modules/db"; :)
declare variable $db external;
(: The directory load from: :)
declare variable $dir external;
(: The collection to load into: :)
declare variable $collection external;
(: URI of entity resolution catalog to use for DTD resolution: :)
declare variable $catalogURI external;
declare variable $xmlLoadOptions as element() :=
<options>
<catfile value='{$catalogURI}'/>
<dtd value='true'/>
<chop value='false'/>
<skipcorrupt value='true'/>
<createfilter value="*.xml,*.dita,*.ditamap,*.ditaval"/>
</options>;
for $file in (file:list($dir, false(), '*.xml,*.dita,*.ditamap,*.ditaval'))
let $fullDir := concat($dir, '/', $file)
return if (file:is-file($file)) then db:replace($db, $file, $fullDir, $xmlLoadOptions) else (),
for $file in (file:list($dir, true()))
let $fullDir := concat($dir, '/', $file)
(:return if (file:is-dir($fullDir)) then <cmd>{(concat('db:replace("',$db,'","', $fullDir, '","',$file, '",'), $xmlLoadOptions,')')}</cmd> else () :)
return if (file:is-dir($fullDir)) then db:replace($db, $file, $fullDir, $xmlLoadOptions) else ()
(: ==================== End of Module ================================= :)
|
3cd54dd17dbff8843f9fc98e750ad907986d6666
|
org.metaborg.util/src/main/java/org/metaborg/util/iterators/CompoundIterator.java
|
org.metaborg.util/src/main/java/org/metaborg/util/iterators/CompoundIterator.java
|
package org.metaborg.util.iterators;
import java.util.Iterator;
import java.util.NoSuchElementException;
import java.util.Queue;
import com.google.common.collect.Queues;
public class CompoundIterator<T> implements Iterator<T> {
private final Queue<? extends Iterator<T>> iterators;
public CompoundIterator(Iterable<? extends Iterator<T>> iterators) {
this.iterators = Queues.newArrayDeque(iterators);
}
@Override public boolean hasNext() {
final Iterator<T> iterator = iterators.peek();
if(iterator == null) {
return false;
}
return iterator.hasNext();
}
@Override public T next() {
final Iterator<T> iterator = iterators.peek();
if(iterator == null) {
throw new NoSuchElementException();
}
if(iterator.hasNext()) {
return iterator.next();
} else {
iterators.poll();
return next();
}
}
@Override public void remove() {
throw new UnsupportedOperationException();
}
}
|
package org.metaborg.util.iterators;
import java.util.Iterator;
import java.util.NoSuchElementException;
import com.google.common.collect.Iterators;
public class CompoundIterator<T> implements Iterator<T> {
private final Iterator<? extends Iterator<T>> iterators;
private Iterator<T> iterator = Iterators.emptyIterator();
public CompoundIterator(Iterable<? extends Iterator<T>> iterators) {
this.iterators = iterators.iterator();
}
@Override public boolean hasNext() {
if(iterator.hasNext()) {
return true;
}
try {
iterator = iterators.next();
} catch(NoSuchElementException ex) {
return false;
}
return hasNext();
}
@Override public T next() {
try {
return iterator.next();
} catch(NoSuchElementException ex) {
iterator = iterators.next();
}
return next();
}
@Override public void remove() {
throw new UnsupportedOperationException();
}
}
|
Fix compound iterator for case where empty iterators appear in the list.
|
Fix compound iterator for case where empty iterators appear in the list.
|
Java
|
apache-2.0
|
metaborg/mb-exec,metaborg/mb-exec,metaborg/mb-exec
|
java
|
## Code Before:
package org.metaborg.util.iterators;
import java.util.Iterator;
import java.util.NoSuchElementException;
import java.util.Queue;
import com.google.common.collect.Queues;
public class CompoundIterator<T> implements Iterator<T> {
private final Queue<? extends Iterator<T>> iterators;
public CompoundIterator(Iterable<? extends Iterator<T>> iterators) {
this.iterators = Queues.newArrayDeque(iterators);
}
@Override public boolean hasNext() {
final Iterator<T> iterator = iterators.peek();
if(iterator == null) {
return false;
}
return iterator.hasNext();
}
@Override public T next() {
final Iterator<T> iterator = iterators.peek();
if(iterator == null) {
throw new NoSuchElementException();
}
if(iterator.hasNext()) {
return iterator.next();
} else {
iterators.poll();
return next();
}
}
@Override public void remove() {
throw new UnsupportedOperationException();
}
}
## Instruction:
Fix compound iterator for case where empty iterators appear in the list.
## Code After:
package org.metaborg.util.iterators;
import java.util.Iterator;
import java.util.NoSuchElementException;
import com.google.common.collect.Iterators;
public class CompoundIterator<T> implements Iterator<T> {
private final Iterator<? extends Iterator<T>> iterators;
private Iterator<T> iterator = Iterators.emptyIterator();
public CompoundIterator(Iterable<? extends Iterator<T>> iterators) {
this.iterators = iterators.iterator();
}
@Override public boolean hasNext() {
if(iterator.hasNext()) {
return true;
}
try {
iterator = iterators.next();
} catch(NoSuchElementException ex) {
return false;
}
return hasNext();
}
@Override public T next() {
try {
return iterator.next();
} catch(NoSuchElementException ex) {
iterator = iterators.next();
}
return next();
}
@Override public void remove() {
throw new UnsupportedOperationException();
}
}
|
4813b9aa3651eb8eb4206b7622cefb38ee935528
|
.travis.yml
|
.travis.yml
|
language: python
python:
- "3.5"
- "3.6"
branches:
- master
- readme_bages
install:
- pip install -r requirements.txt
- pip install codecov
script:
- python3 -m pytest tests --cov app --cov-report html
- coverage run -m pytest tests --cov app --cov-report html
after_success:
- codecov
|
language: python
python:
- "3.5"
- "3.6"
install:
- pip install -r requirements.txt
- pip install codecov
script:
- python3 -m pytest tests --cov app --cov-report html
- coverage run -m pytest tests --cov app --cov-report html
after_success:
- codecov
|
Delete branch info from Travis config
|
Delete branch info from Travis config
|
YAML
|
mit
|
Stark-Mountain/meetup-facebook-bot,Stark-Mountain/meetup-facebook-bot
|
yaml
|
## Code Before:
language: python
python:
- "3.5"
- "3.6"
branches:
- master
- readme_bages
install:
- pip install -r requirements.txt
- pip install codecov
script:
- python3 -m pytest tests --cov app --cov-report html
- coverage run -m pytest tests --cov app --cov-report html
after_success:
- codecov
## Instruction:
Delete branch info from Travis config
## Code After:
language: python
python:
- "3.5"
- "3.6"
install:
- pip install -r requirements.txt
- pip install codecov
script:
- python3 -m pytest tests --cov app --cov-report html
- coverage run -m pytest tests --cov app --cov-report html
after_success:
- codecov
|
d312d45ea1e672d310c553574b694b6addd49570
|
tests/unit/shared/file/SymlinkPharActivatorTest.php
|
tests/unit/shared/file/SymlinkPharActivatorTest.php
|
<?php
namespace PharIo\Phive;
use PharIo\FileSystem\Filename;
use PHPUnit\Framework\TestCase;
/**
* @covers \PharIo\Phive\SymlinkPharActivator
*/
class SymlinkPharActivatorTest extends TestCase {
public function setUp() {
$this->deleteTestSymlink();
}
public function tearDown() {
$this->deleteTestSymlink();
}
public function testActivateCreatesExpectedSymlink() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$activator->activate($targetPhar, $symlinkFilename);
$this->assertTrue(is_link($symlinkFilename->asString()));
}
public function testActiveReturnsExpectedFilename() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$actual = $activator->activate($targetPhar, $symlinkFilename);
$this->assertEquals($symlinkFilename ,$actual);
}
private function deleteTestSymlink() {
$filename = __DIR__ . '/symlink';
if (!file_exists($filename)) {
return;
}
unlink($filename);
}
}
|
<?php
namespace PharIo\Phive;
use PharIo\FileSystem\Filename;
use PHPUnit\Framework\TestCase;
/**
* @covers \PharIo\Phive\SymlinkPharActivator
*/
class SymlinkPharActivatorTest extends TestCase {
public function setUp() {
if (0 === stripos(PHP_OS, 'win')) {
$this->markTestSkipped('PHP does not support symlinks on Windows.');
}
$this->deleteTestSymlink();
}
public function tearDown() {
$this->deleteTestSymlink();
}
public function testActivateCreatesExpectedSymlink() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$activator->activate($targetPhar, $symlinkFilename);
$this->assertTrue(is_link($symlinkFilename->asString()));
}
public function testActiveReturnsExpectedFilename() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$actual = $activator->activate($targetPhar, $symlinkFilename);
$this->assertEquals($symlinkFilename ,$actual);
}
private function deleteTestSymlink() {
$filename = __DIR__ . '/symlink';
if (!file_exists($filename)) {
return;
}
unlink($filename);
}
}
|
Fix unit tests skip symlink tests on windows
|
Fix unit tests skip symlink tests on windows
|
PHP
|
bsd-3-clause
|
phar-io/phive
|
php
|
## Code Before:
<?php
namespace PharIo\Phive;
use PharIo\FileSystem\Filename;
use PHPUnit\Framework\TestCase;
/**
* @covers \PharIo\Phive\SymlinkPharActivator
*/
class SymlinkPharActivatorTest extends TestCase {
public function setUp() {
$this->deleteTestSymlink();
}
public function tearDown() {
$this->deleteTestSymlink();
}
public function testActivateCreatesExpectedSymlink() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$activator->activate($targetPhar, $symlinkFilename);
$this->assertTrue(is_link($symlinkFilename->asString()));
}
public function testActiveReturnsExpectedFilename() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$actual = $activator->activate($targetPhar, $symlinkFilename);
$this->assertEquals($symlinkFilename ,$actual);
}
private function deleteTestSymlink() {
$filename = __DIR__ . '/symlink';
if (!file_exists($filename)) {
return;
}
unlink($filename);
}
}
## Instruction:
Fix unit tests skip symlink tests on windows
## Code After:
<?php
namespace PharIo\Phive;
use PharIo\FileSystem\Filename;
use PHPUnit\Framework\TestCase;
/**
* @covers \PharIo\Phive\SymlinkPharActivator
*/
class SymlinkPharActivatorTest extends TestCase {
public function setUp() {
if (0 === stripos(PHP_OS, 'win')) {
$this->markTestSkipped('PHP does not support symlinks on Windows.');
}
$this->deleteTestSymlink();
}
public function tearDown() {
$this->deleteTestSymlink();
}
public function testActivateCreatesExpectedSymlink() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$activator->activate($targetPhar, $symlinkFilename);
$this->assertTrue(is_link($symlinkFilename->asString()));
}
public function testActiveReturnsExpectedFilename() {
$targetPhar = new Filename(__DIR__ .'/fixtures/some.phar');
$symlinkFilename = new Filename(__DIR__ . '/symlink');
$activator = new SymlinkPharActivator();
$actual = $activator->activate($targetPhar, $symlinkFilename);
$this->assertEquals($symlinkFilename ,$actual);
}
private function deleteTestSymlink() {
$filename = __DIR__ . '/symlink';
if (!file_exists($filename)) {
return;
}
unlink($filename);
}
}
|
34895630fd0f363dc52194676ed4cf93ec0be864
|
contrib/dokku_client.sh
|
contrib/dokku_client.sh
|
if [[ ! -z $DOKKU_HOST ]]; then
function dokku {
appname=$(git remote -v 2>/dev/null | grep dokku | head -n 1 | cut -f1 -d' ' | cut -f2 -d':' 2>/dev/null)
if [[ "$?" != "0" ]]; then
donotshift="YES"
fi
if [[ "$1" = "create" ]]; then
appname=$(echo "print(elfs.GenName())" | lua -l elfs)
if git remote add dokku dokku@$DOKKU_HOST:$appname
then
echo "-----> Dokku remote added at $DOKKU_HOST"
echo "-----> Application name is $appname"
else
echo "! Dokku remote not added! Do you already have a dokku remote?"
return
fi
git push dokku master
return $?
fi
if [[ -z "$donotshift" ]]; then
ssh dokku@$DOKKU_HOST $*
exit
fi
verb=$1
shift
ssh dokku@$DOKKU_HOST "$verb" "$appname" $@
}
fi
|
set -eo pipefail; [[ $DOKKU_TRACE ]] && set -x
if [[ ! -z $DOKKU_HOST ]]; then
function dokku {
appname=$(git remote -v 2>/dev/null | grep dokku | head -n 1 | cut -f1 -d' ' | cut -f2 -d':' 2>/dev/null)
if [[ "$?" != "0" ]]; then
donotshift="YES"
fi
if [[ "$1" = "create" ]]; then
appname=$(echo "print(elfs.GenName())" | lua -l elfs)
if git remote add dokku dokku@$DOKKU_HOST:$appname
then
echo "-----> Dokku remote added at $DOKKU_HOST"
echo "-----> Application name is $appname"
else
echo "! Dokku remote not added! Do you already have a dokku remote?"
return
fi
git push dokku master
return $?
fi
if [[ -z "$donotshift" ]]; then
ssh dokku@$DOKKU_HOST $*
exit
fi
verb=$1
shift
ssh dokku@$DOKKU_HOST "$verb" "$appname" $@
}
fi
|
Add DOKKU_TRACE support to turn on command tracing
|
Add DOKKU_TRACE support to turn on command tracing
|
Shell
|
mit
|
econya/dokku,ertrzyiks/dokku,klodoo/dokku,basicer/dokku,pmvieira/dokku,experimental-platform/dokku,leonardodino/dokku,mmerickel/dokku,fmoliveira/dokku,Cactusbone/dokku,sebastianscatularo/dokku,davidsiefert/dokku,stepri/dokku,vlipco/dokku,cjblomqvist/dokku,AnselZhangGit/dokku,omeid/dokku,Evlos/dokku,alexquick/dokku,econya/dokku,lmars/dokku,mrluc/dokku,AnselZhangGit/dokku,arthurschreiber/dokku,ertrzyiks/dokku,hanwoody/dokku,emdantrim/dokku,progrium/dokku,biddyweb/dokku,suxor42/dokku,econya/dokku,alessio/dokku,expa/dokku,bobvanderlinden/dokku,vijayviji/dokku,henrik/dokku,u2mejc/dokku,jimeh/dokku,stepri/dokku,kennetham/dokku,netbrick/dokku,3onyc/dokku,suxor42/dokku,hanwoody/dokku,ebeigarts/dokku,kennetham/dokku,netbrick/dokku,vlipco/dokku,elia/dokku,TribeMedia/dokku,lmars/dokku,beedesk/dokku,fruitl00p/dokku,johnfraney/dokku,mouserage/dokku,u2mejc/dokku,jimeh/dokku,arthurschreiber/dokku,suxor42/dokku,bkniffler/dokku,mouserage/dokku,fruitl00p/dokku,econya/dokku,jamesmarva/dokku,ebeigarts/dokku,henrik/dokku,mrluc/dokku,Evlos/dokku,rocLv/dokku,experimental-platform/dokku,ddreaml/dokku,experimental-platform/dokku,henrik/dokku,alessio/dokku,crisward/dokku,experimental-platform/dokku,sseemayer/dokku,progrium/dokku,basicer/dokku,mmerickel/dokku,basicer/dokku,kennetham/dokku,beedesk/dokku,bobvanderlinden/dokku,jamesmarva/dokku,progrium/dokku,akatrevorjay/dokku,gianpaj/dokku,stepri/dokku,arsthanea/dokku,mmerickel/dokku,fruitl00p/dokku,alessio/dokku,emdantrim/dokku,jazzzz/dokku,eljojo/dokku,TribeMedia/dokku,rocLv/dokku,jimjag/dokku,gianpaj/dokku,lmars/dokku,Fab-IT-ApS/dokku,arthurschreiber/dokku,arthurschreiber/dokku,akatrevorjay/dokku,expa/dokku,netbrick/dokku,vijayviji/dokku,ddreaml/dokku,vlipco/dokku,econya/dokku,henrik/dokku,codeconfab/dokku,expa/dokku,basicer/dokku,leonardodino/dokku,progrium/dokku,cjblomqvist/dokku,akatrevorjay/dokku,leonardodino/dokku,dokku/dokku,akatrevorjay/dokku,vkurup/dokku,johnfraney/dokku,hanwoody/dokku,vlipco/dokku,bobvanderlinden/dokku,lmars/dokku,elia/dokku,emdantrim/dokku,mouserage/dokku,johnfraney/dokku,arthurschreiber/dokku,leonardodino/dokku,lmars/dokku,vkurup/dokku,amaltson/dokku,kennetham/dokku,biddyweb/dokku,Git-Host/dokku,tilgovi/dokku,3onyc/dokku,eljojo/dokku,suxor42/dokku,vlipco/dokku,johnfraney/dokku,jimjag/dokku,sseemayer/dokku,bobvanderlinden/dokku,davidsiefert/dokku,mrluc/dokku,kennetham/dokku,callahad/dokku,pmvieira/dokku,mmerickel/dokku,emdantrim/dokku,claytonbrown/dokku,Git-Host/dokku,alessio/dokku,Fab-IT-ApS/dokku,mouserage/dokku,ebeigarts/dokku,hanwoody/dokku,hanwoody/dokku,vijayviji/dokku,rocLv/dokku,cjblomqvist/dokku,barseghyanartur/dokku,tilgovi/dokku,matto1990/dokku,omeid/dokku,crisward/dokku,ddreaml/dokku,rocLv/dokku,codeconfab/dokku,emdantrim/dokku,pmvieira/dokku,TribeMedia/dokku,claytonbrown/dokku,matto1990/dokku,pmvieira/dokku,matto1990/dokku,AnselZhangGit/dokku,fmoliveira/dokku,Git-Host/dokku,emdantrim/dokku,callahad/dokku,ebeigarts/dokku,jimeh/dokku,amaltson/dokku,mouserage/dokku,eljojo/dokku,beedesk/dokku,suxor42/dokku,mmerickel/dokku,mmerickel/dokku,fruitl00p/dokku,u2mejc/dokku,callahad/dokku,elia/dokku,eljojo/dokku,rocLv/dokku,callahad/dokku,beedesk/dokku,cjblomqvist/dokku,arsthanea/dokku,jazzzz/dokku,bobvanderlinden/dokku,claytonbrown/dokku,sebastianscatularo/dokku,Fab-IT-ApS/dokku,eljojo/dokku,dokku/dokku,dokku/dokku,matto1990/dokku,vkurup/dokku,vijayviji/dokku,barseghyanartur/dokku,arsthanea/dokku,Git-Host/dokku,ertrzyiks/dokku,beedesk/dokku,AnselZhangGit/dokku,expa/dokku,leonardodino/dokku,callahad/dokku,econya/dokku,jimjag/dokku,Fab-IT-ApS/dokku,expa/dokku,3onyc/dokku,beedesk/dokku,elia/dokku,experimental-platform/dokku,suxor42/dokku,crisward/dokku,expa/dokku,jimjag/dokku,amaltson/dokku,jimeh/dokku,Git-Host/dokku,tilgovi/dokku,stepri/dokku,gianpaj/dokku,Git-Host/dokku,matto1990/dokku,crisward/dokku,3onyc/dokku,cjblomqvist/dokku,rocLv/dokku,jimeh/dokku,sebastianscatularo/dokku,akatrevorjay/dokku,3onyc/dokku,klodoo/dokku,mrluc/dokku,ddreaml/dokku,biddyweb/dokku,ddreaml/dokku,experimental-platform/dokku,omeid/dokku,AnselZhangGit/dokku,TribeMedia/dokku,codeconfab/dokku,stepri/dokku,suxor42/dokku,vijayviji/dokku,jimeh/dokku,gianpaj/dokku,tilgovi/dokku,klodoo/dokku,hanwoody/dokku,sebastianscatularo/dokku,sebastianscatularo/dokku,vkurup/dokku,klodoo/dokku,stepri/dokku,callahad/dokku,sseemayer/dokku,mmerickel/dokku,vijayviji/dokku,matto1990/dokku,arthurschreiber/dokku,biddyweb/dokku,ddreaml/dokku,bkniffler/dokku,cjblomqvist/dokku,ebeigarts/dokku,alexquick/dokku,ebeigarts/dokku,johnfraney/dokku,bkniffler/dokku,alessio/dokku,jazzzz/dokku,elia/dokku,basicer/dokku,codeconfab/dokku,fmoliveira/dokku,tilgovi/dokku,suxor42/dokku,fruitl00p/dokku,callahad/dokku,lmars/dokku,fruitl00p/dokku,u2mejc/dokku,ddreaml/dokku,cjblomqvist/dokku,omeid/dokku,jamesmarva/dokku,alexquick/dokku,hanwoody/dokku,vkurup/dokku,jimjag/dokku,TribeMedia/dokku,gianpaj/dokku,klodoo/dokku,davidsiefert/dokku,arsthanea/dokku,3onyc/dokku,leonardodino/dokku,vlipco/dokku,bobvanderlinden/dokku,fruitl00p/dokku,codeconfab/dokku,amaltson/dokku,netbrick/dokku,paralin/crew,codeconfab/dokku,crisward/dokku,bkniffler/dokku,amaltson/dokku,gianpaj/dokku,Fab-IT-ApS/dokku,Git-Host/dokku,codeconfab/dokku,econya/dokku,jamesmarva/dokku,Fab-IT-ApS/dokku,arsthanea/dokku,klodoo/dokku,jamesmarva/dokku,alessio/dokku,claytonbrown/dokku,basicer/dokku,vijayviji/dokku,basicer/dokku,dokku/dokku,fruitl00p/dokku,Fab-IT-ApS/dokku,gianpaj/dokku,alexquick/dokku,barseghyanartur/dokku,u2mejc/dokku,pmvieira/dokku,progrium/dokku,netbrick/dokku,Cactusbone/dokku,amaltson/dokku,crisward/dokku,expa/dokku,klodoo/dokku,jazzzz/dokku,vkurup/dokku,barseghyanartur/dokku,vijayviji/dokku,mouserage/dokku,davidsiefert/dokku,Evlos/dokku,barseghyanartur/dokku,sebastianscatularo/dokku,progrium/dokku,Git-Host/dokku,omeid/dokku,beedesk/dokku,codeconfab/dokku,jazzzz/dokku,amaltson/dokku,jamesmarva/dokku,TribeMedia/dokku,jimjag/dokku,akatrevorjay/dokku,fmoliveira/dokku,ertrzyiks/dokku,progrium/dokku,arthurschreiber/dokku,ertrzyiks/dokku,hanwoody/dokku,mouserage/dokku,mrluc/dokku,henrik/dokku,alessio/dokku,elia/dokku,tilgovi/dokku,omeid/dokku,sebastianscatularo/dokku,3onyc/dokku,Evlos/dokku,ertrzyiks/dokku,omeid/dokku,Fab-IT-ApS/dokku,sebastianscatularo/dokku,emdantrim/dokku,jamesmarva/dokku,kennetham/dokku,Evlos/dokku,jazzzz/dokku,dokku/dokku,jimeh/dokku,Evlos/dokku,fmoliveira/dokku,u2mejc/dokku,beedesk/dokku,johnfraney/dokku,ertrzyiks/dokku,paralin/crew,mouserage/dokku,Cactusbone/dokku,henrik/dokku,bobvanderlinden/dokku,Cactusbone/dokku,alessio/dokku,arsthanea/dokku,3onyc/dokku,tilgovi/dokku,barseghyanartur/dokku,u2mejc/dokku,alexquick/dokku,leonardodino/dokku,mmerickel/dokku,dokku/dokku,Evlos/dokku,AnselZhangGit/dokku,leonardodino/dokku,alexquick/dokku,vkurup/dokku,jazzzz/dokku,jamesmarva/dokku,sseemayer/dokku,netbrick/dokku,sseemayer/dokku,fmoliveira/dokku,stepri/dokku,akatrevorjay/dokku,biddyweb/dokku,bkniffler/dokku,bobvanderlinden/dokku,claytonbrown/dokku,lmars/dokku,johnfraney/dokku,jimjag/dokku,pmvieira/dokku,arthurschreiber/dokku,arsthanea/dokku,alexquick/dokku,crisward/dokku,experimental-platform/dokku,tilgovi/dokku,davidsiefert/dokku,alexquick/dokku,cjblomqvist/dokku,netbrick/dokku,biddyweb/dokku,u2mejc/dokku,elia/dokku,AnselZhangGit/dokku,TribeMedia/dokku,stepri/dokku,eljojo/dokku,dokku/dokku,akatrevorjay/dokku,netbrick/dokku,bkniffler/dokku,gianpaj/dokku,expa/dokku,basicer/dokku,TribeMedia/dokku,elia/dokku,biddyweb/dokku,jimjag/dokku,amaltson/dokku,davidsiefert/dokku,klodoo/dokku,AnselZhangGit/dokku,barseghyanartur/dokku,omeid/dokku,callahad/dokku,sseemayer/dokku,davidsiefert/dokku,lmars/dokku,bkniffler/dokku,kennetham/dokku,pmvieira/dokku,henrik/dokku,claytonbrown/dokku,vkurup/dokku,jimeh/dokku,davidsiefert/dokku,emdantrim/dokku,dokku/dokku,mrluc/dokku,ertrzyiks/dokku,arsthanea/dokku,matto1990/dokku,crisward/dokku,bkniffler/dokku,kennetham/dokku,fmoliveira/dokku,ddreaml/dokku,biddyweb/dokku,Evlos/dokku,pmvieira/dokku,matto1990/dokku,henrik/dokku,eljojo/dokku,Cactusbone/dokku,fmoliveira/dokku,rocLv/dokku,claytonbrown/dokku,ebeigarts/dokku,barseghyanartur/dokku,vlipco/dokku,rocLv/dokku,sseemayer/dokku,Cactusbone/dokku,jazzzz/dokku,claytonbrown/dokku,johnfraney/dokku,eljojo/dokku,Cactusbone/dokku,econya/dokku,vlipco/dokku,Cactusbone/dokku,mrluc/dokku,sseemayer/dokku,mrluc/dokku,ebeigarts/dokku,progrium/dokku,experimental-platform/dokku
|
shell
|
## Code Before:
if [[ ! -z $DOKKU_HOST ]]; then
function dokku {
appname=$(git remote -v 2>/dev/null | grep dokku | head -n 1 | cut -f1 -d' ' | cut -f2 -d':' 2>/dev/null)
if [[ "$?" != "0" ]]; then
donotshift="YES"
fi
if [[ "$1" = "create" ]]; then
appname=$(echo "print(elfs.GenName())" | lua -l elfs)
if git remote add dokku dokku@$DOKKU_HOST:$appname
then
echo "-----> Dokku remote added at $DOKKU_HOST"
echo "-----> Application name is $appname"
else
echo "! Dokku remote not added! Do you already have a dokku remote?"
return
fi
git push dokku master
return $?
fi
if [[ -z "$donotshift" ]]; then
ssh dokku@$DOKKU_HOST $*
exit
fi
verb=$1
shift
ssh dokku@$DOKKU_HOST "$verb" "$appname" $@
}
fi
## Instruction:
Add DOKKU_TRACE support to turn on command tracing
## Code After:
set -eo pipefail; [[ $DOKKU_TRACE ]] && set -x
if [[ ! -z $DOKKU_HOST ]]; then
function dokku {
appname=$(git remote -v 2>/dev/null | grep dokku | head -n 1 | cut -f1 -d' ' | cut -f2 -d':' 2>/dev/null)
if [[ "$?" != "0" ]]; then
donotshift="YES"
fi
if [[ "$1" = "create" ]]; then
appname=$(echo "print(elfs.GenName())" | lua -l elfs)
if git remote add dokku dokku@$DOKKU_HOST:$appname
then
echo "-----> Dokku remote added at $DOKKU_HOST"
echo "-----> Application name is $appname"
else
echo "! Dokku remote not added! Do you already have a dokku remote?"
return
fi
git push dokku master
return $?
fi
if [[ -z "$donotshift" ]]; then
ssh dokku@$DOKKU_HOST $*
exit
fi
verb=$1
shift
ssh dokku@$DOKKU_HOST "$verb" "$appname" $@
}
fi
|
f52c8cc3938567a24ac6ea0a807654aa73caa871
|
pages/views.py
|
pages/views.py
|
from pages.models import Page, Language, Content
from pages.utils import auto_render
from django.contrib.admin.views.decorators import staff_member_required
from django import forms
from django.http import Http404
import settings
@auto_render
def details(request, page_id=None):
template = None
lang = Language.get_from_request(request)
pages = Page.objects.filter(parent__isnull=True).order_by("tree_id")
if len(pages) > 0:
if page_id:
try:
current_page = Page.objects.get(id=int(page_id), status=1)
except Page.DoesNotExist:
raise Http404
else:
# get the first root page
current_page = pages[0]
template = current_page.get_template()
else:
template = settings.DEFAULT_PAGE_TEMPLATE
return template, locals()
|
from pages.models import Page, Language, Content
from pages.utils import auto_render
from django.contrib.admin.views.decorators import staff_member_required
from django import forms
from django.http import Http404
import settings
@auto_render
def details(request, page_id=None):
template = None
lang = Language.get_from_request(request)
pages = Page.objects.filter(parent__isnull=True).order_by("tree_id")
if len(pages) > 0:
if page_id:
try:
current_page = Page.objects.get(id=int(page_id), status=1)
except Page.DoesNotExist:
raise Http404
else:
# get the first root page
current_page = pages[0]
template = current_page.get_template()
else:
current_page = None
template = settings.DEFAULT_PAGE_TEMPLATE
return template, locals()
|
Fix a bug with an empty database
|
Fix a bug with an empty database
git-svn-id: 54fea250f97f2a4e12c6f7a610b8f07cb4c107b4@138 439a9e5f-3f3e-0410-bc46-71226ad0111b
|
Python
|
bsd-3-clause
|
pombredanne/django-page-cms-1,oliciv/django-page-cms,akaihola/django-page-cms,pombredanne/django-page-cms-1,akaihola/django-page-cms,remik/django-page-cms,pombredanne/django-page-cms-1,batiste/django-page-cms,oliciv/django-page-cms,remik/django-page-cms,batiste/django-page-cms,oliciv/django-page-cms,batiste/django-page-cms,remik/django-page-cms,remik/django-page-cms,akaihola/django-page-cms
|
python
|
## Code Before:
from pages.models import Page, Language, Content
from pages.utils import auto_render
from django.contrib.admin.views.decorators import staff_member_required
from django import forms
from django.http import Http404
import settings
@auto_render
def details(request, page_id=None):
template = None
lang = Language.get_from_request(request)
pages = Page.objects.filter(parent__isnull=True).order_by("tree_id")
if len(pages) > 0:
if page_id:
try:
current_page = Page.objects.get(id=int(page_id), status=1)
except Page.DoesNotExist:
raise Http404
else:
# get the first root page
current_page = pages[0]
template = current_page.get_template()
else:
template = settings.DEFAULT_PAGE_TEMPLATE
return template, locals()
## Instruction:
Fix a bug with an empty database
git-svn-id: 54fea250f97f2a4e12c6f7a610b8f07cb4c107b4@138 439a9e5f-3f3e-0410-bc46-71226ad0111b
## Code After:
from pages.models import Page, Language, Content
from pages.utils import auto_render
from django.contrib.admin.views.decorators import staff_member_required
from django import forms
from django.http import Http404
import settings
@auto_render
def details(request, page_id=None):
template = None
lang = Language.get_from_request(request)
pages = Page.objects.filter(parent__isnull=True).order_by("tree_id")
if len(pages) > 0:
if page_id:
try:
current_page = Page.objects.get(id=int(page_id), status=1)
except Page.DoesNotExist:
raise Http404
else:
# get the first root page
current_page = pages[0]
template = current_page.get_template()
else:
current_page = None
template = settings.DEFAULT_PAGE_TEMPLATE
return template, locals()
|
c4c84d0d2589866bebe6208717e5d019f32703a7
|
src/components/comments/Comment.sass
|
src/components/comments/Comment.sass
|
.Comments
position: relative
.Comments > .SVGIcon
position: absolute
top: rem(22)
left: 0
.CommentsLink
+sanserif
margin-left: rem(60)
color: #aaa
border-bottom: 1px solid #aaa
line-height: rem(10)
.CommentUsername
transition: color $speed
color: #aaa
.no-touch .CommentHeader .PostHeaderLink:hover .CommentUsername
color: black
.Comments .TimeAgoTool
color: #aaa
.PostHeader.CommentHeader
height: rem(45)
.CommentBody
padding-left: rem(60)
.PostTools.CommentTools
padding-left: rem(60)
|
.Comments
position: relative
.Comments > .SVGIcon
position: absolute
top: rem(22)
left: 0
.CommentsLink
+sanserif
margin-left: rem(60)
color: #aaa
border-bottom: 1px solid #aaa
line-height: rem(10)
.CommentUsername
transition: color $speed
color: #aaa
.no-touch .CommentHeader .PostHeaderLink:hover .CommentUsername
color: black
.Comments .TimeAgoTool
color: #aaa
.PostHeader.CommentHeader
height: rem(45)
.CommentBody
padding-left: rem(60)
.PostTools.CommentTools
padding-left: rem(60)
padding-top: inherit
|
Fix padding on comment tools
|
Fix padding on comment tools
|
Sass
|
mit
|
ello/webapp,ello/webapp,ello/webapp
|
sass
|
## Code Before:
.Comments
position: relative
.Comments > .SVGIcon
position: absolute
top: rem(22)
left: 0
.CommentsLink
+sanserif
margin-left: rem(60)
color: #aaa
border-bottom: 1px solid #aaa
line-height: rem(10)
.CommentUsername
transition: color $speed
color: #aaa
.no-touch .CommentHeader .PostHeaderLink:hover .CommentUsername
color: black
.Comments .TimeAgoTool
color: #aaa
.PostHeader.CommentHeader
height: rem(45)
.CommentBody
padding-left: rem(60)
.PostTools.CommentTools
padding-left: rem(60)
## Instruction:
Fix padding on comment tools
## Code After:
.Comments
position: relative
.Comments > .SVGIcon
position: absolute
top: rem(22)
left: 0
.CommentsLink
+sanserif
margin-left: rem(60)
color: #aaa
border-bottom: 1px solid #aaa
line-height: rem(10)
.CommentUsername
transition: color $speed
color: #aaa
.no-touch .CommentHeader .PostHeaderLink:hover .CommentUsername
color: black
.Comments .TimeAgoTool
color: #aaa
.PostHeader.CommentHeader
height: rem(45)
.CommentBody
padding-left: rem(60)
.PostTools.CommentTools
padding-left: rem(60)
padding-top: inherit
|
03153bc0671b5b0c8292f902c6a5f0a792a383b3
|
code/jobs/CheckComposerUpdatesJob.php
|
code/jobs/CheckComposerUpdatesJob.php
|
<?php
/**
* Composer Update checker job. Runs the check as a queuedjob.
*
* @author Peter Thaleikis
* @license MIT
*/
class CheckComposerUpdatesJob extends AbstractQueuedJob implements QueuedJob {
/**
* The task to run
*
* @var BuildTask
*/
protected $task;
/**
* define the title
*
* @return string
*/
public function getTitle() {
return _t(
'ComposerUpdateChecker.Title',
'Check if composer updates are available'
);
}
/**
* define the type.
*/
public function getJobType() {
$this->totalSteps = 1;
return QueuedJob::QUEUED;
}
/**
* init
*/
public function setup() {
// create the instance of the task
$this->task = new CheckComposerUpdatesTask();
}
/**
* process the
*/
public function process() {
$this->task->run(new SS_HTTPRequest());
}
}
|
<?php
/**
* Composer Update checker job. Runs the check as a queuedjob.
*
* @author Peter Thaleikis
* @license MIT
*/
class CheckComposerUpdatesJob extends AbstractQueuedJob implements QueuedJob {
/**
* The task to run
*
* @var BuildTask
*/
protected $task;
/**
* define the title
*
* @return string
*/
public function getTitle() {
return _t(
'ComposerUpdateChecker.Title',
'Check if composer updates are available'
);
}
/**
* define the type.
*/
public function getJobType() {
$this->totalSteps = 1;
return QueuedJob::QUEUED;
}
/**
* init
*/
public function setup() {
// create the instance of the task
$this->task = new CheckComposerUpdatesTask();
}
/**
* processes the task as a job
*/
public function process() {
// run the task
$this->task->run(new SS_HTTPRequest());
// mark job as completed
$this->isComplete = true;
}
}
|
Implement the job better - minor bugfix
|
Implement the job better - minor bugfix
|
PHP
|
mit
|
spekulatius/silverstripe-composer-update-checker
|
php
|
## Code Before:
<?php
/**
* Composer Update checker job. Runs the check as a queuedjob.
*
* @author Peter Thaleikis
* @license MIT
*/
class CheckComposerUpdatesJob extends AbstractQueuedJob implements QueuedJob {
/**
* The task to run
*
* @var BuildTask
*/
protected $task;
/**
* define the title
*
* @return string
*/
public function getTitle() {
return _t(
'ComposerUpdateChecker.Title',
'Check if composer updates are available'
);
}
/**
* define the type.
*/
public function getJobType() {
$this->totalSteps = 1;
return QueuedJob::QUEUED;
}
/**
* init
*/
public function setup() {
// create the instance of the task
$this->task = new CheckComposerUpdatesTask();
}
/**
* process the
*/
public function process() {
$this->task->run(new SS_HTTPRequest());
}
}
## Instruction:
Implement the job better - minor bugfix
## Code After:
<?php
/**
* Composer Update checker job. Runs the check as a queuedjob.
*
* @author Peter Thaleikis
* @license MIT
*/
class CheckComposerUpdatesJob extends AbstractQueuedJob implements QueuedJob {
/**
* The task to run
*
* @var BuildTask
*/
protected $task;
/**
* define the title
*
* @return string
*/
public function getTitle() {
return _t(
'ComposerUpdateChecker.Title',
'Check if composer updates are available'
);
}
/**
* define the type.
*/
public function getJobType() {
$this->totalSteps = 1;
return QueuedJob::QUEUED;
}
/**
* init
*/
public function setup() {
// create the instance of the task
$this->task = new CheckComposerUpdatesTask();
}
/**
* processes the task as a job
*/
public function process() {
// run the task
$this->task->run(new SS_HTTPRequest());
// mark job as completed
$this->isComplete = true;
}
}
|
e08d27b016895b0cd200729169c918b152b038cc
|
src/kaptan.ts
|
src/kaptan.ts
|
import { EventEmitter } from 'events';
import { Logger } from './util';
import { ServiceConstructor, ServiceContainer } from './service';
export class Kaptan extends EventEmitter {
public readonly logger: Logger;
public readonly services: ServiceContainer;
constructor(label: string = 'kaptan') {
super();
this.logger = new Logger(label);
this.services = new ServiceContainer(this);
}
public use(service: ServiceConstructor, options: {[key:string]: any} = {}) {
if (!service.Options) {
service.Options = {};
}
service.Options = { ...service.Options, ...options };
this.services.add(service);
}
public start() {
this.services.spawn();
}
}
|
import { EventEmitter } from 'events';
import { Logger } from './util';
import { Service, ServiceConstructor, ServiceContainer } from './service';
export class Kaptan extends EventEmitter {
public readonly logger: Logger;
public readonly services: ServiceContainer;
constructor(label: string = 'kaptan') {
super();
this.logger = new Logger(label);
this.services = new ServiceContainer(this);
}
public use(service: ServiceConstructor, options: {[key:string]: any} = {}) {
service = Service.copy(Service.getServiceName(service), service);
if (!service.Options) {
service.Options = {};
}
service.Options = { ...service.Options, ...options };
this.services.add(service);
}
public start() {
this.services.spawn();
}
}
|
Add duplicate service to the services instead of the original one
|
Add duplicate service to the services instead of the original one
|
TypeScript
|
mit
|
ibrahimduran/node-kaptan
|
typescript
|
## Code Before:
import { EventEmitter } from 'events';
import { Logger } from './util';
import { ServiceConstructor, ServiceContainer } from './service';
export class Kaptan extends EventEmitter {
public readonly logger: Logger;
public readonly services: ServiceContainer;
constructor(label: string = 'kaptan') {
super();
this.logger = new Logger(label);
this.services = new ServiceContainer(this);
}
public use(service: ServiceConstructor, options: {[key:string]: any} = {}) {
if (!service.Options) {
service.Options = {};
}
service.Options = { ...service.Options, ...options };
this.services.add(service);
}
public start() {
this.services.spawn();
}
}
## Instruction:
Add duplicate service to the services instead of the original one
## Code After:
import { EventEmitter } from 'events';
import { Logger } from './util';
import { Service, ServiceConstructor, ServiceContainer } from './service';
export class Kaptan extends EventEmitter {
public readonly logger: Logger;
public readonly services: ServiceContainer;
constructor(label: string = 'kaptan') {
super();
this.logger = new Logger(label);
this.services = new ServiceContainer(this);
}
public use(service: ServiceConstructor, options: {[key:string]: any} = {}) {
service = Service.copy(Service.getServiceName(service), service);
if (!service.Options) {
service.Options = {};
}
service.Options = { ...service.Options, ...options };
this.services.add(service);
}
public start() {
this.services.spawn();
}
}
|
10f73f0cc4cebd1e7f67f6942185433203378634
|
README.md
|
README.md
|
Jerky
================
[](https://travis-ci.org/sethyanow/jerky)
[](https://codeclimate.com/repos/5410b8bd6956803886002739/feed)
[](https://gemnasium.com/sethyanow/jerky)
This is a work in progress. It'll sell jerky for me, hooray!
|
Jerky
================
[](https://travis-ci.org/sethyanow/jerky)
[](https://codeclimate.com/repos/5410b8bd6956803886002739/feed)
[](https://gemnasium.com/sethyanow/jerky)
[](https://coveralls.io/r/sethyanow/jerky?branch=master)
This is a work in progress. It'll sell jerky for me, hooray!
|
Add coveralls badge to readme
|
Add coveralls badge to readme
|
Markdown
|
mit
|
sethyanow/jerky,sethyanow/jerky
|
markdown
|
## Code Before:
Jerky
================
[](https://travis-ci.org/sethyanow/jerky)
[](https://codeclimate.com/repos/5410b8bd6956803886002739/feed)
[](https://gemnasium.com/sethyanow/jerky)
This is a work in progress. It'll sell jerky for me, hooray!
## Instruction:
Add coveralls badge to readme
## Code After:
Jerky
================
[](https://travis-ci.org/sethyanow/jerky)
[](https://codeclimate.com/repos/5410b8bd6956803886002739/feed)
[](https://gemnasium.com/sethyanow/jerky)
[](https://coveralls.io/r/sethyanow/jerky?branch=master)
This is a work in progress. It'll sell jerky for me, hooray!
|
d5e9da483bc5875165f6960017844efa7ed4d404
|
spec/system/haproxy_spec.rb
|
spec/system/haproxy_spec.rb
|
require 'spec_helper'
require 'httparty'
RSpec.describe "haproxy" do
let(:management_uri) { 'http://10.244.16.3:15672' }
[0, 1].each do |job_index|
context "when the job rmq/#{job_index} is down" do
before(:all) do
bosh_director.stop('rmq', job_index)
end
after(:all) do
bosh_director.start('rmq', job_index)
end
it 'I can still access the managment UI' do
res = HTTParty.get(management_uri)
expect(res.code).to eql(200)
expect(res.body).to include('RabbitMQ Management')
end
end
end
end
|
require 'spec_helper'
require 'httparty'
RSpec.describe "haproxy" do
let(:management_uri) { "http://#{environment.bosh_manifest.job('haproxy').static_ips.first}:15672" }
[0, 1].each do |job_index|
context "when the job rmq/#{job_index} is down" do
before(:all) do
bosh_director.stop('rmq', job_index)
end
after(:all) do
bosh_director.start('rmq', job_index)
end
it 'I can still access the managment UI' do
res = HTTParty.get(management_uri)
expect(res.code).to eql(200)
expect(res.body).to include('RabbitMQ Management')
end
end
end
end
|
Read haproxy static_ips from manifest in test
|
Read haproxy static_ips from manifest in test
[#148579027]
Signed-off-by: Diego Lemos <[email protected]>
|
Ruby
|
apache-2.0
|
pivotal-cf/cf-rabbitmq-release,pivotal-cf/cf-rabbitmq-release,pivotal-cf/cf-rabbitmq-release,pivotal-cf/cf-rabbitmq-release
|
ruby
|
## Code Before:
require 'spec_helper'
require 'httparty'
RSpec.describe "haproxy" do
let(:management_uri) { 'http://10.244.16.3:15672' }
[0, 1].each do |job_index|
context "when the job rmq/#{job_index} is down" do
before(:all) do
bosh_director.stop('rmq', job_index)
end
after(:all) do
bosh_director.start('rmq', job_index)
end
it 'I can still access the managment UI' do
res = HTTParty.get(management_uri)
expect(res.code).to eql(200)
expect(res.body).to include('RabbitMQ Management')
end
end
end
end
## Instruction:
Read haproxy static_ips from manifest in test
[#148579027]
Signed-off-by: Diego Lemos <[email protected]>
## Code After:
require 'spec_helper'
require 'httparty'
RSpec.describe "haproxy" do
let(:management_uri) { "http://#{environment.bosh_manifest.job('haproxy').static_ips.first}:15672" }
[0, 1].each do |job_index|
context "when the job rmq/#{job_index} is down" do
before(:all) do
bosh_director.stop('rmq', job_index)
end
after(:all) do
bosh_director.start('rmq', job_index)
end
it 'I can still access the managment UI' do
res = HTTParty.get(management_uri)
expect(res.code).to eql(200)
expect(res.body).to include('RabbitMQ Management')
end
end
end
end
|
90274dafca573d6c672625a53c46cf6ab2a5fec3
|
test/CMakeLists.txt
|
test/CMakeLists.txt
|
add_library(catch2 OBJECT catch2/catch2-main.cpp)
include_directories (${CMAKE_SOURCE_DIR}/include/pomerol)
# Pomerol tests
set (tests
HamiltonianTest
GF1siteTest
GF2siteTest
GF4siteTest
AndersonTest
AndersonComplexTest
Anderson2PGFTest
Vertex4Test
SusceptibilityTest
)
foreach (test ${tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
set (test_parameters -np 1 "./${test}")
add_test(NAME ${test} COMMAND "${MPIEXEC}" ${test_parameters})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
endforeach(test)
set(mpi_tests BroadcastTest MPIDispatcherTest)
foreach (test ${mpi_tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
foreach (np 2 4 8 16)
set(test_parameters
${MPIEXEC_NUMPROC_FLAG} ${np}
${MPIEXEC_PREFLAGS} ${test} ${MPIEXEC_POSTFLAGS})
add_test(NAME ${test}${np}cpu COMMAND "${MPIEXEC}" ${test_parameters})
endforeach(np)
endforeach(test)
|
add_library(catch2 OBJECT catch2/catch2-main.cpp)
target_include_directories(catch2 PRIVATE ${MPI_CXX_INCLUDE_PATH})
include_directories (${CMAKE_SOURCE_DIR}/include/pomerol)
# Pomerol tests
set (tests
HamiltonianTest
GF1siteTest
GF2siteTest
GF4siteTest
AndersonTest
AndersonComplexTest
Anderson2PGFTest
Vertex4Test
SusceptibilityTest
)
foreach (test ${tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
set (test_parameters -np 1 "./${test}")
add_test(NAME ${test} COMMAND "${MPIEXEC}" ${test_parameters})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
endforeach(test)
set(mpi_tests BroadcastTest MPIDispatcherTest)
foreach (test ${mpi_tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
foreach (np 2 4 8 16)
set(test_parameters
${MPIEXEC_NUMPROC_FLAG} ${np}
${MPIEXEC_PREFLAGS} ${test} ${MPIEXEC_POSTFLAGS})
add_test(NAME ${test}${np}cpu COMMAND "${MPIEXEC}" ${test_parameters})
endforeach(np)
endforeach(test)
|
Add missing `target_include_directories()` to `catch2` target
|
[test] Add missing `target_include_directories()` to `catch2` target
|
Text
|
mpl-2.0
|
aeantipov/pomerol,aeantipov/pomerol,aeantipov/pomerol
|
text
|
## Code Before:
add_library(catch2 OBJECT catch2/catch2-main.cpp)
include_directories (${CMAKE_SOURCE_DIR}/include/pomerol)
# Pomerol tests
set (tests
HamiltonianTest
GF1siteTest
GF2siteTest
GF4siteTest
AndersonTest
AndersonComplexTest
Anderson2PGFTest
Vertex4Test
SusceptibilityTest
)
foreach (test ${tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
set (test_parameters -np 1 "./${test}")
add_test(NAME ${test} COMMAND "${MPIEXEC}" ${test_parameters})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
endforeach(test)
set(mpi_tests BroadcastTest MPIDispatcherTest)
foreach (test ${mpi_tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
foreach (np 2 4 8 16)
set(test_parameters
${MPIEXEC_NUMPROC_FLAG} ${np}
${MPIEXEC_PREFLAGS} ${test} ${MPIEXEC_POSTFLAGS})
add_test(NAME ${test}${np}cpu COMMAND "${MPIEXEC}" ${test_parameters})
endforeach(np)
endforeach(test)
## Instruction:
[test] Add missing `target_include_directories()` to `catch2` target
## Code After:
add_library(catch2 OBJECT catch2/catch2-main.cpp)
target_include_directories(catch2 PRIVATE ${MPI_CXX_INCLUDE_PATH})
include_directories (${CMAKE_SOURCE_DIR}/include/pomerol)
# Pomerol tests
set (tests
HamiltonianTest
GF1siteTest
GF2siteTest
GF4siteTest
AndersonTest
AndersonComplexTest
Anderson2PGFTest
Vertex4Test
SusceptibilityTest
)
foreach (test ${tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
set (test_parameters -np 1 "./${test}")
add_test(NAME ${test} COMMAND "${MPIEXEC}" ${test_parameters})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
endforeach(test)
set(mpi_tests BroadcastTest MPIDispatcherTest)
foreach (test ${mpi_tests})
set(test_src ${test}.cpp)
add_executable(${test} ${test_src})
target_link_libraries(${test}
${Boost_LIBRARIES}
${MPI_CXX_LIBRARIES}
pomerol
catch2
)
foreach (np 2 4 8 16)
set(test_parameters
${MPIEXEC_NUMPROC_FLAG} ${np}
${MPIEXEC_PREFLAGS} ${test} ${MPIEXEC_POSTFLAGS})
add_test(NAME ${test}${np}cpu COMMAND "${MPIEXEC}" ${test_parameters})
endforeach(np)
endforeach(test)
|
67ae89345e5479bb76b8cc7a0c309c03ca088721
|
.golangci.yml
|
.golangci.yml
|
run:
timeout: 10m
linters-settings:
goimports:
local-prefixes: vitess.io/vitess
linters:
disable-all: true
enable:
# Defaults
- deadcode
# - errcheck
- gosimple
- govet
- ineffassign
- staticcheck
- structcheck
- typecheck
- varcheck
# Extras
- gofmt
- goimports
# https://github.com/golangci/golangci/wiki/Configuration
service:
golangci-lint-version: 1.31.0 # use the fixed version to not introduce new linters unexpectedly
|
run:
timeout: 10m
linters-settings:
goimports:
local-prefixes: vitess.io/vitess
linters:
disable-all: true
enable:
# Defaults
- deadcode
# - errcheck
- gosimple
- govet
- ineffassign
- staticcheck
- structcheck
- typecheck
- varcheck
# Extras
- gofmt
- goimports
issues:
exclude-rules:
- path: '^go/vt/proto/'
linters:
- goimports
# https://github.com/golangci/golangci/wiki/Configuration
service:
golangci-lint-version: 1.31.0 # use the fixed version to not introduce new linters unexpectedly
|
Exclude protoc-gen-go generated files from the linter goimports check
|
Exclude protoc-gen-go generated files from the linter goimports check
Signed-off-by: Andrew Mason <[email protected]>
|
YAML
|
apache-2.0
|
tinyspeck/vitess,vitessio/vitess,vitessio/vitess,mahak/vitess,mahak/vitess,vitessio/vitess,mahak/vitess,mahak/vitess,mahak/vitess,vitessio/vitess,tinyspeck/vitess,tinyspeck/vitess,mahak/vitess,tinyspeck/vitess,vitessio/vitess,mahak/vitess,vitessio/vitess,tinyspeck/vitess,vitessio/vitess,tinyspeck/vitess,tinyspeck/vitess,mahak/vitess,vitessio/vitess
|
yaml
|
## Code Before:
run:
timeout: 10m
linters-settings:
goimports:
local-prefixes: vitess.io/vitess
linters:
disable-all: true
enable:
# Defaults
- deadcode
# - errcheck
- gosimple
- govet
- ineffassign
- staticcheck
- structcheck
- typecheck
- varcheck
# Extras
- gofmt
- goimports
# https://github.com/golangci/golangci/wiki/Configuration
service:
golangci-lint-version: 1.31.0 # use the fixed version to not introduce new linters unexpectedly
## Instruction:
Exclude protoc-gen-go generated files from the linter goimports check
Signed-off-by: Andrew Mason <[email protected]>
## Code After:
run:
timeout: 10m
linters-settings:
goimports:
local-prefixes: vitess.io/vitess
linters:
disable-all: true
enable:
# Defaults
- deadcode
# - errcheck
- gosimple
- govet
- ineffassign
- staticcheck
- structcheck
- typecheck
- varcheck
# Extras
- gofmt
- goimports
issues:
exclude-rules:
- path: '^go/vt/proto/'
linters:
- goimports
# https://github.com/golangci/golangci/wiki/Configuration
service:
golangci-lint-version: 1.31.0 # use the fixed version to not introduce new linters unexpectedly
|
a1d0a75ba9739af79711bb9438dc7e5d53cf0d92
|
README.md
|
README.md
|
iOS.MF
==============
An iOS optimised theme for iPhone, iPad and iPod Touch.
This is a very lightweight theme with only a limited set of pages available and theming applied, but which provides a good user experience for interacting with a the core functionality of SMF.
This has been used with various versions of SMF from 2.0 RC3 through to SMF 2.0.7, but has not been tested with all of them.
Gestures
==============
Swipe left to go back.
Swipe right to go forward.
Two finger tap inside topic to activate quick reply, alternatively tap on the topic title.
Installation
==============
Install the theme by uploading the iOS.MF_theme_v1.zip file through the SMF Administration Center > Configuration > Themes and Layout > Manage and Install > From a file.
Credits
==============
This theme is based on the excellent SMF4iPhone by Fabius, butchs and FarFromPerfection: http://custom.simplemachines.org/themes/index.php?lemma=2089.
Thanks to xKroniK13x for contributions and suggestions towards this theme.
The theme uses the Adk Skype Emoticons to give crisp retina smileys (licence: http://creativecommons.org/licenses/by/3.0/).
Internally we use the jQuery, Hammer.js, Touchy and jQuery Autosize libraries.
|
iOS.MF
==============
An iOS optimised theme for iPhone, iPad and iPod Touch.
This is a very lightweight theme with only a limited set of pages available and theming applied, but which provides a good user experience for interacting with a the core functionality of SMF.
This has been used with various versions of SMF from 2.0 RC3 through to SMF 2.0.7, but has not been tested with all of them.
To-Do
==============
- [ ] Re-add subscription and registration templates
- [ ] Style personal messages template and add link
- [ ] Look for an alternative to the taskbar (this doesn't work amazingly with mobile Safari). This should have links to home, unread, inbox, profile (maybe), and search (maybe, depending on how it works).
- [ ] Add retina version of a new default user image
Gestures
==============
Swipe left to go back.
Swipe right to go forward.
Two finger tap inside topic to activate quick reply, alternatively tap on the topic title.
Installation
==============
Install the theme by uploading the iOS.MF_theme_v1.zip file through the SMF Administration Center > Configuration > Themes and Layout > Manage and Install > From a file.
Credits
==============
This theme is based on the excellent SMF4iPhone by Fabius, butchs and FarFromPerfection: http://custom.simplemachines.org/themes/index.php?lemma=2089.
Thanks to xKroniK13x for contributions and suggestions towards this theme.
The theme uses the Adk Skype Emoticons to give crisp retina smileys (licence: http://creativecommons.org/licenses/by/3.0/).
Internally we use the jQuery, Hammer.js, Touchy and jQuery Autosize libraries.
|
Add to-do list to readme
|
Add to-do list to readme
|
Markdown
|
mit
|
filmstarr/iOS.MF,filmstarr/iOS.MF
|
markdown
|
## Code Before:
iOS.MF
==============
An iOS optimised theme for iPhone, iPad and iPod Touch.
This is a very lightweight theme with only a limited set of pages available and theming applied, but which provides a good user experience for interacting with a the core functionality of SMF.
This has been used with various versions of SMF from 2.0 RC3 through to SMF 2.0.7, but has not been tested with all of them.
Gestures
==============
Swipe left to go back.
Swipe right to go forward.
Two finger tap inside topic to activate quick reply, alternatively tap on the topic title.
Installation
==============
Install the theme by uploading the iOS.MF_theme_v1.zip file through the SMF Administration Center > Configuration > Themes and Layout > Manage and Install > From a file.
Credits
==============
This theme is based on the excellent SMF4iPhone by Fabius, butchs and FarFromPerfection: http://custom.simplemachines.org/themes/index.php?lemma=2089.
Thanks to xKroniK13x for contributions and suggestions towards this theme.
The theme uses the Adk Skype Emoticons to give crisp retina smileys (licence: http://creativecommons.org/licenses/by/3.0/).
Internally we use the jQuery, Hammer.js, Touchy and jQuery Autosize libraries.
## Instruction:
Add to-do list to readme
## Code After:
iOS.MF
==============
An iOS optimised theme for iPhone, iPad and iPod Touch.
This is a very lightweight theme with only a limited set of pages available and theming applied, but which provides a good user experience for interacting with a the core functionality of SMF.
This has been used with various versions of SMF from 2.0 RC3 through to SMF 2.0.7, but has not been tested with all of them.
To-Do
==============
- [ ] Re-add subscription and registration templates
- [ ] Style personal messages template and add link
- [ ] Look for an alternative to the taskbar (this doesn't work amazingly with mobile Safari). This should have links to home, unread, inbox, profile (maybe), and search (maybe, depending on how it works).
- [ ] Add retina version of a new default user image
Gestures
==============
Swipe left to go back.
Swipe right to go forward.
Two finger tap inside topic to activate quick reply, alternatively tap on the topic title.
Installation
==============
Install the theme by uploading the iOS.MF_theme_v1.zip file through the SMF Administration Center > Configuration > Themes and Layout > Manage and Install > From a file.
Credits
==============
This theme is based on the excellent SMF4iPhone by Fabius, butchs and FarFromPerfection: http://custom.simplemachines.org/themes/index.php?lemma=2089.
Thanks to xKroniK13x for contributions and suggestions towards this theme.
The theme uses the Adk Skype Emoticons to give crisp retina smileys (licence: http://creativecommons.org/licenses/by/3.0/).
Internally we use the jQuery, Hammer.js, Touchy and jQuery Autosize libraries.
|
84898c684f9abcbf4d88efe64b0de1a06ae4db56
|
.travis.yml
|
.travis.yml
|
dist: xenial
language: python
python:
- "2.7"
- "3.6"
- "3.7"
install:
- "pip install pytest-cov coveralls pyparsing==2.3.1"
- "pip install -e .[test]"
script:
- python -m pytest --cov snuggs --cov-report term-missing
after_success:
- coveralls
deploy:
on:
tags: true
condition: "$TRAVIS_PYTHON_VERSION = 3.6"
provider: pypi
user: mapboxci
distributions: "sdist bdist_wheel"
|
dist: xenial
language: python
python:
- "2.7"
- "3.6"
- "3.7"
install:
- "pip install pytest-cov coveralls pyparsing~=2.0"
- "pip install -e .[test]"
script:
- python -m pytest --cov snuggs --cov-report term-missing
after_success:
- coveralls
deploy:
on:
tags: true
condition: "$TRAVIS_PYTHON_VERSION = 3.6"
provider: pypi
user: __token__
distributions: "sdist bdist_wheel"
|
Use token to upload, update pyparsing
|
Use token to upload, update pyparsing
|
YAML
|
mit
|
mapbox/snuggs
|
yaml
|
## Code Before:
dist: xenial
language: python
python:
- "2.7"
- "3.6"
- "3.7"
install:
- "pip install pytest-cov coveralls pyparsing==2.3.1"
- "pip install -e .[test]"
script:
- python -m pytest --cov snuggs --cov-report term-missing
after_success:
- coveralls
deploy:
on:
tags: true
condition: "$TRAVIS_PYTHON_VERSION = 3.6"
provider: pypi
user: mapboxci
distributions: "sdist bdist_wheel"
## Instruction:
Use token to upload, update pyparsing
## Code After:
dist: xenial
language: python
python:
- "2.7"
- "3.6"
- "3.7"
install:
- "pip install pytest-cov coveralls pyparsing~=2.0"
- "pip install -e .[test]"
script:
- python -m pytest --cov snuggs --cov-report term-missing
after_success:
- coveralls
deploy:
on:
tags: true
condition: "$TRAVIS_PYTHON_VERSION = 3.6"
provider: pypi
user: __token__
distributions: "sdist bdist_wheel"
|
6449899e67646ca113f8dcb20bdf5060ceef86e3
|
ptrhost.go
|
ptrhost.go
|
package main
import (
"fmt"
"net"
"os"
flags "github.com/jessevdk/go-flags"
)
const version = "1.0.0"
var opts struct {
Version bool `short:"v" long:"version" description:"Show version"`
}
func main() {
parser := flags.NewParser(&opts, flags.Default)
parser.Usage = "HOSTNAME [OPTIONS]"
args, _ := parser.Parse()
if len(args) == 0 {
if opts.Version {
fmt.Println("ptrhost version", version)
}
os.Exit(1)
}
hostname := args[0]
addr, _ := net.LookupHost(hostname)
for _, v := range addr {
ptrAddr, _ := net.LookupAddr(v)
fmt.Println(v, "->", ptrAddr[0])
}
}
|
package main
import (
"fmt"
"net"
"os"
flags "github.com/jessevdk/go-flags"
)
const version = "1.0.0"
var opts struct {
Version bool `short:"v" long:"version" description:"Show version"`
}
func main() {
parser := flags.NewParser(&opts, flags.Default)
parser.Usage = "HOSTNAME [OPTIONS]"
args, _ := parser.Parse()
if len(args) == 0 {
if opts.Version {
fmt.Println("ptrhost version", version)
}
os.Exit(1)
}
hostname := args[0]
addr, _ := net.LookupHost(hostname)
for _, v := range addr {
resolvedHost, err := net.LookupAddr(v)
if err == nil {
fmt.Println(v, "->", resolvedHost[0])
} else {
fmt.Println(v, "->", err.(*net.DNSError).Err)
}
}
}
|
Fix crash when none resolved hostname
|
Fix crash when none resolved hostname
|
Go
|
mit
|
itochan/ptrhost
|
go
|
## Code Before:
package main
import (
"fmt"
"net"
"os"
flags "github.com/jessevdk/go-flags"
)
const version = "1.0.0"
var opts struct {
Version bool `short:"v" long:"version" description:"Show version"`
}
func main() {
parser := flags.NewParser(&opts, flags.Default)
parser.Usage = "HOSTNAME [OPTIONS]"
args, _ := parser.Parse()
if len(args) == 0 {
if opts.Version {
fmt.Println("ptrhost version", version)
}
os.Exit(1)
}
hostname := args[0]
addr, _ := net.LookupHost(hostname)
for _, v := range addr {
ptrAddr, _ := net.LookupAddr(v)
fmt.Println(v, "->", ptrAddr[0])
}
}
## Instruction:
Fix crash when none resolved hostname
## Code After:
package main
import (
"fmt"
"net"
"os"
flags "github.com/jessevdk/go-flags"
)
const version = "1.0.0"
var opts struct {
Version bool `short:"v" long:"version" description:"Show version"`
}
func main() {
parser := flags.NewParser(&opts, flags.Default)
parser.Usage = "HOSTNAME [OPTIONS]"
args, _ := parser.Parse()
if len(args) == 0 {
if opts.Version {
fmt.Println("ptrhost version", version)
}
os.Exit(1)
}
hostname := args[0]
addr, _ := net.LookupHost(hostname)
for _, v := range addr {
resolvedHost, err := net.LookupAddr(v)
if err == nil {
fmt.Println(v, "->", resolvedHost[0])
} else {
fmt.Println(v, "->", err.(*net.DNSError).Err)
}
}
}
|
2aef104da6bf6ce98619fe5bac5718533e2e7530
|
yunity/utils/tests/misc.py
|
yunity/utils/tests/misc.py
|
from importlib import import_module
from json import dumps as dump_json
from json import loads as load_json
def json_stringify(data):
return dump_json(data, sort_keys=True, separators=(',', ':')).encode("utf-8") if data else None
def content_json(response):
return load_json(response.content.decode("utf-8"))
def is_test_resource(resource):
return resource.startswith('test_')
def maybe_import(resource):
try:
return import_module(resource)
except ImportError:
return None
|
from importlib import import_module
from json import dumps as dump_json
from json import loads as load_json
def json_stringify(data):
return dump_json(data, sort_keys=True, separators=(',', ':')).encode("utf-8") if data else None
def content_json(response):
try:
return load_json(response.content.decode("utf-8"))
except ValueError:
raise ValueError('invalid json content in response')
def is_test_resource(resource):
return resource.startswith('test_')
def maybe_import(resource):
try:
return import_module(resource)
except ImportError:
return None
|
Improve error on invalid JSON response content
|
Improve error on invalid JSON response content
with @NerdyProjects
|
Python
|
agpl-3.0
|
yunity/yunity-core,yunity/foodsaving-backend,yunity/foodsaving-backend,yunity/yunity-core,yunity/foodsaving-backend
|
python
|
## Code Before:
from importlib import import_module
from json import dumps as dump_json
from json import loads as load_json
def json_stringify(data):
return dump_json(data, sort_keys=True, separators=(',', ':')).encode("utf-8") if data else None
def content_json(response):
return load_json(response.content.decode("utf-8"))
def is_test_resource(resource):
return resource.startswith('test_')
def maybe_import(resource):
try:
return import_module(resource)
except ImportError:
return None
## Instruction:
Improve error on invalid JSON response content
with @NerdyProjects
## Code After:
from importlib import import_module
from json import dumps as dump_json
from json import loads as load_json
def json_stringify(data):
return dump_json(data, sort_keys=True, separators=(',', ':')).encode("utf-8") if data else None
def content_json(response):
try:
return load_json(response.content.decode("utf-8"))
except ValueError:
raise ValueError('invalid json content in response')
def is_test_resource(resource):
return resource.startswith('test_')
def maybe_import(resource):
try:
return import_module(resource)
except ImportError:
return None
|
f7c35786ae89bf59193e4dbcc3ad9ac351fa880b
|
.github/workflows/build.yml
|
.github/workflows/build.yml
|
name: Lint, Build and Run
on: [push]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: ESLint
uses: gimenete/eslint-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
build:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: Start MongoDB As Docker
uses: wbari/[email protected]
with:
mongoDBVersion: 4.2
- name: Read .nvmrc
run: echo "##[set-output name=NVMRC;]$(cat .nvmrc)"
id: nvm
- name: Cache
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Setup Node
uses: actions/setup-node@v1
with:
node-version: "${{ steps.nvm.outputs.NVMRC }}"
- name: Install Dependencies
if: steps.cache.outputs.cache-hit != 'true'
run: npm install
- name: Run Test Cases
run: npm run test
|
name: Lint, Build and Run
on: [push]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: ESLint
uses: gimenete/eslint-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
build:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: Start MongoDB As Docker
uses: wbari/[email protected]
with:
mongoDBVersion: 4.2
- name: Read .nvmrc
run: echo "##[set-output name=NVMRC;]$(cat .nvmrc)"
id: nvm
- name: Cache
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Setup Node
uses: actions/setup-node@v1
with:
node-version: "${{ steps.nvm.outputs.NVMRC }}"
- name: Install Dependencies
if: steps.cache.outputs.cache-hit != 'true'
run: npm install
- name: Run Test Cases
run: npm run test
- name: Run Accessibility Tests
run: npm run test:accessibility
|
Add accessibility test to GitHub Action
|
Add accessibility test to GitHub Action
|
YAML
|
agpl-3.0
|
techx/quill,techx/quill
|
yaml
|
## Code Before:
name: Lint, Build and Run
on: [push]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: ESLint
uses: gimenete/eslint-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
build:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: Start MongoDB As Docker
uses: wbari/[email protected]
with:
mongoDBVersion: 4.2
- name: Read .nvmrc
run: echo "##[set-output name=NVMRC;]$(cat .nvmrc)"
id: nvm
- name: Cache
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Setup Node
uses: actions/setup-node@v1
with:
node-version: "${{ steps.nvm.outputs.NVMRC }}"
- name: Install Dependencies
if: steps.cache.outputs.cache-hit != 'true'
run: npm install
- name: Run Test Cases
run: npm run test
## Instruction:
Add accessibility test to GitHub Action
## Code After:
name: Lint, Build and Run
on: [push]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: ESLint
uses: gimenete/eslint-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
build:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v1
- name: Start MongoDB As Docker
uses: wbari/[email protected]
with:
mongoDBVersion: 4.2
- name: Read .nvmrc
run: echo "##[set-output name=NVMRC;]$(cat .nvmrc)"
id: nvm
- name: Cache
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Setup Node
uses: actions/setup-node@v1
with:
node-version: "${{ steps.nvm.outputs.NVMRC }}"
- name: Install Dependencies
if: steps.cache.outputs.cache-hit != 'true'
run: npm install
- name: Run Test Cases
run: npm run test
- name: Run Accessibility Tests
run: npm run test:accessibility
|
56fe9db9cdf303b6ea87336917d3250aeb1fb699
|
requirements-dev.txt
|
requirements-dev.txt
|
-r requirements.txt
alabaster==0.7.7
Babel==2.2.0
boto3==1.2.3
botocore==1.3.21
bumpversion==0.5.3
coverage==4.0.3
docutils==0.12
flake8==2.5.1
futures==3.0.4
Jinja2==2.8
jmespath==0.9.0
MarkupSafe==0.23
mccabe==0.3.1
nose==1.3.7
pep8==1.7.0
pyflakes==1.0.0
Pygments==2.1
python-dateutil==2.4.2
python-lambda-local==0.1.2
snowballstemmer==1.2.1
Sphinx==1.3.4
sphinx-rtd-theme==0.1.9
|
-r requirements.txt
alabaster==0.7.7
Babel==2.2.0
boto3==1.2.3
botocore==1.3.21
bumpversion==0.5.3
coverage==4.0.3
docutils==0.12
flake8==2.5.1
funcsigs==0.4
futures==3.0.4
Jinja2==2.8
jmespath==0.9.0
MarkupSafe==0.23
mccabe==0.3.1
mock==1.3.0
nose==1.3.7
pbr==1.8.1
pep8==1.7.0
pyflakes==1.0.0
Pygments==2.1
python-dateutil==2.4.2
python-lambda-local==0.1.2
snowballstemmer==1.2.1
Sphinx==1.3.4
sphinx-rtd-theme==0.1.9
|
Add mock + friends as dev requirements
|
Add mock + friends as dev requirements
|
Text
|
mit
|
marvinpinto/irc-hooky,marvinpinto/irc-hooky
|
text
|
## Code Before:
-r requirements.txt
alabaster==0.7.7
Babel==2.2.0
boto3==1.2.3
botocore==1.3.21
bumpversion==0.5.3
coverage==4.0.3
docutils==0.12
flake8==2.5.1
futures==3.0.4
Jinja2==2.8
jmespath==0.9.0
MarkupSafe==0.23
mccabe==0.3.1
nose==1.3.7
pep8==1.7.0
pyflakes==1.0.0
Pygments==2.1
python-dateutil==2.4.2
python-lambda-local==0.1.2
snowballstemmer==1.2.1
Sphinx==1.3.4
sphinx-rtd-theme==0.1.9
## Instruction:
Add mock + friends as dev requirements
## Code After:
-r requirements.txt
alabaster==0.7.7
Babel==2.2.0
boto3==1.2.3
botocore==1.3.21
bumpversion==0.5.3
coverage==4.0.3
docutils==0.12
flake8==2.5.1
funcsigs==0.4
futures==3.0.4
Jinja2==2.8
jmespath==0.9.0
MarkupSafe==0.23
mccabe==0.3.1
mock==1.3.0
nose==1.3.7
pbr==1.8.1
pep8==1.7.0
pyflakes==1.0.0
Pygments==2.1
python-dateutil==2.4.2
python-lambda-local==0.1.2
snowballstemmer==1.2.1
Sphinx==1.3.4
sphinx-rtd-theme==0.1.9
|
211bab7b6206c1eae81dabbc2823f249a19ec0a7
|
applications/calendar/src/app/components/layout/AuthSidebar.js
|
applications/calendar/src/app/components/layout/AuthSidebar.js
|
import React from 'react';
import { useModals, NavMenu } from 'react-components';
import { c } from 'ttag';
import EventModal from '../modals/EventModal';
import MiniCalendar from '../MiniCalendar';
const AuthSidebar = () => {
const { createModal } = useModals();
const handleSelect = () => {};
const list = [
{
icon: 'plus',
text: c('Action').t`Add event`,
type: 'button',
onClick() {
createModal(<EventModal />);
}
},
{
icon: 'calendar',
text: 'My calendar',
link: '/calendar'
}
];
return (
<div className="sidebar flex noprint">
<MiniCalendar onSelect={handleSelect} />
<nav className="navigation flex-item-fluid scroll-if-needed mb1">
<NavMenu list={list} />
</nav>
</div>
);
};
export default AuthSidebar;
|
import React from 'react';
import { useModals, NavMenu } from 'react-components';
import { c } from 'ttag';
import moment from 'moment';
import EventModal from '../modals/EventModal';
import MiniCalendar from '../MiniCalendar';
const AuthSidebar = () => {
const { createModal } = useModals();
const handleSelect = () => {};
const list = [
{
icon: 'plus',
text: c('Action').t`Add event`,
type: 'button',
onClick() {
createModal(<EventModal />);
}
},
{
icon: 'calendar',
text: 'My calendar',
link: '/calendar'
}
];
return (
<div className="sidebar flex noprint">
<MiniCalendar
defaultDate={new Date()}
setDefaultDate
onSelect={handleSelect}
format={moment.localeData().longDateFormat('L')}
/>
<nav className="navigation flex-item-fluid scroll-if-needed mb1">
<NavMenu list={list} />
</nav>
</div>
);
};
export default AuthSidebar;
|
Add prop for mini calendar
|
Add prop for mini calendar
|
JavaScript
|
mit
|
ProtonMail/WebClient,ProtonMail/WebClient,ProtonMail/WebClient
|
javascript
|
## Code Before:
import React from 'react';
import { useModals, NavMenu } from 'react-components';
import { c } from 'ttag';
import EventModal from '../modals/EventModal';
import MiniCalendar from '../MiniCalendar';
const AuthSidebar = () => {
const { createModal } = useModals();
const handleSelect = () => {};
const list = [
{
icon: 'plus',
text: c('Action').t`Add event`,
type: 'button',
onClick() {
createModal(<EventModal />);
}
},
{
icon: 'calendar',
text: 'My calendar',
link: '/calendar'
}
];
return (
<div className="sidebar flex noprint">
<MiniCalendar onSelect={handleSelect} />
<nav className="navigation flex-item-fluid scroll-if-needed mb1">
<NavMenu list={list} />
</nav>
</div>
);
};
export default AuthSidebar;
## Instruction:
Add prop for mini calendar
## Code After:
import React from 'react';
import { useModals, NavMenu } from 'react-components';
import { c } from 'ttag';
import moment from 'moment';
import EventModal from '../modals/EventModal';
import MiniCalendar from '../MiniCalendar';
const AuthSidebar = () => {
const { createModal } = useModals();
const handleSelect = () => {};
const list = [
{
icon: 'plus',
text: c('Action').t`Add event`,
type: 'button',
onClick() {
createModal(<EventModal />);
}
},
{
icon: 'calendar',
text: 'My calendar',
link: '/calendar'
}
];
return (
<div className="sidebar flex noprint">
<MiniCalendar
defaultDate={new Date()}
setDefaultDate
onSelect={handleSelect}
format={moment.localeData().longDateFormat('L')}
/>
<nav className="navigation flex-item-fluid scroll-if-needed mb1">
<NavMenu list={list} />
</nav>
</div>
);
};
export default AuthSidebar;
|
d5588c14fc517a0a5d85d09ded16e19addd42537
|
lib/userbin/configuration.rb
|
lib/userbin/configuration.rb
|
module Userbin
class Configuration
attr_accessor :app_id
attr_accessor :api_secret
attr_accessor :user_model
attr_accessor :restricted_path
end
end
|
module Userbin
class Configuration
attr_accessor :app_id
attr_accessor :api_secret
attr_accessor :user_model
attr_accessor :restricted_path
def initialize
self.app_id = ENV["USERBIN_APP_ID"]
self.api_secret = ENV["USERBIN_API_SECRET"]
end
end
end
|
Set default credentials based on ENV
|
Set default credentials based on ENV
|
Ruby
|
mit
|
givey/castle-ruby,castle/castle-ruby,castle/castle-ruby,cloud66/castle-ruby
|
ruby
|
## Code Before:
module Userbin
class Configuration
attr_accessor :app_id
attr_accessor :api_secret
attr_accessor :user_model
attr_accessor :restricted_path
end
end
## Instruction:
Set default credentials based on ENV
## Code After:
module Userbin
class Configuration
attr_accessor :app_id
attr_accessor :api_secret
attr_accessor :user_model
attr_accessor :restricted_path
def initialize
self.app_id = ENV["USERBIN_APP_ID"]
self.api_secret = ENV["USERBIN_API_SECRET"]
end
end
end
|
cb371b4b931ac425428676b645a8813ba2be485c
|
lib/resque/tasks.rb
|
lib/resque/tasks.rb
|
namespace :resque do
task :setup
desc "Start a Resque worker"
task :work => :setup do
require 'resque'
queues = (ENV['QUEUES'] || ENV['QUEUE']).to_s.split(',')
begin
worker = Resque::Worker.new(*queues)
worker.verbose = ENV['LOGGING'] || ENV['VERBOSE']
worker.very_verbose = ENV['VVERBOSE']
rescue Resque::NoQueueError
abort "set QUEUE env var, e.g. $ QUEUE=critical,high rake resque:work"
end
if ENV['PIDFILE']
File.open(ENV['PIDFILE'], 'w') { |f| f << worker.pid }
end
worker.log "Starting worker #{worker}"
worker.work(ENV['INTERVAL'] || 5) # interval, will block
end
desc "Start multiple Resque workers. Should only be used in dev mode."
task :workers do
threads = []
ENV['COUNT'].to_i.times do
threads << Thread.new do
system "rake resque:work"
end
end
threads.each { |thread| thread.join }
end
end
# Preload app files
task :environment do
Dir['app/**/*.rb'].each do |file|
require file
end
end
|
namespace :resque do
task :setup
desc "Start a Resque worker"
task :work => [ :preload, :setup ] do
require 'resque'
queues = (ENV['QUEUES'] || ENV['QUEUE']).to_s.split(',')
begin
worker = Resque::Worker.new(*queues)
worker.verbose = ENV['LOGGING'] || ENV['VERBOSE']
worker.very_verbose = ENV['VVERBOSE']
rescue Resque::NoQueueError
abort "set QUEUE env var, e.g. $ QUEUE=critical,high rake resque:work"
end
if ENV['PIDFILE']
File.open(ENV['PIDFILE'], 'w') { |f| f << worker.pid }
end
worker.log "Starting worker #{worker}"
worker.work(ENV['INTERVAL'] || 5) # interval, will block
end
desc "Start multiple Resque workers. Should only be used in dev mode."
task :workers do
threads = []
ENV['COUNT'].to_i.times do
threads << Thread.new do
system "rake resque:work"
end
end
threads.each { |thread| thread.join }
end
# Preload app files if this is Rails
task :preload do
if defined? RAILS_ROOT
Dir["#{RAILS_ROOT}/app/**/*.rb"].each do |file|
require file
end
end
end
end
|
Use RAILS_ROOT in preload task
|
Use RAILS_ROOT in preload task
|
Ruby
|
mit
|
tylerdooling/resque,resque/resque,m0dd3r/resque,fmitech/resque,lukeasrodgers/resque,kangkot/resque,paran0ids0ul/resque,begriffs/resque,delectable/resque,Shopify/resque,github/resque,sufery/resque,igorjancic/resque,redsquirrel/resque,ktheory/resque,stvp/resque,rutikasb/resque,CloudVLab/resque,bhighley/resque,coupa/resque,paran0ids0ul/resque,hone/resque,jonatas/resque,lileeyao/resque,ryan-drake-stoker/resque,xiangzhuyuan/resque,CloudVLab/resque,coupa/resque,vietnt/resque,PerfectMemory/resque,l4u/resque,redsquirrel/resque,dylanahsmith/resque,xiangzhuyuan/resque,resque/resque,jacknguyen/resque,redhotpenguin/resque,ejlangev/resque,stvp/resque,lukeasrodgers/resque,esdras/organizze-resque,PerfectMemory/resque,l4u/resque,igorjancic/resque,stitchfix/resque,vladimirich/resque,jacknguyen/resque,thecity/resque,hoffmanc/resque,BIAINC/resque,Shopify/resque,ejlangev/resque,lileeyao/resque,eileencodes/resque,sideci-sample/sideci-sample-resque,ngpestelos/resque,stitchfix/resque,AlexVPopov/resque,github/resque,amiel/resque,xiangzhuyuan/resque,ktheory/resque,thecity/resque,delectable/resque,Shopify/resque,sideci-sample/sideci-sample-resque,vladimirich/resque,dylanahsmith/resque,ducthanh/resque,CloudVLab/resque,esdras/organizze-resque,m0dd3r/resque,begriffs/resque,bonekost/resque,partap/resque,timo-p/resque,hoffmanc/resque,BIAINC/resque,timo-p/resque,stvp/resque,tylerdooling/resque,ychaim/resque,hone/resque,coupa/resque,sufery/resque,fmitech/resque,resque/resque,github/resque,amiel/resque
|
ruby
|
## Code Before:
namespace :resque do
task :setup
desc "Start a Resque worker"
task :work => :setup do
require 'resque'
queues = (ENV['QUEUES'] || ENV['QUEUE']).to_s.split(',')
begin
worker = Resque::Worker.new(*queues)
worker.verbose = ENV['LOGGING'] || ENV['VERBOSE']
worker.very_verbose = ENV['VVERBOSE']
rescue Resque::NoQueueError
abort "set QUEUE env var, e.g. $ QUEUE=critical,high rake resque:work"
end
if ENV['PIDFILE']
File.open(ENV['PIDFILE'], 'w') { |f| f << worker.pid }
end
worker.log "Starting worker #{worker}"
worker.work(ENV['INTERVAL'] || 5) # interval, will block
end
desc "Start multiple Resque workers. Should only be used in dev mode."
task :workers do
threads = []
ENV['COUNT'].to_i.times do
threads << Thread.new do
system "rake resque:work"
end
end
threads.each { |thread| thread.join }
end
end
# Preload app files
task :environment do
Dir['app/**/*.rb'].each do |file|
require file
end
end
## Instruction:
Use RAILS_ROOT in preload task
## Code After:
namespace :resque do
task :setup
desc "Start a Resque worker"
task :work => [ :preload, :setup ] do
require 'resque'
queues = (ENV['QUEUES'] || ENV['QUEUE']).to_s.split(',')
begin
worker = Resque::Worker.new(*queues)
worker.verbose = ENV['LOGGING'] || ENV['VERBOSE']
worker.very_verbose = ENV['VVERBOSE']
rescue Resque::NoQueueError
abort "set QUEUE env var, e.g. $ QUEUE=critical,high rake resque:work"
end
if ENV['PIDFILE']
File.open(ENV['PIDFILE'], 'w') { |f| f << worker.pid }
end
worker.log "Starting worker #{worker}"
worker.work(ENV['INTERVAL'] || 5) # interval, will block
end
desc "Start multiple Resque workers. Should only be used in dev mode."
task :workers do
threads = []
ENV['COUNT'].to_i.times do
threads << Thread.new do
system "rake resque:work"
end
end
threads.each { |thread| thread.join }
end
# Preload app files if this is Rails
task :preload do
if defined? RAILS_ROOT
Dir["#{RAILS_ROOT}/app/**/*.rb"].each do |file|
require file
end
end
end
end
|
d5b22dfcf0b85f25181c31e520dc878a9fed65e8
|
.travis.yml
|
.travis.yml
|
language: ruby
sudo: false
compiler:
- clang
- gcc
before_install:
- sudo apt-get -y install python-software-properties
- sudo add-apt-repository -y ppa:mnunberg/cmake
- sudo apt-get update
- sudo apt-get -y install libgtest-dev libssl-dev libev-dev libevent-dev cmake
# LibCouchbase
- sudo wget -O/etc/apt/sources.list.d/couchbase.list http://packages.couchbase.com/ubuntu/couchbase-ubuntu1204.list
- "sudo wget http://packages.couchbase.com/ubuntu/couchbase.key && sudo cat couchbase.key | sudo apt-key add -"
- sudo apt-get update
- sudo apt-get install libcouchbase2 libcouchbase-dev
rvm:
- 1.9.3
- jruby-18mode # JRuby in 1.8 mode
- jruby-19mode # JRuby in 1.9 mode
- rbx-2.1.1
- 1.8.7
- ree
|
language: ruby
sudo: false
compiler:
- clang
- gcc
before_install:
- sudo apt-get -y install python-software-properties
- sudo add-apt-repository -y ppa:mnunberg/cmake
- sudo apt-get update
- sudo apt-get -y install libgtest-dev libssl-dev libev-dev libevent-dev cmake
# LibCouchbase
- sudo wget -O/etc/apt/sources.list.d/couchbase.list http://packages.couchbase.com/ubuntu/couchbase-ubuntu1204.list
- "sudo wget http://packages.couchbase.com/ubuntu/couchbase.key && sudo cat couchbase.key | sudo apt-key add -"
- sudo apt-get update
- sudo apt-get install libcouchbase2 libcouchbase-dev
rvm:
- 2.2
- 2.1
- 2.0.0
- 1.9.3
|
Support 2.2.x, 2.1.x, 2.0.0, 1.9.3
|
Support 2.2.x, 2.1.x, 2.0.0, 1.9.3
|
YAML
|
mit
|
korczis/jetel,korczis/jetel,korczis/jetel
|
yaml
|
## Code Before:
language: ruby
sudo: false
compiler:
- clang
- gcc
before_install:
- sudo apt-get -y install python-software-properties
- sudo add-apt-repository -y ppa:mnunberg/cmake
- sudo apt-get update
- sudo apt-get -y install libgtest-dev libssl-dev libev-dev libevent-dev cmake
# LibCouchbase
- sudo wget -O/etc/apt/sources.list.d/couchbase.list http://packages.couchbase.com/ubuntu/couchbase-ubuntu1204.list
- "sudo wget http://packages.couchbase.com/ubuntu/couchbase.key && sudo cat couchbase.key | sudo apt-key add -"
- sudo apt-get update
- sudo apt-get install libcouchbase2 libcouchbase-dev
rvm:
- 1.9.3
- jruby-18mode # JRuby in 1.8 mode
- jruby-19mode # JRuby in 1.9 mode
- rbx-2.1.1
- 1.8.7
- ree
## Instruction:
Support 2.2.x, 2.1.x, 2.0.0, 1.9.3
## Code After:
language: ruby
sudo: false
compiler:
- clang
- gcc
before_install:
- sudo apt-get -y install python-software-properties
- sudo add-apt-repository -y ppa:mnunberg/cmake
- sudo apt-get update
- sudo apt-get -y install libgtest-dev libssl-dev libev-dev libevent-dev cmake
# LibCouchbase
- sudo wget -O/etc/apt/sources.list.d/couchbase.list http://packages.couchbase.com/ubuntu/couchbase-ubuntu1204.list
- "sudo wget http://packages.couchbase.com/ubuntu/couchbase.key && sudo cat couchbase.key | sudo apt-key add -"
- sudo apt-get update
- sudo apt-get install libcouchbase2 libcouchbase-dev
rvm:
- 2.2
- 2.1
- 2.0.0
- 1.9.3
|
7d0b51e497b130385527d3de80f2c2dd15335c50
|
src/client/app/soundcloud/soundcloud.service.ts
|
src/client/app/soundcloud/soundcloud.service.ts
|
import { Injectable } from '@angular/core';
import { Http, URLSearchParams } from '@angular/http';
@Injectable()
export class SoundcloudService {
public basePath: string = 'https://api.soundcloud.com/tracks';
private _clientId: string = '8159b4b99151c48d6aaf6770853bfd7a';
constructor(private _http: Http) {
}
public getTrack(trackId: number) {
let params = new URLSearchParams();
params.set('client_id', this._clientId);
let url = this.basePath + `/${trackId}`;
return this._http.get(url, {search: params});
}
public search(searchTerm: string) {
let params = new URLSearchParams();
params.set('client_id', this._clientId);
params.set('q', searchTerm);
return this._http.get(this.basePath, {search: params});
}
}
|
import { Injectable } from '@angular/core';
import { Http, URLSearchParams } from '@angular/http';
@Injectable()
export class SoundcloudService {
public basePath: string = 'https://api.soundcloud.com/tracks';
private _clientId: string = '8159b4b99151c48d6aaf6770853bfd7a';
constructor(private _http: Http) {
}
public getTrack(trackId: number) {
let params = new URLSearchParams();
params.set('client_id', this._clientId);
let url = this.basePath + `/${trackId}`;
return this._http.get(url, {search: params});
}
public search(searchTerm: string) {
const LIMIT: number = 200;
let params = new URLSearchParams();
params.set('client_id', this._clientId);
params.set('q', searchTerm);
params.set('limit', LIMIT);
return this._http.get(this.basePath, {search: params});
}
}
|
Set search limit to 200
|
Set search limit to 200
|
TypeScript
|
mit
|
JavierPDev/SounderProject,JavierPDev/SounderRadio,JavierPDev/SounderRadio,JavierPDev/SounderRadio,JavierPDev/SounderProject,JavierPDev/SounderProject
|
typescript
|
## Code Before:
import { Injectable } from '@angular/core';
import { Http, URLSearchParams } from '@angular/http';
@Injectable()
export class SoundcloudService {
public basePath: string = 'https://api.soundcloud.com/tracks';
private _clientId: string = '8159b4b99151c48d6aaf6770853bfd7a';
constructor(private _http: Http) {
}
public getTrack(trackId: number) {
let params = new URLSearchParams();
params.set('client_id', this._clientId);
let url = this.basePath + `/${trackId}`;
return this._http.get(url, {search: params});
}
public search(searchTerm: string) {
let params = new URLSearchParams();
params.set('client_id', this._clientId);
params.set('q', searchTerm);
return this._http.get(this.basePath, {search: params});
}
}
## Instruction:
Set search limit to 200
## Code After:
import { Injectable } from '@angular/core';
import { Http, URLSearchParams } from '@angular/http';
@Injectable()
export class SoundcloudService {
public basePath: string = 'https://api.soundcloud.com/tracks';
private _clientId: string = '8159b4b99151c48d6aaf6770853bfd7a';
constructor(private _http: Http) {
}
public getTrack(trackId: number) {
let params = new URLSearchParams();
params.set('client_id', this._clientId);
let url = this.basePath + `/${trackId}`;
return this._http.get(url, {search: params});
}
public search(searchTerm: string) {
const LIMIT: number = 200;
let params = new URLSearchParams();
params.set('client_id', this._clientId);
params.set('q', searchTerm);
params.set('limit', LIMIT);
return this._http.get(this.basePath, {search: params});
}
}
|
cda041fc33b1aa7ecfbdaac55226d9bec5def135
|
prow/oss/cluster/monitoring/secrets/alertmanager-prow_secret.yaml
|
prow/oss/cluster/monitoring/secrets/alertmanager-prow_secret.yaml
|
apiVersion: v1
kind: Secret
metadata:
name: alertmanager-prow
namespace: prow-monitoring
stringData:
alertmanager.yaml: |
global:
resolve_timeout: 5m
smtp_smarthost: 'smtp.gmail.com:587'
smtp_auth_username: ''
smtp_auth_identity: ''
smtp_auth_password: '{{ smtp_app_password }}'
smtp_from: ''
smtp_require_tls: true
route:
group_by: ['alertname', 'job']
group_wait: 30s
group_interval: 10m
repeat_interval: 4h
routes:
- receiver: 'email-alerts'
group_interval: 5m
repeat_interval: 2h
match_re:
severity: 'critical|high'
receivers:
- name: 'email-alerts'
email_configs:
- to: '[email protected],[email protected]'
text: '{{ template "custom_email_text" . }}'
templates:
- '*.tmpl'
msg.tmpl: |
{{ define "custom_email_text" }}{{ .CommonAnnotations.message }}{{ end }}
type: Opaque
|
apiVersion: v1
kind: Secret
metadata:
name: alertmanager-prow
namespace: prow-monitoring
stringData:
alertmanager.yaml: |
global:
resolve_timeout: 5m
smtp_smarthost: 'smtp.gmail.com:587'
smtp_auth_username: '{{ smtp_app_username }}'
smtp_auth_identity: '{{ smtp_app_username }}'
smtp_auth_password: '{{ smtp_app_password }}'
smtp_from: '{{ smtp_app_username }}'
smtp_require_tls: true
route:
group_by: ['alertname', 'job']
group_wait: 30s
group_interval: 10m
repeat_interval: 4h
receiver: 'no-op-alerts'
routes:
- receiver: 'email-alerts'
group_interval: 5m
repeat_interval: 2h
match_re:
severity: 'critical|high'
receivers:
- name: 'email-alerts'
email_configs:
- to: '[email protected],[email protected]'
text: '{{ template "custom_email_text" . }}'
- name: 'no-op-alerts'
templates:
- '*.tmpl'
msg.tmpl: |
{{ define "custom_email_text" }}{{ .CommonAnnotations.message }}{{ end }}
type: Opaque
|
Add a no op receiver as default alert receiver
|
Add a no op receiver as default alert receiver
|
YAML
|
apache-2.0
|
GoogleCloudPlatform/oss-test-infra,GoogleCloudPlatform/oss-test-infra
|
yaml
|
## Code Before:
apiVersion: v1
kind: Secret
metadata:
name: alertmanager-prow
namespace: prow-monitoring
stringData:
alertmanager.yaml: |
global:
resolve_timeout: 5m
smtp_smarthost: 'smtp.gmail.com:587'
smtp_auth_username: ''
smtp_auth_identity: ''
smtp_auth_password: '{{ smtp_app_password }}'
smtp_from: ''
smtp_require_tls: true
route:
group_by: ['alertname', 'job']
group_wait: 30s
group_interval: 10m
repeat_interval: 4h
routes:
- receiver: 'email-alerts'
group_interval: 5m
repeat_interval: 2h
match_re:
severity: 'critical|high'
receivers:
- name: 'email-alerts'
email_configs:
- to: '[email protected],[email protected]'
text: '{{ template "custom_email_text" . }}'
templates:
- '*.tmpl'
msg.tmpl: |
{{ define "custom_email_text" }}{{ .CommonAnnotations.message }}{{ end }}
type: Opaque
## Instruction:
Add a no op receiver as default alert receiver
## Code After:
apiVersion: v1
kind: Secret
metadata:
name: alertmanager-prow
namespace: prow-monitoring
stringData:
alertmanager.yaml: |
global:
resolve_timeout: 5m
smtp_smarthost: 'smtp.gmail.com:587'
smtp_auth_username: '{{ smtp_app_username }}'
smtp_auth_identity: '{{ smtp_app_username }}'
smtp_auth_password: '{{ smtp_app_password }}'
smtp_from: '{{ smtp_app_username }}'
smtp_require_tls: true
route:
group_by: ['alertname', 'job']
group_wait: 30s
group_interval: 10m
repeat_interval: 4h
receiver: 'no-op-alerts'
routes:
- receiver: 'email-alerts'
group_interval: 5m
repeat_interval: 2h
match_re:
severity: 'critical|high'
receivers:
- name: 'email-alerts'
email_configs:
- to: '[email protected],[email protected]'
text: '{{ template "custom_email_text" . }}'
- name: 'no-op-alerts'
templates:
- '*.tmpl'
msg.tmpl: |
{{ define "custom_email_text" }}{{ .CommonAnnotations.message }}{{ end }}
type: Opaque
|
979bd68ff6e7fd07a636acbeb6e6b856bacbb92e
|
source/_data/categories.yml
|
source/_data/categories.yml
|
news:
en:
name: News
slug: news
category: news
es:
name: Noticias
slug: noticias
category: noticias
|
book:
en:
name: Books
slug: book
category: book
es:
name: Libros
slug: libro
category: libro
|
Replace news category with book category in data
|
Replace news category with book category in data
Now the book category is known to the category generator.
|
YAML
|
mit
|
ahaasler/hexo-theme-colos-multilingual-demo,ahaasler/hexo-theme-colos-demo,ahaasler/hexo-theme-colos-multilingual-demo
|
yaml
|
## Code Before:
news:
en:
name: News
slug: news
category: news
es:
name: Noticias
slug: noticias
category: noticias
## Instruction:
Replace news category with book category in data
Now the book category is known to the category generator.
## Code After:
book:
en:
name: Books
slug: book
category: book
es:
name: Libros
slug: libro
category: libro
|
f340a959555928e17c040d3e189f5e6abe31dd87
|
.travis.yml
|
.travis.yml
|
language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
|
language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit --verbose
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
|
Enable verbose mode on phpunit on travos
|
Enable verbose mode on phpunit on travos
|
YAML
|
mit
|
bobthecow/psysh,bobthecow/psysh
|
yaml
|
## Code Before:
language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
## Instruction:
Enable verbose mode on phpunit on travos
## Code After:
language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit --verbose
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
|
2a55ddc68ea2eae2fd5fb9e04e4d28762a47b44d
|
lib/knife-solo/berkshelf.rb
|
lib/knife-solo/berkshelf.rb
|
require 'digest/sha1'
require 'fileutils'
require 'knife-solo/cookbook_manager'
require 'knife-solo/tools'
module KnifeSolo
class Berkshelf
include CookbookManager
def self.gem_libraries
%w[berkshelf]
end
def self.conf_file_name
'Berksfile'
end
def install!
path = berkshelf_path
ui.msg "Installing Berkshelf cookbooks to '#{path}'..."
berksfile = ::Berkshelf::Berksfile.from_file('Berksfile')
if berksfile.respond_to?(:vendor)
FileUtils.rm_rf(path)
berksfile.vendor(path)
else
berksfile.install(:path => path)
end
path
end
def berkshelf_path
KnifeSolo::Tools.config_value(config, :berkshelf_path) || default_path
end
def default_path
File.join(::Berkshelf.berkshelf_path, 'knife-solo',
Digest::SHA1.hexdigest(File.expand_path('.')))
end
def initial_config
'site :opscode'
end
end
end
|
require 'digest/sha1'
require 'fileutils'
require 'knife-solo/cookbook_manager'
require 'knife-solo/tools'
module KnifeSolo
class Berkshelf
include CookbookManager
def self.gem_libraries
%w[berkshelf]
end
def self.conf_file_name
'Berksfile'
end
def install!
path = berkshelf_path
ui.msg "Installing Berkshelf cookbooks to '#{path}'..."
berksfile = ::Berkshelf::Berksfile.from_file('Berksfile')
if berksfile.respond_to?(:vendor)
FileUtils.rm_rf(path)
berksfile.vendor(path)
else
berksfile.install(:path => path)
end
path
end
def berkshelf_path
KnifeSolo::Tools.config_value(config, :berkshelf_path) || default_path
end
def default_path
File.join(::Berkshelf.berkshelf_path, 'knife-solo',
Digest::SHA1.hexdigest(File.expand_path('.')))
end
def initial_config
if Gem::Version.new(::Berkshelf::VERSION) >= Gem::Version.new("3.0.0")
'source "https://api.berkshelf.com"'
else
'site :opscode'
end
end
end
end
|
Update source location for Berkshelf 3.0
|
Update source location for Berkshelf 3.0
|
Ruby
|
mit
|
a2ikm/knife-solo,analog-analytics/knife-solo,matschaffer/knife-solo,analog-analytics/knife-solo,leonid-shevtsov/knife-solo,analog-analytics/knife-solo,leonid-shevtsov/knife-solo,coletivoEITA/knife-solo,matschaffer/knife-solo,chadzilla2080/knife-solo,leonid-shevtsov/knife-solo,a2ikm/knife-solo,coletivoEITA/knife-solo,matschaffer/knife-solo,chadzilla2080/knife-solo,coletivoEITA/knife-solo,a2ikm/knife-solo,chadzilla2080/knife-solo
|
ruby
|
## Code Before:
require 'digest/sha1'
require 'fileutils'
require 'knife-solo/cookbook_manager'
require 'knife-solo/tools'
module KnifeSolo
class Berkshelf
include CookbookManager
def self.gem_libraries
%w[berkshelf]
end
def self.conf_file_name
'Berksfile'
end
def install!
path = berkshelf_path
ui.msg "Installing Berkshelf cookbooks to '#{path}'..."
berksfile = ::Berkshelf::Berksfile.from_file('Berksfile')
if berksfile.respond_to?(:vendor)
FileUtils.rm_rf(path)
berksfile.vendor(path)
else
berksfile.install(:path => path)
end
path
end
def berkshelf_path
KnifeSolo::Tools.config_value(config, :berkshelf_path) || default_path
end
def default_path
File.join(::Berkshelf.berkshelf_path, 'knife-solo',
Digest::SHA1.hexdigest(File.expand_path('.')))
end
def initial_config
'site :opscode'
end
end
end
## Instruction:
Update source location for Berkshelf 3.0
## Code After:
require 'digest/sha1'
require 'fileutils'
require 'knife-solo/cookbook_manager'
require 'knife-solo/tools'
module KnifeSolo
class Berkshelf
include CookbookManager
def self.gem_libraries
%w[berkshelf]
end
def self.conf_file_name
'Berksfile'
end
def install!
path = berkshelf_path
ui.msg "Installing Berkshelf cookbooks to '#{path}'..."
berksfile = ::Berkshelf::Berksfile.from_file('Berksfile')
if berksfile.respond_to?(:vendor)
FileUtils.rm_rf(path)
berksfile.vendor(path)
else
berksfile.install(:path => path)
end
path
end
def berkshelf_path
KnifeSolo::Tools.config_value(config, :berkshelf_path) || default_path
end
def default_path
File.join(::Berkshelf.berkshelf_path, 'knife-solo',
Digest::SHA1.hexdigest(File.expand_path('.')))
end
def initial_config
if Gem::Version.new(::Berkshelf::VERSION) >= Gem::Version.new("3.0.0")
'source "https://api.berkshelf.com"'
else
'site :opscode'
end
end
end
end
|
6c931091299fb5fef12f02f1e5647cb05ed10427
|
.travis.yml
|
.travis.yml
|
os:
- linux
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- gcc-8
- g++-8
- gfortran-8
homebrew:
packages:
- gnu-tar
global:
- MAKEFLAGS="-j 2"
matrix:
include:
- os: osx
osx_image: xcode9.4
env:
- MODE=build
dist: xenial
sudo: required
latex: true
language: r
git:
submodules: false
install:
- mkdir -p ~/.R
- ./util/travis/install-$TRAVIS_OS_NAME
script: "./util/travis/script"
env:
matrix:
- MODE=test IMX_OPT_ENGINE=NPSOL
- MODE=test IMX_OPT_ENGINE=CSOLNP
- MODE=test IMX_OPT_ENGINE=SLSQP
- MODE=cran-check
branches:
except:
- stable # already tested
before_deploy:
- openssl aes-256-cbc -K $encrypted_45bb258eabb3_key -iv $encrypted_45bb258eabb3_iv -in util/travis/deploy_rsa.enc -out /tmp/deploy_rsa -d
- eval "$(ssh-agent -s)"
- chmod 600 /tmp/deploy_rsa
- ssh-add /tmp/deploy_rsa
deploy:
provider: script
skip_cleanup: true
script: ./util/travis/deploy
on: master
|
os:
- linux
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- gcc-8
- g++-8
- gfortran-8
homebrew:
packages:
- gnu-tar
global:
- MAKEFLAGS="-j 2"
matrix:
include:
- os: osx
osx_image: xcode9.4
env:
- MODE=build
latex: false
dist: xenial
sudo: required
latex: true
language: r
git:
submodules: false
install:
- mkdir -p ~/.R
- ./util/travis/install-$TRAVIS_OS_NAME
script: "./util/travis/script"
env:
matrix:
- MODE=test IMX_OPT_ENGINE=NPSOL
- MODE=test IMX_OPT_ENGINE=CSOLNP
- MODE=test IMX_OPT_ENGINE=SLSQP
- MODE=cran-check
branches:
except:
- stable # already tested
before_deploy:
- openssl aes-256-cbc -K $encrypted_45bb258eabb3_key -iv $encrypted_45bb258eabb3_iv -in util/travis/deploy_rsa.enc -out /tmp/deploy_rsa -d
- eval "$(ssh-agent -s)"
- chmod 600 /tmp/deploy_rsa
- ssh-add /tmp/deploy_rsa
deploy:
provider: script
skip_cleanup: true
script: ./util/travis/deploy
on: master
|
Revert "Maybe need latex for os/x now?"
|
Revert "Maybe need latex for os/x now?"
This reverts commit 661d41ff60857a32fba196efe0b334e1a9a2e968.
|
YAML
|
apache-2.0
|
jpritikin/OpenMx,jpritikin/OpenMx,jpritikin/OpenMx,jpritikin/OpenMx,jpritikin/OpenMx,jpritikin/OpenMx
|
yaml
|
## Code Before:
os:
- linux
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- gcc-8
- g++-8
- gfortran-8
homebrew:
packages:
- gnu-tar
global:
- MAKEFLAGS="-j 2"
matrix:
include:
- os: osx
osx_image: xcode9.4
env:
- MODE=build
dist: xenial
sudo: required
latex: true
language: r
git:
submodules: false
install:
- mkdir -p ~/.R
- ./util/travis/install-$TRAVIS_OS_NAME
script: "./util/travis/script"
env:
matrix:
- MODE=test IMX_OPT_ENGINE=NPSOL
- MODE=test IMX_OPT_ENGINE=CSOLNP
- MODE=test IMX_OPT_ENGINE=SLSQP
- MODE=cran-check
branches:
except:
- stable # already tested
before_deploy:
- openssl aes-256-cbc -K $encrypted_45bb258eabb3_key -iv $encrypted_45bb258eabb3_iv -in util/travis/deploy_rsa.enc -out /tmp/deploy_rsa -d
- eval "$(ssh-agent -s)"
- chmod 600 /tmp/deploy_rsa
- ssh-add /tmp/deploy_rsa
deploy:
provider: script
skip_cleanup: true
script: ./util/travis/deploy
on: master
## Instruction:
Revert "Maybe need latex for os/x now?"
This reverts commit 661d41ff60857a32fba196efe0b334e1a9a2e968.
## Code After:
os:
- linux
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- gcc-8
- g++-8
- gfortran-8
homebrew:
packages:
- gnu-tar
global:
- MAKEFLAGS="-j 2"
matrix:
include:
- os: osx
osx_image: xcode9.4
env:
- MODE=build
latex: false
dist: xenial
sudo: required
latex: true
language: r
git:
submodules: false
install:
- mkdir -p ~/.R
- ./util/travis/install-$TRAVIS_OS_NAME
script: "./util/travis/script"
env:
matrix:
- MODE=test IMX_OPT_ENGINE=NPSOL
- MODE=test IMX_OPT_ENGINE=CSOLNP
- MODE=test IMX_OPT_ENGINE=SLSQP
- MODE=cran-check
branches:
except:
- stable # already tested
before_deploy:
- openssl aes-256-cbc -K $encrypted_45bb258eabb3_key -iv $encrypted_45bb258eabb3_iv -in util/travis/deploy_rsa.enc -out /tmp/deploy_rsa -d
- eval "$(ssh-agent -s)"
- chmod 600 /tmp/deploy_rsa
- ssh-add /tmp/deploy_rsa
deploy:
provider: script
skip_cleanup: true
script: ./util/travis/deploy
on: master
|
8f52ba8336a4c49732d47eb6f2a586e4639f19bc
|
Extension/YamlExtension.php
|
Extension/YamlExtension.php
|
<?php
/*
* This file is part of the Symfony package.
*
* (c) Fabien Potencier <[email protected]>
*
* For the full copyright and license information, please view the LICENSE
* file that was distributed with this source code.
*/
namespace Symfony\Bridge\Twig\Extension;
use Symfony\Component\Yaml\Dumper as YamlDumper;
/**
* Provides integration of the Yaml component with Twig.
*
* @author Fabien Potencier <[email protected]>
*/
class YamlExtension extends \Twig_Extension
{
/**
* {@inheritdoc}
*/
public function getFilters()
{
return array(
'yaml_encode' => new \Twig_Filter_Method($this, 'encode'),
'yaml_dump' => new \Twig_Filter_Method($this, 'dump'),
);
}
public function encode($input, $inline = 0)
{
static $dumper;
if (null === $dumper) {
$dumper = new YamlDumper();
}
return $dumper->dump($input, $inline);
}
public function dump($value)
{
if (is_resource($value)) {
return '%Resource%';
}
if (is_array($value) || is_object($value)) {
return '%'.gettype($value).'% '.$this->encode($value);
}
return $value;
}
/**
* Returns the name of the extension.
*
* @return string The extension name
*/
public function getName()
{
return 'yaml';
}
}
|
<?php
/*
* This file is part of the Symfony package.
*
* (c) Fabien Potencier <[email protected]>
*
* For the full copyright and license information, please view the LICENSE
* file that was distributed with this source code.
*/
namespace Symfony\Bridge\Twig\Extension;
use Symfony\Component\Yaml\Dumper as YamlDumper;
/**
* Provides integration of the Yaml component with Twig.
*
* @author Fabien Potencier <[email protected]>
*/
class YamlExtension extends \Twig_Extension
{
/**
* {@inheritdoc}
*/
public function getFilters()
{
return array(
'yaml_encode' => new \Twig_Filter_Method($this, 'encode'),
'yaml_dump' => new \Twig_Filter_Method($this, 'dump'),
);
}
public function encode($input, $inline = 0)
{
static $dumper;
if (null === $dumper) {
$dumper = new YamlDumper();
}
return $dumper->dump($input, $inline);
}
public function dump($value)
{
if (is_resource($value)) {
return '%Resource%';
}
if (is_array($value) || is_object($value)) {
return '%'.gettype($value).'% '.$this->encode($value);
}
return $this->encode($value);
}
/**
* Returns the name of the extension.
*
* @return string The extension name
*/
public function getName()
{
return 'yaml';
}
}
|
Add Support for boolean as to string into yaml extension
|
Add Support for boolean as to string into yaml extension
|
PHP
|
mit
|
symfony/twig-bridge,symfony/TwigBridge
|
php
|
## Code Before:
<?php
/*
* This file is part of the Symfony package.
*
* (c) Fabien Potencier <[email protected]>
*
* For the full copyright and license information, please view the LICENSE
* file that was distributed with this source code.
*/
namespace Symfony\Bridge\Twig\Extension;
use Symfony\Component\Yaml\Dumper as YamlDumper;
/**
* Provides integration of the Yaml component with Twig.
*
* @author Fabien Potencier <[email protected]>
*/
class YamlExtension extends \Twig_Extension
{
/**
* {@inheritdoc}
*/
public function getFilters()
{
return array(
'yaml_encode' => new \Twig_Filter_Method($this, 'encode'),
'yaml_dump' => new \Twig_Filter_Method($this, 'dump'),
);
}
public function encode($input, $inline = 0)
{
static $dumper;
if (null === $dumper) {
$dumper = new YamlDumper();
}
return $dumper->dump($input, $inline);
}
public function dump($value)
{
if (is_resource($value)) {
return '%Resource%';
}
if (is_array($value) || is_object($value)) {
return '%'.gettype($value).'% '.$this->encode($value);
}
return $value;
}
/**
* Returns the name of the extension.
*
* @return string The extension name
*/
public function getName()
{
return 'yaml';
}
}
## Instruction:
Add Support for boolean as to string into yaml extension
## Code After:
<?php
/*
* This file is part of the Symfony package.
*
* (c) Fabien Potencier <[email protected]>
*
* For the full copyright and license information, please view the LICENSE
* file that was distributed with this source code.
*/
namespace Symfony\Bridge\Twig\Extension;
use Symfony\Component\Yaml\Dumper as YamlDumper;
/**
* Provides integration of the Yaml component with Twig.
*
* @author Fabien Potencier <[email protected]>
*/
class YamlExtension extends \Twig_Extension
{
/**
* {@inheritdoc}
*/
public function getFilters()
{
return array(
'yaml_encode' => new \Twig_Filter_Method($this, 'encode'),
'yaml_dump' => new \Twig_Filter_Method($this, 'dump'),
);
}
public function encode($input, $inline = 0)
{
static $dumper;
if (null === $dumper) {
$dumper = new YamlDumper();
}
return $dumper->dump($input, $inline);
}
public function dump($value)
{
if (is_resource($value)) {
return '%Resource%';
}
if (is_array($value) || is_object($value)) {
return '%'.gettype($value).'% '.$this->encode($value);
}
return $this->encode($value);
}
/**
* Returns the name of the extension.
*
* @return string The extension name
*/
public function getName()
{
return 'yaml';
}
}
|
426e1767fc1dabfef58a52d476cac6315c2d78cb
|
src/util/analytics.js
|
src/util/analytics.js
|
import Piwik from 'piwik-react-router'
/**
* @typedef {Object} EventTrackInfo
* @property {string} category The event category ('Search', 'Blacklist', etc.).
* @property {string} action The event action ('Add Entry', etc.).
* @property {string} [name] The optional event name (user input - other custom info).
* @property {number} [value] The optional event value (should be numeric).
*/
class Analytics {
instance
/**
* @param {Object} args
* @param {string} args.url Address of the analytics server host.
* @param {string} args.siteId Piwik site ID for the site to track.
*/
constructor(args) {
this.instance = Piwik(args)
}
/**
* Track any user-invoked events.
*
* @param {EventTrackInfo} eventArgs
*/
trackEvent(eventArgs) {
const eventData = [
'trackEvent',
eventArgs.category,
eventArgs.action,
eventArgs.name,
eventArgs.value,
]
return this.instance.push(eventData)
}
// Default method wrappers
connectToHistory(history) {
return this.instance.connectToHistory(history)
}
}
const analytics = new Analytics({
url: process.env.PIWIK_HOST,
siteId: process.env.PIWIK_SITE_ID,
})
export default analytics
|
import Piwik from 'piwik-react-router'
/**
* @typedef {Object} EventTrackInfo
* @property {string} category The event category ('Search', 'Blacklist', etc.).
* @property {string} action The event action ('Add Entry', etc.).
* @property {string} [name] The optional event name (user input - other custom info).
* @property {number} [value] The optional event value (should be numeric).
*/
class Analytics {
instance
/**
* @param {Object} args
* @param {string} args.url Address of the analytics server host.
* @param {string} args.siteId Piwik site ID for the site to track.
*/
constructor(args) {
this.instance = Piwik(args)
}
/**
* Track any user-invoked events.
*
* @param {EventTrackInfo} eventArgs
*/
trackEvent(eventArgs) {
const eventData = [
'trackEvent',
eventArgs.category,
eventArgs.action,
eventArgs.name,
eventArgs.value,
]
return this.instance.push(eventData)
}
// Default method wrappers
connectToHistory(history) {
return this.instance.connectToHistory(history)
}
}
const analytics = new Analytics({
url: process.env.PIWIK_HOST,
siteId: process.env.PIWIK_SITE_ID,
trackErrors: true,
})
export default analytics
|
Set up runtime JS error tracking
|
Set up runtime JS error tracking
- this client lib supports it with `trackErrors` option. Seems to work alright giving the line the error occurred at + message
|
JavaScript
|
mit
|
WorldBrain/WebMemex,WorldBrain/WebMemex
|
javascript
|
## Code Before:
import Piwik from 'piwik-react-router'
/**
* @typedef {Object} EventTrackInfo
* @property {string} category The event category ('Search', 'Blacklist', etc.).
* @property {string} action The event action ('Add Entry', etc.).
* @property {string} [name] The optional event name (user input - other custom info).
* @property {number} [value] The optional event value (should be numeric).
*/
class Analytics {
instance
/**
* @param {Object} args
* @param {string} args.url Address of the analytics server host.
* @param {string} args.siteId Piwik site ID for the site to track.
*/
constructor(args) {
this.instance = Piwik(args)
}
/**
* Track any user-invoked events.
*
* @param {EventTrackInfo} eventArgs
*/
trackEvent(eventArgs) {
const eventData = [
'trackEvent',
eventArgs.category,
eventArgs.action,
eventArgs.name,
eventArgs.value,
]
return this.instance.push(eventData)
}
// Default method wrappers
connectToHistory(history) {
return this.instance.connectToHistory(history)
}
}
const analytics = new Analytics({
url: process.env.PIWIK_HOST,
siteId: process.env.PIWIK_SITE_ID,
})
export default analytics
## Instruction:
Set up runtime JS error tracking
- this client lib supports it with `trackErrors` option. Seems to work alright giving the line the error occurred at + message
## Code After:
import Piwik from 'piwik-react-router'
/**
* @typedef {Object} EventTrackInfo
* @property {string} category The event category ('Search', 'Blacklist', etc.).
* @property {string} action The event action ('Add Entry', etc.).
* @property {string} [name] The optional event name (user input - other custom info).
* @property {number} [value] The optional event value (should be numeric).
*/
class Analytics {
instance
/**
* @param {Object} args
* @param {string} args.url Address of the analytics server host.
* @param {string} args.siteId Piwik site ID for the site to track.
*/
constructor(args) {
this.instance = Piwik(args)
}
/**
* Track any user-invoked events.
*
* @param {EventTrackInfo} eventArgs
*/
trackEvent(eventArgs) {
const eventData = [
'trackEvent',
eventArgs.category,
eventArgs.action,
eventArgs.name,
eventArgs.value,
]
return this.instance.push(eventData)
}
// Default method wrappers
connectToHistory(history) {
return this.instance.connectToHistory(history)
}
}
const analytics = new Analytics({
url: process.env.PIWIK_HOST,
siteId: process.env.PIWIK_SITE_ID,
trackErrors: true,
})
export default analytics
|
32d8ff6e69cceff29b66ce93afaa2f6e6372204c
|
src/main/java/io/mkremins/whydah/interpreter/Scope.java
|
src/main/java/io/mkremins/whydah/interpreter/Scope.java
|
package io.mkremins.whydah.interpreter;
import io.mkremins.whydah.ast.Expression;
import io.mkremins.whydah.ast.ExpressionUtils;
import java.util.HashMap;
import java.util.Map;
public class Scope {
private final Map<String, Expression> vars;
private final Scope parent;
public Scope(final Scope parent) {
vars = new HashMap<>();
this.parent = parent;
}
public Scope() {
this(null);
}
public void delete(final String varName) {
getDeclaringScope(varName).vars.remove(varName);
}
public Expression get(final String varName) {
return getDeclaringScope(varName).vars.get(varName);
}
public void set(final String varName, final Expression value) {
Scope scope = getDeclaringScope(varName);
if (scope == null) {
scope = this;
}
scope.vars.put(varName, ExpressionUtils.fullyEvaluate(value, scope));
}
private Scope getDeclaringScope(final String varName) {
Scope scope = this;
while (scope != null && scope.vars.get(varName) == null) {
scope = scope.parent;
}
return scope;
}
}
|
package io.mkremins.whydah.interpreter;
import io.mkremins.whydah.ast.Expression;
import io.mkremins.whydah.ast.ExpressionUtils;
import java.util.HashMap;
import java.util.Map;
public class Scope {
private final Map<String, Expression> vars;
private final Scope parent;
public Scope(final Scope parent) {
vars = new HashMap<String, Expression>();
this.parent = parent;
}
public Scope() {
this(null);
}
public void delete(final String varName) {
getDeclaringScope(varName).vars.remove(varName);
}
public Expression get(final String varName) {
return getDeclaringScope(varName).vars.get(varName);
}
public void set(final String varName, final Expression value) {
Scope scope = getDeclaringScope(varName);
if (scope == null) {
scope = this;
}
scope.vars.put(varName, ExpressionUtils.fullyEvaluate(value, scope));
}
private Scope getDeclaringScope(final String varName) {
Scope scope = this;
while (scope != null && scope.vars.get(varName) == null) {
scope = scope.parent;
}
return scope;
}
}
|
Make project Java 6 compliant
|
Make project Java 6 compliant
|
Java
|
mit
|
mkremins/whydah
|
java
|
## Code Before:
package io.mkremins.whydah.interpreter;
import io.mkremins.whydah.ast.Expression;
import io.mkremins.whydah.ast.ExpressionUtils;
import java.util.HashMap;
import java.util.Map;
public class Scope {
private final Map<String, Expression> vars;
private final Scope parent;
public Scope(final Scope parent) {
vars = new HashMap<>();
this.parent = parent;
}
public Scope() {
this(null);
}
public void delete(final String varName) {
getDeclaringScope(varName).vars.remove(varName);
}
public Expression get(final String varName) {
return getDeclaringScope(varName).vars.get(varName);
}
public void set(final String varName, final Expression value) {
Scope scope = getDeclaringScope(varName);
if (scope == null) {
scope = this;
}
scope.vars.put(varName, ExpressionUtils.fullyEvaluate(value, scope));
}
private Scope getDeclaringScope(final String varName) {
Scope scope = this;
while (scope != null && scope.vars.get(varName) == null) {
scope = scope.parent;
}
return scope;
}
}
## Instruction:
Make project Java 6 compliant
## Code After:
package io.mkremins.whydah.interpreter;
import io.mkremins.whydah.ast.Expression;
import io.mkremins.whydah.ast.ExpressionUtils;
import java.util.HashMap;
import java.util.Map;
public class Scope {
private final Map<String, Expression> vars;
private final Scope parent;
public Scope(final Scope parent) {
vars = new HashMap<String, Expression>();
this.parent = parent;
}
public Scope() {
this(null);
}
public void delete(final String varName) {
getDeclaringScope(varName).vars.remove(varName);
}
public Expression get(final String varName) {
return getDeclaringScope(varName).vars.get(varName);
}
public void set(final String varName, final Expression value) {
Scope scope = getDeclaringScope(varName);
if (scope == null) {
scope = this;
}
scope.vars.put(varName, ExpressionUtils.fullyEvaluate(value, scope));
}
private Scope getDeclaringScope(final String varName) {
Scope scope = this;
while (scope != null && scope.vars.get(varName) == null) {
scope = scope.parent;
}
return scope;
}
}
|
108992af62c393f931e2ebf1fe7443cbfdac7229
|
.travis.yml
|
.travis.yml
|
---
sudo: false
language: ruby
bundler_args: --without acceptance_testing metatools
script:
- bundle exec rake release_checks
- bundle exec rake mdl
matrix:
fast_finish: true
include:
- rvm: 1.9.3
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
- rvm: 2.2.5
env: PUPPET_GEM_VERSION="~> 4.0"
- rvm: 2.3.1
env: PUPPET_GEM_VERSION="~> 4.0"
notifications:
email: false
webhooks:
urls:
- https://webhooks.gitter.im/e/3402fe31d5b814c57316
on_success: always
on_failure: always
on_start: never
|
---
sudo: false
language: ruby
bundler_args: --without acceptance_testing metatools
script:
- bundle exec rake release_checks
- bundle exec rake mdl
matrix:
fast_finish: true
include:
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
- rvm: 2.2.5
env: PUPPET_GEM_VERSION="~> 4.0"
- rvm: 2.3.1
env: PUPPET_GEM_VERSION="~> 4.0"
notifications:
email: false
webhooks:
urls:
- https://webhooks.gitter.im/e/3402fe31d5b814c57316
on_success: always
on_failure: always
on_start: never
|
Drop CI testing on Ruby 1.9
|
Drop CI testing on Ruby 1.9
|
YAML
|
mit
|
leoarnold/puppet-cups,leoarnold/puppet-cups
|
yaml
|
## Code Before:
---
sudo: false
language: ruby
bundler_args: --without acceptance_testing metatools
script:
- bundle exec rake release_checks
- bundle exec rake mdl
matrix:
fast_finish: true
include:
- rvm: 1.9.3
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
- rvm: 2.2.5
env: PUPPET_GEM_VERSION="~> 4.0"
- rvm: 2.3.1
env: PUPPET_GEM_VERSION="~> 4.0"
notifications:
email: false
webhooks:
urls:
- https://webhooks.gitter.im/e/3402fe31d5b814c57316
on_success: always
on_failure: always
on_start: never
## Instruction:
Drop CI testing on Ruby 1.9
## Code After:
---
sudo: false
language: ruby
bundler_args: --without acceptance_testing metatools
script:
- bundle exec rake release_checks
- bundle exec rake mdl
matrix:
fast_finish: true
include:
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.9
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
- rvm: 2.2.5
env: PUPPET_GEM_VERSION="~> 4.0"
- rvm: 2.3.1
env: PUPPET_GEM_VERSION="~> 4.0"
notifications:
email: false
webhooks:
urls:
- https://webhooks.gitter.im/e/3402fe31d5b814c57316
on_success: always
on_failure: always
on_start: never
|
b61042c8b905f6c689fabadc15c130cf58dbb6e0
|
src/test/scala/org/hashids/CheckHashids.scala
|
src/test/scala/org/hashids/CheckHashids.scala
|
package org.hashids
import org.hashids.syntax._
import org.scalacheck._
import org.specs2._
case class ZeroOrPosLong(value: Long)
class CheckHashids extends org.specs2.Specification with org.specs2.ScalaCheck {
import CheckHashids.arbitraryZeroOrPosLong
def is = {
"List of random zero or positive longs should encode then decode" ! {
check { (a: List[ZeroOrPosLong], salt: String) =>
implicit val hashid = Hashids(salt)
a.raw.toHashid.fromHashid must_== a.raw
}
} ^ {
end
}
}
implicit class RichListZeroOrPosLong(self: List[ZeroOrPosLong]) {
def raw = self.map(_.value)
}
}
object CheckHashids {
implicit val arbitraryZeroOrPosLong: Arbitrary[ZeroOrPosLong] = Arbitrary {
Gen.chooseNum(0L, Long.MaxValue, 2L, 75527867232L).map(ZeroOrPosLong(_))
}
}
|
package org.hashids
import org.hashids.syntax._
import org.scalacheck._
import org.specs2._
case class ZeroOrPosLong(value: Long)
class CheckHashids extends org.specs2.Specification with org.specs2.ScalaCheck {
import CheckHashids._
def is = {
"List of random zero or positive longs should encode then decode" ! {
check { (a: List[ZeroOrPosLong], salt: String) =>
implicit val hashid = Hashids(salt)
a.raw.toHashid.fromHashid must_== a.raw
}
} ^ {
end
}
}
}
object CheckHashids {
implicit val arbitraryZeroOrPosLong: Arbitrary[ZeroOrPosLong] = Arbitrary {
Gen.chooseNum(0L, Long.MaxValue, 2L, 75527867232L).map(ZeroOrPosLong(_))
}
implicit class RichListZeroOrPosLong(self: List[ZeroOrPosLong]) {
def raw = self.map(_.value)
}
}
|
Move non-specification code into companion object.
|
Move non-specification code into companion object.
|
Scala
|
mit
|
newhoggy/hashids-scala,pico-works/pico-hashids
|
scala
|
## Code Before:
package org.hashids
import org.hashids.syntax._
import org.scalacheck._
import org.specs2._
case class ZeroOrPosLong(value: Long)
class CheckHashids extends org.specs2.Specification with org.specs2.ScalaCheck {
import CheckHashids.arbitraryZeroOrPosLong
def is = {
"List of random zero or positive longs should encode then decode" ! {
check { (a: List[ZeroOrPosLong], salt: String) =>
implicit val hashid = Hashids(salt)
a.raw.toHashid.fromHashid must_== a.raw
}
} ^ {
end
}
}
implicit class RichListZeroOrPosLong(self: List[ZeroOrPosLong]) {
def raw = self.map(_.value)
}
}
object CheckHashids {
implicit val arbitraryZeroOrPosLong: Arbitrary[ZeroOrPosLong] = Arbitrary {
Gen.chooseNum(0L, Long.MaxValue, 2L, 75527867232L).map(ZeroOrPosLong(_))
}
}
## Instruction:
Move non-specification code into companion object.
## Code After:
package org.hashids
import org.hashids.syntax._
import org.scalacheck._
import org.specs2._
case class ZeroOrPosLong(value: Long)
class CheckHashids extends org.specs2.Specification with org.specs2.ScalaCheck {
import CheckHashids._
def is = {
"List of random zero or positive longs should encode then decode" ! {
check { (a: List[ZeroOrPosLong], salt: String) =>
implicit val hashid = Hashids(salt)
a.raw.toHashid.fromHashid must_== a.raw
}
} ^ {
end
}
}
}
object CheckHashids {
implicit val arbitraryZeroOrPosLong: Arbitrary[ZeroOrPosLong] = Arbitrary {
Gen.chooseNum(0L, Long.MaxValue, 2L, 75527867232L).map(ZeroOrPosLong(_))
}
implicit class RichListZeroOrPosLong(self: List[ZeroOrPosLong]) {
def raw = self.map(_.value)
}
}
|
9cd54ca34eee54d3cbe357147f2f81c6021bc606
|
controllers/widget.js
|
controllers/widget.js
|
var args = _.extend({
duration: 2000,
animationDuration: 250,
message: '',
title: Ti.App.name,
elasticity: 0.5,
pushForce: 30,
usePhysicsEngine: true
}, arguments[0] || {});
var That = null;
exports.show = function(opt) {
if (_.isObject(opt)) _.extend(args, opt);
if (_.isString(opt)) _.extend(args, { message: opt });
if (OS_ANDROID && args.view == null) {
Ti.API.error("In Android you have to set a view that contain the sliding view. Fallbacking to Ti.UI.Notification.");
That = Ti.UI.createNotification({
message: args.message,
duration: args.duration
});
That.show();
} else {
That = Widget.createController('window', args);
}
};
exports.hide = function() {
if (That != null) {
That.hide();
}
};
|
var args = _.extend({
duration: 2000,
animationDuration: 250,
message: '',
title: Ti.App.name,
elasticity: 0.5,
pushForce: 30,
usePhysicsEngine: true
}, arguments[0] || {});
var That = null;
exports.show = function(opt) {
if (_.isObject(opt)) _.extend(args, opt);
if (_.isString(opt)) _.extend(args, { message: opt });
if (OS_ANDROID && args.view == null) {
Ti.API.error("In Android you have to set a view that contain the sliding view. Fallbacking to Ti.UI.Notification.");
That = Ti.UI.createNotification({
message: args.message,
duration: args.duration
});
That.show();
} else {
That = Widget.createController('window', args);
}
};
exports.update = function(message)
{
That.update(message);
};
exports.hide = function() {
if (That != null) {
That.hide();
}
};
|
Allow update of notification text.
|
Allow update of notification text.
Allows you to update the message in the notification.
I wanted this feature to allow me to show loading percentage in the notification when downloading or uploading data.
|
JavaScript
|
mit
|
titanium-forks/CaffeinaLab.com.caffeinalab.titanium.notifications,CaffeinaLab/Ti.Notifications,DouglasHennrich/Ti.Notifications
|
javascript
|
## Code Before:
var args = _.extend({
duration: 2000,
animationDuration: 250,
message: '',
title: Ti.App.name,
elasticity: 0.5,
pushForce: 30,
usePhysicsEngine: true
}, arguments[0] || {});
var That = null;
exports.show = function(opt) {
if (_.isObject(opt)) _.extend(args, opt);
if (_.isString(opt)) _.extend(args, { message: opt });
if (OS_ANDROID && args.view == null) {
Ti.API.error("In Android you have to set a view that contain the sliding view. Fallbacking to Ti.UI.Notification.");
That = Ti.UI.createNotification({
message: args.message,
duration: args.duration
});
That.show();
} else {
That = Widget.createController('window', args);
}
};
exports.hide = function() {
if (That != null) {
That.hide();
}
};
## Instruction:
Allow update of notification text.
Allows you to update the message in the notification.
I wanted this feature to allow me to show loading percentage in the notification when downloading or uploading data.
## Code After:
var args = _.extend({
duration: 2000,
animationDuration: 250,
message: '',
title: Ti.App.name,
elasticity: 0.5,
pushForce: 30,
usePhysicsEngine: true
}, arguments[0] || {});
var That = null;
exports.show = function(opt) {
if (_.isObject(opt)) _.extend(args, opt);
if (_.isString(opt)) _.extend(args, { message: opt });
if (OS_ANDROID && args.view == null) {
Ti.API.error("In Android you have to set a view that contain the sliding view. Fallbacking to Ti.UI.Notification.");
That = Ti.UI.createNotification({
message: args.message,
duration: args.duration
});
That.show();
} else {
That = Widget.createController('window', args);
}
};
exports.update = function(message)
{
That.update(message);
};
exports.hide = function() {
if (That != null) {
That.hide();
}
};
|
a8b9a8e22f3b5fb047ae4b9537322f761e531e0a
|
generate-combined.sh
|
generate-combined.sh
|
cat \
dygraph-layout.js \
dygraph-canvas.js \
dygraph.js \
dygraph-utils.js \
dygraph-gviz.js \
dygraph-interaction-model.js \
dygraph-range-selector.js \
dygraph-tickers.js \
rgbcolor/rgbcolor.js \
strftime/strftime-min.js \
plugins/base.js \
plugins/legend.js \
plugins/install.js \
| perl -ne 'print unless m,REMOVE_FOR_COMBINED,..m,/REMOVE_FOR_COMBINED,' \
> /tmp/dygraph.js
java -jar yuicompressor-2.4.2.jar /tmp/dygraph.js \
> /tmp/dygraph-packed.js
(
echo '/*! dygraphs v1.2 dygraphs.com | dygraphs.com/license */'
cat /tmp/dygraph-packed.js
) > dygraph-combined.js
chmod a+r dygraph-combined.js
|
cat \
dygraph-layout.js \
dygraph-canvas.js \
dygraph.js \
dygraph-utils.js \
dygraph-gviz.js \
dygraph-interaction-model.js \
dygraph-range-selector.js \
dygraph-tickers.js \
rgbcolor/rgbcolor.js \
strftime/strftime-min.js \
plugins/base.js \
plugins/legend.js \
plugins/install.js \
| perl -ne 'print unless m,REMOVE_FOR_COMBINED,..m,/REMOVE_FOR_COMBINED,' \
> /tmp/dygraph.js
java -jar yuicompressor-2.4.2.jar /tmp/dygraph.js \
> /tmp/dygraph-packed.js
(
echo '/*! @license Copyright 2011 Dan Vanderkam ([email protected]) MIT-licensed (http://opensource.org/licenses/MIT) */'
cat /tmp/dygraph-packed.js
) > dygraph-combined.js
chmod a+r dygraph-combined.js
|
Add @license tag to dygraph-combined.js
|
Add @license tag to dygraph-combined.js
|
Shell
|
mit
|
timeu/dygraphs,timeu/dygraphs,davidmsibley/dygraphs,mantyr/dygraphs,klausw/dygraphs,Akiyah/dygraphs,Yong-Lee/dygraphs,petechap/dygraphs,grantadesign/dygraphs,klausw/dygraphs,kbaggott/dygraphs,pshevtsov/dygraphs,panuhorsmalahti/dygraphs,mcanthony/dygraphs,danvk/dygraphs,mcanthony/dygraphs,Akiyah/dygraphs,petechap/dygraphs,timeu/dygraphs,klausw/dygraphs,cavorite/dygraphs,reinert/dygraphs,mantyr/dygraphs,panuhorsmalahti/dygraphs,petechap/dygraphs,vhotspur/dygraphs,witsa/dygraphs,panuhorsmalahti/dygraphs,mariolll/dygraphs,pshevtsov/dygraphs,kbaggott/dygraphs,grantadesign/dygraphs,mariolll/dygraphs,grantadesign/dygraphs,danvk/dygraphs,mantyr/dygraphs,davidmsibley/dygraphs,jmptrader/dygraphs,mariolll/dygraphs,socib/dygraphs,danvk/dygraphs,mariolll/dygraphs,klausw/dygraphs,vhotspur/dygraphs,Akiyah/dygraphs,reinert/dygraphs,pshevtsov/dygraphs,petechap/dygraphs,cavorite/dygraphs,klausw/dygraphs,grantadesign/dygraphs,reinert/dygraphs,pshevtsov/dygraphs,socib/dygraphs,Yong-Lee/dygraphs,panuhorsmalahti/dygraphs,witsa/dygraphs,mcanthony/dygraphs,pshevtsov/dygraphs,mcanthony/dygraphs,kbaggott/dygraphs,socib/dygraphs,mariolll/dygraphs,vhotspur/dygraphs,witsa/dygraphs,Akiyah/dygraphs,Yong-Lee/dygraphs,kbaggott/dygraphs,timeu/dygraphs,Yong-Lee/dygraphs,witsa/dygraphs,reinert/dygraphs,socib/dygraphs,jmptrader/dygraphs,mantyr/dygraphs,timeu/dygraphs,davidmsibley/dygraphs,cavorite/dygraphs,danvk/dygraphs,davidmsibley/dygraphs,danvk/dygraphs,petechap/dygraphs,jmptrader/dygraphs,kbaggott/dygraphs,davidmsibley/dygraphs,jmptrader/dygraphs,vhotspur/dygraphs,Yong-Lee/dygraphs,grantadesign/dygraphs,Akiyah/dygraphs,jmptrader/dygraphs,cavorite/dygraphs,reinert/dygraphs,panuhorsmalahti/dygraphs,vhotspur/dygraphs,mantyr/dygraphs,mcanthony/dygraphs,witsa/dygraphs
|
shell
|
## Code Before:
cat \
dygraph-layout.js \
dygraph-canvas.js \
dygraph.js \
dygraph-utils.js \
dygraph-gviz.js \
dygraph-interaction-model.js \
dygraph-range-selector.js \
dygraph-tickers.js \
rgbcolor/rgbcolor.js \
strftime/strftime-min.js \
plugins/base.js \
plugins/legend.js \
plugins/install.js \
| perl -ne 'print unless m,REMOVE_FOR_COMBINED,..m,/REMOVE_FOR_COMBINED,' \
> /tmp/dygraph.js
java -jar yuicompressor-2.4.2.jar /tmp/dygraph.js \
> /tmp/dygraph-packed.js
(
echo '/*! dygraphs v1.2 dygraphs.com | dygraphs.com/license */'
cat /tmp/dygraph-packed.js
) > dygraph-combined.js
chmod a+r dygraph-combined.js
## Instruction:
Add @license tag to dygraph-combined.js
## Code After:
cat \
dygraph-layout.js \
dygraph-canvas.js \
dygraph.js \
dygraph-utils.js \
dygraph-gviz.js \
dygraph-interaction-model.js \
dygraph-range-selector.js \
dygraph-tickers.js \
rgbcolor/rgbcolor.js \
strftime/strftime-min.js \
plugins/base.js \
plugins/legend.js \
plugins/install.js \
| perl -ne 'print unless m,REMOVE_FOR_COMBINED,..m,/REMOVE_FOR_COMBINED,' \
> /tmp/dygraph.js
java -jar yuicompressor-2.4.2.jar /tmp/dygraph.js \
> /tmp/dygraph-packed.js
(
echo '/*! @license Copyright 2011 Dan Vanderkam ([email protected]) MIT-licensed (http://opensource.org/licenses/MIT) */'
cat /tmp/dygraph-packed.js
) > dygraph-combined.js
chmod a+r dygraph-combined.js
|
1b44f860e812ef7e26f5ce0658467b2d89beffc4
|
css/main.css
|
css/main.css
|
.message_list {
overflow-y: scroll;
max-height: 50%;
}
.chat-input {
width: 90%;
}
/* I have no idea if this is the best way to handle the issues with float:left in navbars, or if 30px is a good value */
.tabset {
min-height: 30px;
}
ul.tab-list {
list-style: none;
}
ul.tab-list li {
display: block;
float: left;
padding: 5px;
margin: 0px 10px;
}
|
.message_list {
overflow-y: scroll;
max-height: 50%;
white-space: pre-wrap;
line-height: 0.9em;
}
.chat-input {
width: 90%;
}
/* I have no idea if this is the best way to handle the issues with float:left in navbars, or if 30px is a good value */
.tabset {
min-height: 30px;
}
ul.tab-list {
list-style: none;
}
ul.tab-list li {
display: block;
float: left;
padding: 5px;
margin: 0px 10px;
}
|
Use `pre-wrap` rendering to make ASCII art work
|
Use `pre-wrap` rendering to make ASCII art work
|
CSS
|
mit
|
n00bish/hackmud_chat_web,ethankaminski/hackmud_chat_web,n00bish/hackmud_chat_web,ethankaminski/hackmud_chat_web,DrizzlyBear/hackmud_chat_web,DrizzlyBear/hackmud_chat_web
|
css
|
## Code Before:
.message_list {
overflow-y: scroll;
max-height: 50%;
}
.chat-input {
width: 90%;
}
/* I have no idea if this is the best way to handle the issues with float:left in navbars, or if 30px is a good value */
.tabset {
min-height: 30px;
}
ul.tab-list {
list-style: none;
}
ul.tab-list li {
display: block;
float: left;
padding: 5px;
margin: 0px 10px;
}
## Instruction:
Use `pre-wrap` rendering to make ASCII art work
## Code After:
.message_list {
overflow-y: scroll;
max-height: 50%;
white-space: pre-wrap;
line-height: 0.9em;
}
.chat-input {
width: 90%;
}
/* I have no idea if this is the best way to handle the issues with float:left in navbars, or if 30px is a good value */
.tabset {
min-height: 30px;
}
ul.tab-list {
list-style: none;
}
ul.tab-list li {
display: block;
float: left;
padding: 5px;
margin: 0px 10px;
}
|
c80f1fbd432a4bd7d10084a62f9965523dd9c613
|
.travis.yml
|
.travis.yml
|
before_script: apt-get install -y libusb-dev
script: cd usbpush && make
language: c
compiler:
- clang
- gcc
os:
- linux
|
before_install:
- apt-get install -y libusb-dev
script: cd usbpush && make
language: c
compiler:
- clang
- gcc
os:
- linux
|
Bring installation with apt-get to before_install stage.
|
Bring installation with apt-get to before_install stage.
|
YAML
|
bsd-2-clause
|
wkoszek/usbpush
|
yaml
|
## Code Before:
before_script: apt-get install -y libusb-dev
script: cd usbpush && make
language: c
compiler:
- clang
- gcc
os:
- linux
## Instruction:
Bring installation with apt-get to before_install stage.
## Code After:
before_install:
- apt-get install -y libusb-dev
script: cd usbpush && make
language: c
compiler:
- clang
- gcc
os:
- linux
|
58f2ab3b617ac8bdeadbe766b75745cbd449882c
|
.circleci/config.yml
|
.circleci/config.yml
|
version: 2
jobs:
build:
docker:
- image: circleci/node:10
working_directory: ~/nteract
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- yarn-packages-{{ checksum "yarn.lock" }}
# fallback to using the latest cache if no exact match is found
- yarn-packages-
- run: python3 -m pip install setuptools
- run: python3 -m pip install jupyter
- run: yarn
- save_cache:
paths:
- ~/.cache/yarn
key: yarn-packages-{{ checksum "yarn.lock" }}
# run tests!
- run: yarn run test:lint
- run: yarn run build:packages:ci
- run: yarn run test --maxWorkers 1
- run: yarn run test:flow
- run:
name: Build Docs
command: yarn run docs:build
- store_artifacts:
path: styleguide
destination: styleguide
- run:
name: Build Commuter
command: npx lerna run build --scope @nteract/commuter
no_output_timeout: 20m
- run:
name: Build Play
command: npx lerna run build --scope @nteract/play
no_output_timeout: 20m
- run:
name: Build Jupyter Extension
command: npx lerna run build:asap --scope nteract-on-jupyter
no_output_timeout: 30m
|
version: 2
jobs:
build:
docker:
- image: circleci/node:10
working_directory: ~/nteract
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- yarn-packages-{{ checksum "yarn.lock" }}
# fallback to using the latest cache if no exact match is found
- yarn-packages-
- run: pip install setuptools
- run: pip install jupyter
- run: yarn
- save_cache:
paths:
- ~/.cache/yarn
key: yarn-packages-{{ checksum "yarn.lock" }}
# run tests!
- run: yarn run test:lint
- run: yarn run build:packages:ci
- run: yarn run test --maxWorkers 1
- run: yarn run test:flow
- run:
name: Build Docs
command: yarn run docs:build
- store_artifacts:
path: styleguide
destination: styleguide
- run:
name: Build Commuter
command: npx lerna run build --scope @nteract/commuter
no_output_timeout: 20m
- run:
name: Build Play
command: npx lerna run build --scope @nteract/play
no_output_timeout: 20m
- run:
name: Build Jupyter Extension
command: npx lerna run build:asap --scope nteract-on-jupyter
no_output_timeout: 30m
|
Use pip install directly in CircleCI build
|
Use pip install directly in CircleCI build
|
YAML
|
bsd-3-clause
|
nteract/composition,nteract/nteract,nteract/nteract,nteract/nteract,nteract/nteract,nteract/nteract,nteract/composition,nteract/composition
|
yaml
|
## Code Before:
version: 2
jobs:
build:
docker:
- image: circleci/node:10
working_directory: ~/nteract
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- yarn-packages-{{ checksum "yarn.lock" }}
# fallback to using the latest cache if no exact match is found
- yarn-packages-
- run: python3 -m pip install setuptools
- run: python3 -m pip install jupyter
- run: yarn
- save_cache:
paths:
- ~/.cache/yarn
key: yarn-packages-{{ checksum "yarn.lock" }}
# run tests!
- run: yarn run test:lint
- run: yarn run build:packages:ci
- run: yarn run test --maxWorkers 1
- run: yarn run test:flow
- run:
name: Build Docs
command: yarn run docs:build
- store_artifacts:
path: styleguide
destination: styleguide
- run:
name: Build Commuter
command: npx lerna run build --scope @nteract/commuter
no_output_timeout: 20m
- run:
name: Build Play
command: npx lerna run build --scope @nteract/play
no_output_timeout: 20m
- run:
name: Build Jupyter Extension
command: npx lerna run build:asap --scope nteract-on-jupyter
no_output_timeout: 30m
## Instruction:
Use pip install directly in CircleCI build
## Code After:
version: 2
jobs:
build:
docker:
- image: circleci/node:10
working_directory: ~/nteract
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- yarn-packages-{{ checksum "yarn.lock" }}
# fallback to using the latest cache if no exact match is found
- yarn-packages-
- run: pip install setuptools
- run: pip install jupyter
- run: yarn
- save_cache:
paths:
- ~/.cache/yarn
key: yarn-packages-{{ checksum "yarn.lock" }}
# run tests!
- run: yarn run test:lint
- run: yarn run build:packages:ci
- run: yarn run test --maxWorkers 1
- run: yarn run test:flow
- run:
name: Build Docs
command: yarn run docs:build
- store_artifacts:
path: styleguide
destination: styleguide
- run:
name: Build Commuter
command: npx lerna run build --scope @nteract/commuter
no_output_timeout: 20m
- run:
name: Build Play
command: npx lerna run build --scope @nteract/play
no_output_timeout: 20m
- run:
name: Build Jupyter Extension
command: npx lerna run build:asap --scope nteract-on-jupyter
no_output_timeout: 30m
|
903f2504a00501b592e0c99dbdfa5e177638aea1
|
tasks/git_shell.yml
|
tasks/git_shell.yml
|
- set_fact: git_home=/srv/git
- user: name={{ git_username }} shell=/usr/bin/git-shell home={{ git_home }} group={{ ansible_admin_group | default(omit) }}
- file: name={{ git_home }}/git-shell-commands state=directory
- copy: dest={{ git_home }}/git-shell-commands/{{ item }} src={{ item }} mode=0755
with_items:
- list
- help
- file: state=link dest={{ git_home }}/{{ item }}.git src={{ git_repositories_dir }}/{{ item }}
with_items:
- public
- private
|
- set_fact: git_home=/srv/git
- name: Create user {{ git_username }}
user: name={{ git_username }} shell=/usr/bin/git-shell home={{ git_home }} group={{ ansible_admin_group | default(omit) }}
- name: Create git-shell-commands directory
file: name={{ git_home }}/git-shell-commands state=directory
- name: Copy commands
copy: dest={{ git_home }}/git-shell-commands/{{ item }} src={{ item }} mode=0755
with_items:
- list
- help
- name: Link repositories
file: state=link dest={{ git_home }}/{{ item }}.git src={{ git_repositories_dir }}/{{ item }}
with_items:
- public
- private
|
Add name to the various command, for ansible 2.0 compatbility
|
Add name to the various command, for ansible 2.0 compatbility
|
YAML
|
mit
|
mscherer/ansible-role-ansible_bastion,OSAS/ansible-role-ansible_bastion,OSAS/ansible-role-ansible_bastion,mscherer/ansible-role-ansible_bastion
|
yaml
|
## Code Before:
- set_fact: git_home=/srv/git
- user: name={{ git_username }} shell=/usr/bin/git-shell home={{ git_home }} group={{ ansible_admin_group | default(omit) }}
- file: name={{ git_home }}/git-shell-commands state=directory
- copy: dest={{ git_home }}/git-shell-commands/{{ item }} src={{ item }} mode=0755
with_items:
- list
- help
- file: state=link dest={{ git_home }}/{{ item }}.git src={{ git_repositories_dir }}/{{ item }}
with_items:
- public
- private
## Instruction:
Add name to the various command, for ansible 2.0 compatbility
## Code After:
- set_fact: git_home=/srv/git
- name: Create user {{ git_username }}
user: name={{ git_username }} shell=/usr/bin/git-shell home={{ git_home }} group={{ ansible_admin_group | default(omit) }}
- name: Create git-shell-commands directory
file: name={{ git_home }}/git-shell-commands state=directory
- name: Copy commands
copy: dest={{ git_home }}/git-shell-commands/{{ item }} src={{ item }} mode=0755
with_items:
- list
- help
- name: Link repositories
file: state=link dest={{ git_home }}/{{ item }}.git src={{ git_repositories_dir }}/{{ item }}
with_items:
- public
- private
|
59219da0cfa9463b41553d1dd096c662f001ab98
|
README.textile
|
README.textile
|
h1. InvisionBridge
A Rails plugin to allow Authlogic to use an Invision Power Board database's user credentials.
h2. Credits
* Chris Herring - "connection_ninja":http://github.com/cherring/connection_ninja
Copyright (c) 2009 Robert Speicher, released under the MIT license
|
h1. InvisionBridge
A Rails plugin to allow Authlogic to use an Invision Power Board database's user credentials.
h2. Credits
Copyright (c) 2009 Robert Speicher, released under the MIT license
|
Remove a credit, since we went another way.
|
Remove a credit, since we went another way.
|
Textile
|
mit
|
tsigo/invision_bridge
|
textile
|
## Code Before:
h1. InvisionBridge
A Rails plugin to allow Authlogic to use an Invision Power Board database's user credentials.
h2. Credits
* Chris Herring - "connection_ninja":http://github.com/cherring/connection_ninja
Copyright (c) 2009 Robert Speicher, released under the MIT license
## Instruction:
Remove a credit, since we went another way.
## Code After:
h1. InvisionBridge
A Rails plugin to allow Authlogic to use an Invision Power Board database's user credentials.
h2. Credits
Copyright (c) 2009 Robert Speicher, released under the MIT license
|
62b7bad2a5fb5f1de53a54ffeaad104aa5d8c4b6
|
package.json
|
package.json
|
{
"name": "monaca-components",
"version": "1.0.0",
"description": "Monaca Shared Components",
"main": "Gruntfile.js",
"scripts": {
"start": "grunt server",
"build": "./node_modules/.bin/grunt",
"test": "echo \"Add Tests\""
},
"repository": {
"type": "git",
"url": "git+https://github.com/monaca/cdn.monaca.io.git"
},
"keywords": [
"Monaca"
],
"author": "Erisu",
"license": "Apache-2.0",
"devDependencies": {
"compass-sass-mixins": "^0.12.7",
"grunt": "^1.0.1",
"grunt-aws-s3": "^0.14.5",
"grunt-cli": "^1.2.0",
"grunt-contrib-connect": "^1.0.2",
"grunt-contrib-copy": "^1.0.0",
"grunt-contrib-cssmin": "^1.0.2",
"grunt-contrib-watch": "^1.0.0",
"grunt-sass": "^1.2.1",
"grunt-task-loader": "^0.6.0"
}
}
|
{
"name": "monaca-components",
"version": "1.0.0",
"description": "Monaca Shared Components",
"main": "Gruntfile.js",
"scripts": {
"start": "./node_modules/.bin/grunt server",
"build": "./node_modules/.bin/grunt",
"test": "echo \"Add Tests\""
},
"repository": {
"type": "git",
"url": "git+https://github.com/monaca/cdn.monaca.io.git"
},
"keywords": [
"Monaca"
],
"author": "Erisu",
"license": "Apache-2.0",
"devDependencies": {
"compass-sass-mixins": "^0.12.7",
"grunt": "^1.0.1",
"grunt-aws-s3": "^0.14.5",
"grunt-cli": "^1.2.0",
"grunt-contrib-connect": "^1.0.2",
"grunt-contrib-copy": "^1.0.0",
"grunt-contrib-cssmin": "^1.0.2",
"grunt-contrib-watch": "^1.0.0",
"grunt-sass": "^1.2.1",
"grunt-task-loader": "^0.6.0"
}
}
|
Make npm start command work for non-global grunt-cli
|
Make npm start command work for non-global grunt-cli
|
JSON
|
apache-2.0
|
erisu/cdn.monaca.io
|
json
|
## Code Before:
{
"name": "monaca-components",
"version": "1.0.0",
"description": "Monaca Shared Components",
"main": "Gruntfile.js",
"scripts": {
"start": "grunt server",
"build": "./node_modules/.bin/grunt",
"test": "echo \"Add Tests\""
},
"repository": {
"type": "git",
"url": "git+https://github.com/monaca/cdn.monaca.io.git"
},
"keywords": [
"Monaca"
],
"author": "Erisu",
"license": "Apache-2.0",
"devDependencies": {
"compass-sass-mixins": "^0.12.7",
"grunt": "^1.0.1",
"grunt-aws-s3": "^0.14.5",
"grunt-cli": "^1.2.0",
"grunt-contrib-connect": "^1.0.2",
"grunt-contrib-copy": "^1.0.0",
"grunt-contrib-cssmin": "^1.0.2",
"grunt-contrib-watch": "^1.0.0",
"grunt-sass": "^1.2.1",
"grunt-task-loader": "^0.6.0"
}
}
## Instruction:
Make npm start command work for non-global grunt-cli
## Code After:
{
"name": "monaca-components",
"version": "1.0.0",
"description": "Monaca Shared Components",
"main": "Gruntfile.js",
"scripts": {
"start": "./node_modules/.bin/grunt server",
"build": "./node_modules/.bin/grunt",
"test": "echo \"Add Tests\""
},
"repository": {
"type": "git",
"url": "git+https://github.com/monaca/cdn.monaca.io.git"
},
"keywords": [
"Monaca"
],
"author": "Erisu",
"license": "Apache-2.0",
"devDependencies": {
"compass-sass-mixins": "^0.12.7",
"grunt": "^1.0.1",
"grunt-aws-s3": "^0.14.5",
"grunt-cli": "^1.2.0",
"grunt-contrib-connect": "^1.0.2",
"grunt-contrib-copy": "^1.0.0",
"grunt-contrib-cssmin": "^1.0.2",
"grunt-contrib-watch": "^1.0.0",
"grunt-sass": "^1.2.1",
"grunt-task-loader": "^0.6.0"
}
}
|
aa1788fe13192790591e68d27326cbe0e983fdc0
|
src/icon.js
|
src/icon.js
|
/**
* brightwheel
*
* Copyright © 2016 Allen Smith <[email protected]>. All rights reserved.
*
* This source code is licensed under the MIT license found in the
* LICENSE.txt file in the root directory of this source tree.
*/
/** @jsx etch.dom */
import etch from 'etch';
import classNames from 'classnames';
import BrightwheelComponent from './brightwheel-component';
class Icon extends BrightwheelComponent {
constructor(properties, children) {
// Construct a basic BrightwheelComponent from the parameters given
super(properties, children)
// Set a default icon if unspecified
if(this.properties.icon === undefined) {
this.properties.icon = 'help-circled';
}
// Reinitialize component
etch.initialize(this);
}
render() {
let classes = classNames(
'icon',
`icon-${this.properties.icon}`,
this.properties.classNames
);
return (<span {...this.properties.attributes} className={classes}></span>);
}
}
export default Icon;
|
/**
* brightwheel
*
* Copyright © 2016 Allen Smith <[email protected]>. All rights reserved.
*
* This source code is licensed under the MIT license found in the
* LICENSE.txt file in the root directory of this source tree.
*/
/** @jsx etch.dom */
import etch from 'etch';
import classNames from 'classnames';
import BrightwheelComponent from './brightwheel-component';
class Icon extends BrightwheelComponent {
constructor(properties, children) {
// Construct a basic BrightwheelComponent from the parameters given
super(properties, children)
// Set a default icon if unspecified
if(this.properties.icon === undefined) {
this.properties.icon = 'help-circled';
}
// Reinitialize component
etch.initialize(this);
}
render() {
let classes = classNames(
'icon',
`icon-${this.properties.icon}`,
this.properties.classNames
);
return (<span {...this.properties.attributes} className={classes}></span>);
}
update(properties, children) {
this.properties = properties;
this.children = children;
return etch.update(this);
}
}
export default Icon;
|
Implement Icon.update() for ComponentWidget fix
|
Implement Icon.update() for ComponentWidget fix
|
JavaScript
|
mit
|
loranallensmith/brightwheel
|
javascript
|
## Code Before:
/**
* brightwheel
*
* Copyright © 2016 Allen Smith <[email protected]>. All rights reserved.
*
* This source code is licensed under the MIT license found in the
* LICENSE.txt file in the root directory of this source tree.
*/
/** @jsx etch.dom */
import etch from 'etch';
import classNames from 'classnames';
import BrightwheelComponent from './brightwheel-component';
class Icon extends BrightwheelComponent {
constructor(properties, children) {
// Construct a basic BrightwheelComponent from the parameters given
super(properties, children)
// Set a default icon if unspecified
if(this.properties.icon === undefined) {
this.properties.icon = 'help-circled';
}
// Reinitialize component
etch.initialize(this);
}
render() {
let classes = classNames(
'icon',
`icon-${this.properties.icon}`,
this.properties.classNames
);
return (<span {...this.properties.attributes} className={classes}></span>);
}
}
export default Icon;
## Instruction:
Implement Icon.update() for ComponentWidget fix
## Code After:
/**
* brightwheel
*
* Copyright © 2016 Allen Smith <[email protected]>. All rights reserved.
*
* This source code is licensed under the MIT license found in the
* LICENSE.txt file in the root directory of this source tree.
*/
/** @jsx etch.dom */
import etch from 'etch';
import classNames from 'classnames';
import BrightwheelComponent from './brightwheel-component';
class Icon extends BrightwheelComponent {
constructor(properties, children) {
// Construct a basic BrightwheelComponent from the parameters given
super(properties, children)
// Set a default icon if unspecified
if(this.properties.icon === undefined) {
this.properties.icon = 'help-circled';
}
// Reinitialize component
etch.initialize(this);
}
render() {
let classes = classNames(
'icon',
`icon-${this.properties.icon}`,
this.properties.classNames
);
return (<span {...this.properties.attributes} className={classes}></span>);
}
update(properties, children) {
this.properties = properties;
this.children = children;
return etch.update(this);
}
}
export default Icon;
|
39e9562a4ee3cf702ac467dd07704b92ea9ae808
|
_posts/2014-09-22-using-the-new-daysSinceModified-parameter.html.markdown
|
_posts/2014-09-22-using-the-new-daysSinceModified-parameter.html.markdown
|
---
title: Using the new daysSinceModified search parameter
date: 2014-09-22 00:00 UTC
tags:
layout: post
---
Due to popular demand, we have introduced a new parameter which can be used to filter records modified within X number of days. The new daysSinceModified parameter works for both a day value as well as a date range.
If a user appends daysSinceModified=X to the url where x is a number, the API will search for records modified in the last X days including today.
To get everything modified today, use the following:
> /documents.json?daysSinceModified=0
Documents modified since yesterday (including today) can be found using:
> /documents.json?daysSinceModified=1
Similarly, if a user appends daysSinceModified=x-y to the url, our API will run a search for records modified between X and Y.
If you want to get everything that was modified last week (September 14-20, 2014), not inclusive of any modifcations after:
> /documents.json?daysSinceModified=2014-09-14-2014-09-20
This parameter should help API users who query our API on a daily basis to keep their data updated. Prior to this enhancement, users had to pull all records to get data modified recently. Now using this parameter, they will be able to filter out records they already store.
|
---
title: Using the new daysSinceModified search parameter
date: 2014-09-22 00:00 UTC
tags:
layout: post
---
Due to popular demand, we have introduced a new parameter which can be used to filter records modified within X number of days. If a user appends daysSinceModified=X to the url where X is a number, the API will search for records modified in the last X days including today.
To get everything modified today, use the following:
> /documents.json?daysSinceModified=0
Documents modified since yesterday (including today) can be found using:
> /documents.json?daysSinceModified=1
This parameter should help API users who query our API on a daily basis to keep their data updated. Prior to this enhancement, users had to pull all records to get data modified recently. Now using this parameter, they will be able to filter out records they already store.
|
Remove information about the date range
|
Remove information about the date range
|
Markdown
|
unlicense
|
YOTOV-LIMITED/developers,YOTOV-LIMITED/developers,YOTOV-LIMITED/developers
|
markdown
|
## Code Before:
---
title: Using the new daysSinceModified search parameter
date: 2014-09-22 00:00 UTC
tags:
layout: post
---
Due to popular demand, we have introduced a new parameter which can be used to filter records modified within X number of days. The new daysSinceModified parameter works for both a day value as well as a date range.
If a user appends daysSinceModified=X to the url where x is a number, the API will search for records modified in the last X days including today.
To get everything modified today, use the following:
> /documents.json?daysSinceModified=0
Documents modified since yesterday (including today) can be found using:
> /documents.json?daysSinceModified=1
Similarly, if a user appends daysSinceModified=x-y to the url, our API will run a search for records modified between X and Y.
If you want to get everything that was modified last week (September 14-20, 2014), not inclusive of any modifcations after:
> /documents.json?daysSinceModified=2014-09-14-2014-09-20
This parameter should help API users who query our API on a daily basis to keep their data updated. Prior to this enhancement, users had to pull all records to get data modified recently. Now using this parameter, they will be able to filter out records they already store.
## Instruction:
Remove information about the date range
## Code After:
---
title: Using the new daysSinceModified search parameter
date: 2014-09-22 00:00 UTC
tags:
layout: post
---
Due to popular demand, we have introduced a new parameter which can be used to filter records modified within X number of days. If a user appends daysSinceModified=X to the url where X is a number, the API will search for records modified in the last X days including today.
To get everything modified today, use the following:
> /documents.json?daysSinceModified=0
Documents modified since yesterday (including today) can be found using:
> /documents.json?daysSinceModified=1
This parameter should help API users who query our API on a daily basis to keep their data updated. Prior to this enhancement, users had to pull all records to get data modified recently. Now using this parameter, they will be able to filter out records they already store.
|
5f3ce497c57680cd00409b03817d16e58c2e2e18
|
README.md
|
README.md
|
go-oledb
========
OLEDB for Go language
|
go-oledb
========
OLEDB for Go language
I'm not going to continue with this: it makes more sense to use ADO, and mattn has already
done this: see https://github.com/mattn/go-adodb.
|
Update readme, not going to continue with this for now
|
Update readme, not going to continue with this for now
|
Markdown
|
mit
|
jjeffery/go-oledb
|
markdown
|
## Code Before:
go-oledb
========
OLEDB for Go language
## Instruction:
Update readme, not going to continue with this for now
## Code After:
go-oledb
========
OLEDB for Go language
I'm not going to continue with this: it makes more sense to use ADO, and mattn has already
done this: see https://github.com/mattn/go-adodb.
|
2d0b5c965999cddb97618faf6e1093d162bc5fdc
|
_posts/2016-07-04-gsc-projects.md
|
_posts/2016-07-04-gsc-projects.md
|
---
layout: post
title: Google summer of codes projects
tags:
- Coding
---
Here is a list of all the google summer of code prjects I was a mentor for.
* Implement a Map/Reduce Framework (2016)
[Map/Reduce](http://en.wikipedia.org/wiki/MapReduce) frameworks are getting more and more popular for big data processing (for example [Hadoop](http://hadoop.apache.org/)). By utilizing the unified and standards conforming API of the HPX runtime system, we believe to be able to perfectly repesent the Map/Reduce programming model. Many applications would benefit from direct support in HPX. This might include adding [Hypertable](http://hypertable.org/) or similar libraries to the mix to handle the large data sets Map/Reduce is usually used with.
|
---
layout: post
title: Google summer of codes projects
tags:
- Coding
---
Here is a list of all the google summer of code prjects I was a mentor for.
* Implement a Map/Reduce Framework (2016)
[Map/Reduce](http://en.wikipedia.org/wiki/MapReduce) frameworks are getting more and more popular for big data processing (for example [Hadoop](http://hadoop.apache.org/)). By utilizing the unified and standards conforming API of the HPX runtime system, we believe to be able to perfectly repesent the Map/Reduce programming model. Many applications would benefit from direct support in HPX. This might include adding [Hypertable](http://hypertable.org/) or similar libraries to the mix to handle the large data sets Map/Reduce is usually used with.
The project was done by [Aalekh Nigam](https://twitter.com/_aalekh) and is hosete on [github](https://github.com/AALEKH/hpxflow).
|
Add updates to the gsc projects
|
Add updates to the gsc projects
|
Markdown
|
mit
|
diehlpk/diehlpk.github.io,diehlpk/diehlpk.github.io
|
markdown
|
## Code Before:
---
layout: post
title: Google summer of codes projects
tags:
- Coding
---
Here is a list of all the google summer of code prjects I was a mentor for.
* Implement a Map/Reduce Framework (2016)
[Map/Reduce](http://en.wikipedia.org/wiki/MapReduce) frameworks are getting more and more popular for big data processing (for example [Hadoop](http://hadoop.apache.org/)). By utilizing the unified and standards conforming API of the HPX runtime system, we believe to be able to perfectly repesent the Map/Reduce programming model. Many applications would benefit from direct support in HPX. This might include adding [Hypertable](http://hypertable.org/) or similar libraries to the mix to handle the large data sets Map/Reduce is usually used with.
## Instruction:
Add updates to the gsc projects
## Code After:
---
layout: post
title: Google summer of codes projects
tags:
- Coding
---
Here is a list of all the google summer of code prjects I was a mentor for.
* Implement a Map/Reduce Framework (2016)
[Map/Reduce](http://en.wikipedia.org/wiki/MapReduce) frameworks are getting more and more popular for big data processing (for example [Hadoop](http://hadoop.apache.org/)). By utilizing the unified and standards conforming API of the HPX runtime system, we believe to be able to perfectly repesent the Map/Reduce programming model. Many applications would benefit from direct support in HPX. This might include adding [Hypertable](http://hypertable.org/) or similar libraries to the mix to handle the large data sets Map/Reduce is usually used with.
The project was done by [Aalekh Nigam](https://twitter.com/_aalekh) and is hosete on [github](https://github.com/AALEKH/hpxflow).
|
fb19192f80d283f23da32403e73a129238031bc9
|
circle.yml
|
circle.yml
|
deployment:
production:
branch: master
commands:
- ssh -l openfisca legislation.openfisca.fr "cd legislation-explorer; ./deploy_prod.sh"
|
dependencies:
cache_directories:
- "~/cache"
pre:
# Only create directory if not already present (--parents option).
- mkdir --parents ~/cache
post:
# Only downloads if not already present (--no-clobber option)
- cd ~/cache && wget --no-clobber http://selenium-release.storage.googleapis.com/3.3/selenium-server-standalone-3.3.0.jar
- java -jar ~/cache/selenium-server-standalone-3.3.0.jar:
background: true
deployment:
production:
branch: master
commands:
- ssh -l openfisca legislation.openfisca.fr "cd legislation-explorer; ./deploy_prod.sh"
|
Install selenium standalone server in CI
|
Install selenium standalone server in CI
|
YAML
|
agpl-3.0
|
openfisca/legislation-explorer
|
yaml
|
## Code Before:
deployment:
production:
branch: master
commands:
- ssh -l openfisca legislation.openfisca.fr "cd legislation-explorer; ./deploy_prod.sh"
## Instruction:
Install selenium standalone server in CI
## Code After:
dependencies:
cache_directories:
- "~/cache"
pre:
# Only create directory if not already present (--parents option).
- mkdir --parents ~/cache
post:
# Only downloads if not already present (--no-clobber option)
- cd ~/cache && wget --no-clobber http://selenium-release.storage.googleapis.com/3.3/selenium-server-standalone-3.3.0.jar
- java -jar ~/cache/selenium-server-standalone-3.3.0.jar:
background: true
deployment:
production:
branch: master
commands:
- ssh -l openfisca legislation.openfisca.fr "cd legislation-explorer; ./deploy_prod.sh"
|
cbd2778d71cce9870b40c4c090f765ef6ab69230
|
.travis.yml
|
.travis.yml
|
sudo: false
language: python
cache:
directories:
- $HOME/gcloud/
env:
- PATH=$PATH:$HOME/gcloud/google-cloud-sdk/bin GOOGLE_APPLICATION_CREDENTIALS=$TRAVIS_BUILD_DIR/python-docs-samples.json
PYTHONPATH=${HOME}/gcloud/google-cloud-sdk/platform/google_appengine
before_install:
- openssl aes-256-cbc -K $encrypted_2be61c502406_key -iv $encrypted_2be61c502406_iv
-in silver-python.json.enc -out silver-python.json -d
- if [ ! -d $HOME/gcloud/google-cloud-sdk ]; then mkdir -p $HOME/gcloud && wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz
--directory-prefix=$HOME/gcloud && cd $HOME/gcloud && tar xzf google-cloud-sdk.tar.gz
&& printf '\ny\n\ny\ny\n' | ./google-cloud-sdk/install.sh && cd $TRAVIS_BUILD_DIR;
fi
- gcloud -q components update gae-python
- if [ -a python-docs-samples.json ]; then gcloud auth activate-service-account --key-file
silver-python.json; fi
install:
- pip install -r requirements.txt
script:
- gcloud -q preview app deploy app.yaml
- python e2e_test.py
|
sudo: false
language: python
cache:
directories:
- $HOME/gcloud/
env:
- PATH=$PATH:$HOME/gcloud/google-cloud-sdk/bin GOOGLE_APPLICATION_CREDENTIALS=$TRAVIS_BUILD_DIR/python-docs-samples.json
PYTHONPATH=${HOME}/gcloud/google-cloud-sdk/platform/google_appengine
before_install:
- openssl aes-256-cbc -K $encrypted_2be61c502406_key -iv $encrypted_2be61c502406_iv
-in silver-python.json.enc -out silver-python.json -d
- if [ ! -d $HOME/gcloud/google-cloud-sdk ]; then mkdir -p $HOME/gcloud && wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz
--directory-prefix=$HOME/gcloud && cd $HOME/gcloud && tar xzf google-cloud-sdk.tar.gz
&& printf '\ny\n\ny\ny\n' | ./google-cloud-sdk/install.sh && cd $TRAVIS_BUILD_DIR;
fi
- gcloud -q components update gae-python
- if [ -a python-docs-samples.json ]; then gcloud auth activate-service-account --key-file
silver-python.json; fi
- gcloud config set project silver-python2
install:
- pip install -r requirements.txt
script:
- gcloud -q preview app deploy app.yaml
- python e2e_test.py
|
Add Project To Travis File
|
Add Project To Travis File
|
YAML
|
apache-2.0
|
waprin/appengine-vm-fortunespeak-python
|
yaml
|
## Code Before:
sudo: false
language: python
cache:
directories:
- $HOME/gcloud/
env:
- PATH=$PATH:$HOME/gcloud/google-cloud-sdk/bin GOOGLE_APPLICATION_CREDENTIALS=$TRAVIS_BUILD_DIR/python-docs-samples.json
PYTHONPATH=${HOME}/gcloud/google-cloud-sdk/platform/google_appengine
before_install:
- openssl aes-256-cbc -K $encrypted_2be61c502406_key -iv $encrypted_2be61c502406_iv
-in silver-python.json.enc -out silver-python.json -d
- if [ ! -d $HOME/gcloud/google-cloud-sdk ]; then mkdir -p $HOME/gcloud && wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz
--directory-prefix=$HOME/gcloud && cd $HOME/gcloud && tar xzf google-cloud-sdk.tar.gz
&& printf '\ny\n\ny\ny\n' | ./google-cloud-sdk/install.sh && cd $TRAVIS_BUILD_DIR;
fi
- gcloud -q components update gae-python
- if [ -a python-docs-samples.json ]; then gcloud auth activate-service-account --key-file
silver-python.json; fi
install:
- pip install -r requirements.txt
script:
- gcloud -q preview app deploy app.yaml
- python e2e_test.py
## Instruction:
Add Project To Travis File
## Code After:
sudo: false
language: python
cache:
directories:
- $HOME/gcloud/
env:
- PATH=$PATH:$HOME/gcloud/google-cloud-sdk/bin GOOGLE_APPLICATION_CREDENTIALS=$TRAVIS_BUILD_DIR/python-docs-samples.json
PYTHONPATH=${HOME}/gcloud/google-cloud-sdk/platform/google_appengine
before_install:
- openssl aes-256-cbc -K $encrypted_2be61c502406_key -iv $encrypted_2be61c502406_iv
-in silver-python.json.enc -out silver-python.json -d
- if [ ! -d $HOME/gcloud/google-cloud-sdk ]; then mkdir -p $HOME/gcloud && wget https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz
--directory-prefix=$HOME/gcloud && cd $HOME/gcloud && tar xzf google-cloud-sdk.tar.gz
&& printf '\ny\n\ny\ny\n' | ./google-cloud-sdk/install.sh && cd $TRAVIS_BUILD_DIR;
fi
- gcloud -q components update gae-python
- if [ -a python-docs-samples.json ]; then gcloud auth activate-service-account --key-file
silver-python.json; fi
- gcloud config set project silver-python2
install:
- pip install -r requirements.txt
script:
- gcloud -q preview app deploy app.yaml
- python e2e_test.py
|
58a341e8c0018219fd2884d4f6cdc0805f25fa6c
|
vim/vim.symlink/config/05-terminal.vim
|
vim/vim.symlink/config/05-terminal.vim
|
set background=dark
colorscheme solarized
" textwidth < 81 please
set colorcolumn=81
" Medium Menlo
set guifont=Consolas:h16
|
set background=dark
colorscheme solarized
" textwidth < 81 please
set colorcolumn=81
" Medium Consolas for Powerline
set guifont=Consolas\ for\ Powerline:h16
|
Use Consolas for Powerline in Vim
|
Use Consolas for Powerline in Vim
|
VimL
|
mit
|
jcf/ansible-dotfiles,jcf/ansible-dotfiles,jcf/ansible-dotfiles,jcf/ansible-dotfiles,jcf/ansible-dotfiles
|
viml
|
## Code Before:
set background=dark
colorscheme solarized
" textwidth < 81 please
set colorcolumn=81
" Medium Menlo
set guifont=Consolas:h16
## Instruction:
Use Consolas for Powerline in Vim
## Code After:
set background=dark
colorscheme solarized
" textwidth < 81 please
set colorcolumn=81
" Medium Consolas for Powerline
set guifont=Consolas\ for\ Powerline:h16
|
cd70eec47c3696c4713e5f9cf14fc25ebe62f42c
|
package.json
|
package.json
|
{
"name": "serverless-image-resizer",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"deploy": "serverless deploy function -f imageResizer",
"deploy:service": "serverless deploy",
"invoke": "serverless invoke -f imageResizer",
"logs": "serverless logs -f imageResizer -t",
"webpack:build": "serverless webpack --out .webpack",
"webpack:invoke": "serverless webpack invoke -f imageResizer"
},
"dependencies": {
"babel-polyfill": "^6.16.0",
"gm": "^1.23.0",
"serverless": "^1.1.0"
},
"devDependencies": {
"babel-core": "^6.18.2",
"babel-loader": "^6.2.7",
"babel-preset-es2015": "^6.18.0",
"babel-preset-stage-3": "^6.17.0",
"serverless": "^1.1.0",
"serverless-webpack": "^1.0.0-rc.2",
"webpack-node-externals": "^1.5.4",
"yarn": "^0.16.1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
|
{
"name": "serverless-image-resizer",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"deploy": "serverless deploy",
"deploy:function": "serverless deploy function -f imageResizer",
"invoke": "serverless invoke -f imageResizer",
"logs": "serverless logs -f imageResizer -t",
"webpack:build": "serverless webpack --out .webpack",
"webpack:invoke": "serverless webpack invoke -f imageResizer"
},
"dependencies": {
"babel-polyfill": "^6.16.0",
"gm": "^1.23.0",
"serverless": "^1.1.0"
},
"devDependencies": {
"babel-core": "^6.18.2",
"babel-loader": "^6.2.7",
"babel-preset-es2015": "^6.18.0",
"babel-preset-stage-3": "^6.17.0",
"serverless": "^1.1.0",
"serverless-webpack": "^1.0.0-rc.2",
"webpack-node-externals": "^1.5.4",
"yarn": "^0.16.1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
|
Change npm script. `deploy:service` -> `deploy:function`
|
Change npm script. `deploy:service` -> `deploy:function`
|
JSON
|
mit
|
mgi166/serverless-image-resizer
|
json
|
## Code Before:
{
"name": "serverless-image-resizer",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"deploy": "serverless deploy function -f imageResizer",
"deploy:service": "serverless deploy",
"invoke": "serverless invoke -f imageResizer",
"logs": "serverless logs -f imageResizer -t",
"webpack:build": "serverless webpack --out .webpack",
"webpack:invoke": "serverless webpack invoke -f imageResizer"
},
"dependencies": {
"babel-polyfill": "^6.16.0",
"gm": "^1.23.0",
"serverless": "^1.1.0"
},
"devDependencies": {
"babel-core": "^6.18.2",
"babel-loader": "^6.2.7",
"babel-preset-es2015": "^6.18.0",
"babel-preset-stage-3": "^6.17.0",
"serverless": "^1.1.0",
"serverless-webpack": "^1.0.0-rc.2",
"webpack-node-externals": "^1.5.4",
"yarn": "^0.16.1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
## Instruction:
Change npm script. `deploy:service` -> `deploy:function`
## Code After:
{
"name": "serverless-image-resizer",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"deploy": "serverless deploy",
"deploy:function": "serverless deploy function -f imageResizer",
"invoke": "serverless invoke -f imageResizer",
"logs": "serverless logs -f imageResizer -t",
"webpack:build": "serverless webpack --out .webpack",
"webpack:invoke": "serverless webpack invoke -f imageResizer"
},
"dependencies": {
"babel-polyfill": "^6.16.0",
"gm": "^1.23.0",
"serverless": "^1.1.0"
},
"devDependencies": {
"babel-core": "^6.18.2",
"babel-loader": "^6.2.7",
"babel-preset-es2015": "^6.18.0",
"babel-preset-stage-3": "^6.17.0",
"serverless": "^1.1.0",
"serverless-webpack": "^1.0.0-rc.2",
"webpack-node-externals": "^1.5.4",
"yarn": "^0.16.1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
|
c13d35dfff211ac3cfdf6874d1a4fb7631fedeb6
|
.travis.yml
|
.travis.yml
|
language: python
python:
- "2.7"
addons:
apt:
packages:
- oracle-java7-installer
env:
- DEPLOY=true PULL_REQUEST=$TRAVIS_PULL_REQUEST SCM_BRANCH=$TRAVIS_BRANCH
install:
- pip install html5validator
- npm install -g jshint
- npm install -g recess
- npm install -g http-server
- npm install -g node
- npm install -g phantomjs
- npm install -g pa11y
before_script:
- chmod +x ./.scripts/deploy.sh
script:
# HTML5 Validation
- html5validator
# Javascript validation
- jshint ./_static/js/*
# CSS Validation
- recess ./_static/css/*
# Accessibility Validation
- http-server -p 8080 >/dev/null 2>&1 &
- pa11y localhost:8080/index.html -s WCAG2AAA
after_success:
- ./.scripts/deploy.sh
|
language: python
python:
- "2.7"
env:
- DEPLOY=true PULL_REQUEST=$TRAVIS_PULL_REQUEST SCM_BRANCH=$TRAVIS_BRANCH
install:
- pip install html5validator==0.1.14
- npm install -g jshint
- npm install -g recess
- npm install -g http-server
- npm install -g node
- npm install -g phantomjs
- npm install -g pa11y
before_script:
- chmod +x ./.scripts/deploy.sh
script:
# HTML5 Validation
- html5validator
# Javascript validation
- jshint ./_static/js/*
# CSS Validation
- recess ./_static/css/*
# Accessibility Validation
- http-server -p 8080 >/dev/null 2>&1 &
- pa11y localhost:8080/index.html -s WCAG2AAA
after_success:
- ./.scripts/deploy.sh
|
Set an old HTML5 validator version
|
Set an old HTML5 validator version
|
YAML
|
mit
|
Bernardo-MG/docs-bootstrap-template,Bernardo-MG/docs-bootstrap-template
|
yaml
|
## Code Before:
language: python
python:
- "2.7"
addons:
apt:
packages:
- oracle-java7-installer
env:
- DEPLOY=true PULL_REQUEST=$TRAVIS_PULL_REQUEST SCM_BRANCH=$TRAVIS_BRANCH
install:
- pip install html5validator
- npm install -g jshint
- npm install -g recess
- npm install -g http-server
- npm install -g node
- npm install -g phantomjs
- npm install -g pa11y
before_script:
- chmod +x ./.scripts/deploy.sh
script:
# HTML5 Validation
- html5validator
# Javascript validation
- jshint ./_static/js/*
# CSS Validation
- recess ./_static/css/*
# Accessibility Validation
- http-server -p 8080 >/dev/null 2>&1 &
- pa11y localhost:8080/index.html -s WCAG2AAA
after_success:
- ./.scripts/deploy.sh
## Instruction:
Set an old HTML5 validator version
## Code After:
language: python
python:
- "2.7"
env:
- DEPLOY=true PULL_REQUEST=$TRAVIS_PULL_REQUEST SCM_BRANCH=$TRAVIS_BRANCH
install:
- pip install html5validator==0.1.14
- npm install -g jshint
- npm install -g recess
- npm install -g http-server
- npm install -g node
- npm install -g phantomjs
- npm install -g pa11y
before_script:
- chmod +x ./.scripts/deploy.sh
script:
# HTML5 Validation
- html5validator
# Javascript validation
- jshint ./_static/js/*
# CSS Validation
- recess ./_static/css/*
# Accessibility Validation
- http-server -p 8080 >/dev/null 2>&1 &
- pa11y localhost:8080/index.html -s WCAG2AAA
after_success:
- ./.scripts/deploy.sh
|
9b8af4eb531f5e2dd41430b4787d44e6d1ac80b2
|
Instanssi/main2013/templates/main2013/index.html
|
Instanssi/main2013/templates/main2013/index.html
|
{% extends "main2013/base.html" %}
{% load blog_tags %}
{% block head %}
{{ block.super }}
{% endblock %}
{% block jquery %}
{{ block.super }}
{% endblock %}
{% block content %}
{{ block.super }}
<h1>Blogi</h1>
{% render_blog 1 %}
{% render_blog_rss_tag 1 %}
{% endblock %}
|
{% extends "main2013/base.html" %}
{% load blog_tags %}
{% block head %}
{{ block.super }}
{% endblock %}
{% block jquery %}
{{ block.super }}
{% endblock %}
{% block content %}
{{ block.super }}
<h1>Blogi</h1>
{% render_blog event_id %}
{% render_blog_rss_tag event_id %}
{% endblock %}
|
Switch frontpage to use event_id instead of raw event number.
|
main2013: Switch frontpage to use event_id instead of raw event number.
|
HTML
|
mit
|
Instanssi/Instanssi.org,Instanssi/Instanssi.org,Instanssi/Instanssi.org,Instanssi/Instanssi.org
|
html
|
## Code Before:
{% extends "main2013/base.html" %}
{% load blog_tags %}
{% block head %}
{{ block.super }}
{% endblock %}
{% block jquery %}
{{ block.super }}
{% endblock %}
{% block content %}
{{ block.super }}
<h1>Blogi</h1>
{% render_blog 1 %}
{% render_blog_rss_tag 1 %}
{% endblock %}
## Instruction:
main2013: Switch frontpage to use event_id instead of raw event number.
## Code After:
{% extends "main2013/base.html" %}
{% load blog_tags %}
{% block head %}
{{ block.super }}
{% endblock %}
{% block jquery %}
{{ block.super }}
{% endblock %}
{% block content %}
{{ block.super }}
<h1>Blogi</h1>
{% render_blog event_id %}
{% render_blog_rss_tag event_id %}
{% endblock %}
|
e9dddcd43a0543832c6a7a32e845367827876d9c
|
app/controllers/uploads_controller.rb
|
app/controllers/uploads_controller.rb
|
class UploadsController < ApplicationController
before_action :authenticate_user!
def new
@upload = Upload.new(user_id: current_user.id, user_type: "User").tap(&:save)
end
def create
@upload = Upload.new(upload_params)
if @upload.save
redirect_to @upload, notice: 'Document was successfully uploaded.'
else
render action: 'new'
end
end
def download
@upload = Upload.find(params[:id])
redirect_to @upload.file.expiring_url(url_expire_in_seconds)
end
def destroy
@upload = Upload.find(params[:id])
@upload.destroy
flash[:notice] = "File has been deleted."
redirect_to(:back)
end
private
def url_expire_in_seconds
10
end
def upload_params
params.require(:upload).permit(:file, :file_file_name, :user_id, :user_type)
end
end
|
class UploadsController < ApplicationController
before_action :authenticate_user!
def new
@upload = Upload.new(user_id: current_user.id, user_type: "User").tap(&:save)
end
def create
@upload = Upload.new(upload_params)
if @upload.save
redirect_to :back, notice: 'File was successfully uploaded.'
else
redirect_to :back, alert: 'Failed to upload file.'
end
end
def download
@upload = Upload.find(params[:id])
redirect_to @upload.file.expiring_url(url_expire_in_seconds)
end
def destroy
@upload = Upload.find(params[:id])
@upload.destroy
flash[:notice] = "File has been deleted."
redirect_to(:back)
end
private
def url_expire_in_seconds
10
end
def upload_params
params.require(:upload).permit(:file, :file_file_name, :user_id, :user_type)
end
end
|
Fix redirect error when uploading document
|
issue-206: Fix redirect error when uploading document
* app/controllers/uploads_controller.rb: Fix redirect and add
appropriate flash message based on the save result.
|
Ruby
|
mit
|
on-site/Grantzilla,on-site/Grantzilla,on-site/Grantzilla
|
ruby
|
## Code Before:
class UploadsController < ApplicationController
before_action :authenticate_user!
def new
@upload = Upload.new(user_id: current_user.id, user_type: "User").tap(&:save)
end
def create
@upload = Upload.new(upload_params)
if @upload.save
redirect_to @upload, notice: 'Document was successfully uploaded.'
else
render action: 'new'
end
end
def download
@upload = Upload.find(params[:id])
redirect_to @upload.file.expiring_url(url_expire_in_seconds)
end
def destroy
@upload = Upload.find(params[:id])
@upload.destroy
flash[:notice] = "File has been deleted."
redirect_to(:back)
end
private
def url_expire_in_seconds
10
end
def upload_params
params.require(:upload).permit(:file, :file_file_name, :user_id, :user_type)
end
end
## Instruction:
issue-206: Fix redirect error when uploading document
* app/controllers/uploads_controller.rb: Fix redirect and add
appropriate flash message based on the save result.
## Code After:
class UploadsController < ApplicationController
before_action :authenticate_user!
def new
@upload = Upload.new(user_id: current_user.id, user_type: "User").tap(&:save)
end
def create
@upload = Upload.new(upload_params)
if @upload.save
redirect_to :back, notice: 'File was successfully uploaded.'
else
redirect_to :back, alert: 'Failed to upload file.'
end
end
def download
@upload = Upload.find(params[:id])
redirect_to @upload.file.expiring_url(url_expire_in_seconds)
end
def destroy
@upload = Upload.find(params[:id])
@upload.destroy
flash[:notice] = "File has been deleted."
redirect_to(:back)
end
private
def url_expire_in_seconds
10
end
def upload_params
params.require(:upload).permit(:file, :file_file_name, :user_id, :user_type)
end
end
|
37879f7d312deb8fa3c09f7353420a1735597ed4
|
lib/pdf/invoice.rb
|
lib/pdf/invoice.rb
|
require 'render_anywhere'
module PDF
class Invoice
include RenderAnywhere
def initialize(order)
@order = order
end
def generate
write_html
convert_html_to_pdf
end
def write_html
File.open(html_filename, 'w') { |f| f.write(html) }
end
def convert_html_to_pdf
`#{WebKitHTMLToPDF.binary} #{html_filename} #{filename}`
end
def html
set_instance_variable(:order, @order)
render template, layout: layout
end
class RenderingController < ApplicationController
attr_accessor :order
end
def template
'orders/invoice'
end
def layout
'layouts/invoice'
end
def html_filename
'tmp/invoice.html'
end
def filename
'tmp/invoice.pdf'
end
end
end
|
require 'render_anywhere'
module PDF
class Invoice
include RenderAnywhere
def initialize(order)
@order = order
end
def generate
write_html
convert_html_to_pdf
end
def write_html
File.open(html_filename, 'w') { |f| f.write(html) }
end
def convert_html_to_pdf
`#{WebKitHTMLToPDF.binary} #{html_filename} #{filename}`
end
def html
set_instance_variable(:order, @order)
render template, layout: layout
end
class RenderingController < ApplicationController
def initialize
super
set_resolver
end
attr_accessor :order
end
def template
'orders/invoice'
end
def layout
'layouts/invoice'
end
def html_filename
'tmp/invoice.html'
end
def filename
'tmp/invoice.pdf'
end
end
end
|
Use custom view resolvers in PDF
|
Use custom view resolvers in PDF
|
Ruby
|
mit
|
ianfleeton/zmey,ianfleeton/zmey,ianfleeton/zmey,ianfleeton/zmey
|
ruby
|
## Code Before:
require 'render_anywhere'
module PDF
class Invoice
include RenderAnywhere
def initialize(order)
@order = order
end
def generate
write_html
convert_html_to_pdf
end
def write_html
File.open(html_filename, 'w') { |f| f.write(html) }
end
def convert_html_to_pdf
`#{WebKitHTMLToPDF.binary} #{html_filename} #{filename}`
end
def html
set_instance_variable(:order, @order)
render template, layout: layout
end
class RenderingController < ApplicationController
attr_accessor :order
end
def template
'orders/invoice'
end
def layout
'layouts/invoice'
end
def html_filename
'tmp/invoice.html'
end
def filename
'tmp/invoice.pdf'
end
end
end
## Instruction:
Use custom view resolvers in PDF
## Code After:
require 'render_anywhere'
module PDF
class Invoice
include RenderAnywhere
def initialize(order)
@order = order
end
def generate
write_html
convert_html_to_pdf
end
def write_html
File.open(html_filename, 'w') { |f| f.write(html) }
end
def convert_html_to_pdf
`#{WebKitHTMLToPDF.binary} #{html_filename} #{filename}`
end
def html
set_instance_variable(:order, @order)
render template, layout: layout
end
class RenderingController < ApplicationController
def initialize
super
set_resolver
end
attr_accessor :order
end
def template
'orders/invoice'
end
def layout
'layouts/invoice'
end
def html_filename
'tmp/invoice.html'
end
def filename
'tmp/invoice.pdf'
end
end
end
|
faf77acc7ddb6a5e2bc198fcfec129f83d2a7678
|
plotly/tests/test_core/test_file/test_file.py
|
plotly/tests/test_core/test_file/test_file.py
|
from nose.tools import raises
from nose import with_setup
import random
import string
import requests
import plotly.plotly as py
import plotly.tools as tls
from plotly.exceptions import PlotlyRequestError
def _random_filename():
random_chars = [random.choice(string.ascii_uppercase) for _ in range(5)]
unique_filename = 'Valid Folder'+''.join(random_chars)
return unique_filename
def init():
py.sign_in('PythonTest', '9v9f20pext')
@with_setup(init)
def test_create_folder():
py.file_ops.mkdirs(_random_filename())
@with_setup(init)
def test_create_nested_folders():
first_folder = _random_filename()
nested_folder = '{0}/{1}'.format(first_folder, _random_filename())
py.file_ops.mkdirs(nested_folder)
@with_setup(init)
def test_duplicate_folders():
first_folder = _random_filename()
py.file_ops.mkdirs(first_folder)
try:
py.file_ops.mkdirs(first_folder)
except PlotlyRequestError as e:
if e.status_code != 409:
raise e
|
import random
import string
from unittest import TestCase
import plotly.plotly as py
from plotly.exceptions import PlotlyRequestError
class FolderAPITestCase(TestCase):
def setUp(self):
py.sign_in('PythonTest', '9v9f20pext')
def _random_filename(self):
random_chars = [random.choice(string.ascii_uppercase)
for _ in range(5)]
unique_filename = 'Valid Folder'+''.join(random_chars)
return unique_filename
def test_create_folder(self):
try:
py.file_ops.mkdirs(self._random_filename())
except PlotlyRequestError as e:
self.fail('Expected this *not* to fail! Status: {}'
.format(e.status_code))
def test_create_nested_folders(self):
first_folder = self._random_filename()
nested_folder = '{0}/{1}'.format(first_folder, self._random_filename())
try:
py.file_ops.mkdirs(nested_folder)
except PlotlyRequestError as e:
self.fail('Expected this *not* to fail! Status: {}'
.format(e.status_code))
def test_duplicate_folders(self):
first_folder = self._random_filename()
py.file_ops.mkdirs(first_folder)
try:
py.file_ops.mkdirs(first_folder)
except PlotlyRequestError as e:
self.assertTrue(400 <= e.status_code < 500)
else:
self.fail('Expected this to fail!')
|
Fix failing test and refact to TestCase.
|
Fix failing test and refact to TestCase.
|
Python
|
mit
|
ee-in/python-api,plotly/plotly.py,plotly/python-api,ee-in/python-api,plotly/python-api,plotly/python-api,plotly/plotly.py,plotly/plotly.py,ee-in/python-api
|
python
|
## Code Before:
from nose.tools import raises
from nose import with_setup
import random
import string
import requests
import plotly.plotly as py
import plotly.tools as tls
from plotly.exceptions import PlotlyRequestError
def _random_filename():
random_chars = [random.choice(string.ascii_uppercase) for _ in range(5)]
unique_filename = 'Valid Folder'+''.join(random_chars)
return unique_filename
def init():
py.sign_in('PythonTest', '9v9f20pext')
@with_setup(init)
def test_create_folder():
py.file_ops.mkdirs(_random_filename())
@with_setup(init)
def test_create_nested_folders():
first_folder = _random_filename()
nested_folder = '{0}/{1}'.format(first_folder, _random_filename())
py.file_ops.mkdirs(nested_folder)
@with_setup(init)
def test_duplicate_folders():
first_folder = _random_filename()
py.file_ops.mkdirs(first_folder)
try:
py.file_ops.mkdirs(first_folder)
except PlotlyRequestError as e:
if e.status_code != 409:
raise e
## Instruction:
Fix failing test and refact to TestCase.
## Code After:
import random
import string
from unittest import TestCase
import plotly.plotly as py
from plotly.exceptions import PlotlyRequestError
class FolderAPITestCase(TestCase):
def setUp(self):
py.sign_in('PythonTest', '9v9f20pext')
def _random_filename(self):
random_chars = [random.choice(string.ascii_uppercase)
for _ in range(5)]
unique_filename = 'Valid Folder'+''.join(random_chars)
return unique_filename
def test_create_folder(self):
try:
py.file_ops.mkdirs(self._random_filename())
except PlotlyRequestError as e:
self.fail('Expected this *not* to fail! Status: {}'
.format(e.status_code))
def test_create_nested_folders(self):
first_folder = self._random_filename()
nested_folder = '{0}/{1}'.format(first_folder, self._random_filename())
try:
py.file_ops.mkdirs(nested_folder)
except PlotlyRequestError as e:
self.fail('Expected this *not* to fail! Status: {}'
.format(e.status_code))
def test_duplicate_folders(self):
first_folder = self._random_filename()
py.file_ops.mkdirs(first_folder)
try:
py.file_ops.mkdirs(first_folder)
except PlotlyRequestError as e:
self.assertTrue(400 <= e.status_code < 500)
else:
self.fail('Expected this to fail!')
|
bc20e8d01dc154d45f9dfc8f2b610d415a40f253
|
broadbean/__init__.py
|
broadbean/__init__.py
|
from . import ripasso
from .element import Element
from .segment import Segment
from .sequence import Sequence
from .blueprint import BluePrint
from .tools import makeVaryingSequence, repeatAndVarySequence
from .broadbean import PulseAtoms
|
from . import ripasso
from .element import Element
from .sequence import Sequence
from .blueprint import BluePrint
from .tools import makeVaryingSequence, repeatAndVarySequence
from .broadbean import PulseAtoms
|
Remove import of version 1.0 feature
|
Remove import of version 1.0 feature
|
Python
|
mit
|
WilliamHPNielsen/broadbean
|
python
|
## Code Before:
from . import ripasso
from .element import Element
from .segment import Segment
from .sequence import Sequence
from .blueprint import BluePrint
from .tools import makeVaryingSequence, repeatAndVarySequence
from .broadbean import PulseAtoms
## Instruction:
Remove import of version 1.0 feature
## Code After:
from . import ripasso
from .element import Element
from .sequence import Sequence
from .blueprint import BluePrint
from .tools import makeVaryingSequence, repeatAndVarySequence
from .broadbean import PulseAtoms
|
1ce9d232e290c32f2c3e851617a89966f0e3eb87
|
lib/templatetags/baseurl.py
|
lib/templatetags/baseurl.py
|
import re
from bs4 import BeautifulSoup
from django import template
register = template.Library()
@register.filter(is_safe=True)
def baseurl(html, base):
if not base.endswith('/'):
base += '/'
absurl = re.compile(r'\s*[a-zA-Z][a-zA-Z0-9\+\.\-]*:') # Starts with scheme:.
def isabs(url):
return url.startswith('/') or absurl.match(url)
soup = BeautifulSoup(html)
for link in soup.findAll('a', href=True):
if not isabs(link['href']):
link['href'] = base + link['href']
for img in soup.findAll('img', src=True):
if not isabs(img['src']):
img['src'] = base + img['src']
elements = soup.findAll(style=True) # All styled elements.
for e in elements:
def func(m):
url = m.group(2)
if not isabs(url):
url = base + url
return m.group(1) + url + m.group(3)
e['style'] = re.sub(r'''(url\(\s*)([^\s\)\"\']*)(\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*")([^\s\"]*)("\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*')([^\s\']*)('\s*\))''', func, e['style'])
return str(soup)
|
import re
from bs4 import BeautifulSoup
from django import template
register = template.Library()
@register.filter(is_safe=True)
def baseurl(html, base):
if not base.endswith('/'):
base += '/'
absurl = re.compile(r'\s*[a-zA-Z][a-zA-Z0-9\+\.\-]*:') # Starts with scheme:.
def isabs(url):
return url.startswith('/') or absurl.match(url)
soup = BeautifulSoup(html, 'html.parser')
for link in soup.findAll('a', href=True):
if not isabs(link['href']):
link['href'] = base + link['href']
for img in soup.findAll('img', src=True):
if not isabs(img['src']):
img['src'] = base + img['src']
elements = soup.findAll(style=True) # All styled elements.
for e in elements:
def func(m):
url = m.group(2)
if not isabs(url):
url = base + url
return m.group(1) + url + m.group(3)
e['style'] = re.sub(r'''(url\(\s*)([^\s\)\"\']*)(\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*")([^\s\"]*)("\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*')([^\s\']*)('\s*\))''', func, e['style'])
return str(soup)
|
Change bs4 html parser to html.parser
|
Change bs4 html parser to html.parser
This is to fix wrapping with <html> tags.
|
Python
|
mit
|
peterkuma/tjrapid,peterkuma/tjrapid,peterkuma/tjrapid
|
python
|
## Code Before:
import re
from bs4 import BeautifulSoup
from django import template
register = template.Library()
@register.filter(is_safe=True)
def baseurl(html, base):
if not base.endswith('/'):
base += '/'
absurl = re.compile(r'\s*[a-zA-Z][a-zA-Z0-9\+\.\-]*:') # Starts with scheme:.
def isabs(url):
return url.startswith('/') or absurl.match(url)
soup = BeautifulSoup(html)
for link in soup.findAll('a', href=True):
if not isabs(link['href']):
link['href'] = base + link['href']
for img in soup.findAll('img', src=True):
if not isabs(img['src']):
img['src'] = base + img['src']
elements = soup.findAll(style=True) # All styled elements.
for e in elements:
def func(m):
url = m.group(2)
if not isabs(url):
url = base + url
return m.group(1) + url + m.group(3)
e['style'] = re.sub(r'''(url\(\s*)([^\s\)\"\']*)(\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*")([^\s\"]*)("\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*')([^\s\']*)('\s*\))''', func, e['style'])
return str(soup)
## Instruction:
Change bs4 html parser to html.parser
This is to fix wrapping with <html> tags.
## Code After:
import re
from bs4 import BeautifulSoup
from django import template
register = template.Library()
@register.filter(is_safe=True)
def baseurl(html, base):
if not base.endswith('/'):
base += '/'
absurl = re.compile(r'\s*[a-zA-Z][a-zA-Z0-9\+\.\-]*:') # Starts with scheme:.
def isabs(url):
return url.startswith('/') or absurl.match(url)
soup = BeautifulSoup(html, 'html.parser')
for link in soup.findAll('a', href=True):
if not isabs(link['href']):
link['href'] = base + link['href']
for img in soup.findAll('img', src=True):
if not isabs(img['src']):
img['src'] = base + img['src']
elements = soup.findAll(style=True) # All styled elements.
for e in elements:
def func(m):
url = m.group(2)
if not isabs(url):
url = base + url
return m.group(1) + url + m.group(3)
e['style'] = re.sub(r'''(url\(\s*)([^\s\)\"\']*)(\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*")([^\s\"]*)("\s*\))''', func, e['style'])
e['style'] = re.sub(r'''(url\(\s*')([^\s\']*)('\s*\))''', func, e['style'])
return str(soup)
|
e890bbc882c11d41aec396a3820ae84ad94df568
|
createproject.sh
|
createproject.sh
|
src=$(git rev-parse --show-toplevel) || {
echo "try running the script from within html5-boilerplate directories." >&2
exit 1
}
[[ -d $src ]] || {
echo "fatal: could not determine html5-boilerplate's root directory." >&2
echo "try updating git." >&2
exit 1
}
# get a name for new project
while [[ -z $name ]]
do
echo "To create a new html5-boilerplate project, enter a new directory name:"
read name || exit
done
dst=$src/../$name
if [[ -d $dst ]]
then
echo "$dst exists"
else
#create new project
mkdir -p -- "$dst" || exit 1
#success message
echo "Created Directory: $dst"
cd -- "$src"
cp -vr -- css js img build test *.html *.xml *.txt *.png *.ico .htaccess "$dst"
#success message
echo "Created Project: $dst"
fi
|
src=$(git rev-parse --show-toplevel) || {
echo "try running the script from within html5-boilerplate directories." >&2
exit 1
}
[[ -d $src ]] || {
echo "fatal: could not determine html5-boilerplate's root directory." >&2
echo "try updating git." >&2
exit 1
}
if [ $# -eq 1 ]
then
# get a name for new project from command line arguments
name="$1"
fi
# get a name for new project from input
while [[ -z $name ]]
do
echo "To create a new html5-boilerplate project, enter a new directory name:"
read name || exit
done
if [[ "$name" = /* ]]
then
dst=$name
else
dst=$src/../$name
fi
if [[ -d $dst ]]
then
echo "$dst exists"
else
#create new project
mkdir -p -- "$dst" || exit 1
#success message
echo "Created Directory: $dst"
cd -- "$src"
cp -vr -- css js img build test *.html *.xml *.txt *.png *.ico .htaccess "$dst"
#success message
echo "Created Project: $dst"
fi
|
Handle absolute directories for "name" and optionally take name as command line argument.
|
Handle absolute directories for "name" and optionally take name as command line argument.
|
Shell
|
mit
|
mklabs/node-build-script,mklabs/node-build-script
|
shell
|
## Code Before:
src=$(git rev-parse --show-toplevel) || {
echo "try running the script from within html5-boilerplate directories." >&2
exit 1
}
[[ -d $src ]] || {
echo "fatal: could not determine html5-boilerplate's root directory." >&2
echo "try updating git." >&2
exit 1
}
# get a name for new project
while [[ -z $name ]]
do
echo "To create a new html5-boilerplate project, enter a new directory name:"
read name || exit
done
dst=$src/../$name
if [[ -d $dst ]]
then
echo "$dst exists"
else
#create new project
mkdir -p -- "$dst" || exit 1
#success message
echo "Created Directory: $dst"
cd -- "$src"
cp -vr -- css js img build test *.html *.xml *.txt *.png *.ico .htaccess "$dst"
#success message
echo "Created Project: $dst"
fi
## Instruction:
Handle absolute directories for "name" and optionally take name as command line argument.
## Code After:
src=$(git rev-parse --show-toplevel) || {
echo "try running the script from within html5-boilerplate directories." >&2
exit 1
}
[[ -d $src ]] || {
echo "fatal: could not determine html5-boilerplate's root directory." >&2
echo "try updating git." >&2
exit 1
}
if [ $# -eq 1 ]
then
# get a name for new project from command line arguments
name="$1"
fi
# get a name for new project from input
while [[ -z $name ]]
do
echo "To create a new html5-boilerplate project, enter a new directory name:"
read name || exit
done
if [[ "$name" = /* ]]
then
dst=$name
else
dst=$src/../$name
fi
if [[ -d $dst ]]
then
echo "$dst exists"
else
#create new project
mkdir -p -- "$dst" || exit 1
#success message
echo "Created Directory: $dst"
cd -- "$src"
cp -vr -- css js img build test *.html *.xml *.txt *.png *.ico .htaccess "$dst"
#success message
echo "Created Project: $dst"
fi
|
a342d04b51de6c132337ddc479b57e65cca27ffa
|
app/views/users/_form.html.haml
|
app/views/users/_form.html.haml
|
= simple_form_for @user, html: { class: 'form-horizontal', id: 'user_form' } do |f|
= f.input :name
= f.input :email
= f.input :profile, input_html: { rows: 5, class: 'span6' }
%hr
= f.simple_fields_for :preference do |preference_form|
= preference_form.input :languages, as: :select, collection: %w(english japanese chinese), prompt: 'Select language'
.form-actions
= f.button :submit, disable_with: 'Updating...', id: 'user_submit', class: 'btn btn-primary'
|
= simple_form_for @user, html: { class: 'form-horizontal', id: 'user_form' } do |f|
= f.input :name
= f.input :email
= f.input :profile, input_html: { rows: 5, class: 'span6' }
%hr
= f.simple_fields_for :preference do |preference_form|
= preference_form.input :languages, as: :select, collection: %w(english japanese chinese), prompt: 'Select language'
.form-actions
= f.button :submit, data: { disable_with: 'Updating...' }, id: 'user_submit', class: 'btn btn-primary'
|
Update to data: { disable_with: 'Text' } style
|
Update to data: { disable_with: 'Text' } style
|
Haml
|
mit
|
dddaisuke/quoty,dddaisuke/quoty,kinopyo/quoty,kinopyo/quoty,dddaisuke/quoty
|
haml
|
## Code Before:
= simple_form_for @user, html: { class: 'form-horizontal', id: 'user_form' } do |f|
= f.input :name
= f.input :email
= f.input :profile, input_html: { rows: 5, class: 'span6' }
%hr
= f.simple_fields_for :preference do |preference_form|
= preference_form.input :languages, as: :select, collection: %w(english japanese chinese), prompt: 'Select language'
.form-actions
= f.button :submit, disable_with: 'Updating...', id: 'user_submit', class: 'btn btn-primary'
## Instruction:
Update to data: { disable_with: 'Text' } style
## Code After:
= simple_form_for @user, html: { class: 'form-horizontal', id: 'user_form' } do |f|
= f.input :name
= f.input :email
= f.input :profile, input_html: { rows: 5, class: 'span6' }
%hr
= f.simple_fields_for :preference do |preference_form|
= preference_form.input :languages, as: :select, collection: %w(english japanese chinese), prompt: 'Select language'
.form-actions
= f.button :submit, data: { disable_with: 'Updating...' }, id: 'user_submit', class: 'btn btn-primary'
|
970e5334cf4932ee7fa705c7b2f9c5d43749353c
|
src/CrawlLogger.java
|
src/CrawlLogger.java
|
package src;
import java.io.IOException;
import java.util.logging.FileHandler;
import java.util.logging.Level;
import java.util.logging.Logger;
import java.util.logging.SimpleFormatter;
public class CrawlLogger {
static private FileHandler fileTxt;
static private SimpleFormatter formatterTxt;
static public void setup() throws IOException {
// Get the global logger to configure it
Logger logger = Logger.getLogger("");
logger.setLevel(Level.INFO);
fileTxt = new FileHandler("logs/crawler.txt");
// create txt Formatter
formatterTxt = new SimpleFormatter();
fileTxt.setFormatter(formatterTxt);
logger.addHandler(fileTxt);
}
}
|
package src;
import java.io.IOException;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.Date;
import java.util.logging.FileHandler;
import java.util.logging.Level;
import java.util.logging.Logger;
import java.util.logging.SimpleFormatter;
public class CrawlLogger {
static private FileHandler fileTxt;
static private SimpleFormatter formatterTxt;
static public void setup() throws IOException {
Calendar cal = Calendar.getInstance();
Date today = cal.getTime();
DateFormat timeFormat = new SimpleDateFormat("dd-MM-yyyy");
// Get the global logger to configure it
Logger logger = Logger.getLogger("");
logger.setLevel(Level.INFO);
fileTxt = new FileHandler("logs/crawler_"+timeFormat.format(today)+".txt");
// create txt Formatter
formatterTxt = new SimpleFormatter();
fileTxt.setFormatter(formatterTxt);
logger.addHandler(fileTxt);
}
}
|
Change in the name of the log file
|
Change in the name of the log file
|
Java
|
mit
|
joausaga/ideascalecrawler
|
java
|
## Code Before:
package src;
import java.io.IOException;
import java.util.logging.FileHandler;
import java.util.logging.Level;
import java.util.logging.Logger;
import java.util.logging.SimpleFormatter;
public class CrawlLogger {
static private FileHandler fileTxt;
static private SimpleFormatter formatterTxt;
static public void setup() throws IOException {
// Get the global logger to configure it
Logger logger = Logger.getLogger("");
logger.setLevel(Level.INFO);
fileTxt = new FileHandler("logs/crawler.txt");
// create txt Formatter
formatterTxt = new SimpleFormatter();
fileTxt.setFormatter(formatterTxt);
logger.addHandler(fileTxt);
}
}
## Instruction:
Change in the name of the log file
## Code After:
package src;
import java.io.IOException;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.Date;
import java.util.logging.FileHandler;
import java.util.logging.Level;
import java.util.logging.Logger;
import java.util.logging.SimpleFormatter;
public class CrawlLogger {
static private FileHandler fileTxt;
static private SimpleFormatter formatterTxt;
static public void setup() throws IOException {
Calendar cal = Calendar.getInstance();
Date today = cal.getTime();
DateFormat timeFormat = new SimpleDateFormat("dd-MM-yyyy");
// Get the global logger to configure it
Logger logger = Logger.getLogger("");
logger.setLevel(Level.INFO);
fileTxt = new FileHandler("logs/crawler_"+timeFormat.format(today)+".txt");
// create txt Formatter
formatterTxt = new SimpleFormatter();
fileTxt.setFormatter(formatterTxt);
logger.addHandler(fileTxt);
}
}
|
635f492f575027b9eeea352c9e85eb3f28abfc67
|
startup.sh
|
startup.sh
|
ipython trust /import/ipython_galaxy_notebook.ipynb
/monitor_traffic.sh &
ipython notebook --no-browser
|
uid=`stat --printf %u /import`
gid=`stat --printf %g /import`
if [[ $uid != '1450' ]] && [[ $gid != '1450' ]]; then
groupadd -r galaxy -g $gid && \
useradd -u $uid -r -g galaxy -d /home/ipython -c "IPython user" galaxy && \
chown galaxy:galaxy /home/ipython -R
su galaxy -c 'ipython trust /import/ipython_galaxy_notebook.ipynb'
su galaxy -c '/monitor_traffic.sh' &
su galaxy -c 'ipython notebook --no-browser'
else
ipython trust /import/ipython_galaxy_notebook.ipynb
/monitor_traffic.sh &
ipython notebook --no-browser
fi
|
Change permission and user if this container is not started from within the Galaxy Docker container.
|
Change permission and user if this container is not started from within the Galaxy Docker container.
|
Shell
|
mit
|
bgruening/docker-jupyter-notebook,bgruening/docker-jupyter-notebook,bgruening/docker-jupyter-notebook,bgruening/docker-ipython-notebook,bgruening/docker-ipython-notebook,bgruening/docker-ipython-notebook
|
shell
|
## Code Before:
ipython trust /import/ipython_galaxy_notebook.ipynb
/monitor_traffic.sh &
ipython notebook --no-browser
## Instruction:
Change permission and user if this container is not started from within the Galaxy Docker container.
## Code After:
uid=`stat --printf %u /import`
gid=`stat --printf %g /import`
if [[ $uid != '1450' ]] && [[ $gid != '1450' ]]; then
groupadd -r galaxy -g $gid && \
useradd -u $uid -r -g galaxy -d /home/ipython -c "IPython user" galaxy && \
chown galaxy:galaxy /home/ipython -R
su galaxy -c 'ipython trust /import/ipython_galaxy_notebook.ipynb'
su galaxy -c '/monitor_traffic.sh' &
su galaxy -c 'ipython notebook --no-browser'
else
ipython trust /import/ipython_galaxy_notebook.ipynb
/monitor_traffic.sh &
ipython notebook --no-browser
fi
|
0168687fb0b9e599e5bf1f25627006531624d114
|
tests/docker.js
|
tests/docker.js
|
var assert = require("assert");
var spawn = require('child_process').spawn;
describe('Start docker container', function () {
it('should show M2 preamble', function (done) {
// ssh localhost
var docker = "docker";
var process = spawn(docker, ["run", "fhinkel/macaulay2", "M2"]);
var encoding = "utf8";
process.stderr.setEncoding(encoding);
process.stderr.on('data', function (data) {
console.log("Preamble: " + data);
assert(data.match(/Macaulay2, version 1\.\d/),
'M2 preamble does not match');
process.kill();
done();
});
process.on('error', function (error) {
assert(false, error);
next();
})
});
});
|
var assert = require("assert");
var spawn = require('child_process').spawn;
describe('Start docker container', function () {
it('should show M2 preamble', function (done) {
// ssh localhost
var docker = "docker";
var process = spawn(docker, ["run", "fhinkel/macaulay2", "M2"]);
var encoding = "utf8";
process.stderr.setEncoding(encoding);
var result = "";
process.stderr.on('data', function (data) {
console.log("Preamble: " + data);
result += data;
});
process.on('error', function (error) {
assert(false, error);
next();
});
process.on('close', function() {
assert(result.match(/Macaulay2, version 1\.\d/),
'M2 preamble does not match');
done();
});
});
});
|
Fix test to wait for output
|
Fix test to wait for output
|
JavaScript
|
mit
|
fhinkel/InteractiveShell,antonleykin/InteractiveShell,antonleykin/InteractiveShell,antonleykin/InteractiveShell,fhinkel/InteractiveShell,fhinkel/InteractiveShell,fhinkel/InteractiveShell
|
javascript
|
## Code Before:
var assert = require("assert");
var spawn = require('child_process').spawn;
describe('Start docker container', function () {
it('should show M2 preamble', function (done) {
// ssh localhost
var docker = "docker";
var process = spawn(docker, ["run", "fhinkel/macaulay2", "M2"]);
var encoding = "utf8";
process.stderr.setEncoding(encoding);
process.stderr.on('data', function (data) {
console.log("Preamble: " + data);
assert(data.match(/Macaulay2, version 1\.\d/),
'M2 preamble does not match');
process.kill();
done();
});
process.on('error', function (error) {
assert(false, error);
next();
})
});
});
## Instruction:
Fix test to wait for output
## Code After:
var assert = require("assert");
var spawn = require('child_process').spawn;
describe('Start docker container', function () {
it('should show M2 preamble', function (done) {
// ssh localhost
var docker = "docker";
var process = spawn(docker, ["run", "fhinkel/macaulay2", "M2"]);
var encoding = "utf8";
process.stderr.setEncoding(encoding);
var result = "";
process.stderr.on('data', function (data) {
console.log("Preamble: " + data);
result += data;
});
process.on('error', function (error) {
assert(false, error);
next();
});
process.on('close', function() {
assert(result.match(/Macaulay2, version 1\.\d/),
'M2 preamble does not match');
done();
});
});
});
|
49fdc12d5e70ebe881497ab28a6422588991693e
|
.travis.yml
|
.travis.yml
|
language: "ruby"
rvm:
- "1.9.3"
- "2.0"
- "2.1"
- "2.2"
- "2.3"
- "jruby"
|
language: "ruby"
rvm:
- "1.9.3"
- "2.0"
- "2.1"
- "2.2"
- "jruby"
before_install:
- gem update --system
- bundle clean --force || exit 0
|
Delete older versions of gems
|
Delete older versions of gems
|
YAML
|
mit
|
chiku/simple_validation
|
yaml
|
## Code Before:
language: "ruby"
rvm:
- "1.9.3"
- "2.0"
- "2.1"
- "2.2"
- "2.3"
- "jruby"
## Instruction:
Delete older versions of gems
## Code After:
language: "ruby"
rvm:
- "1.9.3"
- "2.0"
- "2.1"
- "2.2"
- "jruby"
before_install:
- gem update --system
- bundle clean --force || exit 0
|
6a24763e68182d68f5bbb30ce93193582d4ba302
|
test/bundle_false/package.json
|
test/bundle_false/package.json
|
{
"name": "bundle-false",
"main": "main",
"dependencies": {
"jqueryt": "2.2.0"
},
"system": {
"directories": {
"lib": "src"
},
"meta": {
"bar": "foo",
"foobar": {
"foo": "bar"
},
"jqueryt": {
"foo": "bar"
}
},
"envs": {
"window-production": {
"map": {
"jqueryt": "@empty",
"src/dep": "@empty"
}
}
}
}
}
|
{
"name": "bundle-false",
"main": "main",
"version": "1.0.0",
"dependencies": {
"jqueryt": "2.2.0"
},
"system": {
"directories": {
"lib": "src"
},
"meta": {
"bar": "foo",
"foobar": {
"foo": "bar"
},
"jqueryt": {
"foo": "bar"
}
},
"envs": {
"window-production": {
"map": {
"jqueryt": "@empty",
"src/dep": "@empty"
}
}
}
}
}
|
Include a version for bundle-false
|
Include a version for bundle-false
|
JSON
|
mit
|
stealjs/steal-tools,stealjs/steal-tools
|
json
|
## Code Before:
{
"name": "bundle-false",
"main": "main",
"dependencies": {
"jqueryt": "2.2.0"
},
"system": {
"directories": {
"lib": "src"
},
"meta": {
"bar": "foo",
"foobar": {
"foo": "bar"
},
"jqueryt": {
"foo": "bar"
}
},
"envs": {
"window-production": {
"map": {
"jqueryt": "@empty",
"src/dep": "@empty"
}
}
}
}
}
## Instruction:
Include a version for bundle-false
## Code After:
{
"name": "bundle-false",
"main": "main",
"version": "1.0.0",
"dependencies": {
"jqueryt": "2.2.0"
},
"system": {
"directories": {
"lib": "src"
},
"meta": {
"bar": "foo",
"foobar": {
"foo": "bar"
},
"jqueryt": {
"foo": "bar"
}
},
"envs": {
"window-production": {
"map": {
"jqueryt": "@empty",
"src/dep": "@empty"
}
}
}
}
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.