commit
stringlengths 40
40
| old_file
stringlengths 4
237
| new_file
stringlengths 4
237
| old_contents
stringlengths 1
4.24k
| new_contents
stringlengths 5
4.84k
| subject
stringlengths 15
778
| message
stringlengths 16
6.86k
| lang
stringlengths 1
30
| license
stringclasses 13
values | repos
stringlengths 5
116k
| config
stringlengths 1
30
| content
stringlengths 105
8.72k
|
---|---|---|---|---|---|---|---|---|---|---|---|
355f76947bad0b32ac96547c7a5397c150db19e1
|
README.md
|
README.md
|
(very early stage of development)
## Compile instructions
Be sure these librairies are installed :
* libsoup-2.4
* gio-2.0
* gio-unix-2.0
* gee-0.8
* json-glib-1.0
Just run :
```bash
valac -g --save-temps \
--pkg gtk+-3.0 \
--pkg libsoup-2.4 \
--pkg gio-2.0 \
--pkg gio-unix-2.0 \
--pkg gee-0.8 \
--pkg json-glib-1.0 \
main.vala docker/*.vala store/*.vala view/*.vala \
-o gdocker
```
|
(very early stage of development)
## Compile instructions
Be sure these librairies are installed :
* valac (`apt-get install valac`, `apt-get install libgtk-3-dev`)
* libsoup-2.4
* gio-2.0
* gio-unix-2.0
* gee-0.8 (`apt-get install libgee-0.8`)
* json-glib-1.0 (`apt-get install libjson-glib-1.0.0 libjson-glib-dev`)
Just run :
```bash
valac -g --save-temps \
--pkg gtk+-3.0 \
--pkg libsoup-2.4 \
--pkg gio-2.0 \
--pkg gio-unix-2.0 \
--pkg gee-0.8 \
--pkg json-glib-1.0 \
main.vala docker/*.vala store/*.vala view/*.vala \
-o gdocker
```
|
Add apt-get commands to install dependencies
|
Add apt-get commands to install dependencies
|
Markdown
|
mit
|
lcallarec/dockery,lcallarec/gnome-docker-manager
|
markdown
|
## Code Before:
(very early stage of development)
## Compile instructions
Be sure these librairies are installed :
* libsoup-2.4
* gio-2.0
* gio-unix-2.0
* gee-0.8
* json-glib-1.0
Just run :
```bash
valac -g --save-temps \
--pkg gtk+-3.0 \
--pkg libsoup-2.4 \
--pkg gio-2.0 \
--pkg gio-unix-2.0 \
--pkg gee-0.8 \
--pkg json-glib-1.0 \
main.vala docker/*.vala store/*.vala view/*.vala \
-o gdocker
```
## Instruction:
Add apt-get commands to install dependencies
## Code After:
(very early stage of development)
## Compile instructions
Be sure these librairies are installed :
* valac (`apt-get install valac`, `apt-get install libgtk-3-dev`)
* libsoup-2.4
* gio-2.0
* gio-unix-2.0
* gee-0.8 (`apt-get install libgee-0.8`)
* json-glib-1.0 (`apt-get install libjson-glib-1.0.0 libjson-glib-dev`)
Just run :
```bash
valac -g --save-temps \
--pkg gtk+-3.0 \
--pkg libsoup-2.4 \
--pkg gio-2.0 \
--pkg gio-unix-2.0 \
--pkg gee-0.8 \
--pkg json-glib-1.0 \
main.vala docker/*.vala store/*.vala view/*.vala \
-o gdocker
```
|
3246df2610f4d550a680624e77062e2056ff6356
|
src/scripts/components/control-panel.js
|
src/scripts/components/control-panel.js
|
/** @jsx REACT.DOM */
import React from 'react';
export default React.createClass({
render: function () {
return (
<div className="control-panel">
<p>{this.props.track ? this.props.track.title : 'CONTROL PANEL'}</p>
</div>
);
}
});
|
/** @jsx REACT.DOM */
import React from 'react';
export default React.createClass({
render: function () {
const trackTitle = this.props.track ? this.props.track.title : 'CONTROL PANEL';
return (
<div className="control-panel">
<p>{trackTitle}</p>
</div>
);
}
});
|
Move control panel title into a variable
|
Move control panel title into a variable
This sets me up for more
|
JavaScript
|
mit
|
brandly/ss15-queso,brandly/ss15-queso
|
javascript
|
## Code Before:
/** @jsx REACT.DOM */
import React from 'react';
export default React.createClass({
render: function () {
return (
<div className="control-panel">
<p>{this.props.track ? this.props.track.title : 'CONTROL PANEL'}</p>
</div>
);
}
});
## Instruction:
Move control panel title into a variable
This sets me up for more
## Code After:
/** @jsx REACT.DOM */
import React from 'react';
export default React.createClass({
render: function () {
const trackTitle = this.props.track ? this.props.track.title : 'CONTROL PANEL';
return (
<div className="control-panel">
<p>{trackTitle}</p>
</div>
);
}
});
|
cd4a0fe699a3dc7cc5624afb2368b4a7d6e7f5b9
|
setup.rb
|
setup.rb
|
require 'rbconfig'
require 'fileutils'
class Setup
attr_accessor :site_dir, :site_lib_dir, :site_arch_dir
def initialize
@site_dir = RbConfig::CONFIG["sitedir"]
@site_lib_dir = RbConfig::CONFIG["sitelibdir"]
@site_arch_dir = RbConfig::CONFIG["sitearchdir"]
end
def compile
Dir.chdir 'ext' do
if File.exist?("Makefile")
system("make clean")
end
system("#{Gem.ruby} extconf.rb")
system("make")
end
end
def copy_files
ext_path =
File.absolute_path "ext/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}"
FileUtils.rm_r site_lib_dir + "/rbkit", force: true
FileUtils.rm_r site_lib_dir + "/rbkit.rb", force: true
FileUtils.rm_r site_arch_dir + "/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}", force: true
FileUtils.cp_r "lib/.", site_lib_dir, verbose: true
FileUtils.cp_r ext_path, site_arch_dir, verbose: true
end
end
if __FILE__ == $0
Setup.new.tap do |setup|
setup.compile
setup.copy_files
end
end
|
require 'rbconfig'
require 'fileutils'
class Setup
attr_accessor :site_dir, :site_lib_dir, :site_arch_dir
def initialize
@site_dir = RbConfig::CONFIG["sitedir"]
@site_lib_dir = RbConfig::CONFIG["sitelibdir"]
@site_arch_dir = RbConfig::CONFIG["sitearchdir"]
end
def compile
Dir.chdir 'ext' do
if File.exist?("Makefile")
system("make clean")
end
system("#{Gem.ruby} extconf.rb")
system("make")
end
end
def copy_files
ext_path =
File.absolute_path "ext/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}"
cleanup
FileUtils.cp_r "lib/.", site_lib_dir, verbose: true
FileUtils.cp_r ext_path, site_arch_dir, verbose: true
end
def cleanup
puts "Removing existing installation"
FileUtils.rm_r site_lib_dir + "/rbkit", force: true
FileUtils.rm_r site_lib_dir + "/rbkit.rb", force: true
FileUtils.rm_r site_arch_dir + "/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}", force: true
end
end
if __FILE__ == $0
cmd_arg = ARGV[0].strip
setup_script = Setup.new
case cmd_arg
when "remove"
setup_script.cleanup
else
setup_script.compile
setup_script.copy_files
end
end
|
Implement new command for removing existing installation
|
Implement new command for removing existing installation
|
Ruby
|
mit
|
code-mancers/rbkit,zBMNForks/rbkit,zBMNForks/rbkit,code-mancers/rbkit,code-mancers/rbkit
|
ruby
|
## Code Before:
require 'rbconfig'
require 'fileutils'
class Setup
attr_accessor :site_dir, :site_lib_dir, :site_arch_dir
def initialize
@site_dir = RbConfig::CONFIG["sitedir"]
@site_lib_dir = RbConfig::CONFIG["sitelibdir"]
@site_arch_dir = RbConfig::CONFIG["sitearchdir"]
end
def compile
Dir.chdir 'ext' do
if File.exist?("Makefile")
system("make clean")
end
system("#{Gem.ruby} extconf.rb")
system("make")
end
end
def copy_files
ext_path =
File.absolute_path "ext/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}"
FileUtils.rm_r site_lib_dir + "/rbkit", force: true
FileUtils.rm_r site_lib_dir + "/rbkit.rb", force: true
FileUtils.rm_r site_arch_dir + "/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}", force: true
FileUtils.cp_r "lib/.", site_lib_dir, verbose: true
FileUtils.cp_r ext_path, site_arch_dir, verbose: true
end
end
if __FILE__ == $0
Setup.new.tap do |setup|
setup.compile
setup.copy_files
end
end
## Instruction:
Implement new command for removing existing installation
## Code After:
require 'rbconfig'
require 'fileutils'
class Setup
attr_accessor :site_dir, :site_lib_dir, :site_arch_dir
def initialize
@site_dir = RbConfig::CONFIG["sitedir"]
@site_lib_dir = RbConfig::CONFIG["sitelibdir"]
@site_arch_dir = RbConfig::CONFIG["sitearchdir"]
end
def compile
Dir.chdir 'ext' do
if File.exist?("Makefile")
system("make clean")
end
system("#{Gem.ruby} extconf.rb")
system("make")
end
end
def copy_files
ext_path =
File.absolute_path "ext/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}"
cleanup
FileUtils.cp_r "lib/.", site_lib_dir, verbose: true
FileUtils.cp_r ext_path, site_arch_dir, verbose: true
end
def cleanup
puts "Removing existing installation"
FileUtils.rm_r site_lib_dir + "/rbkit", force: true
FileUtils.rm_r site_lib_dir + "/rbkit.rb", force: true
FileUtils.rm_r site_arch_dir + "/rbkit_tracer.#{RbConfig::MAKEFILE_CONFIG['DLEXT']}", force: true
end
end
if __FILE__ == $0
cmd_arg = ARGV[0].strip
setup_script = Setup.new
case cmd_arg
when "remove"
setup_script.cleanup
else
setup_script.compile
setup_script.copy_files
end
end
|
41ba385efd7745efff2dc03bad513846964b2b2c
|
src/Collection/LocalityCollection.php
|
src/Collection/LocalityCollection.php
|
<?php
namespace Galahad\LaravelAddressing\Collection;
use Galahad\LaravelAddressing\Entity\Locality;
/**
* Class LocalityCollection
*
* @package Galahad\LaravelAddressing
* @author Junior Grossi <[email protected]>
*/
class LocalityCollection extends Collection implements CollectionInterface
{
/**
* Return all the items ready for a <select> HTML element
*
* @return mixed
*/
public function toList()
{
$values = [];
/** @var Locality $locality */
foreach ($this as $locality) {
$values[$locality->getName()] = $locality->getName();
}
return $values;
}
/**
* Override the getByKey method to return the correct instance
*
* @param int $key
* @return Locality
*/
public function getByKey($key)
{
return parent::getByKey($key);
}
/**
* Get a locality by its name
*
* @param $name
* @return Locality
*/
public function getByName($name)
{
if ($this->count()) {
$locality = $this->getByKey(0);
return $locality->getByName($name);
}
}
}
|
<?php
namespace Galahad\LaravelAddressing\Collection;
use Galahad\LaravelAddressing\Entity\Locality;
/**
* Class LocalityCollection
*
* @package Galahad\LaravelAddressing
* @author Junior Grossi <[email protected]>
*/
class LocalityCollection extends Collection implements CollectionInterface
{
/**
* Return all the items ready for a <select> HTML element
*
* @return mixed
*/
public function toList()
{
$values = [];
/** @var Locality $locality */
foreach ($this as $locality) {
$values[] = $locality->getName();
}
return $values;
}
/**
* Override the getByKey method to return the correct instance
*
* @param int $key
* @return Locality
*/
public function getByKey($key)
{
return parent::getByKey($key);
}
/**
* Get a locality by its name
*
* @param $name
* @return Locality
*/
public function getByName($name)
{
if ($this->count()) {
$locality = $this->getByKey(0);
return $locality->getByName($name);
}
}
}
|
Remove key from toList() method in LocalitiesCollection class
|
Remove key from toList() method in LocalitiesCollection class
|
PHP
|
mit
|
glhd/laravel-addressing
|
php
|
## Code Before:
<?php
namespace Galahad\LaravelAddressing\Collection;
use Galahad\LaravelAddressing\Entity\Locality;
/**
* Class LocalityCollection
*
* @package Galahad\LaravelAddressing
* @author Junior Grossi <[email protected]>
*/
class LocalityCollection extends Collection implements CollectionInterface
{
/**
* Return all the items ready for a <select> HTML element
*
* @return mixed
*/
public function toList()
{
$values = [];
/** @var Locality $locality */
foreach ($this as $locality) {
$values[$locality->getName()] = $locality->getName();
}
return $values;
}
/**
* Override the getByKey method to return the correct instance
*
* @param int $key
* @return Locality
*/
public function getByKey($key)
{
return parent::getByKey($key);
}
/**
* Get a locality by its name
*
* @param $name
* @return Locality
*/
public function getByName($name)
{
if ($this->count()) {
$locality = $this->getByKey(0);
return $locality->getByName($name);
}
}
}
## Instruction:
Remove key from toList() method in LocalitiesCollection class
## Code After:
<?php
namespace Galahad\LaravelAddressing\Collection;
use Galahad\LaravelAddressing\Entity\Locality;
/**
* Class LocalityCollection
*
* @package Galahad\LaravelAddressing
* @author Junior Grossi <[email protected]>
*/
class LocalityCollection extends Collection implements CollectionInterface
{
/**
* Return all the items ready for a <select> HTML element
*
* @return mixed
*/
public function toList()
{
$values = [];
/** @var Locality $locality */
foreach ($this as $locality) {
$values[] = $locality->getName();
}
return $values;
}
/**
* Override the getByKey method to return the correct instance
*
* @param int $key
* @return Locality
*/
public function getByKey($key)
{
return parent::getByKey($key);
}
/**
* Get a locality by its name
*
* @param $name
* @return Locality
*/
public function getByName($name)
{
if ($this->count()) {
$locality = $this->getByKey(0);
return $locality->getByName($name);
}
}
}
|
f259b1b3d7f63176494ce919450f35863ac1d3ea
|
coreext.sh
|
coreext.sh
|
parse() {
pop c
push `inbufgetaddr`
n=0;
while inbufget xc; do
if test $xc = $c; then break; fi
n="`\"$expr\" $n + 1`"
done
push $n
}
builtin 'parse' parse
dot_s() { # .s
n=0
while expr $n '<' $sp >/dev/null; do
eval "echo $n : \$stk_$n"
n="`\"$expr\" $n + 1`"
done
}
builtin '.s' dot_s
bye() {
exit
}
builtin 'bye' bye
nip() {
pop x2
pop x1
push "$x2"
}
builtin 'nip' nip
tuck() {
pop x2
pop x1
push "$x2"
push "$x1"
push "$x2"
}
builtin 'tuck' tuck
not_equals() { # <>
pop n2
pop n1
if "$expr" "$n1" '!=' "$n2" >/dev/null; then
push -1
else
push 0
fi
}
builtin '<>' not_equals
|
parse() {
pop c
push `inbufgetaddr`
n=0;
while inbufget xc; do
if test $xc = $c; then break; fi
n="`\"$expr\" $n + 1`"
done
push $n
}
builtin 'parse' parse
dot_s() { # .s
n=0
while expr $n '<' $sp >/dev/null; do
eval "echo $n : \$stk_$n"
n="`\"$expr\" $n + 1`"
done
}
builtin '.s' dot_s
bye() {
exit
}
builtin 'bye' bye
nip() {
pop x2
pop x1
push "$x2"
}
builtin 'nip' nip
tuck() {
pop x2
pop x1
push "$x2"
push "$x1"
push "$x2"
}
builtin 'tuck' tuck
not_equals() { # <>
pop n2
pop n1
if "$expr" "$n1" '!=' "$n2" >/dev/null; then
push -1
else
push 0
fi
}
builtin '<>' not_equals
zero_not_equals() { # 0<>
pop n
if "$expr" 0 '!=' "$n" >/dev/null; then
push -1
else
push 0
fi
}
builtin '0<>' zero_not_equals
|
Add 0<> core extension word
|
Add 0<> core extension word
|
Shell
|
mit
|
zeldin/shForth
|
shell
|
## Code Before:
parse() {
pop c
push `inbufgetaddr`
n=0;
while inbufget xc; do
if test $xc = $c; then break; fi
n="`\"$expr\" $n + 1`"
done
push $n
}
builtin 'parse' parse
dot_s() { # .s
n=0
while expr $n '<' $sp >/dev/null; do
eval "echo $n : \$stk_$n"
n="`\"$expr\" $n + 1`"
done
}
builtin '.s' dot_s
bye() {
exit
}
builtin 'bye' bye
nip() {
pop x2
pop x1
push "$x2"
}
builtin 'nip' nip
tuck() {
pop x2
pop x1
push "$x2"
push "$x1"
push "$x2"
}
builtin 'tuck' tuck
not_equals() { # <>
pop n2
pop n1
if "$expr" "$n1" '!=' "$n2" >/dev/null; then
push -1
else
push 0
fi
}
builtin '<>' not_equals
## Instruction:
Add 0<> core extension word
## Code After:
parse() {
pop c
push `inbufgetaddr`
n=0;
while inbufget xc; do
if test $xc = $c; then break; fi
n="`\"$expr\" $n + 1`"
done
push $n
}
builtin 'parse' parse
dot_s() { # .s
n=0
while expr $n '<' $sp >/dev/null; do
eval "echo $n : \$stk_$n"
n="`\"$expr\" $n + 1`"
done
}
builtin '.s' dot_s
bye() {
exit
}
builtin 'bye' bye
nip() {
pop x2
pop x1
push "$x2"
}
builtin 'nip' nip
tuck() {
pop x2
pop x1
push "$x2"
push "$x1"
push "$x2"
}
builtin 'tuck' tuck
not_equals() { # <>
pop n2
pop n1
if "$expr" "$n1" '!=' "$n2" >/dev/null; then
push -1
else
push 0
fi
}
builtin '<>' not_equals
zero_not_equals() { # 0<>
pop n
if "$expr" 0 '!=' "$n" >/dev/null; then
push -1
else
push 0
fi
}
builtin '0<>' zero_not_equals
|
0006849e1a01450657b7de7e3475a5f2c8ac1927
|
wordpress/wordpress-performance-notes.md
|
wordpress/wordpress-performance-notes.md
|
Based on the TTFB in Chrome Dev Tools, I have determined that the following plugins significantly slow down a request.
- Easy Update Manager (aka stops-core-theme-and-plugin-updates)
+ Adds 100+ ms
|
Based on the TTFB in Chrome Dev Tools, I have determined that the following plugins significantly slow down a request.
- Easy Update Manager (aka stops-core-theme-and-plugin-updates)
+ Adds 100+ ms to front-end requests
- Yoast (aka wordpress-seo)
+ Adds ~25 ms to front-end requests
|
Update Plugins Known to be Slow Section
|
Update Plugins Known to be Slow Section
|
Markdown
|
mit
|
dhurlburtusa/shortcuts,dhurlburtusa/shortcuts
|
markdown
|
## Code Before:
Based on the TTFB in Chrome Dev Tools, I have determined that the following plugins significantly slow down a request.
- Easy Update Manager (aka stops-core-theme-and-plugin-updates)
+ Adds 100+ ms
## Instruction:
Update Plugins Known to be Slow Section
## Code After:
Based on the TTFB in Chrome Dev Tools, I have determined that the following plugins significantly slow down a request.
- Easy Update Manager (aka stops-core-theme-and-plugin-updates)
+ Adds 100+ ms to front-end requests
- Yoast (aka wordpress-seo)
+ Adds ~25 ms to front-end requests
|
c4d097e6b6193b7b2c02ab525df5bd5e99e16d50
|
.travis.yml
|
.travis.yml
|
language: ruby
sudo: required
rvm:
- 2.2.7
- 2.3.5
- 2.4.2
install:
- sudo apt-get update -qq
- sudo apt-get install libpcap-dev -qq
- bundle install --path vendor/bundle --jobs=3 --retry=3
before_script:
- openssl version
- ruby -ropenssl -e 'puts OpenSSL::VERSION'
script:
- bundle exec rake
- rvmsudo bundle exec rake spec:sudo
|
language: ruby
sudo: required
rvm:
- 2.2.7
- 2.3.5
- 2.4.2
- 2.5.0
install:
- sudo apt-get update -qq
- sudo apt-get install libpcap-dev -qq
- bundle install --path vendor/bundle --jobs=3 --retry=3
before_script:
- openssl version
- ruby -ropenssl -e 'puts OpenSSL::VERSION'
script:
- bundle exec rake
- rvmsudo bundle exec rake spec:sudo
|
Add build with ruby 2.5.0
|
Add build with ruby 2.5.0
|
YAML
|
mit
|
sdaubert/packetgen
|
yaml
|
## Code Before:
language: ruby
sudo: required
rvm:
- 2.2.7
- 2.3.5
- 2.4.2
install:
- sudo apt-get update -qq
- sudo apt-get install libpcap-dev -qq
- bundle install --path vendor/bundle --jobs=3 --retry=3
before_script:
- openssl version
- ruby -ropenssl -e 'puts OpenSSL::VERSION'
script:
- bundle exec rake
- rvmsudo bundle exec rake spec:sudo
## Instruction:
Add build with ruby 2.5.0
## Code After:
language: ruby
sudo: required
rvm:
- 2.2.7
- 2.3.5
- 2.4.2
- 2.5.0
install:
- sudo apt-get update -qq
- sudo apt-get install libpcap-dev -qq
- bundle install --path vendor/bundle --jobs=3 --retry=3
before_script:
- openssl version
- ruby -ropenssl -e 'puts OpenSSL::VERSION'
script:
- bundle exec rake
- rvmsudo bundle exec rake spec:sudo
|
e4a6fa83cbb57ef2a27b93316dd5f8f2964d1b0b
|
package.json
|
package.json
|
{
"name": "karma-sauce-example",
"version": "0.0.0",
"description": "An example of using the Karma test runner with Sauce Labs' browser cloud to run JavaScript unit tests",
"main": "karma.conf-ci.js",
"scripts": {
"test": "karma start karma.conf-ci.js",
"postinstall": "node install.js"
},
"author": "Sauce Labs",
"license": "Apache V2",
"devDependencies": {
"karma": "^0.12.0",
"karma-sauce-launcher": "^0.2.2",
"karma-jasmine": "^0.2.2",
"karma-chrome-launcher": "^0.1.2",
"karma-firefox-launcher": "^0.1.3",
"karma-safari-launcher": "^0.1.1"
}
}
|
{
"name": "karma-sauce-example",
"version": "0.0.0",
"description": "An example of using the Karma test runner with Sauce Labs' browser cloud to run JavaScript unit tests",
"main": "karma.conf-ci.js",
"scripts": {
"test": "karma start karma.conf-ci.js",
"postinstall": "node install.js"
},
"author": "Sauce Labs",
"license": "Apache V2",
"devDependencies": {
"karma": "^0.12.0",
"karma-sauce-launcher": "ChrisWren/karma-sauce-launcher",
"karma-jasmine": "^0.2.2",
"karma-chrome-launcher": "^0.1.2",
"karma-firefox-launcher": "^0.1.3",
"karma-safari-launcher": "^0.1.1"
}
}
|
Use newer version of sauce connect
|
Use newer version of sauce connect
|
JSON
|
apache-2.0
|
sarbbottam/karma-sauce-example,riyamodi/practiceCI,NickTomlin/karma-sauce-example,intrepido/CI-Test2
|
json
|
## Code Before:
{
"name": "karma-sauce-example",
"version": "0.0.0",
"description": "An example of using the Karma test runner with Sauce Labs' browser cloud to run JavaScript unit tests",
"main": "karma.conf-ci.js",
"scripts": {
"test": "karma start karma.conf-ci.js",
"postinstall": "node install.js"
},
"author": "Sauce Labs",
"license": "Apache V2",
"devDependencies": {
"karma": "^0.12.0",
"karma-sauce-launcher": "^0.2.2",
"karma-jasmine": "^0.2.2",
"karma-chrome-launcher": "^0.1.2",
"karma-firefox-launcher": "^0.1.3",
"karma-safari-launcher": "^0.1.1"
}
}
## Instruction:
Use newer version of sauce connect
## Code After:
{
"name": "karma-sauce-example",
"version": "0.0.0",
"description": "An example of using the Karma test runner with Sauce Labs' browser cloud to run JavaScript unit tests",
"main": "karma.conf-ci.js",
"scripts": {
"test": "karma start karma.conf-ci.js",
"postinstall": "node install.js"
},
"author": "Sauce Labs",
"license": "Apache V2",
"devDependencies": {
"karma": "^0.12.0",
"karma-sauce-launcher": "ChrisWren/karma-sauce-launcher",
"karma-jasmine": "^0.2.2",
"karma-chrome-launcher": "^0.1.2",
"karma-firefox-launcher": "^0.1.3",
"karma-safari-launcher": "^0.1.1"
}
}
|
f4ade201e4160b4bc9752c4a8db11ee5f64d343d
|
install-protobuf.sh
|
install-protobuf.sh
|
set -ex
basename=protobuf-2.6.0
tarball=$basename.tar.bz2
wget https://protobuf.googlecode.com/svn/rc/$tarball
tar -xvf $tarball
cd $basename && ./configure --prefix=/usr && make && make install
|
set -ex
version=2.6.1
basename=protobuf-$version
curl -sL https://github.com/google/protobuf/releases/download/v$version/$basename.tar.bz2 | tar jx
cd $basename && ./configure --prefix=/usr && make && make install
|
Update `protobuf` to 2.6.1 and simplify the install
|
Update `protobuf` to 2.6.1 and simplify the install
|
Shell
|
mit
|
quixoten/protobuf,zanker/protobuf,localshred/protobuf,lookout/protobuffy,film42/protobuf,ruby-protobuf/protobuf,quixoten/protobuf,localshred/protobuf,lookout/protobuffy,brianstien/protobuf,film42/protobuf,zanker/protobuf,ruby-protobuf/protobuf,brianstien/protobuf
|
shell
|
## Code Before:
set -ex
basename=protobuf-2.6.0
tarball=$basename.tar.bz2
wget https://protobuf.googlecode.com/svn/rc/$tarball
tar -xvf $tarball
cd $basename && ./configure --prefix=/usr && make && make install
## Instruction:
Update `protobuf` to 2.6.1 and simplify the install
## Code After:
set -ex
version=2.6.1
basename=protobuf-$version
curl -sL https://github.com/google/protobuf/releases/download/v$version/$basename.tar.bz2 | tar jx
cd $basename && ./configure --prefix=/usr && make && make install
|
209b771431ad4472a94f3180dcfe34b50f93f053
|
test/mjsunit/compiler/opt-next-call-turbo.js
|
test/mjsunit/compiler/opt-next-call-turbo.js
|
// Copyright 2015 the V8 project authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// Flags: --allow-natives-syntax --turbo-filter=*
function foo() {
with ({ value:"fooed" }) { return value; }
}
%OptimizeFunctionOnNextCall(foo);
assertEquals("fooed", foo());
// TODO(mstarzinger): Still not optimized, make sure it is.
// assertOptimized(foo);
function bar() {
with ({ value:"bared" }) { return value; }
}
assertEquals("bared", bar());
%OptimizeFunctionOnNextCall(bar);
assertEquals("bared", bar());
// TODO(mstarzinger): Still not optimized, make sure it is.
// assertOptimized(bar);
|
// Copyright 2015 the V8 project authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// Flags: --allow-natives-syntax --turbo-filter=*
function foo() {
with ({ value:"fooed" }) { return value; }
}
%OptimizeFunctionOnNextCall(foo);
assertEquals("fooed", foo());
assertOptimized(foo);
function bar() {
with ({ value:"bared" }) { return value; }
}
assertEquals("bared", bar());
%OptimizeFunctionOnNextCall(bar);
assertEquals("bared", bar());
assertOptimized(bar);
|
Enable test coverage for test coverage.
|
Enable test coverage for test coverage.
[email protected]
TEST=mjsunit/compiler/opt-next-call-turbo
Review URL: https://codereview.chromium.org/822673003
Cr-Commit-Position: 972c6d2dc6dd5efdad1377c0d224e03eb8f276f7@{#26192}
|
JavaScript
|
mit
|
UniversalFuture/moosh,UniversalFuture/moosh,UniversalFuture/moosh,UniversalFuture/moosh
|
javascript
|
## Code Before:
// Copyright 2015 the V8 project authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// Flags: --allow-natives-syntax --turbo-filter=*
function foo() {
with ({ value:"fooed" }) { return value; }
}
%OptimizeFunctionOnNextCall(foo);
assertEquals("fooed", foo());
// TODO(mstarzinger): Still not optimized, make sure it is.
// assertOptimized(foo);
function bar() {
with ({ value:"bared" }) { return value; }
}
assertEquals("bared", bar());
%OptimizeFunctionOnNextCall(bar);
assertEquals("bared", bar());
// TODO(mstarzinger): Still not optimized, make sure it is.
// assertOptimized(bar);
## Instruction:
Enable test coverage for test coverage.
[email protected]
TEST=mjsunit/compiler/opt-next-call-turbo
Review URL: https://codereview.chromium.org/822673003
Cr-Commit-Position: 972c6d2dc6dd5efdad1377c0d224e03eb8f276f7@{#26192}
## Code After:
// Copyright 2015 the V8 project authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// Flags: --allow-natives-syntax --turbo-filter=*
function foo() {
with ({ value:"fooed" }) { return value; }
}
%OptimizeFunctionOnNextCall(foo);
assertEquals("fooed", foo());
assertOptimized(foo);
function bar() {
with ({ value:"bared" }) { return value; }
}
assertEquals("bared", bar());
%OptimizeFunctionOnNextCall(bar);
assertEquals("bared", bar());
assertOptimized(bar);
|
956bf3e178e84b2733448ff6e98d74feac19b947
|
README.md
|
README.md
|
`sql_tracker` tracks SQL queries by subscribing to Rails' `sql.active_record` event notifications.
It then aggregates and generates report to give you insights about all the sql queries happened in your Rails application.
## Installation
Add this line to your application's Gemfile:
```ruby
group :development, :test do
... ...
gem 'sql_tracker'
end
```
And then execute:
$ bundle
## Tracking
To start tracking, simply start your rails application server. When your server is shutting down, `sql_tracker` will dump all the tracking data into one or more json file(s) under the `tmp` folder of your application.
`sql_tracker` can also track sql queries when running rails tests (e.g. your controller or integration tests), it will dump the data after all the tests are finished.
## Reporting
To generate report, run
```bash
sql_tracker tmp/sql_tracker-*.json
```
## License
The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
|
`sql_tracker` tracks SQL queries by subscribing to Rails' `sql.active_record` event notifications.
It then aggregates and generates report to give you insights about all the sql queries happened in your Rails application.
## Installation
Add this line to your application's Gemfile:
```ruby
group :development, :test do
... ...
gem 'sql_tracker'
end
```
And then execute:
$ bundle
## Tracking
To start tracking, simply start your rails application server. When your server is shutting down, `sql_tracker` will dump all the tracking data into one or more json file(s) under the `tmp` folder of your application.
`sql_tracker` can also track sql queries when running rails tests (e.g. your controller or integration tests), it will dump the data after all the tests are finished.
## Reporting
To generate report, run
```bash
sql_tracker tmp/sql_tracker-*.json
```
## Configurations
All the configurable variables and their defaults are list below:
```ruby
SqlTracker::Config.enabled = true
SqlTracker::Config.tracked_paths = %w(app lib)
SqlTracker::Config.tracked_sql_command = %w(SELECT INSERT UPDATE DELETE)
SqlTracker::Config.output_path = File.join(Rails.root.to_s, 'tmp')
```
## License
The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
|
Update readme to include configurations
|
Update readme to include configurations
|
Markdown
|
mit
|
steventen/sql_tracker,steventen/sql_tracker
|
markdown
|
## Code Before:
`sql_tracker` tracks SQL queries by subscribing to Rails' `sql.active_record` event notifications.
It then aggregates and generates report to give you insights about all the sql queries happened in your Rails application.
## Installation
Add this line to your application's Gemfile:
```ruby
group :development, :test do
... ...
gem 'sql_tracker'
end
```
And then execute:
$ bundle
## Tracking
To start tracking, simply start your rails application server. When your server is shutting down, `sql_tracker` will dump all the tracking data into one or more json file(s) under the `tmp` folder of your application.
`sql_tracker` can also track sql queries when running rails tests (e.g. your controller or integration tests), it will dump the data after all the tests are finished.
## Reporting
To generate report, run
```bash
sql_tracker tmp/sql_tracker-*.json
```
## License
The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
## Instruction:
Update readme to include configurations
## Code After:
`sql_tracker` tracks SQL queries by subscribing to Rails' `sql.active_record` event notifications.
It then aggregates and generates report to give you insights about all the sql queries happened in your Rails application.
## Installation
Add this line to your application's Gemfile:
```ruby
group :development, :test do
... ...
gem 'sql_tracker'
end
```
And then execute:
$ bundle
## Tracking
To start tracking, simply start your rails application server. When your server is shutting down, `sql_tracker` will dump all the tracking data into one or more json file(s) under the `tmp` folder of your application.
`sql_tracker` can also track sql queries when running rails tests (e.g. your controller or integration tests), it will dump the data after all the tests are finished.
## Reporting
To generate report, run
```bash
sql_tracker tmp/sql_tracker-*.json
```
## Configurations
All the configurable variables and their defaults are list below:
```ruby
SqlTracker::Config.enabled = true
SqlTracker::Config.tracked_paths = %w(app lib)
SqlTracker::Config.tracked_sql_command = %w(SELECT INSERT UPDATE DELETE)
SqlTracker::Config.output_path = File.join(Rails.root.to_s, 'tmp')
```
## License
The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
|
b70e29af95577d048f8bb6aa04488431abd9ec90
|
app/helpers/blacklight_helper.rb
|
app/helpers/blacklight_helper.rb
|
module BlacklightHelper
# Make our application helper functions available to core blacklight views
include ApplicationHelper
include Blacklight::BlacklightHelperBehavior
include Findingaids::Solr::CatalogHelpers::ClassMethods
# delegate :blacklight_config, to: CatalogController
# Change link to document to link out to external guide
def link_to_document(doc, field, opts={:counter => nil})
if(doc.unittitle.blank?)
label=t('search.brief_results.link_text.no_title')
else
field ||= document_show_link_field(doc)
label=presenter(doc).render_document_index_label field, opts
end
link_to_findingaid(doc, label)
end
def sanitize_search_params(params)
params.permit(:q, :f => whitelisted_facets)
rescue NoMethodError => e
params
end
def whitelisted_facets
@whitelisted_facets ||= Hash[facet_fields.map(&:first).map(&:last).map {|f| ["#{f}_sim".to_sym, []]}]
end
def render_bookmarks_control?
has_user_authentication_provider? and current_or_guest_user.present?
end
end
|
module BlacklightHelper
# Make our application helper functions available to core blacklight views
include ApplicationHelper
include Blacklight::BlacklightHelperBehavior
include Findingaids::Solr::CatalogHelpers::ClassMethods
# delegate :blacklight_config, to: CatalogController
# Change link to document to link out to external guide
def link_to_document(doc, field, opts={:counter => nil})
if(doc.unittitle.blank?)
label=t('search.brief_results.link_text.no_title')
else
field ||= document_show_link_field(doc)
label=presenter(doc).render_document_index_label field, opts
end
link_to_findingaid(doc, label)
end
def sanitize_search_params(params)
params.permit(:q, :f => whitelisted_facets)
rescue NoMethodError => e
params
end
def whitelisted_facets
@whitelisted_facets ||= Hash[facet_fields.map(&:first).map(&:last).map {|f| ["#{f}_sim".to_sym, []]}]
end
def render_bookmarks_control?
has_user_authentication_provider? and current_or_guest_user.present?
end
protected
##
# Context in which to evaluate blacklight configuration conditionals
def blacklight_configuration_context
@blacklight_configuration_context ||= Blacklight::Configuration::Context.new(self)
end
end
|
Fix blacklight helper error (cucumber not passing)
|
Fix blacklight helper error (cucumber not passing)
|
Ruby
|
mit
|
NYULibraries/findingaids,NYULibraries/findingaids,NYULibraries/findingaids
|
ruby
|
## Code Before:
module BlacklightHelper
# Make our application helper functions available to core blacklight views
include ApplicationHelper
include Blacklight::BlacklightHelperBehavior
include Findingaids::Solr::CatalogHelpers::ClassMethods
# delegate :blacklight_config, to: CatalogController
# Change link to document to link out to external guide
def link_to_document(doc, field, opts={:counter => nil})
if(doc.unittitle.blank?)
label=t('search.brief_results.link_text.no_title')
else
field ||= document_show_link_field(doc)
label=presenter(doc).render_document_index_label field, opts
end
link_to_findingaid(doc, label)
end
def sanitize_search_params(params)
params.permit(:q, :f => whitelisted_facets)
rescue NoMethodError => e
params
end
def whitelisted_facets
@whitelisted_facets ||= Hash[facet_fields.map(&:first).map(&:last).map {|f| ["#{f}_sim".to_sym, []]}]
end
def render_bookmarks_control?
has_user_authentication_provider? and current_or_guest_user.present?
end
end
## Instruction:
Fix blacklight helper error (cucumber not passing)
## Code After:
module BlacklightHelper
# Make our application helper functions available to core blacklight views
include ApplicationHelper
include Blacklight::BlacklightHelperBehavior
include Findingaids::Solr::CatalogHelpers::ClassMethods
# delegate :blacklight_config, to: CatalogController
# Change link to document to link out to external guide
def link_to_document(doc, field, opts={:counter => nil})
if(doc.unittitle.blank?)
label=t('search.brief_results.link_text.no_title')
else
field ||= document_show_link_field(doc)
label=presenter(doc).render_document_index_label field, opts
end
link_to_findingaid(doc, label)
end
def sanitize_search_params(params)
params.permit(:q, :f => whitelisted_facets)
rescue NoMethodError => e
params
end
def whitelisted_facets
@whitelisted_facets ||= Hash[facet_fields.map(&:first).map(&:last).map {|f| ["#{f}_sim".to_sym, []]}]
end
def render_bookmarks_control?
has_user_authentication_provider? and current_or_guest_user.present?
end
protected
##
# Context in which to evaluate blacklight configuration conditionals
def blacklight_configuration_context
@blacklight_configuration_context ||= Blacklight::Configuration::Context.new(self)
end
end
|
f7b4d1cda4e1ffb5e50be903aafd8d0c1e1be38c
|
.circleci/config.yml
|
.circleci/config.yml
|
version: 2
jobs:
build:
working_directory: /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub
docker:
- image: circleci/golang:1.8.1
steps:
- checkout
- run:
command: |
echo "---"
curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
tar xzf google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
cd google-cloud-sdk
sh ./install.sh -q --additional-components beta pubsub-emulator --usage-reporting false
cd ..
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; echo "gcloud? $(which gcloud) PATH $PATH"'
echo "Fetching glide; note GOPATH is $GOPATH"
curl http://glide.sh/get | /bin/bash
echo "Building"
git config --global [email protected]:.insteadOf https://github.com/
make
echo "Testing"
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; make test'
|
version: 2
jobs:
build:
working_directory: /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub
docker:
- image: circleci/golang:1.8.1
steps:
- checkout
- run:
command: |
echo "---"
curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
tar xzf google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
cd google-cloud-sdk
sudo apt-get install openjdk-7-jre
sh ./install.sh -q --additional-components beta pubsub-emulator --usage-reporting false
cd ..
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; echo "gcloud? $(which gcloud) PATH $PATH"'
echo "Fetching glide; note GOPATH is $GOPATH"
curl http://glide.sh/get | /bin/bash
echo "Building"
git config --global [email protected]:.insteadOf https://github.com/
make
echo "Testing"
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; make test'
|
Install OpenJDK 7 JRE for pubsub emulator
|
Install OpenJDK 7 JRE for pubsub emulator
|
YAML
|
apache-2.0
|
SignifAi/snap-plugin-publisher-pubsub,SignifAi/snap-plugin-publisher-pubsub
|
yaml
|
## Code Before:
version: 2
jobs:
build:
working_directory: /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub
docker:
- image: circleci/golang:1.8.1
steps:
- checkout
- run:
command: |
echo "---"
curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
tar xzf google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
cd google-cloud-sdk
sh ./install.sh -q --additional-components beta pubsub-emulator --usage-reporting false
cd ..
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; echo "gcloud? $(which gcloud) PATH $PATH"'
echo "Fetching glide; note GOPATH is $GOPATH"
curl http://glide.sh/get | /bin/bash
echo "Building"
git config --global [email protected]:.insteadOf https://github.com/
make
echo "Testing"
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; make test'
## Instruction:
Install OpenJDK 7 JRE for pubsub emulator
## Code After:
version: 2
jobs:
build:
working_directory: /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub
docker:
- image: circleci/golang:1.8.1
steps:
- checkout
- run:
command: |
echo "---"
curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
tar xzf google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
cd google-cloud-sdk
sudo apt-get install openjdk-7-jre
sh ./install.sh -q --additional-components beta pubsub-emulator --usage-reporting false
cd ..
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; echo "gcloud? $(which gcloud) PATH $PATH"'
echo "Fetching glide; note GOPATH is $GOPATH"
curl http://glide.sh/get | /bin/bash
echo "Building"
git config --global [email protected]:.insteadOf https://github.com/
make
echo "Testing"
bash -c 'source /go/src/github.com/SignifAi/snap-plugin-publisher-pubsub/google-cloud-sdk/path.bash.inc; make test'
|
c36f6f6399761fa8a29aea9ad4f16ce57d700c9d
|
src/scripts/modules/search/results/list/item-partial.html
|
src/scripts/modules/search/results/list/item-partial.html
|
<tr>
<td class="type">
{{#is mediaType 'application/vnd.org.cnx.collection'}}
<span class="fa fa-book"></span>
{{else}}
{{#is mediaType 'application/vnd.org.cnx.module'}}
<span class="fa fa-file"></span>
{{else}}
<span class="fa fa-question-circle"></span>
{{/is}}
{{/is}}
</td>
<td class="title">
<h4><a href="contents/{{id}}">{{title}}</a></h4>
<span>{{include summarySnippet}}</span>
</td>
<td class="authors">
{{#each authors}}
<span class="author">{{author ../authorList index 'fullname'}}</span>
{{/each}}
</td>
<td class="edited">{{date pubDate}}</td>
</tr>
|
<tr>
<td class="type">
{{#is mediaType 'application/vnd.org.cnx.collection'}}
<span class="fa fa-book" aria-label="book"></span>
{{else}}
{{#is mediaType 'application/vnd.org.cnx.module'}}
<span class="fa fa-file" arai-label="file"></span>
{{else}}
<span class="fa fa-question-circle" aria-label="unknown"></span>
{{/is}}
{{/is}}
</td>
<td class="title">
<h4><a href="contents/{{id}}">{{title}}</a></h4>
<span>{{include summarySnippet}}</span>
</td>
<td class="authors">
{{#each authors}}
<span class="author">{{author ../authorList index 'fullname'}}</span>
{{/each}}
</td>
<td class="edited">{{date pubDate}}</td>
</tr>
|
Add labels to type icons
|
Add labels to type icons
|
HTML
|
agpl-3.0
|
katalysteducation/webview,Connexions/webview,katalysteducation/webview,Connexions/webview,dak/webview,dak/webview,Connexions/webview,dak/webview,katalysteducation/webview,katalysteducation/webview,Connexions/webview
|
html
|
## Code Before:
<tr>
<td class="type">
{{#is mediaType 'application/vnd.org.cnx.collection'}}
<span class="fa fa-book"></span>
{{else}}
{{#is mediaType 'application/vnd.org.cnx.module'}}
<span class="fa fa-file"></span>
{{else}}
<span class="fa fa-question-circle"></span>
{{/is}}
{{/is}}
</td>
<td class="title">
<h4><a href="contents/{{id}}">{{title}}</a></h4>
<span>{{include summarySnippet}}</span>
</td>
<td class="authors">
{{#each authors}}
<span class="author">{{author ../authorList index 'fullname'}}</span>
{{/each}}
</td>
<td class="edited">{{date pubDate}}</td>
</tr>
## Instruction:
Add labels to type icons
## Code After:
<tr>
<td class="type">
{{#is mediaType 'application/vnd.org.cnx.collection'}}
<span class="fa fa-book" aria-label="book"></span>
{{else}}
{{#is mediaType 'application/vnd.org.cnx.module'}}
<span class="fa fa-file" arai-label="file"></span>
{{else}}
<span class="fa fa-question-circle" aria-label="unknown"></span>
{{/is}}
{{/is}}
</td>
<td class="title">
<h4><a href="contents/{{id}}">{{title}}</a></h4>
<span>{{include summarySnippet}}</span>
</td>
<td class="authors">
{{#each authors}}
<span class="author">{{author ../authorList index 'fullname'}}</span>
{{/each}}
</td>
<td class="edited">{{date pubDate}}</td>
</tr>
|
a8c44e06e6cdebdcc9c6274b7547c064a8e935d9
|
README.md
|
README.md
|
TextureMerger
=============
A tool to take multiple textures and combine them together to use a single one.<br>
Supports N by N textures with preview, up to 8x8 images and has a preference system to keep the settings over sessions.<br>
<br>
It also now support having preview of several sizes to ease big picture input preview, or small screen that wouldn't be able to display all the full size preview.<br>
<br>
Known issue / Todo list:<br>
-Preference should be a default value while overridable in the main window.<br>
-Fix the +/- buttons on each row to keep their size.<br>
-Make sure we have the right size for the buttons when changing the preview size.<br>
|
TextureMerger
=============
A tool to take multiple textures and combine them together to use a single one.<br>
Supports N by N textures with preview, up to 8x8 images and has a preference system to keep the settings over sessions.<br>
<br>
It also now support having preview of several sizes to ease big picture input preview, or small screen that wouldn't be able to display all the full size preview.<br>
<br>
Known issue / Todo list:<br>
-Fix the +/- buttons on each row to keep their size.<br>
-Make sure we have the right size for the buttons when changing the preview size.<br>
|
Update the ReadMe file to reflect the fix to the preferences being in the main windows while the one in the prefs are the defaults used when clearing the whole thing.
|
Update the ReadMe file to reflect the fix to the preferences being in the main windows while the one in the prefs are the defaults used when clearing the whole thing.
|
Markdown
|
mit
|
rl132/TextureMerger
|
markdown
|
## Code Before:
TextureMerger
=============
A tool to take multiple textures and combine them together to use a single one.<br>
Supports N by N textures with preview, up to 8x8 images and has a preference system to keep the settings over sessions.<br>
<br>
It also now support having preview of several sizes to ease big picture input preview, or small screen that wouldn't be able to display all the full size preview.<br>
<br>
Known issue / Todo list:<br>
-Preference should be a default value while overridable in the main window.<br>
-Fix the +/- buttons on each row to keep their size.<br>
-Make sure we have the right size for the buttons when changing the preview size.<br>
## Instruction:
Update the ReadMe file to reflect the fix to the preferences being in the main windows while the one in the prefs are the defaults used when clearing the whole thing.
## Code After:
TextureMerger
=============
A tool to take multiple textures and combine them together to use a single one.<br>
Supports N by N textures with preview, up to 8x8 images and has a preference system to keep the settings over sessions.<br>
<br>
It also now support having preview of several sizes to ease big picture input preview, or small screen that wouldn't be able to display all the full size preview.<br>
<br>
Known issue / Todo list:<br>
-Fix the +/- buttons on each row to keep their size.<br>
-Make sure we have the right size for the buttons when changing the preview size.<br>
|
458afca669d65ab2ce40c64a1e9013f1f93156fc
|
tsconfig.json
|
tsconfig.json
|
{
"compilerOptions": {
"target": "esnext",
"module": "esnext",
"strict": true,
"declaration": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true
},
"include": ["src/**/*"]
}
|
{
"compilerOptions": {
"target": "ES6",
"module": "ES2020",
"strict": true,
"declaration": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true
},
"include": ["src/**/*"]
}
|
Increase accuracy of TS target + module
|
Increase accuracy of TS target + module
|
JSON
|
mit
|
mkay581/scroll-js,mkay581/scroll-js
|
json
|
## Code Before:
{
"compilerOptions": {
"target": "esnext",
"module": "esnext",
"strict": true,
"declaration": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true
},
"include": ["src/**/*"]
}
## Instruction:
Increase accuracy of TS target + module
## Code After:
{
"compilerOptions": {
"target": "ES6",
"module": "ES2020",
"strict": true,
"declaration": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true
},
"include": ["src/**/*"]
}
|
872c4fc55b3e0d71eea61d9c5bfc60d7e0697d1a
|
roles/dotfiles/files/.vim/after/plugin/loupe.vim
|
roles/dotfiles/files/.vim/after/plugin/loupe.vim
|
function! s:SetUpLoupeHighlight()
highlight! link Search Underlined
endfunction
if has('autocmd')
augroup WincentLoupe
autocmd!
autocmd ColorScheme * call s:SetUpLoupeHighlight()
augroup END
endif
|
function! s:SetUpLoupeHighlight()
highlight! clear Search
execute 'highlight! Search ' . pinnacle#embolden('Underlined')
endfunction
if has('autocmd')
augroup WincentLoupe
autocmd!
autocmd ColorScheme * call s:SetUpLoupeHighlight()
augroup END
endif
|
Make Search highlighting bold like it was before
|
vim: Make Search highlighting bold like it was before
With the move to "Underlined" we lost the bold that we had with
"VisualNOS".
|
VimL
|
unlicense
|
wincent/wincent,wincent/wincent,wincent/wincent,wincent/wincent,wincent/wincent,wincent/wincent,wincent/wincent,wincent/wincent
|
viml
|
## Code Before:
function! s:SetUpLoupeHighlight()
highlight! link Search Underlined
endfunction
if has('autocmd')
augroup WincentLoupe
autocmd!
autocmd ColorScheme * call s:SetUpLoupeHighlight()
augroup END
endif
## Instruction:
vim: Make Search highlighting bold like it was before
With the move to "Underlined" we lost the bold that we had with
"VisualNOS".
## Code After:
function! s:SetUpLoupeHighlight()
highlight! clear Search
execute 'highlight! Search ' . pinnacle#embolden('Underlined')
endfunction
if has('autocmd')
augroup WincentLoupe
autocmd!
autocmd ColorScheme * call s:SetUpLoupeHighlight()
augroup END
endif
|
21e0f8fb856c0be3ee297657ccae7fbfc7abf8fc
|
Demo/uMainForm.pas
|
Demo/uMainForm.pas
|
unit uMainForm;
{$IFDEF FPC}{$MODE Delphi}{$ENDIF}
interface
uses
Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms,
Dialogs, Menus, StdCtrls;
type
TForm1 = class(TForm)
Memo1: TMemo;
MainMenu1: TMainMenu;
OpenDelphiUnit1: TMenuItem;
OpenDialog1: TOpenDialog;
procedure OpenDelphiUnit1Click(Sender: TObject);
private
{ Private declarations }
public
{ Public declarations }
end;
var
Form1: TForm1;
implementation
uses
DelphiAST, DelphiAST.Writer, DelphiAST.Classes;
{$R *.lfm}
function Parse(const FileName: string): string;
var
SyntaxTree: TSyntaxNode;
begin
Result := '';
SyntaxTree := TPasSyntaxTreeBuilder.Run(FileName);
try
Result := TSyntaxTreeWriter.ToXML(SyntaxTree, True);
finally
SyntaxTree.Free;
end;
end;
procedure TForm1.OpenDelphiUnit1Click(Sender: TObject);
begin
if OpenDialog1.Execute then
begin
try
Memo1.Lines.Text := Parse(OpenDialog1.FileName);
except
on E: EParserException do
Memo1.Lines.Add(Format('[%d, %d] %s', [E.Line, E.Col, E.Message]));
end;
end;
end;
end.
|
unit uMainForm;
{$IFDEF FPC}{$MODE Delphi}{$ENDIF}
interface
uses
Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms,
Dialogs, Menus, StdCtrls;
type
TForm1 = class(TForm)
Memo1: TMemo;
MainMenu1: TMainMenu;
OpenDelphiUnit1: TMenuItem;
OpenDialog1: TOpenDialog;
procedure OpenDelphiUnit1Click(Sender: TObject);
private
{ Private declarations }
public
{ Public declarations }
end;
var
Form1: TForm1;
implementation
uses
DelphiAST, DelphiAST.Writer, DelphiAST.Classes;
{$IFNDEF FPC}
{$R *.dfm}
{$ELSE}
{$R *.lfm}
{$ENDIF}
function Parse(const FileName: string): string;
var
SyntaxTree: TSyntaxNode;
begin
Result := '';
SyntaxTree := TPasSyntaxTreeBuilder.Run(FileName);
try
Result := TSyntaxTreeWriter.ToXML(SyntaxTree, True);
finally
SyntaxTree.Free;
end;
end;
procedure TForm1.OpenDelphiUnit1Click(Sender: TObject);
begin
if OpenDialog1.Execute then
begin
try
Memo1.Lines.Text := Parse(OpenDialog1.FileName);
except
on E: EParserException do
Memo1.Lines.Add(Format('[%d, %d] %s', [E.Line, E.Col, E.Message]));
end;
end;
end;
end.
|
Correct resource include in demo project's main form
|
Correct resource include in demo project's main form
|
Pascal
|
mpl-2.0
|
Wosi/DelphiASTTemp,AmesianX/DelphiAST,Wosi/DelphiAST,RomanYankovsky/DelphiAST,vintagedave/DelphiAST,LaKraven/DelphiAST,NickRing/DelphiAST,uschuster/DelphiAST,monkeyxu/DelphiAST
|
pascal
|
## Code Before:
unit uMainForm;
{$IFDEF FPC}{$MODE Delphi}{$ENDIF}
interface
uses
Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms,
Dialogs, Menus, StdCtrls;
type
TForm1 = class(TForm)
Memo1: TMemo;
MainMenu1: TMainMenu;
OpenDelphiUnit1: TMenuItem;
OpenDialog1: TOpenDialog;
procedure OpenDelphiUnit1Click(Sender: TObject);
private
{ Private declarations }
public
{ Public declarations }
end;
var
Form1: TForm1;
implementation
uses
DelphiAST, DelphiAST.Writer, DelphiAST.Classes;
{$R *.lfm}
function Parse(const FileName: string): string;
var
SyntaxTree: TSyntaxNode;
begin
Result := '';
SyntaxTree := TPasSyntaxTreeBuilder.Run(FileName);
try
Result := TSyntaxTreeWriter.ToXML(SyntaxTree, True);
finally
SyntaxTree.Free;
end;
end;
procedure TForm1.OpenDelphiUnit1Click(Sender: TObject);
begin
if OpenDialog1.Execute then
begin
try
Memo1.Lines.Text := Parse(OpenDialog1.FileName);
except
on E: EParserException do
Memo1.Lines.Add(Format('[%d, %d] %s', [E.Line, E.Col, E.Message]));
end;
end;
end;
end.
## Instruction:
Correct resource include in demo project's main form
## Code After:
unit uMainForm;
{$IFDEF FPC}{$MODE Delphi}{$ENDIF}
interface
uses
Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms,
Dialogs, Menus, StdCtrls;
type
TForm1 = class(TForm)
Memo1: TMemo;
MainMenu1: TMainMenu;
OpenDelphiUnit1: TMenuItem;
OpenDialog1: TOpenDialog;
procedure OpenDelphiUnit1Click(Sender: TObject);
private
{ Private declarations }
public
{ Public declarations }
end;
var
Form1: TForm1;
implementation
uses
DelphiAST, DelphiAST.Writer, DelphiAST.Classes;
{$IFNDEF FPC}
{$R *.dfm}
{$ELSE}
{$R *.lfm}
{$ENDIF}
function Parse(const FileName: string): string;
var
SyntaxTree: TSyntaxNode;
begin
Result := '';
SyntaxTree := TPasSyntaxTreeBuilder.Run(FileName);
try
Result := TSyntaxTreeWriter.ToXML(SyntaxTree, True);
finally
SyntaxTree.Free;
end;
end;
procedure TForm1.OpenDelphiUnit1Click(Sender: TObject);
begin
if OpenDialog1.Execute then
begin
try
Memo1.Lines.Text := Parse(OpenDialog1.FileName);
except
on E: EParserException do
Memo1.Lines.Add(Format('[%d, %d] %s', [E.Line, E.Col, E.Message]));
end;
end;
end;
end.
|
06100741af8fd2c7f16aa6c2dd8ff8668353a8c4
|
scripts/mysql/dbpatch/change-log/schema-portal/mike-25.sql
|
scripts/mysql/dbpatch/change-log/schema-portal/mike-25.sql
|
INSERT INTO `host`(hostname, ip, agent_version, plugin_version)
SELECT nqm_agent.ag_hostname, INET_NTOA(CONV(HEX(ag_ip_address), 16, 10)), '', ''
FROM nqm_agent
LEFT OUTER JOIN
`host`
ON nqm_agent.ag_hostname = `host`.hostname
WHERE `host`.id IS NULL;
ALTER TABLE nqm_agent
ADD COLUMN ag_hs_id INT AFTER ag_id;
UPDATE nqm_agent, `host`
SET ag_hs_id = `host`.`id`
WHERE `host`.hostname = nqm_agent.ag_hostname;
ALTER TABLE nqm_agent
MODIFY COLUMN ag_hs_id INT NOT NULL,
ADD CONSTRAINT fk_nqm_agent__host
FOREIGN KEY (ag_hs_id)
REFERENCES `host`(`id`)
ON UPDATE RESTRICT
ON DELETE RESTRICT;
|
DELETE FROM nqm_agent
WHERE ag_hostname LIKE '%.cdn.fastweb.com.cn';
INSERT INTO `host`(hostname, ip, agent_version, plugin_version)
SELECT DISTINCT nqm_agent.ag_hostname, INET_NTOA(CONV(HEX(ag_ip_address), 16, 10)), '', ''
FROM nqm_agent
LEFT OUTER JOIN
`host`
ON nqm_agent.ag_hostname = `host`.hostname
WHERE `host`.id IS NULL;
ALTER TABLE nqm_agent
ADD COLUMN ag_hs_id INT AFTER ag_id;
UPDATE nqm_agent, `host`
SET ag_hs_id = `host`.`id`
WHERE `host`.hostname = nqm_agent.ag_hostname;
ALTER TABLE nqm_agent
MODIFY COLUMN ag_hs_id INT NOT NULL,
ADD CONSTRAINT fk_nqm_agent__host
FOREIGN KEY (ag_hs_id)
REFERENCES `host`(`id`)
ON UPDATE RESTRICT
ON DELETE RESTRICT;
|
Fix patch of building host from existing NQM agent
|
[OWL-750] Fix patch of building host from existing NQM agent
|
SQL
|
apache-2.0
|
masato25/open-falcon-backend,Cepave/open-falcon-backend,Cepave/open-falcon-backend,masato25/open-falcon-backend,Cepave/open-falcon-backend,masato25/open-falcon-backend,Cepave/open-falcon-backend,masato25/open-falcon-backend,Cepave/open-falcon-backend,masato25/open-falcon-backend,masato25/open-falcon-backend,Cepave/open-falcon-backend
|
sql
|
## Code Before:
INSERT INTO `host`(hostname, ip, agent_version, plugin_version)
SELECT nqm_agent.ag_hostname, INET_NTOA(CONV(HEX(ag_ip_address), 16, 10)), '', ''
FROM nqm_agent
LEFT OUTER JOIN
`host`
ON nqm_agent.ag_hostname = `host`.hostname
WHERE `host`.id IS NULL;
ALTER TABLE nqm_agent
ADD COLUMN ag_hs_id INT AFTER ag_id;
UPDATE nqm_agent, `host`
SET ag_hs_id = `host`.`id`
WHERE `host`.hostname = nqm_agent.ag_hostname;
ALTER TABLE nqm_agent
MODIFY COLUMN ag_hs_id INT NOT NULL,
ADD CONSTRAINT fk_nqm_agent__host
FOREIGN KEY (ag_hs_id)
REFERENCES `host`(`id`)
ON UPDATE RESTRICT
ON DELETE RESTRICT;
## Instruction:
[OWL-750] Fix patch of building host from existing NQM agent
## Code After:
DELETE FROM nqm_agent
WHERE ag_hostname LIKE '%.cdn.fastweb.com.cn';
INSERT INTO `host`(hostname, ip, agent_version, plugin_version)
SELECT DISTINCT nqm_agent.ag_hostname, INET_NTOA(CONV(HEX(ag_ip_address), 16, 10)), '', ''
FROM nqm_agent
LEFT OUTER JOIN
`host`
ON nqm_agent.ag_hostname = `host`.hostname
WHERE `host`.id IS NULL;
ALTER TABLE nqm_agent
ADD COLUMN ag_hs_id INT AFTER ag_id;
UPDATE nqm_agent, `host`
SET ag_hs_id = `host`.`id`
WHERE `host`.hostname = nqm_agent.ag_hostname;
ALTER TABLE nqm_agent
MODIFY COLUMN ag_hs_id INT NOT NULL,
ADD CONSTRAINT fk_nqm_agent__host
FOREIGN KEY (ag_hs_id)
REFERENCES `host`(`id`)
ON UPDATE RESTRICT
ON DELETE RESTRICT;
|
2bde683bcfbdc7149a114abd609a3c91c19cac0f
|
tagcache/lock.py
|
tagcache/lock.py
|
import os
import fcntl
class FileLock(object):
def __init__(self, fd):
# the fd is borrowed, so do not close it
self.fd = fd
def acquire(self, write=False, block=False):
try:
lock_flags = fcntl.LOCK_EX if write else fcntl.LOCK_SH
if not block:
lock_flags |= fcntl.LOCK_NB
fcntl.flock(self.fd, lock_flags)
return True
except IOError:
return False
def release(self):
fcntl.flock(self.fd, fcntl.LOCK_UN)
|
import os
import fcntl
class FileLock(object):
def __init__(self, fd):
# the fd is borrowed, so do not close it
self.fd = fd
def acquire(self, ex=False, nb=True):
"""
Acquire a lock on the fd.
:param ex (optional): default False, acquire a exclusive lock if True
:param nb (optional): default True, non blocking if True
:return: True if acquired
"""
try:
lock_flags = fcntl.LOCK_EX if ex else fcntl.LOCK_SH
if nb:
lock_flags |= fcntl.LOCK_NB
fcntl.flock(self.fd, lock_flags)
return True
except IOError:
return False
def release(self):
"""
Release lock on the fd.
"""
fcntl.flock(self.fd, fcntl.LOCK_UN)
|
Change parameter names in acquire. Add some doc.
|
Change parameter names in acquire.
Add some doc.
|
Python
|
mit
|
huangjunwen/tagcache
|
python
|
## Code Before:
import os
import fcntl
class FileLock(object):
def __init__(self, fd):
# the fd is borrowed, so do not close it
self.fd = fd
def acquire(self, write=False, block=False):
try:
lock_flags = fcntl.LOCK_EX if write else fcntl.LOCK_SH
if not block:
lock_flags |= fcntl.LOCK_NB
fcntl.flock(self.fd, lock_flags)
return True
except IOError:
return False
def release(self):
fcntl.flock(self.fd, fcntl.LOCK_UN)
## Instruction:
Change parameter names in acquire.
Add some doc.
## Code After:
import os
import fcntl
class FileLock(object):
def __init__(self, fd):
# the fd is borrowed, so do not close it
self.fd = fd
def acquire(self, ex=False, nb=True):
"""
Acquire a lock on the fd.
:param ex (optional): default False, acquire a exclusive lock if True
:param nb (optional): default True, non blocking if True
:return: True if acquired
"""
try:
lock_flags = fcntl.LOCK_EX if ex else fcntl.LOCK_SH
if nb:
lock_flags |= fcntl.LOCK_NB
fcntl.flock(self.fd, lock_flags)
return True
except IOError:
return False
def release(self):
"""
Release lock on the fd.
"""
fcntl.flock(self.fd, fcntl.LOCK_UN)
|
4f1aaea09b80e6a12e0b34df547eaf8222f59e81
|
app/views/credit_invoices/_form.html.haml
|
app/views/credit_invoices/_form.html.haml
|
= semantic_form_for resource do |f|
= f.semantic_errors
= f.inputs do
= f.input :customer_id, :as => :hidden
= f.input :state, :collection => invoice_states_as_collection
.combobox-action= list_link_for(:new, Company, :target => 'blank')
= f.input :company, :input_html => {:class => 'combobox', 'data-autofocus' => true}
= f.input :title
= f.input :code
= f.input :value_date, :as => :date_field
= f.input :due_date, :as => :date_field
= f.input :amount, :input_html => {:size => 12}
= f.input :remarks, :input_html => {:rows => 4}
= f.inputs t('title.period') do
= f.input :duration_from, :as => :date_field
= f.input :duration_to, :as => :date_field
= f.inputs do
= f.semantic_fields_for :attachments do |a|
= a.input :title
= a.input :file, :as => :file
= f.buttons do
= f.commit_button
|
= semantic_form_for resource, :html => {:multipart => true} do |f|
= f.semantic_errors
= f.inputs do
= f.input :customer_id, :as => :hidden
= f.input :state, :collection => invoice_states_as_collection
.combobox-action= list_link_for(:new, Company, :target => 'blank')
= f.input :company, :input_html => {:class => 'combobox', 'data-autofocus' => true}
= f.input :title
= f.input :code
= f.input :value_date, :as => :date_field
= f.input :due_date, :as => :date_field
= f.input :amount, :input_html => {:size => 12}
= f.input :remarks, :input_html => {:rows => 4}
= f.inputs t('title.period') do
= f.input :duration_from, :as => :date_field
= f.input :duration_to, :as => :date_field
= f.inputs do
= f.semantic_fields_for :attachments do |a|
= a.input :title
= a.input :file, :as => :file
= f.buttons do
= f.commit_button
|
Use multipart form for credit invoices because of attachments.
|
Use multipart form for credit invoices because of attachments.
|
Haml
|
agpl-3.0
|
xuewenfei/bookyt,hauledev/bookyt,hauledev/bookyt,xuewenfei/bookyt,silvermind/bookyt,wtag/bookyt,silvermind/bookyt,silvermind/bookyt,gaapt/bookyt,xuewenfei/bookyt,silvermind/bookyt,hauledev/bookyt,gaapt/bookyt,gaapt/bookyt,huerlisi/bookyt,wtag/bookyt,gaapt/bookyt,hauledev/bookyt,huerlisi/bookyt,huerlisi/bookyt,wtag/bookyt
|
haml
|
## Code Before:
= semantic_form_for resource do |f|
= f.semantic_errors
= f.inputs do
= f.input :customer_id, :as => :hidden
= f.input :state, :collection => invoice_states_as_collection
.combobox-action= list_link_for(:new, Company, :target => 'blank')
= f.input :company, :input_html => {:class => 'combobox', 'data-autofocus' => true}
= f.input :title
= f.input :code
= f.input :value_date, :as => :date_field
= f.input :due_date, :as => :date_field
= f.input :amount, :input_html => {:size => 12}
= f.input :remarks, :input_html => {:rows => 4}
= f.inputs t('title.period') do
= f.input :duration_from, :as => :date_field
= f.input :duration_to, :as => :date_field
= f.inputs do
= f.semantic_fields_for :attachments do |a|
= a.input :title
= a.input :file, :as => :file
= f.buttons do
= f.commit_button
## Instruction:
Use multipart form for credit invoices because of attachments.
## Code After:
= semantic_form_for resource, :html => {:multipart => true} do |f|
= f.semantic_errors
= f.inputs do
= f.input :customer_id, :as => :hidden
= f.input :state, :collection => invoice_states_as_collection
.combobox-action= list_link_for(:new, Company, :target => 'blank')
= f.input :company, :input_html => {:class => 'combobox', 'data-autofocus' => true}
= f.input :title
= f.input :code
= f.input :value_date, :as => :date_field
= f.input :due_date, :as => :date_field
= f.input :amount, :input_html => {:size => 12}
= f.input :remarks, :input_html => {:rows => 4}
= f.inputs t('title.period') do
= f.input :duration_from, :as => :date_field
= f.input :duration_to, :as => :date_field
= f.inputs do
= f.semantic_fields_for :attachments do |a|
= a.input :title
= a.input :file, :as => :file
= f.buttons do
= f.commit_button
|
a0f10069388ecc9cde03a2731b6669141f053511
|
lib/generators/link_to_action/install_generator.rb
|
lib/generators/link_to_action/install_generator.rb
|
require 'rails/generators'
require 'rails/generators/named_base'
module LinkToAction
module Generators
class InstallGenerator < Rails::Generators::Base
desc "Copy LinkToAction configuration file"
source_root File.expand_path('../templates', __FILE__)
class_option :bootstrap, type: :boolean,
desc: 'Add the Twitter Bootstrap option and templates.'
def info_framework
puts "link_to_action supports Twitter Bootstrap. If you want a " \
"configuration that is compatible with this framework, then please " \
"re-run this generator with --bootstrap option." unless options.bootstrap?
end
def copy_initializers
copy_file 'link_to_action.rb', 'config/initializers/link_to_action.rb'
if options[:bootstrap]
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_icons = false|, 'config.use_icons = true'
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_classes = false|, 'config.use_classes = true'
end
end
def copy_locale_file
copy_file '../../../link_to_action/locale/en.yml',
'config/locales/link_to_action.en.yml'
end
def copy_templates
['edit', 'index', 'new', 'show'].each do |action|
copy_file "#{action}.html.erb",
"lib/templates/erb/scaffold/#{action}.html.erb"
end
end
end
end
end
|
require 'rails/generators'
require 'rails/generators/named_base'
module LinkToAction
module Generators
class InstallGenerator < Rails::Generators::Base
desc "Copy LinkToAction configuration file"
source_root File.expand_path('../templates', __FILE__)
class_option :bootstrap, type: :boolean,
desc: 'Add the Twitter Bootstrap option and templates.'
def info_framework
puts "link_to_action supports Twitter Bootstrap. If you want a " \
"configuration that is compatible with this framework, then please " \
"re-run this generator with --bootstrap option." unless options.bootstrap?
end
def copy_initializers
copy_file 'link_to_action.rb', 'config/initializers/link_to_action.rb'
if options[:bootstrap]
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_icons = false|, 'config.use_icons = true'
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_classes = false|, 'config.use_classes = true'
end
end
def copy_locale_file
copy_file '../../../link_to_action/locale/en.yml',
'config/locales/link_to_action.en.yml'
end
def copy_templates
['edit', 'index', 'new', 'show'].each do |action|
source_dir = 'bootstrap' if options[:bootstrap]
source_file = [ source_dir, "#{action}.html.erb"].compact.join('/')
copy_file source_file, "lib/templates/erb/scaffold/#{action}.html.erb"
end
end
end
end
end
|
Copy bootstrap templates if option specified.
|
Copy bootstrap templates if option specified.
|
Ruby
|
mit
|
denispeplin/link_to_action,denispeplin/link_to_action,denispeplin/link_to_action
|
ruby
|
## Code Before:
require 'rails/generators'
require 'rails/generators/named_base'
module LinkToAction
module Generators
class InstallGenerator < Rails::Generators::Base
desc "Copy LinkToAction configuration file"
source_root File.expand_path('../templates', __FILE__)
class_option :bootstrap, type: :boolean,
desc: 'Add the Twitter Bootstrap option and templates.'
def info_framework
puts "link_to_action supports Twitter Bootstrap. If you want a " \
"configuration that is compatible with this framework, then please " \
"re-run this generator with --bootstrap option." unless options.bootstrap?
end
def copy_initializers
copy_file 'link_to_action.rb', 'config/initializers/link_to_action.rb'
if options[:bootstrap]
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_icons = false|, 'config.use_icons = true'
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_classes = false|, 'config.use_classes = true'
end
end
def copy_locale_file
copy_file '../../../link_to_action/locale/en.yml',
'config/locales/link_to_action.en.yml'
end
def copy_templates
['edit', 'index', 'new', 'show'].each do |action|
copy_file "#{action}.html.erb",
"lib/templates/erb/scaffold/#{action}.html.erb"
end
end
end
end
end
## Instruction:
Copy bootstrap templates if option specified.
## Code After:
require 'rails/generators'
require 'rails/generators/named_base'
module LinkToAction
module Generators
class InstallGenerator < Rails::Generators::Base
desc "Copy LinkToAction configuration file"
source_root File.expand_path('../templates', __FILE__)
class_option :bootstrap, type: :boolean,
desc: 'Add the Twitter Bootstrap option and templates.'
def info_framework
puts "link_to_action supports Twitter Bootstrap. If you want a " \
"configuration that is compatible with this framework, then please " \
"re-run this generator with --bootstrap option." unless options.bootstrap?
end
def copy_initializers
copy_file 'link_to_action.rb', 'config/initializers/link_to_action.rb'
if options[:bootstrap]
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_icons = false|, 'config.use_icons = true'
gsub_file 'config/initializers/link_to_action.rb',
%r|# config.use_classes = false|, 'config.use_classes = true'
end
end
def copy_locale_file
copy_file '../../../link_to_action/locale/en.yml',
'config/locales/link_to_action.en.yml'
end
def copy_templates
['edit', 'index', 'new', 'show'].each do |action|
source_dir = 'bootstrap' if options[:bootstrap]
source_file = [ source_dir, "#{action}.html.erb"].compact.join('/')
copy_file source_file, "lib/templates/erb/scaffold/#{action}.html.erb"
end
end
end
end
end
|
1fb17e45fce8927b8ab82416dc915233eaffc300
|
SETUP.md
|
SETUP.md
|
How to add DNG Math to DNG
========
1) Navigate to RDNG
2) Open the 'Mini Dashboard' on the left hand side
3) Click the 'Add Widget' button

4) Click 'Add OpenSocial Gadget' under 'Add External Widgets'

5) Specify `https://connyay.github.io/dng_math/ext.xml` as the gadget URL and click 'Add Widget'

6) Done!

|
How to add DNG Math to DNG
========
1) Navigate to RDNG
2) Open the 'Mini Dashboard' on the left hand side
3) Click the 'Add Widget' button

4) Click 'Add OpenSocial Gadget' under 'Add External Widgets'

5) Specify `https://connyay.github.io/dng_math/ext.xml` as the gadget URL and click 'Add Widget'

6) Done!

### Note
The Load / Save formula functionality requires an attribute named `Formula` to be on the currently open artifact.
|
Add note about Load / Save functionality
|
Add note about Load / Save functionality
|
Markdown
|
mit
|
connyay/dng_math
|
markdown
|
## Code Before:
How to add DNG Math to DNG
========
1) Navigate to RDNG
2) Open the 'Mini Dashboard' on the left hand side
3) Click the 'Add Widget' button

4) Click 'Add OpenSocial Gadget' under 'Add External Widgets'

5) Specify `https://connyay.github.io/dng_math/ext.xml` as the gadget URL and click 'Add Widget'

6) Done!

## Instruction:
Add note about Load / Save functionality
## Code After:
How to add DNG Math to DNG
========
1) Navigate to RDNG
2) Open the 'Mini Dashboard' on the left hand side
3) Click the 'Add Widget' button

4) Click 'Add OpenSocial Gadget' under 'Add External Widgets'

5) Specify `https://connyay.github.io/dng_math/ext.xml` as the gadget URL and click 'Add Widget'

6) Done!

### Note
The Load / Save formula functionality requires an attribute named `Formula` to be on the currently open artifact.
|
41a05c80e673dddd3037157a6190bfc5a5769f65
|
device-base/Dockerfile.artik.fedora.tpl
|
device-base/Dockerfile.artik.fedora.tpl
|
FROM #{FROM}
LABEL io.resin.device-type="#{DEV_TYPE}"
COPY artik.repo /etc/yum.repos.d/
COPY RPM-GPG-KEY-artik-#{SUITE}-armhfp /etc/pki/rpm-gpg/
RUN dnf install -y \
less \
nano \
net-tools \
usbutils \
gnupg \
i2c-tools \
libartik-sdk-tests \
&& dnf clean all
|
FROM #{FROM}
LABEL io.resin.device-type="#{DEV_TYPE}"
COPY artik.repo /etc/yum.repos.d/
COPY RPM-GPG-KEY-artik-#{SUITE}-armhfp /etc/pki/rpm-gpg/
RUN dnf install -y \
less \
nano \
net-tools \
usbutils \
gnupg \
i2c-tools \
libartik-sdk-tests \
libartik-sdk-zigbee-devel \
&& dnf clean all
|
Integrate zigbee in Artik family
|
Integrate zigbee in Artik family
|
Smarty
|
apache-2.0
|
nghiant2710/base-images,resin-io-library/base-images,nghiant2710/base-images,resin-io-library/base-images
|
smarty
|
## Code Before:
FROM #{FROM}
LABEL io.resin.device-type="#{DEV_TYPE}"
COPY artik.repo /etc/yum.repos.d/
COPY RPM-GPG-KEY-artik-#{SUITE}-armhfp /etc/pki/rpm-gpg/
RUN dnf install -y \
less \
nano \
net-tools \
usbutils \
gnupg \
i2c-tools \
libartik-sdk-tests \
&& dnf clean all
## Instruction:
Integrate zigbee in Artik family
## Code After:
FROM #{FROM}
LABEL io.resin.device-type="#{DEV_TYPE}"
COPY artik.repo /etc/yum.repos.d/
COPY RPM-GPG-KEY-artik-#{SUITE}-armhfp /etc/pki/rpm-gpg/
RUN dnf install -y \
less \
nano \
net-tools \
usbutils \
gnupg \
i2c-tools \
libartik-sdk-tests \
libartik-sdk-zigbee-devel \
&& dnf clean all
|
c990c35c73a353395a8a81fbea0f043757400388
|
Modules/Multilabel/autoload/IO/CMakeLists.txt
|
Modules/Multilabel/autoload/IO/CMakeLists.txt
|
MITK_CREATE_MODULE( MultilabelIO
INCLUDE_DIRS
PRIVATE src/IO
DEPENDS PUBLIC MitkMultilabel MitkSceneSerialization MitkDICOMReader
PACKAGE_DEPENDS
PRIVATE ITK|ITKQuadEdgeMesh+ITKAntiAlias+ITKIONRRD+ITKImageProcessing DCMTK DCMQI
AUTOLOAD_WITH MitkCore
WARNINGS_AS_ERRORS
)
|
MITK_CREATE_MODULE( MultilabelIO
INCLUDE_DIRS
PRIVATE src/IO
DEPENDS PUBLIC MitkMultilabel MitkSceneSerialization MitkDICOMReader
PACKAGE_DEPENDS
PRIVATE ITK|ITKQuadEdgeMesh+ITKAntiAlias+ITKIONRRD DCMTK DCMQI
AUTOLOAD_WITH MitkCore
WARNINGS_AS_ERRORS
)
|
Remove unused ITK module dependency from Multilabel IO module
|
Remove unused ITK module dependency from Multilabel IO module
|
Text
|
bsd-3-clause
|
RabadanLab/MITKats,MITK/MITK,fmilano/mitk,fmilano/mitk,MITK/MITK,fmilano/mitk,RabadanLab/MITKats,MITK/MITK,MITK/MITK,RabadanLab/MITKats,MITK/MITK,RabadanLab/MITKats,fmilano/mitk,MITK/MITK,RabadanLab/MITKats,RabadanLab/MITKats,fmilano/mitk,fmilano/mitk,fmilano/mitk
|
text
|
## Code Before:
MITK_CREATE_MODULE( MultilabelIO
INCLUDE_DIRS
PRIVATE src/IO
DEPENDS PUBLIC MitkMultilabel MitkSceneSerialization MitkDICOMReader
PACKAGE_DEPENDS
PRIVATE ITK|ITKQuadEdgeMesh+ITKAntiAlias+ITKIONRRD+ITKImageProcessing DCMTK DCMQI
AUTOLOAD_WITH MitkCore
WARNINGS_AS_ERRORS
)
## Instruction:
Remove unused ITK module dependency from Multilabel IO module
## Code After:
MITK_CREATE_MODULE( MultilabelIO
INCLUDE_DIRS
PRIVATE src/IO
DEPENDS PUBLIC MitkMultilabel MitkSceneSerialization MitkDICOMReader
PACKAGE_DEPENDS
PRIVATE ITK|ITKQuadEdgeMesh+ITKAntiAlias+ITKIONRRD DCMTK DCMQI
AUTOLOAD_WITH MitkCore
WARNINGS_AS_ERRORS
)
|
260f9fee341d9b6316c253de1c154c2330915e74
|
Tests/features/support/env.rb
|
Tests/features/support/env.rb
|
require 'aruba/cucumber'
|
require 'aruba/cucumber'
%x'rm -rf ../Build/log'
%x'mkdir ../Build/log'
After do |scenario|
%x'cp -R ../Build/tmp ../Build/log/"#{scenario.name}"'
end
|
Add output logging to Aruba tests.
|
Add output logging to Aruba tests.
|
Ruby
|
mit
|
SwiftKit/CuckooGenerator,SwiftKit/CuckooGenerator,SwiftKit/CuckooGenerator
|
ruby
|
## Code Before:
require 'aruba/cucumber'
## Instruction:
Add output logging to Aruba tests.
## Code After:
require 'aruba/cucumber'
%x'rm -rf ../Build/log'
%x'mkdir ../Build/log'
After do |scenario|
%x'cp -R ../Build/tmp ../Build/log/"#{scenario.name}"'
end
|
3faeef8c6578b6164c2153c24406b0fe87a5d0e9
|
lib/threepl_central_api/soap_client.rb
|
lib/threepl_central_api/soap_client.rb
|
require 'savon'
module ThreePLCentralAPI
class SOAPClient
attr_reader :wsdl, :enable_logging, :raise_errors
def initialize(**opts)
options = default_options.merge(opts)
@wsdl = options[:wsdl]
@enable_logging = options[:enable_logging]
@raise_errors = options[:raise_errors]
end
def call(action, **msg)
handle_response proxy.call(action, message: msg)
end
private
def default_options
{
wsdl: 'https://secure-wms.com/webserviceexternal/contracts.asmx?wsdl',
enable_logging: false,
raise_errors: false
}
end
def proxy
@proxy ||= Savon.client(
wsdl: wsdl,
log: enable_logging,
raise_errors: raise_errors,
no_message_tag: true
)
end
def handle_response(response)
if response.success?
ThreePLCentralAPI::Response.new
else
handle_error(response)
end
end
def handle_error(response)
soap_fault = response.soap_fault
http_error = response.http_error
error = soap_fault || http_error
fail ThreePLCentralAPI::Error, error.message
end
end
end
|
require 'savon'
module ThreePLCentralAPI
class SOAPClient
attr_reader :wsdl, :enable_logging, :raise_errors
def initialize(**opts)
options = default_options.merge(opts)
@wsdl = options[:wsdl]
@enable_logging = options[:enable_logging]
@raise_errors = options[:raise_errors]
end
def call(action, **msg)
@response = proxy.call(action, message: msg)
handle_response
end
private
def default_options
{
wsdl: 'https://secure-wms.com/webserviceexternal/contracts.asmx?wsdl',
enable_logging: false,
raise_errors: false
}
end
def proxy
@proxy ||= Savon.client(
wsdl: wsdl,
log: enable_logging,
raise_errors: raise_errors,
no_message_tag: true
)
end
def handle_response
if @response.success?
response_body
else
handle_error
end
end
def handle_error
error = response_soap_fault || response_http_error
fail ThreePLCentralAPI::Error, error.message
end
def response_body
@response.body
end
def response_http_error
@response.http_error
end
def response_soap_fault
@response.soap_fault
end
end
end
|
Change SOAP Client Response Value
|
Change SOAP Client Response Value
SOAP client now responds with the body hash from the Savon client if
the response is successful. Otherwise, it raises an exception.
|
Ruby
|
apache-2.0
|
dnunez24/3pl-central-endpoint,dnunez24/3pl-central-endpoint
|
ruby
|
## Code Before:
require 'savon'
module ThreePLCentralAPI
class SOAPClient
attr_reader :wsdl, :enable_logging, :raise_errors
def initialize(**opts)
options = default_options.merge(opts)
@wsdl = options[:wsdl]
@enable_logging = options[:enable_logging]
@raise_errors = options[:raise_errors]
end
def call(action, **msg)
handle_response proxy.call(action, message: msg)
end
private
def default_options
{
wsdl: 'https://secure-wms.com/webserviceexternal/contracts.asmx?wsdl',
enable_logging: false,
raise_errors: false
}
end
def proxy
@proxy ||= Savon.client(
wsdl: wsdl,
log: enable_logging,
raise_errors: raise_errors,
no_message_tag: true
)
end
def handle_response(response)
if response.success?
ThreePLCentralAPI::Response.new
else
handle_error(response)
end
end
def handle_error(response)
soap_fault = response.soap_fault
http_error = response.http_error
error = soap_fault || http_error
fail ThreePLCentralAPI::Error, error.message
end
end
end
## Instruction:
Change SOAP Client Response Value
SOAP client now responds with the body hash from the Savon client if
the response is successful. Otherwise, it raises an exception.
## Code After:
require 'savon'
module ThreePLCentralAPI
class SOAPClient
attr_reader :wsdl, :enable_logging, :raise_errors
def initialize(**opts)
options = default_options.merge(opts)
@wsdl = options[:wsdl]
@enable_logging = options[:enable_logging]
@raise_errors = options[:raise_errors]
end
def call(action, **msg)
@response = proxy.call(action, message: msg)
handle_response
end
private
def default_options
{
wsdl: 'https://secure-wms.com/webserviceexternal/contracts.asmx?wsdl',
enable_logging: false,
raise_errors: false
}
end
def proxy
@proxy ||= Savon.client(
wsdl: wsdl,
log: enable_logging,
raise_errors: raise_errors,
no_message_tag: true
)
end
def handle_response
if @response.success?
response_body
else
handle_error
end
end
def handle_error
error = response_soap_fault || response_http_error
fail ThreePLCentralAPI::Error, error.message
end
def response_body
@response.body
end
def response_http_error
@response.http_error
end
def response_soap_fault
@response.soap_fault
end
end
end
|
aba747a42620a6bfdd3bd115ebe77cc7548e0539
|
.travis.yml
|
.travis.yml
|
language: python
python:
- 2.7
install:
- pip install .
before_install:
- wget http://aphyr.com/riemann/riemann_0.2.6_all.deb
- sudo apt-get install openjdk-7-jre
- sudo dpkg -i riemann_0.2.6_all.deb
- sudo /etc/init.d/riemann start
script:
- trial tensor
after_script:
- sudo /etc/init.d/riemann stop
|
language: python
python:
- 2.7
install:
- pip install .
before_install:
- wget http://aphyr.com/riemann/riemann_0.2.6_all.deb
- sudo apt-get update
- sudo apt-get install openjdk-7-jre
- sudo dpkg -i riemann_0.2.6_all.deb
- sudo /etc/init.d/riemann start
script:
- trial tensor
after_script:
- sudo /etc/init.d/riemann stop
|
Add an apt-get update to Travis
|
Add an apt-get update to Travis
|
YAML
|
mit
|
calston/tensor,calston/tensor,hobbeswalsh/tensor,calston/tensor,hobbeswalsh/tensor,calston/tensor
|
yaml
|
## Code Before:
language: python
python:
- 2.7
install:
- pip install .
before_install:
- wget http://aphyr.com/riemann/riemann_0.2.6_all.deb
- sudo apt-get install openjdk-7-jre
- sudo dpkg -i riemann_0.2.6_all.deb
- sudo /etc/init.d/riemann start
script:
- trial tensor
after_script:
- sudo /etc/init.d/riemann stop
## Instruction:
Add an apt-get update to Travis
## Code After:
language: python
python:
- 2.7
install:
- pip install .
before_install:
- wget http://aphyr.com/riemann/riemann_0.2.6_all.deb
- sudo apt-get update
- sudo apt-get install openjdk-7-jre
- sudo dpkg -i riemann_0.2.6_all.deb
- sudo /etc/init.d/riemann start
script:
- trial tensor
after_script:
- sudo /etc/init.d/riemann stop
|
b79d602ebf46812a3b62ddf748c9e94cc3952926
|
services/QuillLMS/lib/clever_library/api/client.rb
|
services/QuillLMS/lib/clever_library/api/client.rb
|
class CleverLibrary::Api::Client
include HTTParty
base_uri 'https://api.clever.com/v2.0'
def initialize(bearer_token)
@options = {
headers: {
"Authorization": "Bearer " + bearer_token
}
}
end
def get_teacher(teacher_id:)
self.class.get('/teachers/' + teacher_id, @options).parsed_response["data"]
end
def get_teacher_sections(teacher_id:)
self.class.get('/teachers/' + teacher_id + '/sections', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
def get_section_students(section_id:)
self.class.get('/sections/' + section_id + '/students', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
end
|
class CleverLibrary::Api::Client
include HTTParty
base_uri 'https://api.clever.com/v2.0'
def initialize(bearer_token)
@options = {
headers: {
"Authorization": "Bearer " + bearer_token
}
}
end
def get_user()
self.class.get('/me', @options).parsed_response["data"]
end
def get_teacher(teacher_id:)
self.class.get('/teachers/' + teacher_id, @options).parsed_response["data"]
end
def get_teacher_sections(teacher_id:)
self.class.get('/teachers/' + teacher_id + '/sections', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
def get_section_students(section_id:)
self.class.get('/sections/' + section_id + '/students', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
end
|
Add Me function to Clever 2.0 API
|
Add Me function to Clever 2.0 API
|
Ruby
|
agpl-3.0
|
empirical-org/Empirical-Core,empirical-org/Empirical-Core,empirical-org/Empirical-Core,empirical-org/Empirical-Core,empirical-org/Empirical-Core,empirical-org/Empirical-Core,empirical-org/Empirical-Core
|
ruby
|
## Code Before:
class CleverLibrary::Api::Client
include HTTParty
base_uri 'https://api.clever.com/v2.0'
def initialize(bearer_token)
@options = {
headers: {
"Authorization": "Bearer " + bearer_token
}
}
end
def get_teacher(teacher_id:)
self.class.get('/teachers/' + teacher_id, @options).parsed_response["data"]
end
def get_teacher_sections(teacher_id:)
self.class.get('/teachers/' + teacher_id + '/sections', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
def get_section_students(section_id:)
self.class.get('/sections/' + section_id + '/students', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
end
## Instruction:
Add Me function to Clever 2.0 API
## Code After:
class CleverLibrary::Api::Client
include HTTParty
base_uri 'https://api.clever.com/v2.0'
def initialize(bearer_token)
@options = {
headers: {
"Authorization": "Bearer " + bearer_token
}
}
end
def get_user()
self.class.get('/me', @options).parsed_response["data"]
end
def get_teacher(teacher_id:)
self.class.get('/teachers/' + teacher_id, @options).parsed_response["data"]
end
def get_teacher_sections(teacher_id:)
self.class.get('/teachers/' + teacher_id + '/sections', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
def get_section_students(section_id:)
self.class.get('/sections/' + section_id + '/students', @options)
.parsed_response["data"]
.map {|section| section["data"]}
end
end
|
0e1a68362c6086caa3e17af49589d82385888a24
|
nls/fr/strings.js
|
nls/fr/strings.js
|
/* global define */
define({
HEADER_TITLE: "Outline",
TOOLBAR_ICON_TOOLTIP: "Onglet Outline",
BUTTON_SETTINGS: "Configurer Outline",
BUTTON_MOVE: "Changer la position de Outline",
BUTTON_CLOSE: "Fermer Outline",
COMMAND_SORT: "Liste courte",
COMMAND_UNNAMED: "Monter les fonctions anonymess",
COMMAND_ARGS: "Montrer les paramètres"
});
|
/* global define */
define({
HEADER_TITLE: "Outline",
TOOLBAR_ICON_TOOLTIP: "Onglet Outline",
BUTTON_SETTINGS: "Configurer Outline",
BUTTON_MOVE: "Changer la position de Outline",
BUTTON_CLOSE: "Fermer Outline",
COMMAND_SORT: "Liste courte",
COMMAND_UNNAMED: "Monter les fonctions anonymess",
COMMAND_ARGS: "Montrer les paramètres"
});
/* Last translated for 41a691e45e34f3ebde20bc5e3fdca8014871490d */
|
Add commithash to french translation
|
Add commithash to french translation
|
JavaScript
|
mit
|
Hirse/brackets-outline-list,pelatx/brackets-outline-list,Hirse/brackets-outline-list,onesfreedom/brackets-outline-list,BernsteinA/brackets-outline-list,pelatx/brackets-outline-list,BernsteinA/brackets-outline-list,onesfreedom/brackets-outline-list,pelatx/brackets-outline-list,Hirse/brackets-outline-list,pelatx/brackets-outline-list,rcuvgd/brackets-outline-list,Hirse/brackets-outline-list,rcuvgd/brackets-outline-list,BernsteinA/brackets-outline-list,rcuvgd/brackets-outline-list
|
javascript
|
## Code Before:
/* global define */
define({
HEADER_TITLE: "Outline",
TOOLBAR_ICON_TOOLTIP: "Onglet Outline",
BUTTON_SETTINGS: "Configurer Outline",
BUTTON_MOVE: "Changer la position de Outline",
BUTTON_CLOSE: "Fermer Outline",
COMMAND_SORT: "Liste courte",
COMMAND_UNNAMED: "Monter les fonctions anonymess",
COMMAND_ARGS: "Montrer les paramètres"
});
## Instruction:
Add commithash to french translation
## Code After:
/* global define */
define({
HEADER_TITLE: "Outline",
TOOLBAR_ICON_TOOLTIP: "Onglet Outline",
BUTTON_SETTINGS: "Configurer Outline",
BUTTON_MOVE: "Changer la position de Outline",
BUTTON_CLOSE: "Fermer Outline",
COMMAND_SORT: "Liste courte",
COMMAND_UNNAMED: "Monter les fonctions anonymess",
COMMAND_ARGS: "Montrer les paramètres"
});
/* Last translated for 41a691e45e34f3ebde20bc5e3fdca8014871490d */
|
9c9b96c1ebda907e1d91ea9960433a28bbefe8d9
|
README.md
|
README.md
|
[](http://travis-ci.org/angelozerr/minimatch.java)
Port of Node.js' [minimatch](https://github.com/isaacs/minimatch) to Java.
# Usage
```java
import minimatch.Minimatch;
...
boolean result = Minimatch.minimatch("bar.foo", "*.foo"); // return true
boolean result = Minimatch.minimatch("js/test.js", "**/**.js"); // return true
boolean result = Minimatch.minimatch("js/test.html", "**/**.js"); // return false
```
# Build
See cloudbees job: https://opensagres.ci.cloudbees.com/job/minimatch.java/
# Structure
The basic structure of the project is given in the following way:
* `src/main/java/` Java sources of minimatch.java.
* `src/test/java/` JUnit tests of minimatch.java.
* `html/` html samples which use JavaScript minimatch.
|
[](https://github.com/angelozerr/minimatch.java/blob/master/LICENSE)
[](http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22minimatch.java%22)
[](http://travis-ci.org/angelozerr/minimatch.java)
Port of Node.js' [minimatch](https://github.com/isaacs/minimatch) to Java.
# Usage
```java
import minimatch.Minimatch;
...
boolean result = Minimatch.minimatch("bar.foo", "*.foo"); // return true
boolean result = Minimatch.minimatch("js/test.js", "**/**.js"); // return true
boolean result = Minimatch.minimatch("js/test.html", "**/**.js"); // return false
```
# Build
See cloudbees job: https://opensagres.ci.cloudbees.com/job/minimatch.java/
# Structure
The basic structure of the project is given in the following way:
* `src/main/java/` Java sources of minimatch.java.
* `src/test/java/` JUnit tests of minimatch.java.
* `html/` html samples which use JavaScript minimatch.
|
Add license + maven icon.
|
Add license + maven icon.
|
Markdown
|
mit
|
piotrtomiak/minimatch.java,piotrtomiak/minimatch.java,piotrtomiak/minimatch.java,angelozerr/minimatch.java,angelozerr/minimatch.java,angelozerr/minimatch.java
|
markdown
|
## Code Before:
[](http://travis-ci.org/angelozerr/minimatch.java)
Port of Node.js' [minimatch](https://github.com/isaacs/minimatch) to Java.
# Usage
```java
import minimatch.Minimatch;
...
boolean result = Minimatch.minimatch("bar.foo", "*.foo"); // return true
boolean result = Minimatch.minimatch("js/test.js", "**/**.js"); // return true
boolean result = Minimatch.minimatch("js/test.html", "**/**.js"); // return false
```
# Build
See cloudbees job: https://opensagres.ci.cloudbees.com/job/minimatch.java/
# Structure
The basic structure of the project is given in the following way:
* `src/main/java/` Java sources of minimatch.java.
* `src/test/java/` JUnit tests of minimatch.java.
* `html/` html samples which use JavaScript minimatch.
## Instruction:
Add license + maven icon.
## Code After:
[](https://github.com/angelozerr/minimatch.java/blob/master/LICENSE)
[](http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22minimatch.java%22)
[](http://travis-ci.org/angelozerr/minimatch.java)
Port of Node.js' [minimatch](https://github.com/isaacs/minimatch) to Java.
# Usage
```java
import minimatch.Minimatch;
...
boolean result = Minimatch.minimatch("bar.foo", "*.foo"); // return true
boolean result = Minimatch.minimatch("js/test.js", "**/**.js"); // return true
boolean result = Minimatch.minimatch("js/test.html", "**/**.js"); // return false
```
# Build
See cloudbees job: https://opensagres.ci.cloudbees.com/job/minimatch.java/
# Structure
The basic structure of the project is given in the following way:
* `src/main/java/` Java sources of minimatch.java.
* `src/test/java/` JUnit tests of minimatch.java.
* `html/` html samples which use JavaScript minimatch.
|
f555b08f07c5dfe4690f1359106e12756856da8f
|
templates/base.html
|
templates/base.html
|
<!DOCTYPE html>
<html>
<head lang="en">
<link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/css/bootstrap.min.css" rel="stylesheet"/>
<link href="/static/blogware.css" rel="stylesheet"/>
<title>
{% block title %}
{{ config.SITENAME }}
{% endblock title %}
</title>
</head>
<body>
<div id="content">
{% block content %}
{% endblock content %}
</div>
</body>
</html>
|
<!DOCTYPE html>
<html>
<head lang="en">
<link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/css/bootstrap.min.css" rel="stylesheet"/>
<link href="/static/blogware.css" rel="stylesheet"/>
<title>
{% block title %}
{{ config.SITENAME }}
{% endblock title %}
</title>
</head>
<body>
<div id="header">
<h1><a href="/">{{ config.SITENAME }}</a></h1>
</div>
<div id="content">
{% block content %}
{% endblock content %}
</div>
</body>
</html>
|
Add a link to the index page.
|
Add a link to the index page.
|
HTML
|
agpl-3.0
|
izrik/blogware,izrik/wikiware,izrik/blogware,izrik/wikiware,izrik/wikiware,izrik/blogware
|
html
|
## Code Before:
<!DOCTYPE html>
<html>
<head lang="en">
<link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/css/bootstrap.min.css" rel="stylesheet"/>
<link href="/static/blogware.css" rel="stylesheet"/>
<title>
{% block title %}
{{ config.SITENAME }}
{% endblock title %}
</title>
</head>
<body>
<div id="content">
{% block content %}
{% endblock content %}
</div>
</body>
</html>
## Instruction:
Add a link to the index page.
## Code After:
<!DOCTYPE html>
<html>
<head lang="en">
<link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/css/bootstrap.min.css" rel="stylesheet"/>
<link href="/static/blogware.css" rel="stylesheet"/>
<title>
{% block title %}
{{ config.SITENAME }}
{% endblock title %}
</title>
</head>
<body>
<div id="header">
<h1><a href="/">{{ config.SITENAME }}</a></h1>
</div>
<div id="content">
{% block content %}
{% endblock content %}
</div>
</body>
</html>
|
87b10ebb6399c8e9ee1984b32337a358001a4baf
|
src/org/bouncycastle/crypto/tls/TlsOuputStream.java
|
src/org/bouncycastle/crypto/tls/TlsOuputStream.java
|
package org.bouncycastle.crypto.tls;
import java.io.IOException;
import java.io.OutputStream;
/**
* An OutputStream for an TLS connection.
*/
public class TlsOuputStream extends OutputStream
{
private TlsProtocolHandler handler;
protected TlsOuputStream(TlsProtocolHandler handler)
{
this.handler = handler;
}
public void write(byte buf[], int offset, int len) throws IOException
{
this.handler.writeData(buf, offset, len);
}
public void write(int arg0) throws IOException
{
byte[] buf = new byte[1];
buf[0] = (byte)arg0;
this.write(buf, 0, 1);
}
public void close() throws IOException
{
handler.close();
}
public void flush() throws IOException
{
handler.flush();
}
}
|
package org.bouncycastle.crypto.tls;
import java.io.IOException;
import java.io.OutputStream;
/**
* An OutputStream for an TLS connection.
*/
public class TlsOuputStream extends OutputStream
{
private TlsProtocolHandler handler;
protected TlsOuputStream(TlsProtocolHandler handler)
{
this.handler = handler;
}
public void write(byte buf[], int offset, int len) throws IOException
{
this.handler.writeData(buf, offset, len);
}
public void write(int arg0) throws IOException
{
byte[] buf = new byte[1];
buf[0] = (byte)arg0;
this.write(buf, 0, 1);
}
/** @deprecated Use 'close' instead */
public void cose() throws IOException
{
handler.close();
}
public void close() throws IOException
{
handler.close();
}
public void flush() throws IOException
{
handler.flush();
}
}
|
Add cose() back as a deprecated method for backward compatibility
|
Add cose() back as a deprecated method for backward compatibility
|
Java
|
mit
|
sake/bouncycastle-java
|
java
|
## Code Before:
package org.bouncycastle.crypto.tls;
import java.io.IOException;
import java.io.OutputStream;
/**
* An OutputStream for an TLS connection.
*/
public class TlsOuputStream extends OutputStream
{
private TlsProtocolHandler handler;
protected TlsOuputStream(TlsProtocolHandler handler)
{
this.handler = handler;
}
public void write(byte buf[], int offset, int len) throws IOException
{
this.handler.writeData(buf, offset, len);
}
public void write(int arg0) throws IOException
{
byte[] buf = new byte[1];
buf[0] = (byte)arg0;
this.write(buf, 0, 1);
}
public void close() throws IOException
{
handler.close();
}
public void flush() throws IOException
{
handler.flush();
}
}
## Instruction:
Add cose() back as a deprecated method for backward compatibility
## Code After:
package org.bouncycastle.crypto.tls;
import java.io.IOException;
import java.io.OutputStream;
/**
* An OutputStream for an TLS connection.
*/
public class TlsOuputStream extends OutputStream
{
private TlsProtocolHandler handler;
protected TlsOuputStream(TlsProtocolHandler handler)
{
this.handler = handler;
}
public void write(byte buf[], int offset, int len) throws IOException
{
this.handler.writeData(buf, offset, len);
}
public void write(int arg0) throws IOException
{
byte[] buf = new byte[1];
buf[0] = (byte)arg0;
this.write(buf, 0, 1);
}
/** @deprecated Use 'close' instead */
public void cose() throws IOException
{
handler.close();
}
public void close() throws IOException
{
handler.close();
}
public void flush() throws IOException
{
handler.flush();
}
}
|
f43d90f2d6cff7c56e11577212caa2ab2eea4494
|
.travis.yml
|
.travis.yml
|
language: ruby
sudo: false
branches:
only: master
rvm:
- 1.9.3
- 2.0.0
- 2.1.0
- 2.2.0
os:
- linux
- osx
cache:
- apt
- bundler
before_install:
- eval "$(curl -Ss https://raw.githubusercontent.com/neovim/bot-ci/master/scripts/travis-setup.sh) nightly-x64"
env: REPORT_COVERAGE=1
script: bundle exec rake --trace spec
|
language: ruby
sudo: false
branches:
only: master
rvm:
- 1.9.3
- 2.0.0
- 2.1.0
- 2.2.0
cache:
- apt
- bundler
before_install:
- eval "$(curl -Ss https://raw.githubusercontent.com/neovim/bot-ci/master/scripts/travis-setup.sh) nightly-x64"
env: REPORT_COVERAGE=1
script: bundle exec rake --trace spec
|
Remove osx builds for now
|
Remove osx builds for now
|
YAML
|
mit
|
alexgenco/neovim-ruby
|
yaml
|
## Code Before:
language: ruby
sudo: false
branches:
only: master
rvm:
- 1.9.3
- 2.0.0
- 2.1.0
- 2.2.0
os:
- linux
- osx
cache:
- apt
- bundler
before_install:
- eval "$(curl -Ss https://raw.githubusercontent.com/neovim/bot-ci/master/scripts/travis-setup.sh) nightly-x64"
env: REPORT_COVERAGE=1
script: bundle exec rake --trace spec
## Instruction:
Remove osx builds for now
## Code After:
language: ruby
sudo: false
branches:
only: master
rvm:
- 1.9.3
- 2.0.0
- 2.1.0
- 2.2.0
cache:
- apt
- bundler
before_install:
- eval "$(curl -Ss https://raw.githubusercontent.com/neovim/bot-ci/master/scripts/travis-setup.sh) nightly-x64"
env: REPORT_COVERAGE=1
script: bundle exec rake --trace spec
|
e8081ee5d7ecafeb9ca34560fc0129bbf55efa6a
|
app/src/index.js
|
app/src/index.js
|
require('babelify/polyfill');
const config = require('./config');
const SculptureApp = require('./app');
const DEFAULT_CLIENT_CONNECTION_OPTIONS = {
protocol: "ws",
username: "anyware",
password: "anyware",
host: "broker.shiftr.io"
};
const app = new SculptureApp(config);
const connectionOptions = Object.assign({}, DEFAULT_CLIENT_CONNECTION_OPTIONS);
if (process.argv.length === 4) {
console.log("Using authentication information provided by command arguments");
connectionOptions.username = process.argv[2];
connectionOptions.password = process.argv[3];
}
app.connectAndSetup(connectionOptions);
|
require('babelify/polyfill');
const config = require('./config');
const SculptureApp = require('./app');
const DEFAULT_CLIENT_CONNECTION_OPTIONS = {
protocol: "ws",
username: "sculpture0",
password: "7f24a3e73b91dc9f51f15861d75c888b",
host: "broker.shiftr.io"
};
const app = new SculptureApp(config);
const connectionOptions = Object.assign({}, DEFAULT_CLIENT_CONNECTION_OPTIONS);
if (process.argv.length === 4) {
console.log("Using authentication information provided by command arguments");
connectionOptions.username = process.argv[2];
connectionOptions.password = process.argv[3];
}
app.connectAndSetup(connectionOptions);
|
Connect as sculpture0 for now, until we have config in local storage
|
Connect as sculpture0 for now, until we have config in local storage
|
JavaScript
|
mit
|
anyWareSculpture/sculpture-client,anyWareSculpture/sculpture-client,anyWareSculpture/sculpture-client
|
javascript
|
## Code Before:
require('babelify/polyfill');
const config = require('./config');
const SculptureApp = require('./app');
const DEFAULT_CLIENT_CONNECTION_OPTIONS = {
protocol: "ws",
username: "anyware",
password: "anyware",
host: "broker.shiftr.io"
};
const app = new SculptureApp(config);
const connectionOptions = Object.assign({}, DEFAULT_CLIENT_CONNECTION_OPTIONS);
if (process.argv.length === 4) {
console.log("Using authentication information provided by command arguments");
connectionOptions.username = process.argv[2];
connectionOptions.password = process.argv[3];
}
app.connectAndSetup(connectionOptions);
## Instruction:
Connect as sculpture0 for now, until we have config in local storage
## Code After:
require('babelify/polyfill');
const config = require('./config');
const SculptureApp = require('./app');
const DEFAULT_CLIENT_CONNECTION_OPTIONS = {
protocol: "ws",
username: "sculpture0",
password: "7f24a3e73b91dc9f51f15861d75c888b",
host: "broker.shiftr.io"
};
const app = new SculptureApp(config);
const connectionOptions = Object.assign({}, DEFAULT_CLIENT_CONNECTION_OPTIONS);
if (process.argv.length === 4) {
console.log("Using authentication information provided by command arguments");
connectionOptions.username = process.argv[2];
connectionOptions.password = process.argv[3];
}
app.connectAndSetup(connectionOptions);
|
de80f307f52b41c16f5a328452b8b53856d040e2
|
media/css/readthedocs-doc-embed.css
|
media/css/readthedocs-doc-embed.css
|
/* Left for CSS overrides we need to make in the future */
/* Fix badge on RTD Theme */
.rst-other-versions {
text-align: left;
}
.rst-other-versions a {
border: 0;
}
.rst-other-versions dl {
margin: 0;
}
/* Fix Mkdocs sidebar */
nav.stickynav {
position: absolute;
}
|
/* Left for CSS overrides we need to make in the future */
/* Fix badge on RTD Theme */
.rst-versions.rst-badge {
display: block;
}
.rst-other-versions {
text-align: left;
}
.rst-other-versions a {
border: 0;
}
.rst-other-versions dl {
margin: 0;
}
/* Fix Mkdocs sidebar */
nav.stickynav {
position: absolute;
}
|
Set badge display to block
|
Set badge display to block
|
CSS
|
mit
|
KamranMackey/readthedocs.org,agjohnson/readthedocs.org,sils1297/readthedocs.org,emawind84/readthedocs.org,soulshake/readthedocs.org,sunnyzwh/readthedocs.org,michaelmcandrew/readthedocs.org,nikolas/readthedocs.org,raven47git/readthedocs.org,tddv/readthedocs.org,Tazer/readthedocs.org,techtonik/readthedocs.org,attakei/readthedocs-oauth,SteveViss/readthedocs.org,techtonik/readthedocs.org,kenwang76/readthedocs.org,wijerasa/readthedocs.org,mhils/readthedocs.org,dirn/readthedocs.org,clarkperkins/readthedocs.org,kdkeyser/readthedocs.org,espdev/readthedocs.org,michaelmcandrew/readthedocs.org,gjtorikian/readthedocs.org,singingwolfboy/readthedocs.org,laplaceliu/readthedocs.org,sunnyzwh/readthedocs.org,laplaceliu/readthedocs.org,SteveViss/readthedocs.org,agjohnson/readthedocs.org,agjohnson/readthedocs.org,laplaceliu/readthedocs.org,kenshinthebattosai/readthedocs.org,wijerasa/readthedocs.org,nikolas/readthedocs.org,attakei/readthedocs-oauth,kdkeyser/readthedocs.org,sid-kap/readthedocs.org,raven47git/readthedocs.org,kenshinthebattosai/readthedocs.org,VishvajitP/readthedocs.org,KamranMackey/readthedocs.org,kenwang76/readthedocs.org,royalwang/readthedocs.org,gjtorikian/readthedocs.org,nikolas/readthedocs.org,attakei/readthedocs-oauth,jerel/readthedocs.org,CedarLogic/readthedocs.org,emawind84/readthedocs.org,davidfischer/readthedocs.org,istresearch/readthedocs.org,mhils/readthedocs.org,Carreau/readthedocs.org,sid-kap/readthedocs.org,Carreau/readthedocs.org,pombredanne/readthedocs.org,dirn/readthedocs.org,atsuyim/readthedocs.org,mhils/readthedocs.org,GovReady/readthedocs.org,espdev/readthedocs.org,Tazer/readthedocs.org,mhils/readthedocs.org,rtfd/readthedocs.org,hach-que/readthedocs.org,techtonik/readthedocs.org,michaelmcandrew/readthedocs.org,clarkperkins/readthedocs.org,espdev/readthedocs.org,emawind84/readthedocs.org,espdev/readthedocs.org,laplaceliu/readthedocs.org,takluyver/readthedocs.org,cgourlay/readthedocs.org,wanghaven/readthedocs.org,singingwolfboy/readthedocs.org,stevepiercy/readthedocs.org,stevepiercy/readthedocs.org,sid-kap/readthedocs.org,CedarLogic/readthedocs.org,hach-que/readthedocs.org,SteveViss/readthedocs.org,raven47git/readthedocs.org,takluyver/readthedocs.org,LukasBoersma/readthedocs.org,titiushko/readthedocs.org,istresearch/readthedocs.org,takluyver/readthedocs.org,asampat3090/readthedocs.org,SteveViss/readthedocs.org,royalwang/readthedocs.org,asampat3090/readthedocs.org,CedarLogic/readthedocs.org,hach-que/readthedocs.org,rtfd/readthedocs.org,stevepiercy/readthedocs.org,cgourlay/readthedocs.org,asampat3090/readthedocs.org,sunnyzwh/readthedocs.org,wijerasa/readthedocs.org,Tazer/readthedocs.org,kenwang76/readthedocs.org,pombredanne/readthedocs.org,GovReady/readthedocs.org,agjohnson/readthedocs.org,tddv/readthedocs.org,KamranMackey/readthedocs.org,d0ugal/readthedocs.org,pombredanne/readthedocs.org,titiushko/readthedocs.org,titiushko/readthedocs.org,VishvajitP/readthedocs.org,kenshinthebattosai/readthedocs.org,mrshoki/readthedocs.org,LukasBoersma/readthedocs.org,mrshoki/readthedocs.org,atsuyim/readthedocs.org,gjtorikian/readthedocs.org,d0ugal/readthedocs.org,sid-kap/readthedocs.org,Carreau/readthedocs.org,VishvajitP/readthedocs.org,titiushko/readthedocs.org,singingwolfboy/readthedocs.org,kdkeyser/readthedocs.org,soulshake/readthedocs.org,Tazer/readthedocs.org,fujita-shintaro/readthedocs.org,clarkperkins/readthedocs.org,fujita-shintaro/readthedocs.org,GovReady/readthedocs.org,sils1297/readthedocs.org,fujita-shintaro/readthedocs.org,atsuyim/readthedocs.org,sunnyzwh/readthedocs.org,asampat3090/readthedocs.org,soulshake/readthedocs.org,safwanrahman/readthedocs.org,KamranMackey/readthedocs.org,takluyver/readthedocs.org,davidfischer/readthedocs.org,clarkperkins/readthedocs.org,techtonik/readthedocs.org,d0ugal/readthedocs.org,istresearch/readthedocs.org,fujita-shintaro/readthedocs.org,cgourlay/readthedocs.org,royalwang/readthedocs.org,LukasBoersma/readthedocs.org,VishvajitP/readthedocs.org,rtfd/readthedocs.org,mrshoki/readthedocs.org,stevepiercy/readthedocs.org,sils1297/readthedocs.org,nikolas/readthedocs.org,espdev/readthedocs.org,safwanrahman/readthedocs.org,safwanrahman/readthedocs.org,safwanrahman/readthedocs.org,wanghaven/readthedocs.org,jerel/readthedocs.org,rtfd/readthedocs.org,michaelmcandrew/readthedocs.org,sils1297/readthedocs.org,emawind84/readthedocs.org,singingwolfboy/readthedocs.org,gjtorikian/readthedocs.org,LukasBoersma/readthedocs.org,dirn/readthedocs.org,wanghaven/readthedocs.org,davidfischer/readthedocs.org,raven47git/readthedocs.org,wijerasa/readthedocs.org,hach-que/readthedocs.org,d0ugal/readthedocs.org,kenshinthebattosai/readthedocs.org,CedarLogic/readthedocs.org,jerel/readthedocs.org,attakei/readthedocs-oauth,wanghaven/readthedocs.org,davidfischer/readthedocs.org,jerel/readthedocs.org,tddv/readthedocs.org,cgourlay/readthedocs.org,kenwang76/readthedocs.org,dirn/readthedocs.org,kdkeyser/readthedocs.org,GovReady/readthedocs.org,soulshake/readthedocs.org,mrshoki/readthedocs.org,istresearch/readthedocs.org,royalwang/readthedocs.org,atsuyim/readthedocs.org,Carreau/readthedocs.org
|
css
|
## Code Before:
/* Left for CSS overrides we need to make in the future */
/* Fix badge on RTD Theme */
.rst-other-versions {
text-align: left;
}
.rst-other-versions a {
border: 0;
}
.rst-other-versions dl {
margin: 0;
}
/* Fix Mkdocs sidebar */
nav.stickynav {
position: absolute;
}
## Instruction:
Set badge display to block
## Code After:
/* Left for CSS overrides we need to make in the future */
/* Fix badge on RTD Theme */
.rst-versions.rst-badge {
display: block;
}
.rst-other-versions {
text-align: left;
}
.rst-other-versions a {
border: 0;
}
.rst-other-versions dl {
margin: 0;
}
/* Fix Mkdocs sidebar */
nav.stickynav {
position: absolute;
}
|
3ba0ea726e62c04d83e92b823a6553ce5ac70e2b
|
circleci.gemspec
|
circleci.gemspec
|
Gem::Specification.new do |s|
s.name = 'circleci'
s.version = '0.0.2'
s.date = '2013-04-16'
s.summary = 'Circle CI API Wrapper'
s.description = 'Wraps Circle CI API calls in a gem.'
s.authors = ['Chavez']
s.email = ''
s.files = Dir.glob("{bin,lib}/**/*") + %w[README.md]
s.require_paths = ['lib']
s.homepage = 'http://github.com/mtchavez/circleci'
s.rdoc_options = ['--charset=UTF-8 --main=README.md']
s.extra_rdoc_files = ['README.md']
s.add_dependency(%q<rest-client>, ['>= 1.6.7'])
s.add_dependency(%q<hashie>, ['>= 1.2.0'])
s.add_development_dependency(%q<rspec>, ['>= 2.0'])
s.add_development_dependency(%q<simplecov>, ['>= 0.7.1'])
s.add_development_dependency(%q<vcr>, ['>= 2.2.5'])
s.add_development_dependency(%q<webmock>, ['>= 1.8.11'])
end
|
Gem::Specification.new do |s|
s.name = 'circleci'
s.version = '0.0.2'
s.date = '2013-04-16'
s.summary = 'Circle CI API Wrapper'
s.description = 'Wraps Circle CI API calls in a gem.'
s.licenses = ['MIT']
s.authors = ['Chavez']
s.email = '[email protected]'
s.files = Dir.glob("{bin,lib}/**/*") + %w[README.md]
s.require_paths = ['lib']
s.homepage = 'http://github.com/mtchavez/circleci'
s.rdoc_options = ['--charset=UTF-8 --main=README.md']
s.extra_rdoc_files = ['README.md']
# Gem Dependencies
s.add_dependency 'rest-client', '~> 1.6.7', '>= 1.6.7'
s.add_dependency 'hashie', '~> 1.2.0', '>= 1.2.0'
# Dev Dependencies
s.add_development_dependency 'rspec', '~> 2.0', '>= 2.0'
s.add_development_dependency 'simplecov', '~> 0.7.1', '>= 0.7.1'
s.add_development_dependency 'vcr', '~> 2.2.5', '>= 2.2.5'
s.add_development_dependency 'webmock', '~> 1.8.11', '>= 1.8.11'
end
|
Remove warnings when building gem
|
Remove warnings when building gem
|
Ruby
|
mit
|
mtchavez/circleci,mtchavez/circleci,katsuhiko/circleci,leecade/circleci
|
ruby
|
## Code Before:
Gem::Specification.new do |s|
s.name = 'circleci'
s.version = '0.0.2'
s.date = '2013-04-16'
s.summary = 'Circle CI API Wrapper'
s.description = 'Wraps Circle CI API calls in a gem.'
s.authors = ['Chavez']
s.email = ''
s.files = Dir.glob("{bin,lib}/**/*") + %w[README.md]
s.require_paths = ['lib']
s.homepage = 'http://github.com/mtchavez/circleci'
s.rdoc_options = ['--charset=UTF-8 --main=README.md']
s.extra_rdoc_files = ['README.md']
s.add_dependency(%q<rest-client>, ['>= 1.6.7'])
s.add_dependency(%q<hashie>, ['>= 1.2.0'])
s.add_development_dependency(%q<rspec>, ['>= 2.0'])
s.add_development_dependency(%q<simplecov>, ['>= 0.7.1'])
s.add_development_dependency(%q<vcr>, ['>= 2.2.5'])
s.add_development_dependency(%q<webmock>, ['>= 1.8.11'])
end
## Instruction:
Remove warnings when building gem
## Code After:
Gem::Specification.new do |s|
s.name = 'circleci'
s.version = '0.0.2'
s.date = '2013-04-16'
s.summary = 'Circle CI API Wrapper'
s.description = 'Wraps Circle CI API calls in a gem.'
s.licenses = ['MIT']
s.authors = ['Chavez']
s.email = '[email protected]'
s.files = Dir.glob("{bin,lib}/**/*") + %w[README.md]
s.require_paths = ['lib']
s.homepage = 'http://github.com/mtchavez/circleci'
s.rdoc_options = ['--charset=UTF-8 --main=README.md']
s.extra_rdoc_files = ['README.md']
# Gem Dependencies
s.add_dependency 'rest-client', '~> 1.6.7', '>= 1.6.7'
s.add_dependency 'hashie', '~> 1.2.0', '>= 1.2.0'
# Dev Dependencies
s.add_development_dependency 'rspec', '~> 2.0', '>= 2.0'
s.add_development_dependency 'simplecov', '~> 0.7.1', '>= 0.7.1'
s.add_development_dependency 'vcr', '~> 2.2.5', '>= 2.2.5'
s.add_development_dependency 'webmock', '~> 1.8.11', '>= 1.8.11'
end
|
f819f61d81467fa87ac458c1ea66aaf99cef731b
|
.travis.yml
|
.travis.yml
|
matrix:
include:
- env: ubuntu_version="16.04"
- env: ubuntu_version="18.04"
sudo: required
services:
- docker
language: python
before_install:
- docker build --build-arg ubuntu_version=${ubuntu_version} -t kismon-travis-${ubuntu_version} .travis/
- docker run -v $PWD:/code/ -d --name kismon-container kismon-travis-${ubuntu_version}
script:
- docker exec kismon-container /usr/bin/python3 -m unittest discover -v
|
matrix:
include:
- env: ubuntu_version="16.04"
- env: ubuntu_version="18.04"
- env: ubuntu_version="19.10"
sudo: required
services:
- docker
language: python
before_install:
- docker build --build-arg ubuntu_version=${ubuntu_version} -t kismon-travis-${ubuntu_version} .travis/
- docker run -v $PWD:/code/ -d --name kismon-container kismon-travis-${ubuntu_version}
script:
- docker exec kismon-container /usr/bin/python3 -m unittest discover -v
|
Add Ubuntu 19.10 to tests
|
Add Ubuntu 19.10 to tests
|
YAML
|
bsd-3-clause
|
Kismon/kismon
|
yaml
|
## Code Before:
matrix:
include:
- env: ubuntu_version="16.04"
- env: ubuntu_version="18.04"
sudo: required
services:
- docker
language: python
before_install:
- docker build --build-arg ubuntu_version=${ubuntu_version} -t kismon-travis-${ubuntu_version} .travis/
- docker run -v $PWD:/code/ -d --name kismon-container kismon-travis-${ubuntu_version}
script:
- docker exec kismon-container /usr/bin/python3 -m unittest discover -v
## Instruction:
Add Ubuntu 19.10 to tests
## Code After:
matrix:
include:
- env: ubuntu_version="16.04"
- env: ubuntu_version="18.04"
- env: ubuntu_version="19.10"
sudo: required
services:
- docker
language: python
before_install:
- docker build --build-arg ubuntu_version=${ubuntu_version} -t kismon-travis-${ubuntu_version} .travis/
- docker run -v $PWD:/code/ -d --name kismon-container kismon-travis-${ubuntu_version}
script:
- docker exec kismon-container /usr/bin/python3 -m unittest discover -v
|
d581b7d2198ab3b58a57d3b4125a7c9082927954
|
install_clouderadirector.sh
|
install_clouderadirector.sh
|
wget http://archive.cloudera.com/director/redhat/6/x86_64/director/cloudera-director.repo -O /etc/yum.repos.d/cloudera-director.repo
yum -y -e1 -d1 install oracle-j2sdk1.7 cloudera-director-server cloudera-director-client
service cloudera-director-server start
chkconfig cloudera-director-server on
|
wget http://archive.cloudera.com/director/redhat/6/x86_64/director/cloudera-director.repo -O /etc/yum.repos.d/cloudera-director.repo
yum -y -e1 -d1 install oracle-j2sdk1.7 cloudera-director-server cloudera-director-client
cp -pc /etc/cloudera-director-server/application.properties /etc/cloudera-director-server/application.properties-orig
chgrp cloudera-director /etc/cloudera-director-server/application.properties
chmod 0640 /etc/cloudera-director-server/application.properties
sed -i -e '/lp.encryption.twoWayCipher:/a\
lp.encryption.twoWayCipher: desede' -e "/lp.encryption.twoWayCipherConfig:/a\
lp.encryption.twoWayCipherConfig: `python -c 'import base64, os; print base64.b64encode(os.urandom(24))'`" /etc/cloudera-director-server/application.properties
service cloudera-director-server start
chkconfig cloudera-director-server on
|
Set a unique encryption key for Cloudera Director.
|
Set a unique encryption key for Cloudera Director.
|
Shell
|
apache-2.0
|
teamclairvoyant/hadoop-deployment-bash,teamclairvoyant/hadoop-deployment-bash
|
shell
|
## Code Before:
wget http://archive.cloudera.com/director/redhat/6/x86_64/director/cloudera-director.repo -O /etc/yum.repos.d/cloudera-director.repo
yum -y -e1 -d1 install oracle-j2sdk1.7 cloudera-director-server cloudera-director-client
service cloudera-director-server start
chkconfig cloudera-director-server on
## Instruction:
Set a unique encryption key for Cloudera Director.
## Code After:
wget http://archive.cloudera.com/director/redhat/6/x86_64/director/cloudera-director.repo -O /etc/yum.repos.d/cloudera-director.repo
yum -y -e1 -d1 install oracle-j2sdk1.7 cloudera-director-server cloudera-director-client
cp -pc /etc/cloudera-director-server/application.properties /etc/cloudera-director-server/application.properties-orig
chgrp cloudera-director /etc/cloudera-director-server/application.properties
chmod 0640 /etc/cloudera-director-server/application.properties
sed -i -e '/lp.encryption.twoWayCipher:/a\
lp.encryption.twoWayCipher: desede' -e "/lp.encryption.twoWayCipherConfig:/a\
lp.encryption.twoWayCipherConfig: `python -c 'import base64, os; print base64.b64encode(os.urandom(24))'`" /etc/cloudera-director-server/application.properties
service cloudera-director-server start
chkconfig cloudera-director-server on
|
5f8d83c4a6ee8db25ff1c023b8b2f70f2b1f7a6c
|
examples/fizzbuzz.rb
|
examples/fizzbuzz.rb
|
subject "FizzBuzz (1 ~ 15)"
expected <<EOS
1
2
Fizz
4
Buzz
Fizz
7
8
Fizz
Buzz
11
Fizz
13
14
FizzBuzz
EOS
|
subject "FizzBuzz (1 ~ 15)"
__END__
1
2
Fizz
4
Buzz
Fizz
7
8
Fizz
Buzz
11
Fizz
13
14
FizzBuzz
|
Fix style of FizzBuzz problem example
|
Fix style of FizzBuzz problem example
|
Ruby
|
mit
|
yuya-takeyama/shellfish
|
ruby
|
## Code Before:
subject "FizzBuzz (1 ~ 15)"
expected <<EOS
1
2
Fizz
4
Buzz
Fizz
7
8
Fizz
Buzz
11
Fizz
13
14
FizzBuzz
EOS
## Instruction:
Fix style of FizzBuzz problem example
## Code After:
subject "FizzBuzz (1 ~ 15)"
__END__
1
2
Fizz
4
Buzz
Fizz
7
8
Fizz
Buzz
11
Fizz
13
14
FizzBuzz
|
54b5ddd48d8a57bacb31faa7d6c7bad84e957dbb
|
gamernews/templates/news/single_blob.html
|
gamernews/templates/news/single_blob.html
|
{% extends 'base.html' %}
{% load blobs_tag %}
{% load comments %}
{% load threadedcomments_tags %}
{% load voting_tags %}
{% block title %}{{ blob.title }}{% endblock %}
{% block page %}gamernews{% endblock %}
{% block header %}
{% include "_includes/header.html" %}
{% endblock %}
{% block content %}
{% vote_by_user user.id on blob as vote %}
{% vote_counts_for_object blob as score %}
<div id="blob-{{ blob.id }}" class="single">
{% include 'news/_link.html' %}
<div class="note">{{ blob.note|striptags|safe }}</div>
{% if user.is_authenticated %}
{% get_comment_form for blob as form %}
<form class="comment-form" action="{% comment_form_target %}" method="post">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" class="btn btn-default" value="Add Comment" />
</form>
{% endif %}
<div id="comments">
{% get_comment_list for blob as comments %}
{% include "news/comments.html" %}
</div>
</div>
{% endblock %}
{% block footer %}
{% include "_includes/footer.html" %}
{% endblock %}
|
{% extends 'base.html' %}
{% load blobs_tag %}
{% load comments %}
{% load threadedcomments_tags %}
{% load voting_tags %}
{% block title %}{{ blob.title }}{% endblock %}
{% block page %}gamernews{% endblock %}
{% block header %}
{% include "_includes/header.html" %}
{% endblock %}
{% block content %}
{% vote_by_user user.id on blob as vote %}
{% vote_counts_for_object blob as score %}
<div id="blob-{{ blob.id }}" class="single">
{% include 'news/_link.html' %}
<div class="note">{{ blob.note|striptags|safe }}</div>
{% if user.is_authenticated %}
{% get_comment_form for blob as form %}
<form class="comment-form" action="{% comment_form_target %}" method="post">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" class="btn btn-default" value="Add Comment" />
<input type="hidden" name="next" value="/id/{{ blob.id }}/#c{{ comment.id }}" />
</form>
{% endif %}
<div id="comments">
{% get_comment_list for blob as comments %}
{% include "news/comments.html" %}
</div>
</div>
{% endblock %}
{% block footer %}
{% include "_includes/footer.html" %}
{% endblock %}
|
Remove name, url and email from comment form
|
Remove name, url and email from comment form
|
HTML
|
mit
|
underlost/GamerNews,underlost/GamerNews
|
html
|
## Code Before:
{% extends 'base.html' %}
{% load blobs_tag %}
{% load comments %}
{% load threadedcomments_tags %}
{% load voting_tags %}
{% block title %}{{ blob.title }}{% endblock %}
{% block page %}gamernews{% endblock %}
{% block header %}
{% include "_includes/header.html" %}
{% endblock %}
{% block content %}
{% vote_by_user user.id on blob as vote %}
{% vote_counts_for_object blob as score %}
<div id="blob-{{ blob.id }}" class="single">
{% include 'news/_link.html' %}
<div class="note">{{ blob.note|striptags|safe }}</div>
{% if user.is_authenticated %}
{% get_comment_form for blob as form %}
<form class="comment-form" action="{% comment_form_target %}" method="post">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" class="btn btn-default" value="Add Comment" />
</form>
{% endif %}
<div id="comments">
{% get_comment_list for blob as comments %}
{% include "news/comments.html" %}
</div>
</div>
{% endblock %}
{% block footer %}
{% include "_includes/footer.html" %}
{% endblock %}
## Instruction:
Remove name, url and email from comment form
## Code After:
{% extends 'base.html' %}
{% load blobs_tag %}
{% load comments %}
{% load threadedcomments_tags %}
{% load voting_tags %}
{% block title %}{{ blob.title }}{% endblock %}
{% block page %}gamernews{% endblock %}
{% block header %}
{% include "_includes/header.html" %}
{% endblock %}
{% block content %}
{% vote_by_user user.id on blob as vote %}
{% vote_counts_for_object blob as score %}
<div id="blob-{{ blob.id }}" class="single">
{% include 'news/_link.html' %}
<div class="note">{{ blob.note|striptags|safe }}</div>
{% if user.is_authenticated %}
{% get_comment_form for blob as form %}
<form class="comment-form" action="{% comment_form_target %}" method="post">
{% csrf_token %}
{{ form.as_p }}
<input type="submit" class="btn btn-default" value="Add Comment" />
<input type="hidden" name="next" value="/id/{{ blob.id }}/#c{{ comment.id }}" />
</form>
{% endif %}
<div id="comments">
{% get_comment_list for blob as comments %}
{% include "news/comments.html" %}
</div>
</div>
{% endblock %}
{% block footer %}
{% include "_includes/footer.html" %}
{% endblock %}
|
0bcdde64aeee1ddc7ae40d6aca8729a4070c604a
|
fabfile.py
|
fabfile.py
|
import os
from fabric.api import *
from fab_shared import (test, webpy_deploy as deploy,
setup, development, production, localhost, staging, restart_webserver,
rollback, lint, enable, disable, maintenancemode, rechef)
env.unit = "trinity"
env.path = "/var/tornado/%(unit)s" % env
env.scm = "[email protected]:bueda/%(unit)s.git" % env
env.scm_http_url = "http://github.com/bueda/%(unit)s" % env
env.root_dir = os.path.abspath(os.path.dirname(__file__))
env.pip_requirements = ["requirements/common.txt",]
env.pip_requirements_dev = ["requirements/dev.txt",]
env.pip_requirements_production = ["requirements/production.txt",]
env.campfire_subdomain = 'bueda'
env.campfire_room = 'Development'
env.campfire_token = '63768eee94d96b7b18e2091f3919b2a2a3dcd12a'
@runs_once
def tornado_test_runner(deployment_type=None):
return local('test/run_tests.py', capture=False).return_code
env.test_runner = tornado_test_runner
|
import os
from fabric.api import *
from fab_shared import (test, webpy_deploy as deploy,
setup, development, production, localhost, staging, restart_webserver,
rollback, lint, enable, disable, maintenancemode, rechef)
env.unit = "trinity"
env.path = "/var/tornado/%(unit)s" % env
env.scm = "[email protected]:bueda/%(unit)s.git" % env
env.scm_http_url = "http://github.com/bueda/%(unit)s" % env
env.root_dir = os.path.abspath(os.path.dirname(__file__))
env.pip_requirements = ["requirements/common.txt",]
env.pip_requirements_dev = ["requirements/dev.txt",]
env.pip_requirements_production = ["requirements/production.txt",]
env.campfire_subdomain = 'bueda'
env.campfire_room = 'Development'
env.campfire_token = '63768eee94d96b7b18e2091f3919b2a2a3dcd12a'
@runs_once
def tornado_test_runner(deployment_type=None):
return local('test/run_tests.py', capture=False).return_code
env.test_runner = tornado_test_runner
def reset():
import trinity
app = trinity.Trinity()
app.db.reset()
|
Add reset method for erasing graph database.
|
Add reset method for erasing graph database.
|
Python
|
mit
|
peplin/trinity
|
python
|
## Code Before:
import os
from fabric.api import *
from fab_shared import (test, webpy_deploy as deploy,
setup, development, production, localhost, staging, restart_webserver,
rollback, lint, enable, disable, maintenancemode, rechef)
env.unit = "trinity"
env.path = "/var/tornado/%(unit)s" % env
env.scm = "[email protected]:bueda/%(unit)s.git" % env
env.scm_http_url = "http://github.com/bueda/%(unit)s" % env
env.root_dir = os.path.abspath(os.path.dirname(__file__))
env.pip_requirements = ["requirements/common.txt",]
env.pip_requirements_dev = ["requirements/dev.txt",]
env.pip_requirements_production = ["requirements/production.txt",]
env.campfire_subdomain = 'bueda'
env.campfire_room = 'Development'
env.campfire_token = '63768eee94d96b7b18e2091f3919b2a2a3dcd12a'
@runs_once
def tornado_test_runner(deployment_type=None):
return local('test/run_tests.py', capture=False).return_code
env.test_runner = tornado_test_runner
## Instruction:
Add reset method for erasing graph database.
## Code After:
import os
from fabric.api import *
from fab_shared import (test, webpy_deploy as deploy,
setup, development, production, localhost, staging, restart_webserver,
rollback, lint, enable, disable, maintenancemode, rechef)
env.unit = "trinity"
env.path = "/var/tornado/%(unit)s" % env
env.scm = "[email protected]:bueda/%(unit)s.git" % env
env.scm_http_url = "http://github.com/bueda/%(unit)s" % env
env.root_dir = os.path.abspath(os.path.dirname(__file__))
env.pip_requirements = ["requirements/common.txt",]
env.pip_requirements_dev = ["requirements/dev.txt",]
env.pip_requirements_production = ["requirements/production.txt",]
env.campfire_subdomain = 'bueda'
env.campfire_room = 'Development'
env.campfire_token = '63768eee94d96b7b18e2091f3919b2a2a3dcd12a'
@runs_once
def tornado_test_runner(deployment_type=None):
return local('test/run_tests.py', capture=False).return_code
env.test_runner = tornado_test_runner
def reset():
import trinity
app = trinity.Trinity()
app.db.reset()
|
58cc2eaa9cfbdb19c3bedcc9cb5d553b7c7b69be
|
lib/ex_unit/escaper.ex
|
lib/ex_unit/escaper.ex
|
defmodule ExUnit::Escaper do
defrecord Flag, as: nil
defimpl String::Inspect, for: Flag do
def inspect(record), do: record.as
def to_binary(record), do: record.as
end
# Replace _ by a record that when inspected prints "_"
def escape({ :_, _, false }) do
quote { unquote(Flag).new(as: "_") }
end
def escape(expr) when is_tuple(expr) do
list_to_tuple(escape(tuple_to_list(expr)))
end
def escape(expr) when is_list(expr) do
Enum.map expr, escape(_)
end
def escape(expr) do
expr
end
end
|
defmodule ExUnit::Escaper do
defrecord Flag, as: nil
defimpl String::Inspect, for: Flag do
def inspect(record), do: record.as
def to_binary(record), do: record.as
end
# Replace _ by a record that when inspected prints "_"
def escape({ :_, _, false }) do
Flag.new(as: "_")
end
def escape(expr) when is_tuple(expr) do
list_to_tuple(escape(tuple_to_list(expr)))
end
def escape(expr) when is_list(expr) do
Enum.map expr, escape(_)
end
def escape(expr) do
expr
end
end
|
Add some explanation for ExUnit::Escaper.
|
Add some explanation for ExUnit::Escaper.
|
Elixir
|
apache-2.0
|
lexmag/elixir,kelvinst/elixir,pedrosnk/elixir,beedub/elixir,joshprice/elixir,beedub/elixir,kimshrier/elixir,lexmag/elixir,ggcampinho/elixir,ggcampinho/elixir,antipax/elixir,kelvinst/elixir,gfvcastro/elixir,michalmuskala/elixir,kimshrier/elixir,gfvcastro/elixir,elixir-lang/elixir,pedrosnk/elixir,antipax/elixir
|
elixir
|
## Code Before:
defmodule ExUnit::Escaper do
defrecord Flag, as: nil
defimpl String::Inspect, for: Flag do
def inspect(record), do: record.as
def to_binary(record), do: record.as
end
# Replace _ by a record that when inspected prints "_"
def escape({ :_, _, false }) do
quote { unquote(Flag).new(as: "_") }
end
def escape(expr) when is_tuple(expr) do
list_to_tuple(escape(tuple_to_list(expr)))
end
def escape(expr) when is_list(expr) do
Enum.map expr, escape(_)
end
def escape(expr) do
expr
end
end
## Instruction:
Add some explanation for ExUnit::Escaper.
## Code After:
defmodule ExUnit::Escaper do
defrecord Flag, as: nil
defimpl String::Inspect, for: Flag do
def inspect(record), do: record.as
def to_binary(record), do: record.as
end
# Replace _ by a record that when inspected prints "_"
def escape({ :_, _, false }) do
Flag.new(as: "_")
end
def escape(expr) when is_tuple(expr) do
list_to_tuple(escape(tuple_to_list(expr)))
end
def escape(expr) when is_list(expr) do
Enum.map expr, escape(_)
end
def escape(expr) do
expr
end
end
|
ad24d28fef2c71f75ea9d3f8405864907eecb3c2
|
templates/default/modules/http_geoip.conf.erb
|
templates/default/modules/http_geoip.conf.erb
|
geoip_country <%= @country_dat %>;
<% if @city_dat -%>
geoip_city <%= @city_dat %>;
<% end -%>
|
geoip_country <%= @country_dat %> utf8;
<% if @city_dat -%>
geoip_city <%= @city_dat %> utf8;
<% end -%>
|
Add utf8 for geoIp data
|
Add utf8 for geoIp data
|
HTML+ERB
|
apache-2.0
|
luishdez/nginx-cookbook,luishdez/nginx-cookbook,luishdez/nginx-cookbook
|
html+erb
|
## Code Before:
geoip_country <%= @country_dat %>;
<% if @city_dat -%>
geoip_city <%= @city_dat %>;
<% end -%>
## Instruction:
Add utf8 for geoIp data
## Code After:
geoip_country <%= @country_dat %> utf8;
<% if @city_dat -%>
geoip_city <%= @city_dat %> utf8;
<% end -%>
|
beab909b96c2e922aaec50c1e13af99137d8a190
|
events/orfik/templates/orfik/question.html
|
events/orfik/templates/orfik/question.html
|
{% extends 'orfik/home.html' %}
{% block body %}
<section id="events-container" style="margin: 60px auto 300px;">
<h2 class="light-space">Question {{ question.number }}</h2>
{% if question.text %}
<p class="lead">{{ question.text }}</p>
{% endif %}
{% if question.image %}
<img class='img-responsive img-hint' src='{{ question.image.url }}' style="margin: 0 auto;"/>
{% endif %}
<!-- Form for the answer -->
<form role='form' class="answer-form" action="{% url 'events:orfik:question' question.number %}" method='POST' style="margi-bottom: 100px;">
{% csrf_token %}
{{ form }}
<button type='submit' class='btn-contact'>Submit</button>
</form>
</section>
{% endblock %}
|
{% extends 'orfik/home.html' %}
{% block body %}
<!-- What do 'Scarface', 'Blow' and 'Miami Vice' have in common? -->
<section id="events-container" style="margin: 60px auto 300px;">
<h2 class="light-space">Question {{ question.number }}</h2>
{% if question.text %}
<p class="lead">{{ question.text }}</p>
{% endif %}
{% if question.image %}
<img class='img-responsive img-hint' src='{{ question.image.url }}' style="margin: 0 auto;"/>
{% endif %}
<!-- Form for the answer -->
<form role='form' class="answer-form" action="{% url 'events:orfik:question' question.number %}" method='POST' style="margi-bottom: 100px;">
{% csrf_token %}
{{ form }}
<button type='submit' class='btn-contact'>Submit</button>
</form>
</section>
{% endblock %}
|
Add some source code magic for orfik
|
Add some source code magic for orfik
|
HTML
|
mit
|
compsoc-ssc/compsocssc,compsoc-ssc/compsocssc,compsoc-ssc/compsocssc
|
html
|
## Code Before:
{% extends 'orfik/home.html' %}
{% block body %}
<section id="events-container" style="margin: 60px auto 300px;">
<h2 class="light-space">Question {{ question.number }}</h2>
{% if question.text %}
<p class="lead">{{ question.text }}</p>
{% endif %}
{% if question.image %}
<img class='img-responsive img-hint' src='{{ question.image.url }}' style="margin: 0 auto;"/>
{% endif %}
<!-- Form for the answer -->
<form role='form' class="answer-form" action="{% url 'events:orfik:question' question.number %}" method='POST' style="margi-bottom: 100px;">
{% csrf_token %}
{{ form }}
<button type='submit' class='btn-contact'>Submit</button>
</form>
</section>
{% endblock %}
## Instruction:
Add some source code magic for orfik
## Code After:
{% extends 'orfik/home.html' %}
{% block body %}
<!-- What do 'Scarface', 'Blow' and 'Miami Vice' have in common? -->
<section id="events-container" style="margin: 60px auto 300px;">
<h2 class="light-space">Question {{ question.number }}</h2>
{% if question.text %}
<p class="lead">{{ question.text }}</p>
{% endif %}
{% if question.image %}
<img class='img-responsive img-hint' src='{{ question.image.url }}' style="margin: 0 auto;"/>
{% endif %}
<!-- Form for the answer -->
<form role='form' class="answer-form" action="{% url 'events:orfik:question' question.number %}" method='POST' style="margi-bottom: 100px;">
{% csrf_token %}
{{ form }}
<button type='submit' class='btn-contact'>Submit</button>
</form>
</section>
{% endblock %}
|
b32da432b3fcfea05728ec42d254f3e5ca109579
|
test/sanitizer_common/TestCases/sanitizer_coverage_inline8bit_counter.cc
|
test/sanitizer_common/TestCases/sanitizer_coverage_inline8bit_counter.cc
|
// Tests -fsanitize-coverage=inline-8bit-counters
//
// REQUIRES: has_sancovcc,stable-runtime
// UNSUPPORTED: i386-darwin
//
// RUN: %clangxx -O0 %s -fsanitize-coverage=inline-8bit-counters 2>&1
#include <stdio.h>
#include <assert.h>
const char *first_counter;
extern "C"
void __sanitizer_cov_8bit_counters_init(const char *start, const char *end) {
printf("INIT: %p %p\n", start, end);
assert(end - start > 1);
first_counter = start;
}
int main() {
assert(first_counter);
assert(*first_counter == 1);
}
|
// Tests -fsanitize-coverage=inline-8bit-counters
//
// REQUIRES: has_sancovcc,stable-runtime
// UNSUPPORTED: i386-darwin, x86_64-darwin
//
// RUN: %clangxx -O0 %s -fsanitize-coverage=inline-8bit-counters 2>&1
#include <stdio.h>
#include <assert.h>
const char *first_counter;
extern "C"
void __sanitizer_cov_8bit_counters_init(const char *start, const char *end) {
printf("INIT: %p %p\n", start, end);
assert(end - start > 1);
first_counter = start;
}
int main() {
assert(first_counter);
assert(*first_counter == 1);
}
|
Mark sancov test as unsupported on Darwin
|
Mark sancov test as unsupported on Darwin
This test has been failing on all Darwin bots since it was introduced:
http://lab.llvm.org:8080/green/job/clang-stage1-configure-RA_check/32111
fatal error: error in backend: Global variable '__sancov_gen_' has an invalid section specifier '__DATA,__sancov_counters': mach-o section specifier requires a section whose length is between 1 and 16 characters.
Target: x86_64-apple-darwin15.6.0
git-svn-id: c199f293c43da69278bea8e88f92242bf3aa95f7@304673 91177308-0d34-0410-b5e6-96231b3b80d8
|
C++
|
apache-2.0
|
llvm-mirror/compiler-rt,llvm-mirror/compiler-rt,llvm-mirror/compiler-rt,llvm-mirror/compiler-rt,llvm-mirror/compiler-rt
|
c++
|
## Code Before:
// Tests -fsanitize-coverage=inline-8bit-counters
//
// REQUIRES: has_sancovcc,stable-runtime
// UNSUPPORTED: i386-darwin
//
// RUN: %clangxx -O0 %s -fsanitize-coverage=inline-8bit-counters 2>&1
#include <stdio.h>
#include <assert.h>
const char *first_counter;
extern "C"
void __sanitizer_cov_8bit_counters_init(const char *start, const char *end) {
printf("INIT: %p %p\n", start, end);
assert(end - start > 1);
first_counter = start;
}
int main() {
assert(first_counter);
assert(*first_counter == 1);
}
## Instruction:
Mark sancov test as unsupported on Darwin
This test has been failing on all Darwin bots since it was introduced:
http://lab.llvm.org:8080/green/job/clang-stage1-configure-RA_check/32111
fatal error: error in backend: Global variable '__sancov_gen_' has an invalid section specifier '__DATA,__sancov_counters': mach-o section specifier requires a section whose length is between 1 and 16 characters.
Target: x86_64-apple-darwin15.6.0
git-svn-id: c199f293c43da69278bea8e88f92242bf3aa95f7@304673 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
// Tests -fsanitize-coverage=inline-8bit-counters
//
// REQUIRES: has_sancovcc,stable-runtime
// UNSUPPORTED: i386-darwin, x86_64-darwin
//
// RUN: %clangxx -O0 %s -fsanitize-coverage=inline-8bit-counters 2>&1
#include <stdio.h>
#include <assert.h>
const char *first_counter;
extern "C"
void __sanitizer_cov_8bit_counters_init(const char *start, const char *end) {
printf("INIT: %p %p\n", start, end);
assert(end - start > 1);
first_counter = start;
}
int main() {
assert(first_counter);
assert(*first_counter == 1);
}
|
bae1161b3393b1b4aa87527bc2adb88f933410f2
|
src/bb/buildinit.go
|
src/bb/buildinit.go
|
package main
import (
"log"
"os"
"os/exec"
"path"
)
func buildinit() {
e := os.Environ()
for i := range e {
if e[i][0:6] == "GOPATH" {
e[i] = e[i] + ":" + path.Join(config.Uroot, "src/bb/bbsh")
}
}
cmd := exec.Command("go", "build", "-o", "init", ".")
cmd.Stderr = os.Stderr
cmd.Stdout = os.Stdout
cmd.Dir = path.Join(config.Uroot, "src/bb/bbsh")
cmd.Env = e
err := cmd.Run()
if err != nil {
log.Fatalf("%v\n", err)
os.Exit(1)
}
}
|
package main
import (
"log"
"os"
"os/exec"
"path"
)
func buildinit() {
e := os.Environ()
for i := range e {
if e[i][0:6] == "GOPATH" {
e[i] = e[i] + ":" + path.Join(config.Uroot, "src/bb/bbsh")
}
}
e = append(e, "CGO_ENABLED=0")
cmd := exec.Command("go", "build", "-o", "init", ".")
cmd.Stderr = os.Stderr
cmd.Stdout = os.Stdout
cmd.Dir = path.Join(config.Uroot, "src/bb/bbsh")
cmd.Env = e
err := cmd.Run()
if err != nil {
log.Fatalf("%v\n", err)
os.Exit(1)
}
}
|
Make sure CGO_ENABLED is 0
|
Make sure CGO_ENABLED is 0
Signed-off-by: Ronald G. Minnich <[email protected]>
|
Go
|
bsd-3-clause
|
rminnich/u-root,GanShun/u-root,hugelgupf/u-root,rminnich/u-root,u-root/u-root,hugelgupf/u-root,hugelgupf/u-root,hugelgupf/u-root,u-root/u-root,GanShun/u-root,GanShun/u-root,u-root/u-root,rminnich/u-root,rminnich/u-root
|
go
|
## Code Before:
package main
import (
"log"
"os"
"os/exec"
"path"
)
func buildinit() {
e := os.Environ()
for i := range e {
if e[i][0:6] == "GOPATH" {
e[i] = e[i] + ":" + path.Join(config.Uroot, "src/bb/bbsh")
}
}
cmd := exec.Command("go", "build", "-o", "init", ".")
cmd.Stderr = os.Stderr
cmd.Stdout = os.Stdout
cmd.Dir = path.Join(config.Uroot, "src/bb/bbsh")
cmd.Env = e
err := cmd.Run()
if err != nil {
log.Fatalf("%v\n", err)
os.Exit(1)
}
}
## Instruction:
Make sure CGO_ENABLED is 0
Signed-off-by: Ronald G. Minnich <[email protected]>
## Code After:
package main
import (
"log"
"os"
"os/exec"
"path"
)
func buildinit() {
e := os.Environ()
for i := range e {
if e[i][0:6] == "GOPATH" {
e[i] = e[i] + ":" + path.Join(config.Uroot, "src/bb/bbsh")
}
}
e = append(e, "CGO_ENABLED=0")
cmd := exec.Command("go", "build", "-o", "init", ".")
cmd.Stderr = os.Stderr
cmd.Stdout = os.Stdout
cmd.Dir = path.Join(config.Uroot, "src/bb/bbsh")
cmd.Env = e
err := cmd.Run()
if err != nil {
log.Fatalf("%v\n", err)
os.Exit(1)
}
}
|
c3bcedb66af80f5843f20488d238c8dfd2a5e2bf
|
src/util/syserror.cpp
|
src/util/syserror.cpp
|
// Copyright (c) 2020-2022 The Bitcoin Core developers
// Distributed under the MIT software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#if defined(HAVE_CONFIG_H)
#include <config/bitcoin-config.h>
#endif
#include <tinyformat.h>
#include <util/syserror.h>
#include <cstring>
std::string SysErrorString(int err)
{
char buf[256];
buf[0] = 0;
/* Too bad there are two incompatible implementations of the
* thread-safe strerror. */
const char *s;
#ifdef STRERROR_R_CHAR_P /* GNU variant can return a pointer outside the passed buffer */
s = strerror_r(err, buf, sizeof(buf));
#else /* POSIX variant always returns message in buffer */
s = buf;
if (strerror_r(err, buf, sizeof(buf)))
buf[0] = 0;
#endif
return strprintf("%s (%d)", s, err);
}
|
// Copyright (c) 2020-2022 The Bitcoin Core developers
// Distributed under the MIT software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#if defined(HAVE_CONFIG_H)
#include <config/bitcoin-config.h>
#endif
#include <tinyformat.h>
#include <util/syserror.h>
#include <cstring>
std::string SysErrorString(int err)
{
char buf[256];
buf[0] = 0;
/* Too bad there are three incompatible implementations of the
* thread-safe strerror. */
const char *s;
#ifdef WIN32
s = buf;
if (strerror_s(buf, sizeof(buf), err) != 0)
buf[0] = 0;
#else
#ifdef STRERROR_R_CHAR_P /* GNU variant can return a pointer outside the passed buffer */
s = strerror_r(err, buf, sizeof(buf));
#else /* POSIX variant always returns message in buffer */
s = buf;
if (strerror_r(err, buf, sizeof(buf)))
buf[0] = 0;
#endif
#endif
return strprintf("%s (%d)", s, err);
}
|
Use strerror_s for SysErrorString on Windows
|
util: Use strerror_s for SysErrorString on Windows
Github-Pull: #24933
Rebased-From: e7f2f77756d33c6be9c8998a575b263ff2d39270
|
C++
|
mit
|
bitcoinknots/bitcoin,bitcoinknots/bitcoin,bitcoinknots/bitcoin,bitcoinknots/bitcoin,bitcoinknots/bitcoin
|
c++
|
## Code Before:
// Copyright (c) 2020-2022 The Bitcoin Core developers
// Distributed under the MIT software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#if defined(HAVE_CONFIG_H)
#include <config/bitcoin-config.h>
#endif
#include <tinyformat.h>
#include <util/syserror.h>
#include <cstring>
std::string SysErrorString(int err)
{
char buf[256];
buf[0] = 0;
/* Too bad there are two incompatible implementations of the
* thread-safe strerror. */
const char *s;
#ifdef STRERROR_R_CHAR_P /* GNU variant can return a pointer outside the passed buffer */
s = strerror_r(err, buf, sizeof(buf));
#else /* POSIX variant always returns message in buffer */
s = buf;
if (strerror_r(err, buf, sizeof(buf)))
buf[0] = 0;
#endif
return strprintf("%s (%d)", s, err);
}
## Instruction:
util: Use strerror_s for SysErrorString on Windows
Github-Pull: #24933
Rebased-From: e7f2f77756d33c6be9c8998a575b263ff2d39270
## Code After:
// Copyright (c) 2020-2022 The Bitcoin Core developers
// Distributed under the MIT software license, see the accompanying
// file COPYING or http://www.opensource.org/licenses/mit-license.php.
#if defined(HAVE_CONFIG_H)
#include <config/bitcoin-config.h>
#endif
#include <tinyformat.h>
#include <util/syserror.h>
#include <cstring>
std::string SysErrorString(int err)
{
char buf[256];
buf[0] = 0;
/* Too bad there are three incompatible implementations of the
* thread-safe strerror. */
const char *s;
#ifdef WIN32
s = buf;
if (strerror_s(buf, sizeof(buf), err) != 0)
buf[0] = 0;
#else
#ifdef STRERROR_R_CHAR_P /* GNU variant can return a pointer outside the passed buffer */
s = strerror_r(err, buf, sizeof(buf));
#else /* POSIX variant always returns message in buffer */
s = buf;
if (strerror_r(err, buf, sizeof(buf)))
buf[0] = 0;
#endif
#endif
return strprintf("%s (%d)", s, err);
}
|
f4b287e84f3866ca34409442a73a505d20d1b433
|
correctingInterval.js
|
correctingInterval.js
|
;(function(global, factory) {
// Use UMD pattern to expose exported functions
if (typeof exports === 'object') {
// Expose to Node.js
module.exports = factory();
} else if (typeof define === 'function' && define.amd) {
// Expose to RequireJS
define([], factory);
}
// Expose to global object (likely browser window)
var exports = factory();
for (var prop in exports) {
global[prop] = exports[prop];
}
}(this, function() {
var numIntervals = 0,
intervals = {};
var setCorrectingInterval = function(func, delay) {
var id = numIntervals++,
planned = Date.now() + delay;
function tick() {
func();
if (intervals[id]) {
planned += delay;
intervals[id] = setTimeout(tick, planned - Date.now());
}
}
intervals[id] = setTimeout(tick, delay);
return id;
};
var clearCorrectingInterval = function(id) {
clearTimeout(intervals[id]);
delete intervals[id];
};
return {
setCorrectingInterval: setCorrectingInterval,
clearCorrectingInterval: clearCorrectingInterval
};
}));
|
;(function(global, factory) {
// Use UMD pattern to expose exported functions
if (typeof exports === 'object') {
// Expose to Node.js
module.exports = factory();
} else if (typeof define === 'function' && define.amd) {
// Expose to RequireJS
define([], factory);
}
// Expose to global object (likely browser window)
var exports = factory();
for (var prop in exports) {
global[prop] = exports[prop];
}
}(this, function() {
var numIntervals = 0,
intervals = {};
var setCorrectingInterval = function(func, delay) {
var id = numIntervals++,
planned = Date.now() + delay;
if (typeof func === 'string') {
// Convert string to function
func = function() { eval(this.func); }.bind({ func: func });
}
function tick() {
func();
if (intervals[id]) {
planned += delay;
intervals[id] = setTimeout(tick, planned - Date.now());
}
}
intervals[id] = setTimeout(tick, delay);
return id;
};
var clearCorrectingInterval = function(id) {
clearTimeout(intervals[id]);
delete intervals[id];
};
return {
setCorrectingInterval: setCorrectingInterval,
clearCorrectingInterval: clearCorrectingInterval
};
}));
|
Resolve failing test "should allow string input for execution"
|
Resolve failing test "should allow string input for execution"
|
JavaScript
|
mit
|
aduth/correctingInterval
|
javascript
|
## Code Before:
;(function(global, factory) {
// Use UMD pattern to expose exported functions
if (typeof exports === 'object') {
// Expose to Node.js
module.exports = factory();
} else if (typeof define === 'function' && define.amd) {
// Expose to RequireJS
define([], factory);
}
// Expose to global object (likely browser window)
var exports = factory();
for (var prop in exports) {
global[prop] = exports[prop];
}
}(this, function() {
var numIntervals = 0,
intervals = {};
var setCorrectingInterval = function(func, delay) {
var id = numIntervals++,
planned = Date.now() + delay;
function tick() {
func();
if (intervals[id]) {
planned += delay;
intervals[id] = setTimeout(tick, planned - Date.now());
}
}
intervals[id] = setTimeout(tick, delay);
return id;
};
var clearCorrectingInterval = function(id) {
clearTimeout(intervals[id]);
delete intervals[id];
};
return {
setCorrectingInterval: setCorrectingInterval,
clearCorrectingInterval: clearCorrectingInterval
};
}));
## Instruction:
Resolve failing test "should allow string input for execution"
## Code After:
;(function(global, factory) {
// Use UMD pattern to expose exported functions
if (typeof exports === 'object') {
// Expose to Node.js
module.exports = factory();
} else if (typeof define === 'function' && define.amd) {
// Expose to RequireJS
define([], factory);
}
// Expose to global object (likely browser window)
var exports = factory();
for (var prop in exports) {
global[prop] = exports[prop];
}
}(this, function() {
var numIntervals = 0,
intervals = {};
var setCorrectingInterval = function(func, delay) {
var id = numIntervals++,
planned = Date.now() + delay;
if (typeof func === 'string') {
// Convert string to function
func = function() { eval(this.func); }.bind({ func: func });
}
function tick() {
func();
if (intervals[id]) {
planned += delay;
intervals[id] = setTimeout(tick, planned - Date.now());
}
}
intervals[id] = setTimeout(tick, delay);
return id;
};
var clearCorrectingInterval = function(id) {
clearTimeout(intervals[id]);
delete intervals[id];
};
return {
setCorrectingInterval: setCorrectingInterval,
clearCorrectingInterval: clearCorrectingInterval
};
}));
|
752f89d866e899395d45c87a9bfc019839dfe7f6
|
.travis.yml
|
.travis.yml
|
sudo: required
env:
- GHCVER=7.10.3
- GHCVER=8.0.2
- GHCVER=8.2.1
- GHCVER=8.4.1
- GHCVER=head
script:
- export HLINT_ARGUMENTS=src
- curl -sL https://raw.github.com/ndmitchell/neil/master/travis.sh | sh
|
sudo: required
matrix:
include:
- env: GHCVER=7.10.3
- env: GHCVER=8.0.2
- env: GHCVER=8.2.1
- env: GHCVER=8.4.1
- env: GHCVER=head
- os: osx
script:
- export HLINT_ARGUMENTS=src
- curl -sL https://raw.github.com/ndmitchell/neil/master/travis.sh | sh
deploy:
provider: releases
api_key:
secure: ""
file_glob: true
file: travis-release/*
skip_cleanup: true
on:
tags: true
|
Test on osx as well
|
Test on osx as well
|
YAML
|
bsd-3-clause
|
ndmitchell/weeder
|
yaml
|
## Code Before:
sudo: required
env:
- GHCVER=7.10.3
- GHCVER=8.0.2
- GHCVER=8.2.1
- GHCVER=8.4.1
- GHCVER=head
script:
- export HLINT_ARGUMENTS=src
- curl -sL https://raw.github.com/ndmitchell/neil/master/travis.sh | sh
## Instruction:
Test on osx as well
## Code After:
sudo: required
matrix:
include:
- env: GHCVER=7.10.3
- env: GHCVER=8.0.2
- env: GHCVER=8.2.1
- env: GHCVER=8.4.1
- env: GHCVER=head
- os: osx
script:
- export HLINT_ARGUMENTS=src
- curl -sL https://raw.github.com/ndmitchell/neil/master/travis.sh | sh
deploy:
provider: releases
api_key:
secure: ""
file_glob: true
file: travis-release/*
skip_cleanup: true
on:
tags: true
|
85190b166740863e96b65cca09e44e0dda801c46
|
app/controllers/metadata_values_controller.rb
|
app/controllers/metadata_values_controller.rb
|
class MetadataValuesController < ApplicationController
def index
raise "Corpus ID is missing!" unless received_id_for?(:corpus)
render json: MetadataValue
.joins(:metadata_category)
.where('metadata_categories.corpus_id = ?', params[:corpus_id])
end
end
|
class MetadataValuesController < ApplicationController
def index
raise "Corpus ID is missing!" unless received_id_for?(:corpus)
render json: MetadataValue
.joins(:metadata_category)
.where(metadata_categories: {corpus_id: params[:corpus_id]})
end
end
|
Use hash syntax for query conditions
|
Use hash syntax for query conditions
|
Ruby
|
mit
|
textlab/rglossa,textlab/glossa,textlab/glossa,textlab/glossa,textlab/rglossa,textlab/rglossa,textlab/rglossa,textlab/glossa,textlab/glossa,textlab/rglossa
|
ruby
|
## Code Before:
class MetadataValuesController < ApplicationController
def index
raise "Corpus ID is missing!" unless received_id_for?(:corpus)
render json: MetadataValue
.joins(:metadata_category)
.where('metadata_categories.corpus_id = ?', params[:corpus_id])
end
end
## Instruction:
Use hash syntax for query conditions
## Code After:
class MetadataValuesController < ApplicationController
def index
raise "Corpus ID is missing!" unless received_id_for?(:corpus)
render json: MetadataValue
.joins(:metadata_category)
.where(metadata_categories: {corpus_id: params[:corpus_id]})
end
end
|
91ebc3e71a1d833c22713c06cae4bc1bf4494893
|
apps/metric_vnode/src/metric_io_worker.erl
|
apps/metric_vnode/src/metric_io_worker.erl
|
-module(metric_io_worker).
-behaviour(riak_core_vnode_worker).
-include("metric_io.hrl").
-export([init_worker/3,
handle_work/3]).
-record(state, {index, node}).
%% ===================================================================
%% Public API
%% ===================================================================
%% @doc Initialize the worker. Currently only the VNode index
%% parameter is used.
init_worker(VNodeIndex, _Args, _Props) ->
{ok, #state{index=VNodeIndex, node = node()}}.
%% @doc Perform the asynchronous fold operation.
handle_work(#read_req{
mstore = MSetc,
metric = Metric,
time = Time,
count = Count,
compression = Compression,
map_fn = Map,
req_id = ReqID
}, _Sender, State = #state{index = P, node = N}) ->
{ok, Data} = ddb_histogram:timed_update(
{mstore, read},
mstore, get, [MSetc, Metric, Time, Count]),
mstore:close(MSetc),
Data1 = case Map of
undefined -> Data;
_ -> Map(Data)
end,
Dc = compress(Data1, Compression),
{reply, {ok, ReqID, {P, N}, Dc}, State}.
compress(Data, snappy) ->
{ok, Dc} = snappyer:compress(Data),
Dc;
compress(Data, none) ->
Data.
|
-module(metric_io_worker).
-behaviour(riak_core_vnode_worker).
-include("metric_io.hrl").
-export([init_worker/3,
handle_work/3]).
-record(state, {index, node}).
%% ===================================================================
%% Public API
%% ===================================================================
%% @doc Initialize the worker. Currently only the VNode index
%% parameter is used.
init_worker(VNodeIndex, _Args, _Props) ->
{ok, #state{index=VNodeIndex, node = node()}}.
%% @doc Perform the asynchronous fold operation.
handle_work(#read_req{
mstore = MSetc,
metric = Metric,
time = Time,
count = Count,
compression = Compression,
map_fn = Map,
req_id = ReqID
}, _Sender, State = #state{index = P, node = N}) ->
{ok, Data} = ddb_histogram:timed_update(
{mstore, read},
mstore, get, [MSetc, Metric, Time, Count, [one_off]]),
mstore:close(MSetc),
Data1 = case Map of
undefined -> Data;
_ -> Map(Data)
end,
Dc = compress(Data1, Compression),
{reply, {ok, ReqID, {P, N}, Dc}, State}.
compress(Data, snappy) ->
{ok, Dc} = snappyer:compress(Data),
Dc;
compress(Data, none) ->
Data.
|
Use one_off option for paralell reads
|
Use one_off option for paralell reads
|
Erlang
|
mit
|
dalmatinerdb/dalmatinerdb
|
erlang
|
## Code Before:
-module(metric_io_worker).
-behaviour(riak_core_vnode_worker).
-include("metric_io.hrl").
-export([init_worker/3,
handle_work/3]).
-record(state, {index, node}).
%% ===================================================================
%% Public API
%% ===================================================================
%% @doc Initialize the worker. Currently only the VNode index
%% parameter is used.
init_worker(VNodeIndex, _Args, _Props) ->
{ok, #state{index=VNodeIndex, node = node()}}.
%% @doc Perform the asynchronous fold operation.
handle_work(#read_req{
mstore = MSetc,
metric = Metric,
time = Time,
count = Count,
compression = Compression,
map_fn = Map,
req_id = ReqID
}, _Sender, State = #state{index = P, node = N}) ->
{ok, Data} = ddb_histogram:timed_update(
{mstore, read},
mstore, get, [MSetc, Metric, Time, Count]),
mstore:close(MSetc),
Data1 = case Map of
undefined -> Data;
_ -> Map(Data)
end,
Dc = compress(Data1, Compression),
{reply, {ok, ReqID, {P, N}, Dc}, State}.
compress(Data, snappy) ->
{ok, Dc} = snappyer:compress(Data),
Dc;
compress(Data, none) ->
Data.
## Instruction:
Use one_off option for paralell reads
## Code After:
-module(metric_io_worker).
-behaviour(riak_core_vnode_worker).
-include("metric_io.hrl").
-export([init_worker/3,
handle_work/3]).
-record(state, {index, node}).
%% ===================================================================
%% Public API
%% ===================================================================
%% @doc Initialize the worker. Currently only the VNode index
%% parameter is used.
init_worker(VNodeIndex, _Args, _Props) ->
{ok, #state{index=VNodeIndex, node = node()}}.
%% @doc Perform the asynchronous fold operation.
handle_work(#read_req{
mstore = MSetc,
metric = Metric,
time = Time,
count = Count,
compression = Compression,
map_fn = Map,
req_id = ReqID
}, _Sender, State = #state{index = P, node = N}) ->
{ok, Data} = ddb_histogram:timed_update(
{mstore, read},
mstore, get, [MSetc, Metric, Time, Count, [one_off]]),
mstore:close(MSetc),
Data1 = case Map of
undefined -> Data;
_ -> Map(Data)
end,
Dc = compress(Data1, Compression),
{reply, {ok, ReqID, {P, N}, Dc}, State}.
compress(Data, snappy) ->
{ok, Dc} = snappyer:compress(Data),
Dc;
compress(Data, none) ->
Data.
|
b0eafadae4b602b57ce0badcb8389bd239b9c9e5
|
build/UpdateTest.sh
|
build/UpdateTest.sh
|
export HAKA_TEST_FIX=$(pwd)/haka-test-fix
for i in $(seq $1 $2)
do
CONTINUE=1
while [ $CONTINUE = 1 ]; do
rm -f $HAKA_TEST_FIX
ctest -V -I $i,$i
if [ $? = 0 ]; then
CONTINUE=0
else
if [ -f "$HAKA_TEST_FIX" ]; then
cat $HAKA_TEST_FIX
read -p "Update test? (y/n) " -n 1 -r
echo
case $REPLY in
"y") bash -c "$(cat $HAKA_TEST_FIX)";;
*) echo ERROR: Test not updated; CONTINUE=0;;
esac
else
echo ERROR: Test failed
CONTINUE=0
fi
fi
done
echo $i
done
|
export HAKA_TEST_FIX=$(pwd)/haka-test-fix
export QUICK=yes
for i in $(seq $1 $2)
do
CONTINUE=1
while [ $CONTINUE = 1 ]; do
rm -f $HAKA_TEST_FIX
ctest -V -I $i,$i
if [ $? = 0 ]; then
CONTINUE=0
else
if [ -f "$HAKA_TEST_FIX" ]; then
cat $HAKA_TEST_FIX
read -p "Update test? (y/n) " -n 1 -r
echo
case $REPLY in
"y") bash -c "$(cat $HAKA_TEST_FIX)";;
*) echo ERROR: Test not updated; CONTINUE=0;;
esac
else
echo ERROR: Test failed
CONTINUE=0
fi
fi
done
echo $i
done
|
Use quick mode for updateTest script
|
Use quick mode for updateTest script
|
Shell
|
mpl-2.0
|
nabilbendafi/haka,lcheylus/haka,LubyRuffy/haka,haka-security/haka,Wingless-Archangel/haka,lcheylus/haka,haka-security/haka,nabilbendafi/haka,Wingless-Archangel/haka,nabilbendafi/haka,lcheylus/haka,haka-security/haka,LubyRuffy/haka
|
shell
|
## Code Before:
export HAKA_TEST_FIX=$(pwd)/haka-test-fix
for i in $(seq $1 $2)
do
CONTINUE=1
while [ $CONTINUE = 1 ]; do
rm -f $HAKA_TEST_FIX
ctest -V -I $i,$i
if [ $? = 0 ]; then
CONTINUE=0
else
if [ -f "$HAKA_TEST_FIX" ]; then
cat $HAKA_TEST_FIX
read -p "Update test? (y/n) " -n 1 -r
echo
case $REPLY in
"y") bash -c "$(cat $HAKA_TEST_FIX)";;
*) echo ERROR: Test not updated; CONTINUE=0;;
esac
else
echo ERROR: Test failed
CONTINUE=0
fi
fi
done
echo $i
done
## Instruction:
Use quick mode for updateTest script
## Code After:
export HAKA_TEST_FIX=$(pwd)/haka-test-fix
export QUICK=yes
for i in $(seq $1 $2)
do
CONTINUE=1
while [ $CONTINUE = 1 ]; do
rm -f $HAKA_TEST_FIX
ctest -V -I $i,$i
if [ $? = 0 ]; then
CONTINUE=0
else
if [ -f "$HAKA_TEST_FIX" ]; then
cat $HAKA_TEST_FIX
read -p "Update test? (y/n) " -n 1 -r
echo
case $REPLY in
"y") bash -c "$(cat $HAKA_TEST_FIX)";;
*) echo ERROR: Test not updated; CONTINUE=0;;
esac
else
echo ERROR: Test failed
CONTINUE=0
fi
fi
done
echo $i
done
|
379c159a860d94ad96050c4cd743539994f7a246
|
.github/issue_template.md
|
.github/issue_template.md
|
1) ...
2) ...
3) ...
### Expected behavior
Tell us what should happen ...
### Actual behavior
Tell us what happens instead ...
### System configuration
**Ruby version**:
**Rails version**:
|
1) ...
2) ...
3) ...
### Expected behavior
Tell us what should happen ...
### Actual behavior
Tell us what happens instead ...
### System configuration
**Ruby version**:
**Rails version**:
**Puma version**:
|
Add puma version to issue template
|
Add puma version to issue template
|
Markdown
|
bsd-3-clause
|
eileencodes/puma,looker/puma,eileencodes/puma,eileencodes/puma,puma/puma,puma/puma,puma/puma,looker/puma,looker/puma,eileencodes/puma,puma/puma,looker/puma
|
markdown
|
## Code Before:
1) ...
2) ...
3) ...
### Expected behavior
Tell us what should happen ...
### Actual behavior
Tell us what happens instead ...
### System configuration
**Ruby version**:
**Rails version**:
## Instruction:
Add puma version to issue template
## Code After:
1) ...
2) ...
3) ...
### Expected behavior
Tell us what should happen ...
### Actual behavior
Tell us what happens instead ...
### System configuration
**Ruby version**:
**Rails version**:
**Puma version**:
|
0dae7fb16b17c165a55b06370129793370007fd4
|
exercises/practice/isogram/.meta/config.json
|
exercises/practice/isogram/.meta/config.json
|
{
"blurb": "Determine if a word or phrase is an isogram.",
"authors": [],
"contributors": [
"dector",
"eparovyshnaya",
"jtigger",
"lihofm",
"mdowds",
"nithia",
"sdavids13",
"sjwarner-bp",
"SleeplessByte",
"stkent",
"uzilan"
],
"files": {
"solution": [
"src/main/kotlin/Isogram.kt"
],
"test": [
"src/test/kotlin/IsogramTest.kt"
],
"example": [
".meta/src/main/kotlin/Isogram.kt"
]
},
"source": "Wikipedia",
"source_url": "https://en.wikipedia.org/wiki/Isogram"
}
|
{
"blurb": "Determine if a word or phrase is an isogram.",
"authors": [
"gandrianakis"
],
"contributors": [
"dector",
"eparovyshnaya",
"jtigger",
"lihofm",
"mdowds",
"nithia",
"sdavids13",
"sjwarner-bp",
"SleeplessByte",
"stkent",
"uzilan"
],
"files": {
"solution": [
"src/main/kotlin/Isogram.kt"
],
"test": [
"src/test/kotlin/IsogramTest.kt"
],
"example": [
".meta/src/main/kotlin/Isogram.kt"
]
},
"source": "Wikipedia",
"source_url": "https://en.wikipedia.org/wiki/Isogram"
}
|
Add gandrianakis as an exercise author
|
Add gandrianakis as an exercise author
|
JSON
|
mit
|
exercism/xkotlin,exercism/xkotlin
|
json
|
## Code Before:
{
"blurb": "Determine if a word or phrase is an isogram.",
"authors": [],
"contributors": [
"dector",
"eparovyshnaya",
"jtigger",
"lihofm",
"mdowds",
"nithia",
"sdavids13",
"sjwarner-bp",
"SleeplessByte",
"stkent",
"uzilan"
],
"files": {
"solution": [
"src/main/kotlin/Isogram.kt"
],
"test": [
"src/test/kotlin/IsogramTest.kt"
],
"example": [
".meta/src/main/kotlin/Isogram.kt"
]
},
"source": "Wikipedia",
"source_url": "https://en.wikipedia.org/wiki/Isogram"
}
## Instruction:
Add gandrianakis as an exercise author
## Code After:
{
"blurb": "Determine if a word or phrase is an isogram.",
"authors": [
"gandrianakis"
],
"contributors": [
"dector",
"eparovyshnaya",
"jtigger",
"lihofm",
"mdowds",
"nithia",
"sdavids13",
"sjwarner-bp",
"SleeplessByte",
"stkent",
"uzilan"
],
"files": {
"solution": [
"src/main/kotlin/Isogram.kt"
],
"test": [
"src/test/kotlin/IsogramTest.kt"
],
"example": [
".meta/src/main/kotlin/Isogram.kt"
]
},
"source": "Wikipedia",
"source_url": "https://en.wikipedia.org/wiki/Isogram"
}
|
2b48bf09a13de342e9aad08b6345dee4ddc410db
|
query-side/src/main/java/br/holandajunior/workaday/Application.java
|
query-side/src/main/java/br/holandajunior/workaday/Application.java
|
package br.holandajunior.workaday;
import org.apache.activemq.broker.BrokerService;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.jms.annotation.EnableJms;
/**
* Created by holandajunior on 29/04/17.
*/
@SpringBootApplication // same as @Configuration @EnableAutoConfiguration @ComponentScan
@EnableJms
public class Application {
private static final String URL_MSG_BROKER = "tcp://localhost:61616";
@Bean(initMethod = "start", destroyMethod = "stop")
public BrokerService broker() throws Exception {
final BrokerService broker = new BrokerService();
broker.addConnector( URL_MSG_BROKER );
broker.setPersistent(false);
return broker;
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
|
package br.holandajunior.workaday;
import org.apache.activemq.broker.BrokerService;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.jms.annotation.EnableJms;
import org.springframework.web.servlet.config.annotation.CorsRegistry;
import org.springframework.web.servlet.config.annotation.WebMvcConfigurer;
import org.springframework.web.servlet.config.annotation.WebMvcConfigurerAdapter;
/**
* Created by holandajunior on 29/04/17.
*/
@SpringBootApplication // same as @Configuration @EnableAutoConfiguration @ComponentScan
@EnableJms
public class Application {
private static final String URL_MSG_BROKER = "tcp://localhost:61616";
@Bean(initMethod = "start", destroyMethod = "stop")
public BrokerService broker() throws Exception {
final BrokerService broker = new BrokerService();
broker.addConnector( URL_MSG_BROKER );
broker.setPersistent(false);
return broker;
}
@Bean
public WebMvcConfigurer corsConfigurer() {
return new WebMvcConfigurerAdapter() {
@Override
public void addCorsMappings(CorsRegistry registry) {
registry.addMapping("/**").allowedOrigins("*");
}
};
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
|
Add cors filter into query-side
|
Add cors filter into query-side
|
Java
|
mit
|
holandajunior/workaday,holandajunior/workaday
|
java
|
## Code Before:
package br.holandajunior.workaday;
import org.apache.activemq.broker.BrokerService;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.jms.annotation.EnableJms;
/**
* Created by holandajunior on 29/04/17.
*/
@SpringBootApplication // same as @Configuration @EnableAutoConfiguration @ComponentScan
@EnableJms
public class Application {
private static final String URL_MSG_BROKER = "tcp://localhost:61616";
@Bean(initMethod = "start", destroyMethod = "stop")
public BrokerService broker() throws Exception {
final BrokerService broker = new BrokerService();
broker.addConnector( URL_MSG_BROKER );
broker.setPersistent(false);
return broker;
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
## Instruction:
Add cors filter into query-side
## Code After:
package br.holandajunior.workaday;
import org.apache.activemq.broker.BrokerService;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.jms.annotation.EnableJms;
import org.springframework.web.servlet.config.annotation.CorsRegistry;
import org.springframework.web.servlet.config.annotation.WebMvcConfigurer;
import org.springframework.web.servlet.config.annotation.WebMvcConfigurerAdapter;
/**
* Created by holandajunior on 29/04/17.
*/
@SpringBootApplication // same as @Configuration @EnableAutoConfiguration @ComponentScan
@EnableJms
public class Application {
private static final String URL_MSG_BROKER = "tcp://localhost:61616";
@Bean(initMethod = "start", destroyMethod = "stop")
public BrokerService broker() throws Exception {
final BrokerService broker = new BrokerService();
broker.addConnector( URL_MSG_BROKER );
broker.setPersistent(false);
return broker;
}
@Bean
public WebMvcConfigurer corsConfigurer() {
return new WebMvcConfigurerAdapter() {
@Override
public void addCorsMappings(CorsRegistry registry) {
registry.addMapping("/**").allowedOrigins("*");
}
};
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
|
08177dd4bd443253415518147e03fdefceb4cc06
|
Manifold/Inference.swift
|
Manifold/Inference.swift
|
// Copyright (c) 2015 Rob Rix. All rights reserved.
public typealias ConstraintSet = Multiset<Constraint>
public func typeOf(expression: Expression) -> Either<Error, (Type, AssumptionSet, ConstraintSet)> {
return expression.analysis(
ifVariable: { v in
let type = Type(Variable())
return .right(type, [ v: [ Scheme([], type) ] ], [])
},
ifAbstraction: const(.left("unimplemented")),
ifApplication: { e1, e2 in (typeOf(e1) && typeOf(e2)) >>- { e1, e2 in
let type = Type(Variable())
let c = Constraint(equality: e1.0, Type(function: e2.0, type))
return .right(type, e1.1 + e2.1, e1.2 + e2.2 + [ c ]) }
})
}
// MARK: - Imports
import Either
import Prelude
import Set
|
// Copyright (c) 2015 Rob Rix. All rights reserved.
public typealias ConstraintSet = Multiset<Constraint>
public func typeOf(expression: Expression) -> Either<Error, (Type, AssumptionSet, ConstraintSet)> {
return expression.analysis(
ifVariable: { v in
let type = Type(Variable())
return .right(type, [ v: [ Scheme([], type) ] ], [])
},
ifAbstraction: const(.left("unimplemented")),
ifApplication: { e1, e2 in (typeOf(e1) && typeOf(e2)) >>- { e1, e2 in
let type = Type(Variable())
let c = Constraint(equality: e1.0, e2.0 --> type)
return .right(type, e1.1 + e2.1, e1.2 + e2.2 + [ c ]) }
})
}
// MARK: - Imports
import Either
import Prelude
import Set
|
Use the function type constructor.
|
Use the function type constructor.
|
Swift
|
mit
|
antitypical/Manifold,antitypical/Manifold
|
swift
|
## Code Before:
// Copyright (c) 2015 Rob Rix. All rights reserved.
public typealias ConstraintSet = Multiset<Constraint>
public func typeOf(expression: Expression) -> Either<Error, (Type, AssumptionSet, ConstraintSet)> {
return expression.analysis(
ifVariable: { v in
let type = Type(Variable())
return .right(type, [ v: [ Scheme([], type) ] ], [])
},
ifAbstraction: const(.left("unimplemented")),
ifApplication: { e1, e2 in (typeOf(e1) && typeOf(e2)) >>- { e1, e2 in
let type = Type(Variable())
let c = Constraint(equality: e1.0, Type(function: e2.0, type))
return .right(type, e1.1 + e2.1, e1.2 + e2.2 + [ c ]) }
})
}
// MARK: - Imports
import Either
import Prelude
import Set
## Instruction:
Use the function type constructor.
## Code After:
// Copyright (c) 2015 Rob Rix. All rights reserved.
public typealias ConstraintSet = Multiset<Constraint>
public func typeOf(expression: Expression) -> Either<Error, (Type, AssumptionSet, ConstraintSet)> {
return expression.analysis(
ifVariable: { v in
let type = Type(Variable())
return .right(type, [ v: [ Scheme([], type) ] ], [])
},
ifAbstraction: const(.left("unimplemented")),
ifApplication: { e1, e2 in (typeOf(e1) && typeOf(e2)) >>- { e1, e2 in
let type = Type(Variable())
let c = Constraint(equality: e1.0, e2.0 --> type)
return .right(type, e1.1 + e2.1, e1.2 + e2.2 + [ c ]) }
})
}
// MARK: - Imports
import Either
import Prelude
import Set
|
f999031a7a0164acd7c0b688a400a56b2fe77e67
|
README.rdoc
|
README.rdoc
|
= define
define is a simple CLI to Google's define:word utility. Syntax is `define: word [or phrase]` to return Google's definitions.
== Copyright
Copyright (c) 2010 Trevor Hartman. See {LICENSE}[http://github.com/devth/define/blob/develop/LICENSE] for details.
|
= define
define is a simple CLI to Google's define:word utility. Syntax is [\+define: word [or phrase]+] to return Google's definitions. Quit wasting your time looking up definitions in a clunky old browser and let command line define build your vocabulary minimalistically!
== Copyright
Copyright (c) 2010 Trevor Hartman. See {LICENSE}[http://github.com/devth/define/blob/develop/LICENSE] for details.
|
Update docs with code formatting and more description.
|
Update docs with code formatting and more description.
|
RDoc
|
mit
|
devth/define
|
rdoc
|
## Code Before:
= define
define is a simple CLI to Google's define:word utility. Syntax is `define: word [or phrase]` to return Google's definitions.
== Copyright
Copyright (c) 2010 Trevor Hartman. See {LICENSE}[http://github.com/devth/define/blob/develop/LICENSE] for details.
## Instruction:
Update docs with code formatting and more description.
## Code After:
= define
define is a simple CLI to Google's define:word utility. Syntax is [\+define: word [or phrase]+] to return Google's definitions. Quit wasting your time looking up definitions in a clunky old browser and let command line define build your vocabulary minimalistically!
== Copyright
Copyright (c) 2010 Trevor Hartman. See {LICENSE}[http://github.com/devth/define/blob/develop/LICENSE] for details.
|
8bd236ee24b80d70906b14e6a64fbcaff526cd5f
|
app/mailers/permit_sender.rb
|
app/mailers/permit_sender.rb
|
class PermitSender < ActionMailer::Base
default from: "[email protected]"
# send the permit email to San Antonio Gov, pass in the user object that contains the user's email address
def send_permit_application(user)
@permit = permit
mail( :to => @permit.email,
:subject => 'Thanks for signing up for our amazing app' )
end
end
|
class PermitSender < ActionMailer::Base
default from: "[email protected]"
# send the permit email to San Antonio Gov, pass in the user object that contains the user's email address
def send_permit_application(user)
@permit = permit
mail( :to => @permit.email,
:subject => 'Thanks for signing up for our amazing app' )
end
end
|
Configure email from permit sender to be donotreply@homebasefix
|
Configure email from permit sender to be donotreply@homebasefix
|
Ruby
|
isc
|
codeforamerica/homebase,randy-r-masters/homebase,randy-r-masters/homebase,schlos/homebase,randy-r-masters/homebase,schlos/homebase,codeforamerica/homebase,schlos/homebase,codeforamerica/homebase
|
ruby
|
## Code Before:
class PermitSender < ActionMailer::Base
default from: "[email protected]"
# send the permit email to San Antonio Gov, pass in the user object that contains the user's email address
def send_permit_application(user)
@permit = permit
mail( :to => @permit.email,
:subject => 'Thanks for signing up for our amazing app' )
end
end
## Instruction:
Configure email from permit sender to be donotreply@homebasefix
## Code After:
class PermitSender < ActionMailer::Base
default from: "[email protected]"
# send the permit email to San Antonio Gov, pass in the user object that contains the user's email address
def send_permit_application(user)
@permit = permit
mail( :to => @permit.email,
:subject => 'Thanks for signing up for our amazing app' )
end
end
|
376f5a856fd970cbb6ddfe7f19f83d4a88a691bc
|
src/field/email/email_pane.jsx
|
src/field/email/email_pane.jsx
|
import React from 'react';
import EmailInput from '../../ui/input/email_input';
import * as c from '../index';
import { swap, updateEntity } from '../../store/index';
import * as l from '../../core/index';
import { setEmail } from '../email';
import { debouncedRequestAvatar, requestAvatar } from '../../avatar';
export default class EmailPane extends React.Component {
componentDidMount() {
const { lock } = this.props;
if (l.ui.avatar(lock) && c.email(lock)) {
requestAvatar(l.id(lock), c.email(lock));
}
}
handleChange(e) {
const { lock } = this.props;
if (l.ui.avatar(lock)) {
debouncedRequestAvatar(l.id(lock), e.target.value);
}
swap(updateEntity, "lock", l.id(lock), setEmail, e.target.value);
}
render() {
const { lock, placeholder } = this.props;
return (
<EmailInput value={c.email(lock)}
isValid={!c.isFieldVisiblyInvalid(lock, "email")}
onChange={::this.handleChange}
avatar={l.ui.avatar(lock)}
placeholder={placeholder}
disabled={l.submitting(lock)}
/>
);
}
}
EmailPane.propTypes = {
lock: React.PropTypes.object.isRequired,
placeholder: React.PropTypes.string.isRequired
};
|
import React from 'react';
import EmailInput from '../../ui/input/email_input';
import * as c from '../index';
import { swap, updateEntity } from '../../store/index';
import * as l from '../../core/index';
import { setEmail } from '../email';
import { debouncedRequestAvatar, requestAvatar } from '../../avatar';
export default class EmailPane extends React.Component {
componentDidMount() {
const { lock } = this.props;
if (l.ui.avatar(lock) && c.email(lock)) {
requestAvatar(l.id(lock), c.email(lock));
}
}
handleChange(e) {
const { lock } = this.props;
if (l.ui.avatar(lock)) {
debouncedRequestAvatar(l.id(lock), e.target.value);
}
swap(updateEntity, "lock", l.id(lock), setEmail, e.target.value);
}
render() {
const { instructions, lock, placeholder } = this.props;
const headerText = instructions || null;
const header = headerText && <p>{headerText}</p>;
return (
<div>
{header}
<EmailInput value={c.email(lock)}
isValid={!c.isFieldVisiblyInvalid(lock, "email")}
onChange={::this.handleChange}
avatar={l.ui.avatar(lock)}
placeholder={placeholder}
disabled={l.submitting(lock)}
/>
</div>
);
}
}
EmailPane.propTypes = {
instructions: React.PropTypes.element,
lock: React.PropTypes.object.isRequired,
placeholder: React.PropTypes.string.isRequired
};
|
Add instructions prop to EmailPane
|
Add instructions prop to EmailPane
|
JSX
|
mit
|
mike-casas/lock,mike-casas/lock,mike-casas/lock
|
jsx
|
## Code Before:
import React from 'react';
import EmailInput from '../../ui/input/email_input';
import * as c from '../index';
import { swap, updateEntity } from '../../store/index';
import * as l from '../../core/index';
import { setEmail } from '../email';
import { debouncedRequestAvatar, requestAvatar } from '../../avatar';
export default class EmailPane extends React.Component {
componentDidMount() {
const { lock } = this.props;
if (l.ui.avatar(lock) && c.email(lock)) {
requestAvatar(l.id(lock), c.email(lock));
}
}
handleChange(e) {
const { lock } = this.props;
if (l.ui.avatar(lock)) {
debouncedRequestAvatar(l.id(lock), e.target.value);
}
swap(updateEntity, "lock", l.id(lock), setEmail, e.target.value);
}
render() {
const { lock, placeholder } = this.props;
return (
<EmailInput value={c.email(lock)}
isValid={!c.isFieldVisiblyInvalid(lock, "email")}
onChange={::this.handleChange}
avatar={l.ui.avatar(lock)}
placeholder={placeholder}
disabled={l.submitting(lock)}
/>
);
}
}
EmailPane.propTypes = {
lock: React.PropTypes.object.isRequired,
placeholder: React.PropTypes.string.isRequired
};
## Instruction:
Add instructions prop to EmailPane
## Code After:
import React from 'react';
import EmailInput from '../../ui/input/email_input';
import * as c from '../index';
import { swap, updateEntity } from '../../store/index';
import * as l from '../../core/index';
import { setEmail } from '../email';
import { debouncedRequestAvatar, requestAvatar } from '../../avatar';
export default class EmailPane extends React.Component {
componentDidMount() {
const { lock } = this.props;
if (l.ui.avatar(lock) && c.email(lock)) {
requestAvatar(l.id(lock), c.email(lock));
}
}
handleChange(e) {
const { lock } = this.props;
if (l.ui.avatar(lock)) {
debouncedRequestAvatar(l.id(lock), e.target.value);
}
swap(updateEntity, "lock", l.id(lock), setEmail, e.target.value);
}
render() {
const { instructions, lock, placeholder } = this.props;
const headerText = instructions || null;
const header = headerText && <p>{headerText}</p>;
return (
<div>
{header}
<EmailInput value={c.email(lock)}
isValid={!c.isFieldVisiblyInvalid(lock, "email")}
onChange={::this.handleChange}
avatar={l.ui.avatar(lock)}
placeholder={placeholder}
disabled={l.submitting(lock)}
/>
</div>
);
}
}
EmailPane.propTypes = {
instructions: React.PropTypes.element,
lock: React.PropTypes.object.isRequired,
placeholder: React.PropTypes.string.isRequired
};
|
c2651fac1891a4aa120c00f99525f254363b0070
|
Gruntfile.js
|
Gruntfile.js
|
/*
* Copyright 2013, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Zeno Rocha <[email protected]>
*/
module.exports = function(grunt) {
grunt.initConfig({
jsbeautifier: {
files: ['bin/*.js', 'lib/**/*.js', 'test/**/*.js', '*.js'],
options: {
config: '.jsbeautifyrc'
}
}
});
grunt.loadNpmTasks('grunt-jsbeautifier');
grunt.registerTask('format', ['jsbeautifier']);
};
|
/*
* Copyright 2013, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Zeno Rocha <[email protected]>
* @author Eduardo Lundgren <[email protected]>
*/
module.exports = function(grunt) {
grunt.loadNpmTasks('grunt-jsbeautifier');
grunt.initConfig({
jsbeautifier: {
files: ['bin/*.js', 'lib/**/*.js', 'test/**/*.js', '*.js'],
options: {
config: '.jsbeautifyrc'
}
}
});
grunt.registerTask('format', ['jsbeautifier']);
};
|
Move task dependencies to the top
|
Move task dependencies to the top
|
JavaScript
|
bsd-3-clause
|
TomzxForks/gh,henvic/gh,dustinryerson/gh,tomzx/gh,TomzxForks/gh,tomzx/gh,modulexcite/gh,oouyang/gh,oouyang/gh,dustinryerson/gh,henvic/gh,modulexcite/gh
|
javascript
|
## Code Before:
/*
* Copyright 2013, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Zeno Rocha <[email protected]>
*/
module.exports = function(grunt) {
grunt.initConfig({
jsbeautifier: {
files: ['bin/*.js', 'lib/**/*.js', 'test/**/*.js', '*.js'],
options: {
config: '.jsbeautifyrc'
}
}
});
grunt.loadNpmTasks('grunt-jsbeautifier');
grunt.registerTask('format', ['jsbeautifier']);
};
## Instruction:
Move task dependencies to the top
## Code After:
/*
* Copyright 2013, All Rights Reserved.
*
* Code licensed under the BSD License:
* https://github.com/node-gh/gh/blob/master/LICENSE.md
*
* @author Zeno Rocha <[email protected]>
* @author Eduardo Lundgren <[email protected]>
*/
module.exports = function(grunt) {
grunt.loadNpmTasks('grunt-jsbeautifier');
grunt.initConfig({
jsbeautifier: {
files: ['bin/*.js', 'lib/**/*.js', 'test/**/*.js', '*.js'],
options: {
config: '.jsbeautifyrc'
}
}
});
grunt.registerTask('format', ['jsbeautifier']);
};
|
44f29fca2d261d52f32a0a8d29ab819e6f0be569
|
lib/http/controllers/api/set_domain.js
|
lib/http/controllers/api/set_domain.js
|
var gateway = require(__dirname+'/../../../../');
module.exports = function(req, res){
gateway.config.set('DOMAIN', req.body.domain);
gateway.config.save(function(){
res.send({ 'DOMAIN': gateway.config.get('DOMAIN') });
});
};
|
var gateway = require(__dirname+'/../../../../');
/*
* @requires Config
* @function setDomain
* @description Set the domain via http for use in the
* ripple.txt file and also as the email address for
* admin basic authentication.
* @param {String} domain
*/
module.exports = function(req, res){
gateway.config.set('DOMAIN', req.body.domain);
gateway.config.save(function(){
res.send({ 'DOMAIN': gateway.config.get('DOMAIN') });
});
};
|
Add jsdoc to getDomain http call.
|
[DOC] Add jsdoc to getDomain http call.
|
JavaScript
|
isc
|
xdv/gatewayd,xdv/gatewayd,zealord/gatewayd,crazyquark/gatewayd,zealord/gatewayd,whotooktwarden/gatewayd,whotooktwarden/gatewayd,Parkjeahwan/awegeeks,crazyquark/gatewayd,Parkjeahwan/awegeeks
|
javascript
|
## Code Before:
var gateway = require(__dirname+'/../../../../');
module.exports = function(req, res){
gateway.config.set('DOMAIN', req.body.domain);
gateway.config.save(function(){
res.send({ 'DOMAIN': gateway.config.get('DOMAIN') });
});
};
## Instruction:
[DOC] Add jsdoc to getDomain http call.
## Code After:
var gateway = require(__dirname+'/../../../../');
/*
* @requires Config
* @function setDomain
* @description Set the domain via http for use in the
* ripple.txt file and also as the email address for
* admin basic authentication.
* @param {String} domain
*/
module.exports = function(req, res){
gateway.config.set('DOMAIN', req.body.domain);
gateway.config.save(function(){
res.send({ 'DOMAIN': gateway.config.get('DOMAIN') });
});
};
|
b6afb76d6f99c9b3d62ef0059ea26c4ae4c330d7
|
wix/pluginlist.txt
|
wix/pluginlist.txt
|
github.com/mackerelio/go-check-plugins/check-log
github.com/mackerelio/go-check-plugins/check-procs
github.com/mackerelio/go-check-plugins/check-ntservice
github.com/mackerelio/go-check-plugins/wrrrrrong
|
github.com/mackerelio/go-check-plugins/check-log
github.com/mackerelio/go-check-plugins/check-procs
github.com/mackerelio/go-check-plugins/check-ntservice
|
Revert "[Revert later] Demonstrate test failure"
|
Revert "[Revert later] Demonstrate test failure"
This reverts commit 4904d6904cd4ec0030414559d71a169b57244acf.
|
Text
|
apache-2.0
|
mackerelio/mackerel-agent,mackerelio/mackerel-agent,mackerelio/mackerel-agent
|
text
|
## Code Before:
github.com/mackerelio/go-check-plugins/check-log
github.com/mackerelio/go-check-plugins/check-procs
github.com/mackerelio/go-check-plugins/check-ntservice
github.com/mackerelio/go-check-plugins/wrrrrrong
## Instruction:
Revert "[Revert later] Demonstrate test failure"
This reverts commit 4904d6904cd4ec0030414559d71a169b57244acf.
## Code After:
github.com/mackerelio/go-check-plugins/check-log
github.com/mackerelio/go-check-plugins/check-procs
github.com/mackerelio/go-check-plugins/check-ntservice
|
4508c9e566699b80aab9fb46da574a31edc3b36f
|
src/components/utils/utils.js
|
src/components/utils/utils.js
|
(function () {
'use strict';
var mod = angular.module('components.utils', []);
mod.factory('Utils', ['$window', function Utils($window) {
function stringify(obj, prefix) {
var str = [];
for(var p in obj) {
var k = prefix ? prefix + '[' + p + ']' : p, v = obj[p];
str.push(typeof v == 'object' ?
stringify(v, k) :
encodeURIComponent(k) + '=' + encodeURIComponent(v));
}
return str.join('&');
}
function objectToFormData(data) {
var formData = new $window.FormData();
for(var i in data) {
if(_.isObject(data[i])) {
for(var j in data[i]) {
formData.append(i + '[' + j + ']', data[i][j]);
}
} else if(_.isArray(data[i])) {
for(var k in data[i]) {
formData.append(i + '[]', data[i][k]);
}
} else {
formData.append(i, data[i]);
}
}
return formData;
}
return {
stringify: stringify,
objectToFormData: objectToFormData
};
}]);
}());
|
(function () {
'use strict';
var mod = angular.module('components.utils', []);
mod.factory('Utils', ['$window', function Utils($window) {
function stringify(obj, prefix) {
var str = [];
for(var p in obj) {
var k = prefix ? prefix + '[' + p + ']' : p, v = obj[p];
str.push(typeof v == 'object' ?
stringify(v, k) :
encodeURIComponent(k) + '=' + encodeURIComponent(v));
}
return str.join('&');
}
function objectToFormData(data) {
var formData = new $window.FormData();
for(var i in data) {
if(_.isArray(data[i])) {
for(var k in data[i]) {
formData.append(i + '[]', data[i][k]);
}
} else if(_.isObject(data[i])) {
for(var j in data[i]) {
formData.append(i + '[' + j + ']', data[i][j]);
}
} else {
formData.append(i, data[i]);
}
}
return formData;
}
return {
stringify: stringify,
objectToFormData: objectToFormData
};
}]);
}());
|
Fix checking type of data (array is recognized as an object)
|
Fix checking type of data (array is recognized as an object)
|
JavaScript
|
apache-2.0
|
yrezgui/buffer-firefoxos
|
javascript
|
## Code Before:
(function () {
'use strict';
var mod = angular.module('components.utils', []);
mod.factory('Utils', ['$window', function Utils($window) {
function stringify(obj, prefix) {
var str = [];
for(var p in obj) {
var k = prefix ? prefix + '[' + p + ']' : p, v = obj[p];
str.push(typeof v == 'object' ?
stringify(v, k) :
encodeURIComponent(k) + '=' + encodeURIComponent(v));
}
return str.join('&');
}
function objectToFormData(data) {
var formData = new $window.FormData();
for(var i in data) {
if(_.isObject(data[i])) {
for(var j in data[i]) {
formData.append(i + '[' + j + ']', data[i][j]);
}
} else if(_.isArray(data[i])) {
for(var k in data[i]) {
formData.append(i + '[]', data[i][k]);
}
} else {
formData.append(i, data[i]);
}
}
return formData;
}
return {
stringify: stringify,
objectToFormData: objectToFormData
};
}]);
}());
## Instruction:
Fix checking type of data (array is recognized as an object)
## Code After:
(function () {
'use strict';
var mod = angular.module('components.utils', []);
mod.factory('Utils', ['$window', function Utils($window) {
function stringify(obj, prefix) {
var str = [];
for(var p in obj) {
var k = prefix ? prefix + '[' + p + ']' : p, v = obj[p];
str.push(typeof v == 'object' ?
stringify(v, k) :
encodeURIComponent(k) + '=' + encodeURIComponent(v));
}
return str.join('&');
}
function objectToFormData(data) {
var formData = new $window.FormData();
for(var i in data) {
if(_.isArray(data[i])) {
for(var k in data[i]) {
formData.append(i + '[]', data[i][k]);
}
} else if(_.isObject(data[i])) {
for(var j in data[i]) {
formData.append(i + '[' + j + ']', data[i][j]);
}
} else {
formData.append(i, data[i]);
}
}
return formData;
}
return {
stringify: stringify,
objectToFormData: objectToFormData
};
}]);
}());
|
ebedde5912bb7ae51a9e250cc9419bdb57531e49
|
apps/home/stylesheets/index.styl
|
apps/home/stylesheets/index.styl
|
.Home
img
max-width 100%
vertical-align bottom
@require './example'
@require './explore'
@require './hero'
@require './more'
@require './quotes'
@require './next'
@require './section'
@require './split'
@require './to_fold'
@require './responsive'
|
.Home
img
max-width 100%
vertical-align bottom
@require './explore'
@require './hero'
@require './more'
@require './quotes'
@require './section'
@require './split'
@require './to_fold'
@require './responsive'
|
Remove references to old styles
|
Remove references to old styles
|
Stylus
|
mit
|
aredotna/ervell,aredotna/ervell,aredotna/ervell,aredotna/ervell,aredotna/ervell
|
stylus
|
## Code Before:
.Home
img
max-width 100%
vertical-align bottom
@require './example'
@require './explore'
@require './hero'
@require './more'
@require './quotes'
@require './next'
@require './section'
@require './split'
@require './to_fold'
@require './responsive'
## Instruction:
Remove references to old styles
## Code After:
.Home
img
max-width 100%
vertical-align bottom
@require './explore'
@require './hero'
@require './more'
@require './quotes'
@require './section'
@require './split'
@require './to_fold'
@require './responsive'
|
4eb248c2c301e5b0f4f7a2dc8eb14f828230332c
|
Build/Grunt-Options/shell.js
|
Build/Grunt-Options/shell.js
|
/**
* Grunt-Shell
* @description Run shell commands.
* @docs https://github.com/sindresorhus/grunt-shell
*/
var config = require("../Config");
module.exports = {
"deleteGitHooks" : {
"command" : "rm -rf .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
},
},
"hookUpGit" : {
"command" : "mkdir .git/hooks/ && cp Build/Git-Hooks/pre-commit .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
},
}
};
|
/**
* Grunt-Shell
* @description Run shell commands.
* @docs https://github.com/sindresorhus/grunt-shell
*/
var os = require('os'),
config = require("../Config"),
isWin = /^win/.test(os.platform());
module.exports = {
"deleteGitHooks" : {
"command" : isWin ? "rmdir .git\\hooks\\ /s /q" : "rm -rf .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
}
},
"hookUpGit" : {
"command" : "mkdir .git/hooks/ && cp Build/Git-Hooks/pre-commit .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
}
}
};
|
Implement a hostname check to prevent errors while running 'grunt init' on windows based machines
|
[TASK] Implement a hostname check to prevent errors while running 'grunt init' on windows based machines
|
JavaScript
|
mit
|
t3b/t3b_template,t3b/t3b_template,t3b/t3b_template,t3b/t3b_template,t3b/t3b_template
|
javascript
|
## Code Before:
/**
* Grunt-Shell
* @description Run shell commands.
* @docs https://github.com/sindresorhus/grunt-shell
*/
var config = require("../Config");
module.exports = {
"deleteGitHooks" : {
"command" : "rm -rf .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
},
},
"hookUpGit" : {
"command" : "mkdir .git/hooks/ && cp Build/Git-Hooks/pre-commit .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
},
}
};
## Instruction:
[TASK] Implement a hostname check to prevent errors while running 'grunt init' on windows based machines
## Code After:
/**
* Grunt-Shell
* @description Run shell commands.
* @docs https://github.com/sindresorhus/grunt-shell
*/
var os = require('os'),
config = require("../Config"),
isWin = /^win/.test(os.platform());
module.exports = {
"deleteGitHooks" : {
"command" : isWin ? "rmdir .git\\hooks\\ /s /q" : "rm -rf .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
}
},
"hookUpGit" : {
"command" : "mkdir .git/hooks/ && cp Build/Git-Hooks/pre-commit .git/hooks/",
"options" : {
"stdout" : true,
"stderr" : true,
"failOnError" : true
}
}
};
|
d022270b3a486cd1943129e8a10a30a25f6bb5d6
|
examples/sample_doc.tex
|
examples/sample_doc.tex
|
%!TEX TS-program = xelatex
%!TEX encoding = UTF-8 Unicode
\documentclass{article}
\usepackage{fontspec}
\usepackage{gettext}
\newcommand{\sandwitches}[1]{
\ngettext{There is one sandwich on the table}{There are #1 sandwiches on the table}{#1}
}
\newcommand{\mercuries}[1]{
\npgettext{first mercury is a person, second is substance, third is planet}{One mercury brings mercury onto Mercury}{#1 mercuries bring mercury onto Mercury}{#1}
}
\begin{document}
{\large get\TeX t}
\gettext{Hello world!}
\sandwitches{1}
\sandwitches{2}
\sandwitches{3}
\sandwitches{4}
\sandwitches{5}
\mercuries{1}
\mercuries{2}
\mercuries{3}
\mercuries{4}
\mercuries{5}
\end{document}
|
%!TEX TS-program = xelatex
%!TEX encoding = UTF-8 Unicode
\documentclass{article}
\usepackage{fontspec}
\usepackage{gettext}
\newcommand{\sandwitches}[1]{
\ngettext{There is one sandwich on the table}{There are #1 sandwiches on the table}{#1}
}
\newcommand{\mercuries}[1]{
\npgettext{first mercury is a person, second is substance, third is planet}{One mercury brings mercury onto Mercury}{#1 mercuries bring mercury onto Mercury}{#1}
}
\begin{document}
{\large get\TeX t}
\gettext{Hello world!}
\today
\formatdate{21}{12}{2012}
\sandwitches{1}
\sandwitches{2}
\sandwitches{3}
\sandwitches{4}
\sandwitches{5}
\mercuries{1}
\mercuries{2}
\mercuries{3}
\mercuries{4}
\mercuries{5}
\end{document}
|
Add \today and \formatdate to example document
|
Add \today and \formatdate to example document
|
TeX
|
bsd-2-clause
|
mplucinski/tex-gettext
|
tex
|
## Code Before:
%!TEX TS-program = xelatex
%!TEX encoding = UTF-8 Unicode
\documentclass{article}
\usepackage{fontspec}
\usepackage{gettext}
\newcommand{\sandwitches}[1]{
\ngettext{There is one sandwich on the table}{There are #1 sandwiches on the table}{#1}
}
\newcommand{\mercuries}[1]{
\npgettext{first mercury is a person, second is substance, third is planet}{One mercury brings mercury onto Mercury}{#1 mercuries bring mercury onto Mercury}{#1}
}
\begin{document}
{\large get\TeX t}
\gettext{Hello world!}
\sandwitches{1}
\sandwitches{2}
\sandwitches{3}
\sandwitches{4}
\sandwitches{5}
\mercuries{1}
\mercuries{2}
\mercuries{3}
\mercuries{4}
\mercuries{5}
\end{document}
## Instruction:
Add \today and \formatdate to example document
## Code After:
%!TEX TS-program = xelatex
%!TEX encoding = UTF-8 Unicode
\documentclass{article}
\usepackage{fontspec}
\usepackage{gettext}
\newcommand{\sandwitches}[1]{
\ngettext{There is one sandwich on the table}{There are #1 sandwiches on the table}{#1}
}
\newcommand{\mercuries}[1]{
\npgettext{first mercury is a person, second is substance, third is planet}{One mercury brings mercury onto Mercury}{#1 mercuries bring mercury onto Mercury}{#1}
}
\begin{document}
{\large get\TeX t}
\gettext{Hello world!}
\today
\formatdate{21}{12}{2012}
\sandwitches{1}
\sandwitches{2}
\sandwitches{3}
\sandwitches{4}
\sandwitches{5}
\mercuries{1}
\mercuries{2}
\mercuries{3}
\mercuries{4}
\mercuries{5}
\end{document}
|
683e4d747a34df88c12bfea61c5a56d9794e8ccb
|
pkgs/development/libraries/libdwarf/default.nix
|
pkgs/development/libraries/libdwarf/default.nix
|
{ stdenv, fetchurl, libelf }:
stdenv.mkDerivation rec {
name = "libdwarf-20140805";
src = fetchurl {
url = "http://www.prevanders.net/${name}.tar.gz";
sha256 = "1z5xz0w1yvk8swcqzx4dvnig94j51pns39jmipv5rl20qahik0nl";
};
configureFlags = "--enable-shared";
preBuild = ''
export LD_LIBRARY_PATH=`pwd`/libdwarf
'';
buildInputs = [ libelf ];
installPhase = ''
mkdir -p $out/lib $out/include $out/bin
cp ./dwarfdump2/dwarfdump $out/bin/dwarfdump2
cp ./dwarfdump/dwarfdump $out/bin/dwarfdump
cp libdwarf/libdwarf.so $out/lib
cp libdwarf/libdwarf.h libdwarf/dwarf.h $out/include
'';
meta = {
homepage = http://www.prevanders.net/dwarf.html;
license = stdenv.lib.licenses.gpl2;
};
}
|
{ stdenv, fetchurl, libelf }:
stdenv.mkDerivation rec {
name = "libdwarf-20121130";
src = fetchurl {
url = http://reality.sgiweb.org/davea/libdwarf-20121130.tar.gz;
sha256 = "1nfdfn5xf3n485pvpb853awyxxnvrg207i0wmrr7bhk8fcxdxbn0";
};
configureFlags = " --enable-shared --disable-nonshared";
preConfigure = ''
cd libdwarf
'';
buildInputs = [ libelf ];
installPhase = ''
mkdir -p $out/lib $out/include
cp libdwarf.so $out/lib
cp libdwarf.h dwarf.h $out/include
'';
meta = {
homepage = http://reality.sgiweb.org/davea/dwarf.html;
};
}
|
Revert "libdwarf: upgrade to the latest version"
|
Revert "libdwarf: upgrade to the latest version"
|
Nix
|
mit
|
NixOS/nixpkgs,triton/triton,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,triton/triton,SymbiFlow/nixpkgs,triton/triton,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,triton/triton,NixOS/nixpkgs,triton/triton,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,triton/triton,SymbiFlow/nixpkgs,triton/triton,NixOS/nixpkgs,SymbiFlow/nixpkgs,triton/triton,NixOS/nixpkgs
|
nix
|
## Code Before:
{ stdenv, fetchurl, libelf }:
stdenv.mkDerivation rec {
name = "libdwarf-20140805";
src = fetchurl {
url = "http://www.prevanders.net/${name}.tar.gz";
sha256 = "1z5xz0w1yvk8swcqzx4dvnig94j51pns39jmipv5rl20qahik0nl";
};
configureFlags = "--enable-shared";
preBuild = ''
export LD_LIBRARY_PATH=`pwd`/libdwarf
'';
buildInputs = [ libelf ];
installPhase = ''
mkdir -p $out/lib $out/include $out/bin
cp ./dwarfdump2/dwarfdump $out/bin/dwarfdump2
cp ./dwarfdump/dwarfdump $out/bin/dwarfdump
cp libdwarf/libdwarf.so $out/lib
cp libdwarf/libdwarf.h libdwarf/dwarf.h $out/include
'';
meta = {
homepage = http://www.prevanders.net/dwarf.html;
license = stdenv.lib.licenses.gpl2;
};
}
## Instruction:
Revert "libdwarf: upgrade to the latest version"
## Code After:
{ stdenv, fetchurl, libelf }:
stdenv.mkDerivation rec {
name = "libdwarf-20121130";
src = fetchurl {
url = http://reality.sgiweb.org/davea/libdwarf-20121130.tar.gz;
sha256 = "1nfdfn5xf3n485pvpb853awyxxnvrg207i0wmrr7bhk8fcxdxbn0";
};
configureFlags = " --enable-shared --disable-nonshared";
preConfigure = ''
cd libdwarf
'';
buildInputs = [ libelf ];
installPhase = ''
mkdir -p $out/lib $out/include
cp libdwarf.so $out/lib
cp libdwarf.h dwarf.h $out/include
'';
meta = {
homepage = http://reality.sgiweb.org/davea/dwarf.html;
};
}
|
2d188cc6efa50c3662295d532abc16b0e61839a3
|
playbook.yml
|
playbook.yml
|
---
- hosts: all
roles:
- bootstrap
- wiki
- self-upgrade
- knime
- modifiedtanimoto
- gpcrdb
- kripo
- sstea
- chemdb4vs
- klifs
tasks:
- name: Vagrant owns Knime
file: path=/opt/knime state=directory recurse=yes owner=vagrant group=vagrant
- name: Vagrant owns Knime workspace examples
file: path='{{ knime_examples_root }}' state=directory recurse=yes owner=vagrant group=vagrant
|
---
- hosts: all
roles:
- bootstrap
- wiki
- self-upgrade
- knime
- modifiedtanimoto
- gpcrdb
- kripo
- sstea
- chemdb4vs
- klifs
tasks:
- name: Vagrant owns Knime
file: path=/opt/knime state=directory recurse=yes owner=vagrant group=vagrant
- name: Vagrant owns Knime workspace examples
file: path='{{ knime_examples_root }}' state=directory recurse=yes owner=vagrant group=vagrant
- name: Vagrant owns Knime workspace workflows
file: path='{{ knime_workflows_root }}' state=directory recurse=yes owner=vagrant group=vagrant
|
Allow writes to Knime 3D-e-Chem workflows.
|
Allow writes to Knime 3D-e-Chem workflows.
|
YAML
|
apache-2.0
|
3D-e-Chem/3D-e-Chem-VM
|
yaml
|
## Code Before:
---
- hosts: all
roles:
- bootstrap
- wiki
- self-upgrade
- knime
- modifiedtanimoto
- gpcrdb
- kripo
- sstea
- chemdb4vs
- klifs
tasks:
- name: Vagrant owns Knime
file: path=/opt/knime state=directory recurse=yes owner=vagrant group=vagrant
- name: Vagrant owns Knime workspace examples
file: path='{{ knime_examples_root }}' state=directory recurse=yes owner=vagrant group=vagrant
## Instruction:
Allow writes to Knime 3D-e-Chem workflows.
## Code After:
---
- hosts: all
roles:
- bootstrap
- wiki
- self-upgrade
- knime
- modifiedtanimoto
- gpcrdb
- kripo
- sstea
- chemdb4vs
- klifs
tasks:
- name: Vagrant owns Knime
file: path=/opt/knime state=directory recurse=yes owner=vagrant group=vagrant
- name: Vagrant owns Knime workspace examples
file: path='{{ knime_examples_root }}' state=directory recurse=yes owner=vagrant group=vagrant
- name: Vagrant owns Knime workspace workflows
file: path='{{ knime_workflows_root }}' state=directory recurse=yes owner=vagrant group=vagrant
|
0bff3a0056d1ac56c3de47b743bbd9c394921fe4
|
app/api/v0.rb
|
app/api/v0.rb
|
require_relative 'v0/api'
require 'grape-swagger'
class ApiV0 < Grape::API
swagger_path = 'swagger'
get do
# Redirect base url to Swagger docs
redirect "http://petstore.swagger.io/?url=#{request.scheme}://#{request.host_with_port}/#{version}/#{swagger_path}"
end
content_type :json, 'application/json'
default_format :json
formatter :json, PrettyJSON
helpers V0Helpers
include Grape::Kaminari
paginate per_page: 20, max_per_page: 30, offset: false
# No need to implement while we're just working on Toronto.
# namespace :divisions
# namespace :jurisdictions
mount App::API::LegislativeSessions
mount App::API::Organizations
mount App::API::Posts
mount App::API::Events
mount App::API::AgendaItems
mount App::API::Memberships
mount App::API::People
mount App::API::Bills
mount App::API::Votes
mount App::API::Locations
add_swagger_documentation \
mount_path: swagger_path,
hide_documentation_path: true,
hide_format: true,
base_path: '/v0'
end
|
require_relative 'v0/api'
require 'grape-swagger'
class ApiV0 < Grape::API
swagger_path = 'swagger'
get do
# Redirect base url to Swagger docs
redirect "http://petstore.swagger.io/?url=#{request.scheme}://#{request.host_with_port}/#{version}/#{swagger_path}"
end
content_type :json, 'application/json'
default_format :json
formatter :json, PrettyJSON
helpers V0Helpers
include Grape::Kaminari
paginate per_page: 20, max_per_page: 30, offset: false
# No need to implement while we're just working on Toronto.
# namespace :divisions
# namespace :jurisdictions
mount App::API::LegislativeSessions
mount App::API::Organizations
mount App::API::Posts
mount App::API::Events
mount App::API::AgendaItems
mount App::API::Memberships
mount App::API::People
mount App::API::Bills
mount App::API::Votes
mount App::API::Locations
add_swagger_documentation \
mount_path: swagger_path,
hide_documentation_path: true,
hide_format: true,
api_version: 'v0',
base_path: 'v0'
end
|
Fix base url and api version in swagger.
|
Fix base url and api version in swagger.
|
Ruby
|
mit
|
patcon/tocityhall-api,CivicTechTO/tocityhall-api
|
ruby
|
## Code Before:
require_relative 'v0/api'
require 'grape-swagger'
class ApiV0 < Grape::API
swagger_path = 'swagger'
get do
# Redirect base url to Swagger docs
redirect "http://petstore.swagger.io/?url=#{request.scheme}://#{request.host_with_port}/#{version}/#{swagger_path}"
end
content_type :json, 'application/json'
default_format :json
formatter :json, PrettyJSON
helpers V0Helpers
include Grape::Kaminari
paginate per_page: 20, max_per_page: 30, offset: false
# No need to implement while we're just working on Toronto.
# namespace :divisions
# namespace :jurisdictions
mount App::API::LegislativeSessions
mount App::API::Organizations
mount App::API::Posts
mount App::API::Events
mount App::API::AgendaItems
mount App::API::Memberships
mount App::API::People
mount App::API::Bills
mount App::API::Votes
mount App::API::Locations
add_swagger_documentation \
mount_path: swagger_path,
hide_documentation_path: true,
hide_format: true,
base_path: '/v0'
end
## Instruction:
Fix base url and api version in swagger.
## Code After:
require_relative 'v0/api'
require 'grape-swagger'
class ApiV0 < Grape::API
swagger_path = 'swagger'
get do
# Redirect base url to Swagger docs
redirect "http://petstore.swagger.io/?url=#{request.scheme}://#{request.host_with_port}/#{version}/#{swagger_path}"
end
content_type :json, 'application/json'
default_format :json
formatter :json, PrettyJSON
helpers V0Helpers
include Grape::Kaminari
paginate per_page: 20, max_per_page: 30, offset: false
# No need to implement while we're just working on Toronto.
# namespace :divisions
# namespace :jurisdictions
mount App::API::LegislativeSessions
mount App::API::Organizations
mount App::API::Posts
mount App::API::Events
mount App::API::AgendaItems
mount App::API::Memberships
mount App::API::People
mount App::API::Bills
mount App::API::Votes
mount App::API::Locations
add_swagger_documentation \
mount_path: swagger_path,
hide_documentation_path: true,
hide_format: true,
api_version: 'v0',
base_path: 'v0'
end
|
0947643977b989ca924bcf932a5153472e362108
|
plata/utils.py
|
plata/utils.py
|
from django.utils import simplejson
class JSONFieldDescriptor(object):
def __init__(self, field):
self.field = field
def __get__(self, obj, objtype):
cache_field = '_cached_jsonfield_%s' % self.field
if not hasattr(obj, cache_field):
try:
setattr(obj, cache_field, simplejson.loads(getattr(obj, self.field)))
except (TypeError, ValueError):
setattr(obj, cache_field, {})
return getattr(obj, cache_field)
def __set__(self, obj, value):
setattr(obj, '_cached_jsonfield_%s' % self.field, value)
setattr(obj, self.field, simplejson.dumps(value))
|
from django.core.serializers.json import DjangoJSONEncoder
from django.utils import simplejson
class JSONFieldDescriptor(object):
def __init__(self, field):
self.field = field
def __get__(self, obj, objtype):
cache_field = '_cached_jsonfield_%s' % self.field
if not hasattr(obj, cache_field):
try:
setattr(obj, cache_field, simplejson.loads(getattr(obj, self.field)))
except (TypeError, ValueError):
setattr(obj, cache_field, {})
return getattr(obj, cache_field)
def __set__(self, obj, value):
setattr(obj, '_cached_jsonfield_%s' % self.field, value)
setattr(obj, self.field, simplejson.dumps(value, cls=DjangoJSONEncoder))
|
Use DjangoJSONEncoder, it knows how to handle dates and decimals
|
JSONFieldDescriptor: Use DjangoJSONEncoder, it knows how to handle dates and decimals
|
Python
|
bsd-3-clause
|
armicron/plata,stefanklug/plata,allink/plata,armicron/plata,armicron/plata
|
python
|
## Code Before:
from django.utils import simplejson
class JSONFieldDescriptor(object):
def __init__(self, field):
self.field = field
def __get__(self, obj, objtype):
cache_field = '_cached_jsonfield_%s' % self.field
if not hasattr(obj, cache_field):
try:
setattr(obj, cache_field, simplejson.loads(getattr(obj, self.field)))
except (TypeError, ValueError):
setattr(obj, cache_field, {})
return getattr(obj, cache_field)
def __set__(self, obj, value):
setattr(obj, '_cached_jsonfield_%s' % self.field, value)
setattr(obj, self.field, simplejson.dumps(value))
## Instruction:
JSONFieldDescriptor: Use DjangoJSONEncoder, it knows how to handle dates and decimals
## Code After:
from django.core.serializers.json import DjangoJSONEncoder
from django.utils import simplejson
class JSONFieldDescriptor(object):
def __init__(self, field):
self.field = field
def __get__(self, obj, objtype):
cache_field = '_cached_jsonfield_%s' % self.field
if not hasattr(obj, cache_field):
try:
setattr(obj, cache_field, simplejson.loads(getattr(obj, self.field)))
except (TypeError, ValueError):
setattr(obj, cache_field, {})
return getattr(obj, cache_field)
def __set__(self, obj, value):
setattr(obj, '_cached_jsonfield_%s' % self.field, value)
setattr(obj, self.field, simplejson.dumps(value, cls=DjangoJSONEncoder))
|
3c631da4e25a3e83fe9e9f1d81f1584782b4ea1e
|
platform/editor-ui-api/resources/META-INF/Editor.xml
|
platform/editor-ui-api/resources/META-INF/Editor.xml
|
<idea-plugin>
<extensionPoints>
<extensionPoint name="editorFactoryListener" interface="com.intellij.openapi.editor.event.EditorFactoryListener"/>
<extensionPoint name="syntaxHighlighter"
beanClass="com.intellij.openapi.extensions.KeyedFactoryEPBean"
dynamic="true">
<with attribute="implementationClass" implements="com.intellij.openapi.fileTypes.SyntaxHighlighter"/>
</extensionPoint>
</extensionPoints>
<extensions defaultExtensionNs="com.intellij">
<applicationService serviceImplementation="com.intellij.ide.ui.UISettings" preload="true"/>
<applicationService serviceImplementation="com.intellij.ide.ui.NotRoamableUiSettings"/>
</extensions>
</idea-plugin>
|
<idea-plugin>
<extensionPoints>
<extensionPoint name="editorFactoryListener" interface="com.intellij.openapi.editor.event.EditorFactoryListener" dynamic="true"/>
<extensionPoint name="syntaxHighlighter"
beanClass="com.intellij.openapi.extensions.KeyedFactoryEPBean"
dynamic="true">
<with attribute="implementationClass" implements="com.intellij.openapi.fileTypes.SyntaxHighlighter"/>
</extensionPoint>
</extensionPoints>
<extensions defaultExtensionNs="com.intellij">
<applicationService serviceImplementation="com.intellij.ide.ui.UISettings" preload="true"/>
<applicationService serviceImplementation="com.intellij.ide.ui.NotRoamableUiSettings"/>
</extensions>
</idea-plugin>
|
Mark <editorFactoryListener> EP as dynamic
|
Mark <editorFactoryListener> EP as dynamic
GitOrigin-RevId: fd82f21696a19b9fa08729f6a83fa210d8db75cd
|
XML
|
apache-2.0
|
allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community,allotria/intellij-community
|
xml
|
## Code Before:
<idea-plugin>
<extensionPoints>
<extensionPoint name="editorFactoryListener" interface="com.intellij.openapi.editor.event.EditorFactoryListener"/>
<extensionPoint name="syntaxHighlighter"
beanClass="com.intellij.openapi.extensions.KeyedFactoryEPBean"
dynamic="true">
<with attribute="implementationClass" implements="com.intellij.openapi.fileTypes.SyntaxHighlighter"/>
</extensionPoint>
</extensionPoints>
<extensions defaultExtensionNs="com.intellij">
<applicationService serviceImplementation="com.intellij.ide.ui.UISettings" preload="true"/>
<applicationService serviceImplementation="com.intellij.ide.ui.NotRoamableUiSettings"/>
</extensions>
</idea-plugin>
## Instruction:
Mark <editorFactoryListener> EP as dynamic
GitOrigin-RevId: fd82f21696a19b9fa08729f6a83fa210d8db75cd
## Code After:
<idea-plugin>
<extensionPoints>
<extensionPoint name="editorFactoryListener" interface="com.intellij.openapi.editor.event.EditorFactoryListener" dynamic="true"/>
<extensionPoint name="syntaxHighlighter"
beanClass="com.intellij.openapi.extensions.KeyedFactoryEPBean"
dynamic="true">
<with attribute="implementationClass" implements="com.intellij.openapi.fileTypes.SyntaxHighlighter"/>
</extensionPoint>
</extensionPoints>
<extensions defaultExtensionNs="com.intellij">
<applicationService serviceImplementation="com.intellij.ide.ui.UISettings" preload="true"/>
<applicationService serviceImplementation="com.intellij.ide.ui.NotRoamableUiSettings"/>
</extensions>
</idea-plugin>
|
3948043fe5f8d9ad8a3a9f2d0eb7c67bd8e0186e
|
app/controllers/email_alert_subscriptions_controller.rb
|
app/controllers/email_alert_subscriptions_controller.rb
|
require 'email_alert_signup_api'
require 'gds_api/helpers'
class EmailAlertSubscriptionsController < ApplicationController
include GdsApi::Helpers
protect_from_forgery except: :create
def new
# So using request.env["PATH_INFO"] has a leading slash which would need
# removing before asking the content api for the artefact. I don't like this
# either but I prefer it to string manip.
artefact = content_api.artefact("#{finder_slug}/email-signup")
@signup = SignupPresenter.new(artefact)
end
def create
signup_url = email_alert_signup_api.signup_url
redirect_to signup_url
end
private
def finder_slug
params[:slug]
end
def finder
Finder.get(finder_slug)
end
def email_alert_signup_api
EmailAlertSignupAPI.new(
delivery_api: delivery_api,
alert_identifier: finder_url_for_alert_type,
alert_name: finder.name
)
end
def delivery_api
@delivery_api ||= GdsApi::GovUkDelivery.new(Plek.current.find('govuk-delivery'))
end
def finder_url_for_alert_type
@finder_url_for_alert_type ||= "#{Plek.current.find('finder-frontend')}/#{finder_slug}.atom"
end
end
|
require 'email_alert_signup_api'
require 'gds_api/helpers'
class EmailAlertSubscriptionsController < ApplicationController
include GdsApi::Helpers
protect_from_forgery except: :create
def new
content = content_store.content_item(request.path)
@signup = SignupPresenter.new(content)
end
def create
signup_url = email_alert_signup_api.signup_url
redirect_to signup_url
end
private
def finder_slug
params[:slug]
end
def finder
Finder.get(finder_slug)
end
def email_alert_signup_api
EmailAlertSignupAPI.new(
delivery_api: delivery_api,
alert_identifier: finder_url_for_alert_type,
alert_name: finder.name
)
end
def delivery_api
@delivery_api ||= GdsApi::GovUkDelivery.new(Plek.current.find('govuk-delivery'))
end
def finder_url_for_alert_type
@finder_url_for_alert_type ||= "#{Plek.current.find('finder-frontend')}/#{finder_slug}.atom"
end
end
|
Refactor Finders to pull from content_store
|
Refactor Finders to pull from content_store
This commit contains the work needed to pull Finders from the
content_store and removes the ability for Finders to be retrieved from
the content-api.
Finders and email-alert-subs now pull their content_item and the
relevant tests have been updated to stub the content_store.
|
Ruby
|
mit
|
alphagov/finder-frontend,alphagov/finder-frontend,alphagov/finder-frontend,alphagov/finder-frontend
|
ruby
|
## Code Before:
require 'email_alert_signup_api'
require 'gds_api/helpers'
class EmailAlertSubscriptionsController < ApplicationController
include GdsApi::Helpers
protect_from_forgery except: :create
def new
# So using request.env["PATH_INFO"] has a leading slash which would need
# removing before asking the content api for the artefact. I don't like this
# either but I prefer it to string manip.
artefact = content_api.artefact("#{finder_slug}/email-signup")
@signup = SignupPresenter.new(artefact)
end
def create
signup_url = email_alert_signup_api.signup_url
redirect_to signup_url
end
private
def finder_slug
params[:slug]
end
def finder
Finder.get(finder_slug)
end
def email_alert_signup_api
EmailAlertSignupAPI.new(
delivery_api: delivery_api,
alert_identifier: finder_url_for_alert_type,
alert_name: finder.name
)
end
def delivery_api
@delivery_api ||= GdsApi::GovUkDelivery.new(Plek.current.find('govuk-delivery'))
end
def finder_url_for_alert_type
@finder_url_for_alert_type ||= "#{Plek.current.find('finder-frontend')}/#{finder_slug}.atom"
end
end
## Instruction:
Refactor Finders to pull from content_store
This commit contains the work needed to pull Finders from the
content_store and removes the ability for Finders to be retrieved from
the content-api.
Finders and email-alert-subs now pull their content_item and the
relevant tests have been updated to stub the content_store.
## Code After:
require 'email_alert_signup_api'
require 'gds_api/helpers'
class EmailAlertSubscriptionsController < ApplicationController
include GdsApi::Helpers
protect_from_forgery except: :create
def new
content = content_store.content_item(request.path)
@signup = SignupPresenter.new(content)
end
def create
signup_url = email_alert_signup_api.signup_url
redirect_to signup_url
end
private
def finder_slug
params[:slug]
end
def finder
Finder.get(finder_slug)
end
def email_alert_signup_api
EmailAlertSignupAPI.new(
delivery_api: delivery_api,
alert_identifier: finder_url_for_alert_type,
alert_name: finder.name
)
end
def delivery_api
@delivery_api ||= GdsApi::GovUkDelivery.new(Plek.current.find('govuk-delivery'))
end
def finder_url_for_alert_type
@finder_url_for_alert_type ||= "#{Plek.current.find('finder-frontend')}/#{finder_slug}.atom"
end
end
|
93e2ff0dd32a72efa90222988d4289c70bb55b98
|
c2corg_api/models/common/fields_book.py
|
c2corg_api/models/common/fields_book.py
|
DEFAULT_FIELDS = [
'locales.title',
'locales.summary',
'locales.description',
'locales.lang',
'author',
'editor',
'activities',
'url',
'isbn',
'book_types',
'publication_date',
'langs',
'nb_pages'
]
DEFAULT_REQUIRED = [
'locales',
'locales.title',
'book_types'
]
LISTING_FIELDS = [
'locales',
'locales.title',
'activities',
'author',
'quality',
'book_types'
]
fields_book = {
'fields': DEFAULT_FIELDS,
'required': DEFAULT_REQUIRED,
'listing': LISTING_FIELDS
}
|
DEFAULT_FIELDS = [
'locales.title',
'locales.summary',
'locales.description',
'locales.lang',
'author',
'editor',
'activities',
'url',
'isbn',
'book_types',
'publication_date',
'langs',
'nb_pages'
]
DEFAULT_REQUIRED = [
'locales',
'locales.title',
'book_types'
]
LISTING_FIELDS = [
'locales',
'locales.title',
'locales.summary',
'activities',
'author',
'quality',
'book_types'
]
fields_book = {
'fields': DEFAULT_FIELDS,
'required': DEFAULT_REQUIRED,
'listing': LISTING_FIELDS
}
|
Add summary to book listing
|
Add summary to book listing
|
Python
|
agpl-3.0
|
c2corg/v6_api,c2corg/v6_api,c2corg/v6_api
|
python
|
## Code Before:
DEFAULT_FIELDS = [
'locales.title',
'locales.summary',
'locales.description',
'locales.lang',
'author',
'editor',
'activities',
'url',
'isbn',
'book_types',
'publication_date',
'langs',
'nb_pages'
]
DEFAULT_REQUIRED = [
'locales',
'locales.title',
'book_types'
]
LISTING_FIELDS = [
'locales',
'locales.title',
'activities',
'author',
'quality',
'book_types'
]
fields_book = {
'fields': DEFAULT_FIELDS,
'required': DEFAULT_REQUIRED,
'listing': LISTING_FIELDS
}
## Instruction:
Add summary to book listing
## Code After:
DEFAULT_FIELDS = [
'locales.title',
'locales.summary',
'locales.description',
'locales.lang',
'author',
'editor',
'activities',
'url',
'isbn',
'book_types',
'publication_date',
'langs',
'nb_pages'
]
DEFAULT_REQUIRED = [
'locales',
'locales.title',
'book_types'
]
LISTING_FIELDS = [
'locales',
'locales.title',
'locales.summary',
'activities',
'author',
'quality',
'book_types'
]
fields_book = {
'fields': DEFAULT_FIELDS,
'required': DEFAULT_REQUIRED,
'listing': LISTING_FIELDS
}
|
fc11092a0af459c3207084e070012f2fd2cf5435
|
Request/PostCurlRequest.php
|
Request/PostCurlRequest.php
|
<?php
/**
* @file PostCurlRequest.php
*
* PHP version 5.4+
*
* @author Yancharuk Alexander <alex at itvault dot info>
* @date 2015-01-20 20:16
* @copyright The BSD 3-Clause License
*/
namespace Veles\Request;
/**
* Class PostCurlRequest
* @author Yancharuk Alexander <alex at itvault dot info>
*/
class PostCurlRequest extends CurlRequest
{
/** @var array */
protected $default_options = [
CURLOPT_RETURNTRANSFER => true, // output to string instead stdout
CURLOPT_CONNECTTIMEOUT => 10, // timeout on connect
CURLOPT_TIMEOUT => 10, // timeout on request
CURLOPT_POST => true
];
/**
* Set request data
*
* @param array $data
*
* @return bool
*/
public function setData(array $data)
{
return $this->setOption(CURLOPT_POSTFIELDS, $data);
}
}
|
<?php
/**
* @file PostCurlRequest.php
*
* PHP version 5.4+
*
* @author Yancharuk Alexander <alex at itvault dot info>
* @date 2015-01-20 20:16
* @copyright The BSD 3-Clause License
*/
namespace Veles\Request;
/**
* Class PostCurlRequest
* @author Yancharuk Alexander <alex at itvault dot info>
*/
class PostCurlRequest extends CurlRequest
{
/** @var array */
protected $default_options = [
CURLOPT_RETURNTRANSFER => true, // output to string instead stdout
CURLOPT_CONNECTTIMEOUT => 10, // timeout on connect
CURLOPT_TIMEOUT => 10, // timeout on request
CURLOPT_POST => true
];
/**
* Set request data
*
* @param string|array $data
*
* @return bool
*/
public function setData($data)
{
return $this->setOption(CURLOPT_POSTFIELDS, $data);
}
}
|
Allow string as POSTFIELDS param
|
Allow string as POSTFIELDS param
|
PHP
|
bsd-3-clause
|
nafigator/Veles
|
php
|
## Code Before:
<?php
/**
* @file PostCurlRequest.php
*
* PHP version 5.4+
*
* @author Yancharuk Alexander <alex at itvault dot info>
* @date 2015-01-20 20:16
* @copyright The BSD 3-Clause License
*/
namespace Veles\Request;
/**
* Class PostCurlRequest
* @author Yancharuk Alexander <alex at itvault dot info>
*/
class PostCurlRequest extends CurlRequest
{
/** @var array */
protected $default_options = [
CURLOPT_RETURNTRANSFER => true, // output to string instead stdout
CURLOPT_CONNECTTIMEOUT => 10, // timeout on connect
CURLOPT_TIMEOUT => 10, // timeout on request
CURLOPT_POST => true
];
/**
* Set request data
*
* @param array $data
*
* @return bool
*/
public function setData(array $data)
{
return $this->setOption(CURLOPT_POSTFIELDS, $data);
}
}
## Instruction:
Allow string as POSTFIELDS param
## Code After:
<?php
/**
* @file PostCurlRequest.php
*
* PHP version 5.4+
*
* @author Yancharuk Alexander <alex at itvault dot info>
* @date 2015-01-20 20:16
* @copyright The BSD 3-Clause License
*/
namespace Veles\Request;
/**
* Class PostCurlRequest
* @author Yancharuk Alexander <alex at itvault dot info>
*/
class PostCurlRequest extends CurlRequest
{
/** @var array */
protected $default_options = [
CURLOPT_RETURNTRANSFER => true, // output to string instead stdout
CURLOPT_CONNECTTIMEOUT => 10, // timeout on connect
CURLOPT_TIMEOUT => 10, // timeout on request
CURLOPT_POST => true
];
/**
* Set request data
*
* @param string|array $data
*
* @return bool
*/
public function setData($data)
{
return $this->setOption(CURLOPT_POSTFIELDS, $data);
}
}
|
80e5ef61f3a43a41d18df7540754545fa040d350
|
docs/uploading-images.md
|
docs/uploading-images.md
|
Uploading Images
================
Images should be uploaded as a Zip file of JPG images. You'll need to upload any new images that you want to add before you add any metadata. Depending on how many images you add they can take a while to process (hours initially, to be fully indexed and searchable it could take many more hours or days).
The images shouldn't be smaller than about 300 pixels on any side. Ideally they would be much larger. Having them be at least 1000 pixels on each side would be good to start with. It's important to note that the names of the images should match the file names provided in the metadata file.
If you ever upload an image that was previously uploaded then the old image will be replaced with the new one.
If you ever upload any new images you'll also need to upload a new metadata file in order for the new images to be displayed on the site. (Metadata records that have no images associated with them are never added to the database.)
|
Uploading Images
================
Images should be uploaded as a Zip file of JPG images. You'll need to upload any new images that you want to add before you add any metadata. Depending on how many images you add they can take a while to process (hours initially, to be fully indexed and searchable it could take many more hours or days).
The images shouldn't be smaller than about 300 pixels on any side. Ideally they would be much larger. Having them be at least 1000 pixels on each side would be good to start with. It's important to note that the names of the images should match the file names provided in the metadata.
If you upload an image that was previously uploaded then the old image will be replaced with the new one.
If you ever upload any new images you'll also need to upload a new metadata file in order for the new images to be displayed on the site. (Metadata records that have no images associated with them are never added to the database.)
Once you've uploaded some images, and the images have been fully processed, you will be presented with the number of images successfully uploaded, or that failed to upload. If you click the filename of the Zip file that you uploaded you'll be able to see additional information about what went wrong with the image uploads (if anything).
Common errors and warnings include:
* Having an improperly-formatted JPG image that is unable to be read or displayed. This can be fixed by providing a correct JPG file (verify that it's able to open on your computer).
* Having multiple images that have the same file name. Only one image will be used, the others will be ignored. You'll need to ensure that no duplicate files are provided.
* The image is smaller than 150 pixels on a side (this prevents it from being able to be indexed by the similarity search). Providing a larger image will resolve this warning.
|
Expand the uploading images docs a bit.
|
Expand the uploading images docs a bit.
|
Markdown
|
mit
|
jeresig/pharos-images
|
markdown
|
## Code Before:
Uploading Images
================
Images should be uploaded as a Zip file of JPG images. You'll need to upload any new images that you want to add before you add any metadata. Depending on how many images you add they can take a while to process (hours initially, to be fully indexed and searchable it could take many more hours or days).
The images shouldn't be smaller than about 300 pixels on any side. Ideally they would be much larger. Having them be at least 1000 pixels on each side would be good to start with. It's important to note that the names of the images should match the file names provided in the metadata file.
If you ever upload an image that was previously uploaded then the old image will be replaced with the new one.
If you ever upload any new images you'll also need to upload a new metadata file in order for the new images to be displayed on the site. (Metadata records that have no images associated with them are never added to the database.)
## Instruction:
Expand the uploading images docs a bit.
## Code After:
Uploading Images
================
Images should be uploaded as a Zip file of JPG images. You'll need to upload any new images that you want to add before you add any metadata. Depending on how many images you add they can take a while to process (hours initially, to be fully indexed and searchable it could take many more hours or days).
The images shouldn't be smaller than about 300 pixels on any side. Ideally they would be much larger. Having them be at least 1000 pixels on each side would be good to start with. It's important to note that the names of the images should match the file names provided in the metadata.
If you upload an image that was previously uploaded then the old image will be replaced with the new one.
If you ever upload any new images you'll also need to upload a new metadata file in order for the new images to be displayed on the site. (Metadata records that have no images associated with them are never added to the database.)
Once you've uploaded some images, and the images have been fully processed, you will be presented with the number of images successfully uploaded, or that failed to upload. If you click the filename of the Zip file that you uploaded you'll be able to see additional information about what went wrong with the image uploads (if anything).
Common errors and warnings include:
* Having an improperly-formatted JPG image that is unable to be read or displayed. This can be fixed by providing a correct JPG file (verify that it's able to open on your computer).
* Having multiple images that have the same file name. Only one image will be used, the others will be ignored. You'll need to ensure that no duplicate files are provided.
* The image is smaller than 150 pixels on a side (this prevents it from being able to be indexed by the similarity search). Providing a larger image will resolve this warning.
|
99a29e5aebc2c283ee1d2d68881323021187c7eb
|
js/Search.jsx
|
js/Search.jsx
|
const React = require('react')
const data = require('../public/data')
const ShowCard = require('./ShowCard')
// key is needed for console/webpack complaint to go away. \
// key is a unique identifier
const Search = React.createClass({
getInitialState () {
return {
searchTerm: 'this is my seach term'
}
},
handleSearchTermEvent (event) {
this.setState({ searchTerm: event.target.value })
},
render () {
return (
<div className='container'>
<header className='header'>
<h1 className='brand'>{this.state.searchTerm}</h1>
<input value={this.state.searchTerm} className='search-input' type='text' placeholder='Search' onChange={this.handleSearchTermEvent} />
</header>
<div className='shows'>
{data.shows.map((show) => (
<ShowCard {...show} key={show.imbdID} />
))}
</div>
</div>
)
}
})
module.exports = Search
|
const React = require('react')
const data = require('../public/data')
const ShowCard = require('./ShowCard')
// key is needed for console/webpack complaint to go away. \
// key is a unique identifier
//have to bind this when this syntaxis is uses
// class Search extends React.Component {
// constructor(props) {
// super(props)
// this.handleSearchTermEvent = this.handleSearchTermEvent.bind(this)
// }
// }
const Search = React.createClass({
getInitialState () {
return {
searchTerm: 'this is my seach term'
}
},
handleSearchTermEvent (event) {
this.setState({ searchTerm: event.target.value })
},
render () {
return (
<div className='container'>
<header className='header'>
<h1 className='brand'>{this.state.searchTerm}</h1>
<input value={this.state.searchTerm} className='search-input' type='text' placeholder='Search' onChange={this.handleSearchTermEvent} />
</header>
<div className='shows'>
{data.shows.map((show) => (
<ShowCard {...show} key={show.imbdID} />
))}
</div>
</div>
)
}
})
module.exports = Search
|
Add note about extends method
|
Add note about extends method
|
JSX
|
mit
|
teatreelee/complete-react,teatreelee/complete-react
|
jsx
|
## Code Before:
const React = require('react')
const data = require('../public/data')
const ShowCard = require('./ShowCard')
// key is needed for console/webpack complaint to go away. \
// key is a unique identifier
const Search = React.createClass({
getInitialState () {
return {
searchTerm: 'this is my seach term'
}
},
handleSearchTermEvent (event) {
this.setState({ searchTerm: event.target.value })
},
render () {
return (
<div className='container'>
<header className='header'>
<h1 className='brand'>{this.state.searchTerm}</h1>
<input value={this.state.searchTerm} className='search-input' type='text' placeholder='Search' onChange={this.handleSearchTermEvent} />
</header>
<div className='shows'>
{data.shows.map((show) => (
<ShowCard {...show} key={show.imbdID} />
))}
</div>
</div>
)
}
})
module.exports = Search
## Instruction:
Add note about extends method
## Code After:
const React = require('react')
const data = require('../public/data')
const ShowCard = require('./ShowCard')
// key is needed for console/webpack complaint to go away. \
// key is a unique identifier
//have to bind this when this syntaxis is uses
// class Search extends React.Component {
// constructor(props) {
// super(props)
// this.handleSearchTermEvent = this.handleSearchTermEvent.bind(this)
// }
// }
const Search = React.createClass({
getInitialState () {
return {
searchTerm: 'this is my seach term'
}
},
handleSearchTermEvent (event) {
this.setState({ searchTerm: event.target.value })
},
render () {
return (
<div className='container'>
<header className='header'>
<h1 className='brand'>{this.state.searchTerm}</h1>
<input value={this.state.searchTerm} className='search-input' type='text' placeholder='Search' onChange={this.handleSearchTermEvent} />
</header>
<div className='shows'>
{data.shows.map((show) => (
<ShowCard {...show} key={show.imbdID} />
))}
</div>
</div>
)
}
})
module.exports = Search
|
0e4cb12d055c902c91b9d62864e35d100e2135f8
|
README.md
|
README.md
|
- pluggable lint based on [metalsmith](https://github.com/metalsmith/metalsmith)
|
[](https://travis-ci.org/jmayer13/eloquent-lint)
- pluggable lint based on [metalsmith](https://github.com/metalsmith/metalsmith)
|
Include trevis ci image into readme.md file
|
Include trevis ci image into readme.md file
|
Markdown
|
mit
|
jmayer13/eloquent-lint
|
markdown
|
## Code Before:
- pluggable lint based on [metalsmith](https://github.com/metalsmith/metalsmith)
## Instruction:
Include trevis ci image into readme.md file
## Code After:
[](https://travis-ci.org/jmayer13/eloquent-lint)
- pluggable lint based on [metalsmith](https://github.com/metalsmith/metalsmith)
|
247283aef8c62894a531ed4c48eb53fc37d39694
|
Casks/iterm2.rb
|
Casks/iterm2.rb
|
cask 'iterm2' do
# note: "2" is not a version number, but indicates a different vendor
version '2.1.4'
sha256 '1062b83e7808dc1e13362f4a83ef770e1c24ea4ae090d1346b49f6196e9064cd'
url "https://iterm2.com/downloads/stable/iTerm2-#{version.gsub('.', '_')}.zip"
appcast 'https://iterm2.com/appcasts/final.xml',
checkpoint: 'e9de319b2fa344a35dd297ee07cd9ea6c9d4ff93e96fece38c36409319767f55'
name 'iTerm2'
homepage 'https://www.iterm2.com/'
license :gpl
auto_updates true
app 'iTerm.app'
zap delete: '~/Library/Preferences/com.googlecode.iterm2.plist'
end
|
cask 'iterm2' do
# note: "2" is not a version number, but indicates a different vendor
version '2.1.4'
sha256 '1062b83e7808dc1e13362f4a83ef770e1c24ea4ae090d1346b49f6196e9064cd'
url "https://iterm2.com/downloads/stable/iTerm2-#{version.gsub('.', '_')}.zip"
appcast 'https://iterm2.com/appcasts/final.xml',
checkpoint: 'e9de319b2fa344a35dd297ee07cd9ea6c9d4ff93e96fece38c36409319767f55'
name 'iTerm2'
homepage 'https://www.iterm2.com/'
license :gpl
auto_updates true
depends_on macos: '>= :lion'
depends_on arch: :intel
app 'iTerm.app'
zap delete: '~/Library/Preferences/com.googlecode.iterm2.plist'
end
|
Update iTerm2: depends_on macos and arch
|
Update iTerm2: depends_on macos and arch
|
Ruby
|
bsd-2-clause
|
julionc/homebrew-cask,elyscape/homebrew-cask,Ephemera/homebrew-cask,mahori/homebrew-cask,JacopKane/homebrew-cask,xakraz/homebrew-cask,bric3/homebrew-cask,okket/homebrew-cask,skatsuta/homebrew-cask,diogodamiani/homebrew-cask,sjackman/homebrew-cask,schneidmaster/homebrew-cask,josa42/homebrew-cask,hristozov/homebrew-cask,arronmabrey/homebrew-cask,andrewdisley/homebrew-cask,greg5green/homebrew-cask,mlocher/homebrew-cask,n0ts/homebrew-cask,sgnh/homebrew-cask,jalaziz/homebrew-cask,vigosan/homebrew-cask,ninjahoahong/homebrew-cask,doits/homebrew-cask,fharbe/homebrew-cask,jpmat296/homebrew-cask,elyscape/homebrew-cask,tedski/homebrew-cask,jmeridth/homebrew-cask,markthetech/homebrew-cask,dictcp/homebrew-cask,vitorgalvao/homebrew-cask,optikfluffel/homebrew-cask,giannitm/homebrew-cask,chadcatlett/caskroom-homebrew-cask,mauricerkelly/homebrew-cask,kongslund/homebrew-cask,sosedoff/homebrew-cask,maxnordlund/homebrew-cask,inta/homebrew-cask,caskroom/homebrew-cask,xyb/homebrew-cask,larseggert/homebrew-cask,claui/homebrew-cask,samdoran/homebrew-cask,cliffcotino/homebrew-cask,mazehall/homebrew-cask,mikem/homebrew-cask,m3nu/homebrew-cask,exherb/homebrew-cask,sebcode/homebrew-cask,jalaziz/homebrew-cask,Amorymeltzer/homebrew-cask,winkelsdorf/homebrew-cask,kronicd/homebrew-cask,pkq/homebrew-cask,malford/homebrew-cask,miccal/homebrew-cask,johnjelinek/homebrew-cask,paour/homebrew-cask,aguynamedryan/homebrew-cask,moimikey/homebrew-cask,renard/homebrew-cask,jconley/homebrew-cask,hakamadare/homebrew-cask,jgarber623/homebrew-cask,tsparber/homebrew-cask,ericbn/homebrew-cask,mishari/homebrew-cask,lumaxis/homebrew-cask,neverfox/homebrew-cask,malob/homebrew-cask,MerelyAPseudonym/homebrew-cask,asbachb/homebrew-cask,kingthorin/homebrew-cask,alebcay/homebrew-cask,squid314/homebrew-cask,vin047/homebrew-cask,onlynone/homebrew-cask,josa42/homebrew-cask,rogeriopradoj/homebrew-cask,ericbn/homebrew-cask,asins/homebrew-cask,JacopKane/homebrew-cask,sebcode/homebrew-cask,phpwutz/homebrew-cask,mahori/homebrew-cask,mauricerkelly/homebrew-cask,kronicd/homebrew-cask,yurikoles/homebrew-cask,goxberry/homebrew-cask,nathancahill/homebrew-cask,wickedsp1d3r/homebrew-cask,xight/homebrew-cask,kongslund/homebrew-cask,Gasol/homebrew-cask,Ketouem/homebrew-cask,leipert/homebrew-cask,singingwolfboy/homebrew-cask,usami-k/homebrew-cask,puffdad/homebrew-cask,janlugt/homebrew-cask,arronmabrey/homebrew-cask,kamilboratynski/homebrew-cask,alexg0/homebrew-cask,FredLackeyOfficial/homebrew-cask,claui/homebrew-cask,deanmorin/homebrew-cask,decrement/homebrew-cask,michelegera/homebrew-cask,Keloran/homebrew-cask,stephenwade/homebrew-cask,claui/homebrew-cask,My2ndAngelic/homebrew-cask,zerrot/homebrew-cask,dictcp/homebrew-cask,retrography/homebrew-cask,samshadwell/homebrew-cask,ianyh/homebrew-cask,adrianchia/homebrew-cask,shonjir/homebrew-cask,howie/homebrew-cask,shonjir/homebrew-cask,zmwangx/homebrew-cask,mahori/homebrew-cask,flaviocamilo/homebrew-cask,tjnycum/homebrew-cask,shoichiaizawa/homebrew-cask,Amorymeltzer/homebrew-cask,kesara/homebrew-cask,reitermarkus/homebrew-cask,moimikey/homebrew-cask,cblecker/homebrew-cask,ptb/homebrew-cask,My2ndAngelic/homebrew-cask,MoOx/homebrew-cask,julionc/homebrew-cask,scribblemaniac/homebrew-cask,diogodamiani/homebrew-cask,michelegera/homebrew-cask,xcezx/homebrew-cask,chrisfinazzo/homebrew-cask,tjt263/homebrew-cask,blainesch/homebrew-cask,MichaelPei/homebrew-cask,nathancahill/homebrew-cask,jawshooah/homebrew-cask,MircoT/homebrew-cask,tjnycum/homebrew-cask,JikkuJose/homebrew-cask,gmkey/homebrew-cask,a1russell/homebrew-cask,uetchy/homebrew-cask,maxnordlund/homebrew-cask,n8henrie/homebrew-cask,Bombenleger/homebrew-cask,danielbayley/homebrew-cask,sanchezm/homebrew-cask,thii/homebrew-cask,samdoran/homebrew-cask,FranklinChen/homebrew-cask,cfillion/homebrew-cask,hanxue/caskroom,ianyh/homebrew-cask,ebraminio/homebrew-cask,coeligena/homebrew-customized,neverfox/homebrew-cask,joshka/homebrew-cask,johndbritton/homebrew-cask,mathbunnyru/homebrew-cask,sohtsuka/homebrew-cask,vin047/homebrew-cask,mjdescy/homebrew-cask,Ephemera/homebrew-cask,lifepillar/homebrew-cask,guerrero/homebrew-cask,stonehippo/homebrew-cask,imgarylai/homebrew-cask,bdhess/homebrew-cask,seanorama/homebrew-cask,gilesdring/homebrew-cask,diguage/homebrew-cask,psibre/homebrew-cask,hanxue/caskroom,blogabe/homebrew-cask,deanmorin/homebrew-cask,jiashuw/homebrew-cask,caskroom/homebrew-cask,shoichiaizawa/homebrew-cask,victorpopkov/homebrew-cask,haha1903/homebrew-cask,timsutton/homebrew-cask,singingwolfboy/homebrew-cask,KosherBacon/homebrew-cask,aguynamedryan/homebrew-cask,andrewdisley/homebrew-cask,MircoT/homebrew-cask,kkdd/homebrew-cask,Ngrd/homebrew-cask,nrlquaker/homebrew-cask,jbeagley52/homebrew-cask,blainesch/homebrew-cask,shorshe/homebrew-cask,wKovacs64/homebrew-cask,nathanielvarona/homebrew-cask,pkq/homebrew-cask,fanquake/homebrew-cask,dvdoliveira/homebrew-cask,stigkj/homebrew-caskroom-cask,retrography/homebrew-cask,tan9/homebrew-cask,deiga/homebrew-cask,uetchy/homebrew-cask,inta/homebrew-cask,syscrusher/homebrew-cask,esebastian/homebrew-cask,puffdad/homebrew-cask,usami-k/homebrew-cask,jaredsampson/homebrew-cask,mattrobenolt/homebrew-cask,dcondrey/homebrew-cask,neverfox/homebrew-cask,lucasmezencio/homebrew-cask,nshemonsky/homebrew-cask,moimikey/homebrew-cask,tolbkni/homebrew-cask,tjnycum/homebrew-cask,guerrero/homebrew-cask,malford/homebrew-cask,scottsuch/homebrew-cask,wKovacs64/homebrew-cask,howie/homebrew-cask,mwean/homebrew-cask,zmwangx/homebrew-cask,moogar0880/homebrew-cask,jpmat296/homebrew-cask,thomanq/homebrew-cask,bosr/homebrew-cask,mlocher/homebrew-cask,mishari/homebrew-cask,gerrypower/homebrew-cask,y00rb/homebrew-cask,m3nu/homebrew-cask,stonehippo/homebrew-cask,artdevjs/homebrew-cask,slack4u/homebrew-cask,athrunsun/homebrew-cask,opsdev-ws/homebrew-cask,rajiv/homebrew-cask,malob/homebrew-cask,zerrot/homebrew-cask,troyxmccall/homebrew-cask,cblecker/homebrew-cask,dvdoliveira/homebrew-cask,mattrobenolt/homebrew-cask,wickles/homebrew-cask,samnung/homebrew-cask,boecko/homebrew-cask,scribblemaniac/homebrew-cask,afh/homebrew-cask,kpearson/homebrew-cask,reelsense/homebrew-cask,hristozov/homebrew-cask,sanyer/homebrew-cask,esebastian/homebrew-cask,mrmachine/homebrew-cask,klane/homebrew-cask,flaviocamilo/homebrew-cask,malob/homebrew-cask,riyad/homebrew-cask,vitorgalvao/homebrew-cask,rogeriopradoj/homebrew-cask,dcondrey/homebrew-cask,elnappo/homebrew-cask,bric3/homebrew-cask,jedahan/homebrew-cask,xtian/homebrew-cask,xtian/homebrew-cask,tyage/homebrew-cask,joshka/homebrew-cask,lumaxis/homebrew-cask,antogg/homebrew-cask,thehunmonkgroup/homebrew-cask,miccal/homebrew-cask,colindunn/homebrew-cask,danielbayley/homebrew-cask,timsutton/homebrew-cask,sscotth/homebrew-cask,gilesdring/homebrew-cask,renaudguerin/homebrew-cask,deiga/homebrew-cask,sanyer/homebrew-cask,adrianchia/homebrew-cask,miccal/homebrew-cask,rajiv/homebrew-cask,opsdev-ws/homebrew-cask,artdevjs/homebrew-cask,bric3/homebrew-cask,yutarody/homebrew-cask,renaudguerin/homebrew-cask,thomanq/homebrew-cask,syscrusher/homebrew-cask,ksylvan/homebrew-cask,gabrielizaias/homebrew-cask,kesara/homebrew-cask,toonetown/homebrew-cask,winkelsdorf/homebrew-cask,hyuna917/homebrew-cask,AnastasiaSulyagina/homebrew-cask,toonetown/homebrew-cask,doits/homebrew-cask,larseggert/homebrew-cask,chuanxd/homebrew-cask,FinalDes/homebrew-cask,chadcatlett/caskroom-homebrew-cask,mazehall/homebrew-cask,tolbkni/homebrew-cask,stephenwade/homebrew-cask,wmorin/homebrew-cask,optikfluffel/homebrew-cask,elnappo/homebrew-cask,sjackman/homebrew-cask,scribblemaniac/homebrew-cask,josa42/homebrew-cask,asins/homebrew-cask,BenjaminHCCarr/homebrew-cask,asbachb/homebrew-cask,seanzxx/homebrew-cask,kassi/homebrew-cask,Bombenleger/homebrew-cask,xakraz/homebrew-cask,gmkey/homebrew-cask,kkdd/homebrew-cask,wmorin/homebrew-cask,muan/homebrew-cask,ptb/homebrew-cask,inz/homebrew-cask,jiashuw/homebrew-cask,xight/homebrew-cask,jmeridth/homebrew-cask,13k/homebrew-cask,vigosan/homebrew-cask,ksato9700/homebrew-cask,imgarylai/homebrew-cask,yumitsu/homebrew-cask,nshemonsky/homebrew-cask,jangalinski/homebrew-cask,scottsuch/homebrew-cask,samnung/homebrew-cask,alebcay/homebrew-cask,sanchezm/homebrew-cask,diguage/homebrew-cask,jalaziz/homebrew-cask,a1russell/homebrew-cask,wickedsp1d3r/homebrew-cask,forevergenin/homebrew-cask,seanzxx/homebrew-cask,kingthorin/homebrew-cask,jedahan/homebrew-cask,kamilboratynski/homebrew-cask,phpwutz/homebrew-cask,esebastian/homebrew-cask,devmynd/homebrew-cask,mathbunnyru/homebrew-cask,0rax/homebrew-cask,theoriginalgri/homebrew-cask,singingwolfboy/homebrew-cask,BenjaminHCCarr/homebrew-cask,troyxmccall/homebrew-cask,farmerchris/homebrew-cask,jasmas/homebrew-cask,tedski/homebrew-cask,antogg/homebrew-cask,ericbn/homebrew-cask,MerelyAPseudonym/homebrew-cask,markhuber/homebrew-cask,jasmas/homebrew-cask,tangestani/homebrew-cask,cblecker/homebrew-cask,alebcay/homebrew-cask,tangestani/homebrew-cask,pkq/homebrew-cask,anbotero/homebrew-cask,forevergenin/homebrew-cask,amatos/homebrew-cask,MoOx/homebrew-cask,0rax/homebrew-cask,psibre/homebrew-cask,boecko/homebrew-cask,kingthorin/homebrew-cask,goxberry/homebrew-cask,victorpopkov/homebrew-cask,joschi/homebrew-cask,Gasol/homebrew-cask,feigaochn/homebrew-cask,adrianchia/homebrew-cask,leipert/homebrew-cask,kTitan/homebrew-cask,colindean/homebrew-cask,gyndav/homebrew-cask,wastrachan/homebrew-cask,shorshe/homebrew-cask,hellosky806/homebrew-cask,andrewdisley/homebrew-cask,ksato9700/homebrew-cask,rogeriopradoj/homebrew-cask,Keloran/homebrew-cask,nathanielvarona/homebrew-cask,optikfluffel/homebrew-cask,alexg0/homebrew-cask,n0ts/homebrew-cask,jconley/homebrew-cask,yuhki50/homebrew-cask,seanorama/homebrew-cask,xyb/homebrew-cask,cobyism/homebrew-cask,m3nu/homebrew-cask,schneidmaster/homebrew-cask,thii/homebrew-cask,Cottser/homebrew-cask,mjdescy/homebrew-cask,shonjir/homebrew-cask,yutarody/homebrew-cask,cfillion/homebrew-cask,farmerchris/homebrew-cask,Labutin/homebrew-cask,pacav69/homebrew-cask,scottsuch/homebrew-cask,tjt263/homebrew-cask,fharbe/homebrew-cask,cobyism/homebrew-cask,nrlquaker/homebrew-cask,patresi/homebrew-cask,okket/homebrew-cask,cprecioso/homebrew-cask,hanxue/caskroom,paour/homebrew-cask,nrlquaker/homebrew-cask,a1russell/homebrew-cask,Saklad5/homebrew-cask,mchlrmrz/homebrew-cask,winkelsdorf/homebrew-cask,mhubig/homebrew-cask,jellyfishcoder/homebrew-cask,perfide/homebrew-cask,tangestani/homebrew-cask,klane/homebrew-cask,jeroenj/homebrew-cask,yurikoles/homebrew-cask,mattrobenolt/homebrew-cask,inz/homebrew-cask,franklouwers/homebrew-cask,joshka/homebrew-cask,robertgzr/homebrew-cask,sscotth/homebrew-cask,fanquake/homebrew-cask,greg5green/homebrew-cask,rajiv/homebrew-cask,sgnh/homebrew-cask,xyb/homebrew-cask,morganestes/homebrew-cask,lantrix/homebrew-cask,reelsense/homebrew-cask,y00rb/homebrew-cask,julionc/homebrew-cask,koenrh/homebrew-cask,kTitan/homebrew-cask,squid314/homebrew-cask,alexg0/homebrew-cask,onlynone/homebrew-cask,danielbayley/homebrew-cask,JosephViolago/homebrew-cask,hovancik/homebrew-cask,gyndav/homebrew-cask,timsutton/homebrew-cask,yuhki50/homebrew-cask,FredLackeyOfficial/homebrew-cask,lantrix/homebrew-cask,Ngrd/homebrew-cask,gerrypower/homebrew-cask,ywfwj2008/homebrew-cask,afh/homebrew-cask,FinalDes/homebrew-cask,jgarber623/homebrew-cask,chrisfinazzo/homebrew-cask,0xadada/homebrew-cask,jacobbednarz/homebrew-cask,koenrh/homebrew-cask,muan/homebrew-cask,dictcp/homebrew-cask,uetchy/homebrew-cask,Labutin/homebrew-cask,devmynd/homebrew-cask,tan9/homebrew-cask,mathbunnyru/homebrew-cask,sscotth/homebrew-cask,colindean/homebrew-cask,hyuna917/homebrew-cask,lifepillar/homebrew-cask,tyage/homebrew-cask,coeligena/homebrew-customized,xcezx/homebrew-cask,colindunn/homebrew-cask,ksylvan/homebrew-cask,theoriginalgri/homebrew-cask,moogar0880/homebrew-cask,bosr/homebrew-cask,nathanielvarona/homebrew-cask,jeanregisser/homebrew-cask,mjgardner/homebrew-cask,amatos/homebrew-cask,slack4u/homebrew-cask,gabrielizaias/homebrew-cask,daften/homebrew-cask,Ephemera/homebrew-cask,daften/homebrew-cask,samshadwell/homebrew-cask,Amorymeltzer/homebrew-cask,coeligena/homebrew-customized,joschi/homebrew-cask,athrunsun/homebrew-cask,sohtsuka/homebrew-cask,reitermarkus/homebrew-cask,riyad/homebrew-cask,cprecioso/homebrew-cask,yurikoles/homebrew-cask,johnjelinek/homebrew-cask,blogabe/homebrew-cask,wmorin/homebrew-cask,BenjaminHCCarr/homebrew-cask,mchlrmrz/homebrew-cask,pacav69/homebrew-cask,andyli/homebrew-cask,jeanregisser/homebrew-cask,KosherBacon/homebrew-cask,mjgardner/homebrew-cask,bdhess/homebrew-cask,jgarber623/homebrew-cask,stephenwade/homebrew-cask,deiga/homebrew-cask,cobyism/homebrew-cask,reitermarkus/homebrew-cask,AnastasiaSulyagina/homebrew-cask,mhubig/homebrew-cask,imgarylai/homebrew-cask,joschi/homebrew-cask,hakamadare/homebrew-cask,skatsuta/homebrew-cask,jellyfishcoder/homebrew-cask,stigkj/homebrew-caskroom-cask,ywfwj2008/homebrew-cask,jeroenj/homebrew-cask,lucasmezencio/homebrew-cask,shoichiaizawa/homebrew-cask,janlugt/homebrew-cask,jacobbednarz/homebrew-cask,0xadada/homebrew-cask,Ketouem/homebrew-cask,sosedoff/homebrew-cask,thehunmonkgroup/homebrew-cask,MichaelPei/homebrew-cask,JacopKane/homebrew-cask,ninjahoahong/homebrew-cask,feigaochn/homebrew-cask,haha1903/homebrew-cask,franklouwers/homebrew-cask,SentinelWarren/homebrew-cask,mikem/homebrew-cask,cliffcotino/homebrew-cask,johndbritton/homebrew-cask,andyli/homebrew-cask,decrement/homebrew-cask,markthetech/homebrew-cask,mjgardner/homebrew-cask,lukasbestle/homebrew-cask,patresi/homebrew-cask,jawshooah/homebrew-cask,stonehippo/homebrew-cask,hovancik/homebrew-cask,RJHsiao/homebrew-cask,mchlrmrz/homebrew-cask,giannitm/homebrew-cask,kesara/homebrew-cask,gyndav/homebrew-cask,paour/homebrew-cask,Cottser/homebrew-cask,chuanxd/homebrew-cask,mrmachine/homebrew-cask,blogabe/homebrew-cask,tsparber/homebrew-cask,robertgzr/homebrew-cask,RJHsiao/homebrew-cask,SentinelWarren/homebrew-cask,markhuber/homebrew-cask,wickles/homebrew-cask,kassi/homebrew-cask,ebraminio/homebrew-cask,13k/homebrew-cask,sanyer/homebrew-cask,mwean/homebrew-cask,kpearson/homebrew-cask,antogg/homebrew-cask,exherb/homebrew-cask,hellosky806/homebrew-cask,lukasbestle/homebrew-cask,renard/homebrew-cask,jangalinski/homebrew-cask,yutarody/homebrew-cask,Saklad5/homebrew-cask,FranklinChen/homebrew-cask,jaredsampson/homebrew-cask,wastrachan/homebrew-cask,JikkuJose/homebrew-cask,JosephViolago/homebrew-cask,yumitsu/homebrew-cask,anbotero/homebrew-cask,perfide/homebrew-cask,xight/homebrew-cask,chrisfinazzo/homebrew-cask,morganestes/homebrew-cask,jbeagley52/homebrew-cask,n8henrie/homebrew-cask,JosephViolago/homebrew-cask
|
ruby
|
## Code Before:
cask 'iterm2' do
# note: "2" is not a version number, but indicates a different vendor
version '2.1.4'
sha256 '1062b83e7808dc1e13362f4a83ef770e1c24ea4ae090d1346b49f6196e9064cd'
url "https://iterm2.com/downloads/stable/iTerm2-#{version.gsub('.', '_')}.zip"
appcast 'https://iterm2.com/appcasts/final.xml',
checkpoint: 'e9de319b2fa344a35dd297ee07cd9ea6c9d4ff93e96fece38c36409319767f55'
name 'iTerm2'
homepage 'https://www.iterm2.com/'
license :gpl
auto_updates true
app 'iTerm.app'
zap delete: '~/Library/Preferences/com.googlecode.iterm2.plist'
end
## Instruction:
Update iTerm2: depends_on macos and arch
## Code After:
cask 'iterm2' do
# note: "2" is not a version number, but indicates a different vendor
version '2.1.4'
sha256 '1062b83e7808dc1e13362f4a83ef770e1c24ea4ae090d1346b49f6196e9064cd'
url "https://iterm2.com/downloads/stable/iTerm2-#{version.gsub('.', '_')}.zip"
appcast 'https://iterm2.com/appcasts/final.xml',
checkpoint: 'e9de319b2fa344a35dd297ee07cd9ea6c9d4ff93e96fece38c36409319767f55'
name 'iTerm2'
homepage 'https://www.iterm2.com/'
license :gpl
auto_updates true
depends_on macos: '>= :lion'
depends_on arch: :intel
app 'iTerm.app'
zap delete: '~/Library/Preferences/com.googlecode.iterm2.plist'
end
|
2028854ec4d6dca904d4ecdd6f85f572847e632f
|
lib/import/import_samples.rb
|
lib/import/import_samples.rb
|
require 'roo'
class Import::ImportSamples
def self.import_samples_from_file(file_path, collection_id)
ActiveRecord::Base.transaction do
xlsx = Roo::Spreadsheet.open(file_path)
rows = xlsx.parse(name: "Name", description: "Beschreibung", smiles: "Smiles")
rows.shift
rows.map do |row|
molfile = Chemotion::PubchemService.molfile_from_smiles URI::encode(row[:smiles], '[]/()+-.@#=\\')
molecule = Molecule.find_or_create_by_molfile(molfile)
sample = Sample.create(name: row[:name], description: row[:description], molecule: molecule)
CollectionsSample.create(collection_id: collection_id, sample_id: sample.id)
end
end
end
end
|
require 'roo'
class Import::ImportSamples
def self.import_samples_from_file(file_path, collection_id)
ActiveRecord::Base.transaction do
xlsx = Roo::Spreadsheet.open(file_path)
rows = xlsx.parse(name: "Name", description: "Beschreibung", smiles: "Smiles")
rows.shift
rows.map do |row|
molfile = Chemotion::PubchemService.molfile_from_smiles URI::encode(row[:smiles], '[]/()+-.@#=\\')
molecule = Molecule.find_or_create_by_molfile(molfile)
if molecule.nil?
raise "Import of Sample #{row[:name]}: Molecule is nil."
end
sample = Sample.create(name: row[:name], description: row[:description], molecule: molecule)
CollectionsSample.create(collection_id: collection_id, sample_id: sample.id)
end
end
end
end
|
Add raise to avoid import of samples without molecules
|
Add raise to avoid import of samples without molecules
|
Ruby
|
agpl-3.0
|
ComPlat/chemotion_ELN,ComPlat/chemotion_ELN,ComPlat/chemotion_ELN,ComPlat/chemotion_ELN
|
ruby
|
## Code Before:
require 'roo'
class Import::ImportSamples
def self.import_samples_from_file(file_path, collection_id)
ActiveRecord::Base.transaction do
xlsx = Roo::Spreadsheet.open(file_path)
rows = xlsx.parse(name: "Name", description: "Beschreibung", smiles: "Smiles")
rows.shift
rows.map do |row|
molfile = Chemotion::PubchemService.molfile_from_smiles URI::encode(row[:smiles], '[]/()+-.@#=\\')
molecule = Molecule.find_or_create_by_molfile(molfile)
sample = Sample.create(name: row[:name], description: row[:description], molecule: molecule)
CollectionsSample.create(collection_id: collection_id, sample_id: sample.id)
end
end
end
end
## Instruction:
Add raise to avoid import of samples without molecules
## Code After:
require 'roo'
class Import::ImportSamples
def self.import_samples_from_file(file_path, collection_id)
ActiveRecord::Base.transaction do
xlsx = Roo::Spreadsheet.open(file_path)
rows = xlsx.parse(name: "Name", description: "Beschreibung", smiles: "Smiles")
rows.shift
rows.map do |row|
molfile = Chemotion::PubchemService.molfile_from_smiles URI::encode(row[:smiles], '[]/()+-.@#=\\')
molecule = Molecule.find_or_create_by_molfile(molfile)
if molecule.nil?
raise "Import of Sample #{row[:name]}: Molecule is nil."
end
sample = Sample.create(name: row[:name], description: row[:description], molecule: molecule)
CollectionsSample.create(collection_id: collection_id, sample_id: sample.id)
end
end
end
end
|
c14748e066cba9978cc1a73b81dbb76646a5a2ab
|
test.sh
|
test.sh
|
rm -rf ./build
cp -R ./test ./build
for file in ./build/*.jpg
do
./cartoonist.sh "$file"
done
|
rm -rf ./build
cp -R ./test ./build
for file in ./build/*.jpg
do
echo Converting "$file"
./cartoonist.sh "$file"
done
|
Test script now prints progress messages
|
Test script now prints progress messages
|
Shell
|
mit
|
honzajavorek/cartoonist
|
shell
|
## Code Before:
rm -rf ./build
cp -R ./test ./build
for file in ./build/*.jpg
do
./cartoonist.sh "$file"
done
## Instruction:
Test script now prints progress messages
## Code After:
rm -rf ./build
cp -R ./test ./build
for file in ./build/*.jpg
do
echo Converting "$file"
./cartoonist.sh "$file"
done
|
dc927a6001672bb9aeec92968df745bd65f0ab72
|
src/scss/refugee-data-updated.scss
|
src/scss/refugee-data-updated.scss
|
.refugee-updated-at {
position: absolute;
text-align: right;
right: 0.5rem;
bottom: 0.5rem;
font-size: 0.8rem;
@media screen and (max-width: 600px) {
font-size: 0.5rem;
}
}
|
.refugee-updated-at {
position: absolute;
text-align: right;
right: 0.5rem;
bottom: 0.5rem;
font-size: 0.8rem;
color: #777777;
text-shadow: -2px 0 white, 0 2px white, 2px 0 white, 0 -2px white;
@media screen and (max-width: 600px) {
font-size: 0.7rem;
}
}
|
Tweak updated at text color and size
|
Tweak updated at text color and size
|
SCSS
|
mit
|
lucified/lucify-asylum-countries,lucified/lucify-asylum-countries,lucified/lucify-asylum-countries,lucified/lucify-asylum-countries
|
scss
|
## Code Before:
.refugee-updated-at {
position: absolute;
text-align: right;
right: 0.5rem;
bottom: 0.5rem;
font-size: 0.8rem;
@media screen and (max-width: 600px) {
font-size: 0.5rem;
}
}
## Instruction:
Tweak updated at text color and size
## Code After:
.refugee-updated-at {
position: absolute;
text-align: right;
right: 0.5rem;
bottom: 0.5rem;
font-size: 0.8rem;
color: #777777;
text-shadow: -2px 0 white, 0 2px white, 2px 0 white, 0 -2px white;
@media screen and (max-width: 600px) {
font-size: 0.7rem;
}
}
|
837d1d3127756f38413b5eda18dfd636e768e55a
|
renovate.json
|
renovate.json
|
{
"extends": [
"config:base",
":rebaseStalePrs",
":automergePatch"
],
"rangeStrategy": "bump",
"packageRules": [
{
"packageNames": ["webpack"],
"paths": ["packages/test-v3"],
"allowedVersions": "<4.0.0"
},
{
"packageNames": ["uglifyjs-webpack-plugin"],
"paths": ["packages/test-v3"],
"allowedVersions": "<2.0.0"
},
{
"packageNames": ["webpack"],
"paths": ["packages/test-v4"],
"allowedVersions": "<5.0.0"
}
]
}
|
{
"extends": [
"config:base",
":rebaseStalePrs",
":automergePatch"
],
"rangeStrategy": "bump",
"packageRules": [
{
"packageNames": ["jest", "@types/jest", "ts-jest"],
"groupName": "jest packages"
},
{
"packageNames": ["webpack"],
"paths": ["packages/test-v3"],
"allowedVersions": "<4.0.0"
},
{
"packageNames": ["uglifyjs-webpack-plugin"],
"paths": ["packages/test-v3"],
"allowedVersions": "<2.0.0"
},
{
"packageNames": ["webpack"],
"paths": ["packages/test-v4"],
"allowedVersions": "<5.0.0"
}
]
}
|
Update Renovate config for grouping jest
|
Update Renovate config for grouping jest
|
JSON
|
mit
|
yami-beta/license-pack
|
json
|
## Code Before:
{
"extends": [
"config:base",
":rebaseStalePrs",
":automergePatch"
],
"rangeStrategy": "bump",
"packageRules": [
{
"packageNames": ["webpack"],
"paths": ["packages/test-v3"],
"allowedVersions": "<4.0.0"
},
{
"packageNames": ["uglifyjs-webpack-plugin"],
"paths": ["packages/test-v3"],
"allowedVersions": "<2.0.0"
},
{
"packageNames": ["webpack"],
"paths": ["packages/test-v4"],
"allowedVersions": "<5.0.0"
}
]
}
## Instruction:
Update Renovate config for grouping jest
## Code After:
{
"extends": [
"config:base",
":rebaseStalePrs",
":automergePatch"
],
"rangeStrategy": "bump",
"packageRules": [
{
"packageNames": ["jest", "@types/jest", "ts-jest"],
"groupName": "jest packages"
},
{
"packageNames": ["webpack"],
"paths": ["packages/test-v3"],
"allowedVersions": "<4.0.0"
},
{
"packageNames": ["uglifyjs-webpack-plugin"],
"paths": ["packages/test-v3"],
"allowedVersions": "<2.0.0"
},
{
"packageNames": ["webpack"],
"paths": ["packages/test-v4"],
"allowedVersions": "<5.0.0"
}
]
}
|
53a3020b513e3ff790bbbc9c75c12fea4fe47e58
|
source/pre-build/CMakeLists.txt
|
source/pre-build/CMakeLists.txt
|
set(GL_SPEC "${CMAKE_CURRENT_SOURCE_DIR}/gl.xml")
set(GENERATED_CONSTANTS "${CMAKE_CURRENT_BINARY_DIR}/gl_constants.h")
set(GENERATED_EXTENSIONS "${CMAKE_CURRENT_BINARY_DIR}/gl_extensions.h")
set(GENERATED_EXTENSION_INFO "${CMAKE_CURRENT_BINARY_DIR}/gl_extension_info.h")
find_program(PYTHON "python")
if (PYTHON)
message(STATUS "Generate constants")
execute_process(
COMMAND ${PYTHON} scripts/generate_constants.py -i ${GL_SPEC} -o ${GENERATED_CONSTANTS}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
message(STATUS "Generate extensions")
execute_process(
COMMAND ${PYTHON} scripts/generate_extensions.py -i ${GL_SPEC} -e ${GENERATED_EXTENSIONS} -o ${GENERATED_EXTENSION_INFO}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
set(PREBUILD_PATH "${CMAKE_CURRENT_BINARY_DIR}")
else()
message("Could not find python")
endif()
|
set(GL_SPEC "${CMAKE_CURRENT_SOURCE_DIR}/gl.xml")
set(GENERATED_CONSTANTS "${CMAKE_CURRENT_BINARY_DIR}/gl_constants.h")
set(GENERATED_EXTENSIONS "${CMAKE_CURRENT_BINARY_DIR}/gl_extensions.h")
set(GENERATED_EXTENSION_INFO "${CMAKE_CURRENT_BINARY_DIR}/gl_extension_info.h")
find_program(PYTHON "python")
if (PYTHON)
message(STATUS "Generate constants")
execute_process(
COMMAND ${PYTHON} scripts/generate_constants.py -i ${GL_SPEC} -o ${GENERATED_CONSTANTS}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
message(STATUS "Generate extensions")
execute_process(
COMMAND ${PYTHON} scripts/generate_extensions.py -i ${GL_SPEC} -e ${GENERATED_EXTENSIONS} -o ${GENERATED_EXTENSION_INFO}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
set(PREBUILD_PATH "${CMAKE_CURRENT_BINARY_DIR}")
else()
message(SEND_ERROR "Could not find python")
endif()
|
Throw an error if python is not found since its required to copy the gl_* header files to the correct directory
|
Throw an error if python is not found since its required to copy the gl_* header files to the correct directory
|
Text
|
mit
|
j-o/globjects,cginternals/globjects,j-o/globjects,j-o/globjects,j-o/globjects,hpi-r2d2/globjects,cginternals/globjects,hpi-r2d2/globjects
|
text
|
## Code Before:
set(GL_SPEC "${CMAKE_CURRENT_SOURCE_DIR}/gl.xml")
set(GENERATED_CONSTANTS "${CMAKE_CURRENT_BINARY_DIR}/gl_constants.h")
set(GENERATED_EXTENSIONS "${CMAKE_CURRENT_BINARY_DIR}/gl_extensions.h")
set(GENERATED_EXTENSION_INFO "${CMAKE_CURRENT_BINARY_DIR}/gl_extension_info.h")
find_program(PYTHON "python")
if (PYTHON)
message(STATUS "Generate constants")
execute_process(
COMMAND ${PYTHON} scripts/generate_constants.py -i ${GL_SPEC} -o ${GENERATED_CONSTANTS}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
message(STATUS "Generate extensions")
execute_process(
COMMAND ${PYTHON} scripts/generate_extensions.py -i ${GL_SPEC} -e ${GENERATED_EXTENSIONS} -o ${GENERATED_EXTENSION_INFO}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
set(PREBUILD_PATH "${CMAKE_CURRENT_BINARY_DIR}")
else()
message("Could not find python")
endif()
## Instruction:
Throw an error if python is not found since its required to copy the gl_* header files to the correct directory
## Code After:
set(GL_SPEC "${CMAKE_CURRENT_SOURCE_DIR}/gl.xml")
set(GENERATED_CONSTANTS "${CMAKE_CURRENT_BINARY_DIR}/gl_constants.h")
set(GENERATED_EXTENSIONS "${CMAKE_CURRENT_BINARY_DIR}/gl_extensions.h")
set(GENERATED_EXTENSION_INFO "${CMAKE_CURRENT_BINARY_DIR}/gl_extension_info.h")
find_program(PYTHON "python")
if (PYTHON)
message(STATUS "Generate constants")
execute_process(
COMMAND ${PYTHON} scripts/generate_constants.py -i ${GL_SPEC} -o ${GENERATED_CONSTANTS}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
message(STATUS "Generate extensions")
execute_process(
COMMAND ${PYTHON} scripts/generate_extensions.py -i ${GL_SPEC} -e ${GENERATED_EXTENSIONS} -o ${GENERATED_EXTENSION_INFO}
WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}"
)
set(PREBUILD_PATH "${CMAKE_CURRENT_BINARY_DIR}")
else()
message(SEND_ERROR "Could not find python")
endif()
|
dccb4284b0190f0ade93f21346577d4b1406cf0a
|
README.md
|
README.md
|
Some images pulled from CustomizedGirl.com
|
A Social Game for Healthy Eating
Created for [H4BC][1] (Feb. 20-22 2015) in Construct 2 by Team LEHP - Sean, Napoleon, Lisa, Paul, Maria. Some images pulled from CustomizedGirl.com.
[1]:http://hackforbigchoices.org/hackathons/ghana/
|
Include team member names in readme
|
Include team member names in readme
|
Markdown
|
mit
|
skaulana/chopbuddy,skaulana/chopbuddy
|
markdown
|
## Code Before:
Some images pulled from CustomizedGirl.com
## Instruction:
Include team member names in readme
## Code After:
A Social Game for Healthy Eating
Created for [H4BC][1] (Feb. 20-22 2015) in Construct 2 by Team LEHP - Sean, Napoleon, Lisa, Paul, Maria. Some images pulled from CustomizedGirl.com.
[1]:http://hackforbigchoices.org/hackathons/ghana/
|
af4cbb8642afffdc32a060aad5c46131ef312d46
|
src/run_tests_exclude.txt
|
src/run_tests_exclude.txt
|
ctests/burn
# these tests haven't been implemented
|
ctests/burn
# these tests haven't been implemented
# Helper scripts for iozone
components/appio/tests/iozone/Gnuplot.txt
components/appio/tests/iozone/Generate_Graphs
components/appio/tests/iozone/report.pl
components/appio/tests/iozone/iozone_visualizer.pl
components/appio/tests/iozone/gengnuplot.sh
components/appio/tests/iozone/gnu3d.dem
|
Exclude iozone helper scripts from run_tests.
|
Exclude iozone helper scripts from run_tests.
run_tests.sh looks for executible files under components/*/tests
Some of the plotting scripts in appio/iozone were getting picked up.
|
Text
|
bsd-3-clause
|
arm-hpc/papi,pyrovski/papi,pyrovski/papi,arm-hpc/papi,arm-hpc/papi,arm-hpc/papi,pyrovski/papi,arm-hpc/papi,arm-hpc/papi,arm-hpc/papi,pyrovski/papi,pyrovski/papi,pyrovski/papi,pyrovski/papi,pyrovski/papi
|
text
|
## Code Before:
ctests/burn
# these tests haven't been implemented
## Instruction:
Exclude iozone helper scripts from run_tests.
run_tests.sh looks for executible files under components/*/tests
Some of the plotting scripts in appio/iozone were getting picked up.
## Code After:
ctests/burn
# these tests haven't been implemented
# Helper scripts for iozone
components/appio/tests/iozone/Gnuplot.txt
components/appio/tests/iozone/Generate_Graphs
components/appio/tests/iozone/report.pl
components/appio/tests/iozone/iozone_visualizer.pl
components/appio/tests/iozone/gengnuplot.sh
components/appio/tests/iozone/gnu3d.dem
|
3ca83d8a321443b3b4e385c456964d248bd2f978
|
deploy-hooks.sh
|
deploy-hooks.sh
|
stage=$1
case $stage in
pre )
pm2 stop greenbot
;;
deploy )
git pull
rm -r node_modules
npm install
;;
post )
pm2 start node_modules/greenbot/bin/greenbot.js --node-args "modacity.cson"
;;
esac
|
stage=$1
case $stage in
pre )
forever stop node_modules/greenbot/bin/greenbot.js
;;
deploy )
git pull
rm -r node_modules
npm install
;;
post )
forever start node_modules/greenbot/bin/greenbot.js modacity.cson
;;
esac
|
Switch from pm2 to forever
|
Switch from pm2 to forever
|
Shell
|
mit
|
csauve/gb-plugins-modacity
|
shell
|
## Code Before:
stage=$1
case $stage in
pre )
pm2 stop greenbot
;;
deploy )
git pull
rm -r node_modules
npm install
;;
post )
pm2 start node_modules/greenbot/bin/greenbot.js --node-args "modacity.cson"
;;
esac
## Instruction:
Switch from pm2 to forever
## Code After:
stage=$1
case $stage in
pre )
forever stop node_modules/greenbot/bin/greenbot.js
;;
deploy )
git pull
rm -r node_modules
npm install
;;
post )
forever start node_modules/greenbot/bin/greenbot.js modacity.cson
;;
esac
|
b844e3c50241598757e4322214dab8f13af0ce98
|
index.html
|
index.html
|
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<title></title>
<link rel="stylesheet" href="">
</head>
<body>
</body>
</html>
|
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<title>newInc</title>
<!--Import Google Icon Font-->
<link href="http://fonts.googleapis.com/icon?family=Material+Icons" rel="stylesheet">
<!-- Compiled and minified CSS -->
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/materialize/0.97.5/css/materialize.min.css">
<!--Let browser know website is optimized for mobile-->
<meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body>
<!--Import jQuery before materialize.js-->
<script type="text/javascript" src="https://code.jquery.com/jquery-2.1.1.min.js"></script>
<!--TEMP-->
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/numeral.js/1.4.5/numeral.min.js"></script>
<!-- Compiled and minified JavaScript -->
<script src="https://cdnjs.cloudflare.com/ajax/libs/materialize/0.97.5/js/materialize.min.js"></script>
<!-- Custom JavaScript -->
<script type="text/javascript" src="js/game.js"></script>
</body>
</html>
|
Add css, js and jquery
|
Add css, js and jquery
|
HTML
|
apache-2.0
|
kuroikyu/employr,kuroikyu/employr
|
html
|
## Code Before:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<title></title>
<link rel="stylesheet" href="">
</head>
<body>
</body>
</html>
## Instruction:
Add css, js and jquery
## Code After:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<title>newInc</title>
<!--Import Google Icon Font-->
<link href="http://fonts.googleapis.com/icon?family=Material+Icons" rel="stylesheet">
<!-- Compiled and minified CSS -->
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/materialize/0.97.5/css/materialize.min.css">
<!--Let browser know website is optimized for mobile-->
<meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body>
<!--Import jQuery before materialize.js-->
<script type="text/javascript" src="https://code.jquery.com/jquery-2.1.1.min.js"></script>
<!--TEMP-->
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/numeral.js/1.4.5/numeral.min.js"></script>
<!-- Compiled and minified JavaScript -->
<script src="https://cdnjs.cloudflare.com/ajax/libs/materialize/0.97.5/js/materialize.min.js"></script>
<!-- Custom JavaScript -->
<script type="text/javascript" src="js/game.js"></script>
</body>
</html>
|
28f5eafe8b2e34e7a9dc9e2c00cf6881013e2314
|
lib/algorhythmic.coffee
|
lib/algorhythmic.coffee
|
class Algorhythmic
max: 10
compute: (array) ->
array
.map(@getCharInt)
.reduce((previousInt, currentInt) =>
(previousInt + currentInt) % @max
, 0) + 1
getCharInt: (char) -> parseInt char.charCodeAt(0) or 0
convert: (string) ->
str = string.replace(/\.(png|jpg|gif|)$/g, "")
stringArray = str.split('')
return @compute(stringArray)
module.exports = new Algorhythmic()
|
class Algorhythmic
max: 10
convert: (string) ->
str = string.replace(/\.(png|jpg|gif|)$/g, "")
stringArray = str.split('')
return @_compute(stringArray)
_compute: (array) ->
array
.map(@_getCharInt)
.reduce((previousInt, currentInt) =>
(previousInt + currentInt) % @max
, 0) + 1
_getCharInt: (char) -> parseInt char.charCodeAt(0) or 0
module.exports = new Algorhythmic()
|
Use convention for 'private' functions
|
Use convention for 'private' functions
Move them below public functions, also.
|
CoffeeScript
|
mit
|
gitblit/avatars-api,mojects/avatars-api,adorableio/avatars-api,mojects/avatars-api,gitblit/avatars-api
|
coffeescript
|
## Code Before:
class Algorhythmic
max: 10
compute: (array) ->
array
.map(@getCharInt)
.reduce((previousInt, currentInt) =>
(previousInt + currentInt) % @max
, 0) + 1
getCharInt: (char) -> parseInt char.charCodeAt(0) or 0
convert: (string) ->
str = string.replace(/\.(png|jpg|gif|)$/g, "")
stringArray = str.split('')
return @compute(stringArray)
module.exports = new Algorhythmic()
## Instruction:
Use convention for 'private' functions
Move them below public functions, also.
## Code After:
class Algorhythmic
max: 10
convert: (string) ->
str = string.replace(/\.(png|jpg|gif|)$/g, "")
stringArray = str.split('')
return @_compute(stringArray)
_compute: (array) ->
array
.map(@_getCharInt)
.reduce((previousInt, currentInt) =>
(previousInt + currentInt) % @max
, 0) + 1
_getCharInt: (char) -> parseInt char.charCodeAt(0) or 0
module.exports = new Algorhythmic()
|
2f0419d10f555c97209ba8d2e857ea90ce2ae86f
|
src/Web/Twig/TwigResponseServiceProvider.php
|
src/Web/Twig/TwigResponseServiceProvider.php
|
<?php
namespace GMO\Common\Web\Twig;
use Silex\Application;
use Silex\ServiceProviderInterface;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\FilterResponseEvent;
use Symfony\Component\HttpKernel\KernelEvents;
class TwigResponseServiceProvider implements ServiceProviderInterface, EventSubscriberInterface {
public function onResponse(FilterResponseEvent $event) {
$response = $event->getResponse();
if (!$response instanceof TwigResponse) {
return;
}
/** @var \Twig_Environment $twig */
$twig = $this->app['twig'];
$content = $twig->render($response->getTemplate(), $response->getVariables()->toArray());
$newResponse = new Response($content, $response->getStatusCode(), $response->headers->all());
$event->setResponse($newResponse);
}
public function register(Application $app) {
$this->app = $app;
}
public function boot(Application $app) {
$app['dispatcher']->addSubscriber($this);
}
/** @inheritdoc */
public static function getSubscribedEvents() {
return array(
KernelEvents::RESPONSE => array('onResponse', -100),
);
}
/** @var Application */
protected $app;
}
|
<?php
namespace GMO\Common\Web\Twig;
use Silex\Application;
use Silex\ServiceProviderInterface;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\FilterResponseEvent;
use Symfony\Component\HttpKernel\KernelEvents;
class TwigResponseServiceProvider implements ServiceProviderInterface, EventSubscriberInterface {
public function onResponse(FilterResponseEvent $event) {
$response = $event->getResponse();
if (!$response instanceof TwigResponse) {
return;
}
/** @var \Twig_Environment $twig */
$twig = $this->app['twig'];
$content = $twig->render($response->getTemplate(), $response->getVariables()->toArray());
$newResponse = new Response($content, $response->getStatusCode());
$newResponse->headers = $response->headers;
$event->setResponse($newResponse);
}
public function register(Application $app) {
$this->app = $app;
}
public function boot(Application $app) {
$app['dispatcher']->addSubscriber($this);
}
/** @inheritdoc */
public static function getSubscribedEvents() {
return array(
KernelEvents::RESPONSE => array('onResponse', -100),
);
}
/** @var Application */
protected $app;
}
|
Fix headers when creating new response from twig response
|
Fix headers when creating new response from twig response
|
PHP
|
mit
|
gmo/common
|
php
|
## Code Before:
<?php
namespace GMO\Common\Web\Twig;
use Silex\Application;
use Silex\ServiceProviderInterface;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\FilterResponseEvent;
use Symfony\Component\HttpKernel\KernelEvents;
class TwigResponseServiceProvider implements ServiceProviderInterface, EventSubscriberInterface {
public function onResponse(FilterResponseEvent $event) {
$response = $event->getResponse();
if (!$response instanceof TwigResponse) {
return;
}
/** @var \Twig_Environment $twig */
$twig = $this->app['twig'];
$content = $twig->render($response->getTemplate(), $response->getVariables()->toArray());
$newResponse = new Response($content, $response->getStatusCode(), $response->headers->all());
$event->setResponse($newResponse);
}
public function register(Application $app) {
$this->app = $app;
}
public function boot(Application $app) {
$app['dispatcher']->addSubscriber($this);
}
/** @inheritdoc */
public static function getSubscribedEvents() {
return array(
KernelEvents::RESPONSE => array('onResponse', -100),
);
}
/** @var Application */
protected $app;
}
## Instruction:
Fix headers when creating new response from twig response
## Code After:
<?php
namespace GMO\Common\Web\Twig;
use Silex\Application;
use Silex\ServiceProviderInterface;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\FilterResponseEvent;
use Symfony\Component\HttpKernel\KernelEvents;
class TwigResponseServiceProvider implements ServiceProviderInterface, EventSubscriberInterface {
public function onResponse(FilterResponseEvent $event) {
$response = $event->getResponse();
if (!$response instanceof TwigResponse) {
return;
}
/** @var \Twig_Environment $twig */
$twig = $this->app['twig'];
$content = $twig->render($response->getTemplate(), $response->getVariables()->toArray());
$newResponse = new Response($content, $response->getStatusCode());
$newResponse->headers = $response->headers;
$event->setResponse($newResponse);
}
public function register(Application $app) {
$this->app = $app;
}
public function boot(Application $app) {
$app['dispatcher']->addSubscriber($this);
}
/** @inheritdoc */
public static function getSubscribedEvents() {
return array(
KernelEvents::RESPONSE => array('onResponse', -100),
);
}
/** @var Application */
protected $app;
}
|
65fdd17339f8bb3477ed27d3008905685fb61588
|
_plugins/search-index-generator.rb
|
_plugins/search-index-generator.rb
|
module Jekyll
class SearchIndexGenerator < Generator
priority :low
def generate(site)
index = site.posts.docs.reverse
.filter { |post| !post.data['retired'] }
.map do |post|
{
url: post.url,
title: post.data['title'],
category: post.data['category'],
content: post.data['excerpt']
}
end
File.open('_functions/search/index.json', 'w') do |f|
f.puts index.to_json
end
end
end
end
|
module Jekyll
class SearchIndexGenerator < Generator
priority :low
def generate(site)
index = site.posts.docs.reverse
.filter { |post| !post.data['retired'] }
.map do |post|
{
url: site.config['url'] + post.url,
title: post.data['title'],
category: post.data['category'],
content: post.data['excerpt']
}
end
File.open('_functions/search/index.json', 'w') do |f|
f.puts index.to_json
end
end
end
end
|
Prepend site url to post url in search index
|
Prepend site url to post url in search index
|
Ruby
|
mit
|
NSHipster/nshipster.com,NSHipster/nshipster.com,NSHipster/nshipster.com
|
ruby
|
## Code Before:
module Jekyll
class SearchIndexGenerator < Generator
priority :low
def generate(site)
index = site.posts.docs.reverse
.filter { |post| !post.data['retired'] }
.map do |post|
{
url: post.url,
title: post.data['title'],
category: post.data['category'],
content: post.data['excerpt']
}
end
File.open('_functions/search/index.json', 'w') do |f|
f.puts index.to_json
end
end
end
end
## Instruction:
Prepend site url to post url in search index
## Code After:
module Jekyll
class SearchIndexGenerator < Generator
priority :low
def generate(site)
index = site.posts.docs.reverse
.filter { |post| !post.data['retired'] }
.map do |post|
{
url: site.config['url'] + post.url,
title: post.data['title'],
category: post.data['category'],
content: post.data['excerpt']
}
end
File.open('_functions/search/index.json', 'w') do |f|
f.puts index.to_json
end
end
end
end
|
fe86c733cbe322e621e0e833b9e04db2a0c4feee
|
lib/monolith/locations/default.rb
|
lib/monolith/locations/default.rb
|
module Monolith
class DefaultLocation < BaseLocation
def install
# For now we just copy any community cookbooks. It would be nice to be
# able to grab them from the source URL, but that isn't readily
# accessible, and then you have to guess how to check it out.
if File.directory?(@destination)
rel_dest = Monolith.formatter.rel_dir(@destination)
Monolith.formatter.skip("#{rel_dest} already exists")
else
Monolith.formatter.install(@cookbook, @destination)
FileUtils.cp_r(@cookbook.path, @destination)
end
end
def update
# There isn't anything to do for updating a community cookbook except
# blowing it away and recreating it. For the moment I'm opting not to do
# that (it may be able ot be an option later)
Monolith.formatter.skip("Not updating community cookbook")
end
end
end
|
module Monolith
class DefaultLocation < BaseLocation
def install
# For now we just copy any community cookbooks. It would be nice to be
# able to grab them from the source URL, but that isn't readily
# accessible, and then you have to guess how to check it out.
if File.directory?(@destination)
rel_dest = Monolith.formatter.rel_dir(@destination)
Monolith.formatter.skip(@cookbook, "#{rel_dest} already exists")
else
Monolith.formatter.install(@cookbook, @destination)
FileUtils.cp_r(@cookbook.path, @destination)
end
end
def update
# There isn't anything to do for updating a community cookbook except
# blowing it away and recreating it. For the moment I'm opting not to do
# that (it may be able ot be an option later)
Monolith.formatter.skip(@cookbook, "Not updating community cookbook")
end
end
end
|
Fix formatting exception when skipping a community cookbook
|
Fix formatting exception when skipping a community cookbook
|
Ruby
|
mit
|
mivok/berks-monolith
|
ruby
|
## Code Before:
module Monolith
class DefaultLocation < BaseLocation
def install
# For now we just copy any community cookbooks. It would be nice to be
# able to grab them from the source URL, but that isn't readily
# accessible, and then you have to guess how to check it out.
if File.directory?(@destination)
rel_dest = Monolith.formatter.rel_dir(@destination)
Monolith.formatter.skip("#{rel_dest} already exists")
else
Monolith.formatter.install(@cookbook, @destination)
FileUtils.cp_r(@cookbook.path, @destination)
end
end
def update
# There isn't anything to do for updating a community cookbook except
# blowing it away and recreating it. For the moment I'm opting not to do
# that (it may be able ot be an option later)
Monolith.formatter.skip("Not updating community cookbook")
end
end
end
## Instruction:
Fix formatting exception when skipping a community cookbook
## Code After:
module Monolith
class DefaultLocation < BaseLocation
def install
# For now we just copy any community cookbooks. It would be nice to be
# able to grab them from the source URL, but that isn't readily
# accessible, and then you have to guess how to check it out.
if File.directory?(@destination)
rel_dest = Monolith.formatter.rel_dir(@destination)
Monolith.formatter.skip(@cookbook, "#{rel_dest} already exists")
else
Monolith.formatter.install(@cookbook, @destination)
FileUtils.cp_r(@cookbook.path, @destination)
end
end
def update
# There isn't anything to do for updating a community cookbook except
# blowing it away and recreating it. For the moment I'm opting not to do
# that (it may be able ot be an option later)
Monolith.formatter.skip(@cookbook, "Not updating community cookbook")
end
end
end
|
53c014a5ccff103a89c2b9ff9b5f1f30b5d4f7b0
|
tests/test_states.py
|
tests/test_states.py
|
import pytest
from mmvdApp.utils import objective_function
from mmvdApp.utils import valid_solution
@pytest.mark.utils
@pytest.mark.linprog
def test_objective_function(states1):
"""
Test if ``utils.linprog.objective_function`` correctly calculates expected
objective function value.
"""
assert objective_function(states1) == 17
@pytest.mark.utils
@pytest.mark.linprog
def test_solution_validity(states1, order1, drop_zone1):
"""
Test if ``utils.linprog.valid_solution`` correctly checks for valid
solution.
"""
# cut order in half because states1 is intended only for first 3 products
order = order1[0:3]
assert valid_solution(states1, order, drop_zone1)
# check for valid solution when one product isn't returned
assert not valid_solution(states1[0:-1], order, drop_zone1)
|
import pytest
from mmvdApp.utils import objective_function
from mmvdApp.utils import valid_solution
from mmvdApp.utils import InvalidOrderException
@pytest.mark.utils
@pytest.mark.linprog
def test_objective_function(states1):
"""
Test if ``utils.linprog.objective_function`` correctly calculates expected
objective function value.
"""
assert objective_function(states1) == 17
@pytest.mark.utils
@pytest.mark.linprog
def test_solution_validity(states1, order1, drop_zone1):
"""
Test if ``utils.linprog.valid_solution`` correctly checks for valid
solution.
"""
# cut order in half because states1 is intended only for first 3 products
order = order1[0:3]
assert valid_solution(states1, order, drop_zone1)
# check for valid solution when one product isn't returned
with pytest.raises(InvalidOrderException):
valid_solution(states1[0:-1], order, drop_zone1)
|
Test missing checking for InvalidOrderException raised
|
Fix: Test missing checking for InvalidOrderException raised
|
Python
|
mit
|
WojciechFocus/MMVD,WojciechFocus/MMVD
|
python
|
## Code Before:
import pytest
from mmvdApp.utils import objective_function
from mmvdApp.utils import valid_solution
@pytest.mark.utils
@pytest.mark.linprog
def test_objective_function(states1):
"""
Test if ``utils.linprog.objective_function`` correctly calculates expected
objective function value.
"""
assert objective_function(states1) == 17
@pytest.mark.utils
@pytest.mark.linprog
def test_solution_validity(states1, order1, drop_zone1):
"""
Test if ``utils.linprog.valid_solution`` correctly checks for valid
solution.
"""
# cut order in half because states1 is intended only for first 3 products
order = order1[0:3]
assert valid_solution(states1, order, drop_zone1)
# check for valid solution when one product isn't returned
assert not valid_solution(states1[0:-1], order, drop_zone1)
## Instruction:
Fix: Test missing checking for InvalidOrderException raised
## Code After:
import pytest
from mmvdApp.utils import objective_function
from mmvdApp.utils import valid_solution
from mmvdApp.utils import InvalidOrderException
@pytest.mark.utils
@pytest.mark.linprog
def test_objective_function(states1):
"""
Test if ``utils.linprog.objective_function`` correctly calculates expected
objective function value.
"""
assert objective_function(states1) == 17
@pytest.mark.utils
@pytest.mark.linprog
def test_solution_validity(states1, order1, drop_zone1):
"""
Test if ``utils.linprog.valid_solution`` correctly checks for valid
solution.
"""
# cut order in half because states1 is intended only for first 3 products
order = order1[0:3]
assert valid_solution(states1, order, drop_zone1)
# check for valid solution when one product isn't returned
with pytest.raises(InvalidOrderException):
valid_solution(states1[0:-1], order, drop_zone1)
|
78bd5327d990fc3d8777b0f9b0c156799baf5298
|
CHANGELOG.md
|
CHANGELOG.md
|
- Fixed wrong parse error type returned by `GenericParser.unexpected()`
- Added missing guard statement to prevent crash in `UnicodeScalar.fromUInt32()`
- Added `ClosedInterval` variant of `ParsecType.oneOf()`
- Migration to Swift 2.2
- Internal code improvement
- Increased tests coverage
- Documentation improvement
|
- Migrated source code to Swift 3.0
- Now Support Swift Package Manager
- Improved files and source code layout
- More documentation
## Performance
A benchmark was added to test the performance of the library.
An internal design modification greatly improved the parsing speed and memory
usage. Before the modification the benchmark measured 648.32s (≈10.8m) to
execute the parsing of a huge JSON file. Now it only takes 6.7s, a bit more than
96 times faster!
## API
- Added the `userState: GenericParser<StreamType, UserState, UserState>`
parser.
- Now the `run(userState: UserState, sourceName: String, input: StreamType)
throws -> Result` only returns the result of the parsing. As an example, if one
wants to get the user state and the result at the same time:
```
let countLine = GenericParser<String, Int, Character>.endOfLine >>- { newLine in
GenericParser<String, Int, Int>.userState >>- { userState in
GenericParser(result: (newLine, userState + 1))
}
}
```
- Added the `Parsec.runSafe(userState: UserState, sourceName: String,
input: StreamType) -> Either<ParseError, Result>` method. This new method does
not throw exceptions but returns the result wrap in an `Either` type.
- Added a parser returning the current source position:
`GenericParser.sourcePosition: GenericParser<StreamType, UserState, SourcePosition>`
- Various minor changes to conform to the Swift API design guide lines
# Release 1.1
- Fixed wrong parse error type returned by `GenericParser.unexpected()`
- Added missing guard statement to prevent crash in `UnicodeScalar.fromUInt32()`
- Added `ClosedInterval` variant of `ParsecType.oneOf()`
- Migration to Swift 2.2
- Internal code improvement
- Increased tests coverage
- Documentation improvement
|
Update change log for SwiftParsec 2.0
|
Update change log for SwiftParsec 2.0
|
Markdown
|
bsd-2-clause
|
davedufresne/SwiftParsec,davedufresne/SwiftParsec,davedufresne/SwiftParsec
|
markdown
|
## Code Before:
- Fixed wrong parse error type returned by `GenericParser.unexpected()`
- Added missing guard statement to prevent crash in `UnicodeScalar.fromUInt32()`
- Added `ClosedInterval` variant of `ParsecType.oneOf()`
- Migration to Swift 2.2
- Internal code improvement
- Increased tests coverage
- Documentation improvement
## Instruction:
Update change log for SwiftParsec 2.0
## Code After:
- Migrated source code to Swift 3.0
- Now Support Swift Package Manager
- Improved files and source code layout
- More documentation
## Performance
A benchmark was added to test the performance of the library.
An internal design modification greatly improved the parsing speed and memory
usage. Before the modification the benchmark measured 648.32s (≈10.8m) to
execute the parsing of a huge JSON file. Now it only takes 6.7s, a bit more than
96 times faster!
## API
- Added the `userState: GenericParser<StreamType, UserState, UserState>`
parser.
- Now the `run(userState: UserState, sourceName: String, input: StreamType)
throws -> Result` only returns the result of the parsing. As an example, if one
wants to get the user state and the result at the same time:
```
let countLine = GenericParser<String, Int, Character>.endOfLine >>- { newLine in
GenericParser<String, Int, Int>.userState >>- { userState in
GenericParser(result: (newLine, userState + 1))
}
}
```
- Added the `Parsec.runSafe(userState: UserState, sourceName: String,
input: StreamType) -> Either<ParseError, Result>` method. This new method does
not throw exceptions but returns the result wrap in an `Either` type.
- Added a parser returning the current source position:
`GenericParser.sourcePosition: GenericParser<StreamType, UserState, SourcePosition>`
- Various minor changes to conform to the Swift API design guide lines
# Release 1.1
- Fixed wrong parse error type returned by `GenericParser.unexpected()`
- Added missing guard statement to prevent crash in `UnicodeScalar.fromUInt32()`
- Added `ClosedInterval` variant of `ParsecType.oneOf()`
- Migration to Swift 2.2
- Internal code improvement
- Increased tests coverage
- Documentation improvement
|
07d0cafdf2018e8789e4e0671eb0ccc8bbc5c186
|
contrib/sendmail/mail.local/pathnames.h
|
contrib/sendmail/mail.local/pathnames.h
|
/*-
* Copyright (c) 1998 Sendmail, Inc. All rights reserved.
* Copyright (c) 1990, 1993
* The Regents of the University of California. All rights reserved.
*
* By using this file, you agree to the terms and conditions set
* forth in the LICENSE file which can be found at the top level of
* the sendmail distribution.
*
*
* @(#)pathnames.h 8.5 (Berkeley) 5/19/1998
*/
#include <paths.h>
#define _PATH_LOCTMP "/tmp/local.XXXXXX"
|
/*-
* Copyright (c) 1998 Sendmail, Inc. All rights reserved.
* Copyright (c) 1990, 1993
* The Regents of the University of California. All rights reserved.
*
* By using this file, you agree to the terms and conditions set
* forth in the LICENSE file which can be found at the top level of
* the sendmail distribution.
*
*
* @(#)pathnames.h 8.5 (Berkeley) 5/19/1998
* $FreeBSD$
*/
#include <paths.h>
#define _PATH_LOCTMP "/var/tmp/local.XXXXXX"
|
Change location of temporary file from /tmp to /var/tmp. This is a repeat of an earlier commit which apparently got lost with the last import. It helps solve the frequently reported problem
|
Change location of temporary file from /tmp to /var/tmp. This is a
repeat of an earlier commit which apparently got lost with the last
import. It helps solve the frequently reported problem
pid 4032 (mail.local), uid 0 on /: file system full
(though there appears to be a lot of space) caused by idiots sending
30 MB mail messages.
Most-recently-reported-by: jahanur <[email protected]>
Add $FreeBSD$ so that I can check the file back in.
Rejected-by: CVS
|
C
|
bsd-3-clause
|
jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase
|
c
|
## Code Before:
/*-
* Copyright (c) 1998 Sendmail, Inc. All rights reserved.
* Copyright (c) 1990, 1993
* The Regents of the University of California. All rights reserved.
*
* By using this file, you agree to the terms and conditions set
* forth in the LICENSE file which can be found at the top level of
* the sendmail distribution.
*
*
* @(#)pathnames.h 8.5 (Berkeley) 5/19/1998
*/
#include <paths.h>
#define _PATH_LOCTMP "/tmp/local.XXXXXX"
## Instruction:
Change location of temporary file from /tmp to /var/tmp. This is a
repeat of an earlier commit which apparently got lost with the last
import. It helps solve the frequently reported problem
pid 4032 (mail.local), uid 0 on /: file system full
(though there appears to be a lot of space) caused by idiots sending
30 MB mail messages.
Most-recently-reported-by: jahanur <[email protected]>
Add $FreeBSD$ so that I can check the file back in.
Rejected-by: CVS
## Code After:
/*-
* Copyright (c) 1998 Sendmail, Inc. All rights reserved.
* Copyright (c) 1990, 1993
* The Regents of the University of California. All rights reserved.
*
* By using this file, you agree to the terms and conditions set
* forth in the LICENSE file which can be found at the top level of
* the sendmail distribution.
*
*
* @(#)pathnames.h 8.5 (Berkeley) 5/19/1998
* $FreeBSD$
*/
#include <paths.h>
#define _PATH_LOCTMP "/var/tmp/local.XXXXXX"
|
dd2646122915e148f23a26bc7c8fcee076311bbc
|
app/views/repos/index.html.erb
|
app/views/repos/index.html.erb
|
<h1>Listing repos</h1>
<table>
<thead>
<tr>
<th>Name</th>
</tr>
</thead>
<tbody>
<% @repos.each do |repo| %>
<tr>
<td><%= link_to repo.full_name, issues_path(org: repo_full_name_to_org(repo.full_name), repo: repo_full_name_to_name(repo.full_name)) %></td>
</tr>
<% end %>
</tbody>
</table>
|
<h1>Listing repos</h1>
<table class="table">
<thead>
<tr>
<th>Name</th>
</tr>
</thead>
<tbody>
<% @repos.each do |repo| %>
<tr>
<td><%= link_to repo.full_name, issues_path(org: repo_full_name_to_org(repo.full_name), repo: repo_full_name_to_name(repo.full_name)) %></td>
</tr>
<% end %>
</tbody>
</table>
|
Add style to repos index
|
Add style to repos index
|
HTML+ERB
|
mit
|
ajsharma/monologue,ajsharma/monologue,ajsharma/monologue,ajsharma/monologue
|
html+erb
|
## Code Before:
<h1>Listing repos</h1>
<table>
<thead>
<tr>
<th>Name</th>
</tr>
</thead>
<tbody>
<% @repos.each do |repo| %>
<tr>
<td><%= link_to repo.full_name, issues_path(org: repo_full_name_to_org(repo.full_name), repo: repo_full_name_to_name(repo.full_name)) %></td>
</tr>
<% end %>
</tbody>
</table>
## Instruction:
Add style to repos index
## Code After:
<h1>Listing repos</h1>
<table class="table">
<thead>
<tr>
<th>Name</th>
</tr>
</thead>
<tbody>
<% @repos.each do |repo| %>
<tr>
<td><%= link_to repo.full_name, issues_path(org: repo_full_name_to_org(repo.full_name), repo: repo_full_name_to_name(repo.full_name)) %></td>
</tr>
<% end %>
</tbody>
</table>
|
457d6747dbdb4a45a32e8c804f2219c7eebb048c
|
.travis.yml
|
.travis.yml
|
sudo: false
language: ruby
# Needed for rainbow 2.2.1 / rubygems issues.
before_install:
- |
if [[ "$(ruby -e 'puts RUBY_VERSION')" != 1.* ]]; then gem update --system; fi
rvm:
- 1.9.3
- 2.3.8
- 2.4.5
- 2.5.3
- 2.6.0
- jruby-9.1.5.0
|
sudo: false
language: ruby
# Needed for rainbow 2.2.1 / rubygems issues.
before_install:
- |
if [[ "$(ruby -e 'puts RUBY_VERSION')" != 1.* ]]; then gem update --system; fi
rvm:
- 2.3.8
- 2.4.5
- 2.5.3
- 2.6.0
- jruby-9.1.5.0
|
Remove support for Ruby 1.9.3 and update current versions
|
Remove support for Ruby 1.9.3 and update current versions
Ruby < 2.0 has been EOL'd over 3 years ago, and we want to use kwargs,
so we're dropping support for it.
Signed-off-by: Daniel Magliola <[email protected]>
|
YAML
|
apache-2.0
|
prometheus/client_ruby
|
yaml
|
## Code Before:
sudo: false
language: ruby
# Needed for rainbow 2.2.1 / rubygems issues.
before_install:
- |
if [[ "$(ruby -e 'puts RUBY_VERSION')" != 1.* ]]; then gem update --system; fi
rvm:
- 1.9.3
- 2.3.8
- 2.4.5
- 2.5.3
- 2.6.0
- jruby-9.1.5.0
## Instruction:
Remove support for Ruby 1.9.3 and update current versions
Ruby < 2.0 has been EOL'd over 3 years ago, and we want to use kwargs,
so we're dropping support for it.
Signed-off-by: Daniel Magliola <[email protected]>
## Code After:
sudo: false
language: ruby
# Needed for rainbow 2.2.1 / rubygems issues.
before_install:
- |
if [[ "$(ruby -e 'puts RUBY_VERSION')" != 1.* ]]; then gem update --system; fi
rvm:
- 2.3.8
- 2.4.5
- 2.5.3
- 2.6.0
- jruby-9.1.5.0
|
7730cc9a11fdf5df33a7012e87c699a363a344af
|
public/modules/campaigns/filter/campaign.expirydate.filter.js
|
public/modules/campaigns/filter/campaign.expirydate.filter.js
|
'use strict';
angular.module('campaign').filter('daysflt', function() {
return function days(value) {
var filteredDay;
console.log(value);
if(value.hoursLeft) {
if(value.hoursLeft <= 1){
filteredDay = 1 + ' Hour';
}
else {
filteredDay = value.hoursLeft + ' Hours';
}
}
else if (value === 1) {
filteredDay = '1 day';
}
else if (value > 1 || value === 0) {
filteredDay = value + ' days';
}
else {
filteredDay = 'This campaign is likely expired, no days ';
}
return filteredDay;
};
});
|
'use strict';
angular.module('campaign').filter('daysflt', function() {
return function days(value) {
var filteredDay;
console.log(value);
if(value.hoursLeft) {
if(value.hoursLeft <= 1){
filteredDay = 1 + ' Hour';
}
else {
filteredDay = value.hoursLeft + ' Hours';
}
}
else if (value === 1) {
filteredDay = '1 day';
}
else if (value > 1 || value === 0) {
filteredDay = value + ' days';
}
else {
filteredDay = '';
}
return filteredDay;
};
});
|
Remove unneeded value for campaign 'Days Left'
|
Remove unneeded value for campaign 'Days Left'
|
JavaScript
|
mit
|
andela/mando,hisabimbola/mando,andela/mando,hisabimbola/mando,andela/mando,hisabimbola/mando
|
javascript
|
## Code Before:
'use strict';
angular.module('campaign').filter('daysflt', function() {
return function days(value) {
var filteredDay;
console.log(value);
if(value.hoursLeft) {
if(value.hoursLeft <= 1){
filteredDay = 1 + ' Hour';
}
else {
filteredDay = value.hoursLeft + ' Hours';
}
}
else if (value === 1) {
filteredDay = '1 day';
}
else if (value > 1 || value === 0) {
filteredDay = value + ' days';
}
else {
filteredDay = 'This campaign is likely expired, no days ';
}
return filteredDay;
};
});
## Instruction:
Remove unneeded value for campaign 'Days Left'
## Code After:
'use strict';
angular.module('campaign').filter('daysflt', function() {
return function days(value) {
var filteredDay;
console.log(value);
if(value.hoursLeft) {
if(value.hoursLeft <= 1){
filteredDay = 1 + ' Hour';
}
else {
filteredDay = value.hoursLeft + ' Hours';
}
}
else if (value === 1) {
filteredDay = '1 day';
}
else if (value > 1 || value === 0) {
filteredDay = value + ' days';
}
else {
filteredDay = '';
}
return filteredDay;
};
});
|
8c6630881746abf6cd6cf8324003d8099b1b4d97
|
config/initializers/attachment_fu_file_mime_type.rb
|
config/initializers/attachment_fu_file_mime_type.rb
|
module Technoweenie::AttachmentFu::InstanceMethods
def uploaded_data_with_unix_file_mime_type=(file_data)
tmp_file = self.uploaded_data_without_unix_file_mime_type=(file_data)
if tmp_file.present? && (unix_file = `which file`.chomp).present? && File.exists?(unix_file)
`#{ unix_file } -v` =~ /^file-(.*)$/
version = $1.to_i
self.content_type = case version
when 5
`#{ unix_file } -b --mime-type #{ tmp_file.path }`.chomp
else
`#{ unix_file } -bi #{ tmp_file.path }`.chomp =~ /(\w*\/[\w+-\.]*)/
$1
end
end
tmp_file
end
alias_method_chain :uploaded_data=, :unix_file_mime_type
end
|
module Technoweenie::AttachmentFu::InstanceMethods
def uploaded_data_with_unix_file_mime_type=(file_data)
tmp_file = self.uploaded_data_without_unix_file_mime_type=(file_data)
if tmp_file.present? && (unix_file = `which file`.chomp).present? && File.exists?(unix_file)
`#{ unix_file } -v 2>&1` =~ /^file-(.*)$/
version = $1
self.content_type = if version >= "4.26"
`#{ unix_file } -b --mime-type #{ tmp_file.path }`.chomp
else
`#{ unix_file } -bi #{ tmp_file.path }`.chomp =~ /(\w*\/[\w+-\.]*)/
$1
end
end
tmp_file
end
alias_method_chain :uploaded_data=, :unix_file_mime_type
end
|
Make unix_file_mime_type compatible with file versions < 5
|
Make unix_file_mime_type compatible with file versions < 5
|
Ruby
|
agpl-3.0
|
ging/vcc,ging/vcc,fbottin/mconf-web,amreis/mconf-web,amreis/mconf-web,ging/vcc,lfzawacki/mconf-web,mconf-ufrgs/mconf-web,mconf-rnp/mconf-web,mconf-ufrgs/mconf-web,mconf-rnp/mconf-web,amreis/mconf-web,fbottin/mconf-web,akratech/Akraconference,mconf-rnp/mconf-web,mconftec/mconf-web-santacasa,arrivu/web-conf,arrivu/web-conf,mconf/mconf-web,becueb/MconfWeb,m-narayan/agora-conf,mconftec/mconf-web-santacasa,m-narayan/agora-conf,lfzawacki/mconf-web,arrivu/web-conf,ging/vcc,arrivu/web-conf,akratech/Akraconference,mconftec/mconf-web-santacasa,lfzawacki/mconf-web,mconf-ufrgs/mconf-web,mconf/mconf-web,becueb/MconfWeb,lfzawacki/mconf-web,mconftec/mconf-web-mytruecloud,becueb/MconfWeb,fbottin/mconf-web,mconftec/mconf-web-uergs,mconftec/mconf-web-cedia,m-narayan/agora-conf,mconftec/mconf-web-ufpe,mconftec/mconf-web-mytruecloud,m-narayan/agora-conf,mconf-rnp/mconf-web,mconftec/mconf-web-uergs,mconftec/mconf-web-santacasa,mconftec/mconf-web-cedia,mconftec/mconf-web-uergs,ging/vcc,mconftec/mconf-web-cedia,arrivu/web-conf,mconf-ufrgs/mconf-web,m-narayan/agora-conf,ging/vcc,fbottin/mconf-web,mconftec/mconf-web-mytruecloud,akratech/Akraconference,m-narayan/agora-conf,mconf/mconf-web,arrivu/web-conf,mconf/mconf-web,ging/vcc,becueb/MconfWeb,amreis/mconf-web,mconftec/mconf-web-mytruecloud,m-narayan/agora-conf,mconftec/mconf-web-ufpe,akratech/Akraconference,mconftec/mconf-web-ufpe,mconftec/mconf-web-uergs,mconftec/mconf-web-ufpe,arrivu/web-conf
|
ruby
|
## Code Before:
module Technoweenie::AttachmentFu::InstanceMethods
def uploaded_data_with_unix_file_mime_type=(file_data)
tmp_file = self.uploaded_data_without_unix_file_mime_type=(file_data)
if tmp_file.present? && (unix_file = `which file`.chomp).present? && File.exists?(unix_file)
`#{ unix_file } -v` =~ /^file-(.*)$/
version = $1.to_i
self.content_type = case version
when 5
`#{ unix_file } -b --mime-type #{ tmp_file.path }`.chomp
else
`#{ unix_file } -bi #{ tmp_file.path }`.chomp =~ /(\w*\/[\w+-\.]*)/
$1
end
end
tmp_file
end
alias_method_chain :uploaded_data=, :unix_file_mime_type
end
## Instruction:
Make unix_file_mime_type compatible with file versions < 5
## Code After:
module Technoweenie::AttachmentFu::InstanceMethods
def uploaded_data_with_unix_file_mime_type=(file_data)
tmp_file = self.uploaded_data_without_unix_file_mime_type=(file_data)
if tmp_file.present? && (unix_file = `which file`.chomp).present? && File.exists?(unix_file)
`#{ unix_file } -v 2>&1` =~ /^file-(.*)$/
version = $1
self.content_type = if version >= "4.26"
`#{ unix_file } -b --mime-type #{ tmp_file.path }`.chomp
else
`#{ unix_file } -bi #{ tmp_file.path }`.chomp =~ /(\w*\/[\w+-\.]*)/
$1
end
end
tmp_file
end
alias_method_chain :uploaded_data=, :unix_file_mime_type
end
|
621a058c04302262982d5bedd0a0cd89800946d7
|
styles/ui/editor.styl
|
styles/ui/editor.styl
|
editor-tag(color=color-primary, color-text=#fff)
background color
padding 2px
margin 0 -2px
border-radius()
color color-text
editor
min-height 460px
display block
border-radius()
&.ace_editor
line-height 26px
font-size 15px
.ace_content .ace_line
.ace_comment
color #aaa
.ace_keyword
editor-tag(color-language-keyword, color-black)
.ace_string
editor-tag(color-language-string)
.ace_numeric
editor-tag(color-language-number)
.ace_identifier
editor-tag(color-language-identifier)
.ace_gutter-cell
color #aaa
|
editor-tag(color=color-primary, color-text=#fff)
background color
padding 2px
margin 0 -2px
border-radius()
color color-text
editor
min-height 460px
display block
border-radius()
&.ace_editor
line-height 26px
font-size 15px
.ace_content .ace_line
.ace_comment
color #aaa
.ace_keyword
editor-tag(color-language-keyword, color-black)
.ace_string
editor-tag(color-language-string)
.ace_numeric
editor-tag(color-language-number)
.ace_identifier
editor-tag(color-language-identifier)
.ace_gutter-cell
color #ccc
// .ace_gutter
// display none
|
Make line number less relevant
|
Make line number less relevant
|
Stylus
|
mit
|
tancredi/draw,tancredi/draw,tancredi/draw
|
stylus
|
## Code Before:
editor-tag(color=color-primary, color-text=#fff)
background color
padding 2px
margin 0 -2px
border-radius()
color color-text
editor
min-height 460px
display block
border-radius()
&.ace_editor
line-height 26px
font-size 15px
.ace_content .ace_line
.ace_comment
color #aaa
.ace_keyword
editor-tag(color-language-keyword, color-black)
.ace_string
editor-tag(color-language-string)
.ace_numeric
editor-tag(color-language-number)
.ace_identifier
editor-tag(color-language-identifier)
.ace_gutter-cell
color #aaa
## Instruction:
Make line number less relevant
## Code After:
editor-tag(color=color-primary, color-text=#fff)
background color
padding 2px
margin 0 -2px
border-radius()
color color-text
editor
min-height 460px
display block
border-radius()
&.ace_editor
line-height 26px
font-size 15px
.ace_content .ace_line
.ace_comment
color #aaa
.ace_keyword
editor-tag(color-language-keyword, color-black)
.ace_string
editor-tag(color-language-string)
.ace_numeric
editor-tag(color-language-number)
.ace_identifier
editor-tag(color-language-identifier)
.ace_gutter-cell
color #ccc
// .ace_gutter
// display none
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.