commit
stringlengths 40
40
| old_file
stringlengths 4
237
| new_file
stringlengths 4
237
| old_contents
stringlengths 1
4.24k
| new_contents
stringlengths 5
4.84k
| subject
stringlengths 15
778
| message
stringlengths 16
6.86k
| lang
stringlengths 1
30
| license
stringclasses 13
values | repos
stringlengths 5
116k
| config
stringlengths 1
30
| content
stringlengths 105
8.72k
|
---|---|---|---|---|---|---|---|---|---|---|---|
f81487d6c0de7b961503a73144b30dcb3faab0a1 | src/Main.elm | src/Main.elm | module Main exposing (main)
import Html exposing (..)
import Html.Attributes exposing (..)
import ChartCard
import Samples
main : Program Never Model Msg
main =
Html.beginnerProgram
{ model = model
, view = view
, update = update
}
-- MODEL
type alias Model =
List ChartCard.Model
model : Model
model =
[ ChartCard.init Samples.allOfMe
, ChartCard.init Samples.allOfMe
]
-- MSG
type Msg
= ChartCard Int ChartCard.Msg
-- UPDATE
update : Msg -> Model -> Model
update msg model =
case msg of
ChartCard msgIndex nestedMsg ->
List.indexedMap
(\index item ->
if index == msgIndex then
ChartCard.update nestedMsg item
else
item
)
model
-- VIEW
view : Model -> Html Msg
view model =
div []
[ h1 []
[ text "Music chart viewer/editor" ]
, p []
[ text "Source code on "
, a [ href "https://github.com/openchordcharts/chart-editor" ]
[ text "GitHub" ]
]
, section []
(List.indexedMap
(\index item -> Html.map (ChartCard index) (ChartCard.view item))
model
)
]
| module Main exposing (main)
import Html exposing (..)
import Html.Attributes exposing (..)
import ChartCard
import Samples
main : Program Never Model Msg
main =
Html.beginnerProgram
{ model = model
, view = view
, update = update
}
-- MODEL
type alias Model =
List ChartCard.Model
model : Model
model =
[ ChartCard.init Samples.allOfMe
, ChartCard.init Samples.allOfMe
]
-- MSG
type Msg
= ChartCardMsg Int ChartCard.Msg
-- UPDATE
update : Msg -> Model -> Model
update msg model =
case msg of
ChartCardMsg msgIndex nestedMsg ->
List.indexedMap
(\index item ->
if index == msgIndex then
ChartCard.update nestedMsg item
else
item
)
model
-- VIEW
view : Model -> Html Msg
view model =
div []
[ h1 []
[ text "Music chart viewer/editor" ]
, p []
[ text "Source code on "
, a [ href "https://github.com/openchordcharts/chart-editor" ]
[ text "GitHub" ]
]
, section []
(List.indexedMap
(\index chartModel -> Html.map (ChartCardMsg index) (ChartCard.view chartModel))
model
)
]
| Rename ChartCard to ChartCardMsg to explicit it's a Msg | Rename ChartCard to ChartCardMsg to explicit it's a Msg
| Elm | agpl-3.0 | open-chords-charts/chart-editor,open-chords-charts/chart-editor | elm | ## Code Before:
module Main exposing (main)
import Html exposing (..)
import Html.Attributes exposing (..)
import ChartCard
import Samples
main : Program Never Model Msg
main =
Html.beginnerProgram
{ model = model
, view = view
, update = update
}
-- MODEL
type alias Model =
List ChartCard.Model
model : Model
model =
[ ChartCard.init Samples.allOfMe
, ChartCard.init Samples.allOfMe
]
-- MSG
type Msg
= ChartCard Int ChartCard.Msg
-- UPDATE
update : Msg -> Model -> Model
update msg model =
case msg of
ChartCard msgIndex nestedMsg ->
List.indexedMap
(\index item ->
if index == msgIndex then
ChartCard.update nestedMsg item
else
item
)
model
-- VIEW
view : Model -> Html Msg
view model =
div []
[ h1 []
[ text "Music chart viewer/editor" ]
, p []
[ text "Source code on "
, a [ href "https://github.com/openchordcharts/chart-editor" ]
[ text "GitHub" ]
]
, section []
(List.indexedMap
(\index item -> Html.map (ChartCard index) (ChartCard.view item))
model
)
]
## Instruction:
Rename ChartCard to ChartCardMsg to explicit it's a Msg
## Code After:
module Main exposing (main)
import Html exposing (..)
import Html.Attributes exposing (..)
import ChartCard
import Samples
main : Program Never Model Msg
main =
Html.beginnerProgram
{ model = model
, view = view
, update = update
}
-- MODEL
type alias Model =
List ChartCard.Model
model : Model
model =
[ ChartCard.init Samples.allOfMe
, ChartCard.init Samples.allOfMe
]
-- MSG
type Msg
= ChartCardMsg Int ChartCard.Msg
-- UPDATE
update : Msg -> Model -> Model
update msg model =
case msg of
ChartCardMsg msgIndex nestedMsg ->
List.indexedMap
(\index item ->
if index == msgIndex then
ChartCard.update nestedMsg item
else
item
)
model
-- VIEW
view : Model -> Html Msg
view model =
div []
[ h1 []
[ text "Music chart viewer/editor" ]
, p []
[ text "Source code on "
, a [ href "https://github.com/openchordcharts/chart-editor" ]
[ text "GitHub" ]
]
, section []
(List.indexedMap
(\index chartModel -> Html.map (ChartCardMsg index) (ChartCard.view chartModel))
model
)
]
|
6525fd5cd449a1345f20c8b423a31f2cfb4f829c | data/examples/asciidoc/block_admonition.adoc | data/examples/asciidoc/block_admonition.adoc | // .basic
WARNING: Watch out for dragons!
// .basic_multiline
NOTE: An admonition paragraph draws the reader's attention to
auxiliary information.
Its purpose is determined by the label
at the beginning of the paragraph.
// .block
[IMPORTANT]
====
While werewolves are hardy community members, keep in mind some dietary concerns.
====
// .block_with_title
[IMPORTANT]
.Feeding the Werewolves
====
While werewolves are hardy community members, keep in mind some dietary concerns.
====
// .block_with_id_and_role
[IMPORTANT, id=werewolve, role=member]
====
While werewolves are hardy community members, keep in mind some dietary concerns.
====
| // .note
NOTE: This is a note.
// .note_with_title
.Title of note
NOTE: This is a note with title.
// .note_with_id_and_role
[#note-1.yellow]
NOTE: This is a note with id and role.
// .note_block
[NOTE]
====
This is a note with complex content.
* It contains a list.
====
// .tip
TIP: This is a tip.
// .tip_with_title
.Title of tip
TIP: This is a tip with title.
// .tip_with_id_and_role
[#tip-1.blue]
TIP: This is a tip with id and role.
// .tip_block
[TIP]
====
This is a tip with complex content.
* It contains a list.
====
// .important
IMPORTANT: This is an important notice.
// .important_with_title
.Title of important notice
IMPORTANT: This is an important notice with title.
// .important_with_id_and_role
[#important-1.red]
IMPORTANT: This is an important notice with id and role.
// .important_block
[IMPORTANT]
====
This is an important notice with complex content.
* It contains a list.
====
// .caution
CAUTION: This is a caution.
// .caution_with_title
.Title of caution
CAUTION: This is a caution with title.
// .caution_with_id_and_role
[#caution-1.red]
CAUTION: This is a caution with id and role.
// .caution_block
[CAUTION]
====
This is a caution with complex content.
* It contains a list.
====
// .warning
WARNING: This is a warning.
// .warning_with_title
.Title of warning
WARNING: This is a warning with title.
// .warning_with_id_and_role
[#warning-1.red]
WARNING: This is a warning with id and role.
// .warning_block
[WARNING]
====
This is a warning with complex content.
* It contains a list.
====
| Add full set of examples for each admonition type | Examples: Add full set of examples for each admonition type
They may be rendered differently, for example note and tip as aside,
others as section.
| AsciiDoc | mit | asciidoctor/asciidoctor-doctest,asciidoctor/asciidoctor-doctest | asciidoc | ## Code Before:
// .basic
WARNING: Watch out for dragons!
// .basic_multiline
NOTE: An admonition paragraph draws the reader's attention to
auxiliary information.
Its purpose is determined by the label
at the beginning of the paragraph.
// .block
[IMPORTANT]
====
While werewolves are hardy community members, keep in mind some dietary concerns.
====
// .block_with_title
[IMPORTANT]
.Feeding the Werewolves
====
While werewolves are hardy community members, keep in mind some dietary concerns.
====
// .block_with_id_and_role
[IMPORTANT, id=werewolve, role=member]
====
While werewolves are hardy community members, keep in mind some dietary concerns.
====
## Instruction:
Examples: Add full set of examples for each admonition type
They may be rendered differently, for example note and tip as aside,
others as section.
## Code After:
// .note
NOTE: This is a note.
// .note_with_title
.Title of note
NOTE: This is a note with title.
// .note_with_id_and_role
[#note-1.yellow]
NOTE: This is a note with id and role.
// .note_block
[NOTE]
====
This is a note with complex content.
* It contains a list.
====
// .tip
TIP: This is a tip.
// .tip_with_title
.Title of tip
TIP: This is a tip with title.
// .tip_with_id_and_role
[#tip-1.blue]
TIP: This is a tip with id and role.
// .tip_block
[TIP]
====
This is a tip with complex content.
* It contains a list.
====
// .important
IMPORTANT: This is an important notice.
// .important_with_title
.Title of important notice
IMPORTANT: This is an important notice with title.
// .important_with_id_and_role
[#important-1.red]
IMPORTANT: This is an important notice with id and role.
// .important_block
[IMPORTANT]
====
This is an important notice with complex content.
* It contains a list.
====
// .caution
CAUTION: This is a caution.
// .caution_with_title
.Title of caution
CAUTION: This is a caution with title.
// .caution_with_id_and_role
[#caution-1.red]
CAUTION: This is a caution with id and role.
// .caution_block
[CAUTION]
====
This is a caution with complex content.
* It contains a list.
====
// .warning
WARNING: This is a warning.
// .warning_with_title
.Title of warning
WARNING: This is a warning with title.
// .warning_with_id_and_role
[#warning-1.red]
WARNING: This is a warning with id and role.
// .warning_block
[WARNING]
====
This is a warning with complex content.
* It contains a list.
====
|
9e85c73a21a2f907ef6df4ef1fe56fb2ceb9bf07 | library/src/main/java/com/novoda/downloadmanager/LiteDownloadsNetworkRecoveryEnabled.java | library/src/main/java/com/novoda/downloadmanager/LiteDownloadsNetworkRecoveryEnabled.java | package com.novoda.downloadmanager;
import android.content.Context;
import android.util.Log;
import com.evernote.android.job.JobManager;
import com.evernote.android.job.JobRequest;
import java.util.concurrent.TimeUnit;
class LiteDownloadsNetworkRecoveryEnabled implements DownloadsNetworkRecovery {
private final ConnectionType connectionType;
LiteDownloadsNetworkRecoveryEnabled(Context context, DownloadManager downloadManager, ConnectionType connectionType) {
this.connectionType = connectionType;
JobManager jobManager = JobManager.create(context);
jobManager.addJobCreator(new LiteJobCreator(downloadManager));
}
@Override
public void scheduleRecovery() {
JobRequest.Builder builder = new JobRequest.Builder(LiteJobCreator.TAG)
.setExecutionWindow(TimeUnit.SECONDS.toMillis(1), TimeUnit.DAYS.toMillis(1));
switch (connectionType) {
case ALL:
builder.setRequiredNetworkType(JobRequest.NetworkType.CONNECTED);
break;
case UNMETERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.UNMETERED);
break;
case METERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.METERED);
break;
default:
Log.w(getClass().getSimpleName(), "Unknown ConnectionType: " + connectionType);
break;
}
JobRequest jobRequest = builder.build();
JobManager jobManager = JobManager.instance();
jobManager.schedule(jobRequest);
}
}
| package com.novoda.downloadmanager;
import android.content.Context;
import com.evernote.android.job.JobManager;
import com.evernote.android.job.JobRequest;
import com.novoda.notils.logger.simple.Log;
import java.util.concurrent.TimeUnit;
class LiteDownloadsNetworkRecoveryEnabled implements DownloadsNetworkRecovery {
private final ConnectionType connectionType;
LiteDownloadsNetworkRecoveryEnabled(Context context, DownloadManager downloadManager, ConnectionType connectionType) {
this.connectionType = connectionType;
JobManager jobManager = JobManager.create(context);
jobManager.addJobCreator(new LiteJobCreator(downloadManager));
}
@Override
public void scheduleRecovery() {
JobRequest.Builder builder = new JobRequest.Builder(LiteJobCreator.TAG)
.setExecutionWindow(TimeUnit.SECONDS.toMillis(1), TimeUnit.DAYS.toMillis(1));
switch (connectionType) {
case ALL:
builder.setRequiredNetworkType(JobRequest.NetworkType.CONNECTED);
break;
case UNMETERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.UNMETERED);
break;
case METERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.METERED);
break;
default:
Log.w("Unknown ConnectionType: " + connectionType);
break;
}
JobRequest jobRequest = builder.build();
JobManager jobManager = JobManager.instance();
jobManager.schedule(jobRequest);
Log.v("Scheduling Network Recovery.");
}
}
| Add some logs to scheduling. | Add some logs to scheduling.
| Java | apache-2.0 | novoda/download-manager | java | ## Code Before:
package com.novoda.downloadmanager;
import android.content.Context;
import android.util.Log;
import com.evernote.android.job.JobManager;
import com.evernote.android.job.JobRequest;
import java.util.concurrent.TimeUnit;
class LiteDownloadsNetworkRecoveryEnabled implements DownloadsNetworkRecovery {
private final ConnectionType connectionType;
LiteDownloadsNetworkRecoveryEnabled(Context context, DownloadManager downloadManager, ConnectionType connectionType) {
this.connectionType = connectionType;
JobManager jobManager = JobManager.create(context);
jobManager.addJobCreator(new LiteJobCreator(downloadManager));
}
@Override
public void scheduleRecovery() {
JobRequest.Builder builder = new JobRequest.Builder(LiteJobCreator.TAG)
.setExecutionWindow(TimeUnit.SECONDS.toMillis(1), TimeUnit.DAYS.toMillis(1));
switch (connectionType) {
case ALL:
builder.setRequiredNetworkType(JobRequest.NetworkType.CONNECTED);
break;
case UNMETERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.UNMETERED);
break;
case METERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.METERED);
break;
default:
Log.w(getClass().getSimpleName(), "Unknown ConnectionType: " + connectionType);
break;
}
JobRequest jobRequest = builder.build();
JobManager jobManager = JobManager.instance();
jobManager.schedule(jobRequest);
}
}
## Instruction:
Add some logs to scheduling.
## Code After:
package com.novoda.downloadmanager;
import android.content.Context;
import com.evernote.android.job.JobManager;
import com.evernote.android.job.JobRequest;
import com.novoda.notils.logger.simple.Log;
import java.util.concurrent.TimeUnit;
class LiteDownloadsNetworkRecoveryEnabled implements DownloadsNetworkRecovery {
private final ConnectionType connectionType;
LiteDownloadsNetworkRecoveryEnabled(Context context, DownloadManager downloadManager, ConnectionType connectionType) {
this.connectionType = connectionType;
JobManager jobManager = JobManager.create(context);
jobManager.addJobCreator(new LiteJobCreator(downloadManager));
}
@Override
public void scheduleRecovery() {
JobRequest.Builder builder = new JobRequest.Builder(LiteJobCreator.TAG)
.setExecutionWindow(TimeUnit.SECONDS.toMillis(1), TimeUnit.DAYS.toMillis(1));
switch (connectionType) {
case ALL:
builder.setRequiredNetworkType(JobRequest.NetworkType.CONNECTED);
break;
case UNMETERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.UNMETERED);
break;
case METERED:
builder.setRequiredNetworkType(JobRequest.NetworkType.METERED);
break;
default:
Log.w("Unknown ConnectionType: " + connectionType);
break;
}
JobRequest jobRequest = builder.build();
JobManager jobManager = JobManager.instance();
jobManager.schedule(jobRequest);
Log.v("Scheduling Network Recovery.");
}
}
|
51b78ee626d766de5f95dca2f512e6cb3d0b15d3 | clappr/src/main/kotlin/io/clappr/player/base/Options.kt | clappr/src/main/kotlin/io/clappr/player/base/Options.kt | package io.clappr.player.base
import java.util.*
class Options(
var source: String? = null,
var mimeType: String? = null,
val options: HashMap<String, Any> = hashMapOf<String, Any>()) : MutableMap<String, Any> by options
enum class ClapprOption(val value: String) {
/**
* Media start position
*/
START_AT("startAt"),
/**
* Poster URL
*/
POSTER("poster"),
/**
* Inform the URL license if DRM is necessary
*/
DRM_LICENSE_URL("drmLicenseUrl"),
/**
* Map from subtitles URL`s with name and URL to each one
*/
SUBTITLES("subtitles"),
/**
* String List to selected MediaOptions
*/
SELECTED_MEDIA_OPTIONS("selectedMediaOptions")
} | package io.clappr.player.base
import java.util.*
class Options(
var source: String? = null,
var mimeType: String? = null,
val options: HashMap<String, Any> = hashMapOf<String, Any>()) : MutableMap<String, Any> by options
enum class ClapprOption(val value: String) {
/**
* Media start position. This value can be a number, but the value may be trunked to an Integer
*/
START_AT("startAt"),
/**
* Poster URL
*/
POSTER("poster"),
/**
* Inform the URL license if DRM is necessary
*/
DRM_LICENSE_URL("drmLicenseUrl"),
/**
* Map from subtitles URL`s with name and URL to each one
*/
SUBTITLES("subtitles"),
/**
* String List to selected MediaOptions
*/
SELECTED_MEDIA_OPTIONS("selectedMediaOptions")
} | Update ClapprOption to better describe the START_AT option | fix(update_start_at_description): Update ClapprOption to better describe the START_AT option
| Kotlin | bsd-3-clause | clappr/clappr-android,clappr/clappr-android | kotlin | ## Code Before:
package io.clappr.player.base
import java.util.*
class Options(
var source: String? = null,
var mimeType: String? = null,
val options: HashMap<String, Any> = hashMapOf<String, Any>()) : MutableMap<String, Any> by options
enum class ClapprOption(val value: String) {
/**
* Media start position
*/
START_AT("startAt"),
/**
* Poster URL
*/
POSTER("poster"),
/**
* Inform the URL license if DRM is necessary
*/
DRM_LICENSE_URL("drmLicenseUrl"),
/**
* Map from subtitles URL`s with name and URL to each one
*/
SUBTITLES("subtitles"),
/**
* String List to selected MediaOptions
*/
SELECTED_MEDIA_OPTIONS("selectedMediaOptions")
}
## Instruction:
fix(update_start_at_description): Update ClapprOption to better describe the START_AT option
## Code After:
package io.clappr.player.base
import java.util.*
class Options(
var source: String? = null,
var mimeType: String? = null,
val options: HashMap<String, Any> = hashMapOf<String, Any>()) : MutableMap<String, Any> by options
enum class ClapprOption(val value: String) {
/**
* Media start position. This value can be a number, but the value may be trunked to an Integer
*/
START_AT("startAt"),
/**
* Poster URL
*/
POSTER("poster"),
/**
* Inform the URL license if DRM is necessary
*/
DRM_LICENSE_URL("drmLicenseUrl"),
/**
* Map from subtitles URL`s with name and URL to each one
*/
SUBTITLES("subtitles"),
/**
* String List to selected MediaOptions
*/
SELECTED_MEDIA_OPTIONS("selectedMediaOptions")
} |
a83b7c7f53004dfd42226e50e44cb6f61466416e | README.md | README.md |
> Preprocessor to use http://github.com/nodeca/mincer for compilation.
## Installation
The easiest way is to keep `karma-mincer-preprocessor` as a devDependency in your `package.json`.
```json
{
"devDependencies": {
"karma": "~0.10",
"karma-coffee-preprocessor": "~0.0.1"
}
}
```
You can simple do it by:
```bash
npm install karma-mincer-preprocessor --save-dev
```
## Configuration
Following code shows the default configuration...
```js
// karma.conf.js
module.exports = function(config) {
config.set({
preprocessors: {
'**/*.js': ['mincer']
},
mincerPreprocessor: {
options: {
// paths mincer will use to search for assets in
paths: [],
// optionally define a function which will receive
// the mincer environment to allow you to set further
// configuration options or register helpers
init: function (environment, options) {
}
}
}
});
};
```
|
> Preprocessor to use http://github.com/nodeca/mincer for compilation.
## Installation
The easiest way is to keep `karma-mincer-preprocessor` as a devDependency in your `package.json`.
```json
{
"devDependencies": {
"karma": "~0.10",
"karma-coffee-preprocessor": "~0.0.1"
}
}
```
You can simple do it by:
```bash
npm install karma-mincer-preprocessor --save-dev
```
## Configuration
Following code shows the default configuration...
```js
// karma.conf.js
module.exports = function(config) {
config.set({
preprocessors: {
'**/*.js': ['mincer']
},
mincerPreprocessor: {
options: {
// paths mincer will use to search for assets in
paths: [],
// optionally define a function which will receive
// the mincer environment to allow you to set further
// configuration options or register helpers
init: function (environment, options) {
environment.registerHelper('value-to-be-used', function () {
// see https://github.com/nodeca/mincer#using-helpers
});
}
}
}
});
};
```
| Add example of helper registration. | Add example of helper registration.
| Markdown | mit | cgc/karma-mincer-preprocessor | markdown | ## Code Before:
> Preprocessor to use http://github.com/nodeca/mincer for compilation.
## Installation
The easiest way is to keep `karma-mincer-preprocessor` as a devDependency in your `package.json`.
```json
{
"devDependencies": {
"karma": "~0.10",
"karma-coffee-preprocessor": "~0.0.1"
}
}
```
You can simple do it by:
```bash
npm install karma-mincer-preprocessor --save-dev
```
## Configuration
Following code shows the default configuration...
```js
// karma.conf.js
module.exports = function(config) {
config.set({
preprocessors: {
'**/*.js': ['mincer']
},
mincerPreprocessor: {
options: {
// paths mincer will use to search for assets in
paths: [],
// optionally define a function which will receive
// the mincer environment to allow you to set further
// configuration options or register helpers
init: function (environment, options) {
}
}
}
});
};
```
## Instruction:
Add example of helper registration.
## Code After:
> Preprocessor to use http://github.com/nodeca/mincer for compilation.
## Installation
The easiest way is to keep `karma-mincer-preprocessor` as a devDependency in your `package.json`.
```json
{
"devDependencies": {
"karma": "~0.10",
"karma-coffee-preprocessor": "~0.0.1"
}
}
```
You can simple do it by:
```bash
npm install karma-mincer-preprocessor --save-dev
```
## Configuration
Following code shows the default configuration...
```js
// karma.conf.js
module.exports = function(config) {
config.set({
preprocessors: {
'**/*.js': ['mincer']
},
mincerPreprocessor: {
options: {
// paths mincer will use to search for assets in
paths: [],
// optionally define a function which will receive
// the mincer environment to allow you to set further
// configuration options or register helpers
init: function (environment, options) {
environment.registerHelper('value-to-be-used', function () {
// see https://github.com/nodeca/mincer#using-helpers
});
}
}
}
});
};
```
|
ed248d9d759949bf415701f72b479817516ceac3 | config/routes.rb | config/routes.rb | Rails.application.routes.draw do
devise_for :users, controllers: { registrations: 'registrations' }
scope "/admin" do
resources :users, only: [:index, :edit, :update, :destroy]
end
resources :agencies
resources :uploads do
member do
get 'download'
end
end
resources :grants do
member do
patch 'update_controls'
post 'add_comment'
end
resources :budgets, only: [:index] do
collection do
patch :bulk_update
end
end
resources :forms
end
match "/401", to: "errors#unauthorized", via: :all
match "/404", to: "errors#not_found", via: :all
match "/500", to: "errors#internal_server_error", via: :all
get 'errors/internal_server_error'
get 'errors/not_found'
get 'errors/unauthorized'
get "/.well-known/acme-challenge/:id",
to: proc {
[200,
{},
["PVltCpMKHMFVvw2JnYNnsKovB8XNwbEm6iB_sjmTBvA.EOz8wpTt91fuDD0G0ziU0_tJHJNh11aZyXsVX3y7mro"]]
}
root 'grants#index'
end
| Rails.application.routes.draw do
devise_for :users, controllers: { registrations: 'registrations' }
scope "/admin" do
resources :users, only: [:index, :edit, :update, :destroy]
end
resources :agencies
resources :uploads do
member do
get 'download'
end
end
resources :grants do
member do
patch 'update_controls'
post 'add_comment'
end
resources :budgets, only: [:index] do
collection do
patch :bulk_update
end
end
resources :forms
end
match "/401", to: "errors#unauthorized", via: :all
match "/404", to: "errors#not_found", via: :all
match "/500", to: "errors#internal_server_error", via: :all
get 'errors/internal_server_error'
get 'errors/not_found'
get 'errors/unauthorized'
get "/.well-known/acme-challenge/:id",
to: proc {
[200,
{},
[ENV['GRANTZILLA_LETS_ENCRYPT_CHALLENGE_ID'] || 'The GRANTZILLA_LETS_ENCRYPT_CHALLENGE_ID environment variable is not set.']]
}
root 'grants#index'
end
| Move lets encrypt challenge id to an environment variable. | issue-289: Move lets encrypt challenge id to an environment variable.
| Ruby | mit | on-site/Grantzilla,on-site/Grantzilla,on-site/Grantzilla | ruby | ## Code Before:
Rails.application.routes.draw do
devise_for :users, controllers: { registrations: 'registrations' }
scope "/admin" do
resources :users, only: [:index, :edit, :update, :destroy]
end
resources :agencies
resources :uploads do
member do
get 'download'
end
end
resources :grants do
member do
patch 'update_controls'
post 'add_comment'
end
resources :budgets, only: [:index] do
collection do
patch :bulk_update
end
end
resources :forms
end
match "/401", to: "errors#unauthorized", via: :all
match "/404", to: "errors#not_found", via: :all
match "/500", to: "errors#internal_server_error", via: :all
get 'errors/internal_server_error'
get 'errors/not_found'
get 'errors/unauthorized'
get "/.well-known/acme-challenge/:id",
to: proc {
[200,
{},
["PVltCpMKHMFVvw2JnYNnsKovB8XNwbEm6iB_sjmTBvA.EOz8wpTt91fuDD0G0ziU0_tJHJNh11aZyXsVX3y7mro"]]
}
root 'grants#index'
end
## Instruction:
issue-289: Move lets encrypt challenge id to an environment variable.
## Code After:
Rails.application.routes.draw do
devise_for :users, controllers: { registrations: 'registrations' }
scope "/admin" do
resources :users, only: [:index, :edit, :update, :destroy]
end
resources :agencies
resources :uploads do
member do
get 'download'
end
end
resources :grants do
member do
patch 'update_controls'
post 'add_comment'
end
resources :budgets, only: [:index] do
collection do
patch :bulk_update
end
end
resources :forms
end
match "/401", to: "errors#unauthorized", via: :all
match "/404", to: "errors#not_found", via: :all
match "/500", to: "errors#internal_server_error", via: :all
get 'errors/internal_server_error'
get 'errors/not_found'
get 'errors/unauthorized'
get "/.well-known/acme-challenge/:id",
to: proc {
[200,
{},
[ENV['GRANTZILLA_LETS_ENCRYPT_CHALLENGE_ID'] || 'The GRANTZILLA_LETS_ENCRYPT_CHALLENGE_ID environment variable is not set.']]
}
root 'grants#index'
end
|
f550fea9c29bca3c9536bc646a9f89d5fe621b36 | models/photo.js | models/photo.js | // grab the things we need
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
// create a schema
var photoSchema = new Schema({
title: String,
slug: String,
status: String,
description: String,
author: {
name: String,
email: String,
username: String
},
meta: {
year: Number,
location: String,
caption: String
},
gallery: [{ name: String, slug: String }],
tags: [{ name: String, slug: String }],
created_at: Date,
updated_at: Date,
});
// the schema is useless so far
// we need to create a model using it
var Photo = mongoose.model('Photo', photoSchema);
// make this available to our users in our Node applications
module.exports = Photo; | /*
|--------------------------------------------------------------------------
| Imports
|--------------------------------------------------------------------------
*/
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
/*
|--------------------------------------------------------------------------
| Define the actual schema
|--------------------------------------------------------------------------
*/
var photoSchema = new Schema({
title: String,
slug: String,
status: String,
description: String,
author: {
name: String,
email: String,
username: String
},
meta: {
year: Number,
location: String,
caption: String
},
gallery: [{ name: String, slug: String }],
tags: [{ name: String, slug: String }],
created_at: Date,
updated_at: Date,
});
/*
|--------------------------------------------------------------------------
| Assign the schema and export
|--------------------------------------------------------------------------
*/
var Photo = mongoose.model('Photo', photoSchema);
module.exports = Photo; | Update comments in Photo model | Update comments in Photo model
| JavaScript | mit | mhovey/photos-api,mhovey/photos | javascript | ## Code Before:
// grab the things we need
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
// create a schema
var photoSchema = new Schema({
title: String,
slug: String,
status: String,
description: String,
author: {
name: String,
email: String,
username: String
},
meta: {
year: Number,
location: String,
caption: String
},
gallery: [{ name: String, slug: String }],
tags: [{ name: String, slug: String }],
created_at: Date,
updated_at: Date,
});
// the schema is useless so far
// we need to create a model using it
var Photo = mongoose.model('Photo', photoSchema);
// make this available to our users in our Node applications
module.exports = Photo;
## Instruction:
Update comments in Photo model
## Code After:
/*
|--------------------------------------------------------------------------
| Imports
|--------------------------------------------------------------------------
*/
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
/*
|--------------------------------------------------------------------------
| Define the actual schema
|--------------------------------------------------------------------------
*/
var photoSchema = new Schema({
title: String,
slug: String,
status: String,
description: String,
author: {
name: String,
email: String,
username: String
},
meta: {
year: Number,
location: String,
caption: String
},
gallery: [{ name: String, slug: String }],
tags: [{ name: String, slug: String }],
created_at: Date,
updated_at: Date,
});
/*
|--------------------------------------------------------------------------
| Assign the schema and export
|--------------------------------------------------------------------------
*/
var Photo = mongoose.model('Photo', photoSchema);
module.exports = Photo; |
b05009ad140911ba3e3dcb15627c6408e75bc3d0 | vm/builtin/atomic.cpp | vm/builtin/atomic.cpp |
namespace rubinius {
void AtomicReference::init(STATE) {
GO(atomic_ref).set(ontology::new_class(state,
"AtomicReference", G(object), G(rubinius)));
G(atomic_ref)->set_object_type(state, AtomicReferenceType);
}
AtomicReference* AtomicReference::allocate(STATE) {
return state->new_object<AtomicReference>(G(atomic_ref));
}
Object* AtomicReference::compare_and_set(STATE, Object* old, Object* new_) {
Object** pp = &value_;
return atomic::compare_and_swap((void**)pp, old, new_) ? cTrue : cFalse;
}
}
|
namespace rubinius {
void AtomicReference::init(STATE) {
GO(atomic_ref).set(ontology::new_class(state,
"AtomicReference", G(object), G(rubinius)));
G(atomic_ref)->set_object_type(state, AtomicReferenceType);
}
AtomicReference* AtomicReference::allocate(STATE) {
return state->new_object<AtomicReference>(G(atomic_ref));
}
Object* AtomicReference::compare_and_set(STATE, Object* old, Object* new_) {
Object** pp = &value_;
if (atomic::compare_and_swap((void**)pp, old, new_)) {
if(mature_object_p()) this->write_barrier(state, new_);
return cTrue;
} else {
return cFalse;
}
}
}
| Fix GC-related SEGV when heavily using AtomicReference | Fix GC-related SEGV when heavily using AtomicReference
| C++ | mpl-2.0 | benlovell/rubinius,lgierth/rubinius,kachick/rubinius,jsyeo/rubinius,benlovell/rubinius,jemc/rubinius,dblock/rubinius,mlarraz/rubinius,ngpestelos/rubinius,dblock/rubinius,Wirachmat/rubinius,lgierth/rubinius,heftig/rubinius,digitalextremist/rubinius,ngpestelos/rubinius,pH14/rubinius,digitalextremist/rubinius,lgierth/rubinius,Wirachmat/rubinius,jsyeo/rubinius,Wirachmat/rubinius,jsyeo/rubinius,jemc/rubinius,ruipserra/rubinius,sferik/rubinius,digitalextremist/rubinius,digitalextremist/rubinius,ngpestelos/rubinius,dblock/rubinius,heftig/rubinius,kachick/rubinius,ruipserra/rubinius,sferik/rubinius,kachick/rubinius,jemc/rubinius,jsyeo/rubinius,mlarraz/rubinius,ngpestelos/rubinius,digitalextremist/rubinius,jsyeo/rubinius,benlovell/rubinius,jemc/rubinius,heftig/rubinius,sferik/rubinius,ngpestelos/rubinius,jsyeo/rubinius,Wirachmat/rubinius,Azizou/rubinius,sferik/rubinius,pH14/rubinius,Azizou/rubinius,kachick/rubinius,benlovell/rubinius,sferik/rubinius,jsyeo/rubinius,ngpestelos/rubinius,benlovell/rubinius,Azizou/rubinius,Wirachmat/rubinius,benlovell/rubinius,pH14/rubinius,ruipserra/rubinius,jemc/rubinius,dblock/rubinius,heftig/rubinius,lgierth/rubinius,dblock/rubinius,kachick/rubinius,mlarraz/rubinius,digitalextremist/rubinius,jemc/rubinius,Azizou/rubinius,digitalextremist/rubinius,dblock/rubinius,mlarraz/rubinius,ruipserra/rubinius,mlarraz/rubinius,mlarraz/rubinius,heftig/rubinius,pH14/rubinius,mlarraz/rubinius,kachick/rubinius,pH14/rubinius,Wirachmat/rubinius,heftig/rubinius,sferik/rubinius,pH14/rubinius,lgierth/rubinius,ruipserra/rubinius,sferik/rubinius,jemc/rubinius,benlovell/rubinius,lgierth/rubinius,ruipserra/rubinius,ngpestelos/rubinius,kachick/rubinius,heftig/rubinius,Wirachmat/rubinius,ruipserra/rubinius,kachick/rubinius,lgierth/rubinius,dblock/rubinius,Azizou/rubinius,pH14/rubinius,Azizou/rubinius,Azizou/rubinius | c++ | ## Code Before:
namespace rubinius {
void AtomicReference::init(STATE) {
GO(atomic_ref).set(ontology::new_class(state,
"AtomicReference", G(object), G(rubinius)));
G(atomic_ref)->set_object_type(state, AtomicReferenceType);
}
AtomicReference* AtomicReference::allocate(STATE) {
return state->new_object<AtomicReference>(G(atomic_ref));
}
Object* AtomicReference::compare_and_set(STATE, Object* old, Object* new_) {
Object** pp = &value_;
return atomic::compare_and_swap((void**)pp, old, new_) ? cTrue : cFalse;
}
}
## Instruction:
Fix GC-related SEGV when heavily using AtomicReference
## Code After:
namespace rubinius {
void AtomicReference::init(STATE) {
GO(atomic_ref).set(ontology::new_class(state,
"AtomicReference", G(object), G(rubinius)));
G(atomic_ref)->set_object_type(state, AtomicReferenceType);
}
AtomicReference* AtomicReference::allocate(STATE) {
return state->new_object<AtomicReference>(G(atomic_ref));
}
Object* AtomicReference::compare_and_set(STATE, Object* old, Object* new_) {
Object** pp = &value_;
if (atomic::compare_and_swap((void**)pp, old, new_)) {
if(mature_object_p()) this->write_barrier(state, new_);
return cTrue;
} else {
return cFalse;
}
}
}
|
3e4287fdb58fbf2cd997c5ddb8d0c163c466139a | .travis.yml | .travis.yml | language: python
python:
- "2.7"
# command to install dependencies
script: python tests/test_simplenote.py
branches:
only:
- master
| language: python
python:
- '2.7'
script: python tests/test_simplenote.py
branches:
only:
- master
deploy:
provider: pypi
user: atomicules
password:
secure: QD0ICwZ/JQ7KJZe+BryMMBjQnJdCXBE8CTf85e0UrJd63kf+ogtGM6w+YVGdvrR75uJDq4IFkce9DDoJIIQ5jThCR9QTkFGt6Xx4I7WKqkbaEAkt6Sf1efEEiouJUCETkO1ILajzbi9805ucuKON1Wb4XZ2R9krFP7+h68ZcXkU=
on:
tags: true
repo: mrtazz/simplenote.py
branch: master
| Deploy to PyPi from Travis CI | Deploy to PyPi from Travis CI
Fixes #10
(well, hopefully, not sure how switching to my PyPi credentials will
affect things since previously deployed by @mrtazz)
[ci skip]
| YAML | mit | mrtazz/simplenote.py | yaml | ## Code Before:
language: python
python:
- "2.7"
# command to install dependencies
script: python tests/test_simplenote.py
branches:
only:
- master
## Instruction:
Deploy to PyPi from Travis CI
Fixes #10
(well, hopefully, not sure how switching to my PyPi credentials will
affect things since previously deployed by @mrtazz)
[ci skip]
## Code After:
language: python
python:
- '2.7'
script: python tests/test_simplenote.py
branches:
only:
- master
deploy:
provider: pypi
user: atomicules
password:
secure: QD0ICwZ/JQ7KJZe+BryMMBjQnJdCXBE8CTf85e0UrJd63kf+ogtGM6w+YVGdvrR75uJDq4IFkce9DDoJIIQ5jThCR9QTkFGt6Xx4I7WKqkbaEAkt6Sf1efEEiouJUCETkO1ILajzbi9805ucuKON1Wb4XZ2R9krFP7+h68ZcXkU=
on:
tags: true
repo: mrtazz/simplenote.py
branch: master
|
d3d724bb8809906ebae3abbd2c4a11c8a4aec268 | actionpack/lib/action_dispatch/testing/performance_test.rb | actionpack/lib/action_dispatch/testing/performance_test.rb | require 'active_support/testing/performance'
require 'active_support/testing/default'
begin
module ActionDispatch
# An integration test that runs a code profiler on your test methods.
# Profiling output for combinations of each test method, measurement, and
# output format are written to your tmp/performance directory.
#
# By default, process_time is measured and both flat and graph_html output
# formats are written, so you'll have two output files per test method.
class PerformanceTest < ActionDispatch::IntegrationTest
include ActiveSupport::Testing::Performance
end
end
rescue NameError
$stderr.puts "Specify ruby-prof as application's dependency in Gemfile to run benchmarks."
end
| require 'active_support/testing/performance'
begin
module ActionDispatch
# An integration test that runs a code profiler on your test methods.
# Profiling output for combinations of each test method, measurement, and
# output format are written to your tmp/performance directory.
#
# By default, process_time is measured and both flat and graph_html output
# formats are written, so you'll have two output files per test method.
class PerformanceTest < ActionDispatch::IntegrationTest
include ActiveSupport::Testing::Performance
end
end
rescue NameError
$stderr.puts "Specify ruby-prof as application's dependency in Gemfile to run benchmarks."
end
| Remove this require since active_support/testing/default doesn't exist anymore | Remove this require since active_support/testing/default doesn't exist anymore
| Ruby | mit | mathieujobin/reduced-rails-for-travis,mtsmfm/railstest,starknx/rails,fabianoleittes/rails,bogdanvlviv/rails,flanger001/rails,tgxworld/rails,kirs/rails-1,deraru/rails,illacceptanything/illacceptanything,sergey-alekseev/rails,riseshia/railsguides.kr,travisofthenorth/rails,untidy-hair/rails,illacceptanything/illacceptanything,pvalena/rails,kmayer/rails,repinel/rails,amoody2108/TechForJustice,Edouard-chin/rails,aditya-kapoor/rails,Erol/rails,palkan/rails,rbhitchcock/rails,illacceptanything/illacceptanything,palkan/rails,yasslab/railsguides.jp,untidy-hair/rails,Sen-Zhang/rails,elfassy/rails,illacceptanything/illacceptanything,bogdanvlviv/rails,88rabbit/newRails,utilum/rails,mohitnatoo/rails,rails/rails,mtsmfm/railstest,odedniv/rails,gcourtemanche/rails,gcourtemanche/rails,Envek/rails,jeremy/rails,ledestin/rails,vipulnsward/rails,hanystudy/rails,yahonda/rails,vipulnsward/rails,travisofthenorth/rails,amoody2108/TechForJustice,gavingmiller/rails,baerjam/rails,rossta/rails,eileencodes/rails,gavingmiller/rails,vassilevsky/rails,lcreid/rails,starknx/rails,rafaelfranca/omg-rails,kachick/rails,shioyama/rails,kachick/rails,yhirano55/rails,felipecvo/rails,pschambacher/rails,MSP-Greg/rails,arjes/rails,lucasmazza/rails,kmcphillips/rails,yalab/rails,alecspopa/rails,elfassy/rails,untidy-hair/rails,matrinox/rails,illacceptanything/illacceptanything,joonyou/rails,printercu/rails,yalab/rails,richseviora/rails,kddeisz/rails,rossta/rails,Vasfed/rails,maicher/rails,Vasfed/rails,georgeclaghorn/rails,utilum/rails,deraru/rails,88rabbit/newRails,arjes/rails,joonyou/rails,gfvcastro/rails,yasslab/railsguides.jp,coreyward/rails,matrinox/rails,BlakeWilliams/rails,tgxworld/rails,yasslab/railsguides.jp,betesh/rails,ttanimichi/rails,flanger001/rails,sealocal/rails,gcourtemanche/rails,BlakeWilliams/rails,lucasmazza/rails,printercu/rails,illacceptanything/illacceptanything,matrinox/rails,vipulnsward/rails,kmayer/rails,yahonda/rails,tjschuck/rails,yawboakye/rails,bradleypriest/rails,kddeisz/rails,tgxworld/rails,yhirano55/rails,esparta/rails,fabianoleittes/rails,mijoharas/rails,kddeisz/rails,lcreid/rails,tjschuck/rails,mohitnatoo/rails,deraru/rails,shioyama/rails,stefanmb/rails,mtsmfm/railstest,MSP-Greg/rails,yahonda/rails,bolek/rails,joonyou/rails,ngpestelos/rails,vassilevsky/rails,kirs/rails-1,gauravtiwari/rails,tijwelch/rails,untidy-hair/rails,mtsmfm/rails,bolek/rails,illacceptanything/illacceptanything,rbhitchcock/rails,xlymian/rails,mechanicles/rails,stefanmb/rails,brchristian/rails,Erol/rails,arunagw/rails,kamipo/rails,kaspth/rails,printercu/rails,iainbeeston/rails,Stellenticket/rails,tgxworld/rails,betesh/rails,georgeclaghorn/rails,tjschuck/rails,esparta/rails,yawboakye/rails,ttanimichi/rails,yalab/rails,shioyama/rails,illacceptanything/illacceptanything,lucasmazza/rails,kmcphillips/rails,aditya-kapoor/rails,sergey-alekseev/rails,rossta/rails,repinel/rails,brchristian/rails,mtsmfm/rails,repinel/rails,betesh/rails,alecspopa/rails,gavingmiller/rails,shioyama/rails,flanger001/rails,ngpestelos/rails,rails/rails,esparta/rails,elfassy/rails,kenta-s/rails,mathieujobin/reduced-rails-for-travis,alecspopa/rails,kenta-s/rails,Edouard-chin/rails,hanystudy/rails,schuetzm/rails,sealocal/rails,pschambacher/rails,prathamesh-sonpatki/rails,samphilipd/rails,betesh/rails,koic/rails,felipecvo/rails,MichaelSp/rails,utilum/rails,kaspth/rails,Vasfed/rails,voray/rails,Envek/rails,voray/rails,iainbeeston/rails,schuetzm/rails,palkan/rails,EmmaB/rails-1,aditya-kapoor/rails,Spin42/rails,rails/rails,vipulnsward/rails,tijwelch/rails,deraru/rails,mathieujobin/reduced-rails-for-travis,riseshia/railsguides.kr,Edouard-chin/rails,koic/rails,Envek/rails,vassilevsky/rails,coreyward/rails,assain/rails,kddeisz/rails,baerjam/rails,tjschuck/rails,mijoharas/rails,jeremy/rails,flanger001/rails,prathamesh-sonpatki/rails,baerjam/rails,stefanmb/rails,marklocklear/rails,88rabbit/newRails,fabianoleittes/rails,Envek/rails,MSP-Greg/rails,travisofthenorth/rails,kmayer/rails,yasslab/railsguides.jp,kmcphillips/rails,odedniv/rails,fabianoleittes/rails,joonyou/rails,xlymian/rails,Erol/rails,jeremy/rails,riseshia/railsguides.kr,maicher/rails,maicher/rails,bolek/rails,repinel/rails,eileencodes/rails,Spin42/rails,pvalena/rails,richseviora/rails,palkan/rails,illacceptanything/illacceptanything,koic/rails,yawboakye/rails,mijoharas/rails,MSP-Greg/rails,assain/rails,iainbeeston/rails,arunagw/rails,marklocklear/rails,kamipo/rails,sergey-alekseev/rails,gfvcastro/rails,gfvcastro/rails,bogdanvlviv/rails,arunagw/rails,ttanimichi/rails,eileencodes/rails,esparta/rails,kmcphillips/rails,yahonda/rails,mechanicles/rails,jeremy/rails,felipecvo/rails,ngpestelos/rails,notapatch/rails,kirs/rails-1,gauravtiwari/rails,notapatch/rails,eileencodes/rails,kachick/rails,kamipo/rails,yhirano55/rails,Sen-Zhang/rails,schuetzm/rails,samphilipd/rails,illacceptanything/illacceptanything,BlakeWilliams/rails,odedniv/rails,tijwelch/rails,printercu/rails,yhirano55/rails,notapatch/rails,illacceptanything/illacceptanything,pvalena/rails,baerjam/rails,Spin42/rails,arunagw/rails,bogdanvlviv/rails,rbhitchcock/rails,prathamesh-sonpatki/rails,georgeclaghorn/rails,pvalena/rails,Stellenticket/rails,arjes/rails,gauravtiwari/rails,kenta-s/rails,illacceptanything/illacceptanything,MichaelSp/rails,aditya-kapoor/rails,mechanicles/rails,ledestin/rails,pschambacher/rails,mohitnatoo/rails,Stellenticket/rails,mtsmfm/rails,samphilipd/rails,mohitnatoo/rails,hanystudy/rails,Stellenticket/rails,assain/rails,xlymian/rails,assain/rails,ledestin/rails,notapatch/rails,travisofthenorth/rails,amoody2108/TechForJustice,rafaelfranca/omg-rails,rafaelfranca/omg-rails,yawboakye/rails,gfvcastro/rails,yalab/rails,riseshia/railsguides.kr,lcreid/rails,lcreid/rails,EmmaB/rails-1,Vasfed/rails,Sen-Zhang/rails,Erol/rails,kaspth/rails,illacceptanything/illacceptanything,illacceptanything/illacceptanything,mechanicles/rails,MichaelSp/rails,utilum/rails,illacceptanything/illacceptanything,prathamesh-sonpatki/rails,BlakeWilliams/rails,coreyward/rails,schuetzm/rails,richseviora/rails,sealocal/rails,bradleypriest/rails,brchristian/rails,starknx/rails,Edouard-chin/rails,rails/rails,voray/rails,iainbeeston/rails,marklocklear/rails,EmmaB/rails-1,bradleypriest/rails,georgeclaghorn/rails | ruby | ## Code Before:
require 'active_support/testing/performance'
require 'active_support/testing/default'
begin
module ActionDispatch
# An integration test that runs a code profiler on your test methods.
# Profiling output for combinations of each test method, measurement, and
# output format are written to your tmp/performance directory.
#
# By default, process_time is measured and both flat and graph_html output
# formats are written, so you'll have two output files per test method.
class PerformanceTest < ActionDispatch::IntegrationTest
include ActiveSupport::Testing::Performance
end
end
rescue NameError
$stderr.puts "Specify ruby-prof as application's dependency in Gemfile to run benchmarks."
end
## Instruction:
Remove this require since active_support/testing/default doesn't exist anymore
## Code After:
require 'active_support/testing/performance'
begin
module ActionDispatch
# An integration test that runs a code profiler on your test methods.
# Profiling output for combinations of each test method, measurement, and
# output format are written to your tmp/performance directory.
#
# By default, process_time is measured and both flat and graph_html output
# formats are written, so you'll have two output files per test method.
class PerformanceTest < ActionDispatch::IntegrationTest
include ActiveSupport::Testing::Performance
end
end
rescue NameError
$stderr.puts "Specify ruby-prof as application's dependency in Gemfile to run benchmarks."
end
|
546089b6192f1e1805c335d53d45a9b55c3af2bf | js/src/test/kotlin/trees.kt | js/src/test/kotlin/trees.kt | package html4k.tests
import html4k.dom.append
import html4k.js.div
import html4k.p
import org.w3c.dom.asList
import kotlin.browser.document
import kotlin.test.assertEquals
import kotlin.test.assertTrue
import org.junit.Test as test
class DomTreeImplTest {
test fun simpleTree() {
val node = document.body!!.append.div {
p {
+"test"
}
}
assertEquals("DIV", node.tagName)
assertEquals(1, node.childNodes.length)
assertEquals("P", node.children[0]?.tagName)
assertTrue(document.body!!.children.length > 0)
assertEquals(node, document.body!!.children.asList().last())
}
}
| package html4k.tests
import html4k.dom.append
import html4k.js.div
import html4k.js.onClickFunction
import html4k.p
import org.w3c.dom.events.Event
import org.w3c.dom.HTMLDivElement
import org.w3c.dom.HTMLElement
import org.w3c.dom.asList
import kotlin.dom.asList
import kotlin.browser.document
import kotlin.test.assertEquals
import kotlin.test.assertNotNull
import kotlin.test.assertTrue
import org.junit.Test as test
class DomTreeImplTest {
test fun simpleTree() {
val node = document.body!!.append.div {
p {
+"test"
}
}
assertEquals("DIV", node.tagName)
assertEquals(1, node.childNodes.length)
assertEquals("P", node.children[0]?.tagName)
assertTrue(document.body!!.children.length > 0)
assertEquals(node, document.body!!.children.asList().last())
}
test fun appendSingleNode() {
val myDiv: HTMLDivElement = document.body!!.append.div {
p {
+"test"
}
}
assertEquals("DIV", myDiv.tagName)
assertEquals(document.body, myDiv.parentNode)
assertEquals("<div><p>test</p></div>", myDiv.outerHTML.replace("\\s+".toRegex(), ""))
}
test fun appendNodeWithEventHandler() {
var clicked = false
document.body!!.append.div {
onClickFunction = {
clicked = true
}
}
document.getElementsByTagName("div").asList().forEach {
if (it is HTMLElement) {
val clickHandler = it.onclick
if (clickHandler != null) {
clickHandler(uninitialized())
}
}
}
assertTrue(clicked)
}
private fun <T> uninitialized(): T = null as T
}
| Add test for event handling | Add test for event handling
| Kotlin | apache-2.0 | kotlinx/kotlinx.html,kotlinx/kotlinx.html,kotlinx/kotlinx.html | kotlin | ## Code Before:
package html4k.tests
import html4k.dom.append
import html4k.js.div
import html4k.p
import org.w3c.dom.asList
import kotlin.browser.document
import kotlin.test.assertEquals
import kotlin.test.assertTrue
import org.junit.Test as test
class DomTreeImplTest {
test fun simpleTree() {
val node = document.body!!.append.div {
p {
+"test"
}
}
assertEquals("DIV", node.tagName)
assertEquals(1, node.childNodes.length)
assertEquals("P", node.children[0]?.tagName)
assertTrue(document.body!!.children.length > 0)
assertEquals(node, document.body!!.children.asList().last())
}
}
## Instruction:
Add test for event handling
## Code After:
package html4k.tests
import html4k.dom.append
import html4k.js.div
import html4k.js.onClickFunction
import html4k.p
import org.w3c.dom.events.Event
import org.w3c.dom.HTMLDivElement
import org.w3c.dom.HTMLElement
import org.w3c.dom.asList
import kotlin.dom.asList
import kotlin.browser.document
import kotlin.test.assertEquals
import kotlin.test.assertNotNull
import kotlin.test.assertTrue
import org.junit.Test as test
class DomTreeImplTest {
test fun simpleTree() {
val node = document.body!!.append.div {
p {
+"test"
}
}
assertEquals("DIV", node.tagName)
assertEquals(1, node.childNodes.length)
assertEquals("P", node.children[0]?.tagName)
assertTrue(document.body!!.children.length > 0)
assertEquals(node, document.body!!.children.asList().last())
}
test fun appendSingleNode() {
val myDiv: HTMLDivElement = document.body!!.append.div {
p {
+"test"
}
}
assertEquals("DIV", myDiv.tagName)
assertEquals(document.body, myDiv.parentNode)
assertEquals("<div><p>test</p></div>", myDiv.outerHTML.replace("\\s+".toRegex(), ""))
}
test fun appendNodeWithEventHandler() {
var clicked = false
document.body!!.append.div {
onClickFunction = {
clicked = true
}
}
document.getElementsByTagName("div").asList().forEach {
if (it is HTMLElement) {
val clickHandler = it.onclick
if (clickHandler != null) {
clickHandler(uninitialized())
}
}
}
assertTrue(clicked)
}
private fun <T> uninitialized(): T = null as T
}
|
b15a429fe694daa2bd4da5ec9d64cfc98c63b716 | source/index.html.erb | source/index.html.erb | ---
pageable: true
per_page: 10
---
<% if paginate && num_pages > 1 %>
<p>Page <%= page_number %> of <%= num_pages %></p>
<% if prev_page %>
<p><%= link_to 'Previous page', prev_page %></p>
<% end %>
<% end %>
<ul class="cards">
<% page_articles.each_with_index do |article, i| %>
<li class="card">
<%= link_to '', article, :class => "card-link no-hvr" %>
<ul class="card-tags">
<% article.tags.each do |tag| %>
<li>
<%= link_to "#{tag}", tag_path(tag),
:class => "badge no-hvr",
:id => "#{tag}" %>
</li>
<% end %>
</ul>
<div class="card-image">
<!-- if article has tags, add filler element -->
<% if !article.tags.empty? %>
<div id="tag-filler-space"></div>
<% end %>
<!-- if article has header image, add it -->
<% if !article.data['header-image'].nil? && article.data['header-image'] != "" %>
<%= image_tag article.data['header-image'] %>
<% end %>
</div>
<div class="card-header">
<h3 class="card-title">
<%= article.title %>
</h3>
<h5 class="card-date">
<%= article.date.strftime('%d %B %Y') %>
</h5>
</div>
<hr class="fractal">
<div class="card-copy">
<%= markdownify article.summary %>
</div>
</li>
<% end %>
</ul>
<% if paginate %>
<% if next_page %>
<p><%= link_to 'Next page', next_page %></p>
<% end %>
<% end %>
| ---
pageable: true
per_page: 10
---
<ul class="cards">
<% page_articles.each do |article| %>
<%= partial "partials/card_li",
locals: { item: article } %>
<% end %>
</ul>
<% if paginate && num_pages > 1 %>
<div class="pagination">
<%= pagination_links(articles.count, per_page) %>
</div>
<% end %>
| Use new card_li partial in index | Use new card_li partial in index
| HTML+ERB | mit | smargh/fractaled_mind,smargh/fractaled_mind,fractaledmind/fractaled_mind,fractaledmind/fractaled_mind,smargh/fractaled_mind,smargh/fractaled_mind,fractaledmind/fractaled_mind,fractaledmind/fractaled_mind | html+erb | ## Code Before:
---
pageable: true
per_page: 10
---
<% if paginate && num_pages > 1 %>
<p>Page <%= page_number %> of <%= num_pages %></p>
<% if prev_page %>
<p><%= link_to 'Previous page', prev_page %></p>
<% end %>
<% end %>
<ul class="cards">
<% page_articles.each_with_index do |article, i| %>
<li class="card">
<%= link_to '', article, :class => "card-link no-hvr" %>
<ul class="card-tags">
<% article.tags.each do |tag| %>
<li>
<%= link_to "#{tag}", tag_path(tag),
:class => "badge no-hvr",
:id => "#{tag}" %>
</li>
<% end %>
</ul>
<div class="card-image">
<!-- if article has tags, add filler element -->
<% if !article.tags.empty? %>
<div id="tag-filler-space"></div>
<% end %>
<!-- if article has header image, add it -->
<% if !article.data['header-image'].nil? && article.data['header-image'] != "" %>
<%= image_tag article.data['header-image'] %>
<% end %>
</div>
<div class="card-header">
<h3 class="card-title">
<%= article.title %>
</h3>
<h5 class="card-date">
<%= article.date.strftime('%d %B %Y') %>
</h5>
</div>
<hr class="fractal">
<div class="card-copy">
<%= markdownify article.summary %>
</div>
</li>
<% end %>
</ul>
<% if paginate %>
<% if next_page %>
<p><%= link_to 'Next page', next_page %></p>
<% end %>
<% end %>
## Instruction:
Use new card_li partial in index
## Code After:
---
pageable: true
per_page: 10
---
<ul class="cards">
<% page_articles.each do |article| %>
<%= partial "partials/card_li",
locals: { item: article } %>
<% end %>
</ul>
<% if paginate && num_pages > 1 %>
<div class="pagination">
<%= pagination_links(articles.count, per_page) %>
</div>
<% end %>
|
1925cf4037da3126816071657709095be6f81193 | templates/zerver/help/change-your-name.md | templates/zerver/help/change-your-name.md |
It's easy to change how your name is shown in Zulip. With full Unicode support,
you can spell your name exactly how you'd like it to be displayed.
{!go-to-the.md!} [Your account](/#settings/your-account)
{!settings.md!}
2. Change your name in the **Full name** field, and click **Save changes**.
Congratulations! You have updated your name.
!!! tip ""
Note that your old messages will still have the previous version of your
name.
|
It's easy to change how your name is shown in Zulip. With full Unicode support,
you can spell your name exactly how you'd like it to be displayed.
{!go-to-the.md!} [Your account](/#settings/your-account)
{!settings.md!}
2. Change your name in the **Full name** field.
{!save-changes.md!} account settings.
Congratulations! You have updated your name.
!!! tip ""
Note that your old messages will still have the previous version of your
name.
| Add *Save changes* macro to *Change your name* | docs: Add *Save changes* macro to *Change your name*
| Markdown | apache-2.0 | vabs22/zulip,punchagan/zulip,brockwhittaker/zulip,andersk/zulip,vaidap/zulip,sonali0901/zulip,amyliu345/zulip,JPJPJPOPOP/zulip,rishig/zulip,jackrzhang/zulip,punchagan/zulip,j831/zulip,vabs22/zulip,synicalsyntax/zulip,susansls/zulip,amanharitsh123/zulip,jphilipsen05/zulip,souravbadami/zulip,punchagan/zulip,showell/zulip,rishig/zulip,amanharitsh123/zulip,brockwhittaker/zulip,jrowan/zulip,amanharitsh123/zulip,brainwane/zulip,verma-varsha/zulip,jrowan/zulip,synicalsyntax/zulip,vaidap/zulip,timabbott/zulip,showell/zulip,punchagan/zulip,zulip/zulip,brainwane/zulip,punchagan/zulip,Galexrt/zulip,timabbott/zulip,SmartPeople/zulip,timabbott/zulip,shubhamdhama/zulip,rishig/zulip,dawran6/zulip,j831/zulip,shubhamdhama/zulip,zulip/zulip,amyliu345/zulip,vaidap/zulip,synicalsyntax/zulip,eeshangarg/zulip,tommyip/zulip,zulip/zulip,christi3k/zulip,dawran6/zulip,jainayush975/zulip,rht/zulip,synicalsyntax/zulip,vabs22/zulip,isht3/zulip,susansls/zulip,andersk/zulip,isht3/zulip,tommyip/zulip,sonali0901/zulip,punchagan/zulip,samatdav/zulip,jainayush975/zulip,eeshangarg/zulip,zulip/zulip,isht3/zulip,vabs22/zulip,eeshangarg/zulip,PhilSk/zulip,sharmaeklavya2/zulip,vabs22/zulip,timabbott/zulip,brockwhittaker/zulip,tommyip/zulip,timabbott/zulip,blaze225/zulip,andersk/zulip,christi3k/zulip,hackerkid/zulip,sharmaeklavya2/zulip,samatdav/zulip,j831/zulip,jackrzhang/zulip,dattatreya303/zulip,kou/zulip,hackerkid/zulip,hackerkid/zulip,rht/zulip,ryanbackman/zulip,samatdav/zulip,dhcrzf/zulip,vabs22/zulip,brainwane/zulip,amanharitsh123/zulip,dhcrzf/zulip,sharmaeklavya2/zulip,andersk/zulip,ryanbackman/zulip,tommyip/zulip,vaidap/zulip,jphilipsen05/zulip,souravbadami/zulip,mahim97/zulip,aakash-cr7/zulip,susansls/zulip,shubhamdhama/zulip,rishig/zulip,brainwane/zulip,andersk/zulip,isht3/zulip,mahim97/zulip,kou/zulip,kou/zulip,rht/zulip,isht3/zulip,samatdav/zulip,andersk/zulip,ryanbackman/zulip,SmartPeople/zulip,PhilSk/zulip,j831/zulip,brockwhittaker/zulip,synicalsyntax/zulip,amyliu345/zulip,eeshangarg/zulip,shubhamdhama/zulip,verma-varsha/zulip,zulip/zulip,ryanbackman/zulip,dattatreya303/zulip,dattatreya303/zulip,dhcrzf/zulip,mahim97/zulip,verma-varsha/zulip,hackerkid/zulip,ryanbackman/zulip,jackrzhang/zulip,SmartPeople/zulip,brockwhittaker/zulip,eeshangarg/zulip,susansls/zulip,amyliu345/zulip,dhcrzf/zulip,vaidap/zulip,synicalsyntax/zulip,rht/zulip,Galexrt/zulip,dawran6/zulip,amyliu345/zulip,andersk/zulip,aakash-cr7/zulip,aakash-cr7/zulip,timabbott/zulip,synicalsyntax/zulip,samatdav/zulip,dattatreya303/zulip,PhilSk/zulip,jackrzhang/zulip,dawran6/zulip,hackerkid/zulip,blaze225/zulip,souravbadami/zulip,j831/zulip,verma-varsha/zulip,rishig/zulip,aakash-cr7/zulip,timabbott/zulip,mahim97/zulip,dawran6/zulip,sharmaeklavya2/zulip,susansls/zulip,JPJPJPOPOP/zulip,amanharitsh123/zulip,SmartPeople/zulip,kou/zulip,isht3/zulip,eeshangarg/zulip,rht/zulip,brainwane/zulip,jphilipsen05/zulip,showell/zulip,jackrzhang/zulip,kou/zulip,jrowan/zulip,tommyip/zulip,sharmaeklavya2/zulip,SmartPeople/zulip,jphilipsen05/zulip,blaze225/zulip,shubhamdhama/zulip,zulip/zulip,zulip/zulip,souravbadami/zulip,aakash-cr7/zulip,mahim97/zulip,christi3k/zulip,punchagan/zulip,jainayush975/zulip,Galexrt/zulip,JPJPJPOPOP/zulip,Galexrt/zulip,christi3k/zulip,tommyip/zulip,brainwane/zulip,aakash-cr7/zulip,jainayush975/zulip,Galexrt/zulip,verma-varsha/zulip,jrowan/zulip,sonali0901/zulip,amanharitsh123/zulip,jphilipsen05/zulip,jainayush975/zulip,JPJPJPOPOP/zulip,verma-varsha/zulip,samatdav/zulip,showell/zulip,dattatreya303/zulip,tommyip/zulip,jackrzhang/zulip,PhilSk/zulip,shubhamdhama/zulip,hackerkid/zulip,mahim97/zulip,souravbadami/zulip,Galexrt/zulip,showell/zulip,blaze225/zulip,dhcrzf/zulip,susansls/zulip,rht/zulip,hackerkid/zulip,showell/zulip,christi3k/zulip,PhilSk/zulip,dhcrzf/zulip,shubhamdhama/zulip,jainayush975/zulip,amyliu345/zulip,jphilipsen05/zulip,sonali0901/zulip,SmartPeople/zulip,souravbadami/zulip,sonali0901/zulip,jackrzhang/zulip,jrowan/zulip,PhilSk/zulip,christi3k/zulip,JPJPJPOPOP/zulip,kou/zulip,JPJPJPOPOP/zulip,dhcrzf/zulip,eeshangarg/zulip,sharmaeklavya2/zulip,sonali0901/zulip,kou/zulip,showell/zulip,rishig/zulip,brainwane/zulip,blaze225/zulip,vaidap/zulip,rishig/zulip,Galexrt/zulip,ryanbackman/zulip,brockwhittaker/zulip,jrowan/zulip,blaze225/zulip,dawran6/zulip,rht/zulip,j831/zulip,dattatreya303/zulip | markdown | ## Code Before:
It's easy to change how your name is shown in Zulip. With full Unicode support,
you can spell your name exactly how you'd like it to be displayed.
{!go-to-the.md!} [Your account](/#settings/your-account)
{!settings.md!}
2. Change your name in the **Full name** field, and click **Save changes**.
Congratulations! You have updated your name.
!!! tip ""
Note that your old messages will still have the previous version of your
name.
## Instruction:
docs: Add *Save changes* macro to *Change your name*
## Code After:
It's easy to change how your name is shown in Zulip. With full Unicode support,
you can spell your name exactly how you'd like it to be displayed.
{!go-to-the.md!} [Your account](/#settings/your-account)
{!settings.md!}
2. Change your name in the **Full name** field.
{!save-changes.md!} account settings.
Congratulations! You have updated your name.
!!! tip ""
Note that your old messages will still have the previous version of your
name.
|
9a4155d7aab9bb4d3a0391658a2bafc5a3c00e59 | .travis.yml | .travis.yml | language: go
go:
- 1.3.3
- 1.4.2
- release
os:
- linux
- osx
notifications:
irc: "chat.freenode.net#restic"
install:
- go get github.com/mitchellh/gox
- gox -build-toolchain -os "linux darwin"
- go get -v -t ./...
script:
- go build -ldflags "-s" ./...
- go build -ldflags "-s" -o restic ./cmd/restic
- sh -c "cd cmd/restic && gox -verbose -os 'linux darwin'"
- "stat --printf='binary size: %s' restic"
- go test -v ./...
- ./testsuite.sh
- sh -c "cd backend && go test -v -test.sftppath /usr/lib/openssh/sftp-server ./..."
- gofmt -l *.go */*.go */*/*.go
- test -z "$(gofmt -l *.go */*.go */*/*.go)"
| language: go
go:
- 1.3.3
- 1.4.2
- release
os:
- linux
- osx
notifications:
irc: "chat.freenode.net#restic"
install:
- go get github.com/mitchellh/gox
- gox -build-toolchain -os "linux darwin"
- go get -v -t ./...
script:
- go build -ldflags "-s" ./...
- go build -ldflags "-s" -o restic ./cmd/restic
- sh -c "cd cmd/restic && gox -verbose -os 'linux darwin' && ls -al"
- go test -v ./...
- ./testsuite.sh
- sh -c "cd backend && go test -v -test.sftppath /usr/lib/openssh/sftp-server ./..."
- gofmt -l *.go */*.go */*/*.go
- test -z "$(gofmt -l *.go */*.go */*/*.go)"
| Add 'ls' on the binary to see the file sizes | Add 'ls' on the binary to see the file sizes
| YAML | bsd-2-clause | mappu/restic,ar-jan/restic,stuertz/restic,ar-jan/restic,byo/restic,kurin/restic,kurin/restic,byo/restic,stuertz/restic,middelink/restic,middelink/restic,episource/restic,klauspost/restic,arithmetric/restic,jayme-github/restic,intfrr/restic,bchapuis/restic,pombredanne/restic,fawick/restic,restic/restic,ckemper67/restic,howeyc/restic,fawick/restic,jayme-github/restic,restic/restic | yaml | ## Code Before:
language: go
go:
- 1.3.3
- 1.4.2
- release
os:
- linux
- osx
notifications:
irc: "chat.freenode.net#restic"
install:
- go get github.com/mitchellh/gox
- gox -build-toolchain -os "linux darwin"
- go get -v -t ./...
script:
- go build -ldflags "-s" ./...
- go build -ldflags "-s" -o restic ./cmd/restic
- sh -c "cd cmd/restic && gox -verbose -os 'linux darwin'"
- "stat --printf='binary size: %s' restic"
- go test -v ./...
- ./testsuite.sh
- sh -c "cd backend && go test -v -test.sftppath /usr/lib/openssh/sftp-server ./..."
- gofmt -l *.go */*.go */*/*.go
- test -z "$(gofmt -l *.go */*.go */*/*.go)"
## Instruction:
Add 'ls' on the binary to see the file sizes
## Code After:
language: go
go:
- 1.3.3
- 1.4.2
- release
os:
- linux
- osx
notifications:
irc: "chat.freenode.net#restic"
install:
- go get github.com/mitchellh/gox
- gox -build-toolchain -os "linux darwin"
- go get -v -t ./...
script:
- go build -ldflags "-s" ./...
- go build -ldflags "-s" -o restic ./cmd/restic
- sh -c "cd cmd/restic && gox -verbose -os 'linux darwin' && ls -al"
- go test -v ./...
- ./testsuite.sh
- sh -c "cd backend && go test -v -test.sftppath /usr/lib/openssh/sftp-server ./..."
- gofmt -l *.go */*.go */*/*.go
- test -z "$(gofmt -l *.go */*.go */*/*.go)"
|
a93edd89935cab95ec443734f09830537786f806 | s3.sh | s3.sh |
echo "Checking if minio-data volume is created:"
docker volume inspect minio-data
if [ $? -ne 0 ]; then
echo "Creating minio-data volume."
docker volume create minio-data
fi
if [ $? -eq 0 ]; then
echo "minio-data volume setup complete"
else
echo "Error while checking for minio-data volume."
exit 1
fi
echo "Checking if minio-config volume is created:"
docker volume inspect minio-config
if [ $? -ne 0 ]; then
echo "Creating minio-config volume."
docker volume create minio-config
fi
if [ $? -eq 0 ]; then
echo "minio-config volume setup complete"
else
echo "Error while checking for minio-config volume."
exit 1
fi
echo "Starting minio"
docker run -p 9001:9001 -d --rm --name minio --mount source=minio-data,target=/data --mount source=minio-config,target=/root/.minio minio/minio server /data
|
echo "Checking if minio-data volume is created:"
docker volume inspect minio-data
if [ $? -ne 0 ]; then
echo "Creating minio-data volume."
docker volume create minio-data
fi
if [ $? -eq 0 ]; then
echo "minio-data volume setup complete"
else
echo "Error while checking for minio-data volume."
exit 1
fi
echo "Checking if minio-config volume is created:"
docker volume inspect minio-config
if [ $? -ne 0 ]; then
echo "Creating minio-config volume."
docker volume create minio-config
fi
if [ $? -eq 0 ]; then
echo "minio-config volume setup complete"
else
echo "Error while checking for minio-config volume."
exit 1
fi
echo "Starting minio"
docker run -p 9001:9000 -d --rm --name minio -e "MINIO_ACCESS_KEY=ABCDEFGHJHIJKLMNOPQR" -e "MINIO_SECRET_KEY=abcdefghijklmnopqrstuvxwyz0123456789ABCD" --mount source=minio-data,target=/data --mount source=minio-config,target=/root/.minio minio/minio server /data
| Fix port forwarding and credentials for minio | Fix port forwarding and credentials for minio
+ minio will run on port 9000 inside container but port 9001 on host
+ Create dummy env credentials for minio. Client will need those keys to communicate with local S3 API.
| Shell | mit | filewalkwithme/light-aws | shell | ## Code Before:
echo "Checking if minio-data volume is created:"
docker volume inspect minio-data
if [ $? -ne 0 ]; then
echo "Creating minio-data volume."
docker volume create minio-data
fi
if [ $? -eq 0 ]; then
echo "minio-data volume setup complete"
else
echo "Error while checking for minio-data volume."
exit 1
fi
echo "Checking if minio-config volume is created:"
docker volume inspect minio-config
if [ $? -ne 0 ]; then
echo "Creating minio-config volume."
docker volume create minio-config
fi
if [ $? -eq 0 ]; then
echo "minio-config volume setup complete"
else
echo "Error while checking for minio-config volume."
exit 1
fi
echo "Starting minio"
docker run -p 9001:9001 -d --rm --name minio --mount source=minio-data,target=/data --mount source=minio-config,target=/root/.minio minio/minio server /data
## Instruction:
Fix port forwarding and credentials for minio
+ minio will run on port 9000 inside container but port 9001 on host
+ Create dummy env credentials for minio. Client will need those keys to communicate with local S3 API.
## Code After:
echo "Checking if minio-data volume is created:"
docker volume inspect minio-data
if [ $? -ne 0 ]; then
echo "Creating minio-data volume."
docker volume create minio-data
fi
if [ $? -eq 0 ]; then
echo "minio-data volume setup complete"
else
echo "Error while checking for minio-data volume."
exit 1
fi
echo "Checking if minio-config volume is created:"
docker volume inspect minio-config
if [ $? -ne 0 ]; then
echo "Creating minio-config volume."
docker volume create minio-config
fi
if [ $? -eq 0 ]; then
echo "minio-config volume setup complete"
else
echo "Error while checking for minio-config volume."
exit 1
fi
echo "Starting minio"
docker run -p 9001:9000 -d --rm --name minio -e "MINIO_ACCESS_KEY=ABCDEFGHJHIJKLMNOPQR" -e "MINIO_SECRET_KEY=abcdefghijklmnopqrstuvxwyz0123456789ABCD" --mount source=minio-data,target=/data --mount source=minio-config,target=/root/.minio minio/minio server /data
|
a74b66edc4145880a4c6dfc73d5645cb63fc3cc4 | lib/ShippingEasy/Object.php | lib/ShippingEasy/Object.php | <?php
class ShippingEasy_Object
{
public function request($meth, $path, $params=null, $payload = null)
{
$requestor = new ShippingEasy_ApiRequestor();
return $requestor->request($meth, $path, $params, $payload);
}
}
| <?php
class ShippingEasy_Object
{
public function request($meth, $path, $params=null, $payload = null, $apiKey = null, $apiSecret = null)
{
$requestor = new ShippingEasy_ApiRequestor();
return $requestor->request($meth, $path, $params, $payload, $apiKey, $apiSecret);
}
}
| Update to accept partner credentials. | Update to accept partner credentials.
| PHP | mit | ShippingEasy/shipping_easy-php | php | ## Code Before:
<?php
class ShippingEasy_Object
{
public function request($meth, $path, $params=null, $payload = null)
{
$requestor = new ShippingEasy_ApiRequestor();
return $requestor->request($meth, $path, $params, $payload);
}
}
## Instruction:
Update to accept partner credentials.
## Code After:
<?php
class ShippingEasy_Object
{
public function request($meth, $path, $params=null, $payload = null, $apiKey = null, $apiSecret = null)
{
$requestor = new ShippingEasy_ApiRequestor();
return $requestor->request($meth, $path, $params, $payload, $apiKey, $apiSecret);
}
}
|
198ccbb51237bdea2dbc41726f16a1aa2f2f9c1a | plugins/com.juliacomputing.jldt.eclipse.ui.console/script/repl-wrapper.jl | plugins/com.juliacomputing.jldt.eclipse.ui.console/script/repl-wrapper.jl | module EclipseREPL
function execute(statement)
status="complete"
mimeType="text/plain"
try
expression = parse(statement,1; greedy=true, raise=false)
if isa(expression[1],Expr)
state = expression[1].head
else
state = nothing
end
status = state==:incomplete ? "incomplete" : state==:error ? "error" : "complete"
result=nothing
if(status=="complete")
result=include_string(statement)
if result!=nothing
if string(typeof(result))=="Gadfly.Plot"
mimeType = "text/html"
println(stringmime(mimeType,result))
else
println(result)
end
end
end
catch e
showerror(STDOUT, e); println()
status = "error"
println()
finally
flush_all()
println("<<<<$status>>>>")
println("<<<<$mimeType>>>>")
println("<<<<eox>>>>")
end
end
function flush_all()
# Libc.flush_cstdio()
flush(STDOUT)
# flush(STDERR)
end
#import Base.flush
#const StdioPipe = Base.PipeEndpoint
#function flush(io::StdioPipe)
# invoke(flush, (supertype(StdioPipe),), io)
# if io == STDOUT
# oslibuv_flush()
# send_stream("stdout")
# elseif io == STDERR
# oslibuv_flush()
# send_stream("stderr")
# end
#end
end;
| module EclipseREPL
function execute(statement)
status="complete"
mimeType="text/plain"
try
expression = parse(statement,1; greedy=true, raise=false)
if isa(expression[1],Expr)
state = expression[1].head
else
state = nothing
end
status = state==:incomplete ? "incomplete" : state==:error ? "error" : "complete"
result=nothing
if(status=="complete")
result=include_string(statement)
if result!=nothing
if string(typeof(result))=="Gadfly.Plot"
mimeType = "text/html"
println(stringmime(mimeType,result))
else
println(result)
end
end
end
catch e
showerror(STDOUT, e); println()
status = "error"
println()
finally
flush_all()
println("<<<<$status>>>>")
println("<<<<$mimeType>>>>")
println("<<<<eox>>>>")
end
end
function flush_all()
Libc.flush_cstdio()
flush(STDOUT)
flush(STDERR)
end
end;
| Update repl io stream flushing | Update repl io stream flushing
| Julia | mit | JuliaComputing/JuliaDT,JuliaComputing/JuliaDT | julia | ## Code Before:
module EclipseREPL
function execute(statement)
status="complete"
mimeType="text/plain"
try
expression = parse(statement,1; greedy=true, raise=false)
if isa(expression[1],Expr)
state = expression[1].head
else
state = nothing
end
status = state==:incomplete ? "incomplete" : state==:error ? "error" : "complete"
result=nothing
if(status=="complete")
result=include_string(statement)
if result!=nothing
if string(typeof(result))=="Gadfly.Plot"
mimeType = "text/html"
println(stringmime(mimeType,result))
else
println(result)
end
end
end
catch e
showerror(STDOUT, e); println()
status = "error"
println()
finally
flush_all()
println("<<<<$status>>>>")
println("<<<<$mimeType>>>>")
println("<<<<eox>>>>")
end
end
function flush_all()
# Libc.flush_cstdio()
flush(STDOUT)
# flush(STDERR)
end
#import Base.flush
#const StdioPipe = Base.PipeEndpoint
#function flush(io::StdioPipe)
# invoke(flush, (supertype(StdioPipe),), io)
# if io == STDOUT
# oslibuv_flush()
# send_stream("stdout")
# elseif io == STDERR
# oslibuv_flush()
# send_stream("stderr")
# end
#end
end;
## Instruction:
Update repl io stream flushing
## Code After:
module EclipseREPL
function execute(statement)
status="complete"
mimeType="text/plain"
try
expression = parse(statement,1; greedy=true, raise=false)
if isa(expression[1],Expr)
state = expression[1].head
else
state = nothing
end
status = state==:incomplete ? "incomplete" : state==:error ? "error" : "complete"
result=nothing
if(status=="complete")
result=include_string(statement)
if result!=nothing
if string(typeof(result))=="Gadfly.Plot"
mimeType = "text/html"
println(stringmime(mimeType,result))
else
println(result)
end
end
end
catch e
showerror(STDOUT, e); println()
status = "error"
println()
finally
flush_all()
println("<<<<$status>>>>")
println("<<<<$mimeType>>>>")
println("<<<<eox>>>>")
end
end
function flush_all()
Libc.flush_cstdio()
flush(STDOUT)
flush(STDERR)
end
end;
|
3e4fed80be1197f0176f7b07611ef25b7972f0cf | common/src/main/java/io/github/aquerr/eaglefactions/common/commands/ReloadCommand.java | common/src/main/java/io/github/aquerr/eaglefactions/common/commands/ReloadCommand.java | package io.github.aquerr.eaglefactions.common.commands;
import io.github.aquerr.eaglefactions.api.EagleFactions;
import io.github.aquerr.eaglefactions.common.PluginInfo;
import io.github.aquerr.eaglefactions.common.messaging.Messages;
import org.spongepowered.api.command.CommandException;
import org.spongepowered.api.command.CommandResult;
import org.spongepowered.api.command.CommandSource;
import org.spongepowered.api.command.args.CommandContext;
import org.spongepowered.api.text.Text;
public class ReloadCommand extends AbstractCommand
{
public ReloadCommand(EagleFactions plugin)
{
super(plugin);
}
@Override
public CommandResult execute(CommandSource source, CommandContext context) throws CommandException
{
try
{
super.getPlugin().getConfiguration().reloadConfiguration();
super.getPlugin().getStorageManager().reloadStorage();
source.sendMessage(Text.of(PluginInfo.PLUGIN_PREFIX, Messages.CONFIG_HAS_BEEN_RELOADED));
}
catch (Exception exception)
{
exception.printStackTrace();
}
return CommandResult.success();
}
}
| package io.github.aquerr.eaglefactions.common.commands;
import io.github.aquerr.eaglefactions.api.EagleFactions;
import io.github.aquerr.eaglefactions.common.PluginInfo;
import io.github.aquerr.eaglefactions.common.messaging.Messages;
import org.spongepowered.api.command.CommandException;
import org.spongepowered.api.command.CommandResult;
import org.spongepowered.api.command.CommandSource;
import org.spongepowered.api.command.args.CommandContext;
import org.spongepowered.api.text.Text;
import org.spongepowered.api.text.format.TextColors;
public class ReloadCommand extends AbstractCommand
{
public ReloadCommand(EagleFactions plugin)
{
super(plugin);
}
@Override
public CommandResult execute(CommandSource source, CommandContext context) throws CommandException
{
try
{
super.getPlugin().getConfiguration().reloadConfiguration();
super.getPlugin().getStorageManager().reloadStorage();
source.sendMessage(Text.of(PluginInfo.PLUGIN_PREFIX, TextColors.GREEN, Messages.CONFIG_HAS_BEEN_RELOADED));
}
catch (Exception exception)
{
exception.printStackTrace();
}
return CommandResult.success();
}
}
| Change color of reload commands message to green | Change color of reload commands message to green
| Java | mit | Aquerr/EagleFactions,Aquerr/EagleFactions | java | ## Code Before:
package io.github.aquerr.eaglefactions.common.commands;
import io.github.aquerr.eaglefactions.api.EagleFactions;
import io.github.aquerr.eaglefactions.common.PluginInfo;
import io.github.aquerr.eaglefactions.common.messaging.Messages;
import org.spongepowered.api.command.CommandException;
import org.spongepowered.api.command.CommandResult;
import org.spongepowered.api.command.CommandSource;
import org.spongepowered.api.command.args.CommandContext;
import org.spongepowered.api.text.Text;
public class ReloadCommand extends AbstractCommand
{
public ReloadCommand(EagleFactions plugin)
{
super(plugin);
}
@Override
public CommandResult execute(CommandSource source, CommandContext context) throws CommandException
{
try
{
super.getPlugin().getConfiguration().reloadConfiguration();
super.getPlugin().getStorageManager().reloadStorage();
source.sendMessage(Text.of(PluginInfo.PLUGIN_PREFIX, Messages.CONFIG_HAS_BEEN_RELOADED));
}
catch (Exception exception)
{
exception.printStackTrace();
}
return CommandResult.success();
}
}
## Instruction:
Change color of reload commands message to green
## Code After:
package io.github.aquerr.eaglefactions.common.commands;
import io.github.aquerr.eaglefactions.api.EagleFactions;
import io.github.aquerr.eaglefactions.common.PluginInfo;
import io.github.aquerr.eaglefactions.common.messaging.Messages;
import org.spongepowered.api.command.CommandException;
import org.spongepowered.api.command.CommandResult;
import org.spongepowered.api.command.CommandSource;
import org.spongepowered.api.command.args.CommandContext;
import org.spongepowered.api.text.Text;
import org.spongepowered.api.text.format.TextColors;
public class ReloadCommand extends AbstractCommand
{
public ReloadCommand(EagleFactions plugin)
{
super(plugin);
}
@Override
public CommandResult execute(CommandSource source, CommandContext context) throws CommandException
{
try
{
super.getPlugin().getConfiguration().reloadConfiguration();
super.getPlugin().getStorageManager().reloadStorage();
source.sendMessage(Text.of(PluginInfo.PLUGIN_PREFIX, TextColors.GREEN, Messages.CONFIG_HAS_BEEN_RELOADED));
}
catch (Exception exception)
{
exception.printStackTrace();
}
return CommandResult.success();
}
}
|
4503e6671828497189736c86d408f6c0a8b47058 | lambda_tweet.py | lambda_tweet.py | import boto3
import tweepy
import json
import base64
from tweet_s3_images import TweetS3Images
with open('./config.json', 'r') as file:
config = json.loads(file.read())
# Decrypt API keys
client = boto3.client('kms')
response = client.decrypt(CiphertextBlob=base64.b64decode(config['secrets']))
secrets = json.loads(response['Plaintext'])
CONSUMER_KEY = secrets['consumer-key']
CONSUMER_SECRET = secrets['consumer-secret']
ACCESS_TOKEN = secrets['access-token']
ACCESS_TOKEN_SECRET = secrets['access-token-secret']
def lambda_handler(event, context):
print('Received event: ' + json.dumps(event, indent=2))
s3_info = event['Records'][0]['S3']
auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_TOKEN, ACCESS_TOKEN_SECRET)
api = tweepy.API(auth)
client = boto3.client('s3')
tweet_images = TweetS3Images(api, client)
tweet_images.send_image(s3_info['bucket']['name'], s3_info['object']['key'], cleanup=True)
| import boto3
import tweepy
import json
import base64
from tweet_s3_images import TweetS3Images
with open('./config.json', 'r') as file:
config = json.loads(file.read())
# Decrypt API keys
client = boto3.client('kms')
response = client.decrypt(CiphertextBlob=base64.b64decode(config['secrets']))
secrets = json.loads(response['Plaintext'])
CONSUMER_KEY = secrets['consumer-key']
CONSUMER_SECRET = secrets['consumer-secret']
ACCESS_TOKEN = secrets['access-token']
ACCESS_TOKEN_SECRET = secrets['access-token-secret']
def lambda_handler(event, context):
print('Received event: ' + json.dumps(event, indent=2))
print()
s3_info = event['Records'][0]['s3']
auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_TOKEN, ACCESS_TOKEN_SECRET)
api = tweepy.API(auth)
client = boto3.client('s3')
tweet_images = TweetS3Images(api, client)
tweet_images.send_image(s3_info['bucket']['name'], s3_info['object']['key'], cleanup=True)
| Update key name for S3 | Update key name for S3
| Python | mit | onema/lambda-tweet | python | ## Code Before:
import boto3
import tweepy
import json
import base64
from tweet_s3_images import TweetS3Images
with open('./config.json', 'r') as file:
config = json.loads(file.read())
# Decrypt API keys
client = boto3.client('kms')
response = client.decrypt(CiphertextBlob=base64.b64decode(config['secrets']))
secrets = json.loads(response['Plaintext'])
CONSUMER_KEY = secrets['consumer-key']
CONSUMER_SECRET = secrets['consumer-secret']
ACCESS_TOKEN = secrets['access-token']
ACCESS_TOKEN_SECRET = secrets['access-token-secret']
def lambda_handler(event, context):
print('Received event: ' + json.dumps(event, indent=2))
s3_info = event['Records'][0]['S3']
auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_TOKEN, ACCESS_TOKEN_SECRET)
api = tweepy.API(auth)
client = boto3.client('s3')
tweet_images = TweetS3Images(api, client)
tweet_images.send_image(s3_info['bucket']['name'], s3_info['object']['key'], cleanup=True)
## Instruction:
Update key name for S3
## Code After:
import boto3
import tweepy
import json
import base64
from tweet_s3_images import TweetS3Images
with open('./config.json', 'r') as file:
config = json.loads(file.read())
# Decrypt API keys
client = boto3.client('kms')
response = client.decrypt(CiphertextBlob=base64.b64decode(config['secrets']))
secrets = json.loads(response['Plaintext'])
CONSUMER_KEY = secrets['consumer-key']
CONSUMER_SECRET = secrets['consumer-secret']
ACCESS_TOKEN = secrets['access-token']
ACCESS_TOKEN_SECRET = secrets['access-token-secret']
def lambda_handler(event, context):
print('Received event: ' + json.dumps(event, indent=2))
print()
s3_info = event['Records'][0]['s3']
auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_TOKEN, ACCESS_TOKEN_SECRET)
api = tweepy.API(auth)
client = boto3.client('s3')
tweet_images = TweetS3Images(api, client)
tweet_images.send_image(s3_info['bucket']['name'], s3_info['object']['key'], cleanup=True)
|
7763fe33a9de7df02fe1a22e213a9ea0a3c2c79e | package.json | package.json | {
"name": "vim-mode",
"main": "./lib/vim-mode",
"version": "0.6.0",
"description": "Provides vim modal control of Atom, currently INCOMPLETE.",
"private": true,
"repository": {
"type": "git",
"url": "https://github.com/atom/vim-mode.git"
},
"bugs": {
"url": "https://github.com/atom/vim-mode/issues"
},
"engines": {
"atom": ">26.0"
},
"dependencies": {
"underscore-plus": "1.x"
}
}
| {
"name": "vim-mode",
"main": "./lib/vim-mode",
"version": "0.6.0",
"description": "Provides vim modal control of Atom, currently INCOMPLETE.",
"private": true,
"repository": "https://github.com/atom/vim-mode",
"engines": {
"atom": ">26.0"
},
"dependencies": {
"underscore-plus": "1.x"
}
}
| Use url as repository value | Use url as repository value
| JSON | mit | bronson/vim-mode-next,naorunaoru/vim-mode-next | json | ## Code Before:
{
"name": "vim-mode",
"main": "./lib/vim-mode",
"version": "0.6.0",
"description": "Provides vim modal control of Atom, currently INCOMPLETE.",
"private": true,
"repository": {
"type": "git",
"url": "https://github.com/atom/vim-mode.git"
},
"bugs": {
"url": "https://github.com/atom/vim-mode/issues"
},
"engines": {
"atom": ">26.0"
},
"dependencies": {
"underscore-plus": "1.x"
}
}
## Instruction:
Use url as repository value
## Code After:
{
"name": "vim-mode",
"main": "./lib/vim-mode",
"version": "0.6.0",
"description": "Provides vim modal control of Atom, currently INCOMPLETE.",
"private": true,
"repository": "https://github.com/atom/vim-mode",
"engines": {
"atom": ">26.0"
},
"dependencies": {
"underscore-plus": "1.x"
}
}
|
99c8176760a600e2c13321b160612e1d761d2e56 | bookstore_app/static/styles/_forms.scss | bookstore_app/static/styles/_forms.scss | .form-group {
@include clearfix;
margin: 20px 0;
label {
float: left;
display: inline-block;
padding-top: 6px;
}
input[type='text'], input[type='password'] {
padding: 5px;
border-radius: 5px;
border: 1px solid #aaa;
float: left;
display: block;
// @include vertical-align();
}
}
.errors {
border: 1px solid $red;
margin: 20px 0;
padding: 9px 12px;
color: $red;
border-radius: 5px;
.custom-message {
margin: 0;
}
}
.errorlist {
padding: 0;
margin: 0;
display: inline-block;
list-style-type: none;
} | .form-group {
@include clearfix;
margin: 20px 0;
label {
float: left;
display: inline-block;
padding-top: 6px;
}
input[type='text'], input[type='password'] {
padding: 5px;
border-radius: 5px;
border: 1px solid #aaa;
float: left;
display: block;
// @include vertical-align();
}
}
.errors {
border: 1px solid $red;
margin: 20px 0;
padding: 9px 12px;
color: $red;
border-radius: 5px;
.custom-message {
margin: 0;
}
}
.errorlist {
padding: 0;
margin: 0;
display: inline-block;
list-style-type: none;
}
.rating-form {
input[type="submit"] {
padding: 4px 12px;
}
} | Add style for rating form | Add style for rating form
| SCSS | mit | siawyoung/bookstore,siawyoung/bookstore,siawyoung/bookstore | scss | ## Code Before:
.form-group {
@include clearfix;
margin: 20px 0;
label {
float: left;
display: inline-block;
padding-top: 6px;
}
input[type='text'], input[type='password'] {
padding: 5px;
border-radius: 5px;
border: 1px solid #aaa;
float: left;
display: block;
// @include vertical-align();
}
}
.errors {
border: 1px solid $red;
margin: 20px 0;
padding: 9px 12px;
color: $red;
border-radius: 5px;
.custom-message {
margin: 0;
}
}
.errorlist {
padding: 0;
margin: 0;
display: inline-block;
list-style-type: none;
}
## Instruction:
Add style for rating form
## Code After:
.form-group {
@include clearfix;
margin: 20px 0;
label {
float: left;
display: inline-block;
padding-top: 6px;
}
input[type='text'], input[type='password'] {
padding: 5px;
border-radius: 5px;
border: 1px solid #aaa;
float: left;
display: block;
// @include vertical-align();
}
}
.errors {
border: 1px solid $red;
margin: 20px 0;
padding: 9px 12px;
color: $red;
border-radius: 5px;
.custom-message {
margin: 0;
}
}
.errorlist {
padding: 0;
margin: 0;
display: inline-block;
list-style-type: none;
}
.rating-form {
input[type="submit"] {
padding: 4px 12px;
}
} |
66b6188049f74b29dc64072c2633903be752a770 | README.md | README.md |
This package provides bindings to
[libtidy](http://www.html-tidy.org/developer/) a.k.a.
[TidyLib](http://api.html-tidy.org/tidy/tidylib_api_5.1.25/tidylib.html)
which can be used to parse and tidy up HTML 5.
The library is built as a native node extension,
so you don't have to have the HTML Tidy package installed on your system.
## Alternatives
* [tidy-html5](https://www.npmjs.com/package/tidy-html5)
has libtidy compiles to JavaScript using emscripten.
It is likely more portable, but at the cost of performance.
* [tidy](https://www.npmjs.com/package/tidy)
and [tidy2](https://www.npmjs.com/package/tidy2)
also provide bindings for libtidy,
but they expect the library and its header files
to be installed on the system.
* [htmltidy](https://www.npmjs.com/package/htmltidy)
and [htmltidy2](https://www.npmjs.com/package/htmltidy2)
use the command line tool to tidy up html,
so the tool has to be installed and
they incur some process creation overhead.
|
This package provides bindings to
[libtidy](http://www.html-tidy.org/developer/) a.k.a.
[TidyLib](http://api.html-tidy.org/tidy/tidylib_api_5.1.25/tidylib.html)
which can be used to parse and tidy up HTML 5.
The library is built as a native node extension,
so you don't have to have the HTML Tidy package installed on your system.
## Alternatives
* [tidy-html5](https://www.npmjs.com/package/tidy-html5)
has libtidy compiles to JavaScript using emscripten.
It is likely more portable, but at the cost of performance.
Only supports synchroneous operation.
* [tidy](https://www.npmjs.com/package/tidy)
and [tidy2](https://www.npmjs.com/package/tidy2)
also provide bindings for libtidy,
but they expect the library and its header files
to be installed on the system.
Only supports synchroneous operation.
* [htmltidy](https://www.npmjs.com/package/htmltidy)
and [htmltidy2](https://www.npmjs.com/package/htmltidy2)
use the command line tool to tidy up html,
so they incur some process creation overhead.
The binaries for the most common platforms are shipped with the package,
but other platforms are not supported.
This approach requires no build tools, though.
Only supports asynchroneous operation.
| Include new findings about alternative implementations | Include new findings about alternative implementations
| Markdown | mit | gagern/node-libtidy,gagern/node-libtidy,gagern/node-libtidy,gagern/node-libtidy | markdown | ## Code Before:
This package provides bindings to
[libtidy](http://www.html-tidy.org/developer/) a.k.a.
[TidyLib](http://api.html-tidy.org/tidy/tidylib_api_5.1.25/tidylib.html)
which can be used to parse and tidy up HTML 5.
The library is built as a native node extension,
so you don't have to have the HTML Tidy package installed on your system.
## Alternatives
* [tidy-html5](https://www.npmjs.com/package/tidy-html5)
has libtidy compiles to JavaScript using emscripten.
It is likely more portable, but at the cost of performance.
* [tidy](https://www.npmjs.com/package/tidy)
and [tidy2](https://www.npmjs.com/package/tidy2)
also provide bindings for libtidy,
but they expect the library and its header files
to be installed on the system.
* [htmltidy](https://www.npmjs.com/package/htmltidy)
and [htmltidy2](https://www.npmjs.com/package/htmltidy2)
use the command line tool to tidy up html,
so the tool has to be installed and
they incur some process creation overhead.
## Instruction:
Include new findings about alternative implementations
## Code After:
This package provides bindings to
[libtidy](http://www.html-tidy.org/developer/) a.k.a.
[TidyLib](http://api.html-tidy.org/tidy/tidylib_api_5.1.25/tidylib.html)
which can be used to parse and tidy up HTML 5.
The library is built as a native node extension,
so you don't have to have the HTML Tidy package installed on your system.
## Alternatives
* [tidy-html5](https://www.npmjs.com/package/tidy-html5)
has libtidy compiles to JavaScript using emscripten.
It is likely more portable, but at the cost of performance.
Only supports synchroneous operation.
* [tidy](https://www.npmjs.com/package/tidy)
and [tidy2](https://www.npmjs.com/package/tidy2)
also provide bindings for libtidy,
but they expect the library and its header files
to be installed on the system.
Only supports synchroneous operation.
* [htmltidy](https://www.npmjs.com/package/htmltidy)
and [htmltidy2](https://www.npmjs.com/package/htmltidy2)
use the command line tool to tidy up html,
so they incur some process creation overhead.
The binaries for the most common platforms are shipped with the package,
but other platforms are not supported.
This approach requires no build tools, though.
Only supports asynchroneous operation.
|
80c52ff11eb9bf5f069571cc7ac44c2c7503b026 | gulp/development/vendor.coffee | gulp/development/vendor.coffee | mainBowerFiles = require 'main-bower-files'
concat = require 'gulp-concat'
filter = require 'gulp-filter'
commonjsWrap = require 'gulp-wrap-commonjs'
pickFilesByExtension = (extension) -> filter (file) -> file.path.match new RegExp("." + extension + "$")
module.exports = (gulp) ->
gulp.task 'build:bower', ->
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'js'
.pipe commonjsWrap
pathModifier: (filePath) ->
matches = filePath.match /(bower_components|node_modules)\/(.*?)\//
moduleName = matches[2]
moduleName
.pipe concat 'vendor.js'
.pipe gulp.dest 'build/ui/scripts'
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'css'
.pipe concat 'vendor.css'
.pipe gulp.dest 'build/ui/styles'
gulp.task 'build:copy:scripts', ->
gulp.src 'node_modules/commonjs-require/commonjs-require.js'
.pipe gulp.dest('build/ui/scripts')
gulp.src 'bower_components/socket.io-client/socket.io.js'
.pipe gulp.dest('build/ui/scripts')
| mainBowerFiles = require 'main-bower-files'
concat = require 'gulp-concat'
filter = require 'gulp-filter'
commonjsWrap = require 'gulp-wrap-commonjs'
pickFilesByExtension = (extension) -> filter (file) -> file.path.match new RegExp("." + extension + "$")
module.exports = (gulp) ->
gulp.task 'build:bower', ->
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'js'
.pipe commonjsWrap
pathModifier: (filePath) ->
matches = filePath.match /(bower_components|node_modules)\/(.*?)\//
moduleName = matches[2]
moduleName
.pipe concat 'vendor.js'
.pipe gulp.dest 'build/ui/scripts'
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'css'
.pipe concat 'vendor.css'
.pipe gulp.dest 'build/ui/styles'
gulp.task 'build:copy:scripts', ->
gulp.src 'node_modules/eventric/build/dist/eventric.js'
.pipe gulp.dest('build/ui/scripts')
gulp.src 'node_modules/commonjs-require/commonjs-require.js'
.pipe gulp.dest('build/ui/scripts')
gulp.src 'bower_components/socket.io-client/socket.io.js'
.pipe gulp.dest('build/ui/scripts')
| Copy eventric js into scripts folder | Copy eventric js into scripts folder
| CoffeeScript | mit | sm0k1nggnu/entertain.io | coffeescript | ## Code Before:
mainBowerFiles = require 'main-bower-files'
concat = require 'gulp-concat'
filter = require 'gulp-filter'
commonjsWrap = require 'gulp-wrap-commonjs'
pickFilesByExtension = (extension) -> filter (file) -> file.path.match new RegExp("." + extension + "$")
module.exports = (gulp) ->
gulp.task 'build:bower', ->
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'js'
.pipe commonjsWrap
pathModifier: (filePath) ->
matches = filePath.match /(bower_components|node_modules)\/(.*?)\//
moduleName = matches[2]
moduleName
.pipe concat 'vendor.js'
.pipe gulp.dest 'build/ui/scripts'
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'css'
.pipe concat 'vendor.css'
.pipe gulp.dest 'build/ui/styles'
gulp.task 'build:copy:scripts', ->
gulp.src 'node_modules/commonjs-require/commonjs-require.js'
.pipe gulp.dest('build/ui/scripts')
gulp.src 'bower_components/socket.io-client/socket.io.js'
.pipe gulp.dest('build/ui/scripts')
## Instruction:
Copy eventric js into scripts folder
## Code After:
mainBowerFiles = require 'main-bower-files'
concat = require 'gulp-concat'
filter = require 'gulp-filter'
commonjsWrap = require 'gulp-wrap-commonjs'
pickFilesByExtension = (extension) -> filter (file) -> file.path.match new RegExp("." + extension + "$")
module.exports = (gulp) ->
gulp.task 'build:bower', ->
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'js'
.pipe commonjsWrap
pathModifier: (filePath) ->
matches = filePath.match /(bower_components|node_modules)\/(.*?)\//
moduleName = matches[2]
moduleName
.pipe concat 'vendor.js'
.pipe gulp.dest 'build/ui/scripts'
gulp.src mainBowerFiles()
.pipe pickFilesByExtension 'css'
.pipe concat 'vendor.css'
.pipe gulp.dest 'build/ui/styles'
gulp.task 'build:copy:scripts', ->
gulp.src 'node_modules/eventric/build/dist/eventric.js'
.pipe gulp.dest('build/ui/scripts')
gulp.src 'node_modules/commonjs-require/commonjs-require.js'
.pipe gulp.dest('build/ui/scripts')
gulp.src 'bower_components/socket.io-client/socket.io.js'
.pipe gulp.dest('build/ui/scripts')
|
2b2aa166d5675c26e8c0866eada4ff5b7157636f | cfg/after/ftplugin/html.vim | cfg/after/ftplugin/html.vim | " make Vim recognize stylesheet imports
let &l:include = '<link\>.*\<rel="stylesheet".*\<href="\zs\S\{-}\ze"'
" set up emmet maps
call maps#MapEmmet()
| " make Vim recognize stylesheet imports
let &l:include = '<link\>.*\<rel="stylesheet".*\<href="\zs\S\{-}\ze"'
" set up emmet maps
call maps#MapEmmet()
" set up nicer soft breaks
setlocal wrap
setlocal linebreak
setlocal breakindent
| Set up HTML wrapping config for writing | Set up HTML wrapping config for writing
The linebreak and breakindent options are pretty nice when writing in
HTML, since I don't rely on hard wrapping to clip my HTML text. This
way, my Vim buffer still reads naturally, even while relying on soft
wrapping.
| VimL | mit | igemnace/vim-config,igemnace/vim-config,igemnace/vim-config | viml | ## Code Before:
" make Vim recognize stylesheet imports
let &l:include = '<link\>.*\<rel="stylesheet".*\<href="\zs\S\{-}\ze"'
" set up emmet maps
call maps#MapEmmet()
## Instruction:
Set up HTML wrapping config for writing
The linebreak and breakindent options are pretty nice when writing in
HTML, since I don't rely on hard wrapping to clip my HTML text. This
way, my Vim buffer still reads naturally, even while relying on soft
wrapping.
## Code After:
" make Vim recognize stylesheet imports
let &l:include = '<link\>.*\<rel="stylesheet".*\<href="\zs\S\{-}\ze"'
" set up emmet maps
call maps#MapEmmet()
" set up nicer soft breaks
setlocal wrap
setlocal linebreak
setlocal breakindent
|
a1573c73f024039eec70d9944a884c4a9acc1e2e | src/foo.txt | src/foo.txt | 12345
asdfg
jjjjjjjjjj
dies ist foo.txt
| 123456789ABCDEF
aaaaaaaaaaaaaaa
this is 3rdline
123456789abcdef
bbbbbbbbbbbbbbb
ccccccccccccccc
ddddddddddddddd
eeeeeeeeeeeeeee
fffffffffffffff
ggggggggggggggg
hhhhhhhhhhhhhhh
iiiiiiiiiiiiiii
jjjjjjjjjjjjjjj
kkkkkkkkkkkkkkk
lllllllllllllll
mmmmmmmmmmmmmmm
nnnnnnnnnnnnnnn
123456789abcdef
543210aaaaaaaaa
aaaaaaaaa012345
111111111111111
| Increase length of test file | Increase length of test file
| Text | mit | Luz/hexdino | text | ## Code Before:
12345
asdfg
jjjjjjjjjj
dies ist foo.txt
## Instruction:
Increase length of test file
## Code After:
123456789ABCDEF
aaaaaaaaaaaaaaa
this is 3rdline
123456789abcdef
bbbbbbbbbbbbbbb
ccccccccccccccc
ddddddddddddddd
eeeeeeeeeeeeeee
fffffffffffffff
ggggggggggggggg
hhhhhhhhhhhhhhh
iiiiiiiiiiiiiii
jjjjjjjjjjjjjjj
kkkkkkkkkkkkkkk
lllllllllllllll
mmmmmmmmmmmmmmm
nnnnnnnnnnnnnnn
123456789abcdef
543210aaaaaaaaa
aaaaaaaaa012345
111111111111111
|
228c9f12147ece8507cd098eee10e290671ede60 | js/forum/src/initializers/routes.js | js/forum/src/initializers/routes.js | import IndexPage from 'flarum/components/index-page';
import DiscussionPage from 'flarum/components/discussion-page';
import ActivityPage from 'flarum/components/activity-page';
import SettingsPage from 'flarum/components/settings-page';
export default function(app) {
app.routes = {
'index': ['/', IndexPage.component()],
'index.filter': ['/:filter', IndexPage.component()],
'discussion': ['/d/:id/:slug', DiscussionPage.component()],
'discussion.near': ['/d/:id/:slug/:near', DiscussionPage.component()],
'user': ['/u/:username', ActivityPage.component()],
'user.activity': ['/u/:username', ActivityPage.component()],
'user.discussions': ['/u/:username/discussions', ActivityPage.component({filter: 'discussion'})],
'user.posts': ['/u/:username/posts', ActivityPage.component({filter: 'post'})],
'settings': ['/settings', SettingsPage.component()]
};
}
| import IndexPage from 'flarum/components/index-page';
import DiscussionPage from 'flarum/components/discussion-page';
import ActivityPage from 'flarum/components/activity-page';
import SettingsPage from 'flarum/components/settings-page';
export default function(app) {
app.routes = {
'index': ['/', IndexPage.component()],
'index.filter': ['/:filter', IndexPage.component()],
'discussion': ['/d/:id/:slug', DiscussionPage.component()],
'discussion.near': ['/d/:id/:slug/:near', DiscussionPage.component()],
'user': ['/u/:username', ActivityPage.component()],
'user.activity': ['/u/:username', ActivityPage.component()],
'user.discussions': ['/u/:username/discussions', ActivityPage.component({filter: 'discussion'})],
'user.posts': ['/u/:username/posts', ActivityPage.component({filter: 'post'})],
'settings': ['/settings', SettingsPage.component()]
};
app.route.discussion = function(discussion, near) {
return app.route(near ? 'discussion.near' : 'discussion', {
id: discussion.id(),
slug: discussion.slug(),
near: near
});
};
app.route.post = function(post) {
return app.route('discussion.near', {
id: post.discussion().id(),
slug: post.discussion().slug(),
near: post.number()
});
};
}
| Add convenience route generation functions | Add convenience route generation functions
Many instances throughout the app need to be updated to use these :)
| JavaScript | mit | falconchen/core,Luceos/core,utkarshx/core,Onyx47/core,malayladu/core,malayladu/core,falconchen/core,kidaa/core,billmn/core,Luceos/core,Onyx47/core,dungphanxuan/core,huytd/core,kidaa/core,Albert221/core,kidaa/core,liukaijv/core,kirkbushell/core,liukaijv/core,malayladu/core,flarum/core,renyuneyun/core,Albert221/core,Albert221/core,huytd/core,vuthaihoc/core,renyuneyun/core,datitisev/core,zaksoup/core,billmn/core,datitisev/core,jubianchi/core,datitisev/core,Luceos/core,Albert221/core,dungphanxuan/core,falconchen/core,renyuneyun/core,flarum/core,zaksoup/core,zaksoup/core,kirkbushell/core,utkarshx/core,Onyx47/core,jubianchi/core,billmn/core,renyuneyun/core,datitisev/core,kirkbushell/core,vuthaihoc/core,flarum/core,Luceos/core,kirkbushell/core,malayladu/core,dungphanxuan/core,vuthaihoc/core | javascript | ## Code Before:
import IndexPage from 'flarum/components/index-page';
import DiscussionPage from 'flarum/components/discussion-page';
import ActivityPage from 'flarum/components/activity-page';
import SettingsPage from 'flarum/components/settings-page';
export default function(app) {
app.routes = {
'index': ['/', IndexPage.component()],
'index.filter': ['/:filter', IndexPage.component()],
'discussion': ['/d/:id/:slug', DiscussionPage.component()],
'discussion.near': ['/d/:id/:slug/:near', DiscussionPage.component()],
'user': ['/u/:username', ActivityPage.component()],
'user.activity': ['/u/:username', ActivityPage.component()],
'user.discussions': ['/u/:username/discussions', ActivityPage.component({filter: 'discussion'})],
'user.posts': ['/u/:username/posts', ActivityPage.component({filter: 'post'})],
'settings': ['/settings', SettingsPage.component()]
};
}
## Instruction:
Add convenience route generation functions
Many instances throughout the app need to be updated to use these :)
## Code After:
import IndexPage from 'flarum/components/index-page';
import DiscussionPage from 'flarum/components/discussion-page';
import ActivityPage from 'flarum/components/activity-page';
import SettingsPage from 'flarum/components/settings-page';
export default function(app) {
app.routes = {
'index': ['/', IndexPage.component()],
'index.filter': ['/:filter', IndexPage.component()],
'discussion': ['/d/:id/:slug', DiscussionPage.component()],
'discussion.near': ['/d/:id/:slug/:near', DiscussionPage.component()],
'user': ['/u/:username', ActivityPage.component()],
'user.activity': ['/u/:username', ActivityPage.component()],
'user.discussions': ['/u/:username/discussions', ActivityPage.component({filter: 'discussion'})],
'user.posts': ['/u/:username/posts', ActivityPage.component({filter: 'post'})],
'settings': ['/settings', SettingsPage.component()]
};
app.route.discussion = function(discussion, near) {
return app.route(near ? 'discussion.near' : 'discussion', {
id: discussion.id(),
slug: discussion.slug(),
near: near
});
};
app.route.post = function(post) {
return app.route('discussion.near', {
id: post.discussion().id(),
slug: post.discussion().slug(),
near: post.number()
});
};
}
|
8499614af60bc6cdf13a248c7abab3faeaa2e59f | install/dependency.sh | install/dependency.sh |
echo 'Installing Required packages for Compass...'
sudo yum install -y rsyslog logrotate ntp iproute openssh-clients python git wget python-setuptools python-netaddr python-flask python-flask-sqlalchemy python-amqplib amqp python-paramiko python-mock mod_wsgi httpd squid dhcp bind rsync yum-utils xinetd tftp-server gcc net-snmp-utils net-snmp net-snmp-python python-daemon unzip openssl openssl098e ca-certificates redis python-redis
if [[ "$?" != "0" ]]; then
echo "failed to install yum dependency"
exit 1
fi
sudo easy_install pip==1.2.1
if [[ "$?" != "0" ]]; then
echo "failed to install easy install"
exit 1
fi
sudo pip install flask-script flask-restful Celery six discover unittest2 pychef requests
if [[ "$?" != "0" ]]; then
echo "failed to install pip packages"
exit 1
fi
sudo chkconfig httpd on
sudo chkconfig squid on
sudo chkconfig xinetd on
sudo chkconfig dhcpd on
sudo chkconfig named on
sudo chkconfig sshd on
sudo chkconfig rsyslog on
sudo chkconfig ntpd on
sudo chkconfig redis on
sudo chkconfig iptables off
sudo chkconfig ip6tables off
|
echo 'Installing Required packages for Compass...'
sudo yum install -y rsyslog logrotate ntp iproute openssh-clients python git wget python-setuptools python-netaddr python-flask python-flask-sqlalchemy python-amqplib amqp python-paramiko python-mock mod_wsgi httpd squid dhcp bind rsync yum-utils xinetd tftp-server gcc net-snmp-utils net-snmp net-snmp-python python-daemon unzip openssl openssl098e ca-certificates redis python-redis
if [[ "$?" != "0" ]]; then
echo "failed to install yum dependency"
exit 1
fi
sudo easy_install pip==1.2.1
if [[ "$?" != "0" ]]; then
echo "failed to install easy install"
exit 1
fi
sudo pip install -r $COMPASSDIR/requirements.txt
sudo pip install -r $COMPASSDIR/test-requirements.txt
if [[ "$?" != "0" ]]; then
echo "failed to install pip packages"
exit 1
fi
sudo chkconfig httpd on
sudo chkconfig squid on
sudo chkconfig xinetd on
sudo chkconfig dhcpd on
sudo chkconfig named on
sudo chkconfig sshd on
sudo chkconfig rsyslog on
sudo chkconfig ntpd on
sudo chkconfig redis on
sudo chkconfig iptables off
sudo chkconfig ip6tables off
| Prepare the installation of dependencies according to requirement files | Prepare the installation of dependencies according to requirement files
Change-Id: I9a940217f5be6fbf4c303daca707836bc97d47e2
| Shell | apache-2.0 | openstack/compass-core,baigk/compass-core,stackforge/compass-core,stackforge/compass-core,openstack/compass-core,stackforge/compass-core,openstack/compass-core,baigk/compass-core,openstack/compass-core,baigk/compass-core,stackforge/compass-core | shell | ## Code Before:
echo 'Installing Required packages for Compass...'
sudo yum install -y rsyslog logrotate ntp iproute openssh-clients python git wget python-setuptools python-netaddr python-flask python-flask-sqlalchemy python-amqplib amqp python-paramiko python-mock mod_wsgi httpd squid dhcp bind rsync yum-utils xinetd tftp-server gcc net-snmp-utils net-snmp net-snmp-python python-daemon unzip openssl openssl098e ca-certificates redis python-redis
if [[ "$?" != "0" ]]; then
echo "failed to install yum dependency"
exit 1
fi
sudo easy_install pip==1.2.1
if [[ "$?" != "0" ]]; then
echo "failed to install easy install"
exit 1
fi
sudo pip install flask-script flask-restful Celery six discover unittest2 pychef requests
if [[ "$?" != "0" ]]; then
echo "failed to install pip packages"
exit 1
fi
sudo chkconfig httpd on
sudo chkconfig squid on
sudo chkconfig xinetd on
sudo chkconfig dhcpd on
sudo chkconfig named on
sudo chkconfig sshd on
sudo chkconfig rsyslog on
sudo chkconfig ntpd on
sudo chkconfig redis on
sudo chkconfig iptables off
sudo chkconfig ip6tables off
## Instruction:
Prepare the installation of dependencies according to requirement files
Change-Id: I9a940217f5be6fbf4c303daca707836bc97d47e2
## Code After:
echo 'Installing Required packages for Compass...'
sudo yum install -y rsyslog logrotate ntp iproute openssh-clients python git wget python-setuptools python-netaddr python-flask python-flask-sqlalchemy python-amqplib amqp python-paramiko python-mock mod_wsgi httpd squid dhcp bind rsync yum-utils xinetd tftp-server gcc net-snmp-utils net-snmp net-snmp-python python-daemon unzip openssl openssl098e ca-certificates redis python-redis
if [[ "$?" != "0" ]]; then
echo "failed to install yum dependency"
exit 1
fi
sudo easy_install pip==1.2.1
if [[ "$?" != "0" ]]; then
echo "failed to install easy install"
exit 1
fi
sudo pip install -r $COMPASSDIR/requirements.txt
sudo pip install -r $COMPASSDIR/test-requirements.txt
if [[ "$?" != "0" ]]; then
echo "failed to install pip packages"
exit 1
fi
sudo chkconfig httpd on
sudo chkconfig squid on
sudo chkconfig xinetd on
sudo chkconfig dhcpd on
sudo chkconfig named on
sudo chkconfig sshd on
sudo chkconfig rsyslog on
sudo chkconfig ntpd on
sudo chkconfig redis on
sudo chkconfig iptables off
sudo chkconfig ip6tables off
|
f977f3d7ebaa156446ff99f35b47c037823f207c | team/_posts/2016-09-13-john-huddleston.md | team/_posts/2016-09-13-john-huddleston.md | ---
layout: member
title: John Huddleston
position: Rotation student
handle: jlhudd
email: [email protected]
twitter: huddlej
github: huddlej
scholar: KCqa_hUAAAAJ
image: /images/team/john-huddleston.jpg
calendar: https://calendar.google.com/calendar/embed?src=jlhudd%40uw.edu&ctz=America/Los_Angeles
---
I am a first-year MCB student rotating in the Bedford Lab. I am generally
interested in evolutionary biology, genomics, and bioinformatics. My interests
in these fields developed while studying genetic algorithms during a master's
program for computer science at Western Washington University (WWU). This
research motivated me to return to school to study evolution where I explored
prezygotic reproductive isolation between species in the apple maggot species
group (*Rhagoletis*). During my master's in biology at WWU, I had the
opportunity to join Evan Eichler's lab in University of Washington's Genome
Sciences as a bioinformatics specialist where I studied structural variation and
genome assembly in primates.
| ---
layout: member
title: John Huddleston
position: Rotation student
handle: jlhudd
email: [email protected]
twitter: huddlej
github: huddlej
scholar: KCqa_hUAAAAJ
image: /images/team/john-huddleston.jpg
calendar: https://calendar.google.com/calendar/embed?src=jlhudd%40uw.edu&ctz=America/Los_Angeles
alumni: true
---
I am a first-year MCB student rotating in the Bedford Lab. I am generally
interested in evolutionary biology, genomics, and bioinformatics. My interests
in these fields developed while studying genetic algorithms during a master's
program for computer science at Western Washington University (WWU). This
research motivated me to return to school to study evolution where I explored
prezygotic reproductive isolation between species in the apple maggot species
group (*Rhagoletis*). During my master's in biology at WWU, I had the
opportunity to join Evan Eichler's lab in University of Washington's Genome
Sciences as a bioinformatics specialist where I studied structural variation and
genome assembly in primates.
| Move John to "alumni" (hopefully not for long). | Move John to "alumni" (hopefully not for long).
| Markdown | mit | AustenLamacraft/austenlamacraft.github.io,AustenLamacraft/austenlamacraft.github.io,AustenLamacraft/austenlamacraft.github.io | markdown | ## Code Before:
---
layout: member
title: John Huddleston
position: Rotation student
handle: jlhudd
email: [email protected]
twitter: huddlej
github: huddlej
scholar: KCqa_hUAAAAJ
image: /images/team/john-huddleston.jpg
calendar: https://calendar.google.com/calendar/embed?src=jlhudd%40uw.edu&ctz=America/Los_Angeles
---
I am a first-year MCB student rotating in the Bedford Lab. I am generally
interested in evolutionary biology, genomics, and bioinformatics. My interests
in these fields developed while studying genetic algorithms during a master's
program for computer science at Western Washington University (WWU). This
research motivated me to return to school to study evolution where I explored
prezygotic reproductive isolation between species in the apple maggot species
group (*Rhagoletis*). During my master's in biology at WWU, I had the
opportunity to join Evan Eichler's lab in University of Washington's Genome
Sciences as a bioinformatics specialist where I studied structural variation and
genome assembly in primates.
## Instruction:
Move John to "alumni" (hopefully not for long).
## Code After:
---
layout: member
title: John Huddleston
position: Rotation student
handle: jlhudd
email: [email protected]
twitter: huddlej
github: huddlej
scholar: KCqa_hUAAAAJ
image: /images/team/john-huddleston.jpg
calendar: https://calendar.google.com/calendar/embed?src=jlhudd%40uw.edu&ctz=America/Los_Angeles
alumni: true
---
I am a first-year MCB student rotating in the Bedford Lab. I am generally
interested in evolutionary biology, genomics, and bioinformatics. My interests
in these fields developed while studying genetic algorithms during a master's
program for computer science at Western Washington University (WWU). This
research motivated me to return to school to study evolution where I explored
prezygotic reproductive isolation between species in the apple maggot species
group (*Rhagoletis*). During my master's in biology at WWU, I had the
opportunity to join Evan Eichler's lab in University of Washington's Genome
Sciences as a bioinformatics specialist where I studied structural variation and
genome assembly in primates.
|
66bd4ad40101008e286dd2ad9252240de03930fd | README.md | README.md | [](https://travis-ci.org/Acosix/alfresco-maven)
# About
This project defines the Maven build framework used by Acosix GmbH for building Alfresco-related libraries and modules.
Acosix GmbH deliberately does not use the [Alfresco SDK](https://github.com/Alfresco/alfresco-sdk) which contains too many assumptions / opinions. The latest version (3.0 at the time of writing) even hard-wires some of the assumptions into plugin behaviour with little configurability. Earlier versions regularly caused side effects with standard Maven plugins related to source code or JavaDoc attachments. This makes it hard to customize / adapt SDK-based projects to specific requirements or simply different patterns of use - even regarding something simple as custom resource directories. | [](https://travis-ci.org/Acosix/alfresco-maven)
# About
This project defines the Maven build framework used by Acosix GmbH for building Alfresco-related libraries and modules.
Acosix GmbH deliberately does not use the [Alfresco SDK](https://github.com/Alfresco/alfresco-sdk) which contains too many assumptions / opinions. The latest version (3.0 at the time of writing) even hard-wires some of the assumptions into plugin behaviour with little configurability. Earlier versions regularly caused side effects with standard Maven plugins related to source code or JavaDoc attachments. This makes it hard to customize / adapt SDK-based projects to specific requirements or simply different patterns of use - even regarding something simple as custom resource directories.
# Use in projects
TBD
## Using SNAPSHOT builds
In order to use a pre-built SNAPSHOT artifact published to the Open Source Sonatype Repository Hosting site, the artifact repository may need to be added to the POM, global settings.xml or an artifact repository proxy server. The following is the XML snippet for inclusion in a POM file.
```xml
<repositories>
<repository>
<id>ossrh</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
``` | Add note about SNAPSHOT repository and correct minor typo | Add note about SNAPSHOT repository and correct minor typo
| Markdown | apache-2.0 | Acosix/alfresco-maven | markdown | ## Code Before:
[](https://travis-ci.org/Acosix/alfresco-maven)
# About
This project defines the Maven build framework used by Acosix GmbH for building Alfresco-related libraries and modules.
Acosix GmbH deliberately does not use the [Alfresco SDK](https://github.com/Alfresco/alfresco-sdk) which contains too many assumptions / opinions. The latest version (3.0 at the time of writing) even hard-wires some of the assumptions into plugin behaviour with little configurability. Earlier versions regularly caused side effects with standard Maven plugins related to source code or JavaDoc attachments. This makes it hard to customize / adapt SDK-based projects to specific requirements or simply different patterns of use - even regarding something simple as custom resource directories.
## Instruction:
Add note about SNAPSHOT repository and correct minor typo
## Code After:
[](https://travis-ci.org/Acosix/alfresco-maven)
# About
This project defines the Maven build framework used by Acosix GmbH for building Alfresco-related libraries and modules.
Acosix GmbH deliberately does not use the [Alfresco SDK](https://github.com/Alfresco/alfresco-sdk) which contains too many assumptions / opinions. The latest version (3.0 at the time of writing) even hard-wires some of the assumptions into plugin behaviour with little configurability. Earlier versions regularly caused side effects with standard Maven plugins related to source code or JavaDoc attachments. This makes it hard to customize / adapt SDK-based projects to specific requirements or simply different patterns of use - even regarding something simple as custom resource directories.
# Use in projects
TBD
## Using SNAPSHOT builds
In order to use a pre-built SNAPSHOT artifact published to the Open Source Sonatype Repository Hosting site, the artifact repository may need to be added to the POM, global settings.xml or an artifact repository proxy server. The following is the XML snippet for inclusion in a POM file.
```xml
<repositories>
<repository>
<id>ossrh</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
``` |
92739af42901e2479135d88abb44b7ff87b986dd | test/display/CMakeLists.txt | test/display/CMakeLists.txt | set(CMAKE_INCLUDE_CURRENT_DIR ON)
add_executable(osvr_print_displays osvr_print_displays.cpp)
target_link_libraries(osvr_print_displays PRIVATE osvrDisplay)
target_include_directories(osvr_print_displays SYSTEM PRIVATE "${CMAKE_SOURCE_DIR}/src")
set_property(TARGET osvr_print_displays PROPERTY CXX_STANDARD 11)
target_compile_features(osvr_print_displays PRIVATE cxx_override)
| set(CMAKE_INCLUDE_CURRENT_DIR ON)
add_executable(osvr_print_displays osvr_print_displays.cpp)
target_link_libraries(osvr_print_displays PRIVATE osvrDisplay)
target_include_directories(osvr_print_displays SYSTEM PRIVATE "${CMAKE_SOURCE_DIR}/src")
set_property(TARGET osvr_print_displays PROPERTY CXX_STANDARD 11)
target_compile_features(osvr_print_displays PRIVATE cxx_override)
install(TARGETS osvr_print_displays
RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR} COMPONENT Runtime)
| Install osvr_print_displays program if built. | Install osvr_print_displays program if built.
| Text | apache-2.0 | OSVR/SteamVR-OSVR,OSVR/SteamVR-OSVR,OSVR/SteamVR-OSVR | text | ## Code Before:
set(CMAKE_INCLUDE_CURRENT_DIR ON)
add_executable(osvr_print_displays osvr_print_displays.cpp)
target_link_libraries(osvr_print_displays PRIVATE osvrDisplay)
target_include_directories(osvr_print_displays SYSTEM PRIVATE "${CMAKE_SOURCE_DIR}/src")
set_property(TARGET osvr_print_displays PROPERTY CXX_STANDARD 11)
target_compile_features(osvr_print_displays PRIVATE cxx_override)
## Instruction:
Install osvr_print_displays program if built.
## Code After:
set(CMAKE_INCLUDE_CURRENT_DIR ON)
add_executable(osvr_print_displays osvr_print_displays.cpp)
target_link_libraries(osvr_print_displays PRIVATE osvrDisplay)
target_include_directories(osvr_print_displays SYSTEM PRIVATE "${CMAKE_SOURCE_DIR}/src")
set_property(TARGET osvr_print_displays PROPERTY CXX_STANDARD 11)
target_compile_features(osvr_print_displays PRIVATE cxx_override)
install(TARGETS osvr_print_displays
RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR} COMPONENT Runtime)
|
bded38f349dba35241bf69d112160d3af555d622 | src/FormModel/Traits/Formable.php | src/FormModel/Traits/Formable.php | <?php
namespace Kregel\FormModel\Traits;
trait Formable
{
protected $form_name;
/**
* @return mixed
*/
public function getFormName()
{
return $this->form_name;
}
/**
* @param mixed $form_name
*/
public function setFormName($form_name)
{
$this->form_name = $form_name;
}
}
| <?php
namespace Kregel\FormModel\Traits;
trait Formable
{
/**
* @return mixed
*/
public function getFormName()
{
return $this->form_name;
}
/**
* @param mixed $form_name
*/
public function setFormName($form_name)
{
$this->form_name = $form_name;
}
}
| Remove form_name property because it broke FormModel | Remove form_name property because it broke FormModel
| PHP | mit | austinkregel/formmodel,austinkregel/formmodel | php | ## Code Before:
<?php
namespace Kregel\FormModel\Traits;
trait Formable
{
protected $form_name;
/**
* @return mixed
*/
public function getFormName()
{
return $this->form_name;
}
/**
* @param mixed $form_name
*/
public function setFormName($form_name)
{
$this->form_name = $form_name;
}
}
## Instruction:
Remove form_name property because it broke FormModel
## Code After:
<?php
namespace Kregel\FormModel\Traits;
trait Formable
{
/**
* @return mixed
*/
public function getFormName()
{
return $this->form_name;
}
/**
* @param mixed $form_name
*/
public function setFormName($form_name)
{
$this->form_name = $form_name;
}
}
|
e166096663af9d4b13b2cf20ae4e412143599813 | javascripts/custom/ajax.js | javascripts/custom/ajax.js | // Define DrupalAjaxRequest
var DrupalAjaxRequest = (function () {
var fetch = function(feed_id, callback) {
Zepto.ajax(
{
url: Kiosk.contentUrl(feed_id),
dataType: 'jsonp',
type: 'GET',
cache: false,
success: function (result) {
callback(result);
}
}
);
}
return {
fetch : fetch
}
})(); | // Define DrupalAjaxRequest
var DrupalAjaxRequest = (function () {
var fetchNode = function(node_id, callback) {
Zepto.ajax(
{
url: Kiosk.contentUrl('node'),
dataType: 'jsonp',
data: {nid: node_id},
type: 'GET',
cache: false,
success: function (result) {
callback(result);
}
}
);
}
var fetchCollections = function(callback) {
Zepto.ajax(
{
url: Kiosk.contentUrl('collections'),
dataType: 'jsonp',
type: 'GET',
cache: false,
success: function (result) {
callback(result);
}
}
);
}
return {
fetchNode : fetchNode,
fetchCollections: fetchCollections
}
})(); | Add a method for fetching a node, and collections list (artifacts). | Add a method for fetching a node, and collections list (artifacts).
| JavaScript | mit | historiclewes/pilots-kiosk,historiclewes/pilots-kiosk,historiclewes/pilots-kiosk | javascript | ## Code Before:
// Define DrupalAjaxRequest
var DrupalAjaxRequest = (function () {
var fetch = function(feed_id, callback) {
Zepto.ajax(
{
url: Kiosk.contentUrl(feed_id),
dataType: 'jsonp',
type: 'GET',
cache: false,
success: function (result) {
callback(result);
}
}
);
}
return {
fetch : fetch
}
})();
## Instruction:
Add a method for fetching a node, and collections list (artifacts).
## Code After:
// Define DrupalAjaxRequest
var DrupalAjaxRequest = (function () {
var fetchNode = function(node_id, callback) {
Zepto.ajax(
{
url: Kiosk.contentUrl('node'),
dataType: 'jsonp',
data: {nid: node_id},
type: 'GET',
cache: false,
success: function (result) {
callback(result);
}
}
);
}
var fetchCollections = function(callback) {
Zepto.ajax(
{
url: Kiosk.contentUrl('collections'),
dataType: 'jsonp',
type: 'GET',
cache: false,
success: function (result) {
callback(result);
}
}
);
}
return {
fetchNode : fetchNode,
fetchCollections: fetchCollections
}
})(); |
9c7b1def1f8a6c13d359f613b58290b396f464fb | appveyor.yml | appveyor.yml | environment:
nodejs_version: "0.12"
virustotal_apikey:
secure: hvKv/Iat286vN6iH88eh1yesAgbq7XTXR+3/P37Jdn8EzUzmI3QIOWgsDi/80neS9vAmDf8UIWjSWUqBGqHn/I39it4p3Vaj+dhGN/mhaAU=
install:
- ps: Install-Product node $env:nodejs_version
- npm install
- npm install -g bower
- npm install -g gulp
- bower install
build_script: gulp build
after_build:
- gulp build-installer
- 7z a -r LoginWars2.zip %APPVEYOR_BUILD_FOLDER%\build\LoginWars2-win32-x64
- ps: curl -fsS -X POST "https://www.virustotal.com/vtapi/v2/file/scan" --form apikey=$env:virustotal_apikey --form file=LoginWars2Setup.exe
artifacts:
- path: installer/*.nupkg
name: nupkgs
- path: installer/RELEASES
name: release
- path: installer/LoginWars2Setup.exe
name: setup
- path: LoginWars2.zip
name: zip
deploy:
provider: GitHub
auth_token:
secure: 3BXDg8DUFbAB7x2kUA7VSnx1dK8Pej/S2SOREIn4YieSaPFYQcT1z/po8DFqhyYm
artifact: nupkgs, release, setup, zip
description: "changelog coming soon..."
draft: true
prerelease: false
on:
appveyor_repo_tag: true
| environment:
nodejs_version: "0.12"
install:
- ps: Install-Product node $env:nodejs_version
- npm install
- npm install -g bower
- npm install -g gulp
- bower install
build_script: gulp build
after_build:
- gulp build-installer
- 7z a -r LoginWars2.zip %APPVEYOR_BUILD_FOLDER%\build\LoginWars2-win32-x64
artifacts:
- path: installer/*.nupkg
name: nupkgs
- path: installer/RELEASES
name: release
- path: installer/LoginWars2Setup.exe
name: setup
- path: LoginWars2.zip
name: zip
deploy:
provider: GitHub
auth_token:
secure: 3BXDg8DUFbAB7x2kUA7VSnx1dK8Pej/S2SOREIn4YieSaPFYQcT1z/po8DFqhyYm
artifact: nupkgs, release, setup, zip
description: "changelog coming soon..."
draft: true
prerelease: false
on:
appveyor_repo_tag: true
| Revert "test virustotal integration to continious delivery process" because they do not allow files as big as mine :( | Revert "test virustotal integration to continious delivery process"
because they do not allow files as big as mine :(
This reverts commit 483266b8c874c9fcb4418b338d9d4f45385a26d6.
| YAML | mit | kasoki/LoginWars2,kasoki/LoginWars2,atomicptr/LoginWars2,atomicptr/LoginWars2 | yaml | ## Code Before:
environment:
nodejs_version: "0.12"
virustotal_apikey:
secure: hvKv/Iat286vN6iH88eh1yesAgbq7XTXR+3/P37Jdn8EzUzmI3QIOWgsDi/80neS9vAmDf8UIWjSWUqBGqHn/I39it4p3Vaj+dhGN/mhaAU=
install:
- ps: Install-Product node $env:nodejs_version
- npm install
- npm install -g bower
- npm install -g gulp
- bower install
build_script: gulp build
after_build:
- gulp build-installer
- 7z a -r LoginWars2.zip %APPVEYOR_BUILD_FOLDER%\build\LoginWars2-win32-x64
- ps: curl -fsS -X POST "https://www.virustotal.com/vtapi/v2/file/scan" --form apikey=$env:virustotal_apikey --form file=LoginWars2Setup.exe
artifacts:
- path: installer/*.nupkg
name: nupkgs
- path: installer/RELEASES
name: release
- path: installer/LoginWars2Setup.exe
name: setup
- path: LoginWars2.zip
name: zip
deploy:
provider: GitHub
auth_token:
secure: 3BXDg8DUFbAB7x2kUA7VSnx1dK8Pej/S2SOREIn4YieSaPFYQcT1z/po8DFqhyYm
artifact: nupkgs, release, setup, zip
description: "changelog coming soon..."
draft: true
prerelease: false
on:
appveyor_repo_tag: true
## Instruction:
Revert "test virustotal integration to continious delivery process"
because they do not allow files as big as mine :(
This reverts commit 483266b8c874c9fcb4418b338d9d4f45385a26d6.
## Code After:
environment:
nodejs_version: "0.12"
install:
- ps: Install-Product node $env:nodejs_version
- npm install
- npm install -g bower
- npm install -g gulp
- bower install
build_script: gulp build
after_build:
- gulp build-installer
- 7z a -r LoginWars2.zip %APPVEYOR_BUILD_FOLDER%\build\LoginWars2-win32-x64
artifacts:
- path: installer/*.nupkg
name: nupkgs
- path: installer/RELEASES
name: release
- path: installer/LoginWars2Setup.exe
name: setup
- path: LoginWars2.zip
name: zip
deploy:
provider: GitHub
auth_token:
secure: 3BXDg8DUFbAB7x2kUA7VSnx1dK8Pej/S2SOREIn4YieSaPFYQcT1z/po8DFqhyYm
artifact: nupkgs, release, setup, zip
description: "changelog coming soon..."
draft: true
prerelease: false
on:
appveyor_repo_tag: true
|
8d96f8947e8e1c6344664309d3e371ea4b67b976 | app/soc/templates/modules/gsoc/templatetags/_as_proposal_duplicates.html | app/soc/templates/modules/gsoc/templatetags/_as_proposal_duplicates.html | {% comment %}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
{% endcomment %}
<li>
Student:<strong>
<a href="/student/show/{{ duplicate.student.key.id_or_name }}">{{ duplicate.student.name }}</a>
</strong>
<a href="mailto: {{ duplicate.student.email }}">{{ duplicate.student.email }}</a>
<ul>
{% for key, org in orgs.items %}
<li>Organization:
<a href="/org/show/{{ key }}">{{ org.name }}</a>
Admins: {% for oa in org.admins %}
<a href="mailto: {{ oa.email }}">'"{{ oa.name }}" <{{ oa.email }}>'</a>
{% endfor %}
<ul>
{% for proposal in org.proposals %}
<li>Proposal:
<a href="/student_proposal/show/{{ proposal.key }}">
{{ proposal.title }}
</a>
</li>
{% endfor %}
</ul>
</li>
{% endfor %}
</ul>
</li>
| {% comment %}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
{% endcomment %}
<li>
Student:<strong>
<a href="/student/show/{{ duplicate.student.key.id_or_name }}">{{ duplicate.student.name }}</a>
</strong>
<a href="mailto: {{ duplicate.student.email }}">{{ duplicate.student.email }}</a>
<ul>
{% for key, org in orgs.items %}
<li>Organization:
<a href="/org/show/{{ key }}">{{ org.name }}</a>
Admins: {% for oa in org.admins %}
<a href="mailto: {{ oa.email }}">'"{{ oa.name }}" <{{ oa.email }}>'</a>
{% endfor %}
<ul>
{% for proposal in org.proposals %}
<li>Proposal:
<a href="/gsoc/student_proposal/private/{{ proposal.key }}">
{{ proposal.title }}
</a>
</li>
{% endfor %}
</ul>
</li>
{% endfor %}
</ul>
</li>
| Fix the url for student proposals where all the duplicates are listed. | Fix the url for student proposals where all the duplicates are listed.
| HTML | apache-2.0 | MatthewWilkes/mw4068-packaging,MatthewWilkes/mw4068-packaging,MatthewWilkes/mw4068-packaging,MatthewWilkes/mw4068-packaging | html | ## Code Before:
{% comment %}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
{% endcomment %}
<li>
Student:<strong>
<a href="/student/show/{{ duplicate.student.key.id_or_name }}">{{ duplicate.student.name }}</a>
</strong>
<a href="mailto: {{ duplicate.student.email }}">{{ duplicate.student.email }}</a>
<ul>
{% for key, org in orgs.items %}
<li>Organization:
<a href="/org/show/{{ key }}">{{ org.name }}</a>
Admins: {% for oa in org.admins %}
<a href="mailto: {{ oa.email }}">'"{{ oa.name }}" <{{ oa.email }}>'</a>
{% endfor %}
<ul>
{% for proposal in org.proposals %}
<li>Proposal:
<a href="/student_proposal/show/{{ proposal.key }}">
{{ proposal.title }}
</a>
</li>
{% endfor %}
</ul>
</li>
{% endfor %}
</ul>
</li>
## Instruction:
Fix the url for student proposals where all the duplicates are listed.
## Code After:
{% comment %}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
{% endcomment %}
<li>
Student:<strong>
<a href="/student/show/{{ duplicate.student.key.id_or_name }}">{{ duplicate.student.name }}</a>
</strong>
<a href="mailto: {{ duplicate.student.email }}">{{ duplicate.student.email }}</a>
<ul>
{% for key, org in orgs.items %}
<li>Organization:
<a href="/org/show/{{ key }}">{{ org.name }}</a>
Admins: {% for oa in org.admins %}
<a href="mailto: {{ oa.email }}">'"{{ oa.name }}" <{{ oa.email }}>'</a>
{% endfor %}
<ul>
{% for proposal in org.proposals %}
<li>Proposal:
<a href="/gsoc/student_proposal/private/{{ proposal.key }}">
{{ proposal.title }}
</a>
</li>
{% endfor %}
</ul>
</li>
{% endfor %}
</ul>
</li>
|
24f0174c9fa1b5160cc8768895301aff5bd8e9ec | lib/analytics/scripts/usernames_to_csv.rb | lib/analytics/scripts/usernames_to_csv.rb |
require 'csv'
# all the usernames
usernames = User.joins(:courses_users).uniq.pluck(:wiki_id)
CSV.open('/root/all_course_participants.csv', 'wb') do |csv|
usernames.each do |username|
csv << [username]
end
end
# student usernames by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.students.each do |student|
csv << [student.wiki_id]
end
end
end
# student usernames and courses, by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.courses.each do |course|
course.students.each do |student|
csv << [student.wiki_id, course.slug]
end
end
end
end
|
require 'csv'
# all the usernames
usernames = User.joins(:courses_users).uniq.pluck(:wiki_id)
CSV.open('/root/all_course_participants.csv', 'wb') do |csv|
usernames.each do |username|
csv << [username]
end
end
# student usernames by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.students.each do |student|
csv << [student.wiki_id]
end
end
end
# student usernames and courses, by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.courses.each do |course|
course.students.each do |student|
csv << [student.wiki_id, course.slug]
end
end
end
end
# courses, instructor usernames and ids
CSV.open("/root/course_instructors.csv", 'wb') do |csv|
csv << ['course', 'instructor_user_id', 'instructor username']
Course.all.each do |course|
course.instructors.each do |instructor|
csv << [course.slug, instructor.id, instructor.wiki_id]
end
end
end
| Add another query to the scripts archive | Add another query to the scripts archive
| Ruby | mit | MusikAnimal/WikiEduDashboard,sejalkhatri/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,alpha721/WikiEduDashboard,Wowu/WikiEduDashboard,KarmaHater/WikiEduDashboard,majakomel/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,feelfreelinux/WikiEduDashboard,Wowu/WikiEduDashboard,MusikAnimal/WikiEduDashboard,majakomel/WikiEduDashboard,KarmaHater/WikiEduDashboard,adamwight/WikiEduDashboard,feelfreelinux/WikiEduDashboard,sejalkhatri/WikiEduDashboard,sejalkhatri/WikiEduDashboard,sejalkhatri/WikiEduDashboard,Wowu/WikiEduDashboard,KarmaHater/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,KarmaHater/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,alpha721/WikiEduDashboard,majakomel/WikiEduDashboard,Wowu/WikiEduDashboard,adamwight/WikiEduDashboard,alpha721/WikiEduDashboard,MusikAnimal/WikiEduDashboard,sejalkhatri/WikiEduDashboard,adamwight/WikiEduDashboard,majakomel/WikiEduDashboard,feelfreelinux/WikiEduDashboard,MusikAnimal/WikiEduDashboard,feelfreelinux/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,alpha721/WikiEduDashboard | ruby | ## Code Before:
require 'csv'
# all the usernames
usernames = User.joins(:courses_users).uniq.pluck(:wiki_id)
CSV.open('/root/all_course_participants.csv', 'wb') do |csv|
usernames.each do |username|
csv << [username]
end
end
# student usernames by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.students.each do |student|
csv << [student.wiki_id]
end
end
end
# student usernames and courses, by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.courses.each do |course|
course.students.each do |student|
csv << [student.wiki_id, course.slug]
end
end
end
end
## Instruction:
Add another query to the scripts archive
## Code After:
require 'csv'
# all the usernames
usernames = User.joins(:courses_users).uniq.pluck(:wiki_id)
CSV.open('/root/all_course_participants.csv', 'wb') do |csv|
usernames.each do |username|
csv << [username]
end
end
# student usernames by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.students.each do |student|
csv << [student.wiki_id]
end
end
end
# student usernames and courses, by cohort
Cohort.all.each do |cohort|
CSV.open("/root/#{cohort.slug}_students.csv", 'wb') do |csv|
cohort.courses.each do |course|
course.students.each do |student|
csv << [student.wiki_id, course.slug]
end
end
end
end
# courses, instructor usernames and ids
CSV.open("/root/course_instructors.csv", 'wb') do |csv|
csv << ['course', 'instructor_user_id', 'instructor username']
Course.all.each do |course|
course.instructors.each do |instructor|
csv << [course.slug, instructor.id, instructor.wiki_id]
end
end
end
|
333f8850cf9a0395fdf71a8fe292ff2d33c93e6f | Reseter.php | Reseter.php | <?php
namespace FOQ\ElasticaBundle;
/**
* Deletes and recreates indexes
**/
class Reseter
{
protected $indexManager;
public function __construct(IndexManager $indexManager)
{
$this->indexManager = $indexManager;
}
/**
* Resets all indexes
*
* @return null
**/
public function reset()
{
foreach ($this->indexManager->getAllIndexes() as $index) {
$index->delete();
$index->create();
}
}
}
| <?php
namespace FOQ\ElasticaBundle;
use Elastica_Exception_Response;
/**
* Deletes and recreates indexes
**/
class Reseter
{
protected $indexManager;
public function __construct(IndexManager $indexManager)
{
$this->indexManager = $indexManager;
}
/**
* Resets all indexes
*
* @return null
**/
public function reset()
{
foreach ($this->indexManager->getAllIndexes() as $index) {
try {
$index->delete();
} catch (Elastica_Exception_Response $e) {
// The index does not exist
}
$index->create();
}
}
}
| Fix issue in reseter when an index does not exist yet | Fix issue in reseter when an index does not exist yet
| PHP | mit | ramuss/FazlandElasticaBundle,alekitto/FazlandElasticaBundle,Exercise/FOQElasticaBundle,fazland/FazlandElasticaBundle,alekitto/FazlandElasticaBundle,fazland/FazlandElasticaBundle,peterkokot/FOSElasticaBundle,ramuss/FazlandElasticaBundle | php | ## Code Before:
<?php
namespace FOQ\ElasticaBundle;
/**
* Deletes and recreates indexes
**/
class Reseter
{
protected $indexManager;
public function __construct(IndexManager $indexManager)
{
$this->indexManager = $indexManager;
}
/**
* Resets all indexes
*
* @return null
**/
public function reset()
{
foreach ($this->indexManager->getAllIndexes() as $index) {
$index->delete();
$index->create();
}
}
}
## Instruction:
Fix issue in reseter when an index does not exist yet
## Code After:
<?php
namespace FOQ\ElasticaBundle;
use Elastica_Exception_Response;
/**
* Deletes and recreates indexes
**/
class Reseter
{
protected $indexManager;
public function __construct(IndexManager $indexManager)
{
$this->indexManager = $indexManager;
}
/**
* Resets all indexes
*
* @return null
**/
public function reset()
{
foreach ($this->indexManager->getAllIndexes() as $index) {
try {
$index->delete();
} catch (Elastica_Exception_Response $e) {
// The index does not exist
}
$index->create();
}
}
}
|
12750a0bfb64a46c8bd5fce4d6dce3d406969dc9 | .travis.yml | .travis.yml | sudo: false
language: "python"
branches:
only:
- master
matrix:
include:
- python: 2.7.13
env: TOXENV=py27
- python: pypy2.7-5.8.0
env: TOXENV=pypy
install:
- pip install -U pip setuptools wheel
- pip install tox codecov
script:
- tox
after_success:
- codecov
notifications:
email: false
irc:
channels: "chat.freenode.net#divmod"
template:
- "%{repository}@%{branch} - %{author}: %{message} (%{build_url})"
use_notice: true
| sudo: false
language: python
dist: xenial
branches:
only:
- master
matrix:
include:
- python: 2.7.13
env: TOXENV=py27
- python: pypy2.7-6.0
env: TOXENV=pypy
install:
- pip install -U pip setuptools wheel
- pip install tox codecov
script:
- tox
after_success:
- codecov
notifications:
email: false
irc:
channels: "chat.freenode.net#divmod"
template:
- "%{repository}@%{branch} - %{author}: %{message} (%{build_url})"
use_notice: true
| Upgrade to pypy 6.0 and xenial | Upgrade to pypy 6.0 and xenial
| YAML | mit | twisted/epsilon | yaml | ## Code Before:
sudo: false
language: "python"
branches:
only:
- master
matrix:
include:
- python: 2.7.13
env: TOXENV=py27
- python: pypy2.7-5.8.0
env: TOXENV=pypy
install:
- pip install -U pip setuptools wheel
- pip install tox codecov
script:
- tox
after_success:
- codecov
notifications:
email: false
irc:
channels: "chat.freenode.net#divmod"
template:
- "%{repository}@%{branch} - %{author}: %{message} (%{build_url})"
use_notice: true
## Instruction:
Upgrade to pypy 6.0 and xenial
## Code After:
sudo: false
language: python
dist: xenial
branches:
only:
- master
matrix:
include:
- python: 2.7.13
env: TOXENV=py27
- python: pypy2.7-6.0
env: TOXENV=pypy
install:
- pip install -U pip setuptools wheel
- pip install tox codecov
script:
- tox
after_success:
- codecov
notifications:
email: false
irc:
channels: "chat.freenode.net#divmod"
template:
- "%{repository}@%{branch} - %{author}: %{message} (%{build_url})"
use_notice: true
|
3364c0173d9d5fc3164598076949f7db19901012 | client/app/components/login/login.component.js | client/app/components/login/login.component.js | import template from "./login.html";
import controller from "./login.controller";
import "./login.styl";
let loginComponent = {
restrict: "E",
bindings: {},
template,
controller,
controllerAs: "ctrl"
};
export default loginComponent;
| import template from "./login.html";
import controller from "./login.controller";
import "./login.styl";
let loginComponent = {
template,
controller,
controllerAs: "ctrl"
};
export default loginComponent;
| Use implicit restrict and bindings | Use implicit restrict and bindings
An angular component is restricted to elements ("E") by default.
| JavaScript | mit | yunity/karrot-frontend,yunity/karrot-frontend,yunity/karrot-frontend,yunity/karrot-frontend,yunity/foodsaving-frontend,yunity/foodsaving-frontend,yunity/foodsaving-frontend,yunity/foodsaving-frontend | javascript | ## Code Before:
import template from "./login.html";
import controller from "./login.controller";
import "./login.styl";
let loginComponent = {
restrict: "E",
bindings: {},
template,
controller,
controllerAs: "ctrl"
};
export default loginComponent;
## Instruction:
Use implicit restrict and bindings
An angular component is restricted to elements ("E") by default.
## Code After:
import template from "./login.html";
import controller from "./login.controller";
import "./login.styl";
let loginComponent = {
template,
controller,
controllerAs: "ctrl"
};
export default loginComponent;
|
ddf57302aa50f380865d95d26baf4551b519ea98 | README.md | README.md | Warning
-------
You might find this code useful, but this doesn't actually work yet. If you make progress on it, go ahead and submit a PR!
freezerwatch
============
Returns health status from La Crosse Alerts sensors.
Useful for adding to your monitoring system--i.e., answers the
question "are my alarms actually live and working?"
The --live option will exit with a zero status if all of the devices you passed in are found and registered, we have readings within te last day, and all devices have adequate battery.
Config
------
Create a file called freezerwatch.json:
```json
{
"username": "[email protected]",
"password": "my_password"
}
```
Usage
-----
npm install freezerwatch
```sh
if freezerwatch --live --device="123" --device="456" --device="789"
then
echo "All monitoring is live and working!"
else
echo "Check your sensors!"
fi
```
Device IDs
----------
You can find device IDs by logging into lacrossealerts.com/login and
looking at the link that your 'Download' button points to.
Note
----
La Crosse Technology and La Crosse Alerts are registered trademarks of La Crosse Technology Ltd. I use their products but am not employed by or connected to them in any other way.
| Warning
-------
You might find this code useful, but this doesn't actually work yet. If you make progress on it, go ahead and submit a PR!
freezerwatch
============
Want to monitor a fridge/freezer and make sure the door isn't left open or it doesn't die or lose power? La Crosse Technology sells alarms that do that.
Want a quick command-line way to make sure those alarms are up and working? This utility can help.
The --live option will exit with a zero status if all of the devices you passed in are found and registered, we have readings within te last day, and all devices have adequate battery.
Config
------
Create a file called freezerwatch.json:
```json
{
"username": "[email protected]",
"password": "my_password"
}
```
Usage
-----
npm install freezerwatch
```sh
if freezerwatch --live --device="123" --device="456" --device="789"
then
echo "All monitoring is live and working!"
else
echo "Check your sensors!"
fi
```
Device IDs
----------
You can find device IDs by logging into lacrossealerts.com/login and
looking at the link that your 'Download' button points to.
Note
----
La Crosse Technology and La Crosse Alerts are registered trademarks of La Crosse Technology Ltd. I use their products but am not employed by or connected to them in any other way.
| Add a little better explanation on purpose | Add a little better explanation on purpose
| Markdown | bsd-2-clause | apiology/freezerwatch,apiology/freezerwatch | markdown | ## Code Before:
Warning
-------
You might find this code useful, but this doesn't actually work yet. If you make progress on it, go ahead and submit a PR!
freezerwatch
============
Returns health status from La Crosse Alerts sensors.
Useful for adding to your monitoring system--i.e., answers the
question "are my alarms actually live and working?"
The --live option will exit with a zero status if all of the devices you passed in are found and registered, we have readings within te last day, and all devices have adequate battery.
Config
------
Create a file called freezerwatch.json:
```json
{
"username": "[email protected]",
"password": "my_password"
}
```
Usage
-----
npm install freezerwatch
```sh
if freezerwatch --live --device="123" --device="456" --device="789"
then
echo "All monitoring is live and working!"
else
echo "Check your sensors!"
fi
```
Device IDs
----------
You can find device IDs by logging into lacrossealerts.com/login and
looking at the link that your 'Download' button points to.
Note
----
La Crosse Technology and La Crosse Alerts are registered trademarks of La Crosse Technology Ltd. I use their products but am not employed by or connected to them in any other way.
## Instruction:
Add a little better explanation on purpose
## Code After:
Warning
-------
You might find this code useful, but this doesn't actually work yet. If you make progress on it, go ahead and submit a PR!
freezerwatch
============
Want to monitor a fridge/freezer and make sure the door isn't left open or it doesn't die or lose power? La Crosse Technology sells alarms that do that.
Want a quick command-line way to make sure those alarms are up and working? This utility can help.
The --live option will exit with a zero status if all of the devices you passed in are found and registered, we have readings within te last day, and all devices have adequate battery.
Config
------
Create a file called freezerwatch.json:
```json
{
"username": "[email protected]",
"password": "my_password"
}
```
Usage
-----
npm install freezerwatch
```sh
if freezerwatch --live --device="123" --device="456" --device="789"
then
echo "All monitoring is live and working!"
else
echo "Check your sensors!"
fi
```
Device IDs
----------
You can find device IDs by logging into lacrossealerts.com/login and
looking at the link that your 'Download' button points to.
Note
----
La Crosse Technology and La Crosse Alerts are registered trademarks of La Crosse Technology Ltd. I use their products but am not employed by or connected to them in any other way.
|
3086970f38b2ebc297764c24a333f2009d3f882f | proj/templates/site/about.html | proj/templates/site/about.html | {% extends "layout.html" %}
{% block page_id %}page-index{% endblock %}
{% block page_title %}关于Flask-Bootstrap{% endblock %}
{% block page_content %}
<div class="container">
<h1>关于proj</h1>
</div>
{% endblock %} | {% extends "layout.html" %}
{% block page_id %}page-index{% endblock %}
{% block page_title %}关于proj{% endblock %}
{% block page_content %}
<div class="container">
<h1>关于proj</h1>
</div>
{% endblock %} | Update title for index page | Update title for index page
| HTML | mit | 1045347128/Flask-Boost,hustlzp/Flask-Boost,hustlzp/Flask-Boost,hustlzp/Flask-Boost,hustlzp/Flask-Boost,1045347128/Flask-Boost,1045347128/Flask-Boost,1045347128/Flask-Boost | html | ## Code Before:
{% extends "layout.html" %}
{% block page_id %}page-index{% endblock %}
{% block page_title %}关于Flask-Bootstrap{% endblock %}
{% block page_content %}
<div class="container">
<h1>关于proj</h1>
</div>
{% endblock %}
## Instruction:
Update title for index page
## Code After:
{% extends "layout.html" %}
{% block page_id %}page-index{% endblock %}
{% block page_title %}关于proj{% endblock %}
{% block page_content %}
<div class="container">
<h1>关于proj</h1>
</div>
{% endblock %} |
9eb23f3dfe8167a25e99dd90c9ea7c229064ef24 | app/views/templates/post.blade.php | app/views/templates/post.blade.php | <div class="post">
@if (isset($post))
<h1>{{{ $post->title }}}</h1>
{{-- Escape html entities, format with paragraph tags --}}
{{-- Leading and trailing paragraph tags are required for leading and trailing paragraphs respectively --}}
<p>{{ str_replace(array("\n","\r\n"), "</p><p>", e($post->content)) }}</p>
<div class="well well-sm clearfix">
<span class="pull-left">{{{ $post->tags }}}</span>
<span class="pull-right">{{ $post->created_at }}</span>
</div>
@else
<h1>Sorry! That post doesn't exist!</h1>
<p>Unfortunately the post you were looking for isn't in the database. Maybe try looking in the <a href="{{ action('BlogController@getList') }}">archive</a> for the post you were seeking.</p>
@endif
</div> | <div class="post col-md-10 col-lg-8">
@if (isset($post))
<h1>{{{ $post->title }}}</h1>
{{-- Escape html entities, format with paragraph tags --}}
{{-- Leading and trailing paragraph tags are required for leading and trailing paragraphs respectively --}}
<p>{{ str_replace(array("\n","\r\n"), "</p><p>", e($post->content)) }}</p>
<div class="well well-sm clearfix">
<span class="pull-left">{{{ $post->tags }}}</span>
<span class="pull-right">{{ $post->created_at }}</span>
</div>
@else
<h1>Sorry! That post doesn't exist!</h1>
<p>Unfortunately the post you were looking for isn't in the database. Maybe try looking in the <a href="{{ action('BlogController@getList') }}">archive</a> for the post you were seeking.</p>
@endif
</div> | Add column width to posts for larger displays | Add column width to posts for larger displays | PHP | mit | ashfordl/blog,ashfordl/blog | php | ## Code Before:
<div class="post">
@if (isset($post))
<h1>{{{ $post->title }}}</h1>
{{-- Escape html entities, format with paragraph tags --}}
{{-- Leading and trailing paragraph tags are required for leading and trailing paragraphs respectively --}}
<p>{{ str_replace(array("\n","\r\n"), "</p><p>", e($post->content)) }}</p>
<div class="well well-sm clearfix">
<span class="pull-left">{{{ $post->tags }}}</span>
<span class="pull-right">{{ $post->created_at }}</span>
</div>
@else
<h1>Sorry! That post doesn't exist!</h1>
<p>Unfortunately the post you were looking for isn't in the database. Maybe try looking in the <a href="{{ action('BlogController@getList') }}">archive</a> for the post you were seeking.</p>
@endif
</div>
## Instruction:
Add column width to posts for larger displays
## Code After:
<div class="post col-md-10 col-lg-8">
@if (isset($post))
<h1>{{{ $post->title }}}</h1>
{{-- Escape html entities, format with paragraph tags --}}
{{-- Leading and trailing paragraph tags are required for leading and trailing paragraphs respectively --}}
<p>{{ str_replace(array("\n","\r\n"), "</p><p>", e($post->content)) }}</p>
<div class="well well-sm clearfix">
<span class="pull-left">{{{ $post->tags }}}</span>
<span class="pull-right">{{ $post->created_at }}</span>
</div>
@else
<h1>Sorry! That post doesn't exist!</h1>
<p>Unfortunately the post you were looking for isn't in the database. Maybe try looking in the <a href="{{ action('BlogController@getList') }}">archive</a> for the post you were seeking.</p>
@endif
</div> |
2e01c0dc0f5ad0fbec24b3d610e9815c6fcae7c2 | .travis.yml | .travis.yml | language: ruby
rvm: 2.5
sudo: required
cache: bundler
services: docker
dist: bionic
env:
matrix:
- TESTS="style unit" CHEF_VERSION="~> 14.0" CHEF_LICENSE=accept
- TESTS="style unit" CHEF_VERSION="~> 15.0" CHEF_LICENSE=accept
before_install:
- chef --version &> /dev/null || curl -L https://www.getchef.com/chef/install.sh | sudo bash -s -- -P chefdk -v 4.3.13
- eval "$(/opt/chefdk/bin/chef shell-init bash)"
install:
- chef exec bundle install --jobs=3 --retry=3 --without='doc integration_vagrant integration_cloud guard'
before_script:
# https://github.com/zuazo/kitchen-in-travis-native/issues/1#issuecomment-142455888
- sudo iptables -L DOCKER || ( echo "DOCKER iptables chain missing" ; sudo iptables -N DOCKER )
- chef --version
- cookstyle --version
- foodcritic --version
script: travis_retry chef exec bundle exec rake $TESTS
| language: ruby
rvm: 2.5
sudo: required
cache: bundler
services: docker
dist: bionic
env:
matrix:
- TESTS="style unit" CHEF_VERSION="~> 14.0" CHEF_LICENSE=accept
- TESTS="style unit" CHEF_VERSION="~> 15.0" CHEF_LICENSE=accept
- TESTS="style unit" CHEF_VERSION="~> 16.0" CHEF_LICENSE=accept
before_install:
- chef --version &> /dev/null || curl -L https://www.getchef.com/chef/install.sh | sudo bash -s -- -P chefdk -v 4.10.0
- eval "$(/opt/chefdk/bin/chef shell-init bash)"
install:
- chef exec bundle install --jobs=3 --retry=3 --without='doc integration_vagrant integration_cloud guard'
before_script:
# https://github.com/zuazo/kitchen-in-travis-native/issues/1#issuecomment-142455888
- sudo iptables -L DOCKER || ( echo "DOCKER iptables chain missing" ; sudo iptables -N DOCKER )
- chef --version
- cookstyle --version
- foodcritic --version
script: travis_retry chef exec bundle exec rake $TESTS
| Test Chef 16, chefdk version updated | Travis: Test Chef 16, chefdk version updated
| YAML | apache-2.0 | zuazo/ssh_authorized_keys-cookbook,zuazo/ssh_authorized_keys-cookbook,onddo/ssh_authorized_keys-cookbook,zuazo/ssh_authorized_keys-cookbook,onddo/ssh_authorized_keys-cookbook,onddo/ssh_authorized_keys-cookbook | yaml | ## Code Before:
language: ruby
rvm: 2.5
sudo: required
cache: bundler
services: docker
dist: bionic
env:
matrix:
- TESTS="style unit" CHEF_VERSION="~> 14.0" CHEF_LICENSE=accept
- TESTS="style unit" CHEF_VERSION="~> 15.0" CHEF_LICENSE=accept
before_install:
- chef --version &> /dev/null || curl -L https://www.getchef.com/chef/install.sh | sudo bash -s -- -P chefdk -v 4.3.13
- eval "$(/opt/chefdk/bin/chef shell-init bash)"
install:
- chef exec bundle install --jobs=3 --retry=3 --without='doc integration_vagrant integration_cloud guard'
before_script:
# https://github.com/zuazo/kitchen-in-travis-native/issues/1#issuecomment-142455888
- sudo iptables -L DOCKER || ( echo "DOCKER iptables chain missing" ; sudo iptables -N DOCKER )
- chef --version
- cookstyle --version
- foodcritic --version
script: travis_retry chef exec bundle exec rake $TESTS
## Instruction:
Travis: Test Chef 16, chefdk version updated
## Code After:
language: ruby
rvm: 2.5
sudo: required
cache: bundler
services: docker
dist: bionic
env:
matrix:
- TESTS="style unit" CHEF_VERSION="~> 14.0" CHEF_LICENSE=accept
- TESTS="style unit" CHEF_VERSION="~> 15.0" CHEF_LICENSE=accept
- TESTS="style unit" CHEF_VERSION="~> 16.0" CHEF_LICENSE=accept
before_install:
- chef --version &> /dev/null || curl -L https://www.getchef.com/chef/install.sh | sudo bash -s -- -P chefdk -v 4.10.0
- eval "$(/opt/chefdk/bin/chef shell-init bash)"
install:
- chef exec bundle install --jobs=3 --retry=3 --without='doc integration_vagrant integration_cloud guard'
before_script:
# https://github.com/zuazo/kitchen-in-travis-native/issues/1#issuecomment-142455888
- sudo iptables -L DOCKER || ( echo "DOCKER iptables chain missing" ; sudo iptables -N DOCKER )
- chef --version
- cookstyle --version
- foodcritic --version
script: travis_retry chef exec bundle exec rake $TESTS
|
7a472072a18eb464220e9341005488fa79c3b1b3 | test/support/partials.html | test/support/partials.html | <div>
<ul class="items-">
<li class="item-" src="./list_item.html"></li>
</ul>
<embed src="./embedded_partial.html" type="text/x-end-dash"></embed>
</div>
| <div>
<ul class="items-">
<li class="item-" src="./list_item.html"></li>
</ul>
<div src="./embedded_partial.html" data-replace></div>
</div>
| Update test for new partial syntax | Update test for new partial syntax
| HTML | mit | Amicus/end-dash | html | ## Code Before:
<div>
<ul class="items-">
<li class="item-" src="./list_item.html"></li>
</ul>
<embed src="./embedded_partial.html" type="text/x-end-dash"></embed>
</div>
## Instruction:
Update test for new partial syntax
## Code After:
<div>
<ul class="items-">
<li class="item-" src="./list_item.html"></li>
</ul>
<div src="./embedded_partial.html" data-replace></div>
</div>
|
3b2162277bf083ba20a042fa8ec73f3a2794a0f6 | packages/hc/hcg-minus-cairo.yaml | packages/hc/hcg-minus-cairo.yaml | homepage: http://rd.slavepianos.org/t/hcg-minus-cairo
changelog-type: ''
hash: 705f5cd6704fc04ef51d3b7abad936842e224c3327185166d7cc3f1311a8fb42
test-bench-deps: {}
maintainer: [email protected]
synopsis: haskell cg (minus) (cairo rendering)
changelog: ''
basic-deps:
base: ==4.*
hcg-minus: ==0.15.*
filepath: -any
cairo: -any
utf8-string: -any
colour: -any
all-versions:
- '0.12'
- '0.14'
- '0.15'
author: Rohan Drape
latest: '0.15'
description-type: text
description: ! 'hcg-minus-cairo
===============
[cairo][cairo] rendering for [hcg-minus][hcg-minus]
[cairo]: http://cairographics.org/
[hcg-minus]: http://rd.slavepianos.org/t/hcg-minus
© [rd][rd], 2009-2014, [gpl][gpl]
[rd]: http://rd.slavepianos.org/
[gpl]: http://gnu.org/copyleft/
'
license-name: BSD3
| homepage: http://rd.slavepianos.org/t/hcg-minus-cairo
changelog-type: ''
hash: 6c72b0fa328c4cda2dd04729c060761234607baa04693b0843e2819a20cdaa06
test-bench-deps: {}
maintainer: [email protected]
synopsis: haskell cg (minus) (cairo rendering)
changelog: ''
basic-deps:
base: ==4.* && <5
hcg-minus: ==0.16.*
filepath: -any
process: -any
cairo: -any
colour: -any
all-versions:
- '0.12'
- '0.14'
- '0.15'
- '0.16'
author: Rohan Drape
latest: '0.16'
description-type: text
description: ! 'hcg-minus-cairo
===============
[cairo][cairo] rendering for [hcg-minus][hcg-minus]
[cairo]: http://cairographics.org/
[hcg-minus]: http://rd.slavepianos.org/t/hcg-minus
© [rd][rd], 2009-2017, [gpl][gpl]
[rd]: http://rd.slavepianos.org/
[gpl]: http://gnu.org/copyleft/
'
license-name: BSD3
| Update from Hackage at 2017-11-22T02:31:23Z | Update from Hackage at 2017-11-22T02:31:23Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: http://rd.slavepianos.org/t/hcg-minus-cairo
changelog-type: ''
hash: 705f5cd6704fc04ef51d3b7abad936842e224c3327185166d7cc3f1311a8fb42
test-bench-deps: {}
maintainer: [email protected]
synopsis: haskell cg (minus) (cairo rendering)
changelog: ''
basic-deps:
base: ==4.*
hcg-minus: ==0.15.*
filepath: -any
cairo: -any
utf8-string: -any
colour: -any
all-versions:
- '0.12'
- '0.14'
- '0.15'
author: Rohan Drape
latest: '0.15'
description-type: text
description: ! 'hcg-minus-cairo
===============
[cairo][cairo] rendering for [hcg-minus][hcg-minus]
[cairo]: http://cairographics.org/
[hcg-minus]: http://rd.slavepianos.org/t/hcg-minus
© [rd][rd], 2009-2014, [gpl][gpl]
[rd]: http://rd.slavepianos.org/
[gpl]: http://gnu.org/copyleft/
'
license-name: BSD3
## Instruction:
Update from Hackage at 2017-11-22T02:31:23Z
## Code After:
homepage: http://rd.slavepianos.org/t/hcg-minus-cairo
changelog-type: ''
hash: 6c72b0fa328c4cda2dd04729c060761234607baa04693b0843e2819a20cdaa06
test-bench-deps: {}
maintainer: [email protected]
synopsis: haskell cg (minus) (cairo rendering)
changelog: ''
basic-deps:
base: ==4.* && <5
hcg-minus: ==0.16.*
filepath: -any
process: -any
cairo: -any
colour: -any
all-versions:
- '0.12'
- '0.14'
- '0.15'
- '0.16'
author: Rohan Drape
latest: '0.16'
description-type: text
description: ! 'hcg-minus-cairo
===============
[cairo][cairo] rendering for [hcg-minus][hcg-minus]
[cairo]: http://cairographics.org/
[hcg-minus]: http://rd.slavepianos.org/t/hcg-minus
© [rd][rd], 2009-2017, [gpl][gpl]
[rd]: http://rd.slavepianos.org/
[gpl]: http://gnu.org/copyleft/
'
license-name: BSD3
|
8da31f15f104633ce3df78b7730adb57434ab0ce | recipes/microed-tools/conda_build_config.yaml | recipes/microed-tools/conda_build_config.yaml | channel_targets:
- conda-forge microed-tools_dev
c_compiler: # [win]
- m2w64-toolchain # [win]
cxx_compiler: # [win]
- m2w64-toolchain # [win]
target_platform: # [win]
- win-64 # [win]
| channel_sources:
- conda-forge/label/microed-data_dev,conda-forge
channel_targets:
- conda-forge microed-tools_dev
c_compiler: # [win]
- m2w64-toolchain # [win]
cxx_compiler: # [win]
- m2w64-toolchain # [win]
target_platform: # [win]
- win-64 # [win]
| Correct dependency on microed-data prerelease | Correct dependency on microed-data prerelease
| YAML | bsd-3-clause | jakirkham/staged-recipes,goanpeca/staged-recipes,jakirkham/staged-recipes,ocefpaf/staged-recipes,johanneskoester/staged-recipes,conda-forge/staged-recipes,johanneskoester/staged-recipes,ocefpaf/staged-recipes,goanpeca/staged-recipes,conda-forge/staged-recipes | yaml | ## Code Before:
channel_targets:
- conda-forge microed-tools_dev
c_compiler: # [win]
- m2w64-toolchain # [win]
cxx_compiler: # [win]
- m2w64-toolchain # [win]
target_platform: # [win]
- win-64 # [win]
## Instruction:
Correct dependency on microed-data prerelease
## Code After:
channel_sources:
- conda-forge/label/microed-data_dev,conda-forge
channel_targets:
- conda-forge microed-tools_dev
c_compiler: # [win]
- m2w64-toolchain # [win]
cxx_compiler: # [win]
- m2w64-toolchain # [win]
target_platform: # [win]
- win-64 # [win]
|
9977bbffb998230b2ded806eb02f9e81daba9f8e | Arduino/ADXL335/ADXL335.ino | Arduino/ADXL335/ADXL335.ino | int vinpin = A0;
int voutpin = A1;
int gndpin = A2;
int zpin = A3;
int ypin = A4;
int xpin = A5;
void setup() {
Serial.begin(9600);
pinMode(vinpin, OUTPUT); digitalWrite(vinpin, HIGH);
pinMode(gndpin, OUTPUT); digitalWrite(gndpin, LOW);
pinMode(voutpin, INPUT);
pinMode(xpin, INPUT);
pinMode(ypin, INPUT);
pinMode(zpin, INPUT);
}
void loop() {
Serial.print(analogRead(xpin)); Serial.print("\t");
Serial.print(analogRead(ypin)); Serial.print("\t");
Serial.print(analogRead(zpin)); Serial.println();
}
| int zpin = A3;
int ypin = A4;
int xpin = A5;
// Uncomment the following lines if you're using an ADXL335 on an
// Adafruit breakout board (https://www.adafruit.com/products/163)
// and want to plug it directly into (and power it from) the analog
// input pins of your Arduino board.
//int vinpin = A0;
//int voutpin = A1;
//int gndpin = A2;
void setup() {
Serial.begin(9600);
pinMode(vinpin, OUTPUT); digitalWrite(vinpin, HIGH);
pinMode(gndpin, OUTPUT); digitalWrite(gndpin, LOW);
pinMode(voutpin, INPUT);
pinMode(xpin, INPUT);
pinMode(ypin, INPUT);
pinMode(zpin, INPUT);
}
void loop() {
Serial.print(analogRead(xpin)); Serial.print("\t");
Serial.print(analogRead(ypin)); Serial.print("\t");
Serial.print(analogRead(zpin)); Serial.println();
}
| Comment out power pins in ADXL example code. | Comment out power pins in ADXL example code.
With a note that you should uncomment them if you want to power the
ADXL335 (on Adafruit breakout) by plugging it directly into the Arduino.
| Arduino | bsd-3-clause | damellis/ESP,damellis/ESP | arduino | ## Code Before:
int vinpin = A0;
int voutpin = A1;
int gndpin = A2;
int zpin = A3;
int ypin = A4;
int xpin = A5;
void setup() {
Serial.begin(9600);
pinMode(vinpin, OUTPUT); digitalWrite(vinpin, HIGH);
pinMode(gndpin, OUTPUT); digitalWrite(gndpin, LOW);
pinMode(voutpin, INPUT);
pinMode(xpin, INPUT);
pinMode(ypin, INPUT);
pinMode(zpin, INPUT);
}
void loop() {
Serial.print(analogRead(xpin)); Serial.print("\t");
Serial.print(analogRead(ypin)); Serial.print("\t");
Serial.print(analogRead(zpin)); Serial.println();
}
## Instruction:
Comment out power pins in ADXL example code.
With a note that you should uncomment them if you want to power the
ADXL335 (on Adafruit breakout) by plugging it directly into the Arduino.
## Code After:
int zpin = A3;
int ypin = A4;
int xpin = A5;
// Uncomment the following lines if you're using an ADXL335 on an
// Adafruit breakout board (https://www.adafruit.com/products/163)
// and want to plug it directly into (and power it from) the analog
// input pins of your Arduino board.
//int vinpin = A0;
//int voutpin = A1;
//int gndpin = A2;
void setup() {
Serial.begin(9600);
pinMode(vinpin, OUTPUT); digitalWrite(vinpin, HIGH);
pinMode(gndpin, OUTPUT); digitalWrite(gndpin, LOW);
pinMode(voutpin, INPUT);
pinMode(xpin, INPUT);
pinMode(ypin, INPUT);
pinMode(zpin, INPUT);
}
void loop() {
Serial.print(analogRead(xpin)); Serial.print("\t");
Serial.print(analogRead(ypin)); Serial.print("\t");
Serial.print(analogRead(zpin)); Serial.println();
}
|
d3e537ed1b7de21849521be112d15908641af222 | tests/Unit/Http/Controllers/ProfileControllerTest.php | tests/Unit/Http/Controllers/ProfileControllerTest.php | <?php
namespace Tests\App\Http\Controllers;
use Tests\TestCase;
use Mockery as Mockery;
use Illuminate\Http\Request;
use Symfony\Component\HttpKernel\Exception\NotFoundHttpException;
class ProfileControllerTest extends TestCase
{
/**
* @covers App\Http\Controllers\ProfileController::__construct
* @covers App\Http\Controllers\ProfileController::show
* @test
*/
public function invalid_profile_should_404()
{
$this->expectException(NotFoundHttpException::class);
// Fake return
$return = [
'profiles' => [],
];
// Mock the connector
$wsuApi = Mockery::mock('Waynestate\Api\Connector');
$wsuApi->shouldReceive('nextRequestProduction')->once()->andReturn(true);
$wsuApi->shouldReceive('sendRequest')->with('profile.users.view', Mockery::type('array'))->once()->andReturn($return);
// Construct the profile repository
$profileRepository = app('App\Repositories\ProfileRepository', ['wsuApi' => $wsuApi]);
// Construct the news controller
$this->profileController = app('App\Http\Controllers\ProfileController', ['profile' => $profileRepository]);
// Call the profile listing
$view = $this->profileController->show(new Request());
}
}
| <?php
namespace Tests\App\Http\Controllers;
use Tests\TestCase;
use Mockery as Mockery;
use Illuminate\Http\Request;
use Symfony\Component\HttpKernel\Exception\NotFoundHttpException;
class ProfileControllerTest extends TestCase
{
/**
* @covers App\Http\Controllers\ProfileController::__construct
* @covers App\Http\Controllers\ProfileController::show
* @test
*/
public function no_profile_accessid_should_404()
{
$this->expectException(NotFoundHttpException::class);
// Construct the news controller
$this->profileController = app('App\Http\Controllers\ProfileController', []);
// Call the profile listing
$view = $this->profileController->show(new Request());
}
/**
* @covers App\Http\Controllers\ProfileController::__construct
* @covers App\Http\Controllers\ProfileController::show
* @test
*/
public function invalid_profile_should_404()
{
$this->expectException(NotFoundHttpException::class);
// Fake return
$return = [
'profiles' => [],
];
// Mock the connector
$wsuApi = Mockery::mock('Waynestate\Api\Connector');
$wsuApi->shouldReceive('nextRequestProduction')->once()->andReturn(true);
$wsuApi->shouldReceive('sendRequest')->with('profile.users.view', Mockery::type('array'))->once()->andReturn($return);
// Construct the profile repository
$profileRepository = app('App\Repositories\ProfileRepository', ['wsuApi' => $wsuApi]);
// Construct the news controller
$this->profileController = app('App\Http\Controllers\ProfileController', ['profile' => $profileRepository]);
$request = new Request();
$request->accessid = 'aa1234';
// Call the profile listing
$view = $this->profileController->show($request);
}
}
| Add new test to test that the Profile show will 404 if there is no access id passed in Update the invalid Access ID test to actually pass in an actual `accessid` instead of relying on no `accessid` is a invalid accessid | Add new test to test that the Profile show will 404 if there is no access id passed in
Update the invalid Access ID test to actually pass in an actual `accessid` instead of relying on no `accessid` is a invalid accessid
| PHP | mit | waynestate/base-site,waynestate/base-site | php | ## Code Before:
<?php
namespace Tests\App\Http\Controllers;
use Tests\TestCase;
use Mockery as Mockery;
use Illuminate\Http\Request;
use Symfony\Component\HttpKernel\Exception\NotFoundHttpException;
class ProfileControllerTest extends TestCase
{
/**
* @covers App\Http\Controllers\ProfileController::__construct
* @covers App\Http\Controllers\ProfileController::show
* @test
*/
public function invalid_profile_should_404()
{
$this->expectException(NotFoundHttpException::class);
// Fake return
$return = [
'profiles' => [],
];
// Mock the connector
$wsuApi = Mockery::mock('Waynestate\Api\Connector');
$wsuApi->shouldReceive('nextRequestProduction')->once()->andReturn(true);
$wsuApi->shouldReceive('sendRequest')->with('profile.users.view', Mockery::type('array'))->once()->andReturn($return);
// Construct the profile repository
$profileRepository = app('App\Repositories\ProfileRepository', ['wsuApi' => $wsuApi]);
// Construct the news controller
$this->profileController = app('App\Http\Controllers\ProfileController', ['profile' => $profileRepository]);
// Call the profile listing
$view = $this->profileController->show(new Request());
}
}
## Instruction:
Add new test to test that the Profile show will 404 if there is no access id passed in
Update the invalid Access ID test to actually pass in an actual `accessid` instead of relying on no `accessid` is a invalid accessid
## Code After:
<?php
namespace Tests\App\Http\Controllers;
use Tests\TestCase;
use Mockery as Mockery;
use Illuminate\Http\Request;
use Symfony\Component\HttpKernel\Exception\NotFoundHttpException;
class ProfileControllerTest extends TestCase
{
/**
* @covers App\Http\Controllers\ProfileController::__construct
* @covers App\Http\Controllers\ProfileController::show
* @test
*/
public function no_profile_accessid_should_404()
{
$this->expectException(NotFoundHttpException::class);
// Construct the news controller
$this->profileController = app('App\Http\Controllers\ProfileController', []);
// Call the profile listing
$view = $this->profileController->show(new Request());
}
/**
* @covers App\Http\Controllers\ProfileController::__construct
* @covers App\Http\Controllers\ProfileController::show
* @test
*/
public function invalid_profile_should_404()
{
$this->expectException(NotFoundHttpException::class);
// Fake return
$return = [
'profiles' => [],
];
// Mock the connector
$wsuApi = Mockery::mock('Waynestate\Api\Connector');
$wsuApi->shouldReceive('nextRequestProduction')->once()->andReturn(true);
$wsuApi->shouldReceive('sendRequest')->with('profile.users.view', Mockery::type('array'))->once()->andReturn($return);
// Construct the profile repository
$profileRepository = app('App\Repositories\ProfileRepository', ['wsuApi' => $wsuApi]);
// Construct the news controller
$this->profileController = app('App\Http\Controllers\ProfileController', ['profile' => $profileRepository]);
$request = new Request();
$request->accessid = 'aa1234';
// Call the profile listing
$view = $this->profileController->show($request);
}
}
|
7b066fe33488ddc389c4545e93647a8e59e69a2b | src/Mod/Import/CMakeLists.txt | src/Mod/Import/CMakeLists.txt |
add_subdirectory(App)
if(BUILD_GUI)
add_subdirectory(Gui)
endif(BUILD_GUI)
IF (BUILD_GUI)
PYSIDE_WRAP_RC(Import_QRC_SRCS Resources/Import.qrc)
ENDIF (BUILD_GUI)
ADD_CUSTOM_TARGET(ImportRC ALL
SOURCES ${Import_QRC_SRCS}
)
IF (BUILD_GUI)
fc_target_copy_resource(ImportRC
${CMAKE_CURRENT_BINARY_DIR}
${CMAKE_BINARY_DIR}/Mod/Import
Import_rc.py)
ENDIF (BUILD_GUI)
INSTALL(
FILES
Init.py
InitGui.py
DESTINATION
Mod/Import
)
|
add_subdirectory(App)
if(BUILD_GUI)
add_subdirectory(Gui)
endif(BUILD_GUI)
IF (BUILD_GUI)
PYSIDE_WRAP_RC(Import_QRC_SRCS Resources/Import.qrc)
ENDIF (BUILD_GUI)
ADD_CUSTOM_TARGET(ImportRC ALL
SOURCES ${Import_QRC_SRCS}
)
IF (BUILD_GUI)
fc_target_copy_resource(ImportRC
${CMAKE_CURRENT_BINARY_DIR}
${CMAKE_BINARY_DIR}/Mod/Import
Import_rc.py)
ENDIF (BUILD_GUI)
INSTALL(
FILES
Init.py
InitGui.py
${Import_QRC_SRCS}
DESTINATION
Mod/Import
)
| Fix missing file during installation (Import_rc.py) | Fix missing file during installation (Import_rc.py)
| Text | lgpl-2.1 | Fat-Zer/FreeCAD_sf_master,Fat-Zer/FreeCAD_sf_master,Fat-Zer/FreeCAD_sf_master,Fat-Zer/FreeCAD_sf_master,Fat-Zer/FreeCAD_sf_master | text | ## Code Before:
add_subdirectory(App)
if(BUILD_GUI)
add_subdirectory(Gui)
endif(BUILD_GUI)
IF (BUILD_GUI)
PYSIDE_WRAP_RC(Import_QRC_SRCS Resources/Import.qrc)
ENDIF (BUILD_GUI)
ADD_CUSTOM_TARGET(ImportRC ALL
SOURCES ${Import_QRC_SRCS}
)
IF (BUILD_GUI)
fc_target_copy_resource(ImportRC
${CMAKE_CURRENT_BINARY_DIR}
${CMAKE_BINARY_DIR}/Mod/Import
Import_rc.py)
ENDIF (BUILD_GUI)
INSTALL(
FILES
Init.py
InitGui.py
DESTINATION
Mod/Import
)
## Instruction:
Fix missing file during installation (Import_rc.py)
## Code After:
add_subdirectory(App)
if(BUILD_GUI)
add_subdirectory(Gui)
endif(BUILD_GUI)
IF (BUILD_GUI)
PYSIDE_WRAP_RC(Import_QRC_SRCS Resources/Import.qrc)
ENDIF (BUILD_GUI)
ADD_CUSTOM_TARGET(ImportRC ALL
SOURCES ${Import_QRC_SRCS}
)
IF (BUILD_GUI)
fc_target_copy_resource(ImportRC
${CMAKE_CURRENT_BINARY_DIR}
${CMAKE_BINARY_DIR}/Mod/Import
Import_rc.py)
ENDIF (BUILD_GUI)
INSTALL(
FILES
Init.py
InitGui.py
${Import_QRC_SRCS}
DESTINATION
Mod/Import
)
|
8ecac6b8fef8df09b5f8c80a2a88da2149923a56 | app/assets/javascripts/pageflow/editor/initializers/setup_hotkeys.js | app/assets/javascripts/pageflow/editor/initializers/setup_hotkeys.js | pageflow.app.addInitializer(function(options) {
var KEY_A = 65;
$(document).on('keydown', function(event) {
if (pageflow.features.isEnabled('atmo') && event.altKey && event.which === KEY_A) {
if (pageflow.atmo.disabled) {
pageflow.atmo.enable();
}
else {
pageflow.atmo.disable();
}
}
});
}); | pageflow.app.addInitializer(function(options) {
var KEY_A = 65;
var KEY_X = 88;
$(document).on('keydown', function(event) {
if (event.altKey && event.which === KEY_A) {
if (pageflow.features.isEnabled('atmo')) {
if (pageflow.atmo.disabled) {
pageflow.atmo.enable();
}
else {
pageflow.atmo.disable();
}
}
}
else if (event.altKey && event.which === KEY_X) {
pageflow.editor.navigate('pages/' + pageflow.slides.currentPage().data('id'), {trigger: true});
}
});
}); | Add a Hotkey to Edit the Current Page | Add a Hotkey to Edit the Current Page
| JavaScript | mit | luatdolphin/pageflow,YoussefChafai/pageflow,grgr/pageflow,tf/pageflow,YoussefChafai/pageflow,codevise/pageflow,Modularfield/pageflow,BenHeubl/pageflow,BenHeubl/pageflow,tilsammans/pageflow,grgr/pageflow,Modularfield/pageflow,schoetty/pageflow,tilsammans/pageflow,tf/pageflow-dependabot-test,tf/pageflow-dependabot-test,Modularfield/pageflow,codevise/pageflow,YoussefChafai/pageflow,BenHeubl/pageflow,codevise/pageflow,tf/pageflow-dependabot-test,schoetty/pageflow,tf/pageflow,tf/pageflow,luatdolphin/pageflow,tf/pageflow-dependabot-test,tilsammans/pageflow,tf/pageflow,schoetty/pageflow,Modularfield/pageflow,grgr/pageflow,codevise/pageflow,tilsammans/pageflow,luatdolphin/pageflow | javascript | ## Code Before:
pageflow.app.addInitializer(function(options) {
var KEY_A = 65;
$(document).on('keydown', function(event) {
if (pageflow.features.isEnabled('atmo') && event.altKey && event.which === KEY_A) {
if (pageflow.atmo.disabled) {
pageflow.atmo.enable();
}
else {
pageflow.atmo.disable();
}
}
});
});
## Instruction:
Add a Hotkey to Edit the Current Page
## Code After:
pageflow.app.addInitializer(function(options) {
var KEY_A = 65;
var KEY_X = 88;
$(document).on('keydown', function(event) {
if (event.altKey && event.which === KEY_A) {
if (pageflow.features.isEnabled('atmo')) {
if (pageflow.atmo.disabled) {
pageflow.atmo.enable();
}
else {
pageflow.atmo.disable();
}
}
}
else if (event.altKey && event.which === KEY_X) {
pageflow.editor.navigate('pages/' + pageflow.slides.currentPage().data('id'), {trigger: true});
}
});
}); |
a04990863403818ba41e1d3a3f66c3db868fefbb | .travis.yml | .travis.yml | language: go
go:
- 1.3
- release
- tip
install: go get -d -t -v ./... | language: go
go:
- 1.3.3
- 1.4.1
install: go get -d -t -v ./...
| Fix Go versions for Travis. | Fix Go versions for Travis.
| YAML | isc | Stash-Crypto/btcwallet,btcsuite/btcwallet,synrg-labs/btcwallet,Stash-Crypto/btcwallet,btcsuite/btcwallet,jrick/dcrwallet,jrick/btcwallet,decred/dcrwallet,jrick/btcwallet,Roasbeef/btcwallet,synrg-labs/btcwallet,jrick/dcrwallet,decred/dcrwallet,Roasbeef/btcwallet | yaml | ## Code Before:
language: go
go:
- 1.3
- release
- tip
install: go get -d -t -v ./...
## Instruction:
Fix Go versions for Travis.
## Code After:
language: go
go:
- 1.3.3
- 1.4.1
install: go get -d -t -v ./...
|
e4ebbf83f77f6ce2b980d0976a9d3e7e029f90a1 | TraversalTests/ReducibleOfTests.swift | TraversalTests/ReducibleOfTests.swift | // Copyright (c) 2014 Rob Rix. All rights reserved.
import Traversal
import XCTest
class ReducibleOfTests: XCTestCase {
func testConstructionFromSequenceType() {
let sequence = [1, 2, 3, 4, 5, 6, 7, 8, 9]
XCTAssertEqual(Traversal.reduce(ReducibleOf(sequence: sequence), 0, +), 45)
}
}
| // Copyright (c) 2014 Rob Rix. All rights reserved.
import Traversal
import XCTest
class ReducibleOfTests: XCTestCase {
func testConstructionFromSequenceType() {
let sequence = [1, 2, 3, 4, 5, 6, 7, 8, 9]
XCTAssertEqual(Traversal.reduce(ReducibleOf(sequence: sequence), 0, +), 45)
}
func testConstructionFromReducibleType() {
let sequence = [1, 2, 3, 4, 5, 6, 7, 8, 9]
let reducible = ReducibleOf(sequence: sequence)
let outer = ReducibleOf(reducible)
XCTAssertEqual(Traversal.reduce(outer, 0, +), 45)
}
}
| Test ReducibleOf construction from ReducibleType. | Test ReducibleOf construction from ReducibleType.
| Swift | mit | robrix/Traversal,robrix/Traversal | swift | ## Code Before:
// Copyright (c) 2014 Rob Rix. All rights reserved.
import Traversal
import XCTest
class ReducibleOfTests: XCTestCase {
func testConstructionFromSequenceType() {
let sequence = [1, 2, 3, 4, 5, 6, 7, 8, 9]
XCTAssertEqual(Traversal.reduce(ReducibleOf(sequence: sequence), 0, +), 45)
}
}
## Instruction:
Test ReducibleOf construction from ReducibleType.
## Code After:
// Copyright (c) 2014 Rob Rix. All rights reserved.
import Traversal
import XCTest
class ReducibleOfTests: XCTestCase {
func testConstructionFromSequenceType() {
let sequence = [1, 2, 3, 4, 5, 6, 7, 8, 9]
XCTAssertEqual(Traversal.reduce(ReducibleOf(sequence: sequence), 0, +), 45)
}
func testConstructionFromReducibleType() {
let sequence = [1, 2, 3, 4, 5, 6, 7, 8, 9]
let reducible = ReducibleOf(sequence: sequence)
let outer = ReducibleOf(reducible)
XCTAssertEqual(Traversal.reduce(outer, 0, +), 45)
}
}
|
318e87f828a31e19cc4e6880cbbbc08ed16cd6fb | src/resources/views/article/blog.blade.php | src/resources/views/article/blog.blade.php | @if($article->param('show_title'))
<h3 class="article-title">
<a href="{{ url($article->alias) }}">{{ $article->present()->title }}</a>
</h3>
@endif
<div class="article-details">
@if($article->param('show_author'))
<small class="inline-help">Written by: {{ $article->present()->author }}</small>
<br>
@endif
@if($article->param('show_create_date'))
<small class="inline-help">
<i class="fa fa-calendar"></i> Published: {{ $article->present()->datePublished }}
</small>
<br>
@endif
@if($article->param('show_modify_date'))
<small class="inline-help">
<i class="fa fa-calendar"></i> Modified: {{ $article->present()->dateModified }}
</small>
<br>
@endif
@if($article->param('show_hits'))
<small class="inline-help">
<i class="fa fa-eye"></i> Hits: {{ $article->present()->hits }}
</small>
@endif
</div>
<div class="article-content">
{!! $article->present()->content !!}
</div>
| @if($article->param('show_title'))
<h3 class="article-title">
<a href="{{ url($article->alias) }}">{{ $article->present()->title }}</a>
</h3>
@endif
<div class="article-details">
@if($article->param('show_author'))
<p class="article-author">Written by: {{ $article->present()->author }}</p>
@endif
@if($article->param('show_create_date'))
<p class="article-created">
<i class="fa fa-calendar"></i> Published: {{ $article->present()->datePublished }}
</p>
@endif
@if($article->param('show_modify_date'))
<p class="article-modified">
<i class="fa fa-calendar"></i> Modified: {{ $article->present()->dateModified }}
</p>
@endif
@if($article->param('show_hits'))
<p class="article-hits">
<i class="fa fa-eye"></i> Hits: {{ $article->present()->hits }}
</p>
@endif
</div>
<div class="article-content">
{!! $article->present()->content !!}
</div>
| Add own class for each article attribute. | Add own class for each article attribute.
| PHP | mit | yajra/cms-core,yajra/cms-core | php | ## Code Before:
@if($article->param('show_title'))
<h3 class="article-title">
<a href="{{ url($article->alias) }}">{{ $article->present()->title }}</a>
</h3>
@endif
<div class="article-details">
@if($article->param('show_author'))
<small class="inline-help">Written by: {{ $article->present()->author }}</small>
<br>
@endif
@if($article->param('show_create_date'))
<small class="inline-help">
<i class="fa fa-calendar"></i> Published: {{ $article->present()->datePublished }}
</small>
<br>
@endif
@if($article->param('show_modify_date'))
<small class="inline-help">
<i class="fa fa-calendar"></i> Modified: {{ $article->present()->dateModified }}
</small>
<br>
@endif
@if($article->param('show_hits'))
<small class="inline-help">
<i class="fa fa-eye"></i> Hits: {{ $article->present()->hits }}
</small>
@endif
</div>
<div class="article-content">
{!! $article->present()->content !!}
</div>
## Instruction:
Add own class for each article attribute.
## Code After:
@if($article->param('show_title'))
<h3 class="article-title">
<a href="{{ url($article->alias) }}">{{ $article->present()->title }}</a>
</h3>
@endif
<div class="article-details">
@if($article->param('show_author'))
<p class="article-author">Written by: {{ $article->present()->author }}</p>
@endif
@if($article->param('show_create_date'))
<p class="article-created">
<i class="fa fa-calendar"></i> Published: {{ $article->present()->datePublished }}
</p>
@endif
@if($article->param('show_modify_date'))
<p class="article-modified">
<i class="fa fa-calendar"></i> Modified: {{ $article->present()->dateModified }}
</p>
@endif
@if($article->param('show_hits'))
<p class="article-hits">
<i class="fa fa-eye"></i> Hits: {{ $article->present()->hits }}
</p>
@endif
</div>
<div class="article-content">
{!! $article->present()->content !!}
</div>
|
250a86dc4041f75488d78653d9566b348e1b70c9 | src/searchclient/dlgfilters.cpp | src/searchclient/dlgfilters.cpp |
DlgFilters::DlgFilters(const QList<QPair<bool,QString> >& filters,
QWidget *parent)
: QDialog(parent, Qt::Dialog) {
setWindowTitle(tr("strigiclient - Edit filters"));
filterwidget = new FilterWidget();
filterwidget->setFilters(filters);
QPushButton* ok = new QPushButton(tr("&Ok"));
ok->setDefault(true);
QPushButton* cancel = new QPushButton(tr("&Cancel"));
QVBoxLayout* layout = new QVBoxLayout();
setLayout(layout);
layout->addWidget(filterwidget);
QHBoxLayout* hl = new QHBoxLayout();
layout->addLayout(hl);
hl->addStretch();
hl->addWidget(ok);
hl->addWidget(cancel);
connect(ok, SIGNAL(clicked()), this, SLOT(accept()));
connect(cancel, SIGNAL(clicked()), this, SLOT(reject()));
}
const QList<QPair<bool,QString> >&
DlgFilters::getFilters() const {
return filterwidget->getFilters();
}
|
DlgFilters::DlgFilters(const QList<QPair<bool,QString> >& filters,
QWidget *parent)
: QDialog(parent, Qt::Dialog) {
setWindowTitle(tr("strigiclient - Edit filters"));
QLabel* explanation = new QLabel(tr("Define filters that determine which files will be included and which will be excluded from the index, e.g. '*.html' or '.svn/' (do not index directories called '.svn')."));
explanation->setWordWrap(true);
filterwidget = new FilterWidget();
filterwidget->setFilters(filters);
QPushButton* ok = new QPushButton(tr("&Ok"));
ok->setDefault(true);
QPushButton* cancel = new QPushButton(tr("&Cancel"));
QVBoxLayout* layout = new QVBoxLayout();
setLayout(layout);
layout->addWidget(explanation);
layout->addWidget(filterwidget);
QHBoxLayout* hl = new QHBoxLayout();
layout->addLayout(hl);
hl->addStretch();
hl->addWidget(ok);
hl->addWidget(cancel);
connect(ok, SIGNAL(clicked()), this, SLOT(accept()));
connect(cancel, SIGNAL(clicked()), this, SLOT(reject()));
}
const QList<QPair<bool,QString> >&
DlgFilters::getFilters() const {
return filterwidget->getFilters();
}
| Add small explanation to the filter dialog. | Add small explanation to the filter dialog.
svn path=/trunk/playground/base/strigi/; revision=609264
| C++ | lgpl-2.1 | KDE/strigi | c++ | ## Code Before:
DlgFilters::DlgFilters(const QList<QPair<bool,QString> >& filters,
QWidget *parent)
: QDialog(parent, Qt::Dialog) {
setWindowTitle(tr("strigiclient - Edit filters"));
filterwidget = new FilterWidget();
filterwidget->setFilters(filters);
QPushButton* ok = new QPushButton(tr("&Ok"));
ok->setDefault(true);
QPushButton* cancel = new QPushButton(tr("&Cancel"));
QVBoxLayout* layout = new QVBoxLayout();
setLayout(layout);
layout->addWidget(filterwidget);
QHBoxLayout* hl = new QHBoxLayout();
layout->addLayout(hl);
hl->addStretch();
hl->addWidget(ok);
hl->addWidget(cancel);
connect(ok, SIGNAL(clicked()), this, SLOT(accept()));
connect(cancel, SIGNAL(clicked()), this, SLOT(reject()));
}
const QList<QPair<bool,QString> >&
DlgFilters::getFilters() const {
return filterwidget->getFilters();
}
## Instruction:
Add small explanation to the filter dialog.
svn path=/trunk/playground/base/strigi/; revision=609264
## Code After:
DlgFilters::DlgFilters(const QList<QPair<bool,QString> >& filters,
QWidget *parent)
: QDialog(parent, Qt::Dialog) {
setWindowTitle(tr("strigiclient - Edit filters"));
QLabel* explanation = new QLabel(tr("Define filters that determine which files will be included and which will be excluded from the index, e.g. '*.html' or '.svn/' (do not index directories called '.svn')."));
explanation->setWordWrap(true);
filterwidget = new FilterWidget();
filterwidget->setFilters(filters);
QPushButton* ok = new QPushButton(tr("&Ok"));
ok->setDefault(true);
QPushButton* cancel = new QPushButton(tr("&Cancel"));
QVBoxLayout* layout = new QVBoxLayout();
setLayout(layout);
layout->addWidget(explanation);
layout->addWidget(filterwidget);
QHBoxLayout* hl = new QHBoxLayout();
layout->addLayout(hl);
hl->addStretch();
hl->addWidget(ok);
hl->addWidget(cancel);
connect(ok, SIGNAL(clicked()), this, SLOT(accept()));
connect(cancel, SIGNAL(clicked()), this, SLOT(reject()));
}
const QList<QPair<bool,QString> >&
DlgFilters::getFilters() const {
return filterwidget->getFilters();
}
|
d72bf615df736b1cddd6bb3901e63488acbb5ea4 | test/test_sync.ml | test/test_sync.ml | open Webtest.Suite
open Webtest.Utils
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
| open Webtest.Suite
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
| Remove an unused module open | Remove an unused module open
Signed-off-by: John Else <[email protected]>
| OCaml | mit | johnelse/ocaml-webtest,johnelse/ocaml-webtest,johnelse/ocaml-webtest | ocaml | ## Code Before:
open Webtest.Suite
open Webtest.Utils
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
## Instruction:
Remove an unused module open
Signed-off-by: John Else <[email protected]>
## Code After:
open Webtest.Suite
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
|
ba84740e7ba0edd709c9cd076a7dce83a6c91a30 | research/mlt_quality_research.py | research/mlt_quality_research.py |
import pprint
from elasticsearch import Elasticsearch
'''Inside ipython:
[(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index='bhp10', doc_type='places', id=71433, mlt_fields=related_fields, search_types=['places','personalities','photoUnits','familyNames'], search_size=40)['hits']['hits']]
'''
def get_related(es, doc_id, index, doc_type, mlt_fields, target_collections, limit):
return [(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index=index, doc_type=doc_type, id=doc_id, mlt_fields=mlt_fields, search_types=target_collections, search_size=limit)['hits']['hits']]
if __name__ == '__main__':
es = Elasticsearch()
mlt_fields = ['Header.En', 'UnitText1.En', 'Header.He', 'UnitText1.He']
target_collections = ['places','personalities','photoUnits','familyNames']
# For Paris:
pprint.pprint (get_related(es, 72312, 'bhp10', 'places', mlt_fields, target_collections, 40))
|
import pprint
from elasticsearch import Elasticsearch
'''Inside ipython:
[(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index='bhp10', doc_type='places', id=71433, mlt_fields=related_fields, search_types=['places','personalities','photoUnits','familyNames'], search_size=40)['hits']['hits']]
'''
'''Try with search:
es.search(doc_type='', size=1, q='UnitText1.En:Einstein')
Or even better:
[(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'], r['_id'])for r in es.search(doc_type=['places'], size=40, q='UnitText1.En:Albert Einstein')['hits']['hits']]
'''
def get_related(es, doc_id, index, doc_type, mlt_fields, target_collections, limit):
return [(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index=index, doc_type=doc_type, id=doc_id, mlt_fields=mlt_fields, search_types=target_collections, search_size=limit)['hits']['hits']]
if __name__ == '__main__':
es = Elasticsearch()
mlt_fields = ['Header.En', 'UnitText1.En', 'Header.He', 'UnitText1.He']
target_collections = ['places','personalities','photoUnits','familyNames']
# For Paris:
pprint.pprint (get_related(es, 72312, 'bhp10', 'places', mlt_fields, target_collections, 40))
| Add ES search example, similar to Mongo related FTS | Add ES search example, similar to Mongo related FTS
| Python | agpl-3.0 | Beit-Hatfutsot/dbs-back,Beit-Hatfutsot/dbs-back,Beit-Hatfutsot/dbs-back,Beit-Hatfutsot/dbs-back | python | ## Code Before:
import pprint
from elasticsearch import Elasticsearch
'''Inside ipython:
[(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index='bhp10', doc_type='places', id=71433, mlt_fields=related_fields, search_types=['places','personalities','photoUnits','familyNames'], search_size=40)['hits']['hits']]
'''
def get_related(es, doc_id, index, doc_type, mlt_fields, target_collections, limit):
return [(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index=index, doc_type=doc_type, id=doc_id, mlt_fields=mlt_fields, search_types=target_collections, search_size=limit)['hits']['hits']]
if __name__ == '__main__':
es = Elasticsearch()
mlt_fields = ['Header.En', 'UnitText1.En', 'Header.He', 'UnitText1.He']
target_collections = ['places','personalities','photoUnits','familyNames']
# For Paris:
pprint.pprint (get_related(es, 72312, 'bhp10', 'places', mlt_fields, target_collections, 40))
## Instruction:
Add ES search example, similar to Mongo related FTS
## Code After:
import pprint
from elasticsearch import Elasticsearch
'''Inside ipython:
[(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index='bhp10', doc_type='places', id=71433, mlt_fields=related_fields, search_types=['places','personalities','photoUnits','familyNames'], search_size=40)['hits']['hits']]
'''
'''Try with search:
es.search(doc_type='', size=1, q='UnitText1.En:Einstein')
Or even better:
[(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'], r['_id'])for r in es.search(doc_type=['places'], size=40, q='UnitText1.En:Albert Einstein')['hits']['hits']]
'''
def get_related(es, doc_id, index, doc_type, mlt_fields, target_collections, limit):
return [(r['_score'], r['_source']['Header']['En'], r['_source']['UnitTypeDesc'])for r in es.mlt(index=index, doc_type=doc_type, id=doc_id, mlt_fields=mlt_fields, search_types=target_collections, search_size=limit)['hits']['hits']]
if __name__ == '__main__':
es = Elasticsearch()
mlt_fields = ['Header.En', 'UnitText1.En', 'Header.He', 'UnitText1.He']
target_collections = ['places','personalities','photoUnits','familyNames']
# For Paris:
pprint.pprint (get_related(es, 72312, 'bhp10', 'places', mlt_fields, target_collections, 40))
|
15d4994308fa34e10f5cac3f66bfcc1a8de00edf | home-server-backend/src/main/java/nl/homeserver/energie/VerbruikenEnKosten.java | home-server-backend/src/main/java/nl/homeserver/energie/VerbruikenEnKosten.java | package nl.homeserver.energie;
import java.math.BigDecimal;
import java.util.Collection;
import java.util.Objects;
public class VerbruikenEnKosten {
private final Collection<VerbruikKosten> all;
public VerbruikenEnKosten(final Collection<VerbruikKosten> all) {
this.all = all;
}
private BigDecimal getTotaalVerbruik() {
return all.stream()
.map(VerbruikKosten::getVerbruik)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
private BigDecimal getTotaalKosten() {
return all.stream()
.map(VerbruikKosten::getKosten)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
public VerbruikKosten sumToSingle() {
return new VerbruikKosten(getTotaalVerbruik(), getTotaalKosten());
}
}
| package nl.homeserver.energie;
import java.math.BigDecimal;
import java.util.Collection;
import java.util.Objects;
import javax.annotation.Nullable;
class VerbruikenEnKosten {
private final Collection<VerbruikKosten> all;
VerbruikenEnKosten(final Collection<VerbruikKosten> all) {
this.all = all;
}
@Nullable
private BigDecimal getTotaalVerbruik() {
return all.stream()
.map(VerbruikKosten::getVerbruik)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
@Nullable
private BigDecimal getTotaalKosten() {
return all.stream()
.map(VerbruikKosten::getKosten)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
VerbruikKosten sumToSingle() {
return new VerbruikKosten(getTotaalVerbruik(), getTotaalKosten());
}
}
| Fix issues found by SonarCloud | Fix issues found by SonarCloud
| Java | apache-2.0 | bassages/home-server,bassages/homecontrol,bassages/homecontrol,bassages/home-server | java | ## Code Before:
package nl.homeserver.energie;
import java.math.BigDecimal;
import java.util.Collection;
import java.util.Objects;
public class VerbruikenEnKosten {
private final Collection<VerbruikKosten> all;
public VerbruikenEnKosten(final Collection<VerbruikKosten> all) {
this.all = all;
}
private BigDecimal getTotaalVerbruik() {
return all.stream()
.map(VerbruikKosten::getVerbruik)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
private BigDecimal getTotaalKosten() {
return all.stream()
.map(VerbruikKosten::getKosten)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
public VerbruikKosten sumToSingle() {
return new VerbruikKosten(getTotaalVerbruik(), getTotaalKosten());
}
}
## Instruction:
Fix issues found by SonarCloud
## Code After:
package nl.homeserver.energie;
import java.math.BigDecimal;
import java.util.Collection;
import java.util.Objects;
import javax.annotation.Nullable;
class VerbruikenEnKosten {
private final Collection<VerbruikKosten> all;
VerbruikenEnKosten(final Collection<VerbruikKosten> all) {
this.all = all;
}
@Nullable
private BigDecimal getTotaalVerbruik() {
return all.stream()
.map(VerbruikKosten::getVerbruik)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
@Nullable
private BigDecimal getTotaalKosten() {
return all.stream()
.map(VerbruikKosten::getKosten)
.filter(Objects::nonNull)
.reduce(BigDecimal::add)
.orElse(null);
}
VerbruikKosten sumToSingle() {
return new VerbruikKosten(getTotaalVerbruik(), getTotaalKosten());
}
}
|
d7b0426f3e800379e04b66ce3e3b91394adddcfa | src/vector/rotationMatrix.js | src/vector/rotationMatrix.js | import "./";
orb.vector.rotationMatrix = function(α, e) {
var cosα = Math.cos(α),
sinα = Math.sin(α);
switch(e) {
case 1:
return [
1, 0, 0,
0, cosα, sinα,
0, -sinα, cosα
];
case 2:
return [
cosα, 0, -sinα,
0, 1, 0,
sinα, 0, cosα
];
case 3:
return [
cosα, sinα, 0,
-sinα, cosα, 0,
0, 0, 1
];
default:
throw new Error('rotation axis has to be 1, 2 or 3');
}
};
orb.vector.r = orb.vector.rotationMatrix;
| import "./";
orb.vector.rotationMatrix = function(α, e) {
α = α % ( 2*Math.PI );
var cosα, sinα;
if ( α === 0 ) {
cosα = 1;
sinα = 0;
} else if ( α === Math.PI/2 || α === -3/2*Math.PI ) {
cosα = 0;
sinα = 1;
} else if ( α === Math.PI || α === -Math.PI ) {
cosα = -1;
sinα = 0;
} else if ( α === 3/2*Math.PI || α === -Math.PI/2 ) {
cosα = 0;
sinα = -1;
} else {
cosα = Math.cos(α);
sinα = Math.sin(α);
}
switch(e) {
case 1:
return [
1, 0, 0,
0, cosα, sinα,
0, -sinα, cosα
];
case 2:
return [
cosα, 0, -sinα,
0, 1, 0,
sinα, 0, cosα
];
case 3:
return [
cosα, sinα, 0,
-sinα, cosα, 0,
0, 0, 1
];
default:
throw new Error('rotation axis has to be 1, 2 or 3');
}
};
orb.vector.r = orb.vector.rotationMatrix;
| Return exact values for α = z * 1/2 pi | [CHANGE] Return exact values for α = z * 1/2 pi
| JavaScript | mit | benelsen/orb | javascript | ## Code Before:
import "./";
orb.vector.rotationMatrix = function(α, e) {
var cosα = Math.cos(α),
sinα = Math.sin(α);
switch(e) {
case 1:
return [
1, 0, 0,
0, cosα, sinα,
0, -sinα, cosα
];
case 2:
return [
cosα, 0, -sinα,
0, 1, 0,
sinα, 0, cosα
];
case 3:
return [
cosα, sinα, 0,
-sinα, cosα, 0,
0, 0, 1
];
default:
throw new Error('rotation axis has to be 1, 2 or 3');
}
};
orb.vector.r = orb.vector.rotationMatrix;
## Instruction:
[CHANGE] Return exact values for α = z * 1/2 pi
## Code After:
import "./";
orb.vector.rotationMatrix = function(α, e) {
α = α % ( 2*Math.PI );
var cosα, sinα;
if ( α === 0 ) {
cosα = 1;
sinα = 0;
} else if ( α === Math.PI/2 || α === -3/2*Math.PI ) {
cosα = 0;
sinα = 1;
} else if ( α === Math.PI || α === -Math.PI ) {
cosα = -1;
sinα = 0;
} else if ( α === 3/2*Math.PI || α === -Math.PI/2 ) {
cosα = 0;
sinα = -1;
} else {
cosα = Math.cos(α);
sinα = Math.sin(α);
}
switch(e) {
case 1:
return [
1, 0, 0,
0, cosα, sinα,
0, -sinα, cosα
];
case 2:
return [
cosα, 0, -sinα,
0, 1, 0,
sinα, 0, cosα
];
case 3:
return [
cosα, sinα, 0,
-sinα, cosα, 0,
0, 0, 1
];
default:
throw new Error('rotation axis has to be 1, 2 or 3');
}
};
orb.vector.r = orb.vector.rotationMatrix;
|
32e1c59ea816aaf921905e4717260988763581d4 | .travis/deploy.sh | .travis/deploy.sh | set -e
# update current version number to a TAG version if this is a tag build
if [ ! -z "$TRAVIS_TAG" ]
then
echo "on a tag -> set pom.xml <version> to $TRAVIS_TAG for release"
mvn --settings .travis/mvnsettings.xml org.codehaus.mojo:versions-maven-plugin:2.3:set -DnewVersion=$TRAVIS_TAG
else
echo "not on a tag -> keep snapshot version in pom.xml"
fi
# cleanup and generate gpg keys
if [ ! -z "$TRAVIS" -a -f "$HOME/.gnupg" ]; then
shred -v ~/.gnupg/*
rm -rf ~/.gnupg
fi
source .travis/gpg.sh
# DEPLOY \o/
mvn clean deploy --settings .travis/mvnsettings.xml -DskipTests=true --batch-mode --update-snapshots
# cleanup gpg keys, just to be safe
if [ ! -z "$TRAVIS" ]; then
shred -v ~/.gnupg/*
rm -rf ~/.gnupg
fi
| set -e
# update current version number to a TAG version if this is a tag build
if [ ! -z "$TRAVIS_TAG" ]
then
echo "on a tag -> set pom.xml <version> to $TRAVIS_TAG for release"
mvn --settings .travis/mvnsettings.xml org.codehaus.mojo:versions-maven-plugin:2.3:set -DnewVersion=$TRAVIS_TAG
else
echo "not on a tag -> keep snapshot version in pom.xml"
fi
# cleanup and generate gpg keys
if [ ! -z "$TRAVIS" -a -f "$HOME/.gnupg" ]; then
shred -v ~/.gnupg/*
rm -rf ~/.gnupg
fi
source .travis/gpg.sh
# DEPLOY \o/
mvn clean deploy --settings .travis/mvnsettings.xml -DskipTests=true --batch-mode --update-snapshots
# cleanup gpg keys, just to be safe
if [ ! -z "$TRAVIS" ]; then
find ~/.gnupg/ -type f -exec shred -v {} \;
rm -rf ~/.gnupg
fi
| Fix shredding taking care of dirs | Fix shredding taking care of dirs
| Shell | apache-2.0 | CurrencyFair/OneSignal-Java-SDK,CurrencyFair/OneSignal-Java-SDK | shell | ## Code Before:
set -e
# update current version number to a TAG version if this is a tag build
if [ ! -z "$TRAVIS_TAG" ]
then
echo "on a tag -> set pom.xml <version> to $TRAVIS_TAG for release"
mvn --settings .travis/mvnsettings.xml org.codehaus.mojo:versions-maven-plugin:2.3:set -DnewVersion=$TRAVIS_TAG
else
echo "not on a tag -> keep snapshot version in pom.xml"
fi
# cleanup and generate gpg keys
if [ ! -z "$TRAVIS" -a -f "$HOME/.gnupg" ]; then
shred -v ~/.gnupg/*
rm -rf ~/.gnupg
fi
source .travis/gpg.sh
# DEPLOY \o/
mvn clean deploy --settings .travis/mvnsettings.xml -DskipTests=true --batch-mode --update-snapshots
# cleanup gpg keys, just to be safe
if [ ! -z "$TRAVIS" ]; then
shred -v ~/.gnupg/*
rm -rf ~/.gnupg
fi
## Instruction:
Fix shredding taking care of dirs
## Code After:
set -e
# update current version number to a TAG version if this is a tag build
if [ ! -z "$TRAVIS_TAG" ]
then
echo "on a tag -> set pom.xml <version> to $TRAVIS_TAG for release"
mvn --settings .travis/mvnsettings.xml org.codehaus.mojo:versions-maven-plugin:2.3:set -DnewVersion=$TRAVIS_TAG
else
echo "not on a tag -> keep snapshot version in pom.xml"
fi
# cleanup and generate gpg keys
if [ ! -z "$TRAVIS" -a -f "$HOME/.gnupg" ]; then
shred -v ~/.gnupg/*
rm -rf ~/.gnupg
fi
source .travis/gpg.sh
# DEPLOY \o/
mvn clean deploy --settings .travis/mvnsettings.xml -DskipTests=true --batch-mode --update-snapshots
# cleanup gpg keys, just to be safe
if [ ! -z "$TRAVIS" ]; then
find ~/.gnupg/ -type f -exec shred -v {} \;
rm -rf ~/.gnupg
fi
|
1bf70e6068b53ff8178547e1117359a6cc93ac94 | server.js | server.js | // vim: set ts=2 et:
var express = require('express'),
app = express(),
http = require('http').createServer(app),
io = require('socket.io').listen(http),
PeerServer = require('peer').PeerServer,
server = new PeerServer({port: 9000}),
peers = {};
server.on('connection', function(id) {
console.log("New peer: " + id);
console.log(peers);
});
io.sockets.on('connection', function (socket) {
//socket.emit('news', { hello: 'world' });
socket.on('newpeer', function (id) {
peers[id] = 1;
io.sockets.emit('updatelist', Object.keys(peers));
});
socket.on('disconnect', function() {
delete peers[id];
io.sockets.emit('updatelist', Object.keys(peers));
console.log("Peer gone: " + id);
console.log(peers);
});
});
app.use('/', express.static(__dirname + '/myapp'));
http.listen(8111);
| // vim: set ts=2 et:
var express = require('express'),
app = express(),
http = require('http').createServer(app),
io = require('socket.io').listen(http),
PeerServer = require('peer').PeerServer,
server = new PeerServer({port: 9000}),
peers = {};
server.on('connection', function(id) {
console.log("New peer: " + id);
console.log(peers);
});
io.sockets.on('connection', function (socket) {
//socket.emit('news', { hello: 'world' });
socket.on('newpeer', function (id) {
socket.peer_id = id;
peers[id] = 1;
io.sockets.emit('updatelist', Object.keys(peers));
});
socket.on('disconnect', function() {
delete peers[socket.peer_id];
io.sockets.emit('updatelist', Object.keys(peers));
console.log("Peer gone: " + socket.peer_id);
console.log(peers);
});
});
app.use('/', express.static(__dirname + '/myapp'));
http.listen(8111);
| Save peer id in socket ds. | Save peer id in socket ds.
| JavaScript | mit | pbanaszkiewicz/pitt | javascript | ## Code Before:
// vim: set ts=2 et:
var express = require('express'),
app = express(),
http = require('http').createServer(app),
io = require('socket.io').listen(http),
PeerServer = require('peer').PeerServer,
server = new PeerServer({port: 9000}),
peers = {};
server.on('connection', function(id) {
console.log("New peer: " + id);
console.log(peers);
});
io.sockets.on('connection', function (socket) {
//socket.emit('news', { hello: 'world' });
socket.on('newpeer', function (id) {
peers[id] = 1;
io.sockets.emit('updatelist', Object.keys(peers));
});
socket.on('disconnect', function() {
delete peers[id];
io.sockets.emit('updatelist', Object.keys(peers));
console.log("Peer gone: " + id);
console.log(peers);
});
});
app.use('/', express.static(__dirname + '/myapp'));
http.listen(8111);
## Instruction:
Save peer id in socket ds.
## Code After:
// vim: set ts=2 et:
var express = require('express'),
app = express(),
http = require('http').createServer(app),
io = require('socket.io').listen(http),
PeerServer = require('peer').PeerServer,
server = new PeerServer({port: 9000}),
peers = {};
server.on('connection', function(id) {
console.log("New peer: " + id);
console.log(peers);
});
io.sockets.on('connection', function (socket) {
//socket.emit('news', { hello: 'world' });
socket.on('newpeer', function (id) {
socket.peer_id = id;
peers[id] = 1;
io.sockets.emit('updatelist', Object.keys(peers));
});
socket.on('disconnect', function() {
delete peers[socket.peer_id];
io.sockets.emit('updatelist', Object.keys(peers));
console.log("Peer gone: " + socket.peer_id);
console.log(peers);
});
});
app.use('/', express.static(__dirname + '/myapp'));
http.listen(8111);
|
bb0e86778b6c6d6d1f2819683ffc3147b74b53b1 | libs/readme.md | libs/readme.md | This directory should contain the library dependencies required to build this RimWorld mod.
Besides the .NET Core libraries this solution has dependencies on the following libraries:
- From RimWorld core Game:
- Assembly-CSharp (RimWorld game logic)
- UnityEngine (Unity Game Engine)
- Third Party:
- (Harmony)[https://github.com/pardeike/Harmony] by (@pardeike)[https://github.com/pardeike] (distributed with HugsLib)
- (HugsLib)[https://github.com/UnlimitedHugs/RimworldHugsLib] by (@UnlimitedHugs)[https://github.com/UnlimitedHugs]
| This directory should contain the library dependencies required to build this RimWorld mod.
Besides the .NET Core libraries this solution has dependencies on the following libraries:
- From RimWorld core Game:
- Assembly-CSharp (RimWorld game logic)
- UnityEngine (Unity Game Engine)
- Third Party:
- [Harmony](https://github.com/pardeike/Harmony) by [@pardeike](https://github.com/pardeike) (distributed with HugsLib)
- [HugsLib](https://github.com/UnlimitedHugs/RimworldHugsLib) by [@UnlimitedHugs](https://github.com/UnlimitedHugs)
| Fix mix up between parenthesis and brackets in markdown... | Fix mix up between parenthesis and brackets in markdown...
| Markdown | mit | neitsa/PrepareLanding,neitsa/PrepareLanding | markdown | ## Code Before:
This directory should contain the library dependencies required to build this RimWorld mod.
Besides the .NET Core libraries this solution has dependencies on the following libraries:
- From RimWorld core Game:
- Assembly-CSharp (RimWorld game logic)
- UnityEngine (Unity Game Engine)
- Third Party:
- (Harmony)[https://github.com/pardeike/Harmony] by (@pardeike)[https://github.com/pardeike] (distributed with HugsLib)
- (HugsLib)[https://github.com/UnlimitedHugs/RimworldHugsLib] by (@UnlimitedHugs)[https://github.com/UnlimitedHugs]
## Instruction:
Fix mix up between parenthesis and brackets in markdown...
## Code After:
This directory should contain the library dependencies required to build this RimWorld mod.
Besides the .NET Core libraries this solution has dependencies on the following libraries:
- From RimWorld core Game:
- Assembly-CSharp (RimWorld game logic)
- UnityEngine (Unity Game Engine)
- Third Party:
- [Harmony](https://github.com/pardeike/Harmony) by [@pardeike](https://github.com/pardeike) (distributed with HugsLib)
- [HugsLib](https://github.com/UnlimitedHugs/RimworldHugsLib) by [@UnlimitedHugs](https://github.com/UnlimitedHugs)
|
28c20bc68085aa2250fef5664175d890ddc35cb9 | app/views/backend/analyses/show.html.haml | app/views/backend/analyses/show.html.haml | - main_toolbar do |t|
= t.edit resource
= main_informations attachment: true do
= attributes_list do |l|
- l.attribute :number
- l.attribute :reference_number
- l.attribute :sensor, url: true
- l.attribute :product, url: true
- l.attribute :sampler, url: true
- l.attribute :sampled_at
- l.attribute :analyser, url: true
- l.attribute :analysed_at
- if resource.geolocation
:ruby
data_product = []
product = resource.product
if product.shape
data_product << {name: product.name, shape: product.shape}
end
= visualization do |v|
- v.serie :data, [{name: resource.number, shape: resource.geolocation}]
- if data_product.any?
- v.serie :data_product, data_product
- v.categories :activity, :data_product
- v.path "#{resource.number}", :data
- v.control :zoom
- v.control :scale
- v.control :fullscreen
- v.control :layer_selector
= cobbles do |c|
- c.cobble :items do
= cobble_list :items
| - main_toolbar do |t|
= t.edit resource
= main_informations attachment: true do
= attributes_list do |l|
- l.attribute :number
- l.attribute :reference_number
- l.attribute :sensor, url: true
- l.attribute :product, url: true
- l.attribute :sampler, url: true
- l.attribute :sampled_at
- l.attribute :analyser, url: true
- l.attribute :analysed_at
- if resource.geolocation
:ruby
data_product = []
product = resource.product
if product && product.shape
data_product << {name: product.name, shape: product.shape}
end
= visualization do |v|
- v.serie :data, [{name: resource.number, shape: resource.geolocation}]
- if data_product.any?
- v.serie :data_product, data_product
- v.categories :activity, :data_product
- v.path "#{resource.number}", :data
- v.control :zoom
- v.control :scale
- v.control :fullscreen
- v.control :layer_selector
= cobbles do |c|
- c.cobble :items do
= cobble_list :items
| Fix missing test on analysis view. | Fix missing test on analysis view.
| Haml | agpl-3.0 | ekylibre/ekylibre,ekylibre/ekylibre,ekylibre/ekylibre,ekylibre/ekylibre,ekylibre/ekylibre | haml | ## Code Before:
- main_toolbar do |t|
= t.edit resource
= main_informations attachment: true do
= attributes_list do |l|
- l.attribute :number
- l.attribute :reference_number
- l.attribute :sensor, url: true
- l.attribute :product, url: true
- l.attribute :sampler, url: true
- l.attribute :sampled_at
- l.attribute :analyser, url: true
- l.attribute :analysed_at
- if resource.geolocation
:ruby
data_product = []
product = resource.product
if product.shape
data_product << {name: product.name, shape: product.shape}
end
= visualization do |v|
- v.serie :data, [{name: resource.number, shape: resource.geolocation}]
- if data_product.any?
- v.serie :data_product, data_product
- v.categories :activity, :data_product
- v.path "#{resource.number}", :data
- v.control :zoom
- v.control :scale
- v.control :fullscreen
- v.control :layer_selector
= cobbles do |c|
- c.cobble :items do
= cobble_list :items
## Instruction:
Fix missing test on analysis view.
## Code After:
- main_toolbar do |t|
= t.edit resource
= main_informations attachment: true do
= attributes_list do |l|
- l.attribute :number
- l.attribute :reference_number
- l.attribute :sensor, url: true
- l.attribute :product, url: true
- l.attribute :sampler, url: true
- l.attribute :sampled_at
- l.attribute :analyser, url: true
- l.attribute :analysed_at
- if resource.geolocation
:ruby
data_product = []
product = resource.product
if product && product.shape
data_product << {name: product.name, shape: product.shape}
end
= visualization do |v|
- v.serie :data, [{name: resource.number, shape: resource.geolocation}]
- if data_product.any?
- v.serie :data_product, data_product
- v.categories :activity, :data_product
- v.path "#{resource.number}", :data
- v.control :zoom
- v.control :scale
- v.control :fullscreen
- v.control :layer_selector
= cobbles do |c|
- c.cobble :items do
= cobble_list :items
|
e896e3deafb0844e6b790fa81e5240790d1c5434 | src/CacheJsonToArticleConverter.cpp | src/CacheJsonToArticleConverter.cpp |
ArticleCollection& CacheJsonToArticleConverter::convertToArticle(std::string json, ArticleCollection& articleCache)
{
Json::Reader reader;
Json::Value document;
bool success = reader.parse(json, document, false);
if(!success) {
throw WalkerException("Error parsing JSON");
}
// get all "main" articles first
for(auto &titleElement : document.getMemberNames()) {
std::string title = titleElement;
Article *a = articleCache.get(title);
if(a == nullptr) {
a = new Article(title);
articleCache.add(a);
}
for(auto linkedArticle :
document.get(title, Json::Value::nullSingleton())
.get("forward_links", Json::Value::nullSingleton())) {
std::string linkedTitle = linkedArticle.asString();
Article *la = articleCache.get(linkedTitle);
if(la == nullptr) {
la = new Article(linkedTitle);
articleCache.add(la);
}
a->addLink(la);
}
}
return articleCache;
/*
a->setAnalyzed(true); ?
*/
}
|
ArticleCollection& CacheJsonToArticleConverter::convertToArticle(std::string json, ArticleCollection& articleCache)
{
Json::Reader reader;
Json::Value document;
bool success = reader.parse(json, document, false);
if(!success) {
throw WalkerException("Error parsing JSON");
}
// get all "main" articles first
for(auto &titleElement : document.getMemberNames()) {
std::string title = titleElement;
Article *a = articleCache.get(title);
if(a == nullptr) {
a = new Article(title);
articleCache.add(a);
}
auto links = document
.get(title, Json::Value::nullSingleton())
.get("forward_links", Json::Value::nullSingleton());
if(links.isNull()) {
/* don't need to set article analyzed to false,
* since that's the default */
continue;
}
a->setAnalyzed(true);
for(auto linkedArticle : links) {
std::string linkedTitle = linkedArticle.asString();
Article *la = articleCache.get(linkedTitle);
if(la == nullptr) {
la = new Article(linkedTitle);
articleCache.add(la);
}
a->addLink(la);
}
}
return articleCache;
/*
a->setAnalyzed(true); ?
*/
}
| Modify json cache implementation to match null/empty array behavior | Modify json cache implementation to match null/empty array behavior
| C++ | mit | dueringa/WikiWalker | c++ | ## Code Before:
ArticleCollection& CacheJsonToArticleConverter::convertToArticle(std::string json, ArticleCollection& articleCache)
{
Json::Reader reader;
Json::Value document;
bool success = reader.parse(json, document, false);
if(!success) {
throw WalkerException("Error parsing JSON");
}
// get all "main" articles first
for(auto &titleElement : document.getMemberNames()) {
std::string title = titleElement;
Article *a = articleCache.get(title);
if(a == nullptr) {
a = new Article(title);
articleCache.add(a);
}
for(auto linkedArticle :
document.get(title, Json::Value::nullSingleton())
.get("forward_links", Json::Value::nullSingleton())) {
std::string linkedTitle = linkedArticle.asString();
Article *la = articleCache.get(linkedTitle);
if(la == nullptr) {
la = new Article(linkedTitle);
articleCache.add(la);
}
a->addLink(la);
}
}
return articleCache;
/*
a->setAnalyzed(true); ?
*/
}
## Instruction:
Modify json cache implementation to match null/empty array behavior
## Code After:
ArticleCollection& CacheJsonToArticleConverter::convertToArticle(std::string json, ArticleCollection& articleCache)
{
Json::Reader reader;
Json::Value document;
bool success = reader.parse(json, document, false);
if(!success) {
throw WalkerException("Error parsing JSON");
}
// get all "main" articles first
for(auto &titleElement : document.getMemberNames()) {
std::string title = titleElement;
Article *a = articleCache.get(title);
if(a == nullptr) {
a = new Article(title);
articleCache.add(a);
}
auto links = document
.get(title, Json::Value::nullSingleton())
.get("forward_links", Json::Value::nullSingleton());
if(links.isNull()) {
/* don't need to set article analyzed to false,
* since that's the default */
continue;
}
a->setAnalyzed(true);
for(auto linkedArticle : links) {
std::string linkedTitle = linkedArticle.asString();
Article *la = articleCache.get(linkedTitle);
if(la == nullptr) {
la = new Article(linkedTitle);
articleCache.add(la);
}
a->addLink(la);
}
}
return articleCache;
/*
a->setAnalyzed(true); ?
*/
}
|
c7e65db27da59ddf221d1720362434581ef30311 | test/unit/locale/test_locale.py | test/unit/locale/test_locale.py |
import os
import unittest
try:
from subprocess import check_output
except ImportError:
from subprocess import Popen, PIPE, CalledProcessError
def check_output(*popenargs, **kwargs):
"""Lifted from python 2.7 stdlib."""
if 'stdout' in kwargs:
raise ValueError('stdout argument not allowed, it will be '
'overridden.')
process = Popen(stdout=PIPE, *popenargs, **kwargs)
output, unused_err = process.communicate()
retcode = process.poll()
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
raise CalledProcessError(retcode, cmd, output=output)
return output
os.environ['LC_ALL'] = 'eo'
os.environ['SWIFT_LOCALEDIR'] = os.path.dirname(__file__)
from swift import gettext_ as _
class TestTranslations(unittest.TestCase):
def test_translations(self):
translated_message = check_output(['python', __file__])
self.assertEquals(translated_message, 'testo mesaĝon\n')
if __name__ == "__main__":
print _('test message')
|
import os
import unittest
import string
import sys
try:
from subprocess import check_output
except ImportError:
from subprocess import Popen, PIPE, CalledProcessError
def check_output(*popenargs, **kwargs):
"""Lifted from python 2.7 stdlib."""
if 'stdout' in kwargs:
raise ValueError('stdout argument not allowed, it will be '
'overridden.')
process = Popen(stdout=PIPE, *popenargs, **kwargs)
output, unused_err = process.communicate()
retcode = process.poll()
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
raise CalledProcessError(retcode, cmd, output=output)
return output
os.environ['LC_ALL'] = 'eo'
os.environ['SWIFT_LOCALEDIR'] = os.path.dirname(__file__)
class TestTranslations(unittest.TestCase):
def test_translations(self):
path = ':'.join(sys.path)
translated_message = check_output(['python', __file__, path])
self.assertEquals(translated_message, 'testo mesaĝon\n')
if __name__ == "__main__":
sys.path = string.split(sys.argv[1], ':')
from swift import gettext_ as _
print _('test message')
| Make test_translations test our tree | Make test_translations test our tree
In order to run the correct classes, Python test framework adjusts
sys.path. However, these changes are not propagated to subprocesses.
Therefore, the test actually tries to test installed Swift, not
the one in which it is running.
The usual suggestion is to run "python setup.py develop" before
testing, but it's annoying and error-prone. If you forget it,
you may test the code in /usr very easily, and never know.
Let's just pass the correct path to subprocess. Much safer.
Change-Id: Ic71314e8462cf6e0579d704ffe9fbbfac7e6ba24
| Python | apache-2.0 | swiftstack/swift,rackerlabs/swift,zackmdavis/swift,williamthegrey/swift,eatbyte/Swift,matthewoliver/swift,Seagate/swift,anishnarang/gswift,clayg/swift,shibaniahegde/OpenStak_swift,openstack/swift,prashanthpai/swift,matthewoliver/swift,psachin/swift,nadeemsyed/swift,AfonsoFGarcia/swift,prashanthpai/swift,smerritt/swift,levythu/swift,hbhdytf/mac,mjzmjz/swift,Khushbu27/Tutorial,redhat-openstack/swift,Seagate/swift,takeshineshiro/swift,notmyname/swift,dpgoetz/swift,nadeemsyed/swift,matthewoliver/swift,tipabu/swift,tipabu/swift,bkolli/swift,gold3bear/swift,mjwtom/swift,swiftstack/swift,IPVL/swift-kilo,Em-Pan/swift,hbhdytf/mac,scality/ScalitySproxydSwift,thiagodasilva/swift,Khushbu27/Tutorial,xiaoguoai/ec-dev-swift,nadeemsyed/swift,williamthegrey/swift,openstack/swift,sarvesh-ranjan/swift,anishnarang/gswift,smerritt/swift,hurricanerix/swift,mjwtom/swift,bradleypj823/swift,dpgoetz/swift,eatbyte/Swift,Akanoa/swift,clayg/swift,gold3bear/swift,psachin/swift,swiftstack/swift,hbhdytf/mac2,openstack/swift,psachin/swift,wenhuizhang/swift,hurricanerix/swift,hbhdytf/mac2,wenhuizhang/swift,aerwin3/swift,maginatics/swift,NeCTAR-RC/swift,bkolli/swift,sarvesh-ranjan/swift,bradleypj823/swift,daasbank/swift,revoer/keystone-8.0.0,smerritt/swift,notmyname/swift,shibaniahegde/OpenStak_swift,NeCTAR-RC/swift,Em-Pan/swift,psachin/swift,openstack/swift,thiagodasilva/swift,notmyname/swift,iostackproject/IO-Bandwidth-Differentiation,maginatics/swift,tipabu/swift,larsbutler/swift,hurricanerix/swift,bouncestorage/swift,redbo/swift,AfonsoFGarcia/swift,notmyname/swift,clayg/swift,revoer/keystone-8.0.0,iostackproject/IO-Bandwidth-Differentiation,bouncestorage/swift,xiaoguoai/ec-dev-swift,hbhdytf/mac2,redhat-openstack/swift,hbhdytf/mac2,dencaval/swift,levythu/swift,nadeemsyed/swift,takeshineshiro/swift,mjzmjz/swift,scality/ScalitySproxydSwift,larsbutler/swift,zackmdavis/swift,dencaval/swift,daasbank/swift,matthewoliver/swift,aerwin3/swift,rackerlabs/swift,IPVL/swift-kilo,smerritt/swift,Akanoa/swift,clayg/swift,redbo/swift,tipabu/swift,hurricanerix/swift | python | ## Code Before:
import os
import unittest
try:
from subprocess import check_output
except ImportError:
from subprocess import Popen, PIPE, CalledProcessError
def check_output(*popenargs, **kwargs):
"""Lifted from python 2.7 stdlib."""
if 'stdout' in kwargs:
raise ValueError('stdout argument not allowed, it will be '
'overridden.')
process = Popen(stdout=PIPE, *popenargs, **kwargs)
output, unused_err = process.communicate()
retcode = process.poll()
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
raise CalledProcessError(retcode, cmd, output=output)
return output
os.environ['LC_ALL'] = 'eo'
os.environ['SWIFT_LOCALEDIR'] = os.path.dirname(__file__)
from swift import gettext_ as _
class TestTranslations(unittest.TestCase):
def test_translations(self):
translated_message = check_output(['python', __file__])
self.assertEquals(translated_message, 'testo mesaĝon\n')
if __name__ == "__main__":
print _('test message')
## Instruction:
Make test_translations test our tree
In order to run the correct classes, Python test framework adjusts
sys.path. However, these changes are not propagated to subprocesses.
Therefore, the test actually tries to test installed Swift, not
the one in which it is running.
The usual suggestion is to run "python setup.py develop" before
testing, but it's annoying and error-prone. If you forget it,
you may test the code in /usr very easily, and never know.
Let's just pass the correct path to subprocess. Much safer.
Change-Id: Ic71314e8462cf6e0579d704ffe9fbbfac7e6ba24
## Code After:
import os
import unittest
import string
import sys
try:
from subprocess import check_output
except ImportError:
from subprocess import Popen, PIPE, CalledProcessError
def check_output(*popenargs, **kwargs):
"""Lifted from python 2.7 stdlib."""
if 'stdout' in kwargs:
raise ValueError('stdout argument not allowed, it will be '
'overridden.')
process = Popen(stdout=PIPE, *popenargs, **kwargs)
output, unused_err = process.communicate()
retcode = process.poll()
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
raise CalledProcessError(retcode, cmd, output=output)
return output
os.environ['LC_ALL'] = 'eo'
os.environ['SWIFT_LOCALEDIR'] = os.path.dirname(__file__)
class TestTranslations(unittest.TestCase):
def test_translations(self):
path = ':'.join(sys.path)
translated_message = check_output(['python', __file__, path])
self.assertEquals(translated_message, 'testo mesaĝon\n')
if __name__ == "__main__":
sys.path = string.split(sys.argv[1], ':')
from swift import gettext_ as _
print _('test message')
|
a038a94c802839ce25a1767eb7ea96149b27ae54 | lib/active_merchant/billing/integrations/paydollar/helper.rb | lib/active_merchant/billing/integrations/paydollar/helper.rb | module ActiveMerchant #:nodoc:
module Billing #:nodoc:
module Integrations #:nodoc:
module Paydollar
class Helper < ActiveMerchant::Billing::Integrations::Helper
def initialize(order, account, options = {})
super
add_field('payType', 'N') # normal sale and not just auth
@secret = options[:credential2]
end
def form_fields
@fields.merge('secureHash' => generate_secure_hash)
end
def generate_secure_hash
fields = [@fields[mappings[:account]],
@fields[mappings[:order]],
@fields[mappings[:currency]],
@fields[mappings[:amount]],
@fields['payType']]
Paydollar.sign(fields, @secret)
end
def currency=(currency_code)
code = CURRENCY_MAP[currency_code]
raise StandardError, "Invalid currency code #{currency_code} specified" if code.nil?
add_field(mappings[:currency], code)
end
mapping :account, 'merchantId'
mapping :amount, 'amount'
mapping :order, 'orderRef'
mapping :currency, 'currCode'
mapping :return_url, 'successUrl'
mapping :cancel_return_url, ['cancelUrl','failUrl']
end
end
end
end
end
| module ActiveMerchant #:nodoc:
module Billing #:nodoc:
module Integrations #:nodoc:
module Paydollar
class Helper < ActiveMerchant::Billing::Integrations::Helper
def initialize(order, account, options = {})
super
add_field('payType', 'N') # normal sale and not just auth
@secret = options[:credential2]
end
def form_fields
@fields.merge('secureHash' => generate_secure_hash)
end
def generate_secure_hash
fields = [@fields[mappings[:account]],
@fields[mappings[:order]],
@fields[mappings[:currency]],
@fields[mappings[:amount]],
@fields['payType']]
Paydollar.sign(fields, @secret)
end
def currency=(currency_code)
add_field(mappings[:currency], CURRENCY_MAP[currency_code])
end
mapping :account, 'merchantId'
mapping :amount, 'amount'
mapping :order, 'orderRef'
mapping :currency, 'currCode'
mapping :return_url, 'successUrl'
mapping :cancel_return_url, ['cancelUrl','failUrl']
end
end
end
end
end
| Remove extraneous checking of currency code | Remove extraneous checking of currency code
| Ruby | mit | cremalab/active_merchant,icaroseara/active_merchant,planetargon/active_merchant,agseco/active_merchant,npverni/active_merchant,reinteractive/active_merchant,tangosource/active_merchant,CraigAPayne/active_merchant,paulsponagl/active_merchant,Kagetsuki/active_merchant,getaroom/active_merchant,Trexle/active_merchant,vivekschetu/active_merchant,sealink/active_merchant,codecraft63/active_merchant,InfraRuby/active_merchant,varyonic/active_merchant,alexandremcosta/active_merchant,fabiokr/active_merchant,nagash/active_merchant,coteyr/active_merchant,ShubhamGupta/active_merchant,wendy0402/active_merchant,taf2/active_merchant,dlehren/active_merchant,jdwsep/active_merchant,arkhitech/active_merchant,rocketlobster/active_merchant,bandzoogle/active_merchant,ammoready/active_merchant,ao/active_merchant,llopez/active_merchant,donaldpiret/active_merchant,michaelherold/active_merchant,X0Refraction/active_merchant2,trevorgrayson/active_merchant,cyanna/active_merchant,jfagan/active_merchant,rafiaqutab/active_merchant,miyazawadegica/active_merchant,sheasadev/active_merchant,sideci-sample/sideci-sample-active_merchant,tychobrailleur/active_merchant,pacso/active_merchant,veracross/active_merchant,bruno/active_merchant,waysact/active_merchant,cef/active_merchant,pledgemusic/active_merchant,3dna/active_merchant,ashishCetuG/active_merchant,duff/active_merchant,celsodantas/active_merchant,atxwebs/active_merchant,komoju/active_merchant,exact/active_merchant,braintreeps/active_merchant,eduvo/hostedpci-active-merchant,Shopify/active_merchant,jsoma/active_merchant,HealthWave/active_merchant,ippoippo/active_merchant,zambot/active_merchant,domain7/active_merchant,elevation/active_merchant,gabetax/active_merchant,jwarchol/active_merchant,davidsantoso/active_merchant,bjacobson26/active_merchant,chargify/active_merchant,NareshVeluri/active_merchant,conekta/active_merchant,innku/active_merchant,gabealmer/active_merchant,algorich/active_merchant,Nguyenanh/active_merchant,waysact/active_merchant,joshnuss/active_merchant,anthonye2007/spreedly1,PaulRaye/active_merchant,hps/active_merchant,Mixbook/active_merchant,camelmasa/active_merchant,QuickPay/active_merchant,semaperepelitsa/active_merchant,anthonye2007/spreedly1,mrezentes/active_merchant,jeffleeismyhero/active_merchant,perryazevedo/active_merchant,shulmang/active_merchant,fgbreel/active_merchant,curiousepic/active_merchant,rwdaigle/active_merchant,whitby3001/active_merchant,itransact/active_merchant,buttercloud/active_merchant,jordan-brough/active_merchant,vanboom/active_merchant,jyr/active_merchant,tomriley/active_merchant,lactose/active_merchant,anderson-mondido/active-merchant-mondido,anellis/active_merchant,Edools/active_merchant,spreedly/active_merchant,autobutler/active_merchant,rubemz/active_merchant,tjstankus/active_merchant,gwmoura/active_merchant,mitjok/active_merchant_fork2,djsmentya/active_merchant,webdev1001/active_merchant,darcybrown/active_merchant,lcn-com-dev/active_merchant,pdamer/active_merchant,markhagan/active_merchant,dotandbo/active_merchant,samuelgiles/active_merchant,bazarka/active_merchant,omise/active_merchant,pinpayments/active_merchant,boone/active_merchant,vampirechicken/active_merchant,GiveCorps/active_merchant,matsubo/active_merchant,pierre/active_merchant,cmaion/active_merchant,BaseCampOps/active_merchant,inspire/active_merchant,weblinc/active_merchant,katgironpe/active_merchant,activemerchant/active_merchant,mitjok/active_merchant_payfort,samvincent/active_merchant,akiomik/active_merchant,jameswritescode/active_merchant,appfolio/active_merchant,jayfredlund/active_merchant,msdundar/active_merchant,craigchristenson/active_merchant,customerlobby/active_merchant,Simplero/active_merchant,musha68k/active_merchant,stAndrei/active_merchant,eitoball/active_merchant,acumenbrands/active_merchant,stAndrei/active_merchant-1 | ruby | ## Code Before:
module ActiveMerchant #:nodoc:
module Billing #:nodoc:
module Integrations #:nodoc:
module Paydollar
class Helper < ActiveMerchant::Billing::Integrations::Helper
def initialize(order, account, options = {})
super
add_field('payType', 'N') # normal sale and not just auth
@secret = options[:credential2]
end
def form_fields
@fields.merge('secureHash' => generate_secure_hash)
end
def generate_secure_hash
fields = [@fields[mappings[:account]],
@fields[mappings[:order]],
@fields[mappings[:currency]],
@fields[mappings[:amount]],
@fields['payType']]
Paydollar.sign(fields, @secret)
end
def currency=(currency_code)
code = CURRENCY_MAP[currency_code]
raise StandardError, "Invalid currency code #{currency_code} specified" if code.nil?
add_field(mappings[:currency], code)
end
mapping :account, 'merchantId'
mapping :amount, 'amount'
mapping :order, 'orderRef'
mapping :currency, 'currCode'
mapping :return_url, 'successUrl'
mapping :cancel_return_url, ['cancelUrl','failUrl']
end
end
end
end
end
## Instruction:
Remove extraneous checking of currency code
## Code After:
module ActiveMerchant #:nodoc:
module Billing #:nodoc:
module Integrations #:nodoc:
module Paydollar
class Helper < ActiveMerchant::Billing::Integrations::Helper
def initialize(order, account, options = {})
super
add_field('payType', 'N') # normal sale and not just auth
@secret = options[:credential2]
end
def form_fields
@fields.merge('secureHash' => generate_secure_hash)
end
def generate_secure_hash
fields = [@fields[mappings[:account]],
@fields[mappings[:order]],
@fields[mappings[:currency]],
@fields[mappings[:amount]],
@fields['payType']]
Paydollar.sign(fields, @secret)
end
def currency=(currency_code)
add_field(mappings[:currency], CURRENCY_MAP[currency_code])
end
mapping :account, 'merchantId'
mapping :amount, 'amount'
mapping :order, 'orderRef'
mapping :currency, 'currCode'
mapping :return_url, 'successUrl'
mapping :cancel_return_url, ['cancelUrl','failUrl']
end
end
end
end
end
|
6ece0f38b01fa70a2ee13477045c6d887cc26fa5 | meta-gnome/recipes-gnome/gnome-shell/gnome-shell-extensions_42.0.bb | meta-gnome/recipes-gnome/gnome-shell/gnome-shell-extensions_42.0.bb | SUMMARY = "GNOME Shell Extensions"
LICENSE = "GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4cb3a392cbf81a9e685ec13b88c4c101"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase gettext gsettings features_check
REQUIRED_DISTRO_FEATURES = "x11 polkit systemd pam gobject-introspection-data"
SRC_URI[archive.sha256sum] = "3ee65b75b1afd8bcca0a2a03da9b2884787ed40e257a881d9aa6ef7c8727602f"
DEPENDS += " \
sassc-native \
"
EXTRA_OEMESON += " \
-Dextension_set=all \
-Dclassic_mode=true \
"
do_install:append() {
# enable gnome-classic session for wayland
install -d ${D}${datadir}/wayland-sessions
install -m644 ${D}${datadir}/xsessions/gnome-classic.desktop ${D}${datadir}/wayland-sessions/
}
RDEPENDS:${PN} += "gnome-shell"
FILES:${PN} += " \
${datadir}/gnome-shell \
${datadir}/gnome-session \
${datadir}/wayland-sessions \
${datadir}/xsessions \
"
| SUMMARY = "GNOME Shell Extensions"
LICENSE = "GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4cb3a392cbf81a9e685ec13b88c4c101"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase gettext gsettings features_check
REQUIRED_DISTRO_FEATURES = "x11 polkit systemd pam gobject-introspection-data"
SRC_URI[archive.sha256sum] = "3ee65b75b1afd8bcca0a2a03da9b2884787ed40e257a881d9aa6ef7c8727602f"
DEPENDS += " \
sassc-native \
"
EXTRA_OEMESON += " \
-Dextension_set=all \
-Dclassic_mode=true \
"
RDEPENDS:${PN} += "gnome-shell"
FILES:${PN} += " \
${datadir}/gnome-shell \
${datadir}/gnome-session \
${datadir}/wayland-sessions \
${datadir}/xsessions \
"
| Stop copying gnome-classic session to wayland | gnome-shell-extensions: Stop copying gnome-classic session to wayland
It was me who introduced gnome-classic for wayland in
342de9bf51f27362e7d1d4f1d55d8548cfa7504c.
In the commit message I wrote:
"Enable gnome-classic session for wayland
Wonder why upstream does not ship this: Session runs perfectly fine"
Seems upstream was listening :) With 42.0 they ship two wayland sessions but
now I do not understand why there are two: Sessions (Try)Exec are identical -
only displayed texts are different. Maybe it's for gdm's sake which I never
used so far.
Signed-off-by: Andreas Müller <[email protected]>
Signed-off-by: Khem Raj <[email protected]>
| BitBake | mit | openembedded/meta-openembedded,openembedded/meta-openembedded,openembedded/meta-openembedded,openembedded/meta-openembedded,openembedded/meta-openembedded,openembedded/meta-openembedded,openembedded/meta-openembedded,openembedded/meta-openembedded | bitbake | ## Code Before:
SUMMARY = "GNOME Shell Extensions"
LICENSE = "GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4cb3a392cbf81a9e685ec13b88c4c101"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase gettext gsettings features_check
REQUIRED_DISTRO_FEATURES = "x11 polkit systemd pam gobject-introspection-data"
SRC_URI[archive.sha256sum] = "3ee65b75b1afd8bcca0a2a03da9b2884787ed40e257a881d9aa6ef7c8727602f"
DEPENDS += " \
sassc-native \
"
EXTRA_OEMESON += " \
-Dextension_set=all \
-Dclassic_mode=true \
"
do_install:append() {
# enable gnome-classic session for wayland
install -d ${D}${datadir}/wayland-sessions
install -m644 ${D}${datadir}/xsessions/gnome-classic.desktop ${D}${datadir}/wayland-sessions/
}
RDEPENDS:${PN} += "gnome-shell"
FILES:${PN} += " \
${datadir}/gnome-shell \
${datadir}/gnome-session \
${datadir}/wayland-sessions \
${datadir}/xsessions \
"
## Instruction:
gnome-shell-extensions: Stop copying gnome-classic session to wayland
It was me who introduced gnome-classic for wayland in
342de9bf51f27362e7d1d4f1d55d8548cfa7504c.
In the commit message I wrote:
"Enable gnome-classic session for wayland
Wonder why upstream does not ship this: Session runs perfectly fine"
Seems upstream was listening :) With 42.0 they ship two wayland sessions but
now I do not understand why there are two: Sessions (Try)Exec are identical -
only displayed texts are different. Maybe it's for gdm's sake which I never
used so far.
Signed-off-by: Andreas Müller <[email protected]>
Signed-off-by: Khem Raj <[email protected]>
## Code After:
SUMMARY = "GNOME Shell Extensions"
LICENSE = "GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4cb3a392cbf81a9e685ec13b88c4c101"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase gettext gsettings features_check
REQUIRED_DISTRO_FEATURES = "x11 polkit systemd pam gobject-introspection-data"
SRC_URI[archive.sha256sum] = "3ee65b75b1afd8bcca0a2a03da9b2884787ed40e257a881d9aa6ef7c8727602f"
DEPENDS += " \
sassc-native \
"
EXTRA_OEMESON += " \
-Dextension_set=all \
-Dclassic_mode=true \
"
RDEPENDS:${PN} += "gnome-shell"
FILES:${PN} += " \
${datadir}/gnome-shell \
${datadir}/gnome-session \
${datadir}/wayland-sessions \
${datadir}/xsessions \
"
|
bfa446d5fc399b685419ad00c376bcd9a13a8605 | mediacrush/decorators.py | mediacrush/decorators.py | from flask import jsonify, request
from functools import wraps
import json
def json_output(f):
@wraps(f)
def wrapper(*args, **kwargs):
def jsonify_wrap(obj):
callback = request.args.get('callback', False)
jsonification = jsonify(obj)
if callback:
jsonification.data = "%s(%s);" % (callback, jsonification.data) # Alter the response
return jsonification
result = f(*args, **kwargs)
if isinstance(result, tuple):
return jsonify_wrap(result[0]), result[1]
return jsonify_wrap(result)
return wrapper
def cors(f):
@wraps(f)
def wrapper(*args, **kwargs):
res = f(*args, **kwargs)
if request.headers.get('x-cors-status', False):
if isinstance(res, tuple):
json_text = res[0].data
code = res[1]
else:
json_text = res.data
code = 200
o = json.loads(json_text)
o['x-status'] = code
return jsonify(o)
return res
return wrapper
| from flask import jsonify, request
from functools import wraps
import json
jsonp_notice = """
// MediaCrush supports Cross Origin Resource Sharing requests.
// There is no reason to use JSONP; please use CORS instead.
// For more information, see https://mediacru.sh/docs/api"""
def json_output(f):
@wraps(f)
def wrapper(*args, **kwargs):
def jsonify_wrap(obj):
callback = request.args.get('callback', False)
jsonification = jsonify(obj)
if callback:
jsonification.data = "%s(%s);\n%s" % (callback, jsonification.data, jsonp_notice) # Alter the response
return jsonification
result = f(*args, **kwargs)
if isinstance(result, tuple):
return jsonify_wrap(result[0]), result[1]
return jsonify_wrap(result)
return wrapper
def cors(f):
@wraps(f)
def wrapper(*args, **kwargs):
res = f(*args, **kwargs)
if request.headers.get('x-cors-status', False):
if isinstance(res, tuple):
json_text = res[0].data
code = res[1]
else:
json_text = res.data
code = 200
o = json.loads(json_text)
o['x-status'] = code
return jsonify(o)
return res
return wrapper
| Add JSONP notice to API | Add JSONP notice to API | Python | mit | nerdzeu/NERDZCrush,MediaCrush/MediaCrush,roderickm/MediaCrush,roderickm/MediaCrush,nerdzeu/NERDZCrush,MediaCrush/MediaCrush,nerdzeu/NERDZCrush,roderickm/MediaCrush | python | ## Code Before:
from flask import jsonify, request
from functools import wraps
import json
def json_output(f):
@wraps(f)
def wrapper(*args, **kwargs):
def jsonify_wrap(obj):
callback = request.args.get('callback', False)
jsonification = jsonify(obj)
if callback:
jsonification.data = "%s(%s);" % (callback, jsonification.data) # Alter the response
return jsonification
result = f(*args, **kwargs)
if isinstance(result, tuple):
return jsonify_wrap(result[0]), result[1]
return jsonify_wrap(result)
return wrapper
def cors(f):
@wraps(f)
def wrapper(*args, **kwargs):
res = f(*args, **kwargs)
if request.headers.get('x-cors-status', False):
if isinstance(res, tuple):
json_text = res[0].data
code = res[1]
else:
json_text = res.data
code = 200
o = json.loads(json_text)
o['x-status'] = code
return jsonify(o)
return res
return wrapper
## Instruction:
Add JSONP notice to API
## Code After:
from flask import jsonify, request
from functools import wraps
import json
jsonp_notice = """
// MediaCrush supports Cross Origin Resource Sharing requests.
// There is no reason to use JSONP; please use CORS instead.
// For more information, see https://mediacru.sh/docs/api"""
def json_output(f):
@wraps(f)
def wrapper(*args, **kwargs):
def jsonify_wrap(obj):
callback = request.args.get('callback', False)
jsonification = jsonify(obj)
if callback:
jsonification.data = "%s(%s);\n%s" % (callback, jsonification.data, jsonp_notice) # Alter the response
return jsonification
result = f(*args, **kwargs)
if isinstance(result, tuple):
return jsonify_wrap(result[0]), result[1]
return jsonify_wrap(result)
return wrapper
def cors(f):
@wraps(f)
def wrapper(*args, **kwargs):
res = f(*args, **kwargs)
if request.headers.get('x-cors-status', False):
if isinstance(res, tuple):
json_text = res[0].data
code = res[1]
else:
json_text = res.data
code = 200
o = json.loads(json_text)
o['x-status'] = code
return jsonify(o)
return res
return wrapper
|
906610fec25376973026e655d209a906666c383a | php/conf/conf.ini | php/conf/conf.ini | [GLOBAL_CONFIGURATION]
www.user = apache
www.group = apache
sql.host = localhost
sql.user = nobody
sql.passwd =
sql.database = doc_editor
vcs.type = svn
data.path = "data/"
php.bin = "/local/php/bin/php"
phd.bin = "/local/php/bin/phd"
xmllint.bin = "/usr/bin/xmllint" | [GLOBAL_CONFIGURATION]
www.user = apache
www.group = apache
sql.host = localhost
sql.user = nobody
sql.passwd =
sql.database = doc_editor
vcs.type = svn
data.path = "data/"
php.bin = "/usr/bin/php"
phd.bin = "/local/oldphp/bin/phd"
xmllint.bin = "/usr/bin/xmllint"
| Fix PHP & PHD binaire | Fix PHP & PHD binaire
| INI | lgpl-2.1 | vrkansagara/web-doc-editor,vrkansagara/web-doc-editor,vrkansagara/web-doc-editor,vrkansagara/web-doc-editor | ini | ## Code Before:
[GLOBAL_CONFIGURATION]
www.user = apache
www.group = apache
sql.host = localhost
sql.user = nobody
sql.passwd =
sql.database = doc_editor
vcs.type = svn
data.path = "data/"
php.bin = "/local/php/bin/php"
phd.bin = "/local/php/bin/phd"
xmllint.bin = "/usr/bin/xmllint"
## Instruction:
Fix PHP & PHD binaire
## Code After:
[GLOBAL_CONFIGURATION]
www.user = apache
www.group = apache
sql.host = localhost
sql.user = nobody
sql.passwd =
sql.database = doc_editor
vcs.type = svn
data.path = "data/"
php.bin = "/usr/bin/php"
phd.bin = "/local/oldphp/bin/phd"
xmllint.bin = "/usr/bin/xmllint"
|
8a88563d3409afd8dbf14563b62dc6a357657269 | .travis.yml | .travis.yml | os:
- linux
addons:
apt:
sources:
- kubuntu-backports
packages:
- cmake
language: c
compiler:
- gcc
- clang
install:
# powercap dependency
- git clone [email protected]:powercap/powercap.git libpowercap
- cd libpowercap
- mkdir _build
- cd _build
- cmake -DCMAKE_INSTALL_PREFIX=_install ..
- make
- make install
- cd ../..
# libmsr dependency (force version 0.3.0) (force Ivy Bridge architecture)
- git clone [email protected]:LLNL/libmsr.git libmsr_src
- cd libmsr_src
- git checkout v0.3.0
# libmsr only claims to support gcc, so don't let it build with clang
- CC=gcc ./install.sh _install -f3E
- cd ..
script:
- mkdir _build
- cd _build
- PKG_CONFIG_PATH="../libpowercap/_build/_install/lib/pkgconfig" cmake -DCMAKE_PREFIX_PATH="`pwd`/../libmsr_src/_install/" ..
- make
| os:
- linux
addons:
apt:
sources:
- kubuntu-backports
packages:
- cmake
language: c
compiler:
- gcc
- clang
install:
# powercap dependency
- git clone https://github.com/powercap/powercap.git libpowercap
- cd libpowercap
- mkdir _build
- cd _build
- cmake -DCMAKE_INSTALL_PREFIX=_install ..
- make
- make install
- cd ../..
# libmsr dependency (force version 0.3.0) (force Ivy Bridge architecture)
- git clone https://github.com/LLNL/libmsr.git libmsr_src
- cd libmsr_src
- git checkout v0.3.0
# libmsr only claims to support gcc, so don't let it build with clang
- CC=gcc ./install.sh _install -f3E
- cd ..
script:
- mkdir _build
- cd _build
- PKG_CONFIG_PATH="../libpowercap/_build/_install/lib/pkgconfig" cmake -DCMAKE_PREFIX_PATH="`pwd`/../libmsr_src/_install/" ..
- make
| Use https to clone dependencies for Travis CI | Use https to clone dependencies for Travis CI
| YAML | bsd-3-clause | powercap/raplcap,powercap/raplcap | yaml | ## Code Before:
os:
- linux
addons:
apt:
sources:
- kubuntu-backports
packages:
- cmake
language: c
compiler:
- gcc
- clang
install:
# powercap dependency
- git clone [email protected]:powercap/powercap.git libpowercap
- cd libpowercap
- mkdir _build
- cd _build
- cmake -DCMAKE_INSTALL_PREFIX=_install ..
- make
- make install
- cd ../..
# libmsr dependency (force version 0.3.0) (force Ivy Bridge architecture)
- git clone [email protected]:LLNL/libmsr.git libmsr_src
- cd libmsr_src
- git checkout v0.3.0
# libmsr only claims to support gcc, so don't let it build with clang
- CC=gcc ./install.sh _install -f3E
- cd ..
script:
- mkdir _build
- cd _build
- PKG_CONFIG_PATH="../libpowercap/_build/_install/lib/pkgconfig" cmake -DCMAKE_PREFIX_PATH="`pwd`/../libmsr_src/_install/" ..
- make
## Instruction:
Use https to clone dependencies for Travis CI
## Code After:
os:
- linux
addons:
apt:
sources:
- kubuntu-backports
packages:
- cmake
language: c
compiler:
- gcc
- clang
install:
# powercap dependency
- git clone https://github.com/powercap/powercap.git libpowercap
- cd libpowercap
- mkdir _build
- cd _build
- cmake -DCMAKE_INSTALL_PREFIX=_install ..
- make
- make install
- cd ../..
# libmsr dependency (force version 0.3.0) (force Ivy Bridge architecture)
- git clone https://github.com/LLNL/libmsr.git libmsr_src
- cd libmsr_src
- git checkout v0.3.0
# libmsr only claims to support gcc, so don't let it build with clang
- CC=gcc ./install.sh _install -f3E
- cd ..
script:
- mkdir _build
- cd _build
- PKG_CONFIG_PATH="../libpowercap/_build/_install/lib/pkgconfig" cmake -DCMAKE_PREFIX_PATH="`pwd`/../libmsr_src/_install/" ..
- make
|
2ea25e940ebd77ce1910e47cf61e7b834411de05 | app/controllers/HomeController.php | app/controllers/HomeController.php | <?php
class HomeController extends BaseController {
protected $page;
protected $issue;
function __construct(Page $page, Issue $issue)
{
$this->page = $page;
$this->issue = $issue;
}
public function showPage($slug)
{
$page = $this->page->where('slug', $slug)->first();
if (empty($page) || ! $page->is_visible) {
return Response::make('Page not found', 404);
}
return View::make('page')->with($page->getContent());
}
public function index()
{
$issue = $this->issue->getCurrent();
if ( ! $issue) {
return $this->showPage('index');
}
return View::make('cover')->with('issue', $issue);
}
}
| <?php
class HomeController extends BaseController {
protected $page;
protected $issue;
function __construct(Page $page, Issue $issue)
{
$this->page = $page;
$this->issue = $issue;
}
public function showPage($slug)
{
$page = $this->page->where('slug', $slug)->first();
if (empty($page) || ! $page->is_visible) {
return Response::make('Page not found', 404);
}
return View::make('page')->with($page->getContent());
}
public function index()
{
$issue = $this->issue->getCurrent();
if ( ! $issue) {
return $this->showPage('index');
}
$issue->load('stories');
return View::make('cover')->with('issue', $issue);
}
}
| Add eager loading for TOC | Add eager loading for TOC
| PHP | bsd-2-clause | chipotle/quill,chipotle/quill,chipotle/quill,chipotle/quill | php | ## Code Before:
<?php
class HomeController extends BaseController {
protected $page;
protected $issue;
function __construct(Page $page, Issue $issue)
{
$this->page = $page;
$this->issue = $issue;
}
public function showPage($slug)
{
$page = $this->page->where('slug', $slug)->first();
if (empty($page) || ! $page->is_visible) {
return Response::make('Page not found', 404);
}
return View::make('page')->with($page->getContent());
}
public function index()
{
$issue = $this->issue->getCurrent();
if ( ! $issue) {
return $this->showPage('index');
}
return View::make('cover')->with('issue', $issue);
}
}
## Instruction:
Add eager loading for TOC
## Code After:
<?php
class HomeController extends BaseController {
protected $page;
protected $issue;
function __construct(Page $page, Issue $issue)
{
$this->page = $page;
$this->issue = $issue;
}
public function showPage($slug)
{
$page = $this->page->where('slug', $slug)->first();
if (empty($page) || ! $page->is_visible) {
return Response::make('Page not found', 404);
}
return View::make('page')->with($page->getContent());
}
public function index()
{
$issue = $this->issue->getCurrent();
if ( ! $issue) {
return $this->showPage('index');
}
$issue->load('stories');
return View::make('cover')->with('issue', $issue);
}
}
|
908121c3b52ca843aa01504f211c56bd32859360 | scripts/golang.sh | scripts/golang.sh | set -o errexit
set -o nounset
GO_VERSION="1.7.3"
GOROOT=$HOME/go
GOPATH=$HOME/workspace/go
echo "Install Go "$GO_VERSION" at "$GOROOT
cd /tmp
wget https://storage.googleapis.com/golang/go$GO_VERSION.linux-amd64.tar.gz
tar -xvf go$GO_VERSION.linux-amd64.tar.gz
rm -rf $GOROOT
mkdir -p $GOROOT
mv ./go/* $GOROOT
go get -u github.com/NeowayLabs/nash/cmd/nash
| set -o errexit
set -o nounset
GO_VERSION="1.7.3"
GOROOT=$HOME/go
GOPATH=$HOME/workspace/go
echo "Install Go "$GO_VERSION" at "$GOROOT
cd /tmp
wget https://storage.googleapis.com/golang/go$GO_VERSION.linux-amd64.tar.gz
tar -xvf go$GO_VERSION.linux-amd64.tar.gz
rm -rf $GOROOT
mkdir -p $GOROOT
mv ./go/* $GOROOT
go get -u github.com/NeowayLabs/nash/cmd/nash
ln -s $GOPATH/bin/nash /bin/nash
| Add nash install with link to make stuff easier | Add nash install with link to make stuff easier
| Shell | mit | katcipis/workbench,katcipis/mise.en.place | shell | ## Code Before:
set -o errexit
set -o nounset
GO_VERSION="1.7.3"
GOROOT=$HOME/go
GOPATH=$HOME/workspace/go
echo "Install Go "$GO_VERSION" at "$GOROOT
cd /tmp
wget https://storage.googleapis.com/golang/go$GO_VERSION.linux-amd64.tar.gz
tar -xvf go$GO_VERSION.linux-amd64.tar.gz
rm -rf $GOROOT
mkdir -p $GOROOT
mv ./go/* $GOROOT
go get -u github.com/NeowayLabs/nash/cmd/nash
## Instruction:
Add nash install with link to make stuff easier
## Code After:
set -o errexit
set -o nounset
GO_VERSION="1.7.3"
GOROOT=$HOME/go
GOPATH=$HOME/workspace/go
echo "Install Go "$GO_VERSION" at "$GOROOT
cd /tmp
wget https://storage.googleapis.com/golang/go$GO_VERSION.linux-amd64.tar.gz
tar -xvf go$GO_VERSION.linux-amd64.tar.gz
rm -rf $GOROOT
mkdir -p $GOROOT
mv ./go/* $GOROOT
go get -u github.com/NeowayLabs/nash/cmd/nash
ln -s $GOPATH/bin/nash /bin/nash
|
8dc0e36a9bc6a3c182af2ace93b9b766dfbfa80e | bosh-monitor/spec/support/buffered_logger.rb | bosh-monitor/spec/support/buffered_logger.rb | require 'rspec'
require 'logger'
require 'mono_logger'
require 'logging'
module BufferedLogger
# returns the log as a string
def log_string
@test_log_buffer.string
end
def logger
@test_logger
end
end
RSpec.configure do |c|
c.include(BufferedLogger)
c.before do
@test_log_buffer = StringIO.new
@test_logger = Logging.logger(@test_log_buffer)
allow(MonoLogger).to receive(:new).and_return(@test_logger)
allow(Logging).to receive(:logger).and_return(@test_logger)
allow(Logger).to receive(:new).and_return(@test_logger)
end
c.after do |example|
# Print logs if the test failed
unless example.exception.nil?
STDERR.write("\nTest Failed: '#{example.full_description}'\nTest Logs:\n#{@test_log_buffer.string}\n")
end
end
end
| require 'rspec'
require 'logger'
require 'logging'
module BufferedLogger
# returns the log as a string
def log_string
@test_log_buffer.string
end
def logger
@test_logger
end
end
RSpec.configure do |c|
c.include(BufferedLogger)
c.before do
@test_log_buffer = StringIO.new
@test_logger = Logging.logger(@test_log_buffer)
allow(Logging).to receive(:logger).and_return(@test_logger)
allow(Logger).to receive(:new).and_return(@test_logger)
end
c.after do |example|
# Print logs if the test failed
unless example.exception.nil?
STDERR.write("\nTest Failed: '#{example.full_description}'\nTest Logs:\n#{@test_log_buffer.string}\n")
end
end
end
| Switch bosh-monitor to use Logging instead of MonoLogger | Switch bosh-monitor to use Logging instead of MonoLogger
[#81611698]
Signed-off-by: Karl Isenberg <[email protected]>
| Ruby | apache-2.0 | barthy1/bosh,barthy1/bosh,barthy1/bosh,barthy1/bosh | ruby | ## Code Before:
require 'rspec'
require 'logger'
require 'mono_logger'
require 'logging'
module BufferedLogger
# returns the log as a string
def log_string
@test_log_buffer.string
end
def logger
@test_logger
end
end
RSpec.configure do |c|
c.include(BufferedLogger)
c.before do
@test_log_buffer = StringIO.new
@test_logger = Logging.logger(@test_log_buffer)
allow(MonoLogger).to receive(:new).and_return(@test_logger)
allow(Logging).to receive(:logger).and_return(@test_logger)
allow(Logger).to receive(:new).and_return(@test_logger)
end
c.after do |example|
# Print logs if the test failed
unless example.exception.nil?
STDERR.write("\nTest Failed: '#{example.full_description}'\nTest Logs:\n#{@test_log_buffer.string}\n")
end
end
end
## Instruction:
Switch bosh-monitor to use Logging instead of MonoLogger
[#81611698]
Signed-off-by: Karl Isenberg <[email protected]>
## Code After:
require 'rspec'
require 'logger'
require 'logging'
module BufferedLogger
# returns the log as a string
def log_string
@test_log_buffer.string
end
def logger
@test_logger
end
end
RSpec.configure do |c|
c.include(BufferedLogger)
c.before do
@test_log_buffer = StringIO.new
@test_logger = Logging.logger(@test_log_buffer)
allow(Logging).to receive(:logger).and_return(@test_logger)
allow(Logger).to receive(:new).and_return(@test_logger)
end
c.after do |example|
# Print logs if the test failed
unless example.exception.nil?
STDERR.write("\nTest Failed: '#{example.full_description}'\nTest Logs:\n#{@test_log_buffer.string}\n")
end
end
end
|
fc15beb93471837ed86039ae88ce9e264763c4e1 | app/assets/javascripts/templates/metadata_accordion.handlebars | app/assets/javascripts/templates/metadata_accordion.handlebars | <div class="accordion">
{{#each category in content}}
<div class="accordion-group">
<div class="accordion-heading">
<a {{bindAttr href="category.collapsibleHref"}} class="accordion-toggle" data-toggle="collapse" data-parent="metadata-accordion">
{{category.name}}
</a>
</div>
<div {{bindAttr id="category.collapsibleId"}} class="accordion-body collapse">
<div class="accordion-inner">
<select multiple="multiple" style="width:180px">
{{#each value in category.metadataValues}}
<option {{bindAttr value="value.id"}}>{{value.text}}</option>
{{/each}}
</select>
</div>
</div>
</div>
{{/each}}
</div>
| <div class="accordion">
{{#each category in content}}
<div class="accordion-group">
<div class="accordion-heading">
<a {{bindAttr href="category.collapsibleHref"}} class="accordion-toggle" data-toggle="collapse" data-parent="metadata-accordion">
{{category.name}}
</a>
</div>
<div {{bindAttr id="category.collapsibleId"}} class="accordion-body collapse">
<div class="accordion-inner">
{{view Em.Select
contentBinding="category.metadataValues"
optionValuePath="content.id"
optionLabelPath="content.text"
multiple="true"
}}
</div>
</div>
</div>
{{/each}}
</div>
| Use Ember.Select for listing of metadata | Use Ember.Select for listing of metadata
| Handlebars | mit | textlab/rglossa,textlab/glossa,textlab/glossa,textlab/rglossa,textlab/rglossa,textlab/glossa,textlab/rglossa,textlab/glossa,textlab/rglossa,textlab/glossa | handlebars | ## Code Before:
<div class="accordion">
{{#each category in content}}
<div class="accordion-group">
<div class="accordion-heading">
<a {{bindAttr href="category.collapsibleHref"}} class="accordion-toggle" data-toggle="collapse" data-parent="metadata-accordion">
{{category.name}}
</a>
</div>
<div {{bindAttr id="category.collapsibleId"}} class="accordion-body collapse">
<div class="accordion-inner">
<select multiple="multiple" style="width:180px">
{{#each value in category.metadataValues}}
<option {{bindAttr value="value.id"}}>{{value.text}}</option>
{{/each}}
</select>
</div>
</div>
</div>
{{/each}}
</div>
## Instruction:
Use Ember.Select for listing of metadata
## Code After:
<div class="accordion">
{{#each category in content}}
<div class="accordion-group">
<div class="accordion-heading">
<a {{bindAttr href="category.collapsibleHref"}} class="accordion-toggle" data-toggle="collapse" data-parent="metadata-accordion">
{{category.name}}
</a>
</div>
<div {{bindAttr id="category.collapsibleId"}} class="accordion-body collapse">
<div class="accordion-inner">
{{view Em.Select
contentBinding="category.metadataValues"
optionValuePath="content.id"
optionLabelPath="content.text"
multiple="true"
}}
</div>
</div>
</div>
{{/each}}
</div>
|
a1a796984d1f48b1a3f0eba46c28b36d866fb178 | master/releases/index.md | master/releases/index.md | ---
title: Releases
---
The following table shows component versioning for Calico **{{ page.version }}**.
Use the version selector at the top-right of this page to view a different release.
## master
| Component | Version |
|------------------------|---------|
| felix | latest |
| calicoctl | latest |
| calico/node | latest |
| calico/cni | latest |
| libcalico | latest |
| calico-bird | latest |
| libnetwork-plugin | latest | | ---
title: Releases
---
The following table shows component versioning for Calico **{{ page.version }}**.
Use the version selector at the top-right of this page to view a different release.
## master
| Component | Version |
|------------------------|---------|
| felix | latest |
| calicoctl | latest |
| calico/node | latest |
| calico/cni | latest |
| libcalico | latest |
| calico-bird | latest |
| libnetwork-plugin | latest |
| calico/kube-policy-controller | latest |
| Add kube-policy-controller's version to the master versions file. | Add kube-policy-controller's version to the master versions file.
| Markdown | apache-2.0 | bcreane/calico,gunjan5/calico,heschlie/calico,gunjan5/calico,gunjan5/calico,caseydavenport/calico,gunjan5/calico,tomdee/calico,heschlie/calico,bcreane/calico,tomdee/calico,caseydavenport/calico,heschlie/calico,heschlie/calico,tomdee/calico,bcreane/calico,bcreane/calico,caseydavenport/calico,gunjan5/calico,tomdee/calico,heschlie/calico,bcreane/calico,tomdee/calico,gunjan5/calico,caseydavenport/calico,caseydavenport/calico,bcreane/calico,caseydavenport/calico,heschlie/calico | markdown | ## Code Before:
---
title: Releases
---
The following table shows component versioning for Calico **{{ page.version }}**.
Use the version selector at the top-right of this page to view a different release.
## master
| Component | Version |
|------------------------|---------|
| felix | latest |
| calicoctl | latest |
| calico/node | latest |
| calico/cni | latest |
| libcalico | latest |
| calico-bird | latest |
| libnetwork-plugin | latest |
## Instruction:
Add kube-policy-controller's version to the master versions file.
## Code After:
---
title: Releases
---
The following table shows component versioning for Calico **{{ page.version }}**.
Use the version selector at the top-right of this page to view a different release.
## master
| Component | Version |
|------------------------|---------|
| felix | latest |
| calicoctl | latest |
| calico/node | latest |
| calico/cni | latest |
| libcalico | latest |
| calico-bird | latest |
| libnetwork-plugin | latest |
| calico/kube-policy-controller | latest |
|
e07f095944a0a6edd125d75f4980a45fc10c6dfd | wiblog/util/comments.py | wiblog/util/comments.py |
from fragdev.contact import validate_ham
from django.forms import ModelForm
from django import forms
from wiblog.models import Comment
class CommentForm(ModelForm):
verify = forms.CharField(label='Anti-spam: Type in the word "power"',validators=[validate_ham],max_length=5)
class Meta:
model = Comment
fields = ('name', 'url', 'comment')
|
from django.forms import ModelForm
from django import forms
from fragdev.util.validate_ham import ANTI_SPAM, validate_ham
from wiblog.models import Comment
class CommentForm(ModelForm):
verify = forms.CharField(label='Anti-spam: Type in the word "{}"'\
.format(ANTI_SPAM),
validators=[validate_ham],
max_length=len(ANTI_SPAM))
class Meta:
model = Comment
fields = ('name', 'url', 'comment')
| Fix wiblog's use of the anti-spam validator | Fix wiblog's use of the anti-spam validator
| Python | agpl-3.0 | lo-windigo/fragdev,lo-windigo/fragdev | python | ## Code Before:
from fragdev.contact import validate_ham
from django.forms import ModelForm
from django import forms
from wiblog.models import Comment
class CommentForm(ModelForm):
verify = forms.CharField(label='Anti-spam: Type in the word "power"',validators=[validate_ham],max_length=5)
class Meta:
model = Comment
fields = ('name', 'url', 'comment')
## Instruction:
Fix wiblog's use of the anti-spam validator
## Code After:
from django.forms import ModelForm
from django import forms
from fragdev.util.validate_ham import ANTI_SPAM, validate_ham
from wiblog.models import Comment
class CommentForm(ModelForm):
verify = forms.CharField(label='Anti-spam: Type in the word "{}"'\
.format(ANTI_SPAM),
validators=[validate_ham],
max_length=len(ANTI_SPAM))
class Meta:
model = Comment
fields = ('name', 'url', 'comment')
|
67a6a6ec06462dcd5b6f88f7d6e172e332d31936 | .github/PULL_REQUEST_TEMPLATE.md | .github/PULL_REQUEST_TEMPLATE.md | <!--
# Title Line Template: [Brief statement describing what this pull request fixes.]
Use the title line as the title of your pull request, then delete these lines.
-->
## Description
Include a high-level description for what the pull request fixes and links to Github issues it resolves.
## Additional information
* Relevant research and support documents
* Screen shot images
* Notes
* etc.
## Definition of Done
- [ ] Content/documentation reviewed by Julian or someone in the Content Team
- [ ] UX reviewed by Gary or someone the Design team
- [ ] Code reviewed by one of the core developers
- [ ] Acceptance Testing
- [ ] Cross-browser tested against standard browser matrix (TBD)
- [ ] Tested on multiple devices (TBD)
- [ ] HTML5 validation (CircleCI)
- [ ] Accessibility testing & WCAG2 compliance (manual/auto TBD)
- [ ] Stakeholder/PO review
- [ ] CHANGELOG updated
| <!--
# Title Line Template: [Brief statement describing what this pull request fixes.]
Use the title line as the title of your pull request, then delete these lines.
-->
## Description
Include a high-level description for what the pull request fixes and links to Github issues it resolves.
## Additional information
* Relevant research and support documents
* Screen shot images
* Notes
* etc.
## Definition of Done
- [ ] Content/documentation reviewed by Julian or someone in the Content Team
- [ ] UX reviewed by Gary or someone the Design team
- [ ] Code reviewed by one of the core developers
- [ ] Acceptance Testing
- [ ] Cross-browser tested against standard browser matrix (TBD)
- [ ] Tested on multiple devices (TBD)
- [ ] HTML5 validation (CircleCI)
- [ ] Accessibility testing & WCAG2 compliance (`gulp test`)
- [ ] Stakeholder/PO review
- [ ] CHANGELOG updated
| Update DoD checklist on PR template to include Pa11y tests | Update DoD checklist on PR template to include Pa11y tests
| Markdown | mit | AusDTO/gov-au-ui-kit,AusDTO/gov-au-ui-kit,AusDTO/gov-au-ui-kit | markdown | ## Code Before:
<!--
# Title Line Template: [Brief statement describing what this pull request fixes.]
Use the title line as the title of your pull request, then delete these lines.
-->
## Description
Include a high-level description for what the pull request fixes and links to Github issues it resolves.
## Additional information
* Relevant research and support documents
* Screen shot images
* Notes
* etc.
## Definition of Done
- [ ] Content/documentation reviewed by Julian or someone in the Content Team
- [ ] UX reviewed by Gary or someone the Design team
- [ ] Code reviewed by one of the core developers
- [ ] Acceptance Testing
- [ ] Cross-browser tested against standard browser matrix (TBD)
- [ ] Tested on multiple devices (TBD)
- [ ] HTML5 validation (CircleCI)
- [ ] Accessibility testing & WCAG2 compliance (manual/auto TBD)
- [ ] Stakeholder/PO review
- [ ] CHANGELOG updated
## Instruction:
Update DoD checklist on PR template to include Pa11y tests
## Code After:
<!--
# Title Line Template: [Brief statement describing what this pull request fixes.]
Use the title line as the title of your pull request, then delete these lines.
-->
## Description
Include a high-level description for what the pull request fixes and links to Github issues it resolves.
## Additional information
* Relevant research and support documents
* Screen shot images
* Notes
* etc.
## Definition of Done
- [ ] Content/documentation reviewed by Julian or someone in the Content Team
- [ ] UX reviewed by Gary or someone the Design team
- [ ] Code reviewed by one of the core developers
- [ ] Acceptance Testing
- [ ] Cross-browser tested against standard browser matrix (TBD)
- [ ] Tested on multiple devices (TBD)
- [ ] HTML5 validation (CircleCI)
- [ ] Accessibility testing & WCAG2 compliance (`gulp test`)
- [ ] Stakeholder/PO review
- [ ] CHANGELOG updated
|
c5fd49edb029cf0fb290bb957d9bdc138e156402 | datasets/templates/datasets/taxonomy_node_mini_info.html | datasets/templates/datasets/taxonomy_node_mini_info.html | {% load dataset_templatetags %}
<table class="ui unstackable table" width="100%">
<tbody>
<td>Hierarchy</td>
<td>
<div class="ui list">
{% for hierarchy_path in node.hierarchy_paths %}
<div class="item">
<div class="ui horizontal list">
<div class="item">
<i class="tree icon"></i>
</div>
{% for node_id in hierarchy_path %}
{% taxonomy_node_minimal_data dataset node_id as sub_node_data %}
<div class="item" style="margin-left:0px;margin-right:5px;">
>
<a href="" target="_blank">{{ sub_node_data.name }}</a>
</div>
{% endfor %}
</div>
</div>
{% empty %}
<div class="item">-</div>
{% endfor %}
</div>
</td>
</tr>
<tr>
<td class="three wide">Description</td>
<td>{{ node.description }}</td>
</tr>
{% if node.freesound_examples %}
<tr><td>Examples</td>
<td>
{% for fsid in node.freesound_examples %}
{{ fsid| fs_embed | safe }}
{% endfor %}
</td>
</tr>
{% endif %}
</div>
</div>
</tbody>
</table> | {% load dataset_templatetags %}
<table class="ui unstackable table" width="100%">
<tbody>
<td>Hierarchy</td>
<td>
<div class="ui list">
{% for hierarchy_path in node.hierarchy_paths %}
<div class="item">
<div class="ui horizontal list">
<div class="item">
<i class="tree icon"></i>
</div>
{% for node_id in hierarchy_path %}
{% taxonomy_node_minimal_data dataset node_id as sub_node_data %}
<div class="item" style="margin-left:0px;margin-right:5px;">
>
<a href="{% url 'dataset-explore-taxonomy-node' dataset.short_name sub_node_data.url_id %}" target="_blank">{{ sub_node_data.name }}</a>
</div>
{% endfor %}
</div>
</div>
{% empty %}
<div class="item">-</div>
{% endfor %}
</div>
</td>
</tr>
<tr>
<td class="three wide">Description</td>
<td>{{ node.description }}</td>
</tr>
{% if node.freesound_examples %}
<tr><td>Examples</td>
<td>
{% for fsid in node.freesound_examples %}
{{ fsid| fs_embed | safe }}
{% endfor %}
</td>
</tr>
{% endif %}
</div>
</div>
</tbody>
</table> | Add link to categories in popup | Add link to categories in popup
| HTML | agpl-3.0 | MTG/freesound-datasets,MTG/freesound-datasets,MTG/freesound-datasets,MTG/freesound-datasets | html | ## Code Before:
{% load dataset_templatetags %}
<table class="ui unstackable table" width="100%">
<tbody>
<td>Hierarchy</td>
<td>
<div class="ui list">
{% for hierarchy_path in node.hierarchy_paths %}
<div class="item">
<div class="ui horizontal list">
<div class="item">
<i class="tree icon"></i>
</div>
{% for node_id in hierarchy_path %}
{% taxonomy_node_minimal_data dataset node_id as sub_node_data %}
<div class="item" style="margin-left:0px;margin-right:5px;">
>
<a href="" target="_blank">{{ sub_node_data.name }}</a>
</div>
{% endfor %}
</div>
</div>
{% empty %}
<div class="item">-</div>
{% endfor %}
</div>
</td>
</tr>
<tr>
<td class="three wide">Description</td>
<td>{{ node.description }}</td>
</tr>
{% if node.freesound_examples %}
<tr><td>Examples</td>
<td>
{% for fsid in node.freesound_examples %}
{{ fsid| fs_embed | safe }}
{% endfor %}
</td>
</tr>
{% endif %}
</div>
</div>
</tbody>
</table>
## Instruction:
Add link to categories in popup
## Code After:
{% load dataset_templatetags %}
<table class="ui unstackable table" width="100%">
<tbody>
<td>Hierarchy</td>
<td>
<div class="ui list">
{% for hierarchy_path in node.hierarchy_paths %}
<div class="item">
<div class="ui horizontal list">
<div class="item">
<i class="tree icon"></i>
</div>
{% for node_id in hierarchy_path %}
{% taxonomy_node_minimal_data dataset node_id as sub_node_data %}
<div class="item" style="margin-left:0px;margin-right:5px;">
>
<a href="{% url 'dataset-explore-taxonomy-node' dataset.short_name sub_node_data.url_id %}" target="_blank">{{ sub_node_data.name }}</a>
</div>
{% endfor %}
</div>
</div>
{% empty %}
<div class="item">-</div>
{% endfor %}
</div>
</td>
</tr>
<tr>
<td class="three wide">Description</td>
<td>{{ node.description }}</td>
</tr>
{% if node.freesound_examples %}
<tr><td>Examples</td>
<td>
{% for fsid in node.freesound_examples %}
{{ fsid| fs_embed | safe }}
{% endfor %}
</td>
</tr>
{% endif %}
</div>
</div>
</tbody>
</table> |
86bbe814ae12aedb58d12a5c869896bfeab8eb4f | README.md | README.md | A little command-line stalker using the Clearbit API
# Installation
```
go get -u github.com/pzurek/stalk
```
# Config
Sign up with Clearbit and put your key into `~/.stalk/config`:
```
clearbit_key: a78c0257fcfdd142bc1bdda8deadbeef
```
# Use
```
$stalk -e [email protected]
```
This will hopefully result in something like:
```
Success!
This email seems to belong to: Alex MacCaw
Looks like they are working at Clearbit as a Founder
You can follow them at:
Facebook: https://facebook.com/amaccaw
Twitter: https://twitter.com/maccaw
GitHub: https://github.com/maccman
LinkedIn: https://linkedin.com/pub/alex-maccaw/78/929/ab5
```
| A little command-line stalker using the Clearbit API. It's just a little toy app I use to test the [Clearbit client library](https://github.com/pzurek/clearbit).
# Installation
```
go get -u github.com/pzurek/stalk
```
# Config
Sign up with Clearbit and put your key into `~/.stalk/config`:
```
clearbit_key: a78c0257fcfdd142bc1bdda8deadbeef
```
# Use
```
$stalk -e [email protected]
```
This will hopefully result in something like:
```
Success!
This email seems to belong to: Alex MacCaw
Looks like they are working at Clearbit as a Founder
You can follow them at:
Facebook: https://facebook.com/amaccaw
Twitter: https://twitter.com/maccaw
GitHub: https://github.com/maccman
LinkedIn: https://linkedin.com/pub/alex-maccaw/78/929/ab5
```
| Add reference to the Clearbit library | Add reference to the Clearbit library | Markdown | apache-2.0 | pzurek/stalk | markdown | ## Code Before:
A little command-line stalker using the Clearbit API
# Installation
```
go get -u github.com/pzurek/stalk
```
# Config
Sign up with Clearbit and put your key into `~/.stalk/config`:
```
clearbit_key: a78c0257fcfdd142bc1bdda8deadbeef
```
# Use
```
$stalk -e [email protected]
```
This will hopefully result in something like:
```
Success!
This email seems to belong to: Alex MacCaw
Looks like they are working at Clearbit as a Founder
You can follow them at:
Facebook: https://facebook.com/amaccaw
Twitter: https://twitter.com/maccaw
GitHub: https://github.com/maccman
LinkedIn: https://linkedin.com/pub/alex-maccaw/78/929/ab5
```
## Instruction:
Add reference to the Clearbit library
## Code After:
A little command-line stalker using the Clearbit API. It's just a little toy app I use to test the [Clearbit client library](https://github.com/pzurek/clearbit).
# Installation
```
go get -u github.com/pzurek/stalk
```
# Config
Sign up with Clearbit and put your key into `~/.stalk/config`:
```
clearbit_key: a78c0257fcfdd142bc1bdda8deadbeef
```
# Use
```
$stalk -e [email protected]
```
This will hopefully result in something like:
```
Success!
This email seems to belong to: Alex MacCaw
Looks like they are working at Clearbit as a Founder
You can follow them at:
Facebook: https://facebook.com/amaccaw
Twitter: https://twitter.com/maccaw
GitHub: https://github.com/maccman
LinkedIn: https://linkedin.com/pub/alex-maccaw/78/929/ab5
```
|
67c2819aedf269fedd0d1960c0a037cf7ebbe2d2 | .travis.yml | .travis.yml | language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
- "5"
sudo: false
script:
- npm run test-with-cov
after_success:
- CODECLIMATE_REPO_TOKEN=$CODECLIMATE_TOKEN ./node_modules/codeclimate-test-reporter/bin/codeclimate.js < ./coverage/lcov.info
git:
depth: 10
| language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
- "5"
sudo: false
test:
- npm run test-with-cov
deploy:
provider: npm
email: [email protected]
api_key:
secure: HtlC7Ylj1Pv7iQHAvmRsClO26jOHfe/s+JUWZwNSbMM7x49XQxjSEuP+XD1kWNBlZ4SM432qRoSgojtBusXXWPcJ7OzH9kAnvHafmR9t70lNXW0yazycIhFooxAzYZvhRTRmP4Ysi+S+A8Zs+gk/GMkb/JkySOKP9pEjZE5WyP6hRNvUFEAkmzWKD74IIkYKWwjB4PFY5a5wdhGb0GeC+yYFm5+t9KYfbXtHkluYNwqVQDwoBWRoXXQz7yTdMY+251wwqQFYPjSVIMvmUzNDB7Qae8b6GsjRu4HKLnUtMVcN5Q7g0LoArlNlm9rHa7m0djhcR6ss8OCCdFzAKj45rw5Irllvbd/rKZgN/ykqOGdvi0eSEXJYFnQJpr2M28XUVSE1r7+FSxkojnFTVUXSTxutOZ7bKET0VfW/x9q9CrN93a2G7K7QfdUw3/9AweGuNvJ2MecJokrifXt79kS+6Dpvgfpz2alAkzgQhZBrnL0grHoe+l9VKuvF7Wr36yZ0gxWON4kIoNqfWNWoJTNQgCCecN69ulha4xlpbuunhEZEP8hLKdxDv/vt+s/Nr3wUnvuadXwCeINRAdwZeM1sxDjJ2Q3o42zWNU1fzk8sR1dPaEG8pKMd1O84KNjFFenKJOlOaHpzeEux8/F4RRQmfHV+FtYq+4ITTvlO+uIGoNI=
on:
tags: true
repo: ReidWeb/GitInspector-CSV
branch: master
after_success:
- CODECLIMATE_REPO_TOKEN=$CODECLIMATE_TOKEN ./node_modules/codeclimate-test-reporter/bin/codeclimate.js < ./coverage/lcov.info
git:
depth: 10
| Add auto-deployment stage to Travis config | chore(ci): Add auto-deployment stage to Travis config
| YAML | mit | ReidWeb/GitInspector-CSV | yaml | ## Code Before:
language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
- "5"
sudo: false
script:
- npm run test-with-cov
after_success:
- CODECLIMATE_REPO_TOKEN=$CODECLIMATE_TOKEN ./node_modules/codeclimate-test-reporter/bin/codeclimate.js < ./coverage/lcov.info
git:
depth: 10
## Instruction:
chore(ci): Add auto-deployment stage to Travis config
## Code After:
language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
- "5"
sudo: false
test:
- npm run test-with-cov
deploy:
provider: npm
email: [email protected]
api_key:
secure: HtlC7Ylj1Pv7iQHAvmRsClO26jOHfe/s+JUWZwNSbMM7x49XQxjSEuP+XD1kWNBlZ4SM432qRoSgojtBusXXWPcJ7OzH9kAnvHafmR9t70lNXW0yazycIhFooxAzYZvhRTRmP4Ysi+S+A8Zs+gk/GMkb/JkySOKP9pEjZE5WyP6hRNvUFEAkmzWKD74IIkYKWwjB4PFY5a5wdhGb0GeC+yYFm5+t9KYfbXtHkluYNwqVQDwoBWRoXXQz7yTdMY+251wwqQFYPjSVIMvmUzNDB7Qae8b6GsjRu4HKLnUtMVcN5Q7g0LoArlNlm9rHa7m0djhcR6ss8OCCdFzAKj45rw5Irllvbd/rKZgN/ykqOGdvi0eSEXJYFnQJpr2M28XUVSE1r7+FSxkojnFTVUXSTxutOZ7bKET0VfW/x9q9CrN93a2G7K7QfdUw3/9AweGuNvJ2MecJokrifXt79kS+6Dpvgfpz2alAkzgQhZBrnL0grHoe+l9VKuvF7Wr36yZ0gxWON4kIoNqfWNWoJTNQgCCecN69ulha4xlpbuunhEZEP8hLKdxDv/vt+s/Nr3wUnvuadXwCeINRAdwZeM1sxDjJ2Q3o42zWNU1fzk8sR1dPaEG8pKMd1O84KNjFFenKJOlOaHpzeEux8/F4RRQmfHV+FtYq+4ITTvlO+uIGoNI=
on:
tags: true
repo: ReidWeb/GitInspector-CSV
branch: master
after_success:
- CODECLIMATE_REPO_TOKEN=$CODECLIMATE_TOKEN ./node_modules/codeclimate-test-reporter/bin/codeclimate.js < ./coverage/lcov.info
git:
depth: 10
|
417c20e9006baa0506bd779f142ec172efdb5c9d | scala/scala-impl/test/org/jetbrains/plugins/scala/codeInspection/unused/Scala3UnusedSymbolInspectionTest.scala | scala/scala-impl/test/org/jetbrains/plugins/scala/codeInspection/unused/Scala3UnusedSymbolInspectionTest.scala | package org.jetbrains.plugins.scala.codeInspection.unused
import org.jetbrains.plugins.scala.ScalaVersion
/**
* Created by Svyatoslav Ilinskiy on 11.07.16.
*/
class Scala3UnusedSymbolInspectionTest extends ScalaUnusedSymbolInspectionTestBase {
override protected def supportedIn(version: ScalaVersion): Boolean = version >= ScalaVersion.Latest.Scala_3_0
// Should be enabled when implementing Scala 3 unused symbol inspection and enabling
// it in ScalaUnusedSymbolInspect#shouldProcessElement
// def testPrivateField(): Unit = {
// val code =
// s"""
// |extension (s: Int)
// | private def ${START}blub$END: Int = 3
// """.stripMargin
// checkTextHasError(code)
// val before =
// """
// |object Test {
// | extension (s: Int)
// | private def blub: int = 3
// |}
// """.stripMargin
// val after =
// """
// |object Test {
// | extension (s: Int)
// |}
// """.stripMargin
// testQuickFix(before, after, hint)
// }
def testExtension(): Unit = checkTextHasNoErrors(
s"""extension (s: Int)
| def blub: Int = s + 3
|6.blub
|""".stripMargin
)
}
| package org.jetbrains.plugins.scala.codeInspection.unused
import org.jetbrains.plugins.scala.ScalaVersion
/**
* Created by Svyatoslav Ilinskiy on 11.07.16.
*/
class Scala3UnusedSymbolInspectionTest extends ScalaUnusedSymbolInspectionTestBase {
override protected def supportedIn(version: ScalaVersion): Boolean = version >= ScalaVersion.Latest.Scala_3_0
// Should be enabled when implementing Scala 3 unused symbol inspection and enabling
// it in ScalaUnusedSymbolInspect#shouldProcessElement
// def testPrivateField(): Unit = {
// val code =
// s"""
// |extension (s: Int)
// | private def ${START}blub$END: Int = 3
// """.stripMargin
// checkTextHasError(code)
// val before =
// """
// |object Test {
// | extension (s: Int)
// | private def blub: int = 3
// |}
// """.stripMargin
// val after =
// """
// |object Test {
// | extension (s: Int)
// |}
// """.stripMargin
// testQuickFix(before, after, hint)
// }
def testExtension(): Unit = checkTextHasNoErrors(
s"""extension (s: Int)
| def blub: Int = s + 3
|6.blub
|""".stripMargin
)
def testThatShouldFailToPreventAutoMerge(): Unit = assert(false)
}
| Add failing test to prevent auto-merge | Add failing test to prevent auto-merge
| Scala | apache-2.0 | JetBrains/intellij-scala,JetBrains/intellij-scala | scala | ## Code Before:
package org.jetbrains.plugins.scala.codeInspection.unused
import org.jetbrains.plugins.scala.ScalaVersion
/**
* Created by Svyatoslav Ilinskiy on 11.07.16.
*/
class Scala3UnusedSymbolInspectionTest extends ScalaUnusedSymbolInspectionTestBase {
override protected def supportedIn(version: ScalaVersion): Boolean = version >= ScalaVersion.Latest.Scala_3_0
// Should be enabled when implementing Scala 3 unused symbol inspection and enabling
// it in ScalaUnusedSymbolInspect#shouldProcessElement
// def testPrivateField(): Unit = {
// val code =
// s"""
// |extension (s: Int)
// | private def ${START}blub$END: Int = 3
// """.stripMargin
// checkTextHasError(code)
// val before =
// """
// |object Test {
// | extension (s: Int)
// | private def blub: int = 3
// |}
// """.stripMargin
// val after =
// """
// |object Test {
// | extension (s: Int)
// |}
// """.stripMargin
// testQuickFix(before, after, hint)
// }
def testExtension(): Unit = checkTextHasNoErrors(
s"""extension (s: Int)
| def blub: Int = s + 3
|6.blub
|""".stripMargin
)
}
## Instruction:
Add failing test to prevent auto-merge
## Code After:
package org.jetbrains.plugins.scala.codeInspection.unused
import org.jetbrains.plugins.scala.ScalaVersion
/**
* Created by Svyatoslav Ilinskiy on 11.07.16.
*/
class Scala3UnusedSymbolInspectionTest extends ScalaUnusedSymbolInspectionTestBase {
override protected def supportedIn(version: ScalaVersion): Boolean = version >= ScalaVersion.Latest.Scala_3_0
// Should be enabled when implementing Scala 3 unused symbol inspection and enabling
// it in ScalaUnusedSymbolInspect#shouldProcessElement
// def testPrivateField(): Unit = {
// val code =
// s"""
// |extension (s: Int)
// | private def ${START}blub$END: Int = 3
// """.stripMargin
// checkTextHasError(code)
// val before =
// """
// |object Test {
// | extension (s: Int)
// | private def blub: int = 3
// |}
// """.stripMargin
// val after =
// """
// |object Test {
// | extension (s: Int)
// |}
// """.stripMargin
// testQuickFix(before, after, hint)
// }
def testExtension(): Unit = checkTextHasNoErrors(
s"""extension (s: Int)
| def blub: Int = s + 3
|6.blub
|""".stripMargin
)
def testThatShouldFailToPreventAutoMerge(): Unit = assert(false)
}
|
ab26c19f9ff2c86b987c7c6583d4cd94b9bff288 | gemfiles/Gemfile-rails.4.2.x | gemfiles/Gemfile-rails.4.2.x | source 'https://rubygems.org'
gemspec path: '..'
gem 'rake'
gem 'rdoc'
gem 'actionmailer', '~> 4.2.0'
gem 'activemodel', '~> 4.2.0'
gem "mime-types", (RUBY_VERSION >= "2.0" ? "~> 3.0" : "~> 2.99")
| source 'https://rubygems.org'
gemspec path: '..'
gem 'rake'
gem 'rdoc'
gem 'actionmailer', '~> 4.2.0'
gem 'activemodel', '~> 4.2.0'
gem "mime-types", (RUBY_VERSION >= "2.0" ? "~> 3.0" : "~> 2.99")
# https://github.com/sparklemotion/nokogiri/blob/ad010b28c6edbc3b40950a72f3af692737b578b6/CHANGELOG.md#backwards-incompatibilities
gem "nokogiri", (RUBY_VERSION >= "2.1" ? "~> 1.7" : "< 1.7")
| Use correct nokogiri version for different Rubies | Use correct nokogiri version for different Rubies
| Logos | mit | plataformatec/mail_form,plataformatec/mail_form | logos | ## Code Before:
source 'https://rubygems.org'
gemspec path: '..'
gem 'rake'
gem 'rdoc'
gem 'actionmailer', '~> 4.2.0'
gem 'activemodel', '~> 4.2.0'
gem "mime-types", (RUBY_VERSION >= "2.0" ? "~> 3.0" : "~> 2.99")
## Instruction:
Use correct nokogiri version for different Rubies
## Code After:
source 'https://rubygems.org'
gemspec path: '..'
gem 'rake'
gem 'rdoc'
gem 'actionmailer', '~> 4.2.0'
gem 'activemodel', '~> 4.2.0'
gem "mime-types", (RUBY_VERSION >= "2.0" ? "~> 3.0" : "~> 2.99")
# https://github.com/sparklemotion/nokogiri/blob/ad010b28c6edbc3b40950a72f3af692737b578b6/CHANGELOG.md#backwards-incompatibilities
gem "nokogiri", (RUBY_VERSION >= "2.1" ? "~> 1.7" : "< 1.7")
|
cefc3d6a30159564b2287f8b803585282fadf1e5 | app/src/main/res/layout/list_item.xml | app/src/main/res/layout/list_item.xml | <?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
<ImageView
android:id="@+id/icon"
android:layout_width="80dp"
android:layout_height="80dp"
android:contentDescription="@string/vorschau"
android:paddingLeft="10dp"
android:paddingRight="10dp" />
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_toRightOf="@+id/icon"
android:paddingBottom="10dp"
android:textColor="#CC0033"
android:textSize="16dp" />
<TextView
android:id="@+id/desc"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/title"
android:layout_toRightOf="@+id/icon"
android:paddingLeft="10dp"
android:textColor="#3399FF"
android:textSize="14dp" />
</RelativeLayout> | <?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
<ImageView
android:id="@+id/icon"
android:layout_width="80dp"
android:layout_height="80dp"
android:contentDescription="@string/vorschau"
android:paddingLeft="10dp"
android:paddingRight="10dp" />
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_toEndOf="@+id/icon"
android:paddingBottom="10dp"
android:textColor="#CC0033"
android:textSize="16sp" />
<TextView
android:id="@+id/desc"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/title"
android:layout_toEndOf="@+id/icon"
android:paddingStart="10dp"
android:paddingEnd="10dp"
android:textColor="#3399FF"
android:textSize="14sp" />
</RelativeLayout> | Improve layout as suggested by AndroidStudio | Improve layout as suggested by AndroidStudio
| XML | apache-2.0 | eMTeeWare/eMTeeSeason | xml | ## Code Before:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
<ImageView
android:id="@+id/icon"
android:layout_width="80dp"
android:layout_height="80dp"
android:contentDescription="@string/vorschau"
android:paddingLeft="10dp"
android:paddingRight="10dp" />
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_toRightOf="@+id/icon"
android:paddingBottom="10dp"
android:textColor="#CC0033"
android:textSize="16dp" />
<TextView
android:id="@+id/desc"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/title"
android:layout_toRightOf="@+id/icon"
android:paddingLeft="10dp"
android:textColor="#3399FF"
android:textSize="14dp" />
</RelativeLayout>
## Instruction:
Improve layout as suggested by AndroidStudio
## Code After:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
<ImageView
android:id="@+id/icon"
android:layout_width="80dp"
android:layout_height="80dp"
android:contentDescription="@string/vorschau"
android:paddingLeft="10dp"
android:paddingRight="10dp" />
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_toEndOf="@+id/icon"
android:paddingBottom="10dp"
android:textColor="#CC0033"
android:textSize="16sp" />
<TextView
android:id="@+id/desc"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/title"
android:layout_toEndOf="@+id/icon"
android:paddingStart="10dp"
android:paddingEnd="10dp"
android:textColor="#3399FF"
android:textSize="14sp" />
</RelativeLayout> |
701f1d020325e7e726fe57ca29a87ea81d3b373d | README.md | README.md |
[](https://app.wercker.com/project/byKey/4dc4a1f2dfecaf9e590806eeb6e5798f)
[](https://codeclimate.com/github/hirakiuc/site-meta-go)
[](http://godoc.org/github.com/hirakiuc/site-meta-go)
NOTE: This is under development yet :smile:
|

[](https://codeclimate.com/github/hirakiuc/site-meta-go)
[](http://godoc.org/github.com/hirakiuc/site-meta-go)
NOTE: This is under development yet :smile:
| Replace CI badge (from wercker to GitHub Action) | Replace CI badge (from wercker to GitHub Action) | Markdown | mit | hirakiuc/site-meta-go,hirakiuc/site-meta-go | markdown | ## Code Before:
[](https://app.wercker.com/project/byKey/4dc4a1f2dfecaf9e590806eeb6e5798f)
[](https://codeclimate.com/github/hirakiuc/site-meta-go)
[](http://godoc.org/github.com/hirakiuc/site-meta-go)
NOTE: This is under development yet :smile:
## Instruction:
Replace CI badge (from wercker to GitHub Action)
## Code After:

[](https://codeclimate.com/github/hirakiuc/site-meta-go)
[](http://godoc.org/github.com/hirakiuc/site-meta-go)
NOTE: This is under development yet :smile:
|
a5e651068a470a0f9ab8baa2a7a716c3789c5d56 | server/middleware/authorizedRequest.js | server/middleware/authorizedRequest.js | let session = require('../../database/models/session');
let isAuthorizedForRoute = (req, res, next) => {
session.getSession(req.cookie.id)
.then((session) => {
console.log(session);
});
};
module.exports = isAuthorizedForRoute;
| let session = require('../../database/models/session');
let isAuthorizedForRoute = (req, res, next) => {
session.getSession(req.cookies.session.id)
.then((session) => {
if (session.hash === req.cookies.session.hash &&
session.houseId === req.cookies.session.houseId &&
session.userId === req.cookies.session.userId) {
next();
} else {
res.clearCookie('session');
res.redirect(401, '/login');
}
});
};
module.exports = isAuthorizedForRoute;
| Add authorized route checking middleware | Add authorized route checking middleware
NOTE: Not integrated yet, will be integrated once session adding middleware is in place
| JavaScript | mit | SentinelsOfMagic/SentinelsOfMagic | javascript | ## Code Before:
let session = require('../../database/models/session');
let isAuthorizedForRoute = (req, res, next) => {
session.getSession(req.cookie.id)
.then((session) => {
console.log(session);
});
};
module.exports = isAuthorizedForRoute;
## Instruction:
Add authorized route checking middleware
NOTE: Not integrated yet, will be integrated once session adding middleware is in place
## Code After:
let session = require('../../database/models/session');
let isAuthorizedForRoute = (req, res, next) => {
session.getSession(req.cookies.session.id)
.then((session) => {
if (session.hash === req.cookies.session.hash &&
session.houseId === req.cookies.session.houseId &&
session.userId === req.cookies.session.userId) {
next();
} else {
res.clearCookie('session');
res.redirect(401, '/login');
}
});
};
module.exports = isAuthorizedForRoute;
|
16874192b9014f7d238eb1544c6d7cca06b44d44 | scripts/spm_integration_step.sh | scripts/spm_integration_step.sh |
SWIFT_VERSION=$1
PROJECT_NAME="TestSPM"
FRAMEWORK_NAME="SwiftyUserDefaults"
FRAMEWORK_DIR=$(pwd)
BRANCH_NAME=$(git symbolic-ref -q HEAD)
BRANCH_NAME=${BRANCH_NAME##refs/heads/}
BRANCH_NAME=${BRANCH_NAME:-HEAD}
mkdir $PROJECT_NAME
cd $PROJECT_NAME
swift package init
echo "// swift-tools-version:$SWIFT_VERSION
import PackageDescription
let package = Package(
name: \"$PROJECT_NAME\",
dependencies: [
.package(url: \"$FRAMEWORK_DIR\", .branch(\"$BRANCH_NAME\")),
],
targets: [
.target(name: \"$PROJECT_NAME\", dependencies: [\"$FRAMEWORK_NAME\"]),
]
)
" > Package.swift
swift build
cd ../
rm -rf $PROJECT_NAME |
SWIFT_VERSION=$1
PROJECT_NAME="TestSPM"
FRAMEWORK_NAME="SwiftyUserDefaults"
FRAMEWORK_DIR=$(pwd)
BRANCH_NAME=$(git symbolic-ref -q HEAD)
BRANCH_NAME=${BRANCH_NAME##refs/heads/}
BRANCH_NAME=${BRANCH_NAME:-HEAD}
mkdir $PROJECT_NAME
cd $PROJECT_NAME
swift package init
echo "// swift-tools-version:$SWIFT_VERSION
import PackageDescription
let package = Package(
name: \"$PROJECT_NAME\",
dependencies: [
.package(url: \"$FRAMEWORK_DIR\", .branch(\"$BRANCH_NAME\")),
],
targets: [
.target(name: \"$PROJECT_NAME\", dependencies: [\"$FRAMEWORK_NAME\"]),
]
)
" > Package.swift
swift build -Xswiftc "-target" -Xswiftc "x86_64-apple-macosx10.11"
cd ../
rm -rf $PROJECT_NAME | Use macOS 10.11 for SPM compat check | Use macOS 10.11 for SPM compat check
| Shell | mit | radex/SwiftyUserDefaults,radex/SwiftyUserDefaults,radex/SwiftyUserDefaults | shell | ## Code Before:
SWIFT_VERSION=$1
PROJECT_NAME="TestSPM"
FRAMEWORK_NAME="SwiftyUserDefaults"
FRAMEWORK_DIR=$(pwd)
BRANCH_NAME=$(git symbolic-ref -q HEAD)
BRANCH_NAME=${BRANCH_NAME##refs/heads/}
BRANCH_NAME=${BRANCH_NAME:-HEAD}
mkdir $PROJECT_NAME
cd $PROJECT_NAME
swift package init
echo "// swift-tools-version:$SWIFT_VERSION
import PackageDescription
let package = Package(
name: \"$PROJECT_NAME\",
dependencies: [
.package(url: \"$FRAMEWORK_DIR\", .branch(\"$BRANCH_NAME\")),
],
targets: [
.target(name: \"$PROJECT_NAME\", dependencies: [\"$FRAMEWORK_NAME\"]),
]
)
" > Package.swift
swift build
cd ../
rm -rf $PROJECT_NAME
## Instruction:
Use macOS 10.11 for SPM compat check
## Code After:
SWIFT_VERSION=$1
PROJECT_NAME="TestSPM"
FRAMEWORK_NAME="SwiftyUserDefaults"
FRAMEWORK_DIR=$(pwd)
BRANCH_NAME=$(git symbolic-ref -q HEAD)
BRANCH_NAME=${BRANCH_NAME##refs/heads/}
BRANCH_NAME=${BRANCH_NAME:-HEAD}
mkdir $PROJECT_NAME
cd $PROJECT_NAME
swift package init
echo "// swift-tools-version:$SWIFT_VERSION
import PackageDescription
let package = Package(
name: \"$PROJECT_NAME\",
dependencies: [
.package(url: \"$FRAMEWORK_DIR\", .branch(\"$BRANCH_NAME\")),
],
targets: [
.target(name: \"$PROJECT_NAME\", dependencies: [\"$FRAMEWORK_NAME\"]),
]
)
" > Package.swift
swift build -Xswiftc "-target" -Xswiftc "x86_64-apple-macosx10.11"
cd ../
rm -rf $PROJECT_NAME |
4d94c865a8eafed79db8d6dd74facbc6754a247a | youtube-dl-gui.rb | youtube-dl-gui.rb | class YoutubeDlGui < Formula
homepage "https://github.com/pr0d1r2/youtube-dl-gui"
url "https://github.com/pr0d1r2/youtube-dl-gui.git",
:branch => "osx"
version "0.3.7"
depends_on "python"
depends_on "wxpython"
depends_on "ffmpeg"
def install
ENV.prepend_create_path "PYTHONPATH", "#{lib}/python2.7/site-packages"
system "python", "setup.py", "install", "--prefix=#{prefix}"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/MacOS"
system "cp", "#{prefix}/bin/youtube-dl-gui", "#{prefix}/YoutubeDlGui.app/Contents/MacOS/YoutubeDlGui"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/Resources/"
system "cp", "osx/YoutubeDlGui.icns", "#{prefix}/YoutubeDlGui.app/Contents/Resources/YoutubeDlGui.icns"
system "cp", "osx/empty.lproj", "#{prefix}/YoutubeDlGui.app/Contents/Resources/empty.lproj"
system "cp", "osx/Info.plist", "#{prefix}/YoutubeDlGui.app/Contents/Info.plist"
system "cp", "osx/PkgInfo", "#{prefix}/YoutubeDlGui.app/Contents/PkgInfo"
end
end
| class YoutubeDlGui < Formula
homepage "https://github.com/MrS0m30n3/youtube-dl-gui"
url "https://github.com/MrS0m30n3/youtube-dl-gui.git",
:revision => "36c8147a0ba4da01215d92957246527285cca710"
version "0.3.7"
depends_on "python"
depends_on "wxpython"
depends_on "ffmpeg"
def install
ENV.prepend_create_path "PYTHONPATH", "#{lib}/python2.7/site-packages"
system "python", "setup.py", "install", "--prefix=#{prefix}"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/MacOS"
system "cp", "#{prefix}/bin/youtube-dl-gui", "#{prefix}/YoutubeDlGui.app/Contents/MacOS/YoutubeDlGui"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/Resources/"
system "cp", "osx/YoutubeDlGui.icns", "#{prefix}/YoutubeDlGui.app/Contents/Resources/YoutubeDlGui.icns"
system "cp", "osx/empty.lproj", "#{prefix}/YoutubeDlGui.app/Contents/Resources/empty.lproj"
system "cp", "osx/Info.plist", "#{prefix}/YoutubeDlGui.app/Contents/Info.plist"
system "cp", "osx/PkgInfo", "#{prefix}/YoutubeDlGui.app/Contents/PkgInfo"
end
end
| Change git source to original gem creator. | Change git source to original gem creator.
As my pull request got merged.
| Ruby | mit | pr0d1r2/homebrew-contrib | ruby | ## Code Before:
class YoutubeDlGui < Formula
homepage "https://github.com/pr0d1r2/youtube-dl-gui"
url "https://github.com/pr0d1r2/youtube-dl-gui.git",
:branch => "osx"
version "0.3.7"
depends_on "python"
depends_on "wxpython"
depends_on "ffmpeg"
def install
ENV.prepend_create_path "PYTHONPATH", "#{lib}/python2.7/site-packages"
system "python", "setup.py", "install", "--prefix=#{prefix}"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/MacOS"
system "cp", "#{prefix}/bin/youtube-dl-gui", "#{prefix}/YoutubeDlGui.app/Contents/MacOS/YoutubeDlGui"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/Resources/"
system "cp", "osx/YoutubeDlGui.icns", "#{prefix}/YoutubeDlGui.app/Contents/Resources/YoutubeDlGui.icns"
system "cp", "osx/empty.lproj", "#{prefix}/YoutubeDlGui.app/Contents/Resources/empty.lproj"
system "cp", "osx/Info.plist", "#{prefix}/YoutubeDlGui.app/Contents/Info.plist"
system "cp", "osx/PkgInfo", "#{prefix}/YoutubeDlGui.app/Contents/PkgInfo"
end
end
## Instruction:
Change git source to original gem creator.
As my pull request got merged.
## Code After:
class YoutubeDlGui < Formula
homepage "https://github.com/MrS0m30n3/youtube-dl-gui"
url "https://github.com/MrS0m30n3/youtube-dl-gui.git",
:revision => "36c8147a0ba4da01215d92957246527285cca710"
version "0.3.7"
depends_on "python"
depends_on "wxpython"
depends_on "ffmpeg"
def install
ENV.prepend_create_path "PYTHONPATH", "#{lib}/python2.7/site-packages"
system "python", "setup.py", "install", "--prefix=#{prefix}"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/MacOS"
system "cp", "#{prefix}/bin/youtube-dl-gui", "#{prefix}/YoutubeDlGui.app/Contents/MacOS/YoutubeDlGui"
system "mkdir", "-p", "#{prefix}/YoutubeDlGui.app/Contents/Resources/"
system "cp", "osx/YoutubeDlGui.icns", "#{prefix}/YoutubeDlGui.app/Contents/Resources/YoutubeDlGui.icns"
system "cp", "osx/empty.lproj", "#{prefix}/YoutubeDlGui.app/Contents/Resources/empty.lproj"
system "cp", "osx/Info.plist", "#{prefix}/YoutubeDlGui.app/Contents/Info.plist"
system "cp", "osx/PkgInfo", "#{prefix}/YoutubeDlGui.app/Contents/PkgInfo"
end
end
|
1a0e2e1056eb8bccd50f7c7e07b58cb3e4e0ef53 | src/frame/decoder.rs | src/frame/decoder.rs | use frame::ChannelAssignment;
pub fn decode(channel_assignment: ChannelAssignment, buffer: &mut [i32]) {
match channel_assignment {
ChannelAssignment::Independent => return,
ChannelAssignment::LeftSide => unimplemented!(),
ChannelAssignment::RightSide => unimplemented!(),
ChannelAssignment::MiddleSide => unimplemented!(),
}
}
| use frame::ChannelAssignment;
fn decode_left_side(buffer: &mut [i32]) {
let block_size = buffer.len() / 2;
for i in 0..block_size {
let left = buffer[i];
let side = buffer[i + block_size];
// right channel
buffer[i + block_size] = left - side;
}
}
pub fn decode(channel_assignment: ChannelAssignment, buffer: &mut [i32]) {
match channel_assignment {
ChannelAssignment::Independent => return,
ChannelAssignment::LeftSide => unimplemented!(),
ChannelAssignment::RightSide => unimplemented!(),
ChannelAssignment::MiddleSide => unimplemented!(),
}
}
| Add implementation for `LeftSide` decoding | Add implementation for `LeftSide` decoding
| Rust | bsd-3-clause | sourrust/flac | rust | ## Code Before:
use frame::ChannelAssignment;
pub fn decode(channel_assignment: ChannelAssignment, buffer: &mut [i32]) {
match channel_assignment {
ChannelAssignment::Independent => return,
ChannelAssignment::LeftSide => unimplemented!(),
ChannelAssignment::RightSide => unimplemented!(),
ChannelAssignment::MiddleSide => unimplemented!(),
}
}
## Instruction:
Add implementation for `LeftSide` decoding
## Code After:
use frame::ChannelAssignment;
fn decode_left_side(buffer: &mut [i32]) {
let block_size = buffer.len() / 2;
for i in 0..block_size {
let left = buffer[i];
let side = buffer[i + block_size];
// right channel
buffer[i + block_size] = left - side;
}
}
pub fn decode(channel_assignment: ChannelAssignment, buffer: &mut [i32]) {
match channel_assignment {
ChannelAssignment::Independent => return,
ChannelAssignment::LeftSide => unimplemented!(),
ChannelAssignment::RightSide => unimplemented!(),
ChannelAssignment::MiddleSide => unimplemented!(),
}
}
|
b36b0e4d5708ab7d5849afc9f1050a37562e7326 | metadata/com.adguard.android.contentblocker.txt | metadata/com.adguard.android.contentblocker.txt | AntiFeatures:NonFreeAdd
Categories:Internet
License:GPL-3.0-or-later
Web Site:https://github.com/AdguardTeam/ContentBlocker
Source Code:https://github.com/AdguardTeam/ContentBlocker
Issue Tracker:https://github.com/AdguardTeam/ContentBlocker/issues
Auto Name:AdGuard Content Blocker
Summary:Block advertisements
Description:
AdGuard Content Blocker is an app that blocks ads on mobile devices operated by
Android in browsers that support content blocking technology. As of today, there
are only two such browsers: Yandex Browser and Samsung Internet browser.
The app does not require root access.
.
Repo Type:git
Repo:https://github.com/AdguardTeam/ContentBlocker.git
Build:2.1.3,21002103
commit=v2.1.3
subdir=adguard_cb
gradle=samsung_api21
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:2.1.3
Current Version Code:21002103
| AntiFeatures:NonFreeAdd
Categories:Internet
License:GPL-3.0-or-later
Web Site:https://github.com/AdguardTeam/ContentBlocker
Source Code:https://github.com/AdguardTeam/ContentBlocker
Issue Tracker:https://github.com/AdguardTeam/ContentBlocker/issues
Auto Name:AdGuard Content Blocker
Summary:Block advertisements
Description:
AdGuard Content Blocker is an app that blocks ads on mobile devices operated by
Android in browsers that support content blocking technology. As of today, there
are only two such browsers: Yandex Browser and Samsung Internet browser.
The app does not require root access.
.
Repo Type:git
Repo:https://github.com/AdguardTeam/ContentBlocker.git
Build:2.1.3,21002103
commit=v2.1.3
subdir=adguard_cb
gradle=samsung_api21
Build:2.2.1,21002201
commit=v2.2.1
subdir=adguard_cb
gradle=samsung_api21
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:2.2.1
Current Version Code:21002201
| Update AdGuard Content Blocker to 2.2.1 (21002201) | Update AdGuard Content Blocker to 2.2.1 (21002201)
| Text | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata,f-droid/fdroid-data | text | ## Code Before:
AntiFeatures:NonFreeAdd
Categories:Internet
License:GPL-3.0-or-later
Web Site:https://github.com/AdguardTeam/ContentBlocker
Source Code:https://github.com/AdguardTeam/ContentBlocker
Issue Tracker:https://github.com/AdguardTeam/ContentBlocker/issues
Auto Name:AdGuard Content Blocker
Summary:Block advertisements
Description:
AdGuard Content Blocker is an app that blocks ads on mobile devices operated by
Android in browsers that support content blocking technology. As of today, there
are only two such browsers: Yandex Browser and Samsung Internet browser.
The app does not require root access.
.
Repo Type:git
Repo:https://github.com/AdguardTeam/ContentBlocker.git
Build:2.1.3,21002103
commit=v2.1.3
subdir=adguard_cb
gradle=samsung_api21
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:2.1.3
Current Version Code:21002103
## Instruction:
Update AdGuard Content Blocker to 2.2.1 (21002201)
## Code After:
AntiFeatures:NonFreeAdd
Categories:Internet
License:GPL-3.0-or-later
Web Site:https://github.com/AdguardTeam/ContentBlocker
Source Code:https://github.com/AdguardTeam/ContentBlocker
Issue Tracker:https://github.com/AdguardTeam/ContentBlocker/issues
Auto Name:AdGuard Content Blocker
Summary:Block advertisements
Description:
AdGuard Content Blocker is an app that blocks ads on mobile devices operated by
Android in browsers that support content blocking technology. As of today, there
are only two such browsers: Yandex Browser and Samsung Internet browser.
The app does not require root access.
.
Repo Type:git
Repo:https://github.com/AdguardTeam/ContentBlocker.git
Build:2.1.3,21002103
commit=v2.1.3
subdir=adguard_cb
gradle=samsung_api21
Build:2.2.1,21002201
commit=v2.2.1
subdir=adguard_cb
gradle=samsung_api21
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:2.2.1
Current Version Code:21002201
|
7c6f5a1446783236ed07d215bdd897f5d2d5abbf | cli.js | cli.js | var spawn = require('child_process').spawn;
var args = process.argv.slice(2);
var bin = require('./');
if (bin === null) {
throw new Error('Platform not supported.');
}
var nagome = spawn(bin, args, { stdin: 'inherit' });
nagome.stderr.on('data', (data) => {
console.log(data.toString('utf8'));
});
nagome.stdout.on('data', (data) => {
console.log(data.toString('utf8'));
});
nagome.on('exit', process.exit);
| var spawn = require('child_process').spawn;
var args = process.argv.slice(2);
var bin = require('./');
if (bin === null) {
throw new Error('Platform not supported.');
}
var nagome = spawn(bin, args, { stdin: 'inherit' });
nagome.stderr.on('data', function (data) {
console.log(data.toString('utf8'));
});
nagome.stdout.on('data', function (data) {
console.log(data.toString('utf8'));
});
nagome.on('exit', process.exit);
| Remove arrow function for compatibility | Remove arrow function for compatibility
| JavaScript | mit | y0za/nagome-bin | javascript | ## Code Before:
var spawn = require('child_process').spawn;
var args = process.argv.slice(2);
var bin = require('./');
if (bin === null) {
throw new Error('Platform not supported.');
}
var nagome = spawn(bin, args, { stdin: 'inherit' });
nagome.stderr.on('data', (data) => {
console.log(data.toString('utf8'));
});
nagome.stdout.on('data', (data) => {
console.log(data.toString('utf8'));
});
nagome.on('exit', process.exit);
## Instruction:
Remove arrow function for compatibility
## Code After:
var spawn = require('child_process').spawn;
var args = process.argv.slice(2);
var bin = require('./');
if (bin === null) {
throw new Error('Platform not supported.');
}
var nagome = spawn(bin, args, { stdin: 'inherit' });
nagome.stderr.on('data', function (data) {
console.log(data.toString('utf8'));
});
nagome.stdout.on('data', function (data) {
console.log(data.toString('utf8'));
});
nagome.on('exit', process.exit);
|
d08c1787c718d78fd5d806df139fe7d2d8657663 | src/utils/d3/render_axis_label.js | src/utils/d3/render_axis_label.js | export default function({
renderedXAxis,
renderedYAxis,
xLabel,
yLabel,
width
}) {
let _y = this.axis == 'y'
let axis = _y ? renderedYAxis : renderedXAxis
if (axis) {
let label = _y ? yLabel : xLabel
let _label = axis.append('text')
.attr('class', 'label')
.attr('y', 15)
.style('text-anchor', 'end')
.text(label)
if (_y) {
_label
.attr('transform', 'rotate(-90)')
} else {
_label
.attr('x', width)
}
}
}
| export default function({
renderedXAxis,
renderedYAxis,
xLabel,
yLabel,
width
}) {
let _y = this.axis == 'y'
let axis = _y ? renderedYAxis : renderedXAxis
if (axis) {
let label = _y ? yLabel : xLabel
let _label = axis.append('text')
.attr('class', 'label')
.style('text-anchor', 'end')
.text(label)
if (_y) {
_label
.attr('y', 15)
.attr('transform', 'rotate(-90)')
} else {
_label
.attr('y', -6)
.attr('x', width)
}
}
}
| Fix rendering of x-Axis labels | Fix rendering of x-Axis labels
| JavaScript | mit | simonwoerpel/d3-playbooks,correctiv/d3-riot-charts,correctiv/d3-riot-charts,simonwoerpel/d3-playbooks,simonwoerpel/d3-playbooks | javascript | ## Code Before:
export default function({
renderedXAxis,
renderedYAxis,
xLabel,
yLabel,
width
}) {
let _y = this.axis == 'y'
let axis = _y ? renderedYAxis : renderedXAxis
if (axis) {
let label = _y ? yLabel : xLabel
let _label = axis.append('text')
.attr('class', 'label')
.attr('y', 15)
.style('text-anchor', 'end')
.text(label)
if (_y) {
_label
.attr('transform', 'rotate(-90)')
} else {
_label
.attr('x', width)
}
}
}
## Instruction:
Fix rendering of x-Axis labels
## Code After:
export default function({
renderedXAxis,
renderedYAxis,
xLabel,
yLabel,
width
}) {
let _y = this.axis == 'y'
let axis = _y ? renderedYAxis : renderedXAxis
if (axis) {
let label = _y ? yLabel : xLabel
let _label = axis.append('text')
.attr('class', 'label')
.style('text-anchor', 'end')
.text(label)
if (_y) {
_label
.attr('y', 15)
.attr('transform', 'rotate(-90)')
} else {
_label
.attr('y', -6)
.attr('x', width)
}
}
}
|
c25e32e61a88aba967129cb53c18ee2b5eae3314 | app/views/items/_form.html.erb | app/views/items/_form.html.erb | <%= form_for(item) do |f| %>
<div>
<%= f.label :name %>:
<%= f.text_field :name %>
</div>
<div>
<%= f.label :description %>:
<%= f.text_field :description %>
</div>
<div>
<%= f.label :image %>:
<%= f.text_field :image %>
</div>
<div>
<%= f.label :price %>:
<%= f.text_field :price %>
</div>
<div>
<%= f.label :quantity %>:
<%= f.text_field :quantity %>
</div>
<div>
<%= f.submit %>
</div>
<% end %> | <%= form_for(item) do |f| %>
<div>
<%= f.label :name %>:
<%= f.text_field :name %>
</div>
<div>
<%= f.label :description %>:
<%= f.text_field :description %>
</div>
<div>
<%= f.label :image %>:
<%= f.text_field :image %>
</div>
<div>
<%= f.label :price %>:
<%= f.text_field :price %>
</div>
<div>
<%= f.label :quantity %>:
<%= f.text_field :quantity %>
</div>
<div>
<%= f.submit %>
</div>
<% end %>
<% @item.errors.each do |error_array, error| %>
<p><%= flash[:error] = error %><p>
<% end %> | Add logic that loops through Item error messages and displays appropriate messages | Add logic that loops through Item error messages and displays appropriate messages
| HTML+ERB | mit | benjaminhyw/rails-online-shop,benjaminhyw/rails-online-shop,benjaminhyw/rails-online-shop | html+erb | ## Code Before:
<%= form_for(item) do |f| %>
<div>
<%= f.label :name %>:
<%= f.text_field :name %>
</div>
<div>
<%= f.label :description %>:
<%= f.text_field :description %>
</div>
<div>
<%= f.label :image %>:
<%= f.text_field :image %>
</div>
<div>
<%= f.label :price %>:
<%= f.text_field :price %>
</div>
<div>
<%= f.label :quantity %>:
<%= f.text_field :quantity %>
</div>
<div>
<%= f.submit %>
</div>
<% end %>
## Instruction:
Add logic that loops through Item error messages and displays appropriate messages
## Code After:
<%= form_for(item) do |f| %>
<div>
<%= f.label :name %>:
<%= f.text_field :name %>
</div>
<div>
<%= f.label :description %>:
<%= f.text_field :description %>
</div>
<div>
<%= f.label :image %>:
<%= f.text_field :image %>
</div>
<div>
<%= f.label :price %>:
<%= f.text_field :price %>
</div>
<div>
<%= f.label :quantity %>:
<%= f.text_field :quantity %>
</div>
<div>
<%= f.submit %>
</div>
<% end %>
<% @item.errors.each do |error_array, error| %>
<p><%= flash[:error] = error %><p>
<% end %> |
5ef8cd64e6e6eda0df9fd2a030e431cc69b92ee7 | source/manual/alerts/data-sync.html.md | source/manual/alerts/data-sync.html.md | ---
owner_slack: "#2ndline"
title: Data sync
section: Icinga alerts
layout: manual_layout
parent: "/manual.html"
old_path_in_opsmanual: "../opsmanual/2nd-line/alerts/data-sync.md"
last_reviewed_on: 2017-02-17
review_in: 6 months
---
> **This page was imported from [the opsmanual on GitHub Enterprise](https://github.com/alphagov/govuk-legacy-opsmanual)**.
It hasn't been reviewed for accuracy yet.
[View history in old opsmanual](https://github.com/alphagov/govuk-legacy-opsmanual/tree/master/2nd-line/alerts/data-sync.md)
Data is synced from production to staging and integration every night.
Check the output of the production Jenkins job to see which part of
the data sync failed. It may be safe to re-run part of the sync.
| ---
owner_slack: "#2ndline"
title: Data sync
section: Icinga alerts
layout: manual_layout
parent: "/manual.html"
old_path_in_opsmanual: "../opsmanual/2nd-line/alerts/data-sync.md"
last_reviewed_on: 2017-07-25
review_in: 6 months
---
> **This page was imported from [the opsmanual on GitHub Enterprise](https://github.com/alphagov/govuk-legacy-opsmanual)**.
It hasn't been reviewed for accuracy yet.
[View history in old opsmanual](https://github.com/alphagov/govuk-legacy-opsmanual/tree/master/2nd-line/alerts/data-sync.md)
Data is synced from production to staging and integration every night.
Check the output of the production Jenkins job to see which part of
the data sync failed. It may be safe to re-run part of the sync.
The Jenkins jobs included in the sync are:
* Copy Data to Staging
* Copy Data to integration
* Copy Licensify Data to staging
See the [source code](https://github.com/alphagov/env-sync-and-backup/tree/master/jobs) of the jobs for more information about how they work.
| Update the docs to use the word "Copy" | Update the docs to use the word "Copy"
The jobs that perform the data sync are called "Copy ...." if you
search the manual with the name of the job you'd never find this page
because the word "copy" doesn't appear anywhere.
| Markdown | mit | alphagov/govuk-developer-docs,alphagov/govuk-developer-docs,alphagov/govuk-developer-docs,alphagov/govuk-developer-docs | markdown | ## Code Before:
---
owner_slack: "#2ndline"
title: Data sync
section: Icinga alerts
layout: manual_layout
parent: "/manual.html"
old_path_in_opsmanual: "../opsmanual/2nd-line/alerts/data-sync.md"
last_reviewed_on: 2017-02-17
review_in: 6 months
---
> **This page was imported from [the opsmanual on GitHub Enterprise](https://github.com/alphagov/govuk-legacy-opsmanual)**.
It hasn't been reviewed for accuracy yet.
[View history in old opsmanual](https://github.com/alphagov/govuk-legacy-opsmanual/tree/master/2nd-line/alerts/data-sync.md)
Data is synced from production to staging and integration every night.
Check the output of the production Jenkins job to see which part of
the data sync failed. It may be safe to re-run part of the sync.
## Instruction:
Update the docs to use the word "Copy"
The jobs that perform the data sync are called "Copy ...." if you
search the manual with the name of the job you'd never find this page
because the word "copy" doesn't appear anywhere.
## Code After:
---
owner_slack: "#2ndline"
title: Data sync
section: Icinga alerts
layout: manual_layout
parent: "/manual.html"
old_path_in_opsmanual: "../opsmanual/2nd-line/alerts/data-sync.md"
last_reviewed_on: 2017-07-25
review_in: 6 months
---
> **This page was imported from [the opsmanual on GitHub Enterprise](https://github.com/alphagov/govuk-legacy-opsmanual)**.
It hasn't been reviewed for accuracy yet.
[View history in old opsmanual](https://github.com/alphagov/govuk-legacy-opsmanual/tree/master/2nd-line/alerts/data-sync.md)
Data is synced from production to staging and integration every night.
Check the output of the production Jenkins job to see which part of
the data sync failed. It may be safe to re-run part of the sync.
The Jenkins jobs included in the sync are:
* Copy Data to Staging
* Copy Data to integration
* Copy Licensify Data to staging
See the [source code](https://github.com/alphagov/env-sync-and-backup/tree/master/jobs) of the jobs for more information about how they work.
|
3d3c335f384074cb0716f6d41acdad872a030567 | docker-compose.yml | docker-compose.yml | version: '3'
services:
pybossa:
build:
context: ./docker
dockerfile: ./Dockerfile
args:
- SOURCE_BRANCH="${SOURCE_BRANCH:-master}"
environment:
- PYBOSSA_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_SESSION_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_DATABASE_URL=postgresql://dfuser:test-only-RDkWWAG9jXMAFO8pXu6K@pybossa-db/dforce
- PYBOSSA_ITSDANGEROUS_SECRET=test-only-2R1ucOk8KxEeh5SCaXcWPt4oDf2sHQvpiyHAPoyX
- PYBOSSA_BRAND=Deciding Force
- PYBOSSA_TITLE=Deciding Force
- PYBOSSA_LOGO=textthresher_logo.png
depends_on:
- pybossa-db
ports:
- "8000:80"
pybossa-db:
image: postgres:9.4
environment:
- POSTGRES_USER=dfuser
- POSTGRES_PASSWORD=test-only-RDkWWAG9jXMAFO8pXu6K
- POSTGRES_DB=dforce
| version: '3'
services:
pybossa:
build:
context: ./docker
dockerfile: ./Dockerfile
args:
- SOURCE_BRANCH="${SOURCE_BRANCH:-master}"
environment:
- PYBOSSA_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_SESSION_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_DATABASE_URL=postgresql://dfuser:test-only-RDkWWAG9jXMAFO8pXu6K@pybossa-db/dforce
- PYBOSSA_ITSDANGEROUS_SECRET=test-only-2R1ucOk8KxEeh5SCaXcWPt4oDf2sHQvpiyHAPoyX
- PYBOSSA_BRAND=Deciding Force
- PYBOSSA_TITLE=Deciding Force
- PYBOSSA_LOGO=default_logo.svg
depends_on:
- pybossa-db
ports:
- "8000:80"
pybossa-db:
image: postgres:9.4
environment:
- POSTGRES_USER=dfuser
- POSTGRES_PASSWORD=test-only-RDkWWAG9jXMAFO8pXu6K
- POSTGRES_DB=dforce
| Revert to default Pybossa logo. | Revert to default Pybossa logo.
| YAML | apache-2.0 | Goodly/pybossa-build,Goodly/pybossa-build | yaml | ## Code Before:
version: '3'
services:
pybossa:
build:
context: ./docker
dockerfile: ./Dockerfile
args:
- SOURCE_BRANCH="${SOURCE_BRANCH:-master}"
environment:
- PYBOSSA_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_SESSION_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_DATABASE_URL=postgresql://dfuser:test-only-RDkWWAG9jXMAFO8pXu6K@pybossa-db/dforce
- PYBOSSA_ITSDANGEROUS_SECRET=test-only-2R1ucOk8KxEeh5SCaXcWPt4oDf2sHQvpiyHAPoyX
- PYBOSSA_BRAND=Deciding Force
- PYBOSSA_TITLE=Deciding Force
- PYBOSSA_LOGO=textthresher_logo.png
depends_on:
- pybossa-db
ports:
- "8000:80"
pybossa-db:
image: postgres:9.4
environment:
- POSTGRES_USER=dfuser
- POSTGRES_PASSWORD=test-only-RDkWWAG9jXMAFO8pXu6K
- POSTGRES_DB=dforce
## Instruction:
Revert to default Pybossa logo.
## Code After:
version: '3'
services:
pybossa:
build:
context: ./docker
dockerfile: ./Dockerfile
args:
- SOURCE_BRANCH="${SOURCE_BRANCH:-master}"
environment:
- PYBOSSA_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_SESSION_SECRET=test-only-y8y2i4ISpfwEMhLjUfXH9e5qA8qJkKA31tuLakvy
- PYBOSSA_DATABASE_URL=postgresql://dfuser:test-only-RDkWWAG9jXMAFO8pXu6K@pybossa-db/dforce
- PYBOSSA_ITSDANGEROUS_SECRET=test-only-2R1ucOk8KxEeh5SCaXcWPt4oDf2sHQvpiyHAPoyX
- PYBOSSA_BRAND=Deciding Force
- PYBOSSA_TITLE=Deciding Force
- PYBOSSA_LOGO=default_logo.svg
depends_on:
- pybossa-db
ports:
- "8000:80"
pybossa-db:
image: postgres:9.4
environment:
- POSTGRES_USER=dfuser
- POSTGRES_PASSWORD=test-only-RDkWWAG9jXMAFO8pXu6K
- POSTGRES_DB=dforce
|
1a3ffe00bfdf8c61b4ff190beb2ee6a4e9db1412 | behave_django/environment.py | behave_django/environment.py | from django.core.management import call_command
from django.shortcuts import resolve_url
from behave_django.testcase import BehaveDjangoTestCase
def before_scenario(context, scenario):
# This is probably a hacky method of setting up the test case
# outside of a test runner. Suggestions are welcome. :)
context.test = BehaveDjangoTestCase()
context.test.setUpClass()
context.test()
# Load fixtures
if getattr(context, 'fixtures', None):
call_command('loaddata', *context.fixtures, verbosity=0)
context.base_url = context.test.live_server_url
def get_url(to=None, *args, **kwargs):
"""
URL helper attached to context with built-in reverse resolution as a
handy shortcut. Takes an absolute path, a view name, or a model
instance as an argument (as django.shortcuts.resolve_url). Examples::
context.get_url()
context.get_url('/absolute/url/here')
context.get_url('view-name')
context.get_url('view-name', 'with args', and='kwargs')
context.get_url(model_instance)
"""
return context.base_url + (
resolve_url(to, *args, **kwargs) if to else '')
context.get_url = get_url
def after_scenario(context, scenario):
context.test.tearDownClass()
del context.test
| from django.core.management import call_command
try:
from django.shortcuts import resolve_url
except ImportError:
import warnings
warnings.warn("URL path supported only in get_url() with Django < 1.5")
resolve_url = lambda to, *args, **kwargs: to
from behave_django.testcase import BehaveDjangoTestCase
def before_scenario(context, scenario):
# This is probably a hacky method of setting up the test case
# outside of a test runner. Suggestions are welcome. :)
context.test = BehaveDjangoTestCase()
context.test.setUpClass()
context.test()
# Load fixtures
if getattr(context, 'fixtures', None):
call_command('loaddata', *context.fixtures, verbosity=0)
context.base_url = context.test.live_server_url
def get_url(to=None, *args, **kwargs):
"""
URL helper attached to context with built-in reverse resolution as a
handy shortcut. Takes an absolute path, a view name, or a model
instance as an argument (as django.shortcuts.resolve_url). Examples::
context.get_url()
context.get_url('/absolute/url/here')
context.get_url('view-name')
context.get_url('view-name', 'with args', and='kwargs')
context.get_url(model_instance)
"""
return context.base_url + (
resolve_url(to, *args, **kwargs) if to else '')
context.get_url = get_url
def after_scenario(context, scenario):
context.test.tearDownClass()
del context.test
| Support Django < 1.5 with a simplified version of `get_url()` | Support Django < 1.5 with a simplified version of `get_url()`
| Python | mit | nikolas/behave-django,nikolas/behave-django,behave/behave-django,bittner/behave-django,bittner/behave-django,behave/behave-django | python | ## Code Before:
from django.core.management import call_command
from django.shortcuts import resolve_url
from behave_django.testcase import BehaveDjangoTestCase
def before_scenario(context, scenario):
# This is probably a hacky method of setting up the test case
# outside of a test runner. Suggestions are welcome. :)
context.test = BehaveDjangoTestCase()
context.test.setUpClass()
context.test()
# Load fixtures
if getattr(context, 'fixtures', None):
call_command('loaddata', *context.fixtures, verbosity=0)
context.base_url = context.test.live_server_url
def get_url(to=None, *args, **kwargs):
"""
URL helper attached to context with built-in reverse resolution as a
handy shortcut. Takes an absolute path, a view name, or a model
instance as an argument (as django.shortcuts.resolve_url). Examples::
context.get_url()
context.get_url('/absolute/url/here')
context.get_url('view-name')
context.get_url('view-name', 'with args', and='kwargs')
context.get_url(model_instance)
"""
return context.base_url + (
resolve_url(to, *args, **kwargs) if to else '')
context.get_url = get_url
def after_scenario(context, scenario):
context.test.tearDownClass()
del context.test
## Instruction:
Support Django < 1.5 with a simplified version of `get_url()`
## Code After:
from django.core.management import call_command
try:
from django.shortcuts import resolve_url
except ImportError:
import warnings
warnings.warn("URL path supported only in get_url() with Django < 1.5")
resolve_url = lambda to, *args, **kwargs: to
from behave_django.testcase import BehaveDjangoTestCase
def before_scenario(context, scenario):
# This is probably a hacky method of setting up the test case
# outside of a test runner. Suggestions are welcome. :)
context.test = BehaveDjangoTestCase()
context.test.setUpClass()
context.test()
# Load fixtures
if getattr(context, 'fixtures', None):
call_command('loaddata', *context.fixtures, verbosity=0)
context.base_url = context.test.live_server_url
def get_url(to=None, *args, **kwargs):
"""
URL helper attached to context with built-in reverse resolution as a
handy shortcut. Takes an absolute path, a view name, or a model
instance as an argument (as django.shortcuts.resolve_url). Examples::
context.get_url()
context.get_url('/absolute/url/here')
context.get_url('view-name')
context.get_url('view-name', 'with args', and='kwargs')
context.get_url(model_instance)
"""
return context.base_url + (
resolve_url(to, *args, **kwargs) if to else '')
context.get_url = get_url
def after_scenario(context, scenario):
context.test.tearDownClass()
del context.test
|
b955ccf31504d29bb5a6379caf27d16bb21a3d04 | packages/common/src/server/utils/cleanGlobPatterns.ts | packages/common/src/server/utils/cleanGlobPatterns.ts | import * as Path from "path";
export function cleanGlobPatterns(files: string | string[], excludes: string[]): string[] {
excludes = excludes.map((s: string) => "!" + s.replace(/!/gi, ""));
return []
.concat(files as any)
.map((file: string) => {
if (!require.extensions[".ts"] && !process.env["TS_TEST"]) {
file = file.replace(/\.ts$/i, ".js");
}
return Path.resolve(file);
})
.concat(excludes as any);
}
| import * as Path from "path";
export function cleanGlobPatterns(files: string | string[], excludes: string[]): string[] {
excludes = excludes.map((s: string) => "!" + s.replace(/!/gi, "").replace(/\\/g, "/"));
return []
.concat(files as any)
.map((file: string) => {
if (!require.extensions[".ts"] && !process.env["TS_TEST"]) {
file = file.replace(/\.ts$/i, ".js");
}
return Path.resolve(file).replace(/\\/g, "/");
})
.concat(excludes as any);
}
| Fix windows paths in scanner | fix(common): Fix windows paths in scanner
Closes: #709
| TypeScript | mit | Romakita/ts-express-decorators,Romakita/ts-express-decorators,Romakita/ts-express-decorators,Romakita/ts-express-decorators | typescript | ## Code Before:
import * as Path from "path";
export function cleanGlobPatterns(files: string | string[], excludes: string[]): string[] {
excludes = excludes.map((s: string) => "!" + s.replace(/!/gi, ""));
return []
.concat(files as any)
.map((file: string) => {
if (!require.extensions[".ts"] && !process.env["TS_TEST"]) {
file = file.replace(/\.ts$/i, ".js");
}
return Path.resolve(file);
})
.concat(excludes as any);
}
## Instruction:
fix(common): Fix windows paths in scanner
Closes: #709
## Code After:
import * as Path from "path";
export function cleanGlobPatterns(files: string | string[], excludes: string[]): string[] {
excludes = excludes.map((s: string) => "!" + s.replace(/!/gi, "").replace(/\\/g, "/"));
return []
.concat(files as any)
.map((file: string) => {
if (!require.extensions[".ts"] && !process.env["TS_TEST"]) {
file = file.replace(/\.ts$/i, ".js");
}
return Path.resolve(file).replace(/\\/g, "/");
})
.concat(excludes as any);
}
|
6207567ab043d35a9072138ec6273f2f1ceb6303 | app/views/courses/export_grades.text.erb | app/views/courses/export_grades.text.erb | "Assignment","Student","Submitted?","Points Available","Automatic Score","Teacher Score","Days Late","Late Penalty","Effective Score"
<% @subs.each do |sub| %>
<% if sub.new_record? %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","no",<%= sub.assignment.points_available %>,0,0,0,0,0
<% else %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","yes",<%= sub.assignment.points_available %>,<%= sub.raw_score %>,<%= sub.teacher_score %>,<%= sub.days_late %>,<%= sub.late_penalty %>,<%= sub.score %>
<% end %>
<% end %> | "Assignment","Student","Last Name","Submitted?","Points Available","Automatic Score","Teacher Score","Days Late","Late Penalty","Effective Score"
<% @subs.each do |sub| %>
<% if sub.new_record? %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","<%= sub.user.surname %>","no",<%= sub.assignment.points_available %>,0,0,0,0,0
<% else %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","<%= sub.user.surname %>","yes",<%= sub.assignment.points_available %>,<%= sub.raw_score %>,<%= sub.teacher_score %>,<%= sub.days_late %>,<%= sub.late_penalty %>,<%= sub.score %>
<% end %>
<% end %>
| Add a "last name" field to the course grade CSV. | Add a "last name" field to the course grade CSV.
| HTML+ERB | agpl-3.0 | marksherman/bottlenose,marksherman/bottlenose,marksherman/bottlenose,marksherman/bottlenose,marksherman/bottlenose,marksherman/bottlenose | html+erb | ## Code Before:
"Assignment","Student","Submitted?","Points Available","Automatic Score","Teacher Score","Days Late","Late Penalty","Effective Score"
<% @subs.each do |sub| %>
<% if sub.new_record? %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","no",<%= sub.assignment.points_available %>,0,0,0,0,0
<% else %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","yes",<%= sub.assignment.points_available %>,<%= sub.raw_score %>,<%= sub.teacher_score %>,<%= sub.days_late %>,<%= sub.late_penalty %>,<%= sub.score %>
<% end %>
<% end %>
## Instruction:
Add a "last name" field to the course grade CSV.
## Code After:
"Assignment","Student","Last Name","Submitted?","Points Available","Automatic Score","Teacher Score","Days Late","Late Penalty","Effective Score"
<% @subs.each do |sub| %>
<% if sub.new_record? %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","<%= sub.user.surname %>","no",<%= sub.assignment.points_available %>,0,0,0,0,0
<% else %>
"<%= sub.assignment.name %>","<%= sub.user.name %>","<%= sub.user.surname %>","yes",<%= sub.assignment.points_available %>,<%= sub.raw_score %>,<%= sub.teacher_score %>,<%= sub.days_late %>,<%= sub.late_penalty %>,<%= sub.score %>
<% end %>
<% end %>
|
6c70c61cc3373c0fea65b3632d46eb32ee267ef2 | buildThemeSite.sh | buildThemeSite.sh | git config user.name "hugo-themebuilder"
git config user.email "[email protected]"
git submodule update --remote --merge
if ! git diff-index --quiet HEAD --; then
git commit -am "Update themes"
git remote add origin [email protected]:spf13/hugoThemes.git
git push origin master
if [ $? -ne 0 ]; then
exit 1
fi
fi
echo "Hugo env:"
hugo env
cd _script
BASEURL=${BASEURL:-http://localhost:1313}
./generateThemeSite.sh ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "generate theme site failed: Exit status $code"
exit 1
fi
echo "Building site to public with baseURL ${BASEURL}..."
hugo --quiet -s hugoThemeSite/themeSite -b ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "build theme site failed: Exit status $code"
exit 1
fi
cd ..
echo "Done!" |
echo "Hugo env:"
hugo env
cd _script
BASEURL=${BASEURL:-http://localhost:1313}
./generateThemeSite.sh ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "generate theme site failed: Exit status $code"
exit 1
fi
echo "Building site to public with baseURL ${BASEURL}..."
hugo --quiet -s hugoThemeSite/themeSite -b ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "build theme site failed: Exit status $code"
exit 1
fi
cd ..
echo "Done!" | Revert "Reintroduce: Update submodules on build" | Revert "Reintroduce: Update submodules on build"
This reverts commit 7a8965d8ef43a36995602d63f77f5e8ea0e21c69.
| Shell | mit | Nytoca/hugoThemes,spf13/hugoThemes | shell | ## Code Before:
git config user.name "hugo-themebuilder"
git config user.email "[email protected]"
git submodule update --remote --merge
if ! git diff-index --quiet HEAD --; then
git commit -am "Update themes"
git remote add origin [email protected]:spf13/hugoThemes.git
git push origin master
if [ $? -ne 0 ]; then
exit 1
fi
fi
echo "Hugo env:"
hugo env
cd _script
BASEURL=${BASEURL:-http://localhost:1313}
./generateThemeSite.sh ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "generate theme site failed: Exit status $code"
exit 1
fi
echo "Building site to public with baseURL ${BASEURL}..."
hugo --quiet -s hugoThemeSite/themeSite -b ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "build theme site failed: Exit status $code"
exit 1
fi
cd ..
echo "Done!"
## Instruction:
Revert "Reintroduce: Update submodules on build"
This reverts commit 7a8965d8ef43a36995602d63f77f5e8ea0e21c69.
## Code After:
echo "Hugo env:"
hugo env
cd _script
BASEURL=${BASEURL:-http://localhost:1313}
./generateThemeSite.sh ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "generate theme site failed: Exit status $code"
exit 1
fi
echo "Building site to public with baseURL ${BASEURL}..."
hugo --quiet -s hugoThemeSite/themeSite -b ${BASEURL}
code=$?
if [ $code -ne 0 ]; then
echo "build theme site failed: Exit status $code"
exit 1
fi
cd ..
echo "Done!" |
07f7d52d3583d7d12113357f7fcf554933f8805b | .github/issue_template.md | .github/issue_template.md |
* MessageKit Version:
* iOS Version(s):
* Swift Version:
* Devices/Simulators:
* Reproducible in ChatExample? (Yes/No):
## What happened?
## What did you expect to happen?
| <!--
Hi, thanks so much for opening an issue! 🤗
To better pinpoint (and solve) the issue you're experiencing, we could use some information on your behalf.
Please let us know the following things:
- What version of MessageKit are you using?
- What version of iOS are you running on?
- What version of Swift are you running on?
- What device(s) are you testing on? Are these simulators?
- Is the issue you're experiencing reproducable in the example app?
In some cases it can be really helpful to provide a short example of your code.
If so, please wrap these code blocks in backticks, like this:
```swift
*your code goes here*
```
The code will automatically get its syntax highlighted, and doesn't need to be indented 4 spaces to be shown as code.
When referencing a dependency manager-related issue (think CocoaPods, Carthage, SwiftPM), please add its configuration file and version to the issue.
It would be helpful to put the contents in a code block too, using ```ruby for CocoaPods and ```swift for SwiftPM.
Also please make sure your title describes your problem well. Questions end with a question mark.
Thanks again, and let us know if there are any other questions you might have.
-->
| Rewrite issue template to a HTML comment | Rewrite issue template to a HTML comment | Markdown | mit | MessageKit/MessageKit,MessageKit/MessageKit,MessageKit/MessageKit | markdown | ## Code Before:
* MessageKit Version:
* iOS Version(s):
* Swift Version:
* Devices/Simulators:
* Reproducible in ChatExample? (Yes/No):
## What happened?
## What did you expect to happen?
## Instruction:
Rewrite issue template to a HTML comment
## Code After:
<!--
Hi, thanks so much for opening an issue! 🤗
To better pinpoint (and solve) the issue you're experiencing, we could use some information on your behalf.
Please let us know the following things:
- What version of MessageKit are you using?
- What version of iOS are you running on?
- What version of Swift are you running on?
- What device(s) are you testing on? Are these simulators?
- Is the issue you're experiencing reproducable in the example app?
In some cases it can be really helpful to provide a short example of your code.
If so, please wrap these code blocks in backticks, like this:
```swift
*your code goes here*
```
The code will automatically get its syntax highlighted, and doesn't need to be indented 4 spaces to be shown as code.
When referencing a dependency manager-related issue (think CocoaPods, Carthage, SwiftPM), please add its configuration file and version to the issue.
It would be helpful to put the contents in a code block too, using ```ruby for CocoaPods and ```swift for SwiftPM.
Also please make sure your title describes your problem well. Questions end with a question mark.
Thanks again, and let us know if there are any other questions you might have.
-->
|
9993d1ecb1de8c694b503df67c85db4af7be6c43 | requirements-dev.txt | requirements-dev.txt | coverage[toml]
flake8
flake8-tabs==2.2.2
fuzzywuzzy-stubs
isort==5.7.0
mkdocs==1.1.2
mkdocs-material==6.2.7
mkdocs-material-extensions==1.0.1
mkdocstrings==0.14.0
mypy
pre-commit
PyInstaller==4.1
git+https://github.com/nstockton/pytkdocs.git@tab-indentation-parser#egg=pytkdocs
git+https://github.com/nstockton/tan.git@personal#egg=tan
types-certifi
types-jsonschema
| coverage[toml]
flake8
flake8-tabs==2.2.2
fuzzywuzzy-stubs
isort==5.7.0
mkdocs==1.2.2
mkdocs-material==7.2.5
mkdocstrings==0.14.0
mypy
pre-commit
PyInstaller==4.1
git+https://github.com/nstockton/pytkdocs.git@tab-indentation-parser#egg=pytkdocs
git+https://github.com/nstockton/tan.git@personal#egg=tan
types-certifi
types-jsonschema
| Update mkdocs and mkdocs-material requirements. | Update mkdocs and mkdocs-material requirements.
| Text | mpl-2.0 | nstockton/mapperproxy-mume,nstockton/mapperproxy-mume | text | ## Code Before:
coverage[toml]
flake8
flake8-tabs==2.2.2
fuzzywuzzy-stubs
isort==5.7.0
mkdocs==1.1.2
mkdocs-material==6.2.7
mkdocs-material-extensions==1.0.1
mkdocstrings==0.14.0
mypy
pre-commit
PyInstaller==4.1
git+https://github.com/nstockton/pytkdocs.git@tab-indentation-parser#egg=pytkdocs
git+https://github.com/nstockton/tan.git@personal#egg=tan
types-certifi
types-jsonschema
## Instruction:
Update mkdocs and mkdocs-material requirements.
## Code After:
coverage[toml]
flake8
flake8-tabs==2.2.2
fuzzywuzzy-stubs
isort==5.7.0
mkdocs==1.2.2
mkdocs-material==7.2.5
mkdocstrings==0.14.0
mypy
pre-commit
PyInstaller==4.1
git+https://github.com/nstockton/pytkdocs.git@tab-indentation-parser#egg=pytkdocs
git+https://github.com/nstockton/tan.git@personal#egg=tan
types-certifi
types-jsonschema
|
8d6b71d8fc65734faeadc679e59f028e51d6fecf | Sprite.cpp | Sprite.cpp |
Sprite::Sprite() : Node()
{
// Image
m_sprite_path = "";
}
Sprite::Sprite(std::string p_sprite_path) : Sprite()
{
m_sprite_path = p_sprite_path;
}
Sprite::~Sprite() {}
std::string Sprite::getSpritePath()
{
return m_sprite_path;
} |
Sprite::Sprite() : Node()
{
// Image
m_sprite_path = "";
}
Sprite::Sprite(std::string p_sprite_path) : Node()
{
// Image
m_sprite_path = p_sprite_path;
}
Sprite::~Sprite() {}
std::string Sprite::getSpritePath()
{
return m_sprite_path;
} | Use more useful ancestor constructor | Use more useful ancestor constructor
| C++ | mit | maksym-gryb/SDL_Wrapper,maksym-gryb/SDL_Wrapper | c++ | ## Code Before:
Sprite::Sprite() : Node()
{
// Image
m_sprite_path = "";
}
Sprite::Sprite(std::string p_sprite_path) : Sprite()
{
m_sprite_path = p_sprite_path;
}
Sprite::~Sprite() {}
std::string Sprite::getSpritePath()
{
return m_sprite_path;
}
## Instruction:
Use more useful ancestor constructor
## Code After:
Sprite::Sprite() : Node()
{
// Image
m_sprite_path = "";
}
Sprite::Sprite(std::string p_sprite_path) : Node()
{
// Image
m_sprite_path = p_sprite_path;
}
Sprite::~Sprite() {}
std::string Sprite::getSpritePath()
{
return m_sprite_path;
} |
67b61dc88f6395da82d86e4c14674ac574e9c6fa | content/20160503_Announcing_IPCC_ROOT.md | content/20160503_Announcing_IPCC_ROOT.md | Title: Announcing the Princeton IPCC ROOT project
date: 2016-05-03 10:37
Slug: Announcing_IPCC_ROOT
Category: Blog
Tags: news
Authors: Peter Elmer
status: published
Intel has funded an Intel Parallel Computing Center (IPCC) at Princeton University who main target is the modernization of critical libraries in the [ROOT](https://root.cern.ch/) data analysis framework. ROOT is a data analysis framework which enables big data processing, statistical analysis, visualization and storage in highly efficient manner. ROOT is open source software. It was begun nearly 20 years ago as part of the preparation for the scientific program of the [Large Hadron Collider](http://home.web.cern.ch/topics/large-hadron-collider) at the [European Laboratory for Particle Physics (CERN)](http://home.web.cern.ch/), but has been widely adopted in particle and nuclear physics. It is used not only at CERN and by external scientists working at CERN, but also for experiments at many other physics laboratories around the world. It is used daily by an estimated community of 20000 scientists around the world, including non-physics applications ranging from finance to genomics.
The compute needs of the LHC program are very large and will grow in a way that there is a significant pressure to optimize software codes like ROOT. The focus of the IPCC ROOT project is the modernization of the ROOT Math and I/O libraries.
| Title: Announcing the Princeton IPCC ROOT project
date: 2016-05-03 10:37
Slug: Announcing_IPCC_ROOT
Category: Blog
Tags: news
Authors: Peter Elmer
status: published
Intel has funded an [Intel Parallel Computing Center (IPCC) at Princeton University](https://software.intel.com/en-us/articles/intel-parallel-computing-center-at-princeton-university-high-energy-experiment) who main target is the modernization of critical libraries in the [ROOT](https://root.cern.ch/) data analysis framework. ROOT is a data analysis framework which enables big data processing, statistical analysis, visualization and storage in highly efficient manner. ROOT is open source software. It was begun nearly 20 years ago as part of the preparation for the scientific program of the [Large Hadron Collider](http://home.web.cern.ch/topics/large-hadron-collider) at the [European Laboratory for Particle Physics (CERN)](http://home.web.cern.ch/), but has been widely adopted in particle and nuclear physics. It is used not only at CERN and by external scientists working at CERN, but also for experiments at many other physics laboratories around the world. It is used daily by an estimated community of 20000 scientists around the world, including non-physics applications ranging from finance to genomics.
The compute needs of the LHC program are very large and will grow in a way that there is a significant pressure to optimize software codes like ROOT. The focus of the IPCC ROOT project is the modernization of the ROOT Math and I/O libraries.
| Add link to Intel page about IPCC ROOT | Add link to Intel page about IPCC ROOT
| Markdown | mit | ipcc-root/ipcc-root.github.io-source,ipcc-root/ipcc-root.github.io-source,ipcc-root/ipcc-root.github.io-source | markdown | ## Code Before:
Title: Announcing the Princeton IPCC ROOT project
date: 2016-05-03 10:37
Slug: Announcing_IPCC_ROOT
Category: Blog
Tags: news
Authors: Peter Elmer
status: published
Intel has funded an Intel Parallel Computing Center (IPCC) at Princeton University who main target is the modernization of critical libraries in the [ROOT](https://root.cern.ch/) data analysis framework. ROOT is a data analysis framework which enables big data processing, statistical analysis, visualization and storage in highly efficient manner. ROOT is open source software. It was begun nearly 20 years ago as part of the preparation for the scientific program of the [Large Hadron Collider](http://home.web.cern.ch/topics/large-hadron-collider) at the [European Laboratory for Particle Physics (CERN)](http://home.web.cern.ch/), but has been widely adopted in particle and nuclear physics. It is used not only at CERN and by external scientists working at CERN, but also for experiments at many other physics laboratories around the world. It is used daily by an estimated community of 20000 scientists around the world, including non-physics applications ranging from finance to genomics.
The compute needs of the LHC program are very large and will grow in a way that there is a significant pressure to optimize software codes like ROOT. The focus of the IPCC ROOT project is the modernization of the ROOT Math and I/O libraries.
## Instruction:
Add link to Intel page about IPCC ROOT
## Code After:
Title: Announcing the Princeton IPCC ROOT project
date: 2016-05-03 10:37
Slug: Announcing_IPCC_ROOT
Category: Blog
Tags: news
Authors: Peter Elmer
status: published
Intel has funded an [Intel Parallel Computing Center (IPCC) at Princeton University](https://software.intel.com/en-us/articles/intel-parallel-computing-center-at-princeton-university-high-energy-experiment) who main target is the modernization of critical libraries in the [ROOT](https://root.cern.ch/) data analysis framework. ROOT is a data analysis framework which enables big data processing, statistical analysis, visualization and storage in highly efficient manner. ROOT is open source software. It was begun nearly 20 years ago as part of the preparation for the scientific program of the [Large Hadron Collider](http://home.web.cern.ch/topics/large-hadron-collider) at the [European Laboratory for Particle Physics (CERN)](http://home.web.cern.ch/), but has been widely adopted in particle and nuclear physics. It is used not only at CERN and by external scientists working at CERN, but also for experiments at many other physics laboratories around the world. It is used daily by an estimated community of 20000 scientists around the world, including non-physics applications ranging from finance to genomics.
The compute needs of the LHC program are very large and will grow in a way that there is a significant pressure to optimize software codes like ROOT. The focus of the IPCC ROOT project is the modernization of the ROOT Math and I/O libraries.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.