content_type
stringclasses 8
values | main_lang
stringclasses 7
values | message
stringlengths 1
50
| sha
stringlengths 40
40
| patch
stringlengths 52
962k
| file_count
int64 1
300
|
---|---|---|---|---|---|
Java | Java | initialize fabricxx c++ code and register rootview | 596f17efdac234193afba25bfed6035765fb1a74 | <ide><path>ReactAndroid/src/main/java/com/facebook/react/fabric/FabricBinder.java
<ide>
<ide> package com.facebook.react.fabric;
<ide>
<del>public interface FabricBinder {
<add>public interface FabricBinder<T extends FabricBinding> {
<ide>
<del> void setBinding(FabricBinding binding);
<add> void setBinding(T binding);
<ide>
<ide> } | 1 |
Javascript | Javascript | fix a typo | fe3fea7003890a9c23de1508a8ef93ce9dfda45b | <ide><path>lib/webpack.js
<ide> const webpack = /** @type {WebpackFunctionSingle & WebpackFunctionMulti} */ (
<ide> if (watch) {
<ide> util.deprecate(
<ide> () => {},
<del> "A 'callback' argument need to be provided to the 'webpack(options, callback)' function when the 'watch' option is set. There is no way to handle the 'watch' option without a callback.",
<add> "A 'callback' argument needs to be provided to the 'webpack(options, callback)' function when the 'watch' option is set. There is no way to handle the 'watch' option without a callback.",
<ide> "DEP_WEBPACK_WATCH_WITHOUT_CALLBACK"
<ide> )();
<ide> } | 1 |
Python | Python | remove forgotten print | 5bc397cc85fa1cbc16967b7ca0fede41f90896bc | <ide><path>libcloud/compute/drivers/packet.py
<ide> def list_images(self):
<ide>
<ide> def list_sizes(self):
<ide> data = self.connection.request('/plans').object['plans']
<del> print data
<ide> return [self._to_size(size) for size in data if
<ide> size.get('line') == 'baremetal']
<ide> | 1 |
Python | Python | fix missing block when there is no failure | 5f06a09b9f3f05b4860f11bbbe22861923b49d81 | <ide><path>utils/notification_service_doc_tests.py
<ide> def payload(self) -> str:
<ide> if self.n_failures > 0:
<ide> blocks.extend([self.category_failures])
<ide>
<del> if self.no_failures == 0:
<add> if self.n_failures == 0:
<ide> blocks.append(self.no_failures)
<ide>
<ide> return json.dumps(blocks) | 1 |
Ruby | Ruby | add test-cases for primary-key-less-views. closes | ec6eee5db0994f92ac2c3c6d2a922c6edd35b2b1 | <ide><path>activerecord/test/cases/view_test.rb
<ide> def test_column_definitions
<ide> ["name", :string],
<ide> ["status", :integer]], Ebook.columns.map { |c| [c.name, c.type] })
<ide> end
<add>
<add> def test_attributes
<add> assert_equal({"id" => 2, "name" => "Ruby for Rails", "status" => 0},
<add> Ebook.first.attributes)
<add> end
<add>end
<add>
<add>class ViewWithoutPrimaryKeyTest < ActiveRecord::TestCase
<add> fixtures :books
<add>
<add> class Paperback < ActiveRecord::Base; end
<add>
<add> setup do
<add> @connection = ActiveRecord::Base.connection
<add> @connection.execute <<-SQL
<add> CREATE VIEW paperbacks
<add> AS SELECT name, status FROM books WHERE format = 'paperback'
<add> SQL
<add> end
<add>
<add> teardown do
<add> @connection.execute "DROP VIEW IF EXISTS paperbacks"
<add> end
<add>
<add> def test_reading
<add> books = Paperback.all
<add> assert_equal ["Agile Web Development with Rails"], books.map(&:name)
<add> end
<add>
<add> def test_table_exists
<add> view_name = Paperback.table_name
<add> assert @connection.table_exists?(view_name), "'#{view_name}' table should exist"
<add> end
<add>
<add> def test_column_definitions
<add> assert_equal([["name", :string],
<add> ["status", :integer]], Paperback.columns.map { |c| [c.name, c.type] })
<add> end
<add>
<add> def test_attributes
<add> assert_equal({"name" => "Agile Web Development with Rails", "status" => 0},
<add> Paperback.first.attributes)
<add> end
<ide> end
<ide> end | 1 |
Javascript | Javascript | update instagram infolink | 61ae070444d2bcda182bc2dbabb266f38a37c78d | <ide><path>website/src/react-native/showcase.js
<ide> var pinned = [
<ide> icon: 'http://a4.mzstatic.com/us/r30/Purple62/v4/1f/8d/f9/1f8df910-8ec7-3b8e-0104-d44e869f4d65/icon175x175.jpeg',
<ide> linkAppStore: 'https://itunes.apple.com/app/instagram/id389801252?pt=428156&ct=igweb.unifiedHome.badge&mt=8',
<ide> linkPlayStore: 'https://play.google.com/store/apps/details?id=com.instagram.android&referrer=utm_source%3Dinstagramweb%26utm_campaign%3DunifiedHome%26utm_medium%3Dbadge',
<del> infoLink: '',
<del> infoTitle: '',
<add> infoLink: 'https://engineering.instagram.com/react-native-at-instagram-dd828a9a90c7#.3h4wir4zr',
<add> infoTitle: 'React Native at Instagram',
<ide> defaultLink: 'https://www.instagram.com/',
<ide> },
<ide> { | 1 |
Python | Python | add license header and fix flake8 | d944a5a069c62c16aa527b29df24b49463df3969 | <ide><path>contrib/scrape-azure-prices.py
<add>#!/usr/bin/env python
<add>#
<add># Licensed to the Apache Software Foundation (ASF) under one
<add># or more contributor license agreements. See the NOTICE file
<add># distributed with this work for additional information
<add># regarding copyright ownership. The ASF licenses this file
<add># to you under the Apache License, Version 2.0 (the
<add># "License"); you may not use this file except in compliance
<add># with the License. You may obtain a copy of the License at
<add>#
<add># http://www.apache.org/licenses/LICENSE-2.0
<add>#
<add># Unless required by applicable law or agreed to in writing,
<add># software distributed under the License is distributed on an
<add># "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
<add># KIND, either express or implied. See the License for the
<add># specific language governing permissions and limitations
<add># under the License.
<add>
<ide> import json
<ide> import time
<ide> import os
<ide> def get_azure_prices():
<ide> continue
<ide> for reg, price in value['prices']['perhour'].items():
<ide> region = region_map[reg].lower().replace(" ", "")
<del> region = region.replace("(public)", "") # for germany
<del> region = region.replace("(sovereign)", "") # for germany
<add> region = region.replace("(public)", "") # for germany
<add> region = region.replace("(sovereign)", "") # for germany
<ide> prices[region] = price['value']
<ide> result[size_raw[0]][size] = prices
<ide> | 1 |
Java | Java | create cxx binding | fd29878a8b64f645c3b6d92eab361f0b7b4d07d1 | <ide><path>ReactAndroid/src/main/java/com/facebook/react/fabric/FabricBinder.java
<add>/**
<add> * Copyright (c) 2014-present, Facebook, Inc.
<add> *
<add> * This source code is licensed under the MIT license found in the
<add> * LICENSE file in the root directory of this source tree.
<add> */
<add>
<add>package com.facebook.react.fabric;
<add>
<add>public interface FabricBinder {
<add>
<add> void setBinding(FabricBinding binding);
<add>
<add>}
<ide><path>ReactAndroid/src/main/java/com/facebook/react/fabric/FabricBinding.java
<ide>
<ide> import com.facebook.react.bridge.JavaScriptContextHolder;
<ide> import com.facebook.react.bridge.NativeMap;
<add>import com.facebook.react.bridge.UIManager;
<ide>
<ide> public interface FabricBinding {
<ide>
<del> void installFabric(JavaScriptContextHolder jsContext, FabricUIManager fabricModule);
<add> void installFabric(JavaScriptContextHolder jsContext, FabricBinder fabricBinder);
<ide>
<ide> void releaseEventTarget(long jsContextNativePointer, long eventTargetPointer);
<ide>
<ide><path>ReactAndroid/src/main/java/com/facebook/react/fabric/FabricUIManager.java
<ide> */
<ide> @SuppressWarnings("unused") // used from JNI
<ide> @DoNotStrip
<del>public class FabricUIManager implements UIManager, JSHandler {
<add>public class FabricUIManager implements UIManager, JSHandler, FabricBinder {
<ide>
<ide> private static final String TAG = FabricUIManager.class.getSimpleName();
<ide> private static final boolean DEBUG = ReactBuildConfig.DEBUG || PrinterHolder.getPrinter().shouldDisplayLogMessage(ReactDebugOverlayTags.FABRIC_UI_MANAGER);
<ide> public FabricUIManager(
<ide> mJSContext = jsContext;
<ide> }
<ide>
<add> @Override
<ide> public void setBinding(FabricBinding binding) {
<ide> mBinding = binding;
<ide> }
<ide><path>ReactAndroid/src/main/java/com/facebook/react/fabric/jsc/FabricJSCBinding.java
<ide> import com.facebook.jni.HybridData;
<ide> import com.facebook.proguard.annotations.DoNotStrip;
<ide> import com.facebook.react.bridge.JavaScriptContextHolder;
<add>import com.facebook.react.bridge.UIManager;
<add>import com.facebook.react.fabric.FabricBinder;
<ide> import com.facebook.react.fabric.FabricBinding;
<ide> import com.facebook.react.fabric.FabricUIManager;
<ide> import com.facebook.react.bridge.NativeMap;
<ide> public FabricJSCBinding() {
<ide> }
<ide>
<ide> @Override
<del> public void installFabric(JavaScriptContextHolder jsContext, FabricUIManager fabricModule) {
<add> public void installFabric(JavaScriptContextHolder jsContext, FabricBinder fabricModule) {
<ide> fabricModule.setBinding(this);
<ide> installFabric(jsContext.get(), fabricModule);
<ide> } | 4 |
Go | Go | move maintainer build test to integration-cli | 3dd4c5f49977bb9538ae1c39605895fde69c86ee | <ide><path>integration-cli/docker_cli_build_test.go
<ide> func TestBuildWithVolume(t *testing.T) {
<ide> logDone("build - with volume")
<ide> }
<ide>
<add>func TestBuildMaintainer(t *testing.T) {
<add> checkSimpleBuild(t,
<add> `
<add> FROM scratch
<add> MAINTAINER dockerio
<add> `,
<add> "testbuildimg",
<add> "{{json .author}}",
<add> `"dockerio"`)
<add>
<add> deleteImages("testbuildimg")
<add> logDone("build - maintainer")
<add>}
<add>
<ide> // TODO: TestCaching
<ide>
<ide> // TODO: TestADDCacheInvalidation
<ide><path>integration/buildfile_test.go
<ide> func buildImage(context testContextTemplate, t *testing.T, eng *engine.Engine, u
<ide> return image, err
<ide> }
<ide>
<del>func TestBuildMaintainer(t *testing.T) {
<del> img, err := buildImage(testContextTemplate{`
<del> from {IMAGE}
<del> maintainer dockerio
<del> `, nil, nil}, t, nil, true)
<del> if err != nil {
<del> t.Fatal(err)
<del> }
<del>
<del> if img.Author != "dockerio" {
<del> t.Fail()
<del> }
<del>}
<del>
<ide> func TestBuildUser(t *testing.T) {
<ide> img, err := buildImage(testContextTemplate{`
<ide> from {IMAGE} | 2 |
Javascript | Javascript | divide install script in blocks | 685255e688d0b38ee5b7784e79e194e50b5512dc | <ide><path>script/lib/install-application.js
<ide> module.exports = function (packagedAppPath, installDir) {
<ide> const shareDirPath = path.join(prefixDirPath, 'share')
<ide> const installationDirPath = path.join(shareDirPath, atomExecutableName)
<ide> const applicationsDirPath = path.join(shareDirPath, 'applications')
<del> const desktopEntryPath = path.join(applicationsDirPath, `${atomExecutableName}.desktop`)
<add>
<ide> const binDirPath = path.join(prefixDirPath, 'bin')
<del> const atomBinDestinationPath = path.join(binDirPath, atomExecutableName)
<del> const apmBinDestinationPath = path.join(binDirPath, apmExecutableName)
<ide>
<ide> fs.mkdirpSync(applicationsDirPath)
<ide> fs.mkdirpSync(binDirPath)
<ide>
<ide> install(installationDirPath, packagedAppFileName, packagedAppPath)
<ide>
<del> if (fs.existsSync(desktopEntryPath)) {
<del> console.log(`Removing existing desktop entry file at "${desktopEntryPath}"`)
<del> fs.removeSync(desktopEntryPath)
<add> { // Install xdg desktop file
<add> const desktopEntryPath = path.join(applicationsDirPath, `${atomExecutableName}.desktop`)
<add> if (fs.existsSync(desktopEntryPath)) {
<add> console.log(`Removing existing desktop entry file at "${desktopEntryPath}"`)
<add> fs.removeSync(desktopEntryPath)
<add> }
<add> console.log(`Writing desktop entry file at "${desktopEntryPath}"`)
<add> const iconPath = path.join(installationDirPath, 'atom.png')
<add> const desktopEntryTemplate = fs.readFileSync(path.join(CONFIG.repositoryRootPath, 'resources', 'linux', 'atom.desktop.in'))
<add> const desktopEntryContents = template(desktopEntryTemplate)({
<add> appName,
<add> appFileName: atomExecutableName,
<add> description: appDescription,
<add> installDir: prefixDirPath,
<add> iconPath
<add> })
<add> fs.writeFileSync(desktopEntryPath, desktopEntryContents)
<ide> }
<del> console.log(`Writing desktop entry file at "${desktopEntryPath}"`)
<del> const iconPath = path.join(installationDirPath, 'atom.png')
<del> const desktopEntryTemplate = fs.readFileSync(path.join(CONFIG.repositoryRootPath, 'resources', 'linux', 'atom.desktop.in'))
<del> const desktopEntryContents = template(desktopEntryTemplate)({
<del> appName,
<del> appFileName: atomExecutableName,
<del> description: appDescription,
<del> installDir: prefixDirPath,
<del> iconPath
<del> })
<del> fs.writeFileSync(desktopEntryPath, desktopEntryContents)
<ide>
<del> if (fs.existsSync(atomBinDestinationPath)) {
<del> console.log(`Removing existing executable at "${atomBinDestinationPath}"`)
<del> fs.removeSync(atomBinDestinationPath)
<add> { // Add atom executable to the PATH
<add> const atomBinDestinationPath = path.join(binDirPath, atomExecutableName)
<add> if (fs.existsSync(atomBinDestinationPath)) {
<add> console.log(`Removing existing executable at "${atomBinDestinationPath}"`)
<add> fs.removeSync(atomBinDestinationPath)
<add> }
<add> console.log(`Copying atom.sh to "${atomBinDestinationPath}"`)
<add> fs.copySync(path.join(CONFIG.repositoryRootPath, 'atom.sh'), atomBinDestinationPath)
<ide> }
<del> console.log(`Copying atom.sh to "${atomBinDestinationPath}"`)
<del> fs.copySync(path.join(CONFIG.repositoryRootPath, 'atom.sh'), atomBinDestinationPath)
<ide>
<del> try {
<del> fs.lstatSync(apmBinDestinationPath)
<del> console.log(`Removing existing executable at "${apmBinDestinationPath}"`)
<del> fs.removeSync(apmBinDestinationPath)
<del> } catch (e) { }
<del> console.log(`Symlinking apm to "${apmBinDestinationPath}"`)
<del> fs.symlinkSync(path.join('..', 'share', atomExecutableName, 'resources', 'app', 'apm', 'node_modules', '.bin', 'apm'), apmBinDestinationPath)
<add> { // Link apm executable to the PATH
<add> const apmBinDestinationPath = path.join(binDirPath, apmExecutableName)
<add> try {
<add> fs.lstatSync(apmBinDestinationPath)
<add> console.log(`Removing existing executable at "${apmBinDestinationPath}"`)
<add> fs.removeSync(apmBinDestinationPath)
<add> } catch (e) { }
<add> console.log(`Symlinking apm to "${apmBinDestinationPath}"`)
<add> fs.symlinkSync(path.join('..', 'share', atomExecutableName, 'resources', 'app', 'apm', 'node_modules', '.bin', 'apm'), apmBinDestinationPath)
<add> }
<ide>
<ide> console.log(`Changing permissions to 755 for "${installationDirPath}"`)
<ide> fs.chmodSync(installationDirPath, '755') | 1 |
Javascript | Javascript | detect 0 length fs writes with tests | 6744e59e461cc200a1c135c99450b2800c21b5e9 | <ide><path>lib/fs.js
<ide> fs.write = function (fd, buffer, offset, length, position, callback) {
<ide> offset = 0;
<ide> length = buffer.length;
<ide> }
<add> if(!length) return;
<ide>
<ide> binding.write(fd, buffer, offset, length, position, callback || noop);
<ide> };
<ide> fs.writeSync = function (fd, buffer, offset, length, position) {
<ide> offset = 0;
<ide> length = buffer.length;
<ide> }
<add> if(!length) return 0;
<ide>
<ide> return binding.write(fd, buffer, offset, length, position);
<ide> };
<ide><path>test/simple/test-fs-write-sync.js
<ide> fn = path.join(common.fixturesDir, 'write.txt');
<ide>
<ide> foo = 'foo'
<ide> var fd = fs.openSync(fn, 'w');
<add>
<add>written = fs.writeSync(fd, '');
<add>assert.strictEqual(0, written);
<add>
<ide> fs.writeSync(fd, foo);
<ide>
<ide> bar = 'bár'
<ide><path>test/simple/test-fs-write.js
<ide> var found;
<ide> fs.open(fn, 'w', 0644, function (err, fd) {
<ide> if (err) throw err;
<ide> console.log('open done');
<add> fs.write(fd, '', 0, 'utf8', function(err, written) {
<add> assert.fail('zero length write should not go through to callback');
<add> });
<ide> fs.write(fd, expected, 0, "utf8", function (err, written) {
<ide> console.log('write done');
<ide> if (err) throw err; | 3 |
Python | Python | fix celery tests | 95cef76eae50a79716e42519c6e44359feb302c4 | <ide><path>tests/executors/test_celery_executor.py
<ide> from airflow.utils import timezone
<ide> from airflow.utils.state import State
<ide> from tests.test_utils import db
<del>from tests.test_utils.config import conf_vars
<ide>
<ide>
<ide> def _prepare_test_bodies():
<ide> def fake_execute_command(command):
<ide> # "Enqueue" them. We don't have a real SimpleTaskInstance, so directly edit the dict
<ide> for (key, simple_ti, command, queue, task) in task_tuples_to_send: # pylint: disable=W0612
<ide> executor.queued_tasks[key] = (command, 1, queue, simple_ti)
<add> executor.task_publish_retries[key] = 1
<ide>
<ide> executor._process_tasks(task_tuples_to_send)
<ide>
<ide> def fake_execute_command():
<ide> )
<ide> key = ('fail', 'fake_simple_ti', when, 0)
<ide> executor.queued_tasks[key] = value_tuple
<add> executor.task_publish_retries[key] = 1
<ide> executor.heartbeat()
<ide> self.assertEqual(0, len(executor.queued_tasks), "Task should no longer be queued")
<ide> self.assertEqual(executor.event_buffer[('fail', 'fake_simple_ti', when, 0)][0], State.FAILED)
<ide>
<ide> @pytest.mark.integration("redis")
<ide> @pytest.mark.integration("rabbitmq")
<ide> @pytest.mark.backend("mysql", "postgres")
<del> @conf_vars({("celery", "operation_timeout"): "0.01"})
<ide> def test_retry_on_error_sending_task(self):
<ide> """Test that Airflow retries publishing tasks to Celery Broker atleast 3 times"""
<ide>
<ide> def fake_execute_command(command):
<ide> print(command)
<ide>
<del> with _prepare_app(execute=fake_execute_command), self.assertLogs(celery_executor.log) as cm:
<add> with _prepare_app(execute=fake_execute_command), self.assertLogs(
<add> celery_executor.log
<add> ) as cm, mock.patch.object(celery_executor, "OPERATION_TIMEOUT", 0.001):
<ide> # fake_execute_command takes no arguments while execute_command takes 1,
<ide> # which will cause TypeError when calling task.apply_async()
<ide> executor = celery_executor.CeleryExecutor() | 1 |
Python | Python | add weighted_metrics arg to compile | 1bbd52c7081a842fb4af7f582c44de0d2ba644e3 | <ide><path>keras/engine/training.py
<ide> def weighted(y_true, y_pred, weights, mask=None):
<ide> return weighted
<ide>
<ide>
<del>def _masked_objective(fn):
<del> """Adds support for masking to an objective function.
<del>
<del> It transforms an objective function `fn(y_true, y_pred)`
<del> into a cost-masked objective function
<del> `fn(y_true, y_pred, mask)`.
<del>
<del> # Arguments
<del> fn: The objective function to wrap,
<del> with signature `fn(y_true, y_pred)`.
<del>
<del> # Returns
<del> A function with signature `fn(y_true, y_pred, mask)`.
<del> """
<del> def masked(y_true, y_pred, mask=None):
<del> """Wrapper function.
<del>
<del> # Arguments
<del> y_true: `y_true` argument of `fn`.
<del> y_pred: `y_pred` argument of `fn`.
<del> mask: Mask tensor.
<del>
<del> # Returns
<del> Scalar tensor.
<del> """
<del> # score_array has ndim >= 2
<del> score_array = fn(y_true, y_pred)
<del> if mask is not None:
<del> # Cast the mask to floatX to avoid float64 upcasting in theano
<del> mask = K.cast(mask, K.floatx())
<del> # mask should have the same shape as score_array
<del> score_array *= mask
<del> # the loss per batch should be proportional
<del> # to the number of unmasked samples.
<del> score_array /= K.mean(mask)
<del>
<del> return K.mean(score_array)
<del> return masked
<del>
<del>
<ide> def _standardize_weights(y, sample_weight=None, class_weight=None,
<ide> sample_weight_mode=None):
<ide> """Performs sample weight validation and standardization.
<ide> class Model(Container):
<ide> """
<ide>
<ide> def compile(self, optimizer, loss, metrics=None, loss_weights=None,
<del> sample_weight_mode=None, **kwargs):
<add> sample_weight_mode=None, weighted_metrics=None, **kwargs):
<ide> """Configures the model for training.
<ide>
<ide> # Arguments
<ide> def compile(self, optimizer, loss, metrics=None, loss_weights=None,
<ide> If the model has multiple outputs, you can use a different
<ide> `sample_weight_mode` on each output by passing a
<ide> dictionary or a list of modes.
<add> weighted_metrics: list of metrics to be evaluated and weighted
<add> by sample_weight or class_weight during training and testing
<ide> **kwargs: when using the Theano/CNTK backends, these arguments
<ide> are passed into K.function. When using the TensorFlow backend,
<ide> these arguments are passed into `tf.Session.run`.
<ide> def compile(self, optimizer, loss, metrics=None, loss_weights=None,
<ide>
<ide> # Prepare metrics.
<ide> self.metrics = metrics
<add> self.weighted_metrics = weighted_metrics
<ide> self.metrics_names = ['loss']
<ide> self.metrics_tensors = []
<ide>
<ide> def compile(self, optimizer, loss, metrics=None, loss_weights=None,
<ide> # List of same size as output_names.
<ide> # contains tuples (metrics for output, names of metrics).
<ide> nested_metrics = _collect_metrics(metrics, self.output_names)
<add> nested_weighted_metrics = _collect_metrics(weighted_metrics, self.output_names)
<ide>
<ide> def append_metric(layer_num, metric_name, metric_tensor):
<ide> """Helper function used in loop below."""
<ide> def append_metric(layer_num, metric_name, metric_tensor):
<ide> for i in range(len(self.outputs)):
<ide> if i in skip_indices:
<ide> continue
<add>
<ide> y_true = self.targets[i]
<ide> y_pred = self.outputs[i]
<add> weights = sample_weights[i]
<ide> output_metrics = nested_metrics[i]
<del> for metric in output_metrics:
<del> if metric == 'accuracy' or metric == 'acc':
<del> # custom handling of accuracy
<del> # (because of class mode duality)
<del> output_shape = self.internal_output_shapes[i]
<del> acc_fn = None
<del> if (output_shape[-1] == 1 or
<del> self.loss_functions[i] == losses.binary_crossentropy):
<del> # case: binary accuracy
<del> acc_fn = metrics_module.binary_accuracy
<del> elif self.loss_functions[i] == losses.sparse_categorical_crossentropy:
<del> # case: categorical accuracy with sparse targets
<del> acc_fn = metrics_module.sparse_categorical_accuracy
<del> else:
<del> acc_fn = metrics_module.categorical_accuracy
<add> output_weighted_metrics = nested_weighted_metrics[i]
<add>
<add> def handle_metrics(metrics, weights=None):
<add> metric_name_prefix = 'weighted_' if weights is not None else ''
<add>
<add> for metric in metrics:
<add> if metric == 'accuracy' or metric == 'acc':
<add> # custom handling of accuracy
<add> # (because of class mode duality)
<add> output_shape = self.internal_output_shapes[i]
<add> if (output_shape[-1] == 1 or
<add> self.loss_functions[i] == losses.binary_crossentropy):
<add> # case: binary accuracy
<add> acc_fn = metrics_module.binary_accuracy
<add> elif self.loss_functions[i] == losses.sparse_categorical_crossentropy:
<add> # case: categorical accuracy with sparse targets
<add> acc_fn = metrics_module.sparse_categorical_accuracy
<add> else:
<add> acc_fn = metrics_module.categorical_accuracy
<ide>
<del> masked_fn = _masked_objective(acc_fn)
<del> append_metric(i, 'acc', masked_fn(y_true, y_pred, mask=masks[i]))
<del> else:
<del> metric_fn = metrics_module.get(metric)
<del> masked_metric_fn = _masked_objective(metric_fn)
<del> metric_result = masked_metric_fn(y_true, y_pred, mask=masks[i])
<del> metric_result = {
<del> metric_fn.__name__: metric_result
<del> }
<del> for name, tensor in six.iteritems(metric_result):
<del> append_metric(i, name, tensor)
<add> acc_fn = _weighted_masked_objective(acc_fn)
<add> metric_name = metric_name_prefix + 'acc'
<add> append_metric(i, metric_name, acc_fn(y_true, y_pred, weights=weights, mask=masks[i]))
<add> else:
<add> metric_fn = metrics_module.get(metric)
<add> weighted_metric_fn = _weighted_masked_objective(metric_fn)
<add> metric_result = weighted_metric_fn(y_true, y_pred, weights=weights, mask=masks[i])
<add> metric_result = {
<add> metric_fn.__name__: metric_result
<add> }
<add> for name, tensor in six.iteritems(metric_result):
<add> append_metric(i, metric_name_prefix + name, tensor)
<add>
<add> handle_metrics(output_metrics)
<add> handle_metrics(output_weighted_metrics, weights=weights)
<ide>
<ide> # Prepare gradient updates and state updates.
<ide> self.total_loss = total_loss
<ide><path>keras/models.py
<ide> def save_weights(self, filepath, overwrite=True):
<ide> def compile(self, optimizer, loss,
<ide> metrics=None,
<ide> sample_weight_mode=None,
<add> weighted_metrics=None,
<ide> **kwargs):
<ide> """Configures the learning process.
<ide>
<ide> def compile(self, optimizer, loss,
<ide> sample_weight_mode: if you need to do timestep-wise
<ide> sample weighting (2D weights), set this to "temporal".
<ide> "None" defaults to sample-wise weights (1D).
<add> weighted_metrics: list of metrics to be evaluated and weighted
<add> by sample_weight or class_weight during training and testing
<ide> **kwargs: for Theano/CNTK backends, these are passed into
<ide> K.function. When using the TensorFlow backend, these are
<ide> passed into `tf.Session.run`.
<ide> def compile(self, optimizer, loss,
<ide> self.model.compile(optimizer, loss,
<ide> metrics=metrics,
<ide> sample_weight_mode=sample_weight_mode,
<add> weighted_metrics=weighted_metrics,
<ide> **kwargs)
<ide> self.optimizer = self.model.optimizer
<ide> self.loss = self.model.loss
<ide> self.total_loss = self.model.total_loss
<ide> self.loss_weights = self.model.loss_weights
<ide> self.metrics = self.model.metrics
<add> self.weighted_metrics = self.model.weighted_metrics
<ide> self.metrics_tensors = self.model.metrics_tensors
<ide> self.metrics_names = self.model.metrics_names
<ide> self.sample_weight_mode = self.model.sample_weight_mode
<ide><path>tests/test_loss_weighting.py
<ide> import pytest
<ide> import numpy as np
<ide>
<add>from keras import backend as K
<ide> from keras.utils.test_utils import get_test_data
<del>from keras.models import Sequential
<del>from keras.layers import Dense, Activation, GRU, TimeDistributed
<add>from keras.models import Sequential, Model
<add>from keras.layers import Dense, Activation, GRU, TimeDistributed, Input
<ide> from keras.utils import np_utils
<ide> from keras.utils.test_utils import keras_test
<add>from numpy.testing import assert_almost_equal, assert_array_almost_equal
<ide>
<ide> num_classes = 10
<ide> batch_size = 128
<ide> timesteps = 3
<ide> input_dim = 10
<ide> loss = 'mse'
<add>loss_full_name = 'mean_squared_error'
<ide> standard_weight = 1
<ide> standard_score_sequential = 0.5
<ide>
<add>decimal_precision = {
<add> 'cntk': 2,
<add> 'theano': 6,
<add> 'tensorflow': 6
<add>}
<add>
<ide>
<ide> def _get_test_data():
<ide> np.random.seed(1337)
<ide> def test_sequential_temporal_sample_weights():
<ide> assert(score < standard_score_sequential)
<ide>
<ide>
<add>@keras_test
<add>def test_weighted_metrics_with_sample_weight():
<add> decimal = decimal_precision[K.backend()]
<add>
<add> model = create_sequential_model()
<add> model.compile(loss=loss, optimizer='rmsprop', metrics=[loss], weighted_metrics=[loss])
<add>
<add> (x_train, y_train), (x_test, y_test), (sample_weight, class_weight, test_ids) = _get_test_data()
<add>
<add> history = model.fit(x_train, y_train, batch_size=batch_size,
<add> epochs=epochs // 3, verbose=0,
<add> sample_weight=sample_weight)
<add>
<add> h = history.history
<add> assert_array_almost_equal(h['loss'], h['weighted_' + loss_full_name], decimal=decimal)
<add>
<add> history = model.fit(x_train, y_train, batch_size=batch_size,
<add> epochs=epochs // 3, verbose=0,
<add> sample_weight=sample_weight,
<add> validation_split=0.1)
<add>
<add> h = history.history
<add> assert_almost_equal(h['val_loss'], h['val_weighted_' + loss_full_name], decimal=decimal)
<add>
<add> model.train_on_batch(x_train[:32], y_train[:32],
<add> sample_weight=sample_weight[:32])
<add> model.test_on_batch(x_train[:32], y_train[:32],
<add> sample_weight=sample_weight[:32])
<add>
<add> test_sample_weight = np.ones((y_test.shape[0])) * standard_weight
<add> test_sample_weight[test_ids] = high_weight
<add>
<add> scores = model.evaluate(x_test, y_test, verbose=0, sample_weight=test_sample_weight)
<add> loss_score, metric_score, weighted_metric_score = scores
<add>
<add> assert loss_score < standard_score_sequential
<add> assert loss_score != metric_score
<add> assert_almost_equal(loss_score, weighted_metric_score, decimal=decimal)
<add>
<add>
<add>@keras_test
<add>def test_weighted_metrics_with_no_sample_weight():
<add> decimal = decimal_precision[K.backend()]
<add>
<add> model = create_sequential_model()
<add> model.compile(loss=loss, optimizer='rmsprop', metrics=[loss], weighted_metrics=[loss])
<add>
<add> (x_train, y_train), (x_test, y_test), _ = _get_test_data()
<add>
<add> history = model.fit(x_train, y_train, batch_size=batch_size,
<add> epochs=epochs // 3, verbose=0)
<add>
<add> h = history.history
<add> assert_array_almost_equal(h['loss'], h[loss_full_name], decimal=decimal)
<add> assert_array_almost_equal(h['loss'], h['weighted_' + loss_full_name], decimal=decimal)
<add>
<add> history = model.fit(x_train, y_train, batch_size=batch_size,
<add> epochs=epochs // 3, verbose=0, validation_split=0.1)
<add>
<add> h = history.history
<add> assert_array_almost_equal(h['val_loss'], h['val_' + loss_full_name], decimal=decimal)
<add> assert_array_almost_equal(h['val_loss'], h['val_weighted_' + loss_full_name], decimal=decimal)
<add>
<add> model.train_on_batch(x_train[:32], y_train[:32])
<add> model.test_on_batch(x_train[:32], y_train[:32])
<add>
<add> scores = model.evaluate(x_test, y_test, verbose=0)
<add> loss_score, metric_score, weighted_metric_score = scores
<add>
<add> assert_almost_equal(loss_score, metric_score, decimal=decimal)
<add> assert_almost_equal(loss_score, weighted_metric_score, decimal=decimal)
<add>
<add>
<add>@keras_test
<add>def test_weighted_metrics_with_weighted_accuracy_metric():
<add> model = create_sequential_model()
<add> model.compile(loss=loss, optimizer='rmsprop', metrics=['acc'], weighted_metrics=['acc'])
<add>
<add> (x_train, y_train), _, (sample_weight, _, _) = _get_test_data()
<add>
<add> history = model.fit(x_train, y_train, batch_size=batch_size,
<add> epochs=epochs // 3, verbose=0,
<add> sample_weight=sample_weight)
<add>
<add> assert history.history['acc'] != history.history['weighted_acc']
<add>
<add>
<add>@keras_test
<add>def test_weighted_metrics_with_multiple_outputs():
<add> decimal = decimal_precision[K.backend()]
<add>
<add> inputs = Input(shape=(5,))
<add> x = Dense(5)(inputs)
<add> output1 = Dense(1, name='output1')(x)
<add> output2 = Dense(1, name='output2')(x)
<add>
<add> model = Model(inputs=inputs, outputs=[output1, output2])
<add>
<add> metrics = {'output1': [loss], 'output2': [loss]}
<add> weighted_metrics = {'output2': [loss]}
<add> loss_map = {'output1': loss, 'output2': loss}
<add>
<add> model.compile(loss=loss_map, optimizer='sgd', metrics=metrics, weighted_metrics=weighted_metrics)
<add>
<add> x = np.array([[1, 1, 1, 1, 1]])
<add> y = {'output1': np.array([0]), 'output2': np.array([1])}
<add> weight = 5
<add>
<add> history = model.fit(x, y, sample_weight={'output2': np.array([weight])})
<add>
<add> unweighted_metric = history.history['output2_' + loss_full_name][0]
<add> weighted_metric = history.history['output2_weighted_' + loss_full_name][0]
<add>
<add> assert_almost_equal(unweighted_metric * weight, weighted_metric, decimal=decimal)
<add>
<add>
<ide> @keras_test
<ide> def test_class_weight_wrong_classes():
<ide> model = create_sequential_model() | 3 |
Javascript | Javascript | add angularevent param to $routechangestart event | 6972596ce99f2a11ae07e65ebcb554cef1dd3240 | <ide><path>src/ngRoute/route.js
<ide> function $RouteProvider(){
<ide> * defined in `resolve` route property. Once all of the dependencies are resolved
<ide> * `$routeChangeSuccess` is fired.
<ide> *
<add> * @param {Object} angularEvent Synthetic event object.
<ide> * @param {Route} next Future route information.
<ide> * @param {Route} current Current route information.
<ide> */ | 1 |
Mixed | Ruby | allow string hash values on ar order method | acbd7ab22e5d9179487bd98110234c54535036c4 | <ide><path>activerecord/CHANGELOG.md
<add>* Allow strings to specify the `#order` value.
<add>
<add> Example:
<add>
<add> Model.order(id: 'asc').to_sql == Model.order(id: :asc).to_sql
<add>
<add> Fixes #10732.
<add>
<add> *Marcelo Casiraghi*
<add>
<ide> * Dynamically register PostgreSQL enum OIDs. This prevents "unknown OID"
<ide> warnings on enum columns.
<ide>
<ide><path>activerecord/lib/active_record/relation/query_methods.rb
<ide> def build_order(arel)
<ide> arel.order(*orders) unless orders.empty?
<ide> end
<ide>
<add> VALID_DIRECTIONS = [:asc, :desc, 'asc', 'desc'] # :nodoc:
<add>
<ide> def validate_order_args(args)
<ide> args.grep(Hash) do |h|
<del> unless (h.values - [:asc, :desc]).empty?
<del> raise ArgumentError, 'Direction should be :asc or :desc'
<add> h.values.map(&:downcase).each do |value|
<add> raise ArgumentError, "Direction '#{value}' is invalid. Valid " \
<add> "directions are asc and desc." unless VALID_DIRECTIONS.include?(value)
<ide> end
<ide> end
<ide> end
<ide> def preprocess_order_args(order_args)
<ide> when Hash
<ide> arg.map { |field, dir|
<ide> field = klass.attribute_alias(field) if klass.attribute_alias?(field)
<del> table[field].send(dir)
<add> table[field].send(dir.downcase)
<ide> }
<ide> else
<ide> arg
<ide><path>activerecord/test/cases/relations_test.rb
<ide> def test_finding_with_order
<ide> assert_equal topics(:first).title, topics.first.title
<ide> end
<ide>
<del>
<ide> def test_finding_with_arel_order
<ide> topics = Topic.order(Topic.arel_table[:id].asc)
<ide> assert_equal 5, topics.to_a.size
<ide> def test_order_with_hash_and_symbol_generates_the_same_sql
<ide> assert_equal Topic.order(:id).to_sql, Topic.order(:id => :asc).to_sql
<ide> end
<ide>
<add> def test_finding_with_desc_order_with_string
<add> topics = Topic.order(id: "desc")
<add> assert_equal 5, topics.to_a.size
<add> assert_equal [topics(:fifth), topics(:fourth), topics(:third), topics(:second), topics(:first)], topics.to_a
<add> end
<add>
<add> def test_finding_with_asc_order_with_string
<add> topics = Topic.order(id: 'asc')
<add> assert_equal 5, topics.to_a.size
<add> assert_equal [topics(:first), topics(:second), topics(:third), topics(:fourth), topics(:fifth)], topics.to_a
<add> end
<add>
<add> def test_nothing_raises_on_upcase_desc_arg
<add> Topic.order(id: "DESC")
<add> end
<add>
<add> def test_nothing_raises_on_downcase_desc_arg
<add> Topic.order(id: "desc")
<add> end
<add>
<add> def test_nothing_raises_on_upcase_asc_arg
<add> Topic.order(id: "ASC")
<add> end
<add>
<add> def test_nothing_raises_on_downcase_asc_arg
<add> Topic.order(id: "asc")
<add> end
<add>
<add> def test_nothing_raises_on_case_insensitive_args
<add> Topic.order(id: "DeSc")
<add> Topic.order(id: :DeSc)
<add> Topic.order(id: "aSc")
<add> Topic.order(id: :aSc)
<add> end
<add>
<ide> def test_raising_exception_on_invalid_hash_params
<del> assert_raise(ArgumentError) { Topic.order(:name, "id DESC", :id => :DeSc) }
<add> assert_raise(ArgumentError) { Topic.order(:name, "id DESC", id: :asfsdf) }
<ide> end
<ide>
<ide> def test_finding_last_with_arel_order | 3 |
Ruby | Ruby | use instance method instead of before hook | 6dce4367c2bba894bb94e27cdfe4c56fdcc2c3df | <ide><path>activesupport/lib/active_support/testing/time_helpers.rb
<ide> def unstub_object(stub)
<ide>
<ide> # Containing helpers that helps you test passage of time.
<ide> module TimeHelpers
<del> def before_setup
<del> super
<del> @simple_stubs = SimpleStubs.new
<del> end
<del>
<ide> def after_teardown #:nodoc:
<del> @simple_stubs.unstub_all!
<add> simple_stubs.unstub_all!
<ide> super
<ide> end
<ide>
<ide> def travel(duration, &block)
<ide> # end
<ide> # Time.current # => Sat, 09 Nov 2013 15:34:49 EST -05:00
<ide> def travel_to(date_or_time, &block)
<del> @simple_stubs.stub_object(Time, :now, date_or_time.to_time)
<del> @simple_stubs.stub_object(Date, :today, date_or_time.to_date)
<add> simple_stubs.stub_object(Time, :now, date_or_time.to_time)
<add> simple_stubs.stub_object(Date, :today, date_or_time.to_date)
<ide>
<ide> if block_given?
<ide> block.call
<del> @simple_stubs.unstub_all!
<add> simple_stubs.unstub_all!
<ide> end
<ide> end
<add>
<add> def simple_stubs
<add> @simple_stubs ||= SimpleStubs.new
<add> end
<ide> end
<ide> end
<ide> end | 1 |
Text | Text | add axiosist to ecosystem.md (#963) | 62db26b58854f53beed0d9513b5cf18615c64a2d | <ide><path>ECOSYSTEM.md
<ide> This is a list of axios related libraries and resources. If you have a suggestio
<ide> * [axios-debug-log](https://github.com/Gerhut/axios-debug-log) - Axios interceptor of logging requests & responses by debug.
<ide> * [axios-method-override](https://github.com/jacobbuck/axios-method-override) - Axios http request method override plugin
<ide> * [mocha-axios](https://github.com/jdrydn/mocha-axios) - Streamlined integration testing with Mocha & Axios
<add>* [axiosist](https://github.com/Gerhut/axiosist) - Axios based supertest: convert node.js request handler to axios adapter, used for node.js server unit test. | 1 |
PHP | PHP | apply fixes from styleci | 2d09053c86e4fd4f3718115dff263c26fb730aa0 | <ide><path>src/Illuminate/Database/Eloquent/Builder.php
<ide> public static function __callStatic($method, $parameters)
<ide> */
<ide> protected static function registerMixin($mixin, $replace)
<ide> {
<del> $methods = (new ReflectionClass($mixin))->getMethods(
<add> $methods = (new ReflectionClass($mixin))->getMethods(
<ide> ReflectionMethod::IS_PUBLIC | ReflectionMethod::IS_PROTECTED
<ide> );
<ide>
<del> foreach ($methods as $method) {
<del> if ($replace || ! static::hasGlobalMacro($method->name)) {
<del> $method->setAccessible(true);
<add> foreach ($methods as $method) {
<add> if ($replace || ! static::hasGlobalMacro($method->name)) {
<add> $method->setAccessible(true);
<ide>
<del> static::macro($method->name, $method->invoke($mixin));
<del> }
<add> static::macro($method->name, $method->invoke($mixin));
<ide> }
<add> }
<ide> }
<ide>
<ide> /** | 1 |
PHP | PHP | rewrite redis layer | 1ef8b9c3f156c7d4debc6c6f67b73b032d8337d5 | <ide><path>src/Illuminate/Broadcasting/Broadcasters/RedisBroadcaster.php
<ide>
<ide> use Illuminate\Support\Arr;
<ide> use Illuminate\Support\Str;
<del>use Illuminate\Contracts\Redis\Database as RedisDatabase;
<add>use Illuminate\Contracts\Redis\Factory as Redis;
<ide> use Symfony\Component\HttpKernel\Exception\HttpException;
<ide>
<ide> class RedisBroadcaster extends Broadcaster
<ide> {
<ide> /**
<ide> * The Redis instance.
<ide> *
<del> * @var \Illuminate\Contracts\Redis\Database
<add> * @var \Illuminate\Contracts\Redis\Factory
<ide> */
<ide> protected $redis;
<ide>
<ide> class RedisBroadcaster extends Broadcaster
<ide> /**
<ide> * Create a new broadcaster instance.
<ide> *
<del> * @param \Illuminate\Contracts\Redis\Database $redis
<add> * @param \Illuminate\Contracts\Redis\Factory $redis
<ide> * @param string $connection
<ide> * @return void
<ide> */
<del> public function __construct(RedisDatabase $redis, $connection = null)
<add> public function __construct(Redis $redis, $connection = null)
<ide> {
<ide> $this->redis = $redis;
<ide> $this->connection = $connection;
<ide><path>src/Illuminate/Cache/RedisStore.php
<ide> namespace Illuminate\Cache;
<ide>
<ide> use Illuminate\Contracts\Cache\Store;
<del>use Illuminate\Contracts\Redis\Database;
<add>use Illuminate\Contracts\Redis\Factory as Redis;
<ide>
<ide> class RedisStore extends TaggableStore implements Store
<ide> {
<ide> /**
<del> * The Redis database connection.
<add> * The Redis factory implementation.
<ide> *
<del> * @var \Illuminate\Contracts\Redis\Database
<add> * @var \Illuminate\Contracts\Redis\Factory
<ide> */
<ide> protected $redis;
<ide>
<ide> class RedisStore extends TaggableStore implements Store
<ide> /**
<ide> * Create a new Redis store.
<ide> *
<del> * @param \Illuminate\Contracts\Redis\Database $redis
<add> * @param \Illuminate\Contracts\Redis\Factory $redis
<ide> * @param string $prefix
<ide> * @param string $connection
<ide> * @return void
<ide> */
<del> public function __construct(Database $redis, $prefix = '', $connection = 'default')
<add> public function __construct(Redis $redis, $prefix = '', $connection = 'default')
<ide> {
<ide> $this->redis = $redis;
<ide> $this->setPrefix($prefix);
<ide><path>src/Illuminate/Contracts/Redis/Database.php
<del><?php
<del>
<del>namespace Illuminate\Contracts\Redis;
<del>
<del>interface Database
<del>{
<del> /**
<del> * Run a command against the Redis database.
<del> *
<del> * @param string $method
<del> * @param array $parameters
<del> * @return mixed
<del> */
<del> public function command($method, array $parameters = []);
<del>}
<ide><path>src/Illuminate/Contracts/Redis/Factory.php
<add><?php
<add>
<add>namespace Illuminate\Contracts\Redis;
<add>
<add>interface Factory
<add>{
<add> /**
<add> * Get a Redis connection by name.
<add> *
<add> * @param string $name
<add> * @return \Illuminate\Redis\Connections\Connection
<add> */
<add> public function connection($name = null);
<add>}
<ide><path>src/Illuminate/Foundation/Application.php
<ide> public function registerCoreContainerAliases()
<ide> 'queue.connection' => ['Illuminate\Contracts\Queue\Queue'],
<ide> 'queue.failer' => ['Illuminate\Queue\Failed\FailedJobProviderInterface'],
<ide> 'redirect' => ['Illuminate\Routing\Redirector'],
<del> 'redis' => ['Illuminate\Redis\Database', 'Illuminate\Contracts\Redis\Database'],
<add> 'redis' => ['Illuminate\Redis\RedisManager', 'Illuminate\Contracts\Redis\Factory'],
<ide> 'request' => ['Illuminate\Http\Request', 'Symfony\Component\HttpFoundation\Request'],
<ide> 'router' => ['Illuminate\Routing\Router', 'Illuminate\Contracts\Routing\Registrar', 'Illuminate\Contracts\Routing\BindingRegistrar'],
<ide> 'session' => ['Illuminate\Session\SessionManager'],
<ide><path>src/Illuminate/Queue/Connectors/RedisConnector.php
<ide>
<ide> use Illuminate\Support\Arr;
<ide> use Illuminate\Queue\RedisQueue;
<del>use Illuminate\Contracts\Redis\Database;
<add>use Illuminate\Contracts\Redis\Factory as Redis;
<ide>
<ide> class RedisConnector implements ConnectorInterface
<ide> {
<ide> /**
<ide> * The Redis database instance.
<ide> *
<del> * @var \Illuminate\Contracts\Redis\Database
<add> * @var \Illuminate\Contracts\Redis\Factory
<ide> */
<ide> protected $redis;
<ide>
<ide> class RedisConnector implements ConnectorInterface
<ide> /**
<ide> * Create a new Redis queue connector instance.
<ide> *
<del> * @param \Illuminate\Contracts\Redis\Database $redis
<add> * @param \Illuminate\Contracts\Redis\Factory $redis
<ide> * @param string|null $connection
<ide> * @return void
<ide> */
<del> public function __construct(Database $redis, $connection = null)
<add> public function __construct(Redis $redis, $connection = null)
<ide> {
<ide> $this->redis = $redis;
<ide> $this->connection = $connection;
<ide><path>src/Illuminate/Queue/Jobs/RedisJob.php
<ide> public function getJobId()
<ide> }
<ide>
<ide> /**
<del> * Get the underlying queue driver instance.
<add> * Get the underlying Redis factory implementation.
<ide> *
<del> * @return \Illuminate\Contracts\Redis\Database
<add> * @return \Illuminate\Contracts\Redis\Factory
<ide> */
<ide> public function getRedisQueue()
<ide> {
<ide><path>src/Illuminate/Queue/RedisQueue.php
<ide> use Illuminate\Support\Arr;
<ide> use Illuminate\Support\Str;
<ide> use Illuminate\Queue\Jobs\RedisJob;
<del>use Illuminate\Contracts\Redis\Database;
<add>use Illuminate\Contracts\Redis\Factory as Redis;
<ide> use Illuminate\Contracts\Queue\Queue as QueueContract;
<ide>
<ide> class RedisQueue extends Queue implements QueueContract
<ide> {
<ide> /**
<del> * The Redis database instance.
<add> * The Redis factory implementation.
<ide> *
<del> * @var \Illuminate\Contracts\Redis\Database
<add> * @var \Illuminate\Contracts\Redis\Factory
<ide> */
<ide> protected $redis;
<ide>
<ide> class RedisQueue extends Queue implements QueueContract
<ide> /**
<ide> * Create a new Redis queue instance.
<ide> *
<del> * @param \Illuminate\Contracts\Redis\Database $redis
<add> * @param \Illuminate\Contracts\Redis\Factory $redis
<ide> * @param string $default
<ide> * @param string $connection
<ide> * @param int $expire
<ide> * @return void
<ide> */
<del> public function __construct(Database $redis, $default = 'default',
<del> $connection = null, $expire = 60)
<add> public function __construct(Redis $redis, $default = 'default', $connection = null, $expire = 60)
<ide> {
<ide> $this->redis = $redis;
<ide> $this->expire = $expire;
<ide><path>src/Illuminate/Redis/Connections/Connection.php
<add><?php
<add>
<add>namespace Illuminate\Redis\Connections;
<add>
<add>use Closure;
<add>
<add>abstract class Connection
<add>{
<add> /**
<add> * The Predis client.
<add> *
<add> * @var \Predis\Client
<add> */
<add> protected $client;
<add>
<add> /**
<add> * Subscribe to a set of given channels for messages.
<add> *
<add> * @param array|string $channels
<add> * @param \Closure $callback
<add> * @param string $method
<add> * @return void
<add> */
<add> abstract public function createSubscription($channels, Closure $callback, $method = 'subscribe');
<add>
<add> /**
<add> * Get the underlying Redis client.
<add> *
<add> * @return mixed
<add> */
<add> public function client()
<add> {
<add> return $this->client;
<add> }
<add>
<add> /**
<add> * Subscribe to a set of given channels for messages.
<add> *
<add> * @param array|string $channels
<add> * @param \Closure $callback
<add> * @param string $method
<add> * @return void
<add> */
<add> public function subscribe($channels, Closure $callback)
<add> {
<add> return $this->createSubscription($channels, $callback, __FUNCTION__);
<add> }
<add>
<add> /**
<add> * Subscribe to a set of given channels with wildcards.
<add> *
<add> * @param array|string $channels
<add> * @param \Closure $callback
<add> * @return void
<add> */
<add> public function psubscribe($channels, Closure $callback)
<add> {
<add> return $this->createSubscription($channels, $callback, __FUNCTION__);
<add> }
<add>
<add> /**
<add> * Run a command against the Redis database.
<add> *
<add> * @param string $method
<add> * @param array $parameters
<add> * @return mixed
<add> */
<add> public function command($method, array $parameters = [])
<add> {
<add> return $this->client->{$method}(...$parameters);
<add> }
<add>
<add> /**
<add> * Pass other method calls down to the underlying client.
<add> *
<add> * @param string $method
<add> * @param array $parameters
<add> * @return mixed
<add> */
<add> public function __call($method, $parameters)
<add> {
<add> return $this->command($method, $parameters);
<add> }
<add>}
<ide><path>src/Illuminate/Redis/Connections/PhpRedisClusterConnection.php
<add><?php
<add>
<add>namespace Illuminate\Redis\Connections;
<add>
<add>class PhpRedisClusterConnection extends PhpRedisConnection
<add>{
<add> //
<add>}
<ide><path>src/Illuminate/Redis/Connections/PhpRedisConnection.php
<add><?php
<add>
<add>namespace Illuminate\Redis\Connections;
<add>
<add>use Closure;
<add>
<add>class PhpRedisConnection extends Connection
<add>{
<add> /**
<add> * Create a new Predis connection.
<add> *
<add> * @param \Predis\Client $client
<add> * @return void
<add> */
<add> public function __construct($client)
<add> {
<add> $this->client = $client;
<add> }
<add>
<add> /**
<add> * Evaluate a Lua script and return the result.
<add> *
<add> * @param string $script
<add> * @param int $numberOfKeys
<add> * @param dynamic $arguments
<add> * @return mixed
<add> */
<add> public function eval($script, $numberOfKeys, ...$arguments)
<add> {
<add> return $this->client->eval($script, $arguments, $numberOfKeys);
<add> }
<add>
<add> /**
<add> * Subscribe to a set of given channels for messages.
<add> *
<add> * @param array|string $channels
<add> * @param \Closure $callback
<add> * @return void
<add> */
<add> public function subscribe($channels, Closure $callback)
<add> {
<add> $this->client->subscribe((array) $channels, function ($redis, $channel, $message) use ($callback) {
<add> $callback($message, $channel);
<add> });
<add> }
<add>
<add> /**
<add> * Subscribe to a set of given channels with wildcards.
<add> *
<add> * @param array|string $channels
<add> * @param \Closure $callback
<add> * @return void
<add> */
<add> public function psubscribe($channels, Closure $callback)
<add> {
<add> $this->client->psubscribe((array) $channels, function ($redis, $pattern, $channel, $message) use ($callback) {
<add> $callback($message, $channel);
<add> });
<add> }
<add>
<add> /**
<add> * Subscribe to a set of given channels for messages.
<add> *
<add> * @param array|string $channels
<add> * @param \Closure $callback
<add> * @param string $method
<add> * @return void
<add> */
<add> public function createSubscription($channels, Closure $callback, $method = 'subscribe')
<add> {
<add> //
<add> }
<add>}
<ide><path>src/Illuminate/Redis/Connections/PredisClusterConnection.php
<add><?php
<add>
<add>namespace Illuminate\Redis\Connections;
<add>
<add>class PredisClusterConnection extends PredisConnection
<add>{
<add> //
<add>}
<ide><path>src/Illuminate/Redis/Connections/PredisConnection.php
<add><?php
<add>
<add>namespace Illuminate\Redis\Connections;
<add>
<add>use Closure;
<add>
<add>class PredisConnection extends Connection
<add>{
<add> /**
<add> * Create a new Predis connection.
<add> *
<add> * @param \Predis\Client $client
<add> * @return void
<add> */
<add> public function __construct($client)
<add> {
<add> $this->client = $client;
<add> }
<add>
<add> /**
<add> * Subscribe to a set of given channels for messages.
<add> *
<add> * @param array|string $channels
<add> * @param \Closure $callback
<add> * @param string $method
<add> * @return void
<add> */
<add> public function createSubscription($channels, Closure $callback, $method = 'subscribe')
<add> {
<add> $loop = $this->pubSubLoop();
<add>
<add> call_user_func_array([$loop, $method], (array) $channels);
<add>
<add> foreach ($loop as $message) {
<add> if ($message->kind === 'message' || $message->kind === 'pmessage') {
<add> call_user_func($callback, $message->payload, $message->channel);
<add> }
<add> }
<add>
<add> unset($loop);
<add> }
<add>}
<ide><path>src/Illuminate/Redis/Connectors/PhpRedisConnector.php
<add><?php
<add>
<add>namespace Illuminate\Redis\Connectors;
<add>
<add>use Redis;
<add>use RedisCluster;
<add>use Illuminate\Support\Arr;
<add>use Illuminate\Redis\Connections\PhpRedisConnection;
<add>use Illuminate\Redis\Connections\PhpRedisClusterConnection;
<add>
<add>class PhpRedisConnector
<add>{
<add> /**
<add> * Create a new clustered Predis connection.
<add> *
<add> * @param array $config
<add> * @param array $clusterOptions
<add> * @param array $options
<add> * @return \Illuminate\Redis\PredisConnection
<add> */
<add> public function connect(array $config, array $options)
<add> {
<add> return new PhpRedisConnection($this->createClient(array_merge(
<add> $config, $options, Arr::pull($config, 'options', [])
<add> )));
<add> }
<add>
<add> /**
<add> * Create a new clustered Predis connection.
<add> *
<add> * @param array $config
<add> * @param array $clusterOptions
<add> * @param array $options
<add> * @return \Illuminate\Redis\PredisClusterConnection
<add> */
<add> public function connectToCluster(array $config, array $clusterOptions, array $options)
<add> {
<add> $options = array_merge($options, $clusterOptions, Arr::pull($config, 'options', []));
<add>
<add> return new PhpRedisClusterConnection($this->createRedisClusterInstance(
<add> array_map([$this, 'buildClusterConnectionString'], $config), $options
<add> ));
<add> }
<add>
<add> /**
<add> * Build a single cluster seed string from array.
<add> *
<add> * @param array $server
<add> * @return string
<add> */
<add> protected function buildClusterConnectionString(array $server)
<add> {
<add> return $server['host'].':'.$server['port'].'?'.http_build_query(Arr::only($server, [
<add> 'database', 'password', 'prefix', 'read_timeout',
<add> ]));
<add> }
<add>
<add> /**
<add> * Create the Redis client instance.
<add> *
<add> * @param array $config
<add> * @return \Redis
<add> */
<add> protected function createClient(array $config)
<add> {
<add> return tap(new Redis, function ($client) use ($config) {
<add> $this->establishConnection($client, $config);
<add>
<add> if (! empty($config['password'])) {
<add> $client->auth($config['password']);
<add> }
<add>
<add> if (! empty($config['database'])) {
<add> $client->select($config['database']);
<add> }
<add>
<add> if (! empty($config['prefix'])) {
<add> $client->setOption(Redis::OPT_PREFIX, $config['prefix']);
<add> }
<add>
<add> if (! empty($config['read_timeout'])) {
<add> $client->setOption(Redis::OPT_READ_TIMEOUT, $config['read_timeout']);
<add> }
<add> });
<add> }
<add>
<add> /**
<add> * Establish a connection with the Redis host.
<add> *
<add> * @param \Redis $client
<add> * @param array $config
<add> * @return void
<add> */
<add> protected function establishConnection($client, array $config)
<add> {
<add> $client->{Arr::get($config, 'persistent', false) === true ? 'pconnect' : 'connect'}(
<add> $config['host'], $config['port'], Arr::get($config, 'timeout', 0)
<add> );
<add> }
<add>
<add> /**
<add> * Create a new redis cluster instance.
<add> *
<add> * @param array $servers
<add> * @param array $options
<add> * @return \RedisCluster
<add> */
<add> protected function createRedisClusterInstance(array $servers, array $options)
<add> {
<add> return new RedisCluster(
<add> null,
<add> array_values($servers),
<add> Arr::get($options, 'timeout', 0),
<add> Arr::get($options, 'read_timeout', 0),
<add> isset($options['persistent']) && $options['persistent']
<add> );
<add> }
<add>}
<ide><path>src/Illuminate/Redis/Connectors/PredisConnector.php
<add><?php
<add>
<add>namespace Illuminate\Redis\Connectors;
<add>
<add>use Predis\Client;
<add>use Illuminate\Support\Arr;
<add>use Illuminate\Redis\Connections\PredisConnection;
<add>use Illuminate\Redis\Connections\PredisClusterConnection;
<add>
<add>class PredisConnector
<add>{
<add> /**
<add> * Create a new clustered Predis connection.
<add> *
<add> * @param array $config
<add> * @param array $clusterOptions
<add> * @param array $options
<add> * @return \Illuminate\Redis\PredisConnection
<add> */
<add> public function connect(array $config, array $options)
<add> {
<add> return new PredisConnection(new Client($config, array_merge(
<add> ['timeout' => 10.0], $options, Arr::pull($config, 'options', [])
<add> )));
<add> }
<add>
<add> /**
<add> * Create a new clustered Predis connection.
<add> *
<add> * @param array $config
<add> * @param array $clusterOptions
<add> * @param array $options
<add> * @return \Illuminate\Redis\PredisClusterConnection
<add> */
<add> public function connectToCluster(array $config, array $clusterOptions, array $options)
<add> {
<add> return new PredisClusterConnection(new Client(array_values($config), array_merge(
<add> $options, $clusterOptions, Arr::pull($config, 'options', [])
<add> )));
<add> }
<add>}
<ide><path>src/Illuminate/Redis/Database.php
<del><?php
<del>
<del>namespace Illuminate\Redis;
<del>
<del>use Illuminate\Support\Arr;
<del>use Illuminate\Contracts\Redis\Database as DatabaseContract;
<del>
<del>abstract class Database implements DatabaseContract
<del>{
<del> /**
<del> * Get a specific Redis connection instance.
<del> *
<del> * @param string $name
<del> * @return \Predis\ClientInterface|\RedisCluster|\Redis|null
<del> */
<del> public function connection($name = 'default')
<del> {
<del> return Arr::get($this->clients, $name ?: 'default');
<del> }
<del>
<del> /**
<del> * Run a command against the Redis database.
<del> *
<del> * @param string $method
<del> * @param array $parameters
<del> * @return mixed
<del> */
<del> public function command($method, array $parameters = [])
<del> {
<del> return call_user_func_array([$this->clients['default'], $method], $parameters);
<del> }
<del>
<del> /**
<del> * Dynamically make a Redis command.
<del> *
<del> * @param string $method
<del> * @param array $parameters
<del> * @return mixed
<del> */
<del> public function __call($method, $parameters)
<del> {
<del> return $this->command($method, $parameters);
<del> }
<del>}
<ide><path>src/Illuminate/Redis/PhpRedisDatabase.php
<del><?php
<del>
<del>namespace Illuminate\Redis;
<del>
<del>use Redis;
<del>use Closure;
<del>use RedisCluster;
<del>use Illuminate\Support\Arr;
<del>
<del>class PhpRedisDatabase extends Database
<del>{
<del> /**
<del> * The host address of the database.
<del> *
<del> * @var array
<del> */
<del> public $clients;
<del>
<del> /**
<del> * Create a new Redis connection instance.
<del> *
<del> * @param array $servers
<del> * @return void
<del> */
<del> public function __construct(array $servers = [])
<del> {
<del> $clusters = (array) Arr::pull($servers, 'clusters');
<del>
<del> $options = (array) Arr::pull($servers, 'options');
<del>
<del> $this->clients = $this->createSingleClients($servers, $options);
<del>
<del> $this->createClusters($clusters, $options);
<del> }
<del>
<del> /**
<del> * Create an array of single connection clients.
<del> *
<del> * @param array $servers
<del> * @param array $options
<del> * @return array
<del> */
<del> protected function createSingleClients(array $servers, array $options = [])
<del> {
<del> $clients = [];
<del>
<del> foreach ($servers as $key => $server) {
<del> $clients[$key] = $this->createRedisInstance($server, $options);
<del> }
<del>
<del> return $clients;
<del> }
<del>
<del> /**
<del> * Create multiple clusters (aggregate clients).
<del> *
<del> * @param array $clusters
<del> * @param array $options
<del> * @return void
<del> */
<del> protected function createClusters(array $clusters, array $options = [])
<del> {
<del> $options = array_merge($options, (array) Arr::pull($clusters, 'options'));
<del>
<del> foreach ($clusters as $name => $servers) {
<del> $this->clients[$name] = $this->createAggregateClient($servers, array_merge(
<del> $options, (array) Arr::pull($servers, 'options')
<del> ));
<del> }
<del> }
<del>
<del> /**
<del> * Create a new aggregate client supporting sharding.
<del> *
<del> * @param array $servers
<del> * @param array $options
<del> * @return array
<del> */
<del> protected function createAggregateClient(array $servers, array $options = [])
<del> {
<del> return $this->createRedisClusterInstance(
<del> array_map([$this, 'buildClusterConnectionString'], $servers), $options
<del> );
<del> }
<del>
<del> /**
<del> * Subscribe to a set of given channels for messages.
<del> *
<del> * @param array|string $channels
<del> * @param \Closure $callback
<del> * @param string $connection
<del> * @return void
<del> */
<del> public function subscribe($channels, Closure $callback, $connection = null)
<del> {
<del> $this->connection($connection)->subscribe((array) $channels, function ($redis, $channel, $message) use ($callback) {
<del> $callback($message, $channel);
<del> });
<del> }
<del>
<del> /**
<del> * Subscribe to a set of given channels with wildcards.
<del> *
<del> * @param array|string $channels
<del> * @param \Closure $callback
<del> * @param string $connection
<del> * @return void
<del> */
<del> public function psubscribe($channels, Closure $callback, $connection = null)
<del> {
<del> $this->connection($connection)->psubscribe((array) $channels, function ($redis, $pattern, $channel, $message) use ($callback) {
<del> $callback($message, $channel);
<del> });
<del> }
<del>
<del> /**
<del> * Create a new redis instance.
<del> *
<del> * @param array $server
<del> * @param array $options
<del> * @return \Redis
<del> */
<del> protected function createRedisInstance(array $server, array $options)
<del> {
<del> $client = new Redis;
<del>
<del> $timeout = empty($server['timeout']) ? 0 : $server['timeout'];
<del>
<del> if (isset($server['persistent']) && $server['persistent']) {
<del> $client->pconnect($server['host'], $server['port'], $timeout);
<del> } else {
<del> $client->connect($server['host'], $server['port'], $timeout);
<del> }
<del>
<del> if (! empty($server['prefix'])) {
<del> $client->setOption(Redis::OPT_PREFIX, $server['prefix']);
<del> }
<del>
<del> if (! empty($server['read_timeout'])) {
<del> $client->setOption(Redis::OPT_READ_TIMEOUT, $server['read_timeout']);
<del> }
<del>
<del> if (! empty($server['password'])) {
<del> $client->auth($server['password']);
<del> }
<del>
<del> if (! empty($server['database'])) {
<del> $client->select($server['database']);
<del> }
<del>
<del> return $client;
<del> }
<del>
<del> /**
<del> * Create a new redis cluster instance.
<del> *
<del> * @param array $servers
<del> * @param array $options
<del> * @return \RedisCluster
<del> */
<del> protected function createRedisClusterInstance(array $servers, array $options)
<del> {
<del> return new RedisCluster(
<del> null,
<del> array_values($servers),
<del> Arr::get($options, 'timeout', 0),
<del> Arr::get($options, 'read_timeout', 0),
<del> isset($options['persistent']) && $options['persistent']
<del> );
<del> }
<del>
<del> /**
<del> * Build a single cluster seed string from array.
<del> *
<del> * @param array $server
<del> * @return string
<del> */
<del> protected function buildClusterConnectionString(array $server)
<del> {
<del> return $server['host'].':'.$server['port'].'?'.http_build_query(Arr::only($server, [
<del> 'database', 'password', 'prefix', 'read_timeout',
<del> ]));
<del> }
<del>}
<ide><path>src/Illuminate/Redis/PredisDatabase.php
<del><?php
<del>
<del>namespace Illuminate\Redis;
<del>
<del>use Closure;
<del>use Predis\Client;
<del>use Illuminate\Support\Arr;
<del>
<del>class PredisDatabase extends Database
<del>{
<del> /**
<del> * The host address of the database.
<del> *
<del> * @var array
<del> */
<del> public $clients;
<del>
<del> /**
<del> * Create a new Redis connection instance.
<del> *
<del> * @param array $servers
<del> * @return void
<del> */
<del> public function __construct(array $servers = [])
<del> {
<del> $clusters = (array) Arr::pull($servers, 'clusters');
<del>
<del> $options = array_merge(['timeout' => 10.0], (array) Arr::pull($servers, 'options'));
<del>
<del> $this->clients = $this->createSingleClients($servers, $options);
<del>
<del> $this->createClusters($clusters, $options);
<del> }
<del>
<del> /**
<del> * Create an array of single connection clients.
<del> *
<del> * @param array $servers
<del> * @param array $options
<del> * @return array
<del> */
<del> protected function createSingleClients(array $servers, array $options = [])
<del> {
<del> $clients = [];
<del>
<del> foreach ($servers as $key => $server) {
<del> $clients[$key] = new Client($server, $options);
<del> }
<del>
<del> return $clients;
<del> }
<del>
<del> /**
<del> * Create multiple clusters (aggregate clients).
<del> *
<del> * @param array $clusters
<del> * @param array $options
<del> * @return void
<del> */
<del> protected function createClusters(array $clusters, array $options = [])
<del> {
<del> $options = array_merge($options, (array) Arr::pull($clusters, 'options'));
<del>
<del> foreach ($clusters as $name => $servers) {
<del> $this->clients += $this->createAggregateClient($name, $servers, array_merge(
<del> $options, (array) Arr::pull($servers, 'options')
<del> ));
<del> }
<del> }
<del>
<del> /**
<del> * Create a new aggregate client supporting sharding.
<del> *
<del> * @param string $name
<del> * @param array $servers
<del> * @param array $options
<del> * @return array
<del> */
<del> protected function createAggregateClient($name, array $servers, array $options = [])
<del> {
<del> return [$name => new Client(array_values($servers), $options)];
<del> }
<del>
<del> /**
<del> * Subscribe to a set of given channels for messages.
<del> *
<del> * @param array|string $channels
<del> * @param \Closure $callback
<del> * @param string $connection
<del> * @param string $method
<del> * @return void
<del> */
<del> public function subscribe($channels, Closure $callback, $connection = null, $method = 'subscribe')
<del> {
<del> $loop = $this->connection($connection)->pubSubLoop();
<del>
<del> call_user_func_array([$loop, $method], (array) $channels);
<del>
<del> foreach ($loop as $message) {
<del> if ($message->kind === 'message' || $message->kind === 'pmessage') {
<del> call_user_func($callback, $message->payload, $message->channel);
<del> }
<del> }
<del>
<del> unset($loop);
<del> }
<del>
<del> /**
<del> * Subscribe to a set of given channels with wildcards.
<del> *
<del> * @param array|string $channels
<del> * @param \Closure $callback
<del> * @param string $connection
<del> * @return void
<del> */
<del> public function psubscribe($channels, Closure $callback, $connection = null)
<del> {
<del> $this->subscribe($channels, $callback, $connection, __FUNCTION__);
<del> }
<del>}
<ide><path>src/Illuminate/Redis/RedisManager.php
<add><?php
<add>
<add>namespace Illuminate\Redis;
<add>
<add>use Illuminate\Support\Arr;
<add>use InvalidArgumentException;
<add>use Illuminate\Contracts\Redis\Factory;
<add>
<add>class RedisManager implements Factory
<add>{
<add> /**
<add> * The name of the default driver.
<add> *
<add> * @var string
<add> */
<add> protected $driver;
<add>
<add> /**
<add> * The Redis server configurations.
<add> *
<add> * @var array
<add> */
<add> protected $config;
<add>
<add> /**
<add> * Create a new Redis manager instance.
<add> *
<add> * @param string $driver
<add> * @param array $config
<add> */
<add> public function __construct($driver, array $config)
<add> {
<add> $this->driver = $driver;
<add> $this->config = $config;
<add> }
<add>
<add> /**
<add> * Get a Redis connection by name.
<add> *
<add> * @param string $name
<add> * @return \Illuminate\Redis\Connection
<add> */
<add> public function connection($name = null)
<add> {
<add> $name = $name ?: 'default';
<add>
<add> if (isset($this->connections[$name])) {
<add> return $this->connections[$name];
<add> }
<add>
<add> return $this->connections[$name] = $this->resolve($name);
<add> }
<add>
<add> /**
<add> * Resolve the given connection by name.
<add> *
<add> * @param string $name
<add> * @return \Illuminate\Redis\Connection
<add> */
<add> protected function resolve($name)
<add> {
<add> $options = Arr::get($this->config, 'options', []);
<add>
<add> if (isset($this->config[$name])) {
<add> return $this->connector()->connect($this->config[$name], $options);
<add> }
<add>
<add> if (isset($this->config['clusters'][$name])) {
<add> $clusterOptions = Arr::get($this->config, 'clusters.options', []);
<add>
<add> return $this->connector()->connectToCluster(
<add> $this->config['clusters'][$name], $clusterOptions, $options
<add> );
<add> }
<add>
<add> throw new InvalidArgumentException("Redis connection [{$name}] not configured.");
<add> }
<add>
<add> /**
<add> * Get the connector instance for the current driver.
<add> *
<add> * @return mixed
<add> */
<add> protected function connector()
<add> {
<add> switch ($this->driver)
<add> {
<add> case 'predis':
<add> return new Connectors\PredisConnector;
<add> case 'phpredis':
<add> return new Connectors\PhpRedisConnector;
<add> }
<add> }
<add>
<add> /**
<add> * Pass methods onto the default Redis connection.
<add> *
<add> * @param string $method
<add> * @param array $parameters
<add> * @return mixed
<add> */
<add> public function __call($method, $parameters)
<add> {
<add> return $this->connection()->{$method}(...$parameters);
<add> }
<add>}
<ide><path>src/Illuminate/Redis/RedisServiceProvider.php
<ide> class RedisServiceProvider extends ServiceProvider
<ide> public function register()
<ide> {
<ide> $this->app->singleton('redis', function ($app) {
<del> $servers = $app['config']['database.redis'];
<add> $config = $app->make('config')->get('database.redis');
<ide>
<del> $client = Arr::pull($servers, 'client', 'predis');
<del>
<del> if ($client === 'phpredis') {
<del> return new PhpRedisDatabase($servers);
<del> } else {
<del> return new PredisDatabase($servers);
<del> }
<add> return new RedisManager(Arr::pull($config, 'client', 'predis'), $config);
<ide> });
<ide> }
<ide>
<ide><path>src/Illuminate/Support/Facades/Redis.php
<ide> namespace Illuminate\Support\Facades;
<ide>
<ide> /**
<del> * @see \Illuminate\Redis\Database
<del> * @see \Illuminate\Redis\PredisDatabase
<del> * @see \Illuminate\Redis\PhpRedisDatabase
<del> * @see \Illuminate\Contracts\Redis\Database
<add> * @see \Illuminate\Redis\RedisManager
<add> * @see \Illuminate\Contracts\Redis\Factory
<ide> */
<ide> class Redis extends Facade
<ide> {
<ide><path>tests/Cache/CacheRedisStoreTest.php
<ide> public function testGetAndSetPrefix()
<ide>
<ide> protected function getRedis()
<ide> {
<del> return new Illuminate\Cache\RedisStore(m::mock('Illuminate\Redis\PredisDatabase'), 'prefix');
<add> return new Illuminate\Cache\RedisStore(m::mock('Illuminate\Contracts\Redis\Factory'), 'prefix');
<ide> }
<ide> }
<ide><path>tests/Queue/QueueRedisQueueTest.php
<ide> public function tearDown()
<ide>
<ide> public function testPushProperlyPushesJobOntoRedis()
<ide> {
<del> $queue = $this->getMockBuilder('Illuminate\Queue\RedisQueue')->setMethods(['getRandomId'])->setConstructorArgs([$redis = m::mock('Illuminate\Redis\PredisDatabase'), 'default'])->getMock();
<add> $queue = $this->getMockBuilder('Illuminate\Queue\RedisQueue')->setMethods(['getRandomId'])->setConstructorArgs([$redis = m::mock('Illuminate\Contracts\Redis\Factory'), 'default'])->getMock();
<ide> $queue->expects($this->once())->method('getRandomId')->will($this->returnValue('foo'));
<ide> $redis->shouldReceive('connection')->once()->andReturn($redis);
<ide> $redis->shouldReceive('rpush')->once()->with('queues:default', json_encode(['job' => 'foo', 'data' => ['data'], 'id' => 'foo', 'attempts' => 1]));
<ide> public function testPushProperlyPushesJobOntoRedis()
<ide>
<ide> public function testDelayedPushProperlyPushesJobOntoRedis()
<ide> {
<del> $queue = $this->getMockBuilder('Illuminate\Queue\RedisQueue')->setMethods(['getSeconds', 'getTime', 'getRandomId'])->setConstructorArgs([$redis = m::mock('Illuminate\Redis\PredisDatabase'), 'default'])->getMock();
<add> $queue = $this->getMockBuilder('Illuminate\Queue\RedisQueue')->setMethods(['getSeconds', 'getTime', 'getRandomId'])->setConstructorArgs([$redis = m::mock('Illuminate\Contracts\Redis\Factory'), 'default'])->getMock();
<ide> $queue->expects($this->once())->method('getRandomId')->will($this->returnValue('foo'));
<ide> $queue->expects($this->once())->method('getSeconds')->with(1)->will($this->returnValue(1));
<ide> $queue->expects($this->once())->method('getTime')->will($this->returnValue(1));
<ide> public function testDelayedPushProperlyPushesJobOntoRedis()
<ide> public function testDelayedPushWithDateTimeProperlyPushesJobOntoRedis()
<ide> {
<ide> $date = Carbon\Carbon::now();
<del> $queue = $this->getMockBuilder('Illuminate\Queue\RedisQueue')->setMethods(['getSeconds', 'getTime', 'getRandomId'])->setConstructorArgs([$redis = m::mock('Illuminate\Redis\PredisDatabase'), 'default'])->getMock();
<add> $queue = $this->getMockBuilder('Illuminate\Queue\RedisQueue')->setMethods(['getSeconds', 'getTime', 'getRandomId'])->setConstructorArgs([$redis = m::mock('Illuminate\Contracts\Redis\Factory'), 'default'])->getMock();
<ide> $queue->expects($this->once())->method('getRandomId')->will($this->returnValue('foo'));
<ide> $queue->expects($this->once())->method('getSeconds')->with($date)->will($this->returnValue(1));
<ide> $queue->expects($this->once())->method('getTime')->will($this->returnValue(1));
<ide><path>tests/Redis/InteractsWithRedis.php
<ide> <?php
<ide>
<ide>
<del>use Illuminate\Redis\PredisDatabase;
<add>use Illuminate\Redis\RedisManager;
<ide>
<ide> trait InteractsWithRedis
<ide> {
<ide> trait InteractsWithRedis
<ide> private static $connectionFailedOnceWithDefaultsSkip = false;
<ide>
<ide> /**
<del> * @var PredisDatabase
<add> * @var RedisManager
<ide> */
<ide> private $redis;
<ide>
<ide> public function setUpRedis()
<ide> return;
<ide> }
<ide>
<del> $this->redis = new PredisDatabase([
<add> $this->redis = new RedisManager('predis', [
<ide> 'cluster' => false,
<ide> 'default' => [
<ide> 'host' => $host,
<ide><path>tests/Redis/PhpRedisConnectionTest.php
<del><?php
<del>
<del>use Illuminate\Redis\PhpRedisDatabase;
<del>
<del>class PhpRedisConnectionTest extends PHPUnit_Framework_TestCase
<del>{
<del> public function testPhpRedisNotCreateClusterAndOptionsAndClustersServer()
<del> {
<del> $redis = $this->getRedis();
<del>
<del> $client = $redis->connection('cluster');
<del> $this->assertNull($client, 'cluster parameter should not create as redis server');
<del>
<del> $client = $redis->connection('options');
<del> $this->assertNull($client, 'options parameter should not create as redis server');
<del>
<del> $client = $redis->connection('clusters');
<del> $this->assertNull($client, 'clusters parameter should not create as redis server');
<del> }
<del>
<del> public function testPhpRedisClusterNotCreateClusterAndOptionsServer()
<del> {
<del> $redis = $this->getRedis();
<del> $this->assertEquals(['default', 'cluster-1', 'cluster-2'], array_keys($redis->clients));
<del> }
<del>
<del> public function testPhpRedisClusterCreateMultipleClustersAndNotCreateOptionsServer()
<del> {
<del> $redis = $this->getRedis();
<del>
<del> $clusterOne = $redis->connection('cluster-1');
<del> $clusterTwo = $redis->connection('cluster-2');
<del>
<del> $this->assertInstanceOf(RedisClusterStub::class, $clusterOne);
<del> $this->assertInstanceOf(RedisClusterStub::class, $clusterTwo);
<del>
<del> $client = $redis->connection('options');
<del> $this->assertNull($client, 'options parameter should not create as redis server');
<del> }
<del>
<del> protected function getRedis()
<del> {
<del> $servers = [
<del> 'default' => [
<del> 'host' => '127.0.0.1',
<del> 'port' => 6379,
<del> 'database' => 0,
<del> ],
<del> 'options' => [
<del> 'prefix' => 'prefix:',
<del> ],
<del> 'clusters' => [
<del> 'options' => [
<del> 'prefix' => 'cluster:',
<del> ],
<del> 'cluster-1' => [
<del> [
<del> 'host' => '127.0.0.1',
<del> 'port' => 6379,
<del> 'database' => 0,
<del> ],
<del> ],
<del> 'cluster-2' => [
<del> [
<del> 'host' => '127.0.0.1',
<del> 'port' => 6379,
<del> 'database' => 0,
<del> ],
<del> ],
<del> ],
<del> ];
<del>
<del> return new PhpRedisDatabaseStub($servers);
<del> }
<del>}
<del>
<del>class PhpRedisDatabaseStub extends PhpRedisDatabase
<del>{
<del> protected function createRedisClusterInstance(array $servers, array $options)
<del> {
<del> return new RedisClusterStub();
<del> }
<del>
<del> protected function createRedisInstance(array $server, array $options)
<del> {
<del> return new RedisStub;
<del> }
<del>}
<del>
<del>class RedisStub
<del>{
<del>}
<del>
<del>class RedisClusterStub
<del>{
<del>}
<ide><path>tests/Redis/RedisConnectionTest.php
<del><?php
<del>
<del>class RedisConnectionTest extends PHPUnit_Framework_TestCase
<del>{
<del> public function testRedisNotCreateClusterAndOptionsAndClustersServer()
<del> {
<del> $redis = $this->getRedis();
<del>
<del> $client = $redis->connection('cluster');
<del> $this->assertNull($client, 'cluster parameter should not create as redis server');
<del>
<del> $client = $redis->connection('options');
<del> $this->assertNull($client, 'options parameter should not create as redis server');
<del>
<del> $client = $redis->connection('clusters');
<del> $this->assertNull($client, 'clusters parameter should not create as redis server');
<del> }
<del>
<del> public function testRedisClusterNotCreateClusterAndOptionsServer()
<del> {
<del> $redis = $this->getRedis();
<del> $this->assertEquals(['default', 'cluster-1', 'cluster-2'], array_keys($redis->clients));
<del> }
<del>
<del> public function testRedisClusterCreateMultipleClustersAndNotCreateOptionsServer()
<del> {
<del> $redis = $this->getRedis();
<del> $clusterOne = $redis->connection('cluster-1');
<del> $clusterTwo = $redis->connection('cluster-2');
<del>
<del> $this->assertCount(1, $clusterOne->getConnection());
<del> $this->assertCount(1, $clusterTwo->getConnection());
<del>
<del> $client = $redis->connection('options');
<del> $this->assertNull($client, 'options parameter should not create as redis server');
<del> }
<del>
<del> protected function getRedis()
<del> {
<del> $servers = [
<del> 'default' => [
<del> 'host' => '127.0.0.1',
<del> 'port' => 6379,
<del> 'database' => 0,
<del> ],
<del> 'options' => [
<del> 'prefix' => 'prefix:',
<del> ],
<del> 'clusters' => [
<del> 'options' => [
<del> 'prefix' => 'cluster:',
<del> ],
<del> 'cluster-1' => [
<del> [
<del> 'host' => '127.0.0.1',
<del> 'port' => 6379,
<del> 'database' => 0,
<del> ],
<del> ],
<del> 'cluster-2' => [
<del> [
<del> 'host' => '127.0.0.1',
<del> 'port' => 6379,
<del> 'database' => 0,
<del> ],
<del> ],
<del> ],
<del> ];
<del>
<del> return new Illuminate\Redis\PredisDatabase($servers);
<del> }
<del>} | 26 |
Javascript | Javascript | update example to use a module | e1f3c0cce84f597af6920a5e32be4a1673c91fab | <ide><path>src/ng/compile.js
<ide> * to illustrate how `$compile` works.
<ide> * </div>
<ide> *
<del> <example module="compile">
<add> <example module="compileExample">
<ide> <file name="index.html">
<ide> <script>
<del> angular.module('compile', [], function($compileProvider) {
<add> angular.module('compileExample', [], function($compileProvider) {
<ide> // configure new 'compile' directive by passing a directive
<ide> // factory function. The factory function injects the '$compile'
<ide> $compileProvider.directive('compile', function($compile) {
<ide> }
<ide> );
<ide> };
<del> })
<del> });
<del>
<del> function Ctrl($scope) {
<add> });
<add> })
<add> .controller('GreeterController', ['$scope', function($scope) {
<ide> $scope.name = 'Angular';
<ide> $scope.html = 'Hello {{name}}';
<del> }
<add> }]);
<ide> </script>
<del> <div ng-controller="Ctrl">
<add> <div ng-controller="GreeterController">
<ide> <input ng-model="name"> <br>
<ide> <textarea ng-model="html"></textarea> <br>
<ide> <div compile="html"></div> | 1 |
Python | Python | use self.get_temp_dir and simplify the test | 674506c409d5c1a71c9ee2a974fa94a2b94b6e84 | <ide><path>official/nlp/modeling/layers/tn_expand_condense_test.py
<ide> """Tests for ExpandCondense tensor network layer."""
<ide>
<ide> import os
<del>import shutil
<ide>
<ide> from absl.testing import parameterized
<ide> import numpy as np
<ide> def test_model_save(self, input_dim, proj_multiple):
<ide> # Train the model for 5 epochs
<ide> model.fit(data, self.labels, epochs=5, batch_size=32)
<ide>
<del> for save_path in ['/test_model', '/test_model.h5']:
<del> # Save model to a SavedModel folder or h5 file, then load model
<del> save_path = os.environ['TEST_UNDECLARED_OUTPUTS_DIR'] + save_path
<del> model.save(save_path)
<del> loaded_model = tf.keras.models.load_model(save_path)
<add> save_path = os.path.join(self.get_temp_dir(), 'test_model')
<add> model.save(save_path)
<add> loaded_model = tf.keras.models.load_model(save_path)
<ide>
<del> # Clean up SavedModel folder
<del> if os.path.isdir(save_path):
<del> shutil.rmtree(save_path)
<del>
<del> # Clean up h5 file
<del> if os.path.exists(save_path):
<del> os.remove(save_path)
<del>
<del> # Compare model predictions and loaded_model predictions
<del> self.assertAllEqual(model.predict(data), loaded_model.predict(data))
<add> # Compare model predictions and loaded_model predictions
<add> self.assertAllEqual(model.predict(data), loaded_model.predict(data))
<ide>
<ide> if __name__ == '__main__':
<ide> tf.test.main() | 1 |
Text | Text | add two white spaces | 2bed4878b86666e4a9dd947274d94a1ac8a7645e | <ide><path>guide/english/data-science-tools/pandas/index.md
<ide> import pandas as pd
<ide> ```
<ide>
<ide> ## Data frames
<del>A data frame consist of a number of rows and column. Each column represents a feature of the data set, and so has a name and a data type. Each row represents a data point through associated feature values. The pandas library allows you to manipulate the data in a data frame in various ways. pandas has a lot of possibilities, so the following is merely scratching the surface to give a feel for the library.
<add>A data frame consists of a number of rows and columns. Each column represents a feature of the data set, and so, has a name and a data type. Each row represents a data point through associated feature values. The pandas library allows you to manipulate the data in a data frame in various ways. pandas has a lot of possibilities, so the following is merely scratching the surface to give you a feel for the library.
<add>
<ide> ## Series
<del>Series is the basic data-type in pandas.A Series is very similar to a array (NumPy array) (in fact it is built on top of the NumPy array object).A Series can have axis labels, as it can be indexed by a label with no number indexing for the location of data. It can hold any valid Python Object like List,Dictionary etc.
<add>Series is the basic data-type in pandas. A Series is very similar to an array (NumPy array, in fact it is built on top of the NumPy array object). A Series can have axis labels, as it can be indexed by a label with no number indexing for the location of data. It can hold any valid Python Object like List, Dictionary etc.
<ide>
<ide> ## Loading data from a csv file
<ide> A `.csv` file is a *comma separated value* file. A very common way to store data. To load such data into a pandas data frame use the `read_csv` method: | 1 |
Ruby | Ruby | fix tests on 1.8 | 3265bbb65998d8175f3cd087f355a007bf4d2d47 | <ide><path>activesupport/test/buffered_logger_test.rb
<ide> def test_should_auto_flush_every_n_messages
<ide> def test_should_create_the_log_directory_if_it_doesnt_exist
<ide> tmp_directory = File.join(File.dirname(__FILE__), "tmp")
<ide> log_file = File.join(tmp_directory, "development.log")
<del> assert !File.exist?(tmp_directory)
<add> FileUtils.rm_rf(tmp_directory)
<ide> @logger = Logger.new(log_file)
<ide> assert File.exist?(tmp_directory)
<del> ensure
<del> FileUtils.rm_rf(tmp_directory)
<ide> end
<ide>
<ide> def test_logger_should_maintain_separate_buffers_for_each_thread
<ide><path>activesupport/test/file_watcher_test.rb
<ide> def test_overlapping_watchers
<ide>
<ide> module FSSM::Backends
<ide> class Polling
<del> def initialize(options={})
<del> @handlers = []
<del> @latency = options[:latency] || 0.1
<del> end
<del>
<del> def add_handler(handler, preload=true)
<del> handler.refresh(nil, true) if preload
<del> @handlers << handler
<del> end
<del>
<del> def run
<del> begin
<del> loop do
<del> start = Time.now.to_f
<del> @handlers.each { |handler| handler.refresh }
<del> nap_time = @latency - (Time.now.to_f - start)
<del> sleep nap_time if nap_time > 0
<del> end
<del> rescue Interrupt
<del> end
<add> def initialize_with_low_latency(options={})
<add> initialize_without_low_latency(options.merge(:latency => 0.1))
<ide> end
<add> alias_method_chain :initialize, :low_latency
<ide> end
<ide> end
<ide>
<ide> def initialize(path, watcher)
<ide> super
<ide>
<ide> monitor = FSSM::Monitor.new
<del> monitor.path(path, '**/*') do |monitor|
<del> monitor.update { |base, relative| trigger relative => :changed }
<del> monitor.delete { |base, relative| trigger relative => :deleted }
<del> monitor.create { |base, relative| trigger relative => :created }
<add> monitor.path(path, '**/*') do |p|
<add> p.update { |base, relative| trigger relative => :changed }
<add> p.delete { |base, relative| trigger relative => :deleted }
<add> p.create { |base, relative| trigger relative => :created }
<ide> end
<ide>
<ide> @thread = Thread.new do | 2 |
Python | Python | add test for | fd187d71ade16595fbc9a6064601714d19c8efa9 | <ide><path>spacy/tests/regression/test_issue1727.py
<add>from __future__ import unicode_literals
<add>import numpy
<add>from ...pipeline import Tagger
<add>from ...vectors import Vectors
<add>from ...vocab import Vocab
<add>from ..util import make_tempdir
<add>
<add>
<add>def test_issue1727():
<add> data = numpy.ones((3, 300), dtype='f')
<add> keys = [u'I', u'am', u'Matt']
<add> vectors = Vectors(data=data, keys=keys)
<add> tagger = Tagger(Vocab())
<add> tagger.add_label('PRP')
<add> tagger.begin_training()
<add>
<add> assert tagger.cfg.get('pretrained_dims', 0) == 0
<add> tagger.vocab.vectors = vectors
<add>
<add> with make_tempdir() as path:
<add> tagger.to_disk(path)
<add> tagger = Tagger(Vocab()).from_disk(path)
<add> assert tagger.cfg.get('pretrained_dims', 0) == 0 | 1 |
Python | Python | set version to v2.1.0a5 | 978d8be8f91b930615acf3043d18f86700f77124 | <ide><path>spacy/about.py
<ide> # fmt: off
<ide>
<ide> __title__ = "spacy-nightly"
<del>__version__ = "2.1.0a5.dev0"
<add>__version__ = "2.1.0a5"
<ide> __summary__ = "Industrial-strength Natural Language Processing (NLP) with Python and Cython"
<ide> __uri__ = "https://spacy.io"
<ide> __author__ = "Explosion AI"
<ide> __email__ = "[email protected]"
<ide> __license__ = "MIT"
<del>__release__ = False
<add>__release__ = True
<ide>
<ide> __download_url__ = "https://github.com/explosion/spacy-models/releases/download"
<ide> __compatibility__ = "https://raw.githubusercontent.com/explosion/spacy-models/master/compatibility.json" | 1 |
Go | Go | remove deprecated cmd function in integration-cli | 7fbbd515b1018721e91199960d1933383a8262a1 | <ide><path>integration-cli/docker_cli_build_test.go
<ide> func TestBuildNoContext(t *testing.T) {
<ide> t.Fatalf("build failed to complete: %v %v", out, err)
<ide> }
<ide>
<del> if out, _, err := cmd(t, "run", "--rm", "nocontext"); out != "ok\n" || err != nil {
<add> if out, _, err := dockerCmd(t, "run", "--rm", "nocontext"); out != "ok\n" || err != nil {
<ide> t.Fatalf("run produced invalid output: %q, expected %q", out, "ok")
<ide> }
<ide>
<ide><path>integration-cli/docker_cli_cp_test.go
<ide> const (
<ide> // Test for #5656
<ide> // Check that garbage paths don't escape the container's rootfs
<ide> func TestCpGarbagePath(t *testing.T) {
<del> out, exitCode, err := cmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath)
<add> out, exitCode, err := dockerCmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath)
<ide> if err != nil || exitCode != 0 {
<ide> t.Fatal("failed to create a container", out, err)
<ide> }
<ide>
<ide> cleanedContainerID := stripTrailingCharacters(out)
<ide> defer deleteContainer(cleanedContainerID)
<ide>
<del> out, _, err = cmd(t, "wait", cleanedContainerID)
<add> out, _, err = dockerCmd(t, "wait", cleanedContainerID)
<ide> if err != nil || stripTrailingCharacters(out) != "0" {
<ide> t.Fatal("failed to set up container", out, err)
<ide> }
<ide> func TestCpGarbagePath(t *testing.T) {
<ide>
<ide> path := filepath.Join("../../../../../../../../../../../../", cpFullPath)
<ide>
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from garbage path: %s:%s %s", cleanedContainerID, path, err)
<ide> }
<ide> func TestCpGarbagePath(t *testing.T) {
<ide>
<ide> // Check that relative paths are relative to the container's rootfs
<ide> func TestCpRelativePath(t *testing.T) {
<del> out, exitCode, err := cmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath)
<add> out, exitCode, err := dockerCmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath)
<ide> if err != nil || exitCode != 0 {
<ide> t.Fatal("failed to create a container", out, err)
<ide> }
<ide>
<ide> cleanedContainerID := stripTrailingCharacters(out)
<ide> defer deleteContainer(cleanedContainerID)
<ide>
<del> out, _, err = cmd(t, "wait", cleanedContainerID)
<add> out, _, err = dockerCmd(t, "wait", cleanedContainerID)
<ide> if err != nil || stripTrailingCharacters(out) != "0" {
<ide> t.Fatal("failed to set up container", out, err)
<ide> }
<ide> func TestCpRelativePath(t *testing.T) {
<ide>
<ide> path, _ := filepath.Rel("/", cpFullPath)
<ide>
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from relative path: %s:%s %s", cleanedContainerID, path, err)
<ide> }
<ide> func TestCpRelativePath(t *testing.T) {
<ide>
<ide> // Check that absolute paths are relative to the container's rootfs
<ide> func TestCpAbsolutePath(t *testing.T) {
<del> out, exitCode, err := cmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath)
<add> out, exitCode, err := dockerCmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath)
<ide> if err != nil || exitCode != 0 {
<ide> t.Fatal("failed to create a container", out, err)
<ide> }
<ide>
<ide> cleanedContainerID := stripTrailingCharacters(out)
<ide> defer deleteContainer(cleanedContainerID)
<ide>
<del> out, _, err = cmd(t, "wait", cleanedContainerID)
<add> out, _, err = dockerCmd(t, "wait", cleanedContainerID)
<ide> if err != nil || stripTrailingCharacters(out) != "0" {
<ide> t.Fatal("failed to set up container", out, err)
<ide> }
<ide> func TestCpAbsolutePath(t *testing.T) {
<ide>
<ide> path := cpFullPath
<ide>
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from absolute path: %s:%s %s", cleanedContainerID, path, err)
<ide> }
<ide> func TestCpAbsolutePath(t *testing.T) {
<ide> // Test for #5619
<ide> // Check that absolute symlinks are still relative to the container's rootfs
<ide> func TestCpAbsoluteSymlink(t *testing.T) {
<del> out, exitCode, err := cmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath+" && ln -s "+cpFullPath+" container_path")
<add> out, exitCode, err := dockerCmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath+" && ln -s "+cpFullPath+" container_path")
<ide> if err != nil || exitCode != 0 {
<ide> t.Fatal("failed to create a container", out, err)
<ide> }
<ide>
<ide> cleanedContainerID := stripTrailingCharacters(out)
<ide> defer deleteContainer(cleanedContainerID)
<ide>
<del> out, _, err = cmd(t, "wait", cleanedContainerID)
<add> out, _, err = dockerCmd(t, "wait", cleanedContainerID)
<ide> if err != nil || stripTrailingCharacters(out) != "0" {
<ide> t.Fatal("failed to set up container", out, err)
<ide> }
<ide> func TestCpAbsoluteSymlink(t *testing.T) {
<ide>
<ide> path := filepath.Join("/", "container_path")
<ide>
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from absolute path: %s:%s %s", cleanedContainerID, path, err)
<ide> }
<ide> func TestCpAbsoluteSymlink(t *testing.T) {
<ide> // Test for #5619
<ide> // Check that symlinks which are part of the resource path are still relative to the container's rootfs
<ide> func TestCpSymlinkComponent(t *testing.T) {
<del> out, exitCode, err := cmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath+" && ln -s "+cpTestPath+" container_path")
<add> out, exitCode, err := dockerCmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "mkdir -p '"+cpTestPath+"' && echo -n '"+cpContainerContents+"' > "+cpFullPath+" && ln -s "+cpTestPath+" container_path")
<ide> if err != nil || exitCode != 0 {
<ide> t.Fatal("failed to create a container", out, err)
<ide> }
<ide>
<ide> cleanedContainerID := stripTrailingCharacters(out)
<ide> defer deleteContainer(cleanedContainerID)
<ide>
<del> out, _, err = cmd(t, "wait", cleanedContainerID)
<add> out, _, err = dockerCmd(t, "wait", cleanedContainerID)
<ide> if err != nil || stripTrailingCharacters(out) != "0" {
<ide> t.Fatal("failed to set up container", out, err)
<ide> }
<ide> func TestCpSymlinkComponent(t *testing.T) {
<ide>
<ide> path := filepath.Join("/", "container_path", cpTestName)
<ide>
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":"+path, tmpdir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from symlink path component: %s:%s %s", cleanedContainerID, path, err)
<ide> }
<ide> func TestCpSymlinkComponent(t *testing.T) {
<ide>
<ide> // Check that cp with unprivileged user doesn't return any error
<ide> func TestCpUnprivilegedUser(t *testing.T) {
<del> out, exitCode, err := cmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "touch "+cpTestName)
<add> out, exitCode, err := dockerCmd(t, "run", "-d", "busybox", "/bin/sh", "-c", "touch "+cpTestName)
<ide> if err != nil || exitCode != 0 {
<ide> t.Fatal("failed to create a container", out, err)
<ide> }
<ide>
<ide> cleanedContainerID := stripTrailingCharacters(out)
<ide> defer deleteContainer(cleanedContainerID)
<ide>
<del> out, _, err = cmd(t, "wait", cleanedContainerID)
<add> out, _, err = dockerCmd(t, "wait", cleanedContainerID)
<ide> if err != nil || stripTrailingCharacters(out) != "0" {
<ide> t.Fatal("failed to set up container", out, err)
<ide> }
<ide> func TestCpVolumePath(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide>
<del> out, exitCode, err := cmd(t, "run", "-d", "-v", "/foo", "-v", tmpDir+"/test:/test", "-v", tmpDir+":/baz", "busybox", "/bin/sh", "-c", "touch /foo/bar")
<add> out, exitCode, err := dockerCmd(t, "run", "-d", "-v", "/foo", "-v", tmpDir+"/test:/test", "-v", tmpDir+":/baz", "busybox", "/bin/sh", "-c", "touch /foo/bar")
<ide> if err != nil || exitCode != 0 {
<ide> t.Fatal("failed to create a container", out, err)
<ide> }
<ide>
<ide> cleanedContainerID := stripTrailingCharacters(out)
<ide> defer deleteContainer(cleanedContainerID)
<ide>
<del> out, _, err = cmd(t, "wait", cleanedContainerID)
<add> out, _, err = dockerCmd(t, "wait", cleanedContainerID)
<ide> if err != nil || stripTrailingCharacters(out) != "0" {
<ide> t.Fatal("failed to set up container", out, err)
<ide> }
<ide>
<ide> // Copy actual volume path
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":/foo", outDir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":/foo", outDir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from volume path: %s:%s %v", cleanedContainerID, "/foo", err)
<ide> }
<ide> func TestCpVolumePath(t *testing.T) {
<ide> }
<ide>
<ide> // Copy file nested in volume
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":/foo/bar", outDir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":/foo/bar", outDir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from volume path: %s:%s %v", cleanedContainerID, "/foo", err)
<ide> }
<ide> func TestCpVolumePath(t *testing.T) {
<ide> }
<ide>
<ide> // Copy Bind-mounted dir
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":/baz", outDir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":/baz", outDir)
<ide> if err != nil {
<ide> t.Fatalf("couldn't copy from bind-mounted volume path: %s:%s %v", cleanedContainerID, "/baz", err)
<ide> }
<ide> func TestCpVolumePath(t *testing.T) {
<ide> }
<ide>
<ide> // Copy file nested in bind-mounted dir
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":/baz/test", outDir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":/baz/test", outDir)
<ide> fb, err := ioutil.ReadFile(outDir + "/baz/test")
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> func TestCpVolumePath(t *testing.T) {
<ide> }
<ide>
<ide> // Copy bind-mounted file
<del> _, _, err = cmd(t, "cp", cleanedContainerID+":/test", outDir)
<add> _, _, err = dockerCmd(t, "cp", cleanedContainerID+":/test", outDir)
<ide> fb, err = ioutil.ReadFile(outDir + "/test")
<ide> if err != nil {
<ide> t.Fatal(err)
<ide><path>integration-cli/docker_cli_events_test.go
<ide> import (
<ide> )
<ide>
<ide> func TestEventsUntag(t *testing.T) {
<del> out, _, _ := cmd(t, "images", "-q")
<add> out, _, _ := dockerCmd(t, "images", "-q")
<ide> image := strings.Split(out, "\n")[0]
<del> cmd(t, "tag", image, "utest:tag1")
<del> cmd(t, "tag", image, "utest:tag2")
<del> cmd(t, "rmi", "utest:tag1")
<del> cmd(t, "rmi", "utest:tag2")
<add> dockerCmd(t, "tag", image, "utest:tag1")
<add> dockerCmd(t, "tag", image, "utest:tag2")
<add> dockerCmd(t, "rmi", "utest:tag1")
<add> dockerCmd(t, "rmi", "utest:tag2")
<ide> eventsCmd := exec.Command("timeout", "0.2", dockerBinary, "events", "--since=1")
<ide> out, _, _ = runCommandWithOutput(eventsCmd)
<ide> events := strings.Split(out, "\n")
<ide> func TestEventsUntag(t *testing.T) {
<ide>
<ide> func TestEventsPause(t *testing.T) {
<ide> name := "testeventpause"
<del> out, _, _ := cmd(t, "images", "-q")
<add> out, _, _ := dockerCmd(t, "images", "-q")
<ide> image := strings.Split(out, "\n")[0]
<del> cmd(t, "run", "-d", "--name", name, image, "sleep", "2")
<del> cmd(t, "pause", name)
<del> cmd(t, "unpause", name)
<add> dockerCmd(t, "run", "-d", "--name", name, image, "sleep", "2")
<add> dockerCmd(t, "pause", name)
<add> dockerCmd(t, "unpause", name)
<ide>
<ide> defer deleteAllContainers()
<ide>
<ide> func TestEventsPause(t *testing.T) {
<ide> func TestEventsContainerFailStartDie(t *testing.T) {
<ide> defer deleteAllContainers()
<ide>
<del> out, _, _ := cmd(t, "images", "-q")
<add> out, _, _ := dockerCmd(t, "images", "-q")
<ide> image := strings.Split(out, "\n")[0]
<ide> eventsCmd := exec.Command(dockerBinary, "run", "-d", "--name", "testeventdie", image, "blerg")
<ide> _, _, err := runCommandWithOutput(eventsCmd)
<ide> func TestEventsContainerFailStartDie(t *testing.T) {
<ide> func TestEventsLimit(t *testing.T) {
<ide> defer deleteAllContainers()
<ide> for i := 0; i < 30; i++ {
<del> cmd(t, "run", "busybox", "echo", strconv.Itoa(i))
<add> dockerCmd(t, "run", "busybox", "echo", strconv.Itoa(i))
<ide> }
<ide> eventsCmd := exec.Command(dockerBinary, "events", "--since=0", fmt.Sprintf("--until=%d", time.Now().Unix()))
<ide> out, _, _ := runCommandWithOutput(eventsCmd)
<ide> func TestEventsLimit(t *testing.T) {
<ide> }
<ide>
<ide> func TestEventsContainerEvents(t *testing.T) {
<del> cmd(t, "run", "--rm", "busybox", "true")
<add> dockerCmd(t, "run", "--rm", "busybox", "true")
<ide> eventsCmd := exec.Command(dockerBinary, "events", "--since=0", fmt.Sprintf("--until=%d", time.Now().Unix()))
<ide> out, exitCode, err := runCommandWithOutput(eventsCmd)
<ide> if exitCode != 0 || err != nil {
<ide> func TestEventsRedirectStdout(t *testing.T) {
<ide>
<ide> since := time.Now().Unix()
<ide>
<del> cmd(t, "run", "busybox", "true")
<add> dockerCmd(t, "run", "busybox", "true")
<ide>
<ide> defer deleteAllContainers()
<ide>
<ide><path>integration-cli/docker_cli_links_test.go
<ide> func TestLinksPingUnlinkedContainers(t *testing.T) {
<ide>
<ide> func TestLinksPingLinkedContainers(t *testing.T) {
<ide> var out string
<del> out, _, _ = cmd(t, "run", "-d", "--name", "container1", "busybox", "sleep", "10")
<add> out, _, _ = dockerCmd(t, "run", "-d", "--name", "container1", "busybox", "sleep", "10")
<ide> idA := stripTrailingCharacters(out)
<del> out, _, _ = cmd(t, "run", "-d", "--name", "container2", "busybox", "sleep", "10")
<add> out, _, _ = dockerCmd(t, "run", "-d", "--name", "container2", "busybox", "sleep", "10")
<ide> idB := stripTrailingCharacters(out)
<del> cmd(t, "run", "--rm", "--link", "container1:alias1", "--link", "container2:alias2", "busybox", "sh", "-c", "ping -c 1 alias1 -W 1 && ping -c 1 alias2 -W 1")
<del> cmd(t, "kill", idA)
<del> cmd(t, "kill", idB)
<add> dockerCmd(t, "run", "--rm", "--link", "container1:alias1", "--link", "container2:alias2", "busybox", "sh", "-c", "ping -c 1 alias1 -W 1 && ping -c 1 alias2 -W 1")
<add> dockerCmd(t, "kill", idA)
<add> dockerCmd(t, "kill", idB)
<ide> deleteAllContainers()
<ide>
<ide> logDone("links - ping linked container")
<ide> }
<ide>
<ide> func TestLinksIpTablesRulesWhenLinkAndUnlink(t *testing.T) {
<del> cmd(t, "run", "-d", "--name", "child", "--publish", "8080:80", "busybox", "sleep", "10")
<del> cmd(t, "run", "-d", "--name", "parent", "--link", "child:http", "busybox", "sleep", "10")
<add> dockerCmd(t, "run", "-d", "--name", "child", "--publish", "8080:80", "busybox", "sleep", "10")
<add> dockerCmd(t, "run", "-d", "--name", "parent", "--link", "child:http", "busybox", "sleep", "10")
<ide>
<ide> childIP := findContainerIP(t, "child")
<ide> parentIP := findContainerIP(t, "parent")
<ide> func TestLinksIpTablesRulesWhenLinkAndUnlink(t *testing.T) {
<ide> t.Fatal("Iptables rules not found")
<ide> }
<ide>
<del> cmd(t, "rm", "--link", "parent/http")
<add> dockerCmd(t, "rm", "--link", "parent/http")
<ide> if iptables.Exists(sourceRule...) || iptables.Exists(destinationRule...) {
<ide> t.Fatal("Iptables rules should be removed when unlink")
<ide> }
<ide>
<del> cmd(t, "kill", "child")
<del> cmd(t, "kill", "parent")
<add> dockerCmd(t, "kill", "child")
<add> dockerCmd(t, "kill", "parent")
<ide> deleteAllContainers()
<ide>
<ide> logDone("link - verify iptables when link and unlink")
<ide> func TestLinksInspectLinksStarted(t *testing.T) {
<ide> result []string
<ide> )
<ide> defer deleteAllContainers()
<del> cmd(t, "run", "-d", "--name", "container1", "busybox", "sleep", "10")
<del> cmd(t, "run", "-d", "--name", "container2", "busybox", "sleep", "10")
<del> cmd(t, "run", "-d", "--name", "testinspectlink", "--link", "container1:alias1", "--link", "container2:alias2", "busybox", "sleep", "10")
<add> dockerCmd(t, "run", "-d", "--name", "container1", "busybox", "sleep", "10")
<add> dockerCmd(t, "run", "-d", "--name", "container2", "busybox", "sleep", "10")
<add> dockerCmd(t, "run", "-d", "--name", "testinspectlink", "--link", "container1:alias1", "--link", "container2:alias2", "busybox", "sleep", "10")
<ide> links, err := inspectFieldJSON("testinspectlink", "HostConfig.Links")
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> func TestLinksInspectLinksStopped(t *testing.T) {
<ide> result []string
<ide> )
<ide> defer deleteAllContainers()
<del> cmd(t, "run", "-d", "--name", "container1", "busybox", "sleep", "10")
<del> cmd(t, "run", "-d", "--name", "container2", "busybox", "sleep", "10")
<del> cmd(t, "run", "-d", "--name", "testinspectlink", "--link", "container1:alias1", "--link", "container2:alias2", "busybox", "true")
<add> dockerCmd(t, "run", "-d", "--name", "container1", "busybox", "sleep", "10")
<add> dockerCmd(t, "run", "-d", "--name", "container2", "busybox", "sleep", "10")
<add> dockerCmd(t, "run", "-d", "--name", "testinspectlink", "--link", "container1:alias1", "--link", "container2:alias2", "busybox", "true")
<ide> links, err := inspectFieldJSON("testinspectlink", "HostConfig.Links")
<ide> if err != nil {
<ide> t.Fatal(err)
<ide><path>integration-cli/docker_cli_rmi_test.go
<ide> func TestRmiWithContainerFails(t *testing.T) {
<ide> }
<ide>
<ide> // make sure it didn't delete the busybox name
<del> images, _, _ := cmd(t, "images")
<add> images, _, _ := dockerCmd(t, "images")
<ide> if !strings.Contains(images, "busybox") {
<ide> t.Fatalf("The name 'busybox' should not have been removed from images: %q", images)
<ide> }
<ide> func TestRmiWithContainerFails(t *testing.T) {
<ide> }
<ide>
<ide> func TestRmiTag(t *testing.T) {
<del> imagesBefore, _, _ := cmd(t, "images", "-a")
<del> cmd(t, "tag", "busybox", "utest:tag1")
<del> cmd(t, "tag", "busybox", "utest/docker:tag2")
<del> cmd(t, "tag", "busybox", "utest:5000/docker:tag3")
<add> imagesBefore, _, _ := dockerCmd(t, "images", "-a")
<add> dockerCmd(t, "tag", "busybox", "utest:tag1")
<add> dockerCmd(t, "tag", "busybox", "utest/docker:tag2")
<add> dockerCmd(t, "tag", "busybox", "utest:5000/docker:tag3")
<ide> {
<del> imagesAfter, _, _ := cmd(t, "images", "-a")
<add> imagesAfter, _, _ := dockerCmd(t, "images", "-a")
<ide> if nLines(imagesAfter) != nLines(imagesBefore)+3 {
<ide> t.Fatalf("before: %q\n\nafter: %q\n", imagesBefore, imagesAfter)
<ide> }
<ide> }
<del> cmd(t, "rmi", "utest/docker:tag2")
<add> dockerCmd(t, "rmi", "utest/docker:tag2")
<ide> {
<del> imagesAfter, _, _ := cmd(t, "images", "-a")
<add> imagesAfter, _, _ := dockerCmd(t, "images", "-a")
<ide> if nLines(imagesAfter) != nLines(imagesBefore)+2 {
<ide> t.Fatalf("before: %q\n\nafter: %q\n", imagesBefore, imagesAfter)
<ide> }
<ide>
<ide> }
<del> cmd(t, "rmi", "utest:5000/docker:tag3")
<add> dockerCmd(t, "rmi", "utest:5000/docker:tag3")
<ide> {
<del> imagesAfter, _, _ := cmd(t, "images", "-a")
<add> imagesAfter, _, _ := dockerCmd(t, "images", "-a")
<ide> if nLines(imagesAfter) != nLines(imagesBefore)+1 {
<ide> t.Fatalf("before: %q\n\nafter: %q\n", imagesBefore, imagesAfter)
<ide> }
<ide>
<ide> }
<del> cmd(t, "rmi", "utest:tag1")
<add> dockerCmd(t, "rmi", "utest:tag1")
<ide> {
<del> imagesAfter, _, _ := cmd(t, "images", "-a")
<add> imagesAfter, _, _ := dockerCmd(t, "images", "-a")
<ide> if nLines(imagesAfter) != nLines(imagesBefore)+0 {
<ide> t.Fatalf("before: %q\n\nafter: %q\n", imagesBefore, imagesAfter)
<ide> }
<ide><path>integration-cli/docker_cli_run_test.go
<ide> func TestRunLoopbackWhenNetworkDisabled(t *testing.T) {
<ide> }
<ide>
<ide> func TestRunNetHostNotAllowedWithLinks(t *testing.T) {
<del> _, _, err := cmd(t, "run", "--name", "linked", "busybox", "true")
<add> _, _, err := dockerCmd(t, "run", "--name", "linked", "busybox", "true")
<ide>
<ide> cmd := exec.Command(dockerBinary, "run", "--net=host", "--link", "linked:linked", "busybox", "true")
<ide> _, _, err = runCommandWithOutput(cmd)
<ide> func TestRunModeHostname(t *testing.T) {
<ide> }
<ide>
<ide> func TestRunRootWorkdir(t *testing.T) {
<del> s, _, err := cmd(t, "run", "--workdir", "/", "busybox", "pwd")
<add> s, _, err := dockerCmd(t, "run", "--workdir", "/", "busybox", "pwd")
<ide> if err != nil {
<ide> t.Fatal(s, err)
<ide> }
<ide> func TestRunRootWorkdir(t *testing.T) {
<ide> }
<ide>
<ide> func TestRunAllowBindMountingRoot(t *testing.T) {
<del> s, _, err := cmd(t, "run", "-v", "/:/host", "busybox", "ls", "/host")
<add> s, _, err := dockerCmd(t, "run", "-v", "/:/host", "busybox", "ls", "/host")
<ide> if err != nil {
<ide> t.Fatal(s, err)
<ide> }
<ide><path>integration-cli/docker_cli_start_test.go
<ide> import (
<ide> func TestStartAttachReturnsOnError(t *testing.T) {
<ide> defer deleteAllContainers()
<ide>
<del> cmd(t, "run", "-d", "--name", "test", "busybox")
<del> cmd(t, "stop", "test")
<add> dockerCmd(t, "run", "-d", "--name", "test", "busybox")
<add> dockerCmd(t, "stop", "test")
<ide>
<ide> // Expect this to fail because the above container is stopped, this is what we want
<ide> if _, err := runCommand(exec.Command(dockerBinary, "run", "-d", "--name", "test2", "--link", "test:test", "busybox")); err == nil {
<ide> func TestStartRecordError(t *testing.T) {
<ide> defer deleteAllContainers()
<ide>
<ide> // when container runs successfully, we should not have state.Error
<del> cmd(t, "run", "-d", "-p", "9999:9999", "--name", "test", "busybox", "top")
<add> dockerCmd(t, "run", "-d", "-p", "9999:9999", "--name", "test", "busybox", "top")
<ide> stateErr, err := inspectField("test", "State.Error")
<ide> if err != nil {
<ide> t.Fatalf("Failed to inspect %q state's error, got error %q", "test", err)
<ide> func TestStartRecordError(t *testing.T) {
<ide> }
<ide>
<ide> // Expect the conflict to be resolved when we stop the initial container
<del> cmd(t, "stop", "test")
<del> cmd(t, "start", "test2")
<add> dockerCmd(t, "stop", "test")
<add> dockerCmd(t, "start", "test2")
<ide> stateErr, err = inspectField("test2", "State.Error")
<ide> if err != nil {
<ide> t.Fatalf("Failed to inspect %q state's error, got error %q", "test", err)
<ide> func TestStartVolumesFromFailsCleanly(t *testing.T) {
<ide> defer deleteAllContainers()
<ide>
<ide> // Create the first data volume
<del> cmd(t, "run", "-d", "--name", "data_before", "-v", "/foo", "busybox")
<add> dockerCmd(t, "run", "-d", "--name", "data_before", "-v", "/foo", "busybox")
<ide>
<ide> // Expect this to fail because the data test after contaienr doesn't exist yet
<ide> if _, err := runCommand(exec.Command(dockerBinary, "run", "-d", "--name", "consumer", "--volumes-from", "data_before", "--volumes-from", "data_after", "busybox")); err == nil {
<ide> t.Fatal("Expected error but got none")
<ide> }
<ide>
<ide> // Create the second data volume
<del> cmd(t, "run", "-d", "--name", "data_after", "-v", "/bar", "busybox")
<add> dockerCmd(t, "run", "-d", "--name", "data_after", "-v", "/bar", "busybox")
<ide>
<ide> // Now, all the volumes should be there
<del> cmd(t, "start", "consumer")
<add> dockerCmd(t, "start", "consumer")
<ide>
<ide> // Check that we have the volumes we want
<del> out, _, _ := cmd(t, "inspect", "--format='{{ len .Volumes }}'", "consumer")
<add> out, _, _ := dockerCmd(t, "inspect", "--format='{{ len .Volumes }}'", "consumer")
<ide> n_volumes := strings.Trim(out, " \r\n'")
<ide> if n_volumes != "2" {
<ide> t.Fatalf("Missing volumes: expected 2, got %s", n_volumes)
<ide><path>integration-cli/docker_utils.go
<ide> func pullImageIfNotExist(image string) (err error) {
<ide> return
<ide> }
<ide>
<del>// deprecated, use dockerCmd instead
<del>func cmd(t *testing.T, args ...string) (string, int, error) {
<del> return dockerCmd(t, args...)
<del>}
<del>
<ide> func dockerCmd(t *testing.T, args ...string) (string, int, error) {
<ide> out, status, err := runCommandWithOutput(exec.Command(dockerBinary, args...))
<ide> if err != nil { | 8 |
PHP | PHP | apply suggestions from code review | aa83bd947ae7d18cd194797ec008714dfbce37fe | <ide><path>src/Database/QueryCompiler.php
<ide> */
<ide> namespace Cake\Database;
<ide>
<del>use Cake\Database\Expression\QueryExpression;
<ide> use Cake\Database\Exception as DatabaseException;
<add>use Cake\Database\Expression\QueryExpression;
<ide> use Closure;
<ide> use Countable;
<ide> | 1 |
PHP | PHP | apply fixes from styleci | 305c007f3d01c229b2c68b4b30c4fe0d5da341bb | <ide><path>src/Illuminate/Database/Eloquent/Relations/BelongsTo.php
<ide> public function getRelationCountHash()
<ide> */
<ide> protected function relationHasIncrementingId()
<ide> {
<del> return $this->related->getIncrementing() &&
<add> return $this->related->getIncrementing() &&
<ide> $this->related->getKeyType() === 'int';
<ide> }
<ide> | 1 |
Go | Go | add integration test for xz path issue | af2021955cb01507984cbd076edaf3caaf5b89b3 | <ide><path>integration-cli/docker_cli_build_test.go
<ide> func TestBuildSymlinkBreakout(t *testing.T) {
<ide> }
<ide> logDone("build - symlink breakout")
<ide> }
<add>
<add>func TestBuildXZHost(t *testing.T) {
<add> name := "testbuildxzhost"
<add> defer deleteImages(name)
<add>
<add> ctx, err := fakeContext(`
<add>FROM busybox
<add>ADD xz /usr/local/sbin/
<add>RUN chmod 755 /usr/local/sbin/xz
<add>ADD test.xz /
<add>RUN [ ! -e /injected ]`,
<add> map[string]string{
<add> "test.xz": "\xfd\x37\x7a\x58\x5a\x00\x00\x04\xe6\xd6\xb4\x46\x02\x00" +
<add> "\x21\x01\x16\x00\x00\x00\x74\x2f\xe5\xa3\x01\x00\x3f\xfd" +
<add> "\x37\x7a\x58\x5a\x00\x00\x04\xe6\xd6\xb4\x46\x02\x00\x21",
<add> "xz": "#!/bin/sh\ntouch /injected",
<add> })
<add>
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> defer ctx.Close()
<add>
<add> if _, err := buildImageFromContext(name, ctx, true); err != nil {
<add> t.Fatal(err)
<add> }
<add>
<add> logDone("build - xz host is being used")
<add>} | 1 |
Javascript | Javascript | remove empty willmergemixin function | 71f453292eb7d6bb253c0c7222ab8cfc9f2a739f | <ide><path>packages/ember-metal/lib/mixin.js
<ide> function mergeMixins(mixins, meta, descs, values, base, keys) {
<ide> if (props === CONTINUE) { continue; }
<ide>
<ide> if (props) {
<add> // remove willMergeMixin after 3.4 as it was used for _actions
<ide> if (base.willMergeMixin) { base.willMergeMixin(props); }
<ide> concats = concatenatedMixinProperties('concatenatedProperties', props, values, base);
<ide> mergings = concatenatedMixinProperties('mergedProperties', props, values, base);
<ide><path>packages/ember-runtime/lib/mixins/action_handler.js
<ide> const ActionHandler = Mixin.create({
<ide> );
<ide> target.send(...arguments);
<ide> }
<del> },
<del>
<del> willMergeMixin() {}
<add> }
<ide> });
<ide>
<ide> export default ActionHandler; | 2 |
PHP | PHP | add missing return docs and fix cs. | 370dbcfb573b917c963b7ba6fc3f1c44e85863a7 | <ide><path>src/Illuminate/Notifications/HasDatabaseNotifications.php
<ide> trait HasDatabaseNotifications
<ide> {
<ide> /**
<ide> * Get the entity's notifications.
<add> *
<add> * @return \Illuminate\Database\Eloquent\Relations\MorphMany
<ide> */
<ide> public function notifications()
<ide> {
<del> return $this->morphMany(DatabaseNotification::class, 'notifiable')
<del> ->orderBy('created_at', 'desc');
<add> return $this->morphMany(DatabaseNotification::class, 'notifiable')->orderBy('created_at', 'desc');
<ide> }
<ide>
<ide> /**
<ide> * Get the entity's read notifications.
<add> *
<add> * @return \Illuminate\Database\Query\Builder
<ide> */
<ide> public function readNotifications()
<ide> {
<del> return $this->notifications()
<del> ->whereNotNull('read_at');
<add> return $this->notifications()->whereNotNull('read_at');
<ide> }
<ide>
<ide> /**
<ide> * Get the entity's unread notifications.
<add> *
<add> * @return \Illuminate\Database\Query\Builder
<ide> */
<ide> public function unreadNotifications()
<ide> {
<del> return $this->notifications()
<del> ->whereNull('read_at');
<add> return $this->notifications()->whereNull('read_at');
<ide> }
<ide> } | 1 |
Mixed | Python | fix authtoken migration | 4cdb6b2959f6d13417c48781d53c4e7e685934e7 | <ide><path>docs/api-guide/settings.md
<ide> The name of a parameter in the URL conf that may be used to provide a format suf
<ide>
<ide> Default: `'format'`
<ide>
<add>## REQUIRED_MIGRATIONS
<add>
<add>This is a list of required migrations which are needed by the authtoken migration.
<add>
<add>E.g.
<add>
<add> (
<add> ('users', '0001_initial'),
<add> )
<add>
<add>Default: `'()'`
<add>
<ide> [cite]: http://www.python.org/dev/peps/pep-0020/
<ide><path>docs/topics/release-notes.md
<ide> You can determine your currently installed version using `pip freeze`:
<ide> * Request authentication is no longer lazily evaluated, instead authentication is always run, which results in more consistent, obvious behavior. Eg. Supplying bad auth credentials will now always return an error response, even if no permissions are set on the view.
<ide> * Bugfix for serializer data being uncacheable with pickle protocol 0.
<ide> * Bugfixes for model field validation edge-cases.
<add>* Bugfix for authtoken migration while using a custom user model.
<ide>
<ide> ### 2.2.1
<ide>
<ide><path>rest_framework/authtoken/migrations/0001_initial.py
<ide> from south.v2 import SchemaMigration
<ide> from django.db import models
<ide>
<add>from rest_framework.settings import api_settings
<add>
<ide>
<ide> try:
<ide> from django.contrib.auth import get_user_model
<ide>
<ide> class Migration(SchemaMigration):
<ide>
<add> depends_on = api_settings.REQUIRED_MIGRATIONS
<add>
<ide> def forwards(self, orm):
<ide> # Adding model 'Token'
<ide> db.create_table('authtoken_token', (
<ide> def backwards(self, orm):
<ide> 'name': ('django.db.models.fields.CharField', [], {'max_length': '50'})
<ide> },
<ide> "%s.%s" % (User._meta.app_label, User._meta.module_name): {
<del> 'Meta': {'object_name': 'User'},
<add> 'Meta': {'object_name': User._meta.module_name},
<ide> 'date_joined': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime.now'}),
<ide> 'email': ('django.db.models.fields.EmailField', [], {'max_length': '75', 'blank': 'True'}),
<ide> 'first_name': ('django.db.models.fields.CharField', [], {'max_length': '30', 'blank': 'True'}),
<ide><path>rest_framework/settings.py
<ide> 'URL_FORMAT_OVERRIDE': 'format',
<ide>
<ide> 'FORMAT_SUFFIX_KWARG': 'format',
<add>
<add> # Authtoken
<add> 'REQUIRED_MIGRATIONS': (),
<ide> }
<ide>
<ide> | 4 |
Python | Python | evaluate content before passing to regex.sub | 921e4ed2ee11edffd19d2ca40f10d47d2c148ea1 | <ide><path>rest_framework/tests/test_description.py
<ide>
<ide> from __future__ import unicode_literals
<ide> from django.test import TestCase
<add>from django.utils.encoding import python_2_unicode_compatible
<ide> from rest_framework.compat import apply_markdown, smart_text
<ide> from rest_framework.views import APIView
<ide> from rest_framework.tests.description import ViewWithNonASCIICharactersInDocstring
<ide> class MockView(APIView):
<ide> pass
<ide> self.assertEqual(MockView().get_view_description(), '')
<ide>
<add> def test_view_description_can_be_promise(self):
<add> """
<add> Ensure a view may have a docstring that is actually a lazily evaluated
<add> class that can be converted to a string.
<add>
<add> See: https://github.com/tomchristie/django-rest-framework/issues/1708
<add> """
<add> # use a mock object instead of gettext_lazy to ensure that we can't end
<add> # up with a test case string in our l10n catalog
<add> @python_2_unicode_compatible
<add> class MockLazyStr(object):
<add> def __init__(self, string):
<add> self.s = string
<add> def __str__(self):
<add> return self.s
<add>
<add> class MockView(APIView):
<add> __doc__ = MockLazyStr("a gettext string")
<add>
<add> self.assertEqual(MockView().get_view_description(), 'a gettext string')
<add>
<ide> def test_markdown(self):
<ide> """
<ide> Ensure markdown to HTML works as expected.
<ide><path>rest_framework/utils/formatting.py
<ide> from django.utils.html import escape
<ide> from django.utils.safestring import mark_safe
<ide> from rest_framework.compat import apply_markdown
<del>from rest_framework.settings import api_settings
<del>from textwrap import dedent
<ide> import re
<ide>
<ide>
<ide> def dedent(content):
<ide> # unindent the content if needed
<ide> if whitespace_counts:
<ide> whitespace_pattern = '^' + (' ' * min(whitespace_counts))
<del> content = re.sub(re.compile(whitespace_pattern, re.MULTILINE), '', content)
<add> content = re.sub(re.compile(whitespace_pattern, re.MULTILINE), '', unicode(content))
<ide>
<ide> return content.strip()
<ide> | 2 |
PHP | PHP | escape the delimiter before preg_match | 6b906e3d45d023a7432f3f509bc9509d1f9bf2e3 | <ide><path>src/Illuminate/Validation/Validator.php
<ide> protected function dependsOnOtherFields($rule)
<ide> */
<ide> protected function getExplicitKeys($attribute)
<ide> {
<del> $pattern = str_replace('\*', '([^\.]+)', preg_quote($this->getPrimaryAttribute($attribute)));
<add> $pattern = str_replace('\*', '([^\.]+)', preg_quote($this->getPrimaryAttribute($attribute), '/'));
<ide>
<ide> if (preg_match('/^'.$pattern.'/', $attribute, $keys)) {
<ide> array_shift($keys); | 1 |
PHP | PHP | fix route commands in consoleshell | d7f1bf52b1f4c479e0a8c38340c497c7f92060d4 | <ide><path>lib/Cake/Console/Command/ConsoleShell.php
<ide> public function startup() {
<ide> foreach ($this->models as $model) {
<ide> $this->out(" - {$model}");
<ide> }
<del> $this->_loadRoutes();
<add>
<add> if (!$this->_loadRoutes()) {
<add> $message = __d(
<add> 'cake_console',
<add> 'There was an error loading the routes config. Please check that the file exists and contains no errors.'
<add> );
<add> $this->err($message);
<add> }
<ide> }
<ide>
<ide> /**
<ide> public function main($command = null) {
<ide> }
<ide> break;
<ide> case (preg_match("/^routes\s+reload/i", $command, $tmp) == true):
<del> $router = Router::getInstance();
<ide> if (!$this->_loadRoutes()) {
<del> $this->out(__d('cake_console', "There was an error loading the routes config. Please check that the file exists and is free of parse errors."));
<add> $this->err(__d('cake_console', "There was an error loading the routes config. Please check that the file exists and is free of parse errors."));
<ide> break;
<ide> }
<del> $this->out(__d('cake_console', "Routes configuration reloaded, %d routes connected", count($router->routes)));
<add> $this->out(__d('cake_console', "Routes configuration reloaded, %d routes connected", count(Router::$routes)));
<ide> break;
<ide> case (preg_match("/^routes\s+show/i", $command, $tmp) == true):
<del> $router = Router::getInstance();
<del> $this->out(implode("\n", Hash::extract($router->routes, '{n}.0')));
<add> $this->out(print_r(Hash::combine(Router::$routes, '{n}.template', '{n}.defaults'), true));
<ide> break;
<ide> case (preg_match("/^route\s+(\(.*\))$/i", $command, $tmp) == true):
<ide> if ($url = eval('return array' . $tmp[1] . ';')) {
<ide> protected function _loadRoutes() {
<ide> CakePlugin::routes();
<ide>
<ide> Router::parse('/');
<del>
<del> foreach (array_keys(Router::getNamedExpressions()) as $var) {
<del> unset(${$var});
<del> }
<del>
<del> foreach (Router::$routes as $route) {
<del> $route->compile();
<del> }
<ide> return true;
<ide> }
<ide> | 1 |
Javascript | Javascript | remove deprecated function from tests | 954a70e591233add31d9ac2e29840de08f9bb869 | <ide><path>src/renderers/shared/stack/reconciler/__tests__/ReactCompositeComponent-test.js
<ide> describe('ReactCompositeComponent', () => {
<ide>
<ide> reactComponentExpect(instance)
<ide> .expectRenderedChild()
<del> .toBeDOMComponentWithTag('a');
<add> .toBeComponentOfType('a');
<ide>
<ide> instance._toggleActivatedState();
<ide> reactComponentExpect(instance)
<ide> .expectRenderedChild()
<del> .toBeDOMComponentWithTag('b');
<add> .toBeComponentOfType('b');
<ide>
<ide> instance._toggleActivatedState();
<ide> reactComponentExpect(instance)
<ide> .expectRenderedChild()
<del> .toBeDOMComponentWithTag('a');
<add> .toBeComponentOfType('a');
<ide> });
<ide>
<ide> it('should not thrash a server rendered layout with client side one', () => {
<ide> describe('ReactCompositeComponent', () => {
<ide> ReactTestUtils.Simulate.click(renderedChild);
<ide> reactComponentExpect(instance)
<ide> .expectRenderedChild()
<del> .toBeDOMComponentWithTag('b');
<add> .toBeComponentOfType('b');
<ide> });
<ide>
<ide> it('should rewire refs when rendering to different child types', () => {
<ide><path>src/renderers/shared/stack/reconciler/__tests__/ReactCompositeComponentDOMMinimalism-test.js
<ide> describe('ReactCompositeComponentDOMMinimalism', () => {
<ide> .expectRenderedChild()
<ide> .toBeCompositeComponentWithType(LowerLevelComposite)
<ide> .expectRenderedChild()
<del> .toBeDOMComponentWithTag('div')
<add> .toBeComponentOfType('div')
<ide> .toBeDOMComponentWithNoChildren();
<ide> };
<ide> });
<ide> describe('ReactCompositeComponentDOMMinimalism', () => {
<ide> .expectRenderedChild()
<ide> .toBeCompositeComponentWithType(LowerLevelComposite)
<ide> .expectRenderedChild()
<del> .toBeDOMComponentWithTag('div')
<add> .toBeComponentOfType('div')
<ide> .toBeDOMComponentWithChildCount(1)
<ide> .expectRenderedChildAt(0)
<del> .toBeDOMComponentWithTag('ul')
<add> .toBeComponentOfType('ul')
<ide> .toBeDOMComponentWithNoChildren();
<ide> });
<ide>
<ide><path>src/test/reactComponentExpect.js
<ide> Object.assign(reactComponentExpectInternal.prototype, {
<ide> return this;
<ide> },
<ide>
<del> /**
<del> * @deprecated
<del> * @see toBeComponentOfType
<del> */
<del> toBeDOMComponentWithTag: function(tag) {
<del> this.toBeDOMComponent();
<del> expect(this.instance().tagName).toBe(tag.toUpperCase());
<del> return this;
<del> },
<del>
<ide> /**
<ide> * Check that internal state values are equal to a state of expected values.
<ide> */ | 3 |
Python | Python | check enable attribute before using | ed22329789f0db71bec940a90616f3c04008c5c9 | <ide><path>keras/utils/io_utils.py
<ide> def is_interactive_logging_enabled():
<ide> Returns:
<ide> Boolean (True if interactive logging is enabled and False otherwise).
<ide> """
<del> return INTERACTIVE_LOGGING.enable
<add> # Use `getattr` in case `INTERACTIVE_LOGGING`
<add> # does not have the `enable` attribute.
<add> return getattr(INTERACTIVE_LOGGING, 'enable',
<add> keras_logging.INTERACTIVE_LOGGING_DEFAULT)
<ide>
<ide>
<ide> def print_msg(message, line_break=True):
<ide> """Print the message to absl logging or stdout."""
<del> # Use `getattr` in case `INTERACTIVE_LOGGING`
<del> # does not have the `enable` attribute.
<del> if INTERACTIVE_LOGGING.enable:
<add> if is_interactive_logging_enabled():
<ide> if line_break:
<ide> sys.stdout.write(message + '\n')
<ide> else: | 1 |
Text | Text | add explicit declaration of fd with null val | 069d2bd2507b62d819b590df859e0db79fb7996f | <ide><path>doc/api/stream.md
<ide> class WriteStream extends Writable {
<ide> constructor(filename) {
<ide> super();
<ide> this.filename = filename;
<add> this.fd = null;
<ide> }
<ide> _construct(callback) {
<ide> fs.open(this.filename, (err, fd) => { | 1 |
PHP | PHP | remove unecessary use | 046cb9d8324752c73df5587da84376ec30abe312 | <ide><path>src/Illuminate/Auth/Console/AuthControllerCommand.php
<ide> <?php namespace Illuminate\Auth\Console;
<ide>
<del>use Illuminate\Filesystem\Filesystem;
<ide> use Illuminate\Console\GeneratorCommand;
<ide> use Symfony\Component\Console\Input\InputArgument;
<ide>
<ide><path>src/Illuminate/Auth/Console/LoginRequestCommand.php
<ide> <?php namespace Illuminate\Auth\Console;
<ide>
<del>use Illuminate\Filesystem\Filesystem;
<ide> use Illuminate\Console\GeneratorCommand;
<del>use Symfony\Component\Console\Input\InputArgument;
<ide>
<ide> class LoginRequestCommand extends GeneratorCommand {
<ide>
<ide><path>src/Illuminate/Auth/Console/RegisterRequestCommand.php
<ide> <?php namespace Illuminate\Auth\Console;
<ide>
<del>use Illuminate\Filesystem\Filesystem;
<ide> use Illuminate\Console\GeneratorCommand;
<del>use Symfony\Component\Console\Input\InputArgument;
<ide>
<ide> class RegisterRequestCommand extends GeneratorCommand {
<ide>
<ide><path>src/Illuminate/Auth/Console/RemindersControllerCommand.php
<ide> <?php namespace Illuminate\Auth\Console;
<ide>
<del>use Illuminate\Filesystem\Filesystem;
<ide> use Illuminate\Console\GeneratorCommand;
<ide> use Symfony\Component\Console\Input\InputArgument;
<ide>
<ide><path>src/Illuminate/Foundation/Publishing/AssetPublisher.php
<ide> <?php namespace Illuminate\Foundation\Publishing;
<ide>
<del>use Illuminate\Filesystem\Filesystem;
<del>
<ide> class AssetPublisher extends Publisher {
<ide>
<ide> /**
<ide><path>src/Illuminate/Foundation/Publishing/ConfigPublisher.php
<ide> <?php namespace Illuminate\Foundation\Publishing;
<ide>
<del>use Illuminate\Filesystem\Filesystem;
<del>
<ide> class ConfigPublisher extends Publisher {
<ide>
<ide> /**
<ide><path>src/Illuminate/Foundation/Publishing/ViewPublisher.php
<ide> <?php namespace Illuminate\Foundation\Publishing;
<ide>
<del>use Illuminate\Filesystem\Filesystem;
<del>
<ide> class ViewPublisher extends Publisher {
<ide>
<ide> /** | 7 |
Java | Java | improve capacity calculcation in defaultdatabuffer | 7085a303822cda26d60e627ffefdaeaf693c5523 | <ide><path>spring-core/src/main/java/org/springframework/core/io/buffer/DefaultDataBuffer.java
<ide> /*
<del> * Copyright 2002-2016 the original author or authors.
<add> * Copyright 2002-2017 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> */
<ide> public class DefaultDataBuffer implements DataBuffer {
<ide>
<add> private static final int MAX_CAPACITY = Integer.MAX_VALUE;
<add>
<add> private static final int CAPACITY_THRESHOLD = 1024 * 1024 * 4;
<add>
<add>
<ide> private final DefaultDataBufferFactory dataBufferFactory;
<ide>
<ide> private ByteBuffer byteBuffer;
<ide> public OutputStream asOutputStream() {
<ide> }
<ide>
<ide> private void ensureExtraCapacity(int extraCapacity) {
<del> int neededCapacity = this.writePosition + extraCapacity;
<add> int neededCapacity = calculateCapacity(this.writePosition + extraCapacity);
<ide> if (neededCapacity > this.byteBuffer.capacity()) {
<ide> grow(neededCapacity);
<ide> }
<ide> }
<ide>
<del> void grow(int minCapacity) {
<add> /**
<add> * @see io.netty.buffer.AbstractByteBufAllocator#calculateNewCapacity(int, int)
<add> */
<add> private int calculateCapacity(int neededCapacity) {
<add> Assert.isTrue(neededCapacity >= 0, "'neededCapacity' must >= 0");
<add>
<add> if (neededCapacity == CAPACITY_THRESHOLD) {
<add> return CAPACITY_THRESHOLD;
<add> }
<add> else if (neededCapacity > CAPACITY_THRESHOLD) {
<add> int newCapacity = neededCapacity / CAPACITY_THRESHOLD * CAPACITY_THRESHOLD;
<add> if (newCapacity > MAX_CAPACITY - CAPACITY_THRESHOLD) {
<add> newCapacity = MAX_CAPACITY;
<add> }
<add> else {
<add> newCapacity += CAPACITY_THRESHOLD;
<add> }
<add> return newCapacity;
<add> }
<add> else {
<add> int newCapacity = 64;
<add> while (newCapacity < neededCapacity) {
<add> newCapacity <<= 1;
<add> }
<add> return Math.min(newCapacity, MAX_CAPACITY);
<add> }
<add> }
<add>
<add> void grow(int capacity) {
<ide> ByteBuffer oldBuffer = this.byteBuffer;
<ide> ByteBuffer newBuffer =
<del> (oldBuffer.isDirect() ? ByteBuffer.allocateDirect(minCapacity) :
<del> ByteBuffer.allocate(minCapacity));
<add> (oldBuffer.isDirect() ? ByteBuffer.allocateDirect(capacity) :
<add> ByteBuffer.allocate(capacity));
<ide>
<ide> // Explicit cast for compatibility with covariant return type on JDK 9's ByteBuffer
<ide> final int remaining = readableByteCount();
<ide> private static class SlicedDefaultDataBuffer extends DefaultDataBuffer {
<ide> }
<ide>
<ide> @Override
<del> void grow(int minCapacity) {
<add> void grow(int capacity) {
<ide> throw new UnsupportedOperationException(
<ide> "Growing the capacity of a sliced buffer is not supported");
<ide> } | 1 |
Python | Python | add line breaks | 4dcfafde0260e01e13e0d8675f27ab4d26be57d8 | <ide><path>spacy/en/language_data.py
<ide> # improved list from Stone, Denis, Kwantes (2010)
<ide> STOP_WORDS = set("""
<ide>
<del>a about above across after afterwards again against all almost alone along already also although always am among amongst amount an and another any anyhow anyone anything anyway anywhere are around as at
<add>a about above across after afterwards again against all almost alone along
<add>already also although always am among amongst amount an and another any anyhow
<add>anyone anything anyway anywhere are around as at
<ide>
<del>back be became because become becomes becoming been before beforehand behind being below beside besides between beyond both bottom but by
<add>back be became because become becomes becoming been before beforehand behind
<add>being below beside besides between beyond both bottom but by
<ide>
<ide> call can cannot ca could
<ide>
<ide> did do does doing done down due during
<ide>
<del>each eight either eleven else elsewhere empty enough etc even ever every everyone everything everywhere except
<add>each eight either eleven else elsewhere empty enough etc even ever every
<add>everyone everything everywhere except
<ide>
<del>few fifteen fifty first five for former formerly forty four from front full further
<add>few fifteen fifty first five for former formerly forty four from front full
<add>further
<ide>
<ide> get give go
<ide>
<del>had has have he hence her here hereafter hereby herein hereupon hers herself him himself his how however hundred
<add>had has have he hence her here hereafter hereby herein hereupon hers herself
<add>him himself his how however hundred
<ide>
<ide> i if in inc indeed into is it its itself
<ide>
<ide>
<ide> just
<ide>
<del>made make many may me meanwhile might mine more moreover most mostly move much must my myself
<add>made make many may me meanwhile might mine more moreover most mostly move much
<add>must my myself
<ide>
<del>name namely neither never nevertheless next nine no nobody none noone nor not nothing now nowhere
<add>name namely neither never nevertheless next nine no nobody none noone nor not
<add>nothing now nowhere
<ide>
<del>of off often on once one only onto or other others otherwise our ours ourselves out over own
<add>of off often on once one only onto or other others otherwise our ours ourselves
<add>out over own
<ide>
<ide> part per perhaps please put
<ide>
<ide> quite
<ide>
<ide> rather re really regarding
<ide>
<del>same say see seem seemed seeming seems serious several she should show side since six sixty so some somehow someone something sometime sometimes somewhere still such
<add>same say see seem seemed seeming seems serious several she should show side
<add>since six sixty so some somehow someone something sometime sometimes somewhere
<add>still such
<ide>
<del>take ten than that the their them themselves then thence there thereafter thereby therefore therein thereupon these they third this those though three through throughout thru thus to together too top toward towards twelve twenty two
<add>take ten than that the their them themselves then thence there thereafter
<add>thereby therefore therein thereupon these they third this those though three
<add>through throughout thru thus to together too top toward towards twelve twenty
<add>two
<ide>
<ide> under until up unless upon us used using
<ide>
<del>various very very via was we well were what whatever when whence whenever where whereafter whereas whereby wherein whereupon wherever whether which while whither who whoever whole whom whose why will with within without would
<add>various very very via was we well were what whatever when whence whenever where
<add>whereafter whereas whereby wherein whereupon wherever whether which while
<add>whither who whoever whole whom whose why will with within without would
<ide>
<ide> yet you your yours yourself yourselves
<ide> | 1 |
Javascript | Javascript | serialize errors if stack getter throws | 79a3348d142be713c2aa5dbe41524b54cb6bac03 | <ide><path>lib/internal/error-serdes.js
<ide> function TryGetAllProperties(object, target = object) {
<ide> Assign(all, TryGetAllProperties(GetPrototypeOf(object), target));
<ide> const keys = GetOwnPropertyNames(object);
<ide> ForEach(keys, (key) => {
<del> const descriptor = GetOwnPropertyDescriptor(object, key);
<add> let descriptor;
<add> try {
<add> descriptor = GetOwnPropertyDescriptor(object, key);
<add> } catch { return; }
<ide> const getter = descriptor.get;
<ide> if (getter && key !== '__proto__') {
<ide> try {
<ide> function serializeError(error) {
<ide> for (var i = 0; i < constructors.length; i++) {
<ide> const name = GetName(constructors[i]);
<ide> if (errorConstructorNames.has(name)) {
<del> try { error.stack; } catch {}
<ide> const serialized = serialize({
<ide> constructor: name,
<ide> properties: TryGetAllProperties(error)
<ide><path>test/parallel/test-worker-error-stack-getter-throws.js
<add>'use strict';
<add>const common = require('../common');
<add>const assert = require('assert');
<add>const { Worker } = require('worker_threads');
<add>
<add>const w = new Worker(
<add> `const fn = (err) => {
<add> if (err.message === 'fhqwhgads')
<add> throw new Error('come on');
<add> return 'This is my custom stack trace!';
<add> };
<add> Error.prepareStackTrace = fn;
<add> throw new Error('fhqwhgads');
<add> `,
<add> { eval: true }
<add>);
<add>w.on('message', common.mustNotCall());
<add>w.on('error', common.mustCall((err) => {
<add> assert.strictEqual(err.stack, undefined);
<add> assert.strictEqual(err.message, 'fhqwhgads');
<add> assert.strictEqual(err.name, 'Error');
<add>})); | 2 |
PHP | PHP | add mix file for loading versioned mix files | 6ea4997fcf7cf0ae4c18bce9418817ff00e4727f | <ide><path>src/Illuminate/Foundation/helpers.php
<ide> function method_field($method)
<ide> }
<ide> }
<ide>
<add>if (! function_exists('mix')) {
<add> /**
<add> * Get the path to a versioned Mix file.
<add> *
<add> * @param string $path
<add> * @return \Illuminate\Support\HtmlString
<add> *
<add> * @throws \InvalidArgumentException
<add> */
<add> function mix($path)
<add> {
<add> static $manifest;
<add> static $shouldHotReload;
<add>
<add> if (! $manifest) {
<add> if (! file_exists($manifestPath = public_path('mix-manifest.json'))) {
<add> throw new Exception('The Mix manifest does not exist.');
<add> }
<add>
<add> $manifest = json_decode(file_get_contents($manifestPath), true);
<add> }
<add>
<add> if (! starts_with($path, '/')) {
<add> $path = "/{$path}";
<add> }
<add>
<add> if (! array_key_exists($path, $manifest)) {
<add> throw new Exception(
<add> "Unable to locate Mix file: {$path}. Please check your ".
<add> "webpack.mix.js output paths and try again."
<add> );
<add> }
<add>
<add> return $shouldHotReload = file_exists(public_path('hot'))
<add> ? new HtmlString("http://localhost:8080{$manifest[$path]}")
<add> : new HtmlString(url($manifest[$path]));
<add> }
<add>}
<add>
<ide> if (! function_exists('old')) {
<ide> /**
<ide> * Retrieve an old input item. | 1 |
PHP | PHP | fix doc block errors | 72ee388de43fa16d73d209f15ee6482e942e3ca2 | <ide><path>src/Routing/RouteCollection.php
<ide> public function add(Route $route, $options) {
<ide> /**
<ide> * Takes the URL string and iterates the routes until one is able to parse the route.
<ide> *
<del> * @param string $url Url to parse.
<del> * @return array An array of request parameters parsed from the url.
<add> * @param string $url URL to parse.
<add> * @return array An array of request parameters parsed from the URL.
<add> * @throws \Cake\Routing\Error\MissingRouteException When a URL has no matching route.
<ide> */
<ide> public function parse($url) {
<ide> foreach (array_keys($this->_paths) as $path) {
<ide> protected function _getNames($url) {
<ide> * @param array $context The request context to use. Contains _base, _port,
<ide> * _host, and _scheme keys.
<ide> * @return string|false Either a string on match, or false on failure.
<add> * @throws \Cake\Routing\Error\MissingRouteException when a route cannot be matched.
<ide> */
<ide> public function match($url, $context) {
<del> // Named routes support hack.
<add> // Named routes support optimization.
<ide> if (isset($url['_name'])) {
<ide> $name = $url['_name'];
<ide> unset($url['_name']); | 1 |
PHP | PHP | getvalueattribute | b789fe1525f6062656198c91908fb01b08510136 | <ide><path>src/Illuminate/Html/FormBuilder.php
<ide> protected function getIdAttribute($name, $attributes)
<ide> * @param string $value
<ide> * @return string
<ide> */
<del> protected function getValueAttribute($name, $value = null)
<add> public function getValueAttribute($name, $value = null)
<ide> {
<ide> if (is_null($name)) return $value;
<ide> | 1 |
Javascript | Javascript | fix additional spacing issues | 5c504a1343ed2e2ab91a85dca110c8f5fa749889 | <ide><path>bonfireMDNlinks.js
<ide> *
<ide> **/
<ide>
<del>var links =
<del> {
<add>var links = {
<ide> // ========= NON MDN REFS
<ide> "Currying": "https://leanpub.com/javascript-allonge/read#pabc",
<ide> "Smallest Common Multiple": "https://www.mathsisfun.com/least-common-multiple.html",
<ide> var links =
<ide> "Roman Numerals": "http://www.mathsisfun.com/roman-numerals.html",
<ide>
<ide> // ========= GLOBAL OBJECTS
<del> "Global Array Object" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array",
<del> "Global Object" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object",
<del> "Global String Object" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String",
<del> "Boolean Objects" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean",
<del> "RegExp" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/RegExp",
<add> "Global Array Object": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array",
<add> "Global Object": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object",
<add> "Global String Object": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String",
<add> "Boolean Objects": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Boolean",
<add> "RegExp": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/RegExp",
<ide> "Global Function Object": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function",
<del> "Arguments object" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/arguments",
<add> "Arguments object": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/arguments",
<ide> "Closures": "https://developer.mozilla.org/en-US/docs/" +
<ide> "Web/JavaScript/Closures",
<ide>
<ide> // ========= GLOBAL OBJECT METHODS
<del> "parseInt()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/parseInt",
<add> "parseInt()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/parseInt",
<ide>
<ide>
<ide> // ========= PROPERTIES/MISC
<del> "String.length" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/length",
<add> "String.length": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/length",
<ide>
<ide>
<ide> // ========== OBJECT METHODS
<del> "Object.getOwnPropertyNames()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/getOwnPropertyNames",
<del> "Object.keys()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/keys",
<del> "Object.hasOwnProperty()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/hasOwnProperty",
<add> "Object.getOwnPropertyNames()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/getOwnPropertyNames",
<add> "Object.keys()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/keys",
<add> "Object.hasOwnProperty()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/hasOwnProperty",
<ide>
<ide>
<ide> // ======== STRING METHODS
<del> "String.charAt()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/charAt",
<del> "String.charCodeAt()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/charCodeAt",
<del> "String.concat()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/concat",
<del> "String.indexOf()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/indexOf",
<del> "String.fromCharCode()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/fromCharCode",
<del> "String.lastIndexOf()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/lastIndexOf",
<del> "String.match()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/match",
<del> "String.replace()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/replace",
<del> "String.slice()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/slice",
<del> "String.split()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/split",
<del> "String.substring()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/substring",
<del> "String.substr()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/substr",
<del> "String.toLowerCase()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/toLowerCase",
<del> "String.toString()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/toString",
<del> "String.toUpperCase()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/toUpperCase",
<add> "String.charAt()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/charAt",
<add> "String.charCodeAt()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/charCodeAt",
<add> "String.concat()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/concat",
<add> "String.indexOf()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/indexOf",
<add> "String.fromCharCode()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/fromCharCode",
<add> "String.lastIndexOf()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/lastIndexOf",
<add> "String.match()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/match",
<add> "String.replace()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/replace",
<add> "String.slice()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/slice",
<add> "String.split()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/split",
<add> "String.substring()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/substring",
<add> "String.substr()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/substr",
<add> "String.toLowerCase()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/toLowerCase",
<add> "String.toString()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/toString",
<add> "String.toUpperCase()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/toUpperCase",
<ide>
<ide> // ======== ARRAY METHODS
<del> "Array.concat()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/concat",
<del> "Array.every()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/every",
<add> "Array.concat()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/concat",
<add> "Array.every()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/every",
<ide> "Array.filter()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/filter",
<del> "Array.forEach()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/forEach",
<del> "Array.indexOf()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/indexOf",
<del> "Array.isArray()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/isArray",
<del> "Array.join()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/join",
<del> "Array.lastIndexOf()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/lastIndexOf",
<del> "Array.map()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/map",
<del> "Array.pop()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/pop",
<del> "Array.push()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/push",
<del> "Array.reduce()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/Reduce",
<add> "Array.forEach()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/forEach",
<add> "Array.indexOf()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/indexOf",
<add> "Array.isArray()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/isArray",
<add> "Array.join()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/join",
<add> "Array.lastIndexOf()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/lastIndexOf",
<add> "Array.map()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/map",
<add> "Array.pop()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/pop",
<add> "Array.push()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/push",
<add> "Array.reduce()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/Reduce",
<ide> "Array.reverse()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/reverse",
<del> "Array.shift()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/shift",
<del> "Array.slice()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/slice",
<del> "Array.some()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/some",
<del> "Array.sort()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort",
<del> "Array.splice()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/splice",
<del> "Array.toString()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/toString",
<add> "Array.shift()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/shift",
<add> "Array.slice()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/slice",
<add> "Array.some()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/some",
<add> "Array.sort()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort",
<add> "Array.splice()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/splice",
<add> "Array.toString()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/toString",
<ide>
<ide> // ======== MATH METHODS
<del> "Math.max()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/max",
<del> "Math.min()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/min",
<del> "Math.pow()" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/pow",
<del> "Remainder" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Arithmetic_Operators#Remainder_(.25)",
<add> "Math.max()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/max",
<add> "Math.min()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/min",
<add> "Math.pow()": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Math/pow",
<add> "Remainder": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Arithmetic_Operators#Remainder_(.25)",
<ide>
<ide> // ======== GENERAL JAVASCRIPT REFERENCES
<del> "Arithmetic Operators" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Arithmetic_Operators",
<del> "Comparison Operators" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Comparison_Operators",
<del> "Details of the Object Model" : "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Details_of_the_Object_Model",
<add> "Arithmetic Operators": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Arithmetic_Operators",
<add> "Comparison Operators": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Comparison_Operators",
<add> "Details of the Object Model": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Details_of_the_Object_Model",
<ide> "For Loops": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for"
<del> };
<add>};
<ide>
<ide> module.exports = links; | 1 |
Javascript | Javascript | expand http2 util test coverage for headers | d8a226673bd9621a5a07346f8702d7dac2f25481 | <ide><path>test/parallel/test-http2-cookies.js
<ide> const server = h2.createServer();
<ide>
<ide> const setCookie = [
<ide> 'a=b',
<del> 'c=d; Wed, 21 Oct 2015 07:28:00 GMT; Secure; HttpOnly'
<add> 'c=d; Wed, 21 Oct 2015 07:28:00 GMT; Secure; HttpOnly',
<add> 'e=f'
<ide> ];
<ide>
<ide> // we use the lower-level API here
<ide><path>test/parallel/test-http2-util-headers-list.js
<ide> const regex =
<ide> })(mapToHeaders({ [name]: 'abc' }));
<ide> });
<ide>
<add>common.expectsError({
<add> code: 'ERR_HTTP2_INVALID_CONNECTION_HEADERS',
<add> message: regex
<add>})(mapToHeaders({ [HTTP2_HEADER_TE]: ['abc'] }));
<add>
<ide> assert(!(mapToHeaders({ te: 'trailers' }) instanceof Error));
<add>assert(!(mapToHeaders({ te: ['trailers'] }) instanceof Error)); | 2 |
PHP | PHP | start test cases for middleware | 5ad0d6fe5ebed4b8de37a0dbb9f40b5446bd2668 | <ide><path>src/Routing/DispatcherFilter.php
<ide> <?php
<ide> /**
<del> *
<ide> * CakePHP(tm) : Rapid Development Framework (http://cakephp.org)
<ide> * Copyright (c) Cake Software Foundation, Inc. (http://cakefoundation.org)
<ide> *
<ide> * @since 2.2.0
<ide> * @license http://www.opensource.org/licenses/mit-license.php MIT License
<ide> */
<del>
<ide> namespace Cake\Routing;
<ide>
<ide> use Cake\Core\InstanceConfigTrait;
<ide> * callback as the conditions could change during the dispatch cycle.
<ide> *
<ide> */
<del>abstract class DispatcherFilter implements EventListener {
<add>class DispatcherFilter implements EventListener {
<ide>
<ide> use InstanceConfigTrait;
<ide>
<ide><path>tests/TestCase/Routing/DispatcherFactoryTest.php
<ide> * @since 3.0.0
<ide> * @license http://www.opensource.org/licenses/mit-license.php MIT License
<ide> */
<del>namespace Cake\Routing;
<add>namespace Cake\Test\TestCase\Routing;
<ide>
<ide> use Cake\Routing\DispatcherFactory;
<ide> use Cake\TestSuite\TestCase; | 2 |
PHP | PHP | add key to reflected schema | 1dc32f2920d2452c2a26a42a34ebe04811e0d5da | <ide><path>lib/Cake/Model/Datasource/Database/Connection.php
<ide> public function describe($table) {
<ide> 'default' => $row['Default'],
<ide> 'length' => $length,
<ide> ];
<add> if (!empty($row['Key'])) {
<add> $schema[$row['Field']]['key'] = $this->_driver->keyType($row['Key']);
<add> }
<ide> }
<ide> return $schema;
<ide> }
<ide><path>lib/Cake/Model/Datasource/Database/Dialect/MysqlDialectTrait.php
<ide> public function describeTableSql($table) {
<ide> return ["SHOW FULL COLUMNS FROM " . $this->quoteIdentifier($table), []];
<ide> }
<ide>
<add>/**
<add> * Convert a platform specific index type to the abstract type
<add> *
<add> * @param string $key The key type to convert.
<add> * @return string The abstract key type (primary, unique, index)
<add> */
<add> public function keyType($key) {
<add> if ($key === 'PRI') {
<add> return 'primary';
<add> }
<add> if ($key === 'MUL') {
<add> return 'index';
<add> }
<add> if ($key === 'uni') {
<add> return 'unique';
<add> }
<add> }
<add>
<ide> /**
<ide> * Convert a MySQL column type into an abstract type.
<ide> *
<add> * The returnned type will be a type that Cake\Model\Datasource\Database\Type can handle.
<add> *
<ide> * @param string $column The column type + length
<ide> * @return array List of (type, length)
<ide> */
<ide> public function columnType($column) {
<ide> }
<ide> return 'text';
<ide> }
<add>
<ide> }
<ide><path>lib/Cake/Test/TestCase/Model/Datasource/Database/Driver/MysqlTest.php
<ide> public function testDescribeTable() {
<ide> 'null' => false,
<ide> 'default' => null,
<ide> 'length' => 20,
<add> 'key' => 'primary',
<ide> ],
<ide> 'title' => [
<ide> 'type' => 'string', | 3 |
PHP | PHP | remove recent deprecations from 3.x | 6f4c09d765b213097e7c9c4dfda296c895d4d678 | <ide><path>src/Database/Query.php
<ide> public function modifier($modifiers, $overwrite = false)
<ide> * passed as an array of strings, array of expression objects, or a single string. See
<ide> * the examples above for the valid call types.
<ide> * @param bool $overwrite whether to reset tables with passed list or not
<del> * @return $this|array
<add> * @return $this
<ide> */
<ide> public function from($tables = [], $overwrite = false)
<ide> {
<del> if (empty($tables)) {
<del> deprecationWarning('Using Query::from() to read state is deprecated. Use clause("from") instead.');
<del>
<del> return $this->_parts['from'];
<del> }
<del>
<ide> $tables = (array)$tables;
<ide>
<ide> if ($overwrite) {
<ide> public function from($tables = [], $overwrite = false)
<ide> * @param array $types associative array of type names used to bind values to query
<ide> * @param bool $overwrite whether to reset joins with passed list or not
<ide> * @see \Cake\Database\Type
<del> * @return $this|array
<add> * @return $this
<ide> */
<ide> public function join($tables = null, $types = [], $overwrite = false)
<ide> {
<del> if ($tables === null) {
<del> deprecationWarning('Using Query::join() to read state is deprecated. Use clause("join") instead.');
<del>
<del> return $this->_parts['join'];
<del> }
<del>
<ide> if (is_string($tables) || isset($tables['table'])) {
<ide> $tables = [$tables];
<ide> }
<ide><path>src/Routing/RouteBuilder.php
<ide> protected function _methodRoute($method, $template, $target, $name)
<ide> * the current RouteBuilder instance.
<ide> *
<ide> * @param string $name The plugin name
<del> * @param string $file The routes file to load. Defaults to `routes.php`. This parameter
<del> * is deprecated and will be removed in 4.0
<ide> * @return void
<ide> * @throws \Cake\Core\Exception\MissingPluginException When the plugin has not been loaded.
<ide> * @throws \InvalidArgumentException When the plugin does not have a routes file.
<ide> */
<del> public function loadPlugin($name, $file = 'routes.php')
<add> public function loadPlugin($name)
<ide> {
<ide> $plugins = Plugin::getCollection();
<ide> if (!$plugins->has($name)) {
<ide> throw new MissingPluginException(['plugin' => $name]);
<ide> }
<ide> $plugin = $plugins->get($name);
<del>
<del> // @deprecated This block should be removed in 4.0
<del> if ($file !== 'routes.php') {
<del> deprecationWarning(
<del> 'Loading plugin routes now uses the routes() hook method on the plugin class. ' .
<del> 'Loading non-standard files will be removed in 4.0'
<del> );
<del>
<del> $path = $plugin->getConfigPath() . DIRECTORY_SEPARATOR . $file;
<del> if (!file_exists($path)) {
<del> throw new InvalidArgumentException(sprintf(
<del> 'Cannot load routes for the plugin named %s. The %s file does not exist.',
<del> $name,
<del> $path
<del> ));
<del> }
<del>
<del> $routes = $this;
<del> include $path;
<del>
<del> return;
<del> }
<ide> $plugin->routes($this);
<ide>
<ide> // Disable the routes hook to prevent duplicate route issues.
<ide><path>tests/TestCase/Database/QueryTest.php
<ide> public function testRemoveJoin()
<ide> $this->assertArrayNotHasKey('authors', $query->clause('join'));
<ide> }
<ide>
<del> /**
<del> * Test join read mode
<del> *
<del> * @deprecated
<del> * @return void
<del> */
<del> public function testJoinReadMode()
<del> {
<del> $this->loadFixtures('Articles');
<del> $query = new Query($this->connection);
<del> $query->select(['id', 'title'])
<del> ->from('articles')
<del> ->join(['authors' => [
<del> 'type' => 'INNER',
<del> 'conditions' => ['articles.author_id = authors.id']
<del> ]]);
<del>
<del> $this->deprecated(function () use ($query) {
<del> $this->assertArrayHasKey('authors', $query->join());
<del> });
<del> }
<del>
<ide> /**
<ide> * Tests that types in the type map are used in the
<ide> * specific comparison functions when using a callable
<ide><path>tests/TestCase/Routing/RouteBuilderTest.php
<ide> public function testLoadPluginBadPlugin()
<ide> $routes->loadPlugin('Nope');
<ide> }
<ide>
<del> /**
<del> * Test loading routes from a missing file
<del> *
<del> * @return void
<del> */
<del> public function testLoadPluginBadFile()
<del> {
<del> $this->deprecated(function () {
<del> $this->expectException(\InvalidArgumentException::class);
<del> $this->expectExceptionMessage('Cannot load routes for the plugin named TestPlugin.');
<del> Plugin::load('TestPlugin');
<del> $routes = new RouteBuilder($this->collection, '/');
<del> $routes->loadPlugin('TestPlugin', 'nope.php');
<del> });
<del> }
<del>
<ide> /**
<ide> * Test loading routes with success
<ide> * | 4 |
Java | Java | add webapplicationinitializers for web reactive | 2b57a4d618a1e755ee9e362208291c0cde813e8d | <ide><path>spring-web-reactive/src/main/java/org/springframework/web/reactive/support/AbstractAnnotationConfigDispatcherHandlerInitializer.java
<add>/*
<add> * Copyright 2002-2016 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.web.reactive.support;
<add>
<add>import org.springframework.context.ApplicationContext;
<add>import org.springframework.context.annotation.AnnotationConfigApplicationContext;
<add>import org.springframework.http.server.reactive.ServletHttpHandlerAdapter;
<add>import org.springframework.util.ObjectUtils;
<add>import org.springframework.web.reactive.DispatcherHandler;
<add>
<add>/**
<add> * Base class for {@link org.springframework.web.WebApplicationInitializer}
<add> * implementations that register a {@link DispatcherHandler} configured with annotated
<add> * {@link org.springframework.context.annotation.Configuration @Configuration} classes in the
<add> * servlet context, wrapping it in a {@link ServletHttpHandlerAdapter}.
<add> *
<add> * <p>Concrete implementations are required to implement {@link #getConfigClasses()} and
<add> * {@link #getServletMapping()}.
<add> * Further template and customization methods are provided by
<add> * {@link AbstractDispatcherHandlerInitializer}.
<add> *
<add> * @author Arjen Poutsma
<add> * @since 5.0
<add> */
<add>public abstract class AbstractAnnotationConfigDispatcherHandlerInitializer
<add> extends AbstractDispatcherHandlerInitializer {
<add>
<add> /**
<add> * {@inheritDoc}
<add> * <p>This implementation creates an {@link AnnotationConfigApplicationContext},
<add> * providing it the annotated classes returned by {@link #getConfigClasses()}.
<add> */
<add> @Override
<add> protected ApplicationContext createApplicationContext() {
<add> AnnotationConfigApplicationContext servletAppContext = new AnnotationConfigApplicationContext();
<add> Class<?>[] configClasses = getConfigClasses();
<add> if (!ObjectUtils.isEmpty(configClasses)) {
<add> servletAppContext.register(configClasses);
<add> }
<add> return servletAppContext;
<add> }
<add>
<add> /**
<add> * Specify {@link org.springframework.context.annotation.Configuration @Configuration}
<add> * and/or {@link org.springframework.stereotype.Component @Component} classes to be
<add> * provided to the {@linkplain #createApplicationContext() application context}.
<add> * @return the configuration classes for the dispatcher servlet application context
<add> */
<add> protected abstract Class<?>[] getConfigClasses();
<add>
<add>}
<ide><path>spring-web-reactive/src/main/java/org/springframework/web/reactive/support/AbstractDispatcherHandlerInitializer.java
<add>/*
<add> * Copyright 2002-2016 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.web.reactive.support;
<add>
<add>import javax.servlet.ServletContext;
<add>import javax.servlet.ServletContextEvent;
<add>import javax.servlet.ServletContextListener;
<add>import javax.servlet.ServletException;
<add>import javax.servlet.ServletRegistration;
<add>
<add>import org.springframework.context.ApplicationContext;
<add>import org.springframework.context.ConfigurableApplicationContext;
<add>import org.springframework.http.server.reactive.HttpHandler;
<add>import org.springframework.http.server.reactive.ServletHttpHandlerAdapter;
<add>import org.springframework.util.Assert;
<add>import org.springframework.web.WebApplicationInitializer;
<add>import org.springframework.web.reactive.DispatcherHandler;
<add>import org.springframework.web.server.WebHandler;
<add>import org.springframework.web.server.adapter.HttpWebHandlerAdapter;
<add>
<add>/**
<add> * Base class for {@link org.springframework.web.WebApplicationInitializer}
<add> * implementations that register a {@link DispatcherHandler} in the servlet context, wrapping it in
<add> * a {@link ServletHttpHandlerAdapter}.
<add> *
<add> * <p>Concrete implementations are required to implement
<add> * {@link #createApplicationContext()}, which gets invoked from
<add> * {@link #registerDispatcherHandler(ServletContext)}. Further customization can be achieved by
<add> * overriding {@link #customizeRegistration(ServletRegistration.Dynamic)}.
<add> *
<add> * @author Arjen Poutsma
<add> * @since 5.0
<add> */
<add>public abstract class AbstractDispatcherHandlerInitializer implements WebApplicationInitializer {
<add>
<add> /**
<add> * The default servlet name. Can be customized by overriding {@link #getServletName}.
<add> */
<add> public static final String DEFAULT_SERVLET_NAME = "dispatcher-handler";
<add>
<add> /**
<add> * The default servlet mapping. Can be customized by overriding {@link #getServletMapping()}.
<add> */
<add> public static final String DEFAULT_SERVLET_MAPPING = "/";
<add>
<add>
<add> @Override
<add> public void onStartup(ServletContext servletContext) throws ServletException {
<add> registerDispatcherHandler(servletContext);
<add> }
<add>
<add> /**
<add> * Register a {@link DispatcherHandler} against the given servlet context.
<add> * <p>This method will create a {@link DispatcherHandler}, initializing it with the application
<add> * context returned from {@link #createApplicationContext()}. The created handler will be
<add> * wrapped in a {@link ServletHttpHandlerAdapter} servlet with the name
<add> * returned by {@link #getServletName()}, mapping it to the pattern
<add> * returned from {@link #getServletMapping()}.
<add> * <p>Further customization can be achieved by overriding {@link
<add> * #customizeRegistration(ServletRegistration.Dynamic)} or
<add> * {@link #createDispatcherHandler(ApplicationContext)}.
<add> * @param servletContext the context to register the servlet against
<add> */
<add> protected void registerDispatcherHandler(ServletContext servletContext) {
<add> String servletName = getServletName();
<add> Assert.hasLength(servletName, "getServletName() must not return empty or null");
<add>
<add> ApplicationContext applicationContext = createApplicationContext();
<add> Assert.notNull(applicationContext,
<add> "createApplicationContext() did not return an application " +
<add> "context for servlet [" + servletName + "]");
<add> refreshApplicationContext(applicationContext);
<add> registerCloseListener(servletContext, applicationContext);
<add>
<add> WebHandler dispatcherHandler = createDispatcherHandler(applicationContext);
<add> Assert.notNull(dispatcherHandler,
<add> "createDispatcherHandler() did not return a WebHandler for servlet ["
<add> + servletName + "]");
<add>
<add> ServletHttpHandlerAdapter handlerAdapter = createHandlerAdapter(dispatcherHandler);
<add> Assert.notNull(handlerAdapter,
<add> "createHttpHandler() did not return a ServletHttpHandlerAdapter for servlet ["
<add> + servletName + "]");
<add>
<add> ServletRegistration.Dynamic registration = servletContext.addServlet(servletName, handlerAdapter);
<add> Assert.notNull(registration,
<add> "Failed to register servlet with name '" + servletName + "'." +
<add> "Check if there is another servlet registered under the same name.");
<add>
<add> registration.setLoadOnStartup(1);
<add> registration.addMapping(getServletMapping());
<add> registration.setAsyncSupported(true);
<add>
<add>
<add> customizeRegistration(registration);
<add> }
<add>
<add> /**
<add> * Return the name under which the {@link ServletHttpHandlerAdapter} will be registered.
<add> * Defaults to {@link #DEFAULT_SERVLET_NAME}.
<add> * @see #registerDispatcherHandler(ServletContext)
<add> */
<add> protected String getServletName() {
<add> return DEFAULT_SERVLET_NAME;
<add> }
<add>
<add> /**
<add> * Create an application context to be provided to the {@code DispatcherHandler}.
<add> * <p>The returned context is delegated to Spring's
<add> * {@link DispatcherHandler#DispatcherHandler(ApplicationContext)}. As such,
<add> * it typically contains controllers, view resolvers, and other web-related beans.
<add> * @see #registerDispatcherHandler(ServletContext)
<add> */
<add> protected abstract ApplicationContext createApplicationContext();
<add>
<add> /**
<add> * Refresh the given application context, if necessary.
<add> */
<add> protected void refreshApplicationContext(ApplicationContext applicationContext) {
<add> if (applicationContext instanceof ConfigurableApplicationContext) {
<add> ConfigurableApplicationContext cac =
<add> (ConfigurableApplicationContext) applicationContext;
<add> if (!cac.isActive()) {
<add> cac.refresh();
<add> }
<add> }
<add> }
<add>
<add> /**
<add> * Create a {@link DispatcherHandler} (or other kind of {@link WebHandler}-derived
<add> * dispatcher) with the specified {@link ApplicationContext}.
<add> */
<add> protected WebHandler createDispatcherHandler(ApplicationContext applicationContext) {
<add> return new DispatcherHandler(applicationContext);
<add> }
<add>
<add> /**
<add> * Create a {@link ServletHttpHandlerAdapter}.
<add> * <p>Default implementation returns a {@code ServletHttpHandlerAdapter} with the provided
<add> * {@code webHandler}.
<add> */
<add> protected ServletHttpHandlerAdapter createHandlerAdapter(WebHandler webHandler) {
<add> HttpHandler httpHandler = new HttpWebHandlerAdapter(webHandler);
<add> return new ServletHttpHandlerAdapter(httpHandler);
<add> }
<add>
<add> /**
<add> * Specify the servlet mapping for the {@code ServletHttpHandlerAdapter}.
<add> * <p>Default implementation returns {@code /}.
<add> * @see #registerDispatcherHandler(ServletContext)
<add> */
<add> protected String getServletMapping() {
<add> return DEFAULT_SERVLET_MAPPING;
<add> }
<add>
<add> /**
<add> * Optionally perform further registration customization once
<add> * {@link #registerDispatcherHandler(ServletContext)} has completed.
<add> * @param registration the {@code ServletHttpHandlerAdapter} registration to be customized
<add> * @see #registerDispatcherHandler(ServletContext)
<add> */
<add> protected void customizeRegistration(ServletRegistration.Dynamic registration) {
<add> }
<add>
<add> /**
<add> * Register a {@link ServletContextListener} that closes the given application context when
<add> * the servlet context is destroyed.
<add> * @param servletContext the servlet context to listen to
<add> * @param applicationContext the application context that is to be closed when
<add> * {@code servletContext} is destroyed
<add> */
<add> protected void registerCloseListener(ServletContext servletContext,
<add> ApplicationContext applicationContext) {
<add>
<add> if (applicationContext instanceof ConfigurableApplicationContext) {
<add> ConfigurableApplicationContext context =
<add> (ConfigurableApplicationContext) applicationContext;
<add> ServletContextDestroyedListener listener = new ServletContextDestroyedListener(context);
<add> servletContext.addListener(listener);
<add> }
<add> }
<add>
<add> private static class ServletContextDestroyedListener implements ServletContextListener {
<add>
<add> private final ConfigurableApplicationContext applicationContext;
<add>
<add> public ServletContextDestroyedListener(ConfigurableApplicationContext applicationContext) {
<add> this.applicationContext = applicationContext;
<add> }
<add>
<add> @Override
<add> public void contextInitialized(ServletContextEvent sce) {
<add> }
<add>
<add> @Override
<add> public void contextDestroyed(ServletContextEvent sce) {
<add> this.applicationContext.close();
<add> }
<add> }
<add>
<add>}
<ide><path>spring-web-reactive/src/main/java/org/springframework/web/reactive/support/package-info.java
<add>/**
<add> * Support classes for Spring Web Reactive.
<add> */
<add>package org.springframework.web.reactive.support;
<ide>\ No newline at end of file
<ide><path>spring-web/src/main/java/org/springframework/http/server/reactive/support/AbstractServletHttpHandlerAdapterInitializer.java
<add>/*
<add> * Copyright 2002-2016 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.http.server.reactive.support;
<add>
<add>import javax.servlet.ServletContext;
<add>import javax.servlet.ServletException;
<add>import javax.servlet.ServletRegistration;
<add>
<add>import org.springframework.http.server.reactive.HttpHandler;
<add>import org.springframework.http.server.reactive.ServletHttpHandlerAdapter;
<add>import org.springframework.util.Assert;
<add>import org.springframework.web.WebApplicationInitializer;
<add>import org.springframework.web.context.WebApplicationContext;
<add>
<add>/**
<add> * Base class for {@link org.springframework.web.WebApplicationInitializer}
<add> * implementations that register a {@link ServletHttpHandlerAdapter} in the servlet context.
<add> *
<add> * <p>Concrete implementations are required to implement
<add> * {@link #createHttpHandler()}, as well as {@link #getServletMappings()},
<add> * both of which get invoked from {@link #registerHandlerAdapter(ServletContext)}.
<add> * Further customization can be achieved by overriding
<add> * {@link #customizeRegistration(ServletRegistration.Dynamic)}.
<add> *
<add> * @author Arjen Poutsma
<add> * @since 5.0
<add> */
<add>public abstract class AbstractServletHttpHandlerAdapterInitializer
<add> implements WebApplicationInitializer {
<add>
<add> /**
<add> * The default servlet name. Can be customized by overriding {@link #getServletName}.
<add> */
<add> public static final String DEFAULT_SERVLET_NAME = "http-handler-adapter";
<add>
<add> @Override
<add> public void onStartup(ServletContext servletContext) throws ServletException {
<add> registerHandlerAdapter(servletContext);
<add> }
<add>
<add> /**
<add> * Register a {@link ServletHttpHandlerAdapter} against the given servlet context.
<add> * <p>This method will create a {@code HttpHandler} using {@link #createHttpHandler()},
<add> * and use it to create a {@code ServletHttpHandlerAdapter} with the name returned by
<add> * {@link #getServletName()}, and mapping it to the patterns
<add> * returned from {@link #getServletMappings()}.
<add> * <p>Further customization can be achieved by overriding {@link
<add> * #customizeRegistration(ServletRegistration.Dynamic)} or
<add> * {@link #createServlet(HttpHandler)}.
<add> * @param servletContext the context to register the servlet against
<add> */
<add> protected void registerHandlerAdapter(ServletContext servletContext) {
<add> String servletName = getServletName();
<add> Assert.hasLength(servletName, "getServletName() must not return empty or null");
<add>
<add> HttpHandler httpHandler = createHttpHandler();
<add> Assert.notNull(httpHandler,
<add> "createHttpHandler() did not return a HttpHandler for servlet ["
<add> + servletName + "]");
<add>
<add> ServletHttpHandlerAdapter servlet = createServlet(httpHandler);
<add> Assert.notNull(servlet,
<add> "createHttpHandler() did not return a ServletHttpHandlerAdapter for servlet ["
<add> + servletName + "]");
<add>
<add> ServletRegistration.Dynamic registration = servletContext.addServlet(servletName, servlet);
<add> Assert.notNull(registration,
<add> "Failed to register servlet with name '" + servletName + "'." +
<add> "Check if there is another servlet registered under the same name.");
<add>
<add> registration.setLoadOnStartup(1);
<add> registration.addMapping(getServletMappings());
<add> registration.setAsyncSupported(true);
<add>
<add> customizeRegistration(registration);
<add> }
<add>
<add> /**
<add> * Return the name under which the {@link ServletHttpHandlerAdapter} will be registered.
<add> * Defaults to {@link #DEFAULT_SERVLET_NAME}.
<add> * @see #registerHandlerAdapter(ServletContext)
<add> */
<add> protected String getServletName() {
<add> return DEFAULT_SERVLET_NAME;
<add> }
<add>
<add> /**
<add> * Create the {@link HttpHandler}.
<add> */
<add> protected abstract HttpHandler createHttpHandler();
<add>
<add> /**
<add> * Create a {@link ServletHttpHandlerAdapter} with the specified {@link WebApplicationContext}.
<add> * <p>Default implementation returns a {@code ServletHttpHandlerAdapter} with the provided
<add> * {@code httpHandler}.
<add> */
<add> protected ServletHttpHandlerAdapter createServlet(HttpHandler httpHandler) {
<add> return new ServletHttpHandlerAdapter(httpHandler);
<add> }
<add>
<add> /**
<add> * Specify the servlet mapping(s) for the {@code ServletHttpHandlerAdapter} —
<add> * for example {@code "/"}, {@code "/app"}, etc.
<add> * @see #registerHandlerAdapter(ServletContext)
<add> */
<add> protected abstract String[] getServletMappings();
<add>
<add> /**
<add> * Optionally perform further registration customization once
<add> * {@link #registerHandlerAdapter(ServletContext)} has completed.
<add> * @param registration the {@code DispatcherServlet} registration to be customized
<add> * @see #registerHandlerAdapter(ServletContext)
<add> */
<add> protected void customizeRegistration(ServletRegistration.Dynamic registration) {
<add> }
<add>}
<ide><path>spring-web/src/main/java/org/springframework/http/server/reactive/support/package-info.java
<add>/*
<add> * Copyright 2002-2016 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>/**
<add> * Support classes for Spring's reactive HTTP server abstraction.
<add> */
<add>
<add>package org.springframework.http.server.reactive.support;
<ide>\ No newline at end of file | 5 |
Ruby | Ruby | pass additional options to `github.pull_requests` | e9c45ff17b406bed50677b5b5eb04d9422343f27 | <ide><path>Library/Homebrew/utils/github.rb
<ide> def write_access?(repo, user = nil)
<ide> ["admin", "write"].include?(permission(repo, user)["permission"])
<ide> end
<ide>
<del> def pull_requests(repo, base:, state: :open, **_options)
<del> url = "#{API_URL}/repos/#{repo}/pulls?#{URI.encode_www_form(base: base, state: state)}"
<add> def pull_requests(repo, **options)
<add> url = "#{API_URL}/repos/#{repo}/pulls?#{URI.encode_www_form(options)}"
<ide> open_api(url)
<ide> end
<ide> | 1 |
Javascript | Javascript | add extra brackets | 8ffc44705df6748ba979661af30336ca30a2ff47 | <ide><path>lib/dependencies/ContextDependencyTemplateAsId.js
<ide> ContextDependencyTemplateAsId.prototype.apply = function(dep, source, outputOpti
<ide> if(outputOptions.pathinfo) comment = "/*! " + requestShortener.shorten(dep.request) + " */ ";
<ide> if(dep.module && dep.module.dependencies && dep.module.dependencies.length > 0) {
<ide> if(dep.valueRange) {
<del> if(Array.isArray(dep.replaces))
<add> if(Array.isArray(dep.replaces)) {
<ide> for(var i = 0; i < dep.replaces.length; i++) {
<ide> var rep = dep.replaces[i];
<ide> source.replace(rep.range[0], rep.range[1] - 1, rep.value)
<ide> }
<add> }
<ide> source.replace(dep.valueRange[1], dep.range[1] - 1, ")");
<ide> source.replace(dep.range[0], dep.valueRange[0] - 1, "__webpack_require__(" + comment + JSON.stringify(dep.module.id) + ").resolve(" + (typeof dep.prepend === "string" ? JSON.stringify(dep.prepend) : "") + "");
<ide> } else {
<ide><path>lib/dependencies/ContextDependencyTemplateAsRequireCall.js
<ide> ContextDependencyTemplateAsRequireCall.prototype.apply = function(dep, source, o
<ide> var isAsync = dep.module && dep.module.async;
<ide> if(dep.module && (isAsync || containsDeps)) {
<ide> if(dep.valueRange) {
<del> if(Array.isArray(dep.replaces))
<add> if(Array.isArray(dep.replaces)) {
<ide> for(var i = 0; i < dep.replaces.length; i++) {
<ide> var rep = dep.replaces[i];
<ide> source.replace(rep.range[0], rep.range[1] - 1, rep.value)
<ide> }
<add> }
<ide> source.replace(dep.valueRange[1], dep.range[1] - 1, ")");
<ide> source.replace(dep.range[0], dep.valueRange[0] - 1, "__webpack_require__(" + comment + JSON.stringify(dep.module.id) + ")(" + (typeof dep.prepend === "string" ? JSON.stringify(dep.prepend) : "") + "");
<ide> } else { | 2 |
Go | Go | fix panic with dynamic buffer | 5cdf7f507771e159ddd335f3757cd33f7ba4e426 | <ide><path>pkg/tarsum/tarsum.go
<ide> func (ts *tarSum) Read(buf []byte) (int, error) {
<ide> if ts.finished {
<ide> return ts.bufWriter.Read(buf)
<ide> }
<del> if ts.bufData == nil {
<add> if len(ts.bufData) < len(buf) {
<ide> switch {
<ide> case len(buf) <= buf8K:
<ide> ts.bufData = make([]byte, buf8K)
<ide> func (ts *tarSum) Read(buf []byte) (int, error) {
<ide> ts.bufData = make([]byte, len(buf))
<ide> }
<ide> }
<del> buf2 := ts.bufData[:len(buf)-1]
<add> buf2 := ts.bufData[:len(buf)]
<ide>
<ide> n, err := ts.tarR.Read(buf2)
<ide> if err != nil {
<ide><path>pkg/tarsum/tarsum_test.go
<ide> func TestTarSums(t *testing.T) {
<ide> t.Errorf("%q :: %q", err, layer.filename)
<ide> continue
<ide> }
<add>
<add> // Read variable number of bytes to test dynamic buffer
<add> dBuf := make([]byte, 1)
<add> _, err = ts.Read(dBuf)
<add> if err != nil {
<add> t.Errorf("failed to read 1B from %s: %s", layer.filename, err)
<add> continue
<add> }
<add> dBuf = make([]byte, 16*1024)
<add> _, err = ts.Read(dBuf)
<add> if err != nil {
<add> t.Errorf("failed to read 16KB from %s: %s", layer.filename, err)
<add> continue
<add> }
<add>
<add> // Read and discard remaining bytes
<ide> _, err = io.Copy(ioutil.Discard, ts)
<ide> if err != nil {
<ide> t.Errorf("failed to copy from %s: %s", layer.filename, err) | 2 |
Javascript | Javascript | detect attach/detach from any level | fca030922360a1a2d1292f6936276ad11a7cf7d9 | <ide><path>src/core/core.controller.js
<ide> class Chart {
<ide> this.attached = false;
<ide> this._animationsDisabled = undefined;
<ide> this.$context = undefined;
<del> this._doResize = debounce(() => this.update('resize'), options.resizeDelay || 0);
<add> this._doResize = debounce(mode => this.update(mode), options.resizeDelay || 0);
<ide>
<ide> // Add the chart instance to the global namespace
<ide> instances[me.id] = me;
<ide> class Chart {
<ide> const aspectRatio = options.maintainAspectRatio && me.aspectRatio;
<ide> const newSize = me.platform.getMaximumSize(canvas, width, height, aspectRatio);
<ide> const newRatio = options.devicePixelRatio || me.platform.getDevicePixelRatio();
<add> const mode = me.width ? 'resize' : 'attach';
<ide>
<ide> me.width = newSize.width;
<ide> me.height = newSize.height;
<ide> class Chart {
<ide> callCallback(options.onResize, [me, newSize], me);
<ide>
<ide> if (me.attached) {
<del> if (me._doResize()) {
<add> if (me._doResize(mode)) {
<ide> // The resize update is delayed, only draw without updating.
<ide> me.render();
<ide> }
<ide> class Chart {
<ide> }
<ide> }
<ide>
<del> destroy() {
<add> _stop() {
<ide> const me = this;
<del> const {canvas, ctx} = me;
<ide> let i, ilen;
<del>
<ide> me.stop();
<ide> animator.remove(me);
<ide>
<del> // dataset controllers need to cleanup associated data
<ide> for (i = 0, ilen = me.data.datasets.length; i < ilen; ++i) {
<ide> me._destroyDatasetMeta(i);
<ide> }
<add> }
<add>
<add> destroy() {
<add> const me = this;
<add> const {canvas, ctx} = me;
<ide>
<add> me._stop();
<ide> me.config.clearCache();
<ide>
<ide> if (canvas) {
<ide> class Chart {
<ide> me.attached = false;
<ide>
<ide> _remove('resize', listener);
<add>
<add> // Stop animating and remove metasets, so when re-attached, the animations start from begining.
<add> me._stop();
<add> me._resize(0, 0);
<add>
<ide> _add('attach', attached);
<ide> };
<ide>
<ide><path>src/helpers/helpers.extras.js
<ide> export function throttled(fn, thisArg, updateFn) {
<ide>
<ide> /**
<ide> * Debounces calling `fn` for `delay` ms
<del> * @param {function} fn - Function to call. No arguments are passed.
<add> * @param {function} fn - Function to call.
<ide> * @param {number} delay - Delay in ms. 0 = immediate invocation.
<ide> * @returns {function}
<ide> */
<ide> export function debounce(fn, delay) {
<ide> let timeout;
<del> return function() {
<add> return function(...args) {
<ide> if (delay) {
<ide> clearTimeout(timeout);
<del> timeout = setTimeout(fn, delay);
<add> timeout = setTimeout(fn, delay, args);
<ide> } else {
<del> fn();
<add> fn.apply(this, args);
<ide> }
<ide> return delay;
<ide> };
<ide> }
<ide>
<del>
<ide> /**
<ide> * Converts 'start' to 'left', 'end' to 'right' and others to 'center'
<ide> * @param {string} align start, end, center
<ide><path>src/platform/platform.dom.js
<ide> function fromNativeEvent(event, chart) {
<ide>
<ide> function createAttachObserver(chart, type, listener) {
<ide> const canvas = chart.canvas;
<del> const container = canvas && _getParentNode(canvas);
<del> const element = container || canvas;
<ide> const observer = new MutationObserver(entries => {
<del> const parent = _getParentNode(element);
<del> entries.forEach(entry => {
<del> for (let i = 0; i < entry.addedNodes.length; i++) {
<del> const added = entry.addedNodes[i];
<del> if (added === element || added === parent) {
<del> listener(entry.target);
<add> for (const entry of entries) {
<add> for (const node of entry.addedNodes) {
<add> if (node === canvas || node.contains(canvas)) {
<add> return listener();
<ide> }
<ide> }
<del> });
<add> }
<ide> });
<ide> observer.observe(document, {childList: true, subtree: true});
<ide> return observer;
<ide> }
<ide>
<ide> function createDetachObserver(chart, type, listener) {
<ide> const canvas = chart.canvas;
<del> const container = canvas && _getParentNode(canvas);
<del> if (!container) {
<del> return;
<del> }
<ide> const observer = new MutationObserver(entries => {
<del> entries.forEach(entry => {
<del> for (let i = 0; i < entry.removedNodes.length; i++) {
<del> if (entry.removedNodes[i] === canvas) {
<del> listener();
<del> break;
<add> for (const entry of entries) {
<add> for (const node of entry.removedNodes) {
<add> if (node === canvas || node.contains(canvas)) {
<add> return listener();
<ide> }
<ide> }
<del> });
<add> }
<ide> });
<del> observer.observe(container, {childList: true});
<add> observer.observe(document, {childList: true, subtree: true});
<ide> return observer;
<ide> }
<ide>
<ide><path>test/specs/core.controller.tests.js
<ide> describe('Chart', function() {
<ide> });
<ide>
<ide> parent.removeChild(wrapper);
<del> parent.appendChild(wrapper);
<del> wrapper.style.height = '355px';
<add> setTimeout(() => {
<add> parent.appendChild(wrapper);
<add> wrapper.style.height = '355px';
<add> }, 0);
<ide> });
<ide>
<ide> // https://github.com/chartjs/Chart.js/issues/4737
<ide> describe('Chart', function() {
<ide> canvas.parentNode.style.width = '455px';
<ide> });
<ide> });
<add>
<add> it('should resize the canvas if attached to the DOM after construction with mutiple parents', function(done) {
<add> var canvas = document.createElement('canvas');
<add> var wrapper = document.createElement('div');
<add> var wrapper2 = document.createElement('div');
<add> var wrapper3 = document.createElement('div');
<add> var body = window.document.body;
<add>
<add> var chart = new Chart(canvas, {
<add> type: 'line',
<add> options: {
<add> responsive: true,
<add> maintainAspectRatio: false
<add> }
<add> });
<add>
<add> expect(chart).toBeChartOfSize({
<add> dw: 0, dh: 0,
<add> rw: 0, rh: 0,
<add> });
<add> expect(chart.chartArea).toBeUndefined();
<add>
<add> waitForResize(chart, function() {
<add> expect(chart).toBeChartOfSize({
<add> dw: 455, dh: 355,
<add> rw: 455, rh: 355,
<add> });
<add>
<add> expect(chart.chartArea).not.toBeUndefined();
<add>
<add> body.removeChild(wrapper3);
<add> chart.destroy();
<add> done();
<add> });
<add>
<add> wrapper3.appendChild(wrapper2);
<add> wrapper2.appendChild(wrapper);
<add> wrapper.style.cssText = 'width: 455px; height: 355px';
<add> wrapper.appendChild(canvas);
<add> body.appendChild(wrapper3);
<add> });
<ide> });
<ide>
<ide> describe('config.options.responsive: true (maintainAspectRatio: true)', function() { | 4 |
Text | Text | add v3.21.0-beta.4 to changelog | 13ff365e9595071639f8ee6c4c338d11a86c3733 | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<add>### v3.21.0-beta.4 (August 3, 2020)
<add>
<add>- [#19059](https://github.com/emberjs/ember.js/pull/19059) [BUGFIX] Prevent `<base target="_parent">` from erroring in `HistoryLocation`
<add>- [#19060](https://github.com/emberjs/ember.js/pull/19060) / [#19065](https://github.com/emberjs/ember.js/pull/19065) / [#19072](https://github.com/emberjs/ember.js/pull/19072) [BUGFIX] Update rendering engine to `@glimmer/*` 0.55.3
<add>- [#19063](https://github.com/emberjs/ember.js/pull/19063) [DOC] Fix missing docs for `{{#in-element}}`
<add>
<ide> ### v3.21.0-beta.3 (July 27, 2020)
<ide>
<ide> - [#19048](https://github.com/emberjs/ember.js/pull/19048) [BUGFIX] Update router.js to ensure transition.abort works for query param only transitions. | 1 |
Python | Python | fix small issue detected by pylint | fb5486a9664fe5d1e3c3316e620837d12b89142c | <ide><path>libcloud/common/openstack_identity.py
<ide> def __init__(self, auth_url, user_id, key, tenant_name=None,
<ide> Tenant, domain and scope options are ignored as they are contained
<ide> within the app credential itself and can't be changed.
<ide> """
<del> super(OpenStackIdentity_3_0_Connection,
<add> super(OpenStackIdentity_3_0_Connection_AppCred,
<ide> self).__init__(auth_url=auth_url,
<ide> user_id=user_id,
<ide> key=key, | 1 |
Javascript | Javascript | add an & prefix to custom variant | a0c081348840bac9a6e28dc5b613248dd9d2eb89 | <ide><path>tools/ui-components/tailwind.config.js
<ide> module.exports = {
<ide> },
<ide> plugins: [
<ide> plugin(({ addVariant }) => {
<del> addVariant('aria-disabled', '[aria-disabled="true"]');
<add> addVariant('aria-disabled', '&[aria-disabled="true"]');
<ide> })
<ide> ]
<ide> }; | 1 |
Text | Text | fix minor grammatical errors in docs | 0e4811e9ce30904bbd4f3aef6446d68be6c3a658 | <ide><path>docs/api-guide/fields.md
<ide> The `default` is not applied during partial update operations. In the partial up
<ide>
<ide> May be set to a function or other callable, in which case the value will be evaluated each time it is used. When called, it will receive no arguments. If the callable has a `set_context` method, that will be called each time before getting the value with the field instance as only argument. This works the same way as for [validators](validators.md#using-set_context).
<ide>
<del>When serializing the instance, default will be used if the the object attribute or dictionary key is not present in the instance.
<add>When serializing the instance, default will be used if the object attribute or dictionary key is not present in the instance.
<ide>
<ide> Note that setting a `default` value implies that the field is not required. Including both the `default` and `required` keyword arguments is invalid and will raise an error.
<ide>
<ide><path>docs/api-guide/serializers.md
<ide> The signatures for these methods are as follows:
<ide>
<ide> Takes the object instance that requires serialization, and should return a primitive representation. Typically this means returning a structure of built-in Python datatypes. The exact types that can be handled will depend on the render classes you have configured for your API.
<ide>
<del>May be overridden in order modify the representation style. For example:
<add>May be overridden in order to modify the representation style. For example:
<ide>
<ide> def to_representation(self, instance):
<ide> """Convert `username` to lowercase.""" | 2 |
Javascript | Javascript | remove unused code | 7762f374d50d3b1df8513c826862c2056e8718f9 | <ide><path>packager/react-packager/index.js
<ide> var debug = require('debug');
<ide> var Activity = require('./src/Activity');
<ide>
<ide> exports.createServer = createServer;
<del>exports.middleware = function(options) {
<del> var server = createServer(options);
<del> return server.processRequest.bind(server);
<del>};
<del>
<ide> exports.Activity = Activity;
<del>
<del>// Renamed "package" to "bundle". But maintain backwards
<del>// compat.
<del>exports.buildPackage =
<del>exports.buildBundle = function(options, bundleOptions) {
<del> var server = createNonPersistentServer(options);
<del> return server.buildBundle(bundleOptions)
<del> .then(function(p) {
<del> server.end();
<del> return p;
<del> });
<del>};
<del>
<del>exports.buildPrepackBundle = function(options, bundleOptions) {
<del> var server = createNonPersistentServer(options);
<del> return server.buildPrepackBundle(bundleOptions)
<del> .then(function(p) {
<del> server.end();
<del> return p;
<del> });
<del>};
<del>
<del>exports.buildPackageFromUrl =
<del>exports.buildBundleFromUrl = function(options, reqUrl) {
<del> var server = createNonPersistentServer(options);
<del> return server.buildBundleFromUrl(reqUrl)
<del> .then(function(p) {
<del> server.end();
<del> return p;
<del> });
<del>};
<del>
<del>exports.getDependencies = function(options, bundleOptions) {
<del> var server = createNonPersistentServer(options);
<del> return server.getDependencies(bundleOptions)
<del> .then(function(r) {
<del> server.end();
<del> return r.dependencies;
<del> });
<del>};
<del>
<ide> exports.getOrderedDependencyPaths = function(options, bundleOptions) {
<ide> var server = createNonPersistentServer(options);
<ide> return server.getOrderedDependencyPaths(bundleOptions) | 1 |
Text | Text | update filtering observables docs | 1e4e966e5f848aaf931cb43db77c36a92ab62885 | <ide><path>docs/Filtering-Observables.md
<del>This page shows operators you can use to filter and select items emitted by Observables.
<del>
<del>* [**`filter( )`**](http://reactivex.io/documentation/operators/filter.html) — filter items emitted by an Observable
<del>* [**`takeLast( )`**](http://reactivex.io/documentation/operators/takelast.html) — only emit the last _n_ items emitted by an Observable
<del>* [**`last( )`**](http://reactivex.io/documentation/operators/last.html) — emit only the last item emitted by an Observable
<del>* [**`lastOrDefault( )`**](http://reactivex.io/documentation/operators/last.html) — emit only the last item emitted by an Observable, or a default value if the source Observable is empty
<del>* [**`takeLastBuffer( )`**](http://reactivex.io/documentation/operators/takelast.html) — emit the last _n_ items emitted by an Observable, as a single list item
<del>* [**`skip( )`**](http://reactivex.io/documentation/operators/skip.html) — ignore the first _n_ items emitted by an Observable
<del>* [**`skipLast( )`**](http://reactivex.io/documentation/operators/skiplast.html) — ignore the last _n_ items emitted by an Observable
<del>* [**`take( )`**](http://reactivex.io/documentation/operators/take.html) — emit only the first _n_ items emitted by an Observable
<del>* [**`first( )` and `takeFirst( )`**](http://reactivex.io/documentation/operators/first.html) — emit only the first item emitted by an Observable, or the first item that meets some condition
<del>* [**`firstOrDefault( )`**](http://reactivex.io/documentation/operators/first.html) — emit only the first item emitted by an Observable, or the first item that meets some condition, or a default value if the source Observable is empty
<del>* [**`elementAt( )`**](http://reactivex.io/documentation/operators/elementat.html) — emit item _n_ emitted by the source Observable
<del>* [**`elementAtOrDefault( )`**](http://reactivex.io/documentation/operators/elementat.html) — emit item _n_ emitted by the source Observable, or a default item if the source Observable emits fewer than _n_ items
<del>* [**`sample( )` or `throttleLast( )`**](http://reactivex.io/documentation/operators/sample.html) — emit the most recent items emitted by an Observable within periodic time intervals
<del>* [**`throttleFirst( )`**](http://reactivex.io/documentation/operators/sample.html) — emit the first items emitted by an Observable within periodic time intervals
<del>* [**`throttleWithTimeout( )` or `debounce( )`**](http://reactivex.io/documentation/operators/debounce.html) — only emit an item from the source Observable after a particular timespan has passed without the Observable emitting any other items
<del>* [**`timeout( )`**](http://reactivex.io/documentation/operators/timeout.html) — emit items from a source Observable, but issue an exception if no item is emitted in a specified timespan
<del>* [**`distinct( )`**](http://reactivex.io/documentation/operators/distinct.html) — suppress duplicate items emitted by the source Observable
<del>* [**`distinctUntilChanged( )`**](http://reactivex.io/documentation/operators/distinct.html) — suppress duplicate consecutive items emitted by the source Observable
<del>* [**`ofType( )`**](http://reactivex.io/documentation/operators/filter.html) — emit only those items from the source Observable that are of a particular class
<del>* [**`ignoreElements( )`**](http://reactivex.io/documentation/operators/ignoreelements.html) — discard the items emitted by the source Observable and only pass through the error or completed notification
<add>This page shows operators you can use to filter and select items emitted by reactive sources, such as `Observable`s.
<add>
<add># Outline
<add>
<add>- [`debounce`](#debounce)
<add>- [`distinct`](#distinct)
<add>- [`distinctUntilChanged`](#distinctuntilchanged)
<add>- [`elementAt`](#elementat)
<add>- [`elementAtOrError`](#elementatorerror)
<add>- [`filter`](#filter)
<add>- [`first`](#first)
<add>- [`firstElement`](#firstelement)
<add>- [`firstOrError`](#firstorerror)
<add>- [`ignoreElement`](#ignoreelement)
<add>- [`ignoreElements`](#ignoreelements)
<add>- [`last`](#last)
<add>- [`lastElement`](#lastelement)
<add>- [`lastOrError`](#lastorerror)
<add>- [`ofType`](#oftype)
<add>- [`sample`](#sample)
<add>- [`skip`](#skip)
<add>- [`skipLast`](#skiplast)
<add>- [`take`](#take)
<add>- [`takeLast`](#takelast)
<add>- [`throttleFirst`](#throttlefirst)
<add>- [`throttleLast`](#throttlelast)
<add>- [`throttleLatest`](#throttlelatest)
<add>- [`throttleWithTimeout`](#throttlewithtimeout)
<add>- [`timeout`](#timeout)
<add>
<add>## debounce
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/debounce.html](http://reactivex.io/documentation/operators/debounce.html)
<add>
<add>Drops items emitted by a reactive source that are followed by newer items before the given timeout value expires. The timer resets on each emission.
<add>
<add>This operator keeps track of the most recent emitted item, and emits this item only when enough time has passed without the source emitting any other items.
<add>
<add>### debounce example
<add>
<add>```java
<add>// Diagram:
<add>// -A--------------B----C-D-------------------E-|---->
<add>// a---------1s
<add>// b---------1s
<add>// c---------1s
<add>// d---------1s
<add>// e-|---->
<add>// -----------A---------------------D-----------E-|-->
<add>
<add>Observable<String> source = Observable.create(emitter -> {
<add> emitter.onNext("A");
<add>
<add> Thread.sleep(1_500);
<add> emitter.onNext("B");
<add>
<add> Thread.sleep(500);
<add> emitter.onNext("C");
<add>
<add> Thread.sleep(250);
<add> emitter.onNext("D");
<add>
<add> Thread.sleep(2_000);
<add> emitter.onNext("E");
<add> emitter.onComplete();
<add>});
<add>
<add>source.subscribeOn(Schedulers.io())
<add> .debounce(1, TimeUnit.SECONDS)
<add> .blockingSubscribe(
<add> item -> System.out.println("onNext: " + item),
<add> Throwable::printStackTrace,
<add> () -> System.out.println("onComplete"));
<add>
<add>// prints:
<add>// onNext: A
<add>// onNext: D
<add>// onNext: E
<add>// onComplete
<add>```
<add>
<add>## distinct
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/distinct.html](http://reactivex.io/documentation/operators/distinct.html)
<add>
<add>Filters a reactive source by only emitting items that are distinct by comparison from previous items. A `io.reactivex.functions.Function` can be specified that projects each item emitted by the source into a new value that will be used for comparison with previous projected values.
<add>
<add>### distinct example
<add>
<add>```java
<add>Observable.just(2, 3, 4, 4, 2, 1)
<add> .distinct()
<add> .subscribe(System.out::println);
<add>
<add>// prints:
<add>// 2
<add>// 3
<add>// 4
<add>// 1
<add>```
<add>
<add>## distinctUntilChanged
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/distinct.html](http://reactivex.io/documentation/operators/distinct.html)
<add>
<add>Filters a reactive source by only emitting items that are distinct by comparison from their immediate predecessors. A `io.reactivex.functions.Function` can be specified that projects each item emitted by the source into a new value that will be used for comparison with previous projected values. Alternatively, a `io.reactivex.functions.BiPredicate` can be specified that is used as the comparator function to compare immediate predecessors with each other.
<add>
<add>### distinctUntilChanged example
<add>
<add>```java
<add>Observable.just(1, 1, 2, 1, 2, 3, 3, 4)
<add> .distinctUntilChanged()
<add> .subscribe(System.out::println);
<add>
<add>// prints:
<add>// 1
<add>// 2
<add>// 1
<add>// 2
<add>// 3
<add>// 4
<add>```
<add>
<add>## elementAt
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/elementat.html](http://reactivex.io/documentation/operators/elementat.html)
<add>
<add>Emits the single item at the specified zero-based index in a sequence of emissions from a reactive source. A default item can be specified that will be emitted if the specified index is not within the sequence.
<add>
<add>### elementAt example
<add>
<add>```java
<add>Observable<Long> source = Observable.<Long, Long>generate(() -> 1L, (state, emitter) -> {
<add> emitter.onNext(state);
<add>
<add> return state + 1L;
<add>}).scan((product, x) -> product * x);
<add>
<add>Maybe<Long> element = source.elementAt(5);
<add>element.subscribe(System.out::println);
<add>
<add>// prints 720
<add>```
<add>
<add>## elementAtOrError
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/elementat.html](http://reactivex.io/documentation/operators/elementat.html)
<add>
<add>Emits the single item at the specified zero-based index in a sequence of emissions from a reactive source, or signals a `java.util.NoSuchElementException` if the specified index is not within the sequence.
<add>
<add>### elementAtOrError example
<add>
<add>```java
<add>Observable<String> source = Observable.just("Kirk", "Spock", "Chekov", "Sulu");
<add>Single<String> element = source.elementAtOrError(4);
<add>
<add>element.subscribe(
<add> name -> System.out.println("onSuccess will not be printed!"),
<add> error -> System.out.println("onError: " + error));
<add>
<add>// prints:
<add>// onError: java.util.NoSuchElementException
<add>```
<add>
<add>## filter
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/filter.html](http://reactivex.io/documentation/operators/filter.html)
<add>
<add>Filters items emitted by a reactive source by only emitting those that satisfy a specified predicate.
<add>
<add>### filter example
<add>
<add>```java
<add>Observable.just(1, 2, 3, 4, 5, 6)
<add> .filter(x -> x % 2 == 0)
<add> .subscribe(System.out::println);
<add>
<add>// prints:
<add>// 2
<add>// 4
<add>// 6
<add>```
<add>
<add>## first
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/first.html](http://reactivex.io/documentation/operators/first.html)
<add>
<add>Emits only the first item emitted by a reactive source, or emits the given default item if the source completes without emitting an item. This differs from [`firstElement`](#firstelement) in that this operator returns a `Single` whereas [`firstElement`](#firstelement) returns a `Maybe`.
<add>
<add>### first example
<add>
<add>```java
<add>Observable<String> source = Observable.just("A", "B", "C");
<add>Single<String> firstOrDefault = source.first("D");
<add>
<add>firstOrDefault.subscribe(System.out::println);
<add>
<add>// prints A
<add>```
<add>
<add>## firstElement
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/first.html](http://reactivex.io/documentation/operators/first.html)
<add>
<add>Emits only the first item emitted by a reactive source, or just completes if the source completes without emitting an item. This differs from [`first`](#first) in that this operator returns a `Maybe` whereas [`first`](#first) returns a `Single`.
<add>
<add>### firstElement example
<add>
<add>```java
<add>Observable<String> source = Observable.just("A", "B", "C");
<add>Maybe<String> first = source.firstElement();
<add>
<add>first.subscribe(System.out::println);
<add>
<add>// prints A
<add>```
<add>
<add>## firstOrError
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/first.html](http://reactivex.io/documentation/operators/first.html)
<add>
<add>Emits only the first item emitted by a reactive source, or signals a `java.util.NoSuchElementException` if the source completes without emitting an item.
<add>
<add>### firstOrError example
<add>
<add>```java
<add>Observable<String> emptySource = Observable.empty();
<add>Single<String> firstOrError = emptySource.firstOrError();
<add>
<add>firstOrError.subscribe(
<add> element -> System.out.println("onSuccess will not be printed!"),
<add> error -> System.out.println("onError: " + error));
<add>
<add>// prints:
<add>// onError: java.util.NoSuchElementException
<add>```
<add>
<add>## ignoreElement
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/ignoreelements.html](http://reactivex.io/documentation/operators/ignoreelements.html)
<add>
<add>Ignores the single item emitted by a `Single` or `Maybe` source, and returns a `Completable` that signals only the error or completion event from the the source.
<add>
<add>### ignoreElement example
<add>
<add>```java
<add>Single<Long> source = Single.timer(1, TimeUnit.SECONDS);
<add>Completable completable = source.ignoreElement();
<add>
<add>completable.doOnComplete(() -> System.out.println("Done!"))
<add> .blockingAwait();
<add>
<add>// prints (after 1 second):
<add>// Done!
<add>```
<add>
<add>## ignoreElements
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/ignoreelements.html](http://reactivex.io/documentation/operators/ignoreelements.html)
<add>
<add>Ignores all items from the `Observable` or `Flowable` source, and returns a `Completable` that signals only the error or completion event from the source.
<add>
<add>### ignoreElements example
<add>
<add>```java
<add>Observable<Long> source = Observable.intervalRange(1, 5, 1, 1, TimeUnit.SECONDS);
<add>Completable completable = source.ignoreElements();
<add>
<add>completable.doOnComplete(() -> System.out.println("Done!"))
<add> .blockingAwait();
<add>
<add>// prints (after 5 seconds):
<add>// Done!
<add>```
<add>
<add>## last
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/last.html](http://reactivex.io/documentation/operators/last.html)
<add>
<add>Emits only the last item emitted by a reactive source, or emits the given default item if the source completes without emitting an item. This differs from [`lastElement`](#lastelement) in that this operator returns a `Single` whereas [`lastElement`](#lastelement) returns a `Maybe`.
<add>
<add>### last example
<add>
<add>```java
<add>Observable<String> source = Observable.just("A", "B", "C");
<add>Single<String> lastOrDefault = source.last("D");
<add>
<add>lastOrDefault.subscribe(System.out::println);
<add>
<add>// prints C
<add>```
<add>
<add>## lastElement
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/last.html](http://reactivex.io/documentation/operators/last.html)
<add>
<add>Emits only the last item emitted by a reactive source, or just completes if the source completes without emitting an item. This differs from [`last`](#last) in that this operator returns a `Maybe` whereas [`last`](#last) returns a `Single`.
<add>
<add>### lastElement example
<add>
<add>```java
<add>Observable<String> source = Observable.just("A", "B", "C");
<add>Maybe<String> last = source.lastElement();
<add>
<add>last.subscribe(System.out::println);
<add>
<add>// prints C
<add>```
<add>
<add>## lastOrError
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/last.html](http://reactivex.io/documentation/operators/last.html)
<add>
<add>Emits only the last item emitted by a reactive source, or signals a `java.util.NoSuchElementException` if the source completes without emitting an item.
<add>
<add>### lastOrError example
<add>
<add>```java
<add>Observable<String> emptySource = Observable.empty();
<add>Single<String> lastOrError = emptySource.lastOrError();
<add>
<add>lastOrError.subscribe(
<add> element -> System.out.println("onSuccess will not be printed!"),
<add> error -> System.out.println("onError: " + error));
<add>
<add>// prints:
<add>// onError: java.util.NoSuchElementException
<add>```
<add>
<add>## ofType
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/filter.html](http://reactivex.io/documentation/operators/filter.html)
<add>
<add>Filters items emitted by a reactive source by only emitting those of the specified type.
<add>
<add>### ofType example
<add>
<add>```java
<add>Observable<Number> numbers = Observable.just(1, 4.0, 3, 2.71, 2f, 7);
<add>Observable<Integer> integers = numbers.ofType(Integer.class);
<add>
<add>integers.subscribe((Integer x) -> System.out.println(x));
<add>
<add>// prints:
<add>// 1
<add>// 3
<add>// 7
<add>```
<add>
<add>## sample
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/sample.html](http://reactivex.io/documentation/operators/sample.html)
<add>
<add>Filters items emitted by a reactive source by only emitting the most recently emitted item within periodic time intervals.
<add>
<add>### sample example
<add>
<add>```java
<add>// Diagram:
<add>// -A----B-C-------D-----E-|-->
<add>// -0s-----c--1s---d----2s-|-->
<add>// -----------C---------D--|-->
<add>
<add>Observable<String> source = Observable.create(emitter -> {
<add> emitter.onNext("A");
<add>
<add> Thread.sleep(500);
<add> emitter.onNext("B");
<add>
<add> Thread.sleep(200);
<add> emitter.onNext("C");
<add>
<add> Thread.sleep(800);
<add> emitter.onNext("D");
<add>
<add> Thread.sleep(600);
<add> emitter.onNext("E");
<add> emitter.onComplete();
<add>});
<add>
<add>source.subscribeOn(Schedulers.io())
<add> .sample(1, TimeUnit.SECONDS)
<add> .blockingSubscribe(
<add> item -> System.out.println("onNext: " + item),
<add> Throwable::printStackTrace,
<add> () -> System.out.println("onComplete"));
<add>
<add>// prints:
<add>// onNext: C
<add>// onNext: D
<add>// onComplete
<add>```
<add>
<add>## skip
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/skip.html](http://reactivex.io/documentation/operators/skip.html)
<add>
<add>Drops the first *n* items emitted by a reactive source, and emits the remaining items.
<add>
<add>### skip example
<add>
<add>```java
<add>Observable<Integer> source = Observable.just(1, 2, 3, 4, 5, 6, 7, 8, 9, 10);
<add>
<add>source.skip(4)
<add> .subscribe(System.out::println);
<add>
<add>// prints:
<add>// 5
<add>// 6
<add>// 7
<add>// 8
<add>// 9
<add>// 10
<add>```
<add>
<add>## skipLast
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/skiplast.html](http://reactivex.io/documentation/operators/skiplast.html)
<add>
<add>Drops the last *n* items emitted by a reactive source, and emits the remaining items.
<add>
<add>### skipLast example
<add>
<add>```java
<add>Observable<Integer> source = Observable.just(1, 2, 3, 4, 5, 6, 7, 8, 9, 10);
<add>
<add>source.skipLast(4)
<add> .subscribe(System.out::println);
<add>
<add>// prints:
<add>// 1
<add>// 2
<add>// 3
<add>// 4
<add>// 5
<add>// 6
<add>```
<add>
<add>## take
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/take.html](http://reactivex.io/documentation/operators/take.html)
<add>
<add>Emits only the first *n* items emitted by a reactive source.
<add>
<add>### take example
<add>
<add>```java
<add>Observable<Integer> source = Observable.just(1, 2, 3, 4, 5, 6, 7, 8, 9, 10);
<add>
<add>source.take(4)
<add> .subscribe(System.out::println);
<add>
<add>// prints:
<add>// 1
<add>// 2
<add>// 3
<add>// 4
<add>```
<add>
<add>## takeLast
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/takelast.html](http://reactivex.io/documentation/operators/takelast.html)
<add>
<add>Emits only the last *n* items emitted by a reactive source.
<add>
<add>### takeLast example
<add>
<add>```java
<add>Observable<Integer> source = Observable.just(1, 2, 3, 4, 5, 6, 7, 8, 9, 10);
<add>
<add>source.takeLast(4)
<add> .subscribe(System.out::println);
<add>
<add>// prints:
<add>// 7
<add>// 8
<add>// 9
<add>// 10
<add>```
<add>
<add>## throttleFirst
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/sample.html](http://reactivex.io/documentation/operators/sample.html)
<add>
<add>Emits only the first item emitted by a reactive source during sequential time windows of a specified duration.
<add>
<add>### throttleFirst example
<add>
<add>```java
<add>// Diagram:
<add>// -A----B-C-------D-----E-|-->
<add>// a---------1s
<add>// d-------|-->
<add>// -A--------------D-------|-->
<add>
<add>Observable<String> source = Observable.create(emitter -> {
<add> emitter.onNext("A");
<add>
<add> Thread.sleep(500);
<add> emitter.onNext("B");
<add>
<add> Thread.sleep(200);
<add> emitter.onNext("C");
<add>
<add> Thread.sleep(800);
<add> emitter.onNext("D");
<add>
<add> Thread.sleep(600);
<add> emitter.onNext("E");
<add> emitter.onComplete();
<add>});
<add>
<add>source.subscribeOn(Schedulers.io())
<add> .throttleFirst(1, TimeUnit.SECONDS)
<add> .blockingSubscribe(
<add> item -> System.out.println("onNext: " + item),
<add> Throwable::printStackTrace,
<add> () -> System.out.println("onComplete"));
<add>
<add>// prints:
<add>// onNext: A
<add>// onNext: D
<add>// onComplete
<add>```
<add>
<add>## throttleLast
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/sample.html](http://reactivex.io/documentation/operators/sample.html)
<add>
<add>Emits only the last item emitted by a reactive source during sequential time windows of a specified duration.
<add>
<add>### throttleLast example
<add>
<add>```java
<add>// Diagram:
<add>// -A----B-C-------D-----E-|-->
<add>// -0s-----c--1s---d----2s-|-->
<add>// -----------C---------D--|-->
<add>
<add>Observable<String> source = Observable.create(emitter -> {
<add> emitter.onNext("A");
<add>
<add> Thread.sleep(500);
<add> emitter.onNext("B");
<add>
<add> Thread.sleep(200);
<add> emitter.onNext("C");
<add>
<add> Thread.sleep(800);
<add> emitter.onNext("D");
<add>
<add> Thread.sleep(600);
<add> emitter.onNext("E");
<add> emitter.onComplete();
<add>});
<add>
<add>source.subscribeOn(Schedulers.io())
<add> .throttleLast(1, TimeUnit.SECONDS)
<add> .blockingSubscribe(
<add> item -> System.out.println("onNext: " + item),
<add> Throwable::printStackTrace,
<add> () -> System.out.println("onComplete"));
<add>
<add>// prints:
<add>// onNext: C
<add>// onNext: D
<add>// onComplete
<add>```
<add>
<add>## throttleLatest
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/sample.html](http://reactivex.io/documentation/operators/sample.html)
<add>
<add>Emits the next item emitted by a reactive source, then periodically emits the latest item (if any) when the specified timeout elapses between them.
<add>
<add>### throttleLatest example
<add>
<add>```java
<add>// Diagram:
<add>// -A----B-C-------D-----E-|-->
<add>// -a------c--1s
<add>// -----d----1s
<add>// -e-|-->
<add>// -A---------C---------D--|-->
<add>
<add>Observable<String> source = Observable.create(emitter -> {
<add> emitter.onNext("A");
<add>
<add> Thread.sleep(500);
<add> emitter.onNext("B");
<add>
<add> Thread.sleep(200);
<add> emitter.onNext("C");
<add>
<add> Thread.sleep(800);
<add> emitter.onNext("D");
<add>
<add> Thread.sleep(600);
<add> emitter.onNext("E");
<add> emitter.onComplete();
<add>});
<add>
<add>source.subscribeOn(Schedulers.io())
<add> .throttleLatest(1, TimeUnit.SECONDS)
<add> .blockingSubscribe(
<add> item -> System.out.println("onNext: " + item),
<add> Throwable::printStackTrace,
<add> () -> System.out.println("onComplete"));
<add>
<add>// prints:
<add>// onNext: A
<add>// onNext: C
<add>// onNext: D
<add>// onComplete
<add>```
<add>
<add>## throttleWithTimeout
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/debounce.html](http://reactivex.io/documentation/operators/debounce.html)
<add>
<add>> Alias to [debounce](#debounce)
<add>
<add>Drops items emitted by a reactive source that are followed by newer items before the given timeout value expires. The timer resets on each emission.
<add>
<add>### throttleWithTimeout example
<add>
<add>```java
<add>// Diagram:
<add>// -A--------------B----C-D-------------------E-|---->
<add>// a---------1s
<add>// b---------1s
<add>// c---------1s
<add>// d---------1s
<add>// e-|---->
<add>// -----------A---------------------D-----------E-|-->
<add>
<add>Observable<String> source = Observable.create(emitter -> {
<add> emitter.onNext("A");
<add>
<add> Thread.sleep(1_500);
<add> emitter.onNext("B");
<add>
<add> Thread.sleep(500);
<add> emitter.onNext("C");
<add>
<add> Thread.sleep(250);
<add> emitter.onNext("D");
<add>
<add> Thread.sleep(2_000);
<add> emitter.onNext("E");
<add> emitter.onComplete();
<add>});
<add>
<add>source.subscribeOn(Schedulers.io())
<add> .throttleWithTimeout(1, TimeUnit.SECONDS)
<add> .blockingSubscribe(
<add> item -> System.out.println("onNext: " + item),
<add> Throwable::printStackTrace,
<add> () -> System.out.println("onComplete"));
<add>
<add>// prints:
<add>// onNext: A
<add>// onNext: D
<add>// onNext: E
<add>// onComplete
<add>```
<add>
<add>## timeout
<add>
<add>**Available in:**  `Flowable`,  `Observable`,  `Maybe`,  `Single`,  `Completable`
<add>
<add>**ReactiveX documentation:** [http://reactivex.io/documentation/operators/timeout.html](http://reactivex.io/documentation/operators/timeout.html)
<add>
<add>Emits the items from the `Observable` or `Flowable` source, but terminates with a `java.util.concurrent.TimeoutException` if the next item is not emitted within the specified timeout duration starting from the previous item. For `Maybe`, `Single` and `Completable` the specified timeout duration specifies the maximum time to wait for a success or completion event to arrive. If the `Maybe`, `Single` or `Completable` does not complete within the given time a `java.util.concurrent.TimeoutException` will be emitted.
<add>
<add>### timeout example
<add>
<add>```java
<add>// Diagram:
<add>// -A-------B---C-----------D-|-->
<add>// a---------1s
<add>// b---------1s
<add>// c---------1s
<add>// -A-------B---C---------X------>
<add>
<add>Observable<String> source = Observable.create(emitter -> {
<add> emitter.onNext("A");
<add>
<add> Thread.sleep(800);
<add> emitter.onNext("B");
<add>
<add> Thread.sleep(400);
<add> emitter.onNext("C");
<add>
<add> Thread.sleep(1200);
<add> emitter.onNext("D");
<add> emitter.onComplete();
<add>});
<add>
<add>source.timeout(1, TimeUnit.SECONDS)
<add> .subscribe(
<add> item -> System.out.println("onNext: " + item),
<add> error -> System.out.println("onError: " + error),
<add> () -> System.out.println("onComplete will not be printed!"));
<add>
<add>// prints:
<add>// onNext: A
<add>// onNext: B
<add>// onNext: C
<add>// onError: java.util.concurrent.TimeoutException: The source did not signal an event for 1 seconds and has been terminated.
<add>``` | 1 |
Javascript | Javascript | remove unused simulated flag parameter | b4608dd24caa31b3c78c51acbb5115ad03e884b3 | <ide><path>packages/events/EventPluginHub.js
<ide> let eventQueue: ?(Array<ReactSyntheticEvent> | ReactSyntheticEvent) = null;
<ide> * Dispatches an event and releases it back into the pool, unless persistent.
<ide> *
<ide> * @param {?object} event Synthetic event to be dispatched.
<del> * @param {boolean} simulated If the event is simulated (changes exn behavior)
<ide> * @private
<ide> */
<del>const executeDispatchesAndRelease = function(
<del> event: ReactSyntheticEvent,
<del> simulated: boolean,
<del>) {
<add>const executeDispatchesAndRelease = function(event: ReactSyntheticEvent) {
<ide> if (event) {
<del> executeDispatchesInOrder(event, simulated);
<add> executeDispatchesInOrder(event);
<ide>
<ide> if (!event.isPersistent()) {
<ide> event.constructor.release(event);
<ide> }
<ide> }
<ide> };
<del>const executeDispatchesAndReleaseSimulated = function(e) {
<del> return executeDispatchesAndRelease(e, true);
<del>};
<ide> const executeDispatchesAndReleaseTopLevel = function(e) {
<del> return executeDispatchesAndRelease(e, false);
<add> return executeDispatchesAndRelease(e);
<ide> };
<ide>
<ide> function isInteractive(tag) {
<ide> function extractEvents(
<ide>
<ide> export function runEventsInBatch(
<ide> events: Array<ReactSyntheticEvent> | ReactSyntheticEvent | null,
<del> simulated: boolean,
<ide> ) {
<ide> if (events !== null) {
<ide> eventQueue = accumulateInto(eventQueue, events);
<ide> export function runEventsInBatch(
<ide> return;
<ide> }
<ide>
<del> if (simulated) {
<del> forEachAccumulated(
<del> processingEventQueue,
<del> executeDispatchesAndReleaseSimulated,
<del> );
<del> } else {
<del> forEachAccumulated(
<del> processingEventQueue,
<del> executeDispatchesAndReleaseTopLevel,
<del> );
<del> }
<add> forEachAccumulated(processingEventQueue, executeDispatchesAndReleaseTopLevel);
<ide> invariant(
<ide> !eventQueue,
<ide> 'processEventQueue(): Additional events were enqueued while processing ' +
<ide> export function runExtractedEventsInBatch(
<ide> nativeEvent,
<ide> nativeEventTarget,
<ide> );
<del> runEventsInBatch(events, false);
<add> runEventsInBatch(events);
<ide> }
<ide><path>packages/events/EventPluginUtils.js
<ide> if (__DEV__) {
<ide> /**
<ide> * Dispatch the event to the listener.
<ide> * @param {SyntheticEvent} event SyntheticEvent to handle
<del> * @param {boolean} simulated If the event is simulated (changes exn behavior)
<ide> * @param {function} listener Application-level callback
<ide> * @param {*} inst Internal component instance
<ide> */
<del>function executeDispatch(event, simulated, listener, inst) {
<add>function executeDispatch(event, listener, inst) {
<ide> const type = event.type || 'unknown-event';
<ide> event.currentTarget = getNodeFromInstance(inst);
<ide> invokeGuardedCallbackAndCatchFirstError(type, listener, undefined, event);
<ide> function executeDispatch(event, simulated, listener, inst) {
<ide> /**
<ide> * Standard/simple iteration through an event's collected dispatches.
<ide> */
<del>export function executeDispatchesInOrder(event, simulated) {
<add>export function executeDispatchesInOrder(event) {
<ide> const dispatchListeners = event._dispatchListeners;
<ide> const dispatchInstances = event._dispatchInstances;
<ide> if (__DEV__) {
<ide> export function executeDispatchesInOrder(event, simulated) {
<ide> break;
<ide> }
<ide> // Listeners and Instances are two parallel arrays that are always in sync.
<del> executeDispatch(
<del> event,
<del> simulated,
<del> dispatchListeners[i],
<del> dispatchInstances[i],
<del> );
<add> executeDispatch(event, dispatchListeners[i], dispatchInstances[i]);
<ide> }
<ide> } else if (dispatchListeners) {
<del> executeDispatch(event, simulated, dispatchListeners, dispatchInstances);
<add> executeDispatch(event, dispatchListeners, dispatchInstances);
<ide> }
<ide> event._dispatchListeners = null;
<ide> event._dispatchInstances = null;
<ide><path>packages/events/__tests__/ResponderEventPlugin-test.internal.js
<ide> const run = function(config, hierarchyConfig, nativeEventConfig) {
<ide> // At this point the negotiation events have been dispatched as part of the
<ide> // extraction process, but not the side effectful events. Below, we dispatch
<ide> // side effectful events.
<del> EventPluginHub.runEventsInBatch(extractedEvents, true);
<add> EventPluginHub.runEventsInBatch(extractedEvents);
<ide>
<ide> // Ensure that every event that declared an `order`, was actually dispatched.
<ide> expect('number of events dispatched:' + runData.dispatchCount).toBe(
<ide><path>packages/react-dom/src/events/ChangeEventPlugin.js
<ide> function manualDispatchChangeEvent(nativeEvent) {
<ide> }
<ide>
<ide> function runEventInBatch(event) {
<del> EventPluginHub.runEventsInBatch(event, false);
<add> EventPluginHub.runEventsInBatch(event);
<ide> }
<ide>
<ide> function getInstIfValueChanged(targetInst) {
<ide><path>packages/react-dom/src/test-utils/ReactTestUtils.js
<ide> function makeSimulator(eventType) {
<ide> // Normally extractEvent enqueues a state restore, but we'll just always
<ide> // do that since we we're by-passing it here.
<ide> enqueueStateRestore(domNode);
<del> runEventsInBatch(event, true);
<add> runEventsInBatch(event);
<ide> });
<ide> restoreStateIfNeeded();
<ide> }; | 5 |
Javascript | Javascript | improve grammar of warning message | 0a47e34d77bac3d998fef965471fe69c3d17d1a7 | <ide><path>lib/NodeStuffPlugin.js
<ide> class NodeStuffPlugin {
<ide> new NodeStuffInWebError(
<ide> dep.loc,
<ide> "global",
<del> "The global namespace object is Node.js feature and doesn't present in browser."
<add> "The global namespace object is a Node.js feature and isn't available in browsers."
<ide> )
<ide> );
<ide> } | 1 |
PHP | PHP | add tests for ucsplit in stringable | 650ca875d3bd7e876ed89f803c705d5d6c121d90 | <ide><path>tests/Support/SupportStringableTest.php
<ide> public function testWhenContainsAll()
<ide> }));
<ide> }
<ide>
<add> public function testUcsplitOnStringable()
<add> {
<add> $this->assertSame(['Taylor', 'Otwell'], $this->stringable('TaylorOtwell')->ucsplit()->toArray());
<add> $this->assertSame(['Hello', 'From', 'Laravel'], $this->stringable('HelloFromLaravel')->ucsplit()->toArray());
<add> $this->assertSame(['He_llo_', 'World'], $this->stringable('He_llo_World')->ucsplit()->toArray());
<add> }
<add>
<ide> public function testWhenEndsWith()
<ide> {
<ide> $this->assertSame('Tony Stark', (string) $this->stringable('tony stark')->whenEndsWith('ark', function ($stringable) { | 1 |
Text | Text | add lead, new maintainers | d6aa0aa8d35c76cc9865035e8570e9ee44600031 | <ide><path>README.md
<ide> This is our PGP key which is valid until May 24, 2017.
<ide> * Full key: https://keybase.io/homebrew/key.asc
<ide>
<ide> ## Who Are You?
<del>Homebrew's current maintainers are [Misty De Meo](https://github.com/mistydemeo), [Andrew Janke](https://github.com/apjanke), [Xu Cheng](https://github.com/xu-cheng), [Tomasz Pajor](https://github.com/nijikon), [Mike McQuaid](https://github.com/mikemcquaid), [Baptiste Fontaine](https://github.com/bfontaine), [Brett Koonce](https://github.com/asparagui), [ilovezfs](https://github.com/ilovezfs), [Martin Afanasjew](https://github.com/UniqMartin), [Dominyk Tiller](https://github.com/DomT4), [Tim Smith](https://github.com/tdsmith) and [Alex Dunn](https://github.com/dunn).
<add>Homebrew's lead maintainer is [Mike McQuaid](https://github.com/mikemcquaid).
<add>
<add>Homebrew's current maintainers are [Misty De Meo](https://github.com/mistydemeo), [Andrew Janke](https://github.com/apjanke), [Xu Cheng](https://github.com/xu-cheng), [Tomasz Pajor](https://github.com/nijikon), [Baptiste Fontaine](https://github.com/bfontaine), [Zhiming Wang](https://github.com/zmwangx), [Brett Koonce](https://github.com/asparagui), [ilovezfs](https://github.com/ilovezfs), [Martin Afanasjew](https://github.com/UniqMartin), [Uladzislau Shablinski](https://github.com/orgs/Homebrew/people/vladshablinsky), [Dominyk Tiller](https://github.com/DomT4), [Tim Smith](https://github.com/tdsmith) and [Alex Dunn](https://github.com/dunn).
<ide>
<ide> Former maintainers with significant contributions include [Jack Nagel](https://github.com/jacknagel), [Adam Vandenberg](https://github.com/adamv) and Homebrew's creator: [Max Howell](https://github.com/mxcl).
<ide> | 1 |
Javascript | Javascript | fix version check in tick processor | 421316dca17d72919a928f414af8d03e2f3a2476 | <ide><path>lib/internal/v8_prof_polyfill.js
<ide> function versionCheck() {
<ide> var firstLine = readline();
<ide> line = firstLine + '\n' + line;
<ide> firstLine = firstLine.split(',');
<del> const curVer = process.versions.v8.split(/\.-/);
<add> const curVer = process.versions.v8.split(/[.\-]/);
<ide> if (firstLine.length !== 6 && firstLine.length !== 7 ||
<ide> firstLine[0] !== 'v8-version') {
<ide> console.log('Unable to read v8-version from log file.'); | 1 |
Javascript | Javascript | use namespaced path in absolute symlinks | a4e273baf43910ba9e5c66949e56b919f8614fb9 | <ide><path>lib/internal/fs/utils.js
<ide> function preprocessSymlinkDestination(path, type, linkPath) {
<ide> path = pathModule.resolve(linkPath, '..', path);
<ide> return pathModule.toNamespacedPath(path);
<ide> }
<add> if (pathModule.isAbsolute(path)) {
<add> // If the path is absolute, use the \\?\-prefix to enable long filenames
<add> return pathModule.toNamespacedPath(path);
<add> }
<ide> // Windows symlinks don't tolerate forward slashes.
<ide> return ('' + path).replace(/\//g, '\\');
<ide> }
<ide><path>test/parallel/test-fs-symlink-longpath.js
<add>'use strict';
<add>
<add>const common = require('../common');
<add>const assert = require('assert');
<add>const path = require('path');
<add>const fs = require('fs');
<add>
<add>const tmpdir = require('../common/tmpdir');
<add>tmpdir.refresh();
<add>const tmpDir = tmpdir.path;
<add>const longPath = path.join(...[tmpDir].concat(Array(30).fill('1234567890')));
<add>fs.mkdirSync(longPath, { recursive: true });
<add>
<add>// Test if we can have symlinks to files and folders with long filenames
<add>const targetDirtectory = path.join(longPath, 'target-directory');
<add>fs.mkdirSync(targetDirtectory);
<add>const pathDirectory = path.join(tmpDir, 'new-directory');
<add>fs.symlink(targetDirtectory, pathDirectory, 'dir', common.mustCall((err) => {
<add> assert.ifError(err);
<add> assert(fs.existsSync(pathDirectory));
<add>}));
<add>
<add>const targetFile = path.join(longPath, 'target-file');
<add>fs.writeFileSync(targetFile, 'data');
<add>const pathFile = path.join(tmpDir, 'new-file');
<add>fs.symlink(targetFile, pathFile, common.mustCall((err) => {
<add> assert.ifError(err);
<add> assert(fs.existsSync(pathFile));
<add>})); | 2 |
Ruby | Ruby | require formula before using it | 0581532fdc68fc5b624cfa32b49c21552c90b4c2 | <ide><path>Library/Homebrew/extend/ENV/shared.rb
<add>require 'formula'
<add>
<ide> module SharedEnvExtension
<ide> CC_FLAG_VARS = %w{CFLAGS CXXFLAGS OBJCFLAGS OBJCXXFLAGS}
<ide> FC_FLAG_VARS = %w{FCFLAGS FFLAGS} | 1 |
Javascript | Javascript | fix error in math helpers | 0ed39c9fd702a0f281206137b01131080be8712b | <ide><path>src/core/core.helpers.js
<ide> } else {
<ide> return max;
<ide> }
<del> }, Number.MIN_VALUE);
<add> }, Number.NEGATIVE_INFINITY);
<ide> };
<ide> helpers.min = function(array) {
<ide> return array.reduce(function(min, value) {
<ide> } else {
<ide> return min;
<ide> }
<del> }, Number.MAX_VALUE);
<add> }, Number.POSITIVE_INFINITY);
<ide> };
<ide> helpers.sign = function(x) {
<ide> if (Math.sign) { | 1 |
Python | Python | add test for sequential model | 07f2253ff23d26126185fdb81e81d357fc6a7b71 | <ide><path>tests/auto/test_sequential_model.py
<add>from __future__ import absolute_import
<add>from __future__ import print_function
<add>import unittest
<add>import numpy as np
<add>np.random.seed(1337)
<add>
<add>from keras.models import Sequential
<add>from keras.layers.core import Dense, Activation, Merge
<add>from keras.utils import np_utils
<add>from keras.utils.test_utils import get_test_data
<add>
<add>input_dim = 32
<add>nb_hidden = 16
<add>nb_class = 4
<add>batch_size = 64
<add>nb_epoch = 1
<add>
<add>train_samples = 5000
<add>test_samples = 1000
<add>
<add>(X_train, y_train), (X_test, y_test) = get_test_data(nb_train=train_samples, nb_test=test_samples, input_shape=(input_dim,),
<add> classification=True, nb_class=4)
<add>y_test = np_utils.to_categorical(y_test)
<add>y_train = np_utils.to_categorical(y_train)
<add>print(X_train.shape)
<add>print(y_train.shape)
<add>
<add>class TestSequential(unittest.TestCase):
<add> def test_sequential(self):
<add> print('Test sequential')
<add> model = Sequential()
<add> model.add(Dense(input_dim, nb_hidden))
<add> model.add(Activation('relu'))
<add> model.add(Dense(nb_hidden, nb_class))
<add> model.add(Activation('softmax'))
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add>
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=1, validation_data=(X_test, y_test))
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=2, validation_data=(X_test, y_test))
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=2, validation_split=0.1)
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=1, validation_split=0.1)
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=1, shuffle=False)
<add>
<add> model.train_on_batch(X_train[:32], y_train[:32])
<add>
<add> loss = model.evaluate(X_train, y_train, verbose=0)
<add> print('loss:', loss)
<add> if loss > 0.6:
<add> raise Exception('Score too low, learning issue.')
<add> preds = model.predict(X_test, verbose=0)
<add> classes = model.predict_classes(X_test, verbose=0)
<add> probas = model.predict_proba(X_test, verbose=0)
<add> print(model.get_config(verbose=1))
<add>
<add> print('test weight saving')
<add> model.save_weights('temp.h5', overwrite=True)
<add> model = Sequential()
<add> model.add(Dense(input_dim, nb_hidden))
<add> model.add(Activation('relu'))
<add> model.add(Dense(nb_hidden, nb_class))
<add> model.add(Activation('softmax'))
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add> model.load_weights('temp.h5')
<add>
<add> nloss = model.evaluate(X_train, y_train, verbose=0)
<add> print(nloss)
<add> assert(loss == nloss)
<add>
<add>
<add> def test_merge_sum(self):
<add> print('Test merge: sum')
<add> left = Sequential()
<add> left.add(Dense(input_dim, nb_hidden))
<add> left.add(Activation('relu'))
<add>
<add> right = Sequential()
<add> right.add(Dense(input_dim, nb_hidden))
<add> right.add(Activation('relu'))
<add>
<add> model = Sequential()
<add> model.add(Merge([left, right], mode='sum'))
<add>
<add> model.add(Dense(nb_hidden, nb_class))
<add> model.add(Activation('softmax'))
<add>
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add>
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=([X_test, X_test], y_test))
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=([X_test, X_test], y_test))
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<add>
<add> loss = model.evaluate([X_train, X_train], y_train, verbose=0)
<add> print('loss:', loss)
<add> if loss > 0.7:
<add> raise Exception('Score too low, learning issue.')
<add> preds = model.predict([X_test, X_test], verbose=0)
<add> classes = model.predict_classes([X_test, X_test], verbose=0)
<add> probas = model.predict_proba([X_test, X_test], verbose=0)
<add> print(model.get_config(verbose=1))
<add>
<add> print('test weight saving')
<add> model.save_weights('temp.h5', overwrite=True)
<add> left = Sequential()
<add> left.add(Dense(input_dim, nb_hidden))
<add> left.add(Activation('relu'))
<add> right = Sequential()
<add> right.add(Dense(input_dim, nb_hidden))
<add> right.add(Activation('relu'))
<add> model = Sequential()
<add> model.add(Merge([left, right], mode='sum'))
<add> model.add(Dense(nb_hidden, nb_class))
<add> model.add(Activation('softmax'))
<add> model.load_weights('temp.h5')
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add>
<add> nloss = model.evaluate([X_train, X_train], y_train, verbose=0)
<add> print(nloss)
<add> assert(loss == nloss)
<add>
<add>
<add> def test_merge_concat(self):
<add> print('Test merge: concat')
<add> left = Sequential()
<add> left.add(Dense(input_dim, nb_hidden))
<add> left.add(Activation('relu'))
<add>
<add> right = Sequential()
<add> right.add(Dense(input_dim, nb_hidden))
<add> right.add(Activation('relu'))
<add>
<add> model = Sequential()
<add> model.add(Merge([left, right], mode='concat'))
<add>
<add> model.add(Dense(nb_hidden * 2, nb_class))
<add> model.add(Activation('softmax'))
<add>
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add>
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=([X_test, X_test], y_test))
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=([X_test, X_test], y_test))
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<add> model.fit([X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<add>
<add> loss = model.evaluate([X_train, X_train], y_train, verbose=0)
<add> print('loss:', loss)
<add> if loss > 0.6:
<add> raise Exception('Score too low, learning issue.')
<add> preds = model.predict([X_test, X_test], verbose=0)
<add> classes = model.predict_classes([X_test, X_test], verbose=0)
<add> probas = model.predict_proba([X_test, X_test], verbose=0)
<add> print(model.get_config(verbose=1))
<add>
<add> print('test weight saving')
<add> model.save_weights('temp.h5', overwrite=True)
<add> left = Sequential()
<add> left.add(Dense(input_dim, nb_hidden))
<add> left.add(Activation('relu'))
<add>
<add> right = Sequential()
<add> right.add(Dense(input_dim, nb_hidden))
<add> right.add(Activation('relu'))
<add>
<add> model = Sequential()
<add> model.add(Merge([left, right], mode='concat'))
<add>
<add> model.add(Dense(nb_hidden * 2, nb_class))
<add> model.add(Activation('softmax'))
<add>
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add> model.load_weights('temp.h5')
<add>
<add> nloss = model.evaluate([X_train, X_train], y_train, verbose=0)
<add> print(nloss)
<add> assert(loss == nloss)
<add>
<add>
<add> def test_merge_recursivity(self):
<add> print('Test merge recursivity')
<add>
<add> left = Sequential()
<add> left.add(Dense(input_dim, nb_hidden))
<add> left.add(Activation('relu'))
<add>
<add> right = Sequential()
<add> right.add(Dense(input_dim, nb_hidden))
<add> right.add(Activation('relu'))
<add>
<add> righter = Sequential()
<add> righter.add(Dense(input_dim, nb_hidden))
<add> righter.add(Activation('relu'))
<add>
<add> intermediate = Sequential()
<add> intermediate.add(Merge([left, right], mode='sum'))
<add> intermediate.add(Dense(nb_hidden, nb_hidden))
<add> intermediate.add(Activation('relu'))
<add>
<add> model = Sequential()
<add> model.add(Merge([intermediate, righter], mode='sum'))
<add>
<add> model.add(Dense(nb_hidden, nb_class))
<add> model.add(Activation('softmax'))
<add>
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add>
<add> model.fit([X_train, X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=([X_test, X_test, X_test], y_test))
<add> model.fit([X_train, X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=([X_test, X_test, X_test], y_test))
<add> model.fit([X_train, X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<add> model.fit([X_train, X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<add> model.fit([X_train, X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<add> model.fit([X_train, X_train, X_train], y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<add>
<add> loss = model.evaluate([X_train, X_train, X_train], y_train, verbose=0)
<add> print('loss:', loss)
<add> if loss > 0.6:
<add> raise Exception('Score too low, learning issue.')
<add> preds = model.predict([X_test, X_test, X_test], verbose=0)
<add> classes = model.predict_classes([X_test, X_test, X_test], verbose=0)
<add> probas = model.predict_proba([X_test, X_test, X_test], verbose=0)
<add> print(model.get_config(verbose=1))
<add>
<add> model.save_weights('temp.h5', overwrite=True)
<add> model.load_weights('temp.h5')
<add>
<add> nloss = model.evaluate([X_train, X_train, X_train], y_train, verbose=0)
<add> print(nloss)
<add> assert(loss == nloss)
<add>
<add>
<add> def test_merge_overlap(self):
<add> print('Test merge overlap')
<add> left = Sequential()
<add> left.add(Dense(input_dim, nb_hidden))
<add> left.add(Activation('relu'))
<add>
<add> model = Sequential()
<add> model.add(Merge([left, left], mode='sum'))
<add>
<add> model.add(Dense(nb_hidden, nb_class))
<add> model.add(Activation('softmax'))
<add>
<add> model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<add>
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=1, validation_data=(X_test, y_test))
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=2, validation_data=(X_test, y_test))
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=2, validation_split=0.1)
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=1, validation_split=0.1)
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<add> model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=1, shuffle=False)
<add>
<add> model.train_on_batch(X_train[:32], y_train[:32])
<add>
<add> loss = model.evaluate(X_train, y_train, verbose=0)
<add> print('loss:', loss)
<add> if loss > 0.6:
<add> raise Exception('Score too low, learning issue.')
<add> preds = model.predict(X_test, verbose=0)
<add> classes = model.predict_classes(X_test, verbose=0)
<add> probas = model.predict_proba(X_test, verbose=0)
<add> print(model.get_config(verbose=1))
<add>
<add> model.save_weights('temp.h5', overwrite=True)
<add> model.load_weights('temp.h5')
<add>
<add> nloss = model.evaluate(X_train, y_train, verbose=0)
<add> print(nloss)
<add> assert(loss == nloss)
<add>
<add>
<add>
<add>if __name__ == '__main__':
<add> print('Test Sequential model')
<add> unittest.main()
<ide><path>tests/manual/check_models.py
<del>from __future__ import absolute_import
<del>from __future__ import print_function
<del>from keras.datasets import mnist
<del>from keras.models import Sequential
<del>from keras.layers.core import Dense, Activation, Merge
<del>from keras.utils import np_utils
<del>import numpy as np
<del>
<del>nb_classes = 10
<del>batch_size = 128
<del>nb_epoch = 1
<del>
<del>max_train_samples = 5000
<del>max_test_samples = 1000
<del>
<del>np.random.seed(1337) # for reproducibility
<del>
<del># the data, shuffled and split between tran and test sets
<del>(X_train, y_train), (X_test, y_test) = mnist.load_data()
<del>
<del>X_train = X_train.reshape(60000,784)[:max_train_samples]
<del>X_test = X_test.reshape(10000,784)[:max_test_samples]
<del>X_train = X_train.astype("float32")
<del>X_test = X_test.astype("float32")
<del>X_train /= 255
<del>X_test /= 255
<del>
<del># convert class vectors to binary class matrices
<del>Y_train = np_utils.to_categorical(y_train, nb_classes)[:max_train_samples]
<del>Y_test = np_utils.to_categorical(y_test, nb_classes)[:max_test_samples]
<del>
<del>#########################
<del># sequential model test #
<del>#########################
<del>print('Test sequential')
<del>model = Sequential()
<del>model.add(Dense(784, 50))
<del>model.add(Activation('relu'))
<del>model.add(Dense(50, 10))
<del>model.add(Activation('softmax'))
<del>
<del>model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=(X_test, Y_test))
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=(X_test, Y_test))
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<del>
<del>score = model.evaluate(X_train, Y_train, verbose=0)
<del>print('score:', score)
<del>if score < 0.25:
<del> raise Exception('Score too low, learning issue.')
<del>preds = model.predict(X_test, verbose=0)
<del>classes = model.predict_classes(X_test, verbose=0)
<del>
<del>model.get_config(verbose=1)
<del>
<del>###################
<del># merge test: sum #
<del>###################
<del>print('Test merge: sum')
<del>left = Sequential()
<del>left.add(Dense(784, 50))
<del>left.add(Activation('relu'))
<del>
<del>right = Sequential()
<del>right.add(Dense(784, 50))
<del>right.add(Activation('relu'))
<del>
<del>model = Sequential()
<del>model.add(Merge([left, right], mode='sum'))
<del>
<del>model.add(Dense(50, 10))
<del>model.add(Activation('softmax'))
<del>
<del>model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<del>
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=([X_test, X_test], Y_test))
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=([X_test, X_test], Y_test))
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<del>
<del>score = model.evaluate([X_train, X_train], Y_train, verbose=0)
<del>print('score:', score)
<del>if score < 0.22:
<del> raise Exception('Score too low, learning issue.')
<del>preds = model.predict([X_test, X_test], verbose=0)
<del>classes = model.predict_classes([X_test, X_test], verbose=0)
<del>
<del>model.get_config(verbose=1)
<del>
<del>###################
<del># merge test: concat #
<del>###################
<del>print('Test merge: concat')
<del>left = Sequential()
<del>left.add(Dense(784, 50))
<del>left.add(Activation('relu'))
<del>
<del>right = Sequential()
<del>right.add(Dense(784, 50))
<del>right.add(Activation('relu'))
<del>
<del>model = Sequential()
<del>model.add(Merge([left, right], mode='concat'))
<del>
<del>model.add(Dense(50*2, 10))
<del>model.add(Activation('softmax'))
<del>
<del>model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<del>
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=([X_test, X_test], Y_test))
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=([X_test, X_test], Y_test))
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<del>model.fit([X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<del>
<del>score = model.evaluate([X_train, X_train], Y_train, verbose=0)
<del>print('score:', score)
<del>if score < 0.22:
<del> raise Exception('Score too low, learning issue.')
<del>preds = model.predict([X_test, X_test], verbose=0)
<del>classes = model.predict_classes([X_test, X_test], verbose=0)
<del>
<del>model.get_config(verbose=1)
<del>
<del>##########################
<del># test merge recursivity #
<del>##########################
<del>print('Test merge recursivity')
<del>
<del>left = Sequential()
<del>left.add(Dense(784, 50))
<del>left.add(Activation('relu'))
<del>
<del>right = Sequential()
<del>right.add(Dense(784, 50))
<del>right.add(Activation('relu'))
<del>
<del>righter = Sequential()
<del>righter.add(Dense(784, 50))
<del>righter.add(Activation('relu'))
<del>
<del>intermediate = Sequential()
<del>intermediate.add(Merge([left, right], mode='sum'))
<del>intermediate.add(Dense(50, 50))
<del>intermediate.add(Activation('relu'))
<del>
<del>model = Sequential()
<del>model.add(Merge([intermediate, righter], mode='sum'))
<del>
<del>model.add(Dense(50, 10))
<del>model.add(Activation('softmax'))
<del>
<del>model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<del>
<del>model.fit([X_train, X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=([X_test, X_test, X_test], Y_test))
<del>model.fit([X_train, X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=([X_test, X_test, X_test], Y_test))
<del>model.fit([X_train, X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<del>model.fit([X_train, X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<del>model.fit([X_train, X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<del>model.fit([X_train, X_train, X_train], Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<del>
<del>score = model.evaluate([X_train, X_train, X_train], Y_train, verbose=0)
<del>print('score:', score)
<del>if score < 0.19:
<del> raise Exception('Score too low, learning issue.')
<del>preds = model.predict([X_test, X_test, X_test], verbose=0)
<del>classes = model.predict_classes([X_test, X_test, X_test], verbose=0)
<del>
<del>model.get_config(verbose=1)
<del>
<del>model.save_weights('temp.h5')
<del>model.load_weights('temp.h5')
<del>
<del>score = model.evaluate([X_train, X_train, X_train], Y_train, verbose=0)
<del>print('score:', score)
<del>
<del>######################
<del># test merge overlap #
<del>######################
<del>print('Test merge overlap')
<del>left = Sequential()
<del>left.add(Dense(784, 50))
<del>left.add(Activation('relu'))
<del>
<del>model = Sequential()
<del>model.add(Merge([left, left], mode='sum'))
<del>
<del>model.add(Dense(50, 10))
<del>model.add(Activation('softmax'))
<del>
<del>model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
<del>
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_data=(X_test, Y_test))
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_data=(X_test, Y_test))
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=0, validation_split=0.1)
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=False, verbose=0, validation_split=0.1)
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0)
<del>model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=0, shuffle=False)
<del>
<del>score = model.evaluate(X_train, Y_train, verbose=0)
<del>print('score:', score)
<del>if score < 0.22:
<del> raise Exception('Score too low, learning issue.')
<del>preds = model.predict(X_test, verbose=0)
<del>classes = model.predict_classes(X_test, verbose=0)
<del>
<del>model.get_config(verbose=1) | 2 |
Python | Python | add timehistory callback to bert | 533d1e6b26f7fe3e4e355ec1a1aeb3277a059c8a | <ide><path>official/modeling/model_training_utils.py
<ide> def _run_callbacks_on_batch_end(batch, logs):
<ide> train_steps(train_iterator,
<ide> tf.convert_to_tensor(steps, dtype=tf.int32))
<ide> train_loss = _float_metric_value(train_loss_metric)
<del> _run_callbacks_on_batch_end(current_step, {'loss': train_loss})
<ide> current_step += steps
<add> _run_callbacks_on_batch_end(current_step - 1, {'loss': train_loss})
<ide>
<ide> # Updates training logging.
<ide> training_status = 'Train Step: %d/%d / loss = %s' % (
<ide><path>official/nlp/bert/common_flags.py
<ide> def define_common_bert_flags():
<ide> flags.DEFINE_bool('hub_module_trainable', True,
<ide> 'True to make keras layers in the hub module trainable.')
<ide>
<add> flags_core.define_log_steps()
<add>
<ide> # Adds flags for mixed precision and multi-worker training.
<ide> flags_core.define_performance(
<ide> num_parallel_calls=False,
<ide><path>official/nlp/bert/run_classifier.py
<ide> def metric_fn():
<ide> epochs,
<ide> steps_per_epoch,
<ide> eval_steps,
<del> custom_callbacks=None)
<add> custom_callbacks=custom_callbacks)
<ide>
<ide> # Use user-defined loop to start training.
<ide> logging.info('Training using customized training loop TF 2.0 with '
<ide> def run_bert(strategy,
<ide> if not strategy:
<ide> raise ValueError('Distribution strategy has not been specified.')
<ide>
<add> if FLAGS.log_steps:
<add> custom_callbacks = [keras_utils.TimeHistory(
<add> batch_size=FLAGS.train_batch_size,
<add> log_steps=FLAGS.log_steps,
<add> logdir=FLAGS.model_dir,
<add> )]
<add> else:
<add> custom_callbacks = None
<add>
<ide> trained_model = run_bert_classifier(
<ide> strategy,
<ide> model_config,
<ide> def run_bert(strategy,
<ide> train_input_fn,
<ide> eval_input_fn,
<ide> run_eagerly=FLAGS.run_eagerly,
<del> use_keras_compile_fit=FLAGS.use_keras_compile_fit)
<add> use_keras_compile_fit=FLAGS.use_keras_compile_fit,
<add> custom_callbacks=custom_callbacks)
<ide>
<ide> if FLAGS.model_export_path:
<ide> # As Keras ModelCheckpoint callback used with Keras compile/fit() API
<ide><path>official/nlp/bert/run_squad.py
<ide> from official.nlp.bert import tokenization
<ide> from official.nlp.data import squad_lib as squad_lib_wp
<ide> from official.utils.misc import distribution_utils
<add>from official.utils.misc import keras_utils
<ide>
<ide>
<ide> flags.DEFINE_string('vocab_file', None,
<ide> def main(_):
<ide> all_reduce_alg=FLAGS.all_reduce_alg,
<ide> tpu_address=FLAGS.tpu)
<ide> if FLAGS.mode in ('train', 'train_and_predict'):
<del> train_squad(strategy, input_meta_data, run_eagerly=FLAGS.run_eagerly)
<add> if FLAGS.log_steps:
<add> custom_callbacks = [keras_utils.TimeHistory(
<add> batch_size=FLAGS.train_batch_size,
<add> log_steps=FLAGS.log_steps,
<add> logdir=FLAGS.model_dir,
<add> )]
<add> else:
<add> custom_callbacks = None
<add>
<add> train_squad(
<add> strategy,
<add> input_meta_data,
<add> custom_callbacks=custom_callbacks,
<add> run_eagerly=FLAGS.run_eagerly,
<add> )
<ide> if FLAGS.mode in ('predict', 'train_and_predict'):
<ide> predict_squad(strategy, input_meta_data)
<ide>
<ide><path>official/utils/flags/_benchmark.py
<ide> from official.utils.flags._conventions import help_wrap
<ide>
<ide>
<add>def define_log_steps():
<add> flags.DEFINE_integer(
<add> name="log_steps", default=100,
<add> help="Frequency with which to log timing information with TimeHistory.")
<add>
<add> return []
<add>
<add>
<ide> def define_benchmark(benchmark_log_dir=True, bigquery_uploader=True):
<ide> """Register benchmarking flags.
<ide>
<ide> def define_benchmark(benchmark_log_dir=True, bigquery_uploader=True):
<ide> "human consumption, and does not have any impact within "
<ide> "the system."))
<ide>
<del> flags.DEFINE_integer(
<del> name='log_steps', default=100,
<del> help='For every log_steps, we log the timing information such as '
<del> 'examples per second. Besides, for every log_steps, we store the '
<del> 'timestamp of a batch end.')
<add> define_log_steps()
<ide>
<ide> if benchmark_log_dir:
<ide> flags.DEFINE_string(
<ide><path>official/utils/flags/core.py
<ide> def core_fn(*args, **kwargs):
<ide> # We have define_base_eager for compatibility, since it used to be a separate
<ide> # function from define_base.
<ide> define_base_eager = define_base
<add>define_log_steps = register_key_flags_in_core(_benchmark.define_log_steps)
<ide> define_benchmark = register_key_flags_in_core(_benchmark.define_benchmark)
<ide> define_device = register_key_flags_in_core(_device.define_device)
<ide> define_image = register_key_flags_in_core(_misc.define_image)
<ide><path>official/utils/misc/keras_utils.py
<ide> import time
<ide>
<ide> from absl import logging
<del>import tensorflow as tf
<del>from tensorflow.core.protobuf import rewriter_config_pb2
<add>import tensorflow.compat.v2 as tf
<ide> from tensorflow.python import tf2
<ide> from tensorflow.python.profiler import profiler_v2 as profiler
<ide>
<ide> def on_batch_end(self, batch, logs=None):
<ide>
<ide> self.timestamp_log.append(BatchTimestamp(self.global_steps, now))
<ide> logging.info(
<del> "TimeHistory: %.2f examples/second between steps %d and %d",
<add> 'TimeHistory: %.2f examples/second between steps %d and %d',
<ide> examples_per_second, self.last_log_step, self.global_steps)
<ide>
<ide> if self.summary_writer:
<ide> def set_session_config(enable_eager=False,
<ide> if enable_eager:
<ide> tf.compat.v1.enable_eager_execution(config=config)
<ide> else:
<del> sess = tf.Session(config=config)
<del> tf.keras.backend.set_session(sess)
<add> sess = tf.compat.v1.Session(config=config)
<add> tf.compat.v1.keras.backend.set_session(sess)
<ide>
<ide>
<ide> def get_config_proto_v1(enable_xla=False): | 7 |
Python | Python | respect compiler customizations | fecbb3d3500d6a6b7503eaec2de05c998798ef9c | <ide><path>numpy/distutils/tests/test_system_info.py
<ide> def have_compiler():
<ide> """ Return True if there appears to be an executable compiler
<ide> """
<ide> compiler = ccompiler.new_compiler()
<add> compiler.customize(None)
<ide> try:
<ide> cmd = compiler.compiler # Unix compilers
<ide> except AttributeError:
<ide> def test_temp2(self):
<ide> def test_compile1(self):
<ide> # Compile source and link the first source
<ide> c = ccompiler.new_compiler()
<add> c.customize(None)
<ide> previousDir = os.getcwd()
<ide> try:
<ide> # Change directory to not screw up directories
<ide> def test_compile2(self):
<ide> # Compile source and link the second source
<ide> tsi = self.c_temp2
<ide> c = ccompiler.new_compiler()
<add> c.customize(None)
<ide> extra_link_args = tsi.calc_extra_info()['extra_link_args']
<ide> previousDir = os.getcwd()
<ide> try: | 1 |
Python | Python | fix str vs bytes and int issues in ma tests | 2a7e659e45db24162bdc49b1a83b7c69796c6be2 | <ide><path>numpy/ma/tests/test_core.py
<ide> def test_basic_ufuncs (self):
<ide> def test_count_func (self):
<ide> "Tests count"
<ide> ott = array([0., 1., 2., 3.], mask=[1, 0, 0, 0])
<del> self.assertTrue(isinstance(count(ott), int))
<add> if sys.version_info[0] >= 3:
<add> self.assertTrue(isinstance(count(ott), np.integer))
<add> else:
<add> self.assertTrue(isinstance(count(ott), int))
<ide> assert_equal(3, count(ott))
<ide> assert_equal(1, count(1))
<ide> assert_equal(0, array(1, mask=[1]))
<ide> ott = ott.reshape((2, 2))
<ide> assert isinstance(count(ott, 0), ndarray)
<del> assert isinstance(count(ott), types.IntType)
<add> if sys.version_info[0] >= 3:
<add> assert isinstance(count(ott), np.integer)
<add> else:
<add> assert isinstance(count(ott), types.IntType)
<ide> assert_equal(3, count(ott))
<ide> assert getmask(count(ott, 0)) is nomask
<ide> assert_equal([1, 2], count(ott, 0))
<ide><path>numpy/ma/tests/test_mrecords.py
<ide> def setup(self):
<ide> "Generic setup"
<ide> ilist = [1,2,3,4,5]
<ide> flist = [1.1,2.2,3.3,4.4,5.5]
<del> slist = ['one','two','three','four','five']
<add> slist = asbytes_nested(['one','two','three','four','five'])
<ide> ddtype = [('a',int),('b',float),('c','|S8')]
<ide> mask = [0,1,0,0,1]
<del> if sys.version_info[0] >= 3:
<del> slist = list(map(asbytes, slist))
<ide> self.base = ma.array(list(zip(ilist,flist,slist)),
<ide> mask=mask, dtype=ddtype)
<ide>
<ide> def test_set_fields(self):
<ide> assert_equal(mbase.c.mask, [1]*5)
<ide> assert_equal(mbase.c.recordmask, [1]*5)
<ide> assert_equal(ma.getmaskarray(mbase['c']), [1]*5)
<del> assert_equal(ma.getdata(mbase['c']), ['N/A']*5)
<add> assert_equal(ma.getdata(mbase['c']), [asbytes('N/A')]*5)
<ide> assert_equal(mbase._mask.tolist(),
<ide> np.array([(0,0,1),(0,1,1),(0,0,1),(0,0,1),(0,1,1)],
<ide> dtype=bool))
<ide><path>numpy/ma/tests/test_old_ma.py
<ide> def test_testUfuncs1 (self):
<ide> def test_xtestCount (self):
<ide> "Test count"
<ide> ott = array([0.,1.,2.,3.], mask=[1,0,0,0])
<del> self.assertTrue( isinstance(count(ott), types.IntType))
<add> if sys.version_info[0] >= 3:
<add> self.assertTrue( isinstance(count(ott), numpy.integer))
<add> else:
<add> self.assertTrue( isinstance(count(ott), types.IntType))
<ide> self.assertEqual(3, count(ott))
<ide> self.assertEqual(1, count(1))
<ide> self.assertTrue (eq(0, array(1,mask=[1])))
<ide> ott=ott.reshape((2,2))
<ide> assert isinstance(count(ott,0),numpy.ndarray)
<del> assert isinstance(count(ott), types.IntType)
<add> if sys.version_info[0] >= 3:
<add> assert isinstance(count(ott), numpy.integer)
<add> else:
<add> assert isinstance(count(ott), types.IntType)
<ide> self.assertTrue (eq(3, count(ott)))
<ide> assert getmask(count(ott,0)) is nomask
<ide> self.assertTrue (eq([1,2],count(ott,0))) | 3 |
Text | Text | add version_scheme documentation | cc2a90ec8d4dc8b26a7658f4424cf6949d0bf4cb | <ide><path>share/doc/homebrew/Formula-Cookbook.md
<ide> Where a dependent of a formula fails against a new version of that dependency it
<ide>
<ide> [`revision`](http://www.rubydoc.info/github/Homebrew/brew/master/Formula#revision%3D-class_method)s are also used for formulae that move from the system OpenSSL to the Homebrew-shipped OpenSSL without any other changes to that formula. This ensures users aren’t left exposed to the potential security issues of the outdated OpenSSL. An example of this can be seen in [this commit](https://github.com/Homebrew/homebrew/commit/6b9d60d474d72b1848304297d91adc6120ea6f96).
<ide>
<add>## Version Scheme Changes
<add>
<add>Sometimes formulae have version schemes that change such that a direct comparison between two versions no longer produces the correct result. For example, a project might be version `13` and then decide to become `1.0.0`. As `13` is translated to `13.0.0` by our versioning system by default this requires intervention.
<add>
<add>Where a version scheme of a formula fails to recognise a new version as newer it must receive a [`version_scheme`](http://www.rubydoc.info/github/Homebrew/brew/master/Formula#version_scheme%3D-class_method). An example of this can be seen [here](https://github.com/Homebrew/homebrew-core/pull/4006).
<add>
<ide> ## Double-check for dependencies
<ide>
<ide> When you already have a lot of formulae installed, it's easy to miss a common dependency. You can double-check which libraries a binary links to with the `otool` command (perhaps you need to use `xcrun otool`): | 1 |
PHP | PHP | length() back in | d71bd108b29637d259f816f461eb122a1bd8a13f | <ide><path>src/Illuminate/Support/Str.php
<ide> public static function limit($value, $limit = 100, $end = '...')
<ide> return mb_substr($value, 0, $limit, 'UTF-8').$end;
<ide> }
<ide>
<add> /**
<add> * Return the length of the given string.
<add> *
<add> * @param string $value
<add> * @return string
<add> */
<add> public static function length($value)
<add> {
<add> return mb_strlen($value);
<add> }
<add>
<ide> /**
<ide> * Convert the given string to lower-case.
<ide> * | 1 |
Ruby | Ruby | fix typos in ar. lots of them | 55686dd1be37d6eb57dc086e4bc72922f49b2e17 | <ide><path>activerecord/lib/active_record/attribute_methods.rb
<ide> module ClassMethods
<ide> # Generates all the attribute related methods for columns in the database
<ide> # accessors, mutators and query methods.
<ide> def define_attribute_methods # :nodoc:
<del> # Use a mutex; we don't want two thread simaltaneously trying to define
<add> # Use a mutex; we don't want two thread simultaneously trying to define
<ide> # attribute methods.
<ide> @attribute_methods_mutex.synchronize do
<ide> return if attribute_methods_generated?
<ide><path>activerecord/lib/active_record/connection_adapters/abstract/database_limits.rb
<ide> def table_name_length
<ide> # limit is enforced by rails and Is less than or equal to
<ide> # <tt>index_name_length</tt>. The gap between
<ide> # <tt>index_name_length</tt> is to allow internal rails
<del> # opreations to use prefixes in temporary opreations.
<add> # operations to use prefixes in temporary operations.
<ide> def allowed_index_name_length
<ide> index_name_length
<ide> end
<ide><path>activerecord/lib/active_record/connection_adapters/abstract/transaction.rb
<ide> def initialize(connection, parent, options = {})
<ide> @joinable = options.fetch(:joinable, true)
<ide> end
<ide>
<del> # This state is necesarry so that we correctly handle stuff that might
<add> # This state is necessary so that we correctly handle stuff that might
<ide> # happen in a commit/rollback. But it's kinda distasteful. Maybe we can
<ide> # find a better way to structure it in the future.
<ide> def finishing?
<ide><path>activerecord/lib/active_record/explain.rb
<ide> def collecting_queries_for_explain # :nodoc:
<ide> yield
<ide> return current[:available_queries_for_explain]
<ide> ensure
<del> # Note that the return value above does not depend on this assigment.
<add> # Note that the return value above does not depend on this assignment.
<ide> current[:available_queries_for_explain] = original
<ide> end
<ide>
<ide><path>activerecord/test/cases/associations/eager_test.rb
<ide> def test_finding_with_includes_on_has_many_association_with_same_include_include
<ide> end
<ide> end
<ide>
<del> def test_finding_with_includes_on_has_one_assocation_with_same_include_includes_only_once
<add> def test_finding_with_includes_on_has_one_association_with_same_include_includes_only_once
<ide> author = authors(:david)
<ide> post = author.post_about_thinking_with_last_comment
<ide> last_comment = post.last_comment
<ide><path>activerecord/test/cases/attribute_methods_test.rb
<ide> def test_array_content
<ide> end
<ide>
<ide> def test_read_attributes_before_type_cast
<del> category = Category.new({:name=>"Test categoty", :type => nil})
<del> category_attrs = {"name"=>"Test categoty", "id" => nil, "type" => nil, "categorizations_count" => nil}
<add> category = Category.new({:name=>"Test category", :type => nil})
<add> category_attrs = {"name"=>"Test category", "id" => nil, "type" => nil, "categorizations_count" => nil}
<ide> assert_equal category_attrs , category.attributes_before_type_cast
<ide> end
<ide>
<ide><path>activerecord/test/cases/batches_test.rb
<ide> def setup
<ide> Post.count('id') # preheat arel's table cache
<ide> end
<ide>
<del> def test_each_should_excecute_one_query_per_batch
<add> def test_each_should_execute_one_query_per_batch
<ide> assert_queries(Post.count + 1) do
<ide> Post.find_each(:batch_size => 1) do |post|
<ide> assert_kind_of Post, post
<ide><path>activerecord/test/cases/dup_test.rb
<ide> def test_dup_after_initialize_callbacks
<ide> def test_dup_validity_is_independent
<ide> repair_validations(Topic) do
<ide> Topic.validates_presence_of :title
<del> topic = Topic.new("title" => "Litterature")
<add> topic = Topic.new("title" => "Literature")
<ide> topic.valid?
<ide>
<ide> duped = topic.dup
<ide><path>activerecord/test/cases/nested_attributes_test.rb
<ide> def test_reject_if_with_blank_nested_attributes_id
<ide> def test_first_and_array_index_zero_methods_return_the_same_value_when_nested_attributes_are_set_to_update_existing_record
<ide> Man.accepts_nested_attributes_for(:interests)
<ide> man = Man.create(:name => "John")
<del> interest = man.interests.create :topic => 'gardning'
<add> interest = man.interests.create :topic => 'gardening'
<ide> man = Man.find man.id
<ide> man.interests_attributes = [{:id => interest.id, :topic => 'gardening'}]
<ide> assert_equal man.interests.first.topic, man.interests[0].topic | 9 |
Java | Java | fix error message in nativeviewhierarchy | 117dcd9c58a0c9de6f63097d42b67f097cfc7dcf | <ide><path>ReactAndroid/src/main/java/com/facebook/react/uimanager/NativeViewHierarchyManager.java
<ide> public synchronized void updateLayout(
<ide> parentViewGroupManager = (ViewGroupManager) parentViewManager;
<ide> } else {
<ide> throw new IllegalViewOperationException(
<del> "Trying to use view with tag " + tag +
<add> "Trying to use view with tag " + parentTag +
<ide> " as a parent, but its Manager doesn't extends ViewGroupManager");
<ide> }
<ide> if (parentViewGroupManager != null | 1 |
Go | Go | use consistent command description | 2b0927c9ac23b66d8a05761801d1c5f882ba8dfb | <ide><path>api/client/attach.go
<ide> import (
<ide> //
<ide> // Usage: docker attach [OPTIONS] CONTAINER
<ide> func (cli *DockerCli) CmdAttach(args ...string) error {
<del> cmd := Cli.Subcmd("attach", []string{"CONTAINER"}, "Attach to a running container", true)
<add> cmd := Cli.Subcmd("attach", []string{"CONTAINER"}, Cli.DockerCommands["attach"].Description, true)
<ide> noStdin := cmd.Bool([]string{"#nostdin", "-no-stdin"}, false, "Do not attach STDIN")
<ide> proxy := cmd.Bool([]string{"#sig-proxy", "-sig-proxy"}, true, "Proxy all received signals to the process")
<ide>
<ide><path>api/client/build.go
<ide> const (
<ide> //
<ide> // Usage: docker build [OPTIONS] PATH | URL | -
<ide> func (cli *DockerCli) CmdBuild(args ...string) error {
<del> cmd := Cli.Subcmd("build", []string{"PATH | URL | -"}, "Build a new image from the source code at PATH", true)
<add> cmd := Cli.Subcmd("build", []string{"PATH | URL | -"}, Cli.DockerCommands["build"].Description, true)
<ide> tag := cmd.String([]string{"t", "-tag"}, "", "Repository name (and optionally a tag) for the image")
<ide> suppressOutput := cmd.Bool([]string{"q", "-quiet"}, false, "Suppress the verbose output generated by the containers")
<ide> noCache := cmd.Bool([]string{"#no-cache", "-no-cache"}, false, "Do not use cache when building the image")
<ide><path>api/client/commit.go
<ide> import (
<ide> //
<ide> // Usage: docker commit [OPTIONS] CONTAINER [REPOSITORY[:TAG]]
<ide> func (cli *DockerCli) CmdCommit(args ...string) error {
<del> cmd := Cli.Subcmd("commit", []string{"CONTAINER [REPOSITORY[:TAG]]"}, "Create a new image from a container's changes", true)
<add> cmd := Cli.Subcmd("commit", []string{"CONTAINER [REPOSITORY[:TAG]]"}, Cli.DockerCommands["commit"].Description, true)
<ide> flPause := cmd.Bool([]string{"p", "-pause"}, true, "Pause container during commit")
<ide> flComment := cmd.String([]string{"m", "-message"}, "", "Commit message")
<ide> flAuthor := cmd.String([]string{"a", "#author", "-author"}, "", "Author (e.g., \"John Hannibal Smith <[email protected]>\")")
<ide><path>api/client/cp.go
<ide> func (cli *DockerCli) CmdCp(args ...string) error {
<ide> "cp",
<ide> []string{"CONTAINER:PATH LOCALPATH|-", "LOCALPATH|- CONTAINER:PATH"},
<ide> strings.Join([]string{
<del> "Copy files/folders between a container and your host.\n",
<del> "Use '-' as the source to read a tar archive from stdin\n",
<add> Cli.DockerCommands["cp"].Description,
<add> "\nUse '-' as the source to read a tar archive from stdin\n",
<ide> "and extract it to a directory destination in a container.\n",
<ide> "Use '-' as the destination to stream a tar archive of a\n",
<ide> "container source to stdout.",
<ide><path>api/client/create.go
<ide> func (cli *DockerCli) createContainer(config *runconfig.Config, hostConfig *runc
<ide> //
<ide> // Usage: docker create [OPTIONS] IMAGE [COMMAND] [ARG...]
<ide> func (cli *DockerCli) CmdCreate(args ...string) error {
<del> cmd := Cli.Subcmd("create", []string{"IMAGE [COMMAND] [ARG...]"}, "Create a new container", true)
<add> cmd := Cli.Subcmd("create", []string{"IMAGE [COMMAND] [ARG...]"}, Cli.DockerCommands["create"].Description, true)
<ide> addTrustedFlags(cmd, true)
<ide>
<ide> // These are flags not stored in Config/HostConfig
<ide><path>api/client/diff.go
<ide> import (
<ide> //
<ide> // Usage: docker diff CONTAINER
<ide> func (cli *DockerCli) CmdDiff(args ...string) error {
<del> cmd := Cli.Subcmd("diff", []string{"CONTAINER"}, "Inspect changes on a container's filesystem", true)
<add> cmd := Cli.Subcmd("diff", []string{"CONTAINER"}, Cli.DockerCommands["diff"].Description, true)
<ide> cmd.Require(flag.Exact, 1)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/events.go
<ide> import (
<ide> //
<ide> // Usage: docker events [OPTIONS]
<ide> func (cli *DockerCli) CmdEvents(args ...string) error {
<del> cmd := Cli.Subcmd("events", nil, "Get real time events from the server", true)
<add> cmd := Cli.Subcmd("events", nil, Cli.DockerCommands["events"].Description, true)
<ide> since := cmd.String([]string{"#since", "-since"}, "", "Show all events created since timestamp")
<ide> until := cmd.String([]string{"-until"}, "", "Stream events until this timestamp")
<ide> flFilter := opts.NewListOpts(nil)
<ide><path>api/client/exec.go
<ide> import (
<ide> //
<ide> // Usage: docker exec [OPTIONS] CONTAINER COMMAND [ARG...]
<ide> func (cli *DockerCli) CmdExec(args ...string) error {
<del> cmd := Cli.Subcmd("exec", []string{"CONTAINER COMMAND [ARG...]"}, "Run a command in a running container", true)
<add> cmd := Cli.Subcmd("exec", []string{"CONTAINER COMMAND [ARG...]"}, Cli.DockerCommands["exec"].Description, true)
<ide>
<ide> execConfig, err := runconfig.ParseExec(cmd, args)
<ide> // just in case the ParseExec does not exit
<ide><path>api/client/export.go
<ide> import (
<ide> //
<ide> // Usage: docker export [OPTIONS] CONTAINER
<ide> func (cli *DockerCli) CmdExport(args ...string) error {
<del> cmd := Cli.Subcmd("export", []string{"CONTAINER"}, "Export the contents of a container's filesystem as a tar archive", true)
<add> cmd := Cli.Subcmd("export", []string{"CONTAINER"}, Cli.DockerCommands["export"].Description, true)
<ide> outfile := cmd.String([]string{"o", "-output"}, "", "Write to a file, instead of STDOUT")
<ide> cmd.Require(flag.Exact, 1)
<ide>
<ide><path>api/client/history.go
<ide> import (
<ide> //
<ide> // Usage: docker history [OPTIONS] IMAGE
<ide> func (cli *DockerCli) CmdHistory(args ...string) error {
<del> cmd := Cli.Subcmd("history", []string{"IMAGE"}, "Show the history of an image", true)
<add> cmd := Cli.Subcmd("history", []string{"IMAGE"}, Cli.DockerCommands["history"].Description, true)
<ide> human := cmd.Bool([]string{"H", "-human"}, true, "Print sizes and dates in human readable format")
<ide> quiet := cmd.Bool([]string{"q", "-quiet"}, false, "Only show numeric IDs")
<ide> noTrunc := cmd.Bool([]string{"#notrunc", "-no-trunc"}, false, "Don't truncate output")
<ide><path>api/client/images.go
<ide> import (
<ide> //
<ide> // Usage: docker images [OPTIONS] [REPOSITORY]
<ide> func (cli *DockerCli) CmdImages(args ...string) error {
<del> cmd := Cli.Subcmd("images", []string{"[REPOSITORY[:TAG]]"}, "List images", true)
<add> cmd := Cli.Subcmd("images", []string{"[REPOSITORY[:TAG]]"}, Cli.DockerCommands["images"].Description, true)
<ide> quiet := cmd.Bool([]string{"q", "-quiet"}, false, "Only show numeric IDs")
<ide> all := cmd.Bool([]string{"a", "-all"}, false, "Show all images (default hides intermediate images)")
<ide> noTrunc := cmd.Bool([]string{"#notrunc", "-no-trunc"}, false, "Don't truncate output")
<ide><path>api/client/import.go
<ide> import (
<ide> //
<ide> // Usage: docker import [OPTIONS] file|URL|- [REPOSITORY[:TAG]]
<ide> func (cli *DockerCli) CmdImport(args ...string) error {
<del> cmd := Cli.Subcmd("import", []string{"file|URL|- [REPOSITORY[:TAG]]"}, "Create an empty filesystem image and import the contents of the\ntarball (.tar, .tar.gz, .tgz, .bzip, .tar.xz, .txz) into it, then\noptionally tag it.", true)
<add> cmd := Cli.Subcmd("import", []string{"file|URL|- [REPOSITORY[:TAG]]"}, Cli.DockerCommands["import"].Description, true)
<ide> flChanges := opts.NewListOpts(nil)
<ide> cmd.Var(&flChanges, []string{"c", "-change"}, "Apply Dockerfile instruction to the created image")
<ide> message := cmd.String([]string{"m", "-message"}, "", "Set commit message for imported image")
<ide><path>api/client/info.go
<ide> import (
<ide> //
<ide> // Usage: docker info
<ide> func (cli *DockerCli) CmdInfo(args ...string) error {
<del> cmd := Cli.Subcmd("info", nil, "Display system-wide information", true)
<add> cmd := Cli.Subcmd("info", nil, Cli.DockerCommands["info"].Description, true)
<ide> cmd.Require(flag.Exact, 0)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/inspect.go
<ide> var funcMap = template.FuncMap{
<ide> //
<ide> // Usage: docker inspect [OPTIONS] CONTAINER|IMAGE [CONTAINER|IMAGE...]
<ide> func (cli *DockerCli) CmdInspect(args ...string) error {
<del> cmd := Cli.Subcmd("inspect", []string{"CONTAINER|IMAGE [CONTAINER|IMAGE...]"}, "Return low-level information on a container or image", true)
<add> cmd := Cli.Subcmd("inspect", []string{"CONTAINER|IMAGE [CONTAINER|IMAGE...]"}, Cli.DockerCommands["inspect"].Description, true)
<ide> tmplStr := cmd.String([]string{"f", "#format", "-format"}, "", "Format the output using the given go template")
<ide> inspectType := cmd.String([]string{"-type"}, "", "Return JSON for specified type, (e.g image or container)")
<ide> cmd.Require(flag.Min, 1)
<ide><path>api/client/kill.go
<ide> import (
<ide> //
<ide> // Usage: docker kill [OPTIONS] CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdKill(args ...string) error {
<del> cmd := Cli.Subcmd("kill", []string{"CONTAINER [CONTAINER...]"}, "Kill a running container using SIGKILL or a specified signal", true)
<add> cmd := Cli.Subcmd("kill", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["kill"].Description, true)
<ide> signal := cmd.String([]string{"s", "-signal"}, "KILL", "Signal to send to the container")
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide><path>api/client/load.go
<ide> import (
<ide> //
<ide> // Usage: docker load [OPTIONS]
<ide> func (cli *DockerCli) CmdLoad(args ...string) error {
<del> cmd := Cli.Subcmd("load", nil, "Load an image from a tar archive or STDIN", true)
<add> cmd := Cli.Subcmd("load", nil, Cli.DockerCommands["load"].Description, true)
<ide> infile := cmd.String([]string{"i", "-input"}, "", "Read from a tar archive file, instead of STDIN")
<ide> cmd.Require(flag.Exact, 0)
<ide>
<ide><path>api/client/login.go
<ide> import (
<ide> //
<ide> // Usage: docker login SERVER
<ide> func (cli *DockerCli) CmdLogin(args ...string) error {
<del> cmd := Cli.Subcmd("login", []string{"[SERVER]"}, "Register or log in to a Docker registry server, if no server is\nspecified \""+registry.IndexServer+"\" is the default.", true)
<add> cmd := Cli.Subcmd("login", []string{"[SERVER]"}, Cli.DockerCommands["login"].Description+".\nIf no server is specified \""+registry.IndexServer+"\" is the default.", true)
<ide> cmd.Require(flag.Max, 1)
<ide>
<ide> var username, password, email string
<ide><path>api/client/logout.go
<ide> import (
<ide> //
<ide> // Usage: docker logout [SERVER]
<ide> func (cli *DockerCli) CmdLogout(args ...string) error {
<del> cmd := Cli.Subcmd("logout", []string{"[SERVER]"}, "Log out from a Docker registry, if no server is\nspecified \""+registry.IndexServer+"\" is the default.", true)
<add> cmd := Cli.Subcmd("logout", []string{"[SERVER]"}, Cli.DockerCommands["logout"].Description+".\nIf no server is specified \""+registry.IndexServer+"\" is the default.", true)
<ide> cmd.Require(flag.Max, 1)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/logs.go
<ide> import (
<ide> //
<ide> // docker logs [OPTIONS] CONTAINER
<ide> func (cli *DockerCli) CmdLogs(args ...string) error {
<del> cmd := Cli.Subcmd("logs", []string{"CONTAINER"}, "Fetch the logs of a container", true)
<add> cmd := Cli.Subcmd("logs", []string{"CONTAINER"}, Cli.DockerCommands["logs"].Description, true)
<ide> follow := cmd.Bool([]string{"f", "-follow"}, false, "Follow log output")
<ide> since := cmd.String([]string{"-since"}, "", "Show logs since timestamp")
<ide> times := cmd.Bool([]string{"t", "-timestamps"}, false, "Show timestamps")
<ide><path>api/client/pause.go
<ide> import (
<ide> //
<ide> // Usage: docker pause CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdPause(args ...string) error {
<del> cmd := Cli.Subcmd("pause", []string{"CONTAINER [CONTAINER...]"}, "Pause all processes within a container", true)
<add> cmd := Cli.Subcmd("pause", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["pause"].Description, true)
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/port.go
<ide> import (
<ide> //
<ide> // Usage: docker port CONTAINER [PRIVATE_PORT[/PROTO]]
<ide> func (cli *DockerCli) CmdPort(args ...string) error {
<del> cmd := Cli.Subcmd("port", []string{"CONTAINER [PRIVATE_PORT[/PROTO]]"}, "List port mappings for the CONTAINER, or lookup the public-facing port that\nis NAT-ed to the PRIVATE_PORT", true)
<add> cmd := Cli.Subcmd("port", []string{"CONTAINER [PRIVATE_PORT[/PROTO]]"}, Cli.DockerCommands["port"].Description, true)
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/ps.go
<ide> func (cli *DockerCli) CmdPs(args ...string) error {
<ide> psFilterArgs = filters.Args{}
<ide> v = url.Values{}
<ide>
<del> cmd = Cli.Subcmd("ps", nil, "List containers", true)
<add> cmd = Cli.Subcmd("ps", nil, Cli.DockerCommands["ps"].Description, true)
<ide> quiet = cmd.Bool([]string{"q", "-quiet"}, false, "Only display numeric IDs")
<ide> size = cmd.Bool([]string{"s", "-size"}, false, "Display total file sizes")
<ide> all = cmd.Bool([]string{"a", "-all"}, false, "Show all containers (default shows just running)")
<ide><path>api/client/pull.go
<ide> import (
<ide> //
<ide> // Usage: docker pull [OPTIONS] IMAGENAME[:TAG|@DIGEST]
<ide> func (cli *DockerCli) CmdPull(args ...string) error {
<del> cmd := Cli.Subcmd("pull", []string{"NAME[:TAG|@DIGEST]"}, "Pull an image or a repository from a registry", true)
<add> cmd := Cli.Subcmd("pull", []string{"NAME[:TAG|@DIGEST]"}, Cli.DockerCommands["pull"].Description, true)
<ide> allTags := cmd.Bool([]string{"a", "-all-tags"}, false, "Download all tagged images in the repository")
<ide> addTrustedFlags(cmd, true)
<ide> cmd.Require(flag.Exact, 1)
<ide><path>api/client/push.go
<ide> import (
<ide> //
<ide> // Usage: docker push NAME[:TAG]
<ide> func (cli *DockerCli) CmdPush(args ...string) error {
<del> cmd := Cli.Subcmd("push", []string{"NAME[:TAG]"}, "Push an image or a repository to a registry", true)
<add> cmd := Cli.Subcmd("push", []string{"NAME[:TAG]"}, Cli.DockerCommands["push"].Description, true)
<ide> addTrustedFlags(cmd, false)
<ide> cmd.Require(flag.Exact, 1)
<ide>
<ide><path>api/client/rename.go
<ide> import (
<ide> //
<ide> // Usage: docker rename OLD_NAME NEW_NAME
<ide> func (cli *DockerCli) CmdRename(args ...string) error {
<del> cmd := Cli.Subcmd("rename", []string{"OLD_NAME NEW_NAME"}, "Rename a container", true)
<add> cmd := Cli.Subcmd("rename", []string{"OLD_NAME NEW_NAME"}, Cli.DockerCommands["rename"].Description, true)
<ide> cmd.Require(flag.Exact, 2)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/restart.go
<ide> import (
<ide> //
<ide> // Usage: docker restart [OPTIONS] CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdRestart(args ...string) error {
<del> cmd := Cli.Subcmd("restart", []string{"CONTAINER [CONTAINER...]"}, "Restart a container", true)
<add> cmd := Cli.Subcmd("restart", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["restart"].Description, true)
<ide> nSeconds := cmd.Int([]string{"t", "-time"}, 10, "Seconds to wait for stop before killing the container")
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide><path>api/client/rm.go
<ide> import (
<ide> //
<ide> // Usage: docker rm [OPTIONS] CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdRm(args ...string) error {
<del> cmd := Cli.Subcmd("rm", []string{"CONTAINER [CONTAINER...]"}, "Remove one or more containers", true)
<add> cmd := Cli.Subcmd("rm", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["rm"].Description, true)
<ide> v := cmd.Bool([]string{"v", "-volumes"}, false, "Remove the volumes associated with the container")
<ide> link := cmd.Bool([]string{"l", "#link", "-link"}, false, "Remove the specified link")
<ide> force := cmd.Bool([]string{"f", "-force"}, false, "Force the removal of a running container (uses SIGKILL)")
<ide><path>api/client/rmi.go
<ide> import (
<ide> //
<ide> // Usage: docker rmi [OPTIONS] IMAGE [IMAGE...]
<ide> func (cli *DockerCli) CmdRmi(args ...string) error {
<del> cmd := Cli.Subcmd("rmi", []string{"IMAGE [IMAGE...]"}, "Remove one or more images", true)
<add> cmd := Cli.Subcmd("rmi", []string{"IMAGE [IMAGE...]"}, Cli.DockerCommands["rmi"].Description, true)
<ide> force := cmd.Bool([]string{"f", "-force"}, false, "Force removal of the image")
<ide> noprune := cmd.Bool([]string{"-no-prune"}, false, "Do not delete untagged parents")
<ide> cmd.Require(flag.Min, 1)
<ide><path>api/client/run.go
<ide> func (cid *cidFile) Write(id string) error {
<ide> //
<ide> // Usage: docker run [OPTIONS] IMAGE [COMMAND] [ARG...]
<ide> func (cli *DockerCli) CmdRun(args ...string) error {
<del> cmd := Cli.Subcmd("run", []string{"IMAGE [COMMAND] [ARG...]"}, "Run a command in a new container", true)
<add> cmd := Cli.Subcmd("run", []string{"IMAGE [COMMAND] [ARG...]"}, Cli.DockerCommands["run"].Description, true)
<ide> addTrustedFlags(cmd, true)
<ide>
<ide> // These are flags not stored in Config/HostConfig
<ide><path>api/client/save.go
<ide> import (
<ide> //
<ide> // Usage: docker save [OPTIONS] IMAGE [IMAGE...]
<ide> func (cli *DockerCli) CmdSave(args ...string) error {
<del> cmd := Cli.Subcmd("save", []string{"IMAGE [IMAGE...]"}, "Save an image(s) to a tar archive (streamed to STDOUT by default)", true)
<add> cmd := Cli.Subcmd("save", []string{"IMAGE [IMAGE...]"}, Cli.DockerCommands["save"].Description+" (streamed to STDOUT by default)", true)
<ide> outfile := cmd.String([]string{"o", "-output"}, "", "Write to a file, instead of STDOUT")
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide><path>api/client/search.go
<ide> func (r ByStars) Less(i, j int) bool { return r[i].StarCount < r[j].StarCount }
<ide> //
<ide> // Usage: docker search [OPTIONS] TERM
<ide> func (cli *DockerCli) CmdSearch(args ...string) error {
<del> cmd := Cli.Subcmd("search", []string{"TERM"}, "Search the Docker Hub for images", true)
<add> cmd := Cli.Subcmd("search", []string{"TERM"}, Cli.DockerCommands["search"].Description, true)
<ide> noTrunc := cmd.Bool([]string{"#notrunc", "-no-trunc"}, false, "Don't truncate output")
<ide> trusted := cmd.Bool([]string{"#t", "#trusted", "#-trusted"}, false, "Only show trusted builds")
<ide> automated := cmd.Bool([]string{"-automated"}, false, "Only show automated builds")
<ide><path>api/client/start.go
<ide> func (cli *DockerCli) forwardAllSignals(cid string) chan os.Signal {
<ide> //
<ide> // Usage: docker start [OPTIONS] CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdStart(args ...string) error {
<del> cmd := Cli.Subcmd("start", []string{"CONTAINER [CONTAINER...]"}, "Start one or more containers", true)
<add> cmd := Cli.Subcmd("start", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["start"].Description, true)
<ide> attach := cmd.Bool([]string{"a", "-attach"}, false, "Attach STDOUT/STDERR and forward signals")
<ide> openStdin := cmd.Bool([]string{"i", "-interactive"}, false, "Attach container's STDIN")
<ide> cmd.Require(flag.Min, 1)
<ide><path>api/client/stats.go
<ide> func (s *containerStats) Display(w io.Writer) error {
<ide> //
<ide> // Usage: docker stats CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdStats(args ...string) error {
<del> cmd := Cli.Subcmd("stats", []string{"CONTAINER [CONTAINER...]"}, "Display a live stream of one or more containers' resource usage statistics", true)
<add> cmd := Cli.Subcmd("stats", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["stats"].Description, true)
<ide> noStream := cmd.Bool([]string{"-no-stream"}, false, "Disable streaming stats and only pull the first result")
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide><path>api/client/stop.go
<ide> import (
<ide> //
<ide> // Usage: docker stop [OPTIONS] CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdStop(args ...string) error {
<del> cmd := Cli.Subcmd("stop", []string{"CONTAINER [CONTAINER...]"}, "Stop a container by sending SIGTERM and then SIGKILL after a\ngrace period", true)
<add> cmd := Cli.Subcmd("stop", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["stop"].Description+".\nSending SIGTERM and then SIGKILL after a grace period", true)
<ide> nSeconds := cmd.Int([]string{"t", "-time"}, 10, "Seconds to wait for stop before killing it")
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide><path>api/client/tag.go
<ide> import (
<ide> //
<ide> // Usage: docker tag [OPTIONS] IMAGE[:TAG] [REGISTRYHOST/][USERNAME/]NAME[:TAG]
<ide> func (cli *DockerCli) CmdTag(args ...string) error {
<del> cmd := Cli.Subcmd("tag", []string{"IMAGE[:TAG] [REGISTRYHOST/][USERNAME/]NAME[:TAG]"}, "Tag an image into a repository", true)
<add> cmd := Cli.Subcmd("tag", []string{"IMAGE[:TAG] [REGISTRYHOST/][USERNAME/]NAME[:TAG]"}, Cli.DockerCommands["tag"].Description, true)
<ide> force := cmd.Bool([]string{"f", "#force", "-force"}, false, "Force")
<ide> cmd.Require(flag.Exact, 2)
<ide>
<ide><path>api/client/top.go
<ide> import (
<ide> //
<ide> // Usage: docker top CONTAINER
<ide> func (cli *DockerCli) CmdTop(args ...string) error {
<del> cmd := Cli.Subcmd("top", []string{"CONTAINER [ps OPTIONS]"}, "Display the running processes of a container", true)
<add> cmd := Cli.Subcmd("top", []string{"CONTAINER [ps OPTIONS]"}, Cli.DockerCommands["top"].Description, true)
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/unpause.go
<ide> import (
<ide> //
<ide> // Usage: docker unpause CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdUnpause(args ...string) error {
<del> cmd := Cli.Subcmd("unpause", []string{"CONTAINER [CONTAINER...]"}, "Unpause all processes within a container", true)
<add> cmd := Cli.Subcmd("unpause", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["unpause"].Description, true)
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>api/client/version.go
<ide> type versionData struct {
<ide> //
<ide> // Usage: docker version
<ide> func (cli *DockerCli) CmdVersion(args ...string) (err error) {
<del> cmd := Cli.Subcmd("version", nil, "Show the Docker version information.", true)
<add> cmd := Cli.Subcmd("version", nil, Cli.DockerCommands["version"].Description, true)
<ide> tmplStr := cmd.String([]string{"f", "#format", "-format"}, "", "Format the output using the given go template")
<ide> cmd.Require(flag.Exact, 0)
<ide>
<ide><path>api/client/volume.go
<ide> import (
<ide> //
<ide> // Usage: docker volume <COMMAND> <OPTS>
<ide> func (cli *DockerCli) CmdVolume(args ...string) error {
<del> description := "Manage Docker volumes\n\nCommands:\n"
<add> description := Cli.DockerCommands["volume"].Description + "\n\nCommands:\n"
<ide> commands := [][]string{
<ide> {"create", "Create a volume"},
<ide> {"inspect", "Return low-level information on a volume"},
<ide><path>api/client/wait.go
<ide> import (
<ide> //
<ide> // Usage: docker wait CONTAINER [CONTAINER...]
<ide> func (cli *DockerCli) CmdWait(args ...string) error {
<del> cmd := Cli.Subcmd("wait", []string{"CONTAINER [CONTAINER...]"}, "Block until a container stops, then print its exit code.", true)
<add> cmd := Cli.Subcmd("wait", []string{"CONTAINER [CONTAINER...]"}, Cli.DockerCommands["wait"].Description, true)
<ide> cmd.Require(flag.Min, 1)
<ide>
<ide> cmd.ParseFlags(args, true)
<ide><path>cli/common.go
<ide> type CommonFlags struct {
<ide> TLSOptions *tlsconfig.Options
<ide> TrustKey string
<ide> }
<add>
<add>// Command is the struct contains command name and description
<add>type Command struct {
<add> Name string
<add> Description string
<add>}
<add>
<add>var dockerCommands = []Command{
<add> {"attach", "Attach to a running container"},
<add> {"build", "Build an image from a Dockerfile"},
<add> {"commit", "Create a new image from a container's changes"},
<add> {"cp", "Copy files/folders between a container and the local filesystem"},
<add> {"create", "Create a new container"},
<add> {"diff", "Inspect changes on a container's filesystem"},
<add> {"events", "Get real time events from the server"},
<add> {"exec", "Run a command in a running container"},
<add> {"export", "Export a container's filesystem as a tar archive"},
<add> {"history", "Show the history of an image"},
<add> {"images", "List images"},
<add> {"import", "Import the contents from a tarball to create a filesystem image"},
<add> {"info", "Display system-wide information"},
<add> {"inspect", "Return low-level information on a container or image"},
<add> {"kill", "Kill a running container"},
<add> {"load", "Load an image from a tar archive or STDIN"},
<add> {"login", "Register or log in to a Docker registry"},
<add> {"logout", "Log out from a Docker registry"},
<add> {"logs", "Fetch the logs of a container"},
<add> {"pause", "Pause all processes within a container"},
<add> {"port", "List port mappings or a specific mapping for the CONTAINER"},
<add> {"ps", "List containers"},
<add> {"pull", "Pull an image or a repository from a registry"},
<add> {"push", "Push an image or a repository to a registry"},
<add> {"rename", "Rename a container"},
<add> {"restart", "Restart a container"},
<add> {"rm", "Remove one or more containers"},
<add> {"rmi", "Remove one or more images"},
<add> {"run", "Run a command in a new container"},
<add> {"save", "Save an image(s) to a tar archive"},
<add> {"search", "Search the Docker Hub for images"},
<add> {"start", "Start one or more stopped containers"},
<add> {"stats", "Display a live stream of container(s) resource usage statistics"},
<add> {"stop", "Stop a running container"},
<add> {"tag", "Tag an image into a repository"},
<add> {"top", "Display the running processes of a container"},
<add> {"unpause", "Unpause all processes within a container"},
<add> {"version", "Show the Docker version information"},
<add> {"volume", "Manage Docker volumes"},
<add> {"wait", "Block until a container stops, then print its exit code"},
<add>}
<add>
<add>// DockerCommands stores all the docker command
<add>var DockerCommands = make(map[string]Command)
<add>
<add>func init() {
<add> for _, cmd := range dockerCommands {
<add> DockerCommands[cmd.Name] = cmd
<add> }
<add>}
<ide><path>docker/docker.go
<ide> func main() {
<ide> help := "\nCommands:\n"
<ide>
<ide> for _, cmd := range dockerCommands {
<del> help += fmt.Sprintf(" %-10.10s%s\n", cmd.name, cmd.description)
<add> help += fmt.Sprintf(" %-10.10s%s\n", cmd.Name, cmd.Description)
<ide> }
<ide>
<ide> help += "\nRun 'docker COMMAND --help' for more information on a command."
<ide><path>docker/flags.go
<ide> package main
<ide>
<del>import flag "github.com/docker/docker/pkg/mflag"
<add>import (
<add> "sort"
<add>
<add> "github.com/docker/docker/cli"
<add> flag "github.com/docker/docker/pkg/mflag"
<add>)
<ide>
<ide> var (
<ide> flHelp = flag.Bool([]string{"h", "-help"}, false, "Print usage")
<ide> flVersion = flag.Bool([]string{"v", "-version"}, false, "Print version information and quit")
<ide> )
<ide>
<del>type command struct {
<del> name string
<del> description string
<del>}
<del>
<del>type byName []command
<add>type byName []cli.Command
<ide>
<ide> func (a byName) Len() int { return len(a) }
<ide> func (a byName) Swap(i, j int) { a[i], a[j] = a[j], a[i] }
<del>func (a byName) Less(i, j int) bool { return a[i].name < a[j].name }
<add>func (a byName) Less(i, j int) bool { return a[i].Name < a[j].Name }
<add>
<add>var dockerCommands []cli.Command
<ide>
<ide> // TODO(tiborvass): do not show 'daemon' on client-only binaries
<del>// and deduplicate description in dockerCommands and cli subcommands
<del>var dockerCommands = []command{
<del> {"attach", "Attach to a running container"},
<del> {"build", "Build an image from a Dockerfile"},
<del> {"commit", "Create a new image from a container's changes"},
<del> {"cp", "Copy files/folders between a container and the local filesystem"},
<del> {"create", "Create a new container"},
<del> {"diff", "Inspect changes on a container's filesystem"},
<del> {"events", "Get real time events from the server"},
<del> {"exec", "Run a command in a running container"},
<del> {"export", "Export a container's filesystem as a tar archive"},
<del> {"history", "Show the history of an image"},
<del> {"images", "List images"},
<del> {"import", "Import the contents from a tarball to create a filesystem image"},
<del> {"info", "Display system-wide information"},
<del> {"inspect", "Return low-level information on a container or image"},
<del> {"kill", "Kill a running container"},
<del> {"load", "Load an image from a tar archive or STDIN"},
<del> {"login", "Register or log in to a Docker registry"},
<del> {"logout", "Log out from a Docker registry"},
<del> {"logs", "Fetch the logs of a container"},
<del> {"pause", "Pause all processes within a container"},
<del> {"port", "List port mappings or a specific mapping for the CONTAINER"},
<del> {"ps", "List containers"},
<del> {"pull", "Pull an image or a repository from a registry"},
<del> {"push", "Push an image or a repository to a registry"},
<del> {"rename", "Rename a container"},
<del> {"restart", "Restart a container"},
<del> {"rm", "Remove one or more containers"},
<del> {"rmi", "Remove one or more images"},
<del> {"run", "Run a command in a new container"},
<del> {"save", "Save an image(s) to a tar archive"},
<del> {"search", "Search the Docker Hub for images"},
<del> {"start", "Start one or more containers"},
<del> {"stats", "Display a live stream of container(s) resource usage statistics"},
<del> {"stop", "Stop a container"},
<del> {"tag", "Tag an image into a repository"},
<del> {"top", "Display the running processes of a container"},
<del> {"unpause", "Unpause all processes within a container"},
<del> {"version", "Show the Docker version information"},
<del> {"volume", "Manage Docker volumes"},
<del> {"wait", "Block until a container stops, then print its exit code"},
<add>
<add>func init() {
<add> for _, cmd := range cli.DockerCommands {
<add> dockerCommands = append(dockerCommands, cmd)
<add> }
<add> sort.Sort(byName(dockerCommands))
<ide> }
<ide><path>docker/flags_experimental.go
<ide>
<ide> package main
<ide>
<del>import "sort"
<add>import (
<add> "sort"
<add>
<add> "github.com/docker/docker/cli"
<add>)
<ide>
<ide> func init() {
<del> dockerCommands = append(dockerCommands, command{"network", "Network management"})
<add> dockerCommands = append(dockerCommands, cli.Command{Name: "network", Description: "Network management"})
<ide>
<ide> //Sorting logic required here to pass Command Sort Test.
<ide> sort.Sort(byName(dockerCommands)) | 44 |
Javascript | Javascript | standardize test environment | ecbf93d922ae168ca915cc463ab859d457245333 | <ide><path>test/dsv/csv-test.js
<ide> var vows = require("vows"),
<del> d3 = require("../../"),
<ide> load = require("../load"),
<del> xhr = require("../env-xhr"),
<ide> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.csv");
<ide>
<ide> suite.addBatch({
<ide> "csv": {
<del> topic: load("dsv/csv")
<del> .expression("d3.csv")
<del> .sandbox({XMLHttpRequest: xhr, document: {}, window: {}}),
<add> topic: load("dsv/csv").expression("d3.csv").document(),
<ide>
<ide> "on a sample file": {
<ide> topic: function(csv) {
<ide> suite.addBatch({
<ide> assert.deepEqual(csv, [{"Hello":"42","World":"\"fish\""}]);
<ide> },
<ide> "overrides the mime type to text/csv": function(csv) {
<del> assert.equal(xhr._last._info.mimeType, "text/csv");
<add> assert.equal(XMLHttpRequest._last._info.mimeType, "text/csv");
<ide> }
<ide> },
<ide>
<ide><path>test/dsv/tsv-test.js
<ide> var vows = require("vows"),
<del> d3 = require("../../"),
<ide> load = require("../load"),
<del> xhr = require("../env-xhr"),
<ide> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.tsv");
<ide>
<ide> suite.addBatch({
<ide> "tsv": {
<del> topic: load("dsv/tsv")
<del> .expression("d3.tsv")
<del> .sandbox({XMLHttpRequest: xhr, document: {}, window: {}}),
<add> topic: load("dsv/tsv").expression("d3.tsv").document(),
<ide>
<ide> "on a sample file": {
<ide> topic: function(tsv) {
<ide> suite.addBatch({
<ide> assert.deepEqual(tsv, [{"Hello":42,"World":"\"fish\""}]);
<ide> },
<ide> "overrides the mime type to text/tab-separated-values": function(tsv) {
<del> assert.equal(xhr._last._info.mimeType, "text/tab-separated-values");
<add> assert.equal(XMLHttpRequest._last._info.mimeType, "text/tab-separated-values");
<ide> }
<ide> },
<ide>
<ide><path>test/env-xhr.js
<ide> var fs = require("fs");
<ide>
<del>module.exports = function XMLHttpRequest() {
<add>global.XMLHttpRequest = function XMLHttpRequest() {
<ide> var self = this,
<ide> info = self._info = {},
<ide> headers = {},
<ide><path>test/event/timer-test.js
<ide> var suite = vows.describe("d3.timer");
<ide>
<ide> suite.addBatch({
<ide> "timer": {
<del> topic: load("event/timer")
<del> .expression("d3.timer")
<del> .sandbox({document: {}, window: {}, setTimeout: setTimeout, clearTimeout: clearTimeout}),
<add> topic: load("event/timer").expression("d3.timer").document(),
<ide>
<ide> "with no delay": {
<ide> topic: delay(),
<ide><path>test/interpolate/array-test.js
<ide> var suite = vows.describe("d3.interpolateArray");
<ide>
<ide> suite.addBatch({
<ide> "interpolateArray": {
<del> topic: load("interpolate/array")
<del> .expression("d3.interpolateArray")
<del> .sandbox({document: null, window: null}),
<del>
<add> topic: load("interpolate/array").expression("d3.interpolateArray").document(),
<ide> "interpolates defined elements": function(interpolate) {
<ide> assert.deepEqual(interpolate([2, 12], [4, 24])(.5), [3, 18]);
<ide> },
<ide><path>test/interpolate/interpolate-test.js
<ide> var suite = vows.describe("d3.interpolate");
<ide>
<ide> suite.addBatch({
<ide> "interpolate": {
<del> topic: load("interpolate/interpolate")
<del> .sandbox({document: null, window: null}),
<del>
<add> topic: load("interpolate/interpolate").document(),
<ide> "interpolates numbers": function(d3) {
<ide> assert.equal(d3.interpolate(2, 12)(.4), 6);
<ide> assert.equal(d3.interpolate("2px", 12)(.4), 6);
<ide> suite.addBatch({
<ide> }
<ide> },
<ide> "interpolators": {
<del> topic: load("interpolate/interpolate")
<del> .sandbox({document: null, window: null}),
<del>
<add> topic: load("interpolate/interpolate").document(),
<ide> "can register a custom interpolator": function(d3) {
<ide> d3.interpolators.push(function(a, b) {
<ide> return a == "one" && b == "two" && d3.interpolateNumber(1, 2);
<ide><path>test/interpolate/object-test.js
<ide> var suite = vows.describe("d3.interpolateObject");
<ide>
<ide> suite.addBatch({
<ide> "interpolateObject": {
<del> topic: load("interpolate/object")
<del> .expression("d3.interpolateObject")
<del> .sandbox({document: null, window: null}),
<del>
<add> topic: load("interpolate/object").expression("d3.interpolateObject").document(),
<ide> "interpolates defined properties": function(interpolate) {
<ide> assert.deepEqual(interpolate({a: 2, b: 12}, {a: 4, b: 24})(.5), {a: 3, b: 18});
<ide> },
<ide><path>test/layout/force-test.js
<ide> var suite = vows.describe("d3.layout.force");
<ide>
<ide> suite.addBatch({
<ide> "force": {
<del> topic: load("layout/force")
<del> .expression("d3.layout.force")
<del> .sandbox({document: {}, window: {}, setTimeout: setTimeout, clearTimeout: clearTimeout}),
<add> topic: load("layout/force").expression("d3.layout.force").document(),
<ide>
<ide> "default instance": {
<ide> topic: function(force) {
<ide><path>test/load.js
<ide> var smash = require("smash"),
<del> jsdom = require("jsdom");
<add> jsdom = require("jsdom"),
<add> xhr = require("./env-xhr");
<ide>
<ide> module.exports = function() {
<ide> var files = [].slice.call(arguments).map(function(d) { return "src/" + d; }),
<ide> module.exports = function() {
<ide>
<ide> topic.document = function(_) {
<ide> var document = jsdom.jsdom("<html><head></head><body></body></html>");
<del> sandbox = {document: document, window: document.createWindow()};
<add>
<add> // Monkey-patch createRange support to JSDOM.
<add> document.createRange = function() {
<add> return {
<add> selectNode: function() {},
<add> createContextualFragment: jsdom.jsdom
<add> };
<add> };
<add>
<add> sandbox = {
<add> XMLHttpRequest: XMLHttpRequest,
<add> document: document,
<add> window: document.createWindow(),
<add> setTimeout: setTimeout,
<add> clearTimeout: clearTimeout
<add> };
<ide> return topic;
<ide> };
<ide>
<ide><path>test/scale/identity-test.js
<ide> var suite = vows.describe("d3.scale.identity");
<ide>
<ide> suite.addBatch({
<ide> "identity": {
<del> topic: load("scale/identity")
<del> .expression("d3.scale.identity")
<del> .sandbox({document: null, window: null}),
<add> topic: load("scale/identity").expression("d3.scale.identity").document(),
<ide>
<ide> "domain and range": {
<ide> "are identical": function(identity) {
<ide><path>test/scale/linear-test.js
<ide> var suite = vows.describe("d3.scale.linear");
<ide>
<ide> suite.addBatch({
<ide> "linear": {
<del> topic: load("scale/linear", "interpolate/hsl") // beware instanceof d3_Color
<del> .sandbox({document: null, window: null}),
<add> topic: load("scale/linear", "interpolate/hsl").document(), // beware instanceof d3_Color
<ide>
<ide> "domain": {
<ide> "defaults to [0, 1]": function(d3) {
<ide><path>test/scale/log-test.js
<ide> var suite = vows.describe("d3.scale.log");
<ide>
<ide> suite.addBatch({
<ide> "log": {
<del> topic: load("scale/log", "interpolate/hsl") // beware instanceof d3_Color
<del> .sandbox({document: null, window: null}),
<add> topic: load("scale/log", "interpolate/hsl").document(), // beware instanceof d3_Color
<ide>
<ide> "domain": {
<ide> "defaults to [1, 10]": function(d3) {
<ide><path>test/scale/pow-test.js
<ide> var suite = vows.describe("d3.scale.pow");
<ide>
<ide> suite.addBatch({
<ide> "pow": {
<del> topic: load("scale/pow", "interpolate/hsl") // beware instance of d3_Colorr
<del> .sandbox({document: null, window: null}),
<add> topic: load("scale/pow", "interpolate/hsl").document(), // beware instance of d3_Colorr
<ide>
<ide> "domain": {
<ide> "defaults to [0, 1]": function(d3) {
<ide><path>test/scale/sqrt-test.js
<ide> var suite = vows.describe("d3.scale.sqrt");
<ide>
<ide> suite.addBatch({
<ide> "sqrt": {
<del> topic: load("scale/sqrt", "interpolate/hsl") // beware instanceof d3_Color
<del> .sandbox({document: null, window: null}),
<add> topic: load("scale/sqrt", "interpolate/hsl").document(), // beware instanceof d3_Color
<ide>
<ide> "domain": {
<ide> "defaults to [0, 1]": function(d3) {
<ide><path>test/svg/axis-test.js
<ide> var vows = require("vows"),
<del> d3 = require("../../"),
<add> ordinal = require("../../").scale.ordinal,
<ide> load = require("../load"),
<del> assert = require("../env-assert"),
<del> document = d3.selection().node()._ownerDocument,
<del> window = document.defaultView;
<add> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.svg.axis");
<ide>
<ide> suite.addBatch({
<ide> "axis": {
<del> topic: load("svg/axis")
<del> .expression("d3.svg.axis")
<del> .sandbox({document: document, window: window}),
<add> topic: load("svg/axis").document(),
<ide>
<ide> "scale": {
<del> "defaults to a linear scale": function(axis) {
<del> var a = axis(), x = a.scale();
<add> "defaults to a linear scale": function(d3) {
<add> var a = d3.svg.axis(), x = a.scale();
<ide> assert.deepEqual(x.domain(), [0, 1]);
<ide> assert.deepEqual(x.range(), [0, 1]);
<ide> assert.equal(x(0.5), 0.5);
<ide> },
<del> "can be defined as a scale object": function(axis) {
<del> var x = d3.scale.linear(), a = axis().scale(x);
<add> "can be defined as a scale object": function(d3) {
<add> var x = d3.scale.linear(), a = d3.svg.axis().scale(x);
<ide> assert.equal(a.scale(), x);
<ide> },
<del> "can be a polylinear scale": function(axis) {
<del> var a = axis().scale(d3.scale.linear().domain([0, 1, 10]).range([2, 20, 200])),
<add> "can be a polylinear scale": function(d3) {
<add> var a = d3.svg.axis().scale(d3.scale.linear().domain([0, 1, 10]).range([2, 20, 200])),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> path = g.selectAll("path");
<ide> assert.equal(path.attr("d"), "M2,6V0H200V6");
<ide> },
<del> "can be an ordinal scale": function(axis) {
<del> var a = axis().scale(d3.scale.ordinal().domain(["A", "B", "C"]).rangeBands([10, 90])),
<add> "can be an ordinal scale": function(d3) {
<add> var a = d3.svg.axis().scale(ordinal().domain(["A", "B", "C"]).rangeBands([10, 90])),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> path = g.selectAll("path");
<ide> assert.equal(path.attr("d"), "M10,6V0H90V6");
<ide> },
<del> "can be an ordinal scale with explicit range": function(axis) {
<del> var a = axis().scale(d3.scale.ordinal().domain(["A", "B", "C"]).range([10, 50, 90])),
<add> "can be an ordinal scale with explicit range": function(d3) {
<add> var a = d3.svg.axis().scale(ordinal().domain(["A", "B", "C"]).range([10, 50, 90])),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> path = g.selectAll("path");
<ide> assert.equal(path.attr("d"), "M10,6V0H90V6");
<ide> }
<ide> },
<ide>
<ide> "orient": {
<del> "defaults to bottom": function(axis) {
<del> var a = axis();
<add> "defaults to bottom": function(d3) {
<add> var a = d3.svg.axis();
<ide> assert.equal(a.orient(), "bottom");
<ide> },
<del> "defaults to bottom when an invalid orientation is specified": function(axis) {
<del> var a = axis().orient("invalid");
<add> "defaults to bottom when an invalid orientation is specified": function(d3) {
<add> var a = d3.svg.axis().orient("invalid");
<ide> assert.equal(a.orient(), "bottom");
<ide> },
<del> "coerces to a string": function(axis) {
<del> var a = axis().orient({toString: function() { return "left"; }});
<add> "coerces to a string": function(d3) {
<add> var a = d3.svg.axis().orient({toString: function() { return "left"; }});
<ide> assert.equal(a.orient(), "left");
<ide> },
<del> "supports top orientation": function(axis) {
<del> var a = axis().orient("top"),
<add> "supports top orientation": function(d3) {
<add> var a = d3.svg.axis().orient("top"),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> tick = g.select("g:nth-child(3)"),
<ide> text = tick.select("text"),
<ide> suite.addBatch({
<ide> assert.equal(line.attr("y2"), -6);
<ide> assert.equal(path.attr("d"), "M0,-6V0H1V-6");
<ide> },
<del> "supports right orientation": function(axis) {
<del> var a = axis().orient("right"),
<add> "supports right orientation": function(d3) {
<add> var a = d3.svg.axis().orient("right"),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> tick = g.select("g:nth-child(3)"),
<ide> text = tick.select("text"),
<ide> suite.addBatch({
<ide> assert.equal(line.attr("x2"), 6);
<ide> assert.equal(path.attr("d"), "M6,0H0V1H6");
<ide> },
<del> "supports bottom orientation": function(axis) {
<del> var a = axis().orient("bottom"),
<add> "supports bottom orientation": function(d3) {
<add> var a = d3.svg.axis().orient("bottom"),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> tick = g.select("g:nth-child(3)"),
<ide> text = tick.select("text"),
<ide> suite.addBatch({
<ide> assert.equal(line.attr("y2"), 6);
<ide> assert.equal(path.attr("d"), "M0,6V0H1V6");
<ide> },
<del> "supports left orientation": function(axis) {
<del> var a = axis().orient("left"),
<add> "supports left orientation": function(d3) {
<add> var a = d3.svg.axis().orient("left"),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> tick = g.select("g:nth-child(3)"),
<ide> text = tick.select("text"),
<ide> suite.addBatch({
<ide> },
<ide>
<ide> "tickSize": {
<del> "defaults to six pixels": function(axis) {
<del> var a = axis();
<add> "defaults to six pixels": function(d3) {
<add> var a = d3.svg.axis();
<ide> assert.equal(a.tickSize(), 6);
<ide> },
<del> "can be defined as a number": function(axis) {
<del> var a = axis().tickSize(3);
<add> "can be defined as a number": function(d3) {
<add> var a = d3.svg.axis().tickSize(3);
<ide> assert.equal(a.tickSize(), 3);
<ide> },
<del> "coerces input value to a number": function(axis) {
<del> var a = axis().tickSize("3");
<add> "coerces input value to a number": function(d3) {
<add> var a = d3.svg.axis().tickSize("3");
<ide> assert.strictEqual(a.tickSize(), 3);
<ide> },
<del> "affects the generated domain path": function(axis) {
<del> var a = axis().tickSize(3),
<add> "affects the generated domain path": function(d3) {
<add> var a = d3.svg.axis().tickSize(3),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> path = g.select("path.domain");
<ide> assert.equal(path.attr("d"), "M0,3V0H1V3");
<ide> },
<del> "affects the generated tick lines": function(axis) {
<del> var a = axis().tickSize(3),
<add> "affects the generated tick lines": function(d3) {
<add> var a = d3.svg.axis().tickSize(3),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> line = g.selectAll("g line");
<ide> line.each(function() {
<ide> assert.equal(d3.select(this).attr("y2"), 3);
<ide> });
<ide> },
<del> "if negative, labels are placed on the opposite end": function(axis) {
<del> var a = axis().tickSize(-80),
<add> "if negative, labels are placed on the opposite end": function(d3) {
<add> var a = d3.svg.axis().tickSize(-80),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> line = g.selectAll("g line"),
<ide> text = g.selectAll("g text");
<ide> suite.addBatch({
<ide> assert.equal(d3.select(this).attr("y"), 3);
<ide> });
<ide> },
<del> "with two arguments, specifies end tick size": function(axis) {
<del> var a = axis().tickSize(6, 3),
<add> "with two arguments, specifies end tick size": function(d3) {
<add> var a = d3.svg.axis().tickSize(6, 3),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> path = g.selectAll("path");
<ide> assert.equal(path.attr("d"), "M0,3V0H1V3");
<ide> },
<del> "with three arguments, specifies end and minor tick sizes": function(axis) {
<del> var a = axis().tickSubdivide(3).tickSize(6, 3, 9),
<add> "with three arguments, specifies end and minor tick sizes": function(d3) {
<add> var a = d3.svg.axis().tickSubdivide(3).tickSize(6, 3, 9),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> path = g.selectAll("path"),
<ide> line = g.select(".minor");
<ide> suite.addBatch({
<ide> },
<ide>
<ide> "tickPadding": {
<del> "defaults to three pixels": function(axis) {
<del> var a = axis();
<add> "defaults to three pixels": function(d3) {
<add> var a = d3.svg.axis();
<ide> assert.equal(a.tickPadding(), 3);
<ide> },
<del> "can be defined as a number": function(axis) {
<del> var a = axis().tickPadding(6);
<add> "can be defined as a number": function(d3) {
<add> var a = d3.svg.axis().tickPadding(6);
<ide> assert.equal(a.tickPadding(), 6);
<ide> },
<del> "coerces input value to a number": function(axis) {
<del> var a = axis().tickPadding("6");
<add> "coerces input value to a number": function(d3) {
<add> var a = d3.svg.axis().tickPadding("6");
<ide> assert.strictEqual(a.tickPadding(), 6);
<ide> },
<del> "affects the generated tick labels": function(axis) {
<del> var a = axis().tickSize(2).tickPadding(7),
<add> "affects the generated tick labels": function(d3) {
<add> var a = d3.svg.axis().tickSize(2).tickPadding(7),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> text = g.selectAll("g text");
<ide> text.each(function() {
<ide> suite.addBatch({
<ide> },
<ide>
<ide> "ticks": {
<del> "defaults to [10]": function(axis) {
<del> var a = axis();
<add> "defaults to [10]": function(d3) {
<add> var a = d3.svg.axis();
<ide> assert.deepEqual(a.ticks(), [10]);
<ide> },
<del> "can be defined as any arguments": function(axis) {
<del> var b = {}, a = axis().ticks(b, 42), t = a.ticks();
<add> "can be defined as any arguments": function(d3) {
<add> var b = {}, a = d3.svg.axis().ticks(b, 42), t = a.ticks();
<ide> assert.equal(t[0], b);
<ide> assert.equal(t[1], 42);
<ide> assert.equal(t.length, 2);
<ide> },
<del> "passes any arguments to the scale's ticks function": function(axis) {
<del> var x = d3.scale.linear(), b = {}, a = axis().ticks(b, "%").scale(x), aa = [],
<add> "passes any arguments to the scale's ticks function": function(d3) {
<add> var x = d3.scale.linear(), b = {}, a = d3.svg.axis().ticks(b, "%").scale(x), aa = [],
<ide> g = d3.select("body").html("").append("svg:g");
<ide> x.ticks = function() { aa.push(arguments); return [42]; };
<ide> g.call(a);
<ide> suite.addBatch({
<ide> assert.equal(aa[0][0], b);
<ide> assert.equal(aa[0][1], "%");
<ide> },
<del> "passes any arguments to the scale's tickFormat function": function(axis) {
<add> "passes any arguments to the scale's tickFormat function": function(d3) {
<ide> var b = {},
<ide> x = d3.scale.linear(),
<del> a = axis().scale(x).ticks(b, "%"),
<add> a = d3.svg.axis().scale(x).ticks(b, "%"),
<ide> g = d3.select("body").html("").append("svg:g"),
<ide> aa = [];
<ide>
<ide> suite.addBatch({
<ide> assert.equal(aa[0][0], b);
<ide> assert.equal(aa[0][1], "%");
<ide> },
<del> "affects the generated ticks": function(axis) {
<del> var a = axis().ticks(20, "%"),
<add> "affects the generated ticks": function(d3) {
<add> var a = d3.svg.axis().ticks(20, "%"),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> t = g.selectAll("g");
<ide> assert.equal(t[0].length, 21);
<ide> assert.equal(t[0][0].textContent, "0%");
<ide> },
<del> "only substitutes precision if not specified": function(axis) {
<del> var a = axis().ticks(20, ".5%"),
<add> "only substitutes precision if not specified": function(d3) {
<add> var a = d3.svg.axis().ticks(20, ".5%"),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> t = g.selectAll("g");
<ide> assert.equal(t[0].length, 21);
<ide> suite.addBatch({
<ide> },
<ide>
<ide> "tickValues": {
<del> "defaults to null": function(axis) {
<del> var a = axis().tickValues();
<add> "defaults to null": function(d3) {
<add> var a = d3.svg.axis().tickValues();
<ide> assert.isNull(a);
<ide> },
<del> "can be given as array of positions": function(axis) {
<del> var l = [1, 2.5, 3], a = axis().tickValues(l), t = a.tickValues();
<add> "can be given as array of positions": function(d3) {
<add> var l = [1, 2.5, 3], a = d3.svg.axis().tickValues(l), t = a.tickValues();
<ide> assert.equal(t, l);
<ide> assert.equal(t.length, 3);
<ide> },
<del> "does not change the tick arguments": function(axis) {
<del> var b = {}, a = axis().ticks(b, 42).tickValues([10]), t = a.ticks();
<add> "does not change the tick arguments": function(d3) {
<add> var b = {}, a = d3.svg.axis().ticks(b, 42).tickValues([10]), t = a.ticks();
<ide> assert.equal(t[0], b);
<ide> assert.equal(t[1], 42);
<ide> assert.equal(t.length, 2);
<ide> },
<del> "does not change the arguments passed to the scale's tickFormat function": function(axis) {
<add> "does not change the arguments passed to the scale's tickFormat function": function(d3) {
<ide> var x = d3.scale.linear(),
<del> a = axis().scale(x).ticks(10).tickValues([1, 2, 3]),
<add> a = d3.svg.axis().scale(x).ticks(10).tickValues([1, 2, 3]),
<ide> g = d3.select("body").html("").append("svg:g"),
<ide> aa = [];
<ide>
<ide> suite.addBatch({
<ide> assert.equal(aa[0].length, 1);
<ide> assert.equal(aa[0][0], 10);
<ide> },
<del> "affects the generated ticks": function(axis) {
<del> var a = axis().ticks(20),
<add> "affects the generated ticks": function(d3) {
<add> var a = d3.svg.axis().ticks(20),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> t = g.selectAll("g");
<ide> assert.equal(t[0].length, 21);
<ide> }
<ide> },
<ide>
<ide> "tickSubdivide": {
<del> "defaults to zero": function(axis) {
<del> var a = axis();
<add> "defaults to zero": function(d3) {
<add> var a = d3.svg.axis();
<ide> assert.equal(a.tickSubdivide(), 0);
<ide> },
<del> "coerces input value to a number": function(axis) {
<del> var a = axis().tickSubdivide(true);
<add> "coerces input value to a number": function(d3) {
<add> var a = d3.svg.axis().tickSubdivide(true);
<ide> assert.strictEqual(a.tickSubdivide(), 1);
<ide> },
<del> "does not generate minor ticks when zero": function(axis) {
<del> var g = d3.select("body").html("").append("svg:g").call(axis());
<add> "does not generate minor ticks when zero": function(d3) {
<add> var g = d3.select("body").html("").append("svg:g").call(d3.svg.axis());
<ide> assert.isTrue(g.selectAll(".minor").empty());
<ide> },
<del> "affects the generated minor ticks": function(axis) {
<del> var a = axis().tickSubdivide(3),
<add> "affects the generated minor ticks": function(d3) {
<add> var a = d3.svg.axis().tickSubdivide(3),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> t = g.selectAll("line.tick.minor");
<ide> assert.equal(t[0].length, 30);
<ide> suite.addBatch({
<ide> },
<ide>
<ide> "tickFormat": {
<del> "defaults to null": function(axis) {
<del> var a = axis();
<add> "defaults to null": function(d3) {
<add> var a = d3.svg.axis();
<ide> assert.isTrue(a.tickFormat() == null);
<ide> },
<del> "when null, uses the scale's tick format": function(axis) {
<del> var x = d3.scale.linear(), a = axis().scale(x),
<add> "when null, uses the scale's tick format": function(d3) {
<add> var x = d3.scale.linear(), a = d3.svg.axis().scale(x),
<ide> g = d3.select("body").html("").append("svg:g");
<ide>
<ide> x.tickFormat = function() {
<ide> suite.addBatch({
<ide> var t = g.selectAll("g text");
<ide> assert.equal(t.text(), "foo-0");
<ide> },
<del> "affects the generated tick labels": function(axis) {
<del> var a = axis().tickFormat(d3.format("+.2%")),
<add> "affects the generated tick labels": function(d3) {
<add> var a = d3.svg.axis().tickFormat(d3.format("+.2%")),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> t = g.selectAll("g text");
<ide> assert.equal(t.text(), "+0.00%");
<ide> },
<del> "can be set to a constant": function(axis) {
<del> var a = axis().tickFormat("I'm a tick!"),
<add> "can be set to a constant": function(d3) {
<add> var a = d3.svg.axis().tickFormat("I'm a tick!"),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> t = g.selectAll("g text");
<ide> assert.equal(t.text(), "I'm a tick!");
<ide> },
<del> "can be set to a falsey constant": function(axis) {
<del> var a = axis().tickFormat(""),
<add> "can be set to a falsey constant": function(d3) {
<add> var a = d3.svg.axis().tickFormat(""),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> t = g.selectAll("g text");
<ide> assert.equal(t.text(), "");
<ide> }
<ide> },
<ide>
<ide> "enter": {
<del> "generates a new domain path": function(axis) {
<del> var a = axis(),
<add> "generates a new domain path": function(d3) {
<add> var a = d3.svg.axis(),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> path = g.selectAll("path.domain");
<ide> assert.equal(path[0].length, 1);
<ide> assert.equal(path.attr("d"), "M0,6V0H1V6");
<ide> assert.isNull(path.node().nextSibling);
<ide> },
<del> "generates new tick marks with labels": function(axis) {
<del> var a = axis(),
<add> "generates new tick marks with labels": function(d3) {
<add> var a = d3.svg.axis(),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> x = d3.scale.linear(),
<ide> tick = g.selectAll("g"),
<ide> suite.addBatch({
<ide> },
<ide>
<ide> "update": {
<del> "updates the domain path": function(axis) {
<del> var a = axis(),
<add> "updates the domain path": function(d3) {
<add> var a = d3.svg.axis(),
<ide> g = d3.select("body").html("").append("svg:g").call(a);
<ide> a.scale().domain([0, 2]).range([1, 2]);
<ide> a.tickSize(3);
<ide> suite.addBatch({
<ide> assert.equal(path.attr("d"), "M1,3V0H2V3");
<ide> assert.domEqual(path.node().nextSibling, null);
<ide> },
<del> "enters, exits and updates tick marks": function(axis) {
<del> var a = axis(),
<add> "enters, exits and updates tick marks": function(d3) {
<add> var a = d3.svg.axis(),
<ide> g = d3.select("body").html("").append("svg:g").call(a),
<ide> x = d3.scale.linear().domain([1, 1.5]);
<ide> a.scale().domain(x.domain());
<ide><path>test/svg/brush-test.js
<ide> var vows = require("vows"),
<del> d3 = require("../../"),
<add> linear = require("../../").scale.linear,
<ide> load = require("../load"),
<del> assert = require("../env-assert"),
<del> jsdom = require("jsdom").jsdom,
<del> document = jsdom("<html><head></head><body></body></html>"),
<del> window = document.createWindow();
<add> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.svg.brush");
<ide>
<ide> suite.addBatch({
<ide> "brush": {
<del> topic: load("svg/brush")
<del> .expression("d3.svg.brush")
<del> .sandbox({document: document, window: window}),
<add> topic: load("svg/brush").expression("d3.svg.brush").document(),
<ide>
<ide> "x": {
<ide> "defaults to null": function(brush) {
<ide> suite.addBatch({
<ide> assert.isNull(brush().extent());
<ide> },
<ide> "returns a one-dimensional array if only x is defined": function(brush) {
<del> var b = brush().x(d3.scale.linear());
<add> var b = brush().x(linear());
<ide> assert.deepEqual(b.extent(), [0, 0]);
<ide> },
<ide> "takes a one-dimensional array if only x is defined": function(brush) {
<del> var b = brush().x(d3.scale.linear()).extent([0.1, 0.4]);
<add> var b = brush().x(linear()).extent([0.1, 0.4]);
<ide> assert.deepEqual(b.extent(), [0.1, 0.4]);
<ide> },
<ide> "returns a one-dimensional array if only y is defined": function(brush) {
<del> var b = brush().y(d3.scale.linear());
<add> var b = brush().y(linear());
<ide> assert.deepEqual(b.extent(), [0, 0]);
<ide> },
<ide> "takes a one-dimensional array if only y is defined": function(brush) {
<del> var b = brush().y(d3.scale.linear()).extent([0.1, 0.4]);
<add> var b = brush().y(linear()).extent([0.1, 0.4]);
<ide> assert.deepEqual(b.extent(), [0.1, 0.4]);
<ide> },
<ide> "returns a two-dimensional array if x and y are defined": function(brush) {
<del> var b = brush().x(d3.scale.linear()).y(d3.scale.linear());
<add> var b = brush().x(linear()).y(linear());
<ide> assert.deepEqual(b.extent(), [[0, 0], [0, 0]]);
<ide> },
<ide> "takes a two-dimensional array if x and y are defined": function(brush) {
<del> var b = brush().x(d3.scale.linear()).y(d3.scale.linear()).extent([[0.1, 0.2], [0.3, 0.4]]);
<add> var b = brush().x(linear()).y(linear()).extent([[0.1, 0.2], [0.3, 0.4]]);
<ide> assert.deepEqual(b.extent(), [[0.1, 0.2], [0.3, 0.4]]);
<ide> },
<ide> "preserves the set extent exactly": function(brush) {
<ide> var lo = new Number(0.1),
<ide> hi = new Number(0.3),
<del> b = brush().x(d3.scale.linear()).extent([lo, hi]),
<add> b = brush().x(linear()).extent([lo, hi]),
<ide> extent = b.extent();
<ide> assert.strictEqual(extent[0], lo);
<ide> assert.strictEqual(extent[1], hi);
<ide><path>test/xhr/html-test.js
<ide> var vows = require("vows"),
<del> jsdom = require("jsdom").jsdom,
<del> d3 = require("../../"),
<ide> load = require("../load"),
<del> xhr = require("../env-xhr"),
<del> assert = require("../env-assert"),
<del> document = d3.selection().node()._ownerDocument,
<del> window = document.defaultView;
<del>
<del>// Monkey-patch createRange support to JSDOM.
<del>document.createRange = function() {
<del> return {
<del> selectNode: function() {},
<del> createContextualFragment: jsdom
<del> };
<del>};
<add> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.html");
<ide>
<ide> suite.addBatch({
<ide> "html": {
<del> topic: load("xhr/html").expression("d3.html").sandbox({
<del> XMLHttpRequest: xhr,
<del> document: document,
<del> window: window
<del> }),
<add> topic: load("xhr/html").expression("d3.html").document(),
<ide>
<ide> "on a sample file": {
<ide> topic: function(html) {
<del> var cb = this.callback;
<del> html("test/data/sample.html", function(error, document) {
<del> cb(null, document);
<del> });
<add> html("test/data/sample.html", this.callback);
<ide> },
<ide> "invokes the callback with the loaded html": function(document) {
<ide> assert.equal(document.getElementsByTagName("H1")[0].textContent, "Hello & world!");
<ide> },
<ide> "override the mime type to text/html": function(document) {
<del> assert.equal(xhr._last._info.mimeType, "text/html");
<add> assert.equal(XMLHttpRequest._last._info.mimeType, "text/html");
<ide> }
<ide> },
<ide>
<ide> "on a file that does not exist": {
<ide> topic: function(html) {
<del> var cb = this.callback;
<add> var callback = this.callback;
<ide> html("//does/not/exist.html", function(error, document) {
<del> cb(null, document);
<add> callback(null, document);
<ide> });
<ide> },
<ide> "invokes the callback with undefined when an error occurs": function(document) {
<ide><path>test/xhr/json-test.js
<ide> var vows = require("vows"),
<ide> load = require("../load"),
<del> xhr = require("../env-xhr"),
<ide> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.json");
<ide>
<ide> suite.addBatch({
<ide> "json": {
<del> topic: load("xhr/json").sandbox({
<del> XMLHttpRequest: xhr,
<del> document: {},
<del> window: {}
<del> }),
<add> topic: load("xhr/json").expression("d3.json").document(),
<ide>
<ide> "on a sample file": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.json("test/data/sample.json", function(error, json) {
<del> cb(null, json);
<del> });
<add> topic: function(json) {
<add> json("test/data/sample.json", this.callback);
<ide> },
<ide> "invokes the callback with the loaded JSON": function(json) {
<ide> assert.deepEqual(json, [{"Hello":42,"World":"\"fish\""}]);
<ide> },
<ide> "overrides the mime type to application/json": function(json) {
<del> assert.equal(xhr._last._info.mimeType, "application/json");
<add> assert.equal(XMLHttpRequest._last._info.mimeType, "application/json");
<ide> }
<ide> },
<ide>
<ide> "on a file that does not exist": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.json("//does/not/exist.json", function(error, json) {
<del> cb(null, json);
<add> topic: function(json) {
<add> var callback = this.callback;
<add> json("//does/not/exist.json", function(error, json) {
<add> callback(null, json);
<ide> });
<ide> },
<ide> "invokes the callback with undefined when an error occurs": function(json) {
<ide><path>test/xhr/text-test.js
<ide> var vows = require("vows"),
<ide> load = require("../load"),
<del> xhr = require("../env-xhr"),
<ide> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.text");
<ide>
<ide> suite.addBatch({
<ide> "text": {
<del> topic: load("xhr/text").sandbox({
<del> XMLHttpRequest: xhr,
<del> document: {},
<del> window: {}
<del> }),
<add> topic: load("xhr/text").expression("d3.text").document(),
<ide>
<ide> "on a sample file": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.text("test/data/sample.txt", function(error, text) {
<del> cb(null, text);
<del> });
<add> topic: function(text) {
<add> text("test/data/sample.txt", this.callback);
<ide> },
<ide> "invokes the callback with the loaded text": function(text) {
<ide> assert.equal(text, "Hello, world!\n");
<ide> },
<ide> "does not override the mime type by default": function(text) {
<del> assert.isUndefined(xhr._last._info.mimeType);
<add> assert.isUndefined(XMLHttpRequest._last._info.mimeType);
<ide> }
<ide> },
<ide>
<ide> "with a custom mime type": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.text("test/data/sample.txt", "text/plain+sample", function(error, text) {
<del> cb(null, text);
<del> });
<add> topic: function(text) {
<add> text("test/data/sample.txt", "text/plain+sample", this.callback);
<ide> },
<ide> "observes the optional mime type": function(text) {
<del> assert.equal(xhr._last._info.mimeType, "text/plain+sample");
<add> assert.equal(XMLHttpRequest._last._info.mimeType, "text/plain+sample");
<ide> }
<ide> },
<ide>
<ide> "on a file that does not exist": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.text("//does/not/exist.txt", function(error, text) {
<del> cb(null, text);
<add> topic: function(text) {
<add> var callback = this.callback;
<add> text("//does/not/exist.txt", function(error, text) {
<add> callback(null, text);
<ide> });
<ide> },
<ide> "invokes the callback with undefined when an error occurs": function(text) {
<ide><path>test/xhr/xhr-test.js
<ide> var vows = require("vows"),
<ide> load = require("../load"),
<del> xhr = require("../env-xhr"),
<ide> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.xhr");
<ide>
<ide> suite.addBatch({
<ide> "xhr": {
<del> topic: load("xhr/xhr").sandbox({
<del> XMLHttpRequest: xhr,
<del> document: {},
<del> window: {}
<del> }),
<add> topic: load("xhr/xhr").expression("d3.xhr").document(),
<ide>
<ide> "on a sample text file": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.xhr("test/data/sample.txt", function(error, req) {
<del> cb(null, req);
<del> });
<add> topic: function(xhr) {
<add> xhr("test/data/sample.txt", this.callback);
<ide> },
<ide> "makes an asynchronous HTTP request": function(req) {
<ide> assert.equal(req._info.url, "test/data/sample.txt");
<ide> suite.addBatch({
<ide> },
<ide>
<ide> "when a custom mime type is specified": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.xhr("test/data/sample.txt", "text/plain", function(error, req) {
<del> cb(null, req);
<del> });
<add> topic: function(xhr) {
<add> xhr("test/data/sample.txt", "text/plain", this.callback);
<ide> },
<ide> "observes the optional mime type": function(req) {
<ide> assert.equal(req._info.mimeType, "text/plain");
<ide> }
<ide> },
<ide>
<ide> "on a file that does not exist": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.xhr("//does/not/exist.txt", function(error, req) {
<del> cb(null, req);
<add> topic: function(xhr) {
<add> var callback = this.callback;
<add> xhr("//does/not/exist.txt", function(error, req) {
<add> callback(null, req);
<ide> });
<ide> },
<ide> "invokes the callback with undefined when an error occurs": function(req) {
<ide><path>test/xhr/xml-test.js
<ide> var vows = require("vows"),
<ide> load = require("../load"),
<del> xhr = require("../env-xhr"),
<ide> assert = require("../env-assert");
<ide>
<ide> var suite = vows.describe("d3.xml");
<ide>
<ide> suite.addBatch({
<ide> "xml": {
<del> topic: load("xhr/xml").sandbox({
<del> XMLHttpRequest: xhr,
<del> document: {},
<del> window: {}
<del> }),
<add> topic: load("xhr/xml").expression("d3.xml").document(),
<ide>
<ide> "on a sample file": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.xml("test/data/sample.xml", function(error, xml) {
<del> cb(null, xml);
<del> });
<add> topic: function(xml) {
<add> xml("test/data/sample.xml", this.callback);
<ide> },
<ide> "invokes the callback with the loaded xml": function(xml) {
<ide> assert.deepEqual(xml, {_xml: "<?xml version=\"1.0\" encoding=\"UTF-8\" ?>\n<hello>\n <world name=\"Earth\"/>\n</hello>\n"});
<ide> },
<ide> "does not override the mime type by default": function(xml) {
<del> assert.isUndefined(xhr._last._info.mimeType);
<add> assert.isUndefined(XMLHttpRequest._last._info.mimeType);
<ide> }
<ide> },
<ide>
<ide> "with a custom mime type": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.xml("test/data/sample.txt", "application/xml+sample", function(error, xml) {
<del> cb(null, xml);
<del> });
<add> topic: function(xml) {
<add> xml("test/data/sample.txt", "application/xml+sample", this.callback);
<ide> },
<ide> "observes the optional mime type": function(xml) {
<del> assert.equal(xhr._last._info.mimeType, "application/xml+sample");
<add> assert.equal(XMLHttpRequest._last._info.mimeType, "application/xml+sample");
<ide> }
<ide> },
<ide>
<ide> "on a file that does not exist": {
<del> topic: function(d3) {
<del> var cb = this.callback;
<del> d3.xml("//does/not/exist.xml", function(error, xml) {
<del> cb(null, xml);
<add> topic: function(xml) {
<add> var callback = this.callback;
<add> xml("//does/not/exist.xml", function(error, xml) {
<add> callback(null, xml);
<ide> });
<ide> },
<ide> "invokes the callback with undefined when an error occurs": function(xml) { | 21 |
Javascript | Javascript | add test failure for after-resolve async | c45d8c25939076517b4481ded61bb55aeb51e9ff | <ide><path>test/Integration.test.js
<ide> describe("Integration", function() {
<ide> this.plugin("normal-module-factory", function(nmf) {
<ide> nmf.plugin("after-resolve", function(data, callback) {
<ide> data.resource = data.resource.replace(/extra\.js/, "extra2.js");
<del> callback(null, data);
<add> setTimeout(function() {
<add> callback(null, data);
<add> }, 50);
<ide> });
<ide> });
<ide> } | 1 |
Text | Text | update changelog from 6.x | bca31670cb89f5bd6e05f16336f13627936b5831 | <ide><path>CHANGELOG.md
<add><a name="6.10.0"></a>
<add># [6.10.0](https://github.com/videojs/video.js/compare/v6.9.0...v6.10.0) (2018-05-11)
<add>
<add>### Features
<add>
<add>* add 'autoSetup' option ([#5123](https://github.com/videojs/video.js/issues/5123)) ([592c255](https://github.com/videojs/video.js/commit/592c255)), closes [#5094](https://github.com/videojs/video.js/issues/5094)
<add>* copy properties from <video-js> to the media el from ([#5039](https://github.com/videojs/video.js/issues/5039)) as ([#5163](https://github.com/videojs/video.js/issues/5163)) ([c654c7d](https://github.com/videojs/video.js/commit/c654c7d))
<add>* update the players source cache on sourceset from ([#5040](https://github.com/videojs/video.js/issues/5040)) as ([#5156](https://github.com/videojs/video.js/issues/5156)) ([72f84d5](https://github.com/videojs/video.js/commit/72f84d5))
<add>
<add>### Bug Fixes
<add>
<add>* **time-display:** restore hidden label text for screen readers. ([#5157](https://github.com/videojs/video.js/issues/5157)) ([baa6b56](https://github.com/videojs/video.js/commit/baa6b56)), closes [#5135](https://github.com/videojs/video.js/issues/5135)
<add>* `sourceset` and browser behavior inconsistencies from ([#5054](https://github.com/videojs/video.js/issues/5054)) as ([#5162](https://github.com/videojs/video.js/issues/5162)) ([e1d26d8](https://github.com/videojs/video.js/commit/e1d26d8))
<add>* Reduce the multiple-announcement by screen readers of the new name of a button when its text label changes. ([#5158](https://github.com/videojs/video.js/issues/5158)) ([79fed25](https://github.com/videojs/video.js/commit/79fed25)), closes [#5023](https://github.com/videojs/video.js/issues/5023)
<add>* Remove spaces from element IDs and ARIA attributes in the Captions Settings Dialog ([#5153](https://github.com/videojs/video.js/issues/5153)) ([e076cde](https://github.com/videojs/video.js/commit/e076cde)), closes [#4688](https://github.com/videojs/video.js/issues/4688) [#4884](https://github.com/videojs/video.js/issues/4884)
<add>* Remove unnecessary ARIA role on the Control Bar. ([#5154](https://github.com/videojs/video.js/issues/5154)) ([9607712](https://github.com/videojs/video.js/commit/9607712)), closes [#5134](https://github.com/videojs/video.js/issues/5134)
<add>
<add><a name="6.9.0"></a>
<add># [6.9.0](https://github.com/videojs/video.js/compare/v6.8.0...v6.9.0) (2018-04-20)
<add>
<add>### Features
<add>
<add>* Queue playback events when the playback rate is zero and we are seeking ([#5061](https://github.com/videojs/video.js/issues/5061)) ([eaf3c98](https://github.com/videojs/video.js/commit/eaf3c98)), closes [#5024](https://github.com/videojs/video.js/issues/5024)
<add>
<add>### Bug Fixes
<add>
<add>* fire sourceset on initial source append ([#5038](https://github.com/videojs/video.js/issues/5038)) ([#5072](https://github.com/videojs/video.js/issues/5072)) ([00e7f7b](https://github.com/videojs/video.js/commit/00e7f7b))
<add>* let the tech preload auto on its own ([#4861](https://github.com/videojs/video.js/issues/4861)) ([#5065](https://github.com/videojs/video.js/issues/5065)) ([c04dac4](https://github.com/videojs/video.js/commit/c04dac4)), closes [#4660](https://github.com/videojs/video.js/issues/4660)
<add>* options.id is now applied correctly to the player dom element ([#5090](https://github.com/videojs/video.js/issues/5090)) ([dd45dc0](https://github.com/videojs/video.js/commit/dd45dc0)), closes [#5088](https://github.com/videojs/video.js/issues/5088)
<add>* wait till play event to listen for user activity ([#5093](https://github.com/videojs/video.js/issues/5093)) ([9f8ce2d](https://github.com/videojs/video.js/commit/9f8ce2d)), closes [#5076](https://github.com/videojs/video.js/issues/5076)
<add>* **time-display:** Use formatTime for a consistent default instead of hardcoded string ([#5055](https://github.com/videojs/video.js/issues/5055)) ([363af84](https://github.com/videojs/video.js/commit/363af84))
<add>
<add>### Code Refactoring
<add>
<add>* move seekbar event handler bindings into a function ([#5097](https://github.com/videojs/video.js/issues/5097)) ([7c3213c](https://github.com/videojs/video.js/commit/7c3213c))
<add>* move sourceset code out of tech ([#5049](https://github.com/videojs/video.js/issues/5049)) ([e2b9d58](https://github.com/videojs/video.js/commit/e2b9d58))
<add>
<add>### Documentation
<add>
<add>* **debugging:** fix markup typo ([#5086](https://github.com/videojs/video.js/issues/5086)) ([8c77aa0](https://github.com/videojs/video.js/commit/8c77aa0))
<add>* **guides:** add debugging section to index ([#5100](https://github.com/videojs/video.js/issues/5100)) ([20546d3](https://github.com/videojs/video.js/commit/20546d3))
<add>
<add>### Tests
<add>
<add>* fix queue playing events test for ie8 (for real this time) ([#5110](https://github.com/videojs/video.js/issues/5110)) ([5dec1a0](https://github.com/videojs/video.js/commit/5dec1a0))
<add>* fix queued events test with playbackrate in IE8 ([#5105](https://github.com/videojs/video.js/issues/5105)) ([c4a05eb](https://github.com/videojs/video.js/commit/c4a05eb))
<add>
<ide> <a name="6.8.0"></a>
<ide> # [6.8.0](https://github.com/videojs/video.js/compare/v6.7.4...v6.8.0) (2018-03-19)
<ide> | 1 |
Ruby | Ruby | improve error handling | c3bf9bda58aba514259d45972a5172e8801d5c65 | <ide><path>Library/Homebrew/dev-cmd/update-test.rb
<ide> def update_test
<ide> elsif date = ARGV.value("before")
<ide> Utils.popen_read("git", "rev-list", "-n1", "--before=#{date}", "origin/master").chomp
<ide> elsif ARGV.include?("--to-tag")
<del> Utils.popen_read("git", "tag", "--list", "--sort=-version:refname").lines[1].chomp
<add> previous_tag =
<add> Utils.popen_read("git", "tag", "--list", "--sort=-version:refname").lines[1]
<add> unless previous_tag
<add> safe_system "git", "fetch", "--tags"
<add> previous_tag =
<add> Utils.popen_read("git", "tag", "--list", "--sort=-version:refname").lines[1]
<add> end
<add> previous_tag.to_s.chomp
<ide> else
<ide> Utils.popen_read("git", "rev-parse", "origin/master").chomp
<ide> end
<add> odie "Could not find start commit!" if start_commit.empty?
<add>
<ide> start_commit = Utils.popen_read("git", "rev-parse", start_commit).chomp
<add> odie "Could not find start commit!" if start_commit.empty?
<add>
<ide> end_commit = Utils.popen_read("git", "rev-parse", "HEAD").chomp
<add> odie "Could not find end commit!" if end_commit.empty?
<ide>
<ide> puts "Start commit: #{start_commit}"
<ide> puts "End commit: #{end_commit}" | 1 |
Python | Python | fix the backward for deepspeed | cd5565bed336305399397624699aaeb3d6196e18 | <ide><path>src/transformers/trainer.py
<ide> def training_step(self, model: nn.Module, inputs: Dict[str, Union[torch.Tensor,
<ide> with amp.scale_loss(loss, self.optimizer) as scaled_loss:
<ide> scaled_loss.backward()
<ide> elif self.deepspeed:
<del> # calling on DS engine (model_wrapped == DDP(Deepspeed(PretrainedModule)))
<del> self.model_wrapped.module.backward(loss)
<add> self.deepspeed.backward(loss)
<ide> else:
<ide> loss.backward()
<ide> | 1 |
PHP | PHP | add test covering scenario described in | 1d24a2f211e67328b9ad9ae4240bd51599b703d4 | <ide><path>tests/TestCase/ORM/Association/HasManyTest.php
<ide> class HasManyTest extends TestCase
<ide> *
<ide> * @var array
<ide> */
<del> public $fixtures = ['core.Comments', 'core.Articles', 'core.Authors'];
<add> public $fixtures = [
<add> 'core.Comments',
<add> 'core.Articles',
<add> 'core.Authors',
<add> ];
<ide>
<ide> /**
<ide> * Set up
<ide> public function testSaveAppendSaveStrategy()
<ide> public function testSaveDefaultSaveStrategy()
<ide> {
<ide> $authors = $this->getTableLocator()->get('Authors');
<del> $authors->hasMany('Articles', ['saveStrategy' => 'append']);
<del> $this->assertEquals('append', $authors->getAssociation('articles')->getSaveStrategy());
<add> $authors->hasMany('Articles', ['saveStrategy' => HasMany::SAVE_APPEND]);
<add> $this->assertEquals(HasMany::SAVE_APPEND, $authors->getAssociation('articles')->getSaveStrategy());
<ide> }
<ide>
<ide> /**
<ide> public function testSaveDefaultSaveStrategy()
<ide> public function testSaveReplaceSaveStrategyDependent()
<ide> {
<ide> $authors = $this->getTableLocator()->get('Authors');
<del> $authors->hasMany('Articles', ['saveStrategy' => 'replace', 'dependent' => true]);
<add> $authors->hasMany('Articles', ['saveStrategy' => HasMany::SAVE_REPLACE, 'dependent' => true]);
<ide>
<ide> $entity = $authors->newEntity([
<ide> 'name' => 'mylux',
<ide> public function testSaveReplaceSaveStrategyDependent()
<ide> $this->assertFalse($authors->Articles->exists(['id' => $articleId]));
<ide> }
<ide>
<add> /**
<add> * Test that the associated entities are unlinked and deleted when they are dependent
<add> *
<add> * TODO in the future this should change and apply the finder.
<add> *
<add> * @return void
<add> */
<add> public function testSaveReplaceSaveStrategyDependentWithConditions()
<add> {
<add> $this->getTableLocator()->clear();
<add> $this->setAppNamespace('TestApp');
<add>
<add> $authors = $this->getTableLocator()->get('Authors');
<add> $authors->hasMany('Articles', [
<add> 'finder' => 'published',
<add> 'saveStrategy' => HasMany::SAVE_REPLACE,
<add> 'dependent' => true,
<add> ]);
<add> $articles = $authors->Articles->getTarget();
<add> $articles->updateAll(['published' => 'N'], ['author_id' => 1, 'title' => 'Third Article']);
<add>
<add> $entity = $authors->get(1, ['contain' => ['Articles']]);
<add> $data = [
<add> 'name' => 'updated',
<add> 'articles' => [
<add> ['title' => 'First Article', 'body' => 'New First', 'published' => 'N']
<add> ]
<add> ];
<add> $entity = $authors->patchEntity($entity, $data, ['associated' => ['Articles']]);
<add> $entity = $authors->save($entity, ['associated' => ['Articles']]);
<add>
<add> // Should only have one article left as we 'replaced' the others.
<add> $this->assertCount(1, $entity->articles);
<add> $this->assertCount(1, $authors->Articles->find()->toArray());
<add>
<add> $others = $articles->find('all')
<add> ->where(['Articles.author_id' => 1])
<add> ->orderAsc('title')
<add> ->toArray();
<add> $this->assertCount(
<add> 1,
<add> $others,
<add> 'Record not matching condition should stay. But does not'
<add> );
<add> $this->assertSame('First Article', $others[0]->title);
<add> }
<add>
<ide> /**
<ide> * Test that the associated entities are unlinked and deleted when they have a not nullable foreign key
<ide> * | 1 |
Javascript | Javascript | improve benchmark runner output | 520a3bd36555316ed84db5172e48f771973a683c | <ide><path>benchmarks/iframe_runner.js
<ide> BenchWarmer.prototype = {
<ide>
<ide> run: function() {
<ide> if (this.profile) {
<del> console.profile(this.emberPath + ": " + this.name);
<del> for (var i=0; i<1000; i++) {
<del> this.fn();
<del> }
<del> console.profileEnd(this.emberPath + ": " + this.name);
<del> if (this.next) { this.next.run(); }
<add> var self = this;
<add>
<add> var count = parseInt(this.profile, 10);
<add>
<add> setTimeout(function() {
<add> self.setup();
<add> console.profile(self.emberPath + ": " + self.name);
<add> for (var i=0; i<count; i++) {
<add> self.fn();
<add> }
<add> console.profileEnd(self.emberPath + ": " + self.name);
<add> self.teardown();
<add> if (self.next) { self.next.run(); }
<add> }, 1);
<ide> } else {
<ide> this.benchmark.run();
<ide> }
<ide><path>benchmarks/runner.js
<ide> function makeiframe(emberPath, suitePath, profile, callback) {
<ide> var iframe = jQuery("<iframe>").appendTo("body")[0];
<ide> var write = function(str) { iframe.contentDocument.write(str); };
<ide>
<add> var name = emberPath + ": " + suitePath;
<add> iframe.name = name;
<add>
<add> write("<title>" + name + "</title>");
<ide> write("<script src='../tests/jquery-1.7.1.js'></script>");
<ide> write("<script src='" + emberPath + "'></script>");
<ide> write("<script src='benchmark.js'></script>");
<ide> function makeiframe(emberPath, suitePath, profile, callback) {
<ide> var bench, before;
<ide>
<ide> var logger = function(string) {
<del> jQuery("[data-ember-path='" + emberPath + "']").html(string);
<add> jQuery("[data-ember-path='" + emberPath + "']").html(emberPath + ": " + string);
<ide> };
<ide>
<ide> setTimeout(function() {
<ide> function makeiframe(emberPath, suitePath, profile, callback) {
<ide> callback(bench);
<ide> }
<ide> });
<del> }, 1000);
<add> }, 2000);
<ide> }
<ide>
<ide> jQuery(function() { | 2 |
Javascript | Javascript | simplify test checks | 8ae10601b12b621f76be576918962fca25d52dd7 | <ide><path>test/integration/config/test/index.test.js
<ide> describe('Configuration', () => {
<ide>
<ide> await check(
<ide> () => browser.elementByCss('.hello-world').getComputedCss('font-size'),
<del> {
<del> test(content) {
<del> return content === '100px'
<del> },
<del> }
<add> '100px'
<ide> )
<ide> } finally {
<ide> if (browser) {
<ide> describe('Configuration', () => {
<ide> browser
<ide> .elementByCss('.hello-world')
<ide> .getComputedCss('background-color'),
<del> {
<del> test(content) {
<del> return content === 'rgba(0, 0, 255, 1)'
<del> },
<del> }
<add> 'rgba(0, 0, 255, 1)'
<ide> )
<ide> } finally {
<ide> if (browser) {
<ide> describe('Configuration', () => {
<ide> try {
<ide> browser = await webdriver(context.appPort, '/webpack-css')
<ide>
<del> await check(
<del> async () => {
<del> const pTag = await browser.elementByCss('.hello-world')
<del> const initialFontSize = await pTag.getComputedCss('font-size')
<del> return initialFontSize
<del> },
<del> {
<del> test(content) {
<del> return content === '100px'
<del> },
<del> }
<del> )
<add> await check(async () => {
<add> const pTag = await browser.elementByCss('.hello-world')
<add> const initialFontSize = await pTag.getComputedCss('font-size')
<add> return initialFontSize
<add> }, '100px')
<ide>
<ide> const pagePath = join(
<ide> __dirname,
<ide> describe('Configuration', () => {
<ide> await check(
<ide> () =>
<ide> browser.elementByCss('.hello-world').getComputedCss('font-size'),
<del> {
<del> test(content) {
<del> return content === '200px'
<del> },
<del> }
<add> '200px'
<ide> )
<ide> } finally {
<ide> // Finally is used so that we revert the content back to the original regardless of the test outcome
<ide> describe('Configuration', () => {
<ide> await check(
<ide> () =>
<ide> browser.elementByCss('.hello-world').getComputedCss('font-size'),
<del> {
<del> test(content) {
<del> return content === '100px'
<del> },
<del> }
<add> '100px'
<ide> )
<ide> }
<ide> } finally {
<ide> describe('Configuration', () => {
<ide> browser = await webdriver(context.appPort, '/webpack-css')
<ide> await check(
<ide> () => browser.elementByCss('.hello-world').getComputedCss('color'),
<del> {
<del> test(content) {
<del> return content === 'rgba(255, 255, 0, 1)'
<del> },
<del> }
<add> 'rgba(255, 255, 0, 1)'
<ide> )
<ide>
<ide> try {
<ide> file.replace('yellow', 'red')
<ide> await check(
<ide> () => browser.elementByCss('.hello-world').getComputedCss('color'),
<del> {
<del> test(content) {
<del> return content === 'rgba(255, 0, 0, 1)'
<del> },
<del> }
<add> 'rgba(255, 0, 0, 1)'
<ide> )
<ide> } finally {
<ide> file.restore()
<ide> await check(
<ide> () => browser.elementByCss('.hello-world').getComputedCss('color'),
<del> {
<del> test(content) {
<del> return content === 'rgba(255, 255, 0, 1)'
<del> },
<del> }
<add> 'rgba(255, 255, 0, 1)'
<ide> )
<ide> }
<ide> } finally {
<ide><path>test/integration/css/test/index.test.js
<ide> describe('CSS Support', () => {
<ide> browser.eval(
<ide> `window.getComputedStyle(document.querySelector('.red-text')).color`
<ide> ),
<del> {
<del> test(content) {
<del> return content === 'rgb(128, 0, 128)'
<del> },
<del> }
<add> 'rgb(128, 0, 128)'
<ide> )
<ide>
<ide> // ensure text remained | 2 |
Text | Text | fix typo in part-4-using-data.md | 9466f65afd75ce9557f183be226c70a4dccdfd51 | <ide><path>docs/tutorials/essentials/part-4-using-data.md
<ide> Redux action objects are required to have a `type` field, which is normally a de
<ide>
<ide> By default, the action creators generated by `createSlice` expect you to pass in one argument, and that value will be put into the action object as `action.payload`. So, we can pass an object containing those fields as the argument to the `postUpdated` action creator.
<ide>
<del>We also know that the reducer is responsible for determing how the state should actually be updated when an action is dispatched. Given that, we should have the reducer find the right post object based on the ID, and specifically update the `title` and `content` fields in that post.
<add>We also know that the reducer is responsible for determining how the state should actually be updated when an action is dispatched. Given that, we should have the reducer find the right post object based on the ID, and specifically update the `title` and `content` fields in that post.
<ide>
<ide> Finally, we'll need to export the action creator function that `createSlice` generated for us, so that the UI can dispatch the new `postUpdated` action when the user saves the post.
<ide> | 1 |
Mixed | Javascript | remove createelement and pre-allocate arrays | 1f3cf3c28d85b650d7b74022871e2a141c3fc3fb | <ide><path>docs/getting-started/v3-migration.md
<ide> Chart.js 3.0 introduces a number of breaking changes. Chart.js 2.0 was released
<ide> * `Chart.chart.chart`
<ide> * `Chart.types`
<ide> * `DatasetController.addElementAndReset`
<add>* `DatasetController.createMetaData`
<add>* `DatasetController.createMetaDataset`
<ide> * `Element.getArea`
<ide> * `Element.height`
<ide> * `Element.initialize`
<ide> Chart.js 3.0 introduces a number of breaking changes. Chart.js 2.0 was released
<ide> * `helpers.log10` was renamed to `helpers.math.log10`
<ide> * `Chart.Animation.animationObject` was renamed to `Chart.Animation`
<ide> * `Chart.Animation.chartInstance` was renamed to `Chart.Animation.chart`
<del>* `DatasetController.createMetaData` and `DatasetController.createMetaDataset` were replaced with `DatasetController.createElement`
<ide> * `DatasetController.updateElement` was renamed to `DatasetController.updateElements`
<ide> * `TimeScale.getLabelCapacity` was renamed to `TimeScale._getLabelCapacity`
<ide> * `TimeScale.tickFormatFunction` was renamed to `TimeScale._tickFormatFunction`
<ide><path>src/core/core.datasetController.js
<ide> function applyStack(stack, value, dsIndex, allOther) {
<ide> }
<ide>
<ide> function convertObjectDataToArray(data) {
<del> var keys = Object.keys(data);
<del> var adata = [];
<del> var i, ilen, key;
<add> const keys = Object.keys(data);
<add> const adata = new Array(keys.length);
<add> let i, ilen, key;
<ide> for (i = 0, ilen = keys.length; i < ilen; ++i) {
<ide> key = keys[i];
<del> adata.push({
<add> adata[i] = {
<ide> x: key,
<ide> y: data[key]
<del> });
<add> };
<ide> }
<ide> return adata;
<ide> }
<ide> helpers.extend(DatasetController.prototype, {
<ide> }
<ide> },
<ide>
<del> createElement: function(type) {
<del> return type && new type();
<del> },
<del>
<ide> /**
<ide> * @private
<ide> */
<ide> _dataCheck: function() {
<del> var me = this;
<del> var dataset = me.getDataset();
<del> var data = dataset.data || (dataset.data = []);
<add> const me = this;
<add> const dataset = me.getDataset();
<add> const data = dataset.data || (dataset.data = []);
<ide>
<ide> // In order to correctly handle data addition/deletion animation (an thus simulate
<ide> // real-time charts), we need to monitor these data modifications and synchronize
<ide> helpers.extend(DatasetController.prototype, {
<ide> },
<ide>
<ide> addElements: function() {
<del> var me = this;
<del> var meta = me._cachedMeta;
<del> var metaData = meta.data;
<del> var i, ilen, data;
<add> const me = this;
<add> const meta = me._cachedMeta;
<add> let i, ilen, data;
<ide>
<ide> me._dataCheck();
<ide> data = me._data;
<add> const metaData = meta.data = new Array(data.length);
<ide>
<ide> for (i = 0, ilen = data.length; i < ilen; ++i) {
<del> metaData[i] = metaData[i] || me.createElement(me.dataElementType);
<add> metaData[i] = new me.dataElementType();
<ide> }
<ide>
<del> meta.dataset = meta.dataset || me.createElement(me.datasetElementType);
<add> if (me.datasetElementType) {
<add> meta.dataset = new me.datasetElementType();
<add> }
<ide> },
<ide>
<ide> buildOrUpdateElements: function() {
<ide> helpers.extend(DatasetController.prototype, {
<ide> const vId = vScale.id;
<ide> const labels = iScale._getLabels();
<ide> const singleScale = iScale === vScale;
<del> const parsed = [];
<del> let i, ilen, item;
<add> const parsed = new Array(count);
<add> let i, ilen, index;
<ide>
<del> for (i = start, ilen = start + count; i < ilen; ++i) {
<del> item = {};
<del> item[iId] = singleScale || iScale._parse(labels[i], i);
<del> item[vId] = vScale._parse(data[i], i);
<del> parsed.push(item);
<add> for (i = 0, ilen = count; i < ilen; ++i) {
<add> index = i + start;
<add> parsed[i] = {
<add> [iId]: singleScale || iScale._parse(labels[index], index),
<add> [vId]: vScale._parse(data[index], index)
<add> };
<ide> }
<ide> return parsed;
<ide> },
<ide> helpers.extend(DatasetController.prototype, {
<ide> const {xScale, yScale} = meta;
<ide> const xId = xScale.id;
<ide> const yId = yScale.id;
<del> const parsed = [];
<del> let i, ilen, item;
<del> for (i = start, ilen = start + count; i < ilen; ++i) {
<del> item = data[i];
<del> parsed.push({
<del> [xId]: xScale._parse(item[0], i),
<del> [yId]: yScale._parse(item[1], i)
<del> });
<add> const parsed = new Array(count);
<add> let i, ilen, index, item;
<add>
<add> for (i = 0, ilen = count; i < ilen; ++i) {
<add> index = i + start;
<add> item = data[index];
<add> parsed[i] = {
<add> [xId]: xScale._parse(item[0], index),
<add> [yId]: yScale._parse(item[1], index)
<add> };
<ide> }
<ide> return parsed;
<ide> },
<ide> helpers.extend(DatasetController.prototype, {
<ide> const {xScale, yScale} = meta;
<ide> const xId = xScale.id;
<ide> const yId = yScale.id;
<del> const parsed = [];
<del> let i, ilen, item;
<del> for (i = start, ilen = start + count; i < ilen; ++i) {
<del> item = data[i];
<del> parsed.push({
<del> [xId]: xScale._parseObject(item, 'x', i),
<del> [yId]: yScale._parseObject(item, 'y', i)
<del> });
<add> const parsed = new Array(count);
<add> let i, ilen, index, item;
<add>
<add> for (i = 0, ilen = count; i < ilen; ++i) {
<add> index = i + start;
<add> item = data[index];
<add> parsed[i] = {
<add> [xId]: xScale._parseObject(item, 'x', index),
<add> [yId]: yScale._parseObject(item, 'y', index)
<add> };
<ide> }
<ide> return parsed;
<ide> },
<ide> helpers.extend(DatasetController.prototype, {
<ide> */
<ide> insertElements: function(start, count) {
<ide> const me = this;
<del> const elements = [];
<add> const elements = new Array(count);
<ide> const data = me._cachedMeta.data;
<ide> let i;
<ide>
<del> for (i = start; i < start + count; ++i) {
<del> elements.push(me.createElement(me.dataElementType));
<add> for (i = 0; i < count; ++i) {
<add> elements[i] = new me.dataElementType();
<ide> }
<ide> data.splice(start, 0, ...elements);
<add>
<ide> me._parse(start, count);
<ide> me.updateElements(data, start, count);
<ide> }, | 2 |
Javascript | Javascript | fix typo in fs.readfile of lying size=0 stat | 4eb2804db9e3bbc06ebda6296620ce5bf6f7b02e | <ide><path>lib/fs.js
<ide> fs.readFile = function(path, encoding_) {
<ide> fs.close(fd, function(er) {
<ide> if (size === 0) {
<ide> // collected the data into the buffers list.
<del> buffer = Buffer.concat(buffer.length, pos);
<add> buffer = Buffer.concat(buffers, pos);
<ide> }
<ide>
<ide> if (encoding) buffer = buffer.toString(encoding); | 1 |
Python | Python | handle no dagrun in dagruniddep | fa3872a89f8de5fc50f85d55d59bbd224109b9da | <ide><path>airflow/ti_deps/deps/dagrun_id_dep.py
<ide> def _get_dep_statuses(self, ti, session, dep_context=None):
<ide> """
<ide> dagrun = ti.get_dagrun(session)
<ide>
<del> if not dagrun.run_id or not match(f"{DagRunType.BACKFILL_JOB.value}.*", dagrun.run_id):
<add> if not dagrun or not dagrun.run_id or not match(f"{DagRunType.BACKFILL_JOB.value}.*", dagrun.run_id):
<ide> yield self._passing_status(
<del> reason=f"Task's DagRun run_id is either NULL "
<add> reason=f"Task's DagRun doesn't exist or the run_id is either NULL "
<ide> f"or doesn't start with {DagRunType.BACKFILL_JOB.value}")
<ide> else:
<ide> yield self._failing_status(
<ide><path>tests/ti_deps/deps/test_dagrun_id_dep.py
<ide> def test_dagrun_id_is_not_backfill(self):
<ide> dagrun.run_id = None
<ide> ti = Mock(get_dagrun=Mock(return_value=dagrun))
<ide> self.assertTrue(DagrunIdDep().is_met(ti=ti))
<add>
<add> def test_dagrun_is_none(self):
<add> """
<add> Task instances which don't yet have an associated dagrun.
<add> """
<add> ti = Mock(get_dagrun=Mock(return_value=None))
<add> self.assertTrue(DagrunIdDep().is_met(ti=ti)) | 2 |
Python | Python | update tf test_step to match train_step | 44eaa2b3036c5c9a83ed781e08e3dc50aae193a9 | <ide><path>src/transformers/modeling_tf_utils.py
<ide> def compile(
<ide> logger.warning(
<ide> "No loss specified in compile() - the model's internal loss computation will be used as the "
<ide> "loss. Don't panic - this is a common way to train TensorFlow models in Transformers! "
<del> "Please ensure your labels are passed as the 'labels' key of the input dict so that they are "
<add> "Please ensure your labels are passed as keys in the input dict so that they are "
<ide> "accessible to the model during the forward pass. To disable this behaviour, please pass a "
<ide> "loss argument, or explicitly pass loss=None if you do not want your model to compute a loss."
<ide> )
<ide> def test_step(self, data):
<ide> # the input dict (and loss is computed internally)
<ide> if y is None and "labels" in x:
<ide> y = x["labels"] # Stops confusion with metric computations
<add> elif y is None and "input_ids" in x:
<add> # Just make any kind of dummy array to make loss work
<add> y = tf.zeros(tf.shape(x["input_ids"])[0], dtype=tf.int64)
<ide> y_pred = self(x, training=False)
<ide> self.compiled_loss(y, y_pred, sample_weight, regularization_losses=self.losses)
<ide> # Updates stateful loss metrics. | 1 |
Javascript | Javascript | use rest param & reflect.apply in makecallback | 0a0fbd54983d0760167232511ae95c8ea4f614b3 | <ide><path>lib/fs.js
<ide> function makeCallback(cb) {
<ide> throw new errors.TypeError('ERR_INVALID_CALLBACK');
<ide> }
<ide>
<del> return function() {
<del> return cb.apply(undefined, arguments);
<add> return function(...args) {
<add> return Reflect.apply(cb, undefined, args);
<ide> };
<ide> }
<ide> | 1 |
Javascript | Javascript | update prerender test for windows | fd1636ae4dc2183018c6464d092eba0ee7fd69f2 | <ide><path>test/integration/prerender/pages/something.js
<ide> export async function unstable_getStaticProps({ params }) {
<ide> world: 'world',
<ide> params: params || {},
<ide> time: new Date().getTime(),
<add> random: Math.random(),
<ide> },
<ide> revalidate: false,
<ide> }
<ide> }
<ide>
<del>export default ({ world, time, params }) => {
<add>export default ({ world, time, params, random }) => {
<ide> return (
<ide> <>
<ide> <p>hello: {world}</p>
<ide> <span>time: {time}</span>
<add> <div>{random}</div>
<ide> <div id="params">{JSON.stringify(params)}</div>
<ide> <div id="query">{JSON.stringify(useRouter().query)}</div>
<ide> <Link href="/">
<ide><path>test/integration/prerender/test/index.test.js
<ide> const runTests = (dev = false) => {
<ide> )
<ide> const escapedBuildId = buildId.replace(/[|\\{}()[\]^$+*?.-]/g, '\\$&')
<ide>
<add> Object.keys(manifest.dynamicRoutes).forEach(key => {
<add> const item = manifest.dynamicRoutes[key]
<add>
<add> if (item.dataRouteRegex) {
<add> item.dataRouteRegex = normalizeRegEx(item.dataRouteRegex)
<add> }
<add> if (item.routeRegex) {
<add> item.routeRegex = normalizeRegEx(item.routeRegex)
<add> }
<add> })
<add>
<ide> expect(manifest.version).toBe(1)
<ide> expect(manifest.routes).toEqual(expectedManifestRoutes())
<ide> expect(manifest.dynamicRoutes).toEqual({
<ide><path>test/lib/next-test-utils.js
<ide> export function getBrowserBodyText(browser) {
<ide> }
<ide>
<ide> export function normalizeRegEx(src) {
<del> return new RegExp(src).source
<add> return new RegExp(src).source.replace(/\^\//g, '^\\/')
<ide> } | 3 |
Python | Python | update newton_raphson.py (#891) | 066f37402d87e31b35a7d9439b394445a1404461 | <ide><path>maths/newton_raphson.py
<ide> def newton_raphson(f, x0=0, maxiter=100, step=0.0001, maxerror=1e-6,logsteps=Fal
<ide> if error < maxerror:
<ide> break
<ide> else:
<del> raise ValueError("Itheration limit reached, no converging solution found")
<add> raise ValueError("Iteration limit reached, no converging solution found")
<ide> if logsteps:
<ide> #If logstep is true, then log intermediate steps
<ide> return a, error, steps
<ide> def newton_raphson(f, x0=0, maxiter=100, step=0.0001, maxerror=1e-6,logsteps=Fal
<ide> plt.xlabel("step")
<ide> plt.ylabel("error")
<ide> plt.show()
<del> print("solution = {%f}, error = {%f}" % (solution, error))
<ide>\ No newline at end of file
<add> print("solution = {%f}, error = {%f}" % (solution, error)) | 1 |
Python | Python | add test for scalar indexing | 2459acb46ddbb268114779e1428dba687ab5650d | <ide><path>numpy/core/tests/test_scalarmath.py
<ide> def test_int_from_long(self):
<ide> # val2 = eval(val_repr)
<ide> # assert_equal( val, val2 )
<ide>
<add>class TestIndexing(NumpyTestCase):
<add> def test_basic(self):
<add> for t in types:
<add> value = t(1)
<add> x = value
<add> assert_array_equal(x[...],value)
<add> assert_array_equal(x[()],value)
<add>
<ide> if __name__ == "__main__":
<ide> NumpyTest().run() | 1 |
PHP | PHP | fix fatal error caused by renamed method | 669e6d7a03af074ad876ad612390152abcb96530 | <ide><path>src/Shell/I18nShell.php
<ide> public function main() {
<ide> $choice = strtolower($this->in('What would you like to do?', ['E', 'H', 'Q']));
<ide> switch ($choice) {
<ide> case 'e':
<del> $this->Extract->execute();
<add> $this->Extract->main();
<ide> break;
<ide> case 'h':
<ide> $this->out($this->OptionParser->help()); | 1 |
Python | Python | fix success message [ci skip] | 2c9804038da8bf9d6b8032c31ef5cfff4e901718 | <ide><path>spacy/cli/init_pipeline.py
<ide> def init_vectors_cli(
<ide> nlp.to_disk(output_dir)
<ide> msg.good(
<ide> "Saved nlp object with vectors to output directory. You can now use the "
<del> "path to it in your config as the 'vectors' setting in [initialize.vocab].",
<add> "path to it in your config as the 'vectors' setting in [initialize].",
<ide> output_dir.resolve(),
<ide> )
<ide> | 1 |
PHP | PHP | reset eager loaders before triggering before find | 2d59a95bfef4195d7170288cdfcbc00b1a1c1095 | <ide><path>src/ORM/Query.php
<ide> public function applyOptions(array $options)
<ide> public function cleanCopy()
<ide> {
<ide> $clone = clone $this;
<add> $clone->setEagerLoader(clone $this->getEagerLoader());
<ide> $clone->triggerBeforeFind();
<ide> $clone->enableAutoFields(false);
<ide> $clone->limit(null);
<ide> public function cleanCopy()
<ide> $clone->formatResults(null, true);
<ide> $clone->setSelectTypeMap(new TypeMap());
<ide> $clone->decorateResults(null, true);
<del> $clone->setEagerLoader(clone $this->getEagerLoader());
<ide>
<ide> return $clone;
<ide> }
<ide><path>tests/TestCase/ORM/QueryRegressionTest.php
<ide> public function testCountWithBind()
<ide> $this->assertEquals(1, $count);
<ide> }
<ide>
<add> /**
<add> * Test count() with inner join containments.
<add> *
<add> * @return void
<add> */
<add> public function testCountWithInnerJoinContain()
<add> {
<add> $this->loadFixtures('Articles', 'Authors');
<add> $table = TableRegistry::get('Articles');
<add> $table->belongsTo('Authors')->setJoinType('INNER');
<add>
<add> $result = $table->save($table->newEntity([
<add> 'author_id' => null,
<add> 'title' => 'title',
<add> 'body' => 'body',
<add> 'published' => 'Y'
<add> ]));
<add> $this->assertNotFalse($result);
<add>
<add> $table->eventManager()
<add> ->on('Model.beforeFind', function (Event $event, $query) {
<add> $query->contain(['Authors']);
<add> });
<add>
<add> $count = $table->find()->count();
<add> $this->assertEquals(3, $count);
<add> }
<add>
<ide> /**
<ide> * Tests that bind in subqueries works.
<ide> * | 2 |
Text | Text | fix nits in report docs | 60aaf2c2144d68eae063e09b5216338341d83633 | <ide><path>doc/api/process.md
<ide> added: v11.8.0
<ide> reports for the current process. Additional documentation is available in the
<ide> [report documentation][].
<ide>
<del>## process.report.directory
<add>### process.report.directory
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> Node.js process.
<ide> console.log(`Report directory is ${process.report.directory}`);
<ide> ```
<ide>
<add>### process.report.filename
<add><!-- YAML
<add>added: REPLACEME
<add>-->
<add>
<add>* {string}
<add>
<add>Filename where the report is written. If set to the empty string, the output
<add>filename will be comprised of a timestamp, PID, and sequence number. The default
<add>value is the empty string.
<add>
<add>```js
<add>console.log(`Report filename is ${process.report.filename}`);
<add>```
<add>
<ide> ### process.report.getReport([err])
<ide> <!-- YAML
<ide> added: v11.8.0
<ide> console.log(data);
<ide>
<ide> Additional documentation is available in the [report documentation][].
<ide>
<del>## process.report.filename
<del><!-- YAML
<del>added: REPLACEME
<del>-->
<del>
<del>* {string}
<del>
<del>Filename where the report is written. If set to the empty string, the output
<del>filename will be comprised of a timestamp, PID, and sequence number. The default
<del>value is the empty string.
<del>
<del>```js
<del>console.log(`Report filename is ${process.report.filename}`);
<del>```
<del>
<del>## process.report.reportOnFatalError
<add>### process.report.reportOnFatalError
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> memory errors or failed C++ assertions.
<ide> console.log(`Report on fatal error: ${process.report.reportOnFatalError}`);
<ide> ```
<ide>
<del>## process.report.reportOnSignal
<add>### process.report.reportOnSignal
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> signal specified by `process.report.signal`.
<ide> console.log(`Report on signal: ${process.report.reportOnSignal}`);
<ide> ```
<ide>
<del>## process.report.reportOnUncaughtException
<add>### process.report.reportOnUncaughtException
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide> If `true`, a diagnostic report is generated on uncaught exception.
<ide> console.log(`Report on exception: ${process.report.reportOnUncaughtException}`);
<ide> ```
<ide>
<del>## process.report.signal
<add>### process.report.signal
<ide> <!-- YAML
<ide> added: REPLACEME
<ide> -->
<ide>
<ide> * {string}
<ide>
<ide> The signal used to trigger the creation of a diagnostic report. Defaults to
<del>`SIGUSR2`.
<add>`'SIGUSR2'`.
<ide>
<ide> ```js
<ide> console.log(`Report signal: ${process.report.signal}`);
<ide><path>doc/api/report.md
<ide> to intercept external triggers for report generation. Defaults to
<ide> `filename` specifies the name of the output file in the file system.
<ide> Special meaning is attached to `stdout` and `stderr`. Usage of these
<ide> will result in report being written to the associated standard streams.
<del>In cases where standard streams are used, the value in `'directory'` is ignored.
<add>In cases where standard streams are used, the value in `directory` is ignored.
<ide> URLs are not supported. Defaults to a composite filename that contains
<ide> timestamp, PID and sequence number.
<ide>
<ide> process.report.reportOnFatalError = true;
<ide> process.report.reportOnSignal = true;
<ide> process.report.reportOnUncaughtException = false;
<ide>
<del>// Change the default signal to `SIGQUIT` and enable it.
<add>// Change the default signal to 'SIGQUIT' and enable it.
<ide> process.report.reportOnFatalError = false;
<ide> process.report.reportOnUncaughtException = false;
<ide> process.report.reportOnSignal = true; | 2 |
Javascript | Javascript | add veggies to the showcase | cf8bb75c19e49db303c0477c08463e1af6017a4a | <ide><path>website/src/react-native/showcase.js
<ide> var apps = [
<ide> link: 'https://itunes.apple.com/cn/app/tong-xing-wang/id914254459?mt=8',
<ide> author: 'Ho Yin Tsun Eugene',
<ide> },
<add> {
<add> name: 'Veggies in Season',
<add> icon: 'https://s3.amazonaws.com/veggies-assets/icon175x175.png',
<add> link: 'https://itunes.apple.com/es/app/veggies-in-season/id1088215278?mt=8',
<add> author: 'Victor Delgado',
<add> },
<ide> {
<ide> name: 'WEARVR',
<ide> icon: 'http://a2.mzstatic.com/eu/r30/Purple69/v4/4f/5a/28/4f5a2876-9530-ef83-e399-c5ef5b2dab80/icon175x175.png', | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.