content
stringlengths
0
557k
url
stringlengths
16
1.78k
timestamp
timestamp[ms]
dump
stringlengths
9
15
segment
stringlengths
13
17
image_urls
stringlengths
2
55.5k
netloc
stringlengths
7
77
Time Indicator Time Indicator element is a line indicating the current time displayed in the Day View, Work-Week View, Full Week View and Timeline View views. The line can go across the entire view, or be restricted to a current date, as specified by the SchedulerTimeIndicatorDisplayOptions.Visibility property. The table below lists the main properties affecting the element’s appearance. See Also Feedback
https://docs.devexpress.com/WPF/114791/controls-and-libraries/scheduler-legacy/visual-elements/scheduler-control/time-indicator
2021-09-17T01:08:11
CC-MAIN-2021-39
1631780053918.46
[]
docs.devexpress.com
Recommended Utility RAIL Recommended Utility This optional software component can be enabled to include a recommended collection of other optional RAIL utilities. This RAIL utility attempts to identify those RAIL utilities most commonly used by applications. If a list of utilities different from what is auto-selected here is sought, this component must be removed from the project with the custom list manually being added to the project. Common RAIL utilities recommended for general use include: - Initialization - Callbacks - Protocol - Power Amplifier (PA) - Packet Trace Interface (PTI) - Received Signal Strength Indicator (RSSI) - Direct Memory Access (DMA) - This primarily accelerates radio initialization, so it is currently included only for dynamic multiprotocol (DMP) scenarios.
https://docs.silabs.com/rail/latest/rail-util-recommended
2021-09-17T00:58:03
CC-MAIN-2021-39
1631780053918.46
[]
docs.silabs.com
Filter: All Files Submit Search Enabling Peg Selection Mode. How to enable the Peg Selection Mode In the Tools toolbar, select the Transform tool or press Shift + T. In the Tool Properties view, click the Peg Selection Mode button. In the Camera view, select an element parented to a peg.
https://docs.toonboom.com/help/harmony-15/essentials/motion-path/enable-peg-selection-mode.html
2021-09-17T02:13:17
CC-MAIN-2021-39
1631780053918.46
[]
docs.toonboom.com
A string is any sequence of characters. The following values can all be treated as strings: String functions perform operations on text data. Based on column or string literal inputs, these functions render outputs that provide a subset of the input, convert to a different format or case, or compute metrics about the input. These functions are very useful for manipulating strings. For more information on these expressions, see Text Matching.
https://docs.trifacta.com/plugins/viewsource/viewpagesrc.action?pageId=109906331
2021-09-16T23:47:46
CC-MAIN-2021-39
1631780053918.46
[]
docs.trifacta.com
MathType Tab has disappeared from Microsoft Word 2007 for Windows PLEASE NOTE: These steps do not apply to Word for Mac, nor for Word for Windows except as noted here! Applicability The information on this page applies to: MathType 6.0 and later for WindowsWord 2007 for Windows This is a continuation of TechNote 133… If you have not first read and followed the steps in that article, please go to TechNote 133 and go through the process described there. Solution on Windows for Word 2007 Launch Microsoft Word. From the Office button, located at the top left corner of the Word window – – choose Word 1: If you're using MathType 6.0 through 6.9d, the first item will be named MathType Commands 6 for Word.dotm. Note 2: Depending on your version of MathType, you may also see a 3rd file listed, named MathPage.wll. If this file is present, it should be checked too. If the items from the previous step are not present, proceed to the next step. If they are present, the "MathType Commands" item should be checked (selected) if you're using MathType 7 and later. If you're using MathType 6.0 through 6.9d, all items should be checked. If the items from step 5 are not present, that's OK. Please try this: Click the Add button. This will open a window titled Add Template. In the window, navigate to one of these paths: C:\Program Files\MathType\Office Support C:\Program Files\MathType\Office Support\32 C:\Program Files (x86)\MathType\Office Support C:\Program Files (x86)\MathType\Office Support\32 You will see the following items in the folder (among others): MathType Commands 6 for Word.dotm (if not using MathType 7 or later) MathType Commands.dotm (if using MathType 7 or later) WordCmds.dot Select one of them at a time, and click Open. This will bring it into the Add-Ins box. You will then need to click Add one more time to bring in the other add-in (either MathType Commands….dotm or WordCmds.dot). If all 3 files shown above are now present, with a checkmark beside them, click OK. If MathPage.wll is not there, click Add, go to one of these 4 paths (if the first one isn't on your computer, go to the next one in order, etc.), change Files of type to Word Add-ins (*.wll), click to select MathPage.wll, and click Open: C:\Program Files\MathType\MathPage C:\Program Files\MathType\MathPage\32 C:\Program Files (x86)\MathType\MathPage C:\Program Files (x86)\MathType\MathPage\32.dotm to have a checkmark.) This should restore the MathType tab in Word. Sometimes it takes a few seconds to load it in. If you have meticulously worked through this process and the MathType tab is still not on the ribbon in Word 2007, 2007
https://docs.wiris.com/en/mathtype/office_tools/support_notices/tn133-word2007?do=login&sectok=899f0d7fe4e3ef93d3462ef6b1f2a1b3
2021-09-17T01:25:29
CC-MAIN-2021-39
1631780053918.46
[]
docs.wiris.com
Ontraport's new email editor OntraMail does not support Deadline Funnel email timers. The only supported method is to use the Deadline Funnel animated email countdown timer using the legacy Ontraport RAW HTML email editor. Video Tutorial: Add the Email Countdown in the RAW HTML Editor 1. In the Deadline Funnel admin, navigate to Campaigns > Edit Campaign > Emails > Email Timer Code and click to copy the HTML Code you'll need to add the email countdown to your emails: 2. In the Ontraport HTML Email editor, click 'Source' to edit your raw HTML: 3. Copy and paste the Deadline Funnel HTML Code into the Ontraport Source box where you want the animated countdown to appear and click 'Save': 4. Click 'Preview' to see a preview of your email with the animated countdown timer:
https://docs.deadlinefunnel.com/en/articles/4160381-how-to-add-email-countdown-code-to-ontraport
2021-09-16T23:54:21
CC-MAIN-2021-39
1631780053918.46
[array(['https://downloads.intercomcdn.com/i/o/229374901/a130973d5e71bae46cb1f1a8/image.png', None], dtype=object) array(['https://downloads.intercomcdn.com/i/o/229376206/a26174a317f637261974cfd9/image.png', None], dtype=object) array(['https://downloads.intercomcdn.com/i/o/229389286/66b2bd430c2b1cd44aae03fb/image.png', None], dtype=object) array(['https://downloads.intercomcdn.com/i/o/229390975/02f898c8976df7413aff4ffa/image.png', None], dtype=object) array(['https://downloads.intercomcdn.com/i/o/229392094/b86a9d7ec30cd1f4c3eea1a4/image.png', None], dtype=object) array(['https://downloads.intercomcdn.com/i/o/229392353/cb55827d33591ccc147cfb32/image.png', None], dtype=object) ]
docs.deadlinefunnel.com
About Any, click File > Import > Anypoint Studio > Anypoint Connector Project from External Location, choose a URL or a .zip file, and complete the wizard to locate and import the project. Determine resource access - Each resource has a different access method, such as REST, SOAP, FTP, or the Java SDK features. Choose an authentication mechanism - Mule supports OAuth V1 or V2, and username and password authentication (known as connection management), which can be used for @ attributes to help others understand the features and use of your connector. - DevKit Features Features DevKit provides: Visual design and implementation using Anypoint Studio with an Eclipse-based interface that simplifies and speeds up development. Maven support. Connector packaging tools. Authentication support for multiple types of authentication, including OAuth and username and password authentication. DataSense support to acquire remote metadata. Extensive testing capability. Examples, training, and support to simplify development startup. Batch, Query Pagination, and DataSense Query Language support. DevKit is a annotations-based tool, with a wide set of available annotations to support its features. For a complete list of DevKit annotations, see the Annotation Reference. What is a Connector? An Anypoint connector is an extension module that eases the interaction between a Mule application and external resources, such as Anypoint Platform. Connector Architecture Connectors operate within Mule applications, which are built up from Mule Flows, and external resources, which are the targeted resources. A Mule connector has two operational sides. The Mule-facing side communicates with a resource’s target-facing client side to enable content to travel between the Mule applications, and the external target-facing resource. Mule-Facing Functionality From the Mule-facing side, a connector consists of: Main Java class. Java code that you annotate with the @Connector attribute. See the Annotation Reference for information about Anypoint Connector DevKit annotations. See Java annotations for information on how annotations work. Connector attributes. Properties of the @Connector class that you annotate with the @Configurable attribute. Methods. Functionality that you annotate with the @Processor attribute. Additional annotations define authentication-related functionality, such as connection management. Annotations allow you to control the layout of the Anypoint Studio dialogues for the connector as well. The data model and exceptions that either raise or propagate are also Mule-facing classes. DevKit generates a scaffold connector when you create your Anypoint Connector project in Studio. This scaffold connector includes the @Connector class, the @Configurable attributes, the @Processor methods, and authentication logic to build out your connector. Target-Facing Functionality The target facing or client facing side of a connector depends on the client technology that enables access to the resource. This functionality consists of a class library and one or more classes that @Connector classes use to access client functionality. This functionality is called the client class. The client class in turn generally depends on other classes to actually implement calls to the targeted resource. Depending on your target, some of these classes may be generated or provided for you. For example, if you have a Java client library, or are working with a SOAP or REST services, most of the client code is implemented there. In other cases, you have to write the code yourself. Coding a Connector DevKit lets you build connectors from scratch. Before creating your own connector, check the Anypoint Exchange tools that helps building a connector using a WSDL file. Code Sample The following is an example connector that Anypoint Studio creates for you as a starting point: /** * (c) 2003-2015 MuleSoft, Inc. The software in this package * is published under the terms of the CPAL v1.0 license, * a copy of which has been included with this distribution * in the LICENSE.md file. */ package org.mule.modules.myproject; import org.mule.api.annotations.ConnectionStrategy; import org.mule.api.annotations.Connector; import org.mule.api.annotations.Configurable; import org.mule.api.annotations.Processor; import org.mule.api.annotations.param.Default; import org.mule.modules.myproject.strategy.ConnectorConnectionStrategy; /** * Anypoint Connector * * @author MuleSoft, Inc. */ @Connector(name="my-project", schemaVersion="1.0", friendlyName="MyProject") public class MyProjectConnector { /** * Configurable */ @Configurable @Default("value") private String myProperty; @ConnectionStrategy ConnectorConnectionStrategy connectionStrategy; /** * Custom processor * * {@sample.xml ../../../doc/my-project-connector.xml.sample my-project:my-processor} * * @param content Content to be processed * @return Some string */ @Processor public String myProcessor(String content) { /* * MESSAGE PROCESSOR CODE GOES HERE */ return content; } /** * Set property * * @param myProperty My property */ public void setMyProperty(String myProperty) { this.myProperty = myProperty; } /** * Get property */ public String getMyProperty() { return this.myProperty; } public ConnectorConnectionStrategy getConnectionStrategy() { return connectionStrategy; } public void setConnectionStrategy(ConnectorConnectionStrategy connectionStrategy) { this.connectionStrategy = connectionStrategy; } }
https://docs.mulesoft.com/connector-devkit/3.6/
2021-09-17T01:55:10
CC-MAIN-2021-39
1631780053918.46
[array(['_images/devkitoverviewarchitecture.png', 'DevKitOverviewArchitecture'], dtype=object) array(['_images/devkitoverviewarchitecture.png', 'DevKitOverviewArchitecture'], dtype=object)]
docs.mulesoft.com
Perform these steps if you want to turn off external authentication and revert to using the internal authentication LDAP in Apigee Edge. - Open /opt/apigee/customer/application/management-server.propertiesin a text editor. - Set the conf_security_authentication.user.storeproperty to ldap. conf_security_authentication.user.store=ldap - OPTIONALLY, only applicable if you were using a non-email address username or a different password in your external LDAP for your sysadmin user. Follow the steps you previously followed in Configuration required for different sysadmin credentials, but substitute the external LDAP username with your Apigee Edge sysadmin user's email address. - Restart the Management Server: /opt/apigee/apigee-service/bin/apigee-service edge-management-server restart - Verify that the server is running: /opt/apigee/apigee-service/bin/apigee-all status - Important: An Edge organization administrator must take the following actions after external authentication is turned off: - Make sure there are no users in Apigee Edge that should not be there. You need to manually remove those users. - Communicate to users that because the external authentication has been turned off, they need to either start using whatever their original password was (if they remember) or complete a "forgot password" process in order to log in.
https://docs.apigee.com/private-cloud/v4.51.00/disabling-external-authentication
2021-09-17T01:48:09
CC-MAIN-2021-39
1631780053918.46
[]
docs.apigee.com
November 2020 These features and Databricks platform improvements were released in November 2020. Note Releases are staged. Your Databricks account may not be updated until a week or more after the initial release date. Databricks Runtime 6.6 series support ends November 26, 2020 Support for Databricks Runtime 6.6, Databricks Runtime 6.6 for Machine Learning, and Databricks Runtime 6.6 for Genomics ended on November 26. See Databricks runtime support lifecycle. MLflow Model Registry GA November 18 - December 1, 2020: Version 3.33 MLflow Model Registry is now GA. Several improvements have been made since Model Registry was released for Public Preview: - Audit logging for actions on model registry objects. Actions in Model Registry are now captured in audit logs. See the mlflowModelRegistryentry in the Request parameters table for the logged actions and parameters. - Comments for model versions. You can now add comments to model versions, allowing you to use Model Registry for team discussions to help manage your model productionization pipeline. - Tags on models and model versions. You can create tags for models and model versions, and search for them using the API. - Improvements to the URL of the registered models page. The URL of this page now keeps its history, so you can navigate with the browser back and forward buttons as you make queries and view models from this page. You can also share the URL with colleagues who will see the same view. Filter experiment runs based on whether a registered model is associated November 18 - December 1, 2020: Version 3.33 When viewing runs for an experiment, you can now filter runs based on whether they created a model version or not. For more information, see Filter runs. Partner integrations gallery now available through the Data tab November 18 - December 1, 2020: Version 3.33 The Partner Integrations gallery has moved from the Account menu to the Add Data tab. For more information, see Partner data integrations. Cluster policies now use allowlist and blocklist as policy type names November 18 - December 1, 2020: Version 3.33 Cluster policies now use “allowlist” and “blocklist” as policy types, replacing “whitelist” and “blacklist.” See Cluster policy definitions. Note that this was originally announced as a version 3.31 feature, which was incorrect. Automatic retries when the creation of a job cluster fails Important This update was reverted following the release of version 3.33. November 18 - December 1, 2020: Version 3.33 Databricks now automatically retries the creation of job clusters when specific recoverable errors occur. Job runs remain in RunLifeCycleState: PENDING until successful cluster launch. Each attempt has a different cluster_id and name. When cluster creation succeeds, the run transitions to RunLifeCycleState: RUNNING. Databricks SQL (Public Preview) November 18, 2020 Databricks is pleased to introduce Databricks SQL, an intuitive environment for running ad-hoc queries and creating dashboards on data stored in your data lake. Databricks SQL empowers your organization to operate a multi-cloud lakehouse architecture that provides data warehousing performance with data lake economics while providing a delightful SQL analytics user experience. Databricks SQL: - Integrates with the BI tools you use today, like Tableau and Microsoft Power BI, to query the most complete and recent data in your data lake. - Complements existing BI tools with a SQL-native interface that allows data analysts and data scientists to query data lake data directly within Databricks. - Enables you to share query insights through rich visualizations and drag-and-drop dashboards with automatic alerting for important data changes. - Uses SQL endpoints to bring reliability, quality, scale, security, and performance to your data lake, so you can run traditional analytics workloads using your most recent and complete data. See the Databricks SQL guide for details. Contact your Databricks representative to request access. Web terminal available on Databricks Community Edition November 5, 2020: Version 3.32 Web terminal is now enabled by default on Databricks Community Edition. For more information, see Web terminal. Single Node clusters now support Databricks Container Services November 4-10, 2020: Version 3.32 You can now use Databricks Container Services on Single Node clusters. For more information, see Single Node clusters and Customize containers with Databricks Container Services. Databricks Runtime 7.4 GA November 3, 2020 Databricks Runtime 7.4, Databricks Runtime 7.4 ML, and Databricks Runtime 7.4 for Genomics are now generally available. For information, see the full release notes at Databricks Runtime 7.4 (Unsupported), Databricks Runtime 7.4 for Machine Learning (Unsupported), and Databricks Runtime 7.4 for Genomics (Unsupported). Databricks JDBC driver update November 3, 2020 A new version of the Databricks JDBC driver has been released. The new version contains a number of bug fixes, most notably, the driver now returns the correct number of modified rows from DML operations when it is provided by Databricks Runtime. Databricks Connect 7.3 (Beta) November 3, 2020 Databricks Connect 7.3 is now available as a Beta release.
https://docs.databricks.com/release-notes/product/2020/november.html
2021-09-17T00:44:59
CC-MAIN-2021-39
1631780053918.46
[]
docs.databricks.com
Storage Stratis 2.1.0 The latest version of the Stratis local storage management utility now supports per-pool encryption of devices that form a pool data tier. It is possible to encrypt the pool or to activate the pool’s individual encrypted devices using a key in the kernel keyring. The stratisd daemon of version 2.1.0 provides the following new D-Bus interfaces: org.storage.stratis2.manager.r1- Provides an extended CreatePoolmethod to support an optional argument for encryption. Also, it supplies a number of methods for key management. org.storage.stratis2.pool.r1- Supports explicit initialization of a cache tier. Also, it supports a new Encryptedproperty. org.storage.stratis2.FetchProperties.r1- Supports an additional HasCacheproperty. org.storage.stratis2.Report.r1- Supports a set of ad-hoc reports about Stratis. The interface and the names by which the reports can be accessed are not stable. Any report is only in the JSON format. The stratis command-line utility of version 2.1.0, requires stratisd of the same version. Users can observe the following changes in stratis: The command for creating pools now allows also encryption. New pool init_cachecommand for initializing a cache. keyis a new sub-command for key management tasks. reportis a new sub-command for displaying of reports generated by stratisd. The output of the pool listcommand now includes a Properties column. Each entry in this column is a string encoding the following properties of the pool: Whether or not it has a cache. Whether or not it is encrypted. All commands now verify that stratisis communicating with a compatible version of stratisd. If stratisdis of incompatible version, stratiswill fail with an appropriate error. The following are significant implementation details: Each block device in an encrypted pool’s data tier is encrypted with a distinct, randomly chosen Media Encryption Key (MEK) on initialization. All devices from a single encrypted pool share a single passphrase that is supplied through the kernel keyring. This release requires the cryptsetuputility of version 2.3. Storage Instantiation Daemon has been introduced Storage Instantiation Daemon (SID) provides a system-level infrastructure for convenient handling of storage-device-related events through modules provided by other developers. Fedora 33 introduces a package with SID. At first, this daemon will be disabled by default and will provide limited functionality. Further Fedora updates will enhance the SID functionality. The general theme running across benefits of this Fedora update is centralization of solutions that address storage issues with udev. This change brings the following benefits: Identifying specific Linux storage devices and their dependencies Collecting information and state tracking Central infrastructure for storage event processing Improving recognition of the storage events and their sequences Centralized solution for delayed actions on storage devices and groups of devices Single notion of device readiness shared among various storage subsystems Enhanced possibilities to store and retrieve storage-device-related records when compared to the udevdatabase Centralized solution for scheduling triggers with associated actions defined on groups of storage devices Direct support for generic device grouping dmraid-activation.service no longer depends on systemd-udev-settle.service The dmraid package is necessary for supporting firmware-based Redundant Array of Independent Disks (RAID) sets of non-Intel® systems and Fedora only support these RAID sets when they are already configured in BIOS during the OS installation. The dmraid package provides the dmraid-activation.service that required an obsoleted service systemd-udev-settle.service in the default Fedora installation. The systemd-udev-settle.service service waited a long time for detection of all devices. As a result, a system booting was significantly prolonged. To solve this problem, dmraid-activation.service now disables itself if no supported RAID sets are found when the service runs for the first time. Fedora Workstation now uses Btrfs by default The default partitioning scheme on Fedora Workstation now uses Btrfs. See Distribution-wide Changes for more information.
https://docs.fedoraproject.org/bn/fedora/f33/release-notes/sysadmin/Storage/
2021-09-17T00:39:14
CC-MAIN-2021-39
1631780053918.46
[]
docs.fedoraproject.org
Date: Mon, 29 Dec 2003 15:22:48 -0600 From: Troy <[email protected]> To: [email protected] Subject: Fuzzy out of focus console w/new flatscreen monitor on bootup Message-ID: <[email protected]> Next in thread | Raw E-Mail | Index | Archive | Help I am using GRUB to boot into Freebsd 4.9 Stable. I recently purchased a new flatscreen monitor and when I boot up the console appears to be very fuzzy. I installed the NVIDIA driver and put the proper settings into the XF86Config and it's clear and in 1600x1200 for X windows. While researching, I found that in Linux, folks have used a switch in grub vga=795 when booting their kernel that gets the console to be much clearer. It doesn't appear that Freebsd supports that switch in grub. Any way to get the equiv switch so the console is clean? Thanks, -Troy Want to link to this message? Use this URL: <>
https://docs.freebsd.org/cgi/getmsg.cgi?fetch=585659+0+archive/2003/freebsd-questions/20031231.freebsd-questions
2021-09-17T00:14:50
CC-MAIN-2021-39
1631780053918.46
[]
docs.freebsd.org
Manage notebooks Note The CLI feature is unavailable on Databricks on Google Cloud as of this release. You can manage notebooks using the UI, the CLI, and by invoking the Workspace API. This article focuses on performing notebook tasks using the UI. For the other methods, see Databricks CLI and Workspace API. Create a notebook Create a notebook in any folder You can create a new notebook in any folder (for example, in the Shared folder) following these steps: In the sidebar, click Workspace. Do one of the following: Next to any folder, click the on the right side of the text and select Create > Notebook. In the workspace or a user folder, click and select Create > Notebook. Follow steps 2 through 4 in Use the Create button. Open a notebook In your workspace, click a . The notebook path displays when you hover over the notebook title. Delete a notebook See Folders and Workspace object operations for information about how to access the workspace menu and delete notebooks or other items in the workspace. Copy notebook path To copy a notebook file path without opening the notebook, right-click the notebook name or click the to the right of the notebook name and select Copy File Path. Rename a notebook To change the title of an open notebook, click the title and edit inline or click File > Rename. Control access to a notebook If your Databricks account has the Databricks Premium Plan, you can use Workspace access control to control who has access to a notebook. Notebook external formats Databricks supports several notebook external formats: - Source file: A file containing only source code statements with the extension .scala, .py, .sql, or .r. - HTML: A Databricks notebook with the extension .html. - DBC archive: A Databricks archive. - RMarkdown: An R Markdown document with the extension .Rmd. In this section: Import a notebook You can import an external notebook from a URL or a file. You can also import a ZIP archive of notebooks exported in bulk from a Databricks workspace. Click Workspace in the sidebar. Do one of the following: Next to any folder, click the on the right side of the text and select Import. In the Workspace or a user folder, click and select Import. Specify the URL or browse to a file containing a supported external format or a ZIP archive of notebooks exported from a Databricks workspace. Click Import. - If you choose a single notebook, it is exported in the current folder. - If you choose a DBC or ZIP archive, its folder structure is recreated in the current folder and each notebook is imported. Export a notebook In the notebook toolbar, select File > Export and a format. Export all notebooks in a folder Note When you export a notebook as HTML, IPython notebook, or archive (DBC), and you have not cleared the results, the results of running the notebook are included. To export all folders in a workspace folder as a ZIP archive: - Click Workspace in the sidebar. Do one of the following: - Next to any folder, click the on the right side of the text and select Export. - In the Workspace or a user folder, click and select Export. - Select the export format: - DBC Archive: Export a Databricks archive, a binary format that includes metadata and notebook command results. - Source File: Export a ZIP archive of notebook source files, which can be imported into a Databricks workspace, used in a CI/CD pipeline, or viewed as source files in each notebook’s default language. Notebook command results are not included. - HTML Archive: Export a ZIP archive of HTML files. Each notebook’s HTML file can be imported into a Databricks workspace or viewed as HTML. Notebook command results are included. Auto-eviction is enabled by default. To disable auto-eviction for a cluster, set the Spark property spark.databricks.chauffeur.enableIdleContextTracking false. Attach a notebook to a cluster To attach a notebook to a cluster, you need the Can Attach To cluster-level permission. Important As long as a notebook is attached to a cluster, any user with the Can Run permission on the notebook has implicit permission to access. You can also detach notebooks from a cluster using the Notebooks tab on the cluster details page. When you detach a notebook from a cluster, the execution context is removed and all computed variable values are cleared from the notebook. Tip Databricks recommends that you detach unused notebooks from a cluster. This frees up memory space on the driver. View all notebooks attached to a cluster The Notebooks tab on the cluster details page displays all of the notebooks that are attached to a cluster. The tab also displays the status of each attached notebook, along with the last time a command was run from the notebook. Schedule a notebook To schedule a notebook job to run periodically: In the notebook, click at the top right. If no jobs exist for this notebook, the Schedule dialog appears. If jobs already exist for the notebook, the Jobs List dialog appears. To display the Schedule dialog, click Add a schedule. In the Schedule dialog, optionally enter a name for the job. The default name is the name of the notebook. Select Manual to run your job only when manually triggered, or Scheduled to define a schedule for running the job. If you select Scheduled, use the drop-downs to specify the frequency, time, and time zone. In the Cluster drop-down, select the cluster to run the task. If you have Allow Cluster Creation permissions, by default the job runs on a new job cluster. To edit the configuration of the default job cluster, click Edit at the right of the field to display the cluster configuration dialog. If you do not have Allow Cluster Creation permissions, by default the job runs on the cluster that the notebook is attached to. If the notebook is not attached to a cluster, you must select a cluster from the Cluster drop-down. Optionally, enter any Parameters to pass to the job. Click Add and specify the key and value of each parameter. Parameters set the value of the notebook widget specified by the key of the parameter. Use Task parameter variables to pass a limited set of dynamic values as part of a parameter value. Optionally, specify email addresses to receive Email Alerts on job events. See Alerts. Click Submit. Manage scheduled notebook jobs To display jobs associated with this notebook, click the Schedule button. The jobs list dialog displays, showing all jobs currently defined for this notebook. To manage jobs, click at the right of a job in the list. From this menu, you can edit, clone, view, pause, resume, or delete a scheduled job. When you clone a scheduled job, a new job is created with the same parameters as the original. The new job appears in the list with the name “Clone of <initial job name>”. How you edit a job depends on the complexity of the job’s schedule. Either the Schedule dialog or the Tasks tab of the Jobs page displays, allowing you to edit the schedule, cluster, parameters, and so on. Distribute.
https://docs.gcp.databricks.com/notebooks/notebooks-manage.html
2021-09-17T00:24:52
CC-MAIN-2021-39
1631780053918.46
[array(['../_images/copy-file-path.png', 'Copy notebook path'], dtype=object) array(['../_images/notebook-context-evicted.png', 'Notebook context evicted'], dtype=object) array(['../_images/notebook-detached.png', 'Notebook detached'], dtype=object) array(['../_images/notebooks.png', 'Cluster details attached notebooks'], dtype=object) array(['../_images/job-list-menu.png', 'Job list menu'], dtype=object)]
docs.gcp.databricks.com
Concurrent CDI Bean Loading Since Payara Server 5.184 It is possible to concurrently load CDI beans and potentially speed up an application’s load time by enabling Weld’s Multi-Threaded Bean Loading feature. Enable Concurrent CDI Bean Loading Concurrent CDI bean loading is configured on a per-configuration basis using the set on the cdi-service.enable-concurrent-deployment property like this: asadmin set configs.config.server-config.cdi-service.enable-concurrent-deployment=true It is possible to configure the number of pre-loader threads like this: asadmin set configs.config.server-config.cdi-service.pre-loader-thread-pool-size=4 When configuring the number of the pre-loader threads, use a sensible value that is consistent with the number of CPU cores available for the server instance, otherwise this setting may introduce performance degradation on the deployment times.
https://docs.payara.fish/community/docs/5.2021.1/documentation/payara-server/app-deployment/concurrent-cdi-bean-loading.html
2021-09-17T01:12:18
CC-MAIN-2021-39
1631780053918.46
[]
docs.payara.fish
About the Welcome Screen T-HFND-001-003 When you start Toon Boom Harmony, the Welcome screen appears. If a scene is already open, you can display the Welcome screen by selecting Help > Show Welcome Screen. The Welcome screen allows you to: - Create scenes - Choose your scene resolution - Quickly open recently edited scenes - Manage your scene resolution presets - Watch Harmony video tutorials - Open the online support page - Browse for and open a scene from your documents
https://docs.toonboom.com/help/harmony-14/essentials/project-creation/about-welcome-screen.html
2021-09-17T02:38:13
CC-MAIN-2021-39
1631780053918.46
[array(['../Resources/Images/HAR/Stage/GettingStarted/WelcomeScreeen-Essentials.png', None], dtype=object) ]
docs.toonboom.com
a GameObject, component or asset. The object obj is destroyed immediately after the current Update loop, or t seconds from now if a time is specified. If obj is a Component, this method removes the component from the GameObject and destroys it. If obj is a GameObject, it destroys the GameObject, all its components and all transform children of the GameObject. Actual object destruction is always delayed until after the current Update loop, but is always done before rendering. Note: When destroying MonoBehaviour scripts, OnDisable and OnDestroy are called before the script is removed..
https://docs.unity3d.com/2020.3/Documentation/ScriptReference/Object.Destroy.html
2021-09-17T01:49:17
CC-MAIN-2021-39
1631780053918.46
[]
docs.unity3d.com
You are viewing version 2.25 of the documentation, which is no longer maintained. For up-to-date documentation, see the latest version. Cloud Admin Quickstart (Preview) InfoArmory Cloud is currently in Early Access and available to a limited number of design partners. If you’re interested, request early access on the Armory Cloud page. Logging in to the Armory Cloud Console Armory provides a set of credentials to log in to the Cloud Console. These are separate from the user accounts that app developers use to access the Armory Enterprise UI and API. The Cloud Console is where you go to make changes to the Armory Cloud environment and what it has access to. For example, you can add deployment targets or other resources in the Cloud Console. When app developers log in to Armory Enterprise, they can use these resources in their deployment pipelines. As a design partner, Armory provides credentials for you to use for logging in to the Cloud Console. From here, you can select which environment you want to configure or open Armory Enterprise for that environment. Configuring the environment If you are setting up the Armory Cloud environment for the first time, you need the following: Required - AWS permissions that allow you to create AssumeRoles. Armory Cloud provides a Cloud Formation template for you to use to create the required AssumeRole. This AssumeRole gets used for actions like provisioning new infrastructure to bake an image. - Artifact sources you want to connect to. Artifacts are external JSON objects like an image, a file stored somewhere, or a binary blob in a bucket. - Deployment targets you want your app developers to have access to. - Secrets, such as AWS Secret Access Keys. Armory Cloud encrypts these after you add them. When configuring resources like deployment targets, you can reference the secrets by a name you assign. Optional - Packer files to perform customized image baking. To start, click Configure for the environment you want to add. Secrets For development and other non-production environments, you can choose to enter secrets in plaintext into the Cloud Console. The best practice, though, is to secure your secrets using the Secrets UI in the Armory Cloud Console. Secrets are values such as an access token for GitHub. Armory Cloud stores and transmits these secrets securely if they are entered into the Secrets UI. Once entered, they are encrypted and not visible in the Armory Cloud Console. to start: - need to provide values for things. Packer templates help you bake images that remain consistent, immutable, and repeatable, which gives your app developers the ability to deploy without worrying about unintended configuration drift and other common issues. By default, Armory ships a set of default Packer templates for Amazon Machine Images (AMI) that your app developers can use when crafting their delivery pipelines. You have the option of adding custom Packer template files to do more customized baking though. For general information about Packer templates, see Templates. Adding a custom Packer file is optional, and you can skip this section if you do not want to provide any custom Packer templates for your app developers. Any templates you do add here. Notifications Configure notifications to alert your app devs about certain pipeline events, such as pipeline completion or failure. Slack Before you can configure Slack notifications for Armory Cloud, you need to configure a bot user in Slack: - Go to. - Create a new app for your workspace. - Create a bot: - For OAuth & Permissions, assign the bot write permissions for chat. - By default, the bot uses the name you gave the app. You will use this name when configuring Slack notifications in Armory Cloud. - Although not required, Armory recommends that you set the bot to always be shown as online. - Install the app in your workspace and note the Bot User OAuth Token. Save this token somewhere safe. You need to add it to the Cloud Console as a secret. The token authorizes the bot to post messages to your Slack workspace in any channel it is invited to. - Invite the bot to your channels. After you complete the configuration in Slack, finish setting notifications in the Cloud Console: - In the Armory Cloud Console, go to the Secrets page. - Add the Bot User OAuth Token from Slack as a secret. Give it a descriptive name. - Go to Notifications > Slack. - Enable the feature. - Provide the display name for the Slack bot. This is the user name that sends notifications in Slack. It should match the name the bot has in Slack. - Select the Bot User OAuth Token from the list of secrets. Save the configuration. Accessing Armory Enterprise Armory Enterprise is where your app developers create their delivery pipelines. Once the environments are configured, your app developers can access Armory Cloud to start deploying applications. The URL to access the Armory platform is available on the Cloud Console homepage when you click Launch.
https://v2-25.docs.armory.io/docs/armory-cloud-preview/armory-cloud-admin/admin-cloud-quickstart/
2021-09-17T00:47:23
CC-MAIN-2021-39
1631780053918.46
[]
v2-25.docs.armory.io
Creating Reports Last updated on November 7, 2019 You can create a report and download it immediately as a .xlsx file if it contains 5,000 records. If the report contains 50,000 records, you will receive the .xlsx file via email. When creating a report you also have the option to schedule it. To create a report: On the left pane, click Reports. The Reports dialog appears. From here you can either click the add Add New Report button or click + Add New Report on the right pane. The Details tab appears on the right pane, complete the following fields: Click Download. The report is created and is now listed in the Downloads History list.
https://devint-docs.openx.com/publishers/reports-creating-reports-supply/
2021-09-17T00:47:35
CC-MAIN-2021-39
1631780053918.46
[]
devint-docs.openx.com
Description: Export the Item inventory from NetSuite and update it in Magento. This data flow uses the “novamodule Magento1 Inventory Export” saved search to obtain the Item inventory. This data flow retrieves the item quantity records which are updated since it's last run. Steps to Configure Advanced Settings: - Choose the saved Search - click on the refresh button and choose the “novamodule Magento1 Inventory Export” saved search and save the setting. - When the data flow runs for the first time OR the customer wants to update the inventory for the entire catalog, then check the “Check this check-box in order to export inventory levels for the entire catalog”. - If the customer wants to get the quantity from specific locations, then choose the locations in the field “Select one or more NetSuite locations to pick the inventory quantity from (Multi-Select)”. Click on refresh if no locations are found. Please sign in to leave a comment.
https://docs.celigo.com/hc/en-us/articles/115004860007-Inventory-Add-Update-Inventory-from-NetSuite-to-Magento
2021-09-17T01:05:29
CC-MAIN-2021-39
1631780053918.46
[array(['/hc/article_attachments/115008974527/Picture1.png', 'Picture1.png'], dtype=object) array(['/hc/article_attachments/115008974667/Picture2.png', 'Picture2.png'], dtype=object) ]
docs.celigo.com
Create and test a device account (Surface Hub) Creating a Surface Hub device account (also known as a resource account/room mailbox) allows the Surface Hub to receive, approve, or decline meeting requests and join meetings. Once the device account is provisioned on a Surface Hub, people can add this account to a meeting invitation the same way that they would invite a conference room. You can configure the device account during the Out-of-Box Experience (OOBE) setup. If needed, you can also change it later in Settings > Surface Hub > Accounts. Configuration overview This table explains the main steps and configuration decisions when you create a device account. Note The Surface Hub device account doesn't support third-party federated Identity Providers (IdPs) and must authenticate via Active Directory or Azure Active Directory. Detailed configuration steps We recommend setting up your Surface Hub device accounts using remote Windows PowerShell. Microsoft provides SkypeRoomProvisioningScript.ps1, a script that will help create new resource accounts, or validate existing resource accounts you have in order to help you turn them into compatible Surface Hub device accounts. If you prefer, you can choose an option from the table below, and follow the detailed PowerShell steps based on your organization deployment. Account verification and testing There are two methods available that you can use to validate and test a Surface Hub device account: SkypeRoomProvisioningScript.ps1 and the Surface Hub Hardware Diagnostic app. The account provisioning script can validate a previously-created device account using PowerShell from your PC. The Surface Hub Hardware Diagnostic app is installed on your Surface Hub and provides detailed feedback about signin and communication failures. Both are valuable tools to test newly created device accounts and should be used to ensure optimal account availability.
https://docs.microsoft.com/en-us/surface-hub/create-and-test-a-device-account-surface-hub
2021-09-16T23:46:46
CC-MAIN-2021-39
1631780053918.46
[]
docs.microsoft.com
You are viewing documentation for version: 3.2 | This version works with Simplicity Studio 5 only. If you have Simplicity Studio 4, switch to 2.13. | For additional versions, see Version History. sl_bt_evt_cte_receiver_connection_iq_reportCTE Receiver IQ sample report from connection CTE packets. Data Structures struct sl_bt_evt_cte_receiver_connection_iq_report_s Data structure of the connection_iq_report event. Macros #define sl_bt_evt_cte_receiver_connection_iq_report_id 0x004500a0 Identifier of the connection_iq_report event. Detailed Description IQ sample report from connection CTE packets. Data Structure Documentation ◆ sl_bt_evt_cte_receiver_connection_iq_report_s struct sl_bt_evt_cte_receiver_connection_iq_report_s Data structure of the connection_iq_report event. Data Fields uint16_t status Status of CTE IQ sampling uint8_t connection Connection handle or periodic advertising synchronization handle uint8_t phy The PHY on which the packet is received. 1: 1M PHY 2: 2M PHY uint8_t channel The channel on which the CTE packet was received int8_t rssi RSSI in the received CTE packet. Unit: dBm uint8_t rssi_antenna_id The ID of the antenna on which RSSI was measured uint8_t cte_type The CTE type 0: AoA CTE response 1: AoD CTE response with 1us slots 2: AoD CTE response with 2us slots uint8_t slot_durations Slot durations 1: Switching and sampling slots are 1 us each 2: Switching and sampling slots are 2 us each uint16_t event_counter The event counter of the connection uint8array samples IQ samples of the received CTE packet. I and Q samples follow each other alternately (I, Q, I, Q, ...)
https://docs.silabs.com/bluetooth/latest/group-sl-bt-evt-cte-receiver-connection-iq-report
2021-09-17T01:38:24
CC-MAIN-2021-39
1631780053918.46
[]
docs.silabs.com
You're viewing Apigee Edge documentation. View Apigee X documentation. Enroll in the beta release of the audience management feature To manage audiences, you must enroll in the beta release of the audience management feature. From the landing page of your portal, in the beta enrollment banner click Enroll. Understand audiences Using audiences, you can segment portal users or developer teams to control access to the following resources: - Pages in your portal - Published API products The following figure shows how audiences are used to control access to a set of resources. As shown in the figure, as an authenticated portal user, User A is able access the resources available through the public API program. In addition, as a member of Team A, User A inherits the entitlements from the Beta Users audience and is able to access the resources available through this restricted API program. The following sections describe how to manage audiences and configure audience entitlements, and the audiences that are available by default. About the default audiences The following two audiences are defined, by default. Explore the Audiences page To access the Audiences page: - Select Publish > Portals in the side navigation bar to display the list of portals. - Click the row of the portal for which you want to view audiences. - Click Audiences on the portal landing page. Alternatively, you can select Audiences in the portal drop-down in the top navigation bar. - Click the Audiences tab. The Audiences page is displayed. As highlighted in the figure, the Audiences page enables you to: - View details for all audiences, including: - Name - Description - Total number of team and individual portal user assignments - Creation date - Add an audience - Edit and delete an audience You can also manage the resource entitlements for an audience when creating resources and manage the default visibility for specific portal resources. Add an audience To add an audience: - Access the Audiences page. - Click +. - Enter the name and description of the audience. - Click OK. - Manage the assignments for an audience. Manage the assignments for an audience When adding or editing an audience, you can manage the developer teams and individual portal users assigned. To manage the assignments for an audience: - Access the Audiences page. - Click the row of the audience for which you want to manage audience assignments. To add a team or individual portal user to an audience: a. Click + in the Assignments section. b. Select one or more developer teams or individual portal users in the Add assignments dialog. Enter a string in the Search box to filter the list. Click All to select all items on the list or None to deselect all items. c. Click Add. To delete an audience assignment, click . Manage the resource entitlements for an audience When creating a portal page or publishing an API product, you can restrict access to the resource by assigning one or more audiences. When the visibility of that resource is set to Restricted Access, access is limited to the audiences to which the resource is assigned. For more information, see: When viewing and editing an audience, you can view the resource entitlements assigned. Edit an audience To edit an audience: - Access the Audiences page. - Click the row of the audience that you want to edit. - To edit the audience details: - In Audience details section, click . - Edit the name and description of the audience. - Click Save. - Manage the developer teams and individual portal users assigned to an audience. - In the Entitlements section, view the resource entitlements for an audience. See Manage the resource entitlements for an audience. Delete an audience To delete an audience: - Access the Audiences page. - Position the cursor over the audience that you want to delete to display the actions menu. - Click - Click Delete to confirm the operation.
https://docs.apigee.com/api-platform/publish/portal/portal-audience?hl=ca
2021-09-17T00:34:55
CC-MAIN-2021-39
1631780053918.46
[array(['https://docs.apigee.com/api-platform/images/audiences.png?hl=ca', None], dtype=object) array(['https://docs.apigee.com/api-platform/images/audiences-page.png?hl=ca', None], dtype=object) ]
docs.apigee.com
Holds all information that may be obtainable from a loot table roll, allowing for identifying key information. Not all parameters are present at all times, for obvious reasons. For example, information related to an entity will not be available if the loot table being rolled is the one for a block. It might be required for you to import the package if you encounter any issues (like casting an Array), so better be safe than sorry and add the import at the very top of the file. ZenScriptCopy import crafttweaker.api.loot.LootContext;
https://docs.blamejared.com/1.16/de/vanilla/api/loot/LootContext/
2021-09-17T01:07:03
CC-MAIN-2021-39
1631780053918.46
[]
docs.blamejared.com
Custom serialization in Silverlight Unlike in the desktop framework, Silverlight doesn't have a good extensibility point for replacing the serializer which is used to serialize / deserialize operation parameters (since the class DataContractSerializerOperationBehavior is not public in SL4). I wrote a post about how to replace the default serializer using the existing extensibility points in SL4 (endpoint behavior, message formatters). Check it out at!
https://docs.microsoft.com/en-us/archive/blogs/silverlightws/custom-serialization-in-silverlight
2021-09-17T01:35:57
CC-MAIN-2021-39
1631780053918.46
[]
docs.microsoft.com
This framework since version 6.0, it now follows the Eclipse framework hierarchy for managing your test assets. You no longer need to open .tst files one at a time. Instead, you can manage all your .tst files in projects within a workspace. A workspace corresponds to a directory on the local machine. SOAtest will ask you for the desired location of workspace at startup and will remember that location in future runs. When you start SOAtest 9.x, an Eclipse workspace is automatically created in <user_home_dir>/parasoft/workspace. For instance: /home/username/parasoft/workspace (Linux), C:\Users\username\parasoft\workspace (Windows). In the following sections, we will go through several ways of creating new projects that may be useful to existing SOAtest or WebKing users. Other ways of creating new projects, (e.g. from a WSDL), can be found in the SOAtest Tutorial. To ensure that tests can easily be shared across your organization, a designated team member—usually a team lead or manager—must decide which project setup strategy to use. The entire team should then adopt the same strategy. By default, SOAtest 9.x ships with CVS source control support. Support for additional source controls can be added by providing appropriate plugins to Eclipse. To create a project consisting of test suites that are checked into source control: .projectand .parasoftfiles into source control. They will be visible in the Navigator view and should be shared by the entire team. .metadatafolder into source control. We strongly recommend copying the old tests into your new workspace. Doing—but it will allow them to remain in the same location on your file system. This procedure is explained in Leaving Tests in the Original Location. To create a project that copies existing test suites to a new location on the file system: .projectfolder, and the .parasoftfiles into source control. They will be visible in the Navigator view and should be shared by the entire team. .metadatafolder into source control. To create a project that leaves existing test suites at the same location on the file system: Once a team member creates a project, that team member can create a Team Project Set File (.psf) that can then be shared with other members of the team. This allows every team member to create their Eclipse Projects in the same uniform way. This is a necessary step for importing tasks from the automated nightly test process. To create a Team Project Set File for a project created from CVS, complete the following: To create a Project from the Team Project Set File, complete the following: Existing SOAtest or WebKing users can import preferences from previous versions extensions in the installation directory of SOAtest or WebKing. To import existing preferences, complete the following: Parasoft SOAtest is based on the Eclipse IDE and has a different look and feel than older versions. Except for the changes outlined below, however, the user interface design layout, forms and settings have largely remained unchanged and should remain familiar to existing users. The Test Case Explorer can have multiple Eclipse projects open at the same time. Each project can have multiple Test Suites open at the same time. In previous versions of SOAtest, only one Test Suite could be open at any given time. At the top right corner of the Test Case Explorer are the following menu buttons: In previous versions, if you wanted to open the configuration panel for a test node (e.g., an "Editor"), you would select that node in the Tests tab. With SOAtest 9.x, you double-click on an item’s Test Case Explorer simultaneously. When an Editor is modified in SOAtest 9.x, an asterisk “*” displays on the Editor tab denoting that the Editor is now “dirty.” Modifications to the Editor must be explicitly saved using the Save toolbar button or the Ctrl-S keyboard shortcut. In previous versions of SOAtest (5.x and earlier), environments were displayed in a separate tab below the Tests tab. Environments are now part of the tree view in the Test Case Explorer. To run a test, you can right-click on the test’s node and select Test Using ‘Example Configuration’ from the shortcut menu.’ Alternatively, you can press F9 on your keyboard, or click the Test toolbar button. In previous versions of SOAtest and WebKing, you had to explicitly save test suite (. tst) files. In SOAtest 6.x and later, user actions in the Test Case Explorer are automatically saved. For instance, adding new Test to the Test Case Explorer will be automatically saved. Note that once tests are saved in the latest version of SOAtest, they cannot be opened in older versions of SOAtest. Failures that occur during the test execution now display in the Quality Tasks view. What was previously displayed in the Messages Log now displays in the Console view. If you have the appropriate source control plugins installed into the Eclipsepace\mylocalsettings.properties" The -publish argument will add reports to Development Testing Platform (DTP) so that test and analysis data can be merged, correlated, and analyzed to expose deeply hidden defect patterns. As the DTP processes data, it creates actionable findings that can be downloaded and imported into your IDE (requires DTP 5.3.x or later). You can also use the -publishteamserver option to publish reports to Team Server, which provides backwards compatibility with Concerto and older versions of DTP. For a detailed list of changes, see the topic on Command Line Interface Migration. There have been upgrades to the HP QC Integration from 6.2 to 9.x. The steps to connect the two products must be redone in order to ensure that the correct behavior is continuing.
https://docs.parasoft.com/plugins/viewsource/viewpagesrc.action?pageId=29408223
2021-09-17T00:43:03
CC-MAIN-2021-39
1631780053918.46
[]
docs.parasoft.com
Overview This document will guide the user during SIOS Protection Suite for Linux installation (SPS). Follow the quick decision matrix to understand how to install SPS for SAP environment. Note: The link above is provided here so it can be copied. Feedback Thanks for your feedback. Post your comment on this topic.
https://docs.us.sios.com/spslinux/9.5.1/en/topic/sios-protection-suite-for-linux-in-the-aws-cloud-sap
2021-09-17T00:10:30
CC-MAIN-2021-39
1631780053918.46
[]
docs.us.sios.com
When you installed AutoCAD Map 3D, the tutorial sample data was installed on your computer in the \Program Files\AutoCAD Map 3D 2010\Help\Map 3D Tutorials folder. You need that sample data to use the tutorials. Copy the Map 3D Tutorials folder to My Documents. That way, if you change the sample files, the original versions remain unchanged and can be used again and again. To make a copy of the sample data A new folder is displayed in My Documents, for example C:\My Documents\Map 3D Tutorials.
http://docs.autodesk.com/MAP/2010/ENU/AutoCAD%20Map%203D%202010%20User%20Documentation/HTML%20Help/files/WS73099cc142f487551e5a0cb10850d4cd7c-7493.htm
2019-01-16T09:41:21
CC-MAIN-2019-04
1547583657151.48
[]
docs.autodesk.com
The easiest way to search for a member is by typing part of their name in the ‘Member’ space at the top of the screen: Under the Member tab there is an option to Search for a member: Search for the member by their first or last name, or even email address.
https://docs.influxhq.com/members/searching/
2019-01-16T10:38:04
CC-MAIN-2019-04
1547583657151.48
[array(['https://influxhqcms.s3.amazonaws.com/sites/5372211a4673aeac8b000002/assets/549344584673aea2ca01ce04/Member_search.png', None], dtype=object) array(['https://influxhqcms.s3.amazonaws.com/sites/5372211a4673aeac8b000002/assets/54b348f14673ae91eb07a5ec/Member_search.png', None], dtype=object) array(['https://influxhqcms.s3.amazonaws.com/sites/5372211a4673aeac8b000002/assets/53bc7ac14673ae18f80003a1/Just_do_it.png', None], dtype=object) ]
docs.influxhq.com
If you do not wish to connect logins using Salt Edge Connect, you can perform all the login actions using our API. In order to connect a new login, you should use the logins create route described in the API reference. After connecting the login, your application will receive the corresponding callbacks so you can fetch your newly created login. If the login you wish to connect is an interactive one, you will receive the interactive callback specified in your app’s profile, which means that you have to use the interactive route and follow the instructions specified in the API reference. If the login had an error while connecting, and it had an InvalidCredentials error during the fetching process, you should reconnect the login and send the correct credentials in your request body. When the login is connected correctly, and you received all the accounts and transactions details, you can refresh the login anytime, and keep the login data up-to-date. API Connect requires high security standards for data handling on client’s side. This integration method is only available for the certified and/or selected partners. For more information, feel free to contact us.
https://docs.saltedge.com/v3_services/guides/api_connect/
2019-01-16T09:41:44
CC-MAIN-2019-04
1547583657151.48
[]
docs.saltedge.com
> Add a new section in the .htaccess file when installing a plugin Some plugins, during their installation, create.
https://docs.bitnami.com/virtual-machine/apps/dreamfactory/administration/use-htaccess/
2019-01-16T11:14:22
CC-MAIN-2019-04
1547583657151.48
[]
docs.bitnami.com
The Preview Window Contents UC Connector uses the Custom Server module, a Universal Routing Server (URS) component built into the UC Connector itself, to handle the Preview interaction. By using the Custom Server module, UC Connector processes proprietary CUSTOMLIB protocol messages sent from the routing strategy, in order to initiate the Preview interaction with the third-party UC client. HTTP is used as the transport method for both supported UC platforms. The following diagram shows how the embedded Custom Server handles the Preview Interaction between URS and the UC platform. The Overall Preview Interaction Depending on how the routing strategy is configured, the Preview interaction can be sent to a particular individual Knowledge Worker, or in a round-robin manner (consecutively) to a pool of available Knowledge Worker resources, continuing until one of them accepts the interaction. You can also design the routing strategy to broadcast notifications to several Knowledge Workers in a group, where the interaction is then sent to the first Knowledge Worker that accepts the preview. The call flow for the overall interaction is as follows: - Customer interaction is initiated towards the Knowledge Worker - Based on the Knowledge Worker presence status in Genesys (possibly mapped from the corresponding agent status on the Microsoft side), URS selects an available Knowledge Worker and initiates an Interaction Preview with that user—the Preview window appears on their device. If the audio-on-preview option is configured, the specified audio will also play to alert Knowledge Workers who may not be at their desk that an interaction Preview has arrived. - A countdown timer appears in the preview window (the length of the timeout period is configurable). The Knowledge Worker must respond before this timer runs out, otherwise the call is returned to URS, where the strategy can select a new Knowledge Worker. - On accepting the Preview, an incoming call notification window appears on the Knowledge Worker device (typically, the device is also ringing). If accepted, the voice call between customer and Knowledge Worker is established. The Preview Notification Window Shown below is a sample Preview window for an incoming voice call, as it appears on the desktop of a Knowledge Worker. Information about the interaction also appears in the UC Connector GUI in another browser window (not shown). The Preview window can display any user information available to the routing strategy—in other words, any customer information stored or collected earlier in the interaction, in order to give relevant details of the interaction to the Knowledge Worker. In this case, customer information such as phone number and service level are included. Feedback Comment on this article:
https://docs.genesys.com/Documentation/UCC/latest/Deployment/previewwindow
2019-01-16T09:41:29
CC-MAIN-2019-04
1547583657151.48
[]
docs.genesys.com
changes.mady.by.user Kalle Korhonen Saved on Jun 14, 2012 More people using tapestry-security means more improvements, feature requests. We are still staying on top of it and keep our backlog clean with the following issues fixed in 0.4.6, the best ever tapestry-security release yet: Read more at tapestry-security guide and enjoy, Tynamo Team Powered by a free Atlassian Confluence Open Source Project License granted to Codehaus. Evaluate Confluence today.
http://docs.codehaus.org/pages/diffpages.action?pageId=229738672&originalId=229738679
2015-01-30T06:24:40
CC-MAIN-2015-06
1422115856087.17
[]
docs.codehaus.org
- Check out Groovy from SVN or grab a source distribution - Install Ant (1.7.0 preferred) - Type 'ant clean test' on the command line to compile, test and make jars - Type 'ant install' to build doco and zips - Type 'ant deploy' to publish to repositories Note: to publish you will need to have appropriate permissions and stored your credentials in a ~/.m2/settings.xml fill of the form:
http://docs.codehaus.org/pages/viewpage.action?pageId=77737
2015-01-30T06:43:27
CC-MAIN-2015-06
1422115856087.17
[]
docs.codehaus.org
User Guide - Quick Help - Tips and shortcuts - Phone - Voice commands - Messages - Files and attachments - Media - Ring tones, sounds, and alerts - Browser - Calendar - Contacts - Clock - Tasks and memos - Typing - Keyboard - Language - Screen display - GPS technology - Compass - Wireless coverage indicators Indicators in the upper-right corner of the home screen display the wireless coverage level for the area that you are using your BlackBerry smartphone in. For more information about wireless coverage areas, contact your wireless service provider. Related information Previous topic: My smartphone responds slowly or freezes Next topic: Troubleshooting: Mobile network Previous topic: Turn on flashing LED notification for wireless coverage Next topic: How to: Wi-Fi technology Previous topic: Turn on, turn off, or check the status of a network connection Next topic: Troubleshooting: Wi-Fi technology Previous topic: About using Wi-Fi with VPN and software tokens Was this information helpful? Send us your comments.
http://docs.blackberry.com/en/smartphone_users/deliverables/38289/1773471.jsp
2015-01-30T06:29:19
CC-MAIN-2015-06
1422115856087.17
[]
docs.blackberry.com
... For.20.0-rc2. Do a dry run Release! Build locally and prepare the staging repository at Codehaus Nexus: ... and to Codehaus (provided a remote codehaus initialized to ssh://[email protected]/izpack.git): Close, promote and release from the staging repository at Codehaus Nexus: ... Now you can reset the local repository to the original state (or using FETCH_HEAD if the remote repository is still blocked and unchanged). ... - Use WebDAV at to push releases from the Maven repository to Example 5.0.0-rc1: Copy izpack/izpack-dist/target/izpack-dist-5.0.0-rc1.jarto webdavs://dav): IzPack.org website:, automatically deployed to Twitter (ping @jponge): @izpack Update the Codehaus GIT repository After deploying a snapshot or release push all differences from GitHub to Codehaus to have it in sync:
http://docs.codehaus.org/pages/diffpages.action?pageId=178520173&originalId=233052804
2015-01-30T06:43:32
CC-MAIN-2015-06
1422115856087.17
[]
docs.codehaus.org
About encrypted media files You can now synchronize and import encrypted media files from your BlackBerry smartphone. Encrypted media files aren't supported on the BlackBerry PlayBook tablet. If you're synchronizing or importing encrypted media files from a smartphone that has encryption turned on, you must keep encryption turned on to continue using the BlackBerry Desktop Software. If you turn off encryption and want to continue using the media features, you need to delete encrypted media from your smartphone. If your smartphone has built-in media storage that is smaller than 128 MB, you can import encrypted media to your computer, but you'll need to use the files feature to copy files from your computer to your smartphone. Was this information helpful? Send us your comments.
http://docs.blackberry.com/en/smartphone_users/deliverables/43033/1645241.jsp
2015-01-30T06:47:37
CC-MAIN-2015-06
1422115856087.17
[]
docs.blackberry.com
You might be required to pay a recurring subscription fee for some apps and games, or for certain features and additional items within an app or game that you have already downloaded. Your subscription automatically renews until you cancel it. The BlackBerry World storefront is designed to send you a reminder email before and after each subscription is renewed.
http://docs.blackberry.com/en/smartphone_users/deliverables/50498/amc1344624438112.html
2015-01-30T06:24:43
CC-MAIN-2015-06
1422115856087.17
[]
docs.blackberry.com
The Documents menu provides access to the documentation management structures and functions. Summarized: - All Documents – is where you view, add, edit, delete and otherwise manage documents. - New Document – is a convenience menu item which provides direct access to creating a new document. - Categories – provides means to add and manage document categories. Note that these are independent and specific to Documents, they are separate from categories used for posts. - Tags – provides means to add and manage document tags. As with document categories, these are independent from the tags used on posts. It’s worth mentioning that Documents combine some of the goodies you already know from Posts and Pages. Documents have categories and tags just like Posts, but Documents also allow to establish a hierarchy among them as do Pages. Anyhow, Documents are entirely separate entities, although you will use WordPress’ familiar features to manage them. All Documents Here is where you are presented with an overview on all your documents. You can add new documents, edit existing documents or delete those that are not required. Some actions can be applied in bulk, for that simply select one or more documents using the checkbox provided for each entry, then select the desired action in the box that shows Bulk Actions and click Apply. A form will show up where you can change settings for those documents in bulk. To add a new document, either use the button labeled New Document provided on top of the screen, or use the menu item of the same name found in the Documents menu. You can also search documents using the field provided on top of the list. Document Categories Document Categories are independent and specific to Documents, they are separate from the categories used for Posts. Here you can create new categories and modify existing ones. Please note that you can also create and assign categories while editing a document. Document Tags Here you can add and manage your Document Tags. As with Document Categories, these tags are independent from the tags used on Posts. You can assign tags directly when you edit a document and they will be added here automatically. Documents and Document Categories in Menus Documents and Document Categories can be added to menus. You will find the Document and Document Categories providing published items that can be added to menus. To add a document directly to a menu, under Appearance > Menus > Edit Menus look for the Documents container (as shown here – click the entry to show its items and options), select the items you want to add and click the Add to Menu button. If you do not find the Documents container, click the Screen Options and check the entry for Documents to make it appear. The same can be done with Document Categories which you will find in the same section. Just look for the Document Categories container and review the Screen Options if it does not appear.
http://docs.itthinx.com/document/documentation/documents/
2018-07-15T22:56:04
CC-MAIN-2018-30
1531676589022.38
[array(['http://docs.itthinx.com/wp-content/uploads/2015/04/All-Documents-showing-Categories-and-Tags.png', None], dtype=object) array(['http://docs.itthinx.com/wp-content/uploads/2015/04/Documents-Bulk-Edit.png', None], dtype=object) array(['http://docs.itthinx.com/wp-content/uploads/2015/04/Document-Categories.png', None], dtype=object) array(['http://docs.itthinx.com/wp-content/uploads/2015/04/Document-Tags.png', None], dtype=object) ]
docs.itthinx.com
Advanced Restore Options (NAS Options) Use this dialog box to select advanced NAS options for the restore operation. Note that all the options described in this help may not be available in the dialog box. The options displayed in the dialog box are applicable to the file server being used. Recursive Restore Specifies that the restore will also include all files and subdirectories of the directory specified, rather than just the specified directory. Exclusive Restore Prevents the concurrent execution of multiple backup or restore operations on the same file system. Overwrite If selected, specifies that if an item to be restored already exists, that item will be overwritten by the restored item. If cleared: - backed up files/directories whose names are the same as those in the restore path are not restored unless they are newer than the existing file/directory. - backed up files/directories whose names are different from those in the restore path are restored. Restore User and Group Quotas Controls the restore of user and group quotas. If set to No, only tree (hierarchical) quota records will be restored (user or group quotas are ignored). Restore Enable 8.3 Names Specifies whether to restore file names from tape. If set to Yes, any 8.3 file name that was backed up is restored. However, this can cause a naming conflict. If a naming conflict occurs, the ONStor Gateway posts a warning message and uses a new file name. If set to No, a file name is generated by the file system, which can result in a name that is different from the one that was backed up. Restore Subtree Quotas Controls the restore of tree (hierarchical) quotas. If set to No, this variable restores no tree quota information from tape. If set to Yes, the tree quota information is restored. Restoring tree quota information involves restoring only the default and limit values for a tree. Restore QTree Configuration and Usage Controls whether to restore the tree or directory quota information from tape to a live file system. If set to Yes, tree and directory quota configuration information and usage conditions are restored. If set to No, tree and directory quota configuration and usage conditions are left on tape and log messages are generated.
http://docs.snapprotect.com/netapp/v10/article?p=products/nas_ndmp/help/restore_adv_nas.htm
2018-07-15T23:11:56
CC-MAIN-2018-30
1531676589022.38
[]
docs.snapprotect.com
Integrating Visio 2007 and Excel 2007 Summary: Visio 2007 and Excel 2007 provide built-in integration capabilities. See an example of linking data in Visio to Excel by using the Visio Data Selector and an example of creating and populating an Excel Bill of Materials by using Automation code. (10 printed pages) October 2006 Applies to: Microsoft Office Visio 2007, Microsoft Office Excel 2007 Contents Generate an Excel Bill of Materials from Data Stored in Shapes by Using the Data Reports Tool Generate a Custom Excel Bill of Materials from Data Stored in Shapes by Using Automation Code The Visio Drawing The Excel Bill of Materials Code Example to Extract Data from the Visio Drawing and Fill in the BOM About the Author Additional Resources Generate an Excel Bill of Materials from Data Stored in Shapes by Using the Data Reports Tool From within Microsoft Office Visio 2007, you can report your data out in a variety of ways. One example is reporting data from Visio 2007 to Microsoft Office Excel 2007. If you create a drawing and add data manually to the shape or to your shape data fields, you can generate an Excel 2007 report summarizing that data. In Figure 1, data was associated with the equipment shapes in the diagram by using shape data fields and displayed by using data graphics. Figure 1. A network diagram with associated shape data ### To report on the data contained in the diagram On the Data menu, click Reports. In the Reports dialog box, select the report to run. You can modify an existing report or create and save a new report. Figure 2. The Reports dialog box enables you to run or modify existing reports or create a new report Select Modify or New to choose the fields on which you want to report. In the Report Definition Wizard, choose which shapes you want to include in your report—Shapes on all pages, Shapes on the current page, or Selected shapes. Figure 3. Specify the shapes in your report On the next screen of the Report Definition Wizard, specify which fields you want to export to your Excel 2007 report. On the next screen, type a title for your report and indicate how you want the data to be sorted and formatted. Run the report; choose Excel as the reporting format. Figure 4. Data reported to Excel from the Visio 2007 network diagram Generate a Custom Excel Bill of Materials from Data Stored in Shapes by Using Automation Code The following example code generates an Excel Bill of Materials (BOM) from data that is extracted from a Visio drawing. Visio drawings are more than just pictures. Visio shapes can have custom data, called Shape Data, defined for any shape. Data can be stored within Shape Data. You can use Automation code to extract and organize this data before writing it into the Excel workbook. The Visio Drawing In this example the data to be captured in the BOM is a count of the number of shapes, by type, within the drawing. We are interested only in a specific kind of shape, those that are of types Part 1, Part 2, Part 3, and Part 4. We are not interested in other kinds of shapes, such as the connectors or the text on the page. To enable us to detect this condition easily, we have assigned to these shapes a user-defined cell named User.PartName. An example can be seen in Figure 5, which shows the Visio ShapeSheet spreadsheet for the selected shape. This user cell simplifies the search criteria. Instead of searching through the shapes looking for a list of shapes by part names, we can make a single comparison to look for the existence of the user cell. Shapes that do not have this cell, such as the connectors in this drawing, are ignored. Figure 5. The Visio drawing from which the BOM material will be extracted The Excel Bill of Materials The Excel BOM file has been preconfigured with layout and formulas already in the file. Figure 6 shows the file as it looks in its preconfigured state. After the example code runs, the count for each part that has the cell User.PartName is inserted into the column labeled Quantity. The Document Name and Date are also added to the BOM. The final result is shown in Figure 7. Figure 6. The Excel BOM template Figure 7. The completed BOM Code Example to Extract Data from the Visio Drawing and Fill in the BOM using System; using System.Data; using System.Data.OleDb; using System.Collections; using System.Windows.Forms; using Visio = Microsoft.Office.Interop.Visio; using Excel = Microsoft.Office.Interop.Excel; BOM creation is initiated by an external event, such as a user clicking a button on a form (not shown in this example), which calls the GenerateBOM method. This is a high-level entry point that opens a Visio drawing, extracts information from the drawing, and then writes the data to the BOM file. The readVisioDrawing method reads the data from the Visio drawing. The createExcelFile method opens the preconfigured BOM shown previously and updates the part data. private static string _DocumentName = string.Empty; internal static void GenerateBOM( string drawingFileName, string excelFileName) { // Get the name of the Visio document to be opened. _DocumentName = System.IO.Path.GetFileNameWithoutExtension(drawingFileName); // Read the information from the drawing. readVisioDrawing(drawingFileName); // Create the Excel file. createExcelFile(excelFileName); } The readVisioDrawing method opens the Visio drawing and then searches all shapes on all pages to find every shape that has the User.PartName cell. Individual shapes are processed within the processDrawingShape method. The information from the Visio drawing is accumulated in a hash table. private static Hashtable _PartNameQuantity = new Hashtable(); private static void readVisioDrawing( string drawingFileName) { // Open the drawing in Visio. Visio.Application visioApplication = new Visio.Application(); Visio.Document partsDocument = visioApplication.Documents.Open(drawingFileName); // Gather the information from the drawing. _PartNameQuantity.Clear(); foreach (Visio.Page thisPage in partsDocument.Pages) { foreach (Visio.Shape thisShape in thisPage.Shapes) { processDrawingShape(thisShape); } } // Close the drawing. visioApplication.Quit(); } The processDrawingShape method reads the part identifier and updates the count in the hash table. It checks for the User.PartName cell in each shape. If the cell exists, it gets the part name data from the cell. If the cell does not exist, an error is generated and no counts are incremented for this shape. As a last step, the method checks to see if this shape has subshapes, to determine if it is a group shape. If the shape does have subshapes, the processDrawingShape method makes a recursive call to itself to process the subshapes of the group. private const string _VisioPartNameCell = "User.PartName"; private const string _VisioPropCellPrefix = "Prop."; private static Hashtable _PartNameQuantity = new Hashtable(); private static void processDrawingShape( Visio.Shape thisShape) { try { // Get the part name from the shape. string partName = thisShape.get_Cells(_VisioPartNameCell). get_ResultStr((short)Visio.VisUnitCodes.visNoCast).ToUpper(); // Reflect this shape in the part name / quantity table. if (_PartNameQuantity.Contains(partName)) _PartNameQuantity[partName] = (int)_PartNameQuantity[partName] + 1; else _PartNameQuantity.Add(partName, 1); } catch { // Ignore shapes without a part name. } // Process any subshapes within a group shape. foreach (Visio.Shape thisSubShape in thisShape.Shapes) { processDrawingShape(thisSubShape); } } The createExcelFile method creates the new Excel file based on the preconfigured BOM template file _ExcelTemplateName. This file is first copied to excelFileName so that the original template file is not changed. The updatePartsTable method fills in the quantity values in the BOM with the information extracted from the Visio drawing. The updateDocumentInformation method adds the document name and date to the BOM. private const string _ExcelTemplateName = "ExcelBOM.xls"; private static void createExcelFile( string excelFileName) { // Copy the Excel file template to the new file name. System.IO.File.Copy( Application.StartupPath + "\\" + _ExcelTemplateName, excelFileName, true); // Update the parts table information. updatePartsTable(excelFileName); // Update the document information. updateDocumentInformation(excelFileName); } The updatePartsTable method updates the parts table information in the worksheet. Because this information is in a worksheet that has a header row, we can use ADO.NET to access the data through a dataset. The parts table is a named range of the worksheet that contains the part name and corresponding quantity. It has a header row as the first row in the named range. private const string _PartsTableConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties=\"Excel 8.0;HDR=Yes\""; private const string _PartsTableName = "PartNameQuantity"; private const string _PartsTableNameColumn = "Part Name"; private const string _PartsTableQuantityColumn = "Quantity"; private const string _PartsTableSelectString = "Select * From [{0}]"; private const string _PartsTableUpdateString = "UPDATE [{0}] SET [{1}] = ? WHERE [{2}] = ?"; private static void updatePartsTable( string excelFileName) { // Open the data portion of the Excel file as a dataset. OleDbConnection connection = new OleDbConnection( string.Format(_PartsTableConnectionString, excelFileName)); connection.Open(); // Access the part quantity table. OleDbDataAdapter partQuantityAdapter = new OleDbDataAdapter( string.Format(_PartsTableSelectString, _PartsTableName), connection); DataSet partQuantityDataset = new DataSet(); partQuantityAdapter.Fill(partQuantityDataset, _PartsTableName); if (partQuantityDataset.Tables.Count > 0) { // Update the records with the quantities from the drawing. foreach (DataRow thisRow in partQuantityDataset.Tables[0].Rows) { string partName = thisRow[_PartsTableNameColumn].ToString().ToUpper(); if (_PartNameQuantity.Contains(partName)) thisRow[_PartsTableQuantityColumn] = (int)_PartNameQuantity[partName]; else thisRow[_PartsTableQuantityColumn] = 0; } // Generate the update command. partQuantityAdapter.UpdateCommand = new OleDbCommand( string.Format(_PartsTableUpdateString, _PartsTableName, _PartsTableQuantityColumn, _PartsTableNameColumn), connection); partQuantityAdapter.UpdateCommand.Parameters.Add( "@" + _PartsTableQuantityColumn, OleDbType.Numeric). SourceColumn = _PartsTableQuantityColumn; partQuantityAdapter.UpdateCommand.Parameters.Add( "@" + _PartsTableNameColumn, OleDbType.VarChar, 255). SourceColumn = _PartsTableNameColumn; // Push the updated data back to Excel. partQuantityAdapter.Update(partQuantityDataset, _PartsTableName); } // Close the connection. connection.Close(); } The updateDocumentInformation method updates the document name and data into the worksheet. Because the cells in the worksheet are not in a table that has a header row, the Excel application programming interface (API) is used to change the cells directly. private const string _DocumentNameCell = "DocumentName"; private const string _DocumentDateCell = "DocumentDate"; private static void updateDocumentInformation( string excelFileName) { object optional = System.Reflection.Missing.Value; try { // Open the Excel file and use the Excel API. // Excel stays hidden during this process. Excel.Application excelApplication = new Excel.ApplicationClass(); Excel.Workbook excelWorkbook = excelApplication.Workbooks.Open(excelFileName, optional, optional, optional, optional, optional, optional, optional, optional, optional, optional, optional, optional, optional, optional); // Update the records with the values from the drawing. Excel.Worksheet excelWorksheet = (Excel.Worksheet)excelWorkbook.Worksheets[1]; excelWorksheet.get_Range(_DocumentNameCell, optional). Value2 = _DocumentName; excelWorksheet.get_Range(_DocumentDateCell, optional). Value2 = System.DateTime.Today; // Save and close. excelWorkbook.Save(); excelWorkbook.Close(false, optional, optional); excelApplication.Quit(); } catch (Exception ex) { MessageBox.Show(ex.Message); } } more information about automating Excel and using the Visio object model to traverse shapes in a drawing, see the following resources: For more information about Visio 2007 integration, see the following articles: Integrating Visio 2007: Introduction to Integrating Visio with Other Microsoft Programs Integrating Visio 2007 and Access 2007 Integrating Visio 2007 and Active Directory Integrating Visio 2007 and Microsoft SQL Server 2005 Integrating Visio 2007 and MOM, Exchange, and Reporting Services Integrating Visio 2007 and Project 2007 Integrating Visio 2007 and SharePoint Products and Technologies
https://docs.microsoft.com/en-us/previous-versions/office/developer/office-2007/aa701255(v=office.12)
2018-07-16T00:03:16
CC-MAIN-2018-30
1531676589022.38
[array(['images/aa701255.5801a976-525c-48fc-9286-35d7ec9b0f8e%28en-us%2coffice.12%29.gif', 'A network diagram with associated shape data A network diagram with associated shape data'], dtype=object) array(['images/aa701255.07dc03b0-e076-480b-8e04-94246b01fd99%28en-us%2coffice.12%29.gif', 'Visio drawing from which the BOM is extracted Visio drawing from which the BOM is extracted'], dtype=object) array(['images/aa701255.302a76a1-4a5f-4953-83ec-3da828bfd468%28en-us%2coffice.12%29.gif', 'The Excel BOM template The Excel BOM template'], dtype=object) array(['images/aa701255.5703b4a7-132f-4e0e-97cf-f9b527235658%28en-us%2coffice.12%29.gif', 'The completed BOM The completed BOM'], dtype=object) ]
docs.microsoft.com
pwd ls("list") it shows the contents of the current directory when you type mkdir ("make dir") it creates a new SUBDIRECTORY inside the current directory mkdir code cd("change dir") moves you into a different directory cd codewould move you into a directory named code cdall on its own and press the return key. This will send you back to your home directory. mkdir code cd code ls(and note that it's empty) to RUN a program you type ruby and then the name of the source file The Recipe Metaphor codesubdirectory using pwd hello.rbusing touch hello.rb hello.rbin your favorite text editor Inside this file, put the following source code: puts "Hello, World!" Save the file Go back to the terminal Run this file using ruby hello.rb What happens? Is this what you expected? /#
http://docs.railsbridge.org/learn-to-code/the_command_line.deck
2018-07-15T22:56:39
CC-MAIN-2018-30
1531676589022.38
[]
docs.railsbridge.org
Office Business Applications: Pricing Exception Management Solution Summary: Learn to implement an internal Price Exception Management solution. Examine how you can use Office Business Applications to establish interoperability between SAP and the 2007 Microsoft Office system. (19 printed pages) Bas Kamphuis, Microsoft Corporation Erika Ehrli, Microsoft Corporation Leanne Brodzinski, Microsoft Corporation Jean Wijsman, Microsoft Corporation Mike Lockhart, Microsoft Corporation Randy Hongo, Microsoft Corporation January 2008 Applies to: 2007 Microsoft Office System, Microsoft Office SharePoint Server 2007 Contents Connecting Back-End Systems and Real-World Work Processes by Using Office Business Applications Overview of the Pricing Exception Management Solution BCWeb Conceptual Architecture Conclusion Acknowledgments Additional Resources View video: Office Business Applications: Price Exception Management. Connecting Back-End Systems and Real-World Work Processes by Using Office Business Applications Organizations and IT departments invest in line-of-business (LOB) applications such as enterprise resource planning (ERP), customer relationship management (CRM), and supply chain management (SCM) to manage work processes and line-of-business information. These applications play a critical role in different workflows and scenarios across the organization. However, the back-end LOB applications and the tools and programs used by information workers are rarely connected. Much of the ad-hoc way that people work occurs in Microsoft Office system applications. Usually, this is disconnected from back-end systems. In many scenarios, LOB applications are seen as overhead since they do not map to the real way that people work. The 2007 Microsoft Office system is a unified solutions platform for building Office Business Applications that makes LOB applications, enterprise data, and workflows accessible and relevant to users. Office Business Applications are a new breed of applications that use the Microsoft Office system to find and surface LOB information. Office Business Applications enable businesses to extend the Microsoft Office clients and servers into business processes running in LOB applications. Figure 1 shows the conceptual architecture of Office Business Applications. Figure 1. Office Business Applications The goal of this article is to show you how. Overview of the Pricing Exception Management Solution In Microsoft, sales employees in the field can request special deals, product promotions, on-price list promotions, and off-price list promotions for different markets. For example, Jean Philippe Bagel, a corporate vice president in France, can request 10 percent off Microsoft Office Professional Plus 2007 on an open license to a specific customer. At the same time, Armando Pinto, a sales representative in Brazil, can request 10 percent off of all Portuguese versions of the software. If both price promotions are approved, customers would get the following discounts: 10 percent off of Microsoft Office Professional 10 percent off of Portuguese software 20 percent off of the Portuguese version of Microsoft Office Professional Price exception management is the set of processes and tools that help manage price list promotions, special deals, and product promotions applicable to product lines in a company. Microsoft sells software, hardware, games, and many other product lines and needs a price exception management solution to orchestrate requests and manage business rules and a routing and approval workflow. The Microsoft Information Technology group implemented a Price Exception Management solution (named BCWeb) based on Office Business Applications that helps simulate pricing discounts, manage business rules, customize routing and approval workflows, and update the LOB data stored in SAP. One of the major goals of this system is to improve field productivity by simplifying the process and removing training requirements for complex back-end systems, in this case SAP. In the past, sales fields needed knowledge of IT systems to create a promotion. Sales fields were also required to use different promotion types in different programs. BCWeb integrates different promotion types into a simplified Web application. It also allows users to specify business rules using Microsoft Office Excel 2007 files and to get notifications in Microsoft Office Outlook 2007. Additional goals of this system include: Reduce revenue leakage by having a business case documented for each promotion field. Obtain visibility decisions and understand who required a discount and why. Understand promotion effectiveness and ensure the field is making informed decisions. Provide a configurable set of files to easily change empowerment guidelines by having a single master data of workflow routing and approvals. Specific business process in price-execution domain, however, pattern addresses larger solutions space Scenario Overview: A Day in the Life Madeleine Kelly, a technical specialist in Romania, supports the subsidiary sales goals by introducing additional incentives for small to medium business customers for Microsoft Office SharePoint Server (MOSS) 2007. Her goal is to create a promotion price based on the published price list with the following parameters: 5 percent price discount on any License or License and Software Assurance offerings The promotion will be effective for three months 35 percent growth target in the sales between fiscal year 2007 and fiscal year 2008 If the sales representative wants to move forward with a promotion request, she submits a business case and the Price Exception Management solution starts a routing and approval workflow. After the promotion discount is approved, the system updates the back-end LOB data. Figure 2 shows a high-level workflow process. Figure 2. Price exception management high-level process As shown in Figure 2, the Price Exception Management solution workflow includes four steps: Create the promotion request. A sales representative uses a Web-based application to select promotion parameters such as the prices, countries, and distributors. The sales representative can also run price simulations and approval routing simulations before entering a business case and submitting a request. Submit the business case. After the sales representative is satisfied with the promotion request, she completes another Web form where she defines the rationale and return of investment details. After the information is ready, the system drops an XML file to a document repository and starts the route and approval process. Route and approve request. The route and approval process defines a set of configurable rules and decision tables for routing and approval. Based on these settings, the system generates a custom workflow for each promotion request. Next, it sends notifications to the set of approvers involved in the custom workflow and orchestrates the custom routing and approval process until completion. Execute a price range. After a promotion request approval is complete, the system notifies the sales manager and SAP conditions are automatically created and applied for the effective date. Figure 3. Price exception management solution high-level workflow BCWeb Conceptual Architecture BCWeb showcases MOSS 2007, Windows Workflow Foundation, SAP interoperability, and the Open XML File Formats. Operationally, the implementation of the Price Exception Management solution using the previous technologies can be the best and most flexible solution for the following reasons: The front-end user interface provides the following: A UI experience that is much like a wizard The use of ntelligent filtering to help improve the user experience Because there is no SAP UI, a Microsoft ASP.NET 2.0 and front-end SharePoint site avoid the use of confusing SAP user ID maintenance. MOSS 2007 provides the platform for custom workflows. A custom-built workflow routing and approval component is generic and allows configuration of up to five levels of workflow approval. BCWeb has three main components that orchestrate the entire promotion request workflow: BCWeb front-end application for SAP. AnASP.NET Web application that provides a set of Web forms that help capture price exception details and information on the business case for doing discounts on price lists, and helps calculate the percentage of discount. SAP Pricing Engine. System that connects with SAP to run real-time price simulations based on user preferences. It uses the SAP .NET Connector and custom Web services to pull real-time price information from SAP, and provides sales executives with a light front-end interface to SAP. Workflow Routing and Approval Process system. Configurable, rules-based empowerment and workflow system that defines business rules by using Microsoft Office Excel 2007. Users can use decision tables to capture approval and routing domain values. Additionally, a Microsoft Visual Studio 2005 state machine workflow uses the Open XML Open Packing conventions to extract routing and approval information from Excel 2007 files. Then, it compiles and deploys custom workflow templates to a SharePoint site. Figure 4. BCWeb architecture diagram BCWeb Front-End Application for SAP The BCWeb front-end application is implemented as an ASP.NET 2.0 Web application. This application is the starting point for all promotion requests and provides a simple set of entry forms to help sales employees create promotion requests and submit a business case. It also provides Web services that pull real-time data from SAP. Creating the Promotion Request The BCWeb front-end application provides a set of Web forms to help users define promotion request settings, as shown in Figure 5. Figure 5. BCWeb application Web forms The BCWeb front-end application connects with SAP every day in real-time by using the SAP .NET Connector. This application provides a set of custom Web services that follow Service Oriented Architecture principles. BCWeb calls the SAP Web services to determine the valid price lists and customers and the possible SKUs for a promotion based on inputs provided during on-screen selections. It also calls the SAP Web services to run pricing simulations that show what the possible impact of the promotion is and to create or change the promotion condition records in SAP. Every night, a scheduled job runs in SAP to extract promotion intelligent filtering files to provide possible values for all the selection options on the front-end Web forms s and to generate an XML data repository. The file is dropped in the application directory and is cached in memory to improve performance. The BCWeb front-end Web forms use intelligent filtering by displaying dynamic drop-down list controls that bind data to the XML file through an XMLDataSource object. This implementation helps reconcile records and promotions automatically. The Web application displays the prices, countries, distributors, and other fields to filter data. Based on user selection, a different Web service connects to SAP to generate price simulation information. This execution is run by the SAP Pricing Engine component. Additionally, before submitting a request, sales representatives use this application to run routing simulations and visualize the routing and approvers that may be involved in a discount process. Submitting the Business Case After a sales representative is satisfied with the price and routing simulation information, she fills the business case form and submits the business case. The BCWeb front-end application provides a set of Web forms that help users define business case information as shown in Figure 6. Figure 6. BCWeb business case submission The BCWeb application helps capture the rational for a price discount and return on investment information. The application serializes pricing information and the business case into an XML file and stores the file in a SharePoint library. Finally, through a custom API call, the BCWeb application uses a Windows Communication Foundation TCP/IP call to invoke and submit a request object. This object serializes the document stored on the document library, document headers fields, and parameters used to kick-start the workflow routing and approval process components. Workflow Routing and Approval Process So far, we explained how the BCWeb front-end application helps with the first two steps of the high-level workflow: creating a promotion request and submitting the business case. The next step of the high-level workflow is routing and approving requests. Even when this represents a single step in the high-level workflow, this is the central part of the system. Routing and Approving Requests Microsoft has a set of policies and rules that define the set of actors that must approve different types of requests such as negotiated deals and price promotions. In the past, approvers had to use multiple systems to approve different types of requests. In multiple occasions, some requests were approved by the wrong person or not in time. Microsoft needed a new system to help route and orchestrate the many different and concurrent approval workflows for promotion requests. The Microsoft IT department implemented a workflow routing and approval process to provide an extensible and agile solution that provides a consistent UI for approvers. Figure 7 shows how the workflow routing and approval process manages approval processing within a larger process. Figure 7. Workflow routing and approval process The workflow routing and approval process is extensible, by using APIs, to receive different types of requests. It collects and archives decisions for auditing and for consumption by and for systems that consume the decisions, such as price publication and order processing. The workflow routing and approval process also implements the concept of an Approval Routing Domain, which is routing rules for each request type, such as price promotions, special deals. Routing approval workflows can change from one promotion type to another. The workflow routing and approval process system provides configurable files to define custom settings for three different approval workflows: Serial. Requires one approver at each level in order. Any approver can reject and therefore stop the process. Parallel. Requires one approver in each department in any order. Hybrid. Combines a serial approval routing workflow and a parallel approval routing workflow. Figure 8. Different configurable approval workflows. Workflow Routing and Approval Process Architecture The workflow routing and approval process system integrates different Microsoft products and technologies to implement the business rules engine and a workflow engine: Business rules. Excel 2007 allows rule administrators to capture business rules. The workflow routing and approval process uses Excel 2007 as the interface for decision sets. Excel 2007 works as a business rules engine and helps manage the rules capture and decision process for promotion approvals and notifications. The workflow routing and approval process uses Excel 2007 to define the complete set of routing and notification rules. Workflow engine. The workflow routing and approval process uses the rules engine provided by the Windows Workflow Foundation and hosts workflow in MOSS 2007. In the first section of the article, we explained that organizations can build custom Office Business Applications by defining components and systems that integrate Microsoft Office system programs, servers, services, tools, and technologies with LOB information. The following section of the article explains how the Microsoft IT department used Excel 2007, MOSS 2007, and custom APIs to define a rules engine and a custom routing and approval workflow. Figure 9. Workflow routing and approval process architecture Excel 2007 as a Business Rules Engine Rule administrators are the people that manage and write business rules. In the past, legacy systems constrained an administrator's ability to manage rules. Excel 2007 provides a comfortable environment for rules administrators and requires minimal investment in training. New rule administrators must understand how the business rules tables work, but are already familiar with Excel. To add to the benefits, implementing decision tables by using Excel 2007 represents low costs for companies that already invested in 2007 Microsoft Office systems licenses. The workflow routing and approval process defines two different configurable sets of business rules in Excel 2007: Routing rules. Rule administrators use Excel 2007 to define maximum approval levels. A participant spreadsheet identifies individual approvers for each matching condition (country/region, segment, and level). Rule administrators also have the flexibility to define child rule sets. This flexibility makes it easier for Rule Administrators to manage the rules, as they are able to collapse repeated rules into one Excel worksheet and reference them from other worksheets. For example, specifying the different maximum approval levels for different countries and segments. Notification rules. Rule administrators can define the e-mail templates to use for different notification events and participation roles. Figure 10. Routing and notification rules in Excel 2007 Rule administrators can change the spreadsheets anytime. However, to validate that business rules are correct, the workflow routing and approval process runs a nightly provisioning process to validate routing and notification rules. All approvers listed in the Excel 2007 spreadsheets should be valid approvers listed in Active Directory directory service. The nightly provisioning process also validates against the rules, as administrators can specify applicable dates for approvers. Running this process helps reduce the risk of possible errors in approval workflows. This functionality was implemented to fulfill a business requirement. To comply with Sarbanes-Oxley principles, we had to verify that an approver was valid for that day to make a decision for a given request. Workflow Engine Windows SharePoint Services 3.0 and MOSS 2007 provide a new platform for defining and running workflows. The workflow platform is based on the Windows Workflow Foundation, a Windows platform component that provides a rules engine, a programming infrastructure, and tools for development and execution of workflow-based applications. For more information, see Developer Introduction to Workflows for Windows SharePoint Services 3.0 and SharePoint Server 2007. The Microsoft IT department used the Microsoft Visual Studio 2005 Workflow Designer as a Workflow designer, the Microsoft .NET Framework Windows Workflow Foundation as a workflow engine, and MOSS 2007 as a run-time host for the workflow routing and approval process. The Price Exception Management solution uses a state machine workflow. State machine workflows are designed to work in event-driven scenarios. A state machine workflow contains two or more states, with one state being the active state at any given time. Figure 11 shows the workflow routing and approval process state machine workflow definition in Visual Studio 2005. Figure 11. Workflow routing and approval process state machine workflow The ultimate goal of the workflow routing and approval process is to generate a compiled set of "rules.dll" libraries that contain rules definitions for every single promotion request. The workflow routing and approval process deploys each rules.dll to the global assembly cache of a server running MOSS 2007 that hosts and persists multiple workflow instances. The workflow routing and approval process defines a six steps process to import and execute rules: Import a data package. Business rules administrators use an administration console to import the rules defined in the Excel 2007 spreadsheets. That is, they can import new rules by using a smart client application on the rules administrator's desktop. The tool generates an import package (ARD) that contains three different files: Business rules. An Excel 2007 file that defines the set of routing and notification rules Configuration parameters. An XML file that provides a schema definition for custom configuration parameters. E-mail templates. XML files and XSLT files that define optional templates that can override predefined e-mail templates. Convert to intermediate data format. The workflow routing and approval process components convert the Excel 2007 rules to an intermediate data structure. The workflow routing and approval process component reduces the time of the importing by parsing the underlying Open XML File Formats files that make up an .xlsx file. What previously took several minutes on the client using the Office primary interop assemblies was reduced to a few seconds. For more information see, Introducing the Office (2007) Open XML File Formats. Validate rule set. We explained earlier that the workflow routing and approval process runs a nightly provisioning process to validate routing and notification rules. Provisioning happens separately from validating a workflow routing and approval process ARD package. Provisioning reruns the rules against in-flight requests. Workflow routing and approval process provisions both nightly and when an ARD is imported. Nightly provisioning evaluates all pending requests; while import provisioning is specific to requests for the new imported ARD. This process validates invalid domain values for attributes, use of undefined columns, circular references between child rule sets, invalid actions for a given rule set type, and invalid approver aliases. Optimize "Find ALL" rules. When a sales representative creates a promotion request, she defines a set of fields that define the required conditions for a promotion. Each rule set has unique request types, countries, amounts, and approvers. The workflow routing and approval process APIs search for all applicable rules that match a given condition. Promotion requests need to "find all" matches for a given condition. Parsing large files rule sets with multiple approvers is a custom scenario needs an optimized search algorithm. The execution of a "find all" optimized search algorithm helps improve the performance of rules and that satisfy the conditions of a search. This optimization helps "find all" matching rules for routing and approver determination. Generate Custom Rules. After the rules are optimized and validated, the workflow routing and approval process components use the Code Document Object Model (CodeDOM) Quick Reference to generate Workflow Rules CodeDom. The Workflow CodeDom provides custom rule sets that define the behavior of workflows that hosted on a server running MOSS 2007. Compile Rules Set.dll. The Workflow CodeDom is compiled into a Microsoft .NET Framework–based assembly (.dll) for better performance. There is one .dll file for each RuleSetType (Routing Level determination, Approver determination, or E-mail Template determination) per ARD package. Windows Workflow Foundation and MOSS 2007 run the rules compiled in the .dll file. This greatly improves the execution of multiple promotion request routing and notification workflows. Execute Price Change The last step of the high-level price execution management process is executing the price change in SAP. BCWeb uses the workflow routing and approval process to define the routing and approval process for user-defined pricing concessions. After each custom workflow is deployed to MOSS 2007, the real-time routing and approval workflow instance starts running. Each approver defined in the workflow rule set gets the appropriate notifications and makes decisions to approve or reject a promotion request until completion. If a price promotion is approved, the workflow routing and approval process calls a Web service hosted by BCWeb, which in turn communicates with SAP to execute a price change. A SAP condition is created automatically and applied for an effective date. Every month, price analysts run a process in SAP to regenerate all the price points and apply all discounts coming from the multiple approved price promotion requests. Conclusion The 2007 Microsoft Office system provides a platform for building composite solutions, named Office Business Applications. This platform is designed to support cross-functional processes and allow information workers to become more productive. Office Business Applications allow global information workers to gain insight, collaborate, make related time decisions, and take action. The Microsoft IT Price Execution Management solution showcases how Office Business Applications can help connect back-end systems with the 2007 Microsoft Office system to take advantage of existing technology investments, and to define custom workflows and solutions that greatly improve business processes and return on investments for organizations. Acknowledgments We want to thank the following people for their contributions to this article: Amit Garg, Pradeep Balakrishnan, Joyjeet Majumdar, Radha Ramaswamy, Dona Mukherjee, Sundaresan Krishnamurthy, Tim Ragain, Padma Chigurupati, Clay Sales, Kiruthika Baskar, Nathan Helgren, Robyn Kilponen, Mark Antone, Lisa Cranfill, Lisa Jackson, Duke Hafferman, Donna Taylor, Eric Morris, Chris Lawson, Ram Sastry, Aisling Duggan, Nagaraj Venkataramu, Yossi Avnon, Adonis Acuario, Emil Tso, Gordon McIntosh, Russ Martin, Prakash Duggirala, Sirisha Chalasani, Anil Kumar Busa, Sridhar Reddy Gandra Cathy Swenton, Robert Peacey, Alexey Morozov, Clint Simmons, Aaron Wiley, Randy Neumaier, Venumadhava rao Yandapally, Mukundan M K, Pavan Adhyapak, Adam Johnson, Patrick O'Brien, Kamal Desilva, John Degel, Betina Duro, Subodh Kirtane, Andrew Whitaker, Anthony Young, Au Casiano, Demian Crumb, Dinesh Srinivasan, Glenn Sawatsky, Greg Grivas, Kevin Neher, Leanne Brodzinski, Manjunath Khatawkar, Niket Kalgaonkar, Ryan Perlman, Lyanne Ma, Bill Dow, and Jerre McQuinn. Additional Resources For more information, see the following resources: Microsoft Office Developer Center Office Business Applications Developer Portal SharePoint Server Developer Center Office Business Application Reference Application Pack for Price Management Workflow Resource Center for SharePoint Server Microsoft-SAP Technical Guidance - Windows Communication Foundation Windows Workflow Foundation Office Open XML Formats Resource Center Service Oriented Architecture (SOA) Dynamic Source Code Generation and Compilation Sarbanes-Oxley: The Role of Microsoft Business Solutions Technology In Supporting Compliance
https://docs.microsoft.com/en-us/previous-versions/office/developer/office-2007/bb977552(v=office.12)
2018-07-16T00:03:08
CC-MAIN-2018-30
1531676589022.38
[array(['images/bb977552.bcf0119f-da33-48f1-8b2e-8bcfad853c53%28en-us%2coffice.12%29.gif', 'Office Business Applications Office Business Applications'], dtype=object) array(['images/bb977552.5e10458b-2b5f-449a-9542-95ae1d5ede0f%28en-us%2coffice.12%29.gif', 'Price exception management high-level workflow Price exception management high-level workflow'], dtype=object) array(['images/bb977552.96cb7eb3-57c3-47ff-b968-1cdddd28a4c2%28en-us%2coffice.12%29.gif', 'Price exception management workflow Price exception management workflow'], dtype=object) array(['images/bb977552.3217ca38-2748-4689-b70e-9160678526d6%28en-us%2coffice.12%29.gif', 'BCWeb architecture diagram BCWeb architecture diagram'], dtype=object) array(['images/bb977552.55da25ae-a1a0-49a9-a2e8-f5c88cd68047%28en-us%2coffice.12%29.gif', 'BCWeb application Web forms BCWeb application Web forms'], dtype=object) array(['images/bb977552.b16f941f-56ed-4c62-a29e-99dedee1ced2%28en-us%2coffice.12%29.gif', 'BCWeb business case submission BCWeb business case submission'], dtype=object) array(['images/bb977552.fe5b339f-def8-4a5d-8cbc-f7fae56389ad%28en-us%2coffice.12%29.gif', 'Workflow routing and approval process Workflow routing and approval process'], dtype=object) array(['images/bb977552.4738dad4-7dfe-4487-8f89-0ef44bcdb5f7%28en-us%2coffice.12%29.gif', 'Configurable approval workflows Configurable approval workflows'], dtype=object) array(['images/bb977552.67f3626b-0402-4d77-9315-5078af7f6537%28en-us%2coffice.12%29.gif', 'Workflow routing and approval system architecture Workflow routing and approval system architecture'], dtype=object) array(['images/bb977552.8f3501e8-3612-4142-9055-30b0951a8bc9%28en-us%2coffice.12%29.gif', 'Routing and notification rules in Excel 2007 Routing and notification rules in Excel 2007'], dtype=object) array(['images/bb977552.26d1a94f-1089-401d-9493-3cd0b335a2a7%28en-us%2coffice.12%29.gif', 'WRAP state machine workflow WRAP state machine workflow'], dtype=object) ]
docs.microsoft.com
Deploy software updates Applies to: System Center Configuration Manager (Current Branch) The software update deployment phase is the process of deploying the software updates. No matter how you deploy software updates, the updates are typically: - Added to a software update group. - Downloaded to distribution points. - The update group is deployed to clients. After you create the deployment, an associated software update policy is sent to client computers. The software update content files are downloaded from a distribution point to the local cache on client computers. The software updates are then available for installation by the client. Clients on the Internet download content from Microsoft Update. Note If a distribution point is not available, you can configure a client on the intranet to client computers, and then you manage software updates on clients by using automatic deployment. Manually deploy software updates You can select software updates in the Configuration Manager console and manually start. For detailed steps, see Manually deploy software updates. Note Starting in Configuration Manager version 1706 Office 365 client updates have moved to the Office 365 Client Management >Office 365 Updates node. This will not impact your ADR configuration but does affect manual deployment of Office 365 updates. Automatically deploy software updates Automatic software updates deployment is configured by using an automatic deployment rule (ADR). This is a common method of deployment for monthly software updates (typically known as "Patch Tuesday") and for managing definition updates. When the rule runs, software updates are removed from the software update group (if using an existing update group), the software updates that meet a specified criteria (for example, all security software updates released in the last month) are added to a software update group, the content files for the software updates are downloaded and copied to distribution points, and the software updates are deployed to clients in the target collection. The following list provides the general workflow to automatically deploy software updates: - Create an ADR that specifies deployment settings. - The software updates are added to a software update group. - The software update group is deployed to the client computers in the target collection, if it is specified. You must determine what deployment strategy to use in your environment. For example, you might create the ADR and target a collection of test clients. After you verify that the software updates are installed on the test group, you can add a new deployment to the rule or change the collection in the existing deployment to a target collection that includes a larger set of clients. The software update objects that are created by the ADRs are interactive. - Software updates that were deployed by using an ADR are automatically deployed to new clients added to the target collection. - New software updates added to a software update group are automatically deployed to the clients in the target collection. - You can enable or disable deployments at any time for the ADR. After you create an ADR, you can add additional deployments to the rule. This can help you manage the complexity of deploying different updates to different collections. Each new deployment has the full range of functionality and deployment monitoring experience, and each new deployment that you add: - Uses the same update group and package, which is created when the ADR first runs - Can specify a different collection - Supports unique deployment properties including: - Activation time - Deadline - Show or hide end-user experience - Separate alerts for this deployment For detailed steps, see Automatically deploy software updates
https://docs.microsoft.com/en-us/sccm/sum/deploy-use/deploy-software-updates
2018-07-15T23:23:42
CC-MAIN-2018-30
1531676589022.38
[]
docs.microsoft.com
ICertAdmin2 interface The ICertAdmin2 interface is one of two interfaces that provide administration functionality for properly authorized clients. The ICertAdmin2 interface is used to perform the following tasks: - Authorize or deny a certificate request. - Revoke an issued certificate. - Trigger the generation of a certificate revocation list (CRL). - Get the current CRL for the server. - Determine whether a certificate is valid. - Get an archived key. - Get a certification authority (CA) display name, property, or property flag. - Publish one or several CRLs. - Get or set configuration information. - Determine which roles are set. - Import a certificate or key. Methods The ICertAdmin2 interface has these methods.
https://docs.microsoft.com/en-us/windows/desktop/api/certadm/nn-certadm-icertadmin2
2018-07-15T23:48:42
CC-MAIN-2018-30
1531676589022.38
[]
docs.microsoft.com
History Timeline You can view a timeline of changes for a CI and for its related records, relationships, baselines, and proposed changes for the CI. Timelines are available for CIs in the Configuration Item [cmdb_ci] table or a descendant of this table, if auditing is enabled for the tables. Role required: The ACL for this view is based on the roles defined in the glide.history.role system property, which by default is set to itil. Also, the user must have read access to the History Set [sys_history_set] table, which by default is granted to admin. You can open a timeline when you view the history of a CI. You can specify the time period, time range, and properties that are displayed in the timeline. You can view either what has changed in a particular change set, or view the entire CI to better troubleshoot any issues. You can also display a timeline of changes to the CI's related records, and export and compare snapshots of the CI at any point in time. CI changes are represented by bubbles in different shapes and colors along the timeline. The shape of each bubble represents a different type of change and the color of each bubble specifies whether the change is valid or invalid. CI baselines are represented by black circles that you can hover over to display more details. Click the ? icon to display bubble shape and color definitions, and point to a bubble to display details about the change set. A change to a relationship is considered valid only if it was applied through change management. If the change was applied via the Proposed Changes framework, it is valid. For additional validation steps, see Create or edit a planned validation script. Figure 1. History Timeline view Figure 2. Timeline bubbles Note: Proposed changes that do not have a planned start date are placed at future points of time. Timeline navigator Use the handles on both ends of the timeline navigator to extend or to shorten the time period that is shown. You can scroll to a different period of time by clicking on the bottom part of the timeline navigator and then dragging the navigator to the left or right. Zoom By default, the timeline for the last month is shown. Next to the Zoom label above the timeline, you can select another time interval. You can select intervals from a minute to the entire period of data. If there are many changes of the CI during the time period, the bubbles displayed might get too crowded. You can zoom in or out to spread the bubbles in either method: Change the time interval on the timeline. As you shorten the time interval, you zoom in, and as you lengthen the time interval, you zoom out. Select the section of the timeline that you want to zoom into. Property filter You can filter the bubbles that are displayed. By default, all bubbles are displayed, representing changes to all of the CI's properties. You can limit the view to display only the bubbles in which selected properties have changed and exclude bubbles in which only unselected properties changed. The Detail and Summary views highlight properties within your filter scope that have changed. The changed properties are highlighted in light blue. In the Summary view, you can choose to include all the properties of the CI, or only properties that have changed. If you choose to display all properties in the summary view, then changed properties are listed before unchanged properties. Summary view The Summary view displays snapshots of the CI's represented by each bubble. Each snapshot displays the changes to the CI's fields and relationships according to the change set. It displays old and new values before and after the change, and any relationships that were added or deleted. Use the > and < buttons on both sides of the snapshot display to scroll through the next and previous change set records in a chronological order. Detail view The Detail view displays snapshots of the CI that correspond with the bubbles. Each snapshot includes the fields that are within the property filter scope, displaying the properties that have changed with a light blue background. Click on a bubble to display its corresponding snapshot of the CI. The data that is displayed is read-only. Use the > and < buttons on both sides to scroll through the next and previous change set records in a chronological order. View timeline of changes to related recordsOn the timeline of changes for a CI record, you can also view a timeline of changes for the CI's related records.Export a snapshot of a CIYou can export a snapshot of a CI from its timeline.Compare CI snapshotsYou can compare the properties and relationships of a CI at two different points in its timeline. Related TasksControl access to historyChange the number of history entriesRelated ConceptsDifferences Between Audit and History SetsTracking changes to reference fieldsTracking insertsRelated ReferenceHistory ListHistory CalendarTracking CI Relationships
https://docs.servicenow.com/bundle/kingston-platform-administration/page/administer/security/concept/c_HistoryTimeline.html
2018-07-15T23:14:42
CC-MAIN-2018-30
1531676589022.38
[]
docs.servicenow.com
Function Curves By default, when a drawing layer or peg is added to a scene, no function curves are created. You will generally create the ones you need keyframes directly on the function curve instead of doing it positions of the aircraft with the camera's peg layer, but ignore the angle, scale and skew. By default, all function curves can only be used and modified using their original parameter. If you want another layer or parameter to use the same function curve, you must share it. There are two ways to share a function:. You can also create Velobased functions for certain effects, like changes in rotation or size over time. When you do this,_20<< You can convert - In the Timeline view, double-click on a layer. The Layer Properties Editor opens. - In the Transformation tab, click the Function Arrow button beside the local function information. - Create a 3D Path, Bezier, or Velobased curve. You can click the Function button to open the Function editor. -. - In the Timeline view, open the Layer Properties editor of the other layer that contains the parameters you want to link to the shared function. - In the Layer Properties editor or view, attach the parameter to the shared function the same way you did for the first layer. The two parameters are both linked to the same function curve and follow the same path. If you modify the curve, both parameters will update.
https://docs.toonboom.com/help/harmony-12/advanced/Content/_CORE/_Workflow/028_Animation_Paths/054_H1_Function_Curves.html
2018-07-15T23:20:09
CC-MAIN-2018-30
1531676589022.38
[array(['../../../Resources/Images/_ICONS/Home_Icon.png', None], dtype=object) array(['../../../Resources/Images/HAR/_Skins/stagePremium.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/_Skins/stageAdvanced.png', 'Toon Boom Harmony 12 Stage Advanced Online Documentation'], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/_Skins/stageEssentials.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/_Skins/controlcenter.png', 'Installation and Control Center Online Documentation Installation and Control Center Online Documentation'], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/_Skins/scan.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/_Skins/stagePaint.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/_Skins/stagePlay.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/_Skins/Activation.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/_ICONS/download.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/Stage/Paths/HAR11/HAR11_animationPaths_bezierCurve.png', None], dtype=object) array(['../../../Resources/Images/HAR/Stage/Paths/an_velobased_001.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/Stage/Paths/HAR12/HAR12_select_layer.png', None], dtype=object) array(['../../../Resources/Images/HAR/Stage/SceneSetup/HAR12/HAR12_transformation_tab_ADV.png', None], dtype=object) array(['../../../Resources/Images/HAR/Stage/Paths/HAR12/HAR12_create_fcurve_ADV.png', None], dtype=object) array(['../../../../Skins/Default/Stylesheets/Images/transparent.gif', 'Closed'], dtype=object) array(['../../../Resources/Images/HAR/Stage/Paths/HAR12/HAR12_share_func.png', None], dtype=object) array(['../../../Resources/Images/HAR/Stage/Paths/HAR12/HAR12_share2.png', None], dtype=object) array(['../../../Resources/Images/HAR/Stage/Paths/HAR11/HAR11_animationPaths_shareFxns2.png', None], dtype=object) array(['../../../Resources/Images/HAR/Compositing/HAR11/Draw_layerprop_transformTab_bezier.png', None], dtype=object) ]
docs.toonboom.com
. Effect on virtual machines at protected site None Site Recovery Manager shuts down virtual machines in reverse priority order. the a datacenter network. Interruption of recovery plan You can cancel a test at any time. You can cancel the recovery in some cases. Parent topic: Creating, Testing, and Running Recovery Plans Related concepts Testing a Recovery Plan Performing a Planned Migration or Disaster Recovery By Running a Recovery Plan How Site Recovery Manager Interacts with DPM and DRS During Recovery How Site Recovery Manager Interacts with Storage DRS or Storage vMotion How Site Recovery Manager Interacts with vSphere High Availability Protecting Microsoft Cluster Server and Fault Tolerant Virtual Machines Export Recovery Plan Steps View and Export Recovery Plan History Related tasks Create, Test, and Run a Recovery Plan Cancel a Test or Recovery Delete a Recovery Plan
https://docs.vmware.com/en/Site-Recovery-Manager/5.5/com.vmware.srm.admin.doc/GUID-A276B4AC-D0F0-4659-8241-F5D987525ED0.html
2018-07-15T23:40:05
CC-MAIN-2018-30
1531676589022.38
[]
docs.vmware.com
You can use a unique host name for a load balancer by providing an address and a placeholder in your container settings. The placeholder determines the location of an automatically generated part of the URL. This value is unique for each host name. The address supports the %s format character to specify where the placeholder is located. If the placeholder is not used, it is positioned as a prefix or suffix of the host name, depending on the system configuration. It is recommended you use a load balancer that can target requests to each node if you build an application which includes a service that must be publicly exposed and which must also scale in and out. After you provision the application, the load balancer configuration is updated whenever the service is scaled in or out by vRealize Automation. Network tab. - In the Address text box, enter the location of the placeholder. The address host acts as a virtual host. To access the address host, you can add mapping information in the etc/hosts file or use a DNS that maps the container address to the host name. - In the Container Port text box, enter the port number used to expose the service. Use the sample format provided in the form. If your container application exposes more than one port, specify which internal port or ports can expose the service. - Click Save.
https://docs.vmware.com/en/vRealize-Automation/7.4/com.vmware.vra.prepare.use.doc/GUID-1342E39B-79E0-4495-AA34-75714E166905.html
2018-07-15T23:29:24
CC-MAIN-2018-30
1531676589022.38
[]
docs.vmware.com
changes.mady.by.user Homepage Saved on Mar 05, 2018 Saved on Mar 06, 2018 JWST user documentation, informally known as "JDox," is available as a collection of articles on the Web. Unlike conventional HST handbooks, JDox is intended as an agile, user-friendly source of information that follows the Wikipedia-like Every Page is Page One (EPPO) philosophy. Our goal is to provide short, focused, well-linked articles that provide the kinds of information found in traditional HST instrument handbooks, data handbooks, and calls for proposals. All JDox articles are separated into four sections: (1) JWST Observatory and Instrumentation, (2) JWST Observation Planning, (3) JWST Opportunities and Policies, and (4) JWST Data Calibration and Analysis. These articles provide details about the observatory and instruments, descriptions of tools used for proposing, advice on observing strategies, “cookbooks” that guide users through the proposal preparation process, as well as information about calibration and analysis of JWST data. Downloadable PDF collections of the documentation are provided as a courtesy, made available and updated when feasible. The online documentation is the authority, and will be updated with the latest information. A This page has a list of recently updated JDox articles: JWST JDox Latest Updates JWST Observatory and Instrumentation JWST Opportunities and Policies JWST Observation Planning JWST Data Calibration and Analysis JDox is made possible by the following contributing authors: O. Fox, S. LaMassa, J. Lotz, D. Coe, D. Soderblom, W. Blair, S. Alberts, C. Alves de Oliveria, T. Beck, S. Birkmann, B. Blacker, T. Böeker, M. L. Boyer, G. Brammer, B. Brooks, R. Brown, H. Bushouse, A. Canipe, C. Chen, M. Correnti, M. Cracraft, A. Deshpande, R. Diaz, R. Downes, N. Earl, H. Ferguson, P. Ferruit, J. Filippazzo, P. Forshay, J. Fraine, A. Fullerton, M. Garcia Marin, G. Giardino, K. Gilbert, J. Girard, K. Gordon, C. Gosmeyer, P. Goudfrooij, T. Greene, B. Hagan, A. Henry, B. Hilbert, D. Hines, S. Holfeltz, B. Holler, G. Kanarek, D. Karakla, S. Kassin, S. Kendrew, T. Keyes, A. Koekemoer, D. Law, J. Lee, J. Leisenring, N. Lützgendorf , L. McCuen, S. Milam, A. Moro-Martin, K. Murray, J. Muzerolle, E. Nelan, B. Nickson, M. Oboryshko, M. Pena-Guerrero, M. Perrin, K. Peterson, N. Pirzkal, K. Pontoppidan, C. Proffitt, E. Puga, L. Pueyo, S. Ravindranath, T. Rawle, N. Reid, M. Ressler, A. Riedel, G. Rieke, M. Rieke, J. Rigby, C. Ritchie, M. Robberto, J. Sahlmann, B. Sargent, E. Schlawin, R. Shaw, M. Sirianni, A. Sivaramakrishnan, G. Sloan, G. Sonneborn, J. Stansberry, K. Stevenson, L. Strolger, T. Temim, D. Thatte, L. Ubeda, J. Valenti, K. Volk, G. Wahlgren, B. Williams, C. Willmer, E. Wislowski, B. York Space Telescope Science Institute HELP DESK JWST WEBSITE Report website problems The NASA James Webb Space Telescope, developed in partnership with ESA and CSA, is operated by AURA’s Space Telescope Science Institute.
https://jwst-docs.stsci.edu/pages/diffpagesbyversion.action?pageId=15663174&selectedPageVersions=31&selectedPageVersions=32
2018-07-15T23:19:03
CC-MAIN-2018-30
1531676589022.38
[]
jwst-docs.stsci.edu
Ticket #2165 (new defect) gtk layout differences between 2007.2 and 2008.testing Description Our application: Displayed fine on 2007.2 running: cacao: 0.98+hg20071001-r8 classpath-common: 0.97.2-r5 classpath: 0.97.2-r5 classpath-gtk: 0.97.2-r5 But fails to display properly on the latest 2008.testing with: cacao: 0.99.3-r5.1 classpath-common: 0.97.2-r8.1 classpath: 0.97.2-r8.1 classpath-gtk: 0.97.2-r8.1 Most noticeable differences include the failure to draw the splash screen correctly (is drawn centrally on 2007.2 and in the top left of the screen on 2008) and any pop-up dialogues (in 2008 they often needlessly take up the entire width of the screen. The open file dialogue in 2007.2 is shown as two boxes, one on top of the other, whereas in 2008 they are squashed next to one another). Upgrading cacao and classpath on the 2007.2 installation broke our application in a similar manner. Almost identical problems were suffered when running the application with JamVM. The gvSIG application is available to anyone who would like to conduct testing. Change History comment:2 Changed 10 years ago by iknowjoseph Thanks Zecke, This small java application would display the file open dialogue on 2007.2, but not on 2008.x: Juan Lucas, can we provide an example of the splash screen? Thanks, Joseph When you say Gtk+ you actually mean AWT of GNU classpath implemented with Gtk+? If you want someone else to even consider looking into it you should create a reduction. The most simple java program exposing the issue (for every of your issues).
http://docs.openmoko.org/trac/ticket/2165
2018-07-15T23:15:57
CC-MAIN-2018-30
1531676589022.38
[]
docs.openmoko.org
Table of Contents The MARC Record Attribute Definitions support the ingesting, indexing, searching, filtering, and delivering of bibliographic record attributes. To Access the MARC Record Attributes, click Administration → Server Administration → MARC Record Attributes The MARC Editor includes Fixed Field Drop-down Context Menus, which make it easier for catalogers to select the right values for fixed fields in both Bibliographic and Authority records. You can use the MARC Record Attributes interface to modify these dropdowns to make them better suited for catalogers in your consortium. To edit these menus, you can follow these steps:
http://docs.evergreen-ils.org/reorg/3.1/staff_client_admin/_marc_record_attributes.html
2018-07-15T23:22:34
CC-MAIN-2018-30
1531676589022.38
[]
docs.evergreen-ils.org
Advanced Restore Options (Map) Specify the restore of individual files to a specified destination using an external map file. The map file provides the list of files to be restored and the destination to which the files are to be restored. (See Restore Data Using a Map File in Books Online for detailed information.) Use map file Specifies whether the restore operation must be performed using a mapping file. Enable this option to specify the map file. Map File Path Specifies the name of the map file. Use the space to enter the path and name of the map file. Use the browse button to browse and select the appropriate map file. Restore unmapped files Specifies whether the restore operation must restore the unmapped files that were selected for restore, to the specified restore destination. i.e., in-place restore or out-of-place restore. Clear this option to restore only those files that are included in the specified map file. Rename all restore files with suffix Specify a common suffix to be added to each file as it is restored. For Windows platforms, this suffix is appended to the filename (i.e., before the extension); for Unix/Macintosh platforms, the suffix is appended after the extension.
http://docs.snapprotect.com/netapp/v10/article?p=en-us/universl/restore/map.htm
2018-07-15T23:12:03
CC-MAIN-2018-30
1531676589022.38
[]
docs.snapprotect.com
Returns a two column table showing distinct values in the first column, and summed data in the second. SppSumTableColumnByGroup([Table Array],[Group By Column],[Column To Sum]) Where: Table Array is a Table Array (such as the data in a standard table, or the result of a QueryDataValues function. Group By Column is the column in the Table Array that will be grouped. Column To Sum is the column in the Table Array that will be summed for each unique value in the Group By Column. This table shows an example order and where some items may of been added to an order more than once and there quantities need adding together.
http://docs.driveworkspro.com/Topic/SppSumTableColumnByGroup
2019-03-18T20:38:16
CC-MAIN-2019-13
1552912201672.12
[]
docs.driveworkspro.com
Service Command Line¶ The MPF service cli is a fast way to debug or troubleshoot your machine during development and operation. - Start your game (e.g. using mpf both) - Start the service cli from within your game folder using mpf service. Your game will go into service mode and you can run diagnostics commands. Once you are done the game will continue and exit service mode. You can use tab to complete commands and arguments. exit/quit¶ Exit service cli. Game will reset and start. See mpf service command line reference.
http://docs.missionpinball.org/en/latest/tools/service_cli/index.html
2019-03-18T20:17:04
CC-MAIN-2019-13
1552912201672.12
[]
docs.missionpinball.org
Recent Email Sent Count¶ This Condition located on the Ministry category tab in Search Builder allows you to find people based on how many emails they have sent through TouchPoint for a specified number of days. Enter a whole number as the Value for the number of emails and for the number of days. Use Case You may want to see which of your users are sending frequent emails via TouchPoint. This Condition will tell you that for a certain number of days to look back. This tracks both mass emails and emails sent to single individuals. See also
http://docs.touchpointsoftware.com/SearchBuilder/QB-RecentEmailSentCount.html
2019-03-18T19:21:34
CC-MAIN-2019-13
1552912201672.12
[]
docs.touchpointsoftware.com
If you want to upgrade your installation of Zinnia from a previous release, it’s easy, but you need to be cautious. The whole process takes less than 15 minutes. The first thing to do is a to dump your data for safety reasons. $ python manage.py dumpdata --indent=2 zinnia > dump_zinnia_before_migration.json The main problem with the upgrade process is the database. The Zinnia’s models can have changed with new or missing fields. That’s why Zinnia use South‘s migrations to facilitate this step. So we need to install the South package. $ easy_install south South needs to be registered in your project’s settings as an INSTALLED_APPS. Once it is done, use syncdb to finish the installtaion of South in your project. $ python manage.py syncdb Now we will install the previous migrations of Zinnia to synchronize the current database schema with South. $ python manage.py migrate zinnia --fake We are now ready to upgrade Zinnia. If you want to use the latest stable version use easy_install with this command : $ easy_install -U zinnia or if you prefer to upgrade from the development release, use pip like that : $ pip install -U -e git://github.com/Fantomas42/django-blog-zinnia.git#egg=django-blog-zinnia The database should probably be updated to the latest database schema of Zinnia, South will be useful. $ python manage.py migrate zinnia The database is now up to date, and ready to use.
https://django-blog-zinnia.readthedocs.io/en/v0.8.1/upgrading.html
2019-03-18T19:36:33
CC-MAIN-2019-13
1552912201672.12
[]
django-blog-zinnia.readthedocs.io
You enable and configure certificate authentication from the VMware Identity Manager administration console. Prerequisites Obtain the root certificate and intermediate certificates from the CA that signed the certificates presented by your users. administration console Identity & Access Management tab, select Setup. - On the Connectors page, select the Worker link for the connector that is being configured. - Click Auth Adapters and then click CertificateAuthAdapter. - Configure the Certificate Authentication Adapter page.Note: An asterisk indicates a required field. The other fields are optional. - Click Save. What to do next Add the certificate authentication method to the default access policy. Go to the Identity & Access Management > Manage > Policies page and edit the default policy rules to add Certificate. See Managing Authentication Methods to Apply to Users..
https://docs.vmware.com/en/VMware-Identity-Manager/2.8/com.vmware.wsp-administrator_28/GUID-26C15BA2-3B76-4E53-ABBB-E83271F1B9CB.html
2019-03-18T19:23:50
CC-MAIN-2019-13
1552912201672.12
[]
docs.vmware.com
Welcome to AXEMAS documentation!¶ Development Framework for MultiPlatform hybrid mobile applications. AXEMAS handles the whole navigation of the application and transition between views, while it permits to implement the views content in HTML itself. AXEMAS works using sections, each Section represents the content of the view and is loaded from an HTML file or from an external URL. Whenever native code requires to be attached to a section, it is possible to attach a SectionController to a Section itself. Getting Started¶ To Install AXEMAS you need Python 2.7 with the pip package manager installed as AXEMAS uses Python to generate project skeletons. To install pip follow the Pip Install Guidelines. Then you can install AXEMAS toolkit using: $ pip install axemas To create a new AXEMAS project you can then use the axemas-quickstart command, it will automatically create a new AXEMAS project: $ gearbox axemas-quickstart -n ProjectName -p com.company.example See Quickstarting a New Application for additional details on the gearbox command. Basic Project Introduction¶ By default AXEMAS will create for you a basic application for iOS and Android. Content of the application will be available inside www directory and the application will load www/sections/index/index.html on startup. The application can be run by simply opening in Android Studio or XCode the android and ios projects inside the newly created application directory and then pressing the Run button inside the IDE. To start customizing the application and providing your own code, you can open the www directory in your favourite editor and start editing sections. From the index.html section, you can then use the JavaScript API to push and pop additional sections and implement your whole Application. Binding to Native Code¶ The previous code shows how to load HTML based sections and rely on the JavaScript API to implement your web application. When more advanced features or interaction with the hardware is needed you might need to get to native code level. AXEMAS has been designed specifically to make it as easy as possible to work with native code, the main difference with frameworks like Cordova is explicitly that AXEMAS makes native a first citizen of your application. HTML sections loaded by your application are explicitly declared inside the application code itself and the application window is explicitly create using makeApplicationRootController from the NavigationSectionsManager Inside your AppDelegate for iOS: @implementation AppDelegate - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { self.window.rootViewController = [NavigationSectionsManager makeApplicationRootController:@[@{ @"title": @"Home", @"url": @"www/home.html", }]]; [self.window makeKeyAndVisible]; return YES; } @end Or in your AXMActivity subclass onCreate() method for Android: public class MainActivity extends AXMActivity { @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); if (savedInstanceState == null) { JSONObject data = new JSONObject(); try { data.put("url", "www/home.html"); data.put("title", "Home"); } catch (JSONException e) { e.printStackTrace(); } NavigationSectionsManager.makeApplicationRootController(this, data); } } } The binding between the HTML Sections and native code is performed using SectionControllers, to link a section to a section controller is as easy as registering the controller class for the specified route: [NavigationSectionsManager registerController:[MySecionController class] forRoute:@"www/mysection.html"]; NavigationSectionsManager.registerController(this, MySecionController.class, "www/mysection.html"); To get started using SectionControllers read the iOS API and Android API. Contents: - iOS API - Android API - JavaScript API - Utilities - AXEMAS CookBook - Quickstarting a New Application - Maintain AXEMAS
https://axemas.readthedocs.io/en/latest/
2019-03-18T19:45:16
CC-MAIN-2019-13
1552912201672.12
[]
axemas.readthedocs.io
The Spell Checker options page contains the CodeRush Classic Spell Checker options. The Enabled option specifies the spell checker availability. Other options are organized into the following groups. Context Options of the Context group specify which elements are checked by spell checker. The following elements are available for spell check: Options This group includes the main spell checker options. Culture Specifies the current culture. Check designer files Specifies whether to check spelling in designer files. Ignored elements: Specify text elements that should be ignored by spell checker. The following elements can be ignored: Dictionaries The Dictionaries group enables you to manage spell checker dictionaries. The dictionary list is located at the left part of the group. The selected dictionary options are located to the right of the list. To add a dictionary, specify its name and type, and click Add. Spell checker supports the following dictionary types: The Delete button enables you to remove the selected dictionary. See How to: Add a Dictionary for Spell Checker for step-by-step instruction of adding a dictionary to Spell Checker. This product is designed for outdated versions of Visual Studio. Although Visual Studio 2015 is supported, consider using the CodeRush extension with Visual Studio 2015 or higher.
https://docs.devexpress.com/CodeRush/9209/coderush-options/editor/spell-checker
2019-03-18T19:29:13
CC-MAIN-2019-13
1552912201672.12
[]
docs.devexpress.com
Decommissioning a Node Why you might be interested in this guide - You are decomissioning a host running calico/node or removing it from your cluster. - You are renaming a Node. - You are receiving an error about an IP address already in use. -/node container should be stopped on the corresponding host and it should be ensured that it will not be restarted. - You must have calicoctl configured and operational to run the commands listed here. Removing a Calico Node resource Note: Removing a Node resource will also remove the Workload Endpoint, Host Endpoint, and IP Address resources and any other sub configuration items associated with that Node. Warning - Calico Node resource See the example below for how to remove a node with the calicoctl command. Caution See the Warning above calicoctl delete node <nodeName> Removing multiple Calico: v1 kind: node metadata: name: node-02 - apiVersion: v1 kind: node metadata: name: node-03 To delete the nodes listed in the file pass it like below. Caution See the Warning above calicoctl delete -f nodes_to_delete.yaml
https://docs.projectcalico.org/v2.2/usage/decommissioning-a-node
2019-03-18T19:32:31
CC-MAIN-2019-13
1552912201672.12
[]
docs.projectcalico.org
Contents Now Platform Capabilities Previous Topic Next Topic Attachment limit properties Subscribe Log in to subscribe to topics and get notified when content changes. ... SAVE AS PDF Selected Topic Topic & Subtopics All Topics in Contents Share Attachment limit properties Several properties control email attachment limits. Properties All the properties are located in the System Property [sys_properties] table. Setting any of the following properties to an excessively large value may cause performance issues. Table 1. Attachment limit properties Name Description glide.email.inbound.max_attachment_count Sets the maximum number of attachments allowed per inbound email. Type: integer Default value: 30 Learn more: Inbound Email Attachment Processing glide.email.inbound.max_total_attachment_size_bytes Sets the maximum total attachment size in bytes allowed per inbound email. Type: integer Default value: 18874368 Learn more: Inbound Email Attachment Processing glide.email.outbound.max_attachment_count Sets the maximum number of attachments allowed per outbound email. Type: integer Default value: 30 Learn more: Outbound Email Attachment Processing glide.email.outbound.max_total_attachment_size_bytes Sets the maximum total attachment size in bytes allowed per outbound email. To send an email, the system must encode the contents of the email. This process may significantly increase the size of the email, including any attachments. It is best to set this property to a value well below the maximum email size. Type: integer Default value: 18874368 Learn more: Outbound Email Attachment Processing Note: A different property, com.glide.attachment.max_size, sets the maximum file size allowed for any attachment in the system and overrides any larger values of glide.email.inbound.max_total_attachment_size_bytes and glide.email.outbound.max_total_attachment_size_bytes. Inbound email attachment processing For inbound emails, the system enforces the maximum number and size of attachments as set by the glide.email.inbound.max_attachment_count and glide.email.inbound.max_total_attachment_size_bytes properties. When the attachments for an inbound email exceed either value, the system logs a warning and discards the excess attachments. The order in which the system processes the attachments determines which attachments are discarded. This order may not be consistent from email to email. Outbound email attachment processing For outbound emails, the system enforces the maximum number and size of attachments as set by the glide.email.outbound.max_attachment_count and glide.email.outbound.max_total_attachment_size_bytes properties. Email records are created from various sources and may exceed the configured attachment limits. Emails that are ready to be sent from the Email [sys_email] table are subject to the outbound attachment limits. Emails that exceed either limit trigger a warning in the email system log and are sent with attachments up to the maximum number or total file size. The log message for such an email might look like this: Maximum combined attachment size exceeded. (max:15728640 bytes). One or more attachment records ignored. Emails for notifications, scheduled reports, and exported tables Notifications can be set to include all the attachments from the record that triggers the notification. If the attachments exceed either of the outbound email attachment limits, the system excludes the excess attachments from the email and logs a warning message. Reports can be scheduled for email distribution as attachments. Large reports may exceed the outbound attachment size limit. In this case, the system sends the scheduled report email without the report attached and logs a warning message. To avoid the issue, send links to large reports instead of sending the reports as attachments. If a user attempts to export numerous records from a list that exceeds a configured warning threshold, a dialog box offers the option to email the exported records as an attachment. If the attachment exceeds the outbound attachment size limit, the system sends the email without the exported record list attached and logs a warning message. On this page Send Feedback Previous Topic Next Topic
https://docs.servicenow.com/bundle/istanbul-servicenow-platform/page/administer/notification/reference/r_AttachmentLimitProperties.html
2019-03-18T20:15:16
CC-MAIN-2019-13
1552912201672.12
[]
docs.servicenow.com
A Distributed Execution Manager (DEM) runs the business logic of custom models, interacting with the database and with external databases and systems as required. specificprocesses workflows before they are run, including checking preconditions for workflows, used in the implementation of the RunOneOnly feature, and creating the workflow execution history. One DEM Orchestrator instance is designated as the active Orchestrator that performs these tasks. Because the DEM Orchestrator is essential to run workflows, install at least one additional Orchestrator instance on a separate machine for redundancy. The Orchestrator is automatically installed on the machine that also runs the Manager Service. The additional DEM Orchestrator monitors the status of the active Orchestrator so that it can take over if the active Orchestrator goes offline.
https://docs.vmware.com/en/vRealize-Automation/7.0/com.vmware.vrealize.automation.doc/GUID-1C5EE1A9-1464-4AB9-B14F-001738D8597F.html
2019-03-18T20:10:48
CC-MAIN-2019-13
1552912201672.12
[]
docs.vmware.com
A storage pool is a collection of SSDs. You can combine SSDs to create a storage pool, which enables you to share the SSDs and SSD spares across multiple Flash Pool aggregates, at the same time. Storage pools consist of allocation units, which you can use to provide SSDs and SSD spares to aggregates or to increase the existing SSD size. After you add an SSD to a storage pool, you can no longer use the SSD as an individual disk. You must use the storage pool to assign or allocate the storage provided by the SSD.
http://docs.netapp.com/ontap-9/topic/com.netapp.doc.onc-sm-help-900/GUID-4245DF08-5537-46D6-B9EA-98B8B2A525BD.html
2019-03-18T19:46:10
CC-MAIN-2019-13
1552912201672.12
[]
docs.netapp.com
, and find its Neutron port ID. nova boot [...] neutron port-list # [...]
https://docs.projectcalico.org/v2.2/usage/openstack/floating-ips
2019-03-18T19:24:45
CC-MAIN-2019-13
1552912201672.12
[]
docs.projectcalico.org
Has Membership Documents¶ This condition located on the Miscellaneous category tab in Search Builder has a value of True or False and allows you to find everyone that has (or does not have) an uploaded document on the Profile > Document tab on their people record. Use Case You could combine this with the Join Date condition to find everyone who joined your church in the past year to see if their their decision information sheet was uploaded to their record. See also
http://docs.touchpointsoftware.com/SearchBuilder/QB-HasMemberDocs.html
2019-03-18T19:21:16
CC-MAIN-2019-13
1552912201672.12
[]
docs.touchpointsoftware.com
This topic explains how you can quickly extend the scope and comprehensiveness of your functional testing by parameterizing tests with values that are stored in data sources or extracted from other tests. Parameterization can be applied to test inputs as well as data validation. Sections include: - Parameterizing Tests with Values Extracted from Another Test - Parameterizing Tools with Variables - Understanding How SOAtest Performs Functional Testing Using Data Sources - Adding a Data Source - Adding a Data Source at the Test Suite, Project, or Global Level - Generating a Data Source Template for Populating Message Elements - Cutting, Copying, and Pasting Data Sources - Performing Functional Tests Using Data Sources - Using Interpreted Data Sources - Populating and Parameterizing Elements with Data Source Values
https://docs.parasoft.com/pages/viewpage.action?pageId=33854225
2019-03-18T19:51:42
CC-MAIN-2019-13
1552912201672.12
[]
docs.parasoft.com
Note Please make sure that the amagama-manage command is accessible in order to be able to use it. amaGama is managed through the amagama-manage command. Try running it with no arguments for usage help: $ amagama-manage The amagama-manage command exposes several management subcommands, each having it’s own --help option that displays its usage information: $ amagama-manage SUBCOMMAND --help See below for the available subcommands. These are the available management subcommands for amaGama: This subcommand benchmarks the application by querying for all strings in the given file. Note For more information please check the help of this subcommand. This subcommand is used to import translations into amaGama from bilingual translation files. Please refer to the importing translations section for a complete usage example. This subcommand is used to optimize the database for deployment. It has no options: $ amagama-manage deploy_db This will permanently alter the database. Continue? [n] y Succesfully altered the database for deployment. This subcommand is used to drop the tables for one or more source languages from the amaGama database: $ amagama-manage dropdb -s fr -s de This will permanently destroy all data in the configured database. Continue? [n] y Succesfully dropped the database for 'fr', 'de'. This subcommand is used to create the tables in the database for one or several source languages. It can be run several times to specify additional source languages. The following example creates the tables for english and french: $ amagama-manage initdb -s en -s fr Succesfully initialized the database for 'en', 'fr'. This subcommand is used to print out some figures about the amaGama database. It has no options: $ amagama-manage tmdb_stats Complete database (amagama): 400 MB Complete size of sources_en: 234 MB Complete size of targets_en: 160 MB sources_en (table only): 85 MB targets_en (table only): 66 MB sources_en sources_en_text_idx 83 MB targets_en targets_en_unique_idx 79 MB sources_en sources_en_text_unique_idx 53 MB targets_en targets_en_pkey 16 MB sources_en sources_en_pkey 13 MB
http://docs.translatehouse.org/projects/amagama/en/latest/managing.html
2019-03-18T19:43:17
CC-MAIN-2019-13
1552912201672.12
[]
docs.translatehouse.org
Warning: -file- is being assigned a //# sourceMappingURL, but already has one. A warning. JavaScript execution won't be halted.. Setting a source map by using a comment in the file: //# sourceMappingURL= Or, alternatively, you can set a header to your JavaScript file: X-SourceMap: /path/to/file.js.map © 2005–2018 Mozilla Developer Network and individual contributors. Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://docs.w3cub.com/javascript/errors/already_has_pragma/
2019-03-18T19:39:21
CC-MAIN-2019-13
1552912201672.12
[]
docs.w3cub.com
The Carbon Fluid skin is available for selection from the Web Theme. To select a different Skin follow the steps below: DriveWorks Administrator is required to be run as an administrator when modifying the theme files. To do this: The credentials used to Login must have the setting Administer Group Security applied. See the section To Edit a Team's Members and Permissions in the topic Security Settings for more information. The address bar of your browser will display The setup page below will now be displayed. See also Welcome to DriveWorks Pro 11 -What's New
http://docs.driveworkspro.com/Topic/WhatsNewDriveWorks11CarbonFluid
2019-03-18T20:34:00
CC-MAIN-2019-13
1552912201672.12
[]
docs.driveworkspro.com
Contents Service Management Previous Topic Next Topic Email notifications installed with Facilities Service Management Subscribe Log in to subscribe to topics and get notified when content changes. ... SAVE AS PDF Selected Topic Topic & Subtopics All Topics in Contents Share Email notifications installed with Facilities Service Management Email notifications are a way to send selected users email or SMS notifications about specific activities in Facilities Service Management. Facilities Service Management adds the following email notifications. Notification Description Facilities Request is assigned Sends an email message to the facilities staff member who is assigned to the facilities request. On this page Send Feedback Previous Topic Next Topic
https://docs.servicenow.com/bundle/istanbul-service-management-for-the-enterprise/page/product/facilities-service-management/reference/r_EmailNotsInstWithFacServMgmt.html
2019-03-18T20:16:06
CC-MAIN-2019-13
1552912201672.12
[]
docs.servicenow.com
CondorPy¶ Contents: - User Manual - Overview of HTCondor - API Reference Description¶ Condorpy is a wrapper for the command line interface (cli) of HTCondor and enables creating submitting and monitoring HTCondor jobs from Python. HTCondor must be installed to use condorpy. Getting Started¶ >>> from condorpy import Job, Templates >>> job = Job('job_name', Templates.vanilla_transfer_files) >>> job.>> jobs.>> job.>> job.>> job.submit() Acknowledgements¶ This material was developed as part of the CI-Water project which was supported by the National Science Foundation under Grant No. 1135482
https://condorpy.readthedocs.io/en/latest/
2020-10-20T02:38:29
CC-MAIN-2020-45
1603107869785.9
[]
condorpy.readthedocs.io
DKAN Overview¶ DKAN is a open data platform with a full suite of cataloging, publishing and visualization features that allows governments, nonprofits and universities to easily publish data to the public. DKAN is maintained by CivicActions. See examples of DKAN sites from around the world. DKAN is a Drupal-based open data portal based on CKAN, the first widely adopted open source open data portal software. CKAN stands for Comprehensive Knowledge Archive Network. It has inspired at least one other variant - JKAN, which is built on Jekyll.
https://dkan.readthedocs.io/en/latest/introduction/index.html
2020-10-20T03:13:42
CC-MAIN-2020-45
1603107869785.9
[]
dkan.readthedocs.io
Extending Delta Data Migration to include customizations You can add custom forms to the Delta Data Migration package so that you can migrate the data in these custom forms. You can do this manually or with the Custom Form Instruction Generation tool. You can also update a field mapping file to correct customer-defined fields in the BMC Remedy reserved range. Manually adding custom forms to the package - Open the Custom_Form_Instructions.xml instruction file in the <migratorInstallFolder>\DeltaDataMigration\Packages\Custom folder. The file will contain information that is similar to the instructions XML example. - Provide your custom form name and the unique field IDs (unique index field IDs) in their respective tags. Follow the same process for all of the forms that you need to add. - Open the Custom_Form_Package.xml package file in the <migratorInstallFolder>\DeltaDataMigration\Packages\Custom folder. Provide the instruction XML file names in the package XML file: - Save the instruction and package XML files. Now, you are ready to run the migrate and compare scripts for the custom package. The new package will run in parallel in a separate command window in the same way as the Delta Data Migration out-of-the-box package files. Adding custom forms to the package by using the Custom Form Instruction Generation tool If you do not have the list of your custom (non-BMC) regular forms, run the following batch files as outlined in the following procedure: - The migratorFindCustomForms.bat utility — Finds all of your custom forms from the AR System server that are not recognized as BMC Software forms. The utility generates a CSV file that includes the list of all custom form names with their unique indexes. - The migratorCSV2Instructions.bat utility — Uses the generated CSV file as the input, and creates the Custom_Form_Instructions.xml file for the custom forms in the CSV file. To add custom forms to the package by using the Custom Form Instruction Generation tool - Navigate to the <MigratorInstallFolder>\Migrator\migrator\DeltaDataMigration\Utilities\migratorUtilities folder. Run the migratorFindCustomForms utility by using the following syntax: For example: - Open the Customforms.csv output file in a text editor or spreadsheet application. - If a form is included in the list but should not be migrated, remove the entire line. Forms that are used for testing or to keep temporary data should not be included. If you are not sure, it is better to include the form in the migration. Migrating a form multiple times is permitted. Note You can save the names of forms to be excluded in a separate file, then you can use that file the next time you run migratorFindCustomForms. - Save the changes you made to the Customforms.csv file. Run the migratorCSV2Instructions utility by using the following syntax: This utility generates an instruction file that BMC Remedy Migrator reads and uses for the migration. - Verify that the output file is Custom_Form_Instructions.xml. - Open Custom_Form_Instructions.xml file, and ensure that the name inside of the xml file has the same name "Custom_Form_Instructions." - Copy the Custom_Form_Instructions.xml files to the Packages\Custom directory. (You can overwrite the same file in the directory.) Now, the custom package is ready to be used. On the DDM user interface, when you select Custom, this custom package is selected, and the migration for the custom forms is executed. Updating a field mapping file if you ran ARCHG. - Under the packages folder, open the custom package folder. - Open the instruction file that needs its form mapping information to be updated. For some reason, if you have more than one instruction file, open the file which contains the form for which the delta data mapping is available. - Create a mapping (.arm) file and map the custom source field ID to the destination field ID (which has a new ID after running the ARCHGIDutility). The following figure shows an example of a mapping file: - Add the mapping file name to the form reference in the instruction file, as shown in the following example: Where to go from here Post-migration procedures In Step 3, Consider adjusting the path from: <migratorInstallFolder>\DeltaDataMigration\Custom folder., to: <migratorInstallFolder>\DeltaDataMigration\Packages\Custom folder. The Custom was not found under the DDM folder on a 10/1 product download/install. Thanks David, I fixed it, too, in Step 1.
https://docs.bmc.com/docs/itsm81/extending-delta-data-migration-to-include-customizations-225970409.html
2020-10-20T03:33:54
CC-MAIN-2020-45
1603107869785.9
[]
docs.bmc.com
#include <descriptionimp.hxx> Definition at line 28 of file descriptionimp.hxx. Definition at line 38 of file descriptionimp.cxx. Definition at line 44 of file descriptionimp.cxx. This method is called for all characters that are contained in the current element. The default is to ignore them. Reimplemented from SvXMLImportContext. Definition at line 72 of file descriptionimp.cxx. EndElement is called before a context will be destructed, but after an elements context has been parsed. It may be used for actions that require virtual methods. The default is to do nothing. Reimplemented from SvXMLImportContext. Definition at line 48 of file descriptionimp.cxx. References Any, SvXMLImportContext::GetLocalName(), xmloff::token::IsXMLToken(), msText, mxShape, and xmloff::token::XML_TITLE. Definition at line 32 of file descriptionimp.hxx. Referenced by Characters(), and EndElement(). Definition at line 31 of file descriptionimp.hxx. Referenced by EndElement().
https://docs.libreoffice.org/xmloff/html/classSdXMLDescriptionContext.html
2020-10-20T03:09:10
CC-MAIN-2020-45
1603107869785.9
[]
docs.libreoffice.org
Design Configuration The Design Configuration makes it easy to edit design-related rules and configuration settings by displaying the settings on a single page. Design Configuration Edit the design configuration On the Admin sidebar, go to Content > Design > Configuration. Find the store view that you want to configure. Then, click Edit in the Action column. The page displays the current design settings for the store view. To change the Default Theme, set Applied Theme to the theme that you want to apply to the view. If no theme is specified, the system default theme is used. Some third-party extensions modify the system default theme. If the theme is to be used for only a specific device, do the following: Under Design Rule section under User Agent Rules, click Add New User Agent Rule. In the Search String column, enter the browser ID for the specific device. A search string can be either a normal expression or Perl Compatible Regular Expression (PCRE). To learn more, see: User Agent. The following search string identifies Firefox: /^mozilla/i In the Theme Name column, choose the theme that is to be used for the specified device. User-Agent Rules Repeat the process to enter additional devices. Search strings are matched in the order they are entered. Under Other Settings, expand each section. Then, follow the instructions in the linked topics to edit the settings as needed. Edit Design Configuration When complete, click Save Configuration.
https://docs.magento.com/user-guide/v2.3/design/configuration.html
2020-10-20T04:06:36
CC-MAIN-2020-45
1603107869785.9
[]
docs.magento.com
CCPA Compliance Guide. This topic provides a high-level outline of the steps required for Magento merchants to comply with privacy regulations such as the California Consumer Privacy Act (CCPA). GDPR and CCPA If your business is required to comply with both the General Data Protection Regulation (GDPR) and the. Compliance Roadmap A coordinated effort is required to develop and implement a plan to address compliance. Use this roadmap as a guide to mobilize resources and prioritize tasks so you can move ahead on multiple fronts. The process is essentially the same for all installations of Magento, with the following exception: Adobe Commerce Cloud: Merchants with stores hosted on Adobe Commerce Cloud can ask their Magento Technical Account Manager or Customer Support for help responding to consumer requests. On-Premise Magento: Merchants with on-premise installations of Magento must develop their own processes and tools to respond to and manage consumer requests related to privacy regulations. Step 1: Assemble a cross-functional team to address regulation compliance. Assemble a team that represents the following functional roles in your business, and schedule a training session to bring them up to speed on the pending legislation. Then, assign required tasks to stakeholders by role. - Business Strategy & Operations - Legal - Information Technology - User Experience - Customer Service - Administrative Support From a business perspective, you must determine if your company will extend these privacy protection measures only to consumers in California, or make them available to all consumers, regardless of location. Step 2: Take inventory of your digital properties. Stakeholders: Information Technology, Legal, Administrative Support Take inventory of your digital properties, including all integrations and those who have access to your consumer data. Determine what public and private personal information is collected through your website(s) and mobile application(s). For example, a standard Magento database stores the following types of public and private personal information: Public: Wish Lists, Product Reviews Private: Customer Information, Order Information, Reward Points, Gift Registry, Address Book, Store Credit, Payment Methods, Billing Agreements, Newsletter Subscriptions, Invitations. If your Magento installation has been customized, additional personal information might be collected. Personal information might also reside in cookies, tags, and other technologies that collect information. Identify the parties with whom you share data. Your list will include service providers and third parties such as advertising networks, internet service providers, data analytics providers, government entities, operating systems and platforms, social networks, and consumer data resellers who do not directly collect personal information from your consumers. Service Providers: Those: Those with whom you share or sell consumer data. For example, you might share consumer data with an advertising network in exchange for advertising. Step 3: Map the customer journey and data collection process in your store(s). Stakeholders: User Experience, Information Technology, Administrative Support Identify each point in the customer journey where personal information is collected, and the type of information that is collected at each step. Visitors to your site must be notified in advance, or at the point of data collection. For example, a Magento of Magento: Storefront Data Entry Points Step 4: Establish procedures and mechanisms to respond to customer requests. Busineses within the scope of CCPA (Brands): Magento merchants collect and store personal information from their customers and guests who make purchases in their store(s). Data Processor (Technology Vendors): Adobe (Magento Commerce) acts as a processor of the personal data that is stored as part the services provided to merchants. As a processor, Adobe processes personal data in accordance with the merchant’s permission and instructions, as set forth in the license agreement. Merchants are responsible to do the following: Identify the parties involved in the Data Subject Access Request (DSAR), and verify the identity of doing so is not possible (in which case, provided that the merchant notifies the customer to explain the reason for the delay. Develop a mechanism to present the required notifications in your store, and to collect consumer response. Establish procedures to respond to and document each of the following requests: Requests to Know: Visitors to your store must be informed of any arrangement(s) that you have to sell or share their personal information with third parties, and be given the opportunity - Merchants whose stores are hosted on Adobe Commerce Cloud should contact Magento Support for assistance deleting personal information. Contact your Magento Technical Account Manager or Customer Support for more information. - Merchants running installations of Magento on premise must implement their own process and script to delete personal information upon request. Step 5: Write the content for the required customer notifications. the same language(s): - Specific pieces of personal information that you have collected about the consumer - Categories of personal information that you have collected about the consumer - Categories of sources from which the personal information is collected - Categories of personal information about the consumer that you have sold or disclosed for a business purpose - Categories of third parties to whom the personal information was sold or disclosed for a business purpose - The reasons why your business collects and/or sells personal information Send the content to the team, and if possible, your legal counsel for review. Determine where the notices will appear, how they will function (for each visit, appear only when user is authenticated, or on click-through) and their position and format in relation to other content. Pass the approved content to your development team. Step 6: Review your agreements with service providers. Stakeholders: Legal, Administrative Support Review and if necessary, update all service provider contracts to reflect CCPA requirements. Step 7: Update your privacy policy. Stakeholders: Legal, Administrative Support Review your current privacy policy and consider what, if any, additional disclosures are necessary. Use of Personal Information: You must disclose what personal information is collected, as well set forth in the company’s privacy policy. When a business receives requests from minors in this age range, it must inform them of their right to opt out at a later date, and explain how to do it. Merchants are prohibited from storing the personal data of children on our platform or systems. If there is reason to believe collected data belongs to a minor, it must be removed from our platform immediately to avoid breach of Magento license terms. Step 8: Document all related procedures and maintain records. Stakeholders: Customer Service, Administrative Support For a period of 24 months after each individual rights request is received, maintain a record of the request and your company’s response.
https://docs.magento.com/user-guide/v2.3/stores/compliance-ccpa-guide.html
2020-10-20T03:57:48
CC-MAIN-2020-45
1603107869785.9
[]
docs.magento.com
Examples for Creating External Tables A newer version of this documentation is available. Use the version menu above to view the most up-to-date release of the Greenplum 5.x documentation. Examples for Creating External Tables These &
https://gpdb.docs.pivotal.io/5240/admin_guide/external/g-creating-external-tables---examples.html
2020-10-20T02:22:11
CC-MAIN-2020-45
1603107869785.9
[]
gpdb.docs.pivotal.io
Managing VPN profiles A VPN profile contains the information that you need to log in to your organization's network over a VPN or Wi-Fi connection. Depending on your organization, you might have more than one VPN profile on your BlackBerry device. For more information on VPN profiles, contact your administrator. Add a VPN profile. - On the home screen, swipe down from the top of the screen. - Tap Settings > Network Connections > VPN > Add. - Complete the fields. If you don't have the required information, contact your administrator. - Tap Save. Was this information helpful? Send us your comments.
http://docs.blackberry.com/en/smartphone_users/deliverables/57147/als1342455150603.jsp
2014-12-18T05:57:34
CC-MAIN-2014-52
1418802765616.69
[]
docs.blackberry.com
Authentication Authentication Options - S3 Signature Authentication - Module name: riak_cs_s3_auth - Documentation - Keystone Authentication - Module name: riak-cs_keystone_auth - Documentation S3 Passthru Authentication - Module name: riak_cs_s3_passthru_auth - This module requires a valid user key_idto be included in the Authorizationheader value, but no signature is required. For example, a valid header using this authentication module would look like this: Authorization: AWS 4REM9H9ZKMXW-DZDC8RV. Warning: This module is only intended for use in development or testing scenarios. Selecting an authentication method is done by adding or changing the auth_module key in the Riak CS app.config file. For example, to instruct Riak CS to use S3-style request signing as the means of authentication, ensure that the following is contained in the app.config in the riak_cs section: {riak_cs, [ %% Other configs {auth_module, riak_cs_s3_auth}, %% Other configs ]} S3-style authentication is used by default. S3 Authentication The primary authentication scheme available to use with Riak CS is the S3 authentication scheme. A signature is calculated using several elements from each request and the user's key_id and key_secret. This signature is included in the Authorization header of the request. Once a request is received by the server, the server also calculates the signature for the request and compares the result with the signature presented in then Authorization header. If they match then the request is authenticated; otherwise, the authentication fails. Full details are available in the S3 authentication scheme documentation. Query String Authentication Riak CS also supports authentication using a query parameter. This allows issuing of pre-signed requests that can be used to grant public access to private Riak CS data. It also supports an expiry timestamp so that the pre-signed URL can be invalidated after a certain period of time. The signature in the query string secures the request and you can specify any future expiration time in epoch or UNIX time. - Create a query - Specify an expiration time for the query - Sign it with your signature - Place the data in an HTTP request - Distribute the request to a user or embed the request in a web page Query String Parameters Example For example, a query URL is similar to the following example. Keystone Authentication More information on using Keystone for authentication with Riak CS can be found in using Riak CS with Keystone.
http://docs.basho.com/riakcs/latest/cookbooks/Authentication/
2014-12-18T05:34:47
CC-MAIN-2014-52
1418802765616.69
[]
docs.basho.com
. For example, when an object is constructed, the MetaClass's invokeConstructor()is called. One feature of the invokeConstructor allows us to create groovy objects using a map argument to set the properties of the object (new X([prop1: value1, prop2: value2])). These solutions perform complete replacements, where as a more scoped solution can be found at Using the Proxy Meta Class. InvokeHelper Solution This technique installs the meta class at runtime using the InvokerHelper to gain access to the registry which allows us to change the meta class instance that is in use. The behaviour for objects created before the change depends on whether they are Java or Groovy objects. Groovy objects "cache" their metaclass, so once they're created, changing the registry has no effect. However, Java objects don't have a cache, so accesses to them always start with a trip to the registry, meaning any registry changes affect existing and new instances alike. Solution. Precedence So what would happen if you used both techniques. Assume that the package convention class exists in your class path and you create and set another meta class. The answer is that the last setMetaClass that you did applies to the usages of all instance of the effected type.
http://docs.codehaus.org/pages/diffpagesbyversion.action?pageId=71503&selectedPageVersions=12&selectedPageVersions=13
2014-12-18T05:54:59
CC-MAIN-2014-52
1418802765616.69
[]
docs.codehaus.org
. - Magnolia CMS provides full Groovy support starting with version 4.3, see Boris' blog post about Groovy support in Magnolia CMS (and Greg's video!) Web frameworks - NanoWeb - RIFE - Struts 2 Create Struts 2 actions in Groovy - Simple Advanced Groovy templating, provides a lightweight version of Struts Tiles - AribaWeb Full stack component-based framework for creating AJAX-enabled apps; embeds Groovy with full support for scripting components and business objects. - Groovy Tapestry - GvTags Template engine with tag lib support for Groovy and JSP tag library - Woko: POJOs on the Web! Full stack DDD framework for building Object Oriented, Multi Profile webapps, with support for Groovy - OOWeb Lightweight, fast, object-oriented web framework. Can be embedded by groovy. - Gracelets A combination of Groovy and JSF/Facelets allowing you to write simple, compact and reloadable pages, controllers and libraries all inside the Servlet/JSF framework. Learn more about it here. - Jaiwls Component oriented Framework with bundled Servlet-Webserver (Jetty) and HTML Widget library. Can use static compiled java-classes or runtime Groovy-scripts. Server Software - Easy Other - GroovyRules is a JSR-94 compliant lightweight rules engine that permits defining rules in Groovy - - jlabgroovy : a Matlab-like scientific programming environment based on Groovy with extensive plotting support, access to Java scientific libraries and user-friendly user interface
http://docs.codehaus.org/pages/viewpage.action?pageId=228164093
2014-12-18T05:57:02
CC-MAIN-2014-52
1418802765616.69
[]
docs.codehaus.org
these bindings do not overlap with the default surefire bindings. To use the Maven Failsafe Plugin you need to add the following to your pom.xml file. <project> [...] <build> <plugins> <plugin> <artifactId>maven-failsafe-plugin</artifactId> <version>2.6</version> <executions> <execution> <goals> <goal>integration-test</goal> <goal>verify</goal> </goals> </execution> </executions> </plugin> </plugins> </build> [...] </project> build if necessary. Here is an example using jetty for hosting an integration test environment: <project> [...] <build> <plugins> <plugin> <artifactId>maven-failsafe-plugin</artifactId> <version>2.6</version> <executions> <execution> <goals> <goal>integration-test</goal> <goal>verify</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.mortbay.jetty</groupId> <artifactId>maven-jetty-plugin</artifactId> <version>6.1.16</version> <configuration> <scanIntervalSeconds>10</scanIntervalSeconds> <stopPort>8005</stopPort> <stopKey>STOP</stopKey> <contextPath>/</contextPath> <> </project> ?
http://docs.codehaus.org/pages/viewpage.action?pageId=63286
2014-12-18T05:52:43
CC-MAIN-2014-52
1418802765616.69
[]
docs.codehaus.org
Getting Set Up Preliminary Notes There are some open tickets (31, 11) about getting the websites repository building on Fedora with Python 3. However, it should be able to be built on Fedora 30, using Python 2 packages, for now. Fedora 29 Alternative Note, the dependencies below may not currently resolve due to package renaming in Fedora 30. If the dependencies do not resolve in Fedora 30, running Flask from a Fedora 29 container is an alternative. An example to run the Fedora 29 container using podman is as follows: # Expose the default port for Flask, 5000. podman run --expose 5000 --net=host --privileged -v /path/to/websites/repo:/path/to/websites/repo -ti registry.fedoraproject.org/fedora:29 Dependencies dnf install \ git \ python-flask \ python-frozen-flask \ python-flask-assets \ python-rjsmin \ python-cssmin \ python-flask-babel \ python-flask-htmlmin \ python-cssutils \ rubygem-sass \ babel \ python3-jinja2 \ python-pyyaml \ python-dateutil \ python-dogpile-cache \ python-requests \ python-zanata-client Pull Strings Now we need to pull the current translations. After installing the dependencies above, you can cd into sites/getfedora.org/ and run: ./scripts/pull-translations.sh. Once the dependencies are installed and translations are pulled, you can do one of two things: Use the Development Server One option is to use the Flask built-in development server. This is handy because it prevents you from needing to build the websites every time you change something. However, it bypasses the Frozen-Flask system which creates are static sites, entirely, so it’s not entirely an accurate representation of what goes live. To use the development server: export FLASK_APP=main.py flask run --reload Now you many visit to get to the site. Use Apache Alternatively, you may simply run python main.py. You’ll get a statically built site in the ./build/ directory. However, the HTML files are all language-code-suffixed as Apache/httpd MultiViews system expects (e.g. index.html.en). As a result you cannot just run a simple HTTP server (like python -m SimpleHTTPServer) in the ./build/ directory, unfortunately. You can, however, set up an Apache on your system, and point it to the ./build/ directory.
https://docs.fedoraproject.org/uk/websites/setup/
2021-04-10T14:48:43
CC-MAIN-2021-17
1618038057142.4
[]
docs.fedoraproject.org
Use persistent queues to help prevent data loss Persistent queuing lets you store data in an input queue to disk. In a Splunk Cloud deployment, persistent queues can help prevent data loss if a forwarder that you configured to send data to your Splunk Cloud instance backs up. In a Splunk Enterprise deployment, persistent queues work for either forwarders or indexers. You can't configure persistent queues directly on a Splunk Cloud instance. By default, forwarders and indexers have an in-memory input queue of 500 KB. If the input stream runs at a faster rate than the forwarder or indexer can process, to a point where the input queue on the forwarder maxes out, undesired consequences occur. In the case where you send network data over the UDP protocol, that data drops off of the queue and gets lost. For other types of data inputs, the application that generates the data can get backed up. By implementing persistent queues, you can help prevent this data drop or loss from happening. With persistent queuing, after the in-memory queue is full, the forwarder or indexer writes the input stream to files on disk. It then processes data from the in-memory and disk queues until it reaches the point when it can again start processing directly from the data stream. While persistent queues help prevent data loss if processing gets backed up, you can still lose data if the forwarder or indexer crashes. For example, the forwarder holds some input data in the in-memory queue as well as in the persistent queue files. The in-memory data can get lost if the forwarder crashes. Similarly, data that is in the parsing or indexing pipeline but that has not yet been written to disk can get lost in a crash. When can you use persistent queues? Persistent queuing is available for certain types of inputs, but not all. Generally speaking, persistent queuing is available for inputs of an ephemeral nature, such as network inputs, but isn't available for inputs that have their own form of persistence, such as monitoring files. Persistent queues are available for these input types: - Network inputs that use the TCP protocol - Network inputs that use the UDP protocol - First-In, First-Out (FIFO) inputs - Scripted inputs - Windows Event Log inputs - HTTP Event Collector tokens Persistent queues aren't available for these input types: - Monitor inputs - Batch inputs - File system change monitor - splunktcp input from Splunk forwarders Configure a persistent queue Use the inputs.conf configuration file to configure a persistent queue. You can configure the persistent queue on the universal forwarder that you configured to send data to Splunk Cloud. You can also configure persistent queues on Splunk Enterprise indexers. Use the same procedure directly on the indexer or forwarder that sends data to the indexer. Inputs don't share queues. You configure a persistent queue in the stanza for the specific input. - On the machine that forwards data to Splunk Cloud, use a text editor to open the $SPLUNK_HOME/etc/system/local/inputs.conf file for editing. - Locate or add the input stanza where you want to enable persistent queuing. - Specify the following setting within that input stanza: persistentQueueSize = <integer>(KB|MB|GB|TB) - Save the file and close it - Restart the forwarder. For more information about the inputs.conf file, see inputs.conf in the Splunk Enterprise Admin Manual. Example of configuring a persistent queue Here's an example of specifying a 100 megabyte persistent queue for a TCP network input: [tcp://9994] persistentQueueSize=100MB Here is another example for specifying a 1GB persistent queue for a Windows Event Log input: [WinEventLog] persistentQueueSize=1GB The Windows Event Log monitor accepts a persistent queue configuration for the default Windows Event Log stanza only. You cannot configure a persistent queue for a specific Event Log channel. You can configure a persistent queue for a specific Windows host monitoring input. Persistent queue location The persistent queue has a hardcoded location, which varies according to the type of input. For network inputs, the persistent queue is located at $SPLUNK_HOME/var/run/splunk/[tcpin|udpin]/pq__<port>. Put two underscores in the file name: pq__, not pq_. See the following examples: - The persistent queue for TCP port 2012 is $SPLUNK_HOME/var/run/splunk/tcpin/pq__2012. - The persistent queue for UDP port 2012 is $SPLUNK_HOME/var/run/splunk/udpin/pq__2012. For FIFO inputs, the persistent queue resides in $SPLUNK_HOME/var/run/splunk/fifoin/<encoded path>. For scripted inputs, the persistent queue resides in $SPLUNK_HOME/var/run/splunk/exec/<encoded path>. The FIFO scripted input stanza in the inputs.conf file!
https://docs.splunk.com/Documentation/Splunk/7.1.3/Data/Usepersistentqueues
2021-04-10T15:29:38
CC-MAIN-2021-17
1618038057142.4
[array(['/skins/OxfordComma/images/acrobat-logo.png', 'Acrobat logo'], dtype=object) ]
docs.splunk.com
Setting up an Amazon S3 Compatible Object Store¶ Installation and configuration of the TeamDrive S3 Daemon s3d is only required if TSHS (see TeamDrive Scalable Hosting Storage for details) is not enabled and you have an Amazon S3 compatible object store you wish to use as a secondary storage tier. Currently, Amazon S3 and OpenStack are supported. TeamDrive Hosting Service using an S3-compatible object store s3d is a process that runs in the background and provides secondary storage by transferring files to the object store. It does this by monitoring the hosted data directory structure and transferring files to the object store when a file reaches an age as specified in the service’s configuration. The configuration settings also allows you to specify what files are eligible to be transferred via the use of pattern matching. Note Because all object store requests will be signed using the current timestamp, it’s essential that the system time is accurate when running s3d. Make sure that the NTP service is installed and running. See the chapters about NTP configuration in the installation guides for details. Configuring s3d¶ The configuration of s3d is performed by changing the relevant configuration settings using the Host Server Administration Console. Log into your Host Servers Administration Console at and click on Settings. The S3 daemon Settings all begin with S3. The following information is needed by s3d to connect to the object store. - S3Brand - This setting specifies the type of S3 storage. Valid options are: Amazon or OpenStack. - S3Server - Your object store’s domain name, e.g. s3.amazonaws.com. - S3DataBucketName - The name of the Bucket in the object store that will contain the Space data. The bucket must already exist. Warning If you are setting up multiple TeamDrive Hosting servers it is important that they do not use the same Bucket. Doing so can result in data loss. - S3AccessKey - Your S3 access (public) key, used to access the specified bucket. - S3SecretKey - Your S3 secret (private) key used to access the specified bucket. - S3SyncActive - Set this to True to enable the synchronization of data stored by the Host Server (Space data) to the specified bucket on an S3 compatible object store. Note that the synchronization won’t start until the s3d service has been started (see below). - S3Options - These options control the way S3 is accessed, for example the number of parallel threads during upload, whether to use multipart upload, etc. The options may also contain S3Brand specific settings. - S3EnableRedirect - When S3 redirect is enabled, the Host Server will redirect the Client to a download directly from the object store, when appropriate. This helps to offload traffic from the Host Server to the object store. If set to False, the Host Server fetches the requested object from the object store and serves it to the Client directly. Starting and Stopping the s3d service¶ You can use the /etc/init.d/s3d init script to start and stop s3d. The configuration setting S3SyncActive needs to be set to True, otherwise s3d will abort with a corresponding error message. Warning Enabling the S3 Daemon means that any new data in existing Spaces and new Spaces will be transferred to the object store. Currently, there is no automatic way to return back to a pure file-based Host Server setup — data that was moved to S3 stays in S3. Disabling the S3 secondary storage tier would result in Clients no longer being able to access their Space data. After starting s3d with the command service s3d start, check the log file /var/log/s3d.log for startup messages. You can use service s3d status to check if s3d is up and running. s3d should be added to the processes to be started at boot time. To do this execute the following commands as root: [root@hostserver ~]# chkconfig s3d on Optional configuration parameters¶ The following optional configuration options can be modified in the S3Options configuration setting: - MinBlockSize - The minimum block size that can be used for multi-part upload. The valid range of this parameter will be determined by the implementation of the object store being used. - MaxUploadParts - The maximum number of parts a file can be divided into for upload.. - BucketAsSubdomain - Amazon S3 uses the bucket name as part of the domain name (BucketAsSubdomain=1), while other S3-compatible object stores (e.g. OpenStack) include the bucket name as part of the path name after the domain (BucketAsSubdomain=0). This option usually does not have to be set explicitly, as the S3Brand setting determines this value automatically. OpenStack configuration parameters¶ If the S3Brand is set to OpenStack then 2 additional parameters are required in S3Options, to enable the generation of temporary URLs. Temporary URLs give clients temporary direct access to objects in the object store, helping to reduce network traffic on the Host Server. This requires the setting S3EnableRedirect to be set to True. - OpenStackAuthPath - This is the path component of the OpenStack Authorization URL. For example if the OpenStack Authorization URL is then the OpenStackAuthPath would be /v1/AUTH_a422b2-91f3-2f46-74b7-d7c9e8958f5d30. - OpenStackAuthKey - Your OpenStack temp URL key used to generate temporary URLs. Your OpenStack administrator should be able to provide you with the OpenStack Authorization URL and the corresponding temp URL key. Warning The generation of new temp URL keys will invalidate older keys. It is important that once the temp URL key has been set for your TeamDrive Hosting server that no new keys are being generated. Enabling Object Store Traffic Usage Processing¶ When S3 storage has been enabled, TeamDrive Clients access the data in the object store directly. The traffic required by these operations is recorded by reading the S3 access log files. This means that setting S3SyncActive to True changes the way Traffic usage is calculated. When S3SyncActive to False, the Hosting Service is able to record all traffic usage in the pspace database. When S3SyncActive to True the S3 access logs myst be downloaded and parsed to get the required information. Note The traffic usage processing expects the Amazon S3 access log format. The calculation of traffic usage requires an external Perl-based tool named aws to be installed. Install the aws tool as described on its home page at. The tools s3get, s3ls and s3delete need to be in the PATH of the apache user. In addition, the following configuration settings have to be defined: - S3LogBucketName: - Name of the bucket that contains the S3 access logs. Note that if this setting is empty, the object store traffic will not be calculated. You must configure your object store to save the access logs to this bucket. Please refer to your object store documentation to determine how this is done. - S3ToProcessPath: - Local path in the filesystem to download above access logs for further processing. By default, this path is set to /var/opt/teamdrive/td-hostserver/s3-logs-incoming/ - S3ArchiveLogs: - If set to True, the processed access logs will be moved to the folder defined in the next setting S3ProcessedPath. - S3ProcessedPath: - If logs need to be kept for own additional analysing, then the S3 log files will be moved to this directory once traffic usage Processing is complete. By default, the task that calculates the S3 traffic runs every 10 minutes.
https://docs.teamdrive.net/HostServer/3.0.013.5/html/TeamDrive-Host-Server-Admin-Guide-en/Setting_up_an_Amazon_S3_Object_Store.html
2021-04-10T14:13:33
CC-MAIN-2021-17
1618038057142.4
[array(['../_images/TeamDrive-HostServer-Overview-S3.png', '../_images/TeamDrive-HostServer-Overview-S3.png'], dtype=object)]
docs.teamdrive.net
Vendor profiles are sets of DHCP options required by particular devices. For example, a VoIP phone might need a very specific set of DHCP options. Vendor options are available in Address Manager in the same locations as other DHCP deployment options and are assigned as deployment options. DHCP vendor options are the deployment options specific to a client vendor type and allows the DHCP client and server to exchange vendor-specific information. To add DHCP vendor options, you must first create a vendor profile and define a vendor option. option definitions, the options themselves can be set at the appropriate level within Address Manager. - Configuration - Server Group - Server - IPv4 block - IPv4 network - DHCPv4 range - IPv4 Address - Text - Unsigned Integer 8bit - Signed Integer 8bit - Unsigned Integer 16bit - Signed Integer 16bit - Unsigned Integer 32bit - Signed Integer 32bit - Unsigned Integer 64bit (Windows) - Boolean - IPv4 mask - String - Binary (Windows) - Encapsulated (Windows)
https://docs.bluecatnetworks.com/r/Address-Manager-Administration-Guide/DHCP-vendor-profiles-and-options/8.3.1
2021-04-10T14:00:05
CC-MAIN-2021-17
1618038057142.4
[]
docs.bluecatnetworks.com
All Release Notes Configure your stores' product distribution channels 14 July 2020 Enhancement Settings You can now add product distribution channels to your stores in the store settings. With product distribution channels defined in your store, only the prices defined for this particular channel are delivered to the storefront application with the product data for this store. Prices linked to other channels are omitted by the platform already reducing effort for the storefront application for achieving the same. Learn how to use product distribution channels in stores to filter out the prices sent to your storefront application in the corresponding API release notes. Please note: We plan to progressively add more store settings, for example, allowing to associate supply channels with a store.
https://docs.commercetools.com/merchant-center/releases/2020-07-14-configure-your-stores-product-distribution-channels
2021-04-10T14:04:35
CC-MAIN-2021-17
1618038057142.4
[]
docs.commercetools.com
Difference between revisions of "Release Notes/099/2018.20000" Revision as of 18:03, 8 March 2019 For the latest Official build, jump to Build 2018.27910 See our Spring 2018 Official Announcement for an overview of new 2018.22800 - May 11, 2018[edit] New Features[edit] - PBR MAT - Added 'Final Specular Color' and 'Final Diffuse Color' as options for outputting to color buffers. - Web Render TOP - Added option 'Use DAT' for specifying a DAT with text/html data scheme. Bug Fixes and Improvements[edit] - Audio File In CHOP / Movie File In TOP - Fixed issues with playing back audio that has more than 8 channels. - Fixed bug where re-initing cloned network dropped connected comp outputs to multi-input OPs. - Panel Viewers and Perform Mode .width .height are now affected by parent post-scaling options (children scale, align grid, fit, etc). - TDAbleton 1.9 - Level data transmitter from Ableton to TouchDesigner, including spectrum analysis rack. Also includes related TD Component for receiving this data. - Rack devices and related TD Component for easy control of TouchDesigner from Ableton. - Cleanup and modularization of M4L devices. - OpenVR - Upgraded to SDK version 1.0.14. - RealSense TOP - Upgrade to libRealSense SDK 2.11.0. Experimental Builds 2018.20000 / 2017.30000 - April 20, 2018[edit] For earlier release note in this branch reset to Experimental 2018.20000 / 2017.30000 Release Notes Official Builds 2017.17040 and earlier - May 11, 2018[edit] For earlier release notes refer to Official 2017.10000 Release Notes
https://docs.derivative.ca/index.php?title=Release_Notes/099/2018.20000&diff=14951&oldid=14950
2021-04-10T15:19:00
CC-MAIN-2021-17
1618038057142.4
[]
docs.derivative.ca
This docs six steps you can take to get up to speed with QA efforts (in rough order of precedence): Create a FAS account and apply to the 'qa' group Introduce yourself to the team! Create a Bugzilla Account Join #fedora-qa IRC channel on Freenode Attend the onboarding call What are you looking to do? Whether you are looking to test a stable release, a new package from updates-testing, a Branched pre-release, or the leading change in a recent or pending release. If you want to: Work on an upcoming stable release Work on testing new package updates addresses for your Fedora account and the Bugzilla account are the same. Our bug reporting practices provide some good background for filing bugs. If you want to discuss the bugs before reporting, QA members can be found on the test mailing list and the #fedora-qa IRC channel..
https://docs.fedoraproject.org/ur/qa-docs/
2021-04-10T15:36:33
CC-MAIN-2021-17
1618038057142.4
[]
docs.fedoraproject.org
Sign document with embedded and encrypted data in QR-code signatures GroupDocs.Signature provides ability to put into QR-code signature encrypted text or embed custom data objects. This feature is implemented over data serialization and encryption. Also there is ability to provide custom data serialization and encryption to secure your signature data the most. Here are feature summary - ability to encrypt QR-code signature text - ability to embed standard QR-code entries (like email, V-Card, web, Address) - ability to embedded custom objects into QR-Code signatures with data encryption - ability to specify custom objects encryption - ability to implement custom data serialization Please find in topics below examples and details for each of these features
https://docs.groupdocs.com/signature/java/sign-document-with-embedded-and-encrypted-data-in-qr-code-signatures/
2021-04-10T15:17:57
CC-MAIN-2021-17
1618038057142.4
[]
docs.groupdocs.com
UWF master 服务脚本UWF master servicing script UWF 主服务脚本 (UwfServicingMasterScript.cmd) 位于\Windows\System32 文件夹。The UWF master servicing script (UwfServicingMasterScript.cmd) is located in the \Windows\System32 folder. UwfServicingMasterScript.cmdUwfServicingMasterScript.cmd 完整 UWF 主服务脚本如下所示:The full UWF master servicing script follows: REM servicing of the device with UWF installed. The script will REM call UWF manager application to update the system with the REM latest available updates. REM The script will detect whether the update operation REM ended successfully or requires a reboot. REM REM The script will change the "SERVICING" state of the device REM only when the update operation results in a "SUCCESS". REM A state change of the device requires a reboot. REM REM If the update operation requires a "REBOOT" the script will REM reboot device without changing the "SERVICING" state. The REM Will then run again on the following reboot until REM the update operation either return a "SUCCESS" or a "ERROR" REM REM Any third-party script that needs to run before the state REM change should run in the UPDATE_SUCCESS block REM REM Environment : REM It is expected that UWF is turned "OFF", "SERVICING" mode REM enabled and all other preconditions REM for servicing are in place. REM REM REM echo UpdateAgent starting. uwfmgr servicing update-windows if ERRORLEVEL 3010 goto UPDATE_REBOOT if ERRORLEVEL 0 goto UPDATE_SUCCESS echo UpdateAgent returned error =%ERRORLEVEL% :UPDATE_ERROR uwfmgr servicing disable echo Restarting system goto UPDATE_EXIT :UPDATE_REBOOT echo UpdateAgent requires a reboot. echo UpdateAgent restarting system goto UPDATE_EXIT :UPDATE_SUCCESS echo UpdateAgent returned success. REM REM echo UpdateAgent executing OEM script REM OEM can call their custom scripts REM at this point through a "call". REM REM The OEM script should hand control REM back to this script once it is done. REM REM Any error recovery for OEM script REM should be handled outside of this script REM post a reboot. REM uwfmgr servicing disable echo Restarting system goto UPDATE_EXIT :UPDATE_EXIT echo UpdateAgent exiting. shutdown -r -t 5 EXIT /B 相关主题Related topics 服务 UWF 保护设备Service UWF-protected devices 统一的写入筛选器Unified Write Filter
https://docs.microsoft.com/zh-cn/windows-hardware/customize/enterprise/uwf-master-servicing-script
2021-04-10T16:01:21
CC-MAIN-2021-17
1618038057142.4
[]
docs.microsoft.com
[ English | français | English (United Kingdom) | Deutsch | ] Advanced.
https://docs.openstack.org/openstack-ansible/latest/reference/configuration/advanced-config.html
2021-04-10T15:03:38
CC-MAIN-2021-17
1618038057142.4
[]
docs.openstack.org
Crate x25519_dalek_fiat[−][src] x25519-dalek A pure-Rust implementation of x25519 elliptic curve Diffie-Hellman key exchange, with curve operations provided by curve25519-dalek. About This is a thin fork of the x25519-dalek project, and its main difference is replacing the original curve25519-dalek dependency with curve25519-dalek-fiat. This allows using a formally verified backend supplied by the fiat-crypto project, where primitive curve operations are extracted from Coq proofs of arithmetic correctness.: use rand_core::OsRng; use x25519_dalek::{EphemeralSecret, PublicKey}; let alice_secret = EphemeralSecret::new(OsRng); let alice_public = PublicKey::from(&alice_secret); Bob does the same: let bob_secret = EphemeralSecret::new(OsR = "1.1") See also - crypto_box: pure Rust public-key authenticated encryption compatible with the NaCl family of encryption libraries (libsodium, TweetNaCl) which uses x25519-dalekfor key agreement Note that docs will only build on nightly Rust until feature(external_doc) is stabilized.
https://docs.rs/x25519-dalek-fiat/0.1.0/x25519_dalek_fiat/
2021-04-10T14:39:05
CC-MAIN-2021-17
1618038057142.4
[array(['https://travis-ci.org/dalek-cryptography/x25519-dalek.svg?branch=master', None], dtype=object) array(['https://raw.githubusercontent.com/dalek-cryptography/x25519-dalek/master/res/bubblesort-zines-secret-messages-cover.jpeg', None], dtype=object) ]
docs.rs
- Save Time with Emailed Document Templates Save Time with Emailed Document Templates Updated by Lila Carsten more efficient. Instead of copy / paste - just select a template, personalize it for the client, and fire away. How do I use the templates? Do you have GSuite or Outlook 365? These templates also work with our @Mail feature - which allows you to send and receive email from your GSuite or Outlook365 account without ever leaving shopVOX. Learn more about our @Mail feature here. This is great for: - cold emails to potential clients - follow up on quote requests and contact form submissions - followups asking for feedback, referrals, and testimonials (here's a good article from the blog about how to get these). example... who ever is the "Primary Contact" on that transaction or the person who is sending the email and their signature etc., {}}} Quote Reminder Hi {{contact_name}} – I thought I’d reach out to see if you are ready to move forward with your project and if you have any questions or concerns regarding the quote that was sent earlier this week. I can imagine that you are extremely busy, so I’ll keep this short and sweet. If you are ready to proceed, simply reply to this email with a quick approval and we will get the ball rolling. If you need more time, that’s no problem. We are here for you, anytime! Just let me know what we can do to help. Thanks! {{{user_signature}}}}}} Proof Reminder Hi {{contact_name}} – Just a quick reminder, your project is currently in our pre-production queue awaiting your approval. I wanted to check in with you to see if you had a chance to review the proof that I sent earlier this week. Do you have any questions or is there anything I can do to help with this part of the process? I’ve attached the proof here for your convenience. Let me know if this is ready for production or if you’d like to see any changes. Cheers! {{}}} Payment Reminder Hello {{contact_name}} – Just checking in on the invoice we sent last month. It appears that your account shows an open balance in the amount of $__.__. Attached is a copy of the invoice. Kindly submit payment at your earliest convenience by calling (888)888-8888 or clicking the link to pay online with a credit card, or by sending a check. If your payment is in transit, let us know and we’ll keep an eye out for it. We appreciate your business and look forward to hearing from you. Thank you! {{{user_signature}}} Deposit Payment Receipt Hi {{contact_name}} – Your payment has been received. Attached is a copy of your receipt and paid invoice for your records. Thank you for your payment! Your project is ready to move on to the next stage in our process, which is proofing. You will soon receive an email with your proof for your approval. Once your proof is approved, we will start production and update you with an estimated time line for completion. Feel free to contact me at any time if you have any questions. We are always here to assist! Cheers! {{}}} Statement of Invoices Hi {{contact_name}} – It is so good to hear from you! Attached is a copy of your invoice statement that you requested. Please let me know if there is anything else I can do for you, I’m happy to help! Thanks! {{{user_signature}}} Asking for Feedback Hi {{contact_first_name}} - Just wanted to follow up and make sure you were 110% satisfied with your order. Were you satisfied with everything? Would love to hear your feedback - even if it's just one line. Thanks, {{{user_signature}}} Asking for a Referral {{contact_first_name}} - really glad you like the XXXXXX. We really enjoyed working on this project with you. As you may know, we’re a small shop and we get most of our business through referrals. Referrals allow us to work really hard to deliver 110% to our customers. Since your happy with the job, I'd like to ask you to introduce us to others like yourself. We primarily serve small businesses - so it could be a friend who runs a business, one of your vendors, maybe your neighbor. We take special care of all referrals. We give you 20% of your next order. We also give them 20% of their first order. Who are one or two people you think we could help? I appreciate you. Thanks, {{{user_signature}}} Asking for a Testimonial {{contact_first_name}} – We’re glad you like your shirt! We really enjoyed working on this project with you. Would you be willing to let us feature your shirts in our portfolio with a testimonial from you? We’d be happy to promote your business and provide a link back to your website. Would the testimonial below be alright to use? Testimonial: "When I needed XXXXX, the guys at XXXXX had my back. After a bad experience with my last printer, I was skeptical. But I told them what I needed and they had a proof to me within 24 hours. I love the design they created. I received XXXXX within a week and was impressed with the quality of the printing. I'd definitely recommend XXXXX for anything you need. Thanks, {{{user_signature}}}
https://docs.shopvox.com/article/si1hc7an10-save-time-with-emailed-document-templates-video
2021-04-10T14:42:21
CC-MAIN-2021-17
1618038057142.4
[array(['https://files.helpdocs.io/2vohp0m02q/articles/si1hc7an10/1555383147031/image.png', None], dtype=object) ]
docs.shopvox.com
Stake EPS to earn fees from Ellipsis Protocol. The EPS token supply is one billion (1,000,000,000) tokens to be emitted over five years. 55% Liquidity provider rewards: Continuously minted over five years with a progressively decreasing rate. 25% veCRV airdrop: Distributed weekly based on a veCRV snapshot. 20% team / development fund: Vested for one year with a continuous release.
https://docs.ellipsis.finance/eps-tokens
2021-04-10T14:13:19
CC-MAIN-2021-17
1618038057142.4
[]
docs.ellipsis.finance
NOTICE: Our WHMCS Addons are discontinued and not supported anymore. Most of the addons are now released for free in github - You can download them from GitHub and conribute your changes ! :) Monitoring Plugins :: Vmware plugin The plugin is written according to VMware’s SOAP API and can Power ON, Power OFF & Reboot servers.In order to get started, we will first need to add hosts. By “hosts” we mean a physical server that runs ESXi and runs guest machines. From the plugins menu, click “Manage Hosts” button – Fill in all the required data, and click “save” to add hosts. Here is a brief explanation about required settings on that page – After our host is set, we can manage actions (what to do when a server is down) for each hosts. From the plugins menu click “Manage actions” button – In the next screen, you can set actions like email alerts and server reboots. Our recommended settings are as follows – - Open WHMCS Support Ticket after 5 minutes of failure. - Reboot Server after 10 Minutes of failure. - Send Email alert (linked to SMS Gateway) after 15 minutes of failure (In case server didn’t came back from reboot). Now that the plugin is set and configured, you can link servers to that host.
https://docs.jetapps.com/monitoring-plugins-vmware-plugin
2021-04-10T14:59:17
CC-MAIN-2021-17
1618038057142.4
[array(['https://docs.jetapps.com/wp-content/plugins/lazy-load/images/1x1.trans.gif', 'vmware monitoring plugin'], dtype=object) ]
docs.jetapps.com
Consider the changes in API between the different versions There are three upgrade paths. You must choose which one to follow. If you are still using WebForms, consider upgrading to Microsoft MVC to get benefits for all performance improvemenets and be more future proof. All Litium web controls for websites (CMS) need to be upgraded in the project and web controls for other areas (products, customer, ecommerce) may need changes as well to adapt the new websites API. Verify that there is an upgraded version of the add-on that supports the new Litium version. The supported versions information is displayed on the add-on download page. If an add-on is not supported yet, contact Litium support to get information about add-on compatibility and upgrade options. Consider the following when upgrading a solution that has integrations: This add-on was used for Litium versions earlier than 5 to import product information. Since Litium 5 this feature is built-in, so all import and export files need to be adjusted to be compatible with the Litium 5 import/export format. Since Litium 5 pricelists are imported/exported through the built-in import/export feature, so all import and export files need to be adjusted to be compatible with the Litium 5 import/export format. In Litium 4.8 it was possible to have relations between products in some assortments. This is no longer possible. A workaround is to rename the relations to “AssortmentName” and “RelationTypeName”. The fields in the field framework now support several languages. This means you can use the same fields for multi-language sites, instead of having to create language-specific fields for each language version. There are two options to deploy an upgrade: These panels are loaded in a frame for re-factored areas (Websites, Products, Customers and Media). They will still work after the platform upgrade first after the panel itself is updated in the upgrade project. Panels implemented through templates are no longer supported in the re-factored areas (Websites, Products, Customers and Media). These need to be implemented as field types. More information on creating custom field types can be found in this article. These areas no longer use Lucene for search, so other solutions need to be implemented, for example by using a data service. If you are upgrading from a version earlier than 6, metadata for files in the Media area must be set up as new fields in the field framework. In Customers back office, the search function should be used to view organizations instead of the tree view to the left. The child organizations in Litium 7 are defined as organization pointers. In Litium 7 an organization can point to multiple other organizations, so any organizational structure can be represented. The Newsletter area has been removed from Litium. E-mail marketing can now be handled through the Apsis Connector add-on. The direct Active Directory connection for logins was removed in Litium 6. Use Active Directory Federation Service (ADFS) for login instead. Instructions on setting this up as an external login provider can be found here. When upgrading between different versions of Litium you might need to change the target framework version (e.g. 4.8 to 7 etc.). If you have many projects, changing the target framework version could take a lot of time, but with this PowerShell script the task only takes a few seconds for all projects. Start the Package Manager Console, that is a PowerShell console inside Visual Studio, from Tools > NuGet Package Manager. Run the following PowerShell script in the Package Manager Console. The script below has the target framework version set to 4.7.2. If you need to change to another version you can change the version number in the script. function update-solution {param ($item = $dte.Solution.Projects)$item | % {if ($_.Type -eq $null){$p = $_.Object} else {$p = $_}if ($p.Type -eq 'Unknown') {if ($p.ProjectItems -ne $null) {update-solution $p.ProjectItems}}elseif ($p.Type -ne $null) {$prop = $p.Properties.Item('TargetFrameworkMoniker');if ($prop.Value -ne ".NETFramework,Version=v4.7.2") {$name = $p.ProjectName;Write-Host "Change .Net framework to 4.7.2 for project $name" -ForegroundColor DarkYellow;try {$prop.Value = ".NETFramework,Version=v4.7.2"} catch {Set-ItemProperty ((get-project $name).FullName) -name IsReadOnly -value $false;$prop.Value = ".NETFramework,Version=v4.7.2"}}}}} & update-solution About Litium Join Litium Support System status
https://docs.litium.com/documentation/upgrading-to-litium-7/preparing-for-upgrade
2021-04-10T15:18:48
CC-MAIN-2021-17
1618038057142.4
[]
docs.litium.com
2.2.2 Formulas A formula is sequence of values, cell references, names, functions, or operators in a cell that together produce a new value. Formulas are stored in a tokenized representation known as "parsed expressions." In this section, formula is a synonym for parsed expression. A parsed expression is converted into a textual formula at runtime for display and user editing. Cell formulas are specified by the Formula record (section 2.4.127). Array formulas are specified by the Array record (section 2.4.4). Shared formulas are specified by the ShrFmla record (section 2.4.260). Formulas that are part of a revision as specified in the Shared Workbooks overview (section 2.2.11) are specified by the pe.rgce field or the peOld.rgce field of the RRDDefName record (section 2.4.225), or by the xpe.rgce field or the xpeOld.rgce field of the RRDChgCell record (section 2.4.223). A parsed expression contains a sequence of parse tokens, each of which is either an operand token (section 2.2.2.2), an operator token (section 2.2.2.1), a control token (section 2.2.2.3), a display token (section 2.2.2.4), or a mem token (section 2.2.2.5). All tokens are stored as Parse Things (Ptg (section 2.5.198.25)). With the exception of control tokens (section 2.2.2.3), display tokens (section 2.2.2.4), and mem tokens (section 2.2.2.5) that are described in subsequent sections, parsed expressions are stored in Rgce (section 2.5.198.104) using Reverse-Polish notation. Reverse-Polish notation is a logical system for the specification of mathematical formulas in which operands are followed by operators. Inside an Rgce, the operands and operators are represented by an array of Ptg structures (section 2.5.198.25) of variable lengths. The first one or two bytes of a Ptg structure (section 2.5.198.25) contain the token type that determines which specific Ptg type (section 2.5.198.25) the Ptg is, as specified in the Ptg structure The remainder of the structure varies according to the token type. Evaluation of a formula specified in Reverse-Polish notation is usually based around an evaluation stack. The expression is parsed from beginning to end, and operands are pushed onto the stack as they are encountered. When operators are encountered, the required number of operands is popped from the stack and the result of the operation is pushed back onto the stack. Evaluation begins with an empty stack, and when the evaluation is finished, there will be exactly one value left on the stack. The value is the result of the evaluation. Subsequent subsections refer to a stack as described by this model.
https://docs.microsoft.com/en-us/openspecs/office_file_formats/ms-xls/e7625cc8-3da9-4154-b449-49cf1bbd9703
2021-04-10T15:19:11
CC-MAIN-2021-17
1618038057142.4
[]
docs.microsoft.com
Train Series Release Notes¶ 3.1.0¶ New Features¶ support cassandra cluster configuration with loadbalancing policy DCAwareRoundRobinPolicy. ‘local_data_center’ defalut value is ‘’.. Support cassandra connection timeout option which set timeout when creating a new connection. Configuration option db_per_tenant added for InfluxDB to allow data points to be written to dedicated tenant database where the database_name prefixes the tenant ID, e.g. monasca_tenantid. Merge monasca-log-api source code into the monasca-api and enable logs endpoints. Introduce configuration options that allow to enable/disable metrics and logs endpoints. Dimensions names and values can be scoped by a timerange which can make dimension related queries to large databases much faster because only the relevant shards are searched. Users that upgrade their Monasca Grafana Datasource plugin to version 1.3.0 will benefit from this feature. Upgrade Notes¶ Configuration option legacy_kafka_client_enabled added to allow working with both legacy kafka-python and new Confluent Kafka client. Please set message format version for the Kafka brokers to 0.9.0.0 to avoid performance issues until all consumers are upgraded. Changes InfluxDB data from in-memory to disk storage (see). If upgrading an existing InfluxDB install please follow the instructions for migrating existing data here: Upgrade InfluxDB to 1.7.6 from 1.3.9. This provides a number of stability, performance and bug fix improvements. Full release notes available here: Upgrade Storm to 1.2.2 from 1.1.3. Upgrade Apache Kafka from version 1.0.1 to 2.0.1. Please consult official upgrading notes for complete information on upgrading from previous versions. Security Issues¶ InfluxDB 1.7.6 fixes a security issue in which monasca-api leaks dimensions across projects.
https://docs.openstack.org/releasenotes/monasca-api/train.html
2021-04-10T14:39:50
CC-MAIN-2021-17
1618038057142.4
[]
docs.openstack.org
Requirements About Calico for Windows Because the Kubernetes and Calico control components do not run on Windows yet, a hybrid Linux/Windows cluster is required. Calico for Windows standard installation is distributed as a .zip archive. What’s supported in this release ✓ Install: Manifest install for Kubernetes clusters ✓ Platforms: Kubernetes, EKS ✓ Networking: - Kubernetes, on-premises: Calico CNI with BGP or VXLAN - EKS: VPC CNI, or Calico CNI with BGP or VXLAN Requirements CNI and networking options The following table summarizes the networking options and considerations. Note: If Calico CNI with VXLAN is used, BGP must be disabled. See the installation reference. Datastores Whether you use etcd or Kubernetes datastore (kdd), the datastore for the Windows node/Kubernetes cluster must be the same as the datastore for the Linux control node. (You cannot mix datastores in a Calico for Windows implementation.) Kubernetes version - Versions 1.20, 1.19, or 1.18 Earlier versions may work, but we do not actively test Calico for Windows against them, and they may have known issues and incompatibilities. Linux platform - At least one Linux Kubernetes worker node to run Calico’s cluster-wide components that meets Linux system requirements, and is installed with Calico v3.12.0+. - VXLAN or BGP without encapsulation is supported if using Calico CNI. IPIP (default encapsulation mode) is not supported. Use the following command to turn off IPIP. calicoctl patch felixconfiguration default -p '{"spec":{"ipipEnabled":false}}' - If using Calico IPAM, strict affinity of IPAM configuration must be set to true. calicoctl ipam configure --strictaffinity=true Note: Calico for Windows requires four Linux worker nodes in order to meet high-availability requirements for Typha. Windows platform - Windows versions: - Windows Server 1903 (AKA 19H1) build 18317 or greater - Windows Server 2019 / 1809 (RS5) or greater, with some limitations - Windows Server 2019 with DSR support: - OS 1809: Build 17763.1432, binary version: 10.0.17763.1432 - OS 1903: Build 18362.1049, binary version: 10.0.18362.1049 - OS 1909: Build 18363.1049, binary version: 10.0.18363.1049 - Powershell for the installer - Make sure the Docker service is installed and running. Install Docker on Windows node. - If you are using Calico BGP networking, the RemoteAccess service must be installed for the Windows BGP Router. - Windows nodes support only a single IP pool type (so, if using a VXLAN pool, you should only use VXLAN throughout the cluster). - TLS v1.2 enabled. For example: [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12 Next steps Install Calico for Windows
https://docs.projectcalico.org/getting-started/windows-calico/kubernetes/requirements
2021-04-10T14:46:54
CC-MAIN-2021-17
1618038057142.4
[]
docs.projectcalico.org
Welcome to Splunk Enterprise 7.3.3 was first released on June 4, 2019. features topic lists computing platforms, browsers, and features for which Splunk has deprecated or removed support in this release. What's New in 7.3.0 What's New in 7.3.1 Splunk Enterprise 7.3.1 was released on July 31, 2019. It introduces enhancements to several features and resolves the issues described in Fixed issues..0, 7.3.1 Feedback submitted, thanks!
https://docs.splunk.com/Documentation/Splunk/7.3.1/ReleaseNotes/MeetSplunk
2021-04-10T15:31:46
CC-MAIN-2021-17
1618038057142.4
[array(['/skins/OxfordComma/images/acrobat-logo.png', 'Acrobat logo'], dtype=object) ]
docs.splunk.com