content
stringlengths
0
557k
url
stringlengths
16
1.78k
timestamp
timestamp[ms]
dump
stringlengths
9
15
segment
stringlengths
13
17
image_urls
stringlengths
2
55.5k
netloc
stringlengths
7
77
There appears to be a few efforts to get Scala to work with Maven, so I thought it would be useful to collect all the ideas here and see what code is currently available. Once that is done I would be happy to review any of the plugins and maybe we can collect them into one project, give everyone access who participated and make one decent toolchain for Scala. If folks put their names here and where there implementations are I'll review the code and make some suggestions for improvements and what might be merged. Probably wouldn't be too hard to make something decent quickly.
http://docs.codehaus.org/pages/viewpage.action?pageId=119865971
2014-03-07T13:55:26
CC-MAIN-2014-10
1393999642530
[]
docs.codehaus.org
have developed, for the Diputación Foral de Gipuzkoa, Department of Land Planning, uses from.
http://docs.codehaus.org/pages/viewpage.action?pageId=75425
2014-03-07T13:54:18
CC-MAIN-2014-10
1393999642530
[]
docs.codehaus.org
Provides an method to apply global parameters to Media Manager. On Media Manager, click Options button at top in Toolbar. Media Manager Options configuration allows setting of parameters used globally for Media Manager. Control the file types allowed for uploading, MIME type check, Enable Flash Uploader, MIME type blacklisting, and more options for Media Manager. This section shows permissions configuration for Media Manager. The screen shows as follows. To change the permissions for the Media Manager component, do the following. At the top left you will see the toolbar: The functions are:
http://docs.joomla.org/index.php?title=Help32:Components_Media_Manager_Options&diff=cur&oldid=81969
2014-03-07T13:56:56
CC-MAIN-2014-10
1393999642530
[]
docs.joomla.org
If you get errors like this in your site header: Warning: session_start() [function.session-start]: open(xxx) failed: Permission denied (13) in xxx/wp-content/plugins/woocommerce/woocommerce.php on line 80 your server is to blame. PHP Sessions may not be set up correctly, or the sessions directory (usually /tmp) is not writable. You need to contact your hosting provider to resolve this.
http://docs.woothemes.com/document/session_start-warnings/
2014-03-07T13:50:48
CC-MAIN-2014-10
1393999642530
[]
docs.woothemes.com
java.lang.Object org.springframework.webflow.execution.repository.support.AbstractFlowExecutionRepositoryorg.springframework.webflow.execution.repository.support.AbstractFlowExecutionRepository public abstract class AbstractFlowExecutionRepository Abstract base class for flow execution repository implementations. Does not make any assumptions about the storage medium used to store active flow executions. Mandates the use of a FlowExecutionStateRestorer, used to rehydrate a flow execution after it has been obtained from storage from resume. The configured FlowExecutionStateRestorer should be compatible with the chosen FlowExecution implementation and its FlowExecutionFactory. protected final org.apache.commons.logging.Log logger protected AbstractFlowExecutionRepository(ConversationManager conversationManager) conversationManager- the conversation manager to use public ConversationManager getConversationManager() public boolean getAlwaysGenerateNewNextKey() FlowExecutionKeyshould always be generated before each put call. public void setAlwaysGenerateNewNextKey(boolean alwaysGenerateNewNextKey) FlowExecutionKeyshould always be generated before each put call. By setting this to false a FlowExecution can remain identified by the same key throughout its life. public FlowExecutionKey getKey(FlowExecution execution) FlowExecutionKeyFactory getKeyin interface FlowExecutionKeyFactory execution- the flow execution public FlowExecutionKey parseFlowExecutionKey(java.lang.String encodedKey) throws FlowExecutionRepositoryException FlowExecutionRepository FlowExecutionKey.toString(). parseFlowExecutionKeyin interface FlowExecutionRepository encodedKey- the string encoded key FlowExecutionRepositoryException public FlowExecutionLock getLock(FlowExecutionKey key) throws FlowExecutionRepositoryException FlowExecutionRepository FlowExecutionLock lock = repository.getLock(key); lock.lock(); try { FlowExecution execution = repository.getFlowExecution(key); // do work } finally { lock.unlock(); } getLockin interface FlowExecutionRepository key- the identifier of the flow execution to lock FlowExecutionRepositoryException- a problem occurred accessing the lock object public void removeFlowExecution(FlowExecution flowExecution) throws FlowExecutionRepositoryException FlowExecutionRepository removeFlowExecutionin interface FlowExecutionRepository flowExecution- the flow execution FlowExecutionRepositoryException- the flow execution could not be removed. protected abstract java.io.Serializable nextSnapshotId(java.io.Serializable executionId) FlowExecutioninstance. Called when getting a flow execution key. public abstract FlowExecution getFlowExecution(FlowExecutionKey key) throws FlowExecutionRepositoryException FlowExecutionRepository FlowExecutionindexed by the provided key. The returned flow execution represents the restored state of an executing flow from a point in time. This should be called to resume a persistent flow execution. Before calling this method, you should acquire the lock for the keyed flow execution. getFlowExecutionin interface FlowExecutionRepository key- the flow execution key FlowExecutionRepositoryException- if no flow execution was indexed with the key provided public abstract void putFlowExecution(FlowExecution flowExecution) throws FlowExecutionRepositoryException FlowExecutionRepository FlowExecutionin this repository under the provided key. This should be called to save or update the persistent state of an active (but paused) flow execution. Before calling this method, you should acquire the lock for the keyed flow execution. putFlowExecutionin interface FlowExecutionRepository flowExecution- the flow execution FlowExecutionRepositoryException- the flow execution could not be stored protected ConversationParameters createConversationParameters(FlowExecution flowExecution) conversation parametersobject. flowExecution- the new flow execution protected Conversation getConversation(FlowExecutionKey key) throws NoSuchFlowExecutionException FlowExecutionwith the provided key. key- the flow execution key NoSuchFlowExecutionException- when the conversation for identified flow execution cannot be found protected Conversation getConversation(java.io.Serializable executionId) throws NoSuchConversationException executionId- the flow execution id NoSuchConversationException- when the conversation for identified flow execution cannot be found protected void assertKeySet(FlowExecution execution) throws java.lang.IllegalStateException execution- the flow execution java.lang.IllegalStateException- if a key has not yet been assigned as expected
http://docs.spring.io/spring-webflow/docs/2.3.x/javadoc-api/org/springframework/webflow/execution/repository/support/AbstractFlowExecutionRepository.html
2014-03-07T13:58:19
CC-MAIN-2014-10
1393999642530
[]
docs.spring.io
User Guide Local Navigation About connecting to a Wi-Fi network If you are in a Wi-Fi® coverage area and your wireless service plan supports it, you can connect to a Wi-Fi network so that your BlackBerry® device uses the Wi-Fi network instead of the mobile network to send and receive email messages, visit web pages, and so on. You can connect to Wi-Fi networks at home or at work, or to hotspots available in public places, such as libraries, airports, hotels, coffee shops, and so on. Your wireless service provider might provide an application for your device that allows you to log in to a hotspot. If you do not have an application on your device, you might have to set up your own account online and log in manually. If you have a wireless access point or router that uses Wi-Fi Protected Setup™, you can connect to a Wi-Fi network using the Push Button Setup method or PIN method. You can save the connection information for a Wi-Fi network in a profile, so that the next time you are within range of that network, your device connects to it automatically. You can also change the profile so that you are prompted to connect manually. When your device is connected to a Wi-Fi network, the Wi-Fi profile name appears at the top of the Home screen. Was this information helpful? Send us your comments.
http://docs.blackberry.com/en/smartphone_users/deliverables/16221/About_connecting_to_a_Wi-Fi_network_118037_11.jsp
2014-03-07T13:51:04
CC-MAIN-2014-10
1393999642530
[]
docs.blackberry.com
- Define the roadmap as concrete as possible - Jumpstart collaboration process and practices Meeting location Hotel most of us booked -?.
http://docs.codehaus.org/pages/viewpage.action?pageId=159285272
2014-03-07T13:56:26
CC-MAIN-2014-10
1393999642530
[]
docs.codehaus.org
User Guide Local Navigation About Wi-Fi diagnostic reports Wi-Fi® diagnostic reports provide Wi-Fi configuration and connection information for your BlackBerry® device. If you cannot connect to a Wi-Fi network or access services such as email messaging, your wireless service provider or administrator might ask you to submit a Wi-Fi diagnostic report. Your wireless service provider or administrator can use the report to help you troubleshoot the problem. Next topic: Submit a Wi-Fi diagnostic report Previous topic: Wi-Fi diagnostic reports Was this information helpful? Send us your comments.
http://docs.blackberry.com/en/smartphone_users/deliverables/11499/About_Wi-Fi_diagnostic_reports_225109_11.jsp
2014-03-07T13:53:43
CC-MAIN-2014-10
1393999642530
[]
docs.blackberry.com
User Guide Local Navigation Set up speed dial for a contact - From the Home screen, press the Send key. - Press the Menu key. - Click View Speed Dial List. - Click an unassigned key. - Click New Speed Dial. - Click a contact. Next topic: Change the contact assigned to a speed dial key Previous topic: Speed dial Was this information helpful? Send us your comments.
http://docs.blackberry.com/en/smartphone_users/deliverables/19565/Set_up_speed_dial_for_a_contact_26302_11.jsp
2014-03-07T13:50:57
CC-MAIN-2014-10
1393999642530
[]
docs.blackberry.com
Contribution Overview Grapplet provides a way for running Groovy on an applet, adding extra functionality to JS objects and arrays, for example arrays behave like Java Lists, so all GDK methods available to List and Collection can be used in JS arrays. Grapplet will automatically look for all <script> tags available in the page that have its language property set to "text/x-groovy". This was inspired by a post on Dion's blog: Running Ruby in the browser via script type="text/ruby". Once Grapplet is running on a page, you can evaluate any Groovy script by calling evaluateScript(). Team Members Andres Almiray [aalmiray at users dot sourceforge dot net] Download Distributions Pending. Installing Pending. Pre-requisites None Documentation In order to run Grapplet it needs to be signed, follow the next instructions settings while finding out the correct way to do it. All you have to do is issue the following command: keytool -genkey -keystore groovy -storepass groovy -keypass groovy \ -alias groovy 2. Trust your own certificate. Unless you want to spend some bucks on this experiment I recommend you selfcert your certificate. To selfcert your newly created certificate, issue the following command: keytool -selfcert -keystore groovy -storepass groovy -keypass groovy \ entries to the jar's manifest. jarsigner -keystore groovy -storepass groovy -keypass groovy \ grapplet-0.1.jar groovy 5. Verify your jar (just in case). You may verify that your jar has indeed been signed and includes the certificate, for more information on jarsigner's output refer to the command's help (jarsigner -help): jarsigner -verify -verbose -certs -keystore groovy grapplet-0.1.jar 6. Configure your local security settings. For this step you must touch $JRE_HOME/lib/security/java.policy and $JRE_HOME/lib/security/java.security, in windows $JRE_HOME usally points to "c:/Program Files/Java/jdk1.x.x/". 1. Add the following lines at the end of java.policy: grant { java.lang.RuntimePermission "usePolicy"; }; 2. Create a file named '.java.policy' at $USER_HOME with the following contents: keystore "file:${user.home}/groovy"; grant signedBy "groovy" { permission java.security.AllPermission; }; grant codeBase "" { permission java.security.AllPer You can install it on your local Maven2 repo with the following command mvn install:installFile -DgroupId=com.sun.java-plugin -Dversion=<jdkversion> \ -Dpackaging=jar -DartifactId=java-plugin \ -Dfile= $JDK_HOME/jre/lib/plugin.jar where <jdkversion> is the version number of the selected jdk. Grapplet has version 1.6.0 configured, if you change version you'll have to update pom.xml After you have the required dependencies installe, you may generate the package by typing mvn package 1 Comment Deena Welch It's understandable that cash can make people disembarrass. But how to act if someone doesn't have cash? The one way is to get the home loans and just secured loan.
http://docs.codehaus.org/display/GROOVY/Grapplet?focusedCommentId=171343886
2014-03-07T13:51:28
CC-MAIN-2014-10
1393999642530
[]
docs.codehaus.org
BYOH Prerequisites and installation Configure BYOH data centers to isolate workloads. You need to install DataStax Enterprise on all the nodes, nodes in the Hadoop cluster and additional nodes outside the Hadoop cluster. You configure the additional nodes in one or more BYOH data centers to isolate workloads. Run sequential data loads, not random OLTP loads or Solr data loads in a BYOH data center. need to be able to communicate with HDFS Data Node located outside the BYOH data center. During the installation procedure, you install the only Hadoop components you need in the BYOH data center: Task Trackers/Node Managers and optional clients, MapReduce, Hive, and Pig. Install Hadoop on the same paths on all nodes. CLASSPATH variables in the Node Type drop-down. - On packaged installations on the Hadoop cluster only, remove the init.d startup files for DataStax Enterprise and DataStax Enterprise Agent. For example, as root, stop DSE processes if they started up automatically, and then remove the files: $ sudo /etc/init.d/dse stop $ sudo /etc/init.d/datastax-agent stop $ sudo rm -rf /etc/init.dse $ sudo rm /etc/init.d/datastax-agentRemoving the startup files prevents accidental start up of DataStax Enterprise on the Hadoop cluster. - Deploy only the BYOH nodes in a virtual data center. - After configuring the cassandra.yaml and dse.yaml files as described in instructions for deploying the data center, data center is optional, but not recommended. Separating workloadsUse separate data centers to deploy mixed workloads. Within the same data center, do not mix nodes that run DSE Hadoop integrated job and task trackers with external Hadoop services. In the BYOH mode, run external Hadoop services on the same nodes as Cassandra. You can enable CFS on these Cassandra nodes as a startup option, but this is not recommended.
http://docs.datastax.com/en/datastax_enterprise/4.5/datastax_enterprise/byoh/byohInstall.html
2016-10-20T21:18:16
CC-MAIN-2016-44
1476988717954.1
[]
docs.datastax.com
Shoko Server - Integrity Tab The Integrity Tab checks the hash values of the files in your collection against their original hash value when first imported to detect corrupted files. are marked as corrupted and displayed in the results panel below. With this utility you can replace corrupted files in your collection preventing playback errors from happening in the future.. Managing Previous Integrity Checks You can view previous integrity checks by click the Drop-Down box and selecting the previous integrity check you'd like to view. You can also delete an integrity check by click the Red X button, you'll be prompted to confirm deletion. Help Make This Page Better Have an idea to improve this page? Is the material incorrect or outdated? Don't wait for us to notice it and update it, instead make the update yourself so the community can benefit from it.Shoko Server - Integrity Tab on GitHub
https://docs.shokoanime.com/server/config/integrity.html
2018-05-20T13:58:38
CC-MAIN-2018-22
1526794863570.21
[]
docs.shokoanime.com
Installation¶ This part of the documentation covers the installation of SQLAlchemy-Utils. Supported platforms¶ SQLAlchemy-Utils has been tested against the following Python platforms. - cPython 2.6 (unsupported since 0.32) - cPython 2.7 - cPython 3.3 - cPython 3.4 - cPython 3.5 - cPython 3.6 Installing an official release¶ You can install the most recent official SQLAlchemy-Utils version using pip: pip install sqlalchemy-utils # Use `pip3` instead of `pip` for Python 3.x Installing the development version¶ To install the latest version of SQLAlchemy-Utils, you need first obtain a copy of the source. You can do that by cloning the git repository: git clone git://github.com/kvesteri/sqlalchemy-utils.git Then you can install the source distribution using pip: cd sqlalchemy-utils pip install -e . # Use `pip3` instead of `pip` for Python 3.x
http://sqlalchemy-utils.readthedocs.io/en/latest/installation.html
2018-05-20T13:40:15
CC-MAIN-2018-22
1526794863570.21
[]
sqlalchemy-utils.readthedocs.io
class OEDefaultConfTest : public OEConfTestBase This class represents OEDefaultConfTest. This is the default implementation of OEConfTestBase. It never combines connection tables into multi-conformer molecules. This function is set as the conformer test to use by all of the oemolistream constructors. The following methods are publicly inherited from OEConfTestBase: bool CombineMols(OEMCMolBase &m1, OEMolBase &m2) Adds m2 as a new conformer in m1 and returns whether the combination was successful. OEConfTestBase *CreateCopy() const Deep copy constructor that returns a copy of the object. The memory for the returned OEDefaultConfTest object is dynamically allocated and owned by the caller. bool HasCompareMols() const Always returns false for this object to indicate that the implementation of the OEConfTestBase.CompareMols method is “trivial” and always returns false, indicating molecule copying can be elided in OEReadMolecule.
https://docs.eyesopen.com/toolkits/python/oechemtk/OEChemClasses/OEDefaultConfTest.html
2018-05-20T13:47:27
CC-MAIN-2018-22
1526794863570.21
[]
docs.eyesopen.com
Saving and Loading Audio You can load audio files as an AudioClip using the ES3.LoadAudio method. MP3 files are only supported on mobile, and Ogg Vorbis files are only supported on standalone platforms. WAV, XM, IT, MOD and S3M files are supported on all platforms except WebGL. As this method requires file access, this method is not supported on WebGL. A FileNotFoundException will be thrown if the file does not exist. In this case, you can use ES3.FileExists to check if the data exists before loading. C# JS Saving Audio Files It’s not currently possible to save audio to a compressed format as Unity lacks the required encoders to do so. However, it’s possible to save and load an AudioClip in Easy Save’s format using the normal ES3.Save and ES3.Load methods. As the data is uncompressed, the file size will be larger than compressed formats. C#
https://docs.moodkie.com/easy-save-3/es3-guides/saving-and-loading-audio/
2018-05-20T13:55:25
CC-MAIN-2018-22
1526794863570.21
[]
docs.moodkie.com
The following recommendations are based on Hortonworks’ experience in production data centers: Server platform Typically, dual-socket servers are optimal for Hadoop deployments. For medium to large clusters, using these servers is a best choice over entry-level servers, because of their load-balancing and parallelization capabilities. In terms of density, select server hardware that fits into a low number of rack units. Typically, 1U or 2U servers are used in 19” racks or cabinets. Storage options For general-purpose Hadoop applications, we recommend using a relatively large number of hard drives (typically eight to twelve SATA LFF drives) per server. Currently typical capacity in production environments is around 2 TB per drive. Highly I/O intensive environments may require using 12 x 2 TB SATA drives. The optimal balance between cost and performance is generally achieved with 7,200 RPM SATA drives. If your current or predicted storage is experiencing a significant growth rate you should also consider using 3 TB disks. SFF disks are being adopted in some configurations for better disk bandwidth. We recommend that you monitor your cluster for any potential disk failures because more disks will increase the rate of disk failures. If you do have large number of disks per server, we recommend that you use two disk controllers, so that the I/O load can be shared across multiple cores. Hortonworks strongly recommends only using either SATA or SAS interconnects. On an HDFS cluster using a low-cost reliable storage option, you will observe that the old data stays on the cluster indefinitely and your storage demands grow quickly. With 12-drive systems, you typically get 24 TB or 36 TB per node. Using this storage capacity in a node is only practical with Hadoop release 1.0.0 or later (because the failures are handled gracefully allowing machines to continue serving from their remaining disks). Hadoop is storage intensive and seek efficient, but does not require fast and expensive hard drives. If your workload pattern is not I/O intensive, it is safe to add only four or six disks per node. Note that power costs are proportional to the number of disks and not storage capacity per disk. We therefore recommend that you add disks to increase storage only and not simply for seeks. Your disk drives should have good MTBF numbers, as slave nodes in Hadoop suffer routine probabilistic failures. Your slave nodes do not need expensive support contracts that offer services like replacement of disks within two hours or less. Hadoop is designed to adapt to slave node disk failure. Treat maintenance activity for the slave nodes as an ongoing task rather than an emergency. It is good to be able to swap out disks without taking the server out of the rack, though switching them off (briefly) is an inexpensive operation in a Hadoop cluster. Memory sizing It is critical to provide sufficient memory to keep the processors busy without swapping and without incurring excessive costs for non-standard motherboards. Depending on the number of cores, your slave nodes typically require 24 GB to 48 GB of RAM for Hadoop applications. For large clusters, this amount of memory provides sufficient extra RAM (approximately 4 GB) for the Hadoop framework and for your query and analysis processes (HBase and/or Map/Reduce). To detect and correct random transient errors introduced due to thermodynamic effects and cosmic rays, we strongly recommend using error correcting code (ECC) memory. Error-correcting RAM allows you to trust the quality of your computations. Some parts (chip-kill/chip spare) have been shown to offer better protection than traditional designs, as they show less recurrence of bit errors. (See, DRAM Errors in the Wild: A Large-Scale Field Study, Schroeder et al, 2009 .) If you want to retain the option of adding more memory to your servers in future, ensure there is space to do this alongside the initial memory modules. Memory provisioning Memory can also be provisioned at commodity prices on low-end server motherboards. It is typical to over-provision memory. The unused RAM will be consumed either by your Hadoop applications (typically when you run more processes in parallel) or by the infrastructure (used for caching disk data to improve performance). Processors Although it is important to understand your workload pattern, for most systems we recommend using medium clock speed processors with less than two sockets. For most workloads, the extra performance per node is not cost-effective. For large clusters, use at least two quad core CPU for the slave machines. Power considerations Power is a major concern when designing Hadoop clusters. Instead of automatically purchasing the biggest and fastest nodes, analyze the power utilization for your existing hardware. We have observed huge savings in pricing and power by avoiding fastest CPUs, redundant power supplies, etc. Vendors today are building machines for cloud data centers that are designed to reduce cost, power, and are light-weight. Supermicro, Dell, and HP all have such product lines for cloud providers. So if you are buying in large volume, we recommend evaluating these stripped-down “cloud servers”. For slave nodes, a single power supply unit (PSU) is sufficient, but for master servers use redundant PSUs. Server designs that share PSUs across adjacent servers can offer increased reliability without increased cost. Some co-location sites bill based on the maximum-possible power budget and not the actual budget. In such a location the benefits of the power saving features of the latest CPUs are not realized completely. We therefore recommend checking the power billing options of the site in advance. Network This is the most challenging parameter to estimate because Hadoop workloads vary a lot. The key is buying enough network capacity at reasonable cost so that all nodes in the cluster can communicate with each other at reasonable speeds. Large clusters typically use dual 1 GB links for all nodes in each 20-node rack and 2*10 GB interconnect links per rack going up to a pair of central switches. A good network design considers the possibility of unacceptable congestion at critical points in the network under realistic loads. Generally accepted oversubscription ratios are around 4:1 at the server access layer and 2:1 between the access layer and the aggregation layer or core. Lower oversubscription ratios can be considered if higher performance is required. Additionally, we also recommend having 1 GE oversubscription between racks. It is critical to have dedicated switches for the cluster instead of trying to allocate a VC in existing switches - the load of a Hadoop cluster would impact the rest of the users of the switch. It is also equally critical to work with the networking team to ensure that the switches suit both Hadoop and their monitoring tools. Design the networking so as to retain the option of adding more racks of Hadoop/HBase servers. Getting the networking wrong can be expensive to fix. The quoted bandwidth of a switch is analogous to the miles per gallon ratings of an automobile - you are unlikely to replicate it. ‘’Deep buffering’’ is preferable to low-latency in switches. Enabling Jumbo Frames across the cluster improves bandwidth through better checksums and possibly may also provide packet integrity.
https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.2/bk_cluster-planning/content/hardware-for-slave.1.html
2018-05-20T14:00:12
CC-MAIN-2018-22
1526794863570.21
[]
docs.hortonworks.com
Create a Node Template). Note Windows Azure nodes are supported starting in HPC Pack 2008 R2 with Service Pack 1 (SP1). Unmanaged server nodes are supported starting in HPC Pack 2008 R2 with SP3. Important If you want to create a node template to deploy an operating system image, you will need to create a new operating system image, or load an existing one. For more information, see Node Template Images. To create a node template In the Deployment. Note In a node template, the Mount Share and Install Windows tasks have optional properties that can be used to specify user or administrator credential information. For security purposes, any credentials that are specified in the node template are removed during export. When you import the template, you can re-specify the credentials by editing the template. For more information about node template tasks and their properties, see Understanding Node Templates. Important As a best practice for security in an HPC cluster, if you save or export information about an HPC cluster into XML files, we recommend that you track the location of those files and protect them from unauthorized use. For more information, see Security Considerations for File and Folder Permissions in Microsoft HPC Pack. For information about using HPC Cluster Manager, see Overview of HPC Cluster Manager.
https://docs.microsoft.com/en-us/previous-versions/orphan-topics/ws.10/ff919334(v=ws.10)
2018-05-20T14:53:12
CC-MAIN-2018-22
1526794863570.21
[]
docs.microsoft.com
Welcome to Healthcare.ai! The aim of healthcareai is to streamline machine learning in healthcare with two main goals: - Allow one to easily create models based on tabular data, and deploy a good model that pushes predictions to a database (MSSQL, MySQL, SQLite, CSV). - Provide tools related to data cleaning, manipulation, and imputation. Where's the Source Code? Find our code at our Github Repo Do I Want This Package or the R package? Choose this Python package if several of the following apply: - You love Python - You're working with 5M+ rows - You can tolerate some quirks Otherwise, the R package is recommended, as it currently has more features and R is tends to be more newbie-friendly. R is where we're putting most of our time.
http://healthcareai-py.readthedocs.io/en/latest/
2018-05-20T13:41:16
CC-MAIN-2018-22
1526794863570.21
[array(['./img/GitHub-Mark-120px-plus.png', 'Our Github repo'], dtype=object) ]
healthcareai-py.readthedocs.io
Configure cloud hybrid search - roadmap Learn how to configure cloud hybrid search for SharePoint Server by setting up a cloud Search service application in your SharePoint Server environment and connecting it to your search index in Office 365. This article describes how you set up cloud hybrid search in an environment with SharePoint Server and SharePoint Online for Office 365 for enterprises. With the cloud hybrid search solution, you add crawled metadata from all your content, including on-premises content, to your search index in Office 365. When users search in Office 365, they get search results from both on-premises and from Office 365 content. Note If you are an Office 365 Dedicated customer, setting up cloud hybrid search requires engagement of SharePoint Service Engineering staff. Contact your Microsoft Service Delivery Manager for assistance. If you aren't sure what type of customer you are, you can safely disregard this note. Before you start To complete the configuration steps you'll need these items: The hardware and software that's needed in a SharePoint Server hybrid environment. An on-premises server or virtual machine for cloud hybrid search that has: Minimum 100 GB storage, 16 GB RAM, and four 1.8 GHz CPUs. SharePoint Server installed. Is a member of a Windows Server Active Directory domain. (SharePoint Server 2013 only) You must have at least Service Pack 1 and the January 2016 public update installed. The accounts that are needed in a SharePoint Server hybrid environment, a search account for cloud hybrid search in SharePoint Server, and a managed account for default content access in SharePoint Server. Ensure that the account for default content access has at least read access to the content to crawl. Your company's or organization's SharePoint Online portal URL, such as https://<yourtenantname>.sharepoint.com The search architecture plan you made for cloud hybrid search. If you'll use the Hybrid picker in the SharePoint Online admin center wizard to help you configure, ensure that the application farm that hosts the SharePoint Server Central Administration website has .NET 4.6.3 installed. If you'll use the CreateCloudSSA.ps1 and Onboard-CloudHybridSearch.ps1 Microsoft PowerShell scripts to help you configure, find them in the Microsoft Download Center. Follow these steps: If you already completed step 1 when you configured a different hybrid solution, skip that step and go to the next. Create a cloud Search service application in SharePoint Server The cloud SSA lets you crawl and add metadata from on-premises content to the search index in Office 365. Each search farm can have only one cloud SSA, but can have multiple SSAs in combination with the cloud SSA. You can't convert an existing SSA to a cloud SSA. Note If your organization restricts computers from connecting to the internet, you need to allow access to the endpoints (FQDNs) that cloud hybrid search uses. Include the endpoints in your outbound allow lists. The endpoints are listed in the SharePoint Online section of the article Office 365 URLs and IP address ranges and are marked for use with Hybrid Search. Use the Hybrid Picker to connect your SharePoint Server and Office 365 environments and create the cloud Search service application. On the application server that hosts the SharePoint ServerCentral Administration website: Log on to the console as a farm administrator. Connect to Office 365 as a global administrator. Navigate to to download, install, and start the Hybrid Picker wizard. Follow the prompts in the Hybrid Picker and select the hybrid search feature. The Hybrid Picker lets you choose between a cloud SSA with the default search architecture on the application server that hosts the SharePoint Server Central Administration website, or a cloud SSA with a search architecture on two application servers (supports high availability) The Hybrid Picker saves you time because it also connects the cloud SSA to your Office 365 tenant (step 3). Alternative methods for creating a cloud Search service application You can also create the cloud SSA as follows: You can download the CreateCloudSSA.ps1 Powershell script from the Microsoft Download Center and run it. The script lets you choose between a cloud SSA with the default search architecture on the application server that hosts the SharePoint Server Central Administration website, or a cloud SSA with a search architecture on two application servers (supports high availability). You can use the SharePoint Central Administration website, just like you would for an SSA. With this method you get a cloud SSA and the default search architecture installed on the application server that hosts the SharePoint Server Central Administration website. To create a cloud SSA by running the CreateCloudSSA.ps1 PowerShell script, follow the instructions below. Note When you installed SharePoint Server, the user account from which you ran the installation was granted the appropriate permissions to run Windows PowerShell cmdlets. On the application server that hosts the SharePoint ServerCentral Administration website , follow these steps: Make sure you're using the same user account as when you installed SharePoint Server. This account is granted the appropriate permissions to run Window Powershell cmdlets. Start the Windows Powershell console with administrator privileges: Click Start, type PowerShell, and then right-click Windows PowerShell and select Run as administrator. Run the CreateCloudSSA.ps1 PowerShell script. When prompted, type: The host name of the search server in SharePoint Server. If you've planned highly available search, the host name of the second search server. The Search service account in this format: domain\username. A name of your choice for the cloud SSA. The name of the database server in SharePoint Server Verify that you see the message that the cloud SSA was successfully created. Can I make my own Windows PowerShell script for creating a cloud SSA? If you want to make your own PowerShell script for creating a cloud SSA, first study the CreateCloudSSA.ps1 PowerShell script we've provided. Notice that the difference between creating a cloud SSA and an SSA is the value of the property CloudIndex. You set CloudIndex: true when you create a cloud SSA (you can't change this value later). When CloudIndex is true, crawled metadata is not added to the on-premises search index. However, this doesn't mean that the metadata is added to the Office 365 search index, you have to onboard the cloud SSA to cloud hybrid search for that to happen (see Connect your cloud Search service application to your Office 365 tenant). Ensure that your PowerShell script: Tests that the Search service account is a managed account, and makes it a managed account if it isn't. Includes -CloudIndex $true as an argument when it uses the New-SPEnterpriseSearchServiceApplication PowerShell cmdlet. Connect your cloud Search service application to your Office 365 tenant Note If you used the Hybrid Picker to create a cloud Search service application, then you can skip this step. This section guides you how to onboard your cloud SSA and Office 365 tenant to cloud hybrid search and covers: Connecting your cloud SSA and your Office 365 tenant - When your cloud SSA and your Office 365 tenant are correctly connected, the cloud hybrid search solution is ready to add crawled metadata from on-premises content to the search index in Office 365. When you've onboarded your cloud SSA, check to see that your cloud SSA has the value 1 for the property IsHybrid. You check by running this PowerShell command: $ssa.GetProperty("CloudIndex"). Configuring server-to-server authentication - Server-to-server authentication allows servers to access and request resources from one another on behalf of users. On the application server that hosts the SharePoint ServerCentral Administration website, follow these steps: Ensure that the date and time of the server is synchronized with the other servers in the SharePoint Server farm. Download and install the Microsoft Online Services Sign-In Assistant for IT Professionals RTW from the Microsoft Download Center. Download version 1.1.166.0 or newer of the Azure Active Directory Module for Windows PowerShell. Click Run to run the installer package. Download the OnBoard-CloudHybridSearch.ps1 PowerShell script from the Microsoft Download Center. If your environment is Office 365 Business, Office 365 Enterprise, Office 365 Education, Office 365 operated by 21Vianet, Office 365 Germany, or Office 365 US Government Defense, open an elevated PowerShell prompt, and run the OnBoard-CloudHybridSearch.ps1 UNRESOLVED_TOKEN_VAL( PowerShell_2nd_NoVer) script as follows: Import-Module MSOnline .\OnBoard-CloudHybridSearch.ps1 -PortalUrl <SPOTenantPortalUrl> -CloudSsaId <CloudSSANameCreatd> SPOTenantPortalUrl is the URL of your company's or organization's SharePoint Online portal, and ** CloudSsaID ** is the name of the cloud SSA that you created earlier. If your environment is Office 365 US Government Communication, open an elevated PowerShell prompt, and run the OnBoard-CloudHybridSearch.ps1 PowerShell script as follows: Import-Module MSOnline .\OnBoard-CloudHybridSearch.ps1 -PortalUrl <SPOTenantPortalUrl> -CloudSsaId <CloudSSANameCreatd> -IsPortalForUSGovernment $true SPOTenantPortalUrl is the URL of your company's or organization's SharePoint Online portal, and ** CloudSsaID ** is the name of the cloud SSA that you created earlier. When prompted, type the global admin credentials for your Office 365 tenant. Set up search architecture in SharePoint Server for cloud hybrid search If you planned to use the default search architecture that you get when creating a cloud SSA, you can skip this step. Otherwise, ensure that you have prepared the servers you need for your planned search architecture for cloud hybrid search, and follow the guidance for setting up your planned search architecture. This guidance is applicable also for cloud hybrid search. Create a content source for cloud hybrid search to crawl We recommend that you start with a small on-premises content source, for example a small file share, to test. You can add more on-premises content sources later. Verify that the user account that is performing this procedure is an administrator for the cloud SSA. On the home page of Central Administration, in the Application Management section, click Manage service applications. On the Manage Service Applications page, click the cloud SSA.. To set the priority of this content source, in the Content Source Priority section, on the Priority list, select Normal or High. Click OK. Set up a separate Search Center in Office 365 to validate hybrid search results After you've set up cloud hybrid search and completed a full crawl of your on-premises content, your existing Search Center in Office 365 as well as Office Delve will automatically show both on-premises and online search results. Before you start the full crawl, we recommend that you create a new, separate Search Center. Set it up to show the mixed on-premises and online search results. This way you can validate and tune the new search experience in the separate Search Center, while you keep the existing Search Center unchanged. Follow these steps to set up a separate Search Center in Office 365: Create a result source that retrieves search results from the search index of this tenant, but limits search results to Office 365 content by using a Query Transform. Change the default query transform to "{?{searchTerms} NOT IsExternalContent:true}". This works because content that has the managed property IsExternalContent set to true (see About the IsExternalContent managed property) in the SharePoint Online search schema, is on-premises content. Modify the Search Results Web Part in your Office 365 Search Center to use the result source that you just created. Your users get the original search experience in this Search Center. Create a second Office 365 Search Center that uses the default result source. This Search Center has hybrid search results when you've run a full crawl. Validate and tune your new search experience in this Search Center. Set up access so only testers and administrators have access to the second Office 365 Search Center. Here's an example of a validation environment: On-premises content. During crawl, content is added to the Office 365 index. Office 365 content. During crawl, content is added to the Office 365 index. Default (or existing) Office 365 Search Center. This Search Center uses the custom result source that limits search results to only Office 365 content. Second Office 365 Search Center, where you validate and tune how hybrid search results are shown. This Search Center uses the default result source and shows search results from both on-premises and Office 365 content. About the IsExternalContent managed property An important part in this environment is the custom result source you use in the default or existing Office 365 Search Center. This result source keeps the search experience unchanged while you validate and tune how hybrid search results are displayed. An important piece in this custom result source is the IsExternalContent managed property in the SharePoint Online search schema. Before you set up cloud hybrid search, this managed property is empty. But, after you've set up cloud hybrid search and crawled your on-premises content, this property is set to true for all on-premises content. You can therefore limit search results to show only Office 365 content with NOT IsExternalContent:true . Start a full crawl of on-premises content for cloud hybrid search Start a full crawl of the content source. See Start, pause, resume, or stop a crawl in SharePoint Server 2013 or follow these steps: Verify that the user account that is performing this procedure is an administrator for the Cloud Search service application. On the home page of the SharePoint Central Administration website, in the Application Management section, click Manage service applications. On the Manage Service Applications page, click the cloud Search service application. On the Search Administration page, in the Crawling section, click Content Sources. On the Manage Content Sources page, in the list of content sources, point to the name of the content source that you want to crawl, click the arrow and then click Start Full Crawl. The value in the Status column changes to Crawling Full for the selected content source. Verify that cloud hybrid search works After the full crawl completes, verify that your on-premises content shows up in the search results in your validation Search Center in Office 365. Log in to Office 365 with your work or school account. Make sure that: You have access to the validation Search Center. You have access to the content in the content source that you have crawled. If you performed step 1 of this roadmap, you should have access. Your organization hasn't assigned user access rights to the on-premises content by using one of the default security groups in Windows Server Active Directory (AD), for example the Domain Users security group, see Plan cloud hybrid search for SharePoint. Search for IsExternalContent:1 in the validation Search Center. The results you get should show content from the on-premises content source that you've crawled. Verify that your on-premises content shows up in the search results. Tune cloud hybrid search After you've set up cloud hybrid search and verified that you get search results from on-premises content in your validation Search Center in Office 365, set up the search experiences that you planned. You might find this guidance useful: With cloud hybrid search you manage the search schema in SharePoint Online in Office 365, just as you would in an Office 365 environment. Learn how in Manage the Search Center in SharePoint Online. You manage how search results are displayed from the search schema in SharePoint Online, see Manage the Search Center in SharePoint Online. If you've set up site search in SharePoint Server to get search results the Office 365, you also manage how these results are displayed from the search schema in SharePoint Online. Enable previews of on-premises search results in cloud hybrid search . Show results from Office 365 in on-premises SharePoint with cloud hybrid search . To publish your SharePoint Server site and make it accessible for your users, follow the best practices in Plan for Internet, intranet, and extranet publishing sites in SharePoint Server To open a link from a search result that comes from on-premises content, users have to either be connected to the on-premises intranet via a Virtual Private Network (VPN) connection or be logged on to where the content is stored. Alternatively, you enable users to open such links by installing a reverse proxy device for SharePoint Server. After setting up and validating the planned search experiences, you might want to clear your search index in Office 365 for metadata from the on-premises content you've used during this work. This works differently from what you might be familiar with from SharePoint Server. In the SharePoint Central Administration website you can use the option "Index reset" for an SSA to remove all content from the search index. This option does not work for cloud hybrid search because there is no direct communication between the cloud SSA in SharePoint Server and the search index in Office 365. If you only want to remove some on-premises metadata, remove that on-premises content source, or create a crawl rule that doesn't crawl the URL of a file. If you need to remove all metadata from on-premises content from the search index in Office 365, open a ticket with Microsoft Support. Related Topics Plan cloud hybrid search for SharePoint
https://docs.microsoft.com/en-us/SharePoint/hybrid/configure-cloud-hybrid-searchroadmap
2018-05-20T14:46:31
CC-MAIN-2018-22
1526794863570.21
[array(['../sharepointserver/media/9f9528f3-ee79-46b2-8113-d7b10be675ba.png', 'The illustration shows how content enters the Office 365 index from both a SharePoint Server content farm and from Office 365. The standard search center in Office 365 only retrieves Office 365 results from the search index, while the validation search ce'], dtype=object) ]
docs.microsoft.com
To enable high availability you must manually configure the Falcon server. When the primary Falcon server is down, the system administrator must manually start the backup Falcon server. Then the backup Falcon server continues where the primary server stopped. The Falcon server stores its data in the startup.properties file that is located in the <falcon_home>/conf directory. You should configure the startup properties as follows to achieve high availability: *.config.store.uri: This location should be a directory on HDFS. *.retry.recorder.path: This location should be an NFS-mounted directory that is owned by Falcon and has permissions set to 755 (rwx/r-x/r-x). *.falcon.graph.storage.directory: This location should also be an NFS-mounted directory that is owned by Falcon, and with permissions set to 755. Falcon conf directory: The default location of this directory is <falcon_home>/conf, which is symbolically linked to /etc/falcon/conf. This directory value must point to an NFS-mounted directory to ensure that the changes made on the primary Falcon server are populated to the backup server. To set up an NFS-mounted directory, follow these steps: The example input uses 240.0.0.10 for the NFS server, 240.0.0.12 for the primary Falcon server, and 240.0.0.13 for the backup Falcon server. On the server that hosts the NFS mount directory, log in as root and perform the following steps. Install and start NFS: yum install nfs-utils nfs-utils-lib chkconfig nfs on service rpcbind start service nfs start Create a directory that holds the Falcon data: mkdir -p /hadoop/falcon/data Add the following lines to the file /etc/exportsto share the data directories: /hadoop/falcon/data 240.0.0.12(rw,sync,no_root_squash,no_subtree_check) /hadoop/falcon/data 240.0.0.13(rw,sync,no_root_squash,no_subtree_check) Export the shared data directories: exportfs -a Logged in as root, install the nfs-utilspackage and its library on each of the Falcon servers: yum install nfs-utils nfs-utils-lib Create the NFS mount directory, and then mount the data directories: mkdir -p /hadoop/falcon/data mount 240.0.0.10:/hadoop/falcon/data/hadoop/falcon/data To prepare the Falcon servers for high availability: Logged in as root on each of the Falcon servers, ensure that the properties *.retry.recorder.path and *.falcon.graph.storage.directory point to a directory within the NFS-mounted directory: for example, the /hadoop/falcon/data directory shown in the previous example. Logged in as the falcon user, start the primary Falcon server, not the backup server: <falcon_home>/bin/falcon-start When the primary Falcon server fails, you must manually fail over to the backup server: Logged in as the falcon user, ensure that the Falcon process is not running on the backup server: <falcon-home>/bin/falcon-stop Logged in as root, update the client.propertiesfiles on all of the Falcon client nodes and set the property falcon.url to the fully qualified domain name of the backup server. If Transport Layer Security (TLS) is disabled, use port 15000: falcon.url=http://<back-up-server>:15000/ ### if TLS is disabled If TLS is enabled, use port 15443: falcon.url=https://<back-up-server>:15443/ ### if TLS is enabled Logged in as the falcon user, start the backup Falcon server: <falcon-home>/bin/falcon-start
https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_data-movement-and-integration/content/ch_data_movement_config_ha.html
2018-05-20T14:24:09
CC-MAIN-2018-22
1526794863570.21
[]
docs.hortonworks.com
Anypoint Connector DevKit 3.8.x Release Notes New Features License management for MuleSoft Certified Connectors. Developers can allow customers to try their connector in design time, yet ensure their connector only works with a valid license in deployment. SOAP Session Management for SOAP Connect enabling developers to inject session tokens in HTTP cookies/headers, or SOAP headers. Further DevKit SOAP Connect functionality includes SOAP message body enrichment with the session token. Operation Filtering to hide operations specified in the WSDL such as login/logout, which is necessary in order to exclude operations from the operations dropdown exposed to the application developer. Caching metadata improves fetching of metadata during application design, and decreases the wait time for connector users. Connector APIDoc documentation is provided intelligently at design time, serving as a contextual help feature Improved APIDoc documentation generator creates documentation for multiple connection configurations automatically and covers filters and transformers too. In addition, it does not require or support code snippets referencing the xml.samplefile such as <@sample.xml … >. See Creating Connector Reference Documentation. In migrating a connector project to DevKit 3.8, all the @sampletags referencing the sample files must be removed A simple Maven command is available for the DevKit user to run connector functional tests (CTF) on CloudHub. Read through the Test Execution section of the connector certification information for details. Functional test framework supports @Source tests. JDK 8 support for DevKit and connector project creation. To use JDK 8, please add the following property in your pom.xml file. <properties> ... <jdk.version>1.8</jdk.version> ... </properties> Fixed Issues This release includes several fixes and the most important are as follows: Supports byte-type and short-type parameters in processor. Supports Gregorian Calendar in DSQL query. CTF doesn’t throw NullPointerException when a process returns legitimate empty body. WSDLProvider shows a specific error message rather than a generic error message, helping developers handle the error easily. Resolves an issue causing connector metadata not appearing in Anypoint Studio 5.2. CTF generates a well-formated JSON for metadata. Properly supports MetaDataKeyParam.BOTH and MetaDataKeyParam.INPUT in processors.
https://docs.mulesoft.com/release-notes/connector-devkit/anypoint-connector-devkit-3.8.0-release-notes
2020-08-03T17:45:32
CC-MAIN-2020-34
1596439735823.29
[]
docs.mulesoft.com
Creating a learning circle¶ We ask facilitators and participants a few questions at different times during the life cycle of a learning circle. We want the questions to be thoughtful and the information gathered to be useful. Here’s an overview of what we ask and when we ask it. The diagram below shows the timeline for a single learning circle: learning circle timeline Learning circle creation¶ When creating a learning circle, ask facilitators: - “What do you hope to achieve by facilitating this learning circle?“ - “Is there anything that we can help you with as you get started?” The answers to these questions are shared with a Welcome Committee, which includes both P2PU staff and veteran facilitators from around the world. learning circle creation Additionally, The facilitator also has the opportunity to ask a single questions to prospective participants. learning circle creation custom question You should use this if there is one other question you want to ask participants. If you have a number of additional questions, we recommend you send those to learners in a separate email through the dashboard, or reach out to us about ways to streamline the process. Learner signup¶ During signup, learners are asked what their goal is for taking a learning circle. Facilitators have the option of adding an additional question. Finally, learners can opt into general P2PU communications. This is what the signup form looks like: learning circle signup Managing learning circles¶ On your facilitator dashboard you’ll find all the tools required to manage individual learning circles, including: an editable sign up page, a messaging system that you can use to communicate with participants, a feedback tool for sharing information with your colleagues and P2PU, and a report feature that collects and aggregates feedback from learning circle participants. .. image:: _static/facilitator-lc-form.png Learner surveys¶ At the end of a learning circle, learners are asked to complete a survey. The survey serves as a baseline for assessing learning circle quality and a prompt for feedback about the online course. These are the questions we’re asking learners in the survey: - What did you hope to achieve when you joined this learning circle? - To what extent did you achieve this? (1-5 scale) - Do you feel more confident about what you just learned in the course? (1-5 scale) - How do you intend to apply what you learned? - How well did the online course work as a learning circle? (1-5 stars and reason) - Please share any other impressions you had about the online course. - How likely are you to recommend participating in a learning circle to a friend or colleague? (1-5 starts and reason) Learners will receive an email right before the last meeting asking them to complete the survey. Facilitators can also find the link for the learner survey on their dashboard and manually share it with learners: learning circle dash Facilitator survey¶ Facilitators will also receive an email right before the last meeting asking them to complete the survey. This is what the facilitator survey look like: - What did you hope to achieve when you signed up to facilitate this learning circle? - To what extent did you achieve this? (1-5 scale) - Did anything about the learning circle surprise you? - Do you have any stories from the learning circle you want to share with the P2PU community? - About how many people showed up for the first, second and last meetings? - How well did the online course work as a learning circle? (1-5 stars and reason) - How likely are you to recommend facilitating a learning circle to a friend or colleague? (1-5 scale and reason) A link to the facilitator survey is also available on facilitator’s learning circle dashboard.
https://learning-circles-user-manual.readthedocs.io/en/latest/engagement.html
2020-08-03T18:01:35
CC-MAIN-2020-34
1596439735823.29
[]
learning-circles-user-manual.readthedocs.io
Organizers¶ We offer additional functionality for institutions that are running learning circles across multiple locations (“teams”). Each team is led by an organizer, who works closely with P2PU to ensure that new facilitators in their area have everything they need to run learning circles. Teams are granted the following features and functionality: - Customizable learning circle team website - Team profiles featured on the P2PU site - Weekly email with team updates - Proprietary course management - Aggregated learning circle feedback and data Current learning circle teams are visible at. If you would like to start a new team, read on! Team Sites¶ Each team has a unique learning circle landing page, which features all learning circles happening across a team. The URL will be set to p2pu.org/[your team]; with [your team] being whatever an organizer determines is most memorable and descriptive for their audience. The top of the team site features information unique to the team, including a header image, logo, introductory about text, and links for a dedicated website and contact email. All learning circles affiliated with that team will appear below the header, accompanied by a learner-facing FAQ: Organizers can edit the top half of the team site directly from their dashboard by clicking ‘edit team information’. Changes that organizers make through this form should update almost immediately. Other changes (like editing the team name and URL), need to be handled by P2PU directly. Team Profiles¶ All P2PU accountholders can update their profile through account settings (). Profiles of teammembers are featured in a few different ways: First, team sites feature a carousel of all team members. Facilitators who have uploaded a photo will appear first, followed by others with a placeholder image like this: Organizers should encourage faciltiators to add an image so that they can be fully featured on their team site! Additionally, organizer profiles will also be featured on P2PU’s team directory (). Changes made to team membership - both adding/removing members and updating profile images - will not be immediately reflected on the team site but will be updated when P2PU next pushes changes to the website (generally at least once a week). You can send an email to [email protected] if you need these updates visible sooner. Team Activity Updates¶ Team organizers receive a weekly update every Monday. This update contais information about past, present, and upcoming learning circle meetings on your team, including learning circle reports, weekly feedback from facilitators, and information about who is signing up. Facilitators can opt into receiving this weekly update through their account settings. Additionally, all team members (both organizers and facilitators) will see a new block on their dashboard that highlights upcoming learning circles across the team. Proprietary Course Management¶ P2PU is constantly moderating the list of online courses available at. One of our goals with course moderation is to ensure that facilitators never encounter a course that they cannot freely access. In addition to deleting courses outright, we will flag a course as “proprietary” if it comes from a pay-to-play vendor like Lynda or GALE. These courses will no longer appear on the public course page search, but they will continue to appear for logged in users who are on the same team as the person who added the course (the assumption being that team members have access to the same proprietary materials). Aggregated Data Collection¶ During a learning circle, P2PU collects a variety of information from both facilitators and participants (see). All information is aggregated by team and shared with organizers on a regular basis. Creating and Managing a Team¶ If you would like to start a new team, please create a P2PU account and contact [email protected]. Once set up, you will see a new block on your learning circle dashboard called Team Management, where you can view team members, invite new team members, and edit your team site. There are threee ways to invite people to your team: Option 1: Automatically through email domain¶ If your team is associated with an organization that has its own email domain, we can save the email domain in your team settings so that any new facilitator with a validated matching email address will automatically receive an invitation to join your team. For example, if your team members all have staff emails such as [email protected], you can set bigpubliclibrary.org as your team domain so that when someone with an email address like [email protected] registers on the P2PU platform, they’ll get invited to join the team. Reach out to P2PU if you would like us to set this up for you. Option 2: Email invitation¶ You can send invitations directly via email. If the recipient already has a P2PU account, they will be prompted to join your team. If they do not, they will be asked to create a P2PU account and then be promoted to join your team. Option 3: Invitation link¶ As an organizer, you can also generate a unique link that allows anyone with the link to join your team. You can regenerate this link whenever you want, making all previous links inactive. Invitations that have been sent via Option 1 or Option 2 will be visible in the “pending invitations” tab until the user accepts or rejects the invitation.
https://learning-circles-user-manual.readthedocs.io/en/latest/organizer.html
2020-08-03T18:35:36
CC-MAIN-2020-34
1596439735823.29
[array(['_images/team-page-editable-fields.jpg', '_images/team-page-editable-fields.jpg'], dtype=object) array(['_images/team-page-bottom.png', '_images/team-page-bottom.png'], dtype=object) array(['_images/team-page-edit.png', '_images/team-page-edit.png'], dtype=object) array(['_images/team-page-facilitator-profile.png', '_images/team-page-facilitator-profile.png'], dtype=object) array(['_images/organizer-weekly-update.png', '_images/organizer-weekly-update.png'], dtype=object) array(['_images/2019-08-13-team-management.png', '_images/2019-08-13-team-management.png'], dtype=object)]
learning-circles-user-manual.readthedocs.io
If you wish to supply parts yourself, click "Consign part" next to the part number you will be supplying. After placing your order you will be provided with a shipping address for the parts. Consigned parts still need to be marked as "Place" even if they are not being purchased by us. We keep track of which parts need to be fit for pick & place programming and for quoting purposes. For consigned parts, the sender does pay shipping fees; we do not provide shipping labels. If parts are purchased by us from an international vendor the sender will also be responsible for all duties and taxes. See How do I consign parts to the factory for further information. Please note that we unfortunately cannot accept fully consigned kits. See for more details.
http://docs.circuithub.com/en/articles/62966-can-i-consign-parts
2020-08-03T17:20:38
CC-MAIN-2020-34
1596439735823.29
[]
docs.circuithub.com
TOPICS× Download assets All users can simultaneously download multiple assets and folders accessible to them from Brand Portal. This way, approved brand assets can be securely distributed for offline use. Read on to know how to download approved assets from Brand Portal, and what to expect from the download performance . Only Administrators can download expired assets. For more information about expired assets, see Manage digital rights of assets . Steps to download assets To download assets or folders containing assets for Brand Portal, follow these steps: - From the Brand Portal interface, do one of the following: - Select the folders or assets you want to download. From the toolbar at the top, click theDownloadicon. If the assets you are downloading also include licensed assets, you are redirected to theCopyright Managementpage. In this page, select the assets, clickAgree, and then clickDownload. If you choose to disagree, licensed assets are not downloaded. License-protected assets have license agreement attached to them, which is done by setting asset's metadata property in AEM Assets.TheDownloaddialog box appears with theAsset(s)option selected by default.If the assets you are downloading are image files, and you select only theAsset(s)option in Download dialog but are not authorized by the administrator to have access to the original renditions of image files then no image files are downloaded and a Notice prompts, stating that you have been restricted by administrator to access original renditions. - To download a single folder or an asset, hover the pointer over the folder or the asset. From the quick action thumbnails available, click theDownloadicon. - To download the renditions of assets in addition to the assets, selectRendition(s). However, to allow auto-generated renditions to download along with custom renditions, deselectExclude Auto Generated Renditions, which is selected by default.To download only the renditions, deselectAsset(s).By default, only the assets are downloaded. However, original renditions of image files are not downloaded if you are not authorized by the administrator to have access to the original renditions of image files . To preview (or download) dynamic renditions of any asset, ensure that the dynamic media is enabled and the asset's Pyramid tiff rendition exists at the AEM author instance, from where the assets have been published. When an asset is published to Brand Portal, its Pyramid tiff rendition is also published. There is no way of generating the Pyramid tiff rendition from Brand Portal. - To speed up the download of asset files from Brand Portal, selectEnable download accelerationoption and follow the wizard . To know more about faster download of assets refer guide to accelerate downloads from Brand Portal . - To apply a custom image preset to the asset and its renditions , selectDynamic Rendition(s). Specify custom image preset properties (size, format, color space, resolution, and image modifier) to apply the custom image preset while downloading the asset and its renditions. To download only the dynamic renditions, delesectAsset(s). - To preserve the Brand Portal folder hierarchy while downloading assets, selectCreate separate folder for each asset. By default, Brand Portal folder hierarchy is ignored and all assets are downloaded in one folder in your local system. - To send an email notification to users with a link for downloading the assets, select - ClickDownload . Expected download performance File download experience may vary for users at different client locations, depending on factors such as local Internet connectivity and server latency. The expected download performance for 2 GB file observed at different client locations is as follows, with Brand Portal server at Oregon in United States: Note: Cited data are observed under test conditions, which may vary for users at different locations witnessing varied latency and bandwidth.
https://docs.adobe.com/content/help/en/experience-manager-brand-portal/using/download/brand-portal-download-users.html
2020-08-03T18:59:00
CC-MAIN-2020-34
1596439735823.29
[]
docs.adobe.com
# Quick start Now that we are ready to roll we can create our first Feathers application. In this quick start guide we'll create our first Feathers REST and real-time API server and a simple website to use it from scratch. It will show how easy it is to get started with Feathers even without a generator or boilerplate. Let's create a new folder for our application: mkdir feathers-basics cd feathers-basics Since any Feathers application is a Node application, we can create a default package.json using npm: # Installing Feathers Feathers can be installed like any other Node module by installing the @feathersjs/feathers package through npm. The same package can also be used with a module loader like Webpack or Browserify and in React Native. npm install @feathersjs/feathers --save Note: All Feathers core modules are in the @feathersjsnamespace. # Our first app Now we can create a Feathers application with a simple messages service that allows to create new messages and find all existing ones. We can run it with And should see A new message has been created { id: 0, text: 'Hello Feathers' } A new message has been created { id: 1, text: 'Hello again' } All messages [ { id: 0, text: 'Hello Feathers' }, { id: 1, text: 'Hello again' } ] Here we implemented only find and create but a service can also have a few other methods, specifically get, update, patch and remove. We will learn more about service methods and events throughout this guide but this sums up some of the most important concepts that Feathers is built on. # An API server Ok, so we created a Feathers application and a service and we are listening to events but it is only a simple NodeJS script that prints some output and then exits. What we really want is hosting it as an API webserver. This is where Feathers transports come in. A transport takes a service like the one we created above and turns it into a server that other clients (like a web- or mobile application) can talk to. In the following example we will take our existing service and use @feathersjs/expresswhich uses Express to automatically turn our services into a REST API @feathersjs/socketiowhich uses Socket.io to do the same as a websocket real-time API (as we will see in a bit this is where the createdevent we saw above comes in handy) npm install @feathersjs/socketio @feathersjs/express --save Now you can run the server via Note: The server will stay running until you stop it by pressing Control + C in the terminal. And visit localhost:3030/messages to see an array with the one message we created on the server. Pro Tip: The built-in JSON viewer in Firefox or a browser plugin like JSON viewer for Chrome makes it nicer to view JSON responses in the browser. This is the basic setup of a Feathers API server. The app.use calls probably look familiar if you have used Express before. The app.configure calls set up the Feathers transport to host the API. app.on('connection') and app.publish are used to set up event channels which send real-time events to the proper clients (everybody that is connected to our server in this case). You can learn more about channels after finishing this guide in the channels API. # In the browser Now we can look at one of the really cool features of Feathers. It works the same way in a web browser! This means that we could take our first app example from above and run it just the same as a website. Since we already have a server running however, let's go a step further and create a Feathers app that talks to our messages service on the server using a real-time Socket.io connection. In the same folder, add the following index.html page: <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>Feathers Example</title> <link rel="stylesheet" href="//unpkg.com/[email protected]/public/base.css"> <link rel="stylesheet" href="//unpkg.com/[email protected]/public/chat.css"> </head> <body> <main id="main" class="container"> <h1>Welcome to Feathers</h1> <form class="form" onsubmit="sendMessage(event.preventDefault())"> <input type="text" id="message-text" placeholder="Enter message here"> <button type="submit" class="button button-primary">Send message</button> </form> <h2>Here are the current messages:</h2> </main> <script src="//unpkg.com/@feathersjs/client@^4.3.0/dist/feathers.js"></script> <script src="//cdnjs.cloudflare.com/ajax/libs/socket.io/2.0.4/socket.io.js"></script> <script type="text/javascript"> // Set up socket.io const socket = io(''); // Initialize a Feathers app const app = feathers(); // Register socket.io to talk to our server app.configure(feathers.socketio(socket)); // Form submission handler that sends a new message async function sendMessage () { const messageInput = document.getElementById('message-text'); // Create a new message with the input field value await app.service('messages').create({ text: messageInput.value }); messageInput.value = ''; } // Renders a single message on the page function addMessage (message) { document.getElementById('main').innerHTML += `<p>${message.text}</p>`; } const main = async () => { // Find all existing messages const messages = await app.service('messages').find(); // Add existing messages to the list messages.forEach(addMessage); // Add any newly created message to the list in real-time app.service('messages').on('created', addMessage); }; main(); </script> </body> </html> If you now go to localhost:3030 you will see a simple website that allows to create new messages. It is possible to open the page in two tabs and see new messages show up on either side in real-time. You can verify that the messages got created by visiting localhost:3030/messages. # What's next? In this chapter we created our first Feathers application and a service that allows to create new messages, store them in-memory and return all messages. We then hosted that service as a REST and real-time API server and used Feathers in the browser to connect to that server and create a website that can send new messages, show all existing messages and update with new messages in real-time. Even though we are using just NodeJS and Feathers from scratch without any additional tools, it was not a lot of code for what we are getting. In the next chapter we will look at the Feathers CLI which can create a similar Feathers application with a recommended file structure and things like authentication and database connections set up for us automatically.
https://docs.feathersjs.com/guides/basics/starting.html
2020-08-03T17:28:51
CC-MAIN-2020-34
1596439735823.29
[]
docs.feathersjs.com
Example Usage¶ Once you have loaded your data, you can analyze it using all the capabilities available in xarray. Here are a few quick examples. Land Masks¶ xmitgcm simply reads the MDS data directly for the disk; it does not attempt to mask land points, which are usually filled with zero values. To mask land, you can use xarray’s where function together the the hFac variables related to MITgcm’s partially filled cells. For example, with the global_oce_latlon dataset, an unmasked average of salinity gives: ds.S.mean() >>> <xarray.DataArray 'S' ()> array(18.85319709777832, dtype=float32) This value is unrealistically low because it includes all of the zeros inside the land which shold be masked. To take the masked average, instead do: ds.S.where(ds.hFacC>0).mean() >>> <xarray.DataArray ()> array(34.73611831665039) This is a more correct value. Volume Weighting¶ However, it is still not properly volume weighted. To take a volume-weighted average, you can do: volume = ds.hFacC * ds.drF * ds.rA (ds.S * volume).sum() / volume.sum() >>> <xarray.DataArray ()> array(34.779126627139945) This represents the correct mean ocean salinity. A different land mask and volume weighting is required for variables located at the u and v points.
https://xmitgcm.readthedocs.io/en/latest/examples.html
2021-04-11T01:26:21
CC-MAIN-2021-17
1618038060603.10
[]
xmitgcm.readthedocs.io
Table of Contents As issues are received, Evergreen creates a holding statement in the OPAC based on what is set up in the Caption and Patterns of the subscription. The systems generated holdings can only be edited by changing caption and pattern information and there is no ability to edit the statement as free text.
http://docs.evergreen-ils.org/2.7/_holdings.html
2017-04-23T10:03:54
CC-MAIN-2017-17
1492917118519.29
[]
docs.evergreen-ils.org
Contributing to psiTurk¶ Note: This guide is copied more or less from the contributors guidelines of the gunicorn project. Alternations were made for the nature of this particular project. An up to date copy of this guide always resides here. Want to contributed to psiTurk? Awesome! Here are instructions to get you started. We want to improve these as we go, so please provide feedback. Contribution guidelines¶ Pull requests are always welcome¶ psiTurk lean, focused, and useable. We don’t want it to do everything for everybody. This means that we might decide against incorporating a new feature. However, there might be a way to implement that feature on top of psiTurk. Discuss your design on the mailing list¶ We recommend discussing your plans in our Google group before starting to code - especially for more ambitious contributions. This gives other contributors a chance to point you in the right direction, give feedback on your design, and maybe point out if someone else is working on the same thing. Create issues...¶ Any significant improvement should be documented as a github issue¶ Fork the repo and make changes on your fork in a new. Make sure you include relevant updates or additions to documentation when creating or modifying features. Write clean code. Pull requests descriptions should be as clear as possible and include a reference to all the issues that they address.. Commits that fix or close an issue should include a reference like Closes #XXX or Fixes #XXX, which will automatically close the issue when merged. Add your name to the THANKS file, but make sure the list is sorted and your name and email address match your git configuration. Contributing to the docs¶ Our docs are currently hosted at readthedocs. Readthedocs uses Sphinx as the backend for their documentation so in order to update the docs you will first have to install Sphinx simply by typing easy_install -U Sphinx on the command line. There’s a Makefile in the docs directory, so you can generate the docs by running make on the command line, for example: make html will generate the html docs in _build/html. Running make with no arguments will show you the available subcommands. All documentation files are in the docs folder and are formatted as reStructured Text. A good, detailed manual for the reStructured Text syntax can be found here. Some essentials: The index page is the main page that users see will see when they open the docs. It is also how readthedocs generates the sidebar that contains all the names of individual pages in the documentary so it is important that this is formatted correctly. The main important feature is the toctree. The toctree just looks like this: .. toctree:: forward install quickstart recording Sphinx will go through the pages listed in the toctree, search for subject headers and create both links for the index page and the sidebar in the correct format in the order that the pages are listed. For this reason, it is also very important that subjected headers be used correctly on the individual pages. For example, the forward page has a title that looks like this: Forward ======= and subtitles that look like this: What is psiTurk? ~~~~~~~~~~~~~~~~ It actually doesn’t matter what character you use for the underline, it can be any of = - ` ‘ ” : ~ ^ _ * + # < > but it must be consistent since all headers with the same character will be at the same level. For convenience, we are using ===== to mean title and ~~~~~ to mean sub header. Some other basic things in rST: Links look like this: with the actual page in angle brackets. If the link is to another page within the docs, you only need to include the name of the page. Whenever you include a code example, put this line before: .. code:: javascript All pages on readthedocs.org (including this one) have a link to “Edit on Github.” This can be a great way to “steal” formatting ideas for your documentation edits. Decision process¶ How are decisions made?¶ In general, all decisions affecting psiTurk, big and small, follow the same 3 steps: - Step 1: Open a pull request. Anyone can do this. - Step 2: Discuss the pull request. Anyone can do this. - Step 3: Accept or refuse a pull request. The little dictators do this (see below “Who decides what?”) Who decides what?¶ psiTurk, like gunicorn, follows the timeless, highly efficient and totally unfair system known as Benevolent dictator for life. In the case of psiTurk, there are multiple little dictators which are the core members of the gureckislab research group and alumni. The dictators can be emailed at [email protected]. For new features from outside contributors, the hope is that friendly consensus can be reached in the discussion on a pull request. In cases where it isn’t the original project creators John McDonnell and/or Todd Gureckis will intervene to decide. The little dictators are not required to create pull requests when proposing changes to the project. Is it possible to become a little dictator if I’m not in the Gureckis lab?¶ Yes, we will accept new dictators from people esp. engaged and helpful in improving the project.
http://psiturk.readthedocs.io/en/latest/contribute.html
2017-04-23T09:58:49
CC-MAIN-2017-17
1492917118519.29
[]
psiturk.readthedocs.io
Around The Campfire Last Edit: Apr 28th, 2015 No art submitted yet Campfire Story (Revised 2012) “It is a longer drive than if we rented horses and rode into the forests, but for comfort and safety if it should storm we will drive the car.” Jack told the girls, as Paul stood nearby and said not a word pro or con. Katie was all for their camping the weekend up in the forested foothills to the Wasatch Mountain range. Shanna was the only one of the four who felt uneasy, even though she wanted to go and trusted Jack to act with respect to her virginity. She had asked her father what he thought about it. Her father spoke sternly and of morals, he was against the idea of two girls and two boys traveling out of town and it showing on her face as a scowl. Jack tried to cheer her spirits with telling her of the beauty she would soon see. The respect she had for her father and his wisdom about so many things came as a reminded warning. Shanna, the girlfriend to Jack, each was of a different race, he Italian-American and she a black American had vastly different rules and exceptions to the rules. Paul and Katie were the instigators of the weekend vacation, they already as disrespectful to each set of parents, and as much of what worried, concerned parents want for their children. Worried still, Shanna sat sullen in the passenger front seat as Jack sped along the highway leading to the forest park entrance. The promised hour-long ride became a two-hour excursion, first by paved roads in the park and then to jeep trails until their car could not advance without causing it damage. Four youthful people then trekked a half mile into the woods, Jack leaving the common trail for one he said would lead to a small lake and a peaceful setting. Upon their arrival at where Jack wished them to be, he and Paul worked feverishly to erect the camping tent for the girls to sleep. In the mountains sundown comes sooner and quicker as the sun slips behind the peaks. A warm campfire and roasting hot dogs offered some comfort to an already worried daughter as of what her father would say to her when she returned home. Jack snuggled next to Shanna, while Paul and Katie frolicked; rolling on the ground, he mostly on top of her and acting as if Jack and Shanna were not there at all. Feeling more than ignored by two overly passionate friends, Shanna got up and asked where she might go to use a rest room. Paul and Katie laughed a lot hearing Shanna ask a foolish question. As before Jack supplied the logical answer, Paul suggested to Shanna she walk out into the dark and find a friendly tree. Shanna looked at Jack and the best he could offer her was to give a shrugged shoulder as if to say, find a tree. She then looked at the tent; it nicely set up for to sleep but had nothing where she could relieve an urge. October nights tend to get chilly cold rather fast after sundown. The damp nighttime chill in the air only made her urge worse. In a short amount of time, she felt it grow to something undeniable, getting up she stood and looked at the others before deciding to trudge into the darkness, and feeling a want of privacy, went a fair distance from camp and the light from the fire. Meanwhile, Jack and Paul had plans of roughing it a bit, they wanting sleep in there bedrolls, lying under the stars. As Katie went to do the same as Shanna, Paul and Jack looked with wonder at each other as to why Shanna made such sudden stir. Their eyes questioning as a mutually wondering, they laughed, hearing Katie moaning about tree bark scratching her back. A week yet before Halloween and a half hour since Katie returned from her excursion away from camp, there was no sign of Shanna returning. Jack said she was angry and playing an early trick on them, but Jack was worried. The dark made worse by a moonless night. The two fellows were thinking about looking for Shanna. Katie sat quiet; se knew the real reason and kept it to herself. When Jack asked why she was so quiet, she gave a shrug to her shoulders, as then winked at Jack for her own reasons. She walked then jogged, and feeling lost worse, she began running. Very unsure as which direction was right with haste, she soon had wandered a long way from the camp. She had looked hard for a place with two trees close together, a place where she might hold on both to squat and relieve that growing urge. She having found that proper place and the urge in her near bursting, she gave up and just did as had Katie suggested, bracing next to a tree until feeling relieved. Shanna wandered too far; she called out to the darkness for Jack but heard only a stray Owl hooted back. She was not appreciative of her own scent left there on the roots of that tree, as her thoughts turned to where she was, and of the others, and more so, as of where again was the camp. Hesitant, Shanna began to wander, the darkness not helping her walk a straight line, she had run far and the others were well out of sight. Wholly undecided her fear began to cloud her better judgment and she began to move deeper into the dark forest. It took not too long a time before she realized the fact that she was lost. She would run with legs as feet moving her as they had in the recent sixteen-kilometer foot race. The faster she ran, the more she bumped into trees and they deflecting her into a changed direction. There were thickets too, brambles, bushes and the like that scratched and made her bleed. When she stopped and felt winded, the distant glow of a campfire lit the upper branches of the tall trees it offered Shanna hope for a reprieve from a horrible situation. Delighted at the thought of finding her friends again, or at the least of a person or persons to help her find the others, Shanna jogged toward the blazing light of a big bonfire. When she came closer, as walking from out a thicket, she discovered there where she stood as looking at some large rears belonging to horses all tied to a tethering line strung between two trees. As Shanna moved carefully along behind the horses, she noted that most of those there had no saddle, ridden there by bareback riders. The horses acted as quite nervous. Shanna knew horses, she an equestrian rider since she was ten. The breeds of the horses and their colors did not vary, so many the same, and all she noted being stallions still! Seeing this Shanna thought those by the fire must be some rich folk. They as having come all that far out into the forest on horseback, some bareback doing what seemed as something very novel. A second as better look at the group of horses standing there, each appeared as if a champion mounts. All of the horses looked to her as if young animals, they exceedingly muscular and those there that saw her became aroused, displaying each his well endowed male member. She seeing but the rears, rumps, flanks, and hanging male organs of the horses, the large bodies blocked her view, as well her walking easily a direct line to the campfire. Nervous and uneasy horses flick-whipped their tails at Shanna, some flicking with such force it seemed to her the animals were trying to dissuade her from coming closer, or even to her going by the campfire. She walked quickly along as behind the horses, until she passed to the end of the line closet to a large as tall Oak tree. As Shanna stood boldly looking down from where she stood to the roaring big bonfire and some dozen plus women seated in a circle, the horse all turned there heads to gawk at this young college student. Had she known more about those horses or the people around the large campfire, she would have known an unexpected person was a welcomed fool in a very dangerous place. Summarily, Shanna began counting the number of mounts, there being nineteen extremely very healthy horses. She liked horses herself, having gone with Jack riding quite often. As she stood looking at the fine horseflesh, from out of the dark walking toward her was a manly black figure. “Hello,” said Shanna, she turned her back to the fire and talking to the man coming her way likely saved her then from the impending dangers. “Shush young one, lest you become as fools fodder for the coven!” The man said as he came closer, his manner of walking taking note by Shanna, she staring at him. “I was lost in the forest from my friends and I…, oh holy cow, what are you?” Shanna stopped her ranting and seeing the man clearer, she stunned had to ask the natural question. “Do not be afraid, what you see is an aberration from a spell cast by master mistress, Miss Bella. She uses me for travel as her wild ride here once each month to the coven witches Sabbath meetings. When she gets here she makes of me a black satyr, giving me duties, one is to guard the horses, a second is as security if a victims were to try as escape from their demonic baptism. The third use happens rarely, but it is a good bit of fun for me, the coven witches get drunk and horny, they have need of me and my services! As most of the time I am for her a black goat, what she makes use of as a herd sire on her farm, she has her times when in dire need of a male and I become a satyr for her for as long as she feels horny.” The satyr man said to Shanna, he explaining in generalities what he was and of the whys. “S…satyrs are things of wild and bestial sex raping of women, nymphs and the like!” Shanna said in pointing out what little she knew of satyrs and the daily lifestyles. “Not true,” said the satyr, “if I was that bestial I might have rushed at you, tossed you to the ground and while you screamed, I would molest you until the witch coven arrived and made use of you in a more painful manner.” Shanna went to walk past the satyr goat man, she was about to give a duck-under the rope-line tether, when the satyr grabbed her by the arm. His long and curving Scimitar horns and the light from the fire on his face sent fear into her heart, sure then he would rape her silly. Instead, Shanna received a hard body slam, falling face down to the ground. "Owe," said Shanna with a groan, she with her hefty chest and pretty face slammed down, met a small pile of sharp edged pinecones. "Excuse me dear girl, but where you are thinking of going? I would expect the welcome you might get by the fire would put you as another like to those tied to the tethering rope!" The satyr said, his tone of voice similar to her father, as Shanna felt the need to respect someone of authority. Shanna lay on the ground and looked up at the satyr man. “This is all some fool joke!” Shanna said with a smile, and gave a short "Ha," as if to show her personal indignation of humor at such a poor time. She spun around to a seated position, her sweater soiled then, as were her legs by the forest muck and moss. She began to brush herself off in the brilliant glow of the nearby roaring big bonfire. "I trust you know what they seated around by the fire would do if they knew a virgin girl were close at hand. I fool you not by saying they are a coven of witches, here once a month for enjoying a devilish Sabbath meeting and or sacrifice. They come here for privacy each month, bringing along a victim to join us as beasts, so then, unlucky for you to have wandered here on the one night when the witches arrived without a victim!" Shanna looked then toward the fire and saw all were seated Indian style about a dozen or so women some young and beautiful as more were elderly, a mass of wrinkled skin. She hedged her view a little further, seeing there two horns protruded skyward. As she sneaked a closer peek, saw there a very large black goat seated on his furry flat rump. Inching closer to the peak of the hill and the tethering line, the black big goat, sat holding its forelegs out forward into the air. The big animal seemed to be enjoying where as how he sat. He, she noted was quite big at being male. The large goat sported a massive erection, it stood erect, the tip pointing at an angle toward the night sky. Then she saw a head bobbing up and down as it quite near to where the goat sat. Shanna saw there a young woman lying prostrate flat on her belly before the goat. She would prop up at an angle using her arms as support. She with tongue licked at the goat along the length of its erection. Shanna thought first the woman was paying erotic homage to the beast, but as she did her tricks to his cock, he began spurting cupreous amounts of semen. Some of what all came out the erect penis spat into the air as some of thinner consistency, poured down over his shaft making the thing to glisten in the glow of the campfire. Disgusted from what she saw, Shanna again turned around look there at the satyr, seeing as taking note, he being a lot alike to the black goat. She felt sickened, and was about to get up and walk to the group anyway, when from the campfire was heard a blood-curdling scream. The scream stopped Shanna in her tracks. She turning her full attention back to the circle of witches, wondering what happened then. She again knelt down low, peering ever so carefully as not to expose her self, preferring no possible involvement in the worse troubles of some poor soul. "Careful, what you hear is the maiden witch feeling her devil coven indoctrination. She had to suckle and lick the goat devil at as on his cock, until her blew a load. Once he felt as warmed up, he would then shove his cock down her throat, the cock extending to demonic length; it comes out her anus, gagging her. He with his cock would thrust at her, the sensation as said like having swallowed a Python snake. Her sexual hazing usually lasts for an hour, so get used to her occasional wailing screams.” “And you are ah…or were…” Shanna asked of the black satyr. Matthew or you can refer to me as just Matt if you please miss," said the satyr to Shanna, “No family name?” Shanna retorted quickly. The satyr named Matt sighed, as he replied, “No…, no reason to even try and recall it, She did this to me eons ago, and considering her powers, I expect this is what I shall be for her for quite some years more. At the first when she changed me I held some hope of her giving me a reprieve, but not so.: Matt shrugging said, “Early on I hated what she did to me, and somewhere along the passing of time I became accustomed to being a stud buck. The being a herd sire buck has its pleasurable side, and when Miss Belle has her occasional desire arise, she makes me a satyr and for two, three days and once for a wild seven days she kept me rigid and able to the duty.” “That is gross, she did this, made you as part animal and you like it?” Shanna grumbling said, and knelt there looking at him as he as her until she noticed something pointy, pink, and peeking out of his furry sheath. “Can you keep it in your pants?” Shanna said as she turned her head away, having seen Matt getting a goat-like arousal just starting. “Maybe if I had pants, but what you see am me, this is all of me. I am comfortable naked, as are they here who are as horses. You noticed some of them gaining their arousals and full erections when noticing you and your rich as musky scent. You are near time to your monthly period are you not?” Matt replying to Shanna and her moralistic outburst, she forgetting the hard truth he was for the most of time living as and being a male goat. Shanna continued to pout, keeping her back toward the friendly satyr. Looking toward the line up of horses standing there tethered, waiting for their mounts to return and ride them all home, Shanna seeing those closest to her as sporting their huge maleness for the same reasons as the satyr had asked of her. Stunned and fascinated for a long moment, Shanna sighed, then said, “Men, males they are all alike, thinking of just one thing and nothing else!” “Why not, as being animals we have nothing more pleasurable to do. Terry, the stud there closest to you has learned to like what he feels from mating; it is different for him since he was born a human female. The silver bridles sued by the witches makes only male horses, stallions, no matter if the person made to wear them was male first or female, they all become as stallions! Terry is his name as was Trina was her name from birth until transformed!” “Impossible,” said Shanna. “Do you see me here…, am I not real enough? If you doubt my satyr form then let me show you what I do best, my seasoned, pleasuring good fun!” Matt said as he challenged the young and tough and a very moralistic girl as to her own deeper desires. Shanna did not like his prodding her with sexual suggestions. She stood up with the thought to walk away, as along the dark path leading away from the bonfire and the tethered horses. “Sit down…,” Matt said, as he added, “Dear girl’, you let any… of those witches catch a glimpse of you here and the whole lot of them… will be up here and take you down there to meet the demon. If that is what you wish, and you think he is some sort of gentleman with young ladies, if that would thrill you seeing a demon, then you got some paranoiac death wish! So go then to them, then stand up, and yell your last few human sounds. Such a headstrong foolish girl, you see me, and believe me not when I can tell you accounts as of feelings when transformed by a witch, her spell, it is something weird, horrifying, and sensually arousing all to once! My changing was painless as it happened. Those changed into horses and especially the women who became male and felt the pangs of a stallion felt horrid pains. Miss Bella wanted me to remember my past and know what I am in the present. I cried seeing my bodily form changing, and told then by her as how she would force me to live as I do, the lust-passions of a male animal urged me to want, and want for more, more, and more, me as a male goat insatiable in the extreme...! Indeed, I now like being a goat. My mistress is a powerful witch, given broad abilities that many of the other witches only dream to own. Considering what I am at the farm, she has made it possible for me with a smaller head and brain to think as would a man but know I am a goat." Shanna sat there and in tree shadowed dimmer glow from the roaring campfire, she looking, and thinking, saw how Matt could control his arousal. She began changing her attitude about Matt, he being a friend to her when as if he gave her to the witches they might reward him. "Please do not fear me by what you see; there is much of the animal in me, mostly goat, the real me a buck-sire.” Matt the satyr said to Shanna, “We, you and I are alike, both black, you by race and I of breed, a Boer goat, am herd sire. So many years ago, I arrived here to your country as a visitor. Originally from Ireland, and what happened to me was by the wild draw luck in the cards of life. Everything worked against me, as of my return to a beloved homeland. During my travels, I came upon this lovely Inn and thought to stay there a couple of days to enjoy the surrounding countryside. Ah, but my being away from home masked the commonly seen the markings of where devils lurk. The atmosphere offering tranquility the likes I had not seen in America until I happened to stop here. It began in earnest the witchery on that third morning of my staying at the Inn. Miss Bella approached me, and together we talked about what I saw, and she asking if I found the Inn a restful place where to relax, would I prefer to stay there with her and live…! It began on that evening Mistress Bella, as with her black powers she did seduce me. At the first, I found my Bella a wild and wonderful woman of extreme sensuous talents. She entrapped me by arousing my manly self, and when feeling willing of anything she wanted all it took to begin was a friendly toast of wine and she began to transform me. She took a kindly man and made a lust beast of him in one span of a night until the morning when I awoke to see this, my form. I screamed from horror, she popping into my mouth her breast nipple and massaging my belly she told me to suckle. My beast in me took control and a sexual rutting man-goat lay with her in her bed. She coaxed me to want to love it, and as the day passed, more of me transferred to being animal, goat, and bestial of delights. What you see is when I am at the peak of being human; this is what I am now and cannot ever go home to Ireland." The roaring fire and screeching cries of the witches as they helped hazing and torment given to a young woman asking to become a witch in their coven, Shanna felt needy of some manly comforting. As she turned and snuggled close to Matt, reaching an arm around his furry waist, as her hand pressed at him front midsection, her index finger happened to poke inside his sheath. Shanna felt his cock rise up and the oily feel of the bestial cock-head touch her finger. Matt saw her wide eyes, and stunned expression, and why not considering what all she had seen there that night. "Come,” as Matt said to Shanna, “Forget now what by chance you touched. Know that I care, and not just for want of any sexual pleasures or satyr delights. I care for those who are horses, spelled, cursed all to be as know, remembering what they were as of what they are, forced to live as, some made to breed, they like I are of a sad kind. I wish for you in this short while to learn to know me for what I was and am, to appreciate, and think kindly of us, the lost!" Shanna listened and began in her heart to feel for Matt and the others a sense of compassion. Compassion not passion, to her there was a distinct difference, as she wanted to show him her friendship while keep her virginity intact. Then they hearing another more guttural scream come from nearer the bonfire; Matt covered her eyes using his furry hand. Rising higher on her knees, Shanna stood looking Matt directly inn the face, she said, “Thank you for caring about me!” It was at that moment she decided to give in a little to him and his passions, she putting her hand to the lips of him, a black male and satiric goat. “What now, where can I go and could you escape with me?” Shanna asked in that tender moment. “Nay-baa…damn, oh my dear friend,” Matt stammered, he wishing to say something from the heart only to trip over his being goatish did give a horny felt bleat as he spoke to her. He continuing to talk said, “But what you see is what I am and being like a satyr in form, I would not be acceptable in a human society. The myths would taint the ideals of people and I like this would be for all a thing and not a man. My having been changed of form, too much of me has learned to like living as do an animal, me a very virile male goat so much of the time. What I am is a body, mind and spirit changed made bestial of desire and being of this as something permanent! What wild and beastly sensual frolic she derives from those there by the fire. As soon, she shall come up here and my Mistress Bella will make me of what she has needed. I shall stand here like the others, as deemed, me then being in my large male goat form and she shall ride on my back to where she calls home. The meetings bring on her a sadistic zeal, when comes the morning she shall return me to this satyr form and insist I do to her many lustful acts. Inescapable of her want for passion, lusts, and sensations shall I be while we touch. Ah..., I sigh from longings wishing to die and cease this interminable sense of lust and desire, this was not the real me before I came to this country. When she and I have returned to the Inn, Bella comes, she refreshed, smelling sweat, is as horny. She loves the way I in this form can service her. Though I then was tired from the night run and exhilaration, though weary or no then, she insists from me we get it on several times." Indeed, as of all what he told Shanna of his past and of the evil danger that sat just a mere couple hundred or so feet away. Shanna sat engrossed and listening then to what she was still questioning as real. As then from near the bonfire, came a short if shrill scream. This scream died off quickly, as the entire group of women jumped up, and began to dance wildly around the raging bonfire. Scared and inquisitive, Shanna peered out at the bonfire with apprehension and renewed fear. "By all that is of the unholy it is done, they did it again to another one!" Spoke another voice there, one with a deeper tone. The hearing of some other voice there made Shanna whirl her head around, she looking, was scared for a witch seeing her as there, and she to them as a vulnerable virgin. "Be calm, take due care lest if they around the campfire were to discover you, you too then would be likened to one of us!” “I need to help her Joey,” Matt said, he looking at the stallion standing closest to him and Shanna kneeling together. “See there, this fine stallion was as is Joey, as his witch still uses his real name for him. He talks often of it to me, as likes to reminisce. What he would tell is of that fated night and how he, she the witch captured him. He wanted to see a neat movie, it at some x-rated creeper at a cinema for one night only. Youth stubborn to his own wishes he waited until all at home went to bed. He thought to be cute and sneaky, snuck then out of the house. He walked the road trying to hitchhike, when an elderly woman stopped to offer him a ride. He thought it perfectly fine, as she being elderly looked to him so trustingly. They talked, and she offered to help him along to where he wanted to get to if he would help her unload what she had in her car. It seemed then a reasonable request for her letting him ride to town and get to see his movie. His witch drove to an estate farm, he remembers of it. She drove to the stables and before she got out of the car she leaned over to him and he though to give him an affectionate kiss, she slipped over his face and head a silver threaded bridle. He remembered nothing of what all happened until his rude awakening. Joey awoke lying in a barn stall, a halter snug on the head, he of body and future as one changed completely of form, but not then of mind. He of his new natural state is as a Pinto stallion. She has need of a strong as young male horse to mount and to ride to the Sabbath meetings. Why she has him as a Pinto and when comes the night of the meeting she changes him again, he as you see him an all black stallion like are the horses for the other witches. Teenager that he was, his becoming an animal a horse and stallion, his witch makes use of him as a stud breeder. As what he would tell you were…,” Matt was saying when the tall stately stallion swung his head to give a stared look at Matt and Shanna. Joey the horse then said, “I will tell her, dear girl what you see of me is a brute animal, I have had to adjust and learn what she my witch wants of me to do. I hesitated at the first, but after several mating sessions with mares, I changed of mind and desire, as realizing for me this being as horse and animal is…the real me." Shanna sat kneeling by Matt, her arm and hand laid carefully around him and not this time with her finger poking at his sheath. She marveled at seeing the mouth and lips of a horse move so corresponding to his verbalizing of words. She sat tolerant enough to listen as hearing of plights worse than she ever imagined. Joey a teenager stood there the equal to a mature horse, undoubtedly a stallion as he spoke his penis dropped from out its sheath and it swelled into a massive erection. She listening watched, seeing how Joey stiffening his penis it lifted to give a bump at his furred belly. He relaxing it, the meaty cock swung quickly down as flopping back, bumping at his hocks. Joey did his penis flicking several times until he finished speaking, as from his stiffened penis, he them masturbated a stream of ejaculated stench. Sexuality a definite part of what all there were by design of their witch masters, of what Joey spoke thought it seemed still as not real, he, Matt, the other horses were something Shanna could not deny. Then Matt began to speak, he said, "Joey got bold and tried to escape once, but was caught and the horror witch, Andorra had him made him senseless for it. He was alert while remembering, forced to act as would a stallion, did breed, she keeping him like that for three years before allowing him his rational mind its return. See too there down on the end of the tethered line, see the dapple-gray, his name be Ernie. He too tried to escape, but his witch mistress wanted a fully active human mind in her mount, she making use of him as does Mistress Bella does with yours truly. Escaping here is as née to impossible. One or another of the witches assuredly will find the then wandering animal. The witches then either geld the escapee, and or they divine upon them to be as they became for a much longer length of life. They remain, as are animals of form but still quite the human in a rational level of mind. Either outcome assures the witches they get to own a humiliated animal doing what it must. A few of those standing wear the silver strand bridles on their heads. They though spelled to become as horses, as when comes the light of daytime, the witch returns them to their own homes, their beds. As then they live a life of apprehension waiting for the next month when one night late the witch comes again and claims them to ride to the witches Sabbath," Matt saying tried to explain, looked then a sad, his head drooped, a cloven hoof showed his anxious feelings as he pawed at the forest dirt. "I would help you all escape! If I were to untie the tether all can run away, they cannot recapture all of you then!" Shanna said, she starting to reach for the knot tying the line tether to a tree trunk. "No dear girl, we are what we are! Joey and Eric are now horses and shall be whether they ran or stayed. The others who wear the leather bridles with the silver star, the star works like global positioning, it sends our a signal that only devils can use to find an escapee. As caught, and not mattering whether the captive of a witch or human, the removal of the bridle causes the wearer to lose their will to endure. Once having lost of all personal resolve, they the captive animals become as then willing to do as whatever a witch would wish. If they were to run and regain their freedom and then were made captive of some human, sent then to the smithy and shod with iron shoes, they would remain, as is their horse self! Now look at me here! I was a man, seduced with sexual perversions to be now living as and liking my being a goat. Bella likes her lovers in this way, allowing still the remembrance of fond memories and a working mind. She comes almost daily, is all horny and lusting, we do it in the barn or by night she takes me to her bedroom, and there...well you can imagine. Dear girl, if you could see your face right now, such a look of innocence, but youth or age means nothing to those witches. They use people as little more than objects of, and/or for their wicked need, and urges." "That is unbelievable," Said Shanna, "and how can the loss and numbers of missing people go as unnoticed, would such not raise some sort of concern?" Matt almost made a grin on his goat lips as he said, "Misery or bliss, Ernie there he told me was a government worker as a census taker. He walking went door to door dutifully doing his interviews. He was nearer to fifty as his age when by chance he walked onto the property of his witch, the interview was a short one he said. A glamour hazing of him by the witch, he remembers drinking an elixir. Then the fear filled walk to the stables and once there, he put into a box stall spent four days of his life in horror and agony. Rejuvenated his age and physically transformed, his fears faded by purpose from what he drank, those can do nothing but accept the fact they are then animals owned of a witch. What his mistress witch made of him, he became a spirited mature two-year-old Arabian stallion. The use of an elixir permanently transfixes a body into a different form, changed of age. Those, they remain interminably the same age. Ernie is a young and exceedingly virile a horse stallion, made use of for a mount but kept daily constantly at stud! We the changed have nowhere to go. The witches do with their changelings as does Bella with me. She keeps me, her sex pet in a pen or pastured. I then when at her farm become a complete male goat, doing as needs required by my beast-driven desires. Such sensations of brute passions a human cannot understand. As then by a scent to my nose, am doomed and forcibly persuaded to react, doing as would any a buck goat enthused for his chance to feel and be at rut. The others too, those being permanent as horses or other animals have a reasonably good life, all kept healthy after having become they are! The witches take pride of owning beautiful horses. They adorn their mounts with silver ornaments, ride them in parades, and go to their monthly meeting. She the Queen of the Witch coven came last year to a July Sabbath. She we know as Berea and coming to the meetings riding the back of a huge boar hog, another lost and doomed man!" Matt told Shanna. Shanna asked, "But you all are males, stallions, two as geldings, a goat and..., why would they want me as an animal, look I am female!" Shanna announced, saying it rather boldly, flaunting at Matt her generous bosom as part showing her gender. "Ha, wrong assumption dear girl! Take note of the two golden colored horses near the far end of the tether. These two are sisters, both being by day as young women working at Scotch Inn. They come regularly, having I suspect accepted their lifestyle as maidens and horses. They try never to escape but speak kindly about the witches that hold them here. I suspect they like it when every other two years or so the witch makes them stay a mare and are bred to then bear a foal. The Inn too is a dangerous place to stay, as for many who sleep there and then go to a free breakfast find them changed into donkeys. No matter as if they are male or female. The witch spell binds them to liking what they became, as then they do with you as their needs dictate." Matt explaining of what it is like to be living of life with a witch. Shanna scooted herself closer to the large tree that held the tethered line, it holding each mount standing at ready as willing to serve their witch. Her lower lip was shivering, showing just how scared she was, feeling the danger and coming to realize the reality of withes, devils, and of God in Heaven. With legs tucked tightly against her well-endowed chest, she wrapped her arms around her knees, and sat gawking a stare at first the face of a black satyr-goat, her gaze trailing off to eying a look at Joey and his sheath backed up to his large balls. "I feel sickened!" Shanna said, as tears welled in her eyes and a fear that she might be doomed into some animal form. Then the sound of laughter and a groaning of something likely now less than human, as came from near the bonfire. Matt picked his shaggy head up to try to see, as well did all the entire line of horses, and took a sudden interest. "Oh dung heap, they did it; she is as damned as are we!" a horse groaned, he seemingly knew of who the witches had near the fire. "Quiet or they will be up here showing off their victim and likely claim this young girl here for the next to be damned!" Matt chided a friend horse, it speaking as from the middle of the horses. Excited too, Ernie began to stomp his fore hoofs, flick his tail, and became rigidly erect where it counted. Matt lets out a begrudged sigh, "Dumb bust it, poor Tina Wilkes, she was such a nice young woman too, and our Veterinarian!" The scene of all the witches surrounding the one they had done their devilish desires upon had them step to the outer ring of the circle. All stood quietly by as near to the fire was a sleek and sassy looking, red roan horse. She was a marked duplicate of Ernie, they were as Arabians, but for Tina she was a mare. Shanna peered over the knoll of where she sat next to the large tree, and close to Matt. "Tina Wilkes, I know her, she sings at church from time to time, has a nice voice too! Is that her now, did they make of her to be a horse?” Shanna asking said, as Joey chimed in to her knowing Tina, as he muttering said, “And not anymore!” “Not any more means what?” Shanna asked Joey. Joey looked more at Shanna as he said to explain, “She sang in the church choir, but not anymore, I imagine they do not let horses into church!” "Yes, I overheard some talk about Tina discovering Auntie Mayer doing her worst to some salesperson, as she made him to lick out..., ah pay homage to her. She then changed him of form to a bull Mastiff dog to service her when she felt a need. Tina had come to that stables on her monthly appointment but unannounced. She saw there Auntie Mayer of who otherwise could have cared less might dare to come and be a witness. Her stable is quite large and the breeding to horses there makes her rich. She comes often to the meetings bringing naive travelers, men, women, couples, even elderly people, and none ever leave the Sabbath in an upright stance!" Said Matt, he looking at one of the horses sporting an erection too, said then, "See there the black stud, he is Kyle and has had quite the love crush for Tina. He then is quite able being as a stallion to show her a good time." The new mare stood pawing the ground and obviously nervous, as she had her full faculties, knowing and realizing, understanding what everyone said. She then too, knowing of the sensations after becoming a young mare. She was realizing what it felt to be as one of those she so lovingly cared for, she a Veterinarian the past years since graduated and having come to the mountain town. "Damn it, you are all lusting for her! From where I sit I can see a lot of elongated erections, some swinging as libido dictates, you are acting just like animals!" Exclaimed Shanna, aggravated by a broadening situation she could not begin to fathom, her voice grated and showing her fears, she spoke a bit too loud for her own good. "Shut up girl, if they hear you, Tina will have another stable-mate, do understand me?" Matt nearly exploded at Shanna, his gravelly satyr-goat voice telling of both concerns and of anger. "She is right Matt; as I would not mind having another nice neat and sleek mare for to fill my paddock." Ernie spoke up, he giving Shanna a wink of his large brown and liquid eye. "I can see your big erection from here, you are damn little more than some rank and bestial animal! You act and lust like one, getting all thrilled at the sight of some poor female made to live her life in some hollow beastly body." Shanna showing her feelings for Tina, being upset, she returned the insult to Ernie. "Quiet you two, the witches hear us chattering and our ability to speak will up and vanish. Shanna for pity sake, do remain quiet. Ernie, you too, cool your raging passion, remember we did not ask or want to be, as are, neither did Tina!" Matt chided all, trying to keep what they had and let Shanna have the fortune to return to her home and family as something other than a pet. Ernie snorted, he whinnied loudly, as if wanting, he gave a call at trumpeting his desire to mate and own yet another mare in his growing herd. Hearing Ernie sounding off as excited, the witches all turned round, pointing and laugh. One of the witches got up and took leave from the Sabbath meeting, she coming to fetch Ernie. She had a plan for seeing him mate there by the fire with Tina the new mare. Her plan, to impress the goat devil there with for her master, he as had before would bestow on her great honors, while make the transformation of Tina to being a mare as something permanent. Seen then was the witch named Mabel, a young witch of the coven who came walking up the knoll to fetch down Ernie. The entire coven was hooting and carrying on like a pack of sex fiends, making lurid comments about Ernie and his long Arabian shaft and the like. Ernie did look eager, he having remained for such a long time as horse and stallion. He had accepted his horse self along with the way of life, and made use of constantly by the witch who owned him, he a stallion at stud too! The other horses became nervous and excited when Mabel loosed the rein to Ernie, she led him prancing down the rocky knoll, to meet and then mate Tina. "This is disgusting," said Shanna rather loudly, "Everything they do and you all here hinges on sex and things sensual!" Matt waited a moment until Mabel was a bit further away before he spoke. "Dear girl, do prey tell what else is there for an animal to do? We can graze only so long before we get our fill! Then what, we do not play golf, go swimming, bowling, out hiking with our friends, or sit around a campfire telling jokes. We are now as animals, those of us who are permanently such, other than eating, we have sensual naked bodies that feel and do things in rather instinctive ways. Sensual pleasures are but a way to work off the pangs and anxieties we get from the weather, or horny beasts, and even some perverts that come to us by night to thrill their zoo pleasures. Sexual sensation is a healthy part of living life, and I cannot damn Ernie for wanting to enjoy a neatly made mare. Tina there, or any other mare they might wish him to mate." Matt scolded Shanna, he wanted to keep her safe from harm, but as nice as she being as black too looked to him, he wished her to keep her humanly beauty. The noise from Tina Wilkes meeting a stallion she doctored; not having any inkling he was once a man, she began to kick up her hind legs at his courting advances. Bella approached Tina, putting a black leather and silver studded bridle over the mare' head. She tightened the straps and once seemingly set, Bella called upon some power to quell the mare to know only equine passion. This did the trick as a very calm and willing Tina Wilkes, then an Arabian mare, stood hind legs spread, tail held high and to the side of her rump, vulva winking, she pissed a trickle to entice her stallion, was then ready as willing to accept her mate. Every one of the witches stood and cooed as the vibrant stallion climbed aboard his new mare. Ernie had a lot of experience, just two jabs and his eyes lighted up like light bulbs. Snorting he squiggled himself into a comfortable position, then doing what came natural. A wild eyed and spellbound mare stood mentally horrified over what she felt. Her education and skills were no longer of much or any use to her then with her being a mare. "Matt, you see that, you see what Ernie is now doing to Tina, Matt, oh Matt!" Shanna said she seemingly enthused at watching two beautiful horses doing what was natural since time began. Matt laughed softly, and said to Shanna, "Now who is the pervert?" Shanna slumped down to her seated position again, she began to pout, and reaching into her pocket, she took something out. Matt cocking his black and shaggy head to give a look at Shanna; he sniffing the air made his yellow square eyes widen, he said, “What do you have there?” Munch, munch, "Dark Chocolate," Shanna slurred out the words, her mouth filled with a gooey slime of some candy. "Are...you...eating...something...made...of...chocolate...and dark chocolate...too?" Matt speaking ever so slowly, and with stealth and caution he did ask of Shanna. Shanna swallowed the lump of candy, licked her lips, and then responded, "Yup, its some pieces of Halloween candy my parents were giving out this year, why?" Matt shook his shaggy head, "Chocolate, and the dark kind at that, what luck, what damn good luck; how many more of those have you been holding out?" Matt asked his voice sound somewhat irritated by what Shanna was eating. "Behold, my dear girl, you hold in your pocket the one talisman that can give protection to someone spelled into being an animal; or cursed to live as one the rest of their natural life. I say Chocolate, and it dark with more cocoa powder makes the very best for what ails someone such as we here. How many do you have in your pockets?" Matt asked Shanna, the urgency in his voice told of just how important this revelation might become. Shanna began to search her fall jacket, it having several pockets stitched in different places. "1...4...6...10...eleven, eleven in total, but now what, can these be of some good to you?" Shanna asked Matt, she looking wide-eyed and willing to help, if she could! Matt gave a sigh, then lifting his head to give a check on the equine orgy and Sabbath goers just below the knoll. "Please, dear girl, can you remove the wrap on the chocolates separately, giving one each to everyone tied to the tether line?" Matt asked of Shanna, his voice then calm, but showing some urgency. "Yes, sure, but what good are my treats for these here?" Shanna asked, she beginning to sit up, as she peered out at Tina and Ernie in the midst of their second round of fun and frolic. "Chocolate, I have learned works as a talisman on those spellbound, be they anything, but in our situation as animals. If you give a spellbound animal a piece of chocolate, and the best of them is Hershey, the spell is broken. However, what is better than best, they cannot ever or never be bound by such a spell again. It is a “REDEMPTION,” from our bestial existence!" Matt spoke of his plan as being excited his goat voice carried some, alerting those as horses of something to stop their insane, spellbound, situations. "Oh, wait," Matt, said he turning and with setting his fore hoofs on the crest of the knoll, he began to verbally count. "Eleven candies..., and...many more needy souls," he sighed, looking down along the tether line, wondering who should be saved from a life of bestial existence. "I'm sorry I ate the other two, never could have thought they would...!" Shanna said, her voice picking up the sadness of the moment. "Matt?" said Joey with a slurring manner, "I will stay behind, let another take mine, my balls are on fire to live this life, I could never enjoy being human again, disgraced, and it has been what... fourteen years since I became a Pinto; why everything I knew or had is gone, deemed worthless, now!" "Ernie, he is busy with Tina, begin the giving with they who were sisters, then to Richard is number three, Kyle four, Joey five, Martin six, Stan seven, Beverly eight... she too became a stallion, Frank nine, and me..., as number ten; and we have all along the tether given your soul saving body reprieving chocolates!" Matt counting, took a tally of all there, who unknowing of what they would eat as given would save them from living on in their equine doom. Luckier for some that the witches led away a few of their mounts, it gave most then some there a chance at a better life. Those horses led to stand by the bonfire, added cause and affect to Tina, urging her to a fuller submission as would a mare to her stallion. They who be standing aroused by their witches, need be deemed as left as they are, lusty, beastly, and living on as a tortured humans in animal form. Matt moved his hind-like goatish legs slip in the dusty dirt until he sat seated on his flat butt. "That settles that, I gave them all a piece of chocolate..., except Joey, he abstained, kept his lips tightly closed." Shanna said, she nearly exploding with her excitement, her voice loud enough for all the tether horses to hear, and as more than the horses heard her shrill voice. "No thanks,” said Matt, “My life too is as well gone a long time hence! I am what Bella wanted, a sex machine animal for some thirty-two years. My dear do give my piece of the chocolate to Ernie, or whoever, the witches return to the tethering line first. I am willing to remain here." Matt layback letting his legs spread and his big goat balls lie on the cold ground. He looked very sad, but seemed as resolute to stay a sex toy for Bella. He gave but a hand waving motion, as if telling Shanna to give another horse there, his assigned piece of her chocolate candies. As offered to each horse took and lip gobbled up the sweet and tasty treat, some of less alert state did not know it was going to work on them as a talisman. One by one, the horses all shrank in size until ten naked people lay or set next to the others, all thanking Shanna, caring little about them being naked. The rounds of thanks became a bit too noisy, alerting the orgy and witches below the knoll turning to look at seeing what up there was happening. Matt lying there and Shanna kneeling beside him, she looking at him began to pet. Her hands wandered to what she wanted to feel of Jack and know pleasuring too, knowing what her father would say, but willing to do it anyway. She saw the protruded furry sheath on Matt and worked her petting to pet at it. His eyes looked at where she touched him, he shaking his head as if to suggest “No, be careful!” As carefully, Shanna coaxed the sheath to give forth its inner red shaft, the thing as bestial as that of any male goat, but bigger, longer, the perked head a nut shape that she felt duty-bound to finger, trace, and pluck at the rounded opening. Noises and harsh words said by those nearer the bonfire, some having noted their horse as gone, let loose from the tethering line. Aroused and liking what Shanna was doing to him, Matt jumped to his hoofs, looked toward the bonfire and yelled, calling to Tina. He told her to run, to come and gain her deliverance from a life of sex, grass, hay, and oats. Hearing Matt, Tina hunched and lurched forward, jerking off and loose one Ernie the Arabian stud. All there by the bonfire and up the hillside heard his shaft pop out of the mare, sounding like a large cork coming out a bottle of old wine. A wild scramble of four new horse legs and huffing in big equine breaths, Tina crested the peak of the knoll just ahead of the witches. Shanna removed the wrap from the eleventh piece of dark chocolate, and shoved it in the gasping mouth of one veterinary mare horse. Tina chewed it up, munching with some sense of promise and delight. The coven of witches nearly flew to encircle those lying together in a huddle of fear. They took hold of Shanna, and placed then a magical controlling noose about the neck of Matt. Bella was the most disgusted of all there that be as witches. Pointing her arthritic index finger at Matt, she damned him for being the crux in this coo. She then turned and stepped to look Shanna in the face, eying the young woman, she took a grip on her arm, Bella announced, "This one shall take the place of our loving Tina; you my dear will become the foremost brood mare in this county!" At that moment, an aroused Ernie topped the knoll, his giant male shaft flagging, slapping the outside of each broad thigh upon every stride step he took. "Now get, get, and get the hell out of my sight!" Bella looked over her shoulder and screamed, as all who were again as humans, they naked stood up and then ran off into the black night. Then Matt, Shanna, and a re-morphing Tina were there along with Ernie the Arabian stallion, and three other horses still doomed to remain as horses. Turning then very slowly, Bella looked over the three who together had ruined her night of wild plans. She first gave an evil eye at the stallion Ernie. "Yer thar' stud, yer as I shall leave Ye are; nothing more be needed now, I can see yer' aroused and affrighted," Bella grumped, and then gave the horse a slap on his strong muscular neck. Hearing that, Ernie strode a slow few strides to stand and face the group, wondering as watching of what would come next. Bella then cast her eye down upon Matt. Matt tried his very best to look and act as if he were lacking any knowledge as to how the coo became ratified. Bleating twice, and then with his red shaft still out to where Shanna had coaxed it, being highly nervous he began to urinate. He stood stiffly for a moment, until with a great amount of satyr-goat like agility he twisted his head down to let the hot stream of urine douse his head and horns. Bella laughed seeing this very normal male goat action. "Yer not that stupid yet, thar Matthew, I know yer can speaketh, and yer wits are still thar and alive," Bella said, eying at the big black satyr Matt as he raised his head from out the stinking shower. Then Bella with her dark orb eyes fell upon Shanna, "Yer a cute beauty, different than the normal folk we see come into these are woods. African decent ye are most likely, such skin has sheen to its blackness! Ye are the culprit; ye gave them all the blessed candy. It be the goat who knew enough to inform about the trick. So it is for ye to bare my anger, and yer not going to like what I have planned." Bella said as she stood speaking and eying Shanna, as from the bonfire came then all the other witches, they all grumbling over the loss of each one's favorite mount. Witch women, some young, and many old, all angry some held an aura of broad power. Bella waved her arm for the other witches to come and see the reason they all were afoot there, them deep into the dark recesses of the forest. "Bella, what do we have here?' Said one witch, she standing above all the others, near to seven feet tall and dressed in a sheer, tight form fitted gown with a cape made of the same see-through material. Scared of the witches from he had seen them do with their abilities and pleasure taking of the unwitting and unsuspecting. Matt held his erection flared stiff, bolstering itself to an even harder state. Nervous as well, for what trouble she helped caused upon the Witchery Sabbath meeting, she, Shanna stood there rubbing her hands along the thigh-tight blue jeans. Wide-eyed and her fearful young eyes were a sharp contrast to her normally warm as smiling complexion, Shanna stood there fearful of what might befall her. "These two are mine to do with as I do please!" Bella announced to her counterparts there, they group coming to stand close, surrounding the girl with her satyr-goat. "Bella be reasonable...," said the tall witch, she eying Shanna as one might visually inspect an animal at auction. "No, hold up there, that satyr goat is mine, I rode him here and he is what will take me home!" A witch yelled, she pushing her way past the taller witch and came to stand by Matt, she beginning to pet his head as if feeling some sordid style of affection for him. "Shush Celina, my buck goat has his full and working human faculties, he knows what he is, what he was, and foremost, he has his plans to make as much trouble for us as what I made for him." Remarked Bella, her dull farmer like drawled, a mid-western manner of speech, she could if desired change to him back to being a college educated person. "Hey, I lost my cute American stud tonight, reared as broke and trained him since he came selling his carpet sweepers, and he, we enjoyed our night-time romps here. Now he is gone, his charm removed and I shall never have him again, damned shall you soon be there girl!" So then grumbled the taller witch, as she reached her bony hand to try to grasp at, and take personal hold of Shanna. As that power wheeling and bony-fingered hand took a grasp at the tight T-shirt covering the ample breasts belonging to Shanna, Bella took a swipe at and broke the hold. "Penelope, this one is not yours to do with as ye please. She belongs to all of us to make sure she knows our dissention of her blatant act of making our lives a good bit more un-blessed." Bella rebutted her tall witch friend, and with her hand grasping the bony hand holding on Shanna, she broke that hold. Together the coven standing around then cried out in unison, "They are ours to do with as we please!" "Not so fast!" a booming voice said, it turned heads, as the words seemed to come from off in the dark. "I demand a say now too, the goat I know to being more a kindred friend, but the girl, she I would wish to know as a lover!" The coven parted as a dark form sauntered through their midst; Shanna gasped when she saw him, he coming to stand there beside Penelope, both the same height were tall and imposing. "Yes our Master and Ye the high councilor to the devil council," said Bella, she bowing to and as well all there bowing of falling prostate before this dark form. Tall Penelope fell to her knees, as kneeling she still stood as tall as was Shanna still standing there by Matt that big satyr-goat. "S" curve-shaped horns stood high from where they joined the skull of a huge, as well muscular form of one big and black colored male goat. Seeing him there so close, Shanna realized this as being, the goat the witches sat around at there Sabbath orgy. He, this master of the coven held such a place of honor, even the hag witches bowed to him as if they were nothing and he were royalty. Scared for being that close to the grand master, a soft "Baa...," Matt bleated, from his many visits to the various Sabbath meetings, he had witness what this goat did to people and taken of some away as burnt offerings to his master. "Holy shit, what in God's great name are you?" Shanna remarked, she eying the creature, as from the large cloven hoofs, up his legs to see the huge two dangling goat testicles and his groin; it sporting a sheath the size common to a draft stallion. She eyed him as she cocked her head back to look along his torso and seeing how shaggy his black goat wooly self was. Eying down to give a good look at this his prize, the grand master of the coven did shy away at hearing Shanna use the name of his worst enemy used in his presence. As the echo of that young female voice faded out into the forest, the devil goat turned back to look at his prize. He smiling, he too reached out a foreleg with a bony cloven hoof that transitioned instantly becoming a human hand, and did pointing the index finger. The creature pointed at Shanna, he reaching closer until touching her, poked harshly at the nipple to one of her shapely breasts. “You have beauty more than just skin deep to some, and you my dear girl are one of those rarer females with an untarnished soul. Although clean in spirit even a pure soul can become as soiled. It is time you begin to feel then a want to become soiled, and by me!" The grand master witch said to Shanna, and by the touch at her one nipple, it swelled and stood puckered, causing her T-shirt to tent, a definitive note as how turned on this pure girl became. Shrieking at the touch and trying to turn away from the goat creature touching her again, Shanna felt some sordid sense of power engulf her breast. All there close to Shanna heard a burbling sound as if the breast were filling with milk. The same strange sensation flowed through her chest to a moment later caused a similar affecting feeling in the other breast. "No baa...ck off," Shanna said as her one nipple puckered, protruding outward, looking then to her, and of all watching as if her breast had on it a full teat likened to those usually seen on a nanny goat udder. Shanna saw as felt it pucker. The happening as seen and felt, she swooned when feeling the other breast do the same. Looking then up into the face of that devil goat as her sensations hit, and opening her mouth she said aloud, "Baa...no way...baa!" She hearing what she said as how she said it, sweaty-nervous girlish hands then clamped over her mouth, she astonished of what had issued from. Shanna looked toward Matt, and began to wonder if she too might join Matt in his being as a goat. The tall goat creature then smiled, he tilting his great goat shaped head to look closer as down at Shanna, he winking one of his rectangle shaped yellow eyes at her. He saying to her then, "Ah but Shanna is a fine name! As finer, still for what I would wish for you, as I on occasion like to have some sensually pleasured times. Being as I am needs for one to keep me groomed and in fine form. You dear Shanna, it shall be for you who shall comb me, brush as fluff my wool, and when I feel the urge, you shall come willingly, readily giving your beauty and body to me for me to pleasure use in bestial as lusty sordid manner of lovemaking." Spoke the tall goat creature and Demon, as he expected taking ownership of Shanna. Matt looked up and over at Shanna. He stoutly bearing a monster of his own had that male goat erection. The red shaft wagged as it pulsed from his wild sensual gyrations. The bestiality put there into his soul by Bella helped well to capture his soul in a tightened grasp by the devil there. Highly aroused, being as if born a goat, Matt strained against thinking domineering ideas about Shanna, as if she were properly a nanny for breeding and impregnation. Bella crawled on her knees to one side as the tall goat creature stepped a half stride closer to Shanna. He stood then with his fur covered belly touching her chest and breasts. He with his head cocked and with one eye looking down at his probable prey, he took his hand and index finger again, and touched it to the forehead of Shanna. "Verflucht und gesegnet, nehme ich thee Shanna, um einer von meinen, die Königin meiner Herde zu sein!" The goat creature said, his words translated called to that his Master, asking as if of Shanna to be as such a part of his growing herd of females and male goats alike. The rush of temporal thoughts about what mat be for her future scared her. She shuddered at the want placed in her mind by the goat devil. She was righteously scared at the building delight in her, gaining an understanding of his delight becoming as her desires. She strong in her religious faith knew in her heart if she bowed to the will of this devil, her life as form would change to be as his consort and a sex slave. Shanna thought hard to fight against the sense of want, she worried her future was turning black. In the distance were the friends of Shanna. The three were looking for their lost friend, as well calling her name. They came with each an unanticipated knowledge of fear. They were nearing where the witches had their wild Sabbath, it an orgy, and there that devil goat. The Master as well Bella looked as pleased, he making the motion to the others there that they go and seek as find replacement mounts for those who ran away. Beating the bushes with a hefty stick, Paul in the lead made his way through the forest undergrowth, he continually calling loudly, "Shanna, Shanna!" A football field distance away from Paul and Jack walked along a trampled path through the forest. He too called out the name of Shanna, as he walked onward. He moved with apprehension, seeing along the path dark had landmines, he stepped on as in many piles of rounded road-apples, the common leavings left by running horses. Friend Katie like her good friend Shanna was a tough cookie! She walked along another pathway, but stayed to the outside of it, she having forlornly stepped once into a heap of something smelling awful. The threesome walked along in the forest, they calling out for their lost friend; each knowing where was the others. Little did they know or would they think it possible that as they walked they came closer to their meeting perdition. Matt turned and gave a cocked head goat look to Shanna, as if asking her not to cry out to her friends. Shanna noted the odd expression of Matt. She had an idea as of what. He had gained more empathy with her, a similarity coming from who owned him and how his owner used him. Bella stood there looking off into the dark forest; she licked her lips with anxious delight, expecting soon those who came searching to give rides to some witches heading back to their homes. "Bella..., please, do not spoil this exceptionally emotional moment for my sweet new protégé. Ah, but I see she longs to express her fears, her building sensations, her pouted nipples; she is feeling the urges common to a nanny goat coming into her yearning time. Come closer to me; press your body to mine." The coven Grande Master devil said, he the familiar sent monthly to bring evil focus to women as willing to pleasure pleasured. Shanna tried to hold herself back, pondering still her delight for Matt and liking him more than she thought of Jack. Fearful, she stepped back from the goat monster rather than to come closer to his darker regions. Wanting her freedom, Shanna took steps backing away far enough out of reach of the goat creature. She wanted to cry out, to call to her three good friends and warn them of the horrible impending danger. He had his own ideas for her. He showed her his lust openly, wanting her soul, if he could force her to beg him to pleasure her sensuality, for him pleasuring of her. Pleasures felt as given a demon of his status and caliber he would never get when in the dark dungeon that was his home. The growing ever louder calls for Shanna meant her three worried friends continued their coming closer to grave dangers. Shanna felt the alluring pull of the demon trying to work his will upon her mind. Fearful she would again utter that pleading tone baaing rather than speaking a defying word "No," she slowly reached inside her jeans pocket and retrieved a small as broken piece of her dark chocolate. The Grande Master stood staring at Shanna, his mental ability reaching at her human mind, wishing to pluck from it favored memories of when she felt, or had some lurid pangs when one of his realms came close. Instead, Shanna fell to her knees and folding reverently her fingers, prayed, pleading to higher powers for aid and deliverance. "Come darling, it is not the time for religious foolishness. The melding of your spirit and body into a form bent on lust and of things sensual is the now!" A coaxing Devil spoke to her of urges he wanted placed into Shanna. He expected some anxious apprehension on her part, as she like he knew what would happen then and when their bodies touched. Hope and faith are great advantages, but passion can go either way, as what Shanna did to stroke Matt aroused and then to finger his maleness, her angelic purity faded enough for the Demon to grasp at her soul. "No..., baa, no..., baa, oh please...baa, Matt..., baa, baa... Matt please do something to help me, no..., baa, baa...," Shanna pleaded for help, she looking to Matt but he could do nothing other than stand his ground. Defiled of his soul a long time hence, he was powerless to do anything to try to stop the Demon, as if he were to try, the monster would kill Matt. A half cloven hand reached out with a shaggy black furry arm to cuddle closer his prey. The Grande Master Goat devil had massive strength in his form. As he moved to pull Shanna close, knowing of her apprehension but not of what she just ate. Shanna gained a vision. Her vision was of what would come as accordingly to the will and mindset of this devil standing there before her. In her mind, Shanna saw her rolling about on green grass, in a wide pasture, and she feeling the steel hard rod of the devil goat cock rammed inside her body. The vision gained in intensity, she began to fantasize the situation, reeling with uneasy feeling, but finding her will failing, and the lust to of an animal, it building inside her chest. Goosebumps from fears realized popped all over her body, Shanna felt this and then realized they were more than just from shivering fear, she felt fur sprouting all over on her skin. Horrified while at the same moment feeling elated to the becoming of how she might look, savoring the thought when partial, or completely, she would stand before the devil, she too being in part the lusty form of a nanny goat. "Master, she is fighting you, she ate something, it likely the talisman that released all the mounts to being human again. She is determined to fight and only a potion administered now can capture her body as then mind and soul!" Bella exclaimed, she telling of as reminding the lust-blinded devil of something he was ignoring. Matt yelled, "Yea...Shanna, fight it, less when if you let him touch his red rod and it enters the gates into your heart you shall become his toy!" Matt said in a whispering of his words, but loud enough to inform Shanna and let the Demon know too. The Demon was not without disgust for the big satyr-goat, he flicked some quick powerful thought at Matt, sending the satyr into a raving fit. Frothing salvia from his mouth, Matt felt his lust forced upon him as he sensed his already massive erection was gaining girth and length. Instantly Matt had a stiffer an erection, his sense of passion reaching such a quick heightened need he turned, rising upright on his hind legs and walked toward Bella. Matt came sexually into contact with Bella, she his witch, flighty but horny, she spread her cloak and let the big buck slide into her his shaft then thickening as mightier. As if planned in the annals of Hell, the moment when Matt and his hot, oily, red rod touched the nether lips of Bella, the Devil expected to feel the same coming from him impaling Shanna. His hope for the moment did fade, as Shanna and her faith won the moment, she slapping away his goatish cock, and when it returned to her face, she sunk her teeth and bit the end enough to draw black blood.
http://docs-lab.com/submissions/579/around-the-campfire
2017-04-23T09:50:35
CC-MAIN-2017-17
1492917118519.29
[]
docs-lab.com
Restrict shipping One question I get a lot is "How can I restrict shipping when X", where X can be anything like when there is a product is in the cart or for specific countries/states. There are a couple ways on how you can restrict shipping to those different conditions. Here I will show two options; 1) Prevent shipping rates from showing up, 2) Add shipping validation rules Prevent Shipping Rates From Showing Up Restrict shipping with zip codes / states / countries If you want to restrict shipping to certain geographical locations, or only want to ship to certain areas, I can recommend to use the WooCommerce Advanced Shipping Shipping Zones extension. With that extension, you can setup a shipping zone of zip codes, states or counties, which you can then use as a 'shipping zone' condition. Setup a condition like the following in order to restric the current shipping rate to only the selected shipping zone. This way there is no way the user can order without being inside the shipping zone. The other way around, restricting shipping to NOT the East coast of the US. With this way you can exclude the East coast from that shipping rate. Restirct shipping of specific products If you have some products that cannot be shipped, you can make sure the shipping rate doesn't show up in a very similar way as above. With the following condition I will prevent flying ninja from shipping with the current shipping rate (remember to add the same condition to other rates if you want to prevent shipping at all). Restrict products to a specific area It could also be that you only want to prevent shipping of a specific product to a specific area. In that case you can combine the condition above with the shipping zone condition (or other 'country', 'state', 'zipcode' etc conditions) to prevent a specific products from shipping to that area. Message showing up When you've setup your shipping rates and a customer wants to checkout from a location that doesn't allow shipping from the configuration you've setup, they will get a "No shipping available" message. This message is from WooCommerce Core and can be changed as read here: How to Change the "No Shipping methods available" Message. Setting Up Shipping Validation Rules Another way of restricting/preventing shipping is by setting up shipping validation rules. A shipping validation rule is fired when someone actually presses the 'Order now' button when they've entered all their data. It is exactly the same type of message when you want to checkout, and didn't enter your name for example. You will get a notice saying ' Name is required. Please fill in [...]'. I've written a blog post about how to setup shipping validation rules on the Shop Plugins blog; Setting Up Shipping Validation Rules in WooCommerce. (A shipping validation rule looks like this)
http://docs.shopplugins.com/article/45-restrict-shipping
2017-04-23T10:03:51
CC-MAIN-2017-17
1492917118519.29
[]
docs.shopplugins.com
Note The N@ classifier cannot be used in compound matches within the CLI or top file, it is only recognized in the nodegroups master config file parameter.. A simple list of minion IDs would traditionally be defined like this: nodegroups: group1: L@host1,host2,host3 They can now also be defined as a YAML list, like this: nodegroups: group1: - host1 - host2 - host3 New in version 2016.11.0. To use Nodegroups in Jinja logic for SLS files, the pillar_opts option in /etc/salt/master must be set to True. This will pass the master's configuration as Pillar data to each minion. Note If the master's configuration contains any sensitive data, this will be passed to each minion. Do not enable this option if you have any configuration data that you do not want to get on your minions. Also, if you make changes to your nodegroups, you might need to run salt '*' saltutil.refresh_pillar after restarting the master. Once pillar_opts is set to True, you can find the nodegroups under the "master" pillar. To make sure that only the correct minions are targeted, you should use each matcher for the nodegroup definition. For example, to check if a minion is in the 'webserver' nodegroup: nodegroups: webserver: 'G@os:Debian and L@minion1,minion2' {% if grains.id in salt['pillar.get']('master:nodegroups:webserver', []) and grains.os in salt['pillar.get']('master:nodegroups:webserver', []) %} ... {% endif %} Note If you do not include all of the matchers used to define a nodegroup, Salt might incorrectly target minions that meet some of the nodegroup requirements, but not all of them.
https://docs.saltstack.com/en/latest/topics/targeting/nodegroups.html
2017-04-23T09:52:25
CC-MAIN-2017-17
1492917118519.29
[]
docs.saltstack.com
Launch debugging any existing Debug Log file In the Logs folder in your project, you have the files that were downloaded from Salesforce by you (you can do this in such a way: the Main Menu ⇒ View ⇒ Logs Window ⇒ Download Logs). Just open the necessary log file in the Debug mode in the IDE and look through all the important information. To do this, go to the Main Menu: Debug ⇒ Start debugging from a log file. This option gives you almost unlimited possibilities: you are able to run the Debugger even if you have a Production org, having only the logs and the current version of the code. NB: The log file should have the minimal required log level. If the log level of your log file is insufficient, you will see the error message with the list of the required debug log levels. Last modified: 2018/02/21 16:14
https://docs.welkinsuite.com/?id=windows:how_does_it_work:tools_in_tws:debugger:how_to_debug:launch_debugging_file
2018-06-17T21:49:01
CC-MAIN-2018-26
1529267859817.15
[array(['/lib/exe/fetch.php?media=windows:how_does_it_work:tools_in_tws:debugger:how_to_debug:debug-log-file.png', 'Debug a log file from the Main Menu Debug a log file from the Main Menu'], dtype=object) array(['/lib/exe/fetch.php?media=windows:how_does_it_work:tools_in_tws:debugger:how_to_debug:debug-log-_level.png', "The log level isn't correct The log level isn't correct"], dtype=object) ]
docs.welkinsuite.com
This Specification Flow Task inserts an image in to an excel document to replace an existing shape that has a specific title. The Replace Image task works by replacing a Shape in the Excel document. To create a shape in the master document, of the file that is to have the image replaced, follow the steps below:
http://docs.driveworkspro.com/Topic/SppTaskReplaceImagesInExcelDocument
2018-06-17T22:14:56
CC-MAIN-2018-26
1529267859817.15
[]
docs.driveworkspro.com
Error getting tags : error 404Error getting tags : error 404 revPreviewVideo if the hilite of button "Preview Only" then revPreviewVideo Use the revPreviewVideo command to see the image from a video camera on the screen. You must use the revInitializeVideoGrabber command to open the video grabber before you can use the revPreviewVideo command. The revPreviewVideo command shows the input from the video source in the video grabber window, but does not record it. To save the video as a file while displaying it, use the revRecordVideo command instead. If the video grabber was already recording video to a file, the revPreviewVideo command stops the recording. Important! If you are using QuickTime for video capture, execute the revVideoGrabIdlecommand periodically while previewing or recording video. Otherwise, you may experience poor quality in the video capture. To stop displaying the video input, use the revStopPreviewingVideo command. Important! The revPreview.
http://docs.runrev.com/Command/revPreviewVideo
2018-06-17T21:49:38
CC-MAIN-2018-26
1529267859817.15
[]
docs.runrev.com
Repair Remote Web Access Published: March 10, 2011 Applies To: Windows Small Business Server 2011 Standard Problem Remote Web Access displays an error message that says, “Your computer can’t connect to the remote computer because the Remote Desktop Gateway server is temporarily unavailable. Try reconnecting later or contact your network administrator for assistance.” Solution Restart the Remote Desktop Gateway service. To start the Remote Desktop Gateway service Click Start, click Administrative Tools, right-click Services, and then click Run as administrator. In the Services (Local) list, right-click Remote Desktop Gateway, and then click Start.
https://docs.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-essentials-sbs/gg680313(v=ws.11)
2018-06-17T22:06:21
CC-MAIN-2018-26
1529267859817.15
[]
docs.microsoft.com
your self..
https://docs.mulesoft.com/anypoint-studio/v/7.1/datasense-perceptive-flow-design-concept
2018-06-17T22:14:58
CC-MAIN-2018-26
1529267859817.15
[array(['./_images/payload+autocomplete.png', 'payload+autocomplete'], dtype=object) ]
docs.mulesoft.com
Connecting Your Account First, head to your Yaguara Integrations and select 'Connect Integration' for Instagram Please note that the instagram account has to be a business account in order to connect Next you will be taken to a Facebook / Instagram authentication page and select "Continue as [me]" Available Metrics - Total followers - New followers - Impressions - Total posts - Website clicks - Phone call clicks - Text message clicks - Get directions clicks Still have questions? Reach out to us at [email protected] or start a chat with us!
http://docs.yaguara.co/en/articles/3494315-instagram
2021-01-16T06:36:47
CC-MAIN-2021-04
1610703500028.5
[]
docs.yaguara.co
After installing the operating system, prepare the networking connections and configuration for your TeamForge site. Note: You must have root access to all the hosts you will be setting up for your site. - Use the NetworkManagerto list the DNS servers you want to use for resolving Internet addresses. - Open the appropriate ports, and close all other ports. See Port Requirements. - Use the hostname command to verify that the machine name is resolvable on the network. hostname bigbox.supervillain.org - Use the nslookupcommand to verify that your hostname maps to the right IP address. nslookup bigbox.supervillain.org Server: 204.16.107.137 Address: 204.16.107.137#53Tip: If there is any doubt about what the system’s real IP address is, use the /sbin/ifconfig command. - If you are installing behind a proxy, specify your proxy settings. export http_proxy=http://<PROXY_USERNAME>:<PROXY_PASSWD>@<PROXY_HOST>:<PROXY_PORT> export no_proxy=localhost,127.0.0.0/8,<hostname> - Use a tool such as Nessus to scan your server for potential vulnerabilities. See Port Requirements for detailed security recommendations.
https://docs.collab.net/teamforge181/setupnetworking.html
2021-01-16T06:57:24
CC-MAIN-2021-04
1610703500028.5
[]
docs.collab.net
1. What was decided upon? (e.g. what has been updated or changed?) As of July 1 2019, the library implemented unlimited renew for standard loans (28/90/365 day loan), that means, for Vanderbilt faculty, staff, and students, they can renew library loans many times as long as the item was not needed (recalled) by another user, the renew can be done online or at any circulation desks. Items do not fall into standard loan policy will not have unlimited renewal, such as laptops, leisure reading items, and reserved items. 2. Why was this decided? (e.g. explain why this decision was reached. It may help to explain the way a procedure used to be handled pre-Alma) 3. Who decided this? (e.g. what unit/group) The proposal was brought up by Access Services Committee, approved by Committee of Library Directors and Alma Systems Committee. 4. When was this decided? May 28, 2019. 5. Additional information or notes.
https://docs.library.vanderbilt.edu/2019/07/02/unlimited-renew-implemented-for-standard-loans/
2021-01-16T06:45:45
CC-MAIN-2021-04
1610703500028.5
[]
docs.library.vanderbilt.edu
Keyboard Support A critical requirement for software accessibility is keyboard support as a complete alternative to pointing devices (mouse, etc.). Keyboard support is comprised of command key, focus key, and keyboard navigation. RadListBox seamlessly switches between mouse and keyboard navigation, allowing items selection, deletion, moving and reordering. Setting the control's KeyboardNavigationSettings (introduced in R2 2016 - ver.2016.2.504), allows you to associate an activation combination (CommandKey + FocusKey), which moves focus directly to the RadListBox and enables keyboard navigation. When there are multiple RadListBoxListBox and enables keyboard navigation. Certain keyboard combinations are reserved and used as shortcuts in the browsers. Example: Setting the KeyboardNavigationSettings for RadListBox <telerik:RadListBox <KeyboardNavigationSettings CommandKey="Alt" FocusKey="M" /> <Items> <telerik:RadListBoxItem </telerik:RadListBoxItem> <telerik:RadListBoxItem </telerik:RadListBoxItem> </Items> </telerik:RadListBox> Arrow Navigation / Selection Down Arrow / Up Arrow keys select next/previous item. Shift + Arrow Keys selects multiple items. Ctrl + Space adds or removes the currently active item to or from the selection. Del deletes the currently selected item (if AllowDelete is set to "true"). Transfer Ctrl + Right Arrow transfers the selected item/s to the destination RadListBox. Ctrl + Left Arrow transfers the selected item/s to the source RadListBox when the selection is in the destination listbox. Reorder Ctrl + Shift + Up Arrow reorders upwards. Ctrl + Shift + Down Arrow reorders downwards. Mark Matches STARTS WITH ( EnableMarkMatches="true") Start typing to highlight the matching items. Esc clears the mask. Backspace removes a single character. FIRST LETTER ( EnableMarkMatches="false") Any key navigates to a matching item. Subsequent press on the same key iterates over the matches. Marking matches works only for the built-in items, because it relies on their Textand structure. This means that using the ItemTemplatewill disable the Mark Matches functionality. For the feature to work, the TabIndexproperty of the control must be set, otherwise it cannot receive focus and the keyboard events.
https://docs.telerik.com/devtools/aspnet-ajax/controls/listbox/accessibility-and-internationalization/keyboard-support
2021-01-16T06:38:02
CC-MAIN-2021-04
1610703500028.5
[]
docs.telerik.com
] Gets the properties associated with a PII entities detection job. For example, you can use this operation to get the job status. See also: AWS API Documentation See 'aws help' for descriptions of global parameters. describe-pii-entities-detection-job --job-id <value> [--cli-input-json <value>] [--generate-cli-skeleton <value>] --job-id (string) The identifier that Amazon Comprehend generated for the job. The operation returns this identifier in. PiiEntitiesDetectionJobProperties -> (structure) Provides information about a PII entities detection job. JobId -> (string)The identifier assigned to the PII entities detection job. JobName -> (string)The name that you assigned the PII entities detection job. JobStatus -> (string)The current status of the PII entities detection job. If the status is FAILED , the Message field shows the reason for the failure. Message -> (string)A description of the status of a job. SubmitTime -> (timestamp)The time that the PII entities detection job was submitted for processing. EndTime -> (timestamp)The time that the PII entities detection job completed. InputDataConfig -> (structure) The input properties for a PII entities detection. OutputDataConfig -> (structure) The output data configuration that you supplied when you created the PII entities detection job. S3Uri -> (string)When you use the PiiOutputDataConfig object with asynchronous operations, you specify the Amazon S3 location where you want to write the output data. KmsKeyId -> (string)ID for the AWS Key Management Service (KMS) key that Amazon Comprehend uses to encrypt the output results from an analysis job. RedactionConfig -> (structure) Provides configuration parameters for PII entity redaction. This parameter is required if you set the Mode parameter to ONLY_REDACTION . In that case, you must provide a RedactionConfig definition that includes the PiiEntityTypes parameter. PiiEntityTypes -> (list) An array of the types of PII entities that Amazon Comprehend detects in the input text for your request. (string) MaskMode -> (string)Specifies whether the PII entity is redacted with the mask character or the entity type. MaskCharacter -> (string)A character that replaces each character in the redacted PII entity. LanguageCode -> (string)The language code of the input documents DataAccessRoleArn -> (string)The Amazon Resource Name (ARN) that gives Amazon Comprehend read access to your input data. Mode -> (string)Specifies whether the output provides the locations (offsets) of PII entities or a file in which PII entities are redacted.
https://docs.aws.amazon.com/cli/latest/reference/comprehend/describe-pii-entities-detection-job.html
2021-01-16T06:51:26
CC-MAIN-2021-04
1610703500028.5
[]
docs.aws.amazon.com
Spooler Administration Spooler Administration MultiValue spooler administration involves three classes of actions: Using of shell commands such as SETPTR to control the sending of print jobs to the printer. Issuing shell commands to manipulate the jobs, form queues and despool processes. Running the SP-JOBS and SP-STATUS menu driven utilities to manipulate the jobs, form queues, and despool processes. Form Queue Names And Numbers Throughout this manual we refer to form queue names and form queue numbers. A form queue is a collection of globals that define and control a pseudo printer. A despool process later comes along and sends the output to a real printer. A form queue actually has three identifiers: The user-specified form queue name (or form queue number). The system-assigned form queue number. The system-assigned spooler global subscript. UniVerse-like MultiValue platforms tend to use form queue numbers. Pick-like MultiValue platforms tend to use form queue names. Caché MultiValue supports both. By default, the form queue globals are indexed under form queue number. Each form queue number has a corresponding form queue name. If the form queue name is numeric, or starts with a letter (other than F, perhaps followed by Q or N, followed by a number), you can use the form queue name, or any of the formats listed below to identify a form queue to one of the SP- commands: A form queue reference in the format nn, for example, 99, describes form queue number 99. A form queue reference in the format Fnn, for example, F99, describes form queue number 99. A form queue reference in the format FQnn or FNnn, for example FN99 or FQ99, describes form queue number 99 A form queue reference in the format FQname or FNname, for example FQHP7210 or FNHP7210, describes a form queue named HP7210. Anything else, for example, “STANDARD”, describes the form queue named STANDARD. In the above rules, the first three result in a reference to a form queue number, and the last two result in a reference to a form queue name. A form queue number will always have a corresponding form queue name, and a form queue name will always have a corresponding form queue number. In all the administrative commands that ask for a FORM QUEUE you can specify the form queue number or the form queue name. For example, using the commands SP-CREATE and SP-ASSIGN: SP-ASSIGN 3=F12 (HS means assign print channel 3 to form queue number 12 , with “H” and “S” options. SP-CREATE HP7210 CACHE NULL creates a form queue whose name is HP7210. The form queue number is allocated automatically as the next free unused form queue number. SP-CREATE F15 CACHE NULL creates a form queue whose name is F15 and whose form queue number is 15. SP-FORM F15 HP7210 Referring to the preceding example, this renames the form queue named F15, which is also form queue number 15, to HP7210 and it retains form queue number 15. Form Queue Groups you can optionally create one or more form queue groups. A form queue group consists of a list of existing form queues that can be used as a collection of form queues. A form queue group may be used by the despool process to send spooler output to all form queues in the group, rather than just one form queue. form queues that are grouped together should share some commonality, such that grouping them together makes administration easier. You use SP-CREATE to create a form queue group by specifying GROUP as the second argument, then listing multiple existing form queues. Illegal and duplicate form queue names are ignored. You can create a form queue group without specifying its form queues by specifying the empty string ("") as the third argument, as follows: SP-CREATE MYFQGRP GROUP "" You can use the SP-DEVICE command to add, remove, or replace form queues from the form queue group. MultiValue De-Spool Devices The MultiValue spooler allows for a despool process to send print jobs from the spooler tables to an external resource, for example a printer. The despool process is controlled using the commands: SP-START, SP-STOP, SP-SUSPEND, SP-RESUME and SP-KILL. The despool process writes to a device as defined by the form queue. This device is specified when creating a form queue with the SP-CREATE command. You can change the device associated with a form queue using the SP-DEVICE command. When defining an output device for the despooler, you use a similar syntax as defined for the ObjectScript OPEN statement: Device{:parameters}{:timeout}. You can specify any device supported by Caché and so you can write spooler output to printers, files, tapes, terminal screens, programs, and so on. For further information on devices supported see the Cache I/O Device Guide. The &HOLD& File You implicitly create the &HOLD& file the first time you use the SETPTR command and specify mode 3 (the sixth argument). The spooler creates &HOLD& as a directory-type file in the current working directory (identified in the @PATH variable). Normally &HOLD& should be a directory-type file, but it can be pre-created as an anode-type if that is preferred. For example, on a new account, the following commands show this happening: NEWACCOUNT:SETPTR 0,100,30,2,2,3,BRIEF,BANNER NEXT,INFORM Creating &HOLD& file. NEWACCOUNT:ED VOC &HOLD& &HOLD& 6 lines long. ----:P 0001: F 0002: &HOLD& 0003: ^DICT.HOLD 0004: 0005: 0006: Bottom at line 6. ----:EXK NEWACCOUNT: In the definition of &HOLD&, attribute 2 means a directory in the Mgr/namespace installation directory will have been created where the file resides. For example, the account NEWACCOUNT, by default this uses namespace NEWACCOUNT and so the &HOLD& file will be a directory called something like C:\InterSystems\Cache\Mgr\NEWACCOUNT\&HOLD&. This action can be overridden. On a new account, simply create the type of &HOLD& file you want, for example: NEWACCOUNT:CREATE-FILE &HOLD& DIR C:\TEMP\MYHOLDFILE or NEWACCOUNT:CREATE-FILE &HOLD& ANODE If the &HOLD& file has already been created, you can simply delete it with DELETE-FILE and manually re-create it as shown. The &HOLD& file can therefore be any regular MultiValue file of type DIR (points to an operating system file path), or of type ANODE (a special Caché type suitable for large items). A &HOLD& file cannot be an INODE file. Form Queue Control Caché allows for a subroutine to be called either before, or after, or both before and after a job is printed on a form queue. The purpose of the subroutine is to allow an arbitrary sequence of characters to be sent to the spool device, most typically for fine control of the device, such as changing the device from portrait mode to landscape mode. The general name of the subroutine to be called is contained in the variable, MVCACHEPRINTER. If a subroutine by this name exists, it will be called once at the start and at the end of every print job. If the subroutine named by MVCACHEPRINTER does not exist, no action is taken. If the print job is printed more than once, for example when copies is set to a value greater than 1, then this subroutine is called multiple times, once per copy of each print job. Caché also provides for finer control over the routines called by allowing them to be specified separately per form queue via the SP–PREAMBLE and SP–POSTAMBLE commands. The API of the pre/post–amble routine called is: SUBROUTINE AMBLER (OCCASION, JOBNO, FQNAME, GLOBAL, OUTPUT) where: “AMBLER” is the name of the routine which will handle the queue “OCCASION” is the string “PRE” or “POST” indicating why the routine was called; this allows the same routine to handle both actions “JOBNO” is the job number being output “FQNAME” is the spooler global subscript. To convert this subscript to a form queue name or number, do the following: $XECUTE 'SET %EXTERNALNAME = ' : GLOBAL : '("' : FQNAME : '","NAME")'Copy code to clipboard “GLOBAL” is the name of the Caché global holding the spooler data “OUTPUT” is the string of characters that should be sent to the spool device before the job starts (if OCCASION is “PRE”) or after the job finishes (if it is “POST”). By default, all accounts use the same spooler. Since the preamble and/or postamble subroutine might be called from an account other than where it was defined, the subroutine should be cataloged globally. This is easier to manage than cataloging locally or normally in every account that might need to use it. Auxiliary Printing Auxiliary printing occurs when a spool job is directed to a printer connected directly to the terminal or personal computer rather than to that controlled by the spooler. Using this facility, an application can print data to the spooler and the job will appear on the normal spooler tables in the assigned form queue. When the print job is closed, the spool job that was created will be sent to the user’s auxiliary printer. Auxiliary printing is initiated by the SP-AUX command using the “A” option: USER:SP-ASSIGN =HP7210 A To check if the spool job will be directed to the local printer, look for the “AUX” option on the job in the form queue. For example: USER:SP-LOOK FORM QUEUES Chan Q# Q name Width Lines Top Bot P# Options 0 1 HP7210 132 66 3 3 1 AUX, INFORM Caché must know the control code sequences to be sent to the terminal to turn on and off auxiliary printing. These are contained in the terminal definition file, TERMDEFS, documented elsewhere. Each terminal type has an entry in the TERMDEFS file. The control definitions used by Caché are “mc5” to turn on auxiliary printing and “mc4” to turn off auxiliary printing. These match those used by most terminal emulators. Without these definitions in the terminal definition file, auxiliary printing will not work. The SP-AUX command can be used to print existing print jobs to the auxiliary printer. For example, the following command causes Caché to print existing jobs 22 and 23 to the own auxiliary printer: USER:SP-AUX 22 23 If your terminal definition does not include codes to turn on and off the printer an error message will be displayed, such as: Error. No auxiliary printer control sequences available for terminal. Printer Control Characters When a print job is sent to a physical printer, a new page is specified by an ASCII 12 character, the form feed control character. A new line is specified by a two-character sequence: an ASCII 13 (carriage return) and an ASCII 10 (line feed). These are the default values. Some printers require other printer control character sequences. For example, a new page may require both a form feed (ASCII 12) and a carriage return (ASCII 13) character. A new line may require just a line feed (ASCII 10) character. If your printer is not formatting correctly, you can use the SP-CONTROL command to change the printer control character defaults.
https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GVSP_SPOOLER_ADMIN
2021-01-16T05:02:19
CC-MAIN-2021-04
1610703500028.5
[]
docs.intersystems.com
Submission testing of a Power BI visual Before you publish your visual to AppSource, it must pass the tests listed in this article. Test your visual before you submit it. If your visual doesn't pass the required test cases, it will be rejected. For more information about the publishing process, see Publish Power BI visuals to Partner Center. Testing a new version of a published visual If you're testing or debugging a new version of an already published visual, you can override the AppSource version with a local file version, by enabling Developer mode in Power BI Desktop. To enable Developer mode, follow these steps: Open Power BI Desktop. Select File > Options and settings. Select Options. In the Options window, from the CURRENT FILE list, select Report settings. In Developer Mode, select the Turn on developer mode for this session option. Note In Power BI Desktop, Developer mode is only valid for one session. If you open a new Power BI Desktop instance for testing, you'll need to enable Developer mode again. General test cases Verify that your visual passes the general test cases. Optional browser testing The AppSource team validates visual on the most current Windows versions of Google Chrome, Microsoft Edge, and Mozilla Firefox browsers. Optionally, test your visual in the following browsers. Desktop testing Test your visual in the current version of Power BI Desktop. Performance testing Your visual should perform at an acceptable level. Use developer tools to validate performance. Don't rely on visual cues and the console time logs. Next steps For more information about the publishing process, see Publish Power BI visuals to Partner Center. More questions? Ask the Power BI Community.
https://docs.microsoft.com/en-us/power-bi/developer/visuals/submission-testing
2021-01-16T07:16:57
CC-MAIN-2021-04
1610703500028.5
[]
docs.microsoft.com
New to Telerik Reporting? Download free 30-day trial Localization The service renders the report with the request thread culture. By default the request thread respects the host's default locale. In order the request thread to respect the clients/browsers preferred language settings you should add in the root web.config file the following globalization element: Then the user agent (the browser) preferred language will be used as culture.
https://docs.telerik.com/reporting/telerik-reporting-rest-service-localization
2021-01-16T05:22:35
CC-MAIN-2021-04
1610703500028.5
[]
docs.telerik.com
. However, it is important you are not stressed or be in the dark anymore on matter related to self-employment and taxes. This is because the guide below has come to your rescue as given below are essential elements that you need to put in mind that concerns filling of your self-employment taxes, check it out!. The first element that you need to look at is income sources, click for more on these website... The second element that you need to put into consideration is the deductions. It is important that you do research on the deductions that you can be able to benefit from but they are always applicable if they were used for business reasons only. Some of the deductions which you need to know about before your filing day includes. A lot of people will select sole proprietorship reason being it is simple and has fewer paperwork. The fourth element that you have to examine is tax preparers. Also, you must reach out to tax professionals as well. Consider asking for assistance with tax planning, business planning and bookkeeping for private contractor taxes.
http://docs-prints.com/2020/09/19/the-beginners-guide-to-chapter-1-2/
2021-01-16T05:10:25
CC-MAIN-2021-04
1610703500028.5
[]
docs-prints.com
How to Insert a Document into another Document There is often a need to insert one document into another. For example to insert a document at a bookmark, merge field or at a custom text marker. At the moment, there is no single method in Aspose.Words that can do this in one line of code. However, a document in Aspose.Words is represented by a tree of nodes; the object model is rich and the task of combining documents is just a matter of moving nodes between the document trees. This article shows how to implement a method for inserting one document into another and using it in a variety of scenarios. Insert a Document at Any Location To insert the content of one document to another at an arbitrary location the following simple InsertDocument method can be used. This technique will be referred to by other scenarios described below. This is a method that inserts the contents of one document at a specified location in another document. This is a method that inserts the contents of one document at a specified location in another document. This method preserves the section breaks and section formatting of the inserted document. Insert a Document at a Bookmark Use the InsertDocument method shown above to insert documents in bookmarked places of the main template. To do this, just create a bookmarked paragraph where you want the document to be inserted. This bookmark should not enclose multiple paragraphs or text that you want to appear in the resulting document after the generation. Just set an empty paragraph and bookmark it. You can even put a small description of the inserted content inside this paragraph. Invokes the InsertDocument method shown above to insert a document at a bookmark. Insert a Document During Mail Merge This example relies on the InsertDocument method shown at the beginning of the article to insert a document into a merge field during mail merge execution. The following example demonstrates how to use the InsertDocument method to insert a document into a merge field during a mail merge. If a document to be inserted is stored as binary data in the database field (BLOB field), use the following example. A slight variation to the above example to load a document from a BLOB database field instead of a file. Insert a Document During Replace Sometimes, there is a requirement to insert documents to places marked with some text. For example, the template can contain paragraphs with the text [INTRODUCTION], [CONCLUSION] and so forth. In the resulting document, these paragraphs should be replaced with the content taken from external documents. This can be achieved with the following code, which also uses the InsertDocument method. The following example shows how to insert the content of one document into another during a customized find and replace operation.
https://docs.aspose.com/words/java/how-to-insert-a-document-into-another-document/
2021-01-16T06:38:18
CC-MAIN-2021-04
1610703500028.5
[]
docs.aspose.com
The tungsten_find_seqno command was added in versions 5.4.0 and 6.1.0 The tungsten_find_seqno assists with locating event information in the THL and producing a dsctl set command as output. The tungsten_find_seqno command performs the following steps: Get the Replicator sequence number to search for from the CLI and validate Validate paths and commands Load all available service names and validate any specified service against that list Check if this Replicator is in the Primary role Locate the supplied seqno in the available THL Parse the THL found, if any Generate the dsctl command and display Table 8.53. tungsten_find_seqno Options Below is a sample session: shell> tungsten_find_seqno 4dsctl set -reset -seqno 4 -epoch 2 -event-id "mysql-bin.000030:0000000000001981;-1" -source-id "db1"
https://docs.continuent.com/tungsten-clustering-6.1/cmdline-tools-tungsten_find_seqno.html
2021-01-16T05:56:50
CC-MAIN-2021-04
1610703500028.5
[]
docs.continuent.com
dotnet test This article applies to: ✔️ .NET Core 2.1 SDK and later versions Name dotnet test - .NET test driver used to execute unit tests. Synopsis dotnet test [<PROJECT> | <SOLUTION> | <DIRECTORY> | <DLL>] [-a|--test-adapter-path <ADAPTER_PATH>] [->] [-r|--results-directory <RESULTS_DIR>] [--runtime <RUNTIME_IDENTIFIER>] [-s|--settings <SETTINGS_FILE>] [-t|--list-tests] [-v|--verbosity <LEVEL>] [[--] ="16.8.3" /> <PackageReference Include="xunit" Version="2.4.1" /> <PackageReference Include="xunit.runner.visualstudio" Version="2.4. Arguments PROJECT | SOLUTION | DIRECTORY | DLL - Path to the test project. - Path to the solution. - Path to a directory that contains a project or a solution. - Path to a test project .dll file. If not specified, it searches for a project or a solution in the current directory. Options -a|--test-adapter-path <ADAPTER_PATH> Path to a directory to be searched for additional test adapters. Only .dll files with suffix .TestAdapter.dllare inspected. If not specified, the directory of the test .dll is searched. - preview preview SDK) The type of crash dump to be collected. Implies --blame-crash. --blame-crash-collect-always(Available since .NET 5.0 preview SDK) Collects a crash dump on expected as well as unexpected test host exit. --blame-hang(Available since .NET 5.0 preview SDK) Run the tests in blame mode and collects a hang dump when a test exceeds the given timeout. --blame-hang-dump-type <DUMP_TYPE>(Available since .NET 5.0 preview SDK) The type of crash dump to be collected. It should be full, mini, or none. When noneis specified, test host is terminated on timeout, but no dump is collected. Implies --blame-hang. --blame-hang-timeout <TIMESPAN>(Available since .NET 5.0 preview and later, on Linux with netcoreapp3.1 and later, and on macOS with net5.0 or later. Implies --blameand --blame-hang. -c|--configuration <CONFIGURATION> Defines the build configuration. The default value is Debug, but your project's configuration could override this default SDK setting. -> Forces the use of dotnet. -. --interactive Allows the command to stop and wait for user input or action. For example, to complete authentication. Available since .NET Core 3.0 SDK. -l|--logger <LOGGER> Specifies a logger for test results. Unlike MSBuild, dotnet test doesn't accept abbreviations: instead of -l "console;v=d"use -l "console;verbosity=detailed". Specify the parameter multiple times to enable multiple loggers. - report tests that were in progress when the test host crashed: dotnet test --blame.
https://docs.microsoft.com/en-us/dotnet/core/tools/dotnet-test?tabs=netcore2x
2021-01-16T06:25:53
CC-MAIN-2021-04
1610703500028.5
[]
docs.microsoft.com
Upgrade Discovery Bot from Enterprise A2019.15 to later versions Upgrade Discovery Bot from Enterprise A2019.15 (On-Premises) to a later version for the latest features and enhancements. Enterprise A2019.15 provides a separate installer (executable file) for Discovery Bot On-Premises users to install and use Discovery Bot from the Enterprise Control Room. Starting from A2019.16, the Discovery Bot installer is integrated with the Enterprise A2019 installer. To upgrade from Enterprise A2019.15 to a later version, you must first uninstall Discovery Bot from your machine and install the Enterprise Control Room. Procedure In your Windows device, go to Apps & features. Search for Discovery Bot. Click Uninstall. Install the latest version of the Enterprise Control Room. Installing Enterprise Control Room On-Premises From your local machine, log in to your Enterprise Control Room as an administrator. The Discovery Bot tab is available for use from the left panel.
https://docs.automationanywhere.com/bundle/enterprise-v2019/page/discovery-bot/topics/discovery-bot-upgrade.html
2021-01-16T06:13:52
CC-MAIN-2021-04
1610703500028.5
[]
docs.automationanywhere.com
Usage statistics Automation Anywhere collects usage statistics from Enterprise A2019 for product improvements. Usage statistics provides information on where users might be facing issues with the product, the most used or least used features, and so on. This information helps in product improvement which in turn helps in improving the customer experience. By default, the Usage statistics option is enabled in the Enterprise Control Room and usage data is collected. The data collected is stored in Automation Anywhere Enterprise and the service provider cloud. The collected data is typically generic information such as anonymized user details, customer name, and user navigation workflows such as menus the users access and clicks. An administrator can disable the Usage statistics option by changing the settings in the Enterprise Control Room. Note: The option can be disabled only for On-Premises deployments. Disable usage statisticsAn administrator can disable the Usage statistics option by changing the settings in the Enterprise Control Room.
https://docs.automationanywhere.com/bundle/enterprise-v2019/page/enterprise-cloud/topics/control-room/administration/settings/cloud-usage-statistics.html
2021-01-16T05:15:07
CC-MAIN-2021-04
1610703500028.5
[]
docs.automationanywhere.com
Apache Spark Overview Apache Spark is a distributed, in-memory data processing engine designed for large-scale data processing and analytics. Data Platform (CDP) supports only the YARN cluster manager. When run on YARN, Spark application processes are managed by the YARN ResourceManager and NodeManager roles. Spark Standalone is not supported. For detailed API information, see the Apache Spark project site. CDP supports Apache Spark, Apache Livy for local and remote access to Spark through the Livy REST API, and Apache Zeppelin for browser-based notebook access to Spark. The Spark LLAP connector is not supported.
https://docs.cloudera.com/runtime/7.2.2/spark-overview/topics/spark-overview.html
2021-01-16T06:41:21
CC-MAIN-2021-04
1610703500028.5
[]
docs.cloudera.com
For adding a VMware Cloud on AWS NSX Manager as a Data Source in to vRealize Network Insight, you need a refresh token. Procedure - Log in to the VMware Cloud services console. - Under My Services, click VMware Cloud on AWS. - Select the desired Software-Defined Data Center (SDDC). - Click the Support tab. - Make a note of the NSX Manager IP address. - Click on the organization name on the top banner.Note: Ensure that the organization resides in the selected SDDC. - Generate the API token.For procedure, see Generate API Tokens.Note: To generate the API token, you must have the Administrator and the NSX Cloud Admin privileges. Results You can use this token for authenticating all VMware Cloud on AWS SDDCs on the organization.
https://docs.vmware.com/en/VMware-vRealize-Network-Insight/6.0/com.vmware.vrni.using.doc/GUID-7DDCE1C0-FFA7-4BA7-8C07-10C882D9B6BB.html
2021-01-16T05:02:31
CC-MAIN-2021-04
1610703500028.5
[]
docs.vmware.com
Importing projects¶ If you have an already created Integration project file, you can import it to your WSO2 Integration Studio workspace. - Open WSO2 Integration Studio, navigate to File -> Import, select Existing WSO2 Projects into workspace, and click Next: If you have a ZIP file of your project, browse for the archive file, or if you have an extracted project folder, browse for the root directory: Info Select Copy projects into workspace check box if you want to save the project in the workspace. Click Finish , and see that the project files are imported in the project explorer.
https://ei.docs.wso2.com/en/latest/micro-integrator/develop/importing-projects/
2021-01-16T05:35:50
CC-MAIN-2021-04
1610703500028.5
[]
ei.docs.wso2.com
If you have an existing dataservice, data can be replicated from a standalone MySQL server into the service. The replication is configured by creating a service that reads from the standalone MySQL server and writes into the cluster through a connector attached to your dataservice. By writing through the connector, changes to the underlying dataservice topology can be handled. Additionally, using a replicator that writes data into an existing data service can be used when migrating from an existing service into a new Tungsten Cluster service. For more information on initially provisioning the data for this type of operation, see Section 5.12.2, “Migrating from MySQL Native Replication Using a New Service”. In order to configure this deployment, there are two steps: Create a new replicator on the source server that extracts the data. Create a new replicator that reads the binary logs directly from the external MySQL service through the connector There are also the following requirements: The host on which you want to replicate to must have Tungsten Replicator 5.3. When writing into the Primary through the connector, the user must be given the correct privileges to write and update the MySQL server. For this reason, the easiest method is to use the tungsten user, and ensure that that user has been added to the user.map: tungsten secret alpha Install the Tungsten Replicator package (see Section 2.3.2, “Using the RPM package files”), or download the compressed tarball and unpack it on host1: shell> cd /opt/replicator/softwareshell> tar zxf tungsten-replicator- 6.1.10-4.tar.gz Change to the Tungsten Replicator staging directory: shell> cd tungsten-replicator- 6.1.10-4 Configure the replicator on host1 First we configure the defaults and a cluster alias that points to the Primaries and Replicas within the current Tungsten Cluster service that you are replicating from: Click the link below to switch examples between Staging and INI methods shell> ./tools/tpm configure alpha \ --master=host1 \ --install-directory=/opt/continuent \ --replication-user=tungsten \ --replication-password=password \ --enable-batch-service=true shell> vi /etc/tungsten/tungsten.ini [alpha] master=host1 install-directory=/opt/continuent replication-user=tungsten replication-password=password enable-batch-service=true. --enable-batch-service=true (in [Tungsten Replicator 6.1 Manual]) enable-batch-service=true (in [Tungsten Replicator 6.1 Manual]) This option enables batch mode for a service, which ensures that replication services that are writing to a target database using batch mode in heterogeneous deployments (for example Hadoop, Amazon Redshift or Vertica). Setting this option enables the following settings on each host: On a Primary mysql-use-bytes-for-string is set to false. colnames filter is enabled (in the binlog-to-q stage to add column names to the THL information. pkey filter is enabled (in the binlog-to-q and q-to-dbms stage), with the addPkeyToInserts and addColumnsToDeletes filter options set to true. This ensures that rows have the right primary key information. enumtostring filter is enabled (in the q-to-thl stage), to translate ENUM values to their string equivalents. settostring filter is enabled (in the q-to-thl stage), to translate SET values to their string equivalents. On a Replica mysql-use-bytes-for-string is set to true. pkey filter is enabled ( q-to-dbms stage). Primary -service beta 6.1.10 build 4 Finished status command...
https://docs.continuent.com/tungsten-clustering-6.1/deployment-replicatorin.html
2021-01-16T05:40:19
CC-MAIN-2021-04
1610703500028.5
[]
docs.continuent.com
When creating a DataFlow, it is often necessary to transfer data from one instance of NiFi to another. In this case, the remote instance of NiFi can be thought of as a Process Group. For this reason, NiFi provides the concept of a Remote Process Group. From the User Interface, the Remote Process Group looks similar to the Process Group. However, rather than showing information about the inner workings and state of a Remote Process Group, such as queue sizes, the information rendered about a Remote Process Group is related to the interaction that occurs between this instance of NiFi and the remote instance. The image above shows the different elements that make up a Remote Process Group. Here, we provide an explanation of the icons and details about the information provided. Transmission Status: The Transmission Status indicates whether or not data Transmission between this instance of NiFi and the remote instance is currently enabled. The icon shown will be the Transmission Enabled icon ( ) if any of the Input Ports or Output Ports is currently configured to transmit or the Transmission Disabled icon ( ) if all of the Input Ports and Output Ports that are currently connected are stopped. Remote Instance Name: This is the name of the NiFi instance that was reported by the remote instance. When the Remote Process Group is first created, before this information has been obtained, the URL of the remote instance will be shown here instead. Remote Instance URL: This is the URL of the remote instance that the Remote Process Group points to. This URL is entered when the Remote Process Group is added to the canvas and it cannot be changed. Secure Indicator: This icon indicates whether or not communications with the remote NiFi instance are secure. If communications with the remote instance are secure, this will be indicated by the "Locked" icon ( ). If the communications are not secure, this will be indicated by the "Unlocked" icon ( ). If the communications are secure, this instance of NiFi will not be able to communicate with the remote instance until an administrator for the remote instance grants access. Whenever the Remote Process Group is added to the canvas, this will automatically initiate a request to have a user for this instance of NiFi created on the remote instance. This instance will be unable to communicate with the remote instance until an administrator on the remote instance adds the user to the system and adds the "NiFi" role to the user. In the event that communications are not secure, the Remote Process Group is able to receive data from anyone, and the data is not encrypted while it is transferred between instances of NiFi. 5-Minute Statistics: Two statistics are shown for Remote Process Groups: Sent and Received. Both of these are in the format <count> (<size>) where <count> is the number of FlowFiles that have been sent or received in the previous five minutes and <size> is the total size of those FlowFiles' content. Comments: The Comments that are provided for a Remote Process Group are not comments added by the users of this NiFi but rather the Comments added by the administrators of the remote instance. These comments indicate the purpose of the NiFi instance as a whole. Last Refreshed Time: The information that is pulled from a remote instance and rendered on the Remote Process Group in the User Interface is periodically refreshed in the background. This element indicates the time at which that refresh last happened, or if the information has not been refreshed for a significant amount of time, the value will change to indicate Remote flow not current. NiFi can be triggered to initiate a refresh of this information by right-clicking on the Remote Process Group and choosing the "Refresh flow" menu item.
https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.0/bk_user-guide/content/remote_group_anatomy.html
2018-05-20T12:13:07
CC-MAIN-2018-22
1526794863410.22
[array(['figures/1/images/remote-group-anatomy.png', None], dtype=object)]
docs.hortonworks.com
Set-Net Virtualization Provider Route Syntax Set-NetVirtualizationProviderRoute [-InterfaceIndex <UInt32[]>] [-DestinationPrefix <String[]>] [-NextHop <String[]>] [-Metric <UInt32>] [-CimSession <CimSession[]>] [-ThrottleLimit <Int32>] [-AsJob] [-PassThru] [<CommonParameters>] Set-NetVirtualizationProviderRoute -InputObject <CimInstance[]> [-Metric <UInt32>] [-CimSession <CimSession[]>] [-ThrottleLimit <Int32>] [-AsJob] [-PassThru] [<CommonParameters>] Description The Set-NetVirtualizationProviderRoute cmdlet updates the route metric for a route in a hv_win8_1. - Next hop gateway. Examples Example 1: Change a Provider Route metric PS C:\> Set-NetVirtualizationProviderRoute -DestinationPrefix "0.0.0.0/0" -InterfaceIndex 17 -Metric 23 -NextHop "10.1.1.1" This command specifies a Provider Route for the interface that has the index 17, with the specified destination prefix and next hop gateway. The command resets the metric for this route to 23. Example 2: Change a Provider Route metric for a subnet PS C:\>Set-NetVirtualizationProviderRoute -DestinationPrefix "10.10.0.0/16" -InterfaceIndex 23 -Metric 2 This command specifies a Provider Route for the interface that has the index 23 and the destination prefix 10.10.0.0/16. The command changes the metric for this route to 2. integer value for the route..
https://docs.microsoft.com/en-us/powershell/module/netwnv/set-netvirtualizationproviderroute?view=winserver2012r2-ps
2018-05-20T12:43:25
CC-MAIN-2018-22
1526794863410.22
[]
docs.microsoft.com
Using SQL¶ The SQL library helps you build and serialize SQL queries in Swift. It has an extensible, protocol-based design and supports DQL, DML, and DDL: - DQL: Data Query Language ( - DML: Data Manpulation Language ( INSERT, DELETE, etc) - DDL: Data Definition Language ( CREATE, ALTER, etc) This library's goal is to help you build SQL queries in a type-safe, consistent way. It can be used by ORMs to serialize their data types into SQL. It can also be used to generate SQL in a more Swifty way. The rest of this guide will give you an overview into using the SQL library manually. Data Query¶ DQL (data query language) is used to fetch data from one or more tables. This is done via the SELECT statement. Let's take a look how we would serialize the following SQL query. SELECT * FROM users WHERE name = ? This query selects all rows from the table users where the name is equal to a parameterized value. You can serialize literal values into data queries as well, but parameterization is highly recommended. Let's build this query in Swift. // Create a new data query for the table. var users = DataQuery(table: "users") // Create the "name = ?" predicate. let name = DataPredicate(column: "name", comparison: .equal, value: .placeholder) // Add the predicate as a single item (not group) to the query. users.predicates.append(.predicate(name)) // Serialize the query. let sql = GeneralSQLSerializer.shared.serialize(query: users) print(sql) // "SELECT * FROM `users` WHERE (`name` = ?)" Here we are using the shared GeneralSQLSerializer to serialize the query. You can also implement custom serializers. Data Manipulation¶ DML (data manipulation language) is used to mutate data in a table. This is done via statements like INSERT, UPDATE, and DELETE. Let's take a look how we would serialize the following SQL query. INSERT INTO users (name) VALUES (?) This query inserts a new row into the table users where the name is equal to a parameterized value. You can serialize literal values into data manipulation queries as well, but parameterization is highly recommended. Let's build this query in Swift. // Create a new data manipulation query for the table. var users = DataManipulationQuery(statement: .insert, table: "users") // Create the column + value. let name = DataManipulationColumn(column: "name", value: .placeholder) // Add the column + value to the query. users.columns.append(name) // Serialize the query. let sql = GeneralSQLSerializer.shared.serialize(query: user) print(sql) // "INSERT INTO `users` (`name`) VALUES (?)" That's all it takes to generate an INSERT query. Let's take a look at how this query would serialize if we use the .update statement instead. // Change the statement type users.statement = .update // Serialize the query. let sql = GeneralSQLSerializer.shared.serialize(query: user) print(sql) // "UPDATE `users` SET `name` = ?" You can see that SQL has generated an equivalent UPDATE query with the appropriate syntax. Data Definition¶ DDL (data definition language) is used to create, update, and delete schemas in the database. This is done via statements like CREATE TABLE, DROP TABLE, etc. Let's take a look at how we would serialize the following SQL query. CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT) This example is using SQLite-like syntax, but that is not required. Let's generate this query in Swift. // Create a new data definition query for the table. var users = DataDefinitionQuery(statement: .create, table: "users") // Create and append the id column. let id = DataDefinitionColumn(name: "id", dataType: "INTEGER", attributes: ["PRIMARY KEY"]) users.addColumns.append(id) // Create and append the name column. let name = DataDefinitionColumn(name: "name", dataType: "TEXT") users.addColumns.append(name) // Serialize the query. let sql = GeneralSQLSerializer.shared.serialize(query: user) print(sql) // "CREATE TABLE `users` (`id` INTEGER PRIMARY KEY, `name` TEXT)" That's all it takes to generate a CREATE query. Serializer¶ The default GeneralSQLSerializer that comes with this library generates a fairly "standard" SQL syntax. However, each flavor of SQL (SQLite, MySQL, PostgreSQL, etc) all have specific rules for their syntax. To deal with this, all serializers are backed by the SQLSerializer protocol. All serialization methods are defined on this protocol and come with a default implementation. Custom serializers can conform to this protocol and implement only the methods they need to customize. Let's take a look at how we would implement a PostgreSQLSerializer that uses $x placeholders instead of ?. /// Postgres-flavor SQL serializer. final class PostgreSQLSerializer: SQLSerializer { /// Keeps track of the current placeholder count. var count: Int /// Creates a new `PostgreSQLSerializer`. init() { self.count = 1 } /// See `SQLSerializer`. func makePlaceholder() -> String { defer { count += 1 } return "$\(count)" } } Here we've implemented PostgreSQLSerializer and overriden one method from SQLSerializer, makePlaceholder(). Now let's use this serializer to serialize the data query from a previous example. // Data query from previous example let users: DataQuery = ... // Serialize the query. let sql = PostgreSQLSerializer().serialize(query: users) print(sql) // "SELECT * FROM `users` WHERE (`name` = $1)" That's it, congratulations on implementing a custom serializer. API Docs¶ Check out the API docs for more in-depth information about SQL's APIs.
https://docs.vapor.codes/3.0/sql/overview/
2018-05-20T11:57:52
CC-MAIN-2018-22
1526794863410.22
[]
docs.vapor.codes
Test-SPInfo Path Form Template Syntax Test-SPInfoPathFormTemplate [-Path] <String> [-AssignmentCollection <SPAssignmentCollection>] [<CommonParameters>] Description The Test-SPInfoPathFormTemplate cmdlet validates that an InfoPath form template can be browser-enabled. For permissions and the most current information about Windows PowerShell for SharePoint Products, see the online documentation at (). Examples ---------------EXAMPLE-------------- C:\PS>Test-SPInfoPathFormTemplate -Identity formName.xsn This example validates an InfoPath form template for a specified name. Required Parameters Specifies the path and name of the InfoPath form template to install. The type must be a valid path and file name of a form template, in the form: - C:\folder_name\formtemplate_name - \\server_name\folder_name\formtemplate.
https://docs.microsoft.com/en-us/powershell/module/sharepoint-server/Test-SPInfoPathFormTemplate?view=sharepoint-ps
2018-05-20T12:44:02
CC-MAIN-2018-22
1526794863410.22
[]
docs.microsoft.com
Pull exceptions Pulling ignores versions when certain conditions occur. Table 1. Pull exceptions table Issue Description Matched an exclusion policy An exclusion policy prevents pulling changes for records matching the policy conditions. The pull identifies the changes but does not include versions for these records. Private properties A private property is excluded from all Update Sets and pulls. Collisions A collision is detected when the pulled version and the current local version both include modifications to the same record. You must resolve all collisions before you can pull. Previously resolved collisions When you resolved a collision by accepting either the pulled version or local version of a record, the pull remembers your decision and accepts the version that you indicated as a "previously resolved collision". Skipped Pulls skip versions where there is a problem with the version record such as a corrupt or missing version.When to use Team DevelopmentTeam dashboardLimitations on updating records
https://docs.servicenow.com/bundle/kingston-application-development/page/build/team-development/reference/r_PullExceptions.html
2018-05-20T12:10:27
CC-MAIN-2018-22
1526794863410.22
[]
docs.servicenow.com
sqlalchemy-redshift¶ Amazon Redshift dialect for SQLAlchemy. Usage¶ The DSN format is similar to that of regular Postgres: >>> import sqlalchemy as sa >>> sa.create_engine('redshift+psycopg2://[email protected]:5439/database') Engine(redshift+psycopg2://[email protected]:5439/database) See the RedshiftDDLCompiler documentation for details on Redshift-specific features the dialect supports. Releasing¶ To perform a release, you will need to be an admin for the project on GitHub and on PyPI. Contact the maintainers if you need that access. You will need to have a ~/.pypirc with your PyPI credentials and also the following settings: [zest.releaser] create-wheels = yes To perform a release, run the following: python3.6 -m venv ~/.virtualenvs/dist workon dist pip install -U pip setuptools wheel pip install -U tox zest.releaser fullrelease # follow prompts, use semver ish with versions. The releaser will handle updating version data on the package and in CHANGES.rst along with tagging the repo and uploading to PyPI. 0.7.1 (2018-01-17)¶ - Fix incompatibility of reflection code with SQLAlchemy 1.2.0+ (Issue #138) 0.7.0 (2017-10-03)¶ - Do not enumerate search_path with external schemas (Issue #120) - Return constraint name from get_pk_constraint and get_foreign_keys - Use Enums for Format, Compression and Encoding. Deprecate string parameters for these parameter types (Issue #133) - Update included certificate with the transitional ACM cert bundle (Issue #130) 0.6.0 (2017-05-04)¶ - Support role-based access control in COPY and UNLOAD commands (Issue #88) - Increase max_identifier_length to 127 characters (Issue #96) - Fix a bug where table names containing a period caused an error on reflection (Issue #97) - Performance improvement for reflection by caching table constraint info (Issue #101) - Support BZIP2 compression in COPY command (Issue #110) - Allow tests to tolerate new default column encodings in Redshift (Issue #114) - Pull in set of reserved words from Redshift docs (Issue #94 <> _) 0.5.0 (2016-04-21)¶ - Support reflecting tables with foriegn keys to tables in non-public schemas (Issue #70) - Fix a bug where DISTKEY and SORTKEY could not be used on column names containing spaces or commas. This is a breaking behavioral change for a command like __table_args__ = {‘redshift_sortkey’: (‘foo, bar’)}. Previously, this would sort on the columns named foo and bar. Now, it sorts on the column named foo, bar. (Issue #74) 0.4.0 (2015-11-17)¶ - Change the name of the package to sqlalchemy_redshift to match the naming convention for other dialects; the redshift_sqlalchemy package now emits a DeprecationWarning and references sqlalchemy_redshift. The redshift_sqlalchemy compatibility package will be removed in a future release. (Issue #58) - Fix a bug where reflected tables could have incorrect column order for some CREATE TABLE statements, particularly for columns with an IDENTITY constraint. (Issue #60) - Fix a bug where reflecting a table could raise a NoSuchTableErrorin cases where its schema is not on the current search_path(Issue #64) - Add python 3.5 to the list of versions for integration tests. (Issue #61) 0.3.1 (2015-10-08)¶ - Fix breakages to CopyCommand introduced in 0.3.0: Thanks solackerman. (Issue #53) - When format is omitted, no FORMAT AS ... is appended to the query. This makes the default the same as a normal redshift query. - fix STATUPDATE as a COPY parameter 0.3.0 (2015-09-29)¶ - Fix view support to be more in line with SQLAlchemy standards. get_view_definition output no longer includes a trailing semicolon and views no longer raise an exception when reflected as Table objects. (Issue #46) - Rename RedShiftDDLCompiler to RedshiftDDLCompiler. (Issue #43) - Update commands (Issue #52) - Expose optional TRUNCATECOLUMNS in CopyCommand. - Add all other COPY parameters to CopyCommand. - Move commands to their own module. - Support inserts into ordered columns in CopyCommand. 0.2.0 (2015-09-04)¶ - Use SYSDATE instead of NOW(). Thanks bouk. (Issue #15) - Default to SSL with hardcoded AWS Redshift CA. (Issue #20) - Refactor of CopyCommand including support for specifying format and compression type. (Issue #21) - Explicitly require SQLAlchemy >= 0.9.2 for ‘dialect_options’. (Issue #13) - Refactor of UnloadFromSelect including support for specifying all documented redshift options. (Issue #27) - Fix unicode issue with SORTKEY on python 2. (Issue #34) - Add support for Redshift DELETEstatements that refer other tables in the WHEREclause. Thanks haleemur. (Issue #35) - Raise NoSuchTableErrorwhen trying to reflect a table that doesn’t exist. (Issue #38) 0.1.2 (2015-08-11)¶ 0.1.0 (2015-05-11)¶ - First version of sqlalchemy-redshift that can be installed from PyPI Contents:
http://sqlalchemy-redshift.readthedocs.io/en/stable/
2018-05-20T11:26:26
CC-MAIN-2018-22
1526794863410.22
[]
sqlalchemy-redshift.readthedocs.io
Note Please participate in the MPF User Survey 2018. Tutorial step 7: Add your trough¶ At this point you have a flipping machine with a display, but you don’t have a “working” pinball machine since you can’t start or play games. So the next two steps in this tutorial, we’re going to get your first two ball devices set up—your trough and plunger lane. (A ball device is anything in MPF that holds a ball. 1. Read about ball devices¶ In MPF, a “ball device” is any physical mechanism in your machine that holds a ball. You can read more about ball devices in the Ball Devices documentation, which we recommend that you do now to familiarize yourself with the concepts. (You don’t have to understand everything about them for now, just skim through that link so you get the basics.) 2. Add your trough and/or drain¶ Now that you understand what a ball device is, lets add your first ball device, which is going to be trough (or drain) device which collects balls that drain from the playfield and stores them while they’re not in play. Since there are so many different types of ball drain and trough configurations, we can’t write a single tutorial that walks you through all of them. Instead, we have several tutorials. :) So your next step is to visit the Troughs / Ball Drains documentation which lists all the options (with pictures), as well as links to step-by-step guides which walk you through the setup of the particular type of trough or ball drain you have in your machine. 3. Enable debugging so you can see cool stuff in the log¶ Once you have your trough or drain device (or devices, in some cases) set up, add one more setting to that device: debug: yes This setting causes MPF to write detailed debugging information about this ball device to the log file. You have to run MPF with the -v (verbose) option to see this. This will come in handy in the future as you’re trying to debug things, and it’s nice because you can just turn on debugging for the things you’re troubleshooting at that moment which helps keep the debug log from filling up with too much gunk. For example, if you have a modern style trough with a jam switch, you’d add the debug setting like this: ball_devices: bd_trough: ball_switches: s_trough1, s_trough2, s_trough3, s_trough4, s_trough5, s_trough6, s_trough_jam eject_coil: c_trough_eject tags: trough, home, drain jam_switch: s_trough_jam eject_coil_jam_pulse: 15ms 4. Don’t test yet¶ Since the trough or drain device works hand-in-hand with the plunger lane, and since we haven’t set up a plunger lane yet, it’s not worth testing your config at this point. We’ll get the plunger lane set up in the next step. If you’re following along with the example tutorial configurations, at this point there could be some significant divergence between the examples and your machine since the examples are based on a Demolition Man machine with a modern opto-based trough. We still have the examples which you can try, and they’ll work fine because they use the “virtual” platform which doesn’t connect to real hardware. So you can run them and follow along, but just be aware that they might not match your own files exactly. The complete machine config is in the mpf-examples/tutorial folder with the name step7.yaml. You can run this file directly by switching to that folder and then running the following command: C:\mpf-examples\tutorial>mpf both -c step7
http://docs.missionpinball.org/en/latest/tutorial/7_trough.html
2018-05-20T12:13:03
CC-MAIN-2018-22
1526794863410.22
[]
docs.missionpinball.org
method resolve Documentation for method resolve assembled from the following types: class IO::Path (IO::Path) method resolve Defined as: method resolve(IO::Path: : --> IO::Path) Returns a new IO::Path object with all symbolic links and references to the parent directory ( ..) resolved. This means that the filesystem is examined for each directory in the path, and any symlinks found are followed. # bar is a symlink pointing to "/baz"my = "foo/./bar/..".IO.resolve; # now "/" (the parent of "/baz") If :$completely, which defaults to False, is set to a true value, the method will fail with X::IO::Resolve if it cannot completely resolve the path, otherwise, it will resolve as much as possible, and will merely perform cleanup of the rest of the path. The last part of the path does NOT have to exist to :$completely resolve the path. NOTE: Currently (April 2017) this method doesn't work correctly on all platforms, e.g. Windows, since it assumes POSIX semantics.
https://docs.perl6.org/routine/resolve
2018-05-20T11:46:30
CC-MAIN-2018-22
1526794863410.22
[]
docs.perl6.org
Managing Objects in a Versioning-Suspended Bucket Topics You suspend versioning to stop accruing new versions of the same object in a bucket. You might do this because you only want a single version of an object in a bucket, or you might not want to accrue charges for multiple versions. When you suspend versioning, existing objects in your bucket do not change. What changes is how Amazon S3 handles objects in future requests. The topics in this section explain various object operations in a versioning-suspended bucket.
http://docs.aws.amazon.com/AmazonS3/latest/dev/VersionSuspendedBehavior.html
2017-03-23T06:19:26
CC-MAIN-2017-13
1490218186780.20
[]
docs.aws.amazon.com
The USS Franklin D. Roosevelt (CVB/CVA-42)* (Photo: Christening of the USS Franklin D. Roosevelt , April 29, 1945). The USS Roosevelt (DDG-80) was not the first naval ship named for Franklin Roosevelt. On April 29, 1945, seventeen days after President Roosevelt's death, Eleanor Roosevelt attended the christening of the new Midway -class aircraft carrier USS Roosevelt (CVB/CVA-42). Originally named the USS Coral Sea , the ship was renamed the USS Franklin D. Roosevelt as a memorial to the fallen Commander-in-Chief; it was a fitting tribute because President Roosevelt had been personally involved in the design of the new class of carriers and had a lifelong affection for the Navy. At the ceremony, Eleanor Roosevelt said, "I know that my husband would have felt very keenly and appreciated the thought of having this super-carrier given his name. It's no secret that he loved the Navy and would have liked always to be associated with it. He would watch this ship with great pride. So today I hope this ship will always do its duty in winning the war. I pray God to bless this ship and its personnel and keep them safe, and bring them home victorious" ( NY Times , April 30, 1945). At the time of her completion, the USS Roosevelt was one of the largest naval ships ever built. She was more than 900 feet long, weighed 45,000 tons, and could carry over 80 aircraft; it took over 17 months to build the Roosevelt , at an expense of $90,000,000. The new carrier was not finished before the end of World War II, but she was an integral part of the Fleet during the Cold War and helped enforce American containment policy against Communism. During the Greek Civil War the USS Roosevelt was on hand to aid the Greek government and participated in various other NATO operations. She assisted in the evacuation of American citizens from Cuba during the 1958 revolution and was prepared to do the same in the Middle East during the 1973 Yom Kippur War. The only combat the Roosevelt saw was in Vietnam where she was deployed between June 1966 and February 1967; over 7,000 sorties were carried out off her deck during that time. In addition, the USS Roosevelt was often involved in special training missions; she was the first carrier to successfully launch and receive a jet airplane and took part in record-breaking flights with both long-range fighters and helicopters. The USS Roosevelt was decommissioned on September 30, 1977. *The USS Roosevelt was initially designated as a "Battle" aircraft carrier (CVB) but during the Korean War (around 1952) carriers took on an attack role and the Roosevelt was designated as an "Attack" aircraft carrier (CVA). When the Roosevelt was decommissioned, she carried the designation "CV," signifying her ability to handle a variety of missions.
http://docs.fdrlibrary.marist.edu/ussroos4.html
2017-03-23T06:08:22
CC-MAIN-2017-13
1490218186780.20
[]
docs.fdrlibrary.marist.edu
Macro Editor¶ Macros are a quick way to customize and extend Canopy. They can help you to automate tasks which are frequent or complicated. Starting with the Record Macro command in the Tools menu, you can record a series of actions,. The Macro Editor window provides a convenient way to makes such changes. For more information about writing and recording macros, see Recording, Editing, and Writing Macros.
http://docs.enthought.com/canopy/quick-start/macro_editor.html
2017-03-23T06:07:41
CC-MAIN-2017-13
1490218186780.20
[array(['../_images/macro_editor.png', '../_images/macro_editor.png'], dtype=object) ]
docs.enthought.com
External Programs You may need to run the external tools (ex : compiler, interpreter or web browser) from Notepad++ (via Run dialog) by passing the current edited document as argument. To do so, you have to use environment variables. The NppExec plugin will give you plenty of extra flexibility. Contents File level environment variables The usage of environment variable is : $( ENVIRONMENT_VARIABLE ) Say the current file you edit in Notepad++ is: E:\my Web\main\welcome.html There are more environment variables containing information on the current session: FULL_CURRENT_PATH : E:\my Web\main\welcome.html CURRENT_DIRECTORY : E:\my Web\main\ FILE_NAME : welcome.html NAME_PART : welcome EXT_PART : html SYS.<var> : system environment variable, e.g. $(SYS.PATH) Note that you should put the double quote around the path environment variables : "$(ENVIRONMENT_VARIABLE)" since they may contain some white space. Examples firefox "$(FULL_CURRENT_PATH)" iexplore "$(FULL_CURRENT_PATH)" These 2 user commands are also included in npp.3.0.installer.exe (or later version). You can launch them by Ctrl+Alt+Shift+X and Ctrl+Alt+Shift+I respectively. Document level environment variables There are still more variables: CURRENT_WORD : it gives the word(s) you selected in Notepad++. CURRENT_LINE : it gives the current line number where your cursor position is in Notepad++. CURRENT_COLUMN : it gives the current column number where your cursor position is in Notepad++. NPP_DIRECTORY : this variable contains the absolute path of Notepad++'s directory. And even more if you go through the NppExec plugin: PLUGINS_CONFIG_DIR : full path of the plugins configuration directory #N : full path of the Nth opened document (N=1,2,3...) #0 : full path to notepad++.exe LEFT_VIEW_FILE : current file path-name in primary (left) view RIGHT_VIEW_FILE : current file path-name in secondary (right) view Knowing the plugin directory enables you to call any plugin function by passing the plugin name, function name and possibly arguments to rundll.exe. Or to unload a rogue plugin using regsvr /u. examples $(NPP_DIRECTORY)\notepad++.exe $(CURRENT_WORD) For the examples 1 ~ 3, we pass the argument URL + the current selected word(s) to the default browser in order to perform the search on Internet. Whereas for the 4th example, it'll be useful if you want to open a file from current document. Consider a php file which contains a line: include("../myFuncs.php"); Selecting ../myFuncs.php then typing the hot key you assigne to this command will open myFuncs.php in Notepad++, of course the mentioned file should exist in the indcated path. The variable CURRENT_WORD brings you a flexible solution to configure your external commands Running a command When you use NppExec to run a command, the following variables are set for the command to use, before it is actually issued: CWD : current working directory of NppExec (use "cd" to change it) ARGC : number of arguments passed to the NPP_EXEC command ARGV : all arguments passed to the NPP_EXEC command after the script name ARGV[0] : script name - first parameter of the NPP_EXEC command ARGV[N] : Nth argument (N=1,2,3...) RARGV : all arguments in reverse order (except the script name) RARGV[N] : Nth argument in reverse order (N=1,2,3...) INPUT : this value is set by the 'inputbox' command INPUT[N] : Nth field of the $(INPUT) value (N=1,2,3...) The external process, if aware of NppExec, can set some variables as well (new in v0.3.1): OUTPUT : this value can be set by the child process, see npe_console v+ OUTPUT1 : first line in $(OUTPUT) OUTPUTL : last line in $(OUTPUT)
http://docs.notepad-plus-plus.org/index.php/External_Programs
2017-03-23T06:07:46
CC-MAIN-2017-13
1490218186780.20
[]
docs.notepad-plus-plus.org
To delete an alert or many alerts, click the selection check-box for all alerts you would like to delete. Then click the "Delete Selected" button at the top of the screen. To report a problem with this documentation or provide feedback, please contact the DIG mailing list. © 2008-2015 GPLS and others. The Evergreen Project is a member of the Software Freedom Conservancy.
http://docs.evergreen-ils.org/2.7/_deleting_an_address_alert.html
2017-03-23T06:12:17
CC-MAIN-2017-13
1490218186780.20
[]
docs.evergreen-ils.org
- 1. Ensure that the DaemonSet update strategy is “OnDelete” - 2. Upgrade the Portworx spec - 3. Upgrade Portworx pods This guide walks through upgrading Portworx deployed as a DaemonSet in a Kubernetes cluster. In the current version, Portworx recommends following an update strategy of ‘OnDelete’. With ‘OnDelete’ update strategy, after you update a DaemonSet template, new DaemonSet pods will only be created when you manually delete old DaemonSet pods. Doing so gives end users more control on when they are ready to upgrade Portworx on a particular node. Users are expected to migrate application pods using Portworx volumes to another node before deleting the old Portworx pod. Follow the below sequence to upgrade Portworx in your cluster. 1. Ensure that the DaemonSet update strategy is “OnDelete” - Check current update strategy using command: $ kubectl get ds portworx -n kube-system -o yaml | grep -A 3 updateStrategy: - If the updateStrategy type is RollingUpdate, change it to the On - Edit the spec using command: $ kubectl edit ds portworx -n kube-system - This will open the spec in an editor. Change updateStrategy to OnDeleteand save the file. This section in your spec should look like below: updateStrategy: type: On 2. Upgrade the Portworx spec - Change the image of the Portworx Daemonset - Set the image with command: $ kubectl set image ds portworx portworx=portworx/px-enterprise:1.2.9 -n kube-system - Alternately, you can also change the image in the DaemonSet spec file and apply it using $ kubectl apply -f <px-spec.yaml>. Update the ClusterRolepermissions in Portworx spec using below: cat <<EOF | kubectl apply -f - kind: ClusterRole apiVersion: rbac.authorization.k8s.io/v1alpha1 metadata: name: node-get-put-list-role rules: - apiGroups: [""] resources: ["nodes"] verbs: ["get", "update", "list"] - apiGroups: [""] resources: ["pods"] verbs: ["get", "list"] EOF 3. Upgrade Portworx pods It is not recommended to delete the Portworx pod while an application is actively issuing I/O. This can induce race conditions in docker causing it to hang. The following procedure should be followed: - Cordon the node where you want to upgrade Portworx: $ kubectl cordon <node-name> - Delete application pods running on this node that are using Portworx volumes: $ kubectl delete pod <pod-name> - Since application pods are expected to be managed by a controller like Deployementor StatefulSet, Kubernetes will spin up a new replacement pod on another node. - Delete Portworx pod running on this node. This will start a new Portworx pod on this node with the new version you set above. (Note: Cordoning a kubernetes node doesn’t affect DaemonSet pods) - A new Portworx pod with the new version will be initiated on this node. This pod will stay in initializing state. - Reboot the node. (Portworx 1.2.9 release requires a reboot of the host to perform upgrade of our kernel driver.) - Uncordon the node once it comes up. In the 1.2.10 release, Portworx will be automating parts of the above procedure by closer integration with the Kubernetes API.
https://docs.portworx.com/scheduler/kubernetes/upgrade.html
2017-08-16T23:37:26
CC-MAIN-2017-34
1502886102757.45
[]
docs.portworx.com
Properties Property Description Value Used to get or set the control value. It accepts an object of type DateTime, where only the time part is taken under consideration. Culture Determines the language of the drop down and the editable area. From here you can control if the format is 12 ("en-US") or 24 hours ("en-UK"). CloseButtonText Gets or sets the text of the button in the drop down RowHeight Gets or sets the height of the rows in the hour/minutes tables in the drop down. ColumnsCount Gets or sets the number of the columns in the hour/minutes tables in the drop down. HeadersHeight Gets or sets the size of the headers (clock, hours and minutes headers) in the drop down. ButtonPanelHeight Sets or gets the height of the buttons panel at the bottom of the drop down. TableWidth Gets or sets the width of the hours and minutes tables in the drop down. ClockPosition Gets or sets the position of the clock according to the tables. Currently possible options – ClockBeforeTables, ClockAboveTables and HideClock. TimeTables Allows you to choose between one table or two tables to display both hours and minutes. Step Gets or sets the minutes step. ReadOnly Makes the control read only – can’t type in it, or change its value from the UI. Also the drop down cannot be opened. NullText This property determines the text displayed in RadTimePicker when the Value is null. MaxValue Get or set the maximal time value assigned to the control. MinValue Get or set the minimal time value assigned to the control. TimePickerElement Gives access the RadTimePickerElement. Events Events Description ValueChanging Cancelable event, occurs when the value is changing. ValueChanged occurs when the value is changed. TimeCellFormatting occurs when a cell is being created/shown. Used for formatting the cell’s appearance. See Also Structure Localization Free Form Date Time Parsing
http://docs.telerik.com/devtools/winforms/editors/timepicker/structure-properties-and-events
2017-08-16T23:48:03
CC-MAIN-2017-34
1502886102757.45
[]
docs.telerik.com
Overview Thank you for choosing Telerik RadSparkline! The RadSparkline control is an information graphic, which, the RadSparkline control is comparable to RadChart, however, in order to maximize performance, the spark lines do not utilize this many visual indicators, as well as x or y axis or multiple axes. In order to use RadSparkline control in your projects you have to add references to Telerik.Windows.Controls.Charting.dll, Telerik.Windows.Controls.dll, Telerik.Windows.Controls.DataVisualization.dll and Telerik.Windows.Data.dll. Types of Sparklines Currently, the Sparkline control has the following subtypes: Line, which is of type RadLinearSparkline. This type of sparkline represents a set of points, connected by a line. Scatter, which is of type RadScatterSparkline. This represents the data points as a set of scattered, separate points. This is demonstrated in the screenshot below: Area, which is of type RadAreaSparkline. The area represents a series of datapoints, connected by a line, as well as the space defined by the line and the median (which is usually the 0-value axis, however the value may be specified to be different from 0). This is demonstrated by the screenshot below: Column, which is of type RadColumnSparkline. This type is very similar to a bar, and is a precise match of the value of each datapoint. Its vertical direction is an indicator of the value - positive or negative. Win/Loss is of type RadWinLossSparkline. This type of sparkline is very similar to the column type, with the only difference being that all datapoints, or bars are of equal size. There is no visual indicator of the precise value of each datapoint, but simply of its positive or negative nature. This is demonstrated in the screenshot below:
http://docs.telerik.com/devtools/wpf/controls/radsparkline/radsparkline_overview
2017-08-16T23:48:06
CC-MAIN-2017-34
1502886102757.45
[array(['images/sparklines_wpf.png', 'sparklines wpf'], dtype=object)]
docs.telerik.com
Previous: Printf Ordering, Up: Translator i18n [Contents][Index] awkPortability Issues, because TEXTDOMAINis not special in other awkimplementations. awktreat marked strings as the concatenation of a variable named _with the string following it.90 Typically, the variable _has the null string ( "") as its value, leaving the original string constant as the result. dcgettext(), dcngettext(), and bindtextdomain(), the awkprogram()is not portable. To support gettext()at the C level, many systems’ C versions of sprintf()do support positional specifiers. But it works only if enough arguments are supplied in the function call. Many versions of awkpass printfformats and arguments unchanged to the underlying C library version of sprintf(), but only one format and argument at a time. What happens if a positional specification is used is anybody’s guess. However, because the positional specifications are primarily for use in translated format strings, and because non-GNU awks never retrieve the translated string, this should not be a problem in practice. Previous: Printf Ordering, Up: Translator i18n [Contents][Index]
http://docs.ruanjiadeng.com/gnu/gawk/I18N-Portability.html
2017-08-16T23:34:35
CC-MAIN-2017-34
1502886102757.45
[]
docs.ruanjiadeng.com
Information for "Sample Data Specifications for 1.6" Basic information Display titleSample Data Specifications for 1.6 Default sort keySample Data Specifications for 1.6 Page length (in bytes)4,223 Page ID6411:49, 12 December 2009 Latest editorChris Davenport (Talk | contribs) Date of latest edit12:27, 12 December 2009 Total number of edits4 Total number of distinct authors2 Recent number of edits (within past 30 days)0 Recent number of distinct authors0 Retrieved from ‘’
https://docs.joomla.org/index.php?title=Sample_Data_Specifications_for_1.6&action=info
2015-06-30T00:14:41
CC-MAIN-2015-27
1435375090887.26
[]
docs.joomla.org
Legacy Public Cloud Guides The legacy guides in this section may be out of date. They cover using Ansible with a range of public cloud platforms. They explore particular use cases in greater depth and provide a more “top-down” explanation of some basic features. Guides for using public clouds are moving into collections. We are migrating these guides into collections. Please update your links for the following guides: Amazon Web Services Guide
https://docs.ansible.com/ansible/5/scenario_guides/cloud_guides.html
2022-05-16T08:09:09
CC-MAIN-2022-21
1652662510097.3
[]
docs.ansible.com
Security Disclosures¶ Background¶ Continual is committed to ensuring the safety and security of our customers. and employees.. Legal Posture¶ Continual will not engage in legal action against individuals who submit vulnerability reports through our Vulnerability Reporting inbox. We openly accept reports for the currently listed Continual products. We agree not to pursue legal action against individuals who: - Engage in testing of systems/research without harming Continual or its customers. - Engage in vulnerability testing within the scope of our vulnerability disclosure program. - Test on products without affecting customers, or receive permission/consent from customers before engaging in vulnerability testing against their devices/software, etc. - Adhere to the laws of their location and the location of Continual. For example, violating laws that would only result in a claim by Continual (and not a criminal claim) may be acceptable as Continual is authorizing the activity (reverse engineering or circumventing protective measures) to improve its system. - Refrain from disclosing vulnerability details to the public before a mutually agreedupon timeframe expires. Responsible Disclosure Policy¶ How to Submit a Vulnerability¶ To submit a vulnerability report to Continual’s Security Team, please email [email protected]. Preference, Prioritization, and Acceptance Criteria¶ We will use the following criteria to prioritize and triage submissions. What we would like to see from you:¶ - Well-written reports in English will have a higher probability Continual:¶ -, Continual may bring in a neutral third party to assist in determining how best to handle the vulnerability.
https://docs.continual.ai/security-disclosures/
2022-05-16T09:31:47
CC-MAIN-2022-21
1652662510097.3
[]
docs.continual.ai
What is GDNative?¶ はじめに¶ GDNative is a Godot-specific technology that lets the engine interact with native shared libraries at run-time. You can use it to run native code without compiling it with the engine. Differences between GDNative and C++ modules¶ You can use both GDNative and C++ modules to run C or C++ code in a Godot project. They also both allow you to integrate third-party libraries into Godot. The one you should choose depends on your needs. Advantages of GDNative¶ Unlike modules, GDNative doesn't require compiling the engine's source code, making it easier to distribute your work. It gives you access to most of the API available to GDScript C#, allowing you to code game logic with full control regarding performance. It's ideal if you need high-performance code you'd like to distribute as an add-on in the asset library. Also: GDNative is not limited to C and C++. Thanks to third-party bindings, you can use it with many other languages. You can use the same compiled GDNative library in the editor and exported project. With C++ modules, you have to recompile all the export templates you plan to use if you require its functionality at run-time. GDNative only requires you to compile your library, not the whole engine. That's unlike C++ modules, which are statically compiled into the engine. Every time you change a module, you need to recompile the engine. Even with incremental builds, this process is slower than using GDNative. Advantages of C++ modules¶ We recommend C++ modules in cases where GDNative isn't enough: C++ modules provide deeper integration into the engine. GDNative's access is limited to what the scripting API exposes. You can use C++ modules to provide additional features in a project without carrying native library files around. This extends to exported projects. C++ modules are supported on all platforms. In contrast, GDNative isn't supported on HTML5 and the Universal Windows Platform (UWP) yet. C++ modules can be faster than GDNative, especially when the code requires a lot of communication through the scripting API. Supported languages¶ The Godot developers officially support the following language bindings for GDNative: C++ (tutorial) - 注釈 There are no plans to support additional languages with GDNative officially. That said, the community offers several bindings for other languages (see below). The bindings below are developed and maintained by the community: 注釈 Not all bindings mentioned here may be production-ready. Make sure to research options thoroughly before starting a project with one of those. Also, double-check whether the binding is compatible with the Godot version you're using. Version compatibility¶ Unlike Godot itself, GDNative has stricter version compatibility requirements as it relies on low-level ptrcalls to function. GDNative add-ons compiled for a given Godot version are only guaranteed to work with the same minor release series. For example, a GDNative add-on compiled for Godot 3.4 will only work with Godot 3.4, 3.4.1, 3.4.2… but not Godot 3.3 or 3.5.
https://docs.godotengine.org/ja/stable/tutorials/scripting/gdnative/what_is_gdnative.html
2022-05-16T09:26:44
CC-MAIN-2022-21
1652662510097.3
[]
docs.godotengine.org
Mythic_CLI- This holds all of the code for the PyPi package, mythic, that you can use to script up actions. Mythic_Translator_Container- This holds all of the code for the PyPi package, mythic_translator_container, that you can use to build your own translation container. Mythic_PayloadType_Container- This holds all of the code for the PyPi package, mythic_payloadtype_container, that you can use to create your own payload type docker image or when turning a vm into your own payloadtype container. Mythic_C2Profile_Container- This holds all of the code for the PyPi package, mythic_c2_container, that you can use to create your own c2 profile docker image or when turning an arbitrary host into a c2 profile service. Mythic_DockerTemplates- This holds all of the code and resources that are used to make all of the Docker images hosted on DockerHub ( This is helpful if you want to see what's actually happening for a specific container or you want to use one of these as a starting point for your own containers. Mythic/Payload_Types/make a new folder that matches the name of your agent. Inside of this folder make a file called Dockerfile. This is where you have a choice to make - either use the default Payload Type docker container as a starting point and make your additions from there, or use a different container base. Dockerfileoff with: python:3.8-busterand only has python3.8 installed mono:latestwith python 3.8.6 manually installed along with the System.Management.Automation.dlladded in (v2 and v4) karalabe/xgo-latestwith python 3.8.6 manually installed mythic_payloadtype_containerPyPi versions can be found here: Current PayloadType Versions. Mythic/Payload_Types/[agent name]folder is mapped to /Mythicin the docker container. Editing the files on disk results in the edits appearing in the docker container and visa versa. Mythic/Payload_Types/[agent name]/agent_code/. You can have any folder structure or files you want here. Mythic/Payload_Types/[agent name]/mythicfolder contains all information for interacting with Mythic. Inside of the mythicfolder there's a subfolder agent_functionswhere all of your agent-specific building/command information lives. Example_Payload_Typefolder. mythic-cliand the one docker-composefile, the dynaconfand mythic-payloadtype-containerare the ones responsible for the Mythic/.envconfiguration being applied to your container. rabbitmq_passwordis shared over to your agent as well. By default, this is a randomized value stored in the Mythic/.envfile and shared across containers, but you will need to manually share this over with your agent either via an environment variable ( MYTHIC_RABBITMQ_PASSWORDor by editing the rabbitmq_passwordfield in your rabbitmq_config.json file. mythic-cli install github <url>). You can check if your docker-composefile is aware of your agent via mythic-cli payload list. If it's not aware, you can simply do mythic-cli payload add <payload name>. Now you can start just that one container via mythic-cli payload start <payload name>. sudo ./mythic-cli status. sudo ./mythic-cli logs payload_type_nameto see the output from the container to potentially troubleshoot. rabbitmq_config.jsonfile in step 6, is to use environment variables. These can be exported from your mythic-cliand .envfile via: sudo ./mythic-cli config payload.. /pathA). Example_Payload_Type_folder). Essentially, your /pathApath will be the new Payload_Types/[agent name]folder. rabbitmq_config.jsonwith the parameters you need hostvalue should be the IP address of the main Mythic install namevalue should be the name of the payload type (this is tied into how the routing is done within rabbitmq). For Mythic's normal docker containers, this is set to hostnamebecause the hostname of the docker container is set to the name of the payload type. For this case though, that might not be true. So, you can set this value to the name of your payload type instead (this must match your agent name exactly). container_files_pathshould be the absolute path to the folder in step 3 ( /pathAin this case) PYTHONPATHvariable adding your /pathAand /pathA/mythic. python3.8 mythic_service.pyand now you should see this container pop up in the UI rabbitmq_password, then you need to make sure that the password from Mythic/.envafter you start Mythic for the first time is copied over to your vm. You can either add this to your rabbitmq_config.jsonfile or set it as an environment variable ( MYTHIC_RABBITMQ_PASSWORD).
https://docs.mythic-c2.net/customizing/payload-type-development
2022-05-16T07:34:24
CC-MAIN-2022-21
1652662510097.3
[]
docs.mythic-c2.net
Table of Contents Product Index Mace Assassin for Genesis 3 Females is the new sci fi outfit from Midnight_stories. An elegant and simplistic design. It has a Full Body Suit and Shoes, plus Left and right hand guns. There are 8 material presets and 8 Shader preset so you can mix and match. The guns are wearable presets, so drop and drag and she’s gripping the gun, plus there are 16 material and 8 shader presets for them. There’s also 10 full body wearable presets for easy loading. There are 4K and 8K maps for super details. This future proof so it will fit any shape. And as an added Free Bonus you get a full set of gun poses to match. Great for all your sc fi and assassin.
http://docs.daz3d.com/doku.php/public/read_me/index/43937/start
2022-05-16T09:38:18
CC-MAIN-2022-21
1652662510097.3
[]
docs.daz3d.com
Catalyze Deployment Travis CI can automatically deploy to Catalyze after a successful build. Before configuring your .travis.yml you need to: - Find your Catalyze git remote: - Make sure your Catalyze environment is associated. Get the git remote by running git remote -vfrom within the associated repository. Edit your .travis.yml: deploy: provider: catalyze target: "ssh://[email protected]:2222/app1234.git" - Set up a deployment key to Catalyze for Travis CI: - Install the Travis CI command line client. Get the public SSH key for your Travis CI project and save it to a file by running travis pubkey > travis.pub Add the key as a deploy key using the catalyze command line client within the associated repo. For example: catalyze deploy-keys add travisci ./travis.pub code-1 where code-1is the name of your service. - Set up Catalyze as a known host for Travis CI: - List your known hosts by running cat ~/.ssh/known_hosts. Find and copy the line from known_hosts that includes the git remote found in Step 1. It’ll look something like [git.catalyzeapps.com]:2222 ecdsa-sha2-nistp256 BBBB12abZmKlLXNo... Update your before_deploystep in .travis.ymlto update the known_hostsfile: before_deploy: echo "[git.catalyzeapps.com]:2222 ecdsa-sha2-nistp256 BBBB12abZmKlLXNo..." >> ~/.ssh/known_hosts Deploying a subset of your Files To only deploy the build folder, for example, set skip_cleanup: true and path: “build”: deploy: provider: catalyze target: "ssh://[email protected]:2222/app1234.git" skip_cleanup: true path:
https://docs.travis-ci.com/user/deployment/catalyze/
2017-08-16T21:40:57
CC-MAIN-2017-34
1502886102663.36
[]
docs.travis-ci.com
Revision history of "JElementMenu:::fetchElement/1.6 (content was: "__NOTOC__ =={{JVer|1.6}} JElementMenu::fetchElement== ===Description=== {{Description:JElementMenu::fetchElement}} <span class="editsection" style="font-size:76..." (and the only contributor was "Doxiki2"))
https://docs.joomla.org/index.php?title=JElementMenu::fetchElement/1.6&action=history
2015-07-28T14:34:25
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Difference between revisions of "JUserHelper::getCryptedPassword"CryptedPassword Description Formats a password using the current encryption. Description:JUserHelper::getCryptedPassword [Edit Descripton] public static function getCryptedPassword ( $plaintext $salt= '' $encryption= 'md5-hex' $show_encrypt=false ) - Returns string The encrypted password. - Defined on line 296 of libraries/joomla/user/helper.php - Referenced by See also JUserHelper::getCryptedPassword source code on BitBucket Class JUserHelper Subpackage User - Other versions of JUserHelper::getCryptedPassword SeeAlso:JUserHelper::getCryptedPassword [Edit See Also] User contributed notes <CodeExamplesForm />
https://docs.joomla.org/index.php?title=API17:JUserHelper::getCryptedPassword&diff=cur&oldid=58002
2015-07-28T14:46:59
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Difference between revisions of "Getting Started Page Index/2.5" From Joomla! Documentation Revision as of 08:20, 23 September 2012 (view source)Tom Hutchison (Talk | contribs)m (Doing more, remove 2.5 reference)← Older edit Revision as of 07:02, 4 October 2012 (view source) Tom Hutchison (Talk | contribs) m (change category)Newer edit → Line 60: Line 60: --> --> <noinclude> <noinclude> −[[Category:Navigation templates]]+[[Category:Navigation boxes]] {{Navbox {{Navbox |name=Getting_Started_Page_Index/2.5 |name=Getting_Started_Page_Index/2.5 Revision as of 07:02, 4 October 2012 Index to the other documents in this series Start Here: Introduction to Getting Started with Joomla! for version 2.5 Hands-on a Joomla! site: Use a site that has already been set up. Install a copy of Joomla! on your own computer, known as localhost Use the Demo on the Joomla! site. Hands-on adding and altering Articles: Hands-on how to begin to edit an Article. Hands-on how to create a new Article. Hands-on altering and manipulating Articles: Hands-on adding links to other pages Hands-on adding a table Hands-on adding a picture Hands-on splitting a long article Manipulating Articles using the Front-end Hands-on: Starting to manage a Joomla! site Doing and learning more: Do more with Joomla! Beyond the basics: doing more Joomla! Books, links and helpful resources Retrieved from ‘’ Category: Navigation boxes
https://docs.joomla.org/index.php?title=Getting_Started_Page_Index/2.5&diff=76174&oldid=75662
2015-07-28T14:33:46
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Revision history of "Why does the formatting disappear after enabling SEF URLs?" View logs for this page Diff selection: Mark the radio boxes of the revisions to compare and hit enter or the button at the bottom. Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.
https://docs.joomla.org/index.php?title=Why_does_the_formatting_disappear_after_enabling_SEF_URLs%3F&action=history
2015-07-28T14:47:58
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Pizza Bugs and Fun 2 From Joomla! Documentation Contents)
https://docs.joomla.org/index.php?title=Pizza_Bugs_and_Fun_2&oldid=63034
2015-07-28T14:23:51
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Difference between revisions of "Glossary" From Joomla! Documentation Latest revision as of 15:25, 24 February 2014 Translation Hints
https://docs.joomla.org/index.php?title=Talk:Glossary&diff=cur&oldid=62283
2015-07-28T13:25:29
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Information for "Version 2.5.5 FAQ" Basic information Display titleCategory:Version 2.5.5 FAQ Default sort keyVersion 2.5.5 FAQ Page length (in bytes)577 Page ID2419310:26, 18 June 2012 Latest editorTom Hutchison (Talk | contribs) Date of latest edit13:20, 13 September 2012 Total number of edits3 Total number of distinct authors3 Recent number of edits (within past 30 days)0 Recent number of distinct authors0 Page properties Transcluded template (1)Template used on this page: Template:Cat info (view source) Retrieved from ‘’
https://docs.joomla.org/index.php?title=Category:Version_2.5.5_FAQ&action=info
2015-07-28T15:00:00
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Changes related to "JDOC:Docs Wiki Roadmap" ← JDOC:Docs Wiki Roadmap This is a list of changes made recently to pages linked from a specified page (or to members of a specified category). Pages on your watchlist are bold. 27 July 2015 03:38JDOC:Documentation Translators (diff; hist; +15) B2z 22 July 2015 08:39JDOC:Documentation Translators (diff; hist; +25) Welkson Ramos 21 July 2015 20:26JDOC:Documentation Translators (diff; hist; +20) Mrs.siam
https://docs.joomla.org/index.php?title=Special:RecentChangesLinked&limit=500&target=JDOC%3ADocs_Wiki_Roadmap
2015-07-28T14:00:44
CC-MAIN-2015-32
1438042981921.1
[array(['/extensions/CleanChanges/images/Arr_.png', None], dtype=object) array(['/extensions/CleanChanges/images/showuserlinks.png', 'Show user links Show user links'], dtype=object) array(['/extensions/CleanChanges/images/Arr_.png', None], dtype=object) array(['/extensions/CleanChanges/images/showuserlinks.png', 'Show user links Show user links'], dtype=object) array(['/extensions/CleanChanges/images/Arr_.png', None], dtype=object) array(['/extensions/CleanChanges/images/showuserlinks.png', 'Show user links Show user links'], dtype=object) ]
docs.joomla.org
Changes related to "International Enhancements for Version 1.6" ← International Enhancements for Version 1.6 This is a list of changes made recently to pages linked from a specified page (or to members of a specified category). Pages on your watchlist are bold. No changes during the given period matching these criteria.
https://docs.joomla.org/Special:RecentChangesLinked/International_Enhancements_for_Version_1.6
2015-07-28T14:09:38
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Can I add registration fields? From Joomla! Documentation Since Joomla! 1.6 you can add extended registration fields by using a profile plugin. A sample profile plugin is included in the standard installation, offering a number of commonly requested fields such as mailing address, telephone number and date-of-birth. See: What is a profile plugin? See also: Creating a profile plugin
https://docs.joomla.org/index.php?title=Can_I_add_registration_fields%3F&direction=next&oldid=85842
2015-07-28T14:13:08
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Difference between revisions of "Developer Tutorials Project/Opentask" From Joomla! Documentation JDOC:Developer Tutorials Project Revision as of 10:33, 11 February 2013 To Do List These are a few 'to do's' to get this project started - Organisation of project - Need maintainers of example files on GitHub - Dev sign-ups use talk page - Identify current articles to include in Dev Tutorials Project
https://docs.joomla.org/index.php?title=JDOC:Developer_Tutorials_Project/Opentask&diff=81060&oldid=81058
2015-07-28T14:01:43
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Source code: Lib/http/cookies.py identity and str() respectively. See also Return a decoded value from a string representation. Return value can be any type. This method does nothing in BaseCookie — it exists so it can be overridden.‘s. The value of the cookie. The encoded value of the cookie — this is what should be sent. The name of the cookie. Set the key, value and coded_value attributes. Deprecated since version 3.5: The undocumented LegalChars parameter is ignored and will be removed in a future version. Whether K is a member of the set of keys of a Morsel. 21.22. http.server — HTTP servers 21.24. http.cookiejar — Cookie handling for HTTP clients Enter search terms or a module, class or function name.
https://docs.python.org/dev/library/http.cookies.html
2015-07-28T13:21:22
CC-MAIN-2015-32
1438042981921.1
[]
docs.python.org
Difference between revisions of "Page Class Suffix" From Joomla! Documentation Revision as of 23:23, 24 February 2014 Usage If you enter a Page Class Suffix with a leading space, a new CSS class will be created. If the parameter does not have a leading space, the CSS classes associated with this Menu Item (for example, "componentheading") will be modified. The first method is normally preferred, since then you don't break any of the existing styling for the page elements, and you only need to add new CSS code for the new styling. If you don't use a leading space, you will need to copy all of the styling code for the Menu Item classes and duplicate it for the new CSS class before making your CSS changes. - See the tutorial Using Class Suffixes.
https://docs.joomla.org/index.php?title=Page_Class_Suffix&diff=108523&oldid=108088&rcid=&curid=211
2015-07-28T13:42:49
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
django-terms¶ Site-wide adds a definition or a link for specialized terms. Source: Code statistics: PyPI status: Continuous integration status: Live example¶ (To learn more about the connection between django-terms and Criminocorpus, read the Bounty paragraph) Table of contents¶ Contents: - Requirements - Installation - Usage - Settings - Troubleshooting - Translations - Bounty & donations - Getting help
https://django-terms.readthedocs.org/en/latest/
2015-07-28T13:17:32
CC-MAIN-2015-32
1438042981921.1
[array(['https://raw.github.com/BertrandBordage/django-terms/master/example_project/screenshot.png', 'https://raw.github.com/BertrandBordage/django-terms/master/example_project/screenshot.png'], dtype=object) array(['https://www.openhub.net/p/django-terms/widgets/project_thin_badge.gif', 'https://www.openhub.net/p/django-terms/widgets/project_thin_badge.gif'], dtype=object) array(['https://pypip.in/v/django-terms/badge.png', 'https://pypip.in/v/django-terms/badge.png'], dtype=object) array(['https://pypip.in/d/django-terms/badge.png', 'https://pypip.in/d/django-terms/badge.png'], dtype=object) array(['https://travis-ci.org/BertrandBordage/django-terms.png', 'https://travis-ci.org/BertrandBordage/django-terms.png'], dtype=object) array(['https://coveralls.io/repos/BertrandBordage/django-terms/badge.png', 'https://coveralls.io/repos/BertrandBordage/django-terms/badge.png'], dtype=object) ]
django-terms.readthedocs.org
Creating a basic index file From Joomla! Documentation.. temlate is in then the css files will go in). Body Section . Module Positions. End Finish it off - one last bit: </html> View the full source code of this template. Custom Images. Custom CSS.
https://docs.joomla.org/index.php?title=Creating_a_basic_index_file&direction=prev&oldid=79157
2015-07-28T14:17:37
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Revision history of "JDocumentHTML::setBufferBuffer/1.6 (content was: "__NOTOC__ =={{JVer|1.6}} JDocumentHTML::setBuffer== ===Description=== Set the contents a document include. {{Description:JDocumentHTML::setBuffer}} <span class=..." (and the only contributor was "Doxiki2"))
https://docs.joomla.org/index.php?title=JDocumentHTML::setBuffer/1.6&action=history
2015-07-28T14:20:14
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Information for "JComponentHelper/getParams" Basic information Display titleAPI15:JComponentHelper/getParams Default sort keyJComponentHelper/getParams Page length (in bytes)1,348’
https://docs.joomla.org/index.php?title=API15:JComponentHelper/getParams&action=info
2015-07-28T14:28:26
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Difference between revisions of "Extensions packing" From Joomla! Documentation Revision as of 06:51, 24 Renekorss (talk| contribs) 20 months ago. (Purge) It's strongly recommended to use Package ( and higher) for multiple extensions. This avoids unpacking and is easily installable. Test it Please do test the zip/tar file and install it in a Joomla test server before release your file.
https://docs.joomla.org/index.php?title=Extensions_packing&diff=next&oldid=31138
2015-07-28T15:00:12
CC-MAIN-2015-32
1438042981921.1
[]
docs.joomla.org
Table of Contents Product Index The Alivia Dress is a summertime lightweight outfit for fit a girl character. This product includes Dress, Necklace and Bracelets for a cute and unique style for your character. The dForce Alivia Candy dress comes in 16 Patterns and Colors to be perfect for your date night, fancy, or dinner-with-friends scene! Note:.
http://docs.daz3d.com/doku.php/public/read_me/index/72347/start
2022-08-08T07:40:05
CC-MAIN-2022-33
1659882570767.11
[]
docs.daz3d.com
Table of Contents Product Index Under the city that never sleeps lays a labyrinth of tunnels, some asdeep as 800 feet below… In the Central City Sewer, there's plenty of room for a dragon or troll, science mutations gone wrong, Spring-heeled Jack, alligators… or even the pizza rat! It's no wonder it's closed to the public. This environment set comes with Inner and Outer Halls, a Center Hub, Smoke and Standing Water, an Upper room and even Graffiti LIE Tags for authenticity. Don't worry, the Central City Sewer is safe for now… although in your capable hands that might not last too.
http://docs.daz3d.com/doku.php/public/read_me/index/72831/start
2022-08-08T07:20:03
CC-MAIN-2022-33
1659882570767.11
[]
docs.daz3d.com
Accounts Accounts in Aztec. A Technical Primer on Accounts There are two main parts to each Aztec account, the account (private/public key pair) and signers. The account is associated with a privacy key that can be used to decrypt value notes (assets on Aztec). The signers are associated with spending keys (or signing keys) which can be used to send notes from the associated account. An account is just a private/public key pair until it is registered (see the account registration section below.) Before an account is registered, the private key is used to decrypt account notes as well as send value notes. The account will not have any registered spending keys or an account alias until is is registered. Once an account is registered, value notes that are sent to the account can either be marked to be spent by the account private key or the spending keys. It is a best practice for a sender to mark value notes as spendable by the spending keys when an account is registered. The SDK abstracts away much of this complexity to make it easier for developers to follow best practices. For example, when you look up an account by alias (meaning the account has been registered) a transfer automatically marks the notes as spendable by the recipients spending keys. You can still choose to mark value notes sent to an account as spendable by the account private key, but this is not recommended because the recipient have shared that key with an application in order to allow it to decrypt value notes to calculate their note balances. Compared to Ethereum Accounts in Aztec work differently than accounts in Ethereum. Aztec uses a different curve than Ethereum for SNARK efficient operations. This means that you cannot use an Ethereum private key directly for signing Aztec transactions or a public key for deriving an account address. Specifically, Aztec uses the Grumpkin curve, see the yellow paper for more information. In zk.money, Aztec accounts are generated using Ethereum accounts by having a user sign a message and deriving the Aztec keys from the signed message. This ensures that as long as someone has access to their Ethereum account, they will be able to access their Aztec account by signing a message. Different messages are used to generate different keys (account decryption key and spending key). Users And Accounts Users in Aztec will use the main account to receive notes and decrypt balances and the signer (or spending key) to spend notes or initiate bridged Ethereum interactions. Account The privacy account is the first account that is generated for an Aztec user. The private key associated with this account can be used to decrypt notes. The private key can also be used to register a distinct spending key. This allows for account abstraction by creating a separation between the key required to decrypt notes (privacy key) and the key required to spend notes (spending key). If a spending key has not been registered, the account private key can be used. Accounts can be identified by their alias or their public key. You can read more about aliases below. You can also find more information in the SDK section on account keys. Spending keys (signer) An account should register a signer with a new spending key on the network in order to take advantage of account abstraction. When an account is first registered, you can pick a human-readable alias, a spending key and a recovery key. If the spending key is lost, a recovery flow can be initiated by the recovery account specified when the new spending key was registered (account recovery). Registering a spending key has an associated fee as it typically includes a token (or ETH) deposit and posts transactions to the network. You can add as many spending keys to an account as you want. This allows you to spend notes from the same account from multiple devices without having to share sensitive private keys across devices. Read more about creating and using spending keys in the SDK docs here. Account Registration To register a new account, you need to choose an alias and a new spending public key. Optionally, you can include a recovery account public key and a deposit. Generally, an account with a registered spending key is considered safer than an account that only uses the default account keys. An account without a spending key uses default account private key for note decryption as well as spending notes. When a spending key is registered, the default private key can only be used for decrypting notes and spending must be done with a registered spending key. Most users will typically use an account with a registered spending key and are thus considered "safe". There are use cases (airdrops) where you might want to use an account that has not yet registered a spending key and is using the default account key for both note decryption and spending. So it is possible to use the system without registering your account. When you use an unregistered account, your notes are marked as spendable by the account key. It's the sender that defines whether notes are marked spendable with the account key. A sender can check whether an account has registered spending keys before specifying the spending key. You cannot mix the spending of these notes. You can send unspent notes from the default account to yourself, but marked as spendable by the signing key. The SDK tries to abstract much of this complexity away and presents everything to a developer as if this notion does not exist (e.g. the account balance is the sum of all notes regardless of registered or not). If you want to know exactly what you can spend in one transaction, you have to tell the SDK whether you're interested in the unregistered or registered balances. When actually creating the zero knowledge proof, the SDK infers which balance you're drawing from based on whether you give it a spending key or the account key. Read more about account registration with the SDK on this page. Account Alias The main privacy account public key is associated with a human-readable alias when the account registers a new signing key (see below). The alias can be anything (20 alphanumeric, lowercase characters or less) as long as it hasn't been claimed yet. Do not forget your alias. If you forget your alias you will not be able to share it to make it easy for them to send you asset notes. There is no way to recover a forgotten alias, but you can register a new account with a new alias and transfer your notes to the new account. If you forget your alias you can still transfer and withdraw asset notes. Account Migration Account migration allows you to keep your alias and just update your account (privacy) keys. This will update the public key associated with your alias as well as the key that is used to decrypt your account notes. This can only be done 1 time. Account Recovery If you lose access to all of your spending keys for an account, the designated recovery account can help you recover access and register a new spending key that you have access to. This recovery information is created and registered with the account during the account registration step. Read more about account recovery with the SDK on this page. Frequently Asked Questions What happens if I lose my Aztec account private key? Your Aztec account private keys are derived from an Ethereum signature we ask you to sign when you register with us. As long as you still control your original Ethereum account you can re-derive your Aztec account keys. What happens if I lose my Aztec account private key AND my Ethereum account private key? At the current time, your funds would be lost. Our protocol architecture supports Aztec account social recovery but implementation into our front-end software is still under development. What is the zk.money username for? zk.money username/alias lets other users easily lookup your encryption public key so they can send you assets. This name has to be unique and is limited to 20 characters, lowercase, alphanumeric. Please note that this isn’t an ENS domain. I've registered to the platform, but zk.money prompts registration once again when I try to log in again? Please follow these instructions: Step 1: Clear browser cache. For Chrome, this is the link: chrome://settings/cookies/detail?site=zk.money Step 2: Make sure you are signing in with the Metamask account you used to register zkmoney username. 💡 Sign in with Metamask account you used to register your zkmoney username (You might have previously used a different Metamask account for funding/depositing). Step 3:
https://docs.aztec.network/how-aztec-works/accounts
2022-08-08T06:32:24
CC-MAIN-2022-33
1659882570767.11
[]
docs.aztec.network