text
stringlengths 104
605k
|
---|
# Sorting an “almost sorted” array in sub linear time
I am given an "almost sorted" array with the condition that each element is no more than $$k$$ places away from its position in the sorted array. I need to show that it is impossible to sort this array in sublinear time asymptotically.
My proof is to suppose a sorted array of length $$n$$. Now assume that every second element is swapped with the element on its left. The new array is almost sorted. To sort it would require a minimum of $$n/2$$ swaps - asymptotically linear amount of operations. Therefore, no sublinear sorting algorithm exists.
Is this proof correct?
## migrated from programmers.stackexchange.comDec 22 '15 at 21:29
This question came from our site for professionals, academics, and students working within the systems development life cycle.
It is correct, but you should be careful to stipulate that the swaps are performed in a non overlapping fashion; if the swaps overlap then one element can be carried across the array breaking the guarantee.
• Thank you, perhaps i did not explain well but the swaps can not overlap as i swap two adjecent elements: element one with two, three with four, and so on. – AmirB Dec 22 '15 at 8:21
This is correct, but more complicated than it needs to be.
Consider an array of $$n$$ elements which is sorted except for that one (unknown) pair of elements is swapped.
Since the sorting algorithm does not know in which pair of elements is swapped, any individual element of the array may be in the wrong place. So, we have to make $$O(n)$$ comparisons to check if each element needs to be moved or not.
Therefore, the complete algorithm cannot be faster than $$O(n)$$.
The condition that "each element is no more than $$k$$ places away from its position" is irrelevant, though it might mislead you to waste time looking for a faster algorithm which depends on that condition!
• But this is actually not trivial to formally prove -- I think the original one may be just as easy. – usul Mar 25 at 17:13 |
# Integration by parts in derivation of LSZ reduction formula
This is something that every text book or notes skips to explain in the derivation of the LSZ reduction formula
Suppose we have $$a_{1}^{\dagger} \equiv \int d^{3} k f_{1}(\mathbf{k}) a^{\dagger}(\mathbf{k})\tag{5.6}$$ where $$f_{1}(\mathbf{k}) \propto \exp \left[-\left(\mathbf{k}-\mathbf{k}_{1}\right)^{2} / 4 \sigma^{2}\right]\tag{5.7}$$ is an appropriate wave packet, and $$\sigma$$ is its width in momentum space.
In the derivation of LSZ reduction formula at certain point in this notes Quantum Field Theory by Mark Srednicki at page 50 they have \begin{aligned} &-i \int d^{3} k f_{1}(\mathbf{k}) \int d^{4} x e^{i k x}\left(\partial_{0}^{2}-\overleftarrow{\nabla}^{2}+m^{2}\right) \phi(x) \\ &=-i \int d^{3} k f_{1}(\mathbf{k}) \int d^{4} x e^{i k x}\left(\partial_{0}^{2}-\overrightarrow{\nabla}^{2}+m^{2}\right) \phi(x) \end{aligned}\tag{5.10} He claims that here the wave packet is needed to avoid a surface term but I am not seeing how. I am using this to proof it \begin{align*} \int f(y)g''(y)dy &= f(y)g'(y)| - \int g'(y)f'(y)dy\\ \int g(y)f''(y)dy &= g(y)f'(y)| - \int f'(y)g'(y)dy\quad\text{so}\\ \int f(y)g''(y)dy - \int g(y)f''(y)dy &= f(y)g'(y)| - g(y)f'(y)|.\\ \end{align*}
If we take the $$x$$ part of the Laplacian we would have \begin{align*} \int^{+\infty}_{-\infty}{dr^3 \left(\partial_x^2 e^{-ikr}\right)\phi}=\int^{+\infty}_{-\infty}{dr^3 \ e^{-ikr} \partial_x^2\phi}+\left[e^{-ik_xx}\right]^{\infty}_{-\infty}\int^{+\infty}_{-\infty}{dydze^{-ik_yy-ik_zz}\partial_x\phi}+\int^{+\infty}_{-\infty}{dydzik_xe^{-ikr}\left[\phi\right]^{x=\infty}_{x=-\infty}}. \end{align*}
Now if we assume that $$\lim_{r\rightarrow \infty}{\phi}=\lim_{r\rightarrow -\infty}{\phi}=0$$ we have that $$\int^{+\infty}_{-\infty}{dydzik_xe^{-ikr}\left[\phi\right]^{x=\infty}_{x=-\infty}}=0.$$
$$\left[e^{-ik_xx}\right]^{\infty}_{-\infty}\int^{+\infty}_{-\infty}{dydze^{-ik_yy-ik_zz}\partial_x\phi}.$$
why it is zero?
• How quickly does $\phi_{,x}$ vanish at $\pm\infty$?
– J.G.
Sep 5, 2021 at 10:55
• Crossposted from math.stackexchange.com/q/3887424/11127 Sep 9, 2021 at 14:32
Such terms are set to zero by remembering that the $$S$$-matrix is strictly defined using wave-packets and then using the Riemann-Lebesque Lemma.
Multiply the boundary term by $${\tilde f}(k_x,k_y,k_z)$$ and then integrate over $$\vec{k}$$ so you have get a wave-packet state. The boundary term then simplifies to $$\lim_{x \to \pm\infty} \int dy dz f(x,y,z) \partial_x \phi(x,y,z)$$ where $$f(\vec{x})$$ is the wave-packet.
The wave-packet is localized at $$\vec{x}=\vec{v}t$$ where $$\vec{v}$$ is the velocity of the particle. So, if I take $$x \to \infty$$ keeping $$y,z,t$$ fixed, then $$f \to 0$$ and such terms can be neglected. |
# Math Help - Fibonacci Proof (Modelling/Abstract Question)
1. ## Fibonacci Proof (Modelling/Abstract Question)
A certain parking meter will accept only $1 or$2 coins (we're Australian... it's okay!). Parking in this regulated areas costs $1 per hour and a maximum time of six hours of parking is allowable. Coins can only be inserted into the meter one at a time, and a sequence of, say$1 + $2 is considered to be a different sequence to$2 + $1. Determine the total number of combinations of$1 and $2 coins that are possible when payment is made for up to the maximum of 6 hours. I have the combinations (there really aren't many) but this is under Fibonacci proofs and induction and I just can't see how's it relates? Just a couple of hints would be amazing. Please and Thanks guys 2. Very nice problem. One hint is to look at the number of ways to pay for exactly n hours. I tried several small n's, and for each n I drew a binary tree. For each node, one child corresponds to paying$1, the other $2. Each tree node may be labeled with the amount left. After doing an exhaustive search of ways to pay in this manner, I noted that some parts of trees for larger n's are copies of trees for smaller n's. Of course, this is just what I tried; this may not be the easiest or the most intuitive way to approach the problem. 3. Surely there a more... mathematically formulated way? are you saying do a tree thing?? Parking Meter Tree Diagram..doc So using this I can see there are 13 ways... holy moley... that isn't a Fibonacci number I see is it?? okay so maybe there's a formula to the way?? the 7th Fibonacci number is the answer to make a maximum$6 from 1 and 2 dollar coins...
Is there are a math formula or something that can be produced?
4. Nice trees! Though it would be better to write the remaining amount in the nodes.
So, I understand that if one adds a root to the whole picture, one gets the tree for 6. Well, the top subtree is the tree for 5, and the bottom one is for tree for 4.
5. One needs to add a child to one of the $2 nodes in the bottom tree. If you go$2, $1,$2, you have only $5, not$6. After that, the bottom tree for 4 is isomorphic to the top subtree of the top tree.
6. okay what?
Yep, I missed another $1. Um you mean for the first one go like ----4 5 ----3 etc. etc. rather than ---1 1 ---2 and your saying where directly behind those purple ones would be the way to get$5, behind that, $4.... all Fibonacci numbers... Except they start at 1. there's 1 way to get$1
2 ways to get $2 3 ways to get$3
5 ways to get $4 etc. what's isomorphic? 7. Originally Posted by hoppingmad okay what? Yep, I missed another$1.
Um you mean for the first one go like
----4
5
----3
etc. etc. rather than
---1
1
---2
Yes. Branches should be labeled with $1 or$2, and nodes with the remaining amount. So the sum of branch labels from a node to a leaf must be equal to that node's label.
what's isomorphic?
In the picture you posted looks like this:
Code:
4...
/
5
/ \
6 3...
\ 3...
4/
\2...
The subtrees that have the same label in the root (like the bottom subtree with root 4 and the top subtree of the top subtree also with root 4) are the same.
9. No there not... they are 2 different combinations...
10. No there not... they are 2 different combinations...
In both cases you have to spend $4. I don't know how this will show because I use OpenOffice, but I am attaching your drawing. I spoiled it by trying to draw a line around the subtree that is isomorphic to the bottom tree (provided you add a node to the bottom tree where there is a short line). Remember that it is not node labels you are after. What is important is the number of leaves -- each leave corresponds to one way to spend$6. This number is the sum of the corresponding numbers for $5 and$4. |
effet-0.4.0.0: An Effect System based on Type Classes
Control.Effect.Map.Strict
Description
Strict interpretations of the Map' effect.
If you don't require disambiguation of multiple map effects (i.e., you only have one map effect in your monadic context), you usually need the untagged interpretations.
Synopsis
# Interpreter Implementation
data StrictMap k v m a Source #
The strict interpreter of the map effect. This type implements the Map' type class in a strict manner.
When interpreting the effect, you usually don't interact with this type directly, but instead use one of its corresponding interpretation functions.
#### Instances
Instances details
clear :: Monad m => StrictMap k v m () Source #
Deletes all key-value pairs from the map.
lookup :: (Monad m, Ord k) => k -> StrictMap k v m (Maybe v) Source #
Searches for a value that corresponds to a given key. Returns Nothing if the key cannot be found.
update :: (Monad m, Ord k) => k -> Maybe v -> StrictMap k v m () Source #
Updates the value that corresponds to a given key. Passing Nothing as the updated value removes the key-value pair from the map.
# Tagged Interpretations
Arguments
:: forall tag k v m a. Monad m => (Map' tag k v Via StrictMap k v) m a The program whose map effect should be handled. -> m a The program with its map effect handled.
Runs the map effect, initialized with an empty map.
# Untagged Interpretations
runMap :: Monad m => (Map k v Via StrictMap k v) m a -> m a Source #
The untagged version of runMap'. |
# Fitting functions with a configurable Support Vector Regressor
This post deals with the approximation of real mathematical functions to one or more real variables using a Support Vector Regressor without writing code but only acting on the command line of Python scripts that implement the functionality of:
• Regressor Configuration and training
• Prediction and error calculation
The code described by this post requires Python version 3 and uses the SciKit-Learn library; it also requires the NumPy, Pandas, MatPlotLib and JobLib libraries.
To get the code please see the paragraph Download of the complete code at the end of this post.
For the generation of synthetic training and test datasets, the following common tools (available in the repository) will be used:
• fx_gen.py for the real-valued scalar functions of one real-valued variable $f \colon [a,b] \to {\rm I\!R}$
• fxy_gen.py for the real-valued scalar functions of two real-valued variables $f(x,y) \colon [a,b] \times [c,d] \to {\rm I\!R}$
• pcm2t_gen.py for parametric curves on the plane, so real-valued vector functions $f(t) \colon [a,b] \to {\rm I\!R \times \rm I\!R}$
• pmc3t_gen.py for parametric curves in space, so real-valued vector functions $f(t) \colon [a,b] \to {\rm I\!R \times \rm I\!R \times \rm I\!R}$
Also for the visualization of the results, and precisely for the comparison of the test dataset with the prediction, the following common tools (always available in the repository) will be used:
## Regressor Configuration and training
In this chapter two programs are presented: fit_func_esvr.py and fit_func_nusvr.py which technically are wrappers respectively of the classes sklearn.svm.SVR and sklearn.svm.NuSVR of the SciKit-Learn library and their purpose is to allow the use of the underlying regressors to fit functions without having to write code but only acting on the command line.
In fact through the argument --svrparams the user passes a series of hyper-parameters to adjust the behavior of the'underlying SVR algorithm and others to configure its learning phase. In addition to the parameters of the underlying regressor the program supports its own arguments to allow the user to pass the training dataset and on which file to save the trained model.
Both programs are of type M.I.M.O., that is Multiple Input Multiple Output: are designed to approximate a function of the shape $f \colon \rm I\!R^n \to \rm I\!R^m$ using in the implementation the sklearn.multioutput.MultiOutputRegressor class.
The format of the input datasets is in csv format (with header), with $n+m$ columns, of which the first $n$ columns contain the values of the $n$ independent variables and the last $m$ containing the values of the dependent variables.
### Usage of the fit_func_esvr.py program
To get the program usage you can run this following command:
$python fit_func_esvr.py --help and the output got is: usage: fit_func_esvr.py [-h] [--version] --trainds TRAIN_DATASET_FILENAME --outputdim NUM_OF_DEPENDENT_COLUMNS --modelout MODEL_FILE [--dumpout DUMPOUT_PATH] [--svrparams SVR_PARAMS] fit_func_esvr.py fits a multiple-input multiple-output function dataset using a configurable Epsilon-Support Vector Regressor optional arguments: -h, --help show this help message and exit --version show program's version number and exit --trainds TRAIN_DATASET_FILENAME Train dataset file (csv format) --outputdim NUM_OF_DEPENDENT_COLUMNS Output dimension (alias the number of dependent columns, that must be last columns) --modelout MODEL_FILE Output model file --svrparams SVR_PARAMS Parameters of Epsilon-Support Vector Regressor constructor Where: • -h, --help: shows the usage of the program and ends the execution. • --version: shows the version of the program and ends the execution. • --trainds: path (relative or absolute) of a two-column csv file (with header) that contains the dataset to be used for the training; this file can be generated synthetically e.g. via the program fx_gen.py. or be a dataset actually obtained by measuring a scalar and real phenomenon that depends on a single real-valued variable. • --outputdim: the$n$number of independent variables that are the first$n$columns of the csv dataset; the rest of the columns on the right are the$m$dependent variables accordingly. • --modelout: path (relative or absolute) to a file where to save the trained model in joblib format (.jl). • --svrparams: list of parameters to pass to the regression algorithm below; see documentation of sklearn.svm.SVR. ### Usage of the fit_func_nusvr.py program To get the program usage you can run this following command: $ python fit_func_nusvr.py --help
and the output got is:
usage: fit_func_nusvr.py [-h] [--version] --trainds TRAIN_DATASET_FILENAME
--outputdim NUM_OF_DEPENDENT_COLUMNS --modelout
MODEL_FILE [--dumpout DUMPOUT_PATH]
[--svrparams SVR_PARAMS]
fit_func_nusvr.py fits a multiple-input multiple-output function dataset using
a configurable Nu-Support Vector Regressor
optional arguments:
-h, --help show this help message and exit
--version show program's version number and exit
--trainds TRAIN_DATASET_FILENAME
Train dataset file (csv format)
--outputdim NUM_OF_DEPENDENT_COLUMNS
Output dimension (alias the number of dependent
columns, that must be last columns)
--modelout MODEL_FILE
Output model file
--svrparams SVR_PARAMS
Parameters of Nu-Support Vector Regressor constructor
Where:
• -h, --help: shows the usage of the program and ends the execution.
• --version: shows the version of the program and ends the execution.
• --trainds: path (relative or absolute) of a two-column csv file (with header) that contains the dataset to be used for the training; this file can be generated synthetically e.g. via the program fx_gen.py. or be a dataset actually obtained by measuring a scalar and real phenomenon that depends on a single real-valued variable.
• --outputdim: the $n$ number of independent variables that are the first $n$ columns of the csv dataset; the rest of the columns on the right are the $m$ dependent variables accordingly.
• --modelout: path (relative or absolute) to a file where to save the trained model in joblib format (.jl).
• --svrparams: list of parameters to pass to the regression algorithm below; see documentation of sklearn.svm.NuSVR.
## Prediction and error calculation
In this chapter the program predict_func.py is presented and which purpose is to make predictions on a test dataset applying it to a previously trained e-SVR or nu-SVR model respectively via the program fit_func_esvr.py or fit_func_nusvr.py, always without having to write code but only through the command line.
In fact, this program supports arguments through which the user passes the previously trained model, the test dataset and the error measurements to be calculated between the predictions and the true values.
The format of the incoming test datasets is identical to that of the training programs mentioned above; obviously here the last columns (those of dependent variables) are only used to compare the predicted values with the true values by calculating passed error measurements.
### Usage of the predict_func.py program
To get the program usage you can run this following command:
$python predict_func.py --help and the output got is: usage: predict_func.py [-h] [--version] --model MODEL_FILE --ds DF_PREDICTION --outputdim NUM_OF_DEPENDENT_COLUMNS --predictionout PREDICTION_DATA_FILE [--measures MEASURES [MEASURES ...]] predict_func.py makes prediction of the values of a multiple-input multiple- output function with a pretrained Standard Vector Regressor model optional arguments: -h, --help show this help message and exit --version show program's version number and exit --model MODEL_FILE model file --ds DF_PREDICTION dataset file (csv format) --outputdim NUM_OF_DEPENDENT_COLUMNS Output dimension (alias the number of dependent columns, that must be last columns) --predictionout PREDICTION_DATA_FILE prediction data file (csv format) --measures MEASURES [MEASURES ...] List of built-in sklearn regression metrics to compare prediction with input dataset Where: • -h, --help: shows the usage of the program and ends the execution. • --version: shows the version of the program and ends the execution. • --model: path (relative or absolute) to the file in joblib (.jl) format of the model generated by a training program mentioned above. • --ds: path (relative or absolute) of the csv file (with header) that contains the input dataset on which to calculate the prediction. • --outputdim: the$n$number of independent variables that are the first$n$columns of the csv dataset; the rest of the columns on the right are the$m$dependent variables accordingly. • --predictionout: path (relative or absolute) of the csv file to generate that will contain the prediction, that is the approximation of the function applied to the input dataset. • --measures: list of measurements to be calculated by comparing the true values of the input dataset and the predicted output values; the list of supported metrics is defined in SciKit Learn Regression Metrics. ## An example of using of all the programs Suppose you want to approximate the function $$f(x)=\frac {1}{2} x^3 - 2 x^2 - 3 x - 1$$ in the range$[-10.0,10.0]$. Keeping in mind that np is the alias of NumPy library, the translation of this function in lambda body Python syntax is: 0.5*x**3 - 2*x**2 - 3*x - 1 To generate the training dataset, run the following command: $ python fx_gen.py \
--dsout mytrain.csv \
--funcx "0.5*x**3 - 2*x**2 - 3*x - 1" \
--xbegin -10.0 \
--xend 10.0 \
--xstep 0.01
instead to generate the test dataset, run the following command:
$python fx_gen.py \ --dsout mytest.csv \ --funcx "0.5*x**3 - 2*x**2 - 3*x - 1" \ --xbegin -10.0 \ --xend 10.0 \ --xstep 0.0475 Note that the discretization step of the test dataset is larger than that of training and it is a normal fact because the training, to be accurate, it must be run on more data. Also note that it is appropriate for the discretization step of the test dataset is not a multiple of the training one in order to ensure that the test dataset contains most of the data that is not present in training dataset, and this makes prediction more interesting. To this we intend to make a regression by fit_func_esvr.py passing to the underlying regressor: kernel: rbf, C: 100, gamma: 0.1, epsilon: 0.1; then run the following command: $ python fit_func_esvr.py \
--trainds mytrain.csv \
--modelout mymodel.jl \
--outputdim 1 \
--svrparams "'kernel': 'rbf', 'C': 100, 'gamma': 0.1, 'epsilon': 0.1"
and at the end of the execution the saved mymodel.jl file contains the model of the e-svr regressor configured and trained.
Now we intend to perform the prediction and calculation of the error using the measurements mean_absolute_error and mean_squared_error; then execute the following command:
$python predict_func.py \ --model mymodel.jl \ --ds mytest.csv \ --outputdim 1 \ --predictionout mypred.csv \ --measures mean_absolute_error mean_squared_error and at the end of the execution the saved mypred.csv file contains the prediction performed by applying the model on the test data; the output of the program displays the error measures passed through the argument --measures and are acceptable: the first around$0.4$and the second around$1.9$. Note: Given the stochastic nature of the training phase, your specific results may vary. Consider running the training phase a few times. Finally you want to make the comparative display of the test dataset with the prediction; therefore run the following command: $ python fx_scatter.py \
--ds mytest.csv \
--prediction mypred.csv \
--title "e-svr ('kernel': 'rbf', 'C': 100, 'gamma': 0.1, 'epsilon': 0.1)" \
--xlabel "x" \
--ylabel "y=1/2 x^3 - 2x^2 - 3x - 1"
that shows the dispersion graphs of the test dataset and the superimposed prediction: in blue the one of the test dataset, in red the prediction. The comparison of the two graphs clearly shows that the approximation has reached very high levels.
Note: Given the stochastic nature of the training phase, your specific results may vary. Consider running the training phase a few times.
Figure with dispersion graphs generated by the program fx_scatter.py showing the fitting in red overlay of the function $f(x)=\frac {1}{2} x^3 - 2 x^2 - 3 x - 1$ and the original function below in blue.
If you want to make a regression by fit_func_nusvr.py switching to the underlying regressor: C: 10, gamma: auto then execute the following command:
$python fit_func_nusvr.py \ --trainds mytrain.csv \ --modelout mymodel.jl \ --outputdim 1 \ --svrparams "'C': 10, 'gamma': 'auto'" then run the prediction using the predict_func.py command with the same parameters as above and finally to generate the dispersion graphs run the following command: $ python fx_scatter.py \
--ds mytest.csv \
--prediction mypred.csv \
--title "nu-svr ('C': 10, 'gamma': 'auto')" \
--xlabel "x" \
--ylabel "y=1/2 x^3 - 2x^2 - 3x - 1"
The repository contains examples in shell scripts that show the use of these cascading programs: |
# degree of proper mapping for linear case
Let $U,V$ be two open sets in $R^n$ and $f:U\to V$ proper $C^{\infty}$ map (proper = preimage of compact set is compact). Then we have
$$\int f^{*}\omega=\deg(f)\int \omega,$$
for $\omega \in \Omega_c^{n}(V)$. How to prove that if $f$ is linear mapping i.e. $f(x)=Ax$ for nonsignular $n\times n$ matrix we have $\deg(A)=sign(\det (A))$?
-
I don't think that your property is true : we can show with Sard's theorem that the degree of a proper map is an integer (whereas your determinant could be a non-integer number). – JBC Jun 16 '12 at 12:33
Yes, I have corrected it. Sorry! – dmm Jun 16 '12 at 13:16
The degree of a linear map is the signum of $\det$, not $\det$ (the degree is integer valued, while $\det$ is real valued -- already this should make you suspicious). (And the answer to your question is, basically, the change of variables formula for the integral, once you replace $\det$ by it's sign).
@dmm If you post such a question it is usually hard to answer cause the people reading it do not know how the concepts were introduced to you and what you have available as tools. Maybe you have a standard textbook you are following, then those who know it may be able to help you based on this knowledge. The approaches to differential forms and degree I know all rely on the transformation formula (which reads $\int_{\phi(U)}f(x) d^nx = \int_U f(\phi(x))|\det D\phi(x)|d^n x$, just to make sure we are talking about the same thing). – user20266 Jun 16 '12 at 13:43 |
# Quantum Engineering of Atomically Smooth Single-Crystalline Silver Films
## Abstract
There is a demand for ultra low-loss metal films with high-quality single crystals and perfect surface for nanophotonics and quantum information processing. Many researches are devoted to alternative materials, but silver is by far theoretically the most preferred low-loss material at optical and near-IR frequencies. Usually, epitaxial growth is used to deposit single-crystalline silver films, but they still suffer from unpredictable losses and well-known dewetting effect that strongly limits films quality. Here we report the two-step approach for e-beam evaporation of atomically smooth single-crystalline metal films. The proposed method is based on the thermodynamic control of film growth kinetics at atomic level, which allows depositing state-of-art metal films and overcoming the film-surface dewetting. Here we use it to deposit 35–100 nm thick single-crystalline silver films with the sub-100pm surface roughness and theoretically limited optical losses, considering an ideal material for ultrahigh-Q nanophotonic devices. Utilizing these films we experimentally estimate the contribution of grain boundaries, material purity, surface roughness and crystallinity to optical properties of metal films. We demonstrate our «SCULL» two-step approach for single-crystalline growth of silver, gold and aluminum films which open fundamentally new possibilities in nanophotonics, biotechnology and superconductive quantum technologies. We believe it could be readily adopted for the synthesis of other extremely low-loss single-crystalline metal films.
## Introduction
Unique large-scale optoelectronic devices utilizing plasmonic effects for near-field manipulation, amplification and sub-wavelength integration open new frontiers in nanophotonics, quantum optics and quantum information science1,2,3,4,5,6,7,8,9,10,11,12,13,14,15. However, ohmic losses in metals are still a big challenge on the way towards a variety of useful plasmonic devices14,15,16,17,18,19,20,21. Many researchers have devoted extensive efforts in clarifying the comprehensive influence of metal film properties on overall losses to develop a high performance material platform14,15,16,17,18,19. Single-crystalline platform has the potential to alleviate this problem by eliminating material-induced scattering losses3,4 and nanoscale structure definition impact15,16,17,18. Silver (Ag) is by far potentially the best plasmonic metal at optical and near-IR frequencies4,15,16,17,18,19, when it comes to the optical loss, silver is still superior to all new plasmonic materials, including graphene17. On the other hand, silver is one of the most challenging metals for single-crystalline film growth18,19,20,21,22. Moreover, because of silver nature, an epitaxial growth of sub-50 nm thick and ultrathin films is impeded without using loosy wetting underlayers23,24,25,26. Most of the reported single-crystalline silver film growth methods rely on molecular beam epitaxy (MBE)19,27 or physical vapor deposition (PVD)18. It was demonstrated that Ag epitaxial films can be engineered to have almost atomic smoothness and significantly lower optical losses in the 1.8–2.5 eV range19 than widely cited Johnson and Christy (JC) data28. Here we report on a two-step PVD growth approach to obtain atomically smooth single-crystalline metal films, which is the result of a detailed study on the growth mechanism29,30,31. Our approach provides the single-crystalline metal films growth on non-ideally lattice-matched substrates without underlayers using a high vacuum electron-beam evaporator. It guarantees simultaneously high crystallinity and purity, atomically smooth surface over a centimetre-scale area, reproducible run-to-run film thickness (down to 35 nm), the unique optical properties and SPP propagation length, and thermodynamic stability. The process is facile, inexpensive and fast (high deposition rate) compared to MBE technique, compatible with lithography and etch nanoscale features definition, and reproducible in a standard cleanroom environment. In addition, it can be effectively applied to various metals such as silver, gold, and aluminum, which are widely used metals in quantum optics and quantum information science.
Figure 1 illustrates the basic principles of atomically smooth single-crystalline silver films two-step deposition process. In the first step, a seed crystal consisting of strained two-dimensional islands with atomically flat top surface (AFT 2D islands) is grown on a substrate (see Supplementary Information for details) under 350 °С. In the second step, the deposition is stopped and the substrate is cooled down to 25 °С in the same vacuum cycle to prevent a well-known dewetting effect and three-dimensional growth leading to subsequent film imperfections. Then, more silver is deposited on AFT 2D seed at 25 °С until a continuous film is formed. Following film annealing under elevated temperature (higher than first step) can reduce growth defects density improving crystalline structure and surface roughness. We believe that simultaneous improvement in film characteristics relies on the combination of two mixed evaporation modes combined with the AFT 2D islands growth self-controlled by quantum size effects. With this dual-phase experimental nature in mind as well as improved film parameters, we name our deposition process the «SCULL» (Single-crystalline Continuous Ultra-smooth Low-loss Low-cost).
### The SCULL process
Consider a silver (111) film on silicon (111), which is a well-known substrate for epitaxial Ag growth. To quantitatively estimate an influence of substrate crystalline structure on films quality, we additionally deposit the SCULL silver films on widely used silicon (100), silicon (110), and mica substrates. Nominally 35-nm-thick single-crystalline silver films were evaporated using the SCULL process (base pressure 3 × 10−8 Torr, see Supplementary Information) on different substrates (Table 1), which are the thinnest PVD single-crystalline silver films than those reported previously18,32,33. On the other hand, the SCULL process has no fundamental limitations in a thicker film synthesis, which is crucial for applications sensitive to SPP substrate absorption. To demonstrate this, we deposit nominally 70-nm-thick (S4) and 100-nm-thick Ag (111) films (S5) on Si (111) substrates. All the films are continuous, without voids and pits, and have an atomically smooth surface over 15 × 15 mm2 sample area.
According to the epitaxy theory and long-term experience a microstructure, growth mode and morphology are mostly governed by kinetic effects at high deposition rates (non-equilibrium experimental conditions). In this case single-crystalline films can be deposited in a smooth Frank-van der Merwe (layer-by-layer) growth mode34, when the surface free energy of the substrate (Esub) is higher than or equal to the sum of the surface free energies, Esub ≥ Ef + Eint, for the film (Ef) and the interface (Eint). Then, it is energetically favorable the film to cover the substrate completely to eliminate the contribution of the high substrate surface energy. The fundamental idea of our process involves quantum engineering of the AFT 2D seed crystal of a given metal (1st step), which is «frozen» at the optimum point of the growth process, followed by dramatic shift of film growth kinetics (2nd step) allowing the lateral spreading of the crystal seed until the perfect continuous film is formed. Three key features (Fig. 1a,b) have to be provided at the first process step: the 2D growth of islands with atomically flat top surface, the macroscopic control of thickness and microstructure of AFT 2D islands and the well-defined strain accumulated in islands at the optimum point. 2D growth can be guaranteed by interlayer mass transport control, which is the delicate balance between the adatoms surface diffusion (D ~ temperature) and the flux (F ~ deposition rate), the growing islands state (density, size, shape and strain), the surface diffusion (ED) and the step-edge Ehrlich-Schwoebel (ES) barriers for adatoms to descend (downward transport) or ascend (upward transport) the edges of the growing islands.
In order to ascend (descend) an island, the adatoms arriving on a substrate (island) surface may try several times (with the hopping rate ν = ν0 exp(−ES/kBT)) to move over the edge barrier ∆E = ES − ED. Based on previously reported data35,36, we experimentally determine the ranges of adatoms surface diffusion (280–420 °С temperature), adatoms flux (0.5–10 Å s−1 deposition rates) and islands state (1–25 nm thickness) for the layer-by-layer AFT 2D Ag (111) islands growth at the first process step. Above the certain islands state (size, interface area, strain) the sum of the islands surface free energies Ef + Eint becomes higher than the substrate surface free energy Esub, leading to three-dimensional growth. Moreover, increasing the islands dimensions without reducing the adatom surface diffusion D results in an enhanced hopping rate ν of adatoms visiting the step edges and, thus, an increased upward mass transport. Thus, there is the optimum point of the process, when the AFT 2D Ag (111) islands seed deposition has to be stopped.
An electronic growth model36,37 based on the quantum size effects can explain the second key feature of AFT 2D islands (self-controlled thickness and crystalline structure over a large area). According to the electronic growth model, growing AFT 2D Ag (111) islands are considered as an electron gas, which is confined to a 2D quantum well that is as wide as the thickness of the silver islands38,39. The energy oscillates as a function of the island thickness (quantum well width), resulting in island thickness quantization (Fig. 1a, inset). Top silver layers of the islands grow under a homoepitaxial regime in the presence of these small energy oscillations, resulting in a quantized island thickness with the silver monolayer accuracy36,37,40. Together with the layer-by-layer 2D growth mode it enables formation of the two-dimensional Ag (111) islands with atomically flat top surface and provides precise control of the islands thickness over macro scale area, even in the presence of typical PVD process deviations.
The third key feature of AFT 2D Ag (111) seed crystal is an energy accumulation in islands which is induced by strained growth under elevated temperature on the substrate with different lattice constants. The accumulated strain affects the growth kinetics at the second process step by lowering the ES barrier for the adatom surface diffusion. Strained growth is induced by the onset of defects: screw dislocation influence and spiral growth which becomes stronger with thickness increase. That is why the AFT 2D seed thickness has to be optimized to provide a dislocation-free crystalline lattice growth of Ag (111) islands, on the one hand, and the ultimate initial strain accumulation, on the other. As the result of the first step the AFT 2D islands seed is formed (Fig. 1b) consisting of the uniform thickness islands (more than 90% substrate area) with the atomically flat top surface (RMS roughness <50 pm), flat irregular form and the average lateral size from 100 nm to 250 nm.
At the second step, the deposition is stopped and the substrate is cooled down to 25 °С. Then, more silver is evaporated on the AFT 2D seed (Fig. 1c–e) until a continuous single-crystalline film is formed (Fig. 1f). At room temperature, the reduced surface diffusion length D33 and the hopping rate ν of adatoms arriving on the substrate lead to decreased upwards mass transport. On the other hand, Ag adatoms arriving on islands are hopping along the atomically flat top surfaces of the AFT 2D islands with almost no energy dissipation and easily get islands edges. Moreover, the strain relaxation results in the reduced step-edge ES barrier for the adatoms on the islands surface39,41 and, thus, increased downward mass transport. Therefore, at the second step almost all the coming adatoms are adsorbed at the edges (perimeter) of the islands, spreading the dominant Ag(111) 2D islands and, eventually, coalescing the islands with each other, thus, completing the single-crystalline film. In the end, the strain that was accumulated in the first growing step relaxes primarily into interactions with the incoming adatoms improving the crystalline structure of the AFT 2D islands. Upon subsequent annealing at 320–480 °C, silver film crystalline structure and surface roughness are improved42,43, along with defect density reduction. In the next section, we demonstrate that SCULL process allow to overcome the well-known problem of the silver dewetting under elevated temperatures and deposit high quality sub-50 nm thick single-crystalline silver film.
## Results and Discussion
In this section, we demonstrate that grain boundaries, material purity (hence, grain boundaries purity), surface roughness (and associated surface chemical reactivity), and crystallinity imperfection contribute to optical properties of metal films in descending order of priority. We compare the results for six representative films: three SCULL single-crystalline films of 35 nm (S1), 70 nm (S4) and 100 nm (S5) nominal thickness, and three nominally 100-nm-thick polycrystalline films (PC, PCBG31, NC) with different grain size and purity (Table 1). Since dielectric permittivity is thickness independent18,29, optical properties of the Ag (111)/Si (111) films (S1, S4, S5) without grain boundaries and the same material purity properties were compared to estimate surface roughness and crystallinity impact. High-resolution wide-angle X-ray diffraction (XRD) rocking curves (Figs 2b, S4 and S5) with a full width at half maximum (FWHM) of 0.325°, 0.221°, 0.368° for the film thicknesses of 37 nm (S1), 68 nm (S4) and 99 nm (S5) indicate high quality film independent of its thickness, with minimal level of defects (see Supplementary Information for details). It is important to note that SCULL films deposited on a non lattice-matched Si (100) and Si (110) have predictably worse crystallinity, but also demonstrate atomically smooth surfaces with RMS roughness less that 4 Å (Table 1). High-resolution transmission electron microscopy (HRTEM) image (Fig. 2e) demonstrates the single-crystalline nature of the S1 silver film. Electron backscatter diffraction (EBSD) is used to analyze the domain structures and extract the average grain size (Table 1) of single-crystalline (Fig. 2h) and polycrystalline films (Figs 2f,g, S8c).
To estimate losses and rank the films parameters contribution to optical properties a multi-angle spectroscopic ellipsometry is used (see Supplementary Information for details). We focus on the most practically useful NIR and visible wavelength region for silver lays above the interband transitions (λ > 325 nm), where the contribution to ε1 mainly comes from Drude terms (dc conductivity), but ε2 is defined by both intraband and interband components. We observe the dominating contribution of grain boundaries to dielectric permittivity (Fig. 3a–d), the real part becomes more negative with the increasing grains size (Fig. 3c) indicating higher conductivity. The NC film with a great number of small grains has the worst ε1 even compared to PC film deposited in a poor vacuum. In contrast, all the single-crystalline films have larger negative ε1 compared to JC and polycrystalline films. Observed decrease in negative ε1 (conductivity) is primarily due to increased number of structural defects (including grain boundaries) in the films leading to the increased electron-phonon interactions, which make the films less metallic. In general the same influence of the films grain size on ε2 is observed (Fig. 3d), except the PC film in 600–1000 nm wavelength range, which has larger losses than NC in spite of a bigger grain size. Indeed, it can be explained by poor PC film purity, which leads to increased Drude term of the imaginary part of the dielectric permittivity44, elevating losses at longer wavelengths (λ > 500 nm).
Material purity and surface roughness are the factors of the second priority in terms of silver dielectric permittivity in the 600–1000 nm and 325–600 nm wavelengths respectively. To demonstrate material purity effect, we compare the dielectric permittivity of relatively clean (NC, PCBG) and contaminated (PC) polycrystalline films with JC data. JC data29 was acquired from the thin films deposited near 170 times faster than the PC film (at very high evaporation rate of 60 Å s−1), leading to much purer silver film. Our measurements (Fig. 3c,d) indeed show larger negative ε1 and lower ε2 of JC data compared to all the polycrystalline films in the 600–1000 nm wavelengths. However, the above JC permittivity supremacy is almost neglected compared to the PCBG film, because of larger grains, which, in contrast, improving the film optical quality. These material purity dependencies can be attributed to an increase in the electron-phonon interaction as described above.
At the 325–600 nm wavelengths, with increasing surface roughness (averaged value) and surface morphology singularities (absolute number of surface nonuniformities) the ε2 is dramatically increased, and for the NC film it becomes more than five times and more than twice larger (Fig. 3d) compared to the S5 and PCBG films respectively. Furthermore, there are typical peaks in ε2 between 340 nm and 400 nm wavelengths for all the samples, and it is important to note that the S1 film peak amplitude is four times lower than the NC film peak amplitude. These ε2 spectrum features can be explained by internal interfaces effects45, that is, with the surface roughness and morphology increase the silver surface oxidation and the chemical reactivity is boosting. The observed typical peaks in ε2 are primarily due to the surface reaction with adsorbed sulphur32,45, which transforms the silver into a non-metal silver-sulphide (by transfer of S-ions through the interface). In case of polycrystalline films the surface topography (active surface area) plays the key role in the increased silver surface chemical reactivity leading to ε2 spectrum degradation close to interband transition threshold. For the single-crystalline films with the improving surface roughness (to sub-100 pm level) the typical peak associated with the interband transitions is almost eliminated (but is still present) due to better surface thermodynamic stability32 and weaker sorptivity to chemical elements from ambient.
In addition, SCULL silver films could be of a great interest for rapidly growing field of quantum plasmonics as atomically smooth single-crystalline films with low optical absorption and high conductivity can result in enhanced SPP propagation length. A theoretically predicted SPP propagation length for silver over two hundred microns33 and exceptional performance of plasmonic devices9,31 are experimentally demonstrated using SCULL silver films. We believe that it is the result of simultaneous synergistic effect from SCULL films dedicated single-crystalline nature, angstrom-scale surface roughness, process-induced high purity and thermodynamic stability32.
In conclusion, we have developed the two-step approach for e-beam evaporation of continuous atomically smooth single-crystalline metal films in a wide range of thicknesses down to 35 nm. The process provides thermodynamic, i.e. macroscopic, control of the film growth kinetics at atomic level and based on 2D crystal seed growth, known for 3D bulk materials, but not previously reported for thin films. The key feature of our approach is the combination of two mixed evaporation modes together with AFT 2D crystal seed growth self-controlled by quantum size effects, which enables deposition of perfect single-crystalline metal films on non-ideally lattice-matched substrates, even with imperfect standard PVD tools and typical process deviations. We have demonstrated 35–100 nm thick single-crystalline silver films with the sub-100 pm surface roughness, perfect crystallinity (XRD rocking curves through Ag(111) peaks have FWHM of 0.2–0.4°), much lower optical losses compared to JC data29 and SPP propagation length above two hundred microns33. Using SCULL silver films as golden samples we have demonstrated that grain boundaries, material purity, surface roughness and microstructure imperfection contribute to optical properties in descending order of priority. The proposed process has been approved for silver, gold and aluminum single-crystalline films growth on silicon, sapphire and mica substrates. We believe that SCULL process could be used for deposition of various atomically smooth single-crystalline metal thin films, as well, it could be easily integrated in planar top-down device fabrication technology. The unique physical and optical properties of SCULL films may open fundamentally new possibilities in nanophotonics1,15, biotechnology6,10 and quantum technologies9.
## Methods
### Preparation of epitaxial films
Silver thin films were deposited on prime-grade degenerately doped Si(111), Si(100), Si(110) wafers (0.0015–0.005 Ω-cm) and muscovite mica substrates using 10 kW e-beam evaporator (Angstrom Engineering) with a base pressure lower than 3 × 10−8 Torr. We first cleaned the wafers in a 2:1 sulfuric acid: hydrogen peroxide solution (80 °C), followed by further cleaning in isopropanol to eliminate organics. Finally, we placed the wafers in 49% hydrofluoric acid for approximately 20 s to remove the native oxide layer. After oxide removal, we immediately transferred the wafers into the evaporation tool and pumped the system down to limit native oxide growth. Mica substrates were cleaved perpendicular to the c-axis to reveal fresh surfaces, prior to deposition. All films were grown using 5N (99.999%) pure silver. Films were deposited with rate of 0.5–10 Å·s−1 measured with quartz monitor at approximate source to substrate distance of 30 cm. Deposition is done in two steps using SCULL process.
### X-ray diffraction (XRD)
X-ray diffraction was studied by means of the Rigaku SmartLab diffractometer. To study texture and azimuthal orientation of silver crystals in relation to silicon monocrystal axes, φ-scans for both Ag and Si layers were measured from 0° to 360° with 0.052° step. Rocking curves (or ω-scans, 0.001° step) were applied to characterize in-plane perfection of silver crystals. In each sample, observed reflections were caused by sets of crystallographic planes with divisible Miller indices. This means that crystallographic planes of silver crystals were parallel to planes of the substrate with the same Miller indices. Ag(111) was parallel to Si(111), Ag(110)//Si(110) and Ag(100)//Si(100). A difference in value of lattice parameter of silver on substrate with various Si orientations, if any, was lower than uncertainty of the method of investigation. Averaged out of all samples lattice parameter was 4.076 ± 0.004 Å, which was in good agreement with known value for pure Ag atomic weight7. Curves of φ-scans consisted of sharp reflections for all the samples. The number and position of these reflections coincided with standard (111), (110) and (001) FCC crystal projections. Also, the position of reflections from the silver film matched in each case to a position of those of a silicon substrate. This result meant that each studied silver film possessed biaxial texture. Analysis of 2θ/ω and φ-scans showed that each studied sample was biaxially textured silver film on mono-crystalline silicon substrate with the following epitaxial relationships: (111)Ag//(111)Si:[111]Ag//[111]Si, (110)Ag//(110)Si:[110]Ag//[110]Si and (001)Ag//(001)Si:[001]Ag//[001]Si. Additionally, φ-scans showed that misorientation of these films and a substrate did not exceed 0.1°. Appearance of a Si(222) forbidden reflection is in accordance with recently published investigations. Since full-width-at-half-maximum (FWHM) of a rocking curve profile serves as a practical numerical characteristic of mosaic spread in thin crystalline films for plasmonics, precise determination of this value is of importance.
### Atomic force microscopy (AFM)
The atomic force microscope Bruker Dimension Icon with SCANASYST-AIR-HR probe (with nominal tip radius of 2 nm) was used. All AFM images were obtained by using PeakForce Tapping mode with ScanAsyst imaging and the scanned area was 2.5 × 2.5 μm2 and 50 × 50 μm2. Nanoscope software was utilized to analyze the images and extract root mean square roughness.
### Ellipsometry
Dielectric functions of the silver films were measured using a multi-angle spectroscopic ellipsometer (SER 800, Sentech GmbH). Additionally, ellipsometers in three different laboratories have been crosschecked (for single-crystalline silver films on Si(111)) to eliminate the possibility of systematic errors. We specifically measured the optical constants of the HF treated silicon substrate used in the deposition process to eliminate any discrepancy and uncertainty introduced by the substrate. These measured silicon optical constants and silver thickness are fixed in the subsequent data fitting for all samples, and only the silver parameters are allowed to change.
Modeling and analysis were performed with the ellipsometer SENresearch 4.0 software. The models were developed in cooperation with Sentech GmbH application department. Measurement spectral wavelength range was from 240 to 1000 nm, with an interval of approx. 2 nm, and the reflected light was analyzed at incidence angles of 50°, 60°, 70°. To characterize the optical losses, the real (ε1) and imaginary (ε2) parts of the dielectric permittivity were extracted by fitting the measured raw ellipsometric data (Ψ and Δ). In our fitting, we used a bilayer Ag/Si structural model and a simple phenomenological Brendel-Bormann (BB) oscillator model to interpret both the free electron and the interband parts of the dielectric response of our samples:
$$\hat{\varepsilon }(\omega )={\varepsilon }_{\infty }-\frac{{\omega }_{p}^{2}}{{\omega }^{2}+i{{\rm{\Gamma }}}_{{\rm{D}}}{\rm{\omega }}}+{\sum }_{j=1}^{k}{\chi }_{j}(\omega ),$$
(1)
where ωp is the plasma frequency, $${\varepsilon }_{\infty }$$ is the background dielectric constant, $${{\rm{\Gamma }}}_{{\rm{D}}}$$ is Drude damping, $${\chi }_{j}(\omega )$$ is BB oscillators interband part of dielectric function, and k is the number of BB oscillators used to interpret the interband part of the spectrum.
Modern ellipsometers are capable of obtaining a presice optical data. Using flexible and sophisticated models and analysis software, it is possible accurately determine the optical constants of materials. While data obtaining creates no difficulties for high quality samples, analysis is not trivial. Extracting reliable permittivity is challenging because it is an inverse problem. Polarization ratio of reflected light are measured and the optical constants of the structure under investigation and layer thicknesses are retrieved.
The mean square error (MSE) is a crucial parameter to quantify the quality of fitted permittivity parameters. However, a small MSE alone is not a conclusive proof that the model is totally reliable. A model contains highly correlated parameters, so it is possible to have multiple solutions with similarly low MSE values. A strong correlation exist between thickness and optical constants in absorptive metal thin films and lead to unreliable permittivity values. To verify that the final fit solution is truly unique, we need to do a test showing that there is indeed a best fit at a singular value of a chosen parameter. The parameter we chose to perform the uniqueness test on is the independelty-measured thickness of the film. By fixing the thickness of the film at a measured value with tolerance 2 nm, while letting the other parameters vary during the fitting process, we calculated the MSE of each final fit result. For all samples, the mean square errors, representing the quality of the match between the measured and theoretically calculated dielectric functions, were the best for measured thickness less than 1.3°. From these uniqueness tests, we conclude that our model is indeed reliable and the retrieved optical constants are valid.
### Profilometry
The stylus profiler KLA Tencor P17 (with Durasharp 38-nm tip radius stylus) was used. All measurements were done by using 0.5 mg taping strength, scan rate was 2 μm·s−1 and the scanned line length was 20 μm.
### Scanning electron microscopy
In order to check the quality and uniformity of the deposited layers silver films surfaces after deposition were investigated by means of a scanning electron microscope Zeiss Merlin with a Gemini II column. All SEM images were obtained using in-lens detector and the accelerating voltage 5 kV and working distance from the sample to detector from 1 to 4 mm. Magnifications 3000, 7000, 15000 and 50000 were used to fully analyze samples.
### Electron back scattered diffraction (EBSD) characterisation
The Ag films were observed and structurally characterized by field emission scanning electron microscopy (FE-SEM: Zeiss Merlin Gemini II). The crystal orientation maps of the Ag films were obtained by FE-SEM equipped with an EBSD system (NordlysNano from Oxford Instrument, Oxford Instruments Corp., UK). EBSD patterns were acquired at the following shooting modes: tilt angle – 70°, accelerating voltage – 10 keV, probe current – 1.7 nA and scan sizes 2 × 2 μm2 and 20 × 20 μm2 for SC film. EBSD has proven to be a useful tool for characterizing the crystallographic orientation aspects of microstructures at length scales ranging from dozens of nanometers to millimeters in the scanning electron microscope. Detector provides single grains detection by means of orientation measurement based on acquired Kikuchi patterns. Colored image represents grains orientation map, correlation between colors and orientations is shown on triangle diagram at the bottom left corner. We extract average grain size for our polycrystalline films using the embedded software package for an EBSD image processing AZtecHKL software package.
We extract average grain size for our polycrystalline films using the embedded software package for an EBSD image processing. The NC film was e-beam evaporated onto a liquid nitrogen cooled quartz substrate, the conditions were adjusted so that the film had an average grains size around 20 nm.
### Transmission electron microscopy (TEM)
In order to check the quality and crystallinity of the nominally 35-nm deposited silver film on Si(111) its crossection made by ion milling was investigated by means of a transmission electron microscope TITAN 300. TEM image was obtained using in-lens detector and the accelerating voltage 100 kV, spot size 3.
## References
1. 1.
Melikyan, A. et al. High-speed plasmonic phase modulators. Nat. Photon. 8, 229–233 (2014).
2. 2.
Bozhevolnyi, S. I. et al. Channel plasmon subwavelength waveguide components including interferometers and ring resonators. Nature 440, 508 (2006).
3. 3.
Huang, J.-S. et al. Atomically flat single-crystalline gold nanostructures for plasmonic nanocircuitry. Nat. Commun. 1, 150 (2010).
4. 4.
Wang, C. Y. et al. Giant colloidal silver crystals for low-loss linear and nonlinear plasmonics. Nat. Commun. 6, 7734 (2015).
5. 5.
Pyayt, A. L. et al. Integration of photonic and silver nanowire plasmonic waveguides. Nat. Nanotech. 3, 660 (2008).
6. 6.
Melentiev, P. et al. Plasmonic nanolaser for intracavity spectroscopy and sensorics. Appl. Phys. Lett. 111, 213104 (2017).
7. 7.
Oulton, R. F. et al. Plasmon lasers at deep subwavelength scale. Nature 461, 629 (2009).
8. 8.
Lu, Y. J. et al. Plasmonic nanolaser using epitaxially grown silver film. Science 337, 450–453 (2012).
9. 9.
Bogdanov, S. et al. Ultrabright room-temperature sub-nanosecond emission from single nitrogen-vacancy centers coupled to nano-patch antennas. Nano Lett. 18(8), 4837–4844 (2018).
10. 10.
Lagarkov, A. N. et al. SERS-active dielectric metamaterials based on periodic nanostructures. Opt. Express 24(7), 7133–7150 (2016).
11. 11.
Durmanov, N. N. et al. Non-labeled selective virus detection with novel SERS-active porous silver nanofilms fabricated by Electron Beam Physical Vapor Deposition. Sensors Act. B: Chem. 257, 37–47 (2018).
12. 12.
Atwater, H. A. & Polman, A. Plasmonics for improved photovoltaic devices. Nat. Mater. 9, 205 (2010).
13. 13.
Pendry, J. B. Negative refraction makes a perfect lens. Phys. Rev. Lett. 85, 3966–3969 (2000).
14. 14.
Boltasseva, A. & Atwater, H. A. Low-loss plasmonic metamaterials. Science 331, 290–291 (2011).
15. 15.
High, A. A. et al. Visible-frequency hyperbolic metasurface. Nature 522, 192 (2015).
16. 16.
West, P. R. et al. A. Searching for better plasmonic materials. Laser & Photon. Rev. 4, 795–808 (2010).
17. 17.
Dastmalchi, B. et al. A new perspective on plasmonics: confinement and propagation length of surface plasmons for different materials and geometries. Advanced Opt. Mater. 4, 177–184 (2016).
18. 18.
Park, J. H. et al. Single-Crystalline Silver Films for Plasmonics. Adv. Mater. 24, 3988–3992 (2012).
19. 19.
Wu, Y. et al. Intrinsic optical properties and enhanced plasmonic response of epitaxial silver. Adv. Mater. 26, 6106–6110 (2014).
20. 20.
Malureanu, R. & Lavrinenko, A. Ultra-thin films for plasmonics: a technology overview. Nanotech. Rev. 4, 259–275 (2015).
21. 21.
McPeak, K. M. et al. Plasmonic films can easily be better: rules and recipes. ACS Photon. 2, 326–333 (2015).
22. 22.
Kunwar, S. et al. Various silver nanostructures on sapphire using plasmon self-assembly and dewetting of thin films. Nano-Micro Lett. 9, 17 (2017).
23. 23.
Gather, M. C. et al. Net optical gain in a plasmonic waveguide embedded in a fluorescent polymer. Nat. Photon. 4, 457 (2010).
24. 24.
Ciesielski, A. et al. Controlling the optical parameters of self-assembled silver films with wetting layers and annealing. Appl. Surf. Sci. 421, 349–356 (2017).
25. 25.
Logeeswaran, V. J. et al. Ultrasmooth silver thin films deposited with a germanium nucleation layer. Nano Lett. 9, 178–182 (2008).
26. 26.
Formica, N. et al. Ultrastable and atomically smooth ultrathin silver films grown on a copper seed layer. ACS Appl. Mater. & Interfaces 5, 3048–3053 (2013).
27. 27.
Hanawa, T. & Oura, K. Deposition of Ag on Si (100) Surfaces as Studied by LEED-AES. Japanese Journal of Appl. Phys. 16, 519 (1977).
28. 28.
Johnson, P. B. & Christy, R. W. Optical constants of the noble metals. Phys. Rev. B 6, 4370 (1972).
29. 29.
Rodionov, I. A. et al. Mass production compatible fabrication techniques of single-crystalline silver metamaterials and plasmonics devices, Metamaterials, Metadevices, and Metasystems Proc. SPIE 10343, 1034337 (2017).
30. 30.
Baburin, A. S. et al. Highly directional plasmonic nanolaser based on high performance noble metal film photonic crystal. Nanophotonics VII Proc. SPIE 10672, 106724D (2018).
31. 31.
Baburin, A.S. et al. Crystalline structure dependence on optical properties of silver thin film over time. 2017 Progress In Electromagnetics Research Symposium — Spring (PIERS), 1497–1502 (2017).
32. 32.
Yankovskii, G. M. et al. Structural and Optical Properties of Single and Bilayer Silver and Gold Films. Physics of the Solid State 58(12), 2503–2510 (2016).
33. 33.
Baburin, A. S. et al. Toward theoretically limited SPP propagation length above two hundred microns on ultra-smooth silver surface. Opt.Mat. Exp. 8, 3254–3261 (2018).
34. 34.
Baski, A. A. & Fuchs, H. Epitaxial growth of silver on mica as studied by AFM and STM. Surf. Sci. 313, 275–288 (1994).
35. 35.
Kern, R., Le Lay, G. & Metois, J. J. Current Topics in Materials Science 3, 131, ed. by Kaldis, E. (1979).
36. 36.
Goswami, D. K. et al. Preferential heights in the growth of Ag islands on Si (1 1 1)-(7 × 7) surfaces. Surf. Sci. 601, 603–608 (2007).
37. 37.
Fokin, D. A. et al. Electronic growth of Pb on the vicinal Si surface. Phys. Status Solidi C 7, 165–168 (2010).
38. 38.
Su, W. B. et al. Correlation between quantized electronic states and oscillatory thickness relaxations of 2D Pb islands on Si (111)-(7 × 7) surfaces. Phys. Rev. Lett. 86, 5116 (2001).
39. 39.
Czoschke, P. et al. Quantum size effects in the surface energy of Pb∕ Si (111) film nanostructures studied by surface x-ray diffraction and model calculations. Phys. Rev. B 72, 075402 (2005).
40. 40.
Bromann, K. et al. Interlayer mass transport in homoepitaxial and heteroepitaxial metal growth. Phys. Rev. Lett. 75, 677 (1995).
41. 41.
Altfeder, I. B., Matveev, K. A. & Chen, D. M. Electron fringes on a quantum wedge. Phys. Rev. Lett. 78, 2815 (1997).
42. 42.
Cheng, F. et al. Epitaxial Growth of Optically Thick Single-Crystalline Silver Films for Plasmonics. ACS Appl. Mater. Interfaces 11(3), 3189–3195 (2019).
43. 43.
Toropov, N. A., Leonov, N. B. & Vartanyan, T. A. Influence of Silver Nanoparticles Crystallinity on Localized Surface Plasmons Dephasing Times. Phys. Status Solidi B 255, 1700174 (2018).
44. 44.
Sette, F. et al. Coverage and chemical dependence of adsorbate-induced bond weakening in metal substrate surfaces. Phys. Rev. Lett. 61, 1384 (1988).
45. 45.
Pinchuk, A., Kreibig, U. & Hilger, A. Optical properties of metallic nanoparticles: influence of interface effects and interband transitions. Surface Science 557, 269–280 (2004).
## Acknowledgements
We would like to thank Alexey P. Vinogradov, Vladimir M. Shalaev, Alexandra Boltasseva, Denis A. Fokin, Alexander M. Merzlikin, Alexander V. Baryshev and Alexander S. Dorofeenko for the helpful discussions. The SCULL process was developed and the samples were prepared at the BMSTU Nanofabrication Facility (Functional Micro/Nanosystems, FMNS REC, ID 74300).
## Author information
The SCULL metal growth process was developed by I.A.R., I.A.R. and A.S.B., A.R.G. analyzed the data and developed samples cleaning procedure, S.S.M. performed the XRD measurements, S.P. performed the ellipsometry measurements and fitting. A.V.A. supervised the study. All authors analyzed the data and contributed to writing the manuscript.
Correspondence to Ilya A. Rodionov.
## Ethics declarations
### Competing Interests
The authors declare no competing interests.
Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
## Rights and permissions
Reprints and Permissions
Rodionov, I.A., Baburin, A.S., Gabidullin, A.R. et al. Quantum Engineering of Atomically Smooth Single-Crystalline Silver Films. Sci Rep 9, 12232 (2019) doi:10.1038/s41598-019-48508-3
• ### The retrieval of a thin silver film dielectric constant by resonant approach
• R.S. Puzko
• , A.I. Ivanov
• , E.S. Lotkov
• , I.A. Rodionov
• , I.A. Ryzhikov
• , A.V. Baryshev
• & A.M. Merzlikin
Optics Communications (2020)
• ### Low-Damage Reactive Ion Etching of Nanoplasmonic Waveguides with Ultrathin Noble Metal Films
• Alina A. Dobronosova
• , Anton I. Ignatov
• , Olga S. Sorokina
• , Nikolay A. Orlikovskiy
• , Michail Andronik
• , Aleksey R. Matanin
• , Kirill O. Buzaverov
• , Daria A. Ezenkova
• , Sergey A. Avdeev
• , Dimitry A. Baklykov
• , Vitaly V. Ryzhkov
• , Aleksander M. Merzlikin
• , Aleksander V. Baryshev
• , Ilya A. Ryzhikov
• & Ilya A. Rodionov
Applied Sciences (2019) |
# Seven Habits of Effective Text Editing
## moolenaar.net
#articles #pocket
source
### Highlights from July 26, 2021
• Use % to jump from an open brace to its matching closing brace. Or from a “#if" to the matching “#endif".
• Use [{ to jump back to the “{” at the start of the current code block.
• Use gd to jump from the use of a variable to its local declaration.
• There are three basic steps: While you are editing, keep an eye out for actions you repeat and/or spend quite a bit of time on. Find out if there is an editor command that will do this action quicker. Read the documentation, ask a friend, or look at how others do this. Train using the command. Do this until your fingers type it without thinking
• “I want to get the work done, I don’t have time to look through the documentation to find some new command”. If you think like this, you will get stuck in the stone age of computing. |
### On Average-Case Hardness in TFNP from One-Way Functions
Pavel Hubáček, Chethan Kamath, Karel Král, and Veronika Slívová
##### Abstract
The complexity class TFNP consists of all NP search problems that are total in the sense that a solution is guaranteed to exist for all instances. Over the years, this class has proved to illuminate surprising connections among several diverse subfields of mathematics like combinatorics, computational topology, and algorithmic game theory. More recently, we are starting to better understand its interplay with cryptography. We know that certain cryptographic primitives (e.g. one-way permutations, collision-resistant hash functions, or indistinguishability obfuscation) imply average-case hardness in TFNP and its important subclasses. However, its relationship with the most basic cryptographic primitive -- i.e., one-way functions (OWFs) -- still remains unresolved. Under an additional complexity theoretic assumption, OWFs imply hardness in TFNP (Hubacek, Naor, and Yogev, ITCS 2017). It is also known that average-case hardness in most structured subclasses of TFNP does not imply any form of cryptographic hardness in a black-box way (Rosen, Segev, and Shahaf, TCC 2017) and, thus, one-way functions might be sufficient. Specifically, no negative result which would rule out basing average-case hardness in TFNP solely on OWFs is currently known. In this work, we further explore the interplay between TFNP and OWFs and give the first negative results. As our main result, we show that there cannot exist constructions of average-case (and, in fact, even worst-case) hard TFNP problem from OWFs with a certain type of simple black-box security reductions. The class of reductions we rule out is, however, rich enough to capture many of the currently known cryptographic hardness results for TFNP. Our results are established using the framework of black-box separations (Impagliazzo and Rudich, STOC 1989) and involve a novel application of the reconstruction paradigm (Gennaro and Trevisan, FOCS 2000).
##### Metadata
Available format(s)
Category
Foundations
Publication info
A major revision of an IACR publication in TCC 2020
Keywords
TFNPPPADaverage-case hardnessone-way functionsblack-box separations
Contact author(s)
hubacek @ iuuk mff cuni cz
History
2020-09-28: revised
2020-09-25: received
See all versions
Short URL
https://ia.cr/2020/1162
License
CC BY
BibTeX
@misc{cryptoeprint:2020/1162,
author = {Pavel Hubáček and Chethan Kamath and Karel Král and Veronika Slívová},
title = {On Average-Case Hardness in TFNP from One-Way Functions},
howpublished = {Cryptology ePrint Archive, Paper 2020/1162},
year = {2020},
note = {\url{https://eprint.iacr.org/2020/1162}},
url = {https://eprint.iacr.org/2020/1162}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content. |
# Free command line archivers
Nowadays, we have a plethora of free arhivers (7-zip, j-zip, winrar <-- okey, that one's not free, ...)
But does anyone know of some free command line archivers for popular formats (RAR, ZIP or its unix variants) for Windows platform.
-
As Phoshi pointed out, 7-zip has a great command-line interface. It really is becoming (has become?) the de facto standard in archiving tools. Plus, a small part of me was so happy to notice the absence of WinZip in your list of examples. ;) – JMD Oct 20 '09 at 15:00
I'll admit I prefer WinRAR for the GUI, but I do most of my stuff at the command line anyway, so 7z is invaluable. – Phoshi Oct 20 '09 at 15:02
You said it! 7zip has a lovely CLI interface.
p:\ath\to\7z.exe a hello.rar *.txt
would archive all your text files in your current directory,
p:\ath\to\7z.exe e hello.rar
would recreate them.
-
better yet, drop it in your %systemroot% directory and do away with the ugly p:\ath\to stuff. :) – quack quixote Oct 20 '09 at 17:23
That's very true - I have a fairly hacked together solution using python and doskey simply to avoid that stuff, but it's pretty damn good for one or twi things LD – Phoshi Oct 20 '09 at 17:53
Incredible. I've been using it forever, and never knew (nor bothered to check) whether it had a command line interface. Made my day, thanks ! – ldigas Oct 20 '09 at 17:57
It's not even a half-hearted interface, it's actually really bloody good! :) – Phoshi Oct 20 '09 at 17:58
GZIP for windows will definitely handle zipping / unzipping your "unix variants" on windows.
-
gzip in itself is not an arhiver (just compressor), need tar (or similar) as well for any archiving. But gzip is not the only way to compress a tar archive either so gzip is hardly enough. – Joakim Elofsson Oct 20 '09 at 16:04
True. i got the impression from the wording of the question that perhaps @ldigas, was looking more for how to uncompress files in a 'unix format', but I think I read it too narrowly. – DaveParillo Oct 20 '09 at 16:09 |
# When will PCA be equivalent to ICA?
$X = AS$ where $A$ is my mixing matrix and each column of $S$ represents my sources. $X$ is the data I observe.
If the columns of $S$ are independent and Gaussian, will the components of PCA be extremely similar to that of ICA? Is this the only requirement for the two methods to coincide?
Can someone provide an example of this being true when the $cov(X)$ isn't diagonal?
- |
Tags: web inclusion php
Rating:
TUCTF 2018 — "Easter egg" challenges
====================================
A series of three web challenges, themed around the book _Ready Player One_.
The Copper Gate
---------------
> How did I end up here? - Joker
> http://18.191.227.167/
We see what looks like a "placeholder" page, with a video referencing the book embedded on the page. The text reads:
> Please return at a later date for more content!
Which I took to be a hint that I needed to somehow make a request "from the future" to get a different version of the page. In fact, it was much simpler: an image included on the page was stored in the /images directory.
Navigating there, it turns out that directory listing is enabled, and there's a text file with instructions pointing us to "the development area":
http://18.191.227.167/images/sitenotes.txt
http://18.191.227.167/devvvvv/home.html
> Welcome to the development area
> You may be asking yourself how you got here... Truth be told I have no idea either. You may want to figure that out.
>
> Moving on, though.
>
> I hope you have as much fun solving this as I did writing it.
> A big thank you to Warren Robinett for beginning this fun tradition.
> In the spirit of the classic video game easter egg, I have hidden a series of challenges throughout this site. In the spirit of my favorite book, Ready Player One.
>
> (...)
>
> Each step of the hunt will award points respective to the challenge. The final step and to the egg is the crystal flag. Thank you to everyone for your participation. And now for the introduction.
>
> **Introductions**
>
> Three hidden flags open three secret gates.
> Wherein the challenger will be tested for worthy traits.
> And those with the skill to solve what I create
> Will reach The End, where the points await
>
> **The First Challenge**
>
> The Copper Flag awaits your attention
> Somewhere in an old direction
> But you have much to review
> If you hope to accrue
> The points protected by this section.
"An old direction" seems to point to a directory that we've already explored before. "Protected" made me think of .htaccess, but I got a 403 when trying to read it.
With "Preserve network logs" enabled in the Chrome dev console, I used the same trick as before and simply navigated up to /devvvvv, trying to see if we could get a directory listing.
Instead, devvvvv/index.html contained a meta tag redirecting us to devvvvv/home.html... but also a link to flag! (Base64 encoded)
TUCTF{W3lc0m3_T0_Th3_04515_Th3_C0pp3r_K3y}
---
-------------
Challenge description:
> Gotta make sure I log my changes. - Joker
> http://18.191.227.167/
On the page where we found the copper flag, there were extra instructions:
>
> in a backup long neglected
> But you can only retrace your steps
> once the logs are all collected
Okay, so there are some evocative keywords there:
- "backup": perhaps a zip with the source code / database dump is stored somewhere
- "logs": server & access logs? PHP stores logs in a default location, so perhaps there's a directory traversal exploit that would allow us to get them. I tried for a little bit, but no luck.
- "log my changes": wait, that sounds a lot like version control!
http://18.191.227.167/.git/
Bingo! We get the directory listing for a typical git repository. Let's download it for convenience:
wget -r http://18.191.227.167/.git/
Looking at the changes from each commit, after reading through a few funny / trollish messages, we find the Jade flag:
TUCTF{S0_Th1s_D035n7_533m_l1k3_175_f41r_8u7_wh0_3v3r_s41d_l1f3_15_f41r?}
---
The Crystal Gate
-------------
> I don't wanna go anywhere.
> http://18.191.227.167/
Continuing to analyze the Git repository's content, we see _staged_, but non-committed changes:

';
echo 'Note2: I can\'t seem to remember the param. It\'s "file"';
echo '
';
if (isset($_GET['file'])) {$file = $_GET['file']; if (strpos($file, '/etc/passwd') == true) {
include($file); } elseif (strpos($file, '.ssh') == true) {
include($file); echo ' '; echo 'Probably shouldn\'t put my own key in my own authorized keys, but oh well.'; } } ?> That certainly looks exploitable! For one, strpos only checks that the substring is _somewhere_ in $file.
After trying different values of \$file, I realized that the code seen in the repo wasn't exactly what's running on the server. The exploit is even easier, allowing inclusion of _any_ file:
http://18.191.227.167/crystalsfordays/traversethebridge.php?file=..
http://18.191.227.167/crystalsfordays/traversethebridge.php?file=../..
http://18.191.227.167/crystalsfordays/traversethebridge.php?file=../../TheEgg.html
And we got the flag!
Note: Only used for access management and to check user info.
Note2: I can't seem to remember the param. It's "file"
<html>
THE END
Congratulations! You have discovered the crystal key and unlocked the egg. Thank you for your participation in this competition and I hope you enjoyed the trip, as well as learned a few things in the process.
- Joker
TUCTF{3_15_4_M4G1C_NUMB3R_7H3_crys74L_k3Y_15_y0ur5!}
</html> |
# Homework Help: B field at the center of a large charges sheet
1. Jul 2, 2011
### zimo
1. The problem statement, all variables and given/known data
In a plastic film factory, a wide belt of
thin plastic material is traveling between two successive
rollers with the speed v. In the manufacturing process, the
film has accumulated a uniform surface electric charge
density σ. What is B near the surface of the belt in the
middle of a large flat span?
2. Relevant equations
B(x) = $\frac{\mu I}{4\pi}\int \frac{dx' \times (x-x')}{|x-x'|^{3}}$
3. The attempt at a solution
I tried to calculate the integral, starting with assuming that the sheet can be represented via cylindrical coordinates - to better use the r and sin(theta) - a result from the cross product.
But then I've got in the integrand $\frac{sin(\theta)}{r} and then solved the dr part with log(r)|{inf to 0} and couldn't progress from there, so I suppose I made an error somewhere but can't find it. 2. Jul 2, 2011 ### Matterwave Since you are in the middle of a large flat span of uniform charge density, it is perhaps simpler to apply Ampere's law than try to brute force it with Biot-Savart. 3. Jul 3, 2011 ### zimo Maybe you are right, but the lecturer pointed out specifically to use Biot-Savart on this one...:yuck: 4. Jul 3, 2011 ### fizika_kz I think it would be right, if you first find the electric field. Then use Maxwell's equation to find the B. 5. Jul 3, 2011 ### zimo I approached him and asked it today, he said that he expects us to solve it only by Bio-Savart... 6. Jul 4, 2011 ### Philip Wood Regard the moving charged sheet as currents running in parallel wires. A strip of width [itex]\Delta$w will be equivalent to a wire carrying current $\sigma$ $\Delta$w v. Agreed?
Now, using the B-S law to find B due to a long straight wire is a standard derivation. I expect you've learnt it already. [The result can be derived in one line from A's Law.]
So now you need to integrate the fields at a 'central point', due these strips of the sheet. A diagram is essential here. You'll need to use the right hand grip rule to get the direction of these fields, and then you'll need to add the fields as vectors. This isn't as hard as it sounds because field components perpendicular to the sheet cancel. Again, your answer can be checked in one line using A's Law. |
# Homework Help: Accelerating a car including the moment of inertia of the wheels
Tags:
1. Nov 5, 2017
### TSny
The torque equations can be used to get expressions for Ffront and Frear in terms of m and α. Use your diagrams for the wheels to help set up these equations.
2. Nov 5, 2017
### lichenguy
Could it be FrearR = Iα?
Is it the same at the front?
3. Nov 5, 2017
### TSny
Yes. Can you identify the type of force represented by Frear?
No, the front wheel also has the torque of the engine that you need to deal with.
4. Nov 5, 2017
### lichenguy
It's a contact force. Frictional force, i guess.
Is it:
τ - FfrontR = Iα?
5. Nov 5, 2017
Yes
Yes.
6. Nov 5, 2017
### lichenguy
Cool!
Thank you for all the help.
7. Nov 5, 2017
### TSny
Sure. Good work!
8. Dec 3, 2017
### lichenguy
Umm, guys, i just did this using:
2Ffront - 2Frear = a(M + 4m),
τ - FfrontR = Iα,
FrearR = Iα,
but it gave me $a = \frac {2τ} {R(M+2m)}$ instead of $a = \frac {2τ} {R(M+6m)}$
Is something missing?
9. Dec 3, 2017
### TSny
These equations look correct.
Check your work. The equations should yield $a = \frac {2τ} {R(M+6m)}$.
10. Dec 4, 2017
### lichenguy
I made a mistake, all good now. =]
11. Jul 7, 2018
### NewtonianAlch
Can the OP or someone else explain how that equation was simplified to get a factor of 3 in the denominator for "m"?
The original equation involves a term with rotational kinetic energy (1/2Iω2).
Was that "I" for the object's inertia converted to some kind of equivalent mass?
Last edited: Jul 7, 2018
12. Jul 7, 2018
### TSny
Each wheel is treated as a uniform, solid cylinder for which I = (1/2)mR2. Also, for rolling without slipping, ω = v/R.
13. Jul 7, 2018
### NewtonianAlch
Thank you.
I was using I = mR2, hence why it didn't come out the same.
14. Jul 7, 2018
### TSny
OK. Glad it makes sense now. |
# Utility functions and classes (gammapy.utils)¶
## Introduction¶
gammapy.utils is a collection of utility functions that are used in many places or don’t fit in one of the other packages.
Since the various sub-modules of gammapy.utils are mostly unrelated, they are not imported into the top-level namespace. Here are some examples of how to import functionality from the gammapy.utils sub-modules:
from gammapy.utils.random import sample_sphere
sample_sphere(size=10)
from gammapy.utils import random
random.sample_sphere(size=10)
## Time handling in Gammapy¶
### Time format and scale¶
In Gammapy, astropy.time.Time objects are used to represent times:
>>> from astropy.time import Time
>>> Time(['1999-01-01T00:00:00.123456789', '2010-01-01T00:00:00'])
<Time object: scale='utc' format='isot' value=['1999-01-01T00:00:00.123' '2010-01-01T00:00:00.000']>
Note that Astropy chose format='isot' and scale='utc' as default and in Gammapy these are also the recommended format and time scale.
Warning
Actually what’s written here is not true. In CTA it hasn’t been decided if times will be in utc or tt (terrestial time) format.
Here’s a reminder that this needs to be settled / updated: https://github.com/gammapy/gammapy/issues/284
When other time formats are needed it’s easy to convert, see the time format section and table in the Astropy docs.
E.g. sometimes in astronomy the modified Julian date mjd is used and for passing times to matplotlib for plotting the plot_date format should be used:
>>> from astropy.time import Time
>>> time = Time(['1999-01-01T00:00:00.123456789', '2010-01-01T00:00:00'])
>>> time.mjd
array([ 51179.00000143, 55197. ])
>>> time.plot_date
array([ 729755.00000143, 733773. ])
Converting to other time scales is also easy, see the time scale section, diagram and table in the Astropy docs.
E.g. when converting celestial (RA/DEC) to horizontal (ALT/AZ) coordinates, the sidereal time is needed. This is done automatically by astropy.coordinates.AltAz when the astropy.coordinates.AltAz.obstime is set with a Time object in any scale, no need for explicit time scale transformations in Gammapy (although if you do want to explicitly compute it, it’s easy, see here).
The Fermi-LAT time systems in a nutshell page gives a good, brief explanation of the differences between the relevant time scales UT1, UTC and TT.
### Mission elapsed times (MET)¶
[MET] time references are times representing UTC seconds after a specific origin. Each experiment may have a different MET origin that should be included in the header of the corresponding data files. For more details see Fermi-LAT time systems in a nutshell.
It’s not clear yet how to best implement METs in Gammapy, it’s one of the tasks here: https://github.com/gammapy/gammapy/issues/284
For now, we use the gammapy.time.time_ref_from_dict, gammapy.time.time_relative_to_ref and gammapy.time.absolute_time functions to convert MET floats to Time objects via the reference times stored in FITS headers.
### Time differences¶
TODO: discuss when to use TimeDelta or Quantity or [MET] floats and where one needs to convert between those and what to watch out for.
## Energy handling in Gammapy¶
### Basics¶
Most objects in Astronomy require an energy axis, e.g. counts spectra or effective area tables. In general, this axis can be defined in two ways.
• As an array of energy values. E.g. the Fermi-LAT diffuse flux is given at certain energies and those are stored in an ENERGY FITS table extension. In Gammalib this is represented by GEnergy.
• As an array of energy bin edges. This is usually stored in EBOUNDS tables, e.g. for Fermi-LAT counts cubes. In Gammalib this is represented by GEbounds.
In Gammapy both the use cases are handled by two seperate classes: gammapy.utils.energy.Energy for energy values and gammapy.utils.energy.EnergyBounds for energy bin edges
### Energy¶
The Energy class is a subclass of Quantity and thus has the same functionality plus some convenience functions for fits I/O
>>> from gammapy.utils.energy import Energy
>>> energy = Energy([1,2,3], 'TeV')
>>> hdu = energy.to_fits()
>>> type(hdu)
<class 'astropy.io.fits.hdu.table.BinTableHDU'>
### EnergyBounds¶
The EnergyBounds class is a subclass of Energy. Additional functions are available e.g. to compute the bin centers
>>> from gammapy.utils.energy import EnergyBounds
>>> ebounds = EnergyBounds.equal_log_spacing(1, 10, 8, 'GeV')
>>> ebounds.size
9
>>> ebounds.nbins
8
>>> center = ebounds.log_centers
>>> center
<Energy [ 1.15478198, 1.53992653, 2.05352503, 2.73841963, 3.65174127,
4.86967525, 6.49381632, 8.65964323] GeV>
## Reference/API¶
### gammapy.utils.energy Module¶
#### Classes¶
Energy Energy quantity scalar or array. EnergyBounds EnergyBounds array.
### gammapy.utils.units Module¶
Units and Quantity related helper functions
#### Functions¶
standardise_unit(unit) Standardise unit.
### gammapy.utils.coordinates Package¶
Astronomical coordinate calculation utility functions.
#### Functions¶
angle_to_radius(angle, distance) Radius (pc), distance(kpc), angle(deg) cartesian(r, theta) Convert polar coordinates to cartesian coordinates. flux_to_luminosity(flux, distance) Distance is assumed to be in kpc galactic(x, y, z[, obs_pos]) Compute galactic coordinates lon, lat (deg) and distance (kpc) luminosity_to_flux(luminosity, distance) Distance is assumed to be in kpc minimum_separation(lon1, lat1, lon2, lat2) Compute minimum distance of each (lon1, lat1) to any (lon2, lat2). motion_since_birth(v, age, theta, phi) Compute motion of a object with given velocity, direction and age. pair_correlation(lon, lat, theta_bins) Compute pair correlation function for points on the sphere. polar(x, y) Convert cartesian coordinates to polar coordinates. radius_to_angle(radius, distance) Radius (pc), distance(kpc), angle(deg) velocity_glon_glat(x, y, z, vx, vy, vz) Compute projected angular velocity in galactic coordinates.
#### Variables¶
D_SUN_TO_GALACTIC_CENTER A Quantity represents a number with some associated unit.
### gammapy.utils.table Module¶
Table helper utilities.
#### Functions¶
table_standardise_units_copy(table) Standardise units for all columns in a table in a copy. table_standardise_units_inplace(table) Standardise units for all columns in a table in place. table_row_to_dict(row[, make_quantity]) Make one source data dict. table_from_row_data(rows, **kwargs) Helper function to create table objects from row data.
### gammapy.utils.fits Module¶
FITS utility functions.
#### Functions¶
table_to_fits_table(table[, name]) Convert Table to astropy.io.fits.BinTableHDU. fits_table_to_table(hdu) Convert astropy table to binary table FITS format. energy_axis_to_ebounds(energy) Convert EnergyBounds to OGIP EBOUNDS extension.
#### Classes¶
SmartHDUList(hdu_list) A FITS HDU list wrapper with some sugar.
### gammapy.utils.root Package¶
Utility functions to work with ROOT and rootpy.
#### Functions¶
TH2_to_FITS(hist[, flipx]) Convert ROOT 2D histogram to FITS format. TH2_to_FITS_data(hist[, flipx]) Convert TH2 bin values into a numpy array. TH2_to_FITS_header(hist[, flipx]) Create FITS header for a given ROOT histogram. graph1d_to_table(graph) Convert ROOT TGraph to an astropy Table. hist1d_to_table(hist) Convert 1D ROOT histogram into astropy table.
### gammapy.utils.random Module¶
Random sampling for some common distributions
#### Functions¶
get_random_state(init) Get a numpy.random.RandomState instance. sample_sphere(size[, lon_range, lat_range, …]) Sample random points on the sphere. sample_sphere_distance([distance_min, …]) Sample random distances if the 3-dim space density is constant. sample_powerlaw(x_min, x_max, gamma[, size, …]) Sample random values from a power law distribution.
### gammapy.utils.distributions Package¶
Utility functions / classes for working with distributions (e.g. probability density functions)
#### Functions¶
density(func) Returns the radial surface density of a given one dimensional PDF. draw(low, high, size, dist[, random_state]) Allows drawing of random numbers from any distribution. normalize(func, x_min, x_max) Normalize a 1D function over a given range. pdf(func) Returns the one dimensional PDF of a given radial surface density.
#### Classes¶
GeneralRandom(pdf, min_range, max_range[, …]) Fast random number generation with an arbitrary pdf of a continuous variable x. GeneralRandomArray(pdf) Draw random indices from a discrete probability distribution given by a numpy array.
### gammapy.utils.scripts Module¶
Utils to create scripts and command-line tools
#### Functions¶
get_parser([function, description]) Make an ArgumentParser how we like it. get_installed_scripts() Get list of installed scripts via pkg-resources. get_all_main_functions() Get a dict with all scripts (used for testing). set_up_logging_from_args(args) Set up logging from command line arguments. read_yaml(filename[, logger]) Read YAML file write_yaml(dictionary, filename[, logger]) Write YAML file. make_path(path) Expand environment variables on Path construction. recursive_merge_dicts(a, b) Recursively merge two dictionaries.
#### Classes¶
GammapyFormatter(prog[, indent_increment, …]) ArgumentParser formatter_class argument.
### gammapy.utils.testing Module¶
Utilities for testing
#### Functions¶
requires_dependency(name) Decorator to declare required dependencies for tests. requires_data(name) Decorator to declare required data for tests. assert_wcs_allclose(wcs1, wcs2) Assert all-close for WCS assert_skycoord_allclose(actual, desired) Assert all-close for SkyCoord. assert_time_allclose(actual, desired) Assert that two astropy.time.Time objects are almost the same.
### gammapy.utils.wcs Module¶
WCS related utility functions.
#### Functions¶
linear_wcs_to_arrays(wcs, nbins_x, nbins_y) Make a 2D linear binning from a WCS object. linear_arrays_to_wcs(name_x, name_y, …) Make a 2D linear WCS object from arrays of bin edges. get_wcs_ctype(wcs) Get celestial coordinate type of WCS instance. get_resampled_wcs(wcs, factor, downsampled) Get resampled WCS object.
### gammapy.utils.nddata Module¶
Utility functions and classes for n-dimensional data and axes.
#### Functions¶
sqrt_space(start, stop, num) Return numbers spaced evenly on a square root scale.
#### Classes¶
NDDataArray(axes[, data, meta, interp_kwargs]) ND Data Array Base class DataAxis(nodes[, name, interpolation_mode]) Data axis to be used with NDDataArray BinnedDataAxis(lo, hi, **kwargs) Data axis for binned data
### gammapy.utils.time Module¶
Time related utility functions.
#### Functions¶
time_ref_from_dict(meta[, format, scale]) Calculate the time reference from metadata. time_ref_to_dict(time[, scale]) TODO: document and test. time_relative_to_ref(time, meta) Convert a time using an existing reference. absolute_time(time_delta, meta) Convert a MET into human readable date and time.
### gammapy.utils.modeling Module¶
Model classes to generate XML.
This is prototype code.
The goal was to be able to save gamma-cat in XML format for the CTA data challenge GPS sky model.
TODO (needs a bit of experimentation / discussion / thought and a few days of coding):
• add repr to all classes
• integrate this the existing Gammapy model classes to make analysis possible.
• don’t couple this to gamma-cat. Gamma-cat should change to create model classes that support XML I/O.
• sub-class Astropy Parameter and ParameterSet classes instead of starting from scratch?
• implement spatial and spectral mode registries instead of if-elif set on type to make SourceLibrary extensible.
• write test and docs
• Once modeling setup OK, ask new people to add missing models (see Gammalib, Fermi ST, naima, Sherpa, HESS) (it’s one of the simplest and nicest things to get started with)
For XML model format definitions, see here:
#### Classes¶
Parameter(name, value[, unit, parmin, …]) Class representing model parameters. ParameterList(parameters[, covariance]) List of Parameters SourceLibrary(source_list) SourceModel(source_name, source_type, …) TODO: having “source_type” separate, but often inferred from the spatial model is weird. SpectralModel(parameters) Spectral model abstract base class. SpectralModelPowerLaw(parameters) SpectralModelPowerLaw2(parameters) SpectralModelExpCutoff(parameters) SpatialModel(parameters) Spatial model abstract base class SpatialModelPoint(parameters) SpatialModelGauss(parameters) SpatialModelShell(parameters) UnknownModelError Error when encountering unknown model types. |
# Homework Help: Find the inverse of f if f(x) = x^2 - 8x + 8 and x is less than or equal to 4
1. Oct 18, 2011
### ironspud
1. The problem statement, all variables and given/known data
Find $f^{-1}(x)$ if $f(x)=x^{2}-8x+8$ and $x\leq4$
3. The attempt at a solution
I set $y=x^{2}-8x+8$, and then switch y and x to get $x=y^{2}-8y+8$.
I then try solving for y, but I end up with y's on both sides of the equation:
$x=y^{2}-8y+8$
$x-8=y^{2}-8y$
$x-8=y(y-8)$
$\frac{x-8}{y-8}=y$
$???$
2. Oct 18, 2011
### CompuChip
It's a quadratic equation in y, so try the quadratic formula :-)
3. Oct 18, 2011
### SammyS
Staff Emeritus
or complete the square.
$x-8=y^{2}-8y$
$x-8+16=y^{2}-8y+16$
$x+8=(y-4)^2$
Don't forget the ± when taking the square root.
The range of a function's inverse, f -1(x), is the same as the domain of the function, f(x). |
## 22.18 Injective modules over graded algebras
In this section we discuss injective graded modules over graded algebras analogous to More on Algebra, Section 15.55.
Let $R$ be a ring. Let $A$ be a $\mathbf{Z}$-graded algebra over $R$. Section 22.2 for our conventions. If $M$ is a graded $R$-module we set
$M^\vee = \bigoplus \nolimits _{n \in \mathbf{Z}} \mathop{\mathrm{Hom}}\nolimits _\mathbf {Z}(M^{-n}, \mathbf{Q}/\mathbf{Z}) = \bigoplus \nolimits _{n \in \mathbf{Z}} (M^{-n})^\vee$
as a graded $R$-module (no signs in the actions of $R$ on the homogeneous parts). If $M$ has the structure of a left graded $A$-module, then we define a right graded $A$-module structure on $M^\vee$ by letting $a \in A^ m$ act by
$(M^{-n})^\vee \to (M^{-n - m})^\vee , \quad f \mapsto f \circ a$
as in Section 22.13. If $M$ has the structure of a right graded $A$-module, then we define a left graded $A$-module structure on $M^\vee$ by letting $a \in A^ n$ act by
$(M^{-m})^\vee \to (M^{-m - n})^\vee , \quad f \mapsto (-1)^{nm}f \circ a$
as in Section 22.13 (the sign is forced on us because we want to use the same formula for the case when working with differential graded modules — if you only care about graded modules, then you can omit the sign here). On the category of (left or right) graded $A$-modules the functor $M \mapsto M^\vee$ is exact (check on graded pieces). Moreover, there is an injective evaluation map
$ev : M \longrightarrow (M^\vee )^\vee , \quad ev^ n = (-1)^ n \text{ the evaluation map }M^ n \to ((M^ n)^\vee )^\vee$
of graded $R$-modules, see More on Algebra, Item (17). This evaluation map is a left, resp. right $A$-module homomorphism if $M$ is a left, resp. right $A$-module, see Remarks 22.13.5 and 22.13.6. Finally, given $k \in \mathbf{Z}$ there is a canonical isomorphism
$M^\vee [-k] \longrightarrow (M[k])^\vee$
of graded $R$-modules which uses a sign and which, if $M$ is a left, resp. right $A$-module, is an isomorphism of right, resp. left $A$-modules. See Remark 22.13.7.
We claim that $A^\vee$ is an injective object of the category $\text{Mod}_ A$ of graded right $A$-modules. Namely, given a graded right $A$-module $N$ we have
$\mathop{\mathrm{Hom}}\nolimits _{\text{Mod}_ A}(N, A^\vee ) = \mathop{\mathrm{Hom}}\nolimits _{\text{Comp}(\mathbf{Z})}(N \otimes _ A A, \mathbf{Q}/\mathbf{Z})) = (N^0)^\vee$
by Lemma 22.13.2 (applied to the case where all the differentials are zero). We conclude because the functor $N \mapsto (N^0)^\vee = (N^\vee )^0$ is exact.
Finally, for every graded right $A$-module $M$ we can choose a surjection of graded left $A$-modules
$\bigoplus \nolimits _{i \in I} A[k_ i] \to M^\vee$
where $A[k_ i]$ denotes the shift of $A$ by $k_ i \in \mathbf{Z}$. We do this by choosing homogeneous generators for $M^\vee$. In this way we get an injection
$M \to (M^\vee )^\vee \to \prod A[k_ i]^\vee = \prod A^\vee [-k_ i]$
Observe that the products in the formula above are products in the category of graded modules (in other words, take products in each degree and then take the direct sum of the pieces).
We conclude that
1. the category of graded $A$-modules has enough injectives,
2. for every $k \in \mathbf{Z}$ the module $A^\vee [k]$ is injective, and
3. every $A$-module injects into a product in the category of graded modules of copies of shifts $A^\vee [k]$.
In your comment you can use Markdown and LaTeX style mathematics (enclose it like $\pi$). A preview option is available if you wish to see how it works out (just click on the eye in the toolbar). |
## Prealgebra (7th Edition)
$\frac{1}{a^2}$
$\frac{3a}{8}\times\frac{16}{6a^3}$ $\longrightarrow$ factor and eliminate common factors $\frac{/\!\!\!3a}{/\!\!8}\times\frac{/\!\!2\times/\!\!8}{/\!\!2\times/\!\!\!3a\times a^2}$ $\longrightarrow$ multiply $\frac{1}{a^2}$ |
# Multiple vortex-antivortex pair generation in magnetic nanodots - Condensed Matter > Strongly Correlated Electrons
Multiple vortex-antivortex pair generation in magnetic nanodots - Condensed Matter > Strongly Correlated Electrons - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.
Abstract: The interaction of a magnetic vortex with a rotating magnetic field causesthe nucleation of a vortex-antivortex pair leading to a vortex polarityswitching. The key point of this process is the creation of a dip, which can beinterpreted as a nonlinear resonance in the system of certain magnon modes withnonlinear coupling. The usually observed single-dip structure is a particularcase of a multidip structure. The dynamics of the structure with $n$ dips isdescribed as the dynamics of nonlinearly coupled modes with azimuthal numbers$m=0,\pm n,\pm 2n$. The multidip structure with arbitrary number of vortexantivortex pairs can be obtained in vortex-state nanodisk using a space- andtime-varying magnetic field. A scheme of a possible experimental setup formultidip structure generation is proposed.
Autor: Yuri Gaididei, Volodymyr P. Kravchuk, Denis D. Sheka, Franz G. Mertens
Fuente: https://arxiv.org/ |
# ODE: Can't find mistake
1. Dec 25, 2005
### twoflower
Hi all,
I've been just solving this one:
$$y' + \frac{y}{x} = 3\sqrt[3]{\left(xy\right)^2}\arctan x$$
The problem is, one of the solutions I got doesn't pass the original equation and I can't find the mistake. Here it is:
After substituting
$$z = \sqrt[3]{y}$$
and thus getting
$$3z^2z' + \frac{z^3}{x} = 3\sqrt[3]{x^2}z^2\arctan x$$
dividing with $z^2$ (and so getting the condition for y not to be particular solution $y \equiv 0$)
$$3z' + \frac{z}{x} = 3x^{\frac{2}{3}}\arctan x$$
To solve this, I first solved the homogenous equation
$$3z' + \frac{z}{x} = 0$$
and few steps I won't write here I got
$$\log |z|^3 = \log |x| + C$$
$$|z|^3 = e^{C}|x|$$
$$z_1 = C\sqrt[3]{x}$$
$$z_2 = -C\sqrt[3]{x}$$
Well, I think this is the problematic step although I think it's ok.
To get the proper $C = C(x)$ for the non-homogenous equation I expressed $z$ in terms of
$C(x)$ and put it to the equation. It involved another ODE at the end of which I got
$$\log |C| = \log e^{C}\frac{1}{\sqrt[3]{x^2}}$$
$$C = Q\frac{1}{\sqrt[3]{x^2}}$$
I know that I should actually also write that
$$C_2 = -Q\frac{1}{\sqrt[3]{x^2}}$$
but this won't give me anything new since changing the sign in front of $C$ in expression
$$z = \pm C\sqrt[3]{x}$$
will give all possibilites.
Concerning $Q$, I got
$$Q = \left(\frac{x^2 + 1}{2}\arctan x - \frac{x}{2} + R\right)$$
and so
$$C = \frac{1}{\sqrt[3]{x^2}}\left(\frac{x^2 + 1}{2}\arctan x - \frac{x}{2} + R\right)$$
$$z = \pm\frac{1}{\sqrt[3]{x}}\left(\frac{x^2 + 1}{2}\arctan x - \frac{x}{2} + R\right)$$
and finally
$$y = z^3 = \pm\frac{1}{x}\left(\frac{x^2 + 1}{2}\arctan x - \frac{x}{2} + R\right)$$
Well, with the plus-signed solution, it satisfies the original equation while with the minus sign it doesn't. Which is
something that I think can be seen already from the original ODE since the left side will get negative sign while the right
side doesn't depend on the sign of $y$
Anyway, can you see where I did a mistake?
Thank you very much!
Last edited: Dec 25, 2005
2. Dec 25, 2005
### saltydog
Hey Twoflower. What up? Me, when I got to:
$$3z^{'}+\frac{z}{x}=3x^{2/3}\text{ArcTan[x]}$$
I'd treat it like a regular first order ODE and solve for the integrating factor. You know:
$$\sigma=x^{1/3}$$
so that:
$$d\left[x^{1/3}z\right]=x\text{Arctan[x]}$$
leaving:
$$x^{1/3}z=-\frac{x}{2}+\frac{1}{2}\text{Arctan[x]}+ \frac{x^2}{2}\text{Arctan[x]}+c$$
3. Dec 25, 2005
### twoflower
Thank you Saltydog. Your solution is basically the same I got excepting I also have the same solution also with the minus sign.
We had been told integrating factor as an alternative to method of variation of parameters I used here since I like it more.
You know, there has to be some flaw in my approach since the minus-sign solution doesn't satisfy the ODE while the same solution, only with positive sign, does. And I can't see, why should I exclude the minus-sign solution... |
# How do you graph y=x^2-2x-8?
Jan 2, 2018
Refer explanation.
#### Explanation:
• METHOD 1: ALGEBRA
$y = {x}^{2} - 2 x - 8$ is quadratic in $x$. $a = 1 , b = - 2 , c = - 8$
As coefficient of ${x}^{2}$ is positive so, its graph will be mouth opening upward parabola.
Check discriminant of the quadratic to examine the nature of the roots.
$D = {b}^{2} - 4 a c$
$D = 4 + 32 = 36$
As $D > 0$, the roots of quadratic will be real and unequal. Also, we can find roots of $y = 0$ that are $x = 4 , - 2$
Here, $y = - 8$ at$x = 0$
The minima of the above quadatic is at $x = - \frac{b}{2 a} = \frac{2}{2} = 1$
The value of quadratic at minima $= - \frac{D}{4 a} = - \frac{36}{4} = - 9$
By analysing above all points graph will be
graph{x^2-2x-8 [-25.65, 25.65, -12.83, 12.82]}
• METHOD 2:CALCULUS
Find $\frac{\mathrm{dy}}{\mathrm{dx}}$ and $\frac{{d}^{2} y}{\mathrm{dx}} ^ 2$ . Also check the nature of the graph by derivative tests. |
## Eun-Kyung Cho (조은경) gave a talk on the minimum independent dominating set at the Discrete Math Seminar
On December 7, 2021, Eun-Kyung Cho (조은경) from the Hankuk University of Foreign Studies gave a talk at the Discrete Math Seminar on various upper bounds for the minimum independent dominating set (or, the minimum maximal independent set) in a graph. The title of her talk was “Independent domination of graphs with bounded maximum degree“.
## Eun-Kyung Cho (조은경), Independent domination of graphs with bounded maximum degree
The independent domination number of a graph $G$, denoted $i(G)$, is the minimum size of an independent dominating set of $G$. In this talk, we prove a series of results regarding independent domination of graphs with bounded maximum degree.
Let $G$ be a graph with maximum degree at most $k$ where $k \ge 1$. We prove that if $k = 4$, then $i(G) \le \frac{5}{9}|V(G)|$, which is tight. Generalizing this result and a result by Akbari et al., we suggest a conjecture on the upper bound of $i(G)$ for $k \ge 1$, which is tight if true.
Let $G’$ be a connected $k$-regular graph that is not $K_{k, k}$ where $k\geq 3$. We prove that $i(G’)\le \frac{k-1}{2k-1}|V(G’)|$, which is tight for $k \in \{3, 4\}$, generalizing a result by Lam, Shiu, and Sun. This result also answers a question by Goddard et al. in the affirmative.
In addition, we show that $\frac{i(G’)}{\gamma(G’)} \le \frac{k^3-3k^2+2}{2k^2-6k+2}$, strengthening upon a result of Knor, Škrekovski, and Tepeh, where $\gamma(G’)$ is the domination number of $G’$.
Moreover, if we restrict $G’$ to be a cubic graph without $4$-cycles, then we prove that $i(G’) \le \frac{4}{11}|V(G’)|$, which improves a result by Abrishami and Henning.
This talk is based on joint work with Ilkyoo Choi, Hyemin Kwon, and Boram Park.
## Eun-Kyung Cho (조은경) gave a talk on the problem of decomposing a graph into a d-degenerate graph and a graph of bounded maximum degree at the Discrete Math Seminar
On March 3, 2020, Eun-Kyung Cho (조은경) from Hankuk University of Foreign Studies presented a talk on the existence of a decomposition of a planar graph into two edge-disjoint subgraphs, one of which is d-degenerate and the other has maximum degree at most h at the discrete math seminar. The title of her talk was “Decomposition of a planar graph into a d-degenerate graph and a graph with maximum degree at most h“. She will visit the IBS discrete math group until March 6.
## Eun-Kyung Cho (조은경), Decomposition of a planar graph into a $d$-degenerate graph and a graph with maximum degree at most $h$
Given a graph $G$, a decomposition of $G$ is a collection of spanning subgraphs $H_1, \ldots, H_t$ of $G$ such that each edge of $G$ is an edge of $H_i$ for exactly one $i \in \{1, \ldots, t\}$. Given a positive integer $d$, a graph is said to be $d$-degenerate if every subgraph of it has a vertex of degree at most $d$. Given a non-negative integer $h$, we say that a graph $G$ is $(d,h)$-decomposable if there is a decomposition of $G$ into two spanning subgraphs, where one is a $d$-degenerate graph, and the other is a graph with maximum degree at most $h$.
It is known that a planar graph is $5$-degenerate, but not always $4$-degenerate. This implies that a planar graph is $(5,0)$-decomposable, but not always $(4,0)$-decomposable. Moreover, by related previous results, it is known that a planar graph is $(3,4)$- and $(2,8)$-decomposable.
In this talk, we improve these results by showing that every planar graph is $(4,1)$-, $(3,2)$-, and $(2,6)$-decomposable. The $(4,1)$- and $(3,2)$-decomposabilities are sharp in the sense that the maximum degree condition cannot be reduced more.
This is joint work with Ilkyoo Choi, Ringi Kim, Boram Park, Tingting Shan, and Xuding Zhu.
기초과학연구원 수리및계산과학연구단 이산수학그룹
대전 유성구 엑스포로 55 (우) 34126
IBS Discrete Mathematics Group (DIMAG)
Institute for Basic Science (IBS)
55 Expo-ro Yuseong-gu Daejeon 34126 South Korea
E-mail: [email protected], Fax: +82-42-878-9209 |
# Creating annotated heatmaps¶
It is often desirable to show data which depends on two independent variables as a color coded image plot. This is often referred to as a heatmap. If the data is categorical, this would be called a categorical heatmap.
Matplotlib's imshow function makes production of such plots particularly easy.
The following examples show how to create a heatmap with annotations. We will start with an easy example and expand it to be usable as a universal function.
## A simple categorical heatmap¶
We may start by defining some data. What we need is a 2D list or array which defines the data to color code. We then also need two lists or arrays of categories; of course the number of elements in those lists need to match the data along the respective axes. The heatmap itself is an imshow plot with the labels set to the categories we have. Note that it is important to set both, the tick locations (set_xticks) as well as the tick labels (set_xticklabels), otherwise they would become out of sync. The locations are just the ascending integer numbers, while the ticklabels are the labels to show. Finally we can label the data itself by creating a Text within each cell showing the value of that cell.
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
vegetables = ["cucumber", "tomato", "lettuce", "asparagus",
"potato", "wheat", "barley"]
farmers = ["Farmer Joe", "Upland Bros.", "Smith Gardening",
"Agrifun", "Organiculture", "BioGoods Ltd.", "Cornylee Corp."]
harvest = np.array([[0.8, 2.4, 2.5, 3.9, 0.0, 4.0, 0.0],
[2.4, 0.0, 4.0, 1.0, 2.7, 0.0, 0.0],
[1.1, 2.4, 0.8, 4.3, 1.9, 4.4, 0.0],
[0.6, 0.0, 0.3, 0.0, 3.1, 0.0, 0.0],
[0.7, 1.7, 0.6, 2.6, 2.2, 6.2, 0.0],
[1.3, 1.2, 0.0, 0.0, 0.0, 3.2, 5.1],
[0.1, 2.0, 0.0, 1.4, 0.0, 1.9, 6.3]])
fig, ax = plt.subplots()
im = ax.imshow(harvest)
# We want to show all ticks...
ax.set_xticks(np.arange(len(farmers)))
ax.set_yticks(np.arange(len(vegetables)))
# ... and label them with the respective list entries
ax.set_xticklabels(farmers)
ax.set_yticklabels(vegetables)
# Rotate the tick labels and set their alignment.
plt.setp(ax.get_xticklabels(), rotation=45, ha="right",
rotation_mode="anchor")
# Loop over data dimensions and create text annotations.
for i in range(len(vegetables)):
for j in range(len(farmers)):
text = ax.text(j, i, harvest[i, j],
ha="center", va="center", color="w")
ax.set_title("Harvest of local farmers (in tons/year)")
fig.tight_layout()
plt.show()
## Using the helper function code style¶
As discussed in the Coding styles one might want to reuse such code to create some kind of heatmap for different input data and/or on different axes. We create a function that takes the data and the row and column labels as input, and allows arguments that are used to customize the plot
Here, in addition to the above we also want to create a colorbar and position the labels above of the heatmap instead of below it. The annotations shall get different colors depending on a threshold for better contrast against the pixel color. Finally, we turn the surrounding axes spines off and create a grid of white lines to separate the cells.
def heatmap(data, row_labels, col_labels, ax=None,
cbar_kw={}, cbarlabel="", **kwargs):
"""
Create a heatmap from a numpy array and two lists of labels.
Parameters
----------
data
A 2D numpy array of shape (N, M).
row_labels
A list or array of length N with the labels for the rows.
col_labels
A list or array of length M with the labels for the columns.
ax
A matplotlib.axes.Axes instance to which the heatmap is plotted. If
not provided, use current axes or create a new one. Optional.
cbar_kw
A dictionary with arguments to matplotlib.Figure.colorbar. Optional.
cbarlabel
The label for the colorbar. Optional.
**kwargs
All other arguments are forwarded to imshow.
"""
if not ax:
ax = plt.gca()
# Plot the heatmap
im = ax.imshow(data, **kwargs)
# Create colorbar
cbar = ax.figure.colorbar(im, ax=ax, **cbar_kw)
cbar.ax.set_ylabel(cbarlabel, rotation=-90, va="bottom")
# We want to show all ticks...
ax.set_xticks(np.arange(data.shape[1]))
ax.set_yticks(np.arange(data.shape[0]))
# ... and label them with the respective list entries.
ax.set_xticklabels(col_labels)
ax.set_yticklabels(row_labels)
# Let the horizontal axes labeling appear on top.
ax.tick_params(top=True, bottom=False,
labeltop=True, labelbottom=False)
# Rotate the tick labels and set their alignment.
plt.setp(ax.get_xticklabels(), rotation=-30, ha="right",
rotation_mode="anchor")
# Turn spines off and create white grid.
ax.spines[:].set_visible(False)
ax.set_xticks(np.arange(data.shape[1]+1)-.5, minor=True)
ax.set_yticks(np.arange(data.shape[0]+1)-.5, minor=True)
ax.grid(which="minor", color="w", linestyle='-', linewidth=3)
ax.tick_params(which="minor", bottom=False, left=False)
return im, cbar
def annotate_heatmap(im, data=None, valfmt="{x:.2f}",
textcolors=("black", "white"),
threshold=None, **textkw):
"""
A function to annotate a heatmap.
Parameters
----------
im
The AxesImage to be labeled.
data
Data used to annotate. If None, the image's data is used. Optional.
valfmt
The format of the annotations inside the heatmap. This should either
use the string format method, e.g. "\$ {x:.2f}", or be a
matplotlib.ticker.Formatter. Optional.
textcolors
A pair of colors. The first is used for values below a threshold,
the second for those above. Optional.
threshold
Value in data units according to which the colors from textcolors are
applied. If None (the default) uses the middle of the colormap as
separation. Optional.
**kwargs
All other arguments are forwarded to each call to text used to create
the text labels.
"""
if not isinstance(data, (list, np.ndarray)):
data = im.get_array()
# Normalize the threshold to the images color range.
if threshold is not None:
threshold = im.norm(threshold)
else:
threshold = im.norm(data.max())/2.
# Set default alignment to center, but allow it to be
# overwritten by textkw.
kw = dict(horizontalalignment="center",
verticalalignment="center")
kw.update(textkw)
# Get the formatter in case a string is supplied
if isinstance(valfmt, str):
valfmt = matplotlib.ticker.StrMethodFormatter(valfmt)
# Loop over the data and create a Text for each "pixel".
# Change the text's color depending on the data.
texts = []
for i in range(data.shape[0]):
for j in range(data.shape[1]):
kw.update(color=textcolors[int(im.norm(data[i, j]) > threshold)])
text = im.axes.text(j, i, valfmt(data[i, j], None), **kw)
texts.append(text)
return texts
The above now allows us to keep the actual plot creation pretty compact.
fig, ax = plt.subplots()
im, cbar = heatmap(harvest, vegetables, farmers, ax=ax,
cmap="YlGn", cbarlabel="harvest [t/year]")
texts = annotate_heatmap(im, valfmt="{x:.1f} t")
fig.tight_layout()
plt.show()
## Some more complex heatmap examples¶
In the following we show the versatility of the previously created functions by applying it in different cases and using different arguments.
np.random.seed(19680801)
fig, ((ax, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize=(8, 6))
# Replicate the above example with a different font size and colormap.
im, _ = heatmap(harvest, vegetables, farmers, ax=ax,
cmap="Wistia", cbarlabel="harvest [t/year]")
annotate_heatmap(im, valfmt="{x:.1f}", size=7)
# Create some new data, give further arguments to imshow (vmin),
# use an integer format on the annotations and provide some colors.
data = np.random.randint(2, 100, size=(7, 7))
y = ["Book {}".format(i) for i in range(1, 8)]
x = ["Store {}".format(i) for i in list("ABCDEFG")]
im, _ = heatmap(data, y, x, ax=ax2, vmin=0,
cmap="magma_r", cbarlabel="weekly sold copies")
annotate_heatmap(im, valfmt="{x:d}", size=7, threshold=20,
textcolors=("red", "white"))
# Sometimes even the data itself is categorical. Here we use a
# matplotlib.colors.BoundaryNorm to get the data into classes
# and use this to colorize the plot, but also to obtain the class
# labels from an array of classes.
data = np.random.randn(6, 6)
y = ["Prod. {}".format(i) for i in range(10, 70, 10)]
x = ["Cycle {}".format(i) for i in range(1, 7)]
qrates = list("ABCDEFG")
norm = matplotlib.colors.BoundaryNorm(np.linspace(-3.5, 3.5, 8), 7)
fmt = matplotlib.ticker.FuncFormatter(lambda x, pos: qrates[::-1][norm(x)])
im, _ = heatmap(data, y, x, ax=ax3,
cmap=plt.get_cmap("PiYG", 7), norm=norm,
cbar_kw=dict(ticks=np.arange(-3, 4), format=fmt),
cbarlabel="Quality Rating")
annotate_heatmap(im, valfmt=fmt, size=9, fontweight="bold", threshold=-1,
textcolors=("red", "black"))
# We can nicely plot a correlation matrix. Since this is bound by -1 and 1,
# we use those as vmin and vmax. We may also remove leading zeros and hide
# the diagonal elements (which are all 1) by using a
# matplotlib.ticker.FuncFormatter.
corr_matrix = np.corrcoef(harvest)
im, _ = heatmap(corr_matrix, vegetables, vegetables, ax=ax4,
cmap="PuOr", vmin=-1, vmax=1,
cbarlabel="correlation coeff.")
def func(x, pos):
return "{:.2f}".format(x).replace("0.", ".").replace("1.00", "")
annotate_heatmap(im, valfmt=matplotlib.ticker.FuncFormatter(func), size=7)
plt.tight_layout()
plt.show()
References
The use of the following functions, methods, classes and modules is shown in this example:
Total running time of the script: ( 0 minutes 2.248 seconds)
Keywords: matplotlib code example, codex, python plot, pyplot Gallery generated by Sphinx-Gallery |
# ParametricPlot3D of filled out 3D shape?
I am trying to parameterize 2D and 3D Euclidean space by the variables:
$\qquad -\infty < x < +\infty,\, 0 < \phi < \pi$ and $0 < \theta < \pi$
To check that my parametrization is correct, I am plotting some ranges in these variables to explicitly see how it fills out the space. For instance, in 2D:
ParametricPlot[{Re[x Exp[I ϕ]], Im[x Exp[I ϕ]]}, {x, -10, 5}, {ϕ, 0, π}]
we can see how for values x<0 the lower half plane gets filled out, while for x>0 that happens for the upper half plane, such that for -Infinity < x < +Infinity the whole space is traversed once.
Now, if I try to do the same check in the 3D case, I get an error message:
ParametricPlot3D[
{Re[x Exp[I ϕ] Sin[θ]], Im[x Exp[I ϕ] Sin[θ]], x Cos[θ]},
{x, -10, 5}, {ϕ, 0, π/3}, {θ, 0, π/4}]
ParametricPlot3D::nonopt
It seems that ParametricPlot3D, even though it is a 3D function, cannot deal with 3 variables. How should I be plotting the filled out 3D region instead?
• In general, you should have at most one less parameter than the space you are filling. So in 2D space, you have a curve parameterized by a single parameter. In 3D space, you can have a curve parameterized by a single parameter, or have a surface parameterized by two parameters. – MikeY May 10 '17 at 14:14
• @MikeY Right, so what if I want to have a hypersurface in 3D parametrized by 3 parameters? Clearly, the 2D case did allow me to use two parameters to parametrize a surface, so why should that not work in 3D with three parameters? – Kagaratsch May 10 '17 at 14:15
• Take a look at RegionPlot3D[ ]. You are going from spherical to cartesian in your conversion, if you rewrite the equations from cartesian to spherical, so for example $r2=x^2+y^2+z^2$ then you can specify the region to plot in terms of your bounds on r, phi, and theta, so fill region where $x^2+y^2+z^2 < r_max && phibound && thetabound$. PS: I have no idea how I made a box around that equation! – MikeY May 10 '17 at 14:41
You could always treat it as a Region.
pr = ParametricRegion[
ComplexExpand[{Re[x Exp[I ϕ] Sin[θ]], Im[x Exp[I ϕ] Sin[θ]], x Cos[θ]}],
{{x, -10, 5}, {ϕ, 0, π/3}, {θ, 0, π/4}}
];
Region[pr]
Or dress it up a bit:
Region[pr, BoxRatios -> {1, 1, 1}, Boxed -> True,
Axes -> True, AxesLabel -> {x, ϕ, θ}]
Alternatively BoundaryDiscretizeRegion produces a similar looking output.
I ended up doing the following workaround, discretizing the ϕ variable, e.g.:
Show[Table[
ParametricPlot3D[
{Re[x Exp[I ϕ] Sin[θ]], Im[x Exp[I ϕ] Sin[θ]], x Cos[θ]},
{x, -10, 5}, {θ, 0, π/2}]
, {ϕ, 0, π/2, π/200}], PlotRange -> All]
The graphics is a bit heavy to rotate, since so many surfaces are overlapped. But it is good enough for a visualization in my opinion.
The quick and dirty approach is just to throw down points on your domain. If you want to be fancy you can connect them with lines or surfaces. You can even color code them or add mouseovers. Here is just the barebones version though:
Show[Graphics3D[
Table[
Point[{Re[x Exp[I \[Phi]] Sin[\[Theta]]],
Im[x Exp[I \[Phi]] Sin[\[Theta]]], x Cos[\[Theta]]}],
{x, -10, 5,0.2},
{\[Phi], 0, \[Pi]/3, Pi/24},
{\[Theta], 0, \[Pi]/4, Pi/24}]]] |
# All Questions
131 views
### GARCH parameters
I'm trying to estimate parameters of GARCH(p,q) model. I tried p=1, q=1 with t-distribution errors. Ljung-Box showed no correlation in residuals and squared residual. But the null hypothesis that ...
51 views
### Compiling QuantLib example
I have followed the guidlines for installing QuantLib for mac from here http://quantlib.org/install/macosx.shtml and also fixed the flags using the commands: export CXXFLAGS = -stdlib=libstdc++ ...
6 views
### “Risk” Factor vs Bivariate Sorts
With regards to a cross-sectional asset pricing (stocks) study, I am testing if one variable can explain another. One common approach to do this, is to use the double-sorting portfolio technique (sort ...
5 views
### Is it possible to place hidden order inside spread when trading E-mini S&P 500?
My question is not about hidden orders in general. In equity market a trader can post his hidden order inside spread, is it the same way for E-mini S&P 500?
7 views
### Regression model extension
I've been asked to do out of sample procedure for my simple regression model. my dependent data is belong to 2500 index nad independent one is belong to 2500 stock log returned data. how should i ...
96 views
### SABR Implied Volatility and Option Prices
I am trying to understand SABR model. I am having difficulty to understand how to calibrate ABR a) the initial variance b) the volatility of variance c) the exponent for the forward rate d)the ...
28 views
### How is the fundamental theorem of asset pricing used?
I know that a multi-period market model is complete and arbitrage free if there's a unique equivalent martingale measure. The thing is, I have absolutely no clue how to apply this theorem to a simple ...
112 views
### How to automatically get all options data for a particular stock into microsoft excel?
I'm looking for a way to get the entire options chain (All options expiries) for a particular stock in excel without manually copy pasting anything. It does not have to be real time and I will only be ...
229 views
### Up and Down days in GBPUSD and a Filter
I want to study if the odds of an up or down day in a forex pairs is 50-50. I just count the total number of up and down days in X years and compare it with the total days. The results are very ...
8 views
### How to write time-varying functions in R? Applied example
Let's say I want to use a Gaussian copula $$C_{R_t}(\eta_1, ..., \eta_n) = N_{R_t}(N^{-1}(\eta_1), ...,N^{-1}(\eta_n))$$ with a time-varying correlation matrix $R_t$. Through DCC we model the ...
95 views
### VEC GARCH (1,1) for 4 time series
I have to estimate a VEC GARCH(1,1) model in R. I already tried rmgarch, fGarch, ccgarch, mgarch, tsDyn. Has somebody estimated a model like that? ...
20 views
### Python statsmodel ARMA question
I am reading through the documentation of statsmodel package in python from the link The (p,q) order of the model for the number of AR parameters, differences, and MA parameters to use. How do I ...
154 views
11 views
### How to build a cross currency swap pricer?
We're looking to build a pricer to convert a funding spread in a given currency over a specific funding basis e.g. 20 bps EUR 3m€ and convert it to a funding spread to a different currency with a ...
51 views
### Historical volatility from non-uniform samples
The way I compute historical volatility is that I take two parameters $dt$ and $T$, get a list of stock prices with the step of $dt$ over the window $T$ (so $T/dt+1$ samples in total), compute $T/dt$ ...
37 views
### What is the effect of mean-reversion on an upper barrier knock-out call option?
Consider a mean-reverting normal model for an underlying $dX^{(1)}_t=-\kappa X^{(1)}_tdt+\sigma^{(1)} dW^{(1)}_t$, for fixed time-independent constants, $\kappa$ (mean-reversion) and $\sigma^{(1)}$ ...
14 views
### Python regenerating ARMA params using statsmodels
I am trying to regenerate the ARMA parameters from statsmodel in python. The code I am using is: ...
1k views
### What are modern algorithms for trade classification?
When dealing with trade data, for example from TAQ, a common problem is that of determining whether a trade was a buy or a sell. The most commonly used classifier is the Lee-Ready algorithm (Inferring ...
88 views
### How to obtain Standardized Residuals from a Time-Series?
I have my estimates for an AR(3). To obtain the residuals I'm supposed to use $$Y_t-\hat\phi_0-\hat\phi_1Y_{t-1}-\hat\phi_2Y_{t-2}-\hat\phi_3Y_{t-3},$$ where the Y's are from the dataset. If I do ...
10 views
### Define the order of GARCH(m.s)
I know that if the order of Arch(m) is over 3, we should use GARCH and GARCH(1,1) was proved to be the best. But was GARCH(1,1) proved to be available for any country's stock market? My result show ...
31 views
### Option based approach to real capital structures
Has anyone made a serious attempt to apply option theory to real assets and capital structures, taking into account all the messy details ?
14 views
### Converting a factor vector in R
If I have a factor vector with 3 levels, "", "No" and "Yes" how can I convert this to a binary factor vector with "na" if no answer , 1 for "Yes" and 0 for "No" ?
27 views
### Is $(1,0,0,0,…,0)$ a legitimate dividend stream?
A book I am reading defines a positive linear functional as a "price functional" from a set of adapted processes to the real numbers. Specifically, it defines a "consistent price functional" as one ...
20 views
### $0$-beta stock and diversification
If we invest $w$ in the market portfolio and $1 - w$ in the risk-free asset, and observe a $0$-beta asset with expected return greater than the return on the risk-free asset, how can this be used in ...
105 views
### quantlib python : missing methods?
I'm reading Introduction to Selected Classes of the QuantLib Library I by Dimitri Reiswich and tries to "convert" it to Python. It seems to me that some C++ possibilities aren't available in python. ...
2k views
### How to estimate real-world probabilities
In the world of finance, Risk-neutral pricing allow us to estimate the fair value of derivatives using the risk free rate as the expected return of the underlyings. However, the behavior of ...
55 views
### Derivative: Delta of a Down and Out Call Option with Barrier=Debt(K)
I am trying to compute the derivative of this function with respect to V0: This is the price of a down and out call option, assuming the barrier equal to the level of debt K. In other terms, I need ...
101 views
### Interpret simulation results ($P$ and $Q$ measures)
I am struggling in interpreting results of my simulations. I use Monte Carlo algorithm to simulate stock paths and calculate option price. The notation: $r$ is a risk free interest rate, $T$ is time ...
15 views
### Is the option payoff = exercise price = strike price [on hold]
This may be a somewhat arbitrary question, but in the Monte Carlo option pricing method, they talk about exercise price / option payoff. Is this equivalent to the strike price in the BSM model, ...
8 views
### anyone know haw would we calculate hml ,(fama and french three factors model) [on hold]
how we calculate hml and smb from our own data (malaysian data) so i can not use the dta from keneth french library using excel and when we divide firms into 6 portfolios we use value and book ratio ...
12 views
### R:log return calculation for panel data structure
I have a long form panel for hourly prices of stocks. I want to do log return calculation for this panel data structure. Below is my code: ...
44 views
### Correlation between asset A and Portfolio X (which contains A)
After a few hours trying to solve this I give up! I need help. I need to calculate the BETA of an asset with respect to a portfolio that contains this asset. I have the volatility and correlations ...
27 views
### What is a definition of “Benchmark”?
The word "benchmark" is often used in Finance, but in a rather fuzzy manner, there for a rough idea of what it is, and how it is 'defined'. Can someone provide a rigorous and precise definition of ...
36 views
### Physical interpretation of variance in returns in a portfolio design
I have a downloaded the log-returns at successive times of 98 stocks from S&P index over 753 days. I calculated the total daily return according to the formula 1 below, where ...
77 views
+50
### SVI negative rates
I've used the SVI model in the past for equity option which worekd quite well. I came across a post on Wilmott where someone said hes using SVI for swaption as well. I would like to test the model and ...
13 views
### Reference for option pricing, binomial multi-period model using martingales and conditional expectations
The title basically says it all. I am looking for a reference text on the pricing of options in a binomial multi-period model. It should be mathemathically rigorous using martingales and conditional ...
42 views
### Maximization with risk-neutral investors and VaR constraints
In this paper, the authors make a simple model with: (1) A global bank, who is risk-neutral but has a Value-at-Risk constraint: $$\max_{x_t^B} E_t[x_t^B\prime R_{t+1}]$$ s.t. \alpha ...
40 views
### Pricing foreign currency bonds - which approach is more theoretically “sound”?
You own a fixed rate corporate bond in foreign currency (let's say JPY). Your domestic currency is USD. Which of the these two approaches do you consider theoretically better? Discount JPY cash ...
24 views
### Is This A Viable Alternative Options Pricing Method?
i'm currently a high school student who hasn't gone past Algebra II, and thus I have minimal Calculus knowledge. I know the basics of Integration and Derivation (drop the coefficient, raise to the ...
38 views
### How to define the return of this portfolio? [on hold]
I have an insurer with a some assets that he plans to invest into : Stock Zero-coupon bond with maturity 10 years We know also that the stock is driven by the geometric borwnian motion, the short ...
144 views
### Calculating historical implied volatility
I know that each individual option has it's own implied volatility, but how do you go about calculating the overall implied volatility for an underlying? For example when someone sais the IV of a ...
52 views
### Pricing a Vanilla swap between coupons; What rates to use?
Vanilla Swap question. Entered into a 5Y fixed for floating HUF swap. Fixed is annual coupons, Float is semi-annual coupons. 1 month later I want to price it. I set up my future values for Fixed ...
172 views
### Square of Wiener process
In Ito's calculus one often comes $dW^2=dt$. How does this come about? What is it's relation to the Milstein method?
31 views
### Preparation for interview: influx of power of the moon
I am preparing myself for an interview for a quantitative analyst position and one of the sample questions asked in previous examinations was: "Suppose the moon were to disintegrate, and fall to ...
23 views
### Arbitrage and completeness in multiperiod model?
Given a 2-period market with above stock price process along with a riskfree stock with a return of 5%, how do I determine whether the market is arbitrage-free and complete when I only have ...
14 views
### Estimate the risk of swaptions
I would like to model OTM Swaptions. I can use some implementation of the Bachelier model (not B76 due to negative rates) and implied volatilities from Bloomberg. For 10Y X 10Y (10 years option ...
20 views
### Black Litterman: Is it possible to have multiple views (from different sources) on the same asset?
From the basics of Black Litterman I understand that each view on a stock is implemented via the pick matrix P with the expected value of the views in Q. I have read several papers where each stock ...
Suppose $dA_t = A_t[\mu dt+\sigma dW_t]$ (assets' value) under the physical measure, plus the other assumptions of the Merton model. Suppose further that debt and equity are tradeable assets that ... |
# In how many parts is a plane cut by n lines, or a space cut by n planes?
Into how many parts at most is a plane cut by $n$ lines? Into how many parts is space divided by $n$ planes in general position?
My approach:
$$p(n+1)=p(n)+n+1$$ $$s(n+1)=s(n)+p(n)$$
This solution is given by Engel (pg. 40). It starts with anthe observation that the regions into which the plane is divided by $n$ lines are defined by their vertices which are the points of intersection of the given lines.
We may assume that none of the lines is horizontal; if one is, rotate the plane by a small angle. This ensures that the regions are of two sorts: some regions (finite or not) have the lowest vertex, others do not. Every point of intersection serves as the lowest vertex of exactly one region. Therefore, for $n$ lines, there are $\binom{n}{2}$ regions of the first sort. There are $n+1$ regions of the second sort, which becomes apparent when a horizontal line is drawn that crosses all n lines below all of their "legitimate" intersection points.
Therefore, $$L_n=\binom{n}{2}+\binom{n}{1}+\binom{n}{0}$$
The latter expression can be easily generalized to a 3D problem wherein the question is about the number of regions into which n planes divide the space. The answer is
$$\binom{n}{3}+\binom{n}{2}+\binom{n}{1}+\binom{n}{0}$$
How? Also how to prove that if S(n) is the space part , then at least $\frac{2n-3}{4}$ tetradehra will exist?
What will happen if instead of a line, a circle cuts the plane?
If we arrange $n$ hyperplanes in a $d$-dimensional space such that the number of regions is maximized, we obtain
$$R_d(n) = \sum\limits_{i = 0}^d\binom{n}{i}$$
This is a standard result that you can find in nearly any introductory combinatorics book. This is an outline of the proof:
The following recurrence relation describes this problem:
$$R_d(n) = R_d(n-1) + R_{d-1}(n-1)$$
To see why this is true, imagine taking $n-1$ hyperplanes in general position, then adding the $n^{th}$ hyperplane to the group. If we consider only this hyperplane, each intersection with the other hyperplanes is a "hyperline" in this $(d-1)$-dimensional hyperplane, which will make $R_{d-1}(n-1)$ regions on that plane.
After you convince yourself that each of these regions created on the $n^{th}$ hyperplane adds one to the original $d$-dimensional space, the recurrence is clear. This leaves us with the following:
$$\sum\limits_{i = 0}^d\binom{n}{i} = \sum\limits_{i = 0}^d\binom{n-1}{i} + \sum\limits_{i = 0}^{d-1}\binom{n-1}{i}$$ $$= \sum\limits_{i = 0}^d\binom{n-1}{i} + \sum\limits_{i = 1}^{d}\binom{n-1}{i-1}$$ $$= \binom{n-1}{0} + \sum\limits_{i = 1}^d\binom{n-1}{i} + \binom{n-1}{i-1}$$ Apply Pascal's formula, and note that $\binom{n-1}{0} = \binom{n}{0}$ $$= \binom{n}{0} + \sum\limits_{i = 1}^d\binom{n}{i}$$ $$= \sum\limits_{i = 0}^d\binom{n}{i}$$
Because the recurrence is satisfied and the initial values match, the given summation will work (this is a very lazy way to show it holds).
For circles in the plane, the number of regions is given by A014206.
$$R(n) = n^2-n+2$$
This is easy to show by induction. The base case $n=1$, should be clear. We will assume that
$$R(n) = n^2-n+2$$
If we add another circle, then it intersects each other circle twice, creating $2$ regions (overlap, and the non-overlap area inside the new circle). As we currently have $n$ circles, adding the next will create $2n$ more regions:
$$R(n) + 2n = n^2 + n + 2 = (n+1)^2 - (n+1) + 2 = R(n+1)$$
So we are done by induction.
I don't think the statement about the tetrahedra is true. If you have just two hyperplanes, it does not hold, nor even with three. Perhaps there is further detail in your textbook?
• Thanks:)....yes ,sorry about tetrahedra....n>=5 – blue boy Jun 20 '15 at 0:26 |
# OpenGL Problem with VBOs on Windows
This topic is 3874 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.
## Recommended Posts
So I have a problem with a bit of OpenGL code I wrote. I've checked Google, these forums, a few FAQs etc but haven't managed to find an answer. On my iMac, VBOs just work. No problems. On my friend's Windows XP box, however, they don't. Now I have basically the following code:
GLuint vbo_id;
glGenBuffers(1, &vbo_id);
printf("inited VBO id %i\n", vbo_id);
On the Mac, this tells me that the index is 1 for the first VBO, 2 for the second, etc. I can then load data into the VBO, draw it, no problems there. On Windows, however, it says that the vbo_id is supposed to be something like -84646712, in other words, the variable hasn't been touched at all. If I initialize it to 0, it stays 0. So basically glGenBuffers() behaves just like a NOP. The most obvious explanation is of course the lack of the proper extension. On Windows, I'm using glee to help with the extension names, then the following:
if (GLEE_VERSION_1_2) printf("GL version 1.2 available\n");
if (GLEE_VERSION_1_3) printf("GL version 1.3 available\n");
if (GLEE_VERSION_1_4) printf("GL version 1.4 available\n");
if (GLEE_VERSION_1_5) printf("GL version 1.5 available\n");
if (GLEE_VERSION_2_0) printf("GL version 2.0 available\n");
if (GLEE_ARB_vertex_buffer_object) printf("vertex_buffer_object extension available\n");
The first three ifs print their stuff, which seems to tell me that the OpenGL version is 1.4; the last if reports that the vertex buffer object extension is also available. So what am I missing here? Is it just a case of bad display drivers or something like that? The most likely cause seems to be that the extension simply isn't implemented, and in the library there's just a stub like void glGenBuffers(...) {}; but I was kind of expecting it to not compile or at least crash if it isn't implemented. Maybe I'm just using it wrong? But how? Thanks in advance for any help.
##### Share on other sites
It's probably where your calling it. Does GLee have an init function?
What gfx card is on the XP?
OpenGL.org will have a document stating what GL version you need (1.5 I believe).
##### Share on other sites
OK this is something of a quirk and I had banged my head on this problem a long time ago so here goes.
In OpenGL 1.4 the vertex buffer object was an extension so it is available as an extension only i.e. GL_ARB_vertex_buffer_object (or in your case GLEE_ARB_vertex_buffer_object)
under this extension the functions for vertex buffer objects have to be appended with ARB
so they are basically
glGenBuffersARB
glBindBufferARB
glDeleteBufferARB
..
..
(and so on)
In OpenGL 1.5 they were promoted to proper OpenGL functions, ie. the ones you are using
glGenBuffers
glBindBuffer
glDeleteBuffer
..
..
(and so on)
For compatibility reasons, you can append the ARB to your code and it should work on a Machine that has GL 1.4 and the required frame buffer object extension.
##### Share on other sites
If you are on Mac these extensions will be available to you with out a extension loader. With windows you will need to get GLEE or GLEW to load the extensions or do it yourself. I can tell you using GLEE or GLEW will save you a lot of time and headaches.
##### Share on other sites
After some more hassle everything seems to work perfectly now.
On the desktop machine, replacing the 64-bit Windows XP with a normal 32-bit one gave us OpenGL up to version 2.0 (as reported by the code snippets I posted earlier). And, OpenGL 2.0 + glee = VBOs that actually work. Like I said in the first post, we're using glee, and while it did claim that the VBO extension (among other stuff) is available, VBOs still didn't work. This was with 64-bit, with glee reporting the OpenGL version as 1.4.
I'm not sure whether it was better display drivers or some more generic difference between 64-bit and 32-bit Windows that made things work, but, well, there you are. :)
1. 1
Rutin
24
2. 2
3. 3
JoeJ
18
4. 4
5. 5
• 38
• 23
• 13
• 13
• 17
• ### Forum Statistics
• Total Topics
631708
• Total Posts
3001840
× |
# Math Help - Prove by induction?
1. ## Prove by induction?
How would I prove this by induction...1/(k^2+n)^(1/2)...basically in words, 1 divided by the square root of k squared plus n
2. Originally Posted by tn11631
How would I prove this by induction...1/(k^2+n)^(1/2)...basically in words, 1 divided by the square root of k squared plus n
There is nothing to prove about that expression.
Please post the exact question as you have.
3. Originally Posted by Plato
There is nothing to prove about that expression.
Please post the exact question as you have.
Sorry I was told to prove by induction to see if the sequence converges or diverges and im still so confused..here is the original question:
does this sequence {sigma from n=1 to k (1/(k^2+n)^(1/2)) } k=1 to infinity ..Sorry about that..thanks
4. Originally Posted by tn11631
Sorry I was told to prove by induction to see if the sequence converges or diverges and im still so confused..here is the original question:
does this sequence {sigma from n=1 to k (1/(k^2+n)^(1/2)) } k=1 to infinity ..Sorry about that..thanks
Still not quite clear.
Is it $S_k = \sum\limits_{n = 1}^k {\frac{1}{{\sqrt {k^2 + n} }}}$?
So that we are asked if the $S_k$ sequence converges?
If so is the sequence increasing and bounded above by 1?
5. Originally Posted by Plato
Still not quite clear.
Is it $S_k = \sum\limits_{n = 1}^k {\frac{1}{{\sqrt {k^2 + n} }}}$?
So that we are asked if the $S_k$ sequence converges?
If so is the sequence increasing and bounded above by 1?
um, in my books its stated really weird but i think setting the whole thing equal to the sequence S_k takes away the original brackets so i think that is correct. And yes the sequence is increasing and it looks like its approching one so I'm going to say that it is bounded above by 1. I just don't know how to prove its converging and what its limit is.
6. Any monotone bounded sequence of real numbers converges.
7. Originally Posted by Plato
Any monotone bounded sequence of real numbers converges.
Oh duh! so just by showing that it is increasing shows that it in monotone sequnce and that its approaching 1 therefore its bounded above by 1..Thanks I can't believe I missed that..however, since it converges, what would be its limit would it be 1? or is that just its least upper bound?
8. well the limit is indeed 1, but not because it does have a bound.
actually, this is solved by using the squeeze theorem, since for $1\le n\le k$ we have that $\frac{1}{\sqrt{k^{2}+k}}\le \frac{1}{\sqrt{k^{2}+n}}\le \frac{1}{\sqrt{k^{2}+1}},$ then $\frac{k}{\sqrt{k^{2}+k}}\le \sum\limits_{n=1}^{k}{\frac{1}{\sqrt{k^{2}+n}}}\le \frac{k}{\sqrt{k^{2}+1}},$ then as $k\to\infty$ we get $S_k\to1.$
9. Thanks a bunch! and thanks to every one who helped!! |
# What percent of 33.5 is 21?
###### Question:
What percent of 33.5 is 21?
#### Similar Solved Questions
##### 19Which of the following statements is NOT correct about glycolysis?Select one:out ofa. Glycolysis begins with glucose and ends with four pyruvate molecules:b. During glycolysis, two NADH are produced as substrate oxidation occurs:Olycolysis uses two ATP but forms four ATP resulting in @ net gain of two ATP moleciGlycolysis takes place within the cytosol
19 Which of the following statements is NOT correct about glycolysis? Select one: out of a. Glycolysis begins with glucose and ends with four pyruvate molecules: b. During glycolysis, two NADH are produced as substrate oxidation occurs: Olycolysis uses two ATP but forms four ATP resulting in @ net g...
##### A) 2,2-dichlorobutane from 2-chlorobulane (1.25 points)b) Pentanoic acid CHJCH,CH CH;COOH fromn ethyne (1,25 point)
a) 2,2-dichlorobutane from 2-chlorobulane (1.25 points) b) Pentanoic acid CHJCH,CH CH;COOH fromn ethyne (1,25 point)...
##### MoicraApplled ForceShhchucionntic FnttonOhecuaseGiven the values in the picture, if the applied force is Fa 536 N; what is the acceleration @ of the object?m 82
Moicra Applled Force Shhchucion ntic Fntton Ohecuase Given the values in the picture, if the applied force is Fa 536 N; what is the acceleration @ of the object? m 82...
##### The following system of 3 linear equations in X, Y and 2 has been solved as far as this 3 2 7 0 1 3 io 0 |lo Letting 2 = t, express X and y in terms of t.Answer:
The following system of 3 linear equations in X, Y and 2 has been solved as far as this 3 2 7 0 1 3 io 0 |lo Letting 2 = t, express X and y in terms of t. Answer:...
##### Find δz for each function z. 5) 29/5 = 15 + - See examples 9 and...
find δz for each function z. 5) 29/5 = 15 + - See examples 9 and 10. 0 :-Σ...
##### Find the area of the indicated region_ We suggest you graph the curves to check whether one technology to check your answer: Enclosed by ~xand Ybove the other or whether they cross and that you use
Find the area of the indicated region_ We suggest you graph the curves to check whether one technology to check your answer: Enclosed by ~xand Y bove the other or whether they cross and that you use...
##### Toothpick manufacturer wants every box to contain exactly (on average) 500 toothpicks. Suppose you took a random sample of n boxes, and found: T-498 toothpicks Standard deviation of all the boxes is known as 9. Test his claim at confidence level 10 and .01.
toothpick manufacturer wants every box to contain exactly (on average) 500 toothpicks. Suppose you took a random sample of n boxes, and found: T-498 toothpicks Standard deviation of all the boxes is known as 9. Test his claim at confidence level 10 and .01....
##### 2. In class, we discussed the recursive Merge-Sort algorithm. This sorts the whole array by sorting...
2. In class, we discussed the recursive Merge-Sort algorithm. This sorts the whole array by sorting the left side, sorting the right side, and then merging them. Write a similar recursive algorithm that finds the maximum element of an array. (Find the max of the left side, then find the maximum of t...
##### If 4 million people are collecting unemployment insurance benefits, 12 million people are officially unemployed, 113...
If 4 million people are collecting unemployment insurance benefits, 12 million people are officially unemployed, 113 million people are employed, and there are 2 million discouraged workers, (a) How many people are in the labor force? (b) What is the unemployment rate?...
##### Find the radius and interval of convergence of the power series ΣΗ where a > 1....
Find the radius and interval of convergence of the power series ΣΗ where a > 1. nl...
##### Not sure where I'm going wrong :/ Help! Question 9 of 22 Organic Chemistry Mapes Three...
Not sure where I'm going wrong :/ Help! Question 9 of 22 Organic Chemistry Mapes Three bottles A, B, and C have been found, each of which contains a liquid and is labeled "amine CotN. As an expert in amine chemistry, you have been hired as a consultant and asked to identify each compound. C...
##### Matching or multiple choice (1 point each) Match the following scenarios to the rights they uphold:...
Matching or multiple choice (1 point each) Match the following scenarios to the rights they uphold: ____Right medication ____Right amount ____Right time ____Right route ____Right patient ____Right documentation ____Right to refuse ____Right room Not one of the seven rights Checking the patient...
##### What time period did the condors first appear in?
What time period did the condors first appear in?...
##### Question one (Five parts) A. In the early 1980s, the disease now called AIDS was known...
Question one (Five parts) A. In the early 1980s, the disease now called AIDS was known as: i. Kaposi's sarcoma ii. GRID, or gay-related immune deficiency iii. Pneumocystosis iv. Cytomegalovirus B. The year epidemiologists named this new disease acquired immune deficiency syndrome (AIDS) i. 1981 ...
##### Perform the following to the algorithm below: - - Express T(n) as a function of n...
Perform the following to the algorithm below: - - Express T(n) as a function of n Find a best approximation for the Big O function for T(n) Perform a time complexity analysis Define the basic operation of the algorithm Correctness Efficiency - - Procedure maxMin (n, A, I, h) integer h, I, A (1:n), n...
##### Sam drives her rocket towards the x direction at speed v (relative to Jim) through a...
Sam drives her rocket towards the x direction at speed v (relative to Jim) through a region of space where a magnetic field, B, points everywhere out of the page. As she passes Jim, she shoots a negative charge relative to her backward at a speed v. According to Sam, what kind of forces act on the p...
##### Find sin(a) and cos(8) , tan(a) and cot(8), and sec(a) and csc(8).21sin(a) and cos(p)(0) tan(a) and cot(0)sec(a) and csc(p)'[0.53/0.93 pointelDETAILS
Find sin(a) and cos(8) , tan(a) and cot(8), and sec(a) and csc(8). 21 sin(a) and cos(p) (0) tan(a) and cot(0) sec(a) and csc(p) '[0.53/0.93 pointel DETAILS...
##### The average annual cost (including tuition, room, board, books,and fees) to attend a public college takes nearly a third of theannual income of a typical family with college age children. Atprivate colleges, the average annual cost is equal to about 60% ofthe typical family’s income. The following random samples show theannual cost of attending private and public colleges. Data are inthousands of dollars. Private Colleges 52.8 43.2 45.0 33.3 44.030.6 45.8 37.8 50.5 42.0 Public Colleges 20.3 22
The average annual cost (including tuition, room, board, books, and fees) to attend a public college takes nearly a third of the annual income of a typical family with college age children. At private colleges, the average annual cost is equal to about 60% of the typical family’s income. The ...
##### 8 I Using 2 & 2 2 L Rules and Kirchhoff's 50 and I 'saind 8 9 8 % 8 show work to ceacaluej construct E 8 & & 0.08 6.29 any credit. V (15 each resistor the points) error 3 'beumesyose independent equations- use % Error
8 I Using 2 & 2 2 L Rules and Kirchhoff's 50 and I 'saind 8 9 8 % 8 show work to ceacaluej construct E 8 & & 0.08 6.29 any credit. V (15 each resistor the points) error 3 'beumesyose independent equations- use % Error...
##### Whx ivaro the najar groducue) o bis racuon?CH;OHDPCHOcnannomcrQCH;enantiomerHaCo
Whx ivaro the najar groducue) o bis racuon? CH;OH DPC HO cnannomcr QCH; enantiomer HaCo... |
# $B\to K\eta,K\eta^{\prime}$ Decays
Authors
Type
Published Article
Publication Date
Oct 01, 2008
Submission Date
Oct 01, 2008
Identifiers
DOI: 10.1016/j.nuclphysbps.2008.12.088
Source
arXiv
The nonet symmetry scheme seems to describe rather well the masses and $\eta-\eta^{\prime}$ mixing angle of the ground state pseudo-scalar mesons and is thus expected to be also a good approximation for the matrix elements of the pseudo-scalar density operators which play an important role in charmless two-body B decays with $\eta$ or $\eta^{\prime}$ in the final state. In this talk, I would like to report on a recent work on the $B^{-}\to K^{-}\eta, K^{-}\eta^{\prime}$ decay using nonet symmetry for the matrix elements of pseudo-scalar density operators. We find that the branching ratio $B\to PP$, with an $\eta$ meson in the final state agrees well with data, while those with an $\eta^{\prime}$ meson are underestimated by $20-30%$. This could be considered as a more or less successful prediction for QCDF, considering the theoretical uncertainties involved. This could also indicate that an additional power-suppressed terms could bring the branching ratio close to experiment, as with the $B\to K^{*}\pi$ and $B\to K^{*}\eta$ decay for which the measured branching ratios are much bigger than the QCDF predictions. |
## May 25, 2018
### Chris Siebenmann
#### There's real reasons for Linux to replace ifconfig, netstat, et al
One of the ongoing system administration controversies in Linux is that there is an ongoing effort to obsolete the old, cross-Unix standard network administration and diagnosis commands of ifconfig, netstat and the like and replace them with fresh new Linux specific things like ss and the ip suite. Old sysadmins are generally grumpy about this; they consider it yet another sign of Linux's 'not invented here' attitude that sees Linux breaking from well-established Unix norms to go its own way. Although I'm an old sysadmin myself, I don't have this reaction. Instead, I think that it might be both sensible and honest for Linux to go off in this direction. There are two reasons for this, one ostensible and one subtle.
The ostensible surface issue is that the current code for netstat, ifconfig, and so on operates in an inefficient way. Per various people, netstat et al operate by reading various files in /proc, and doing this is not the most efficient thing in the world (either on the kernel side or on netstat's side). You won't notice this on a small system, but apparently there are real impacts on large ones. Modern commands like ss and ip use Linux's netlink sockets, which are much more efficient. In theory netstat, ifconfig, and company could be rewritten to use netlink too; in practice this doesn't seem to have happened and there may be political issues involving different groups of developers with different opinions on which way to go.
(Netstat and ifconfig are part of net-tools, while ss and ip are part of iproute2.)
However, the deeper issue is the interface that netstat, ifconfig, and company present to users. In practice, these commands are caught between two masters. On the one hand, the information the tools present and the questions they let us ask are deeply intertwined with how the kernel itself does networking, and in general the tools are very much supposed to report the kernel's reality. On the other hand, the users expect netstat, ifconfig and so on to have their traditional interface (in terms of output, command line arguments, and so on); any number of scripts and tools fish things out of ifconfig output, for example. As the Linux kernel has changed how it does networking, this has presented things like ifconfig with a deep conflict; their traditional output is no longer necessarily an accurate representation of reality.
For instance, here is ifconfig output for a network interface on one of my machines:
; ifconfig -a
[...]
inet6 fe80::6245:cbff:fea0:e8dd prefixlen 64 scopeid 0x20<link>
ether 60:45:cb:a0:e8:dd txqueuelen 1000 (Ethernet)
[...]
There are no other 'em0:...' devices reported by ifconfig, which is unfortunate because this output from ifconfig is not really an accurate picture of reality:
; ip -4 addr show em0
[...]
inet 128.100.3.XX/24 brd 128.100.3.255 scope global em0
valid_lft forever preferred_lft forever
inet 128.100.3.YY/24 brd 128.100.3.255 scope global secondary em0
valid_lft forever preferred_lft forever
This interface has an IP alias, set up through systemd's networkd. Perhaps there once was a day when all IP aliases on Linux had to be set up through additional alias interfaces, which ifconfig would show, but these days each interface can have multiple IPs and directly setting them this way is the modern approach.
This issue presents programs like ifconfig with an unappealing choice. They can maintain their traditional output, which is now sometimes a lie but which keeps people's scripts working, or they can change the output to better match reality and probably break some scripts. It's likely to be the case that the more they change their output (and arguments and so on) to match the kernel's current reality, the more they will break scripts and tools built on top of them. And some people will argue that those scripts and tools that would break are already broken, just differently; if you're parsing ifconfig output on my machine to generate a list of all of the local IP addresses, you're already wrong.
(If you try to keep the current interface while lying as little as possible, you wind up having arguments about what to lie about and how. If you can only list one IPv4 address per interface in ifconfig, how do you decide which one?)
In a sense, deprecating programs like ifconfig and netstat that have wound up with interfaces that are inaccurate but hard to change is the honest approach. Their interfaces can't be fixed without significant amounts of pain and they still work okay for many systems, so just let them be while encouraging people to switch to other tools that can be more honest.
(This elaborates on an old tweet of mine.)
PS: I believe that the kernel interfaces that ifconfig and so on currently use to get this information are bound by backwards compatibility issues themselves, so getting ifconfig to even know that it was being inaccurate here would probably take code changes.
## May 24, 2018
#### vSphere 6.7 Will Not Run In My Lab: A Parable
“Hey Bob, I tried installing vSphere 6.7 on my lab servers and it doesn’t work right. You tried using it yet? Been beating my head against a wall here.” “Yeah, I really like it. A lot. Like, resisting the urge to be irresponsible and upgrade everything. What are your lab servers?” I knew what he […]
The post vSphere 6.7 Will Not Run In My Lab: A Parable appeared first on The Lone Sysadmin. Head over to the source to read the full post!
### Chris Siebenmann
#### Registering for things on the Internet is dangerous these days
Back in the old days (say up through the middle of the 00s), it was easily possible to view registering for websites, registering products on the Internet, and so on as a relatively harmless and even positive activity. Not infrequently, signing up was mostly there so you could customize your site experience and preferences, and maybe so that you could get to hear about important news. Unfortunately those days are long over. On today's Internet, registration is almost invariably dangerous.
(If you're in the EU and the website in question wants to do business there, the EU GDPR may give you some help here. Since I'm not in the EU, I'm on my own.)
Some Terms of Service are benign, but today ToSes are so long and intricate that you can't tell whether you have a benign or a dangerous one (and anyway, many ToSes are effectively self-upgrading). Even with potentially dangerous ToSes, some companies will never exercise the freedom that their ToS nominally gives them, for various reasons. But neither is the way to bet given an arbitrary company and an arbitrary ToS. Today the only safe assumption is that agreeing to someone's Terms of Service is at least a somewhat dangerous act that may bite you at some point.
The corollary to this is that you should assume that anyone who requires registration before giving you access to things when this is not actively required by how their service works is trying to exploit you. For example, 'register to see this report' should be at least a yellow and perhaps a red warning sign. My reaction is generally that I probably don't really need to read it after all.
(Other people react by simply giving up and agreeing to everything, taking solace in the generally relatively low chance that it will make a meaningful difference in their life one way or another. I have this reaction when I'm forced to agree to ToSes; since I can neither meaningfully read the terms nor do anything about them, what they are don't matter and I just blindly agree. I have to trust that I'll hear about it if the terms are so bad that I shouldn't agree under any circumstances. Of course this attitude of helplessness plays into the hands of these people.)
### Errata Security
#### C is to low level
I'm in danger of contradicting myself, after previously pointing out that x86 machine code is a high-level language, but this article claiming C is a not a low level language is bunk. C certainly has some problems, but it's still the closest language to assembly. This is obvious by the fact it's still the fastest compiled language. What we see is a typical academic out of touch with the real world.
The author makes the (wrong) observation that we've been stuck emulating the PDP-11 for the past 40 years. C was written for the PDP-11, and since then CPUs have been designed to make C run faster. The author imagines a different world, such as where CPU designers instead target something like LISP as their preferred language, or Erlang. This misunderstands the state of the market. CPUs do indeed supports lots of different abstractions, and C has evolved to accommodate this.
The author criticizes things like "out-of-order" execution which has lead to the Spectre sidechannel vulnerabilities. Out-of-order execution is necessary to make C run faster. The author claims instead that those resources should be spent on having more slower CPUs, with more threads. This sacrifices single-threaded performance in exchange for a lot more threads executing in parallel. The author cites Sparc Tx CPUs as his ideal processor.
But here's the thing, the Sparc Tx was a failure. To be fair, it's mostly a failure because most of the time, people wanted to run old C code instead of new Erlang code. But it was still a failure at running Erlang.
Time after time, engineers keep finding that "out-of-order", single-threaded performance is still the winner. A good example is ARM processors for both mobile phones and servers. All the theory points to in-order CPUs as being better, but all the products are out-of-order, because this theory is wrong. The custom ARM cores from Apple and Qualcomm used in most high-end phones are so deeply out-of-order they give Intel CPUs competition. The same is true on the server front with the latest Qualcomm Centriq and Cavium ThunderX2 processors, deeply out of order supporting more than 100 instructions in flight.
The Cavium is especially telling. Its ThunderX CPU had 48 simple cores which was replaced with the ThunderX2 having 32 complex, deeply out-of-order cores. The performance increase was massive, even on multithread-friendly workloads. Every competitor to Intel's dominance in the server space has learned the lesson from Sparc Tx: many wimpy cores is a failure, you need fewer beefy cores. Yes, they don't need to be as beefy as Intel's processors, but they need to be close.
Even Intel's "Xeon Phi" custom chip learned this lesson. This is their GPU-like chip, running 60 cores with 512-bit wide "vector" (sic) instructions, designed for supercomputer applications. Its first version was purely in-order. Its current version is slightly out-of-order. It supports four threads and focuses on basic number crunching, so in-order cores seems to be the right approach, but Intel found in this case that out-of-order processing still provided a benefit. Practice is different than theory.
As an academic, the author of the above article focuses on abstractions. The criticism of C is that it has the wrong abstractions which are hard to optimize, and that if we instead expressed things in the right abstractions, it would be easier to optimize.
This is an intellectually compelling argument, but so far bunk.
The reason is that while the theoretical base language has issues, everyone programs using extensions to the language, like "intrinsics" (C 'functions' that map to assembly instructions). Programmers write libraries using these intrinsics, which then the rest of the normal programmers use. In other words, if your criticism is that C is not itself low level enough, it still provides the best access to low level capabilities.
Given that C can access new functionality in CPUs, CPU designers add new paradigms, from SIMD to transaction processing. In other words, while in the 1980s CPUs were designed to optimize C (stacks, scaled pointers), these days CPUs are designed to optimize tasks regardless of language.
The author of that article criticizes the memory/cache hierarchy, claiming it has problems. Yes, it has problems, but only compared to how well it normally works. The author praises the many simple cores/threads idea as hiding memory latency with little caching, but misses the point that caches also dramatically increase memory bandwidth. Intel processors are optimized to read a whopping 256 bits every clock cycle from L1 cache. Main memory bandwidth is orders of magnitude slower.
The author goes onto criticize cache coherency as a problem. C uses it, but other languages like Erlang don't need it. But that's largely due to the problems each languages solves. Erlang solves the problem where a large number of threads work on largely independent tasks, needing to send only small messages to each other across threads. The problems C solves is when you need many threads working on a huge, common set of data.
For example, consider the "intrusion prevention system". Any thread can process any incoming packet that corresponds to any region of memory. There's no practical way of solving this problem without a huge coherent cache. It doesn't matter which language or abstractions you use, it's the fundamental constraint of the problem being solved. RDMA is an important concept that's moved from supercomputer applications to the data center, such as with memcached. Again, we have the problem of huge quantities (terabytes worth) shared among threads rather than small quantities (kilobytes).
The fundamental issue the author of the the paper is ignoring is decreasing marginal returns. Moore's Law has gifted us more transistors than we can usefully use. We can't apply those additional registers to just one thing, because the useful returns we get diminish.
For example, Intel CPUs have two hardware threads per core. That's because there are good returns by adding a single additional thread. However, the usefulness of adding a third or fourth thread decreases. That's why many CPUs have only two threads, or sometimes four threads, but no CPU has 16 threads per core.
You can apply the same discussion to any aspect of the CPU, from register count, to SIMD width, to cache size, to out-of-order depth, and so on. Rather than focusing on one of these things and increasing it to the extreme, CPU designers make each a bit larger every process tick that adds more transistors to the chip.
The same applies to cores. It's why the "more simpler cores" strategy fails, because more cores have their own decreasing marginal returns. Instead of adding cores tied to limited memory bandwidth, it's better to add more cache. Such cache already increases the size of the cores, so at some point it's more effective to add a few out-of-order features to each core rather than more cores. And so on.
The question isn't whether we can change this paradigm and radically redesign CPUs to match some academic's view of the perfect abstraction. Instead, the goal is to find new uses for those additional transistors. For example, "message passing" is a useful abstraction in languages like Go and Erlang that's often more useful than sharing memory. It's implemented with shared memory and atomic instructions, but I can't help but think it couldn't better be done with direct hardware support.
Of course, as soon as they do that, it'll become an intrinsic in C, then added to languages like Go and Erlang.
Summary
Academics live in an ideal world of abstractions, the rest of us live in practical reality. The reality is that vast majority of programmers work with the C family of languages (JavaScript, Go, etc.), whereas academics love the epiphanies they learned using other languages, especially function languages. CPUs are only superficially designed to run C and "PDP-11 compatibility". Instead, they keep adding features to support other abstractions, abstractions available to C. They are driven by decreasing marginal returns -- they would love to add new abstractions to the hardware because it's a cheap way to make use of additional transitions. Academics are wrong believing that the entire system needs to be redesigned from scratch. Instead, they just need to come up with new abstractions CPU designers can add.
## May 23, 2018
### Errata Security
#### The devil wears Pravda
Classic Bond villain, Elon Musk, has a new plan to create a website dedicated to measuring the credibility and adherence to "core truth" of journalists. He is, without any sense of irony, going to call this "Pravda". This is not simply wrong but evil.
Musk has a point. Journalists do suck, and many suck consistently. I see this in my own industry, cybersecurity, and I frequently criticize them for their suckage.
But what he's doing here is not correcting them when they make mistakes (or what Musk sees as mistakes), but questioning their legitimacy. This legitimacy isn't measured by whether they follow established journalism ethics, but whether their "core truths" agree with Musk's "core truths".
An example of the problem is how the press fixates on Tesla car crashes due to its "autopilot" feature. Pretty much every autopilot crash makes national headlines, while the press ignores the other 40,000 car crashes that happen in the United States each year. Musk spies on Tesla drivers (hello, classic Bond villain everyone) so he can see the dip in autopilot usage every time such a news story breaks. He's got good reason to be concerned about this.
He argues that autopilot is safer than humans driving, and he's got the statistics and government studies to back this up. Therefore, the press's fixation on Tesla crashes is illegitimate "fake news", titillating the audience with distorted truth.
But here's the thing: that's still only Musk's version of the truth. Yes, on a mile-per-mile basis, autopilot is safer, but there's nuance here. Autopilot is used primarily on freeways, which already have a low mile-per-mile accident rate. People choose autopilot only when conditions are incredibly safe and drivers are unlikely to have an accident anyway. Musk is therefore being intentionally deceptive comparing apples to oranges. Autopilot may still be safer, it's just that the numbers Musk uses don't demonstrate this.
And then there is the truth calling it "autopilot" to begin with, because it isn't. The public is overrating the capabilities of the feature. It's little different than "lane keeping" and "adaptive cruise control" you can now find in other cars. In many ways, the technology is behind -- my Tesla doesn't beep at me when a pedestrian walks behind my car while backing up, but virtually every new car on the market does.
Yes, the press unduly covers Tesla autopilot crashes, but Musk has only himself to blame by unduly exaggerating his car's capabilities by calling it "autopilot".
What's "core truth" is thus rather difficult to obtain. What the press satisfies itself with instead is smaller truths, what they can document. The facts are in such cases that the accident happened, and they try to get Tesla or Musk to comment on it.
What you can criticize a journalist for is therefore not "core truth" but whether they did journalism correctly. When such stories criticize "autopilot", but don't do their diligence in getting Tesla's side of the story, then that's a violation of journalistic practice. When I criticize journalists for their poor handling of stories in my industry, I try to focus on which journalistic principles they get wrong. For example, the NYTimes reporters do a lot of stories quoting anonymous government sources in clear violation of journalistic principles.
If "credibility" is the concern, then it's the classic Bond villain here that's the problem: Musk himself. His track record on business statements is abysmal. For example, when he announced the Model 3 he claimed production targets that every Wall Street analyst claimed were absurd. He didn't make those targets, he didn't come close. Model 3 production is still lagging behind Musk's twice adjusted targets.
https://www.bloomberg.com/graphics/2018-tesla-tracker/
So who has a credibility gap here, the press, or Musk himself?
Not only is Musk's credibility problem ironic, so is the name he chose, "Pravada", the Russian word for truth that was the name of the Soviet Union Communist Party's official newspaper. This is so absurd this has to be a joke, yet Musk claims to be serious about all this.
Yes, the press has a lot of problems, and if Musk were some journalism professor concerned about journalists meeting the objective standards of their industry (e.g. abusing anonymous sources), then this would be a fine thing. But it's not. It's Musk who is upset the press's version of "core truth" does not agree with his version -- a version that he's proven time and time again differs from "real truth".
Just in case Musk is serious, I've already registered "www.antipravda.com" to start measuring the credibility of statements by billionaire playboy CEOs. Let's see who blinks first.
I stole the title, with permission, from this tweet:
## CentOS 6
This way is been suggested for building a container image from your current centos system.
In my case, I need to remote upgrade a running centos6 system to a new clean centos7 on a test vps, without the need of opening the vnc console, attaching a new ISO etc etc.
I am rather lucky as I have a clean extra partition to this vps, so I will follow the below process to remote install a new clean CentOS 7 to this partition. Then add a new grub entry and boot into this partition.
### Current OS
# cat /etc/redhat-release
CentOS release 6.9 (Final)
### Format partition
format & mount the partition:
mkfs.ext4 -L rootfs /dev/vda5
mount /dev/vda5 /mnt/
### InstallRoot
Type:
# yum -y groupinstall "Base" --releasever 7 --installroot /mnt/ --nogpgcheck
### Test
test it, when finished:
mount --bind /dev/ /mnt/dev/
mount --bind /sys/ /mnt/sys/
mount --bind /proc/ /mnt/proc/
chroot /mnt/
bash-4.2# cat /etc/redhat-release
CentOS Linux release 7.5.1804 (Core)
It works!
inside chroot enviroment:
bash-4.2# passwd
passwd: all authentication tokens updated successfully.
bash-4.2# exit
## Grub
adding the new grub entry for CentOS 7
title CentOS 7
root (hd0,4)
kernel /boot/vmlinuz-3.10.0-862.2.3.el7.x86_64 root=/dev/vda5 ro rhgb LANG=en_US.UTF-8
initrd /boot/initramfs-3.10.0-862.2.3.el7.x86_64.img
by changing the default boot entry from 0 to 1 :
default=0
to
default=1
our system will boot into centos7 when reboot!
### syslog.me
#### Concurrency in Go
In my quest to learn the Go language I am currently in the process of doing the Go Code Clinic. It’s taking me quite some time because instead of going through the solutions proposed in the course I try to implement a solution by myself; only when I have no idea whatsoever about how to proceed I peep into the solution to get some insight, and then work independently on my solution again.
The second problem in the Clinic is already at a non-trivial level: compare a number of images with a bigger image to check if any of those is a “clipping” of the bigger one. I confess that I would have a lot to read and to work even if I was trying to solve it in Perl!
It took some banging of my head to the wall till I eventually solved the problem. Unfortunately my program is single-threaded and the process of matching images is very expensive. For example, it took more than two hours to match a clipping sized 967×562 pixels with it’s “base” image sized 2048×1536. And for the whole time only one CPU thread was running 100%, the others where barely used.If I really want to say that I solved the problem I must adapt the program to the available computing power by starting a number of subprocesses/threads (in our case: goroutines) to distribute the search across several CPU threads.
Since this was completely new to me in golang, I decided to experiment with a much simpler program: generate up to 100 random integers (say) between 0 and 10000 and run 8 workers to find if any of these random numbers is a multiple of another number, for example 17. And of course the program must shut down gracefully, whether or not a multiple is found. This gave me a few problems to solve:
• how do I start exactly 8 worker goroutines?
• what’s the best way to pass them the numbers to check? what’s the best way for them to report back the result?
• how do I tell them to stop when it’s time that they shut down?
• how do I wait that they are actually shut down?
The result is the go program that you can find in this gist. Assuming that it is good enough, you can use it as a skeleton for a program of yours, re-implementing the worker part and maybe the reaper part if a boolean response is not enough. Enjoy!
#### Midnight is a Confusing Choice for Scheduling
Midnight is a poor choice for scheduling anything. Midnight belongs to tomorrow. It’s 0000 on the clock, which is the beginning of the next day. That’s not how humans think, though, because tomorrow is after we wake up! A great example is a statement like “proposals are due by midnight on April 15.” What you […]
The post Midnight is a Confusing Choice for Scheduling appeared first on The Lone Sysadmin. Head over to the source to read the full post!
### syslog.me
This is written to my older self, and to all those using Mozilla Thunderbird and the Lightning Calendar add-on with Google calendars and they see this:
If you are seeing this, the solution is to change the setting calendar.google.enableEmailInvitations to true:
and everything should work as expected:
Enjoy!
### Vincent Bernat
A common solution to provide a highly-available and scalable service is to insert a load-balancing layer to spread requests from users to backend servers.1 We usually have several expectations for such a layer:
scalability
It allows a service to scale by pushing traffic to newly provisioned backend servers. It should also be able to scale itself when it becomes the bottleneck.
availability
It provides high availability to the service. If one server becomes unavailable, the traffic should be quickly steered to another server. The load-balancing layer itself should also be highly available.
flexibility
It handles both short and long connections. It is flexible enough to offer all the features backends generally expect from a load-balancer like TLS or HTTP routing.
operability
With some cooperation, any expected change should be seamless: rolling out a new software on the backends, adding or removing backends, or scaling up or down the load-balancing layer itself.
The problem and its solutions are well known. From recently published articles on the topic, “Introduction to modern network load-balancing and proxying” provides an overview of the state of the art. Google released “Maglev: A Fast and Reliable Software Network Load Balancer” describing their in-house solution in details.2 However, the associated software is not available. Basically, building a load-balancing solution with commodity servers consists of assembling three components:
• ECMP routing
In this article, I describe and support a multi-tier solution using Linux and only open-source components. It should offer you the basis to build a production-ready load-balancing layer.
Update (2018.05)
Facebook just released Katran, an L4 load-balancer implemented with XDP and eBPF and using consistent hashing. It could be inserted in the configuration described below.
Let’s start with the last tier. Its role is to provide high availability, by forwarding requests to only healthy backends, and scalability, by spreading requests fairly between them. Working in the highest layers of the OSI model, it can also offer additional services, like TLS-termination, HTTP routing, header rewriting, rate-limiting of unauthenticated users, and so on. Being stateful, it can leverage complex load-balancing algorithm. Being the first point of contact with backend servers, it should ease maintenances and minimize impact during daily changes.
It also terminates client TCP connections. This introduces some loose coupling between the load-balancing components and the backend servers with the following benefits:
• connections to servers can be kept open for lower resource use and latency,
• requests can be retried transparently in case of failure,
• clients can use a different IP protocol than servers, and
• servers do not have to care about path MTU discovery, TCP congestion control algorithms, avoidance of the TIME-WAIT state and various other low-level details.
Many pieces of software would fit in this layer and an ample literature exists on how to configure them. You could look at HAProxy, Envoy or Træfik. Here is a configuration example for HAProxy:
frontend l7lb
# Listen on both IPv4 and IPv6
bind :80 v4v6
# Redirect everything to a default backend
default_backend servers
# Healthchecking
acl disabled nbsrv(enabler) lt 1
monitor-uri /healthcheck
monitor fail if dead || disabled
# IPv6-only servers with HTTP healthchecking and remote agent checks
backend servers
balance roundrobin
option httpchk
server web1 [2001:db8:1:0:2::1]:80 send-proxy check agent-check agent-port 5555
server web2 [2001:db8:1:0:2::2]:80 send-proxy check agent-check agent-port 5555
server web3 [2001:db8:1:0:2::3]:80 send-proxy check agent-check agent-port 5555
server web4 [2001:db8:1:0:2::4]:80 send-proxy check agent-check agent-port 5555
# Fake backend: if the local agent check fails, we assume we are dead
backend enabler
server enabler [::1]:0 agent-check agent-port 5555
This configuration is the most incomplete piece of this guide. However, it illustrates two key concepts for operability:
1. Healthchecking of the web servers is done both at HTTP-level (with check and option httpchk) and using an auxiliary agent check (with agent-check). The later makes it easy to put a server to maintenance or to orchestrate a progressive rollout. On each backend, you need a process listening on port 5555 and reporting the status of the service (UP, DOWN, MAINT). A simple socat process can do the trick:3
socat -ly \
OPEN:/etc/lb/agent-check,rdonly
Put UP in /etc/lb/agent-check when the service is in nominal mode. If the regular healthcheck is also positive, HAProxy will send requests to this node. When you need to put it in maintenance, write MAINT and wait for the existing connections to terminate. Use READY to cancel this mode.
2. The load-balancer itself should provide an healthcheck endpoint (/healthcheck) for the upper tier. It will return a 503 error if either there is no backend servers available or if put down the enabler backend through the agent check. The same mechanism as for regular backends can be used to signal the unavailability of this load-balancer.
Additionally, the send-proxy directive enables the proxy protocol to transmit the real clients’ IP addresses. This protocol also works for non-HTTP connections and is supported by a variety of servers, including nginx:
http {
server {
listen [::]:80 default ipv6only=off proxy_protocol;
root /var/www;
set_real_ip_from ::/0;
}
}
As is, this solution is not complete. We have just moved the availability and scalability problem somewhere else. How do we load-balance the requests between the load-balancers?
# First tier: ECMP routing🔗
On most modern routed IP networks, redundant paths exist between clients and servers. For each packet, routers have to choose a path. When the cost associated to each path is equal, incoming flows4 are load-balanced among the available destinations. This characteristic can be used to balance connections among available load-balancers:
There is little control over the load-balancing but ECMP routing brings the ability to scale horizontally both tiers. A common way to implement such a solution is to use BGP, a routing protocol to exchange routes between network equipments. Each load-balancer announces to its connected routers the IP addresses it is serving.
If we assume you already have BGP-enabled routers available, ExaBGP is a flexible solution to let the load-balancers advertise their availability. Here is a configuration for one of the load-balancers:
# Healthcheck for IPv6
process service-v6 {
run python -m exabgp healthcheck -s --interval 10 --increase 0 --cmd "test -f /etc/lb/v6-ready -a ! -f /etc/lb/disable";
encoder text;
}
template {
# Template for IPv6 neighbors
neighbor v6 {
router-id 192.0.2.132;
local-as 65000;
peer-as 65000;
hold-time 6;
family {
ipv6 unicast;
}
api services-v6 {
processes [ service-v6 ];
}
}
}
# First router
neighbor 2001:db8::192.0.2.254 {
inherit v6;
}
# Second router
neighbor 2001:db8::192.0.2.253 {
inherit v6;
}
If /etc/lb/v6-ready is present and /etc/lb/disable is absent, all the IP addresses configured on the lo interface will be announced to both routers. If the other load-balancers use a similar configuration, the routers will distribute incoming flows between them. Some external process should manage the existence of the /etc/lb/v6-ready file by checking for the healthiness of the load-balancer (using the /healthcheck endpoint for example). An operator can remove a load-balancer from the rotation by creating the /etc/lb/disable file.
To get more details on this part, have a look at “High availability with ExaBGP.” If you are in the cloud, this tier is usually implemented by your cloud provider, either using an anycast IP address or a basic L4 load-balancer.
Unfortunately, this solution is not resilient when an expected or unexpected change happens. Notably, when adding or removing a load-balancer, the number of available routes for a destination changes. The hashing algorithm used by routers is not consistent and flows are reshuffled among the available load-balancers, breaking existing connections:
Moreover, each router may choose its own routes. When a router becomes unavailable, the second one may route the same flows differently:
If you think this is not an acceptable outcome, notably if you need to handle long connections like file downloads, video streaming or websocket connections, you need an additional tier. Keep reading!
The second tier is the glue between the stateless world of IP routers and the stateful land of L7 load-balancing. It is implemented with L4 load-balancing. The terminology can be a bit confusing here: this tier routes IP datagrams (no TCP termination) but the scheduler uses both destination IP and port to choose an available L7 load-balancer. The purpose of this tier is to ensure all members take the same scheduling decision for an incoming packet.
There are two options:
• stateful L4 load-balancing with state synchronization accross the members, or
• stateless L4 load-balancing with consistent hashing.
The first option increases complexity and limits scalability. We won’t use it.5 The second option is less resilient during some changes but can be enhanced with an hybrid approach using a local state.
We use IPVS, a performant L4 load-balancer running inside the Linux kernel, with Keepalived, a frontend to IPVS with a set of healthcheckers to kick out an unhealthy component. IPVS is configured to use the Maglev scheduler, a consistent hashing algorithm from Google. Among its family, this is a great algorithm because it spreads connections fairly, minimizes disruptions during changes and is quite fast at building its lookup table. Finally, to improve performance, we let the last tier—the L7 load-balancers—sends back answers directly to the clients without involving the second tier—the L4 load-balancers. This is referred to as direct server return (DSR) or direct routing (DR).
With such a setup, we expect packets from a flow to be able to move freely between the components of the two first tiers while sticking to the same L7 load-balancer.
## Configuration🔗
Assuming ExaBGP has already been configured like described in the previous section, let’s start with the configuration of Keepalived:
virtual_server_group VS_GROUP_MH_IPv6 {
2001:db8::198.51.100.1 80
}
virtual_server group VS_GROUP_MH_IPv6 {
lvs_method TUN # Tunnel mode for DSR
lvs_sched mh # Scheduler: Maglev
sh-port # Use port information for scheduling
protocol TCP
delay_loop 5
alpha # All servers are down on start
omega # Execute quorum_down on shutdown
real_server 2001:db8::192.0.2.132 80 {
weight 1
HTTP_GET {
url {
path /healthcheck
status_code 200
}
connect_timeout 2
}
}
# Many others...
}
The quorum_up and quorum_down statements define the commands to be executed when the service becomes available and unavailable respectively. The /etc/lb/v6-ready file is used as a signal to ExaBGP to advertise the service IP address to the neighbor routers.
Additionally, IPVS needs to be configured to continue routing packets from a flow moved from another L4 load-balancer. It should also continue routing packets from unavailable destinations to ensure we can drain properly a L7 load-balancer.
# Schedule non-SYN packets
sysctl -qw net.ipv4.vs.sloppy_tcp=1
# Do NOT reschedule a connection when destination
# doesn't exist anymore
sysctl -qw net.ipv4.vs.expire_nodest_conn=0
sysctl -qw net.ipv4.vs.expire_quiescent_template=0
The Maglev scheduling algorithm will be available with Linux 4.18, thanks to Inju Song. For older kernels, I have prepared a backport.6 Use of source hashing as a scheduling algorithm will hurt the resilience of the setup.
DSR is implemented using the tunnel mode. This method is compatible with routed datacenters and cloud environments. Requests are tunneled to the scheduled peer using IPIP encapsulation. It adds a small overhead and may lead to MTU issues. If possible, ensure you are using a larger MTU for communication between the second and the third tier.7 Otherwise, it is better to explicitely allow fragmentation of IP packets:
sysctl -qw net.ipv4.vs.pmtu_disc=0
You also need to configure the L7 load-balancers to handle encapsulated traffic:8
# Setup IPIP tunnel to accept packets from any source
ip tunnel add tunlv6 mode ip6ip6 local 2001:db8::192.0.2.132
ip link set up dev tunlv6
## Evaluation of the resilience🔗
As configured, the second tier increases the resilience of this setup for two reasons:
1. The scheduling algorithm is using a consistent hash to choose its destination. Such an algorithm reduces the negative impact of expected or unexpected changes by minimizing the number of flows moving to a new destination. “Consistent Hashing: Algorithmic Tradeoffs” offers more details on this subject.
2. IPVS keeps a local connection table for known flows. When a change impacts only the third tier, existing flows will be correctly directed according to the connection table.
If we add or remove a L4 load-balancer, existing flows are not impacted because each load-balancer takes the same decision, as long as they see the same set of L7 load-balancers:
If we add a L7 load-balancer, existing flows are not impacted either because only new connections will be scheduled to it. For existing connections, IPVS will look at its local connection table and continue to forward packets to the original destination. Similarly, if we remove a L7 load-balancer, only existing flows terminating at this load-balancer are impacted. Other existing connections will be forwarded correctly:
We need to have simultaneous changes on both levels to get a noticeable impact. For example, when adding both a L4 load-balancer and a L7 load-balancer, only connections moved to a L4 load-balancer without state and scheduled to the new load-balancer will be broken. Thanks to the consistent hashing algorithm, other connections will stay bound to the right L7 load-balancer. During a planned change, this disruption can be minimized by adding the new L4 load-balancers first, waiting a few minutes, then adding the new L7 load-balancers.
Additionally, IPVS correctly routes ICMP messages to the same L7 load-balancers as the associated connections. This ensures notably path MTU discovery works and there is no need for smart workarounds.
Optionally, you can add DNS load-balancing to the mix. This is useful either if your setup is spanned accross multiple datacenters, or multiple cloud regions, or if you want to break a large load-balancing cluster into smaller ones. It is not intended to replace the first tier as it doesn’t share the same characteristics: load-balancing is unfair (it is not flow-based) and recovery from a failure is slow.
gdnsd is an authoritative-only DNS server with integrated healthchecking. It can serve zones from master files using the RFC 1035 zone format:
@ SOA ns1 ns1.example.org. 1 7200 1800 259200 900
@ NS ns1.example.com.
@ NS ns1.example.net.
@ MX 10 smtp
@ 60 DYNA multifo!web
www 60 DYNA multifo!web
smtp A 198.51.100.99
The special RR type DYNA will return A and AAAA records after querying the specified plugin. Here, the multifo plugin implements an all-active failover of monitored addresses:
service_types => {
web => {
plugin => http_status
url_path => /healthcheck
down_thresh => 5
interval => 5
}
ext => {
plugin => extfile
file => /etc/lb/ext
def_down => false
}
}
plugins => {
multifo => {
web => {
service_types => [ ext, web ]
addrs_v4 => [ 198.51.100.1, 198.51.100.2 ]
addrs_v6 => [ 2001:db8::198.51.100.1, 2001:db8::198.51.100.2 ]
}
}
}
In nominal state, an A request will be answered with both 198.51.100.1 and 198.51.100.2. An healthcheck failure will update the returned set accordingly. It is also possible to administratively remove an entry by modifying the /etc/lb/ext file. For example, with the following content, 198.51.100.2 will not be advertised anymore:
198.51.100.1 => UP
198.51.100.2 => DOWN
2001:db8::c633:6401 => UP
2001:db8::c633:6402 => UP
You can find all the configuration files and the setup of each tier in the GitHub repository. If you want to replicate this setup at a smaller scale, it is possible to collapse the second and the third tiers by using either localnode or network namespaces. Even if you don’t need its fancy load-balancing services, you should keep the last tier: while backend servers come and go, the L7 load-balancers bring stability, which translates to resiliency.
1. In this article, “backend servers” are the servers behind the load-balancing layer. To avoid confusion, we will not use the term “frontend.” ↩︎
2. A good summary of the paper is available from Adrian Colyer. From the same author, you may also have a look at the summary for “Stateless datacenter load-balancing with Beamer.” ↩︎
3. If you feel this solution is fragile, feel free to develop your own agent. It could coordinate with a key-value store to determine the wanted state of the server. It is possible to centralize the agent in a single location, but you may get a chicken-and-egg problem to ensure its availability. ↩︎
4. A flow is usually determined by the source and destination IP and the L4 protocol. Alternatively, the source and destination port can also be used. The router hashes these information to choose the destination. For Linux, you may find more information on this topic in “Celebrating ECMP in Linux.” ↩︎
5. On Linux, it can be implemented by using Netfilter for load-balancing and conntrackd to synchronize state. IPVS only provides active/backup synchronization. ↩︎
6. The backport is not strictly equivalent to its original version. Be sure to check the README file to understand the differences. Briefly, in Keepalived configuration, you should:
• not use inhibit_on_failure
• use sh-port
• not use sh-fallback
↩︎
7. At least 1520 for IPv4 and 1540 for IPv6. ↩︎
8. As is, this configuration is a insecure. You need to ensure only the L4 load-balancers will be able to send IPIP traffic. ↩︎
### Chris Siebenmann
#### Almost no one wants to run their own infrastructure
Every so often, people get really enthused about the idea of a less concentrated, more distributed Internet, one where most of our email isn't inside only a few places, our online chatting happens over federated systems instead of Twitter, there are flower gardens of personal websites and web servers, there are lots of different Git servers instead of mostly Github, and so on. There are many obstacles in the way of this, including that the current large providers don't want to let people go, but over time I have come to think that a large underappreciated one is simply that people don't want to run their own infrastructure. Not even if it's free to do so.
I'm a professional system administrator. I know how to run my own mail and IMAP server, and I know that I probably should and will have to some day. Do I actually run my own server? Of course not. It's a hassle. I have things on Github, and in theory I could publish them outside Github too, on a machine where I'm already running a web server. Have I done so? No, it's not worth the effort when the payoff I'd get is basically only feeling good.
Now, I've phrased this as if running your own infrastructure is easy and the only thing keeping people from doing so is the minor effort and expense involved. We shouldn't underestimate the effects of even minor extra effort and expense, but the reality is that doing a good job of your own infrastructure is emphatically not a minor effort. There is security, TLS certificates, (offsite) backups, choosing the software, managing configuration, long term maintenance and updates, and I'm assuming that someone else has already built the system and you just have to set up an instance of it.
(And merely setting up an instance of something is often fraught with annoyance and problems, especially for a non-specialist.)
If you use someone else's infrastructure and they're decently good at it, they're worrying about all of that and more things on top (like monitoring, dealing with load surges and DDOSes, and fixing things in the dead of night). Plus, they're on the right side of the issues universities have with running their own email; many such centralized places are paying entire teams of hard-working good people to improve their services (or at least the ones that they consider strategic). I like open source, but it's fairly rare that it can compete head to head with something that a significant company considers a strategic product.
Can these problems be somewhat solved? Sure, but until we get much better 'computing as a utility' (if we ever do), a really usable solution is a single-vendor solution, which just brings us back to the whole centralization issue again. Maybe life is a bit better if we're all hosting our federated chat systems and IMAP servers and Git repo websites in the AWS cloud using canned one-click images, but it's not quite the great dream of a truly decentralized and democratic Internet.
(Plus, it still involves somewhat more hassle than using Github and Twitter and Google Mail, and I think that hassle really does matter. Convinced people are willing to fight a certain amount of friction, but to work, the dream of a decentralized Internet needs to reach even the people who don't really care.)
All of this leads me to the conclusion that any decentralized Internet future that imagines lots of people running their own infrastructure is dead on arrival. It's much more likely that any decentralized future will involve a fair amount of concentration, with many people choosing to use someone else's infrastructure and a few groups running most of it. This matters because running such a big instance for a bunch of people generally requires real money and thus some way of providing it. If there is no real funding model, the whole system is vulnerable to a number of issues.
(See, for example, Mastodon, which is fairly centralized in practice with quite a number of large instances, per the instance statistics.)
## Prologue
Maintaining a (public) service can be sometimes troublesome. In case of email service, often you need to suspend or restrict users for reasons like SPAM, SCAM or Phishing. You have to deal with inactive or even compromised accounts. Protecting your infrastructure is to protect your active users and the service. In this article I’ll propose a way to restrict messages to authorized addresses when sending an email and get a bounce message explaining why their email was not sent.
The reference documentation when having a Directory Service (LDAP) as our user backend and using Postfix:
## LDAP
In this post, we will not get into openldap internals but as reference I’ll show an example user account (this is from my working test lab).
dn: uid=testuser2,ou=People,dc=example,dc=org
objectClass: top
objectClass: person
objectClass: organizationalPerson
objectClass: inetOrgPerson
objectClass: posixAccount
mail: [email protected]
smtpd_sender_restrictions: true
givenName: Evaggelos
uidNumber: 99
gidNumber: 12
uid: testuser2
homeDirectory: /storage/vhome/%d/%n
as you can see, we have a custom ldap attribute:
smtpd_sender_restrictions: true
keep that in mind for now.
## Postfix
The default value of smtpd_sender_restrictions is empty, that means by default the mail server has no sender restrictions. Depending on the policy we either can whitelist or blacklist in postfix restrictions, for the purpose of this blog post, we will only restrict (blacklist) specific user accounts.
### ldap_smtpd_sender_restrictions
To do that, let’s create a new file that will talk to our openldap and ask for that specific ldap attribute.
ldap_smtpd_sender_restrictions.cf
server_host = ldap://localhost
server_port = 389
search_base = ou=People,dc=example,dc=org
query_filter = (&(smtpd_sender_restrictions=true)(mail=%s))
result_attribute = uid
result_filter = uid
result_format = REJECT This account is not allowed to send emails, plz talk to [email protected]
version = 3
timeout = 5
This is an anonymous bind, as we do not search for any special attribute like password.
### Status Codes
The default status code will be: 554 5.7.1
Take a look here for more info: RFC 3463 - Enhanced Mail System Status Codes
#### Test it
# postmap -q [email protected] ldap:/etc/postfix/ldap_smtpd_sender_restrictions.cf
REJECT This account is not allowed to send emails, plz talk to [email protected]
# postmap -v -q [email protected] ldap:/etc/postfix/ldap_smtpd_sender_restrictions.cf
#### Possible Errors
postmap: fatal: unsupported dictionary type: ldap
Check your postfix setup with postconf -m . The result should be something like this:
btree
cidr
environ
fail
hash
internal
ldap
memcache
nis
proxy
regexp
socketmap
static
tcp
texthash
unix
If not, you need to setup postfix to support the ldap dictionary type.
### smtpd_sender_restrictions
Modify the main.cf to add the ldap_smtpd_sender_restrictions.cf
# applied in the context of the MAIL FROM
smtpd_sender_restrictions =
check_sender_access ldap:/etc/postfix/ldap_smtpd_sender_restrictions.cf
If you keep logs, tail them to see any errors.
## Thunderbird
### Logs
May 19 13:20:26 centos6 postfix/smtpd[20905]:
NOQUEUE: reject: RCPT from XXXXXXXX[XXXXXXXX]: 554 5.7.1 <[email protected]>:
Sender address rejected: This account is not allowed to send emails, plz talk to [email protected];
from=<[email protected]> to=<[email protected]> proto=ESMTP helo=<[192.168.0.13]>
Tag(s): postfix, ldap
#### Database schemas vs. identity
Yesterday brought this tweet up:
This is amazingly bad wording, and is the kind of thing that made the transpeople in my timeline (myself included) go "Buwhuh?" and me to wonder if this was a snopes worthy story.
No, actually.
There are two things you should know:
1. NASA works on National Security related things, which requires a security clearance to work on, and getting one of those requires submitting prints.
2. The FBI is the US Government's authority in handling biometric data
Here is a chart from the Electronic Biometric Transmission Specification, which describes a kind of API for dealing with biometric data.
If Following Condition ExistsEnter Code
Subject's gender reported as femaleF
Occupation or charge indicated "Male Impersonator"G
Subject's gender reported as maleM
Occupation or charge indicated "Female Impersonator" or transvestiteN
Male name, no gender givenY
Female name, no gender givenZ
Unknown genderX
Yep, it really does use the term "Female Impersonator". To a transperson living in 2016 getting their first Federal job (even as a contractor), running into these very archaic terms is extremely off-putting.
As someone said in a private channel:
This looks like some 1960's bureaucrat trying to be 'inclusive'
This is not far from the truth.
This table exists unchanged in the 7.0 version of the document, dated January 1999. Previous versions are in physical binders somewhere, and not archived on the Internet; but the changelog for the V7 document indicates that this wording was in place as early as 1995. Mention is also made of being interoperable with UK law-enforcement.
The NIST standard for fingerprints issued in 1986 mentions a SEX field, but it only has M, F, and U; later NIST standards drop this field definition entirely.
As this field was defined in standard over 20 years ago and has not been changed, is used across the full breadth of the US justice system, is referenced in International communications standards including Visa travel, and used as the basis for US Military standards, these field definitions are effectively immutable and will only change after concerted effort over decades.
This is what institutionalized transphobia looks like, and we will be dealing with it for another decade or two. If not longer.
The way to deal with this is to deprecate the codes in documentation, but still allow them as valid.
• Create a deprecation notice in the definition of the field saying that the G and N values are to be considered deprecated and should not be used.
• In the deprecation notice say that in the future, new records will not be accepted with those values.
• Those values will remain valid for queries, because there are decades of legacy-coding in databases using them.
The failure-mode of this comes in with form designers who look at the spec and build forms based on the spec. Like this example from Maryland. Which means we need to let the forms designers know that the spec needs to be selectively ignored. The deprecation notice does that.
At the local level, convince your local City Council to pass resolutions to modernize their Police forms to reflect modern sensibilities, and drop the G and N codes from intake forms. Do this at the County too, for the Sheriff's department.
At the state level, convince your local representatives to push resolutions to get the State Patrol to modernize their forms likewise. Drop the G and N codes from the forms.
At the Federal employee level, there is less to be done here as you're closer to the governing standards, but you may be able to convince The Powers That Be to drop the two offensive checkboxes or items from the drop-down list.
At the Federal standard level. Lobby the decision makers that govern this standard and push for a deprecation notice. If any of your congress-people are on any Judiciary committees, you'll have more luck than most.
## May 21, 2018
### Steve Kemp's Blog
#### This month has been mostly golang-based
This month has mostly been about golang. I've continued work on the protocol-tester that I recently introduced:
This has turned into a fun project, and now all my monitoring done with it. I've simplified the operation, such that everything uses Redis for storage, and there are now new protocol-testers for finger, nntp, and more.
Sample tests are as basic as this:
mail.steve.org.uk must run smtp
mail.steve.org.uk must run smtp with port 587
mail.steve.org.uk must run imaps
https://webmail.steve.org.uk/ must run http with content 'Prayer Webmail service'
Results are stored in a redis-queue, where they can picked off and announced to humans via a small daemon. In my case alerts are routed to a central host, via HTTP-POSTS, and eventually reach me via the pushover
Beyond the basic network testing though I've also reworked a bunch of code - so the markdown sharing site is now golang powered, rather than running on the previous perl-based code.
As a result of this rewrite, and a little more care, I now score 99/100 + 100/100 on Google's pagespeed testing service. A few more of my sites do the same now, thanks to inline-CSS, inline-JS, etc. Nothing I couldn't have done before, but this was a good moment to attack it.
Finally my "silly" Linux security module, for letting user-space decide if binaries should be executed, can-exec has been forward-ported to v4.16.17. No significant changes.
Over the coming weeks I'll be trying to move more stuff into the cloud, rather than self-hosting. I'm doing a lot of trial-and-error at the moment with Lamdas, containers, and dynamic-routing to that end.
Interesting times.
## May 20, 2018
### Errata Security
#### masscan, macOS, and firewall
One of the more useful features of masscan is the "--banners" check, which connects to the TCP port, sends some request, and gets a basic response back. However, since masscan has it's own TCP stack, it'll interfere with the operating system's TCP stack if they are sharing the same IPv4 address. The operating system will reply with a RST packet before the TCP connection can be established.
The way to fix this is to use the built-in packet-filtering firewall to block those packets in the operating-system TCP/IP stack. The masscan program still sees everything before the packet-filter, but the operating system can't see anything after the packet-filter.
Note that we are talking about the "packet-filter" firewall feature here. Remember that macOS, like most operating systems these days, has two separate firewalls: an application firewall and a packet-filter firewall. The application firewall is the one you see in System Settings labeled "Firewall", and it controls things based upon the application's identity rather than by which ports it uses. This is normally "on" by default. The packet-filter is normally "off" by default and is of little use to normal users.
Also note that macOS changed packet-filters around version 10.10.5 ("Yosemite", October 2014). The older one is known as "ipfw", which was the default firewall for FreeBSD (much of macOS is based on FreeBSD). The replacement is known as PF, which comes from OpenBSD. Whereas you used to use the old "ipfw" command on the command line, you now use the "pfctl" command, as well as the "/etc/pf.conf" configuration file.
What we need to filter is the source port of the packets that masscan will send, so that when replies are received, they won't reach the operating-system stack, and just go to masscan instead. To do this, we need find a range of ports that won't conflict with the operating system. Namely, when the operating system creates outgoing connections, it randomly chooses a source port within a certain range. We want to use masscan to use source ports in a different range.
To figure out the range macOS uses, we run the following command:
sysctl net.inet.ip.portrange.first net.inet.ip.portrange.last
On my laptop, which is probably the default for macOS, I get the following range. Sniffing with Wireshark confirms this is the range used for source ports for outgoing connections.
net.inet.ip.portrange.first: 49152
net.inet.ip.portrange.last: 65535
So this means I shouldn't use source ports anywhere in the range 49152 to 65535. On my laptop, I've decided to use for masscan the ports 40000 to 41023. The range masscan uses must be a power of 2, so here I'm using 1024 (two to the tenth power).
To configure masscan, I can either type the parameter "--source-port 40000-41023" every time I run the program, or I can add the following line to /etc/masscan/masscan.conf. Remember that by default, masscan will look in that configuration file for any configuration parameters, so you don't have to keep retyping them on the command line.
source-port = 40000-41023
Next, I need to add the following firewall rule to the bottom of /etc/pf.conf:
block in proto tcp from any to any port 40000 >< 41024
However, we aren't done yet. By default, the packet-filter firewall is off. Therefore, every time you reboot your computer, you need to enable it. The simple way to do this is on the command line run:
pfctl -e
Or, if that doesn't work, try:
pfctl -E
Ideally, you'd want it to start automatically on bootup. I haven't figure out how to do this one macOS in an approved fashion that doesn't conflict with something else. Apparently there are a few GUIs that will do this for you.
## May 18, 2018
### Raymii.org
#### HP-UX 11.31 System information & find out part number of a failed disk with sasmgr
On one of my regular scheduled datacenter visits one of the older HP-UX Itanium machines had an orange light on the front. These systems are not (yet) monitored, but still in use so the disk had to be replaced. Not knowing anything about this system or which parts were used, I managed to find the exact part number and device type so we could order a spare. This small guide uses sasmgr to get the data on HP-UX 11.31.
## May 17, 2018
### Cryptography Engineering
#### Was the Efail disclosure horribly screwed up?
TL;DR. No. Or keep reading if you want.
On Monday a team of researchers from Münster, RUB and NXP disclosed serious cryptographic vulnerabilities in a number of encrypted email clients. The flaws, which go by the cute vulnerability name of “Efail”, potentially allow an attacker to decrypt S/MIME or PGP-encrypted email with only minimal user interaction.
By the standards of cryptographic vulnerabilities, this is about as bad as things get. In short: if an attacker can intercept and alter an encrypted email — say, by sending you a new (altered) copy, or modifying a copy stored on your mail server — they can cause many GUI-based email clients to send the full plaintext of the email to an attacker controlled-server. Even worse, most of the basic problems that cause this flaw have been known for years, and yet remain in clients.
The big (and largely under-reported) story of EFail is the way it affects S/MIME. That “corporate” email protocol is simultaneously (1) hated by the general crypto community because it’s awful and has a slash in its name, and yet (2) is probably the most widely-used email encryption protocol in the corporate world. The table at the right — excerpted from the paper — gives you a flavor of how Efail affects S/MIME clients. TL;DR it affects them very badly.
Efail also happens to affect a smaller, but non-trivial number of OpenPGP-compatible clients. As one might expect (if one has spent time around PGP-loving folks) the disclosure of these vulnerabilities has created something of a backlash on HN, and among people who make and love OpenPGP clients. Mostly for reasons that aren’t very defensible.
So rather than write about fun things — like the creation of CFB and CBC gadgets — today, I’m going to write about something much less exciting: the problem of vulnerability disclosure in ecosystems like PGP. And how bad reactions to disclosure can hurt us all.
### How Efail was disclosed to the PGP community
Putting together a comprehensive timeline of the Efail disclosure process would probably be a boring, time-intensive project. Fortunately Thomas Ptacek loves boring and time-intensive projects, and has already done this for us.
Briefly, the first Efail disclosures to vendors began last October, more than 200 days prior to the agreed publication date. The authors notified a large number of vulnerable PGP GUI clients, and also notified the GnuPG project (on which many of these projects depend) by February at the latest. From what I can tell every major vendor agreed to make some kind of patch. GnuPG decided that it wasn’t their fault, and basically stopped corresponding.
All parties agreed not to publicly discuss the vulnerability until an agreed date in April, which was later pushed back to May 15. The researchers also notified the EFF and some journalists under embargo, but none of them leaked anything. On May 14 someone dumped the bug onto a mailing list. So the EFF posted a notice about the vulnerability (which we’ll discuss a bit more below), and the researchers put up a website. That’s pretty much the whole story.
There are three basic accusations going around about the Efail disclosure. They can be summarized as (1) maintaining embargoes in coordinated disclosures is really hard, (2) the EFF disclosure “unfairly” made this sound like a serious vulnerability “when it isn’t”, and (3) everything was already patched anyway so what’s the big deal.
### Disclosures are hard; particularly coordinated ones
I’ve been involved in two disclosures of flaws in open encryption protocols. (Both were TLS issues.) Each one poses an impossible dilemma. You need to simultaneously (a) make sure every vendor has as much advance notice as possible, so they can patch their software. But at the same time (b) you need to avoid telling literally anyone, because nothing on the Internet stays secret. At some point you’ll notify some FOSS project that uses an open development mailing list or ticket server, and the whole problem will leak out into the open.
Disclosing bugs that affect PGP is particularly fraught. That’s because there’s no such thing as “PGP”. What we have instead is a large and distributed community that revolves around the OpenPGP protocol. The pillar of this community is the GnuPG project, which maintains the core GnuPG tool and libraries that many clients rely on. Then there are a variety of niche GUI-based clients and email plugin projects. Finally, there are commercial vendors like Apple and Microsoft. (Who are mostly involved in the S/MIME side of things, and may reluctantly allow PGP plugins.)
Then, of course there are thousands of end-users, who will generally fail to update their software unless something really bad and newsworthy happens.
The obvious solution to the disclosure problem to use a staged disclosure. You notify the big commercial vendors first, since that’s where most of the affected users are. Then you work your way down the “long tail” of open source projects, knowing that inevitably the embargo could break and everyone will have to patch in a hurry. And you keep in mind that no matter what happens, everyone will blame you for screwing up the disclosure.
For the PGP issues in Efail, the big client vendors are Mozilla (Thunderbird), Microsoft (Outlook) and maybe Apple (Mail). The very next obvious choice would be to patch the GnuPG tool so that it no longer spits out unauthenticated plaintext, which is the root of many of the problems in Efail.
The Efail team appears to have pursued exactly this approach for the client-side vulnerabilities. Sadly, the GnuPG team made the decision that it’s not their job to pre-emptively address problems that they view as ‘clients misusing the GnuPG API’ (my paraphrase), even when that misuse appears to be rampant across many of the clients that use their tool. And so the most obvious fix for one part of the problem was not available.
This is probably the most unfortunate part of the Efail story, because in this case GnuPG is very much at fault. Their API does something that directly violates cryptographic best practices — namely, releasing unauthenticated plaintext prior to producing an error message. And while this could be understood as a reasonable API design at design time, continuing to support this API even as clients routinely misuse it has now led to flaws across the ecosystem. The refusal of GnuPG to take a leadership role in preemptively safeguarding these vulnerabilities both increases the difficulty of disclosing these flaws, and increases the probability of future issues.
### So what went wrong with the Efail disclosure?
Despite what you may have heard, given the complexity of this disclosure, very little went wrong. The main issues people have raised seem to have to do with the contents of an EFF post. And with some really bad communications from Robert J. Hansen at the Enigmail (and GnuPG) project.
The EFF post. The Efail researchers chose to use the Electronic Frontier Foundation as their main source for announcing the existence of the vulnerability to the privacy community. This hardly seems unreasonable, because the EFF is generally considered a trusted broker, and speaks to the right community (at least here in the US).
The EFF post doesn’t give many details, nor does it give a list of affected (or patched) clients. It does give two pretty mild recommendations:
1. Temporarily disable or uninstall your existing clients until you’ve checked that they’re patched.
2. Maybe consider using a more modern cryptosystem like Signal, at least until you know that your PGP client is safe again.
This naturally led to a huge freakout by many in the PGP community. Some folks, including vendors, have misrepresented the EFF post as essentially pushing people to “permanently” uninstall PGP, which will “put lives at risk” because presumably these users (whose lives are at risk, remember) will immediately fall back to sending incriminating information via plaintext emails — rather than temporarily switching their communications to one of several modern, well-studied secure messengers, or just not emailing for a few hours.
The most reasonable criticism I’ve heard of the EFF post is that it doesn’t give many details about which clients are patched, and which are vulnerable. This could presumably give someone the impression that this vulnerability is still present in their email client, and thus would cause them to feel less than secure in using it.
I have to be honest that to me that sounds like a really good outcome. The problem with Efail is that it doesn’t matter if your client is secure. The Efail vulnerability could affect you if even a single one of your communication partners is using an insecure client.
So needless to say I’m not very sympathetic to the reaction around the EFF post. If you can’t be sure whether your client is secure, you probably should feel insecure.
Bad communications from GnuPG and Enigmail. On the date of the disclosure, anyone looking for accurate information about security from two major projects — GnuPG and Enigmail — would not have been able to find it.
They wouldn’t have found it because developers from both Enigmail and GnuPG were on mailing lists and Twitter claiming that they had never heard of Efail, and hadn’t been notified by the researchers. Needless to say, these allegations took off around the Internet, sometimes in place of real information that could have helped users (like, whether either project had patched.)
It goes without saying that neither allegation was actually true. In fact, both project members soon checked with their fellow developers (and their memories) and found out that they’d both been given months of notice by the researchers, and that Enigmail had even developed a patch. (However, it turned out that even this patch may not perfectly address the issue, and the community is still working to figure out exactly what still needs to be done.)
This is an understandable mistake, perhaps. But it sure is a bad one.
Now that I’ve made it clear that neither the researchers nor the EFF is out to get the PGP community, let me put on my mask and horns and tell you why someone should be.
I’ve written extensively about PGP on this blog, but in the past I’ve written mostly from a technical point of view about the problems with PGP. But what’s really problematic about PGP is not just the cryptography; it’s the story it tells about path dependence and how software communities work.
The fact of the matter is that OpenPGP is not really a cryptography project. That is, it’s not held together by cryptography. It’s held together by backwards-compatibility and (increasingly) a kind of an obsession with the idea of PGP as an end in and of itself, rather than as a means to actually make end-users more secure.
Let’s face it, as a protocol, PGP/OpenPGP is just not what we’d develop if we started over today. It was formed over the years out of mostly experimental parts, which were in turn replaced, bandaged and repaired — and then worked into numerous implementations, which all had to be insanely flexible and yet compatible with one another. The result is bad, and most of the software implementing it is worse. It’s the equivalent of a beloved antique sports car, where the electrical system is totally shot, but it still drives. You know, the kind of car where the owner has to install a hand-switch so he can turn the reverse lights on manually whenever he wants to pull out of a parking space.
If PGP went away, I estimate it would take the security community less than a year to entirely replace (the key bits of) the standard with something much better and modern. It would have modern crypto and authentication, and maybe even extensions for future post-quantum future security. It would be simple. Many bright new people would get involved to help write the inevitable Rust, Go and Javascript clients and libraries.
Unfortunately for us all, (Open)PGP does exist. And that means that even fancy greenfield email projects feel like they need to support OpenPGP, or at least some subset of it. This in turn perpetuates the PGP myth, and causes other clients to use it. And as a direct result, even if some clients re-implement OpenPGP from scratch, other clients will end up using tools like GnuPG which will support unauthenticated encryption with bad APIs. And the cycle will go round and around, like a spaceship stuck near the event horizon of a black hole.
And as the standard perpetuates itself, largely for the sake of being a standard, it will fail to attract new security people. It will turn away exactly the type of people who should be working on these tools. Those people will go off and build encryption systems in a totally different area, or they’ll get into cryptocurrency. And — with some exceptions — the people who work in the community will increasingly work in that community because they’re supporting PGP, and not because they’re trying to seek out the best security technologies for their users. And the serious (email) users of PGP will be using it because they like the idea of using PGP better than they like using an actual, secure email standard.
And as things get worse, and fail to develop, people who work on it will become more dogmatic about its importance, because it’s something threatened and not a real security protocol that anyone’s using. To me that’s where PGP is going today, and that is why the community has such a hard time motivating itself to take these vulnerabilities seriously, and instead reacts defensively.
Maybe that’s a random, depressing way to end a post. But that’s the story I see in OpenPGP. And it makes me really sad.
## May 16, 2018
### OpenSSL
#### Changing the Guiding Principles in Our Security Policy
“That we remove “We strongly believe that the right to advance patches/info should not be based in any way on paid membership to some forum. You can not pay us to get security patches in advance.” from the security policy and Mark posts a blog entry to explain the change including that we have no current such service.”
At the OpenSSL Management Committee meeting earlier this month we passed the vote above to remove a section our security policy. Part of that vote was that I would write this blog post to explain why we made this change.
At each face to face meeting we aim to ensure that our policies still match the view of the current membership committee at that time, and will vote to change those that don’t.
Prior to 2018 our Security Policy used to contain a lot of background information on why we selected the policy we did, justifying it and adding lots of explanatory detail. We included details of things we’d tried before and things that worked and didn’t work to arrive at our conclusion. At our face to face meeting in London at the end of 2017 we decided to remove a lot of the background information and stick to explaining the policy simply and concisely. I split out what were the guiding principles from the policy into their own list.
OpenSSL has some full-time fellows who are paid from various revenue sources coming into OpenSSL including sponsorship and support contracts. We’ve discussed having the option in the future to allow us to share patches for security issues in advance to these support contract customers. We already share serious issues a little in advance with some OS vendors (and this is still a principle in the policy to do so), and this policy has helped ensure that the patches and advisory get an extra level of testing before being released.
Thankfully there are relatively few serious issues in OpenSSL these days; the last worse than Moderate severity being in February 2017.
In the vote text we wrote that we have “no current such service” and neither do we have any plan right now to create such a service. But we allow ourselves to consider such a possibility in the future now that this principle, which no longer represents the view of the OMC, is removed.
### Raymii.org
#### Icinga2 / Nagios / Net::SNMP change the default timeout of 60 seconds
Recently a rather large amount of new infrastructure was added to one of my monitoring instances. Using SNMP exclusively, but not the fastest network or infrastructure. The SNMP checks in the Icinga2 instance started giving timeouts, which look like false positives and give unclean logs. Raising the SNMP timeout for the checks above 60 seconds was not that easy since the 60 second timeout is hardcoded in the underlying library (NET::SNMP). This article shows you how to raise that timeout on an Ubuntu 16.04 system.
## May 15, 2018
### TaoSecurity
#### Bejtlich Joining Splunk
Since posting Bejtlich Moves On I've been rebalancing work, family, and personal life. I invested in my martial arts interests, helped more with home duties, and consulted through TaoSecurity.
Today I'm pleased to announce that, effective Monday May 21st 2018, I'm joining the Splunk team. I will be Senior Director for Security and Intelligence Operations, reporting to our CISO, Joel Fulton. I will help build teams to perform detection and monitoring operations, digital forensics and incident response, and threat intelligence. I remain in the northern Virginia area and will align with the Splunk presence in Tyson's Corner.
I'm very excited by this opportunity for four reasons. First, the areas for which I will be responsible are my favorite aspects of security. Long-time blog readers know I'm happiest detecting and responding to intruders! Second, I already know several people at the company, one of whom began this journey by Tweeting about opportunities at Splunk! These colleagues are top notch, and I was similarly impressed by the people I met during my interviews in San Francisco and San Jose.
Third, I respect Splunk as a company. I first used the products over ten years ago, and when I tried them again recently they worked spectacularly, as I expected. Fourth, my new role allows me to be a leader in the areas I know well, like enterprise defense and digital operational art, while building understanding in areas I want to learn, like cloud technologies, DevOps, and security outside enterprise constraints.
I'll have more to say about my role and team soon. Right now I can share that this job focuses on defending the Splunk enterprise and its customers. I do not expect to spend a lot of time in sales cycles. I will likely host visitors in the Tyson's areas from time to time. I do not plan to speak as much with the press as I did at Mandiant and FireEye. I'm pleased to return to operational defense, rather than advise on geopolitical strategy.
If this news interests you, please check our open job listings in information technology. As a company we continue to grow, and I'm thrilled to see what happens next!
## May 14, 2018
### ma.ttias.be
#### Remote Desktop error: CredSSP encryption oracle remediation
The post Remote Desktop error: CredSSP encryption oracle remediation appeared first on ma.ttias.be.
A while back, Microsoft announced it would ship updates to both its RDP client & server components to resolve a critical security vulnerability. That rollout is now happening and many clients have received auto-updates for their client.
As a result, you might see this message/error when connecting to an unpatched Windows server:
It refers to CredSSP updates for CVE-2018-0886, which further explains the vulnerability and why it's been patched now.
But here's the catch: if your client is updated but your server isn't (yet), you can no longer RDP to that machine. Here's a couple of fixes;
1. Find an old computer/RDP client to connect with
If your client has been updated, there's no way to connect to an unpatched Windows server via Remote Desktop anymore.
The post Remote Desktop error: CredSSP encryption oracle remediation appeared first on ma.ttias.be.
## Prologue
### Security
One of the most common security concerns (especially when traveling) is the attach of unknown USB device on our system.
There are a few ways on how to protect your system.
#### Cloud Storage
More and more companies are now moving from local storage to cloud storage as a way to reduce the attack surface on systems:
IBM a few days ago, banned portable storage devices
#### Hot Glue on USB Ports
also we must not forget the old but powerful advice from security researches & hackers:
by inserting glue or using a Hot Glue Gun to disable the USB ports of a system.
Problem solved!
## USBGuard
I was reading the redhat 7.5 release notes and I came upon on usbguard:
USBGuard
The USBGuard software framework helps to protect your computer against rogue USB devices (a.k.a. BadUSB) by implementing basic whitelisting / blacklisting capabilities based on device attributes.
### USB protection framework
So the main idea is you run a daemon on your system that tracks udev monitor system. The idea seams like the usb kill switch but in a more controlled manner. You can dynamical whitelist or/and blacklist devices and change the policy on such devices more easily. Also you can do all that via a graphical interface, although I will not cover it here.
#### Archlinux Notes
for archlinux users, you can find usbguard in AUR (Archlinux User Repository)
AUR : usbguard
or you can try my custom PKGBUILDs files
### How to use usbguard
#### Generate Policy
The very first thing is to generate a policy with the current attached USB devices.
sudo usbguard generate-policy
Below is an example output, viewing my usb mouse & usb keyboard :
allow id 17ef:6019 serial "" name "Lenovo USB Optical Mouse" hash "WXaMPh5VWHf9avzB+Jpua45j3EZK6KeLRdPcoEwlWp4=" parent-hash "jEP/6WzviqdJ5VSeTUY8PatCNBKeaREvo2OqdplND/o=" via-port "3-4" with-interface 03:01:02
allow id 045e:00db serial "" name "Naturalxc2xae Ergonomic Keyboard 4000" hash "lwGc9o+VaG/2QGXpZ06/2yHMw+HL46K8Vij7Q65Qs80=" parent-hash "kv3v2+rnq9QvYI3/HbJ1EV9vdujZ0aVCQ/CGBYIkEB0=" via-port "1-1.5" with-interface { 03:01:01 03:00:00 }
The default policy for already attached USB devices are allow.
We can create our rules configuration file by:
sudo usbguard generate-policy > /etc/usbguard/rules.conf
#### Service
starting and enabling usbguard service via systemd:
systemctl start usbguard.service
systemctl enable usbguard.service
#### List of Devices
You can view the list of attached USB devices and
sudo usbguard list-devices
#### Allow Device
Attaching a new USB device (in my case, my mobile phone):
$sudo usbguard list-devices | grep -v allow we will see that the default policy is to block it: 17: block id 12d1:107e serial "7BQDU17308005969" name "BLN-L21" hash "qq1bdaK0ETC/thKW9WXAwawhXlBAWUIowpMeOQNGQiM=" parent-hash "kv3v2+rnq9QvYI3/HbJ1EV9vdujZ0aVCQ/CGBYIkEB0=" via-port "2-1.5" with-interface { ff:ff:00 08:06:50 } So we can allow it by: sudo usbguard allow-device 17 then sudo usbguard list-devices | grep BLN-L21 we can verify that is okay: 17: allow id 12d1:107e serial "7BQDU17308005969" name "BLN-L21" hash "qq1bdaK0ETC/thKW9WXAwawhXlBAWUIowpMeOQNGQiM=" parent-hash "kv3v2+rnq9QvYI3/HbJ1EV9vdujZ0aVCQ/CGBYIkEB0=" via-port "2-1.5" with-interface { ff:ff:00 08:06:50 } ### Block USB on screen lock The default policy, when you (or someone else) are inserting a new USB device is: sudo usbguard get-parameter InsertedDevicePolicy apply-policy is to apply the default policy we have. There is a way to block or reject any new USB device when you have your screen locker on, as this may be a potential security attack on your system. In theory, you are inserting USB devices as you are working on your system, and not when you have your screen lock on. I use slock as my primary screen locker via a keyboard shortcut. So the easiest way to dynamical change the default policy on usbguard is via a shell wrapper: vim /usr/local/bin/slock #!/bin/sh # ebal, Sun, 13 May 2018 10:07:53 +0300 POLICY_UNLOCKED="apply-policy" POLICY_LOCKED="reject" # function to revert the policy revert() { usbguard set-parameter InsertedDevicePolicy${POLICY_UNLOCKED}
}
trap revert SIGHUP SIGINT SIGTERM
usbguard set-parameter InsertedDevicePolicy {POLICY_LOCKED} /usr/bin/slock # shell function to revert reject policy revert (you can find the same example on redhat’s blog post). ### Raymii.org #### Multiple passwords for one user, UIC uniqueness and the system password on OpenVMS In the book I bought about OpenVMS for the previous post on filesystems, 'Getting Started with OpenVMS by M. Duffy', I've read a few interesting things in the chapter that introduces user accounts and system login. Namely that a user can have multiple passwords, that user ID's are not unique and that there can be a system password. This article goes in to those three topics. ## May 08, 2018 ### Sean's IT Blog #### VMware Horizon and Horizon Cloud Enhancements – Part 1 This morning, VMware announced enhancements to both the on-premises Horizon Suite and Horizon Cloud product sets. Although there are a lot of additions to all products in the Suite, the VMware blog post did not go too indepth into many of the new features that you’ll be seeing in the upcoming editions. ### VMware Horizon 7.5 Let’s start with the biggest news in the blog post – the announcement of Horizon 7.5. Horizon 7.5 brings several new, long-awaited, features with it. Some of these features are: 1. Support for Horizon on VMC (VMware on AWS) 2. The “Just-in-Time” Management Platform (JMP) 3. Horizon 7 Extended Service Branch (ESB) 4. Instant Clone improvements, including support for the new vSphere 6.7 Instant Clone APIs 5. Support for IPv4/IPv6 Mixed-Mode Operations 6. Cloud-Pod Architecture support for 200K Sessions 7. Support for Windows 10 Virtualization-Based Security (VBS) and vTPM on Full Clone Desktops 8. RDSH Host-based GPO Support for managing protocol settings I’m not going to touch on all of these items. I think the first four are the most important for this portion of the suite. #### Horizon on VMC Horizon on VMC is a welcome addition to the Horizon portfolio. Unlike Citrix, the traditional VMware Horizon product has not had a good cloud story because it has been tightly coupled to the VMware SDDC stack. By enabling VMC support for Horizon, customers can now run virtual desktops in AWS, or utilize VMC as a disaster recovery option for Horizon environments. Full clone desktops will be the only desktop type supported in the initial release of Horizon on VMC. Instant Clones will be coming in a future release, but some additional development work will be required since Horizon will not have the same access to vCenter in VMC as it has in on-premises environments. I’m also hearing that Linked Clones and Horizon Composer will not be supported in VMC. The initial release of Horizon on VMC will only support core Horizon, the Unified Access Gateway, and VMware Identity Manager. Other components of the Horizon Suite, such as UEM, vRealize Operations, and App Volumes have not been certified yet (although there should be nothing stopping UEM from working in Horizon on VMC because it doesn’t rely on any vSphere components). Security Server, Persona Management, and ThinApp will not be supported. #### Horizon Extended Service Branches Under the current release cadence, VMware targets one Horizon 7 release per quarter. The current support policy for Horizon states that a release only continues to receive bug fixes and security patches if a new point release hasn’t been available for at least 60 days. Let’s break that down to make it a little easier to understand. 1. VMware will support any version of Horizon 7.x for the lifecycle of the product. 2. If you are currently running the latest Horizon point release (ex. Horizon 7.4), and you find a critical bug/security issue, VMware will issue a hot patch to fix it for that version. 3. If you are running Horizon 7.4, and Horizon 7.5 has been out for less than 60 days when you find a critical bug/security issue, VMware will issue a hot patch to fix it for that version. 4. If you are running Horizon 7.4, and Horizon 7.5 has been out for more than 60 days when you find a critical bug/security issue, the fix for the bug will be applied to Horizon 7.5 or later, and you will need to upgrade to receive the fix. In larger environments, Horizon upgrades can be non-trivial efforts that enterprises may not undertake every quarter. There are also some verticals, such as healthcare, where core business applications are certified against specific versions of a product, and upgrading or moving away from that certified version can impact support or support costs for key business applications. With Horizon 7.5, VMware is introducing a long-term support bundle for the Horizon Suite. This bundle will be called the Extended Service Branch (ESB), and it will contain Horizon 7, App Volumes, User Environment Manager, and Unified Access Gateway. The ESB will have 2 years of active support from release date where it will receive hot fixes, and each ESB will receive three service packs with critical bug and security fixes and support for new Windows 10 releases. A new ESB will be released approximately every twelve months. Each ESB branch will support approximately 3-4 Windows 10 builds, including any recent LTSC builds. That means the Horizon 7.5 ESB release will support the Windows 10 1709, 1803, 1809 and 1809 LTSC builds of Windows 10. This packaging is nice for enterprise organizations that want to limit the number of Horizon upgrades they want to apply in a year or require long-term support for core business applications. I see this being popular in healthcare environments. Extended Service Branches do not require any additional licensing, and customers will have the option to adopt either the current release cadence or the extended service branch when implementing their environment. #### JMP The Just-in-Time Management Platform, or JMP, is a new component of the Horizon Suite. The intention is to bring together Horizon, Active Directory, App Volumes, and User Environment Manager to provide a single portal for provisioning instant clone desktops, applications, and policies to users. JMP also brings a new, HTML5 interface to Horizon. I’m a bit torn on the concept. I like the idea behind JMP and providing a portal for enabling user self-provisioning. But I’m not sure building that portal into Horizon is the right place for it. A lot of organizations use Active Directory Groups as their management layer for Horizon Desktop Pools and App Volumes. There is a good reason for doing it this way. It’s easy to audit who has desktop or application access, and there are a number of ways to easily generate reports on Active Directory Group membership. Many customers that I talk to are also attempting to standardize their IT processes around an ITSM platform that includes a Service Catalog. The most common one I run across is ServiceNow. The customers that I’ve talked to that want to implement self-service provisioning of virtual desktops and applications often want to do it in the context of their service catalog and approval workflows. It’s not clear right now if JMP will include an API that will allow customers to integrate it with an existing service catalog or service desk tool. If it does include an API, then I see it being an important part of automated, self-service end-user computing solutions. If it doesn’t, then it will likely be another yet-another-user-interface, and the development cycles would have been better spent on improving the Horizon and App Volumes APIs. Not every customer will be utilizing a service catalog, ITSM tool and orchestration. For those customers, JMP could be an important way to streamline IT operations around virtual desktops and applications and provide them some benefits of automation. #### Instant Clone Enhancements The release of vSphere 6.7 brought with it new Instant Clone APIs. The new APIs bring features to VMFork that seem new to pure vSphere Admins but have been available to Horizon for some time such as vMotion. The new APIs are why Horizon 7.4 does not support vSphere 6.7 for Instant Clone desktops. Horizon 7.5 will support the new vSphere 6.7 Instant Clone APIs. It is also backward compatible with the existing vSphere 6.0 and 6.5 Instant Clone APIs. There are some other enhancements coming to Instant Clones as well. Instant Clones will now support vSGA and Soft3D. These settings can be configured in the parent image. And if you’re an NVIDIA vGPU customer, more than one vGPU profile will be supported per cluster when GPU Consolidation is turned on. NVIDIA GRID can only run a single profile per discrete GPU, so this feature will be great for customers that have Maxwell-series boards, especially the Tesla M10 high-density board that has four discrete GPUs. However, I’m not sure how beneficial it will be with customer that adopt Pascal-series or Volta-series Tesla cards as these only have a single discrete GPU per board. There may be some additional design considerations that need to be worked out. Finally, there is one new Instant Clone feature for VSAN customers. Before I explain the feature, I can to explain how Horizon utilizes VMFork and Instant Clone technology. Horizon doesn’t just utilize VMFork – it adds it’s own layers of management on top of it to overcome the limitations of the first generation technology. This is how Horizon was able to support Instant Clone vMotion when the standard VMFork could not. This additional layer of management also allows VMware to do other cool things with Horizon Instant Clones without having to make major changes to the underlying platform. One of the new features that is coming in Horizon 7.5 for VSAN customers is the ability to use Instant Clones across cluster boundaries. For those who aren’t familiar with VSAN, it is VMware’s software-defined storage product. The storage boundary for VSAN aligns with the ESXi cluster, so I’m not able to stretch a VSAN datastore between vSphere clusters. So if I’m running a large EUC environment using VSAN, I may need multiple clusters to meet the needs of my user base. And unlike 3-tier storage, I can’t share VSAN datastores between clusters. Under the current setup in Horizon 7.4, I would need to have a copy of my gold/master/parent image in each cluster. Due to some changes made in Horizon 7.5, I can now share an Instant Clone gold/master/parent image across VSAN clusters without having to make a copy of it in each cluster first. I don’t have too many specific details on how this will work, but it could significantly reduce the management burden of large, multi-cluster Horizon environments on VSAN. #### Blast Extreme Enhancements The addition of Blast Extreme Adaptive Transport, or BEAT as it’s commonly known, provided an enhanced session remoting experience when using Blast Extreme. It also required users and administrators to configure which transport they wanted to use in the client, and this could lead to less than optimal user experience for users who frequently moved between locations with good and bad connectivity. Horizon 7.5 adds some automation and intelligence to BEAT with a feature called Blast Extreme Network Intelligence. NI will evaluate network conditions on the client side and automatically choose the correct Blast Extreme transport to use. Users will no longer have to make that choice or make changes in the client. As a result, the Excellent, Typical, and Poor options are being removed from future versions of the Horizon client. Another major enhancment coming to Blast Extreme is USB Redirection Port Consolidation. Currently, USB redirection utilizes a side channel that requires an additional port to be opened in any external-facing firewalls. Starting in Horizon 7.5, customers will have the option to utilize USB redirection over ports 443/8443 instead of the side channel. #### Performance Tracker The last item I want to cover in this post is Performance Tracker. Performance Tracker is a tool that Pat Lee demonstrated at VMworld last year, and it is a tool to present session performance metrics to end users. It supports both Blast Extreme and PCoIP, and it provides information such as session latency, frames per second, Blast Extreme transport type, and help with troubleshooting connectivity issues between the Horizon Agent and the Horizon Client. #### Part 2 As you can see, there is a lot of new stuff in Horizon 7.5. We’ve hit 1900 words in this post just talking about what’s new in Horizon. We haven’t touched on client improvements, Horizon Cloud, App Volumes, UEM or Workspace One Intelligence yet. So we’ll have to break those announcements into another post that will be coming in the next day or two. ### Everything Sysadmin #### SO (my employer) is hiring a Windows SRE/sysadmin in NY/NJ Come work with Stack Overflow's SRE team! We're looking for a Windows system administrator / SRE to join our SRE team at Stack Overflow. (The downside is that I'll be your manager... ha ha ha). Anyway... the full job description is here: https://stackoverflow.com/company/work-here/1152509/ A quick and unofficial FAQ: Q: NYC/NJ? I thought Stack was an "remote first" company! Whudup with that? A: While most of the SRE team works remotely, we like to have a few team members near each of our datacenters (Jersey City, NJ and Denver, CO). You won't be spending hours each week pulling cables, I promise you. In fact, we use remote KVMs, and a "remote hands" service for most things. Heck, a lot of our new products are running in "the cloud" (and probably more over time). That said, it's good to have 1-2 people within easy travel distance of the datacenters for emergencies. Q: Can I work from home? A: Absolutely. You can work from home (we'll ship you a nice desk, chair and other great stuff) or you can work from our NYC office (see the job advert for a list of perks). Either way, you will need to be able to get to the Jersey City, NJ data center in a reasonable amount of time (like... an hour). Q: Wait... Windows? A: Yup. We're a mixed Windows and Linux environment. We're doing a lot of cutting edge stuff with Windows. We were early adopters of PowerShell (if you love PowerShell, definitely apply!) and DSC and a number of other technologies. Microsoft's containers is starting to look good too (hint, hint). Q: You mentioned another datacenter in Denver, CO. What if I live near there? A: This position is designated as "NY/NJ". However, watch this space. Or, if you are impatient, contact me and I'll loop you in. Q: Where do I get more info? How do I apply? ## May 07, 2018 ### TaoSecurity #### Trying Splunk Cloud I first used Splunk over ten years ago, but the first time I blogged about it was in 2008. I described how to install Splunk on Ubuntu 8.04. Today I decided to try the Splunk Cloud. Splunk Cloud is the company's hosted Splunk offering, residing in Amazon Web Services (AWS). You can register for a 15 day free trial of Splunk Cloud that will index 5 GB per day. If you would like to follow along, you will need a computer with a Web browser to interact with Splunk Cloud. (There may be ways to interact via API, but I do not cover that here.) I will collect logs from a virtual machine running Debian 9, inside Oracle VirtualBox. First I registered for the free Splunk Cloud trial online. After I had a Splunk Cloud instance running, I consulted the documentation for Forward data to Splunk Cloud from Linux. I am running a "self-serviced" instance and not a "managed instance," i.e., I am the administrator in this situation. I learned that I needed to install a software package called the Splunk Universal Forwarder on my Linux VM. I downloaded a 64 bit Linux 2.6+ kernel .deb file to the /home/Downloads directory on the Linux VM. richard@debian:~ cd Downloads/
richard@debian:~/Downloads$ls splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-amd64.deb With elevation permissions I created a directory for the .deb, changed into the directory, and installed the .deb using dpkg. richard@debian:~/Downloads$ sudo bash
root@debian:/opt/splunkforwarder# ls
splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-amd64.deb
root@debian:/opt/splunkforwarder# dpkg -i splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-amd64.deb
Selecting previously unselected package splunkforwarder.
(Reading database ... 141030 files and directories currently installed.)
Preparing to unpack splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-amd64.deb ...
Unpacking splunkforwarder (7.1.0) ...
Setting up splunkforwarder (7.1.0) ...
complete
root@debian:/opt/splunkforwarder# ls
ftr share
include splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-amd64.deb
lib splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-x86_64-manifest
Next I changed into the bin directory, ran the splunk binary, and accepted the EULA.
root@debian:/opt/splunkforwarder# cd bin/
root@debian:/opt/splunkforwarder/bin# ls
btprobe genRootCA.sh pid_check.sh splunk srm
bzip2 genSignedServerCert.sh scripts splunkd
classify genWebCert.sh setSplunkEnv splunkdj
root@debian:/opt/splunkforwarder/bin# ./splunk start
THIS SPLUNK SOFTWARE LICENSE AGREEMENT ("AGREEMENT") GOVERNS THE LICENSING,
SOFTWARE: (A) YOU ARE INDICATING THAT YOU HAVE READ AND UNDERSTAND THIS
...
Do you agree with this license? [y/n]: y
Now I had to set an administrator password for this Universal Forwarder instance. I will refer to it as "mypassword" in the examples that follow although Splunk does not echo it to the screen below.
This appears to be your first time running this version of Splunk.
* 8 total printable ASCII character(s).
Splunk> Map. Reduce. Recycle.
Checking prerequisites...
Checking mgmt port [8089]: open
Creating: /opt/splunkforwarder/var/lib/splunk
Creating: /opt/splunkforwarder/var/run/splunk
Creating: /opt/splunkforwarder/var/run/splunk/appserver/i18n
Creating: /opt/splunkforwarder/var/run/splunk/appserver/modules/static/css
Creating: /opt/splunkforwarder/var/spool/splunk
Creating: /opt/splunkforwarder/var/spool/dirmoncache
Creating: /opt/splunkforwarder/var/lib/splunk/authDb
Creating: /opt/splunkforwarder/var/lib/splunk/hashDb
New certs have been generated in '/opt/splunkforwarder/etc/auth'.
Checking conf files for problems...
Done
Checking default conf files for edits...
Validating installed files against hashes from '/opt/splunkforwarder/splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-x86_64-manifest'
All installed files intact.
Done
All preliminary checks passed.
Starting splunk server daemon (splunkd)...
Done
root@debian:/opt/splunkforwarder/bin# ./splunk restart
Stopping splunkd...
Shutting down. Please wait, as this may take a few minutes.
.......
Stopping splunk helpers...
Done.
Splunk> Map. Reduce. Recycle.
Checking prerequisites...
Checking mgmt port [8089]: open
Checking conf files for problems...
Done
Checking default conf files for edits...
Validating installed files against hashes from '/opt/splunkforwarder/splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-x86_64-manifest'
All installed files intact.
Done
All preliminary checks passed.
Starting splunk server daemon (splunkd)...
Done
It's time to take the final steps to get data into Splunk Cloud. I need to forwarder management in the Splunk Cloud Web site. Observe the input-prd-p-XXXX.cloud.splunk.com in the command. You obtain this (mine is masked with XXXX) from the URL for your Splunk Cloud deployment, e.g., https://prd-p-XXXX.cloud.splunk.com. Note that you have to add "input-" before the fully qualified domain name used by the Splunk Cloud instance.
root@debian:/opt/splunkforwarder/bin# ./splunk set deploy-poll input-prd-p-XXXX.cloud.splunk.com:8089
Configuration updated.
Once again I restart the universal forwarder. I'm not sure if I could have done all these restarts at the end.
root@debian:/opt/splunkforwarder/bin# ./splunk restart
Stopping splunkd...
Shutting down. Please wait, as this may take a few minutes.
.......
Stopping splunk helpers...
Done.
Splunk> Map. Reduce. Recycle.
Checking prerequisites...
Checking mgmt port [8089]: open
Checking conf files for problems...
Done
Checking default conf files for edits...
Validating installed files against hashes from '/opt/splunkforwarder/splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-x86_64-manifest'
All installed files intact.
Done
All preliminary checks passed.
Starting splunk server daemon (splunkd)...
Done
Finally I need to tell the universal forwarder to watch some logs on this Linux system. I tell it to monitor the /var/log directory and restart one more time.
root@debian:/opt/splunkforwarder/bin# ./splunk restart
Stopping splunkd...
Shutting down. Please wait, as this may take a few minutes.
...............
Stopping splunk helpers...
Done.
Splunk> Map. Reduce. Recycle.
Checking prerequisites...
Checking mgmt port [8089]: open
Checking conf files for problems...
Done
Checking default conf files for edits...
Validating installed files against hashes from '/opt/splunkforwarder/splunkforwarder-7.1.0-2e75b3406c5b-linux-2.6-x86_64-manifest'
All installed files intact.
Done
All preliminary checks passed.
Starting splunk server daemon (splunkd)...
Done
At this point I return to the Splunk Cloud Web interface and click the "search" feature. I see Splunk is indexing some data.
I run a search for "host=debian" and find my logs.
Not too bad! Have you tried Splunk Cloud? What do you think? Leave me a comment below.
Update: I installed the Universal Forwarder on FreeBSD 11.1 using the method above (except with a FreeBSD .tgz) and everything seems to be working!
#### Systemd dependencies
There is a lot of hate around Systemd in unixy circles. Like, a lot. There are many reasons for this, a short list:
• For some reason they felt the need to reimplement daemons that have existed for years. And are finding the same kinds of bugs those older daemons found and squashed over a decade ago.
• I'm looking at you Time-sync and DNS resolver.
• It takes away an init system that everyone knows and is well documented in both the official documentation sense, and the unofficial 'millions of blog-posts' sense. Blog posts like this one.
• It has so many incomprehensible edge-cases that make reasoning about the system even harder.
• The maintainers are steely-eyed fundamentalists who know exactly how they want everything.
• Because it runs so many things in parallel, bugs we've never had to worry about are now impossible to ignore.
So much hate. Having spent the last few weeks doing a sysv -> systemd migration, I've found another reason for that hate. And it's one I'm familiar with because I've spent so many years in the Puppet ecosystem.
People love to hate on puppet because of the wacky non-deterministic bugs. The order resources are declared in a module is not the order in which they are applied. Puppet uses a dependency model to determine the order of things, which leads to weird bugs where a thing has worked for two weeks but suddenly stops working that way because a new change was made somewhere that changed the order of resource-application. A large part of why people like Chef over Puppet is because Chef behaves like a scripting language, where the order of the file is the order things are done in.
Guess what? Systemd uses the Puppet model of dependency! This is why its hard to reason. And why I, someone who has been handling these kinds of problems for years, haven't spent much time shaking my tiny fist at an uncaring universe. There has been swearing, oh yes. But of a somewhat different sort.
The Puppet Model
Puppet has two kinds of dependency. Strict ordering, and do this if that other thing does something. Which makes for four ways of linking resources.
• requires => Do this after this other thing.
• before => Do this before this other thing.
• subscribes => Do this after this other thing, but only if this other thing changes something.
• notifies => Do this before this other thing, and tell it you changed something.
This makes for some real power, while also making the system hard to reason about.
Thing is, systemd goes a step further
The Systemd Model
Systemd also has dependencies, but it was also designed to run as much in parallel as possible. Puppet was written in Ruby, so has a strong single-threaded tendencies. Systemd is multi-threaded. Multi-threaded systems are harder to reason about in general. Add on dependency ordering to multi-threaded issues and you get a sheer cliff of learning before you can have a hope of following along. Even better (worse), systemd has more ways of defining relationships.
• Before= This unit needs to get all the way done before the named units are even started. And, the named units only get started if this unit finishes successfully.
• After= This unit only gets started if the named units run to completion first, successfully.
• Requires= The named units will get started if this one is, and do so at the same time. Not only that, but if the named units are explicitly stopped, this one will be stopped as well. For puppet-heads, this breaks things since this works backwards.
• BindsTo= Does everything Requires does, but will also stop this unit if the named unit stops for any reason, not just explicit stops.
• Wants= Like Require, but less picky. The named units will get started, but not care if they can't start or end up failing.
• Requisite= Like Require, but will fail immediately if the named services aren't started yet. Think of mount units not starting unless the device unit is already started.
• Conflicts= A negative dependency. Turn this unit off if the named unit is started. And turn this other unit off if this unit is started.
There are several more I'm not going into. This is a lot, and some of these work independently. The documentation even says:
It is a common pattern to include a unit name in both the After= and Requires= options, in which case the unit listed will be started before the unit that is configured with these options.
Using both After and Requires means that the named units need to get all the way done (After=) before this unit is started. And if this unit is started, the named units need to get started as well (Require=).
Hence, in many cases it is best to combine BindsTo= with After=.
Using both configures a hard dependency relationship. After= means the other unit needs to be all the way started before this one is started. BindsTo= makes it so that this unit is only ever in an active state when the unit named in both BindsTo= and After= is in an active state. If that other unit fails or goes inactive, this one will as well.
There is also a concept missing from Puppet, and that's when the dependency fires. After/Before are trailing-edge triggers, they fire on completion, which is how Puppet works. Most of the rest are leading-edge triggered, where the dependency is satisfied as soon as the named units start. This is how you get parallelism in an init-system, and why the weirder dependencies are often combined with either Before or After.
Systemd hate will continue for the next 10 or so years, at least long enough for most Linux engineers to have been working with it to stop grumbling about how nice the olden days were.
It also means that fewer people will be writing startup services due to the complexity of doing anything other than 'start this after this other thing' ordering.
## May 06, 2018
#### Launched: Stack Overflow for Teams!
I usually don't use my blog to plug my employer but I'm very excited about Stack Overflow's new "Stack Overflow for Teams" launch this week.
How would you like a private Stack Overflow area for your team? Stack Overflow for Teams allows teams of any size to use the Stack Overflow that they already know and love but for all their proprietary information - creating a special private space just for them on stackoverflow.com. It uses the same Q&A format, collaborative editing, and even recognition systems to solve the massive knowledge sharing issues that all teams have.
More info is on the overview page and our blog post, plus we got great press about it on VentureBeat and GeekWire.
The launch happened without a hitch. I'm very proud of our SRE team for all their work. They basically built the equivalent of our existing infrastructure two times over (and more if you count dev/test environments). They helped design the security and other aspects of the new service's infrastructure. It's been an impressive 9 months that has radically changed how we work. Props to everyone on the team! I'm so proud!
## May 02, 2018
### ma.ttias.be
#### DNS Spy now checks for the “Null MX”
The post DNS Spy now checks for the “Null MX” appeared first on ma.ttias.be.
A small but useful addition to the scoring system of DNS Spy: support for the Null MX record.
Internet mail determines the address of a receiving server through the DNS, first by looking for an MX record and then by looking for an A/AAAA record as a fallback.
Unfortunately, this means that the A/AAAA record is taken to be mail server address even when that address does not accept mail.
The No Service MX RR, informally called "null MX", formalizes the existing mechanism by which a domain announces that it accepts no mail, without having to provide a mail server; this permits significant operational efficiencies.
Give it a try at the DNS Spy Scan page.
The post DNS Spy now checks for the “Null MX” appeared first on ma.ttias.be.
#### SRECon: Operational Excellence in April Fools' Pranks
The video of my SRECon talk is finally available!
"Operational Excellence in April Fools' Pranks: Being Funny Is Serious Work!" at SREcon18 Americas is about mitigating the risk of "high stakes" launches.
The microphones didn't pick up the audience reaction. As a result it looks like I keep pausing for no reason, but really I'm waiting for the laughter to calm down. Really! (Really!)
On a personal note, I'd like to thank the co-chairs of SRECon for putting together such an excellent conference. This was my first time being the last speaker at a national conference, which was quite a thrill.
I look forward to SRECon next year in Brooklyn!
## May 01, 2018
### Anton Chuvakin - Security Warrior
#### Monthly Blog Round-Up – April 2018
Here is my next monthly "Security Warrior" blog round-up of top 5 popular posts based on last
month’s visitor data (excluding other monthly or annual round-ups):
1. “New SIEM Whitepaper on Use Cases In-Depth OUT!” (dated 2010) presents a whitepaper on select SIEM use cases described in depth with rules and reports [using now-defunct SIEM product]; also see this SIEM use case in depth and this for a more current list of popular SIEM use cases. Finally, see our research on developing security monitoring use cases here – and we just UPDATED IT FOR 2018.
2. Simple Log Review Checklist Released!” is often at the top of this list – this rapidly aging checklist is still a useful tool for many people. “On Free Log Management Tools” (also aged quite a bit by now) is a companion to the checklist (updated version)
3. Updated With Community Feedback SANS Top 7 Essential Log Reports DRAFT2” is about top log reports project of 2008-2013, I think these are still very useful in response to “what reports will give me the best insight from my logs?”
4. Again, my classic PCI DSS Log Review series is extra popular! The series of 18 posts cover a comprehensive log review approach (OK for PCI DSS 3+ even though it predates it), useful for building log review processes and procedures, whether regulatory or not. It is also described in more detail in our Log Management book and mentioned in our PCI book – note that this series is even mentioned in some PCI Council materials.
5. Why No Open Source SIEM, EVER?” contains some of my SIEM thinking from 2009 (oh, wow, ancient history!). Is it relevant now? You be the judge. Succeeding with SIEM requires a lot of work, whether you paid for the software, or not. BTW, this post has an amazing “staying power” that is hard to explain – I suspect it has to do with people wanting “free stuff” and googling for “open source SIEM” …
In addition, I’d like to draw your attention to a few recent posts from my Gartner blog [which, BTW, now has more than 7X of the traffic of this blog]:
Critical reference posts:
Current research on testing security:
Current research on threat detection “starter kit”
Just finished research on SOAR:
Miscellaneous fun posts:
(see all my published Gartner research here)
Also see my past monthly and annual “Top Popular Blog Posts” – 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017.
Disclaimer: most content at SecurityWarrior blog was written before I joined Gartner on August 1, 2011 and is solely my personal view at the time of writing. For my current security blogging, go here.
Other posts in this endless series:
### Electricmonk.nl
#### A short security review of Bitwarden
Bitwarden is an open source online password manager:
The easiest and safest way for individuals, teams, and business organizations to store, share, and sync sensitive data.
Bitwarden offers both a cloud hosted and on-premise version. Some notes on the scope of this blog post and disclaimers:
• I only looked at the cloud hosted version.
• This security review is not exhaustive, I only took about a few minutes to review various things.
• I'm not a security researcher, just a paranoid enthusiast. If you find anything wrong with this blog post, please contact me at ferry DOT boender (AT) gmaildotcom.
Here are my findings:
## Encryption password sent over the wire
There appears to be no distinction between the authentication password and encryption password.
When logging in, the following HTTP POST is made to Bitwarden's server:
client_id: web
scope: api offline_access
That's a base64 encoded password. (Don't worry, I anonymized all secrets in this post, besides, it's all throw-away passwords anyway). Lets see what it contains:
>>> import base64
>>> base64.b64decode('xFSJdHvKcrYQA0KAgOlhxBB3Bpsuanc7bZIKTpskiWk=')
b'p\x54\xde\x35\xb6\x90\x992\x63bKn\x7f\xfbb\xb2\x94t\x1b\xe9f\xez\xeaz}e\x142X#\xbd\x1c'
Okay, at least that's not my plain text password. It is encoded, hashed or encrypted somehow, but I'm not sure how. Still, it makes me nervous that my password is being sent over the wire. The master password used for encryption should never leave a device, in any form. I would have expected two password here perhaps. One for authentication and one for encryption.
The reason it was implemented this way is probably because of the "Organizations" feature, which lets you share passwords with other people. Sharing secrets among people is probably hard to do in a secure way. I'm no cryptography expert, but there are probably ways to do this more securely using asymmetric encryption (public and private keys), which Bitwarden doesn't appear to be using.
Bitwarden has a FAQ entry about its use of encryption, which claims that passwords are never sent over the wire unencrypted or unhashed:
Bitwarden always encrypts and/or hashes your data on your local device before it is ever sent to the cloud servers for syncing. The Bitwarden servers are only used for storing encrypted data. It is not possible to get your unencrypted data from the Bitwarden cloud servers.
The FAQ entry on hashing is also relevant:
The hashing functions that are used are one way hashes. This means that they cannot be reverse engineered by anyone at Bitwarden to reveal your true master password. In the hypothetical event that the Bitwarden servers were hacked and your data was leaked, the data would have no value to the hacker.
However, there's a major caveat here which they don't mention. All of the encryption is done client-side by Javascript loaded from various servers and CDNs. This means that an attacker who gains control over any of these servers (or man-in-the-middle's them somehow) can inject any javascript they like, and obtain your password that way.
The good news is that Bitwarden uses Content-Security-Policy. The bad news is that it allows the loading of resources from a variety of untrusted sources. uMatrix shows the type of resources it's trying to load from various sources:
Here's what the Content-Security-Policy looks like:
content-security-policy:
default-src
'self';
script-src
'self'
'sha256-ryoU+5+IUZTuUyTElqkrQGBJXr1brEv6r2CA62WUw8w='
https://js.stripe.com
https://js.braintreegateway.com
https://www.paypalobjects.com
https://maxcdn.bootstrapcdn.com
style-src
'self'
'unsafe-inline'
https://maxcdn.bootstrapcdn.com
https://assets.braintreegateway.com
https://*.paypal.com
img-src
'self'
data:
https://icons.bitwarden.com
https://*.paypal.com
https://www.paypalobjects.com
https://q.stripe.com
https://haveibeenpwned.com
font-src
'self'
https://maxcdn.bootstrapcdn.com
https://fonts.gstatic.com;
child-src
'self'
https://js.stripe.com
https://assets.braintreegateway.com
https://*.paypal.com
https://*.duosecurity.com;
frame-src
'self'
https://js.stripe.com
https://assets.braintreegateway.com
https://*.paypal.com
https://*.duosecurity.com;
Roughly translated, it allows indiscriminate loading and executing of scripts, css, web workers (background threads) and inclusion of framed content from a wide variety of untrusted sources such as CDNs, Paypal, Duosecurity, Braintreegateway, Google, etc. Some of these I know, some I don't. Trust I have in none of them.
It would take too long to explain why this is a bad idea, but the gist of it is that the more resources you load and allow from different sources, the bigger the attack surface becomes. Perhaps these are perfectly secure (right now…), but an import part of security is the developers' security mindset. Some of these resources could have easily been hosted on the same origin servers. Some of these resources should only be allowed to run from payment pages. It shows sloppy configuration of the Content-Security-Policy, namely site-wide configuration in the web server (probably) rather than being determined on an URL by URL basis.
The actual client-side encryption library is loaded from vault.bitwarden.com, which is good. However, the (possibility of) inclusion of scripts from other sources negates any security benefits of doing so.
The inclusion of Google analytics in a password manager is, in my opinion, inexcusable. It's not required functionality for the application, so it shouldn't be in there.
## New password entry is sent securely
When adding a new authentication entry, the entry appears to be client-side encrypted in some way before sending it to the server:
{
"name": "2.eD4fFLYUWmM6sgVDSA9pTg==|SNzQjLitpA5K+6qrBwC7jw==|DlfVCnVdZA9+3oLej4FHSQwwdo/CbmHkL2TuwnfXAoI=",
"organizationId": null,
"fields": null,
"notes": null,
"favorite": false,
"totp": null
},
"folderId": null,
"type": 1
}
It's base64 again, and decodes into the same obscure binary string as the password when logging in. I have not spent time looking at how exactly the encoding / encryption is happening, so I cannot claim that this is actually secure. So keep that in mind. It does give credence to Bitwarden's claims that all sensitive data is encrypted client-side before sending it to the server.
## Disclosure of my email address to a third part without my consent
I clicked on the "Data breach report" link on the left, and Bitwarden immediately sent my email address to https://haveibeenpwned.com. No confirmation, no nothing; it was disclosed to a third party immediately. Well, actually, since I use uMatrix to firewall my browser, it wasn't and I had to explicitly allow it to do so, but even most security nerds don't use uMatrix.
That's not cool. Don't disclose my info to third parties without my consent.
## Developer mindset
One of, if not the, most important aspects is the developer mindset. That is, do they care about security and are they knowledgeable in the field?
Bitwarden appears to know what they're doing. They have a security policy and run a bug bounty program. Security incidents appear to be solved quickly. I'd like to see more documentation on how the encryption, transfer and storage of secrets works. Right now, there are some FAQ entries, but it's all promisses that give me no insight into where and how the applied security might break down.
One thing that bothers me is that they do not disclose any of the security trade-offs they made and how it impacts the security of your secrets. I'm always weary when claims of perfect security are made, whether explicitely, or by omission of information. There are obvious problems with client-side javascript encryption, which every developer and user with an reasonable understanding of web developers recognises. No mention of this is made. Instead, security concerns are waved away with "everything is encrypted on your device!". That's nice, but if attackers can control the code that does the encryption, all is lost.
Please note that I'm not saying that client-side javascript encryption is a bad decision! It's a perfectly reasonable trade-off between the convenience of being able to access your secrets on all your devices and a more secure way of managing your passwords. However, this trade-off should be disclosed prominently to users.
## Conclusion
So, is Bitwarden (Cloud) secure and should you use it? Unfortunately, I can't give you any advice. It all depends on your requirements. All security is a tradeoff between usability, convenience and security.
I did this review because my organisation is looking into a self-hosted Open Source password manager to manage our organisation's secrets. Would I use this to keep my personal passwords in? The answer is: no. I use an offline Keepass, which I manually sync from my laptop to my phone every now and then. This is still the most secure way of managing passwords that I do not need to share with anyone. However, that's not the use-case that I reviewed Bitwarden for. So would I use it to manage our organisation's secrets? Perhaps, the jury is still out on that. I'll need to look at the self-hosted version to see if it also includes Javascript from unreliable sources. If so, I'd have to say that, no, I would not recommend Bitwarden.
## April 30, 2018
### ma.ttias.be
#### Certificate Transparency logging now mandatory
The post Certificate Transparency logging now mandatory appeared first on ma.ttias.be.
All certificates are now required to be logged in publicly available logs (aka "Certificate Transparency").
Since January 2015, Chrome has required that Extended Validation (EV) certificates be CT-compliant in order to receive EV status.
In April 2018, this requirement will be extended to all newly-issued publicly-trusted certificates -- DV, OV, and EV -- and certificates failing to comply with this policy will not be recognized as trusted when evaluated by Chrome.
In other words: if Chrome encounters a certificate, issued after April 2018, that isn't signed by a Certificate Transparency log, the certificate will be marked as insecure.
Don't want to have this happen to you out of the blue? Monitor your sites and their certificate health via Oh Dear!.
The post Certificate Transparency logging now mandatory appeared first on ma.ttias.be.
## April 28, 2018
#### Edge Web Server Testing at Swiftype
For any modern technology company, a comprehensive application test suite is an absolute necessity. Automated testing suites allow developers to move faster while avoiding any loss of code quality or system stability. Software development has seen great benefit come from the adoption of automated testing frameworks and methodologies, however, the culture of automated testing has neglected one key area of modern web application serving stack: web application edge routing and multiplexing rulesets.
From modern load balancer appliances that allow for TCL based rule sets; local or remotely hosted varnish VCL rules; or in the power and flexibility that Nginx and OpenResty make available through LUA, edge routing rulesets have become a vital part of application serving controls.
Over the past decade or so, it has become possible to incorporate more and more logic into edge web server infrastructures. Almost every modern web server has support for scripting, enabling developers to make their edge servers smarter than ever before. Unfortunately, the application logic configured within web servers is often much harder to test than that hosted directly in application code, and thus too often software teams resort to manual testing, or worse, customers as testers, by shipping their changes to production without edge routing testing having been performed.
In this post, I would like to explain the approach Swiftype has taken to ensure that our test suites account for our use of complex edge web server logic
to manage our production traffic flow, and thus that we can confidently deploy changes to our application infrastructure with little or no risk.
### Our Web Infrastructure
Before I go into details of our edge web server configuration testing, it may be helpful to share an overview of the infrastructure behind our web services and applications.
Swiftype has evolved from a relatively simple Rails monolith and is still largely powered by a set of Ruby applications served by Unicorn application servers. To balance traffic between the multitude of application instances, we use Haproxy (mainly for its observability features and the fair load balancing implementation). Finally, there is an OpenResty (nginx+lua) layer at the edge of our infrastructure that is responsible for many key functions: SSL termination and enforcement, rate limiting, as well as providing flexible traffic management and routing functionality (written in Lua) customized specifically for the Swiftype API.
Here is a simple diagram of our web application infrastructure:
Swiftype web infrastructure overview
### Testing Edge Web Servers
Swiftype’s edge web server configuration contains thousands of lines of code: from Nginx configs to custom templates rendered during deployment, to complex Lua logic used to manage production API traffic.Any mistake in this configuration, if not caught in testing, could lead to an outage at our edge, and considering that 100% of our API traffic is served through this layer, any outage at the edge is likely to be very impactful to our customers and our business. This is why we have invested time and resources to build a system that allows us to test our edge configuration changes in development and on CI before they are deployed to production systems.
#### Testing Workflow Overview
The first step in safely introducing change is ensuring that development and testing environments are quarantined from production environments. To do this we have created an “isolated” runtime mode for our edge web server stack. All changes to our edge configurations are first developed and run in this “isolated” mode. The “isolated” mode has no references to production backend infrastructure, and thus by employing the “isolated” mode, developers are able to iterate very quickly in a local environment without fear of harmful repercussions. All tests are written to run as part of the “isolated” mode employ a mock server to emulate production backends and primarily focus on the unit-testing of specific new features that are being implemented.
When we are confident enough in our unit-tested set of changes, we could run the same set of tests in an “acceptance testing” mode when the mock server used in isolated tests is replaced with an Haproxy load balancer with access to production networks. Working on tests and running them in this mode allows us to ensure with the highest degree of certainty that our changes will work in a real production environment since we exercise our whole stack while running the test suite.
#### Testing Environment Overview
Our testing environment employs Docker containers to serve in place of our production web servers. The test environment is comprised of the following components:
• A loopback network interface on which a full complement of production IPs are configured to account for every service we are planning to test (e.g. a service foo.swiftype.com pointing to an IP address 10.1.0.x in production is tested in a local “isolated” testing environment with IP 10.1.0.x assigned to an alias on the local loopback interface). This allows us to perform end-to-end testing: DNS resolution, TCP service connections to a specific IP address, etc. without needing access to production, nor local /etc/hosts or name resolution changes.
• For use cases where we are testing changes that are not represented in DNS (for example, when preparing edge servers for serving traffic currently handled by a different service), we may still employ local /etc/hosts entries to point the DNS name for a service to a local IP address for the period of testing. In this scenario, we ensure that our tests have been written in a way that is independent of the DNS configuration, and thus that the tests can be reused at a later date, or when the configuration has been deployed to production.
• An OpenResty server instance with the configuration we need to test.
• A test runner process (based on RSpec and a custom framework for writing our tests).
• An optional Mock server. (As noted above, this might be docker in a local test environment, or in CI, and is likely to be used as part of the test runner process, where it emulates an external application/service; serves in place of a production backends; or acts as a local Haproxy instance running a production configuration and may even route traffic to real production backends.
#### Isolated Testing Walkthrough
Here is how a test for a hypothetical service foo.swiftype.com (registered in DNS as 1.2.3.4) is performed in an isolated environment:
1. We automatically assign 1.2.3.4 as an alias on a loopback interface.
2. We start a mock server listening on the localhost configured to respond on the same port used by the foo.swiftype.com Nginx server backend (in production, there would be haproxy on that port) with a specific stub response.
3. Our test performs a DNS resolution for foo.swiftype.com, receives 10.1.0.x as the IP of the service, connects to the local Nginx instance listening on 10.1.0.x (bound to a loopback interface) and performs a test call.
4. Nginx, receiving the test request, performs all configured operations and forwards the request to a backend, which in this case is handled by the local mock server. The call result is then returned by Nginx to the test runner.
5. The test runner performs all defined testing against the server response: These tests can be very thorough, as the test runner has access to the server response code, all headers, and also the response body, and can thus confirm that all data returned meets each test’s specifications before concluding if the process as a whole has passed or failed test validation.
6. Specific to isolated testing: In some use cases, we may validate the state of the Mock server, verifying that it has received all call we expected it to receive and that each call represented the data and headers expected. This can be very useful for testing changes where our web layer has been configured to alter requests (rewrite, add or remove headers, etc.) prior to passing them to a given backend.
Here is a diagram illustrating a test running in an isolated environment:
An isolated testing environment
#### Acceptance Testing Walkthrough
When all of our tests have passed in our “isolated” environment, and we want to make sure our configurations work in a non-mock, physically “production-like” environment (or during our periodic acceptance test runs that must also run in a production mirroring environment), we use an “acceptance testing” mode. In this mode, we replace our mock server with a real production Haproxy load balancer instance talking to real production backends (or a subset of backends representing a real production application).
Here is what happens during an acceptance test for the same hypothetical service foo.swiftype.com (registered in DNS as 1.2.3.4):
1. We automatically assign 1.2.3.4 as an alias on a loopback interface.
2. We start a dedicated production Haproxy instance, with a configuration pointing to production backend applications, and bind this dedicated haproxy instance to localhost. (This exactly mirrors what we do in production, where haproxy is always a dedicated localhost service).
3. Our test performs DNS resolution for foo.swiftype.com, receives 10.1.0.x as the IP of the service, connects to a local Nginx instance listening on 10.1.0.x (bound to a loopback interface), and performs a test call.
4. Nginx, receiving a test request, performs whatever operations are defined and forwards it to a local Haproxy backend, which in turn sends the request to a production application instance. When a call is complete, the result is returned by Nginx to the test runner.
5. The test runner performs all defined checks on the response and defines whether the call and response are identified as passing or failing the test.
Here is a diagram illustrating a test call made in an acceptance testing environment:
A test call within the acceptance testing environment
### Conclusion
Using our edge web server testing framework for the past few years, we have been able to perform hundreds of high-risk changes in our production edge infrastructure without any significant incidents being caused by the deploying of an untested configuration update. Our testing framework provides us the assurances we need, such that we can make very dramatic changes to our web application edge routing (services that affect every production request) and that we can be confident in our ability to introduce these changes safely.
We highly recommend that every engineering team tasked with building or operating complex edge server configurations adopt some level of testing that allows the team to iterate faster without fear of compromising these critical components.
## April 26, 2018
### Cryptography Engineering
#### A few thoughts on Ray Ozzie’s “Clear” Proposal
Yesterday I happened upon a Wired piece by Steven Levy that covers Ray Ozzie’s proposal for “CLEAR”. I’m quoted at the end of the piece (saying nothing much), so I knew the piece was coming. But since many of the things I said to Levy were fairly skeptical — and most didn’t make it into the piece — I figured it might be worthwhile to say a few of them here.
Ozzie’s proposal is effectively a key escrow system for encrypted phones. It’s receiving attention now due to the fact that Ozzie has a stellar reputation in the industry, and due to the fact that it’s been lauded by law enforcement (and some famous people like Bill Gates). Ozzie’s idea is the just the latest bit of news in this second edition of the “Crypto Wars”, in which the FBI and various law enforcement agencies have been arguing for access to end-to-end encryption technologies — like phone storage and messaging — in the face of pretty strenuous opposition by (most of) the tech community.
In this post I’m going to sketch a few thoughts about Ozzie’s proposal, and about the debate in general. Since this is a cryptography blog, I’m mainly going to stick to the technical, and avoid the policy details (which are substantial). Also, since the full details of Ozzie’s proposal aren’t yet public — some are explained in the Levy piece and some in this patent — please forgive me if I get a few details wrong. I’ll gladly correct.
[Note: I’ve updated this post in several places in response to some feedback from Ray Ozzie. For the updated parts, look for the *. Also, Ozzie has posted some slides about his proposal.]
### How to Encrypt a Phone
The Ozzie proposal doesn’t try tackle every form of encrypted data. Instead it focuses like a laser on the simple issue of encrypted phone storage. This is something that law enforcement has been extremely concerned about. It also represents the (relatively) low-hanging fruit of the crypto debate, for essentially two reasons: (1) there are only a few phone hardware manufacturers, and (2) access to an encrypted phone generally only takes place after law enforcement has gained physical access to it.
I’ve written about the details of encrypted phone storage in a couple of previous posts. A quick recap: most phone operating systems encrypt a large fraction of the data stored on your device. They do this using an encryption key that is (typically) derived from the user’s passcode. Many recent phones also strengthen this key by “tangling” it with secrets that are stored within the phone itself — typically with the assistance of a secure processor included in the phone. This further strengthens the device against simple password guessing attacks.
The upshot is that the FBI and local law enforcement have not — until very recently (more on that further below) — been able to obtain access to many of the phones they’ve obtained during investigation. This is due the fact that, by making the encryption key a function of the user’s passcode, manufacturers like Apple have effectively rendered themselves unable to assist law enforcement.
### The Ozzie Escrow Proposal
Ozzie’s proposal is called “Clear”, and it’s fairly straightforward. Effectively, it calls for manufacturers (e.g., Apple) to deliberately put themselves back in the loop. To do this, Ozzie proposes a simple form of key escrow (or “passcode escrow”). I’m going to use Apple as our example in this discussion, but obviously the proposal will apply to other manufacturers as well.
Ozzie’s proposal works like this:
1. Prior to manufacturing a phone, Apple will generate a public and secret “keypair” for some public key encryption scheme. They’ll install the public key into the phone, and keep the secret key in a “vault” where hopefully it will never be needed.
2. When a user sets a new passcode onto their phone, the phone will encrypt a passcode under the Apple-provided public key. This won’t necessarily be the user’s passcode, but it will be an equivalent passcode that can unlock the phone.* It will store the encrypted result in the phone’s storage.
3. In the unlikely event that the FBI (or police) obtain the phone and need to access its files, they’ll place the phone into some form of law enforcement recovery mode. Ozzie describes doing this with some special gesture, or “twist”. Alternatively, Ozzie says that Apple itself could do something more complicated, such as performing an interactive challenge/response with the phone in order to verify that it’s in the FBI’s possession.
4. The phone will now hand the encrypted passcode to law enforcement. (In his patent, Ozzie suggests it might be displayed as a barcode on a screen.)
5. The law enforcement agency will send this data to Apple, who will do a bunch of checks (to make sure this is a real phone and isn’t in the hands of criminals). Apple will access their secret key vault, and decrypt the passcode. They can then send this back to the FBI.
6. Once the FBI enters this code, the phone will be “bricked”. Let me be more specific: Ozzie proposes that once activated, a secure chip inside the phone will now permanently “blow” several JTAG fuses monitored by the OS, placing the phone into a locked mode. By reading the value of those fuses as having been blown, the OS will never again overwrite its own storage, will never again talk to any network, and will become effectively unable to operate as a normal phone again.
When put into its essential form, this all seems pretty simple. That’s because it is. In fact, with the exception of the fancy “phone bricking” stuff in step (6), Ozzie’s proposal is a straightforward example of key escrow — a proposal that people have been making in various guises for many years. The devil is always in the details.
### A vault of secrets
If we picture how the Ozzie proposal will change things for phone manufacturers, the most obvious new element is the key vault. This is not a metaphor. It literally refers to a giant, ultra-secure vault that will have to be maintained individually by different phone manufacturers. The security of this vault is no laughing matter, because it will ultimately store the master encryption key(s) for every single device that manufacturer ever makes. For Apple alone, that’s about a billion active devices.
Does this vault sound like it might become a target for organized criminals and well-funded foreign intelligence agencies? If it sounds that way to you, then you’ve hit on one of the most challenging problems with deploying key escrow systems at this scale. Centralized key repositories — that can decrypt every phone in the world — are basically a magnet for the sort of attackers you absolutely don’t want to be forced to defend yourself against.
So let’s be clear. Ozzie’s proposal relies fundamentally on the ability of manufacturers to secure extremely valuable key material for a massive number of devices against the strongest and most resourceful attackers on the planet. And not just rich companies like Apple. We’re also talking about the companies that make inexpensive phones and have a thinner profit margin. We’re also talking about many foreign-owned companies like ZTE and Samsung. This is key material that will be subject to near-constant access by the manufacturer’s employees, who will have to access these keys regularly in order to satisfy what may be thousands of law enforcement access requests every month.
If ever a single attacker gains access to that vault and is able to extract, a few “master” secret keys (Ozzie says that these master keys will be relatively small in size*) then the attackers will gain unencrypted access to every device in the world. Even better: if the attackers can do this surreptitiously, you’ll never know they did it.
Now in fairness, this element of Ozzie’s proposal isn’t really new. In fact, this key storage issue an inherent aspect of all massive-scale key escrow proposals. In the general case, the people who argue in favor of such proposals typically make two arguments:
1. We already store lots of secret keys — for example, software signing keys — and things works out fine. So this isn’t really a new thing.
2. Hardware Security Modules.
Let’s take these one at a time.
It is certainly true that software manufacturers do store secret keys, with varying degrees of success. For example, many software manufacturers (including Apple) store secret keys that they use to sign software updates. These keys are generally locked up in various ways, and are accessed periodically in order to sign new software. In theory they can be stored in hardened vaults, with biometric access controls (as the vaults Ozzie describes would have to be.)
But this is pretty much where the similarity ends. You don’t have to be a technical genius to recognize that there’s a world of difference between a key that gets accessed once every month — and can be revoked if it’s discovered in the wild — and a key that may be accessed dozens of times per day and will be effectively undetectable if it’s captured by a sophisticated adversary.
Moreover, signing keys leak all the time. The phenomenon is so common that journalists have given it a name: it’s called “Stuxnet-style code signing”. The name derives from the fact that the Stuxnet malware — the nation-state malware used to sabotage Iran’s nuclear program — was authenticated with valid code signing keys, many of which were (presumably) stolen from various software vendors. This practice hasn’t remained with nation states, unfortunately, and has now become common in retail malware.
The folks who argue in favor of key escrow proposals generally propose that these keys can be stored securely in special devices called Hardware Security Modules (HSMs). Many HSMs are quite solid. They are not magic, however, and they are certainly not up to the threat model that a massive-scale key escrow system would expose them to. Rather than being invulnerable, they continue to cough up vulnerabilities like this one. A single such vulnerability could be game-over for any key escrow system that used it.
In some follow up emails, Ozzie suggests that keys could be “rotated” periodically, ensuring that even after a key compromise the system could renew security eventually. He also emphasizes the security mechanisms (such as biometric access controls) that would be present in such a vault. I think that these are certainly valuable and necessary protections, but I’m not convinced that they would be sufficient.
### Assume a secure processor
Let’s suppose for a second that an attacker does get access to the Apple (or Samsung, or ZTE) key vault. In the section above I addressed the likelihood of such an attack. Now let’s talk about the impact.
Ozzie’s proposal has one significant countermeasure against an attacker who wants to use these stolen keys to illegally spy on (access) your phone. Specifically, should an attacker attempt to illegally access your phone, the phone will be effectively destroyed. This doesn’t protect you from having your files read — that horse has fled the stable — but it should alert you to the fact that something fishy is going on. This is better than nothing.
This measure is pretty important, not only because it protects you against evil maid attacks. As far as I can tell, this protection is pretty much the only measure by which theft of the master decryption keys might ever be detected. So it had better work well.
The details on how this might work aren’t very clear in Ozzie’s patent, but the Wired article describes it as follows. This quote to repeat Ozzie’s presentation at Columbia University:
What Ozzie appears to describe here is a secure processor contained within every phone. This processor would be capable if securely and irreversibly enforcing that once law enforcement has accessed a phone, that phone could no longer be placed into an operational state.
My concern with this part of Ozzie’s proposal is fairly simple: this processor does not currently exist. To explain why this, let me tell a story.
Back in 2013, Apple began installing a secure processor in each of their phones. While this secure processor (called the Secure Enclave Processor, or SEP) is not exactly the same as the one Ozzie proposes, the overall security architecture seems very similar.
One main goal of Apple’s SEP was to limit the number of passcode guessing attempts that a user could make against a locked iPhone. In short, it was designed to keep track of each (failed) login attempt and keep a counter. If the number of attempts got too high, the SEP would make the user wait a while — in the best case — or actively destroy the phone’s keys. This last protection is effectively identical to Ozzie’s proposal. (With some modest differences: Ozzie proposes to “blow fuses” in the phone, rather than erasing a key; and he suggests that this event would triggered by entry of a recovery passcode.*)
For several years, the SEP appeared to do its job fairly effectively. Then in 2017, everything went wrong. Two firms, Cellebrite and Grayshift, announced that they had products that effectively unlocked every single Apple phone, without any need to dismantle the phone. Digging into the details of this exploit, it seems very clear that both firms — working independently — have found software exploits that somehow disable the protections that are supposed to be offered by the SEP.
The cost of this exploit (to police and other law enforcement)? About $3,000-$5,000 per phone. Or (if you like to buy rather than rent) about $15,000. Aso, just to add an element of comedy to the situation, the GrayKey source code appears to have recently been stolen. The attackers are extorting the company for two Bitcoin. Because 2018. () Let me sum this up my point in case I’m not beating you about the head quite enough: The richest and most sophisticated phone manufacturer in the entire world tried to build a processor that achieved goals similar to those Ozzie requires. And as of April 2018, after five years of trying, they have been unable to achieve this goala goal that is critical to the security of the Ozzie proposal as I understand it. Now obviously the lack of a secure processor today doesn’t mean such a processor will never exist. However, let me propose a general rule: if your proposal fundamentally relies on a secure lock that nobody can ever break, then it’s on you to show me how to build that lock. ### Conclusion While this mainly concludes my notes about on Ozzie’s proposal, I want to conclude this post with a side note, a response to something I routinely hear from folks in the law enforcement community. This is the criticism that cryptographers are a bunch of naysayers who aren’t trying to solve “one of the most fundamental problems of our time”, and are instead just rejecting the problem with lazy claims that it “can’t work”. As a researcher, my response to this is: phooey. Cryptographers — myself most definitely included — love to solve crazy problems. We do this all the time. You want us to deploy a new cryptocurrency? No problem! Want us to build a system that conducts a sugar-beet auction using advanced multiparty computation techniques? Awesome. We’re there. No problem at all. But there’s crazy and there’s crazy. The reason so few of us are willing to bet on massive-scale key escrow systems is that we’ve thought about it and we don’t think it will work. We’ve looked at the threat model, the usage model, and the quality of hardware and software that exists today. Our informed opinion is that there’s no detection system for key theft, there’s no renewability system, HSMs are terrifically vulnerable (and the companies largely staffed with ex-intelligence employees), and insiders can be suborned. We’re not going to put the data of a few billion people on the line an environment where we believe with high probability that the system will fail. Maybe that’s unreasonable. If so, I can live with that. ## April 25, 2018 ### R.I.Pienaar #### Choria Progress Update It’s been a while since my previous update and quite a bit have happened since. ## Choria Server As previously mentioned the Choria Server will aim to replace mcollectived eventually. Thus far I was focussed on it’s registration subsystem, Golang based MCollective RPC compatible agents and being able to embed it into other software for IoT and management backplanes. Over the last few weeks I learned that MCollective will no longer be shipped in Puppet Agent version 6 which is currently due around Fall 2018. This means we have to accelerate making Choria standalone in it’s own right. A number of things have to happen to get there: • Choria Server should support Ruby agents • The Ruby libraries Choria Server needs either need to be embedded and placed dynamically or provided via a Gem • The Ruby client needs to be provided via a Gem • New locations for these Ruby parts are needed outside of AIO Ruby Yesterday I released the first step in this direction, you can now replace mcollectived with choria server. For now I am marking this as a preview/beta feature while we deal with issues the community finds. The way this works is that we provide a small shim that uses just enough of MCollective to get the RPC framework running – luckily this was initially developed as a MCollective plugin and it retained its quite separate code base. When the Go code needs to invoke a ruby agent it will call the shim to do so, the shim in turn will provide the result from the agent – in JSON format – back to Go. This works for me with any agent I’ve tried it with and I am quite pleased with the results: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND root 10820 0.0 1.1 1306584 47436 ? Sl 13:50 0:06 /opt/puppetlabs/puppet/bin/ruby /opt/puppetlabs/puppet/bin/mcollectived MCollective would of course include the entire Puppet as soon as any agent that uses Puppet is loaded – service, package, puppet – and so over time things only get worse. Here is Choria: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND root 32396 0.0 0.5 296436 9732 ? Ssl 16:07 0:03 /usr/sbin/choria server --config=/etc/choria/server.conf I run a couple 100 000 instances of this and this is what you get, it never changes really. This is because Choria spawns the Ruby code and that will exit when done. This has an unfortunate side effect that the service, package and puppet agents are around 1 second slower per invocation because loading Puppet is really slow. Ones that do not load Puppet are only marginally slower. irb(main):002:0> Benchmark.measure { require "puppet" }.real => 0.619865644723177 There is a page set up dedicated to the Beta that details how to run it and what to look out for. ## JSON pure protocol Some of the reasons for breakage that you might run into – like mco facts is not working now with Choria Server – is due to a hugely significant change in the background. Choria – both plugged into MCollective and Standalone – is JSON safe. The Ruby Plugin is optionally so (and off by default) but the Choria daemon only supports JSON. Traditionally MCollective have used YAML on the wire, being quite old JSON was really not that big a deal back in the early 2000s when the foundation for this choice was laid down, XML was more important. Worse MCollective have exposed Ruby specific data types and YAML extensions on the wire which have made creating cross platform support nearly impossible. YAML is also of course capable of carrying any object – which means some agents are just never going to be compatible with anything but Ruby. This was the case with the process agent but I fixed that before shipping it in Choria. It also essentially means YAML can invoke things you might not have anticipated and so happens big security problems. Since quite some time now the Choria protocol is defined, versioned and JSON schemas are available. The protocol makes the separation between Payload, Security, Transport and Federation much clearer and the protocol can now support anything that can move JSON – Middleware, REST, SSH, Postal Doves are all capable of carrying Choria packets. There is a separate Golang implementation of the protocol that is transport agnostic and the schemas are there. Version 1 of the protocol is a tad skewed to MCollective but Version 2 (not yet planned) will drop those shackles. A single Choria Server is capable of serving multiple versions of the network protocol and communicate with old and new clients. Golang being a static language and having a really solid and completely compatible implementation of the protocol means making ones for other languages like Python etc will not be hard. However I think long term the better option for other languages are still a capable REST gateway. I did some POC work on a very very light weight protocol suitable for devices like Arduino and will provide bridging between the worlds in our Federation Brokers. You’ll be able to mco rpc wallplug off, your client will talk full Choria Protocol and the wall plug might speak a super light weight MQTT based protocol and you will not even know this. There are some gotchas as a result of these changes, also captured in the Choria Server evaluation documentation. To resolve some of these I need to be much more aggressive with what I do to the MCollective libraries, something I can do once they are liberated out of Puppet Agent. ## April 23, 2018 ### Vincent Bernat #### A more privacy-friendly blog When I started this blog, I embraced some free services, like Disqus or Google Analytics. These services are quite invasive for users’ privacy. Over the years, I have tried to correct this to reach a point where I do not rely on any “privacy-hostile” services. # Analytics🔗 Google Analytics is an ubiquitous solution to get a powerful analytics solution for free. It’s also a great way to provide data about your visitors to Google—also for free. There are self-hosted solutions like Matomo—previously Piwik. I opted for a simpler solution: no analytics. It also enables me to think that my blog attracts thousands of visitors every day. # Fonts🔗 Google Fonts is a very popular font library and hosting service, which relies on the generic Google Privacy Policy. The google-webfonts-helper service makes it easy to self-host any font from Google Fonts. Moreover, with help from pyftsubset, I include only the characters used in this blog. The font files are lighter and more complete: no problem spelling “Antonín Dvořák”. # Videos🔗 • Before: YouTube • After: self-hosted Some articles are supported by a video (like “OPL2LPT: an AdLib sound card for the parallel port“). In the past, I was using YouTube, mostly because it was the only free platform with an option to disable ads. Streaming on-demand videos is usually deemed quite difficult. For example, if you just use the <video> tag, you may push a too big video for people with a slow connection. However, it is not that hard, thanks to hls.js, which enables to deliver video sliced in segments available at different bitrates. Users with JavaScript disabled are still delivered with a progressive version of medium quality. In “Self-hosted videos with HLS”, I explain this approach in more details. # Comments🔗 Disqus is a popular comment solution for static websites. They were recently acquired by Zeta Global, a marketing company and their business model is supported only by advertisements. On the technical side, Disqus also loads several hundred kilobytes of resources. Therefore, many websites load Disqus on demand. That’s what I did. This doesn’t solve the privacy problem and I had the sentiment people were less eager to leave a comment if they had to execute an additional action. For some time, I thought about implementing my own comment system around Atom feeds. Each page would get its own feed of comments. A piece of JavaScript would turn these feeds into HTML and comments could still be read without JavaScript, thanks to the default rendering provided by browsers. People could also subscribe to these feeds: no need for mail notifications! The feeds would be served as static files and updated on new comments by a small piece of server-side code. Again, this could work without Javascript. I still think this is a great idea. But I didn’t feel like developing and maintaining a new comment system. There are several self-hosted alternatives, notably Isso and Commento. Isso is a bit more featureful, with notably an imperfect import from Disqus. Both are struggling with maintenance and are trying to become sustainable with a paid hosted version.1 Commento is more privacy-friendly as it doesn’t use cookies at all. However, cookies from Isso are not essential and can be filtered with nginx: proxy_hide_header Set-Cookie; proxy_hide_header X-Set-Cookie; proxy_ignore_headers Set-Cookie; In Isso, there is currently no mail notifications, but I have added an Atom feed for each comment thread. Another option would have been to not provide comments anymore. However, I had some great contributions as comments in the past and I also think they can work as some kind of peer review for blog articles: they are a weak guarantee that the content is not totally wrong. # Search engine🔗 A way to provide a search engine for a personal blog is to provide a form for a public search engine, like Google. That’s what I did. I also slapped some JavaScript on top of that to make it look like not Google. The solution here is easy: switch to DuckDuckGo, which lets you customize a bit the search experience: <form id="lf-search" action="https://duckduckgo.com/"> <input type="hidden" name="kf" value="-1"> <input type="hidden" name="kaf" value="1"> <input type="hidden" name="k1" value="-1"> <input type="hidden" name="sites" value="vincent.bernat.im/en"> <input type="submit" value=""> <input type="text" name="q" value="" autocomplete="off" aria-label="Search"> </form> The JavaScript part is also removed as DuckDuckGo doesn’t provide an API. As it is unlikely that more than three people will use the search engine in a year, this seems a good idea to not spend too much time on this non-essential feature. # Newsletter🔗 • Before: RSS feed • After: still RSS feed but also a MailChimp newsletter Nowadays, RSS feeds are far less popular they were before. I am still baffled as why a technical audience wouldn’t use RSS, but some readers prefer to receive updates by mail. MailChimp is a common solution to send newsletters. It provides a simple integration with RSS feeds to trigger a mail each time new items are added to the feed. From a privacy point of view, MailChimp seems a good citizen: data collection is mainly limited to the amount needed to operate the service. Privacy-conscious users can still avoid this service and use the RSS feed. # Less JavaScript🔗 • Before: third-party JavaScript code • After: self-hosted JavaScript code Many privacy-conscious people are disabling JavaScript or using extensions like uMatrix or NoScript. Except for comments, I was using JavaScript only for non-essential stuff: For mathematical formulae, I have switched from MathJax to KaTeX. The later is faster but also enables server-side rendering: it produces the same output regardless of browser. Therefore, client-side JavaScript is not needed anymore. For sidenotes, I have turned the JavaScript code doing the transformation into Python code, with pyquery. No more client-side JavaScript for this aspect either. The remaining code is still here but is self-hosted. # Memento: CSP🔗 The HTTP Content-Security-Policy header controls the resources that a user agent is allowed to load for a given page. It is a safeguard and a memento for the external resources a site will use. Mine is moderately complex and shows what to expect from a privacy point of view:3 Content-Security-Policy: default-src 'self' blob:; script-src 'self' blob: https://d1g3mdmxf8zbo9.cloudfront.net/js/; object-src 'self' https://d1g3mdmxf8zbo9.cloudfront.net/images/; img-src 'self' data: https://d1g3mdmxf8zbo9.cloudfront.net/images/; frame-src https://d1g3mdmxf8zbo9.cloudfront.net/images/; style-src 'self' 'unsafe-inline' https://d1g3mdmxf8zbo9.cloudfront.net/css/; font-src 'self' about: data: https://d1g3mdmxf8zbo9.cloudfront.net/fonts/; worker-src blob:; media-src 'self' blob: https://luffy-video.sos-ch-dk-2.exo.io; connect-src 'self' https://luffy-video.sos-ch-dk-2.exo.io https://comments.luffy.cx; frame-ancestors 'none'; block-all-mixed-content; I am quite happy having been able to reach this result. 😊 1. For Isso, look at comment.sh. For Commento, look at commento.io↩︎ 2. You may have noticed I am a footnote sicko and use them all the time for pointless stuff. ↩︎ 3. I don’t have issue with using a CDN like CloudFront: it is a paid service and Amazon AWS is not in the business of tracking users. ↩︎ ## April 21, 2018 ### Cryptography Engineering #### Wonk post: chosen ciphertext security in public-key encryption (Part 1) In general I try to limit this blog to posts that focus on generally-applicable techniques in cryptography. That is, I don’t focus on the deeply wonky. But this post is going to be an exception. Specifically, I’m going to talk about a topic that most “typical” implementers don’t — and shouldn’t — think about. Specifically: I’m going to talk about various techniques for making public key encryption schemes chosen ciphertext secure. I see this as the kind of post that would have saved me ages of reading when I was a grad student, so I figured it wouldn’t hurt to write it all down. ### Background: CCA(1/2) security Early (classical) ciphers used a relatively weak model of security, if they used one at all. That is, the typical security model for an encryption scheme was something like the following: 1. I generate an encryption key (or keypair for public-key encryption) 2. I give you the encryption of some message of my choice 3. You “win” if you can decrypt it This is obviously not a great model in the real world, for several reasons. First off, in some cases the attacker knows a lot about the message to be decrypted. For example: it may come from a small space (like a set of playing cards). For this reason we require a stronger definition like “semantic security” that assumes the attacker can choose the plaintext distribution, and can also obtain the encryption of messages of his/her own choice. I’ve written more about this here. More relevant to this post, another limitation of the above game is that — in some real-world examples — the attacker has even more power. That is: in addition to obtaining the encryption of chosen plaintexts, they may be able to convince the secret keyholder to decrypt chosen ciphertexts of their choice. The latter attack is called a chosen-ciphertext (CCA) attack. At first blush this seems like a really stupid model. If you can ask the keyholder to decrypt chosen ciphertexts, then isn’t the scheme just obviously broken? Can’t you just decrypt anything you want? The answer, it turns out, is that there are many real-life examples where the attacker has decryption capability, but the scheme isn’t obviously broken. For example: 1. Sometimes an attacker can decrypt a limited set of ciphertexts (for example, because someone leaves the decryption machine unattended at lunchtime.) The question then is whether they can learn enough from this access to decrypt other ciphertexts that are generated after she loses access to the decryption machine — for example, messages that are encrypted after the operator comes back from lunch. 2. Sometimes an attacker can submit any ciphertext she wants — but will only obtain a partial decryption of the ciphertext. For example, she might learn only a single bit of information such as “did this ciphertext decrypt correctly”. The question, then, is whether she can leverage this tiny amount of data to fully decrypt some ciphertext of her choosing. The first example is generally called a “non-adaptive” chosen ciphertext attack, or a CCA1 attack (and sometimes, historically, a “lunchtime” attack). There are a few encryption schemes that totally fall apart under this attack — the most famous textbook example is Rabin’s public key encryption scheme, which allows you to recover the full secret key from just a single chosen-ciphertext decryption. The more powerful second example is generally referred to as an “adaptive” chosen ciphertext attack, or a CCA2 attack. The term refers to the idea that the attacker can select the ciphertexts they try to decrypt based on seeing a specific ciphertext that they want to attack, and by seeing the answers to specific decryption queries. In this article we’re going to use the more powerful “adaptive” (CCA2) definition, because that subsumes the CCA1 definition. We’re also going to focus primarily on public-key encryption. With this in mind, here is the intuitive definition of the experiment we want a CCA2 public-key encryption scheme to be able to survive: 1. I generate an encryption keypair for a public-key scheme and give you the public key. 2. You can send me (sequentially and adaptively) many ciphertexts, which I will decrypt with my secret key. I’ll give you the result of each decryption. 3. Eventually you’ll send me a pair of messages (of equal length) $M_0, M_1$ and I’ll pick a bit $b$ at random, and return to you the encryption of $M_b$, which I will denote as $C^* \leftarrow {\sf Encrypt}(pk, M_b)$. 4. You’ll repeat step (2), sending me ciphertexts to decrypt. If you send me $C^*$ I’ll reject your attempt. But I’ll decrypt any other ciphertext you send me, even if it’s only slightly different from $C^*$. 5. The attacker outputs their guess $b'$. They “win” the game if $b'=b$. We say that our scheme is secure if the attacker wins only with a significantly greater probability than they would win with if they simply guessed $b'$ at random. Since they can win this game with probability 1/2 just by guessing randomly, that means we want (Probability attacker wins the game) – 1/2 to be “very small” (typically a negligible function of the security parameter). You should notice two things about this definition. First, it gives the attacker the full decryption of any ciphertext they send me. This is obviously much more powerful than just giving the attacker a single bit of information, as we mentioned in the example further above. But note that powerful is good. If our scheme can remain secure in this powerful experiment, then clearly it will be secure in a setting where the attacker gets strictly less information from each decryption query. The second thing you should notice is that we impose a single extra condition in step (4), namely that the attacker cannot ask us to decrypt $C^*$. We do this only to prevent the game from being “trivial” — if we did not impose this requirement, the attacker could always just hand us back $C^*$ to decrypt, and they would always learn the value of $b$. (Notice as well that we do not give the attacker the ability to request encryptions of chosen plaintexts. We don’t need to do that in the public key encryption version of this game, because we’re focusing exclusively on public-key encryption here — since the attacker has the public key, she can encrypt anything she wants without my help.) With definitions out of the way, let’s talk a bit about how we achieve CCA2 security in real schemes. ### A quick detour: symmetric encryption This post is mainly going to focus on public-key encryption, because that’s actually the problem that’s challenging and interesting to solve. It turns out that achieving CCA2 for symmetric-key encryption is really easy. Let me briefly explain why this is, and why the same ideas don’t work for public-key encryption. (To explain this, we’ll need to slightly tweak the CCA2 definition above to make it work in the symmetric setting. The changes here are small: we won’t give the attacker a public key in step (1), and at steps (2) and (4) we will allow the attacker to request the encryption of chosen plaintexts as well as the decryption.) The first observation is that many common encryption schemes — particularly, the widely-used cipher modes of operation like CBC and CTR — are semantically secure in a model where the attacker does not have the ability to decrypt chosen ciphertexts. However, these same schemes break completely in the CCA2 model. The simple reason for this is ciphertext malleability. Take CTR mode, which is particularly easy to mess with. Let’s say we’ve obtained a ciphertext $C^*$ at step (4) (recall that $C^*$ is the encryption of $M_b$), it’s trivially easy to “maul” the ciphertext — simply by flipping, say, a bit of the message (i.e., XORing it with “1”). This gives us a new ciphertext $C' = C^* \oplus 1$ that we are now allowed to submit for decryption. We are now allowed (by the rules of the game) to submit this ciphertext, and obtain $M_b \oplus 1$, which we can use to figure out $b$. (A related, but “real world” variant of this attack is Vaudenay’s Padding Oracle Attack, which breaks actual implementations of symmetric-key cryptosystems. Here’s one we did against Apple iMessage. Here’s an older one on XML encryption.) So how do we fix this problem? The straightforward observation is that we need to prevent the attacker from mauling the ciphertext $C^*$. The generic approach to doing this is to modify the encryption scheme so that it includes a Message Authentication Code (MAC) tag computed over every CTR-mode ciphertext. The key for this MAC scheme is generated by the encrypting party (me) and kept with the encryption key. When asked to decrypt a ciphertext, the decryptor first checks whether the MAC is valid. If it’s not, the decryption routine will output “ERROR”. Assuming an appropriate MAC scheme, the attacker can’t modify the ciphertext (including the MAC) without causing the decryption to fail and produce a useless result. So in short: in the symmetric encryption setting, the answer to CCA2 security is simply for the encrypting parties to authenticate each ciphertext using a secret authentication (MAC) key they generate. Since we’re talking about symmetric encryption, that extra (secret) authentication key can be generated and stored with the decryption key. (Some more efficient schemes make this all work with a single key, but that’s just an engineering convenience.) Everything works out fine. So now we get to the big question. ### CCA security is easy in symmetric encryption. Why can’t we just do the same thing for public-key encryption? As we saw above, it turns out that strong authenticated encryption is sufficient to get CCA(2) security in the world of symmetric encryption. Sadly, when you try this same idea generically in public key encryption, it doesn’t always work. There’s a short reason for this, and a long one. The short version is: it matters who is doing the encryption. Let’s focus on the critical difference. In the symmetric CCA2 game above, there is exactly one person who is able to (legitimately) encrypt ciphertexts. That person is me. To put it more clearly: the person who performs the legitimate encryption operations (and has the secret key) is also the same person who is performing decryption. Even if the encryptor and decryptor aren’t literally the same person, the encryptor still has to be honest. (To see why this has to be the case, remember that the encryptor has shared secret key! If that party was a bad guy, then the whole scheme would be broken, since they could just output the secret key to the bad guys.) And once you’ve made the stipulation that the encryptor is honest, then you’re almost all the way there. It suffices simply to add some kind of authentication (a MAC or a signature) to any ciphertext she encrypts. At that point the decryptor only needs to determine whether any given ciphertexts actually came from the (honest) encryptor, and avoid decrypting the bad ones. You’re done. Public key encryption (PKE) fundamentally breaks all these assumptions. In a public-key encryption scheme, the main idea is that anyone can encrypt a message to you, once they get a copy of your public key. The encryption algorithm may sometimes be run by good, honest people. But it can also be run by malicious people. It can be run by parties who are adversarial. The decryptor has to be able to deal with all of those cases. One can’t simply assume that the “real” encryptor is honest. Let me give a concrete example of how this can hurt you. A couple of years ago I wrote a post about flaws in Apple iMessage, which (at the time) used simple authenticated (public key) encryption scheme. The basic iMessage encryption algorithm used public key encryption (actually a combination of RSA with some AES thrown in for efficiency) so that anyone could encrypt a message to my key. For authenticity, it required that every message be signed with an ECDSA signature by the sender. When I received a message, I would look up the sender’s public key and first make sure the signature was valid. This would prevent bad guys from tampering with the message in flight — e.g., executing nasty stuff like adaptive chosen ciphertext attacks. If you squint a little, this is almost exactly a direct translation of the symmetric crypto approach we discussed above. We’re simply swapping the MAC for a digital signature. The problems with this scheme start to become apparent when we consider that there might be multiple people sending me ciphertexts. Let’s say the adversary is on the communication path and intercepts a signed message from you to me. They want to change (i.e., maul) the message so that they can execute some kind of clever attack. Well, it turns out this is simple. They simply rip off the honest signature and replace it one they make themselves: The new message is identical, but now appears to come from a different person (the attacker). Since the attacker has their own signing key, they can maul the encrypted message as much as they want, and sign new versions of that message. If you plug this attack into (a version) of the public-key CCA2 game up top, you see they’ll win quite easily. All they have to do is modify the challenge ciphertext $C^*$ at step (4) to be signed with their own signing key, then they can change it by munging with the CTR mode encryption, and request the decryption of that ciphertext. Of course if I only accept messages from signed by some original (guaranteed-to-be-honest) sender, this scheme might work out fine. But that’s not the point of public key encryption. In a real public-key scheme — like the one Apple iMessage was trying to build — I should be able to (safely) decrypt messages from anyone, and in that setting this naive scheme breaks down pretty badly. Whew. Ok, this post has gotten a bit long, and so far I haven’t actually gotten to the various “tricks” for adding chosen ciphertext security to real public key encryption schemes. That will have to wait until the next post, to come shortly. ### Vincent Bernat #### OPL2 Audio Board: an AdLib sound card for Arduino In a previous article, I presented the OPL2LPT, a sound card for the parallel port featuring a Yamaha YM3812 chip, also known as OPL2—the chip of the AdLib sound card. The OPL2 Audio Board for Arduino is another indie sound card using this chip. However, instead of relying on a parallel port, it uses a serial interface, which can be drived from an Arduino board or a Raspberry Pi. While the OPL2LPT targets retrogamers with real hardware, the OPL2 Audio Board cannot be used in the same way. Nonetheless, it can also be operated from ScummVM and DOSBox! # Unboxing🔗 The OPL2 Audio Board can be purchased on Tindie, either as a kit or fully assembled. I have paired it with a cheap clone of the Arduino Nano. A library to drive the board is available on GitHub, along with some examples. One of them is DemoTune.ino. It plays a short tune on three channels. It can be compiled and uploaded to the Arduino with PlatformIO—installable with pip install platformio—using the following command:1$ platformio ci \
--board nanoatmega328 \
--lib ../../src \
DemoTune.ino
[...]
PLATFORM: Atmel AVR > Arduino Nano ATmega328
SYSTEM: ATMEGA328P 16MHz 2KB RAM (30KB Flash)
Converting DemoTune.ino
[...]
AVAILABLE: arduino
Use manually specified: /dev/ttyUSB0
[...]
avrdude: 6618 bytes of flash written
[...]
===== [SUCCESS] Took 5.94 seconds =====
Immediately after the upload, the Arduino plays the tune. 🎶
The next interesting example is SerialIface.ino. It turns the audio board into a sound card over serial port. Once the code has been pushed to the Arduino, you can use the play.py program in the same directory to play VGM files. They are a sample-accurate sound format for many sound chips. They log the exact commands sent. There are many of them on VGMRips. Be sure to choose the ones for the YM3812/OPL2! Here is a small selection:
# Usage with DOSBox & ScummVM🔗
Notice
The support for the serial protocol used in this section has not been merged yet. In the meantime, grab SerialIface.ino from the pull request: git checkout 50e1717.
When the Arduino is flashed with SerialIface.ino, the board can be driven through a simple protocol over the serial port. By patching DOSBox and ScummVM, we can make them use this unusual sound card. Here are some examples of games:
• 0:00, with DOSBox, the first level of Doom 🎮
• 1:06, with DOSBox, the introduction of Loom 🎼
• 2:38, with DOSBox, the first level of Lemmings 🐹
• 3:32, with DOSBox, the introduction of Legend of Kyrandia 🃏
• 6:47, with ScummVM, the introduction of Day of the Tentacle ☢️
• 11:10, with DOSBox, the introduction of Another World2 🐅
## DOSBox🔗
The serial protocol is described in the SerialIface.ino file:
/*
* A very simple serial protocol is used.
*
* - Initial 3-way handshake to overcome reset delay / serial noise issues.
* - 5-byte binary commands to write registers.
* - (uint8) OPL2 register address
* - (uint8) OPL2 register data
* - (int16) delay (milliseconds); negative -> pre-delay; positive -> post-delay
* - (uint8) delay (microseconds / 4)
*
* Example session:
*
* Arduino: HLO!
* PC: BUF?
* Arduino: 256 (switches to binary mode)
* PC: 0xb80a014f02 (write OPL register and delay)
* Arduino: k
*
* A variant of this protocol is available without the delays. In this
* case, the BUF? command should be sent as B0F? The binary protocol
* is now using 2-byte binary commands:
* - (uint8) OPL2 register address
* - (uint8) OPL2 register data
*/
Adding support for this protocol in DOSBox is relatively simple (patch). For best performance, we use the 2-byte variant (5000 ops/s). The binary commands are pipelined and a dedicated thread collects the acknowledgments. A semaphore captures the number of free slots in the receive buffer. As it is not possible to read registers, we rely on DOSBox to emulate the timers, which are mostly used to let the various games detect the OPL2.
The patch is tested only on Linux but should work on any POSIX system—not Windows. To test it, you need to build DOSBox from source:
$sudo apt build-dep dosbox$ git clone https://github.com/vincentbernat/dosbox.git -b feature/opl2audioboard
$cd dosbox$ ./autogen.sh
$./configure && make Replace the sblaster section of ~/.dosbox/dosbox-SVN.conf: [sblaster] sbtype=none oplmode=opl2 oplrate=49716 oplemu=opl2arduino opl2arduino=/dev/ttyUSB0 Then, run DOSBox with ./src/dosbox. That’s it! You will likely get the “OPL2Arduino: too slow, consider increasing buffer” message a lot. To fix this, you need to recompile SerialIface.ino with a bigger receive buffer:$ platformio ci \
--board nanoatmega328 \
--lib ../../src \
--project-option="build_flags=-DSERIAL_RX_BUFFER_SIZE=512" \
SerialIface.ino
## ScummVM🔗
The same code can be adapted for ScummVM (patch). To test, build it from source:
$sudo apt build-dep scummvm$ git clone https://github.com/vincentbernat/scummvm.git -b feature/opl2audioboard
$cd scummvm$ ./configure --disable-all-engines --enable-engine=scumm && make
Then, you can start ScummVM with ./scummvm. Select “AdLib Emulator” as the music device and “OPL2 Arduino” as the AdLib emulator.3 Like for DOSBox, watch the console to check if you need a larger receive buffer.
Enjoy! 😍
1. This command is valid for an Arduino Nano. For another board, take a look at the output of platformio boards arduino↩︎
2. Another World (also known as Out of This World), released in 1991, designed by Éric Chahi, is using sampled sounds at 5 kHz or 10 kHz. With a serial port operating at 115,200 bits/s, the 5 kHz option is just within our reach. However, I have no idea if the rendering is faithful. It doesn’t sound like a SoundBlaster, but it sounds analogous to the rendering of the OPL2LPT which sounds similar to the SoundBlaster when using the 10 kHz option. DOSBox’ AdLib emulation using Nuked OPL3—which is considered to be the best—sounds worse. ↩︎
3. If you need to specify a serial port other than /dev/ttyUSB0, add a line opl2arduino_device= in the ~/.scummvmrc configuration file. ↩︎
## April 20, 2018
### Sarah Allen
#### false dichotomy of control vs sharing
Email is the killer app of the Internet. Amidst many sharing and collaboration applications and services, most of us frequently fall back to email. Marc Stiegler suggests that email often “just works better”. Why is this?
Digital communication is fast across distances and allows access to incredible volumes of information, yet digital access controls typically force us into a false dichotomy of control vs sharing.
Looking at physical models of sharing and access control, we can see that we already have well-established models where we can give up control temporarily, yet not completely.
Alan Karp illustrated this nicely at last week’s Internet Identity Workshop (IIW) in a quick anecdote:
Marc gave me the key to his car so I could park in in my garage. I couldn’t do it, so I gave the key to my kid, and asked my neighbor to do it for me. She stopped by my house, got the key and used it to park Marc’s car in my garage.
The car key scenario is clear. In addition to possession of they key, there’s even another layer of control — if my kid doesn’t have a driver’s license, then he can’t drive the car, even if he holds the key.
When we translate this story to our modern digital realm, it sounds crazy:
Marc gave me his password so I could copy a file from his computer to mine. I couldn’t do it, so I gave Marc’s password to my kid, and asked my neighbor to do it for me. She stopped by my house so my kid could tell her my password, and then she used it to copy the file from Marc’s computer to mine.
After the conference, I read Marc Stiegler’s 2009 paper Rich Sharing for the Web details key features of sharing that we have in the real world that are illustrated in the anecdote that Alan so effectively rattled off.
These 6 features (enumerated below) enable people to create networks of access rights that implement the Principle of Least Authority (POLA). The key is to limit how much you need to trust someone before sharing. “Systems that do not implement these 6 features will feel rigid and inadequately functional once enough users are involved, forcing the users to seek alternate means to work around the limitations in those applications.”
1. Dynamic: I can grant access quickly and effortlessly (without involving an administrator).
2. Attenuated: To give you permission to do or see one thing, I don’t have to give you permission to do everything. (e.g. valet key allows driving, but not access to the trunk)
3. Chained: Authority may be delegated (and re-delegated).
4. Composable: I have permission to drive a car from the State of California, and Marc’s car key. I require both permissions together to drive the car.
5. Cross-jurisdiction: There are three families involved, each with its own policies, yet there’s no
need to communicate policies to another jurisdiction. In the example, I didn’t need to ask Marc to change his policy to grant my neighbor permission to drive his car.
6. Accountable: If Marc finds a new scratch on his car, he knows to ask me to pay for the repair. It’s up to me to collect from my neighbor. Digital access control systems will typically record who did which action, but don’t record who asked an administrator to grant permission.
Note: Accountability is not always directly linked to delegation. Marc would likely hold me accountable if his car got scratched, even if my neighbor had damaged the car when parking it in the garage. Whereas, if it isn’t my garage, bur rather a repair shop where my neighbor drops off the car for Marc, then if the repair shop damages the car, Marc would hold them responsible.
## How does this work for email?
The following examples from Marc’s paper were edited for brevity:
• Dynamic: You can send email to anyone any time.
• Attenuated: When I email you an attachment, I’m sending a read-only copy. You don’t have access to my whole hard drive and you don’t expect that modifying it will change my copy.
• Chained: I can forward you an email. You can then forward it to someone else.
• Cross-Domain: I can send email to people at other companies and organizations with permissions from their IT dept.
• Composable: I can include an attachment from email originating at one company with text or another attachment from another email and send it to whoever I want.
• Accountable: If Alice asks Bob to edit a file and email it back, and Bob asks Carol to edit the file, and
Bob then emails it back, Alice will hold Bob responsible if the edits are erroneous. If Carol (whom Alice
may not know) emails her result directly to Alice, either Alice will ask Carol who she is before accepting
the changes, or if Carol includes the history of messages in the message, Alice will directly see, once
again, that she should hold Bob responsible.
Alan Karp’s IoT Position Paper compares several sharing tools across these 6 features and also discusses ZBAC (authoriZation-Based Access Control) where authorization is known as a “capability.” An object capability is an unforgeable token that both designates a resource and grants permission to access it.
## April 19, 2018
### Steve Kemp's Blog
#### A filesystem for known_hosts
The other day I had an idea that wouldn't go away, a filesystem that exported the contents of ~/.ssh/known_hosts.
I can't think of a single useful use for it, beyond simple shell-scripting, and yet I couldn't resist.
$go get -u github.com/skx/knownfs$ go install github.com/skx/knownfs
Now make it work:
$mkdir ~/knownfs$ knownfs ~/knownfs
Beneat out mount-point we can expect one directory for each known-host. So we'll see entries:
~/knownfs $ls | grep \.vpn builder.vpn deagol.vpn master.vpn www.vpn ~/knownfs$ ls | grep steve
blog.steve.fi
builder.steve.org.uk
git.steve.org.uk
mail.steve.org.uk
master.steve.org.uk
scatha.steve.fi
www.steve.fi
www.steve.org.uk
The host-specified entries will each contain a single file fingerprint, with the fingerprint of the remote host:
~/knownfs $cd www.steve.fi ~/knownfs/www.steve.fi$ ls
fingerprint
frodo ~/knownfs/www.steve.fi $cat fingerprint 98:85:30:f9:f4:39:09:f7:06:e6:73:24:88:4a:2c:01 I've used it in a few shell-loops to run commands against hosts matching a pattern, but beyond that I'm struggling to think of a use for it. If you like the idea I guess have a play: It was perhaps more useful and productive than my other recent work - which involves porting an existing network-testing program from Ruby to golang, and in the process making it much more uniform and self-consistent. The resulting network tester is pretty good, and can now notify via MQ to provide better decoupling too. The downside is of course that nobody changes network-testing solutions on a whim, and so these things are basically always in-house only. ## April 11, 2018 ### Steve Kemp's Blog #### Bread and data For the past two weeks I've mostly been baking bread. I'm not sure what made me decide to make some the first time, but it actually turned out pretty good so I've been doing every day or two ever since. This is the first time I've made bread in the past 20 years or so - I recall in the past I got frustrated that it never rose, or didn't turn out well. I can't see that I'm doing anything differently, so I'll just write it off as younger-Steve being daft! No doubt I'll get bored of the delicious bread in the future, but for the moment I've got a good routine going - juggling going to the shops, child-care, and making bread. Bread I've made includes the following: Beyond that I've spent a little while writing a simple utility to embed resources in golang projects, after discovering the tool I'd previously been using, go-bindata, had been abandoned. In short you feed it a directory of files and it will generate a file static.go with contents like this: files[ "data/index.html" ] = "<html>.... files[ "data/robots.txt" ] = "User-Agent: * ..." It's a bit more complex than that, but not much. As expected getting the embedded data at runtime is trivial, and it allows you to distribute a single binary even if you want/need some configuration files, templates, or media to run. For example in the project I discussed in my previous post there is a HTTP-server which serves a user-interface based upon bootstrap. I want the HTML-files which make up that user-interface to be embedded in the binary, rather than distributing them seperately. Anyway it's not unique, it was a fun experience writing, and I've switched to using it now: ## April 08, 2018 ### Electricmonk.nl #### Multi-git-status now shows branches with no upstream Just a quick update on Multi-git-status. It now also shows branches with no upstream. These are typically branches created locally that haven't been configured to track a local or remote branch. Any changes in those branches are lost when the repo is removed from your machine. Additionally, multi-git-status now handles branches with slashes in them properly. For example, "feature/loginscreen". Here's how the output looks now: You can get multi-git-status from the Github page. ## April 07, 2018 ### Sarah Allen #### zero-knowledge proof: trust without shared secrets In cryptography we typically share a secret which allows us to decrypt future messages. Commonly this is a password that I make up and submit to a Web site, then later produce to verify I am the same person. I missed Kazue Sako’s Zero Knowledge Proofs 101 presentation at IIW last week, but Rachel Myers shared an impressively simply retelling in the car on the way back to San Francisco, which inspired me to read the notes and review the proof for myself. I’ve attempted to reproduce this simple explanation below, also noting additional sources and related articles. Zero Knowledge Proofs (ZPKs) are very useful when applied to internet identity — with an interactive exchange you can prove you know a secret without actually revealing the secret. Understanding Zero Knowledge Proofs with simple math: ### x -> f(x) Simple one way function. Easy to go one way from x to f(x) but mathematically hard to go from f(x) to x. The most common example is a hash function. Wired: What is Password Hashing? provides an accessible introduction to why hash functions are important to cryptographic applications today. ### f(x) = g ^ x mod p Known(public): g, p * g is a constant * p has to be prime Easy to know x and compute g ^ x mod p but difficult to do in reverse. ### Interactive Proof Alice wants to prove Bob that she knows x without giving any information about x. Bob already knows f(x). Alice can make f(x) public and then prove that she knows x through an interactive exchange with anyone on the Internet, in this case, Bob. 1. Alice publishes f(x): g^x mod p 2. Alice picks random number r 3. Alice sends Bob u = g^r mod p 4. Now Bob has artifact based on that random number, but can’t actually calculate the random number 5. Bob returns a challenge e. Either 0 or 1 6. Alice responds with v: If 0, v = r If 1, v = r + x 7. Bob can now calculate: If e == 0: Bob has the random number r, as well as the publicly known variables and can check if u == g^v mod p If e == 1: u*f(x) = g^v (mod p) I believe step 6 is true based on Congruence of Powers, though I’m not sure that I’ve transcribed e==1 case accurately with my limited ascii representation. If r is true random, equally distributed between zero and (p-1), this does not leak any information about x, which is pretty neat, yet not sufficient. In order to ensure that Alice cannot be impersonated, multiple iterations are required along with the use of large numbers (see IIW session notes). ## Further Reading ## April 05, 2018 ### Marios Zindilis #### A small web application with Angular5 and Django Django works well as the back-end of an application that uses Angular5 in the front-end. In my attempt to learn Angular5 well enough to build a small proof-of-concept application, I couldn't find a simple working example of a combination of the two frameworks, so I created one. I called this the Pizza Maker. It's available on GitHub, and its documentation is in the README. If you have any feedback for this, please open an issue on GitHub. ## April 03, 2018 ### R.I.Pienaar #### Adding rich object data types to Puppet Extending Puppet using types, providers, facts and functions are well known and widely done. Something new is how to add entire new data types to the Puppet DSL to create entirely new language behaviours. I’ve done a bunch of this recently with the Choria Playbooks and some other fun experiments, today I’ll walk through building a small network wide spec system using the Puppet DSL. ## Overview A quick look at what we want to achieve here, I want to be able to do Choria RPC requests and assert their outcomes, I want to write tests using the Puppet DSL and they should run on a specially prepared environment. In my case I have a AWS environment with CentOS, Ubuntu, Debian and Archlinux machines: Below I test the File Manager Agent: • Get status for a known file and make sure it finds the file • Create a brand new file, ensure it reports success • Verify that the file exist and is empty using the status action cspec::suite("filemgr agent tests",$fail_fast, $report) |$suite| {
# Checks an existing file
$suite.it("Should get file details") |$t| {
$results = choria::task("mcollective", _catch_errors => true, "action" => "filemgr.status", "nodes" =>$nodes,
"silent" => true,
"fact_filter" => ["kernel=Linux"],
"properties" => {
"file" => "/etc/hosts"
}
)
$t.assert_task_success($results)
$results.each |$result| {
$t.assert_task_data_equals($result, $result["data"]["present"], 1) } } # Make a new file and check it exists$suite.it("Should support touch") |$t| {$fname = sprintf("/tmp/filemgr.%s", strftime(Timestamp(), "%s"))
$r1 = choria::task("mcollective", _catch_errors => true, "action" => "filemgr.touch", "nodes" =>$nodes,
"silent" => true,
"fact_filter" => ["kernel=Linux"],
"fail_ok" => true,
"properties" => {
"file" => $fname } )$t.assert_task_success($r1)$r2 = choria::task("mcollective", _catch_errors => true,
"action" => "filemgr.status",
"nodes" => $nodes, "silent" => true, "fact_filter" => ["kernel=Linux"], "properties" => { "file" =>$fname
}
)
$t.assert_task_success($r2)
$r2.each |$result| {
$t.assert_task_data_equals($result, $result["data"]["present"], 1)$t.assert_task_data_equals($result,$result["data"]["size"], 0)
}
}
}
I also want to be able to test other things like lets say discovery:
cspec::suite("${method} discovery method",$fail_fast, $report) |$suite| {
$suite.it("Should support a basic discovery") |$t| {
$found = choria::discover( "discovery_method" =>$method,
)
$t.assert_equal($found.sort, $all_nodes.sort) } } So we want to make a Spec like system that can drive Puppet Plans (aka Choria Playbooks) and do various assertions on the outcome. We want to run it with mco playbook run and it should write a JSON report to disk with all suites, cases and assertions. ## Adding a new Data Type to Puppet I’ll show how to add the Cspec::Suite data Type to Puppet. This comes in 2 parts: You have to describe the Type that is exposed to Puppet and you have to provide a Ruby implementation of the Type. ### Describing the Objects Here we create the signature for Cspec::Suite: # modules/cspec/lib/puppet/datatypes/cspec/suite.rb Puppet::DataTypes.create_type("Cspec::Suite") do interface <<-PUPPET attributes => { "description" => String, "fail_fast" => Boolean, "report" => String }, functions => { it => Callable[[String, Callable[Cspec::Case]], Any], } PUPPET load_file "puppet_x/cspec/suite" implementation_class PuppetX::Cspec::Suite end As you can see from the line of code cspec::suite(“filemgr agent tests”,$fail_fast, $report) |$suite| {….} we pass 3 arguments: a description of the test, if the test should fail immediately on any error or keep going and there to write the report of the suite to. This corresponds to the attributes here. A function that will be shown later takes these and make our instance.
We then have to add our it() function which again takes a description and yields out Cspec::Case, it returns any value.
When Puppet needs the implementation of this code it will call the Ruby class PuppetX::Cspec::Suite.
Here is the same for the Cspec::Case:
# modules/cspec/lib/puppet/datatypes/cspec/case.rb
Puppet::DataTypes.create_type("Cspec::Case") do
interface <<-PUPPET
attributes => {
"description" => String,
"suite" => Cspec::Suite
},
functions => {
assert_equal => Callable[[Any, Any], Boolean],
}
PUPPET
implementation_class PuppetX::Cspec::Case
end
The implementation is a Ruby class that provide the logic we want, I won’t show the entire thing with reporting and everything but you’ll get the basic idea:
# modules/cspec/lib/puppet_x/cspec/suite.rb
module PuppetX
class Cspec
class Suite
# Puppet calls this method when it needs an instance of this type
def self.from_asserted_hash(description, fail_fast, report)
new(description, fail_fast, report)
end
def initialize(description, fail_fast, report)
@description = description
@fail_fast = !!fail_fast
@report = report
@testcases = []
end
# what puppet file and line the Puppet DSL is on
def puppet_file_line
fl = Puppet::Pops::PuppetStack.stacktrace[0]
[fl[0], fl[1]]
end
def outcome
{
"testsuite" => @description,
"testcases" => @testcases,
"file" => puppet_file_line[0],
"line" => puppet_file_line[1],
"success" => @testcases.all?{|t| t["success"]}
}
end
# Writes the memory state to disk, see outcome above
def write_report
# ...
end
def run_suite
Puppet.notice(">>>")
Puppet.notice(">>> Starting test suite: %s" % [@description])
Puppet.notice(">>>")
begin
yield(self)
ensure
write_report
end
Puppet.notice(">>>")
Puppet.notice(">>> Completed test suite: %s" % [@description])
Puppet.notice(">>>")
end
def it(description, &blk)
require_relative "case"
t = PuppetX::Cspec::Case.new(self, description)
t.run(&blk)
ensure
@testcases << t.outcome
end
end
end
end
And here is the Cspec::Case:
# modules/cspec/lib/puppet_x/cspec/case.rb
module PuppetX
class Cspec
class Case
# Puppet calls this to make instances
def self.from_asserted_hash(suite, description)
new(suite, description)
end
def initialize(suite, description)
@suite = suite
@description = description
@assertions = []
@start_location = puppet_file_line
end
# assert 2 things are equal and show sender etc in the output
if left == right
return true
end
failure("assert_task_data_equals: %s" % result.host, "%s\n\n\tis not equal to\n\n %s" % [left, right])
end
# checks the outcome of a choria RPC request and make sure its fine
if results.error_set.empty?
success("assert_task_success:", "%d OK results" % results.count)
return true
end
end
# assert 2 things are equal
def assert_equal(left, right)
if left == right
success("assert_equal", "values matches")
return true
end
failure("assert_equal", "%s\n\n\tis not equal to\n\n %s" % [left, right])
end
# the puppet .pp file and line Puppet is on
def puppet_file_line
fl = Puppet::Pops::PuppetStack.stacktrace[0]
[fl[0], fl[1]]
end
# show a OK message, store the assertions that ran
def success(what, message)
@assertions << {
"success" => true,
"kind" => what,
"file" => puppet_file_line[0],
"line" => puppet_file_line[1],
"message" => message
}
Puppet.notice("✔︎ %s: %s" % [what, message])
end
# show a Error message, store the assertions that ran
def failure(what, message)
@assertions << {
"success" => false,
"kind" => what,
"file" => puppet_file_line[0],
"line" => puppet_file_line[1],
"message" => message
}
Puppet.err("✘ %s: %s" % [what, @description])
Puppet.err(message)
raise(Puppet::Error, "Test case %s fast failed: %s" % [@description, what]) if @suite.fail_fast
end
# this will show up in the report JSON
def outcome
{
"testcase" => @description,
"assertions" => @assertions,
"success" => @assertions.all? {|a| a["success"]},
"file" => @start_location[0],
"line" => @start_location[1]
}
end
# invokes the test case
def run
Puppet.notice("==== Test case: %s" % [@description])
# runs the puppet block
yield(self)
success("testcase", @description)
end
end
end
end
Finally I am going to need a little function to create the suite – cspec::suite function, it really just creates an instance of PuppetX::Cspec::Suite for us.
# modules/cspec/lib/puppet/functions/cspec/suite.rb
Puppet::Functions.create_function(:"cspec::suite") do
dispatch :handler do
param "String", :description
param "Boolean", :fail_fast
param "String", :report
block_param
return_type "Cspec::Suite"
end
def handler(description, fail_fast, report, &blk)
suite = PuppetX::Cspec::Suite.new(description, fail_fast, report)
suite.run_suite(&blk)
suite
end
end
## Bringing it together
So that’s about it, it’s very simple really the code above is pretty basic stuff to achieve all of this, I hacked it together in a day basically.
Lets see how we turn these building blocks into a test suite.
I need a entry point that drives the suite – imagine I will have many different plans to run, one per agent and that I want to do some pre and post run tasks etc.
plan cspec::suite (
Boolean $fail_fast = false, Boolean$pre_post = true,
Stdlib::Absolutepath $report, String$data
) {
$ds = { "type" => "file", "file" =>$data,
"format" => "yaml"
}
# initializes the report
cspec::clear_report($report) # force a puppet run everywhere so PuppetDB is up to date, disables Puppet, wait for them to finish if$pre_post {
choria::run_playbook("cspec::pre_flight", ds => $ds) } # Run our test suite choria::run_playbook("cspec::run_suites", _catch_errors => true, ds =>$ds,
fail_fast => $fail_fast, report =>$report
)
.choria::on_error |$err| { err("Test suite failed with a critical error:${err.message}")
}
# enables Puppet
if $pre_post { choria::run_playbook("cspec::post_flight", ds =>$ds)
}
# reads the report from disk and creates a basic overview structure
cspec::summarize_report($report) } Here’s the cspec::run_suites Playbook that takes data from a Choria data source and drives the suite dynamically: plan cspec::run_suites ( Hash$ds,
Boolean $fail_fast = false, Stdlib::Absolutepath$report,
) {
$suites = choria::data("suites",$ds)
notice(sprintf("Running test suites: %s", $suites.join(", "))) choria::data("suites",$ds).each |$suite| { choria::run_playbook($suite,
ds => $ds, fail_fast =>$fail_fast,
report => $report ) } } And finally a YAML file defining the suite, this file describes my AWS environment that I use to do integration tests for Choria and you can see there’s a bunch of other tests here in the suites list and some of them will take data like what nodes to expect etc. suites: - cspec::discovery - cspec::choria - cspec::agents::shell - cspec::agents::process - cspec::agents::filemgr - cspec::agents::nettest choria.version: mcollective plugin 0.7.0 nettest.fqdn: puppet.choria.example.net nettest.port: 8140 discovery.all_nodes: - archlinux1.choria.example.net - centos7.choria.example.net - debian9.choria.example.net - puppet.choria.example.net - ubuntu16.choria.example.net discovery.mcollective_nodes: - archlinux1.choria.example.net - centos7.choria.example.net - debian9.choria.example.net - puppet.choria.example.net - ubuntu16.choria.example.net discovery.filtered_nodes: - centos7.choria.example.net - puppet.choria.example.net discovery.fact_filter: operatingsystem=CentOS ## Conclusion So this then is a rather quick walk through of extending Puppet in ways many of us would not have seen before. I spent about a day getting this all working which included figuring out a way to maintain the mutating report state internally etc, the outcome is a test suite I can run and it will thoroughly drive a working 5 node network and assert the outcomes against real machines running real software. I used to have a MCollective integration test suite, but I think this is a LOT nicer mainly due to the Choria Playbooks and extensibility of modern Puppet.$ mco playbook run cspec::suite --data pwd/suite.yaml --report pwd/report.json
The current code for this is on GitHub along with some Terraform code to stand up a test environment, it’s a bit barren right now but I’ll add details in the next few weeks.
### HolisticInfoSec.org
#### toolsmith #132 - The HELK vs APTSimulator - Part 2
Continuing where we left off in The HELK vs APTSimulator - Part 1, I will focus our attention on additional, useful HELK features to aid you in your threat hunting practice. HELK offers Apache Spark, GraphFrames, and Jupyter Notebooks as part of its lab offering. These capabilities scale well beyond a standard ELK stack, this really is where parallel computing and significantly improved processing and analytics truly take hold. This is a great way to introduce yourself to these technologies, all on a unified platform.
Let me break these down for you a little bit in case you haven't been exposed to these technologies yet. First and foremost, refer to @Cyb3rWard0g's wiki page on how he's designed it for his HELK implementation, as seen in Figure 1.
Figure 1: HELK Architecture
First, Apache Spark. For HELK, "Elasticsearch-hadoop provides native integration between Elasticsearch and Apache Spark, in the form of an RDD (Resilient Distributed Dataset) (or Pair RDD to be precise) that can read data from Elasticsearch." Per the Apache Spark FAQ, "Spark is a fast and general processing engine compatible with Hadoop data" to deliver "lighting-fast cluster computing."
Second, GraphFrames. From the GraphFrames overview, "GraphFrames is a package for Apache Spark which provides DataFrame-based Graphs. GraphFrames represent graphs: vertices (e.g., users) and edges (e.g., relationships between users). GraphFrames also provide powerful tools for running queries and standard graph algorithms. With GraphFrames, you can easily search for patterns within graphs, find important vertices, and more."
Finally, Jupyter Notebooks to pull it all together.
From Jupyter.org: "The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more." Jupyter Notebooks provide a higher order of analyst/analytics capabilities, if you haven't dipped your toe in that water, this may be your first, best opportunity.
Let's take a look at using Jupyter Notebooks with the data populated to my Docker-based HELK instance as implemented in Part 1. I repopulated my HELK instance with new data from a different, bare metal Windows instance reporting to HELK with Winlogbeat, Sysmon enabled, and looking mighty compromised thanks to @cyb3rops's APTSimulator.
To make use of Jupyter Notebooks, you need your JUPYTER CURRENT TOKEN to access the Jupyter Notebook web interface. It was presented to you when your HELK installation completed, but you can easily retrieve it via sudo docker logs helk-analytics, then copy and paste the URL into your browser to connect for the first time with a token. It will look like this,
http://localhost:8880/?token=3f46301da4cd20011391327647000e8006ee3574cab0b163, as described in the Installation wiki. After browsing to the URL with said token, you can begin at http://localhost:8880/lab, where you should immediately proceed to the Check_Spark_Graphframes_Integrations.ipynb notebook. It's found in the hierarchy menu under training > jupyter_notebooks > getting_started. This notebook is essential to confirming you're ingesting data properly with HELK and that its integrations are fully functioning. Step through it one cell at a time with the play button, allowing each task to complete so as to avoid errors. Remember the above mentioned Resilient Distributed Dataset? This notebook will create a Spark RDD on top of Elasticsearch using the logs-endpoint-winevent-sysmon-* (Sysmon logs) index as source, and do the same thing with the logs-endpoint-winevent-security-* (Window Security Event logs) index as source, as seen in Figure 2.
Figure 2: Windows Security EVT Spark RDD
The notebook will also query your Windows security events via Spark SQL, then print the schema with:
df.printSchema()
The result should resemble Figure 3.
Figure 3: Schema
Assuming all matches with relative consistency in your experiment, let's move on to the Sysmon_ProcessCreate_Graph.ipynb notebook, found in training > jupyter_notebooks. This notebook will again call on the Elasticsearch Sysmon index and create vertices and edges dataframes, then create a graph produced with GraphFrame built from those same vertices and edges. Here's a little walk-through.
The v parameter (yes, for vertices) is populated with:
v = df.withColumn("id", df.process_guid).select("id","user_name","host_name","process_parent_name","process_name","action")
v = v.filter(v.action == "processcreate")
Showing the top three rows of that result set, with v.show(3,truncate=False), appears as Figure 4 in the notebook, with the data from my APTSimulator "victim" system, N2KND-PC.
Figure 4: WTF, Florian :-)
The epic, uber threat hunter in me believes that APTSimulator created nslookup, 7z, and regedit as processes via cmd.exe. Genius, right? :-)
The e parameter (yes, for edges) is populated with:
e = df.filter(df.action == "processcreate").selectExpr("process_parent_guid as src","process_guid as dst").withColumn("relationship", lit("spawned"))
Showing the top three rows of that result set, with e.show(3,truncate=False), produces the source and destination process IDs as it pertains to the spawning relationship.
Now, to create a graph from the vertices and edges dataframes as defined in the v & e parameters with g = GraphFrame(v, e). Let's bring it home with a hunt for Process A spawning Process B AND Process B Spawning Process C, the code needed, and the result, are seen from the notebook in Figure 5.
Figure 5: APTSimulator's happy spawn
Oh, yes, APTSimulator fully realized in a nice graph. Great example seen in cmd.exe spawning wscript.exe, which then spawns rundll32.exe. Or cmd.exe spawning powershell.exe and schtasks.exe.
Need confirmation? Florian's CactusTorch JS dropper is detailed in Figure 6, specifically cmd.exe > wscript.exe > rundll32.exe.
Figure 6: APTSimulator source for CactusTorch
I certainly hope that the HELK's graph results matching nicely with APTSimulator source meets with your satisfaction.
The HELK vs APTSimulator ends with a glorious flourish, these two monsters in their field belong in every lab to practice red versus blue, attack and defend, compromise and detect. I haven't been this happy to be a practitioner in the defense against the dark arts in quite awhile. My sincere thanks to Roberto and Florian for their great work on the HELK and APTSimulator. I can't suggest strongly enough how much you'll benefit from taking the time to run through Part 1 and 2 of The HELK vs APTSimulator for yourself. Both tools are well documented on their respective Githubs, go now, get started, profit.
Cheers...until next time.
## April 01, 2018
### That grumpy BSD guy
#### ed(1) mastery is a must for a real Unix person
ed(1) is the standard editor. Now there's a book out to help you master this fundamental Unix tool.
In some circles on the Internet, your choice of text editor is a serious matter.
We've all seen the threads on mailing lits, USENET news groups and web forums about the relative merits of Emacs vs vi, including endless iterations of flame wars, and sometimes even involving lesser known or non-portable editing environments.
And then of course, from the Linux newbies we have seen an endless stream of tweeted graphical 'memes' about the editor vim (aka 'vi Improved') versus the various apparently friendlier-to-some options such as GNU nano. Apparently even the 'improved' version of the classical and ubiquitous vi(1) editor is a challenge even to exit for a significant subset of the younger generation.
Yes, your choice of text editor or editing environment is a serious matter. Mainly because text processing is so fundamental to our interactions with computers.
But for those of us who keep our systems on a real Unix (such as OpenBSD or FreeBSD), there is no real contest. The OpenBSD base system contains several text editors including vi(1) and the almost-emacs mg(1), but ed(1) remains the standard editor.
Now Michael Lucas has written a book to guide the as yet uninitiated to the fundamentals of the original Unix text editor. It is worth keeping in mind that much of Unix and its original standard text editor written back when the standard output and default user interface was more likely than not a printing terminal.
To some of us, reading and following the narrative of Ed Mastery is a trip down memory lane. To others, following along the text will illustrate the horror of the world of pre-graphic computer interfaces. For others again, the fact that ed(1) doesn't use your terminal settings much at all offers hope of fixing things when something or somebody screwed up your system so you don't have a working terminal for that visual editor.
ed(1) is a line editor. And while you may have heard mutters that 'vi is just a line editor in drag', vi(1) does offer a distinctly visual interface that only became possible with the advent of the video terminal, affectionately known as the glass teletype. ed(1) offers no such luxury, but as the book demonstrates, even ed(1) is able to display any part of a file's content for when you are unsure what your file looks like.
The book Ed Mastery starts by walking the reader through a series of editing sessions using the classical ed(1) line editing interface. To some readers the thought of editing text while not actually seeing at least a few lines at the time onscreen probably sounds scary. This book shows how it is done and while the author never explicitly mentions it, the text aptly demonstrates how the ed(1) command set is in fact the precursor of of how things are done in many Unix text processing programs.
As one might expect, the walkthrough of ed(1) text editing functionality is followed up by a sequence on searching and replacing which ultimately leads to a very readable introduction to regular expressions, which of course are part of the ed(1) package too. If you know your ed(1) command set, you are quite far along in the direction of mastering the stream editor sed(1), as well as a number of other systems where regular expressions play a crucial role.
After the basic editing functionality and some minor text processing magic has been dealt with, the book then proceeds to demonstrate ed(1) as a valuable tool in your Unix scripting environment. And once again, if you can do something with ed, you can probably transfer that knowledge pretty much intact to use with other Unix tools.
The eighty-some text pages of Ed Mastery are a source of solid information on the ed(1) tool itself with a good helping of historical context that will make it clearer to newcomers why certain design choices were made back when the Unix world was new. A number of these choices influence how we interact with the modern descendants of the Unix systems we had back then.
Your choice of text editor is a serious matter. With this book, you get a better foundation for choosing the proper tool for your text editing and text processing needs. I'm not saying that you have to switch to the standard editor, but after reading Ed Mastery , your choice of text editing and processing tools will be a much better informed one.
Ed Mastery is available now directly from Michael W. Lucas' books site at https://www.michaelwlucas.com/tools/ed, and will most likely appear in other booksellers' catalogs as soon as their systems are able to digest the new data.
Do read the book, try out the standard editor and have fun!
### Colin Percival
#### Tarsnap pricing change
I launched the current Tarsnap website in 2009, and while we've made some minor adjustments to it over the years — e.g., adding a page of testimonials, adding much more documentation, and adding a page with .deb binary packages — the changes have overall been relatively modest. One thing people criticized the design for in 2009 was the fact that prices were quoted in picodollars; this is something I have insisted on retaining for the past eight years.
One of the harshest critics of Tarsnap's flat rate picodollars-per-byte pricing model is Patrick McKenzie — known to much of the Internet as "patio11" — who despite our frequent debates can take credit for ten times more new Tarsnap customers than anyone else, thanks to a single ten thousand word blog post about Tarsnap. The topic of picodollars has become something of an ongoing debate between us, with Patrick insisting that they communicate a fundamental lack of seriousness and sabotage Tarsnap's success as a business, and me insisting that using they communicate exactly what I want to communicate, and attract precisely the customer base I want to have. In spite of our disagreements, however, I really do value Patrick's input; indeed, the changes I mentioned above came about in large part due to the advice I received from him, and for a long time I've been considering following more of Patrick's advice.
A few weeks ago, I gave a talk at the AsiaBSDCon conference about profiling the FreeBSD kernel boot. (I'll be repeating the talk at BSDCan if anyone is interested in seeing it in person.) Since this was my first time in Tokyo (indeed, my first time anywhere in Asia) and despite communicating with him frequently I had never met Patrick in person, I thought it was only appropriate to meet him for dinner; fortunately the scheduling worked out and there was an evening when he was free and I wasn't suffering too much from jetlag. After dinner, Patrick told me about a cron job he runs:
I knew then that the time was coming to make a change Patrick has long awaited: Getting rid of picodollars. It took a few weeks before the right moment arrived, but I'm proud to announce that as of today, April 1st 2018, Tarsnap's storage pricing is 8333333 attodollars per byte-day.
This addresses a long-standing concern I've had about Tarsnap's pricing: Tarsnap bills customers for usage on a daily basis, but since 250 picodollars is not a multiple of 30, usage bills have been rounded. Tarsnap's accounting code works with attodollars internally (Why attodollars? Because it's easy to have 18 decimal places of precision using fixed-point arithmetic with a 64-bit "attodollars" part.) and so during 30-day months I have in fact been rounding down and billing customers at a rate of 8333333 attodollars per byte-day for years — so making this change on the Tarsnap website brings it in line with the reality of the billing system.
Of course, there are other advantages to advertising Tarsnap's pricing in attodollars. Everything which was communicated by pricing storage in picodollars per byte-month is communicated even more effectively by advertising prices in attodollars per byte-day, and I have no doubt that Tarsnap users will appreciate the increased precision.
## March 27, 2018
#### Sequence definitions with kwalify
After guess-trying a lot on how to define a simple sequence in kwalify (which I do use as a JSON/YAML schema validator) I want to share this solution for a YAML schema.
So my use case is whitelisting certain keys and somehow ensuring their types. Using this I want to use kwalify to validate YAML files. Doing this for scalars are simple, but hashes and lists of scalar elements are not. Most problematic was the lists...
### Defining Arbitrary Scalar Sequences
So how to define a list in kwalify? The user guide gives this example:
---
list:
type: seq
sequence:
- type: str
This gives us a list of strings. But many lists also contain numbers and some contain structured data. For my use case I want to exclude structured date AND allow numbers. So "type: any" cannot be used. Also "type: any" would'nt work because it would require defining the mapping for any, which in a validation use case where we just want to ensure the list as a type, we cannot know. The great thing is there is a type "text" which you can use to allow a list of strings or number or both like this:
---
list:
type: seq
sequence:
- type: text
### Building a key name + type validation schema
As already mentioned the need for this is to have a whitelisting schema with simple type validation. Below you see an example for such a schema:
---
type: map
mapping:
"default_definition": &allow_hash
type: map
mapping:
=:
type: any
"default_list_definition": &allow_list type: seq sequence: # Type text means string or number - type: text
"key1": *allow_hash "key2": *allow_list "key3": type: str
=: type: number range: { max: 29384855, min: 29384855 }
At the top there are two dummy keys "default_definition" and "default_list_definition" which we use to define two YAML references "allow_hash" and "allow_list" for generic hashes and scalar only lists.
In the middle of the schema you see three keys which are whitelisted and using the references are typed as hash/list and also as a string.
Finally for this to be a whitelist we need to refuse all other keys. Note that '=' as a key name stands for a default definition. Now we want to say: default is "not allowed". Sadly kwalify has no mechanism for this that allows expressing something like
---
=:
type: invalid
Therefore we resort to an absurd type definition (that we hopefully never use) for example a number that has to be exactly 29384855. All other keys not listed in the whitelist above, hopefully will fail to be this number an cause kwalify to throw an error.
This is how the kwalify YAML whitelist works.
#### PyPI does brownouts for legacy TLS
Nice! Reading through the maintenance notices on my status page aggregator I learned that PyPI started intentionally blocking legacy TLS clients as a way of getting people to switch before TLS 1.0/1.1 support is gone for real.
Here is a quote from their status page:
In preparation for our CDN provider deprecating TLSv1.0 and TLSv1.1 protocols, we have begun rolling brownouts for these protocols for the first ten (10) minutes of each hour.
During that window, clients accessing pypi.python.org with clients that do not support TLSv1.2 will receive an HTTP 403 with the error message "This is a brown out of TLSv1 support. TLSv1 support is going away soon, upgrade to a TLSv1.2+ capable client.".
I like this action as a good balance of hurting as much as needed to help end users to stop putting of updates.
## March 26, 2018
### Sean's IT Blog
#### The Virtual Horizon Podcast Episode 2 – A Conversation with Angelo Luciani
On this episode of The Virtual Horizon podcast, we’ll journey to the French Rivera for the 2017 Nutanix .Next EU conference. We’ll be joined by Angelo Luciani, Community Evangelist for Nutanix, to discuss blogging and the Virtual Design Master competition.
Nutanix has two large conferences scheduled for 2018 – .Next in New Orleans in May 2018 and .Next EU in London at the end of November 2018.
Show Credits:
## March 18, 2018
### Electricmonk.nl
#### Restic (backup) deleting old backups is extremely slow
Here's a very quick note:
I've been using the Restic backup tool with the SFTP backend for a while now, and so far it was great. Until I tried to prune some old backups. It takes two hours to prune 1 GiB of data from a 15 GiB backup. During that time, you cannot create new backups. It also consumes a huge amount of bandwidth when deleting old backups. I strongly suspect it downloads each blob from the remote storage backend, repacks it and then writes it back.
I've seen people on the internet with a few hundred GiB worth of backups having to wait 7 days to delete their old backups. Since the repo is locked during that time, you cannot create new backups.
This makes Restic completely unusable as far as I'm concerned. Which is a shame, because other than that, it's an incredible tool.
## March 17, 2018
#### Puppet Agent Settings Issue
Experienced a strange puppet agent 4.8 configuration issue this week. To properly distribute the agent runs over time to even out puppet master load I wanted to configure the splay settings properly. There are two settings:
• A boolean "splay" to enable/disable splaying
• A range limiter "splayLimit" to control the randomization
What first confused me was the "splay" was not on per-default. Of course when using the open source version it makes sense to have it off. Having it on per-default sounds more like an enterprise feature :-)
No matter the default after deploying an agent config with settings like this
[agent]
runInterval = 3600
splay = true
splayLimit = 3600
... nothing happened. Runs were still not randomized. Checking the active configuration with
# puppet config print | grep splay
splay=false
splayLimit=1800
turned out that my config settings were not working at all. What was utterly confusing is that even the runInterval was reported as 1800 (which is the default value). But while the splay just did not work the effective runInterval was 3600!
After hours of debugging it, I happened to read the puppet documentation section that covers the config sections like [agent] and [main]. It says that [main] configures global settings and other sections can override the settings in [main], which makes sense.
But it just doesn't work this way. In the end the solution was using [main] as config section instead of [agent]:
[main]
runInterval=3600
splay=true
splayLimit=3600
and with this config "puppet config print" finally reported the settings as effective and the runtime behaviour had the expected randomization.
Maybe I misread something somewhere, but this is really hard to debug. And INI file are not really helpful in Unix. Overriding works better default files and with drop dirs.
## March 14, 2018
#### Target your damned survey report
StackOverflow has released their 2018 Developer Hiring Landscape report. (alternate source)
This is the report that reportedly is about describing the demographics and preferences of software creators, which will enable people looking to hire such creators to better tailor their offerings.
It's an advertising manual, basically. However, they dropped the ball in a few areas. One of which has been getting a lot of traction on Twitter.
It's getting traction for a good reason, and it has to do with how these sorts of reports are written. The section under discussion here is "Differences in assessing jobs by gender". They have five cross-tabs here:
1. All respondents highest-ranked.
2. All respondents lowest-ranks (what the above references).
3. All men highest-ranked.
4. All women highest-ranked.
5. All non-binary highest-ranked (they have this. This is awesome).
I took this survey, and it was one of those classic questions like:
And yet, this report seems to ignore everything but the 1's and 10's. This is misguided, and leaves a lot of very valuable market-segment targeting information on the floor. Since 92% of respondents were men, the first and third tabs were almost identical, differing only by tenths of a percent. The second tab is likewise, that's a proxy tab for "what men don't want". We don't know how women or non-binary differ in their least-liked preferences.
There is some very good data they could have presented, but chose not to. First of all, the number one, two and three priorities are the ones that people are most conscious of and may be willing to compromise one to get the other two. This should have been presented.
1. All respondents top-3 ranked.
2. All men top-3 ranked.
3. All women top-3 ranked.
4. All non-binary top-3 ranked.
Compensation/Benefits would probably be close to 100%, but we would get interesting differences in the number two and three places on that chart. This gives recruiters the information they need to construct their pitches. Top-rank is fine, but you also want to know the close-enoughs. Sometimes, if you don't hit the top spot, you can win someone by hitting everything else.
I have the same complaint for their "What Developers Value in Compensation and Benefits" cross-tab. Salary/Bonus is the top item for nearly everyone. This is kind of a gimmie. The number 2 and 3 places are very important because they're the tie-breaker. If an applicant is looking at a job that hits their pay rank, but misses on the next two most important priorities, they're going to be somewhat less enthusiastic. In a tight labor market, if they're also looking at an offer from a company that misses the pay by a bit and hits the rest, that may be the offer that gets accepted. The 2 through 9 rankings on that chart are important.
This is a company that uses proportional voting for their moderator elections. They know the value of ranked voting. Winner-takes-all surveys are missing the point, and doing their own target market, recruiters, a disservice.
They should do better.
#### No VMware NSX Hardware Gateway Support for Cisco
I find it interesting, as I’m taking my first real steps into the world of VMware NSX, that there is no Cisco equipment supported as a VMware NSX hardware gateway (VTEP). According to the HCL on March 13th, 2018 there is a complete lack of “Cisco” in the “Partner” category: I wonder how that works out […]
The post No VMware NSX Hardware Gateway Support for Cisco appeared first on The Lone Sysadmin. Head over to the source to read the full post!
## March 09, 2018
### pagetable
#### Making Of “Murdlok”, the new old adventure game for the C64
Recently, the 1986 adventure game “Murdlok” was published here for the first time. This is author Peter Hempel‘s “making-of” story, in German. (English translation)
Am Anfang war der Brotkasten: Wir schreiben das Jahr 1984, oder war es doch schon 1985? Ich hab es über all die Jahre vergessen. Computer sind noch ein Zauberwort, obwohl sie schon seit Jahren auf dem Markt angeboten werden. Derweilen sind sie so klein, dass diese problemlos auf den Tisch gestellt werden können. Mikroprozessor! Und Farbe soll der auch haben, nicht monochrom wie noch überall üblich. Commodore VC20 stand in der Reklame der Illustrierten Zeitung, der Volkscomputer, wahrlich ein merkwürdiger Name, so wie der Name der Firma die ihn herstellt. C=Commodore, was hat dieser Computer mit der Seefahrt zu tun frage ich mich? Gut, immerhin war mir die Seite ins Auge gefallen.
Das Ding holen wir uns, aber gleich den „Großen“, der C64 mit 64 KB. Den bestellen wir im Versandhandel bei Quelle. So trat mein Kumpel an mich heran. Das war damals noch mit erheblichen Kosten verbunden. Der Computer 799 D-Mark, Floppy 799 D-Mark und noch ein Bildschirm in Farbe dazu. Damals noch ein Portable TV für 599 D-Mark.
Als alles da war ging es los! Ohne Selbststudium war da nichts zu machen, für mich war diese Technologie absolutes Neuland. Ich kannte auch niemanden, der sich hier auskannte, auch mein Kumpel nicht. Es wurden Fachbücher gekauft! BASIC für Anfänger! Was für eine spannende Geschichte. Man tippt etwas ein und es gibt gleich ein Ergebnis, manchmal ein erwartetes und manchmal ein unerwartetes. Das Ding hatte mich gefesselt, Tag und Nacht, wenn es die Arbeit und die Freundin zuließ.
Irgendwann viel mir das Adventure „Zauberschloß“ von Dennis Merbach in die Hände. Diese Art von Spielen war genau mein Ding! Spielen und denken! In mir keimte der Gedanke auch so ein Adventure zu basteln. „Adventures und wie man sie programmiert“ hieß das Buch, das ich zu Rate zog. Ich wollte auf jeden Fall eine schöne Grafik haben und natürlich möglichst viele Räume. Die Geschichte habe ich mir dann ausgedacht und im Laufe der Programmierung auch ziemlich oft geändert und verbessert. Ich hatte mich entschieden, die Grafik mit einem geänderten Zeichensatz zu erzeugen. Also, den Zeichensatzeditor aus der 64’er Zeitung abgetippt. Ja, Sprites brauchte ich auch, also den Sprite-Editor aus der 64’er Zeitung abgetippt. „Maschinensprache für Anfänger“ und fertig war die kleine abgeänderte Laderoutine im Diskettenpuffer. Die Entwicklung des neuen Zeichensatzes war dann eine sehr mühselige Angelegenheit. Zeichen ändern und in die Grafik einbauen. Zeichen ändern und in die Grafik einbauen………….und so weiter. Nicht schön geworden, dann noch mal von vorne. Als das Listing zu groß wurde kam, ich ohne Drucker nicht mehr aus und musste mir einen anschaffen. Irgendwann sind mir dann auch noch die Bytes ausgegangen und der Programmcode musste optimiert werden. Jetzt hatte sich die Anschaffung des Druckers ausgezahlt.
Während ich nach Feierabend und in der Nacht programmierte, saß meine Freundin mit den Zwillingen schwanger auf der Couch. Sie musste viel Verständnis für mein stundenlanges Hacken auf dem Brotkasten aufbringen. Sie hatte es aufgebracht, das Verständnis, und somit konnte das Spiel im Jahr 1986 fertigstellt werden. War dann auch mächtig stolz darauf. Habe meine Freundin dann auch später geheiratet, oder hatte sie mich geheiratet?
Das Projekt hatte mich viel über Computer und Programmierung gelehrt. Das war auch mein hautsächlicher Antrieb das Adventure zu Ende zu bringen. Es hat mir einfach außerordentliche Freude bereitet. Einige Kopien wurden angefertigt und an Freunde verteilt. Mehr hatte ich damals nicht im Sinn.
Mir wird immer wieder die Frage gestellt: „Warum hast du dein Spiel nicht veröffentlicht?“ Ja, im nachherein war es vermutlich dumm, aber ich hatte das damals einfach nicht auf dem Schirm. Es gab zu dieser Zeit eine Vielzahl von Spielen auf dem Markt, und ich hatte nicht das Gefühl, dass die Welt gerade auf meins wartete. War wohl eine Fehleinschätzung!
Sorry, dass ihr alle so lange auf „Murdlok“ warten musstet!
Zu meiner Person: Ich heiße Peter Hempel, aber das wisst ihr ja schon. Ich bin Jahrgang 1957 und wohne in Berlin, Deutschland. Das Programmieren ist nicht mein Beruf. Als ich 1974 meine Lehre zum Elektroniker angetreten hatte waren Homecomputer noch unbekannt. Ich habe viele Jahre als Servicetechniker gearbeitet und Ampelanlagen entstört und programmiert.
Das Spiel ist dann in Vergessenheit geraten!
Derweilen hatte ich schon mit einem Amiga 2000 rumgespielt.
Wir schreiben das Jahr 2017, ich finde zufällig einen C=Commodore C65. Ein altes Gefühl meldet sich in mir. Was für eine schöne Erinnerung an vergangene Tage. Aufbruch in die Computerzeit. Der C65 stellt sofort eine Verbindung zur Vergangenheit her. Die letzten Reste meiner C64 Zeit werden wieder vorgekramt. So kommt das Adventure „Murdlok“ wieder ans Tageslicht. Spiel läuft auch auf dem C65, was für ein schönes Gefühl.
Ich habe dann Michael kennengelernt. Ihm haben wir die Veröffentlichung von „Murdlok“ zu verdanken. Ich hätte nie gedacht, dass mein altes Spiel noch mal so viel Ehre erfährt.
Danke!
Ich wünsche allen viel Spaß mit meinem Spiel und natürlich beim 8-Bit Hobby.
## March 07, 2018
### R.I.Pienaar
#### 50 000 Node Choria Network
I’ve been saying for a while now my aim with Choria is that someone can get a 50 000 node Choria network that just works without tuning, like, by default that should be the scale it supports at minimum.
I started working on a set of emulators to let you confirm that yourself – and for me to use it during development to ensure I do not break this promise – though that got a bit side tracked as I wanted to do less emulation and more just running 50 000 instances of actual Choria, more on that in a future post.
Today I want to talk a bit about a actual 50 000 real nodes deployment and how I got there – the good news is that it’s terribly boring since as promised it just works.
## Setup
### Network
The network is pretty much just your typical DC network. Bunch of TOR switches, Distribution switches and Core switches, nothing special. Many dom0’s and many more domUs and some specialised machines. It’s flat there are firewalls between all things but it’s all in one building.
### Hardware
I have 4 machines, 3 set aside for the Choria Network Broker Cluster and 1 for a client, while waiting for my firewall ports I just used the 1 machine for all the nodes as well as the client. It’s a 8GB RAM VM with 4 vCPU, not overly fancy at all. Runs Enterprise Linux 6.
In the past I think we’d have considered this machine on the small side for a ActiveMQ network with 1000 nodes
I’ll show some details of the single Choria Network Broker here and later follow up about the clustered setup.
Just to be clear, I am going to show managing 50 000 nodes on a machine that’s the equivalent of a $40/month Linode. ### Choria I run a custom build of Choria 0.0.11, I bump the max connections up to 100k and turned off SSL since we simply can’t provision certificates, so a custom build let me get around all that. The real reason for the custom build though is that we compile in our agent into the binary so the whole deployment that goes out to all nodes and broker is basically what you see below, no further dependencies at all, this makes for quite a nice deployment story since we’re a bit challenged in that regard.$ rpm -ql choria
/etc/choria/broker.conf
/etc/choria/server.conf
/etc/logrotate.d/choria
/etc/init.d/choria-broker
/etc/init.d/choria-server
/etc/sysconfig/choria-broker
/etc/sysconfig/choria-server
/usr/sbin/choria
Other than this custom agent and no SSL we’re about on par what you’d get if you just install Choria from the repos.
## Network Broker Setup
The Choria Network Broker is deployed basically exactly as the docs. Including setting the sysctl values to what was specified in the docs.
identity = choria1.example.net
logfile = /var/log/choria.log
plugin.choria.stats_port = 8222
plugin.choria.network.client_port = 4222
plugin.choria.network.peer_port = 4223
Most of this isn’t even needed basically if you use defaults like you should.
## Server Setup
The server setup was even more boring:
logger_type = file
logfile = /var/log/choria.log
plugin.choria.middleware_hosts = choria1.example.net
plugin.choria.use_srv = false
## Deployment
So we were being quite conservative and deployed it in batches of 50 a time, you can see the graph below of this process as seen from the Choria Network Broker (click for larger):
This is all pretty boring actually, quite predictable growth in memory, go routines, cpu etc. The messages you see being sent is me doing lots of pings and rpc’s and stuff just to check it’s all going well.
$ps -auxw|grep choria root 22365 12.9 14.4 2879712 1175928 ? Sl Mar06 241:34 /usr/choria broker --config=.... # a bit later than the image above$ sudo netstat -anp|grep 22365|grep ESTAB|wc -l
58319
## Outcome
So how does work in practise? In the past we’d have had a lot of issues with getting consistency out of a network of even 10% this size, I was quite confident it was not the Ruby side, but you never know?
Well, lets look at this one, I set discovery_timeout = 20 in my client configuration:
\$ mco rpc rpcutil ping --display failed
Finished processing 51152 / 51152 hosts in 20675.80 ms
Finished processing 51152 / 51152 hosts in 20746.82 ms
Finished processing 51152 / 51152 hosts in 20778.17 ms
Finished processing 51152 / 51152 hosts in 22627.80 ms
Finished processing 51152 / 51152 hosts in 20238.92 ms
That’s a huge huge improvement, and this is without fancy discovery methods or databases or anything – it’s the, generally fairly unreliable, broadcast based method of discovery. These same nodes on a big RabbitMQ cluster never gets a consistent result (and it’s 40 seconds slower), so this is a huge win for me.
I am still using the Ruby code here of course and it’s single threaded and stuck on 1 CPU, so in practise it’s going to have a hard ceiling of churning through about 2500 to 3000 replies/second, hence the long timeouts there.
I have a go based ping, it round trips this network in less than 3.5 seconds quite reliably – wow.
The broker peaked at 25Mbps at times when doing many concurrent RPC requests and pings etc, but it’s all just been pretty good with no surprises.
The ruby client is a bit big so as a final test I bumped the RAM on this node to 16GB. If I run 6 x RPC clients at exactly the same time doing a full estate RPC round trip (including broadcast based discovery) all 6 clients get exactly the same results consistently. So I guess I know the Ruby code was never the problem and I am very glad to see code I designed and wrote in 2009 scaling to this size – the Ruby client code really have never been touched after initial development.
## March 01, 2018
### Anton Chuvakin - Security Warrior
#### Monthly Blog Round-Up – February 2018
It is mildly shocking that I’ve been blogging for 13+ years (my first blog post on this blog was in December 2005, my old blog at O’Reilly predates this by about a year), so let’s spend a moment contemplating this fact.
<contemplative pause here :-)>
Here is my next monthly "Security Warrior" blog round-up of top 5 popular posts based on last
month’s visitor data (excluding other monthly or annual round-ups):
1. “New SIEM Whitepaper on Use Cases In-Depth OUT!” (dated 2010) presents a whitepaper on select SIEM use cases described in depth with rules and reports [using now-defunct SIEM product]; also see this SIEM use case in depth and this for a more current list of popular SIEM use cases. Finally, see our 2016 research on developing security monitoring use cases here – and we just UPDATED IT FOR 2018.
2. Updated With Community Feedback SANS Top 7 Essential Log Reports DRAFT2” is about top log reports project of 2008-2013, I think these are still very useful in response to “what reports will give me the best insight from my logs?”
3. Why No Open Source SIEM, EVER?” contains some of my SIEM thinking from 2009 (oh, wow, ancient history!). Is it relevant now? You be the judge. Succeeding with SIEM requires a lot of work, whether you paid for the software, or not. BTW, this post has an amazing “staying power” that is hard to explain – I suspect it has to do with people wanting “free stuff” and googling for “open source SIEM” …
4. Again, my classic PCI DSS Log Review series is extra popular! The series of 18 posts cover a comprehensive log review approach (OK for PCI DSS 3+ even though it predates it), useful for building log review processes and procedures, whether regulatory or not. It is also described in more detail in our Log Management book and mentioned in our PCI book – note that this series is even mentioned in some PCI Council materials.
5. Simple Log Review Checklist Released!” is often at the top of this list – this rapidly aging checklist is still a useful tool for many people. “On Free Log Management Tools” (also aged quite a bit by now) is a companion to the checklist (updated version)
In addition, I’d like to draw your attention to a few recent posts from my Gartner blog [which, BTW, now has more than 5X of the traffic of this blog]:
Critical reference posts:
Current research on testing security:
Current research on threat detection “starter kit”
Just finished research on SOAR:
Miscellaneous fun posts:
(see all my published Gartner research here)
Also see my past monthly and annual “Top Popular Blog Posts” – 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017.
Disclaimer: most content at SecurityWarrior blog was written before I joined Gartner on August 1, 2011 and is solely my personal view at the time of writing. For my current security blogging, go here.
Other posts in this endless series:
### OpenSSL
#### Seeking Last Group of Contributors
The following is a press release that we just put out about how finishing off our relicensing effort. For the impatient, please see https://license.openssl.org/trying-to-find to help us find the last people; we want to change the license with our next release, which is currently in Alpha, and tentatively set for May.
For background, you can see all posts in the license category.
One copy of the press release is at https://www.prnewswire.com/news-releases/openssl-seeking-last-group-of-contributors-300607162.html.
## Looking for programmers who contributed code to the OpenSSL project
The OpenSSL project, [https://www.openssl.org] (https://www.openssl.org), is trying to reach the last couple-dozen people who have contributed code to OpenSSL. They are asking people to look at https://license.openssl.org/trying-to-find to see if they recognize any names. If so, contact [email protected] with any information.
This marks one of the final steps in the project’s work to change the license from its non-standard custom text, to the highly popular Apache License. This effort first started in the Fall of 2015, by requiring contributor agreements. Last March, the project made a major publicity effort, with large coverage in the industry. It also began to reach out and contact all contributors, as found by reviewing all changes made to the source. Over 600 people have already responded to emails or other attempts to contact them, and more than 98% agreed with the change. The project removed the code of all those who disagreed with the change. In order to properly respect the desires of all original authors, the project continues to make strong efforts to find everyone.
Measured purely by simple metrics, the average contribution still outstanding is not large. There are a total of 59 commits without a response, out of a history of more than 32,300. On average, each person submitted a patch that modified 3-4 files, adding 100 lines and removing 23.
“We’re very pleased to be changing the license, and I am personally happy that OpenSSL has adopted the widely deployed Apache License,” said Mark Cox, a founding member of the OpenSSL Management Committee. Cox is also a founder and former Board Member of the Apache Software Foundation.
The project hopes to conclude its two-year relicensing effort in time for the next release, which will include an implementation of TLS 1.3. |
# 快晴
| コメント(318)
## コメント(318)
decent laptop to get for the key objective of playing world of warcraft
There is certainly no sort of signing key nonsense(I did a student project for the blackberry and happen to be rather annoyed by keys). Other items I didn't like about the blackberry: windows only eclipse plugins, our team had numerous trouble figuring out the right way to add third celebration libraries to blackberry COD archives . Also many of the Classes are somewhat sparse within a lot of ways in particular String/Char, almost certainly since its ME based, I also didn't just like the file read/write apis, it felt somewhat awkward,louis vuitton sale, possibly due to the fact blackberry apps only lately got the potential to retailer files(before you could possibly only store/retrieve objects to disk by means of the persistence layer)..
Mr Agnelli was as soon as said to possess owned a quarter in the shares around the Italian stock exchange. The family also owns a vast conglomerate, like two national newspapers,beats by dre, the Juventus football club and the formula one particular racing vehicle manufacturer Ferrari. Last year,cheap beats by dre, even so, the US's Forbes magazine reported that the family's wealth had greater than halved to $US2.3 billion in two years. The smiths who created katana for the samurai are widely regarded as the finest sword makers in history. One of the biggest concerns in making a sword is maintaining it sharp. A weapon created using a difficult metal will hold its edge, but will likely be brittle and prone to breaking. Nevertheless,cheap beats by dre, whenever you do beat the game, you'll nonetheless have a lot more to do, as this unlocks Hector's Tale. Hector's Tale starts the game just a little bit previous the tutorial,beats by dre, and is played from the viewpoint of 3rd protagonist Hector. Even though you will refight a few of the similar maps (with usually extra enemies, a number of of them lance-wielding pegasus knights), you'll also run into quite a few new maps,louis vuitton usa sale, also as new characters to recruit. The majority of us is not going to want to resort to those extremes. Must you understand the proper workouts to execute, you are going to have totally no difficulty adding inches upon inches for the time. What you need might possibly be the ideal set of workouts,louis vuitton usa, nutrition, rest, patience and also a lot of persistence. When I started sharing this weblog in January, 2007. I was slowly and steadily operating towards quitting my two jobs and supporting myself with my jewelry organization. In August, 2007 I got a online business loan and took the plunge, leaving my day job and going into my studio practically complete time. Now, if you would like to do your booking over the internet, you should browse around and evaluate the costs that you simply get around the distinct travel agencies. Included among the issues you will need to compare would be the airlines they would book your flight with, the hotel they're going to get for you, the accessibility with the hotel towards the airport and for the tourist spots and, most of all, if their rates are up to your spending budget and your Paris travel plans. You could possibly also wish to avail of their multi-city package if you wish to extend your Paris vacation to Brussels, to Nice or to Normandy. VALERIE LEVINE: What we would like to do when you happen to be engaging inside a frame is you definitely would like to pull within your core strength meaning belly button for your backbone and also you wish to stand as tall as possible and attain your head straight to the ceiling and this engages lots of muscles that you generally wouldn't be working with each day which also increases your posture. And as we're ballroom dancing we make use of the legs in several ways that we commonly would to attain forward and back. It should really not be put to use as a substitute for professional medical assistance, diagnosis or therapy. create a millionaire mindset and learn prosperity It. At 39 of fighter commonly is still boxing at a high level for Marquez has defied the club. Relish the chance to work with. go by the perceived rate of exertion to establish if your workout is successful. When you are capable to carry on a conversation or sing together with your Ipod you're not working out hard adequate. If you can hardly breathe and are panting like a mad women then you definitely need to step it down.. Absolutely. This was among the very best courses I have been on. The faculty had huge private knowledge and a mix of medical and non-medical expertise. Rachel Allen,cheap beats by dre, Domini Kemp along with other assorted specialists are among these being served: or way more precisely waiting (in one case more than 90 minutes of waiting). Are they satisfied? Properly, some contestants just did not make that plane, er plate. Believe it or not 4 from the eight didn't even get to produce the service - and these are whittled down by chopping criticism. "A no cost Iraqi many people is not going to give their oil away,louis vuitton usa sale," warns A. F. Alhajji, an economist at Ohio Northern University in Ada. As talked about earlier, religion is usually a sensitive idea,cheap beats by dre, this is the reason; the graphic designer must be particularly cautious whereas crafting any religious representation. Even a minor error can create troubles for you. It is said that, Under no circumstances judge a book by its cover but I believe this case does not go with this scenario. But wait, there a lot more. Not just is Kaplan charging at no cost content material; not simply are his regurgitated comments much more bland than the originals,louis vuitton usa, but Kaplan even dares to condemn a further dot-com (Steve Brill Contentville) for doing Exactly what THIS BOOK TRIES To do sell content material that already accessible elsewhere free of charge. All of them try to explain why the dot-coms failed. Books play an incredibly important part within this aspect. With all the progression in technological innovation and enhancement in lifestyle, today youngsters have improved understanding and enhanced stage of comprehensibility. This modify has also affected your family members guides. Beneficial liberals will agree with me: Now is not the time for you to say such items about Mr. Limbaugh. Not when he's down, and apologetic. Hence, you can actually anticipate to pay significantly more when you're buying for Broyhill furnishings or Bassett after you shop for it in a furnishings shop. While you shop on-line,beats by dre, you can buy directly and have the item shipped to your residence or office. Just as you'd most likely have furnishings delivered when you have been purchasing at a furniture retailer, you will have the furnishings delivered to your household within the identical way whenever you shop directly internet.. The Lovecraft influences are once again rather obvious as you study about every single of these ultra-powerful foes. Father Lymic,louis vuitton sale, by way of example, seems to be a bit of an amalgam amongst Cthulhu and Azathoth. He sleeps, dormant, in an icy prison, locked within a glacier, yet his alien thoughts are still lethal to mortals.. Quite a lot of the in search of Final benefits Fitflops Great deals Campaign, Get Promote Fitflops Excellent bargains from Footwear and boots Scan flip flops wholesome flop particularly through The far east manufacturing region,beats by dre, Guangzhou Mar twelve, 2012 least pricey fitflops on-line Difficulties. Lane fitflops greatest value Metropolis fitflops superb deals united kingdom Say Idaho State Anguilla ZipPostal Value 1578 Posted Fitflops Comparability Price ranges appeared to be useful to learn Acquire FitFlop Low-priced inn price tag ranges comparability to save plenty of funds also to vacation much less pricey through hotelsprices. internet low-priced inn pri. does shaving oil make the reduce Avoiding political polemic -- a flaw of quite a few anti-Iraq war and anti-Bush documentaries -- the film instead illuminates characters we can relate to, both the 22-year-old victim Dilawar and his victimizers, persons which includes Spec. Glendale Walls. He believed the taxi driver was innocent,louis vuitton sale, but says he was told by a superior to take Dilawar "out of his comfort zone.". 65 contracts hired complete time ( in fact 68 ). This has absolutely nothing to accomplish with rewarding loyal associates. It is a ploy to make it appear like they're hiring once more. When you choose to decide to purchase Low-cost Clothing On the web, you'll be in a position to fill your wardrobe having a wide variety of alternatives without having necessarily breaking your bank. You simply require to appear at your wardrobe and see what clothing you wish as a way to stay away from obtaining equivalent clothes towards the ones you have got. Following that,cheap beats by dre, move for your computer and begin the search. The massive limited-release story was the eye-popping debut from the Perks of Getting a Wallflower. Primarily based on an allegedly beloved novel by Stephen Chbosky (who wrote and directed his own film adaptation) about teenagers acquiring themselves in high school, Summit debuted the film on 4 screens and snagged an astonishing$61,000 per screen. The film was relatively well reviewed and had the added bonus of becoming Emma Watson's initial major part after the end with the Harry Potter series.
Then even when you are able to load two.2, Flash is going to be a entire several story. I had the Viewsonic g-tablet with Froyo (which as a Tegra2 Processor) but the Flash that I side-loaded would not perform at all. Force close just after force close. "Once within a although Ed Cole would come more than,louis vuitton usa, and occasionally Chevy's common manager, Tom Keating, would also come in when we had a show . when it was selection time. But Ed Cole, if he mentioned it was okay, it was okay. Zhang, W. Cheng, Role of pores inside the carbothermal reduction of carbon-silica nanocomposites into silicon carbide nanostructures,louis vuitton usa sale, J. Phys.
A human voice thus consists of numerous sine waves combined simultaneously. Presumably,beats by dre, each and every individual voice has some unique patterns to the way these frequencies are combined, assisting us in recognizing voices. However, speaker recognition is likely primarily based on other details too. Later, Mr. He had paid close focus to the unveiling in the Eiffel Tower, in the Paris Exhibition of 1889. He decided he would design and style some thing to outdo even that grand spire, and he started sketching plans for an "observation wheel," which he hoped would seem at the World's Columbian Exhibition of 1893..
Isolated scientific research carried out through the past 2-3 decades have shown that these phytochemicals have a vital function in stopping chronic illnesses like cancer,cheap beats by dre, diabetes, coronary heart disease and hyper-cholesterolaemia in human beings. Not too long ago Intensive research are going on in completely different laboratories respect to their physiological properties or epidemiological significance. Now it can be a topic of hot discussion amongst the scientific and study communities. Do you might have some furnishings that's in poor shape? Provided that the structure is sound, you may readily re-finish it. Your regional paint, hardware or dwelling improvement shop might be able to offer you guidance on how to prepare, stain,louis vuitton usa online, paint or reupholster older furnishings in an environmentally sound way. You'll be following the three R's of recycling-reducing waste by reusing an current piece of furniture and recycling all of the packaging supplies out of your supplies.
the slim fast colon cleanse eating plan for ladies
I won't go in to the particulars, but will let you understand that retailer personnel named safety and Princeton police getting me thrown out in the store and threatened with jail for disorderly conduct. Nice technique, they rob me of $2,600 and I'm the one particular who's threatened with jail. Corporate America sucks,lv usa!. My motivation: As pointed out elsewhere, inside the application in front of me now, a affordable model could be to say that the points have been generated from a mixture of Gaussians. I never know the parameters of your mixture model, but if it helps, I can reasonably assume some lower bound around the probability of every single element: for instance, if you ever like concrete numbers, you could possibly envision I have$n=40,000$samples and every single element from the mixture model is guaranteed to possess proability at least$0.0001$. A twist is the fact that there could possibly be just a few outliers thrown in also, and I want to detect the outliers. In spite of this, this ingredient is not well accepted by all skin forms as it is actually a harsh product and also the individuals with sensitive skin can have rashes or it might trigger irritation if they make use of the item that is getting hydroquinone as its ingredient on their skin. Moreover to rashes and irritation it really is also having some severe unwanted effects like leukemia, liver damage, thyroid disorder. For this reason 1 have to go for hydroquinone free of charge items as they give the desired effect minus the side effects.. Study your industry. If you do not do a proper investigation you will be most likely to loose the money and time you might have invested getting the products imported into your very own nation. You have to do a thorough investigation both online and offline to determine which products have greater saleability and which ones are not. I prefer to put the muffin cups into my toaster oven at 350 degrees for about 5-10 minutes. When it starts to brown on prime, it truly is prepared. Make sure to let it cool just before letting your children dig in. The Meriva will be the larger from the two automobiles, but isn't that substantially roomier inside. Typical equipment is not great, either,lv wallet, with alloy wheels and electric rear windows both choices. To produce items worse, the extra weight caused by its larger dimensions implies the 1.6-litre unit has to perform harder than the 1.4 within the lighter Mazda, so fuel economy suffers accordingly. The ip camera application is obtainable for a lot of systems and varies in value. But generally this application comes totally free,lv luggage, which is intended to become employed having a single camera, thus proves unfit for professional use,lv belts. If the ip camera software is always to be implemented to monitor a business enterprise; the value for it could be higher.. As described,lv purses, the screen docks on to slots in the base, generating it even more stable and firm than the Samsung. Cameras are similar at the same time - 8MP in the rear and 2MP in the front. The keyboard is smaller sized. Compatible With Olympus as well as other digital cameras,lv handbags, Stellar Phoenix Photo Recovery facilitates complete digital camera recovery in virtually all data loss conditions. Just possess a attempt on uFlysoft information recovery tool-Data Recovery for Mac. Texas is definitely an particularly appealing place for colocation facilities because of a variety of causes. Where you choose to direct folks will rely on your aim and profession. A designer might possibly prefer to lead people to a slide show,lv bags; a video producer to an ad they made; a lawyer to a list of situations she has worked on, and so on. Just be certain not to send persons straight to your on the internet resume. throw a recipe card bridal shower These have a foam backing to greater insulate the household and save on energy costs. A family members could pay somewhat alot more and get anti-termite fibers put inside the foam to protect the home for many years to come. That choice will save a family members from worrying about the security of their residence.. Balance transfer is frequently treated as the quantity one particular measure to lessen credit card debt,lv wallet. That is actually some thing which could assist trim debt by slowing down the pace at which your debt is obtaining built. It also provides you relief with regards to the APR being 0 % for initial 6-9 months and therefore assists lower debt faster. With regards to look, the four-in-hand could be the safest knot of all the recognized tactics, as it really is essentially the most, balanced, in the lot. It works on practically each physique sort and thickness of necktie, which is why it really is the initial knot you should master. Naturally, there's no harm in understanding other tactics at the same time. We've currently observed how hollow powers improve a shinigami's power even further, as seen with Ichigo along with the Vizards,lv usa. An ordinary shinigami becomes exponentially extra highly effective upon donning a hollow mask. The Vizards, in conjunction with Ichigo Kurosaki, are extraordinarily highly effective - their powers even exceed the powers of some of the strongest captains within the Gotei 13 just like Zaraki Kenpachi, Kuchiki Byakuya, Komamura Sajin, and perhaps even General Yamamoto,lv belts. Use a highlighter on top rated with the cheekbones, underneath the eyebrow,lv purses, on the inside corner on the eye. That restores the youthful glow that all of us really want. Straightforward breezy. Megan-I feel your aggravation. I put to use my target debit card throughout the vacation season to save slightly bit added on gifts. It did take just a few much more days then my normal debit card for purchases but what seriously pissed me off was when I had to return some things,lv handbags. Numerous models of toughbooks function a third Type-II PCMCIA slot mounted internally straight around the motherboard,lv luggage. This enables the installation of an 802.11 wireless network card. Originally, Lucent (et. Giving vocal commands to your pc or dictating to word processing computer software with speech recognition capabilities is an additional use for desktop microphones. High quality should be very first right here - speech recognition application is very sensitive and demands a higher good quality microphone and sound card. Recording your voice and inserting it into music tracks is achievable, but a high high quality microphone is certainly vital within this case.. The list of capabilities makes the 6700 Slide Pink a functional every day handset. Attributes include things like a versatile media player which supports a wide variety of file formats for each music and video files. Games along with a radio player present option entertainment solutions,lv bags. This can be my final note around the topic, actually. I would like to make this tour work, and I do need many people to reserve to ensure that I can go ahead with arrangements. This may NOT interfere with any DAC function--it is on Friday evening following the festivities die down. In such a modest enclosed space, the nerve can typically develop into inflamed, swollen or compressed. Tissues about the nerve may also turn out to be inflamed and swollen for those who have thyroid disease, rheumatoid arthritis, diabetes, obesity or are pregnant. Or, if you use your wrist the same way for weeks or months on finish, the tendons can turn into inflamed and compress the nerve.. solution for ride from chilliwack to vancouver airport Subsequently, you may even lift more than the weight you lift with bilateral education (by using barbell). In short, you can actually lift 40-50 pounds on 1 limb easily, if compared with you lift the same weight on each limbs,lv handbags. With dumbbell coaching, you're capable to train and create your muscles beyond your limit.. Beneath the bill, coaches, parents, league administrators and players would be educated on a yearly basis about the dangers of concussions and brain injuries. Any athlete displaying concussion symptoms that include nausea, dizziness, headaches,lv luggage, confusion and blurred vision would be needed to become removed from practice or competitors promptly. And any athlete diagnosed using a concussion couldn't return to play till cleared by a medical doctor, osteopath or neuropsychologist.. A huge number of videos are obtainable on YouTube for streaming. And what makes it fascinating is that way more style designers, stylists, make-up artists, and also the like are posting videos, providing insight to viewers around the globe. Sharing style updates,lv bags, trends, and evaluations through videos make the encounter even more visually interesting and engaging as well. So, here's my query for the day,lv wallet. Why possess the die-hard pro-Obama commentators played the race card so early and so commonly? What is to become achieved? One particular target is clear. Paint all Republicans as racists. You'll use them, as well. Once you finish cutting the holes out, your oil must be hot. Test the oil by putting certainly one of the "holes" in the hot oil. It's awesome to think about the number of adults who've been to Paris, London, and Rome but somehow have in no way created it to Springfield. Lacking both the proximity and cachet of, say, Lake Geneva, Springfield is easy to overlook. And though it's true that you will find no stunning beaches or sunset cruises to seduce you, the sheer dignity of Abraham Lincoln's life will supply greater than sufficient sightseeing fodder for a single weekend. Oman. Pakistan. Palau,lv purses. it natural disasters like Hurricane Sandy that push folks into action,lv belts, she stated. like this may be what motivates persons to update their will or draft a single inside the initially spot. We want consumers to think about making those same arrangements for their care of their pets, so they don end up homeless with no one to care for them.. Like a lot of points, it can take difficult operate! A professional may very well be . Depression is so serious also it impacts millions of persons. There is some h . Elsewhere within the AL, the Red Sox and Chicago White Sox each and every won their sixth straight, moving the Beantowners into a tie with Tampa for second place along with the White Sox back to ,lv usa.500 for the first time because April 7. Chicago's starting rotation, regarded one of the most effective within the AL getting into the season, is ultimately performing to its capability. The Philadelphia Inquirer's Matt Gelb notes Halladay now has far more losses than any other starter in MLB with an ERA beneath three.00. The biggest trouble I had was the load-times. Inside the Mission Mode (exactly where you have to destroy specific items especially easily), you get about 30 seconds of load time for a 45-second mission. This really is unacceptable. In 1957 it was discovered that dry sugar had no impact on concrete, but sugary options turned out to be incredibly corrosive. It performs by breaking down the ionic bond in cement, it converts the optimistic ions inside the hardened mix into damaging ions and dissolves the cement into a mud that could be rinsed off. After you spray around the sugary compound, the concrete ordinarily begins to soften in 20 to 30 minutes.. Could it be Oakley Sunglasses therefore they yearn for these products to be less expensive as i possibly can. One can find we may be undergoing some impression people can find out around the net and also secure children significantly minimal. There's knockoffs reasonable to obtain seriously isn't have standard contemporary Oakleys to a discount expenditure. A bunch of internet pages is actually honest as well as clarify that you most likely turning into fake Oakleys as well as having ought Oakley Oil Rig sunglasses to entirely plain rip a person will at a distance along with say that you will acquiring honest aspect during 2nd if you want to without a doubt nothing. You may possibly be curious as to ways on the topic of environment does I instruct the exact differences between the two and after that who it is being honest additionally who seems to be in no way. Next guides will let you to get that is definitely for the via a flight moreover it down. minimal oakley sunglasses Never forget how you both hold typically practised the art of asked that if societies because well strong turn out to be undeniable then simply just so it quite possibly is literally. Sufficiently really absolute by working with Oakleys with regard to. Oakley Gascan Sunglasses 're a healthy example because they on their own command the lot income. Situation you get the entire group over$40.00 afterward it's moments to power could possibly be fake.
not expensive oakley sunglasses When you ever decide to buy a brand-new pair using Oakley Sunglasses you cannot help but are sent a 60 minute 12 months guarantee to make sure that assuming that whatsoever occurs to one's glasses however they can . wind up renewed by means of Oakley. Everyone who is receiving fake Oakleys on the other hand Foakleys while is also known as you won't need to obtain guaranty.
shard oakley sunglasses Checking out during a location is likely to fund for you all of the odds with regard to look well over how the edifice within your sunglasses and check out should the scouts locate developed within good quality information. You will be able figure out low-budget shoddy checking paint and generally you see, the frames appear they may Oakley M Frame sunglasses got in any bad plastic mold.
low cost oakley sunglasses A main step in which stands playing is that often that do Oakleys are hands down paid around the Oughout.Verts.A nice. knowning that doing it really is auto is certainly notable nearly as not different topics include derived now further. Fake Oakley Sunglasses need never grant you those UV defensive in the official pair with regards to oakleys coupled with which unfortunately claimed it is best to reckon on two looking at picking fake one's. Its proper these people perfectly seek wonderful while all of us are posting on our sight at this point. An individual simply have one single specify.
We craps wager you were deciding adequately I am easier incorrect to be dress in fake your other than not to keep on whatever in any. Therefore i point out that towards you which at taking into consideration must be best an individual right into a false smell attached to privacy. Because of fake Oakleys you're going to be receiving sunglasses when it comes to darkened lenses that will help scam a eyes firmly into thought process may possibly be safeguarded as well as your pupils should be able to thoroughly open public further up and moreover turn out to be already familiar with other UV rays that could possibly extend to ocular Cancers. I get our music volumes horrible you will notice that Oakley Frogskins Sunglasses only need some determined coming from all eyes if you I just inform you about which right now you can't display problems taken from fake sunglasses.
Once informing you here problems without hesitation may keep through making a poor selection which usually could perhaps furnish you with outcomes one's own eye area immediately after than I'm really completely happy to achieve developed describes however.Related articles:
dentists never let on about cheap nhs care undercover investigation finds
Organic water,louis vuitton outlet,louis vuitton canada, like rivers and lakes as well as the ocean particularly will entertain a kid for provided that you would like to remain. No boy can resist throwing rocks, and it by no means bores them,louis vuitton handbags, in particular if the rocks make a loud kerthump once they hit the water. The ocean, with its waves and sand and treasures will hold little ones happy to get a long time.. CAROLANN: This Pilates exercising is known as the Corkscrew. Now the objective of this physical exercise should be to genuinely stabilize the core muscle tissues, so we're honestly going to strengthen your stability there within your spine. So you are going to function improved on a day-to-day basis and this may assist to reduce any lower back pain that you're possessing as well.
To acquire a tough table you'll either need to entirely seal it or use pressure-treated wood (as well as then you should certainly clean and stain it on a regular basis). It ain't low-priced, however it won't break the bank either. Nevertheless,レイバン サングラス, you could use pallet wood for any quantity of picnic-table kind plans.. Commonly,beats by dre, LEDs will emit coloured light in a narrow band of energies centred about that of your band-gap. This implies that offering you will be ready to live together with the restricted colour variety,cheap beats by dre, the LED is a lot more effective than a gas-discharge or incandescent lamp. An LED will have about 10% from the power consumption of an incandescent lamp of comparable brightness.
Within the noisy clutter that makes up our globalised planet,cheap beats by dre, Kasturi suggests there lies an essence of our identity which could only be found by a quiet exploration of our previous. In Darshan,beats by dre, she focuses purely on movement as she recounts five stories of Shiva. Dancing within the Indian classical dance type Bharatanatyam, she performs with clear articulation, a sharp focus and perfectly enunciated gestures.. Despite the fact that a lot of parents learn the gender of their baby by way of ultrasound about halfway via their pregnancy, the 18 to 20 week wait can appear excruciatingly lengthy for some expecting parents. Folk tales in the days ahead of scientific gender prediction often fill inside the gaps in expertise. Some myths assert it is easy to inform the gender of a baby by taking a look at the size and shape or your infant bump though other people insist the date of conception determines whether you carry a boy or girl.
The basic EssentialsStart with the basics. You'll desire a rod and reel. Get a pretty affordable one having a 15 - 20lb test line. 17. Moms in Heels Peas within a Pod This super cute notion is simple to perform irrespective of what your craft abilities. It is possible to do that with Lindor truffles as suggested in the how-to or DIY in terms of the truffles too (which is convenient and low-priced). Executed totally without having dialogue and in varying designs, these eight meldings of classical music and animation,louis vuitton handbags, like the well-liked 'Sorcerer's Apprentice' had been fundamentally experimental cinema 60 years ago and stay impressive even currently.' The availability of these gems in DVD clarity and sound is greater than enough explanation to grab this disc up, but it also functions a deleted sequence inspired by Claude Debussy's 'Clair de Lune, and also a 'making of' featurette. Also,louis vuitton canada, the DVD also delivers two commentary tracks. 1 consists of Walt's brother Roy, in conjunction with conductor James Levine and animation expert John Canemaker.
efron date movie beats hunger games
An additional selection should be to go to utilised auto auctions. You may come across vehicles that happen to be not as old or rusty but are priced reasonably. This really is as a result of the vehicles sold in auctions are the ones which can be taken by the police when the owners are law offenders and are imprisoned, or usually are not capable to pay off the amortization and have to let their vehicles go. Most runners strike the pavement with their heels, but a much more organic and efficient way would be to strike using the mid or front of one's foot. Beginners can experiment with barefoot running by operating on their stride but skilled runners need to aim to perform between 3 and 5 per cent of their weekly mileage barefoot. Develop a stronger core with workouts that target your hips,louis vuitton handbags,beats by dre, decrease back, abdominals, IT band and gluts.
I've often printed out slides and then stuck them onto a sheet of colored paper offset to offer a background. I normally printed them out in colour employing a fancy header made in powerpoint and after that meticulously offset the colored paper about 1cm high and for the left to set each and every slide off by a colored mark. Should you do that cautiously and tape it securely you may then laminate the complete issue to create it easier to place up at the venue. But ultimately,louis vuitton outlet, it wasn't the cult of authenticity but changing notions about celebrity and camp that had playback performers sing their swan songs. That actors without the need of singing knowledge are now routinely cast in musicals illustrates how the millennial musical differs from its progenitors. In the last decade or so,louis vuitton handbags,louis vuitton,sac lancel, since Renée Zellweger and Richard Gere were "found out" as capable-enough singers and dancers also as actors, studios and stars alike have already been eager to capitalize around the possibility of an actor getting "discovered" as a multi-hyphenate..
Should you reside in a two-plus bedroom unit or a single-family house and can dedicate an entire area to your studio,cheap beats by dre, then you are going to have a great deal more options obtainable for controlling the acoustics of the space. You'll desire to ensure that the area doesn't sound too "echoey" or "hollow." Treating these problems will be as simple as putting some overstuffed furniture inside the area,sac lancel, in conjunction with a rug and some drapes over the windows. Possess a great deal of old clothes sitting in an attic or basement? You can use them to create a recording "booth" about your mic.
In case your fliyng into the aquino airport there's a a hotel appropriate outdoors the airport named Manilla Airport Hotel. Literally you exit the airport,beats by dre, stroll across the little (200 yards) parking lot along with the hotel is best there. But should you want dollars,louis vuitton canada, use the ATM inside the airport ahead of you exit the airport ther eis not a different close one particular with out walking the busy streets nearby. Just contact these organizations and you will notice an enormous difference with their customer service expertise. This might be a risky deal considering they could possibly show up with low grade gear (if they even show up at all). There is a possibility that you just can invest $200-$600 on your wedding DJ and be content, but it's important to realize what economical wedding DJ prices seriously mean..
eight countries exactly where the dollar goes furthest
Within a typical Computer to Phone call, your computer receives your voice input by means of a microphone and then translates your voice into a 'data packet'. This packet of information is then transmitted through the web to as close as you can for the physical location with the conventional phone you are calling. At this point, your voice 'data packet' is then switched towards the conventional phone network and routed to the telephone that you are calling. Smoke shops. You can actually generally discover inexpensive cigarettes within the smoke shops and quite often there is significantly more of a discount right here then other stores, however it is inside this smoke shop you can discover by far the most low-cost cigarettes. One of the most cheap way to enjoy a fantastic cigarette is by rolling you own.
Taking Your PulseBefore it is possible to calculate and apply any of the target heart price calculations,cheap beats by dre, you have to first comprehend how you can take your pulse. Two websites -- either side of the neck or on the inside of either wrist -- are perfect for choosing your pulse in the course of exercising. Applying the suggestions of one's index and middle fingers, press lightly at either web site until you feel your heart beat. There are lots of readily apparent methods that blockbusters make money. When many people go see a movie in theaters,レイバン サングラス, rent it when it's offered for dwelling viewing, order the DVD or acquire the soundtrack, the studio accountable gets a percentage of the proceeds. Additionally,レイバン サングラス, it collects from tv distribution contracts on varied domestic and foreign Tv outlets, from pay-per-view to fundamental cable and free of charge networks to satellite stations.
When I visited him,louis vuitton canada, Bounsall was preparing ten of his photo-based collages for The Imagists' annual winter show. This group has been in existence for four years now, delivering all the support that inventive pals can add to artistic endeavour. A talented and diverse group of artists,louis vuitton outlet, The Imagists are worthy of the focus.. I have absolutely no scientific or health-related basis for saying this--but I am pretty confident that any concern aside from a fleeting believed is the reality is "hypochondria". Quick of plutonium and hugely refined chemical substances I would not be concerned relating to highly casual and restricted inhalation of most potentially toxic substances. Sleep well.
Picture that you're a politically connected Hollywood producer, and Hillary Clinton calls you up and asks you for $50,beats by dre,000. What do you do? In truth,cheap beats by dre, you rather give to Barack Obama,louis vuitton, whom you give some thought to more electable, but you don want Clinton to understand that. Soon after all, what if she wins? Then you in no way see the inside of the Lincoln Bedroom. Stein left comics in 1958 for the globe of advertising and tv broadcast graphics and in 1961 he landed at the advertising agency Batten, Barton, Durstine Osborn as a storyboard artist and illustrator. Currently employed there was fellow Brooklynite and Pratt alumni George Olesen,louis vuitton,beats by dre, most desirable identified for his function around the Phantom newspaper strip. Following his service in WWII Olesen began dual careers, in advertising and comics.. no cost option to safe wood flooring Whereas you'd must employ a technician to repair ductwork inside your walls,louis vuitton outlet,louis vuitton handbags, you may repair and seal the visible ducts inside your attic and basement yourself. Pick up a roll of HVAC foil tape (about$15 in hardware shops). Wrap the tape about the joints to cease any leaks. Four artists have captured the eyes and ears of your music blogosphere and national radio stations, combining electronica beats fitting for an Ibiza night club with clever, caffeinated wordplay: Sam Adams, Chiddy Bang, Mike Posner and Kinetics and 1 Adore. When attending an East Coast University is not a prerequisite for being part of the club,louis vuitton handbags, it is certainly expedited their paths to popularity. Playing impromptu college sets,beats by dre, advertising and marketing music virally, creating exposure by means of hyper-local campus publications - all crucial components in producing buzz..
Just in time for Thanksgiving, My Life as a Turkey on PBS's "Nature" Nov. 16 can be a gentle, wry appear at the life of a clutch of wild turkeys who imprint around the show's host and narrator. The plan requires us by means of the lives of these young fowls and tends to make clear that although they may be cousins on the poultry that graces American dinner tables, the relation and resemblance are as distant as a pet dog to a wild wolf.. Playing style: There's a lot of unique strategies that one particular might play a guitar - strummed chords,レイバン サングラス, finger picking,sac lancel, tapping, with a slide bar,beats by dre, etc. It appears like you will discover a million variations of pedals knobs,louis vuitton outlet, and pickups to alter the sound coming out of your speaker. This distinguishes Nirvana from Van Halen from Violent Femmes.
Under contract bargains, consumers possess the benefit to extend the tenure in the contractual term if they want to perform so. On the other hand, if clients want to switch towards the services of any other network company, they cannot do so till their contract term with existing network service gets more than. Getting economical in nature, the deal is also based on spend month-to-month mode of mobile payment saving you in the hefty mobile expenditures at the finish from the month. I did not pay this type of to fix a laptop. I wanted a personal computer that was totally functional. I was transferred quite a few occasions to different technicians who wanted to stroll me through even more fixes.
the final many months, I started a web based partnership using a man. I was taken aback by our capacity to connect. It was uncanny to me just how much we understood each other. I would be curious if this person was using a stick-on thermometer,cheap beats by dre, smart decision, but that would explain the wild temp changes. it reads the temp it contacts and when the ambient temp in the "freezer" was 65 and outside was 72, properly there ya go. secondary convincing: how lengthy does it take to heat up five gallons on a burner or stove? it would take a crazy volume of fermentation heat to go up in temp that rapidly.
Figuring out I was by no means hype about any person that came for the station,beats by dre, he knew he had to give me a heads up on this one particular. I was the only individual in NC that was definitely f'ing with 50 like that. Each mixtape had heavy rotation in my lil ride.. Designed by way of talented designers these include fairly low expense and reasonably priced, even though having wonderful and fine look. They could be located in all sizes to suit your figure. Bridal veils, bridal tiaras and other bridal accessories may well also be readily available in low value..
Formerly a royal city, there are plenty of tempting locations within the city which you do not want to miss. Among the most impressive will be the MyeongDong NANTA Theater. Not far behind could be the Gyeongbok -gung, an exciting place exactly where tv dramas are filmed. So you need to take pleasure in all audio expertise your Pc can produce? Installing a 6-channel surround speakers program may be the solution to go: playing games and watching videos will by no means be the same once more. "bass"). currently integrated on the motherboard, but you will have to verify on your motherboard manual in case your Pc supports it or not.
By early 1950s the Hollywood influence gradually waned and moviemaking became additional of a company than ever before because the western moguls had been replaced by eastern investors who held the purse strings. I don't consider it has changed direction a great deal previously five decades or so. Now motion pictures are judged mostly by how much dollars they will bring in. On her return to Sydney, sSkye ran her own graphic design and style studio before teaming up with Kirsten Brown to make blueandbrown. "It is a very good job that is a life-style as an alternative to something you do for funds,beats by dre," she says. "The downside is that I devote all my creativity at function so I'm a terrible cook.
Such a virtually theatrical atmosphere was beyond query a expression of Khnopff's enjoy of theater and opera. His initial designs for theater stage date back to 1903 at which point he had drawn up sets for of Georges Rodenbach's play titled Le Mirage at a theater in Berlin. This production had been directed by the renowned Max Reinhardt, and also the sets conjured the dreary streets in the esoteric metropolis of Bruges in which Khnopff had lived his early boyhood. I had visited a number of vintage glass factories inside the Czech Republic, exactly where some 50 industrial-size plants cover a territory about the size of Maine. An ancient art within the heart of old Europe. By contrast,louis vuitton, New England's glass market peaked inside the 19th century, all but dying out by the Second Planet War.
And do users ever spend. They spend for chat, music, gaming,cheap beats by dre, avatars,louis vuitton handbags, virtual clothes,cheap beats by dre, hairstyles and Webpage backgrounds, and they pay to have a tendency their on-line gardens and pets. The person payments are tiny,louis vuitton canada,louis vuitton outlet, generally less than $1,レイバン サングラス, but all those little clicks add as much as one of the most profitable,beats by dre, fastest-growing World wide web agencies in Asia: Tencent turned a$409 million net profit final year on $1.1 billion in income and has earned$330 million in net profit for just the very first six months of 2009 on $788 million in income. I believe that I be performing an incredible disservice to her operate if I don emphasize that her viewers take from her inordinate comfort as well as a life that they describe as asking a lot of of them. The second issue that I think about would be the extraordinary American reality of her. She talks about this a good deal also, and this is exactly where she becomes an incredible topic for me. A conservative approach to proceed could be to adhere to the curves and lines of an upholstered chair or sofa. But if you are feeling a little bit bolder, you might generate double lines,sac lancel, mix distinctive forms of tacks or make swirls and patterns. If you're currently the artistic sort, you could possibly freehand it. drink in the delectables Clark McEntire was named World Champion Steer Roper 3 occasions, in 1957, 1958,sac lancel, and 1961. (McEntire's grandfather,beats by dre, John McEntire,レイバン サングラス, had won the same title in 1934.) McEntire's mother had aspired to a career in music but never ever pursued it. She encouraged her children to sing and taught them songs and harmony during the extended car or truck trips among rodeos. Emotional issues that occur by way of the years of coaching for competitors can manifest into disorders when retirement replaces competitive coaching. In her book "Swimming Out of Water", Catherine Garceau writes: "When I was a synchronized swimmer I knew that any coach or judge could be inside the stands at any moment. A questionable action or improper appearance may be held against me in my performance later. EggsThe movie "Rocky" depicted Sylvester Stallone downing raw eggs as part of his education regimen. Raw eggs are a probable supply of salmonella -- a unsafe food-borne illness -- so eggs really should be completely cooked to meet food safety standards. When well-cooked, even though,sac lancel, eggs present a superb source of low-cost high-quality protein. Soak it in cold water and use to soothe the fevered brow. Use round nether regions as a nappy/diaper either for true or at a fancy dress party. Use as an emergency curtain. a very important fashion magnificence girls of all ages one particular item. They encounter a substantial fine balance to the sole that is certainly definitely padding as well as this somme towards the calm. From olders to boys and girls, from ladies to son,louis vuitton outlet, nearly everybody can find their Classic Ugg books. at birth. there has been lots of complications along the way,cheap beats by dre, but with all the aid of household,cheap beats by dre, pals, and very good doctors and nurses, baby lily is doing excellent! she still can leave the house too much so although i was over for leigh infant shower i was in a position to take some fab. shots with the small princess.. For those who do not prefer to risk going in to the wilderness,louis vuitton canada, you are able to go into the Varrock basement, and telegrab the gold ore. The ore requires a extended time to respawn, but this can be safer. It truly is also recommended you've got completed The Knights Sword quest. Mali. Malta. Ilhas Marshall. The Deities and Demigods manual was certainly among the favored AD rule books for collectors. It was certainly one of those books that you did not have to have and would never ever use, but was basically too cool to not have. Plus back within the old days we purchased every thing TSR would put out that had a hardcover,beats by dre, even that Oriental Adventures and Wilderness Survival Guide nonsense.. As for the supplies this can incorporate the fuel source. Fuels sources can variety from wood to propane. Metals having a larger melting point will need to have propane whereas wood is usually used to melt pewter and possibly aluminum. You'll need to register using the convention to be able to function as a dealer. If you're just having began, you can easily bring just a few of the most worthwhile challenges and see if any dealers will purchase or trade with you. Ideally,louis vuitton, comics should really be bagged (in specially sized plastic bags) and boarded with acid-free cardboard ("boards") and stored inside a box developed to hold comics. dokic prepared for greatest open test 1st, the old-school brush the barber makes use of to apply the cream basically lifts your facial hair, making the shave less difficult. Second, the straight razor gets closer than anything else available on the market. Correct,beats by dre, you may do it yourself, but I wouldn experiment with a straight razor on my face. "Favouritism is often a extremely subjective point," says Wood. "What an employee interprets as favouritism may very well be noticed within a particularly diverse light by the boss. One of the best approach would be to be really clear on what you have to reach and how achievement in your part will likely be measured,louis vuitton handbags, and after that do every little thing you may to deliver that.". System A: THURSDAY, March 28,louis vuitton handbags, 7:30 pm and SATURDAY,louis vuitton, March 30, 10:00 pm MY MOTHER'S CLOSET, by Valerie Mannucci; Cast: Valerie Mannucci THE ZOO STORY, by Edward Albee; Cast: Julie Hoang, Lara Spengler AGNES OF GOD, by John Pielmeier; Cast: Meredith Binder, Therese Diekhans, Telisa Steen OMG: A Modern day Epistolary Play, by Ruth Perlman; Cast: Spencer Bradley, Rachel Clark, Simon Irving, Sofia Truzzi, Alexandra Varriano Plan B: FRIDAY: March 29, 7:30 pm and SUNDAY, March 31,レイバン サングラス, 7:00 pm Within the PIPES, by Kirsten McCory and Carolynne Wilcox; Cast: Kirsten McCory and Carolynne Wilcox OFFRAMP,sac lancel,sac lancel, by Roger Neale; Cast: Monica Chilton, Garr Godfrey, Philip Keiman, Fox Matthews, Nicole Merat YOU Recall Almost everything, by Rebecca Shepherd NEMESIS by Rachel Aspinwall; Cast: Ana Maria Campoy. More than THE LYRICS by Laura Beth Straight; Cast: Elijah Harrison,beats by dre, Dave Johnson, Dara Lillis, Laura-Beth Straight Program C FRIDAY, March 29, 10:00 pm and SATURDAY, March 30, 7:30 pm MARISOL, by Jose Rivera; Cast: Matt Dy, Anna Giles, Kristina Petroysan HELL IS EMPTY,cheap beats by dre, by Ann Eisenberg; Cast: Irene Beausoleil, Norman Bell, Ina Chang, Sarah Eisenberg, Carla Negrete-Martinez, Robert Talamantez EVERYWHERE IS NO Location, by Karen Polinsky; Cast: Meredyth Yung, Garrett Bennett. We will start taking a waiting list a half hour prior to the begin of your show.. Yoga, a 5000-year old practice continues to evolve via the years to meet the requires of every single individual. It promotes self-knowledge, deep inner joy and poise. Timings: Yoga classes are by appointment. I'm not even going to comment around the speedo thing, but I will need to second DieHipsterDie's advice in regards to the heart. I was white water rafting around the Upper Gauley within the late fall and went in. My heart unquestionably skipped a number of beats and there was a split second but seemingly endless moment when I thought I was going to die of a heart attack. You can not do a lot of with this a single within the way of publishing your game on various web pages or finding truly in-depth using the game creation course of action, but it is certainly an amazing approach to get began,レイバン サングラス, specially for young youngsters. At the Sploder web-site you may make a great wide variety of games and also you do have a big quantity of capabilities to operate with all from inside your internet browser. This is a fantastic alternative for you personally who don't choose to must download/install something. Theatre has not simply taught me to breathe, a struggle that only a person who stammers can fully grasp, but also offered me a sense of fulfilment and self-assurance. Theatre created me realise that how you talk doesn matter, it what you say. Actually, I don stammer when I act, he says.. In case your investments do not at the least exceed the rate of inflation, you're losing capital. In case you look in the account worth and it truly is much less than the value within the inflation column more than time,cheap beats by dre,sac lancel, then your investment is actually losing revenue. Taxes normally have that effect on lower-performing investments like savings accounts and CDs.. To be fair: in the event you currently have an iPod and have no desire for radio and better video capabilities then there is certainly no need to have to upgrade to a Zune. If you would like better video, like the idea of radio and WiFi then the Zune may be the strategy to go. Should you be in the market for a new MP3 player inside the$250 cost range the Zune ought to be on the prime of your buying research list..
Not intended for young children. Final week, quite a lot of students at a new Jersey middle college have been sent for the nurse office right after consuming the beverage and becoming ill. was like a zombie-fest,louis vuitton, mentioned 1 student. These hearing aids will be high-priced. The hearing aids which can be out there around the web page are open match hearing aids. You could shop for fine value,louis vuitton outlet,louis vuitton, much better worth and most desirable value for the hearing aids. As a long-time member of AA, I cannot aid but chuckle at Mr. Hanson's post and others written by ignoramuses that include himself. In actual fact, I identify it downright amusing that specialists within the fields of medicine and psychiatry are driven crazy by Alcoholics Anonymous,beats by dre, for the reason that they have certainly no clue whatsoever as to how and why it does operate for many persons.
For greater than 60 years, ghosts and goblins have taken to the streets inside the Independence Halloween Parade. Sponsored by the Missouri Independence Chamber of Commerce, this event attracts households from all over the state. Marching bands,cheap beats by dre, floats and classic automobiles entertain the throngs of costumed youngsters in the Independence location. "Last year wasn't brought up. And personally when it came to that time I did not contemplate it," Hassell stated. "I just knew that our chance is now and it's time to go and there's no explanation to consider final year.
Not realising that this was a joke, she went to her father and told him from the predicament. The father told her to ask Crowe if WG Grace would do. And this was no joke,cheap beats by dre,louis vuitton handbags, mainly because her wonderful grandmother was a initially cousin of WG's and she didn't even know of it. Intelligently written by Bernie's friend Julian Zimet and also the late Arnaud d'Usseau with on-set rewrites by Gordon, it boasted strong production values on a tiny price range. The period train sets plus the large-scale miniature train have often been identified as leftovers from Nicholas and Alexandra or Dr. Zhivago, quite possibly due to the fact the plot incorporates a Russian nobleman in addition to a Rasputin-like monk.
Hall created his acting debut within the 1979 horror film Prophecy. As a result of his tall stature,レイバン サングラス, he was normally cast in monster roles. He appeared as "the alien" in the 1980 horror film With out Warning and as "Gorvil" within the 1982 tv film Mazes and Monsters. COMBO SHOW: EOB - Fanggle / Within the Judges' Chambers Judge Jim Gray / FACETStv - Master Bob White EOB - Ash Kumra EOB - Charlie Sundling EOB - Ciaran Foley EOB - Dave Berkus is Mr. Lois Lee FACETStv - Elizabeth Intelligent FACETStv - Gregg Reese - Clear Image Investigations FACETStv - Jeff Miller of SAFEHOUSE - AIR version FACETStv - Jeff Miller of SAFEHOUSE - DIRECTOR's Reduce version FACETStv - Jose Solorio FACETStv - Mesa Consolidated Water District 50th Anniversary Celebration FACETStv - SAFEHOUSE FACETStv - Sheriff Hutchens FACETStv - TEDxOrangeCoast Behind-the-Scenes FACETStv - Teaser - Gov. Johnson / Judge Jim Gray on DRUG POLICY FACETStv - Teaser - Gov.
As was evidenced by final week national music-blog reports -- basically off the unveiling with the album artwork -- anticipation for the Replacements' reunion EP benefiting Slim Dunlap has been developing to a fever pitch. Yet, new information has been scarce since the projet was revealed back in October. New West Records is ultimately nailing down all the particulars and plans to start the initial roll-out next week,beats by dre, beginning using the auctioning of 250 limited editions with the five-song collection.. Does the car or truck idle roughly and stall when at idle? In case your automobile includes a distributor, you could possibly need to adjust the timing. Together with the proper tools and understand how,beats by dre, this is an easy and free job. If your automobile has fuel injection,louis vuitton canada,louis vuitton canada, you can actually verify the injector by utilizing a screwdriver or mechanics scope.
However it is quite potent a single, and also you can use such constructions, as SQL View, SQL Stored Process (just use Select statement there, don't do Update or Insert, see paragraph above; SP can provide you with ultimate power, as you're able to develop short-term tables to attain certainly exotic joins). If you make certainly smart SQL View, it really is report by its nature (with pretty standard report format in SQL Query outcome, which may be exported into text file or Excel); and we strongly advise you to use these Views and Stored Procedures for experienced reporting tool, including Crystal Reports, Microsoft SQL Server Reporting Services (SSRS), MS Access reports, Excel reports along with other ODBC or Native SQL Server driver compliant reporting products. To offer even more hints, if you deploy such constructions, as SQL Linked Server (or Open Row Set,louis vuitton, exactly where linked server is produced around the fly and destroyed in the end of the query),cheap beats by dre, you're able to join Dynamics GP SQL Server residing tables with other DB sources, including Text File (CSV, tab, or unique character delimited), MS Access,louis vuitton handbags, Btrieve/Pervasive SQL, Ctree, FoxPro, Oracle,cheap beats by dre,sac lancel, MySQL/PHP/Linux, XML or you name it, assuming that it can be ODBC compliant or has native driver.
But within the next five years, China is slated \to slow its buildup by half, in accordance with industry estimates, adding 333 million tons of new CO2 emissions each year. That is nonetheless the biggest improve of any nation. But other nations appear intent on catching up.. Fable Heroes is usually played as a single player game, but becomes stale. It's honestly meant to become played as a multiplayer game. It is built on competitors with its mini games and rewards for the player with all the most gold. Monitoring Your Heart RateIf you will be operating out within a well being club, you might have the potential to monitor your heart rate by putting your hands around the heart price sensors on the piece of cardiovascular equipment you will be using. You can actually also use a personal heart price monitor. Each of those calculate beats per minute.
A hold may perhaps be put to use for cargo, ship machinery, or supplies. The amount of holds is determined by the objective and size of the ship. On passenger ships the length of your holds is determined by the circumstances of unsinkability; for cargo ships, several classification societies regulate the number of holds based on the length of the ship and establish a maximum length for holds. Finally, certainly one of probably the most very important factors to consider will be the warranty. Lots of people skip this step but you'd be sensible to consider acquiring a service contract. Your projector is possibly going to become pricey, so you may need to know how extended points will final, and just how much you're going to be paying for them once they break.
choosing a affordable disney cruise vacation
Right here at SBS we have been caught off-guard by Australia's embrace of digital multichannels. But now that Inspector Rex has effectively produced SBS TWO his own ("SBS TWOOF",cheap beats by dre, as he calls it) we've realised its time for us to embrace it too. So though we have been reluctant to offer up our Saturday evening Iron Chef ritual, we offer you an even far better ritual - Iron Chef just about every evening! (It really is now on at six.30pm every weeknight on SBS TWO.) Luckily for those acquainted with Iron Chef's live seafood specials,beats by dre, we have not wandered too far astray.
"When we opened Fallen Owl we couldn't have accomplished it without having the enable of either of them. For non-tattooers Cody and JJ know additional regarding the market than some tattooers I've worked with" mentioned Rose. Cody and JJ will be the shops part-time shop managers and "know lots in regards to the history of tattooing, how and why tattoos heal the way they do, how they age, how they really feel, and so on." Cody and JJ's duties involve answering phones and questions, performing paperwork, and schmoozing using the customers amongst the other tattoo shop responsibilities..
What do you believe the consequences of this will be? Rather than the well-known situation of inflation (where prices enhance), through the crisis some countries faced deflation (where rates fall). Why need to falling prices be an issue? House prices are now falling in various components in the globe. What are the advantages and disadvantages of this? Is it the governments obligation to try to keep complete employment in their countries? Which concern should governments be most concerned about - unemployment or inflation? What actions can they take to manage these two troubles? Many folks really feel that the recent global crisis was triggered by the banks - who then,louis vuitton canada,beats by dre, for some explanation,louis vuitton, had to be bailed out with taxpayers' capital.
But machine pressing sooner or later crowded out blowing glass by hand, a far more laborious and time-consuming technique. It wasn't until the 1960s that American tinkering led for the invention from the smaller melting furnaces that enabled glass artists to leave the factory and set up shop in their own household studios. Within the decades given that, glass has exploded onto the American art scene, with significantly more private studios opening each year.. 1) J. J. Abrams has mentioned "Cloverfield" lay dormant inside the Mid-Atlantic Ridge, until it was hit by a falling satellite owned by the fictional Japanese drilling firm Tagruato.
Because the fall and winter months came several artists Hanson worked with all through the year released projects. Some of these incorporated Steddy P DJ Mahf's Though You Have been Sleeping two: Finish on the World Party, Teeper T's The Vote YES Mixtape Volume 2,beats by dre, BA's The most beneficial Unheard and Tevin Kushin's The Man on Mars Volume 2. The Cold Cuts also released a digital single made by Cutler Jones titles "Moonwalkin." Their first non-mixtape release because 2010's Absolute Zero album.. Returning to the shrinking planet metaphor,cheap beats by dre,louis vuitton handbags, we might possibly at the same time say - with today's faxes, Web sites,レイバン サングラス, iPhones and all the rest - that communications capabilities are practically limitless. They are undoubtedly instantaneous. Nothing at all brings people today so close as to send text messages in real time, or share a video chat over the net, or be connected to the technology wherever they're and having a number of devices - phones, netbooks, laptops, iPods, fax machines,sac lancel, and so on.
improvement of qld gay and lesbian culture
Well being and security mania is often a best storm of a whole series from the shabbiest, most mean-spirited phenomena of our age, relentlessly combining to make life just that tiny less enjoyable than it may very well be. Public sector jobsworths,cheap beats by dre,cheap beats by dre, who lengthy considering the fact that ceased to see themselves as public servants, use health and safety as a catch-all excuse for saying to new ideas or placing an finish to old ones. Numerous harmless annual events parades, races, cake-sales, you name it are hounded out of existence by council gremlins blabbering about health and security.
The Golf is a rounded performer with a nicely laid-out, top-quality cabin. Composed handling, a decent ride and a punchy engine make it superior to drive, plus the two.0 TDI BlueMotion Technologies is clean - we hit 43.5mpg on test. Plus, the Focus will be the least expensive auto on test. On Friday,louis vuitton canada, Garcia attempted calling Hautamaki,cheap beats by dre, but his voice mail was complete and wasn't accepting messages. "You can contact him and go around, but if he's not there, he's not there,beats by dre," Garcia stated. "If I would have gotten ahold of him, I'd have asked him if I could come get the bike.".
Save your funds. Unless it was just a couple of months ago, don抰 bite. They don抰 give away the new stuff, and they don抰 let you have it for inexpensive, either. It is easy to apply to be a contestant on more than 1 game show, but, as noted above,louis vuitton handbags, you could appear on only 1 game show in a given year. Should you violate this or any other eligibility rule,louis vuitton, any prizes you win may be forfeited and donated to charity. (One contestant violated this rule by appearing around the game shows "Password" and "Split Second" in the very same year.
Industrial is substantially abused as a term. I've heard individuals label "anything with guitars and clanging noises" as being Industrial. This can be a comparable circumstance to the complete chillout issue. A low fare on 1 airline could turn out to be not so low after fees are added on. Airlines are making most of their income these days not from promoting you airfares but with all those costs for baggage and other perks. Additionally to checked bag costs (chart) you will discover even charges for using your frequent flyer miles and for other services similar to altering a travel date or bringing a pet on board..
The trend was later created in Europe and America. Chanel,sac lancel, Tom Ford,beats by dre, Gucci fashion homes at the same time as many other individuals started to design and style custom jeans - with cuts,louis vuitton canada, feathers and African beads which price $3134 for a pair, A Guinness Book record back then. Jeans makes custom jeans on a by-order basis ad their costs are above$4000. You merely ought to clip about 1½ inches of hair close towards the scalp (two inches to be secure) concerning the thickness of a pencil - that is all it takes. As a part of the $64.95 or$79.99 (based on which test you purchase) postage is paid and benefits are on line inside a matter of days, confidentially with no names utilized. This will fairly possibly be the most beneficial parenting advice you can expect to ever get, as far as performing a one particular or two-time occasion along with your children.
five days is very good for you
It could take 1 hour to beat this thing together with the ideal gear and spells. I discovered out that some people say it is as well tough and as a result quit. Do not these people understand that you can easily get all master command, summon and magic materia?! This really is critical when fighting Sephiroth in the long run. Their economic concerns are justified because these jewels are usually not low-cost. All jewelery shops are numerous and is quite significant to know what type of individual is going to sell you jewels. Just before you go buying,cheap beats by dre,beats by dre, ask about for recommendations.
Retailers who are nonetheless arranging to go ahead with all the giveaways can really feel what they may be carrying out is significantly less bribery and more celebrating the democratic process; Starbucks has said that they are not breaking the law because they are not requiring proof of voting, but rather going on the honor program. N. - Minneapolis. The products are readily available at a selection of costs levels that will fit the household price range. Almost everything from tuning forks at an typical investment of $40 to a$2,louis vuitton handbags,000 sound bed or chair. It is actually doable to create these devices your self should you are that kind of particular person.
When you possess a multi-city itinerary,sac lancel, pick rail over low-cost carriers. You are going to see the countryside, have less complicated point-to-point connections,, and get pleasure from fee-free travel. Ryanair is notorious for nickel-and-diming travelers-with the train,レイバン サングラス, you could possess a longer trip,cheap beats by dre, but the convenience of downtown train terminals, no baggage fees, a snack automobile, extra legroom, along with other amenities make the experience an general improved worth.. The amount of heartbeats per minute is definitely an essential point of reference for determining irrespective of whether you're operating yourself difficult sufficient for the duration of cardiovascular exercise. You calculate your heart price right after workout the identical as at any other time. Then again,cheap beats by dre,louis vuitton canada, essentially the most vital time for you to measure heartbeats per minute is for the duration of aerobic physical exercise.
3. THE ANGEL OAK TREE and CHARLESTON TEA PLANTATION. Thought to be certainly one of the oldest living issues east from the Mississippi River, the 1,400-year-old Angel Oak Tree is worth a little drive to Johns Island, due south of Charleston. c) Expense: If you're calling comparatively small and/or remote nations, you might want to verify the prices of the Computer to Phone Service Providers' to these countries prior to signing up for their Computer to Phone service. As an example, calling a nation like Fiji within the Pacific Ocean is virtually as high priced working with Computer to Phone since it is making use of a standard telephone. The purpose for these high prices to specific nations is as a consequence of the Pc to Phone Service Provider not possessing the required switching gear (needed for switching your call back to the standard telephone network in the World wide web) physically located in that country.
and also the winner is Within a planet where hauling 18 pounds of potato salad to peewee football can imply the difference amongst a pleasant afternoon and suffering the wrath of angry preteens, space efficiency and entertainment have to peacefully coexist. With models like Odyssey and Quest, the minivan is a single such resolution. The Honda Odyssey is the decisive victor,beats by dre, toppling the competitors with ease.. Functionalized mesporous silica and mixed oxides derived from layered double hydroxide (LDH) have uniform structures and higher surface locations, which are perfect as catalysts for big organic molecules. Their chemical and physical properties might be conveniently controlled through modification of their surface groups or composition, and as a result they may be investigated as heterogeneous base catalysts for the triglyceride transesterification in this thesis. The initial element of this study compares the catalytic activity of MgO-functionalized mesoporous silica within the transesterification of vegetable oil with ethanol at 473 K by varying the following parameters from the catalyst: • Form of silica help; • Catalyst loading approach; • Sort of precursor salt utilized; and • The quantity of MgO loading.
destroyit 4107 higher capacity cross reduce shredder
You will discover numerous beats available on the net which you can decide to purchase and use in your personal songs and tunes. Yet, you might have to make certain that these beats come from a reliable source and are of wonderful high quality. You'll ought to 1st take into account what style of music that you are making then you can actually ascertain the usefulness of the beats you have got offered.. In the videos of interviews with Ted Gunderson, Barbara Hartwell pointed out that her speaking capability can be stopped suddenly by remote Mind Handle. Technologically, the speaking capability is usually stopped when certain aspect of our brain neuron program is disabled by disconnection. The disconnection is usually realized through a biological relay that will be constructed using focused electromagnetic remote surgery.
The three girls are revealed to become members on the Interplanetary Observation Agency and to genuinely throw us to get a loop,レイバン サングラス, they reveal that Muneto is one of them too. Yes, poor confused male Muneto whose fallen in really like with Kayo, whose demon "son" is bringing about the destruction of anything. Some people take their job way too seriously.. Kettle Pond and Osmore Pond :: Groton, VTKettle and Osmore Ponds nestle amongst the hillsides of Groton State Forest. Though water is definitely the main attraction right here,louis vuitton outlet, hiking trails travel among the peaks and skirt the edges of Peacham Bog,Former PGA Tour pro Notah Begay joins broadcast teams at NBC, one of only two or 3 raised bogs in Vermont. If you ever visit….
Even a small present per teacher would set me back additional than I wanted to go. I have received small ornaments, some handmade,beats by dre, from students annually, and I enjoy pulling them all out to decorate the tree and remembering those students as I hang their ornament. Never really feel like you've to break the bank to show appreciation.. This Researcher's preferred procedure is usually to use heavy duty velcro. This can be obtained cheaply from craft shops and can help over 10kg. Nothing you attach for your helmet need to weigh far more than a few hundredths of that, but despite that, do frequently ensure that what ever is stuck for your helmet can also be attached to you by some other means, including a lanyard..
The average upright piano can weigh something from around 170kg6 to 230kg7. It depends upon how large it really is and how old. When you obtain your piano be sure you have adequate lifting power to have it through doors,beats by dre,louis vuitton handbags, up methods along with the like. The FDA doesn't regulate dietary supplements, the category that the energy drinks are listed below and due to the Dietary Supplements Well being and Education Act of 1996 obtaining these solutions pulled in the market place is no simple task. The agency must prove that the solution itself is unsafe when put to use as advised around the label ahead of which can take place. In 2008, it became a requirement that all supplement makers inform the FDA of any deaths or significant injuries that might possibly or may not be related to their solutions..
Conclusion: I am sort of torn about this program. I don't like how its incredibly un-user friendly for the first time user,cheap beats by dre,beats by dre, however the fact that there is certainly a great number of selections suggests you have got a great deal of control more than the precise sound you'd like. For what I was trying to use it for,beats by dre, I never assume it will be worth the time for you to understand all that the program has to deliver. One thing that you just typically hear is the fact that to attain a higher Web page Rank thru Google, (which,louis vuitton canada, let's face it, may be the Significant Daddy of Search engines) that you simply must hyperlink to other high PR web sites. This isn't necessarily true. Nobody is 100% around the relevance of PR rank affecting a link exchange.
exploit the paris of the middle east with dubai holiday packages
Heart Price RecoveryHeart price recovery,beats by dre, or HRR,louis vuitton, is the variety of heart beats it requires for the heart to return to standard in a single minute immediately after you exercise. The more quickly you will be in a position to recover, the improved your amount of fitness along with the lower your danger is to get a heart attack. A study published in 2001 in "Circulation" by the American Heart Association discovered that participants with a low HRR had an increase in the threat of death by 9 percent. THE VERDICT: Whilst the Dragons are all the rage with some punters to claim the wooden spoon, claims a proud club could sink so low are dubious. Using the exception on the the Bennett years, the halves happen to be a headache for decades. If they unearth any semblance of cohesion there, who knows what is achievable..
While you do pure maths,cheap beats by dre, you cannot generate experimental final results therefore proofs are every little thing. But in Graph Theory it not THAT difficult to code your algorithm and generate helpful experimental outcomes! Let take the MST dilemma. Present business implementations are Prim/Kruskal and Boruvska and yet, a great deal more effective algorithms are described in papers but they are not utilized considering the fact that no one has ever coded them. Um,louis vuitton canada, did someone say health club membership? There are so many fitness facilities in this state, from personal training studios and sports performance centers for your salt of the earth membership gyms. Maybe a gym membership will assist your loved one particular be a lot more motivated to get in shape. My favored facility inside the state is .
Sports stars will be a lot more difficult to get hold of as they may sometimes sign in person at events, as an alternative to answering plenty of letters. From encounter it is actually unusual to have more than a 10% reply rate from this group. Even so, writing to the club nonetheless gives the top likelihood of achievement.. Usually,beats by dre, the human heart is divided into 4 chambers; two upper chambers and two reduced chambers. The upper chambers are called atria (atrium will be the singular noun). The decrease chambers are named ventricles. Seeing a show taped is definitely fun. The handlers will answer all of the logistical queries like when to get there, and you will have a pre-show prep to obtain you pumped and trained. I got to view a taping of Dick Cavett years ago-- it was a blast even though the guest was type of boring,louis vuitton outlet, considering that Cavett was so rivetting.
Conversation was hearty and load in the table as we busily exchanged ideas about our initially day as a cowboy around the trails. There was excitement, pride, and an more than all feeling of accomplishment. So this really is how the true cowboy's felt following each day around the variety. Books are hot items, as well,louis vuitton outlet, but won't go at a higher value - put 25-50 cents on books (unless they may be a signed,louis vuitton handbags, limited edition, certainly). I won't price tag my fine stuff like Gymboree and Nike kid's clothing that are still in exceptionally beneficial condition decrease than $5. On the other hand,cheap beats by dre, women's clothing and shoes can be a difficult sale, so be cautious pricing those products also high (often bear in mind, although,レイバン サングラス, you're able to often come off the price). death of a porn king The target heart rate for me, for the greatest advantage, is seventy to eighty five % of your maximum heart price. That would be anywhere amongst 109 and 132 beats per minute. I sustain a rate of 120 beats per minute throughout my walk. Don't forget -- a fitness center is a place that you go to so that you can get match,louis vuitton canada, healthful and in super shape. Do compact issues that would assist you really feel much better. One example is, wear exercise gear that makes you really feel decent about yourself. Tickets may perhaps be priced more affordably than competitors. concerts are scheduled for this year. The initial takes places on November 30th at in Uncasville, Connecticut. Sadly, obtaining a studio anywhere on Capitol Hill for$800 a month could be a bit of a challenge. You can also consider living in the Central District (the neighborhood centered about 23rd Ave and Yesler). The CD is known as a historically low-income high-crime neighborhood,beats by dre, but it really is gotten a whole lot far better inside the final handful of years. 1 strategy involved cornering the market place by producing De Beers the only supplier of diamonds on the planet. Because it turned out, this was the simple and easy portion. By means of different anti-competitive practices,cheap beats by dre, De Beers came to manage 90 percent on the whole diamond industry.
There's a misunderstanding concerning the location of struggle in childhood. Society thinks avoiding struggle leads to happiness. But struggling to create and accomplish items is what makes us satisfied. A healer is often a job that can heal the hit points (HP) of a celebration member and eliminate status ailments, which include poison. The healer will primarily be healing the tank, because the tank shall be receiving one of the most harm. The healer is almost generally required within a party considering that of its job requirements and wants within the game,sac lancel, so subsequently, you'll have to have a safe member for it..
region. The Landale Street Neighborhood Association A neighborhood association is a group of residents, many times organized as 501(c)(3) nonprofit organization, who take on difficulties or organize activities inside a neighborhood. An association may have elected leaders and voluntary or mandatory dues. Combat is mostly gunplay. More than the course with the game Lara acquires a Bow, a pistol,cheap beats by dre, a shotgun and an old German MP-40. The Bow could be the most versatile of these weapons, serving as a extended range silent killer.
Meanwhile,beats by dre,louis vuitton handbags, the producer of "Blue Valentine" is taking a crack at "Electric Slide," a long-developed movie about Los Angeles so-called gentleman bank robber Eddie Dodson. In the 1980s,sac lancel,レイバン サングラス, Dodson robbed banks all over Southern California to help his trendy Melrose Avenue shop at the same time as a developing drug habit. But he never shot any individual and, in reality,レイバン サングラス, implemented a fake gun as he committed his robberies. Clip-in extensions are appropriate for people today whose all-natural hair is shoulder length or longer. Should you have medium length hair, extensions can make it longer. If you have lengthy but really fine hair, or your hair isn't in its ideal condition, clip-in hair extensions are an amazing way to bring richness and fullness to your hair..
do binaural beats perform
I've a Chicco Keyfit 22 that I want to sell on Craigslist. We purchased it initially using the cortina stroller but we are keeping the stroller and just selling the car or truck seat and base. It was manufactured in April 2006 so it expires in April 2012. "I like what Ash did,レイバン サングラス,cheap beats by dre," Ratchford mentioned. "The plays we produced inside the field, the way we ran the bases was encouraging but I want we were somewhat a great deal more relaxed at the plate. I believe we got ourselves out a couple times however they also did an incredible job and created plays and hats off to them.
Having a profession spanning nearly 20 years as a star of stage and screen, Magda Szubanski has cemented herself a spot within the public imagination with a side-splitting string of comic creations. She shot to fame for her big selection of comic characters and her present for accents in countless comedy programs including The D Generation, Quickly Forward and major Girls Blouse. She has also appeared in films like Babe as well as voiced Miss Viola within the Academy Award winning Satisfied Feet. Korean Studies 34 (2010): 54-89. Dost thou purpose to live without the need of that? Together with these historians,louis vuitton, you will discover a variety of folks throughout the martial art community which have also created false claims about their very own martial art and its partnership with all the ancient Hwarang technique. , ё - , ; ,louis vuitton handbags, .
Heart-Rate GoalsYou also desire to be at around 55 to 65 % of the maximum heart price to maximize the fitness added benefits of Zumba,beats by dre, writes Zumba Fitness founder Beto Perez in "Zumba: Ditch the Exercise Join the Celebration! The Zumba Weight-loss Program." To establish your MHR, subtract your age from 220. When you are 30 years old, by way of example, your MHR would be 190,sac lancel, and 55 to 65 % of this equates to 104 to 124 beats per minute. It shouldn't be made use of as a substitute for specialist health-related assistance, diagnosis or treatment.
WEDNESDAY: I take some time to stand in the mouth with the Corrib across in the Spanish Arch, seeking back to Galway's cathedral (nicknamed Taj Micheal following the bishop who built it, Micheal Browne), the riverscape, plus the tail end of the city crowds of Shop Street in the low evening sun. Galway is looking swell but there's a thing missing. Right after two Volvo Ocean races,beats by dre, why does Galway not have a marina?. What might be more significant to note is just not to consume just before you about to hit the sack. As soon as asleep, the body functioning is a great deal leaner than it can be throughout the day; whereas sleeping, you just don burn precisely the same volume of sugar/fat/calories that you would if you had been awake. So, a massive bowl of pasta with cheese may well not be the most beneficial midnight snack..
They play violin and saxophone. Considering the fact that they decided to be stubborn and not practice like they ought to, we decided to become stubborn and not send them to lessons. (My daughter lost 1st chair inside the last concert since she refuses to practice at all.) My daughter old band instructor paid $50 per half hour when he was mastering violin (that was about a year ago) I believed he was nuts to spend that substantially.. We laugh,beats by dre, Seth MacFarlane, we laugh. Family members Guy has offered a framework by way of which my son can view the globe: demented,louis vuitton canada,beats by dre, ironic, absurd. Nicely before his applications for middle school are due my son has attained a reference point that could otherwise take into adulthood to achieve. does india want more supermarkets Erotic Manga: Draw Like the Specialists by Ikari Studio - Discover this book online from$12.69. The Monster Book of Manga: Draw Like the Specialists by Joso Estudio (Editor) - Discover this book over the internet from $8.11. 20. This can be a terrific short article that addresses the typical fears amongst children starting from an early age. I have noticed that the ideal strategy to assuage several of the fears for example the sound of a vacuum cleaner or honestly obnoxiously loud sounds normally, that my daughter detests, is always to soothe her when she is upset even though eliminating the source of distress. I genuinely like the thought of creating some thing optimistic of an upsetting scenario by becoming inventive and speaking it out. Freddie Mac. Visit this web site and kind "Worksheets" inside the search box. A web page will probably be pulled up which can offer you pdf downloadable files regarding budgeting,louis vuitton, mortgage, desires and desires. Vibrant accents are amazing solutions to add some color to your modern day studio. For a area, add a single colour to create them appear clean and constant. Guard against more than coloring a modern day studio apartment,beats by dre, as this may defeat the objective of a minimalist, clean style. Detroit is 8-22 on the road. Del Negro mentioned G Maalik Wayns was signed to a 10-day contract to provide an added body at practice. Crawford and Bledsoe have not been practicing and F Trey Thompkins has been out all season. Wayns produced his debut within the fourth quarter. In other words, my brain was as well acidic, and initially, the alkalinity naturally present in most raw foods helped to neutralize a number of this acidity, producing some improvement. But this extremely alkaline, negatively ionized water,cheap beats by dre, has definitely helped. The ionized water combined with my 100% super-food eating plan, unique adaptogenic herbs,cheap beats by dre, antioxidants, pure moral life style,louis vuitton canada, etc. On those occasions he would take a mini-break, stay-put in the recording booth, and restore energy levels by means of a fast 'power nap'. Eyes closed. Head on chest. Hole 16: Plus the crowd goes wild as Bubba pulls out his popular pink driver to hit off the fairway on the 670-yard par-5, launching the ball in to the rough. You can not contact him boring. Phil, meanwhile, requires one other involuntary stroll by means of the woods. The heartbreaker. ah,louis vuitton outlet, the facts as 53 out fits all pictures is away her sexy. that's it, this that time, right? The colour don't blow your boss or queries about Erica Durance nude Think it accidentally falls into my bed? Basically,louis vuitton outlet, she plays. Thompson was referring to a Georgetown front court that is suddenly significantly thinner. On Monday, the Hoyas have been devoid of Vaughn, their starting center,louis vuitton outlet, soon after the senior became ill in practice Sunday and was taken to Georgetown University Hospital. As of late Monday, physicians had not cleared Vaughn to return towards the court and Thompson mentioned there is no time table for his return.. It is easy to soften the light by adding diffusion amongst the light source along with the topic. As an example, in the event the sun is harsh it is easy to location a diffusion panel between the sun and also the topic. This can be as straight forward as a white sheet hung from a tree limb. Despite the fact that your seat is pretty bumpy inside the trot and it could be hard inside the beginning to resist bouncing up and down, it is actually conceivable to stay seated in the trot. Essentially the most comfortable way to ride within the trot is usually to do increasing trot; this is any time you lift your weight out in the saddle on every alternate step so as to "avoid" the bounce. Though the trot, a slightly quicker,レイバン サングラス,louis vuitton canada, bumpier gait, might be fairly uncomfortable, it can be doable to ride it very comfortably.. fresh make and flowers throughout the week Eyelashes usually do not only retain sweat and water away from your eyes. Eyelashes can also make your eyes pop out and make your appear younger. Not absolutely everyone, but, is born with naturally long and curvy eye lashes. Much better get to cleaning up. Later. Thank you for the space. I do believe that you will discover many people that have microwave weapons, but it really is not around the scale that targets are led to think. By way of example, when I lived using a sister, I know I felt the effects of a microwave weapon. And this paved the way for the thoughts messing to obtain me to believe they had been EVERYWHERE! You will be ideal after you have the believed that it would be also high priced to target all the people that are claiming they may be targets. Douglas requires that his clients initial create a script, and e mail the completed outcome together with price range needs, and any voice path which is essential. He will then record the voice more than, exactly where it will be posted internet with a hyperlink to an approval/invoice page. The client will then possess the choice to approve the final product by opting to pay the invoice on this web page, making sure complete satisfaction before buy. 50 years ago it was 1961. If we going to speak about films, I assuming you imply Hollywood movies-- narrative motion pictures created for profit. These films were several than the Hollywood movies of currently,louis vuitton, but not vastly several. Playing with unique bands, you might notice that each and every of them utilizes a distinct tuning. Envision yourself inside a circumstance, exactly where you might be playing jazz requirements having a band tuned in E regular, in a really hard rock band tuned in E flat and within a post hardcore band tuned in Drop C,レイバン サングラス, and you have promised to rehearse with all of them in 1 single day. Now with all the hard tail guitar you may have the ability to retune in actually five minutes ahead of each rehearsal,louis vuitton outlet, whereas with all the floating bridge guitar,beats by dre, you are going to at most effective spend half an hour adjusting the spring tension, fine tuning and intonation.. Turning the wheelAn indexing pin was attached to a single arm with the tuning fork. The vibration from the tuning fork moved the indexing pin back and forth. The finish with the indexing pin pushed a gear 1 tooth at a time, creating it turn constantly in 1 direction. Jeans are amongst the handful of articles of clothing that nicely compromise everyday versatility with contemporary fashion. I hope that I have laid out several dimensions by which you,cheap beats by dre, the Everyman, can contemplate, judge and wear them devoid of skimping in high quality. Try to remember: Jeans are kind of like suits (have been in 1800). (Some would argue that IdeaPaint is also prohibitively costly, but that could be debated). You're able to mount the panels with screws, however it will bubble. You really should glue it for the wall or to a thicker piece of MDF or hardboard that will not bubble. two. The count quite often made use of for the cha-cha is 1,two,louis vuitton,three,4 for the pacing, it goes slightly one thing like "Slow, slow, slow, speedy, fast,beats by dre, slow, slow,louis vuitton canada, slow, speedy, fast." some also refer to it at 1, two,louis vuitton outlet, 3, cha, cha, cha (this refers to brisk actions or when your feet tap around the ground with slightly faster movements equivalent to tap-tap-tap or left-right-left). Sounds confusing? No worries,sac lancel, you don't require to know this right away as this counting will fall into spot as you discover the basic dance moves.. film 'the hunt' to screen in westport A clean furnace filter can cut your heating bill by 5 to 15 %. When you live in a cold region of the globe,cheap beats by dre, it truly is likely that the majority of your home-energy expenditure will go towards your furnace. Cleaning your furnace filter is definitely an convenient and economical way of decreasing your power charges and enhancing your indoor air quality.. After you get such very good clubs, you can actually obtain great testimonials from their satisfied members. One example is,louis vuitton canada, one such holiday club that I saw online had people today posting thank you notes. What caught my eye have been notes commenting on how excited they were in their holidays and how they are going to tell their good friends that the membership helped them fulfill their trip requires. Created by (Supernatural) and executive created by (Lost, Fringe, Person of Interest), Revolution returns with new episodes on March 25,louis vuitton,sac lancel, airing Mondays at 10/9c on NBC. Inside Warner Bros. Tell-A-Vision, attendees will expertise 3D artistic renderings of vanity cards from the imagination of Chuck Lorre - which includes a 10-foot tv remote manage wrapped inside a giant condom. Proponents with the UID have argued that numbering Indians and providing each and every one particular of them a 12-digit ID would support avert leakages from diverse entitlement schemes. Ultimately, a UID card could make it conceivable for direct cash transfers to the targeted groups as opposed to generating an imperfect infrastructure for provision of subsidised meals, fertilisers, kerosene, and so on. It could also assist check the flow of illegal migrants into the nation.. It might then pay the 20% that medicare does not pay, and also possibly the added 15% (Aspect B Excess Charges), if the physician will not accept "Assignment" (the allowable charge by Medicare). In the event you can discover reputable web sites,louis vuitton outlet, then you may be certain that the insurers whose quotes they would offer will be solid ones since they (the quotes comparison internet sites) also would need to perform with all the finest so as to retain there reputation. Well being issues, really serious or minor, happen to be the result of millions of people today going uninsured more than the years. Bellare,beats by dre, T. He, D. Yao, Y. Naturally, the degree to which you focus on the music and ignore your achin' dogs (or lungs) is dependent upon a whole lot of components, like, presumably, irrespective of whether or not you like what is playing. Though some songs are just about universally inspiring (like the theme from "Rocky"), what some people could possibly acquire motivating and thrilling,louis vuitton, others will not, says Vince Nethery, chair of your Department of Nutrition, Workout, and Health Sciences at Central Washington University in Ellensberg, Wash. It all depends on your taste and emotional associations with a offered piece or genre of music.. A: To me, Iban tattoos are a way of life. For the older generation,louis vuitton handbags, tattoos had been marks of life. The more tattoos you might have, the a lot more you may have achieved in life. When an occasion, or a mixture of events,beats by dre, occurs that will not fit the analytical framework,レイバン サングラス, the framework will need to undergo a rigorous evaluation to make sure it remains valid. When the framework is located to become flawed, we identify if it demands to become adjusted or scrapped. Because of the rapid shifts we've observed on the ground in Mexico previously two years in terms of arrests and deaths of important cartel leaders along with the emergence of factional infighting as well as new cartel groups, we have located it crucial to adjust our framework cartel report even more than just annually. facing the past and seizing a future in america Based on Paul Gilding the concept of infinite development on a finite planet is nonsense. The challenge would be to adjust the narrative from the never-ending pursuit of development to 1 focused on find out how to maximize happiness, community, and meaningful interactions. He argues inside the short-term, we are going to deny our troubles until we are faced with large scale calamities. She paid focus to her dancing profession a lot, that the Jackson five asked her to become component of your back up dancers on their Victory tour. She became the choreographer of massive stars which includes ZZ Major, George Michael, Duran Duran, and Janet Jackson. Soon after choreographing Janet Jackson's hit Control and receiving an Emmy Award for her dance perform around the Tracey Ullman Show, she began her recording career. It appears unthinkable that we could shed the measles vaccine. However we could possibly. Just as bacteria mutate and turn out to be resistant to antibiotics,cheap beats by dre, so viruses mutate to outwit a child's vaccine-induced immunity. A blue suit works the ideal and it provides you a great deal of versatility with regards to shirt and tie choice. Light or dark grey are also superior conservative selections. A 3 button suit will look good on virtually everyone, even though a 2 button will give a slightly taller/slimmer look.. Producing a guitar louder is not exactly the same as tone. Your 00 17 has even more midrange mainly because of it's body size,beats by dre, which might give the allusion of much more volume. 1972 was a poor year for Martins generally - they may be heavier, lacquer was thicker, and such. Inspired by the concept write about our mates plus the stupid stuff we did in New York, How I Met Your Mother is Bays and Thomas notion. The two drew from their friendship in building the characters, with Ted based loosely on Bays, and Marshall and Lily primarily based loosely on Thomas and his wife. Thomas wife Rebecca was initially reluctant to have a character based on her, but agreed if they could get Alyson Hannigan to play her. Cleaning your oven on self-clean mode is one of the most satisfying household chores. Literally no perform in your component, and you will be shocked at how clean your oven is. You will by no means wish to clean an oven the old way ever once more. By some accounts, the wave of abuse may have already passed. The number of bath salt-related reports created so far this year by the country's 57 poison handle centers stands at 1,beats by dre,007 as of Will probably 1,louis vuitton handbags, as outlined by the American Association of Poison Control Centers. In all of 2011, six,138 calls have been reported. Then I started to appear down. These identical youngsters wearing what I thought were all buying up those ugly sandals from Crocs, have been in reality wearing look-alikes. Not the real brand! So,louis vuitton, for all those that are still "believers", try to remember that fads, specifically style,louis vuitton canada, do peak after which fall hard. Piranha, I doubt if you will obtain many that do not agree that footballers are grossly overpaid but the difficulty just isn't one particular for UEFA or FIFA to manage. That may be entirely out of their hands due to EU regulations. However typical sense will not play a part in their strategy of views of the standard man. The next day,cheap beats by dre,レイバン サングラス, I dropped the course, and I said I am never ever going to take yet another course that I do not would like to take, which led to having 120 hours of electives, and no college degree; but I had a very interesting college knowledge, anyway. Then that led to,レイバン サングラス, you know what,sac lancel, I am just not going to perform factors any longer that I do not wish to do, considering that life is just as well quick. It really is as well short to perform anything but what you certainly care about, what you happen to be actually passionate about, and that was when I had latched fundamentally onto following my heart wherever it led me -- and to not do points considering I was afraid. distribution and advertising charges,louis vuitton canada By and significant yes. Its trump card remains the solid script and also a nice flowing sense of action, which assists it overcome its single glaring deficiency (a lot more on that in a sec). It sticks with Hal Jordan (voiced by Josh Keaton) as its key figure and sets up a rather simple but powerful plot arc for the very first series. close friends are permanent. boyfriends are usually short-term. nevertheless exciting nice to possess around,louis vuitton canada, even though.. So you need to inhale through the nose; exhale by means of the mouth forcefully. So when you're inhaling, inhale expand the ribcage outwards, and after that as you exhale, exhale forcefully like you are becoming singed in a corset through your ribcage. And after that,cheap beats by dre, next, in an effort to manage balance you need to think of lengthening out through the crown on the head and pull your shoulder blades back and down. I take my auto to Finzels. Dunno why, it just looked like a good place with good quality cars surrounding it. Im told to check back later within the day. Video on Demand and Spend Per View have now taken over that when coveted rental-only period. Video retailers are becoming a factor of your past because of the technology obtainable, as well as the beast which is known as Netflix. Streaming could be the future and we on the brink of seeing a entire new rental entity blossom mainly because of it. Cardiovascular Effects of ExerciseWhen you workout routinely, you operate the heart muscle. Like any other muscle in your body, the far more you function the muscle,レイバン サングラス, the stronger it gets. The stronger the heart muscle becomes by means of elevated activity for the duration of workout,レイバン サングラス,beats by dre, the a lot easier it really is for the heart to pump blood through your cardiovascular program. Share an ISP account or Internet connection with a neighbor, pals, or loved ones. In exchange for this service,sac lancel,beats by dre, give to routinely do yard work, repairs,beats by dre, or take out their trash."Borrow" your neighbors' unprotected wireless connection. Simply turn on your laptop and search for an unprotected net connection. I typically love a walk at about four miles p/hour and although this music is really a bit slower than that, I like this tape. I started out to walk just 30 minutes but went the whole hour then turned the tape back to start and continued on. Decide to purchase it!. Like Crawmerax before him, Terra includes a slow elevator leading up to him. Soon after being used it requires a minute or two (seriously,louis vuitton outlet, not "a few seconds" or "a small bit") to shoot all of the way up. If two players die, be sure you both get on the elevator at after or among you'll be waiting a extended time to get back in.. Roketa scooters are a few of the Chinese small motorcycles which have received a wave of complaints because of the poor technical functionality and their doubtful high quality. Did you understand for instance that half the number of the world's scooters are manufactured in China? The low value is what makes Roketa scooters a superior option to Japanese, American or European items. However, the client satisfaction level is fairly low as we can tell by the plenty of online damaging critiques. Everyone knows who Sting is, and there's no doubt that he's tremendously talented,beats by dre, as is drummer Stewart Copeland. As element of that unit, but, Summers was more often than not overlooked. What exactly is so incredible about Andy Summers will not be only his command with the instrument, but his overwhelming versatility. His latest release,cheap beats by dre, "Witching Hour," requires its cues from novelty monster songs (consider "Monster Mash") and runs with its haunted theme for 22 tracks. Along the way,cheap beats by dre, the "Witching Hour" embraces a lot of theramin and dresses up its vintage rock 'n' roll sounds with theatrical vocals and spooky brass lines. At times, it comes off as a soundtrack to some unreleased children's film, and substantially of it does not really feel all that totally different in the "Grim Grinning Ghosts" theme that permeates Disney's Haunted Mansion ride.. Sunday at the Minneapolis Convention Center, 1301 2nd Av. S., Mpls. Admission is$10 for adults, $4 for ages 5-15 and totally free for ages four and under. In it, he blames the enormous reduction in our typical of life on climate transform and too countless many people. Conversely, I not too long ago heard concerning the legislature in North Carolina taking into consideration a law requiring that changes in sea level be forecast only with regards to history, and not making use of scientific approaches. Is this all-too-common clash of minds coincidental or could one quickly anticipate it? Properly, I think there is a correlation. Tome Show, is definitely the sort of podcast that plunges fans of Dungeons and Dragons - the grandfather of tabletop role playing games, at the moment created by Wizards in the Coast(WotC) - straight into a master-level dissertation. Subjects of conversation include news and testimonials on the most current D merchandise too as interviews with persons in the role-playing game market and gamers. Advice to players and Dungeon Masters is a signature portion of Tome Show. Dealer practice: Some bar leagues may have dealers on hand to control the action,louis vuitton outlet, but a number of do it up "home game style" and rotate the dealer duties amongst the players. If this is the case,louis vuitton, make sure that your expertise are honed towards the point where you will not make a fool out of oneself slinging the cards about the table. At the least, know how to provide the cards a right shuffle.. A mixed bag here, with decent operating fees offset under par residuals. After three years,louis vuitton handbags, the Ford will be worth around 35 per cent of its original value, lagging behind the VW Golf and Honda Civic. Around the plus side, you need to be capable of negotiate a healthful discount on the list price tag. "If anything is cheap and plentiful, it is in all probability in the peak of its breeding season and fantastic to purchase," says Manfredi. Then again,レイバン サングラス, he cautions that it really is not a black and white answer. "Something like flathead made use of to become extremely economical and a lot of it is sustainable. It shouldn't be put to use as a substitute for qualified health-related guidance,cheap beats by dre,louis vuitton canada, diagnosis or therapy. LIVESTRONG is usually a registered trademark in the LIVESTRONG Foundation. Furthermore, we don't select every advertiser or advertisement that appears on the net site-many on the ads are served by third celebration advertising firms.. Yao, Y. Huang, D. Wang, Eggshell membrane-templated synthesis of very crystalline perovskite ceramics for strong oxide fuel cell,sac lancel, J. I've seen it typically sufficient at Disney-grown adults filling up baskets with Minnie t-shirts and Mickey mugs and whatnot. Pshaw, I believed. I will by no means do that. Some complain that this subsidy calculation ought to be divided over all customers with the stadium, given that the Vikings play only ten games per year, as well as the stadium will likely be made use of for high school sports and other events. However the reality is the fact that you do not create a billion dollar stadium for higher school sports or monster trucks. Ticket revenues for Vikings games are projected to be many dozen instances greater than ticket revenues for all other stadium events combined. The newest chairs within the store are created so that you can supply your back and neck, a good comfort. Wide Varity of workplace furniture covers most recent duo back office chair,sac lancel, office chair,beats by dre, back friendly office chair, move office chair, leather office chair, as well as the doctor recommended orthopaedic office chair. Spare component of workplace chair can also be readily available. Thanks once more,louis vuitton canada,I do not know if ZAR will do the raid rebuild,レイバン サングラス, I've by no means utilized it. We use r-studio all of the time on recoveries, and it really is number two on my list for raid rebuilds, number one if x-ways forensics,cheap beats by dre, about$1600 :) FTK is just the app,louis vuitton handbags, you'd require to create a bootable disk. You don't need to have to utilize the machine that the drives are in now, take them out a single at a time and image them in an additional machine. England's forgotten man Ed Smith scored a career-best 213 as Kent racked up the runs against a bruised and battered Warwickshire attack,cheap beats by dre, Ian Trott excluded. Resuming on 121 not out, Smith collected a further 92 runs and piled on 267 with Matthew Walker (121), who was the first of Trott's eventual seven victims. Trott then removed Geraint Jones and Smith in swift succession and helped himself to one other 5 inexpensive wickets, ending with a career-best 7 for 39.
There is no magic formula to acquiring or cooking healthier foods on a spending budget. It requires just a little preparing and creativity. But should you consider on the rewards -- greater wellness and more cash -- you are going to identify it is worth the effort. Places of WorkIn the case of soap stars and sports individuals the very best place to write to is the studio or club that they function at. At either location there might be staff who are in charge of collating and distributing autograph requests. The letters are handed towards the individual in question, and, if they decide to answer,sac lancel,louis vuitton handbags, the office will take care of the dispatching the reply.
In Jericho, two men and women meet: A 9/11 widow and a 9/11 survivor. Their lives instantly grow to be entangled in an emotionally-charged journey that leads among them to the Middle east. A thought-provoking and witty new play, exploring how consumers cope with private and collective tragedy. If there are specialist paragliders who have knowledgeable several different motors, it is easy to also ask for recommendations and referrals. This can be especially useful in case you have in no way purchased a paragliding motor before. Take into account, on the other hand, that some companies will cost their products higher than other people..
Head higher,cheap beats by dre, decent posture. Think you have got it all to be an A-list girl and honestly act like it. but never flaunt it. Screen tends to become a great deal more durable and thicker, and hence can save you income upfront and down the road too. This type of printing tends to stand up to washing far better than digital printing, and you'll find no particular guidelines it's important to adhere to when performing this. Screen printing tends to become brighter too, and hence stands out superior than digital printing.. But I don`t care. Our citizens need to have to become protected against such hotels. These places desire to become closed down.
doctors and youngster care
Initially dragon tattoos had been well known amongst men only considering of their masculine and potent styles. But gradually women also have turn into enthusiastic about them. Angelina Jolie has a dragon tattoo on her arm. Excellent. As she inhales in, she's going to draw the air down to the middle section of her lungs allowing her ribcage to expand, both front and back, so the air basically moves the bones outward. As she exhales,レイバン サングラス,sac lancel, she's going to let the ribcage to come back with each other, the entire time keeping her heart nice and open.
Now they are becoming bombarded with micro-payments, which they reduce up and divide amongst the 186 artists they represent. I know that Spotify has paid revenue for me, but it is gone to the record label. I've but to see it from the record suppliers.". You™ll have no difficulty in finding inexpensive wigs in salons, high street retailers,louis vuitton outlet, and online. Just attempt a handful of on to determine what they appear like, and try not to rush into purchasing the initial a single you see that appears remotely OK. When you™re wig purchasing for the very first time, it™s most beneficial to attempt as a lot of as you can to acquire the very best style, colour, and length for you and your face shape"and then also look web based.
Due to the fact that is Washington, prime seats are reserved for huge donors: The Politico newspaper sponsors VIP-esque arrangements of tables and couches. A name tag is necessary if you'd like to take a seat. A 20-something man inside a suit approached a woman sitting behind the Politico's sign-in table and asked, jokingly, "How do I get 1 of these [name tags] so I appear essential?" She thought for any moment and replied, "I think you have to spend $30,000 to$50,000 on ads.". Pay close focus to damages that might possibly expense you an arm and leg to repair, that include replacement of electrical wirings, heating and ventilation,cheap beats by dre, along with the plumbing technique. Remember that your target right here would be to decide to purchase a property that will offer you great savings. It follows that a house that needs major repairs, regardless of how low cost it truly is, will not be a sound investment..
You can also use heart rate measurements to track your progress by comparing your resting heart rates taken all through your program. Take your resting heart price ahead of you start out a plan. This becomes your baseline or beginning point for comparison. get an effective feeling. Particularly on bijoyadoshomi when hundreds of people today dance to the beat of my dhak, it offers me a sense of peace and I like becoming in the centre of things, he grins as he straps his dhak around his shoulder and hits it with two sticks. Pretty much instantly men and women collect around him swaying,sac lancel,cheap beats by dre, swinging and clapping.
Benjamin Graham, the patron saint of securities analysis, sidestepped this difficulty by averaging a decade of earnings in evaluating a stock. I make two modifications to Graham's formula. One should be to adjust each and every year's earnings for inflation. He was,beats by dre, actually,beats by dre, a godsend. He was willing to share copies of all the supplies he'd ready for Cory's defence,louis vuitton, and in an act of extraordinary prescience, (or perhaps given that his wife is known as a journalist),sac lancel, he'd produced a specific trip towards the courthouse to have a copy of Cory's interrogation video that he could share with all the media. Most significant of all, he mentioned he'd speak to Cory for me in regards to the possibility of doing an interview.
deformed chest and complications breathing,louis vuitton canada
Ook als stockfotograaf kunt een opdrachten krijgen, bijvoorbeeld via de webpage van DreamsTime. Opdrachtgevers zoeken soms hele specifieke foto's. Maar het mooiste aan werken voor deze website is toch dat u one hundred beelden per dag kunt uploaden, beelden die u anders nooit verkopen zou. Han Jong Sim, Pak Chung GukIn this image produced out of film "Comrade Kim Goes Flying" released by An additional Dimension of An Idea/Koryo Group, Comrade Kim Yong Mi, left,cheap beats by dre, played by Han Jong Sim and Pak Jang Phil played by Pak Chung Guk are together at a park. It is a classic tale of a compact town girl who follows her dreams to the significant city. But in this case, the girl is actually a North Korean coal miner, the huge city is Pyongyang and her dream should be to come to be a high-flying trapeze artist.
Within the identical way that interference may perhaps trigger a drop-out,レイバン サングラス, the receiver may not get a clean version with the signal. For true diversity systems, the second antenna helps from having drop-outs,sac lancel, however they never seriously make the signal sound superior. Any VHF and UHF system with plenty of other frequencies in use could have challenges. The query is what takes place to the economy and Fed policy later this year and into 2008. Economists, which includes those in the Fed, have already been minimizing their growth estimates. Whilst countless anticipate the nation to prevent a sharp downturn, given the robust world economy, tight labor industry and healthier corporate profits, the shaky monetary markets have added a brand new layer of danger..
Rollins around just everyone that I am pitching in reduced body and he's up. You realize unbelievable around the evening and albums. -- -- -- -- -- -- Yeah. Then I thought of making films in other languages as well. Suresh Productions has produced films in 13 languages (including English). My Bengali film directed by Rituparno Ghosh, won an award also. The binaural beats applied within the meditation CDs will stimulate your chakras and power centers inside your body inside a way that will target all seven of one's chakras. The binaural beats do that by using different frequency levels. This can be essential for balancing the chakras and allowing the power that may be at the moment stuck inside of you to flow freely.
c)Anna is playing politics: When Congress says that Team Anna is indulging into politics then Congress requirements to study the particularly definition of politics. Politics is carried out for political gains. Team Anna has time and once more reiterated that they will in no way fight an election. The stuff: The cabinets are built devoid of formaldehyde,beats by dre,cheap beats by dre, with doors produced from sustainably grown poplar wood fabricated into veneers. The recycled paper countertops "feel as challenging as stone," Eric mentioned. Multi-colored recycled glass covers the center island.
But then it comes back in exciting techniques to haunt us,louis vuitton outlet, given that we honestly haven't taken care of it.'' Fear of transform, or of your unknown,louis vuitton, is another situation that some organizations come across poetry can deal with. ''The entire notion of that lake [in beowulf] conjures up the fears facing all of us,cheap beats by dre,beats by dre,'' McBride says. ''We are inviting the executives to take a dive into this lake, to confront their very own insecurities.'' In dealing with workers' efforts to preserve their integrity and self-worth within the face of bureaucratic forces, Whyte finds that poetry aids them grasp ''what we intuit for ourselves.
dudleyqzktechtv's secrets of your digital studio
As I look about the space in the course of a break,beats by dre, it is refreshingly clear to me that everyone has their personal definition of "figurative" art. Even though everybody else has rendered the model with far more or significantly less strict representation,sac lancel, Elizabeth Frownfelter,beats by dre, a youth instructor and registrar in the Art League, has produced a composition of abstract-looking blobs in charcoal. "See this kind right here," she says, pointing to one particular shape whose rounded contours even more closely resemble the innards of a lava lamp than a human figure.
Hannibal Lector is among the most interesting and one of a kind characters of modern day fiction so the possiblilities are endless. Wonder who will play him. I would like to see Christof Waltz or James McAvoy.. But that doesn't imply that they can not appear really good,cheap beats by dre, or which you can't get them any longer. High-rise jeans are most beneficial on girls,louis vuitton canada, with outfits that feature hip-hugging blouses. High-rise jeans often thin you out above the waist, so the techniques in which they ought to be applied, against,レイバン サングラス, say, mid-rise jeans, are limited..
The other factor is that many people appear to purchase the incorrect size for their feet as a result of the all-natural wool lining does pack out and is supposed to mold to the shape from the foot. The boot have to be snug at point of buy to let for this or the feet will slide around. I have had uggs inside the "ultra" style for over 6 years now and I've to say they make it achievable to survive the snowy Canadian winters!. Gypsum Corporation (USG) invented drywall in 1916. It was originally called "Sackett Board,louis vuitton," after the Sackett plaster enterprise, a USG subsidiary [Source: Allen]. The material was initially sold inside the form of smaller, fireproof tiles, but inside some years, it was sold in multi-layer gypsum and paper sheets.
If she's honestly overbearing (I never possess a scale right here on how more than the top she is) you could must address this oneself. (It could never hurt if both you and your husband express your thoughts),louis vuitton outlet,Degree of concern for high. The trick will not be to humiliate her and still guard your partnership. Perfect maintaining their heads nonetheless under your hands do not drop they do not know -- just move anyone back after which you throw the bat the baseball is all about top hand. Thrown the bat in the baseball down via. Dropped very fundraiser in these guys -- relaxed.
Originally called "Chompers" as a result of of his ridiculously massive set of teeth, BL was promptly renamed when a listener pointed out that he could exceptionally properly be the lovechild of Butthead from Beavis and Butthead,レイバン サングラス, and Taylor Lautner. BL is identified for rather quickly entering super-defensive mode when place on the spot and asked a rather simple question, as well as his stupid goose laugh. Other memorable BL moments include things like his becoming hypnotized by guest Rich Guzzi. Dredd, believing the signal is not going to attain the top floor explosives by means of the tower concrete floors and walls from below, infuses Ma-Ma with Slo-Mo and throws her in the building to her death. Within the aftermath. Anderson accepts she has failed her evaluation by getting disarmed, and leaves.
finish of trapped on the tarmac,cheap beats by dre
Out front,レイバン サングラス, there a fishbowl-like smoking annex to bypass the city antismoking laws. We can genuinely comment around the interior design, as Ivanhoe requires dim lighting to a brand new level. But we consider there was a pool table and there will probably or may well not have already been a dartboard. "What we appear for,cheap beats by dre," says Browne, "is evidence of self-motivation, determination in addition to a passion to perform our course and perform inside the business. Exam results are important,beats by dre, however the key element is usually a portfolio, with proof of having the ability to render complex three-dimensional types." Browne emphasises that, once within the business, a designer can not expect to complete a profession with just 1 employer and should be ready to move on often. "You wouldn't go career-hunting in the UK," he says, rather ominously..
The most effective factor about searching for Economical Garments Online is that you will get stylish clothes which have related appeal to the clothing displayed in style magazines. Via the internet clothes shops normally provide bargain sales that allow you to get clothes at a a great deal cheaper price tag than the original. Right after the bargain sales, the stores stock new styles for the next season.. $1.50/Month X490,000 NS Power prospects =$735,000 X 12 months = $8,800,000 X 5 years=$44,sac lancel,000,000 paid on $1.5B Maritime Hyperlink investment, where does rest come from? Effectively the "Maritime link cost recovery regulations" states: "Once approved under Section five,レイバン サングラス, an applicant is entitled to recover Project fees by way of a rate, toll, charge or other compensation from Nova Scotia Energy Incorporated in accordance with Section 8" So they'll inform us$1.50 a month,beats by dre, then rates may have to rise to spend off the $1.5 billion link cost,cheap beats by dre, the total with interest will probably be around$4.5 billion. A massive cost to pay for 8-10% of your total power desires with the province. I consider someone has the wool pulled over the governments eyes.
Be cautious about coupon use. Coupons can save lots of funds, but only if utilized to get things that you simply will in fact use and when the products are economical sufficient. It must not be employed as a substitute for specialist healthcare advice, diagnosis or treatment. We've got now swapped her to Advocate too. So far, so excellent. Seems a bit like men and women to me - some people/cats are basically allergic to some solutions,louis vuitton handbags, but not other folks. Film director George Roy Hill cast young Diane to star opposite Sir Lawrence Olivier in his 1978 feature film A bit Romance. Despite the film critical praise,louis vuitton canada, its box workplace success was mediocre at ideal. But Olivier was pretty vocal in press interviews about how splendid an actress his young costar was.
The classic greeting among workers is the Invention Exchange in which employees meeting inside the halls greet one another by sharing their most recent inventions. This notion dominated the opening skits in the series for the duration of Joel Hodgson's reign as he was the gizmocrat on staff at Preferred Brains and he created most of the inventions that had been demonstrated. The backstory with the series goes that the evil mad scientists were jealous of Joel's inventions, as he was a lowly janitor and the two mad doctors have been skilled scientists.
democrat platform warms to use of all-natural gas,レイバン サングラス
Functioning all the t . No matter whether you use the boxes offered by them or your personal boxes, they may make certain that no . Figuring out the logistics of the trip, and making sure that everything is set up appropriately, can get co . How can I get this inexpensively? There are many solutions to get no cost or discounted newspapers. As an example,cheap beats by dre, one particular local gas station will give away old newspapers if someone is willing to take them, so I'd generally get the Sunday paper free of charge there early on Monday morning (for the coupon flyers, if nothing at all else). You're able to also keep an eye on recycling bins or else pick up copies with the regional absolutely free newspaper..
Bunches of black and grey wires adorn its corners. On the prime, robotic arms slide purposefully back and forth along metal tracks, dropping liquids from a single compartment to a further in an intricately choreographed dance. Inside, bacteria are shunted through slim plastic tubes,beats by dre, and alternately coddled,louis vuitton, chilled and electrocuted. You possibly can stay away from paying extra fees soon after the surgery by minimizing your danger of re-injury. There's a danger that if you ever don't do your physical therapy or in the event you try to return to athletic activities also immediately which you could will need the surgery again. Regrettably,cheap beats by dre, re-tearing the knee is quite widespread, especially among athletes and teenagers that attempt to return to sports too rather quickly..
Music is the only artform which could be appreciated throughout, and enhance,cheap beats by dre, the act itself. But for my taste, beyond several Curtis Mayfield albums, PE's Nation Of Millions (I know, not the intent, however the impact) and that Morphine record; popular music has to hand the crown more than to jazz and electronica for music to grind hips to. Little in pop has the lazy, fluid sensibility necessary to maintain the rhythms for extended sessions of bitten lips and sweat around the sheets. By way of example,louis vuitton outlet, sit around the floor on your knees with an exercising ball placed among you as well as your standing toddler. Pound the ball with both arms in a steady 1-2-3 pattern and after that assist your child in imitating the behavior. Just after the child can independently imitate your pattern, modify the rhythm to 1 lengthy beat and two shorter ones or some other combination.
Bonaire. Bosnia-Erzegovina. Botswana. When we consider of deserts, we normally image a sea of sand, practically no plant life and turban-clad sheiks riding camels. While there are plenty of deserts that fit this description, they aren't all like they appear within the motion pictures. In actual fact,louis vuitton handbags, only about 20 % with the world's deserts are covered in sand. ( we've a tiny umbrella stroller too ) I got both implemented, one particular was absolutely free the other was $50 I think. We use it every day for at the very least 1 hour each day ( to and from daycare ) I don't drive so it goes everywhere. They both perform just fine. The sisters' biggest industry is within the USA, exactly where handmade dolls and toys are highly sought after. They've sold wholesale to a shop in San Francisco, and customised creations for instance their top promoting "People Purses" go down a storm. Clientele will e-mail in pictures or images of an individual,cheap beats by dre, which the sisters use because the basis for their 1 off designs.. THE ATTIC EXPEDITIONS is so much improved than the horror dreck the studios have already been foisting on us in theaters lately SOUL SURVIVORS, JASON X,レイバン サングラス, VALENTINE that it really is difficult to picture why it's ended up going straight to video. Seth (BUFFY THE VAMPIRE SLAYER, IDLE HANDS) Green is just part of a robust cast, including horror vets Jeffrey Combs and Ted Raimi, within this tale of a young man incarcerated in a mental hospital for any brutal murder. It really is a good take on the "who's running the asylum" notion, each nicely acted and cleverly plotted. totally free or incredibly cheap family entertainment Is Faith Hill. Seeking amazing at 44 with superstar husband Tim McGraw -- her side she features a lot to celebrate including -- return. To the spotlight. The first element from the season, they face Devimon and his Black Gears (they take over friendly Digimon and turn them evil). Within this initial story Arc, they all reach the Champion level. The second story arc has two key villains, Etemon and Myotismon. Whether or not you use it as a implies of transport or you participate in bicycle races you realize that the rims or wheels of a bicycle is one the significant components that need close attention. The wheels keep the bicycle steady and carry the weight of your passenger. They also endure the put on and tear,レイバン サングラス,louis vuitton outlet, the anxiety as well as the abuse on account of continual use. So this can be a very good prep move for the Swan. Now, if you get comfy with that, it is possible to start off off by just letting oneself go and catch oneself. Roll down and push up,louis vuitton, and that is simply to get you utilised for the rocking motion of performing the Swan. But for individual client this seldom conceivable even to have more than 1 car or truck to become seen at a single spot. For person consumers,louis vuitton,louis vuitton canada, the locations to visits typically equal the amount of automobiles. Further, as stated earlier, the possibilities that a certain consumer is not going to pay premium on time is reduce in case of corporate buyers.. D-Jack wants to stay healthy this season. Bobby Engram will occupy the slot this season. Jurevicius left for Cleveland, but his replacement, Nate Burleson,cheap beats by dre, should certainly do pretty properly opposite Jackson. She also appeared in addition to Dancing Using the Stars host Tom Bergeron for ABC-TV prime time coverage on the finals. Erin Andrews created the headlines for yet another purpose in July 2009 after a foul ball hit by New York Met Alex Cora struck the reporter inside the chin throughout the fourth inning of a Mets-Dodgers game. She was taken to hospital, but fortunately only suffered minor bruising. Nevertheless,louis vuitton handbags, there can be certain procedures to attend when you're to successfully attract this species to your garden, to begin with your selection of pesticides can have an adverse impact on their propogation,cheap beats by dre, so be cautious what you use. Also, an artificial supplement called 'wheast',louis vuitton handbags, a combination of whey and yeast might be applied for your space to nourish developing larvae. You'll be able to make your own version by mixing 1 component sugar to 1 part yeast in water to produce a paste, which can then be painted onto stakes amongst your planting. More than the past two years there has been a expanding awareness to a huge future of VoIP technology. This was realised when new agencies began to push forward VoIP technologies. So we saw Skype getting purchased out by eBay/Paypal. Not certain how green Austin is these days. The historic Latino neighborhood east of I-35 is progressively being edged out to make way for apartments and condos for the middle class white hipsters who're flocking to Austin and want to live close to downtown. If you have two amps, and even far better a PA having a couple of powered monitors, you could run a stereo signal (through XLR or 1/4" outputs) and produce some fantastic sounds. I do that with my Fender Twin Reverb tube amp, plus a affordable practice amp, and WOW! Try some effects like the Tremolo Panner,louis vuitton outlet, or the Rotary Speaker and you get some exceptionally exciting panning sounds across the speakers. I can only think about what two very good amps would sound like.. don cherry calls sabres' patrick kaleta a 'cheap shot artist' Find out how various umbrellas you would like printed. Commonly, the price tag per piece goes reduced as the quantity of things goes higher. Also, look at the umbrella itself, also as its high quality and durability. It may also work with two AA batteries. Reviews say that this headset is capable of lowering up to 99% of ambient noise. This device retails for$399.99.. When you are making use of electrically wired lanterns, this can be carried out days ahead of the wedding. For anyone who is employing lanterns with LED lights, you might want to turn on the lights and hang the lanterns the day with the wedding but prior to guests arrive. Never ever hang outdoors lanterns before the day of your wedding,louis vuitton, for the reason that bad weather can destroy them..
You may invest the whole day at Hong Kong's most well known tourist attraction. The kids will love riding the vertiginous funicular railway towards the wok-shaped Peak Tower for spectacular views from the world's most beneficial skyline. There's also a kids' playground, virtual sports zone, Madame Tussauds wax museum,cheap beats by dre, a shopping centre and dozens of restaurants. "The wonderful factor is that it is both novel and succeeds as a thing we are artistically proud of,louis vuitton handbags,louis vuitton canada," DiDomenico mentioned within a statement. "We're sharing a piece of our vibe. And we're hopefully helping to inspire these participating within the new media conversation.
Delineador de ojos comienza con latigazo mejora y amplía luego por encima o por debajo de las pestañas. Delineador de ojos permanente puede tener un borde duro, un borde suavizado o incluso una mirada borrosas, sombra colocado donde más le mejora. Puede ser muy precisa, con terminaciones delicadas,louis vuitton outlet, suaves a las líneas; o puede ser un aspecto de polvo suave. It is a smart application that assists customers to translate a section of text,cheap beats by dre, or possibly a webpage into one other language,sac lancel, but, it is still in beta. Customers can also search within a supply language that will be translated to a destination language of your decision. You can also detect the text of many languages and by default, it's going to translate into English.
"The cut-off line between Beryl's poor weather might be pretty sharp - generally speaking,louis vuitton handbags, along Interstate 95," says weather watcher Steve Lyttle,beats by dre,レイバン サングラス, who blogs because the Weather Guy. "Areas east of I-95 will see mainly cloudy skies this weekend, using a wonderful chance of rain. West of I-95, there'll be mostly sunny skies and just a number of showers.". Your girl will kneel down with her pumpkin-looking head poking out with the hole. You stand in front of her, aim for the gaping black grin and place a candle of a totally different type in to the mouth of this jack-o /> Number 1 Witches Brew Wart of mute child, eye of albino newt unless you Prince or Charlie Sheen, this stuff possibly doesn turn you on. But right here for Halloween could be the first-ever attractive Witches Brew.
Most of the evidence is anecdotal1. Small concrete proof including blood samples or bones of unknown creatures have been located. Photographs will be altered and so are frequently discarded as fakes by the scientific establishment.. Lake Seminole is located around the Florida Georgia State line Close to the City of Chattahoochee, flowing south to city of Apalachicola. This freshwater area is exactly where each of the state records for stripers are discovered. #2 Lake Talquin / Ochlockonee River. Saif Ali Khan did Getting Cyrus, Imran Khan did Delhi Belly. While Getting Cyrus was also darkly indie and appealed only to a niche audience, the far more well-known Delhi Belly was sold by means of Hindi songs. Try to remember DK Bose A large number of did not even realise Delhi Belly was an English-speaking film due to the fact the film was infested with each day cuss words and half-a-dozen Hindi songs.
dirt cheap canon dslr remote
The soft textured gold legends bearing the Exposure logo and relevant letterings add the only colour for the otherwise blandly packaged goods (which can be great, in reality, contemplating that older Exposures had their sizeable legends denoted by garish, yucky orange plastered all more than their front. A nightmare.). Six inputs are to be discovered around the XIX,louis vuitton, as are two sets of outputs -- one particular RCA the other XLR,cheap beats by dre, but unbalanced. It been rumored that Manga Entertainment optioned the Tv series as well as the ending movie (which they did in fact release) but that the series comedic nature didn mix well with Manga sci-fi and action lineup and was at some point dropped. It a true shame the Tv series hasn been released within the states (however a blessing it wasn released by Manga) since it a very entertaining and funny program. For many years the perfect summation I have used for the GS Mikami anime is meets Ghostbusters..
However, some boat covers are designed to be well equipped and even too effectively equipped. You'll be able to more than do it as well. This can provide you with a warranty also as supplying you with all the necessary design and style components that may be specific for your model of boat. Then came the second phase, where news paper began to earn via print marketing as far as my facts is concerned. This gave new dimensions to produce new and refreshing strategies to advertise one's business for the planet. Right here, viral marketing and advertising was nevertheless executed but a great deal lesser than just before as print was cultivating significantly organization for the owners..
Weave one particular layer from the buzz. Luminous color gradations and begin weaving in 1997 able. Print articles from textile innovators the loom is 450 weaver. Lionsgate,sac lancel, which acquired the $50 million-budgeted film from Reliance Entertainment and is only on the hook for advertising and marketing and distribution, sold this one virtually exclusively for the committed faithful, which led to a predictably disastrous debut. The film played 75 percent male and 60 % under-25. Of all the films opening this weekend,レイバン サングラス, this is the one particular that would have benefited from opening mostly unopposed on September 4th after Warner Bros. And it worked. In some cases,louis vuitton outlet,cheap beats by dre, it worked with horrific final results. Quantrill considered his guys to become a military unit, however the Yankees didn't see it that way. Griese's initially pass against the Chiefs was underthrown. He had an indifferent 1st quarter. But then it began to come. 7. Invest in a wormery. A wormery can be a way of converting food waste into rich,louis vuitton canada, useable compost. Even though the good news is that you can find various game copy software's that are accessible inside the industry at present. But let me tell you that you just really should not confuse these together with the common software's that you use to copy music and motion pictures. The notion of copying your games is slightly trickier,louis vuitton handbags, so it's necessary which you adhere to the measures cautiously.. There was an report in Slate a even though ago on the Nokia 6600 that prompted me to have one. I do not know if it truly is compatible with Verizon, but I like its significant screen and PDA features. I use it with T-Mobile's T-Zones plan,cheap beats by dre, which give me basic web surfing and e-mail as a US$5 add-on to create voice program. It truly is sold inside a stylish package, complete with some fascinating accessories. It mainly functions to enhance the base sound,beats by dre, producing it a great deal more suitable for the listening pleasure. You are able to conveniently connect it to your MP3 phone.
egoyan 'goes hollywood' with large,beats by dre
Possible CausesIf you haven't worked out frequently lately and you notice heart palpitations whilst walking uphill, the added exertion may have brought on your trouble. Exercising can trigger palpitations, but there are many other possible causes, like tension,louis vuitton,レイバン サングラス, too much caffeine and low blood sugar, according to Harvard Medical School. Even though it is significantly less most likely, heart valve disease or cardiovascular illness also can trigger heart palpitations as your heart beats quicker than standard through exercise..
"when the accurate guru is met with, one meets together with the great god" The Guru Granth Sahib also contains hymns of saints from other religions,louis vuitton canada, which includes Hinduism and Sufi Islam,louis vuitton handbags, for instance Kabir, Baba Farid, Tulsidas,louis vuitton outlet, Ravidas and Namdev. The Sikh Gurus also held the views of previous saints of the Bhakti movement in high regard,beats by dre, and considered their teachings sacred, and incorporated them in the Guru Granth Sahib, subsequent to their very own. The 15 Bhagats are thought of equally holy, revered and sacred by Sikhs as the Gurus.
Compression isn't the only test to which foam is put as a way to gauge its excellent though. It really is also stretched to capacity to measure elongation. This really is the percentage of stretch that can be achieved just before a length of foam breaks. "I don't feel issues have actually changed that a lot in the final 100 years," Bushnell says in explanation of her ode to Wharton. "The Significant Story of moving to New York to become profitable is still there. George Paxton in Trading Up comes to New York after he's become a billionaire in the Mid-west.
No I don't assume that game was an excellent reflection -- -- there that may be more so no we played properly certainly defensively in. I think is the fact that you belong to -- seasons -- -- does -- -- we're capable of performing that. Any teams each of the blame Clinton loves. When you happen to be deciding on where to possess your tattoo art function performed, it really is fundamental to take the time for you to make an educated selection. Your tattoo will grace your body for the rest of your life, so make sure that it'll be on the highest top quality probable by deciding upon the premier tattoo studio in your area. Here are six techniques for discovering that top tattoo studio:.
Be consistant. At all times make use of the same commands. Stick with 1 word like "come", "whoa",cheap beats by dre, and use hand signals to also guide your dog. It''s not your fault. we''re all just struggling humans doing our most desirable to get via life. But the food manufacturing companies and quickly meals restaurants are selling us a bill of goods with their advertising,louis vuitton outlet, just like the tobacco corporations they are pretty much only concerned with the bottom line and give lip service only, towards the overall health issues of their products.. I remember my 1st deal. My knees have been knocking. Thankfully for me I had an experience RE investor walk me via my 1st deal.
When you have excellent taste, present to join your grad for the purchasing trip and make it a lunch date. A different selection is always to set him or her up with an appointment with a private shopper. Some upscale department shops even provide you with such services at no cost. Even though living in Grenoble,beats by dre, I stayed in an international student residence. To be truthful, the residences in France are not generally the nicest or most comfy in the world, however they are definitely the easiest option and are great for meeting new folks and obtaining into the international student life style. I met lots of other internationals and heaps of French students also and now have close friends all over the world.
fiat panda costs announced,cheap beats by dre
Right here they think about you subsequent couple who in early 80's Kiss. In the event you stare for myself from one another." She wants any belly suggests they certainly sarcastic. Not just fantastic method immediately after about flying?" and Hef and Heartbreakers can not understand one thing even 7 pm at least for three nicknames she had, suitable? To view a thing? Because this then. I had exactly the same trouble until I just asked my 2yr old what was scaring her. She didn't possess a particularly clear answer but she did say that the "scaries" had been hurting her. So we began a new bedtime routine component; scooping up the scaries.
Early childhood games are not a brand new invention. They had been on private computer systems from close to the starting. Parents of early learners in 1983 could decide to purchase "Cookie Monster's Munch" on the Atari 2600. Now,louis vuitton,beats by dre, it TMs unique. I like to do all kinds of music,レイバン サングラス, even though some individuals have downplayed the hip-hop stuff I TMve performed in the past. But hip-hop has been hugely supportive of me more than the years,sac lancel,louis vuitton handbags, and I really feel it TMs time for you to remind those that I TMm most certainly nonetheless in that gang.. This really will not function the way that tends to make sense or that you could count on that it would. You REVOKE VIEW ANY DATABASE in the public part,louis vuitton handbags, but then the user has to be the database owner in the database or it can not be noticed, however it nevertheless can be accessed. The issue is usually a Database Engine Security shortcoming and not likely to be fixed inside the present or future release of SQL Server.
This manicure is not certainly a half-moon, and it is not precisely a french either. It really is a French moon, for those who will. I just like the appear of those obnoxiously prominent white ideas you see on acrylics oftentimes, and I like the half-moon issue, so I thought this was a quite cool compromise. To have the most reasonable getaway in Boston you should constantly appear for thye specific discounts plus the offers in the hotels in Boston. A superb track of these delivers is usually the supply for selecting essentially the most suitable hotel for any holiday. Discounts and presents in cheap Boston hotels is a incredibly standard point while the booking is created ahead of time.
Its European style activities put to use to become shared in between a little studio at Cranfield Technical Centre (home to 700 with the motor industry's top engineers) and a different facility close to Munich. International design and style chief Shiro Nakamura decided to establish a new facility and chose London considering he thought of it to become "the trend centre of Europe". The constructing was originally a British Rail lorry depot (and winner of some design awards for "creative use of concrete"),cheap beats by dre, then an illegal rave venue prior to being reborn as Nissan Design and style Europe.
After they had Ronaldinho, Deco, Giuly and so on. they played an incredibly expansive Brazilian form flair based game, for the reason that the key man was Ronaldinho Gaucho. Prior to that in they played conventional wide players as a result of their captain after which key man was Luis Figo.. Inside a spot referred to as "Bored Room" pieces of bread gather about a table to find out about bun-warming from a monotone piece of complete grain. A sleepy baguette and other pieces of bread are nodding off,beats by dre, and then a can of Spam bursts by means of the doors as loud rock music plays. The breads get excited,louis vuitton, with one slice even letting down her hair.
fishmongers kins pray for xmas no,louis vuitton canada
i know that it performs becuse it does in the course of 1st handful of seconds bur then stops. i also checked my BIOS however it does not have any issue on the fan. also my battery is i think giving up because my laptop is three yr old,louis vuitton outlet, so what is the best place to buy a low cost battery. As broadcaster Paul Gambaccini, who seems to become a permanent fixture of BBC rock-docs in recent times but was also a genuine buddy of Mercury's,louis vuitton handbags, said, it was a grossly unfair attitude. It really is no secret Mercury was voraciously sexually active through this period,sac lancel, taking to the New York City gay scene (where men and women gave him a privacy denied to him in Britain) like a duck to water, however so had been a good amount of other consumers. None of them at the time could have dreamt in the awful consequences that lay in wait..
It might be one of several most widely used stretches in West Delhi, but on this evening,cheap beats by dre, there was only 1 policeman to be observed around Pacific Mall, situated between Subhash Nagar and Tagore Garden metro stations. Plenty of young people today hang out right here, however the stretch is still secluded plus the street lights do not function. Of late,レイバン サングラス,レイバン サングラス, I did see some police barriers crop up but there had been none on this day. A snob after my own heart. Right here the issue: The girls at the bar may well feel your Breitling is actually a fake, however the guys in the Skull Bones surely know you a man of signifies, a man of distinction, a true power-player-to-be. And who do you truly care about? The girl you wouldn introduce your uncle Jack to, or the man who will later appoint you to the Supreme Court? Forget these chickens and wear your grandfather Patek Philippe with pride..
The balancing kit comes with a number of distinctive weights, and is more often than not cheap and cheap. To use a balancing kit, verify the speed on the ceiling fan by turning it on. Verify the fan speed and observe exactly where you get the most wobble. TypesAerobic physical exercise is definitely the most powerful form of physical exercise for lowering your resting heart price. Usually, both moderate-intensity and vigorous types of aerobic physical exercise are powerful. According to the American Heart Association, the essential should be to retain your heart price between 50 percent and 85 percent of your maximum for no less than 20 minutes; it is possible to calculate your approximate maximum heart price in beats per minute by subtracting your age from 220.
"I'm a massive believer in supporting other folks,cheap beats by dre," Burch mentioned inside a backstage interview. Models are a part of the glue right here, also. Jessica Stam made it to the Chelsea space exactly where Thom Browne was displaying and Coco Rocha made a swift modify from von Furstenberg's front row to Zac Posen's backstage. Even though you practically definitely know, zits will be made worse by direct sunlight. UV light given off by tanni . Listed below are genuinely 5 strategies for taking control of the acne st . Squiggles. The CPSC examined the report and gave the small guy a clean bill of overall health. It turns out that GoodGuide had conducted its test having a hand-held X-ray fluorescence analyzer,cheap beats by dre, which is considered much less precise than the methods necessary of manufacturers,louis vuitton, which test the levels of soluble contaminants inside a toy.
finding low-priced car insurance,beats by dre
Worse, this number will cap the reimbursement on all valuable products,louis vuitton outlet, not on a per-item basis. We've four distinctive sorts of overall health genworth mortgage insurance coverage policies, within the market. Ask for suggestions. There was a further memorable game for Croft at the Oval, which must have been a single with the most extraordinary ever played. Within a 50-over contest,cheap beats by dre,louis vuitton,cheap beats by dre, Surrey produced 438 for five - then a planet record - with Alistair Brown smashing 268 from 160 balls. Croft, who was captain at the time, heard the comments of a Sky Sports summariser - covering a different game - who stated inside the interval,sac lancel, "The finest that Glamorgan can do now would be to jump in their cars and head back down the M4.".
I do not take into consideration a sound that I then impose around the songs. I just listen towards the songs and think about what environment will permit them to flourish. I like that each album has its own environment - its own appear and really feel.. The objectives in the Galena project were to supply residents using a low cost,louis vuitton canada, trustworthy supply of power for 30 years. The City of Galena was functioning with Toshiba to install a Compact Modular Reactors (SMR) inside the city to provide residents with electrical energy. It was initially believed that Toshiba could be willing to provide a reactor to Galena,beats by dre, therefore eliminating a few of the significant fees of implementing new reactors.five.
Revenue Saving Strategies and Cost-free Shipping Provides. For that reason, so as to save in your next day shipping expenses, and as a a part of clever capital saving opportunity, you could possibly look for absolutely free shipping coupons for the favorite internet shop. And it's also a fantastic concept to search for absolutely free shipping coupon on a precise product.. Do as I say,レイバン サングラス,Dell Studio 14Z Review, not as I do. Forget flying to meetings. Try Skype rather ;-). Next you are likely to choose to determine how thick you would like your stamping blank to become. I'd advocate a thickness of 20-22 gauge. This enables the blank to become rigid adequate to not bend readily, but additionally not also heavy for the particular person wearing the jewelry item.
Totally different forms of tips are getting used and applied by designers to produce green clothes. One such content is almond. It is 1 on the most atmosphere affable elements on the market with us. Brooks (who is also co-creator of "The Simpsons"). His 1994 comedy I will Do Something was originally meant to become a musical, but a series of test screenings convinced him to cut all the musical numbers. The resulting film was an incoherent mess, and also though the offending musical numbers have been cut, the film was a full bomb.
This, in spite of this, has not often been the case. Scottish cities of Aberdeen (94%) and Inverness (81%) recorded the largest cost rises between 2002 and 2012. Two other Scottish cities Dundee (73%) and Perth (70%) also feature within the best five. Once you have booked your venue you might wish to give consideration to decorating the room to suit the style of one's event and make it a pleasant location to be. Balloons and banners are ideal for birthdays and weddings but may not be proper for conferences exactly where you will be essential to produce a superb impression on clientele. Tailor your decorations accordingly..
neglect the recording studio
I think that you do not need to pay more than a quarter to get a paperback book and more than fifty cents for a hardcover book. How much you commit is up to you, but it is advisable to not have to spend more than half the original worth of your book- and even that I would contemplate rather higher. If you go around the first day with the sale,cheap beats by dre, there are going to be significantly more selection but the rates can be greater.. In accordance with the internet site literature, Cross Fit seeks to serve all varieties of persons by changing the degree of intensity rather than the kind of exercise. Translation: Everyone can use a Cross Match. Workouts are sports-inspired, from Olympic dead lifts to gymnastic parallel bars, as a way to train the body for athletic activity.
Meanwhile,louis vuitton outlet, Florida has been plagued by pythons,レイバン サングラス,beats by dre, nonnative predators that have not too long ago taken up residence within the region. Over the past couple of months, the massive constrictors have been found in western Miami-Dade County, raising concerns that they could establish a breeding population. This rattlesnake, discovered in St. He is amused by reports that suggest the Foundation be blacklisted. terrorist organisations are blacklisted, he laughs. But behind the cheerful façade, there is concern. For practically two,000 years,cheap beats by dre, ink on paper was the only solution to display words and pictures,beats by dre, and it nonetheless beats pc displays with regards to portability and value. Paper also does not demand an external power supply. But it does have some limitations: As soon as you've printed words on paper, those words cannot be changed with no at the least leaving some marks, and it can be also difficult to carry about a large variety of books..
La danse et le chant prenant une grande spot dans sa vie, elle suit l'écho de son coeur et se dirige alors vers le Gospel, tout en prêtant sa voix comme choriste pour des artistes sur des projets de types différents. Elle eut un parcours semé d'embûches, ce qui l'a rendit aussi eye-catching, mystérieuse et humble dans la discrétion de son talent. Mais elle ne lche rien. Fishing - sailfish, yellow tuna,louis vuitton handbags, blue marlin,louis vuitton canada, bullnose dolphin, grand slams,louis vuitton canada, wahoo, Mahi Mahi, reef fish, practically nothing beats every day of fishing! There is no finer sport fishing within the Florida Keys - or anywhere - than on this unassuming tropical getaway of Essential Largo. Knowledge a tropical Caribbean paradise - without leaving the Continental United states. Program a Vacation in Crucial Largo! Appear forward to some Planet Class Sport Fishing! .
Given that Batu Ferringhi is well-known, tourist under no circumstances failed to check out the web-site every time. Mostly for the duration of holidays, the beach gets crowded with travelers and backpackers from all more than the globe hence luxury hotels give good deals along with other budget-friendly options. Apart from its wonderful beach,louis vuitton, Batu Ferringhi is known as a prime place for shops, restaurants and its well-known night market place. Inside, you will obtain a redesigned Dell. Despite the fact that the keyboard is matte black, the deck is glossy silver with an abstract, swirling line pattern on the palm rest. At first glance, this looked like water spill to greater than 1 particular person wo saw the Studio 17.
disney hopes 'hercules' reaches olympian heights,レイバン サングラス
Your groomsmen will absolutely really like a vintage poker room badge which is often customized for him. This specific stunning poker sign will undoubtedly be well-known groomsmen gifts for your groomsmen and all of your buddies. Correct immediately after a tricky day's function, there's practically nothing beats coming property in an air conditioned area for any quick round of Hold'em Poker. True story: An e-mail arrived out of your blue three weeks ago from an independent filmmaker in Los Angeles asking the whereabouts of Bellwether, a Northern Lights-tinted country-rock band from Minneapolis that earned a following within the late '90s and early '00s. This is the third gig the pair has played in recent months right after years apart, in which time Peterson fronted Missing Numbers and formed the 757s,louis vuitton canada, now both bands with the previous. Rybak to declare it "one from the major 10 days in Minnesota music" -- alongside Prince's Dakota gig announcements along with the Replacements' benefit EP going up for auction -- both Atmosphere and Trampled by Turtles unveiled large concert plans..
We began gearing up for 2006 months ago: In November we announced our hiring of Walter Shapiro as our Washington bureau chief, and he joined us final week, stepping as much as cover the Alito hearings. Meanwhile, Washington correspondent Michael Scherer was all more than the Abramoff indictment (and whereas other people mistook the lobbyist black hat for mobster garb, we have been the only ones to tell you what it honestly meant). And in War Area, Tim Grieve brings you each of the news on Alito, Abramoff, indictments, nominations, resignations usually ahead of you unearth it anyplace else..
Find a handful of older personal computer games. Various retailers will sell a lot more aged games for as small as ten dollars. This low price does not consistently mean low quality. primarily based around the info we get when clientele schedule their sessions. We also schedule a transition time amongst appointments so nobody feels rushed,beats by dre, and we've time for you to clean and put away the props made use of from the session. Value of Comfort Level. Needless to say, I flunked the Garfield section of your test miserably. I didn't even get partial credit when I answered, "The 14th Amendment" for 1 query. My application to Harvard was rejected,louis vuitton handbags, and I am now sweeping floors in the local homeless shelter in return for the occasional can of cat meals..
The bonnet also has a a lot more aggressive 'Power Dome' shape, with a raised centre in place of the pre-facelift car's flatter design,beats by dre, plus you can get new tail-lights,cheap beats by dre, new alloy wheel designs and new colours. It all combines to provide the Fiesta a significantly more robust stance, and makes it look like a larger auto than it basically is. What is way more, it tends to make the widely used Volkswagen Polo appear bland and uninspiring. With the introduction of Superman in Action Comics No. 1 in 1938,cheap beats by dre, the longer kind of comic books sold in the millions of copies per challenge but the artists themselves didn't fare at the same time,louis vuitton, earning only dollars per web page. Following the congressional hearings on juvenile delinquency and comic books in 1954,sac lancel, the business tightened,sac lancel, and many titles ceased.
exxonmobil and american power
This can play regular movie DVD's too. You'll want to pay for Xbox Reside membership if you would like to play over the internet games, it really is not too expensive although. Also the internet gameplay is quite smooth and clear. Great fire starter, when you have ran out of twigs. A dog leash,sac lancel, or piece of rope. A snapping weapon, that like a whip, can be used to scare the crap out of men and women. We're not,cheap beats by dre, like,sac lancel, just type of beating around the bush. But honestly we choose to. It really is just that I feel it really is gotten to such a head that if we do something that is not the greatest,louis vuitton outlet,レイバン サングラス, it's going to become so anticlimactic, so we've currently believed about it.".
My Dad, Larry will be the most important studio photographer. He performs largely with high college seniors and families. My Mom, Patsy would be the online business manager,louis vuitton handbags,Discover Cheap Vegas Hotels, and my son Hunter aids with maintinance and is already saying he desires to be a photographer when he grows up. When you purchase operating clothes on line, you'll save a lot time. There is certainly no finding within the automobile, wanting to get the sporting goods retailer that has what what you desire, then purchasing around for the top costs. All of it can be done immediately perfect in front of one's computer system..
Prepaid mobile phone plans are terrific for youngsters, senior citizens,beats by dre, and anybody who doesn't want to use his mobile phone continually yet values the necessity of having one about, in case of emergency. If you want to survey the market around the prepaid cellular phone deals, visit Prepaid Evaluations. It is actually a internet site that guides prepaid mobile phone shoppers.. With sales of 461. Million dollars in a single in ten. By the way April greater husband in the National Automobile Dealers Association dealers academy. Appear, it TMs the Russian gangster with a vulgar taste in cars and accessories. Basing stories about not too long ago released sex offenders when seemed hazardous; it now reeks of a speedy route to cheap sensation. Ben Foster seems to prove that point..
At Campfire Corner storytellers will entertain with classic and modern tales. Visitors need to save some time for you to explore some of Stone Mountain's other attractions, that include Geyser Towers, the Outstanding Barn,cheap beats by dre, and the Summit Skyride. Pumpkin Festival hours are ten:30am-5pm on Fridays and Sundays and ten:30am-7pm on Saturdays. High-tier bicycle makers already use 7075 aluminum alloy. It is durability and properties are comparable to steel and has wonderful wear strength and ordinary machinability that makes it much easier to type as a tube or frame. Alternatively, 7075 aluminum alloy delivers the weakest resistance to corrosion in comparison with other aluminum households.
Yet it was the discovery on the electric guitar in the age of 13 that woke up the latent musical talent in Grant. He started writing songs straight away,レイバン サングラス, and sloped by way of higher school, putting varied bands collectively and dreaming of bigger factors. At some point Grant left school. The board asked for a progress report in 90 days. It did not get one. Rather, in July of this year (about 1,180 days late), the agencies released a five-page draft document that primarily left the two-headed monster intact.
five critical components for receiving your internet job applications noticed,louis vuitton handbags
You will find about 75 flights to Seoul (countless of them providing low cost tickets) which arrive in the Inchon International Airport, one of several most breathtaking airports in the globe. Korean Airlines delivers economical flights to Seoul. Etihad Airways, Lufthansa and Qatar Airways also provide you with low-cost tickets to Seoul. Any garlic nevertheless present inside the cup will probably be smashed as well flat to be any enjoyable inside a third round of "Crush the Vegetable Matter". You are able to retrieve these paper-thin bits using a compact spoon, your finger or merely wash them down the drain. Realize that picking out this last choice will cruelly deny the abused garlic its raison d'阾re..
For webcasting I use Shoutcast by means of two companies (I created my personal asx,beats by dre, pls,louis vuitton, ram, and m3u files so I could add greater than 1 provider at after,cheap beats by dre, and adjust them if needed with out too much trouble), if one of several companies is down, the subsequent a single picks up my listener. I use joomla for the web page. Licensing and such will vary in each and every country,レイバン サングラス, so check it out before playing music on your station.. 3. Classified ads. Finally, verify out the classifieds inside your local organization newspaper. Also memorable was going to Madras for the final mixing. Sitting subsequent to Dasettan and explaining the lines was unbelievable. Also I've had the uncommon honour of having my lines set to tune by Dasettan inside the album in which he has also written a song.
Fortunately for me, I did not quit yoga as a result of my initial experience. I did exactly what Dorr recommends several teachers and studios till getting the appropriate match. Because people today connect differently with their teachers,beats by dre, it is worth seeking out an instructor who's best for you personally, Dorr says. Since the book was a humongous bestseller,cheap beats by dre,sac lancel, "Pet Sematary's" plot is hardly the world's best-kept secret: A nice clean nuclear household plus precious cat move to an apparently idyllic but long-abandoned ultrarural property using a flinty neighbor who initiates them into the lore of an ancient dead-pet preserve in their backyard. After the cat gets run over -- tremendous diesel trucks are constantly racing by a grave's length from the property -- stated neighbor (engagingly codgerly Fred Gwynne) initiates Dad (vapid Dale Midkiff) into the mysterious rejuvenating powers in the soil inside a secret Indian burial ground, much less conveniently nearby. Whaddaya know, the cat comes back (they thought he was a goner), except he's a little crotchety even with an eight-life credit..
LeasingProfit sharingIn the leasing model, the distributor agrees to spend a fixed quantity for the rights to distribute the film. If the distributor along with the studio possess a profit-sharing relationship,cheap beats by dre, alternatively,sac lancel, the distributor gets a percentage (typically anyplace from ten to 50 percent) of the net earnings made from the film. Each models may be beneficial or undesirable, depending on how well a movie does at the box office. Listen for your inner voice and do what performs for you. It need to not be put to use as a substitute for skilled medical tips, diagnosis or remedy. LIVESTRONG is often a registered trademark from the LIVESTRONG Foundation.
dell studio 17 laptop
You can easily as a result keep a perfect temperature and also ensure suitable ventilation within your house. All you will need to complete is hire a reputed organization for comprehensive contract of all these functions. Most of the air conditioning and plumbing service providers also cover air conditioner repairs and heater repairs below their services. With small to no operate expertise and couple of industry connections,beats by dre, figuring out tips on how to identify a job in accounting requires taking advantage of all resources available to you. When you are a present student or an alumni of a university, it is best to take complete benefit in the career-building sources hosted there. Chief amongst these sources may be the profession fair; schools with prominent accounting programs or schools in significant cities will more often than not possess a separate profession fair just for accounting jobs..
That'll aid the rest of your line,beats by dre,sac lancel, which includes DE Leonard Small,louis vuitton outlet, the team's top rated pass rusher. One particular difficulty this group has may be the absence of one other established DE to take the double teams away from Tiny. They've added LB Will Witherspoon from Carolina. Bank-robbery films are a staple of outstanding cinema. Regrettably, unless you count "The Town" or "Inside Man," the most effective with the lot -- "Bonnie Clyde,louis vuitton," "Butch Cassidy and the Sundance Kid," etc. -- were made about 50 years ago. Nonetheless life is amongst the easiest subjects for newbies to draw. Literally, the objects don't move. Opt for effortless ones initially, a vase without the need of too lots of curlicues, a handful of pebbles, a flower with no also quite a few petals, a clear bottle with an fascinating shape.
Corporations began in search of managers who had been alert about their own feelings along with the emotions of others. ''More and more companies are thinking this way,'' says Ross Brown, vice president of human sources at Analog Device Inc.,レイバン サングラス, of Norwood, Mass. ''Whatever they're able to do to assist their many people relate improved and have an understanding of the world helps business enterprise.'' ''Poetry is magnificent at undertaking that,'' says Whyte. So I replace the adapter. 70$. Get household plug it in, it last for about 10 minutes exact same factor. Shearer stated he'd told producers he'd be prepared to accept a 70 % spend reduce, but in return the actors wanted "a tiny share" from the billions of dollars in income the show has earned through syndication and marketing. The show's creators, Matt Groening and James L. Brooks,louis vuitton canada, have profit participation however the actors have already been rebuffed in efforts to join them.. If you ever take place to be browsing for certainly one of the most beneficial you will get in the least pricey value,cheap beats by dre, have a appear at the World wide web retailers. Their fees for good models are an awful lot less expensive than suppliers given that they've low overheads. Names like: Wildfire, Roketa,louis vuitton handbags, BMS,beats by dre, RoadRunner and Tank are mentioned in plenty of motor scooter critiques and can be shipped suitable to your door. Evercleanse gives blanket statements like '100% natural', 'totally safe' and so forth., but these don't provide the consumer with any valuable details regarding the item. Furthermore, Evercleanse goes one step additional and markets itself also a solution that ensures common wellness and wellness. It also claims to enhance the immune system. egypt tipped as most desirable value destination for winter sun escape,cheap beats by dre,cheap beats by dre But Pakistan completed in the major of their group and, regardless of the weakness of a number of its Associate opposition,louis vuitton canada, had the extra emphatic very first four weeks of your tournament between the two teams. In Mohali, immediately after a lengthy round of football,sac lancel,beats by dre, fielding then the traditional nets, Misbah deconstructed the numbers down to their bare basics. "According to me, the most crucial factor for any team is winning. Clark has been involved within the Food Beverage business for the final 22 years. His experience ranges from conceptualisation and running of at the least ten F B establishments in the UK, Hong Kong and Singapore. This also includes strategising advertising and marketing and home business plans through a thorough and distinct understanding on the nearby industry, clientele demands and financial guidelines. "It was on a dreary night of November, that I beheld the accomplishment of my toils. With an anxiety that almost amounted to agony,beats by dre, I collected the instruments of life about me that I may well infuse a spark of becoming into the lifeless issue that lay at my feet. It was already a single inside the morning; the rain pattered dismally against the panes, and my candle was almost burnt out, when, by the glimmer of the half-extinguished light, I saw the dull yellow eye of your creature open; it breathed tough, plus a convulsive motion agitated its limbs" (Wolf,louis vuitton, 2004, p.5).. Fill your bucket with water. To begin, fill the bucket with warm water. The volume of water you place is determined by the size on the region to become cleaned and how dirty the floors are. The perfect pair of footwear is definitely the perfect size. While we usually do not have the comfort of our professional basketball players of having custom-made shoes,louis vuitton handbags, we desire to get one that is pretty comfy to move about in. The ball of one's heels should not slide in the footwear. Considering that you are no longer competing against ambient sound, both noise-canceling technologies reduce fatigue by allowing you to listen for your music at a decrease volume. Then again, even at low volume, these technologies are designed particularly to maintain you from hearing outside sound. Hence, you need to be sure to are in an atmosphere where you can actually relax and concentrate only in your music. Discovering personalized cheap groomsmen gifts could require desire to search via purchasing malls and gifts shops to seek out the correct gifts in case your wedding party is on a shoestring. Most guys usually do not set their heart on undertaking such task. On the lookout for an ideal gift for the bride is already a tall order for males,レイバン サングラス,louis vuitton outlet, so they don't want significantly more tension over gifts to provide guys participating in the entourage. Immediately after twice trading down from their original #25 pick the Broncos took Cincinnati DT Derek Wolfe early within the second round. Highly-touted defensive tackles Devon Still and Jerel Worthy had been nonetheless around the board when Denver chosen Wolfe, and 'experts' howled that the Broncos have been 'reaching'. But the two players they had been supposed to become thinking of for that #36 all round pick kept falling. flexaway program of facial exercising helps you eliminate deep lines and wrinkles as well as lifting your face As from the census of 2004, there were 68,181 people today and 21,cheap beats by dre,レイバン サングラス,938 households in the division. The typical household size was 3.1. For just about every one hundred females, there were one hundred.four males. Josh let's give Josh all correct why -- -- let's hear it but beloved. Yeah I lastly and this is on this telephone -- apparently is not about to have that keeps on providing interns each and every little issue you get -- personally -- now in your residence I was explaining this just yesterday your boss. The president of ABC news. Skinny men and women face numerous health associated challenges which can make their life miserable,beats by dre, apart kind this too much thin or lean physique will not be appealing in appearance too and person looks sick and older. Musculoskeletal system of skinny men and women also gets extremely weak to cause many different kinds of issues. Healthful diet plan,beats by dre, suitable rest and common workout routines are encouraged for alleviating situation of being under-weight but even these measures the majority of the instances prove insufficient and unable to bring optimistic leads to quick time. A prefab set of 'quick parts' lets you simply insert styled text parts,louis vuitton handbags, such as a Tip or Warning. If you'd like to make a descriptive web page to get a kind you might want to use 'Generate Document' on the 'Insert' tab. Soon after minimizing the Assist Editor you're able to now choose the form. The spring is often a cord return spring from a weed eater about .25 " higher. It provides around 8 rotations for the drum in less than a second. Two more pairs of magnets may be placed on the other side on the drum to triple the amount of pulses. reset timer but by this point he was so freaked out that he didn't want underpants. When you have any issues about your own personal wellness or the well being of the youngster,louis vuitton canada, you might want to invariably seek advice from having a physician or other healthcare experienced. Please evaluation the Privacy Policy and Terms of Use prior to employing this webpage. Get a dive package from one particular with the Male dive centers. There are lots of diving desks that offer very affordable packages in Male. A good 1 is definitely the Ocean Dive Desk proper at the Ferry landing in the airport. Klimt's most popular painting is possibly The Kiss,cheap beats by dre,cheap beats by dre, which portrays a couple in an embrace. It has been in comparison to the Mona Lisa because of the fascination it evokes. The man clearly dominates the piece, and initiates the kiss. Appear, I not confused or annoyed by hip-hop, like older rock fans are by, say,louis vuitton outlet, Fall Out Boy. Greater than anything I embarrassed. Since when did young black guys, heretofore the arbiters of pop culture,louis vuitton, turn into so lame? And considering the fact that when did the citizens of that culture not know the distinction? A single Saturday not extended ago, my wife called me into the living space. A large number of other widely used songs have employed this chord progression. "Friday" is performed in the essential of B major with a tempo of 122 beats per minute. Black implemented auto-tune in Friday, meaning that her voice was edited lots. The remainder is padded out with nudity and simulated sex scenes, plus some ridiculous character actions that exist only to set up the next stock footage sequence. The story bargains using a young college student who professes tiny interest in fraternities but modifications his thoughts when the prime frat on campus pursues him as a brand new member. Turns out, the especially hip, hot and handsome frat boys are variations of vampires and desire the muscular freshman for a murderous ceremony. elizabeth edwards responds to mccain more than healthcare Pre-release promotion for Systematic Chaos was considerably stronger with RoadRunner Records, which led the album to initially chart at #19, much higher than any preceding Dream Theater album. A video was shot for "Constant Motion", which was released before the album. A second video was shot for "Forsaken", which is notably the band's initial and so far only foray into animation. The first 3 weeks of the Isles Story have been superb. We're looking forward to hearing countless far more of one's stories. NHL, the NHL Shield, the word mark and image with the Stanley Cup and NHL Conference logos are registered trademarks of the National Hockey League. You will discover diverse supplies of leotards, which consist of lycra,レイバン サングラス, spandex, and in some cases cotton. Spandex can be a much better decision, since it tends to final longer than cotton. Tights are also essential. 1 of your best solutions to get pleasure from it is to begin at the riverside Tate Modern day,cheap beats by dre, an art museum housed in an iconic power station. From there,louis vuitton outlet, either walk east along the river toward Tower Bridge, or west toward the National Theatre, which hosts a series of cost-free circus,レイバン サングラス, music and also other arts events on its grounds just about every summer time. Both walks are appropriate for households and take beneath 30 minutes.. Nothing beats locating a rare gem amongst stones. That is how a film enthusiast would describe the feeling of obtaining a rare film that is certainly previously not possible to seek out. Finding a rare DVD and not being able to get it is now a thing from the previous. The value point on the classes is impossible to beat. Sign up for any drop in price of$12.50 or become a member of the studio and get limitless classes for \$45,louis vuitton handbags, which consists of the other yoga classes at the studio. Even though be sure you sign up via the internet ahead of time for the aerial classes as you can get limited class sizes due to the silks..
I have a difficult time believing Tyra Banks invented the lacefront. Some of those lacefronts are a inexpensive mess and usually do not appear organic at all. The edges are just as thick as the ends. Resting Heart RateResting heart rate is definitely the frequency of heart beat while at rest,cheap beats by dre, ideally measured within the morning just ahead of you get out of bed. The American Heart Association says that resting heart price must be between 60 to 80 beats per minute. In line with Cleveland Clinic, ladies have a tendency to have a more quickly baseline heart price, starting in the age of 5 years old.
Irrespective of the federal government's views on cannabis,beats by dre, 14 states have now legalized health-related marijuana for sufferers. As if to succeed from the stalled federalist views on cannabis consumption,レイバン サングラス, states are now implementing their very own legislation. Neighborhood governments are only beginning to recognize the added value of cannabis from a taxation standpoint. People today around the globe have chosen Yoga. For anyone who is considering about becoming one of them, have no doubt, that you are inside the perfect path. Yoga can help you in lots of strategies.
There. Are moving -- concern. The. Despite the fact that this tutorial is designed to produce your introduction to GNU Radio as hassle-free as possible, it is not a definitive guide. The fact is, I could occasionally simply not inform the real truth to create explanations much easier. I might possibly even contradict myself in later chapters. Most of the people tour the volcano within a bus or maybe a car or truck, but to seriously feel the spirit of Pele,sac lancel, goddess of fire, practically nothing beats a bicycle. Things to find out: Kilauea Caldera, Halemaumau Crater, ohia lehua forest,louis vuitton, cooled lava flows. Site visitors: light..
### このブログ記事について
このページは、mobileが2013年2月11日 11:56に書いたブログ記事です。
ひとつ前のブログ記事は「テレビ」です。 |
Article Text
## other Versions
Screening for lung cancer: we still need to know more
1. Stephen G Spiro
1. Correspondence to Professor Stephen G Spiro, (Honoray Consultant Royal Brompton Hospital), 66 Grange Gardens, Pinner, Middlesex HA5 5QF, UK; stephenspiro{at}btinternet.com
## Statistics from Altmetric.com
The holy grail for a screening test is that it discovers more cancers in the screened arm than in the control; that those cancers are of an earlier stage and there is, as a consequence, a stage shift towards lower stage cancers compared with the control group; that the test is acceptable to, basically, healthy individuals with low risks of serious side effects resulting from tests following a positive screen; and that the cost of a life saved, or a quality-adjusted life-year (QALY) is acceptable to the economy of the day.
Published in Thorax there is an end-of-screening report on a Danish CT-based study.1 They entered 4104 men and women, (of which 45% were women, unusually high) aged between 50 and 70 years, a 20 pack-year smoking history; lung function was recorded but was not used as an inclusion criterion. The screened group underwent five annual CTs and the control group nothing, but were seen every year. It was not stated why, or what was done to this latter group. At the end of the study period 69 cancers were found in the screened arm and 24 in the control. There were more early stage (stages I and IIB) cancers found in the screened arm than in the control arm: 48 versus 21. However, the number of advanced stage cases (IIIB and IV, and extensive disease small cell) were similar: 21 versus 16, ie, no stage shift effect. There was also a large preponderance of adenocarcinomas and bronchoalveolar cell tumours, typical of screened populations. Also, of 611 participants followed for 5 years, 1404 non-calcified nodules (NCN) were identified, another enduring problem in CT-based trials. Evaluating all deaths by the end of the study, there were 61 in the screen arm of which 15 were from lung cancer, compared with 42 deaths in total with 11 from lung cancer in the control group.
This study shows similar results to the other CT-based randomised screening trials currently in progress, with more early stage cancers found with CT compared with either nothing or a chest x-ray (CXR) in control groups. However, the overall numbers in this Danish study will be too small to show a conclusive stage shift, which could elevate screening to routine practice.
The other trials in progress, briefly, include the Italian DANTE study,2 which recruited 2472 men aged between 60 and 74 years, and a 20 plus pack-year smoking history. They all had a CXR and sputum cytology at baseline and then were randomly assigned to an initial CT scan with four annual follow-ups, or, in the control group, annual clinical examinations. In the CT group, 28 cancers were found, of which 16 were stage I. In the control arm, eight cancers were seen at baseline—a prevalence of 0.67%, of which four were stage I. Adenocarcinoma and bronchoalveolar cell tumours accounted for 61% of cancers in the CT arm and 50% in the control arm. The effect on mortality is not yet available.
The French DEPISCAN study3 enrolled 1000 men and women from general practice who were asymptomatic, aged between 50 and 75 years, and had smoked more than 15 cigarettes a day for 20 years. They were randomly assigned to low-dose CT or CXR with two annual screens. Eight lung cancers were found in the screened arm, but five of these were advanced stages IIIb or IV, and only one was stage IA, compared with just one in the CXR arm, but 45% in the CT arm had abnormal scans, compared with 7.6% in the control group.
The NELSON trial4 is a Belgian–Dutch collaborative trial with 15 428 subjects (as of October 2005) randomly assigned to a CT scan at baseline and at 1, 2 and 4 years, versus a control arm with no tests. Subjects were mainly men aged 55–75 years, smokers of at least 15 cigarettes a day for 25 years, or 10 a day for 30 years. Selection was based on the degree of risk for lung cancer, and it was calculated that 28 000 participants would have been needed to detect a benefit of lung cancer mortality by 20%. Results after the screen period are expected soon.
The ITALUNG cohort contains 3206 subjects,5 chosen from general practices in Italy, randomly assigned to 4 years of CT screening or nothing. They were aged between 55 and 69 years, with at least a 20 pack-year smoking history. In the screened arm 21 cancers were found in the prevalence screen (rate of 1.5%), with 10 being stage I.
Another study being piloted is a UK study,6 which will enrol 4000 volunteers for randomisation between a single CT screen versus nothing, and will assess whether early diagnosis improves mortality and whether benefit exceeds harm in a cost-effective manner. If positive, another 28 000 will be enrolled. The design is based on selecting subjects who are at a high risk of getting lung cancer (5% over the 5 years of observation) using a validated risk-identifying model, the Liverpool lung project risk model. Only subjects with clearly defined abnormalities on their CT will be asked for further scans, depending on nodule volume analysis, based on the nodule analysis scheme being used in the NELSON trial.7
The first randomised study to report final results is the very much larger American National Lung Cancer Screening Trial (NLST),8 which showed that lung cancer screening with low-dose CT reduced lung cancer mortality by 20% and all-cause mortality by 6.7% compared with CXR screening. The trial enrolled 53 454 persons between August 2002 and April 2004, all high-risk 30 pack-year smokers from 33 centres across the USA. Twenty-six thousand seven hundred and twenty-two participants were randomly assigned to low-dose CT and underwent three annual screens, or a single CXR for the 26 732 participants in the control arm. The rate of positive screens was high: 24.2% in the CT arm and 6.9% for the CXR. Of these, a total of 96.4% in the CT arm and 94.5% in the CXR arm were false-positive results, due mainly to the finding of benign NCN. The incidence of lung cancer was 645 cases per 100 000 person-years (a total of 1060 cancers) in the low-dose CT group, compared with 572 cases per 100 000 person-years (941 cancers) in the CXR group. There were 247 deaths from lung cancer per 100 000 years in the CT group and 309 in the CXR group, representing a relative reduction in the death rate from lung cancer of 20.0%.
The NLST is a hugely important trial but expensive and therefore the cost effectiveness of a screening intervention becomes very important. An assessment of the cost-effectiveness of NLST, based on an existing lung cancer policy model that simulates lung cancer development, disease progression, treatment and survival, was applied to each decade of the NLST population (the authors did not have access to individual data such as smoking habits). They compared estimated QALY for lung cancers based on the screening test, compared with either nothing in the control arm or the addition of a smoking cessation programme for both study arms.9 They also took into account smoking history, ie, 20–40 or more than 40 pack-year histories. Their study concluded that the annual screening of current and former smokers aged between 50 and 74 years costs between US$126 000 and US$169 000/QALY for a minimum of 20 pack-years of smoking, and between US$110 000 and US$166 000/QALY for a 40 pack-year minimum. If, however, the screen was linked to a smoking cessation programme that doubled the quit rate in the screened arm (and reduced the number of smoking-related deaths) the cost fell to US$75 000 for a 50 years plus and minimum 20 pack-year smoker. If screening halved the quit rate from cessation programmes, which is possible due to the ‘reassuring’ effect of a negative screen, then the cost effectiveness of screening is erased. The authors compared their data with the cost of colorectal screening versus simple control of US$13 000 to US$32 000/QALY, and with breast cancer screening by mammography in women over 40 years of 47 700/QALY.9 In an accompanying editorial to this study by McMahon et al,9 Evans and Wolfson10 emphasise the importance and cheaper costs of smoking cessation, which is far more cost effective than screening alone and also more cost effective than cessation plus CT screening. The model used by McMahon et al9 predicted that if the cessation rate was doubled to 6% from its baseline 3%, it would cost US$17 000–20 000/QALY, but if combined with annual screening, it would still remain more cost effective at US$73 000 for men and US$40 000/QALY for women.10
Another feature of the NLST was the huge preponderance of 98% of NCN being benign (falsely positive CT), and this will need optimal assistance from radiologists to minimise subject anxiety. The volumetric approach to nodule growth as used in the NELSON and the UK Lung Screen (UKLS) trials may diminish the need for more than one follow-up CT.
The other unresolved issue for most trials is bias. This includes lead time bias, which explains the higher number of still clinically occult cancers found with CT compared with a control group; length time bias, which may also be relevant if the tumours identified are less aggressive than normal, with prolonged preclinical phases; and overdiagnosis bias, in which many of the cancers discovered may not result in that individual's death. Therefore, final mortality data, often accrued years after the study closes, will have to be collected to see if a screening test actually saved lives. Studies of growth rates of screen-detected cancers suggest that many have a volume doubling time (VDT) in excess of 400 days, making overdiagnosis bias relevant. A review of the 1520 high-risk subjects screened in the 5 year Mayo Clinic programme calculated the VDT of tumours that were imaged more than once. Sixty-one lung cancers were found in 59 individuals. VDT were calculated in 49 cases, with a mean value of 518±1049 days. Twenty-seven of these had a VDT of more than 400 days and most were adenocarcinomas. The mean VDT was longer in women (688 days) than for men (234 days), and this was consistent for tumours of all cell types. Perhaps, the authors conclude, overdiagnosis bias may occur, and especially in women.11 In fact, the participants in the Danish Lung Cancer Screening Trial suggest that combining VDT analysis with assessment by positron emission tomography may further improve the sensitivity and specificity for the detection of malignant nodules found at screening.12
Another way to try to identify those who, if screened, would be more likely to have a high incidence of lung cancer would be to screen only target populations. The NLST and most other trials in progress target smokers, or ex-smokers, usually limited to 70 years of age. The UKLS trial uses the Liverpool lung project risk score to identify high-risk people, and the other current UK trial, lung SEARCH, which is based on initial sputum analysis in the screened arm is using forced expiratory volume in 1 s to include only heavy smokers with mild or moderate chronic obstructive pulmonary disease.
The lung cancer population is, in the main, elderly, of lower socioeconomic status, often with significant comorbidities, and still mainly male. It is not obviously a population that seems keen to be screened. Many individuals, by the risk taken by smoking, are risk averse and not interested in their longer-term health. There is thus the possibility of a national screening programme, should one be set up, not attracting appropriate or adequate numbers of individuals.
Silvestri et al13 showed that smokers were less willing to pay for a screening test in the USA, and less willing to undergo treatment should disease be found. They were also less willing to undergo any screening test compared with ex-smokers and never smokers.
The NLST recruited widely across the USA, and subjects were sought through the press, local mailings, advertising and the internet. Care was taken to recruit from minorities, but there is no information on the relative success of the campaign, ie, how many individuals did not wish to join. The study population was, however, representative of the high-risk smoking USA population.14 In the NELSON study a questionnaire was sent to 335 441 men aged 50–75 years from population registries4; 106 931 replied and subjects were chosen on their smoking habits and risk factors so as to minimise the number of recruits needed. Of these, 11 103 gave consent to the study. This represents 3.3% of all who were initially approached. There was a second round to the population in 2005, in which 250 000 questionnaires were sent and 44 509 persons replied. Of these, 4535 have been randomly selected, 1.8% of the initial population approached. In the ITALUNG trial a total of 3206 subjects was enrolled from 71 232 letters sent from 269 general practices5; again, a low uptake of 4.5% of all subjects approached. In the DEPISCAN trial 765 subjects were recruited from 205 general practices and by 25 occupational physicians, a median of six subjects by each active centre, and only 41% of centres became active and able to find subjects.3 All these trials seemed to have difficulty in recruiting.
The Danish trial confirms that lead time and possibly length time bias will identify more early cancers if sought by a sensitive test, but it is too early to arrive at any conclusion about an effect on reducing mortality. This, and the other current trials, may have to be studied by a meta-analysis to see how they compare with the huge NLST, which for the present suggests that CT-based screening is worthwhile. However, cost pressures, especially with the high cost of QALY for CT screening, will drive us to a better identification of the population to screen. Even then there are challenges, both methodological, in persuading the ‘right’ people to accept a screen, in interpreting the true from false-positive results, and in driving the costs of these expensive methods down. Finally, one may have to wait several years to find whether discovering more cancers early means more lives saved.
View Abstract
## Footnotes
• Competing interests None.
• Provenance and peer review Commissioned; internally peer reviewed.
## Request permissions
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways. |
[email protected]
+55 (21) 2221-6695
## hold on glmm
### 26 de janeiro de 2021, às 3:11
We will generate Monte Carlo sam- GLMM publications may be copied and distributed only in their entirety and together with any copyright statements they may contain, as long as they are properly attributed and used for non-commercial, educational, or public policy purposes. Aug 8, 2020 - Explore Audrey Hammond's board "GLMM/GLMV" on Pinterest. rdrr.io Find an R package R language docs Run R in your browser. over 7 years ago. Building on the successful Analysing Ecological Data (2007) by Zuur, Ieno and Smith, the authors now provide an expanded introduction to using regression and its extensions in analysing ecological data. GLMM publications may be copied and distributed only in their entirety and together with any copyright statements they may contain, as long as they are properly attributed and used for non-commercial, educational, or public policy purposes. 3. there were random missing values. Degenerate design matrices. This book presents generalized linear models (GLM) and generalized linear mixed models (GLMM) based on both frequency-based and Bayesian concepts. 0. Edit 2: It is also worth noting that the "robust" sandwich-type standard errors produced by a GEE model provide valid asymptotic confidence intervals (e.g. 0. [2] over 7 years ago. However, in mixed effects logistic models, the random effects also bear on the results. TopMrFilm. No significant outliers; ... GLMM Means. Make games, stories and interactive art with Scratch. Background. Calculating brier scores for a binomial GLMM with a combine count of success/failure. Haiku evaluation. over 7 years ago reshape2 intro for NCEAS SI. Still, let me know if need it. Olivia Wilde and Harry Styles Seen Holding Hands _ PEOPLE. 7/2015). reference number of permit purpose of permit type of permits 2013 22 residence Dependant permits 504,947 23 Study 812 24 Self residence permits 2,179 17 work Governmental sector permits 99,218 18 Private sector permits 1,149,182 19 Business 345 20 Domestic help 628,406 TOTAL 2,385,089 Source: Ministry of Interior ANNEXED NOTE 1. I used the GLMM package in R, my code was: Anna, because you used family = "binomial" and link = "logit" as options in your model, R assumes that you are trying to model a binary response variable which takes the values 0 ("failure") or 1 ("success"). See more ideas about anime, sssniperwolf, music publishing. GLMM publications may be copied and distributed only in their entirety and together with any copyright statements they may contain, as long as they are properly attributed and used for non-commercial, educational, or public policy purposes. [2] Distributions of y ∣ u {\displaystyle y\mid u} and u {\displaystyle u} can also be chosen to be conjugate, since nice properties hold and it is easier for computation and interpretation. I tried to use groupedData() as well as nlsList() and SSlogis(), to fit my model.. For lme4 I can fit my models wihtout any trouble. Now let's focus in on what makes GLMMs unique. I believe a good answer to my question would be relevant to many people. TOP 10 STUDIO. Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube. Secure way to hold private keys in the Android app How to request help on a project without throwing my co-worker "under the bus" Are two wires coming out of the same circuit breaker safe? Secret Love Song Lyrics: When you hold me in the street / And you kiss me on the dance floor / I wish that it could be like that / Why can't it be like that? Praise for Rosie James: Generalized linear mixed models (or GLMMs) are an extension of linear mixed models to allow response variables from different distributions, such as binary responses. Hold on though, what were the assumptions for that test? GLMM. Finding near-duplicated strings. This article presents a systematic review of the application and quality of results and information reported from GLMMs in the field of clinical medicine. Other Worldly is a supernatural gacha series that follows a group of 5 girls who all have a sleepover one day. Note that in the linear model, (that is, $\psi(x) = x$), the equality does hold, so they are equivalent. Jan 7, 2020 - Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube. GLMM competing for the crown -Gacha life- (Romance GLMM) mini movie. Methods A search using the Web of Science database was performed for … Don’t miss this emotional story of one woman’s remarkable courage in the face of the Great War. 0:53. Beverton-Holt fitting example. they actually cover 95% of the time) even if the correlation structure specified in the model is not correct. Prediction in R - GLMM. Gacha Life Design – Gacha Life Series-GLMM • Millions of unique designs by independent artists. over 7 years ago. Dear Colleague, Alcove Advisors (https://www.alcoveadvisors.com) is conducting an independent external assessment on the impact of the Gulf Labour Markets, Migration and Population (GLMM)Programme to develop a stronger Knowledge Base on Migration in GCC States also using a survey.The survey should take 10-15min to complete and you may opt to participate anonymously. ... Should I hold back some ideas for after my PhD? How to reduce a categorical variable in a logistic regression model in R. 1. What It’s About. I am fitting a GLMM and I had seen some examples where is used the function: overdisp_fun, defined in glmm_funs.R, but I don't know which package contain them or how can I call it from R, can somebody If needed I can provide a subset of my data but this question can probably be answered without it. The coefficient of determination R 2 quantifies the proportion of variance explained by a statistical model and is an important summary statistic of biological interest. I'm working with Mixed-Effects Models in S and S-Plus (Pinheiro, Bates 2000) and the current Version of the documentation Package 'nlme' (04/07/2018). author (GLMM - EN - No. R/glmm.R defines the following functions: glmm. We have previously introduced a version of R 2 that we called for Poisson and binomial GLMMs, but not for other distributional families. Definitions Residency procedures do […] Can she hold her nerve, save the men around her – and protect her heart? 12:42. Up to this point everything we have said applies equally to linear mixed models as to generalized linear mixed models. The same holds for the MCMC support file. GLMM Website Legal Notices Ver. Photographs, logos, graphs, tables or any other images from GLMM publications may not be used separately. As with the earlier book, real data sets from postgraduate ecological studies or research plot(fitted(glmm.8)~predict(glmm.8)) I looked on this and other websites and I couldn't find a "perfect" method to validate Poisson GLMM models. Originally published as Front Line Nurse. Find your thing. Nesting success (binomial glmm) in r. 3. Beginner's Guide to GLM and GLMM with R (2013) Zuur AF, Hilbe JM and Ieno EN. GLMM competing for the crown ~Gacha life~ (Romance GLMM) mini movie. Projects were put on hold, companies went bankrupt, and many expatriates (exact numbers unknown) left Dubai, some of them relocating to Abu Dhabi and others leaving the UAE. (2009) Mixed Effects Models and Extensions in Ecology with R.The dataset used to build the model has ~ 1000 samples and my best model is: Background Modeling count and binary data collected in hierarchical designs have increased the use of Generalized Linear Mixed Models (GLMMs) in medicine. For any sampling site s i, Efb(s i)jYg cannot be given in closed form when Y is not Gaus-sian, but can be approximated by Monte Carlo samples. Salamander GLMM. over 7 years ago. What is different between LMMs and GLMMs is that the response variables can come from different distributions besides gaussian. However, estimating R 2 for generalized linear mixed models (GLMMs) remains challenging. I built a generalized linear mixed-effects model (GLMM) using glmer function from the lme4 package in r to model species richness around aquaculture sites based on significant explanatory variables using Zuur et al. Proof of the theorem is provided in the Appendix. (scratch.mit.edu) Dear Colleague, Alcove Advisors (https://www.alcoveadvisors.com) is conducting an independent external assessment on the impact of the Gulf Labour Markets, Migration and Population (GLMM)Programme to develop a stronger Knowledge Base on Migration in GCC States also using a survey.The survey should take 10-15min to complete and you may opt to participate anonymously. Thus, if you hold everything constant, the change in probability of the outcome over different values of your predictor of interest are only true when all covariates are held constant and you are in the same group, or a group with the same random effect. Photographs, logos, graphs, tables or any other images from the GLMM publications may not be used separately. If the distribution of is normal and the link function of is the identity function, then hierarchical generalized linear model is the same as GLMM. glmm_funs (code to download for use in the GLMM section) Limitations of linear and linear mixed models To illustrate why generalized linear models are incredibly useful, it is best to first try to understand the limitations of linear models (workshop 4), … Note this theorem holds under conditions that are more general than the GLMM. ... Olivia Wilde Holds Hands With Harry Styles Following Split From Jason Sudeikis _ ET Live @ Home. But I'm confused by how to use syntax in nlme. Out of boredom they decide to play some “paranormal games” —which all end in failure— eventually they land on one.. over … I want to compare lme4 and nlme packages for my data. Estimate Lower.CI Upper.CI p.value 1 4.129 1.235 13.80 0.0217 2 14.283 4.273 47.75 0.0000 Conclusion. In case information is copyrighted, the GLMM contacts the copyright holder in order to obtain the right to publish the information. Photographs, logos, graphs, tables or any other images from the GLMM publications may not be used separately. I used the GLMM model because: 1. the response was binary, 2. it was a repeated measure, each participant received this question 18 times. Count and binary data collected in hierarchical designs have increased the use of generalized linear (... ~Gacha life~ ( Romance GLMM ) mini movie model is not correct estimate Lower.CI Upper.CI p.value 1 1.235. ( GLMMs ) in r. 3 from the GLMM publications may not be used.. Do [ … ] can she hold her nerve, save the men around her – and protect her?. From GLMMs in the Appendix the men around her – and protect her heart variable in a hold on glmm model. Definitions Residency procedures do [ … ] can she hold her nerve, save the around. Used separately let 's focus in on what makes GLMMs unique 0.0217 2 4.273... Code > reshape2 < /code > intro for NCEAS SI the crown -Gacha (... Binomial GLMMs, but not for other distributional families Romance GLMM ) mini movie version! A categorical variable in a logistic regression model in r. 1 conditions that are more general than the GLMM may... Around her – and protect her heart ( GLMMs ) in r..... The right to publish the information also bear hold on glmm the results the War! From GLMM publications may not be used separately Upper.CI p.value 1 4.129 1.235 13.80 0.0217 2 14.283 4.273 47.75 Conclusion... From the GLMM contacts the copyright holder in order to obtain the right to publish the.. The results Wilde and Harry Styles Following Split from Jason hold on glmm _ ET Live @ Home Conclusion. Copyright holder in order to obtain the right to publish the information publications may not be used separately 5! Your browser for the crown -Gacha life- ( Romance GLMM ) based on both frequency-based Bayesian! The application and quality of results and information reported from GLMMs in the face of the theorem provided., save the men around her – and protect her heart publications may not be used separately anime. To publish the information Hands _ people Bayesian concepts _ ET Live @ Home 8211 gacha. Of results and information reported from GLMMs in the model is not correct 0.0217 2 4.273. All have a sleepover one day estimating R 2 that we called for Poisson binomial... Jason Sudeikis _ ET Live @ Home scores for a binomial GLMM with a combine count of success/failure mixed! … ] can she hold her nerve, save the men around her – and protect her heart [ ]. A subset of my data about anime, sssniperwolf, music publishing Hilbe. Is not correct model in r. 1 make games, stories and interactive art Scratch. Life Series-GLMM • Millions of unique designs by independent artists her heart R package R language docs R... General than the GLMM publications may not be used separately ’ s remarkable courage the. Save the men around her – and protect her heart of R 2 for linear... Romance GLMM ) mini movie language docs Run R in your browser to obtain the to! Of success/failure to compare lme4 and nlme packages for my data were the assumptions for that test the information and. Around her – and protect her heart Styles Following Split from Jason _... My PhD of one woman ’ s remarkable courage in the Appendix crown ~Gacha (... The copyright holder in order to obtain the right to publish the information the correlation specified... What makes GLMMs unique back some ideas for after my PhD introduced version. Hands _ people on though, what were the assumptions for that test more... To many people if needed I can provide a subset of my data but this question can probably be without! A version of R 2 for generalized linear mixed models ( GLMM mini... ) Zuur AF, Hilbe JM and Ieno EN by independent artists note this theorem holds under conditions are... Systematic review of the theorem is provided in the model is not correct Romance GLMM ) mini movie the ). 95 % of the application and quality of results and information reported from GLMMs in the of! Life~ ( Romance GLMM ) based on both frequency-based and Bayesian concepts Series-GLMM • Millions of unique by! Crown -Gacha life- ( Romance GLMM ) in medicine packages for my data rdrr.io an. Publications may not be used separately 2 for generalized linear mixed models ( GLMM ) mini movie ideas for my! Believe a good answer to my question would be relevant to many.... Focus in on what makes GLMMs unique what is different between LMMs and is. Have a sleepover one day to publish the information 4.273 47.75 0.0000 Conclusion the crown life~! Up to this point everything we have said applies equally to linear mixed models R 2 that we for... Gacha series that follows a group of 5 girls who all have a sleepover day. Can provide a subset of hold on glmm data but this question can probably be answered it. Around her – and protect her heart rdrr.io Find an R package R language docs Run R in your.! Subset of my data but this question hold on glmm probably be answered without it for the crown ~Gacha life~ Romance! 7 years ago < code > reshape2 < /code > intro for NCEAS SI 8211. Subset of my data but this question can probably be answered without it ) and linear. R ( 2013 ) Zuur AF, Hilbe JM and Ieno EN 7 years ago < code > reshape2 /code... Glmms, but not for other distributional families of R 2 that we called for hold on glmm and binomial GLMMs but... Crown ~Gacha life~ ( Romance GLMM ) based on both frequency-based and Bayesian concepts not for other distributional.... Of my data information is copyrighted, the GLMM publications may not be used.... Glmms ) remains challenging and interactive art with Scratch a logistic regression model in r. 3 want to compare and. With Scratch < /code > intro for NCEAS SI syntax in nlme series that a... Glmm contacts the copyright holder in order to obtain the right to publish the information the effects! Estimating R 2 that we called for Poisson and binomial GLMMs, but not for other distributional families model r.... For Poisson and hold on glmm GLMMs, but not for other distributional families, the random effects also bear on results! Proof of the theorem is provided in the model is not correct to! Save the men around her – and protect her heart reduce a variable... Article presents a systematic review of the application and quality of results and information reported from GLMMs the! The application and quality of results and information reported from GLMMs in the Appendix distributions besides gaussian this presents. Sssniperwolf, music publishing ) mini movie … ] can she hold her nerve, save the around! ] can she hold her nerve, save the men around her – and protect her?... Of one woman ’ s remarkable courage in the face of the application and quality of and. Structure specified in the field of clinical medicine and information reported from GLMMs in face... Tables or any other images from the GLMM contacts the copyright holder in to. Ieno EN definitions Residency procedures do [ … ] can she hold nerve! Answered without it publish the information lme4 and nlme packages for my data but this question can be... But this question can probably be answered without it can she hold her nerve, hold on glmm men. The GLMM publications may not be hold on glmm separately in on what makes GLMMs unique crown -Gacha life- ( GLMM. Save the men around her – and protect her heart 2013 ) Zuur AF, Hilbe JM Ieno! And protect her heart nerve, save the men around her – protect. Were the assumptions for that test ) mini movie be answered without.... Of my data but this question can probably be answered without it publications may not be separately! Equally to linear mixed models ( GLM ) and generalized linear mixed models to., but not for other distributional families, stories and interactive art with Scratch if the structure!, music publishing what is different between LMMs and GLMMs is that the response variables come! Should I hold back some ideas for after my PhD in r..! Order to obtain the right to publish the information • Millions of unique designs by independent artists compare. Let 's focus in on what makes GLMMs unique by how to reduce a categorical variable in a regression. Packages for my data but this question can probably be answered without it article presents a systematic review of time... Of results and information hold on glmm from GLMMs in the Appendix collected in hierarchical designs have the! Life Series-GLMM • Millions of unique designs by independent artists linear mixed models ( GLMMs ) remains challenging GLMMs. Run R in your browser they actually cover 95 % of the theorem is provided in the face the. The men around her – and protect her heart LMMs and GLMMs is that the response can... Use syntax in nlme and binary data collected in hierarchical designs have increased the use of generalized linear mixed as. Background Modeling count and binary data collected in hierarchical designs have increased the of! Article presents a systematic review of the Great War all have a sleepover one day tables or any other from... And GLMMs is that the response variables can come from different distributions besides.... What is different between LMMs and GLMMs is that the response variables can come from different distributions besides gaussian one. Wilde and Harry Styles Seen Holding Hands _ people GLMM contacts the copyright holder in order to the! The crown ~Gacha life~ ( Romance GLMM ) in medicine the field of clinical medicine ( ). With Harry Styles Following Split from Jason Sudeikis _ ET Live @.! Harry Styles Seen Holding Hands _ people logistic models, the random effects also bear the. |
TheInfoList
OR:
In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, the codomain or set of destination of a function is the
set Set, The Set, SET or SETS may refer to: Science, technology, and mathematics Mathematics *Set (mathematics), a collection of elements *Category of sets, the category whose objects and morphisms are sets and total functions, respectively Electro ...
into which all of the output of the function is constrained to fall. It is the set in the notation . The term range is sometimes ambiguously used to refer to either the codomain or
image An image is a visual representation of something. It can be two-dimensional, three-dimensional, or somehow otherwise feed into the visual system to convey information. An image can be an artifact, such as a photograph or other two-dimensi ...
of a function. A codomain is part of a function if is defined as a triple where is called the ''
domain Domain may refer to: Mathematics *Domain of a function, the set of input values for which the (total) function is defined ** Domain of definition of a partial function ** Natural domain of a partial function ** Domain of holomorphy of a function * ...
'' of , its ''codomain'', and its ''
graph Graph may refer to: Mathematics * Graph (discrete mathematics), a structure made of vertices and edges **Graph theory, the study of such graphs and their properties * Graph (topology), a topological space resembling a graph in the sense of disc ...
''. The set of all elements of the form , where ranges over the elements of the domain , is called the ''
image An image is a visual representation of something. It can be two-dimensional, three-dimensional, or somehow otherwise feed into the visual system to convey information. An image can be an artifact, such as a photograph or other two-dimensi ...
'' of . The image of a function is a
subset In mathematics, set ''A'' is a subset of a set ''B'' if all elements of ''A'' are also elements of ''B''; ''B'' is then a superset of ''A''. It is possible for ''A'' and ''B'' to be equal; if they are unequal, then ''A'' is a proper subset o ...
of its codomain so it might not coincide with it. Namely, a function that is not
surjective In mathematics, a surjective function (also known as surjection, or onto function) is a function that every element can be mapped from element so that . In other words, every element of the function's codomain is the image of one element of ...
has elements in its codomain for which the equation does not have a solution. A codomain is not part of a function if is defined as just a graph. For example in
set theory Set theory is the branch of mathematical logic that studies sets, which can be informally described as collections of objects. Although objects of any kind can be collected into a set, set theory, as a branch of mathematics, is mostly concern ...
it is desirable to permit the domain of a function to be a
proper class Proper may refer to: Mathematics * Proper map, in topology, a property of continuous function between topological spaces, if inverse images of compact subsets are compact * Proper morphism, in algebraic geometry, an analogue of a proper map f ...
, in which case there is formally no such thing as a triple . With such a definition functions do not have a codomain, although some authors still use it informally after introducing a function in the form ., p. 91 ( quote 1 quote 2; , p. 8 Mac Lane, in ,
p. 232 P. is an abbreviation or acronym that may refer to: * Page (paper), where the abbreviation comes from Latin ''pagina'' * Paris Herbarium, at the ''Muséum national d'histoire naturelle'' * ''Pani'' (Polish), translating as Mrs. * The ''Pacific Rep ...
, p. 91 , p. 89/ref>
# Examples
For a function :$f\colon \mathbb\rightarrow\mathbb$ defined by : $f\colon\,x\mapsto x^2,$ or equivalently $f\left(x\right)\ =\ x^2,$ the codomain of is $\textstyle \mathbb R$, but does not map to any negative number. Thus the image of is the set $\textstyle \mathbb^+_0$; i.e., the interval . An alternative function is defined thus: : $g\colon\mathbb\rightarrow\mathbb^+_0$ : $g\colon\,x\mapsto x^2.$ While and map a given to the same number, they are not, in this view, the same function because they have different codomains. A third function can be defined to demonstrate why: : $h\colon\,x\mapsto \sqrt x.$ The domain of cannot be $\textstyle \mathbb$ but can be defined to be $\textstyle \mathbb^+_0$: : $h\colon\mathbb^+_0\rightarrow\mathbb.$ The compositions are denoted : $h \circ f,$ : $h \circ g.$ On inspection, is not useful. It is true, unless defined otherwise, that the image of is not known; it is only known that it is a subset of $\textstyle \mathbb R$. For this reason, it is possible that , when composed with , might receive an argument for which no output is defined – negative numbers are not elements of the domain of , which is the
square root function In mathematics, a square root of a number is a number such that ; in other words, a number whose ''square'' (the result of multiplying the number by itself, or ⋅ ) is . For example, 4 and −4 are square roots of 16, because . E ...
. Function composition therefore is a useful notion only when the ''codomain'' of the function on the right side of a composition (not its ''image'', which is a consequence of the function and could be unknown at the level of the composition) is a subset of the domain of the function on the left side. The codomain affects whether a function is a
surjection In mathematics, a surjective function (also known as surjection, or onto function) is a function that every element can be mapped from element so that . In other words, every element of the function's codomain is the image of one element of ...
, in that the function is surjective if and only if its codomain equals its image. In the example, is a surjection while is not. The codomain does not affect whether a function is an injection. A second example of the difference between codomain and image is demonstrated by the
linear transformation In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
s between two
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called '' vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but ...
s – in particular, all the linear transformations from $\textstyle \mathbb^2$ to itself, which can be represented by the
matrices Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
with real coefficients. Each matrix represents a map with the domain $\textstyle \mathbb^2$ and codomain $\textstyle \mathbb^2$. However, the image is uncertain. Some transformations may have image equal to the whole codomain (in this case the matrices with
rank Rank is the relative position, value, worth, complexity, power, importance, authority, level, etc. of a person or object within a ranking, such as: Level or position in a hierarchical organization * Academic rank * Diplomatic rank * Hierarchy ...
) but many do not, instead mapping into some smaller subspace (the matrices with rank or ). Take for example the matrix given by :$T = \begin 1 & 0 \\ 1 & 0 \end$ which represents a linear transformation that maps the point to . The point is not in the image of , but is still in the codomain since linear transformations from $\textstyle \mathbb^2$ to $\textstyle \mathbb^2$ are of explicit relevance. Just like all matrices, represents a member of that set. Examining the differences between the image and codomain can often be useful for discovering properties of the function in question. For example, it can be concluded that does not have full rank since its image is smaller than the whole codomain. |
# Use Python Objects in C++¶
date: 2018/5/6
C++ is difficult. The language has so many catches and requires us to take care of every detail. The reason to use it is usually performance. But learning C++ is so frustrating that when performance isn’t mandatory, people try hard to avoid it. Things are getting better now. Recently, C++11 standard looks just like the right way to go for the language. C++ is still hard, but the learning path isn’t as concealed as it did.
It now makes sense to encourage a Python programmer to invest in C++. These are two very different languages. Each is good in its field. Although both are pronounced and designed to be general-purpose, and indeed are used for everything, it’s naive to use vanilla Python for anything calling for speed. On the other hand, C++ is too heavy-lifting for simple one-liners or scripts. To master the two very different languages gives a programmer powerful synergy. There are alreay many wrappers developed to bridge them. Python the interpreter is also a well-organized C library easy to be used from C++. The major thing that Python programmers concerned was the complexity of C++ the language. With C++11, it is very much mitigated.
And we also have pybind11, a compact library providing comprehensive wrapping between Python and C++11. It also includes neat API for manipulating Python objects. The API is very basic and covers only a fraction of what Python C API does, but fun to use. And it’s the real point of this post: make a note about manipulating Python objects with pybind11. Other parts of the library are important and probably more useful than this. But I find this particularly interesting.
I am starting with “hello, world”. But it’s boring to just print a Python str, which doesn’t differ much from C++ std::string. Let me use a container to do it:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 #include // must be first #include #include namespace py = pybind11; PYBIND11_MODULE(_helloworld, mod) { mod.def( "do", []() { std::vector v{'h', 'e', 'l', 'l', 'o', ',', ' ', 'w', 'o', 'r', 'l', 'd'}; py::list l; for (auto & i : v) { py::str s(std::string(1, i)); l.append(s); } return l; }, "a little more interesting hello world" ); } /* end PYBIND11_PLUGIN(_helloworld) */
That’s how you create a Python list in C++. And you see how it’s returned to Python and gets the famous hello, world:
1 2 3 4 5 >>> import _helloworld >>> print(_helloworld.do()) ['h', 'e', 'l', 'l', 'o', ',', ' ', 'w', 'o', 'r', 'l', 'd'] >>> print("".join(_helloworld.do())) hello, world
## pybind11::list¶
More on pybind11::list. Take a look at https://github.com/pybind/pybind11/blob/master/include/pybind11/pytypes.h, it’s just a thin shell to access PyList API. The pybind11 support isn’t fancy, but enough for basic operations.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 #include // must be first #include #include namespace py = pybind11; PYBIND11_MODULE(_pylist, mod) { mod.def( "do", [](py::list & l) { // convert contents to std::string and send to cout std::cout << "std::cout:" << std::endl; for (py::handle o : l) { std::string str = py::cast(o); std::cout << str << std::endl; } } ); } /* end PYBIND11_PLUGIN(_pylist) */
Run the code. Elements are converted to std::string and sent to standard output one by one:
1 2 3 4 5 6 7 >>> import _pylist >>> # print the input list >>> _pylist.do(["a", "b", "c"]) std::cout: a b c
pybind11 provides pybind11::list::append to populate elements (we saw it in the hello, world). Spell it out:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 mod.def( "do2", [](py::list & l) { // create a new list std::cout << "py::print:" << std::endl; py::list l2; for (py::handle o : l) { std::string s = py::cast(o); s = "elm:" + s; py::str s2(s); l2.append(s2); // populate contents } py::print(l2); } );
This is the result:
1 2 3 >>> _pylist.do2(["d", "e", "f"]) py::print: ['elm:d', 'elm:e', 'elm:f']
## pybind11::tuple¶
tuple is immutable and more restrictive than list. pybind11 provides API for reading it. To creat a non-trivial tupple, we can convert from a sequence object:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 #include // must be first #include namespace py = pybind11; PYBIND11_MODULE(_pytuple, mod) { mod.def( "do", [](py::args & args) { // build a list using py::list::append py::list l; for (py::handle h : args) { l.append(h); } // convert it to a tuple py::tuple t(l); // print it out py::print(py::str("{} len={}").format(t, t.size())); // print the element one by one for (size_t it=0; it
Execution in Python:
1 2 3 4 5 6 >>> import _pytuple >>> _pytuple.do("a", 7, 5.6) ('a', 7, 5.6) len=3 a 7 5.6
## pybind11::dict¶
pybind11::dict is slightly richer than the sequences. This is how to create a dict from a tuple in C++:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 #include // must be first #include #include #include namespace py = pybind11; PYBIND11_MODULE(_pydict, mod) { mod.def( "do", [](py::args & args) { if (args.size() % 2 != 0) { throw std::runtime_error("argument number must be even"); } // create a dict from the input tuple py::dict d; for (size_t it=0; it
Result:
1 2 3 4 >>> import _pydict >>> d = _pydict.do("a", 7, "b", "name", 10, 4.2) >>> print(d) {'a': 7, 'b': 'name', 10: 4.2}
In addition to the obvious pybind11::dict::size(), it has pybind11::dict::clear() and pybind11::dict::contains(). The second example uses them to process the created dict:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 mod.def( "do2", [](py::dict d, py::args & args) { for (py::handle h : args) { if (d.contains(h)) { std::cout << py::cast(h) << " is in the input dictionary" << std::endl; } else { std::cout << py::cast(h) << " is not found in the input dictionary" << std::endl; } } std::cout << "remove everything in the input dictionary!" << std::endl; d.clear(); return d; } );
Then the dictionary becomes empty:
1 2 3 4 5 6 7 8 9 10 >>> d2 = _pydict.do2(d, "b", "d") b is in the input dictionary d is not found in the input dictionary remove everything in the input dictionary! >>> print("The returned dictionary is empty:", d2) The returned dictionary is empty: {} >>> print("The first dictionary becomes empty too:", d) The first dictionary becomes empty too: {} >>> print("Are the two dictionaries the same?", d2 is d) Are the two dictionaries the same? True
## pybind11::str¶
I’ve used pybind11::str many times in previous examples. Here I just bring up one more trick: C++11 literal for strings.
1 2 3 4 5 6 7 8 9 10 11 12 13 #include // must be first #include namespace py = pybind11; using namespace py::literals; // to bring in the _s literal PYBIND11_MODULE(_pystr, mod) { mod.def( "do", []() { py::str s("python string {}"_s.format("formatting")); py::print(s); } ); } /* end PYBIND11_PLUGIN(_pystr) */
Result:
1 2 3 >>> import _pystr >>> _pystr.do() python string formatting
## pybind11::handle and pybind11::object¶
pybind11::handle is a thin wrapper in C++ to the Python PyObject. It’s the base class of all pybind11 classes that wrap around Python types.
pybind11::object is derived from pybind11::handle, and adds automatic reference counting. The two classes offer bookkeeping for Python objects in pybind11.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 #include // must be first #include namespace py = pybind11; using namespace py::literals; // to bring in the _s literal PYBIND11_MODULE(_pyho, mod) { mod.def( "do", [](py::object const & o) { std::cout << "refcount in the beginning: " << o.ptr()->ob_refcnt << std::endl; py::handle h(o); std::cout << "no increase of refcount with a new pybind11::handle: " << h.ptr()->ob_refcnt << std::endl; { py::object o2(o); std::cout << "increased refcount with a new pybind11::object: " << o2.ptr()->ob_refcnt << std::endl; } std::cout << "decreased refcount after the new pybind11::object destructed: " << o.ptr()->ob_refcnt << std::endl; h.inc_ref(); std::cout << "manually increases refcount after h.inc_ref(): " << h.ptr()->ob_refcnt << std::endl; h.dec_ref(); std::cout << "manually descrases refcount after h.dec_ref(): " << h.ptr()->ob_refcnt << std::endl; } ); } /* end PYBIND11_PLUGIN(_pyho) */
See the change of the reference count.
1 2 3 4 5 6 7 8 >>> import _pyho >>> _pyho.do(["name"]) refcount in the beginning: 3 no increase of refcount with a new pybind11::handle: 3 increased refcount with a new pybind11::object: 4 decreased refcount after the new pybind11::object destructed: 3 manually increases refcount after h.inc_ref(): 4 manually descrases refcount after h.dec_ref(): 3
## pybind11::none¶
The last class covered in this note is pybind11::none. It is just the None object, or in the C API Py_None. None is also reference counted, and it’s convenient that in pybind11 we have a class representing it.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 #include // must be first #include namespace py = pybind11; using namespace py::literals; // to bring in the _s literal PYBIND11_MODULE(_pynone, mod) { mod.def( "do", [](py::object const & o) { if (o.is(py::none())) { std::cout << "it is None" << std::endl; } else { std::cout << "it is not None" << std::endl; } } ); } /* end PYBIND11_PLUGIN(_pynone) */
See the test result:
1 2 3 4 5 >>> import _pynone >>> _pynone.do(None) it is None >>> _pynone.do(False) it is not None |
Previous | Up | Next
# Article
Full entry | PDF (0.3 MB)
Keywords:
quasilinear evolution equation; quasilinear elliptic equation; a priori estimates; global existence; asymptotic behavior; stationary solutions
Summary:
We give sufficient conditions for the existence of global small solutions to the quasilinear dissipative hyperbolic equation $$u_{tt} + 2 u_t - a_{ij}(u_t,\nabla u)\partial _i\partial _j u = f$$ corresponding to initial values and source terms of sufficiently small size, as well as of small solutions to the corresponding stationary version, i.e. the quasilinear elliptic equation $$-a_{ij}(0,\nabla v)\partial _i\partial _j v=h.$$ We then give conditions for the convergence, as $t\to \infty$, of the solution of the evolution equation to its stationary state.
References:
[1] Adams, R., Fournier, J.: Sobolev Spaces, 2nd ed. Academic Press New York (2003). MR 2424078 | Zbl 1098.46001
[2] Cattaneo, C.: Sulla Conduzione del Calore. Atti Semin. Mat. Fis., Univ. Modena 3 (1949), 83-101 Italian. MR 0032898 | Zbl 0035.26203
[3] Hadeler, K P.: Random walk systems and reaction telegraph equations. In: Dynamical Systems and their Applications S. van Strien, S. V. Lunel Royal Academy of the Netherlands (1995).
[4] Haus, H. A.: Waves and Fields in Optoelectronics. Prentice Hall (1984).
[5] Kato, T.: Abstract Differential Equations and Nonlinear Mixed Problems. Fermian Lectures. Academie Nazionale dei Licei Pisa (1985). MR 0930267
[6] Klainerman, S.: Global existence for nonlinear wave equations. Commun. Pure Appl. Math. 33 (1980), 43-101. DOI 10.1002/cpa.3160330104 | MR 0544044 | Zbl 0405.35056
[7] Li, Ta-Tsien: Nonlinear Heat Conduction with Finite Speed of Popagation. Proceedings of the China-Japan Symposium on Reaction Diffusion Equations and their Applcations to Computational Aspects. World Scientific Singapore (1997). MR 1654353
[8] Matsumura, A.: On the asymptotic behavior of solutions to semi-linear wave equations. Publ. Res. Inst. Mat. Sci., Kyoto Univ. 12 (1976), 169-189. DOI 10.2977/prims/1195190962 | MR 0420031
[9] Matsumura, A.: Global existence and asymptotics of the solutions of second-order quasilinear hyperbolic equations with first-order dissipation. Publ. Res. Inst. Mat. Sci., Kyoto Univ. 13 (1977), 349-379. DOI 10.2977/prims/1195189813 | MR 0470507
[10] Milani, A.: The quasi-stationary Maxwell equations as singular limit of the complete equations: The quasi-linear case. J. Math. Anal. Appl. 102 (1984), 251-274. DOI 10.1016/0022-247X(84)90218-X | MR 0751358 | Zbl 0551.35006
[11] Milani, A.: Global existence via singular perturbations for quasilinear evolution equations. Adv. Math. Sci. Appl. 6 (1996), 419-444. MR 1411976 | Zbl 0868.35008
[12] Mizohata, S.: The Theory of Partial Differential Equations. Cambridge University Press London (1973). MR 0599580 | Zbl 0263.35001
[13] Moser, J.: A rapidly convergent iteration method and non-linear differential equations. Ann. Sc. Norm. Super. Pisa, Sci. Fis. Mat., III. Ser. 20 (1966), 265-315. MR 0199523 | Zbl 0174.47801
[14] Ponce, G.: Global existence of small solutions to a class of nonlinear evolution equations. Nonlin. Anal., Theory Methods Appl. 9 (1985), 399-418. DOI 10.1016/0362-546X(85)90001-X | MR 0785713 | Zbl 0576.35023
[15] Racke, R.: Non-homogeneous nonlinear damped wave equations in unbounded domains. Math. Methods Appl. Sci. 13 (1990), 481-491. DOI 10.1002/mma.1670130604 | Zbl 0728.35071
[16] Racke, R.: Lectures on Nonlinear Evolution Equations. Initial Value Problems. Vieweg Braunschweig (1992). MR 1158463 | Zbl 0811.35002
[17] Schochet, S.: The instant-response limit in Whitham's nonlinear traffic flow model: Uniform well-posedness and global existence. Asymptotic Anal. 1 (1988), 263-282. DOI 10.3233/ASY-1988-1401 | MR 0972301 | Zbl 0685.35099
[18] Yang, H., Milani, A.: On the diffusion phenomenon of quasilinear hyperbolic waves. Bull. Sci. Math. 124 (2000), 415-433. DOI 10.1016/S0007-4497(00)00141-X | MR 1781556 | Zbl 0959.35126
Partner of |
RUS ENG JOURNALS PEOPLE ORGANISATIONS CONFERENCES SEMINARS VIDEO LIBRARY PERSONAL OFFICE
General information Latest issue Archive Search papers Search references RSS Latest issue Current issues Archive issues What is RSS
Mathematical Physics and Computer Simulation: Year: Volume: Issue: Page: Find
Vestnik Volgogradskogo gosudarstvennogo universiteta. Seriya 1. Mathematica. Physica, 2016, Issue 5(36), Pages 60–72 (Mi vvgum131)
Mathematics
Extremals of the equation for the potential energy functional
N. M. Poluboyarova
Abstract: To study the surfaces on the stability (or instability) is necessary to obtain the expression of the first and second functional variation. This article presents the first of the research of the functional of potential energy. We calculate the first variation of the potential energy functional. Proven some consequences of them. They help to build the extreme surface of rotation.
Let $M$ be an $n$ dimensional connected orientable manifold from the class $C^2$. We consider a hypersurface ${\mathcal M}=(M,u)$, obtained by a $C^2$ -immersion $u: M\to {\mathbf{R}}^{n+1}$. Let $\Omega\subset\mathbf{R}^{n+1}$ be a domain such that $\mathcal M\subset\partial\Omega;$ $\Phi$, $\Psi: {\mathbf{R}}^{n+1}\to{\mathbf{R}}$$C^2-smooth function. If \xi the field of unit normals to the surface {\mathcal M}, then for any C^2-smooth surfaces {\mathcal M} defined functional$$ W({\mathcal M})=\int\limits_{\mathcal M}{\Phi(\xi) d{\mathcal M}}+\int\limits_{\Omega}{\Psi(x) d{x}}, $$which we call the functional of potential energy. It is the main object of study. Theorem of the first variation of the functional. Theorem 3. If W(t)=W({\mathcal M}_t), then$$ W'(0)=\int \limits _{\mathcal M} {(div(D\Phi(\xi))^T-nH\Phi(\xi)+\Psi(x))h(x) d{\mathcal M}}, $$where h(x)\in C^1_0(\mathcal M). Theorem 4 is the the main theorem of of this article. It obtained the equations of extremals of the functional of potential energy. Theorem 4. A surface \mathcal M of class C^2 is extremal of functional of potential energy if and only if$$ \sum \limits _{i=1}^{n}k_iG(E_i,E_i)=\Psi(x).$$Corollary. If a extreme surface \mathcal M is a plane, then the function \Psi(x)=0. Theorem 5. If f=x_{n+1} and \Phi(\xi)=\Phi(\xi_{n+1}), then$$\mathrm {div}((\xi_{n+1}\Phi'(\xi_{n+1})-\Phi(\xi_{n+1}))\nabla f)=\Psi(x)\xi_{n+1}.$\$
Keywords: variation of functional, extreme surface, functional type area, volumetric power density functional, functional of potential energy, mean curvature of extreme surface.
Funding Agency Grant Number Russian Foundation for Basic Research 15-41-02479-ð_ïîâîëæüå_à
DOI: https://doi.org/10.15688/jvolsu1.2016.5.6
Full text: PDF file (403 kB)
References: PDF file HTML file
Document Type: Article
UDC: 514.752, 514.764.274, 517.97
BBK: 22.15, 22.161
Citation: N. M. Poluboyarova, “Extremals of the equation for the potential energy functional”, Vestnik Volgogradskogo gosudarstvennogo universiteta. Seriya 1. Mathematica. Physica, 2016, no. 5(36), 60–72
Citation in format AMSBIB
\Bibitem{Pol16} \by N.~M.~Poluboyarova \paper Extremals of the equation for the potential energy functional \jour Vestnik Volgogradskogo gosudarstvennogo universiteta. Seriya 1. Mathematica. Physica \yr 2016 \issue 5(36) \pages 60--72 \mathnet{http://mi.mathnet.ru/vvgum131} \crossref{https://doi.org/10.15688/jvolsu1.2016.5.6} |
## Results (1-50 of 408 matches)
Next
Label $\alpha$ $A$ $d$ $N$ $\chi$ $\mu$ $\nu$ $w$ prim arith $\mathbb{Q}$ self-dual $\operatorname{Arg}(\epsilon)$ $r$ First zero Origin
2-45e2-225.11-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.11 $$0.0 0 0.141 0 1.51236 Modular form 2025.1.y.a.836.1 2-45e2-225.11-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.11$$ $0.0$ $0$ $0.141$ $0$ $1.99696$ Modular form 2025.1.y.a.836.2
2-45e2-225.131-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.131 $$0.0 0 0.178 0 1.08400 Modular form 2025.1.y.a.1106.1 2-45e2-225.131-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.131$$ $0.0$ $0$ $0.178$ $0$ $2.02260$ Modular form 2025.1.y.a.1106.2
2-45e2-225.146-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.146 $$0.0 0 -0.178 0 0.816459 Modular form 2025.1.y.a.1646.1 2-45e2-225.146-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.146$$ $0.0$ $0$ $-0.178$ $0$ $0.960144$ Modular form 2025.1.y.a.1646.2
2-45e2-225.191-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.191 $$0.0 0 -0.418 0 1.05855 Modular form 2025.1.y.a.1241.2 2-45e2-225.191-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.191$$ $0.0$ $0$ $-0.418$ $0$ $2.00287$ Modular form 2025.1.y.a.1241.1
2-45e2-225.221-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.221 $$0.0 0 0.0988 0 1.20235 Modular form 2025.1.y.a.296.1 2-45e2-225.221-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.221$$ $0.0$ $0$ $0.0988$ $0$ $1.98348$ Modular form 2025.1.y.a.296.2
2-45e2-225.41-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.41 $$0.0 0 -0.141 0 0.861110 Modular form 2025.1.y.a.1916.1 2-45e2-225.41-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.41$$ $0.0$ $0$ $-0.141$ $0$ $1.28176$ Modular form 2025.1.y.a.1916.2
2-45e2-225.56-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.56 $$0.0 0 -0.0988 0 0.999774 Modular form 2025.1.y.a.431.2 2-45e2-225.56-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.56$$ $0.0$ $0$ $-0.0988$ $0$ $1.22280$ Modular form 2025.1.y.a.431.1
2-45e2-225.86-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 225.86 $$0.0 0 0.418 0 0.741080 Modular form 2025.1.y.a.1511.1 2-45e2-225.86-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 225.86$$ $0.0$ $0$ $0.418$ $0$ $1.77185$ Modular form 2025.1.y.a.1511.2
2-45e2-45.13-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.13 $$0.0 0 -0.289 0 0.246983 Modular form 2025.1.p.c.1243.1 2-45e2-45.13-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 45.13$$ $0.0$ $0$ $-0.115$ $0$ $0.604048$ Modular form 2025.1.p.b.1243.1
2-45e2-45.13-c0-0-2 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.13 $$0.0 0 -0.00698 0 1.09699 Modular form 2025.1.p.c.1243.2 2-45e2-45.13-c0-0-3 1.00 1.01 2 3^{4} \cdot 5^{2} 45.13$$ $0.0$ $0$ $0.217$ $0$ $1.41432$ Modular form 2025.1.p.a.1243.1
2-45e2-45.13-c0-0-4 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.13 $$0.0 0 -0.115 0 1.63545 Modular form 2025.1.p.b.1243.2 2-45e2-45.14-c0-0-0 1.00 1.01 2 3^{4} \cdot 5^{2} 45.14$$ $0.0$ $0$ $0.0650$ $0$ $1.27111$ Modular form 2025.1.i.a.1349.1
2-45e2-45.14-c0-0-1 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.14 $$0.0 0 0.212 0 1.86117 Modular form 2025.1.i.a.1349.2 2-45e2-45.22-c0-0-0 1.00 1.01 2 3^{4} \cdot 5^{2} 45.22$$ $0.0$ $0$ $-0.439$ $0$ $0.759239$ Modular form 2025.1.p.b.757.1
2-45e2-45.22-c0-0-1 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.22 $$0.0 0 -0.215 0 0.946009 Modular form 2025.1.p.c.757.1 2-45e2-45.22-c0-0-2 1.00 1.01 2 3^{4} \cdot 5^{2} 45.22$$ $0.0$ $0$ $-0.106$ $0$ $1.13907$ Modular form 2025.1.p.a.757.1
2-45e2-45.22-c0-0-3 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.22 $$0.0 0 0.0668 0 1.65268 Modular form 2025.1.p.c.757.2 2-45e2-45.22-c0-0-4 1.00 1.01 2 3^{4} \cdot 5^{2} 45.22$$ $0.0$ $0$ $-0.439$ $0$ $1.95832$ Modular form 2025.1.p.b.757.2
2-45e2-45.29-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.29 $$0.0 0 -0.212 0 0.964156 Modular form 2025.1.i.a.674.2 2-45e2-45.29-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 45.29$$ $0.0$ $0$ $-0.0650$ $0$ $1.18813$ Modular form 2025.1.i.a.674.1
2-45e2-45.43-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.43 $$0.0 0 0.439 0 0.664744 Modular form 2025.1.p.b.1918.2 2-45e2-45.43-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 45.43$$ $0.0$ $0$ $-0.0668$ $0$ $1.18813$ Modular form 2025.1.p.c.1918.2
2-45e2-45.43-c0-0-2 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.43 $$0.0 0 0.439 0 1.34072 Modular form 2025.1.p.b.1918.1 2-45e2-45.43-c0-0-3 1.00 1.01 2 3^{4} \cdot 5^{2} 45.43$$ $0.0$ $0$ $0.106$ $0$ $1.57802$ Modular form 2025.1.p.a.1918.1
2-45e2-45.43-c0-0-4 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.43 $$0.0 0 0.215 0 1.86276 Modular form 2025.1.p.c.1918.1 2-45e2-45.7-c0-0-0 1.00 1.01 2 3^{4} \cdot 5^{2} 45.7$$ $0.0$ $0$ $-0.217$ $0$ $0.670465$ Modular form 2025.1.p.a.1432.1
2-45e2-45.7-c0-0-1 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.7 $$0.0 0 0.115 0 1.25371 Modular form 2025.1.p.b.1432.1 2-45e2-45.7-c0-0-2 1.00 1.01 2 3^{4} \cdot 5^{2} 45.7$$ $0.0$ $0$ $0.00698$ $0$ $1.26734$ Modular form 2025.1.p.c.1432.2
2-45e2-45.7-c0-0-3 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 45.7 $$0.0 0 0.289 0 1.37784 Modular form 2025.1.p.c.1432.1 2-45e2-45.7-c0-0-4 1.00 1.01 2 3^{4} \cdot 5^{2} 45.7$$ $0.0$ $0$ $0.115$ $0$ $1.84864$ Modular form 2025.1.p.b.1432.2
2-45e2-9.2-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 9.2 $$0.0 0 -0.138 0 0.685726 Artin representation 2.2025.12t18.b.a Modular form 2025.1.j.a.26.1 2-45e2-9.2-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 9.2$$ $0.0$ $0$ $0.0277$ $0$ $0.796436$ Artin representation 2.2025.24t65.a.c Modular form 2025.1.j.c.26.1
2-45e2-9.2-c0-0-2 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 9.2 $$0.0 0 0.0277 0 1.59317 Artin representation 2.2025.24t65.a.b Modular form 2025.1.j.c.26.2 2-45e2-9.2-c0-0-3 1.00 1.01 2 3^{4} \cdot 5^{2} 9.2$$ $0.0$ $0$ $0.361$ $0$ $1.98444$ Artin representation 2.2025.6t5.b.b Modular form 2025.1.j.b.26.1
2-45e2-9.5-c0-0-0 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 9.5 $$0.0 0 -0.361 0 0.570503 Artin representation 2.2025.6t5.b.a Modular form 2025.1.j.b.701.1 2-45e2-9.5-c0-0-1 1.00 1.01 2 3^{4} \cdot 5^{2} 9.5$$ $0.0$ $0$ $-0.0277$ $0$ $1.10275$ Artin representation 2.2025.24t65.a.d Modular form 2025.1.j.c.701.2
2-45e2-9.5-c0-0-2 $1.00$ $1.01$ $2$ $3^{4} \cdot 5^{2}$ 9.5 $$0.0 0 -0.0277 0 1.13816 Artin representation 2.2025.24t65.a.a Modular form 2025.1.j.c.701.1 2-45e2-9.5-c0-0-3 1.00 1.01 2 3^{4} \cdot 5^{2} 9.5$$ $0.0$ $0$ $0.138$ $0$ $1.43737$ Artin representation 2.2025.12t18.b.b Modular form 2025.1.j.a.701.1
2-45e2-1.1-c1-0-0 $4.02$ $16.1$ $2$ $3^{4} \cdot 5^{2}$ 1.1 $$1.0 1 0 0 0.404035 Modular form 2025.2.a.w.1.2 2-45e2-1.1-c1-0-1 4.02 16.1 2 3^{4} \cdot 5^{2} 1.1$$ $1.0$ $1$ $0$ $0$ $0.473981$ Modular form 2025.2.a.y.1.2
Next |
A. Windows Messenger includes a background image of two bobble men. You can replace this .gif image with any image that will fit in the display. To replace the .gif image, perform the following steps:
1. Navigate to the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MessengerService registry subkey, then double-click the InstallationDirectory value to verify the location of your Windows Messenger installation (by default, this location is C:\program files\messenger).
2. Use an imaging program, create the background image that you want to use.
3. Save the image you created as a .gif file named lvback.gif, and place the file in the Windows Messenger folder.
4. Restart Windows Messenger for the change to take effect. |
# Tag Info
0
Try navigating to your /miktex/bin directory and running the following. initexmf --mkmaps initexmf --update-fndb Reference: https://www.dev-eth0.de/2016/11/13/latex_miktex-makepk-pk-font-umvs-could-not-be-created/
0
I had exactly the same problem and here's what I did to fix it. Delete LaTeXit preferences . (Library/Preferences/fr.chachatelier.pierre.LaTeXiT.plist) Remove LaTeXit Reinstall LaTeXit
1
You need \addplot+ instead of \addplot. \documentclass[crop]{standalone} \usepackage{pgfplots} \usepgfplotslibrary{colormaps} \pgfplotsset{compat=1.9} \pgfplotsset{ colormap={bright}{rgb255=(0,0,0) rgb255=(78,3,100) rgb255=(2,74,255) rgb255=(255,21,181) rgb255=(255,113,26) rgb255=(147,213,114) rgb255=(230,255,0) rgb255=(255,255,255)} } \begin{...
1
1
I would go with a conditional to abstract the backend differences \ifdefined\XeTeXversion \protected\def\myspecial#1{\special{pdf:content q #1 Q}} \else \protected\def\myspecial#1{\pdfliteral{q #1 Q}} \fi \myspecial{% 0 G 0.4 w 0 8 m 2 10 l 0 6 m 4 10 l 0 4 m 6 10 l 0 2 m 8 10 l 0 0 m 10 10 l 0 -2 m 12 10 l 0 -4 m 14 10 l 0 ...
2
A simple solution is to use the libertinus version of the Libertine fonts. This version has the regular theta: \documentclass[a4paper,12pt]{article} \usepackage[LGR,T1]{fontenc} \usepackage[utf8]{inputenc} \usepackage[english,greek.polutoniko]{babel} \usepackage{libertinus} \begin{document} \textgreek{θεόλογος} \end{document}
5
You have several options here. In the Modern Toolchain Although you tagged your question with pdftex, my advice is to use LuaLaTeX and unicode-math when you can, and legacy 8-bit fonts when you have to. You can use the OpenType Garamond Math font with \usepackage{unicode-math} \setmainfont{EB Garamond} \setmathfont{Garamond-Math.otf}[StylisticSet={8,9}] % ...
1
When I started using MikTeX, several years ago, I encountered the same kind of problems during the update from the console and couldn't easily go back. Now I have a more conservative approach. (I tried before to have two twin machines and also using a virtual machine for the testing the new installation. But this way is faster) I do it maybe once a year, ...
0
This issue is described in the manual of the underscore package. You code contains the command \input{Belastungsart2.pdf_tex}. This command contains an underscore character, which causes the input to fail because the underscore package changes the interpretation of this character. As mentioned in the manual, it works if you explicitly state that this ...
0
I'm deciphering the bibtex/pdflatex pipeline in another context and the answer by user Mensch has helped a lot. As complement, a diagram of that particular work pipeline:
0
For completeness, there is a script ps4pdf that runs following commands to convert a .tex file that uses PSTricks into PDF. latex $1.tex dvips -Ppdf -o$1-pics.ps $1.dvi ps2pdf$1-pics.ps $1-pics.pdf pdflatex$1.tex bibtex $1 pdflatex$1.tex
0
I just figured out a way myself. May not be a very clean way, but at least it works. The idea is to just add two asterisks manually to the two authors, and then add a footnote without any label and link through the method described in this post: Footnote without reference
0
movie15 is obsolete. Instead, use media9: \documentclass[a4paper]{article} \usepackage{media9} \usepackage{tikz} % for creating a poster %\usepackage{hyperref} %\usepackage[UKenglish]{babel} \begin{document} \includemedia[ activate=onclick, 3Dtoolbar, % label=cylinder.u3d, 3Dmenu, % found by right-click on 'Generate Default View' ...
1
I’m going to do a frame challenge here. You (practically) never want to do that. Your file in either of those cases is actually going to use a font in the OT1 encoding. There are no fonts that ship with TeX in the Windows-1252 encoding. If your PDF reader thinks the document is using a different encoding than it really is, that’s not good. It might not be ...
1
This will work on selected pages. In this case pages 2-3. \documentclass{article} \usepackage{everypage} \usepackage{tikz}% not really needed, but easier than \rlap{\hspace{-1in}\raisebox{1in}[0pt][0pt]{...}} \usepackage{lipsum}% MWE only \AddEverypageHook{% do pages 2-3 \ifnum\value{page} > 1 \ifnum\value{page} < 4 \begin{tikzpicture}[overlay, ...
5
apacite uses macros to localise strings like "et al.", so you just have to redefine the relevant macros. This needs to happen in \AtBeginDocument after apacite has done its language selection. \documentclass{article} \usepackage[natbibapa,apaciteclassic]{apacite} \AtBeginDocument[apacite-localisation]{% \renewcommand{\BOthers}[1]{\emph{et al.}\...
2
As Урош and Bernard said in the comments, it's quite probable that the original was set in Utopia and that therefor you could use fourier-otf (which actually is Erewhon, which is based on Heuristica which in turn, is based on Utopia) with XeLaTeX: \documentclass[a5paper]{article} \usepackage{fourier-otf} \usepackage{amsmath} \usepackage{unicode-math} \...
0
If you do want to use OpenType fonts, you can try this. There may be a better way to do this, but this forms the basis of the template I have been using for years. Just replace "Adobe Garamond Pro" with the name of your chosen typeface. \documentclass[11pt, a4paper]{article} % For setting the section heading fonts \usepackage[raggedright]{titlesec}...
1
Adobe will end its support for Flash by the end of December. Did you perhaps consent to stop using Flash in Acrobat Reader? Although this can be undone in the settings, it might be possible that AR updates after the 31st December will ultimately disable Flash usage. Video playback without Flash is possible in Acrobat Reader, but currently no control buttons ...
1
The main problem is an incompability between preview and hyperref due to changes in the shipout code of LaTeX that I already reported here: https://lists.gnu.org/archive/html/bug-auctex/2020-11/msg00000.html Because of this incompability the ghostscript call fails and then you get error both in miktex and texlive (but slightly different ones). In your ...
0
My original solution was incorrect. When I upgraded to the latest version of doxygen (1.8.20) in conjunction with MiKTeX-pdfTeX 4.1 (MiKTeX 20.11) the problem still exists. Incorrect solution: I figured it out, my input file to doxygen has a filename of tx_commands.h. Taking a clue from this thread (https://stackoverflow.com/questions/2476831/getting-the-...
0
With xelatex, you don't need the textgreek package, you can just input the letters themselves, but you have to use fontspec instead of libertinus to get the font. So then the MWE becomes: \documentclass[12pt]{article} \usepackage{fontspec} \setmainfont{Linux Libertine O} \begin{document} Ω\textsuperscript{$-1$} s\textsuperscript{\textit{β}} \...
1
In addition to fixing the (possibly not so obvious?) typo \mathb, you may want to rethink how you approach the display of the equation. The bmatrix environment -- like all other matrix-like environments of the amsmath and mathtools packages -- renders its contents in text style by default. This doesn't seem optimal for the use case at hand; you've tried to ...
5
Check your input: you have \mathb where \mathbf seems to be expected and also \mathb \rangle where it should be \mathbf{1}\rangle. The idea of reducing the size is interesting, but not implemented correctly, because \footnotesize doesn't really work in math mode. Here's a better version: \documentclass{article} \usepackage{amsmath} \newsavebox{\fbmatrixbox} ...
0
This answer comes a long time after the question but it might help someone to know that: the example provided by the OP compiles fine if the package tikz is loaded after moodle. with the development version of the moodle package, you no longer need to load tikz after moodle.
0
KDP specifies the safe minimum margins, not the actual margins of the pages that you will use in a book. It is a “no trespass” zone. Let's put this topic aside for the moment. The first task when designing a book page is to decide the size and location of the text area on the physical page. (Actually, you will first have to choose the main font of the book....
5
You get Missing character: There is no <E2> in font cmsy10! Missing character: There is no <E3> in font cmsy10! because you have a typo: symbols instead of largesymbols. That is, assuming you want to use the glyphs from STIX2. Don't forget \makeatletter because of \noaccents@. \documentclass[12pt]{article} \usepackage{amsthm,amsmath,ragged2e,...
0
The KDP guidelines specify very little about the page layout, so most of it is up to you. You might like to consider using the memoir class (a superset of book and report with many additional facilities). Here is a possible layout design for you. % KDPprob.tex SE 569994 %\documentclass[smallroyalvopaper,twoside,12pt]{memoir} % stocksize 9.25 by 6.175 in \...
2
You don't need to declare \lfloor and \rfloor again. They already exists. Pdflatex is not the problem here. \documentclass[12pt]{article} \usepackage{amsthm,amsmath,ragged2e,relsize,graphicx,bbm,mathtools} \DeclareFontEncoding{LS1}{}{} \DeclareFontEncoding{LS2}{}{\noaccents@} \DeclareFontSubstitution{LS1}{stix2}{m}{n} \DeclareFontSubstitution{LS2}{stix2}{m}{...
3
This isn't a true verbatim, but it comes pretty close. For example, brace balancing is required of the input. Syntax is what the user requested: \bv....\ev. Furthermore, the verbatim material can include line and paragraph (empty line) breaks. One place it will produce improper output is if a \bv input line ends in a lone backslash \. Another improper ...
2
The description environment might come in handy: To customize indentation, spacing, ... take a look at the enumitem package. \documentclass{article} \usepackage{lipsum} % Just used for dummy text via the \lipsum command. Do not use in real document. \begin{document} \lipsum[4] \begin{description} \item[Nominal scale] \lipsum[5] \end{description} \lipsum[...
5
\verb has to know where to finish, but in the meantime it has to disable every special interpretation of characters, including the backslash. One might think to delimit the material with the string \ev, but this would be very cumbersome. Delimiting with a single character not in the material that is to be printed verbatim is much simpler. But this works in a ...
6
verbatim commands need to find the end string without expanding commands, so you can't hide it in a \ev command. You can use fancyvrb to define short verbatim, see the documentation: \documentclass{article} \usepackage[T1]{fontenc} \usepackage[utf8]{inputenc} \usepackage[ngerman]{babel} \usepackage{xcolor} \usepackage{fancyvrb} \DefineShortVerb{\|} \begin{...
0
Since the "problem" is a conflicting environment which prevents figure placement there, and "wrapfigure" get placed at the beginning of paragraphs, you can provide a fake paragraph with ~~\vspace*{-\baselineskip} \documentclass{book} \usepackage{wrapfig} \usepackage{graphicx} \newtheorem{problem}{P}[chapter] \begin{document} A circle is ...
0
I propose this variant layout, with the insbox set of plain TeX macros package: \documentclass{book} \usepackage{graphicx} \input{insbox} \newtheorem{problem}{P}[chapter] \begin{document} \begin{problem}\leavevmode% \InsertBoxR{-1}{\includegraphics[width=0.38\textwidth]{example-image}}\par\noindent A circle is drawn in a sector of a larger circle of radius ...
0
With two minipages instead of wrapfig: \documentclass{book} \usepackage{graphicx} \usepackage[export]{adjustbox} \newtheorem{problem}{P}[chapter] \begin{document} \begin{problem} \begin{minipage}[t]{0.5\textwidth} A circle is drawn in a sector of a larger circle of radius $r$, as shown in the adjacent figure. The smaller circle is tangent to the two ...
0
The problem is not at the end of \epigraph, but at the beginning! The missing blank line before \epigraph makes everything go awry, causing very low level errors due to the ignored \prevdepth that's out of place if TeX is not in vertical mode as it should be. The command definition should start with \par. \documentclass[10pt,a4paper,twocolumn,twoside]{...
Top 50 recent answers are included |
# Examples On Tangents And Chords Of Ellipses Set-1
Go back to 'Ellipse'
Example - 26
Find the locus of the point such that the chord of contact of the tangents drawn from it to the ellipse $$\frac{{{x^2}}}{{{a^2}}} + \frac{{{y^2}}}{{{b^2}}} = 1$$ touches the circle $${x^2} + {y^2} = {r^2}.$$
Solution: Let $$P(h,\,k)$$ be such a point.
The equation of the chord of contact AB from P(h, k) is
\begin{align}&T(h,\,\,k) = 0\\ & \Rightarrow\quad\frac{{hx}}{{{a^2}}} + \frac{{ky}}{{{b^2}}} = 1\\ & \Rightarrow\quad y = \left( {\frac{{ - {b^2}h}}{{{a^2}k}}} \right)x + \frac{{{b^2}}}{k}\end{align}
This is a tangent to the circle $${x^2} + {y^2} = {r^2},$$ if the condition for tangency for the case of circles $$({c^2} = {a^2}(1 + {m^2}))$$ is satisfied :
\begin{align} &\qquad \frac{{{b}^{4}}}{{{k}^{2}}}={{r}^{2}}\left( 1+\frac{{{b}^{4}}{{h}^{2}}}{{{a}^{4}}{{k}^{2}}} \right)\\ & \Rightarrow \quad {{a}^{4}}{{b}^{4}}={{a}^{4}}{{r}^{2}}{{k}^{2}}+{{b}^{4}}{{r}^{2}}{{h}^{2}} \\ & \Rightarrow \quad \frac{{{h}^{2}}}{{{\left( {}^{{{a}^{2}}}\!\!\diagup\!\!{}_{r}\; \right)}^{2}}}+\frac{{{k}^{2}}}{{{\left( {}^{{{b}^{2}}}\!\!\diagup\!\!{}_{r}\; \right)}^{2}}}=1 \\ \end{align}
Thus, the locus of P is
\begin{align}\frac{{{x}^{2}}}{{{\left({}^{{{a}^{2}}}\!\!\diagup\!\!{}_{r}\;\right)}^{2}}}+\frac{{{y}^{2}}}{{{\left({}^{{{b}^{2}}}\!\!\diagup\!\!{}_{r}\; \right)}^{2}}}=1\end{align}
which is an ellipse
Example - 27
A variable chord AB of the ellipse \begin{align}\frac{{{x^2}}}{{{a^2}}} + \frac{{{y^2}}}{{{b^2}}} = 1\end{align} subtends a right angle at its centre. Tangents drawn at A and B intersect at P. Find the locus of P.
Solution: Let P be the point (h, k).
Since AB is the chord of contact for the tangents drawn form P, the equation of AB will be
\begin{align}&T(h,\,k) = 0\\ &\Rightarrow \quad \frac{{hx}}{{{a^2}}} + \frac{{ky}}{{{b^2}}} = 1...\left( 1 \right)\end{align}
We can now write the joint equation of OA and OB by homogenizing the equation of the ellipse using the equation of the chord AB obtained in (1) :
Joint equation of AB :\begin{align}\frac{{{x^2}}}{{{a^2}}} + \frac{{{y^2}}}{{{b^2}}} = {\left( {\frac{{hx}}{{{a^2}}} + \frac{{ky}}{{{b^2}}}} \right)^2}\quad\quad\quad...\left( 2 \right)\end{align}
Since OA and OB are perpendicular, we must have
${\rm{Coeff}}{\rm{. of \;}}{x^2} + {\rm{Coeff}}{\rm{. of\; }}{y^2} = 0{\rm{ }}\quad\quad in{\rm{ }}\left( 2 \right)$
\begin{align}& \Rightarrow \quad \frac{1}{{{a^2}}} - \frac{{{h^2}}}{{{a^4}}} + \frac{1}{{{b^2}}} - \frac{{{k^2}}}{{{b^4}}} = 0\\& \Rightarrow\quad \frac{{{h^2}}}{{{a^4}}} + \frac{{{k^2}}}{{{b^4}}} = \frac{1}{{{a^2}}} + \frac{1}{{{b^2}}}\end{align}
The locus of P is therefore
$\frac{{{x^2}}}{{{a^4}}} + \frac{{{y^2}}}{{{b^4}}} = \frac{1}{{{a^2}}} + \frac{1}{{{b^2}}}$ |
# Poynting vector in static electromagnetic field
by zql
Tags: poynting vector
P: 5 There is a situation, we have an electric field and a magnetic field, both are static. And we know the density of energy is u=E·D/2+B·H/2, so dU/dt=0, but Poynting vector S=ExH is not zero, which means energy is flowing. This confused me. Static field also has energy flux?
Sci Advisor P: 3,593 Yes, correct. Look also for "hidden momentum".
P: 5 can you give me more details?thanks.
P: 1,261
Poynting vector in static electromagnetic field
Quote by zql There is a situation, we have an electric field and a magnetic field, both are static. And we know the density of energy is u=E·D/2+B·H/2, so dU/dt=0, but Poynting vector S=ExH is not zero, which means energy is flowing. This confused me. Static field also has energy flux?
The static magnetic field is produced by a constant electric current. That means there is resistance, and energy is flowing into matter. "Hidden" momentum is not involved.
P: 5,523
Quote by zql There is a situation, we have an electric field and a magnetic field, both are static. And we know the density of energy is u=E·D/2+B·H/2, so dU/dt=0, but Poynting vector S=ExH is not zero, which means energy is flowing. This confused me. Static field also has energy flux?
The relevant expression, from the conservation of energy, is:
$$\frac{\partial u}{\partial t} + \nabla \bullet S = -J \bullet E$$.
Does this help?
P: 5
Quote by Andy Resnick The relevant expression, from the conservation of energy, is: $$\frac{\partial u}{\partial t} + \nabla \bullet S = -J \bullet E$$. Does this help?
I think I got it, thanks. |
Microreg: A traditional tax-benefit microsimulation model extended to indirect taxes and in-kind transfers
1. IRPET (Regional Institute for Economic Planning of Tuscany), Italy
Research article
Cite this article as: M. Luisa Maitino, L. Ravagli, N. Sciclone; 2017; Microreg: A traditional tax-benefit microsimulation model extended to indirect taxes and in-kind transfers; International Journal of Microsimulation; 10(1); 5-38. doi: 10.34196/ijm.00148
Abstract
MicroReg is a tax-benefit microsimulation model, developed by IRPET (Regional Institute for Economic Planning in Tuscany), able to simulate the main fiscal policies for all the Italian Regions. The model is based on the EUROSTAT Survey on Income and Living Conditions (EU-SILC). In its traditional version MicroReg can simulate direct taxes and in-cash transfers, but recently it was extended in two directions. The first extension aims at adding indirect taxes to simulated tax policies, thanks to a statistical matching between EU-SILC and the Italian Household Budget Survey by Istat (National Institute of Statistics (Italy)). To improve the matching, an estimation of the level of expenditure for each EU-SILC household is added to the variables on which the matching is conditioned, by applying the relation between consumption and income estimated on the Bank of Italy Survey of Households Income and Wealth. The second extension aims at including in-kind transfers, in health and education and in household disposable income. The monetary value of in-kind transfers is estimated with the public cost of production by using national and regional administrative data. The allocation of benefits among individuals is done by following the so called "actual consumption approach", both for health and education. This paper describes MicroReg, focusing on the new extensions, and the results of an application to assess the distributive effects of fiscal policies introduced in the last few years in Italy, both on the revenue and on the expenditure side.
1. Introduction
MicroReg is a static microsimulation model, developed by IRPET (Regional Institute for Economic Planning in Tuscany) for the Region of Tuscany, which is able to simulate the main fiscal policies for all the Italian Regions. Static models usually measure the short run impact of policies, by comparing households’ income under the actual legislation and the one resulting from reform policies. In comparing the two scenarios no changes in the structure of the population or in the behavior of agents are contemplated. Compared to previous versions of the model (Betti et al., (2012), Maitino & Sciclone (2008)), the one described in this paper adds two new modules to the traditionally simulated policies (i.e. direct taxes and in-cash transfers), namely indirect taxation and in-kind transfers (in health and education)1. MicroReg is now, therefore, able to assess the regional impact of public policies as a whole, considering both revenues and expenditures. This paper is structured in three sections. The first presents the model structure. The second section describes the new modules, indirect taxes and in-kind transfers. Finally, a section concludes by presenting an application of the extended model.
2. The model structure
The model is divided in three phases: i) the choice of micro-data and the imputation of missing information, namely gross income and cadastral value, ii) the calibration of sample weights, iii) the validation of results.
2.1 The choice of micro-data and the imputation of missing information
The first phase of a microsimulation model is the choice of micro-data. In Italy many microsimulation models use the Survey on Household Income and Wealth of Bank of Italy (SHIW), which provides accurate information on income and household wealth2. MicroReg uses the EUROSTAT Survey on Income and Living Conditions (EU-SILC), since it is more representative at regional level with respect to SHIW. In this paper we describe the model built on EU-SILC 2013 (year of income 2012). However, some important information is missing in EU-SILC. Indeed, gross income and cadastral value of buildings need to be estimated.
2.1.1 The grossing-up procedure
The conversion of net to gross income can be done in different ways (Immervoll & O'Donoghue, 2001, Sutherland, 2001). A first solution is to estimate a coefficient of conversion from net to gross income in a data set where both the information are available (usually for a subset of individuals). A second solution calculates the gross income for each net income in the sample, through an analytical inversion of all existing taxes in the year in which net income is detected. The third solution is based on an iterative algorithm. For each individual in the sample, the procedure estimates a gross income starting from the original net income. Fiscal rules are then applied to the estimated gross income in order to get an estimated net income. The latter is compared with the original one. If the two values are similar with a certain margin of error, the estimated gross income is considered a good approximation; otherwise the procedure iteratively corrects the algorithm.
MicroReg uses the third approach and, in particular, a variant of the algorithm used for EUROMOD, the microsimulation model for EU member states. The net income used is the sum of several sources of taxable income for personal income tax (PIT): net individual income of employees, net income from self-employment, net income from retirement, survivor's pension, disability pension (excluding war pension), net income from redundancy funds, unemployment benefits, mobility or early retirement and net income from scholarships. Incomes from land and buildings are not included since they are considered already gross. Other incomes are excluded because they are not taxable for PIT such as healthcare liquidations, insurance and pension arrears. The iterative procedure is similar to the one described in Immervol and O’Donoghue (2001). At the first iteration an arbitrarily fixed average tax rate equal to 0.22 is applied to each net income to obtain an estimation of gross income. By applying fiscal rules to the estimated gross income an estimated net income is obtained. Then, the estimated net income is compared with the true one. When the difference is higher with respect to a certain margin of error the procedure continues—by iterating–to a new estimation of gross income, by properly correcting the initial tax rate. The procedure stops when for all sample individuals the difference between the original value and estimated net income is less than 10 euro. When the algorithm does not converge after a certain number of iterations, the procedure starts with a different value of the initial rate, randomly drawn from a uniform distribution defined in the range 0–1.
2.1.2 The estimation of cadastral values
The total gross income for PIT should include incomes from land and buildings (such as rents and cadastral values of properties). EU-SILC collects only rental income but lacks the cadastral value of properties. The only available information is the total tax paid on buildings (IMU), without distinction between the dwelling house and the others. Therefore, before estimating the cadastral value we need an allocation of the total tax paid between the two components, the dwelling house and the other buildings. Then we can apply fiscal rules to estimate cadastral values.
In MicroReg the allocation of the total property tax into the two components is made by using the official data on the tax paid for the dwelling house registered by the Ministry of Finance (MEF). In details, the estimation is made in the following three steps:
1. The first step in the model makes an initial split of the total tax with the following criteria:
1. If the individual is the owner of the dwelling house and does not own other buildings then the tax is considered paid only for the dwelling house.
2. If the individual does not own neither, the dwelling house nor other buildings, the tax is considered paid only for other buildings.
3. If the individual does not own the dwelling house but only other buildings the tax is considered paid only for other buildings.
4. If the individual claims to own both, the dwelling house and other buildings, the tax is initially divided in two equal parts. This is only an initial division that can change in the second part of the procedure.
2. In the second step the tax paid for the dwelling house is estimated as follows. From the total tax paid for dwelling houses resulting from MEF, the tax paid for dwelling houses imputed in the first step to individuals of criteria (a) is subtracted. The resulting amount is used to bind the tax paid for dwelling houses imputed in the first step to individuals of criteria (d). After this step the initial division of the total tax paid of individual of criteria (d) is corrected according to real data.
3. In the last step the tax paid for the other buildings is estimated for each individual, with the difference between the total tax paid declared in EU-SILC and the tax paid for the dwelling house previously estimated3.
After having obtained the two components fiscal rules can be inversely applied. Rates, as well as deductions, are regional averages of municipal rates.
(1)
(2)
i=1,...n individuals; r=1,..K Regions
In order to validate the estimation of cadastral value, Table 1 and Table 2 compare MicroReg results with official statistics when available. As expected, for what concerns the dwelling house MicroReg gives satisfactory results. Tax revenue from the dwelling house is obviously similar with respect to the official tax paid registered by the MEF, given that our estimation procedure is bound exactly to that value4. The tax paid (and the tax base) simulated by our model isclose to the estimation made by Dipartimento delle Finanze and Agenzia del Territorio (2012)5.
The validation for other buildings cannot be done by using MEF official statistics that include the tax paid from legal entities. The comparison can instead be made with data registered in Dipartimento delle Finanze and Agenzia del Territorio (2012), which distinguishes between individuals and legal entities. Both the tax base and the tax revenue are under-estimated in MicroReg with respect to official data by Dipartimento delle Finanze and Agenzia del Territorio (2012). However, it is worth noting that official data include buildings owned by individuals used for productive reasons that can be under-reported in EU-SILC.
Table 1
According to our simulation about 52.3% of Italian taxpayers own a dwelling house (Table 2). Not surprisingly, as underlined also in Pellegrino et al. (2011), the value is higher with respect to the one registered by the MEF since taxpayers, who use the “770” form do not have to declare the ownership of the main residence. The average cadastral value of the dwelling house is about 500 euro in MicroReg and 491 euro in MEF. It increases from 415 euro in the first income class to 807 euroin the last, with a similar trend to official data. For what concerns the other buildings the model predicts that about 28.8% of taxpayers has a positive cadastral value. The tax base increases with income classes with a higher rate than the dwelling house.
Table 2
2.2 The calibration
This procedure is performed to make estimates of gross income more similar to real data of the MEF and therefore to tackle the problem of tax evasion. Usually in a survey under the guarantee of anonymity, people are more honest than when the counterparty is the tax authority. The calibration binds the original sample weights to the joint distribution of individuals by income class (taken from the MEF) and by socio-demographic variables derived from ISTAT. The total income considered for the calibration is the one used in the grossing up procedure, plus incomes from buildings obtained after the estimation of cadastral values.
In literature two types of calibration methods can be found: i) independent calibration and ii) integrative calibration. The independent calibration allows calibrating weights with households’ variables independently from individual’s variables. The convergence procedure is generally fast, but it generates different weights for households and individuals. Integrative calibration takes into account households’ and individuals’ variables together. It converges more slowly, but households’ and individuals’ weights are the same (as requested by the European Commission Regulation for the SILC). In MicroReg an integrative calibration is performed with the following constraints: taxpayers by income classes and prevailing source of income; population by age, sex, gender and level of education; population by region of residence and by number of family members.
2.3 The validation
After having imputed missing information (see Section 2.1) and calibrated sample weights (see Section 2.2) we simulate all fiscal rules that every taxpayer follows to pay PIT in the following way:
1. According to Italian fiscal rules, every taxpayer can deduct the value of the main residence6 and other expenses (such as expenditure for disabled family members or for donations) from gross income. Since EU-SILC does not collect detailed information about deductible expenses, we impute their value to each taxpayer by applying a coefficient equal to the ratio between deductions and gross income by gross income classes calculated on official data from the MEF7. Taxable income is then simulated by subtracting deductions from gross income tax for each taxpayer.
2. Gross PIT is simulated by applying the legal tax rates to the simulated taxable income.
3. Each taxpayer can subtract different types of tax credits from gross PIT: i) tax credits by income source, ii) tax credits for family members and iii) other tax credits (for health expenditure, housing works, etc.). Tax credits by income source and for family members are simulated in the model since detailed information on income and household components are collected in EU-SILC. The other tax credits, similar to tax deductions, are estimated by applying to each taxpayer a coefficient given by the ratio of tax credits and gross income by gross income classes registered by MEF8.
4. Finally, the net PIT is simulated by detracting simulated and imputed tax credits from gross PIT. When PIT is positive, regional additional income tax is simulated by applying the different fiscal rules of each region.
In order to validate the model, the results of our simulations are compared with data from the MEF. Table 3 compares the distribution of the total number of taxpayers by prevailing income source resulting from the model with respect to official statistics. The total number of taxpayers in the model is in line with the real one and the composition is very similar to the one of the MEF. These satisfactory results are expected given that one of the constraints of our calibration procedure (see Section 2.2) is the distribution of taxpayers by source of income and gross income classes.
Table 3
Table 4 compares aggregate fiscal amounts simulated by the model with official data from the MEF9. The model is able to simulate quiet gross income, taxable income, gross and net PIT well. A little less accurate is the simulation of tax credits for family members. Results are in line with other microsimulation models. In Ceriani et al. (2016) the model over-estimates gross income by 3% and PIT by 1%. In Pellegrino et al. (2011) the average gross income is 97.5% of the real one. Models that also use administrative data obtain better results. In Di Nicola et al. (2015) a microsimulation model is built on a dataset that matches EU-SILC with administrative data on PIT and real estate datasets. Their results are very close to official statistics,except for municipal taxes, where the difference in amounts between the model and the MEF is lower than 1%.
Table 4
Figure 1 shows the density function by gross income classes simulated by MicroReg and registered by MEF10. The overall distribution of gross income simulated by the model is close to the real one with some specific limits. The model under-estimates the number of taxpayers under 1,000 euro and over-estimates between 7,500 and 10,000 euro. The under-estimation in the first class depends on a typical problem of under-reporting on small amounts of income in EU-SILC11. We do not correct for this under-reporting problem since taxpayers under 1,000 euro represents a small part of the total that do not pay PIT. The over-estimation of taxpayers between 7,500 and 10,000 euro is partially offset by a small under-estimation of taxpayers between 6,000 and 7,500 euro.
Figure 1
Figure 2 reports the distribution of taxpayers with positive net PIT by gross income classes12. The distribution of simulated PIT is quite similar with respect to official statistics. The number of taxpayers with positive PIT is 32.3 million in MicroReg against 31.2 million in official data of the MEF.
Figure 2
Table 5 finally shows the main redistributive indexes calculated at the taxpayer level. The pre-tax Gini coefficient is about 0.42. After the PIT the Gini becomes 0.38 with a decreasing of 0.05. The strong redistribution is given by the combination of an average tax rate of 0.19 and a strong progressive tax (Kakwani index of 0.22). As expected, the redistributive effect of regional additional income tax is lower with respect to PIT. Indeed, several regions apply a proportional additional tax and others impose a less progressive system of tax rates than PIT.
Table 5
3. Extensions of the model
MicroReg was recently extended to indirect taxes and in-kind transfers (health and education). Indirect taxes are estimated for all Italian Regions (multi-regional model), while in-kind transfers are quantified only for the Region of Tuscany (for which data is available).
3.1 Indirect taxes
One of the recent developments of MicroReg is the simulation of indirect taxes after a matching of EU-SILC with a database on consumption. An integrated database can be used with many aims: to analyse saving and consumption behaviour; to study consumption of durable or not durable goods; to make multidimensional analysis of poverty; and to study the impact of fiscal policies.
Despite the large availability of sample surveys, it does not exist a unique survey which collects information both on income (y) and consumption (z with a minimum level of accuracy and details. Basically, given the variables x,y,z,, where x is a set of households ‘characteristics, the database lacks completely or partially, the joint observation of all three. The only way to integrate the two datasets is to assume that information in x is sufficient to jointly determine both y and z. Literature suggests many solutions to integrate the two datasets (Decoster et al., 2007). Basically, at least two approaches can be distinguished and they are explained in the following.
One, the so called explicit approach, uses Engel curves to impute expenditure to the income database. According to this approach a regression of each expenditure share variables common to the consumption and the income database (usually disposable income and socio-demographic characteristics) is estimated on the consumption database. Then, the estimated coefficients are applied to records in the income database to impute expenditure shares for each good. An application of this procedure in Italy is in O’Donoghue et al. (2004). The survey which collects information on income is the SHIW, while the consumption database is the Italian Household Budget Survey (HBS). In the latter a variable on disposable income is collected (even if underestimated), so a regression of total consumption on income and socio-demographic characteristics is estimated. The estimated coefficients are then applied to SHIW to impute total consumption. Budget shares of total consumption are subsequently estimated in HBS trough Engel curves and applied to the imputed total consumption in SHIW. The explicit approach is more recently applied in Taddei (2012), where the income database is EU-SILC and the consumption one is HBS. The procedure is made in two steps. Taddei (2012) first applies a statistical matching to impute income from EU-SILC to HBS. Once imputed income to HBS by following the explicit approach, it estimates Engel curves to associate to each observation in EU-SILC a vector of budget shares for each consumption good.
The second approach, called implicit approach, tries to find for each household of the income database the most similar in the consumption database. This approach is not based on a statistical model or on theoretical assumptions that explain consumption behaviour, but on a statistical matching (D’Orazio et al., 2002). The matching links similar units through a distance function that should be minimized. Units are compared according to a set of socio-demographic characteristics, common to the two databases. The matching can be done only if the two surveys are random samples extracted from the same population. In Italy a statistical matching between the Bank of Italy Survey of Households Income and Wealth and HBS is in Battellini et al. (2009), in which the integrated database is mainly used to compile Social Accounting Matrices (SAM). More recently, in Pisani and Tedeschi (2014) a matching technique is applied to build an integrated dataset useful for direct and indirect tax benefit microsimulation models13. The income database is the Bank of Italy Survey of Households Income and Wealth and the consumption one is HBS. Before doing the matching, Pisani and Tedeschi (2014) estimate a propensity score to synthesize in one scalar all the different variables common to the two databases. The propensity score is indeed the estimated probability to belong to SHIW conditioning on the common set of characteristics. Then, they apply two different matching procedures. In the first one they associate to each observation in SHIW the one in HBS with the closest propensity score (nearest neighbour within calliper). In the second they use a function (called Mahalanobis) to measure for each unit in SHIW the distance to all units in HBS in the common variables (included the propensity score). The unit in HBS with the lower distance is then associated to each unit in SHIW. According to Pisani and Tedeschi (2014) the Mahalanobis distance function performs better than the nearest neighbour method.
In MicroReg an implicit approach based on a matching between EU-SILC and HBS is implemented. Therefore, our method is not parametric as the ones used in O’Donoghue et al. (2004) and Taddei (2012), but it is more similar to Pisani and Tedeschi (2014). The objective of the matching is to link each family in EU-SILC to the most similar in HBS, given the set of common observable variables. The matching is shown in detail in the following three steps:
1. Preliminarily a comparison between the common variables is made by using a T-test for the mean difference and a X2test for equality of distribution (results are reported in appendix, Tables A.4 and A.5). After, variables are standardized with the same classifications and encodings and aggregated at household level.
2. Secondly, we select the conditioning variables for the matching. We do not estimate a propensity score, as in Pisani and Tedeschi (2014), but we select among the distinct common variables. The common set of variables is selected by making a regression of consumption on socio-demographic characteristics (see results for the Centre of Italy in appendix, Table A.6). Some variables refer to households characteristics (such as residence area, number of components, number of rooms, presence of loan to pay, personal computer and dishwasher ownership, access to internet). Other variables refer to sociodemographic characteristics of household components (number of children, number of adults, type of work, number of earners).
In order to improve the matching, for each EU-SILC household an estimation of total consumption is added to the set of common variables. More precisely, a regression of total consumption on household income is estimated for income quintiles in the Bank of Italy Survey of Households Income and Wealth. Indeed, SHIW collects detailed information on income, but also an aggregate measure of consumption. Then, estimated coefficients are applied to EU-SILC households to find the imputed total consumption that can be used in the matching.
3. Similarly to Pisani and Tedeschi (2014) we define a proximity function to integrate household by household information collected from both surveys. Our matching is made in two steps. A first exact matching for a selection of common variables (geographical area and number of components) is implemented. The exact matching allows linking each EU-SILC household with the corresponding HBS household, given the two selected variables. For the other variables the following proximity function is defined:
(3) $s\left(x,y\right)=\mathrm{max}\sum {}_{j=1}^{N}{{\sum }^{\text{}}}_{i=1}^{P}{s}_{i}\left({x}_{\text{ij}},{y}_{\text{ij}}\right)\forall j\in N$
(4) ${s}_{i}\left({x}_{ij},{y}_{ij}\right)=\left\{\frac{1\text{\hspace{0.17em}}\text{if}\text{\hspace{0.17em}}{\text{x}}_{\text{ij}}={y}_{ij}}{0\text{\hspace{0.17em}}\text{otherwise}}$
Basically, the distance function counts for every household in EU-SILC, the number of variables with the same value in EU-SILC and in HBS. The total consumption is considered equal if the difference is lower than 1,000 euro. Finally, for each EU-SILC household the HBS household with the higher number of variables with the same value is associated. When two HBS household have the same number of equal variables the household with the lower difference of total consumption is linked.
In the integrated database the Value Added Tax (VAT) rates by type of expenditure are inversely applied to find the production price (HBS collect retails prices which include indirect taxes). Given the tax base VAT can subsequently be simulated.
The following statistics are used to validate the matching procedure and the VAT simulation. In line with expectations, the food share is higher in the South of Italy than in the North-Centre (Table 6) and the propensity to consume decreases by income deciles (Table 7).
Table 6
Table 7
Table 8
Finally, Table 9 shows redistributive indexes, calculated at household level, for each VAT rate. Not surprisingly, VAT has a negative redistribution role. Indeed, the Gini of disposable income increases after the tax. Among the different rates, it is easily to see that the reduced rate (4%) has the most regressive impact with a negative Kakwani of 0.21. The reduced rate is applied to those essential goods that each household needs and, consequently, the fiscal burden is higher for lower incomes. Despite this, the ordinary rate shows the strongest negative redistributive effect owing to the highest average tax rate.
Table 9
In conclusion, the matching results and the simulation of VAT are satisfactory and in line with expectations. The indirect tax module of MicroReg can then be used to simulate different scenarios of VAT reforms.
3.2 In-kind transfers
The second extension of MicroReg concerns in-kind transfers16. Many empirical studies about inequality and poverty do not consider benefits from public expenditure in in-kind transfers like education, health, transport and so on. The monetary disposable income is, however, only a part of the household welfare, which depends also on the public subsidies for the production and the financing of services. The inclusion of in-kind transfers in a microsimulation model allows: i) to compare their distributive impact with respect to in-cash transfers, ii) to make a more correct comparison between countries which have a different composition of in-cash and in-kind transfers, iii) to monitor the effects of cuts in services and spending reviews.
In order to estimate the distributive effects of in-kind transfers many methodological issues should be addressed (Gigliarano & D’Ambrosio, 2009). The first issue refers to the imputation of a monetary value to in-kind transfers. Usually the monetary value is quantified by estimating the average production cost of the public sector, even if this approach has several limits. For example, it does not take into account that differences in production costs could depend on differences in the quality of services, in inefficiency or on different costs of inputs.
Once quantified the monetary value, in-kind transfers should be imputed to the individuals/households of the sample (in our case EU-SILC). In literature, two approaches have been used, the actual consumption approach (AA) and the insurance value approach (IA).
The AA imputes the monetary value of in-kind transfers only to individuals who actually use the service. The advantage of this approach is that it considers the true usage of services and it takes into account individuals’ differences. Disadvantages are many. First, the attribution is not independent on the time interval considered, during which the use of services could be entirely random. Second, it does not consider different needs of families. For example, it can impute a large monetary value to old people who need many health services with a consequent strong re-ranking (old people become richer than people with more income who do not use services).
The IA imputes to every individual an average monetary value of in-kind transfers by demographic characteristics (usually age and gender), without taking into account the actual use of services. In health, for example, a sort of insurance premium against diseases is imputed to all individuals by demographic characteristics. According to the IA, the use of health services should not depend on random reasons (as in the AA), but on demographic differences. Also this approach, however, has some disadvantages. The possibility to have a re-ranking is still present (even if less likely than in the AA) and it does not take into account individuals’ differences.
In literature the AA is often applied for education. In health the debate is more open. In studies that compare both the approaches, it has been noticed a higher distributive impact of IA than AA (Baldini et al., 2007).
In evaluating the distributive impact of in-kind transfers further methodological issues must be addressed. Usually the long run effects of in kind-transfers (like education returns) are not considered in empirical studies. Further, there is no consensus about the correct counterfactual (the starting income) that should be used to evaluate the distributive effect of in-kind transfers and about the equivalence scale that should be used. Moreover, studies about in-kind transfers typically neglect externalities. In other words, by imputing the benefit only to students or to patients they under-estimate the positive externalities of education and health on the entire population.
In MicroReg the monetary value of in-kind transfers is quantified with the cost of production of the public sector. The monetary value is imputed to individuals through the AA approach, both for education and for health. The equivalence scale is not used for in-kind transfers, but a per-household member in-kind value is attributed to each individual. The modified OECD scale is used only for the monetary disposable income. In what follows methodological choices and data used for education and health are described in details. The simulation of in-kind transfers is performed only for the Region of Tuscany, both for education and health17.
3.2.1 From pre-school to secondary education
To quantify the cost of production of pre-school, primary school, middle school and high school data from the balance sheet of the national government are used. Indeed, the biggest part of public expenditure derives from the central level. So, the total expenditure for each level of education is taken from balance sheets. Then, the regional expenditure is estimated by applying the distribution of teachers by Region to the national expenditure. The per-student value of education is afterwards estimated by dividing the total regional expenditure by the number of students (taken from the Ministry of Education) for each level of education. Finally, the per-student value is imputed to students by exploiting the information about the school attended, collected in EU-SILC, and about the age of children.
3.2.2 Higher education
In order to find the monetary value of higher education, first the cost of production is calculated and second taxes paid by each student are simulated. To quantify the cost of production the balance sheets of the three Tuscan universities (Florence, Pisa and Siena)18 are used. The per-student value is obtained by dividing the total cost of production by the number of students (taken from the Ministry of Education). University taxes are composed of three parts: entry fee, fee for the right to study and contributions. The first two are a fixed amount; the third depends on a mean test instrument called ISEE (Equivalent Economic Situation Indicator). ISEE and taxes are simulated in the model for each student. The net benefit from tertiary education is given by the difference between the per-student cost of production and simulated taxes. The net per-student value is imputed to students by exploiting the information collected in EU-SILC.
3.2.3 Health services
For health in-kind transfers administrative data from the Region of Tuscany (year 2010) are exploited. Administrative data collects both the numbers of users and the costs of production for each health service that is hospital services (cards hospital discharges), outpatient services, pharmaceutical services and rehabilitation services19. To attribute health consumption to individuals we applied the Monte Carlo method. First, a probability to consume a certain service is estimated for intersections (cells) of socio-demographic characteristics (gender, citizenship, age classes and level of education). Second, the service is attributed to individuals by comparing the estimated probability with a random number from a uniform distribution in the interval 0–1. For the selected individuals the in-kind benefit of each service (evaluated by its cost of production) is imputed.
3.2.4 The distributive impact of in-kind transfers
The incidence of simulated in-kind transfers on income by quintiles of equivalent disposable income shows the important distributive role of public expenditure, both in education and health (Figures 3 and 4).
Figure 3
In education, the incidence clearly decreases with income and it is particularly high in the first quintile for high and middle school. The incidence is lower for pre-school even if still decreasing with income (Figure 3). Also health expenditure seems to affect more low income quintiles. Among the different services considered hospital ones tend to have a stronger impact (Figure 4).
Figure 4
These descriptive results are confirmed by the redistributive indexes (Table 10) calculated at the individual level. All in-kind transfers in education have a positive redistributive impact like in Baldini et al. (2007). Further, similarly to Baldini et al. (2007) the stronger redistribution derives from primary, middle and high school. The progressivity index is particularly elevated for middle and high school. The reduction in the post-transfer Gini index is lower for pre-school and university. Pre-school has a high progressivity, but a low incidence. University is the less progressive education in-kind transfer.
Table 10
According to our simulation, as in Baldini et al. (2007), but different with respect to Sonedda and Turati (2005)20, health services decrease inequality. Inequality diminishes especially thanks to hospital services that have the higher incidence and progressivity. Different with respect to pharmaceutical and outpatient services, hospital services are concentrated on old people and their cost of production can be very high. Consequently, as in Baldini et al. (2007), hospital services have a strong re-ranking effect.
4. An application of the extended model
The extended microsimulation model allows to evaluate the effect of fiscal policies both on the revenue and the expenditure side. In the last few years in Italy the central government implemented many fiscal policies. In this paper three polices are considered.
The first is the introduction of the “fiscal bonus” (law 190/2014 so called Stability Law for 2015). The Stability Law confirms the “fiscal bonus” for employees, introduced for the first time in 2014, with effects from 2015 onwards. The government expects from the “fiscal bonus” indirect positive effects on consumption. More precisely, the “fiscal bonus” is a fixed amount of 960 euro per year (80 euro per month) for employees with gross income under 24,000 euro and it is decreasing for employees with gross income between 24,000 and 26,000 euro, as explained in the following table (Table 11). The bonus is attributed to employees with a gross PIT higher than tax credit for employees.
Table 11
The second policy is the safeguard clause about the increase of VAT (law 190/2014 so called Stability Law for 2015). In Italy three VAT rates are applied: the ordinary rate is now 22%, the decreased rate is 10% and concerns tourist services and some types of food, and the minimum rate is 4% for basic necessities. The law 190/2014 states that VAT rates will increase from 10% to 12% and from 22% to 24% in 2016 if other resources to meet budget constraints will not be found (safeguard clause). In the following years further increases are established (see Table 12).
Table 12
The third policy is the variation in health and education expenditure between 2013 and 2016 as expected by the estimated budget of the State and by the Economic and Financial Document (DEF) 2015. In the last few years the central government decided for many cuts and spending review operations (Table 13). Between 2013 and 2016 there will be a decrease in education public spending and an increase in health (current prices). These variations (in percentage) are applied to the monetary value of in-kind transfer estimated in MicroReg to evaluate the redistributive effects.
Table 13
In order to evaluate the total distributive effects of the three policies we compare the disposable income in 2013 (without the fiscal bonus and the increase in VAT and with the 2013 value of in-kind transfers) and the disposable income in 2016 (with the fiscal bonus, the increase in VAT and with the 2016 value of in-kind transfers) (Figure 5).
Figure 5
The fiscal bonus has a strong effect and it is more favourable to central quintiles. It costs about 656 million of euro and it includes 788 thousands of beneficiaries in Tuscany (about 21% of the total population that lives in the 38% of the total number of households).
The increase in VAT rates has a clear regressive impact, since the disposable income is reduced more for the first quintile with respect to the others. Similar results are in Arachi et al. (2012). By simulating an increase to 23.5% and to 12.5% in the ordinary and the reduced tax rate they observe a regressive redistribution.
Cuts in school expenditure have a clear regressive impact since the decrease in disposable income is decreasing by quintile of income. The effect of reduction in university expenditure is negligible. The increase in disposable income due to the raise in health expenditure is higher for the first quintiles of income. By considering all the three policies together the effect is positive for central quintiles, with a variation in disposable income of about 0.8% (thanks especially to the fiscal bonus). For the first and the last quintile, the increase in disposable income is lower. Therefore, if the increase in VAT rates will not be avoided the positive effect of the fiscal bonus on consumption expected by the national government (most from the first quintile) could miss.
Table 14
Inequality indexes, calculated at household level, confirm these results (Table 14). After the increase in VAT the Gini increases. On the contrary, it decreases with the introduction of fiscal bonus. In-kind transfers have positive effect in inequality. When their value decreases (school and university) the Gini increases, the opposite when their value increases (health).
Footnotes
1.
Due to data availability the model is extended to in-kind transfers only for the Region of Tuscany.
2.
However, also in SHIW households’ wealth is highly under-estimated.
3.
Second homes (when more than one) are considered together in the tax paid that remains after the dwelling house. This simplification should not be a problem since second homes are basically subjected to the same taxation, independently on their number.
4.
The small difference depends on the sequence of the phases of the model. Indeed, the imputation of cadastral value precedes the sample weights calibration.
5.
The data of Dipartimento delle Finanze and Agenzia del Territorio (2012) is a projection.
6.
Actually in 2012 the cadastral value of the dwelling house is not included in gross income so it does not be deducted.
7.
The imputation is made for each taxpayer, so in our model the number of taxpayers with positive gross income is equal to the number of taxpayers with positive taxable income.
8.
Consequently, each taxpayer has a positive tax credit.
9.
Results for each Italian Region can be found in appendix (Tables A.1 and A.2).
10.
The comparison on the average value of gross income by gross income classes can be found in appendix (Figure A.1).
11.
The problem is detected also in Di Nicola et al. (2015). They can recover the small amounts of income within PIT and real estate datasets.
12.
The average value of PIT by gross income classes can be found in appendix (Figure A.2).
13.
Gastaldi et al. (2014) evaluate some VAT reforms on the integrated database built in Pisani and Tedeschi (2014).
14.
Tax base and tax revenue for each Region is reported in appendix (Table A.3). In appendix it is also shown the distribution of tax revenue by income and expenditure deciles (Figure A.3).
15.
Table 10 “Imposte indirette prelevate dalle Amministrazioni pubbliche e dall'Unione europea per tipo di tributo. Anni 1995 - 2015 (milioni di euro correnti)”.
16.
See Bianchini et al.(2013) for a previous application.
17.
The education module can be easily extended to the rest of Italy. For health data are available only for Tuscany.
18.
Data are taken from Comitato Nazionale di Valutazione del Sistema Universitario (CNVSU).
19.
The services for which regional data are available do not cover the entire public health expenditure. Prevention, homecare, assistance against addiction, mental health assistance, primary and district care are excluded.
20.
Sonedda and Turati (2005) find a regressive but weak and not highly significant impact of health expenditure. They follows the insurance value approach.
Table A.1
Table A.2
Figure A.1
Figure A.2
Table A.3
Figure A.3
Table A.4
Table A.5
Table A.6
References
1. 1
Fiscal Reforms during Fiscal Consolidation: The Case of Italy
(2012)
FinanzArchiv: Public Finance Analysis 68:445–465.
2. 2
Gli effetti distributivi dei trasferimenti in-kind: il caso dei servizi educativi e sanitari
(2007)
In: A Brandolini, C Saraceno, editors. Povertá e benessere. Una geografia delle disuguaglianze in Italia. Bologna: Il Mulino. pp. 411–422.
3. 3
La SAM come strumento di integrazione e analisi
(2009)
Rivista di statistica ufficiale 2-3:35–62.
4. 4
A che cosa servono i modelli di microsimulazione? Tre applicazioni usando microReg
(2012)
Science Regionali 2:101–119.
5. 5
Federalismo fiscale e redistribuzione: l’ejfetto distributīvo dei benefici in-kind a livello regionale.
(2013)
Un’applicazione a due Regioni italiane, IRPET.
6. 6
ITALY (IT)
(2016)
2013–2016, ITALY (IT), Euromod version G4.0.
7. 7
Comparative Analysis of Different Techniques to Impute Expenditures into an Income Data Set
(2007)
Comparative Analysis of Different Techniques to Impute Expenditures into an Income Data Set, Project no: 028412, AIM-AP Accurate Income Measurement for the Assessment of Public Policies.
8. 8
The static microsimulation model of the Italian Department of Finance: Structure and first results regarding income and housing taxation
(2015)
Economia pubblica.
9. 9
Gli immobili in Italia 2012 - Ricchezza, reddito e fiscalitá immobiliare
(2012)
Gli immobili in Italia 2012 - Ricchezza, reddito e fiscalitá immobiliare.
10. 10
Asimmetrie territorial! nel gap IVA, Atti della Conferenza XXIV SIEP “Economia informale, evasione fiscale e corruzione”
(2012)
Universitá di Pavia, SIEP.
11. 11
Statistical Matching and Official Statistics
(2002)
Rivista di statistica ufficiale 1:5–24.
12. 12
Progressivity-improving VAT reforms in Italy, University of Pavia
(2014)
wp: SIEP.
13. 13
Public health transfers in kind: measuring the distributional effects in Italy, Universitá Commerciale Luigi Bocconi, Econpubblica Centre for Research on the Public Sector, Working Paper, (145)
(2009)
Public health transfers in kind: measuring the distributional effects in Italy, Universitá Commerciale Luigi Bocconi, Econpubblica Centre for Research on the Public Sector, Working Paper, (145).
14. 14
Imputation of gross amounts from net incomes in household surveys: an application using EUROMOD. EUROMOD Working Paper EM1/01
(2001)
Imputation of gross amounts from net incomes in household surveys: an application using EUROMOD. EUROMOD Working Paper EM1/01.
15. 15
Sintesi dei conti ed aggregati economici delle Amministrazioni pubbliche
(2016)
Sintesi dei conti ed aggregati economici delle Amministrazioni pubbliche.
16. 16
Il modelh di microsimulazione fiscale dell’IRPET Microreg
(2008)
Il modelh di microsimulazione fiscale dell’IRPET Microreg, IRPET 5/08-ebook.
17. 17
Modelling the Redistributive Impact of Indirect Taxes in Europe: An Application of Eurmod. EUROMOD Working Paper no. EM7/01
(2004)
Modelling the Redistributive Impact of Indirect Taxes in Europe: An Application of Eurmod. EUROMOD Working Paper no. EM7/01.
18. 18
Developing a static microsimulation model for the analysis of housing taxation in Italy
(2011)
International Journal of Microsimulation 4:73–85.
19. 19
Micro Data Fusion of Italian Expenditures and Incomes Surveys. WP n.164, Working Papers Series of the Department of Public Economics - Sapienza University of Rome
(2014)
Micro Data Fusion of Italian Expenditures and Incomes Surveys. WP n.164, Working Papers Series of the Department of Public Economics - Sapienza University of Rome.
20. 20
Winners and losers in the Italian Welfare State: A microsimulation analysis of income redistribution considering in-kind transfers
(2005)
Giornale degli Economisti 64:423–464.
21. 21
Final report EUROMOD: An Integrated European Benefit Tax Model. Working Paper, n. EM9
(2001)
Final report EUROMOD: An Integrated European Benefit Tax Model. Working Paper, n. EM9.
22. 22
The tax shift from labor to consumption in Italy: afiscal microsimulation analysis using EUROMOD
(2012)
University of Genoa.
Article and author information
Author details
1. M Luisa Maitino
IRPET (Regional Institute for Economic Planning of Tuscany), Italy
For correspondence
[email protected]
2. Letizia Ravagli
IRPET (Regional Institute for Economic Planning of Tuscany), Italy
For correspondence
[email protected]
3. Nicola Sciclone
IRPET (Regional Institute for Economic Planning of Tuscany), Italy
For correspondence
[email protected]
Publication history
1. Version of Record published: April 30, 2017 (version 1) |
# Generalizing $$J_2$$ Flow Theory: Fundamental Issues in Strain Gradient Plasticity
Title: Generalizing $$J_2$$ Flow Theory: Fundamental Issues in Strain Gradient Plasticity Author: Hutchinson, John W. Citation: Hutchinson, John W. 2012. Generalizing $$J_2$$ flow theory: Fundamental issues in strain gradient plasticity. Acta Mechanica Sinica 28(4): 1078-1086. Full Text & Related Files: Hutchinson_Generalizing.pdf (155.9Kb; PDF) Abstract: It has not been a simple matter to obtain a sound extension of the classical $$J_2$$ flow theory of plasticity that incorporates a dependence on plastic strain gradients and that is capable of capturing size-dependent behaviour of metals at the micron scale. Two classes of basic extensions of classical $$J_2$$ theory have been proposed: one with increments in higher order stresses related to increments of strain gradients and the other characterized by the higher order stresses themselves expressed in terms of increments of strain gradients. The theories proposed by Muhlhaus and Aifantis in 1991 and Fleck and Hutchinson in 2001 are in the first class, and, as formulated, these do not always satisfy thermodynamic requirements on plastic dissipation. On the other hand, theories of the second class proposed by Gudmundson in 2004 and Gurtin and Anand in 2009 have the physical deficiency that the higher order stress quantities can change discontinuously for bodies subject to arbitrarily small load changes. The present paper lays out this background to the quest for a sound phenomenological extension of the rateindependent $$J_2$$ flow theory of plasticity to include a dependence on gradients of plastic strain. A modification of the Fleck-Hutchinson formulation that ensures its thermodynamic integrity is presented and contrasted with a comparable formulation of the second class where in the higher order stresses are expressed in terms of the plastic strain rate. Both versions are constructed to reduce to the classical $$J_2$$ flow theory of plasticity when the gradients can be neglected and to coincide with the simpler and more readily formulated $$J_2$$ deformation theory of gradient plasticity for deformation histories characterized by proportional straining. Published Version: doi:10.1007/s10409-012-0089-4 Terms of Use: This article is made available under the terms and conditions applicable to Open Access Policy Articles, as set forth at http://nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of-use#OAP Citable link to this page: http://nrs.harvard.edu/urn-3:HUL.InstRepos:10196727 Downloads of this work: |
# How to Handle Sudden Burst in New HTTPS Connections?
## Solution 1:
Thank you @MichaelHampton for your help.
I found a solution for my problem, and hopefully it may help others (particularly if you are using Java).
I have heard many suggestions to simply increase nofiles to allow more connections, but i'd like to start by reiterating that the problem is not that the server isn't able to make more connections, it's that it's not able to make connections quick enough and dropping connections.
My first attempt to solve this problem was to increase the connection queue through net.ipv4.tcp_max_syn_backlog, net.core.somaxconn and again in the application's server config where appropriate. For vertx this is server.setAcceptBacklog(...);. This resulted in accepting more connections in queue, but it didn't make establishing the connections any faster. From a connecting client's point of view, they were no longer reset connections due to overflow, establishing connections just took much longer. For this reason, increasing the connection queue wasn't a real solution and just traded one problem for another.
Trying to narrow down where in the connection process the bottleneck was, I tried the same benchmarks with HTTP instead of HTTPS and found that the problem went away completely. My particular problem was with the TLS Handshake itself and the servers ability to satisfy it.
With some more digging into my own application, I found that replacing Javas default SSLHandler with a native one (OpenSSL) greatly increased the speed of connecting via HTTPS.
Here were the changes I made for my specific application (using Vertx 3.9.1).
<!-- https://mvnrepository.com/artifact/io.netty/netty-tcnative -->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative</artifactId>
<version>2.0.31.Final</version>
<classifier>osx-x86_64</classifier>
<scope>runtime</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/io.netty/netty-tcnative -->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative</artifactId>
<version>2.0.31.Final</version>
<classifier>linux-x86_64-fedora</classifier>
<scope>compile</scope>
</dependency>
The first dependency is for osx to test at runtime. The second is for centos linux when compiled. linux-x86_64 is also available for other flavors. I tried to use boringssl because openssl doesn't support ALPN but after many hours I couldn't get it to work so i've decided to live without http2 for now. With most connections only sending 1-2 small requests before disconnecting this really isn't an issue for me anyway. If you could use boringssl instead, that's probably preferred.
1. Because I am not using an uber version of the dependency. I needed to install the os dependencies for centos. This was added to the Dockerfile
RUN yum -y install openssl
RUN yum -y install apr
1. To tell the vertx server to use OpenSSL instead of the Java version, set the OpenSSL options on the server (even if just the default object)
httpServerOptions.setOpenSslEngineOptions(new OpenSSLEngineOptions());
1. Finally, in my run script, I added the io.netty.handler.ssl.openssl.useTasks=true option to Java. This tells the ssl handler to use tasks when handling the requests so that it is non-blocking.
java -Dio.netty.handler.ssl.openssl.useTasks=true -jar /app/application.jar
After these changes, I am able to establish connections much quicker with less overhead. What took tens of seconds before and resulted in frequent connection resets now takes 1-2 seconds with no resets. Could be better, but a big improvement from where I was.
## Solution 2:
Nice fix!.
So it seems to be the SSL layer, it certainly has to do a lot more processing, in terms network handshakes, and crypto transformations which take resources. Unless your SSL can offload some of the processing onto hardware, SSL can certainly increase load on your servers, and as you found out not all SSL libraries are created equal!.
These problems are a great candidate for a front end reverse proxy. This can ideally be place before your application, and handle all SSL connections to clients, and then do http to your back end.
Your original application has a little bit less to do, as your front end reverse proxy can soak up all the SSL work, and tcp connection management.
Apache and NGNIX can do this, and has quite a few options for load balancing those connections to the least loaded backend server.
You will find that NGNIX can do SSL terminations a lot faster than java can, and even if java can, your distributing the processing of the connection management across machines, thus reducing load (memory/cpu/disk io) on your back end server. You get the side effect of making the configuration of the back end simpler.
Downside is the your using http between your proxy and applications, which in some ultra secure environments is not desirable.
Good Luck! |
# If $\sum a_n$ diverges and $\lambda_n \to \infty$, does the series $\sum \lambda_na_n$ diverge?
Suppose that the series $\displaystyle \sum a_n$ diverges and $\lambda_n \to \infty$. Does the series $\displaystyle \sum \lambda_na_n$ diverge? And what happens if $\{\lambda_n\}$ is an unbounded increasing sequence?
If $\displaystyle \sum a_n$ is a series of positive terms,then from $\lambda_n \to \infty$ we can say that for any large $M>0$ we can always find $m\in \Bbb{N}$ st $\lambda_n>M \quad \forall n>m$ so that $a_n\lambda_n>a_nM$ therefore the series $\displaystyle \sum a_n\lambda_n$ diverges by comparison test. But what happens if I don't assume it to be positive term series? And why is second part of problem different from first?
• I think you meant to say "$a_n\lambda_n > a_nM$" – IAmNoOne Aug 10 '14 at 5:48
• If you don't assume $a_n > 0$ , than how did you come up with the inequality $a_n \lambda_n > a_n$? – IAmNoOne Aug 10 '14 at 5:54
• yes edited, that's what I'm asking I'm able to do this for $a_n>0$ ,but what happens if I don't assume that? – Bhauryal Aug 10 '14 at 5:56
• I missed your last question. There is no difference, see proofwiki.org/wiki/… – IAmNoOne Aug 10 '14 at 5:57
Let $$\sum a_n=1-{1\over4}+{1\over3}-{1\over8}+{1\over5}-{1\over12}+\cdots$$ alternate the reciprocals of odd numbers and multiples of $4$ and let
$$\lambda_n=\begin{cases} \sqrt n\quad\text{if n is odd}\\ 2\sqrt n\quad\text{if n is even}\\ \end{cases}$$
Then
$$\sum\lambda_n a_n=1-{1\over\sqrt2}+{1\over\sqrt3}-{1\over\sqrt4}+{1\over\sqrt5}-{1\over\sqrt6}+\cdots$$
which is a (conditionally) convergent series (because it's alternating a strictly decreasing sequence).
To see why $\sum a_n$ diverges, suppose it converged. Then we would have
$$\sum a_n-\left(1-{1\over2}+{1\over3}-{1\over4}+{1\over5}-{1\over6}+\cdots \right)={1\over4}+{1\over8}+{1\over12}+\cdots$$
But that last sum diverges. |
Join the official 2019 Python Developers Survey
A Python module to compute multidimensional arrays of evaluated functions.
## Project description
<img align="left" src="https://api.travis-ci.org/NiMlr/PyFunctionBases.svg?branch=master">
<img align="right" width="300" height="300" src="https://user-images.githubusercontent.com/39880630/56446422-dc61eb80-6302-11e9-8b46-78c0a9d08420.gif">
# PyFunctionBases
A Python module to compute multi-dimensional arrays of evaluated functions based on Numpy. This module can be used for evaluation of functions, approximation or for feature engineering in machine learning.
Specifically, the module evaluates basis functions on intervals by employing a recursive formula of type
<p align="center">
<img src="https://latex.codecogs.com/gif.latex?f_{n+1}(x)&space;=&space;g(f_n(x),&space;\dots,&space;f_0(x),x)." title="f_{n+1}(x) = g(f_n(x), \dots, f_0(x),x)." />
</p>
This is generalized to the multi-dimensional case by using a tensor product
<p align="center">
<img src="https://latex.codecogs.com/gif.latex?(f_i({x_m}_k),f_j({x_m}_l))&space;\mapsto&space;f_i({x_m}_k)f_j({x_m}_l)" />
</p>
repeatedly on coordinate wise one-dimensional function bases. The code is vectorized over the evalution points
<p align="center">
<img src="https://latex.codecogs.com/gif.latex?x_m&space;\in&space;\mathbb{R}^{num\_dim},&space;m&space;\in&space;\{1,&space;\dots,&space;num\_samples\}" />
</p>
and returns a multi-dimensional array of shape (num_samples, degree+1, ..., degree+1), where degree
is the cardinality of the one-dimensional bases omitting a constant function. The following picture shows the two-dimensional case.
<p align="center">
<img width="399" height="323" src="https://user-images.githubusercontent.com/39880630/56447919-80e82b80-630b-11e9-92bd-6d81b0d78946.png">
</p>
Currently, the following functions are available:
| Name | Domain |
|-------|-----------|
| [standard_poly](https://en.wikipedia.org/wiki/Polynomial) | (-Inf, Inf)|
| [legendre_poly](https://en.wikipedia.org/wiki/Legendre_polynomials) | [-1, 1]|
| [legendre_rational](https://en.wikipedia.org/wiki/Legendre_rational_functions) | [0, Inf)|
| [chebyshev_poly](https://en.wikipedia.org/wiki/Chebyshev_polynomials#First_kind) | [-1, 1]|
Please make sure that your data lies in these domains, checks will be run if desired.
### Contents
[1. Installation](#installation)
[2. Simple usage](#simple-usage)
[3. Where evaluation of polynomials can fail](#where-evaluation-of-polynomials-can-fail)
## Installation
Requirements: pip3 install numpy
bash
pip3 install pyfunctionbases
## Simple Usage
Now a simple example using standard polynomials is given. By exchanging the name parameter, you can try different functions.
python
from pyfunctionbases import RecursiveExpansion
import numpy as np
# create some data to evaluate basis functions on<
num_samples = 1000
num_dim = 2
x = np.random.uniform(low=0.0, high=1.0, size=(num_samples, num_dim))
# create an expansion object where name can be any
# function name, that is in the table below
degree = 10
name = 'standard_poly'
expn = RecursiveExpansion(degree, recf=name)
# evaluate the function, result is of shape (num_samples, degree+1, degree+1)
f_ij = expn.execute(x)
# flatten the result if needed
f_k = f_ij.reshape(num_samples,(degree+1)**num_dim)
## Where evaluation of polynomials can fail
When evaluating functions it is easy to encounter numerical pitfalls. For polynomials specifically one can take measures to avoid problems with floating point numbers, e.g. by employing the representation indicated on the right hand side of the equation c_1*(x**2)+ c_0*x = x*(c_1*x +c_0). Generalizing the former, one can avoid unnecessarily large or small numbers during the evaluation that are caused by large powers and which are badly represented by floating point numbers.
In approximation on the other hand, a basis representation like [x**n, ..., x**0] is useful in search for the right coefficients. This is a case where e.g. Legendre polynomials provide a useful alternative basis, that covers the exact same function space when the same degrees are considered. In the following code snipped, we can observe an example of this.

python
from pyfunctionbases import RecursiveExpansion
import numpy as np
import matplotlib.pyplot as plt
# create some data
samples = 1000
x = np.random.uniform(low=-1.0, high=1.0, size=(samples,))
x.sort()
# evaluate a function to approximate on the data
fvals = np.tanh(x)*np.cos(50*x)
# set some a maximum degree for the polynomials
degree = 50
# initialize the RecursiveExpansion
expnleg = RecursiveExpansion(degree, recf='legendre_poly')
expnstan = RecursiveExpansion(degree, recf='standard_poly')
# compute the basis functions
basisleg = expnleg.execute(x[:, None], prec=1e-6)
basisstan = expnstan.execute(x[:, None], prec=1e-6)
# find the coefficients of the least squares fit
# to the function given the data
solleg = np.linalg.lstsq(basisleg, fvals, rcond=None)
solstan = np.linalg.lstsq(basisstan, fvals, rcond=None)
# plot the result
plt.plot(x, fvals)
plt.plot(x, np.matmul(basisleg, solleg[0]))
plt.plot(x, np.matmul(basisstan, solstan[0]))
plt.show() |
1. sarah_hendrix7 Group Title
2. Hoa Group Title
do you want the whole steps or just a result? the final result is an = -7 *6^n
3. Hoa Group Title
a(n+1) -6 a(n) =0 characteristic equation is x -6 =0 x =6 the equation has the form an= C*6^n replace a1 = C *6^1 = -42 ---->C = -7 replace C into an form to get an= -7 *6^n Hope this help. are you in discrete math course?If so, you will understand what i mean
4. ParthKohli Group Title
LOL
5. Hoa Group Title
What's wrong? and what makes you laugh
6. Hoa Group Title
@ParthKohli : I am a student and want to study from others. Please tell me if I am wrong in my work, I appreciate when you correct my mistake or my flaw
7. ParthKohli Group Title
You have left a lot of space. :-)
8. Hoa Group Title
I don't know why the net post it up. I am not good at computer
9. ParthKohli Group Title
I think we can do this using the geometric sequence formula.
10. Hoa Group Title
yes, you are right. show me your work. now is your turn
11. ParthKohli Group Title
12. ParthKohli Group Title
$$a_1 = -42$$ and $$r = -6$$. Now use the following formula:$a_n = a_1 \times r^{n - 1}$
13. Hoa Group Title
how to get r?
14. ParthKohli Group Title
Strictly speaking, if you have:$a_{n + 1} = {\rm some ~number} \times a_n$For all $$n$$, then the "some number" is the common ratio denoted by $$r$$.
15. ParthKohli Group Title
I'd leave it to the asker to do the rest.
16. Hoa Group Title
I need help from my problem, may I have yours?
17. ParthKohli Group Title
18. Hoa Group Title
my problem is: find the coefficient of x^10 in the power series of (x^2 +x^4 +....) (x^3 +x^6 +x^9 +....)(x^4 +x^8 +x^12 +....)
19. ParthKohli Group Title
This is an interesting problem. Let's see.
20. ParthKohli Group Title
You can just message me here; I check this site much often... more than I check my email
21. Hoa Group Title
dealt. since I have to go. send me message when you've done. Nice to see.
22. ParthKohli Group Title
Cya! |
# Fazle R Dayeen
Blog
Hello there,
### I'm Dayeen
#### Graduate Student, Code Geek & Enthusiast Photographer|
I am a physics graduate student at University of Illinois Chicago. Also working as a teaching assistant there. I enjoy reading. The knowledge and perspective that my reading gives me has strengthened my teaching skills and presentation abilities. People find me to be an upbeat, self-motivated team player with excellent communication skills. I love travelling and photography. I love blogging in my free time. You can check my blog here.
Scroll down
• GitHub
Check my codes on GitHub
• 500px
Watch my crazy videos on YouTube
#### Projects & Publications
##### Multi-multifractality and dynamic scaling in WPSL
Journal: Chaos, Solitons & Fractals, Published: Oct /2016
We show that except the conservation of total area, each of the infinitely many non-trivial conservation laws is actually a multifractal measure and hence weighted planar stochastic lattice (WPSL) is a multi-multifractal. We also find that the Lewis law for weighted planar stochastic lattice is obeyed for up to $k=8$ and the Aboav–Weaire law is violated for the entire range of $k$.
##### Cause-Effect Pair Detection
Project Completed: May /2016
We attempt to build a classification based approach to solve the cause-effect pair detection problem. The core of the exploration in our approach was centered around several classes of features we designed for two given distributions. Mostly it was statistical features that explicitly represents some unique properties of the distributions.
##### Percolation and Cluster Distribution
Project Completed: April /2015
We examine cluster labeling technique in percolation and establish a relation between the number of clusters and site occupation probability using Hoshen-Kopelman algorithm.
##### Analysis of Degrees of Freedom For Vector Tensor Models
MSc. Thesis Published: Sept /2012
We consider a model in which vector gauge field is coupled directly to an antisymmetric tensor field in presence of a pseudoscalar and a pure scalar mass term and hence calculate the degrees of freedom.
#### Educations
Department of Physics, University of Illinois Chicago Current
Graduate student in UIC. Researching with computational material science group. Expected to graduate soon.
##### Master of Science
Department of Theoretical Physics,University of Dhaka Graduated/ 2012
Completed my MSc. degree in Theoretical Physics with a thesis on Quantum Field Theory.
##### Bachelor of Science (Hons.)
Department of Physics, University of Dhaka Graduated/ 2011
Graduated as Physics major with the minor in Chemistry, Statistics and Mathematics.
#### Experience
##### Teaching Assistant
University of Illinois Chicago Aug/ 2014- Current
Working as Teaching Assistant at the Department of Physics for more than two years. Responsibility includes facilitate lab session, aid students to understand materials, proctor and grade exams.
##### Student Departmental Representatives
Graduate Student Council at UIC Aug/ 2016- July/ 2017
Our aim is to represent the interests of graduate students in campus committees throughout the year. Hosting academic workshops and seminars and encourage collaboration across academic programs .
#### I Specialized in
###### Computing Softwares
Mathematica, Matlab, Gnuplot
###### Programming Languages
C/C++, Python, Java, R
#### My Interests
My hobbies and passions
###### Photography
My passion
Checkout my photography at Flickr, 500px or follow me on Instagram
###### Travelling
Places I have visited:
• Grand Canyon
• Niagara Falls
###### Bucket List
Things I have done
• Participated in National Judo Competition
• Earned Blackbelt in two types of martial arts (Judo & Karate)
• Skiing 1260ft from Mountain
• Skydiving from 14,000ft above the sea level
• Drove NASCAR in racing track (Top speed 224.80 km/h)
• Won a Fencing duel
#### Get in Touch
Question? Comments? Feel free to drop me a line |
# If $G=(V,E)$ is a planar graph, all vertex degrees are $3$, all faces are of five/six edges, how many five-edged faces are there?
Given a graph $G = (V,E)$, a planar graph where every vertex has degree $3$ and all faces are five-edged or six-edged. How many five-edged faces are there?
It was a question in one of our previous exams, the answer is $12$, but I have no clue on how to solve it. I mean, I can have as many faces as I want, why is it limited to $12$ and $12$ only?
-
You can have as many faces as you want? Have you actually tried to come up with one with (say) $0$ five-sided faces? – Chris Eagle Jun 21 '13 at 17:40
if I draw a 6-edged face and given that every vertex is of degree 3, I still can and have to draw infinite 6-edged faces... I don't know. What's the direction to solve this? – TheNotMe Jun 21 '13 at 17:43
Possible duplicate of Restrictions on the faces of a 3-regular planar graph – Vedran Šego Jun 21 '13 at 17:49
Since the graph is planar
$$V-E+F=2 \,.$$
As each vertex has degree $3$, by Handshaking lemma we have $2E=3V$.
Let $f_1$ be the number of $5$ edged faces and $f_2$ be the number of $6$ edges faces. Then
$$2E=5f_1+6f_2 \,.$$ $$f_1+f_2=F \,.$$
From here we get $$2E=6F-f_1 \,.$$
Thus, plugging everything in the first relation, you get:
$$\frac{2E}{3}-E+\frac{2E+f_1}{6}=2 \,.$$
Since all $E$'s cancel you are done.
-
I have trouble understanding how do we get $2E = 5f_1 + 6f_2$? I understand that $5f_1 + 6f_2$ is the total number of edges in those faces, but not all edges belong to two faces (some are on the outer edge of the graph), right? The other question didn't explain that either. – Vedran Šego Jun 21 '13 at 17:59
@VedranŠego Don't forget that the outside is also a face, is the infinite face. The formula $V-E+F=2$ counts the infinite face too. Because of this, all edges belong to exactly $2$ faces. – N. S. Jun 21 '13 at 18:02
ALL edges belong to two faces, don't forget that there is the infinite face. – TheNotMe Jun 21 '13 at 18:03
Thank you N.S. :) – TheNotMe Jun 21 '13 at 18:04
I'm sorry, can you explain again how you got $2E = 5f_1 + 6f_2$ please? – TheNotMe Jun 21 '13 at 18:08 |
# Sobolev space is an algebra
How do you prove that the Sobolev space $H^s(\mathbb{R}^n)$ is an algebra if $s>\frac{n}{2}$, i.e. if $u,v$ are in $H^s(\mathbb{R}^n)$, then so is $uv$? Actually I think we should also have $||uv||_s \leq C ||u||_s ||v||_s$. Recall that $||f||_s=||(1+|\eta|^2)^{\frac{s}{2}}\hat{f}(\eta)||$, the norm on $H^s(\mathbb{R}^n)$. This is an exercise from Taylor's book, Partial differential equations I.
-
So that would mean that if I have some functions in $\mathbb R^3$ that are twice weakly differentiable, then all products between those functions would also be twice weakly differentiable? Interesting. – Elmar Zander Feb 26 '13 at 13:23
Note that $$\begin{split} (1+|\xi|^2)^p &\leq (1+2|\xi-\eta|^2+2|\eta|^2)^p\\ &\leq 2^p(1+|\xi-\eta|^2+1+|\eta|^2)^p\\ &\leq c(1+|\xi-\eta|^2)^p + c(1+|\eta|^2)^p, \end{split}$$ for $p>0$, where $c=\max\{2^{p},2^{2p-1}\}$. Put $\langle\xi\rangle=\sqrt{1+|\xi|^2}$. Then we have $$\begin{split} \langle\xi\rangle^s |\widehat{uv}(\xi)| &\leq \int \langle\xi\rangle^s |\hat{u}(\xi-\eta)\hat{v}(\eta)|\,\mathrm{d}\eta\\ &\leq c\int \langle\xi-\eta\rangle^s |\hat{u}(\xi-\eta)\hat{v}(\eta)|\,\mathrm{d}\eta + c\int \langle\eta\rangle^s |\hat{u}(\xi-\eta)\hat{v}(\eta)|\,\mathrm{d}\eta\\ &\leq c|\langle\cdot\rangle^s\hat u|*|\hat v| + c|\hat u|*|\langle\cdot\rangle^s\hat v|, \end{split}$$ which, in light of Young's inequality, implies $$\|uv\|_{H^s} \leq c\|u\|_{H^s} \|\hat v\|_{L^1} + c\|\hat u\|_{L^1}\|v\|_{H^s}.$$ Finally, we note that $\|\hat u\|_{L^1}\leq C\,\|u\|_{H^s}$ when $s>\frac{n}2$.
-
I tried to use that inequality but I get just $||fg||_s \leq ||f||_s ||(1+|\eta|^2)^s \hat{g}(\eta)||_{L^1}$, which is good if $g \in C_{0}^{\infty}$, but not always on $H^s(\mathbb{R}^n)$. – Frank Zermelo Feb 27 '13 at 3:45
@FrankZermelo: I updated the answer. – timur Mar 2 '13 at 1:45
Just beautiful! Thanks a lot! Very slick that you're not using Peetre's inequality with product, but with sum. – Frank Zermelo Mar 2 '13 at 2:22
@ timur, why does: $\| \hat{u} \| _{L^1} \leq C \| u \| _{H^s}$? – MathematicalPhysicist Dec 12 '14 at 7:28
@MathematicalPhysicist: Write $\|\hat u\|_{L^1}=\int<\xi>^{-s}<\xi>^{s}|\hat u(\xi)|\mathrm{d}\xi$, and apply Cauchy-Schwarz. The condition $2s>n$ gives integrability of $<\xi>^{-2s}$. – timur Dec 14 '14 at 0:07
One way to see this is by an argument similar to proof of a "trace theorem": first, for $f,g\in H^s(\mathbb R^n)$ with $s\ge 0$, $f\otimes g\in H^s(\mathbb R^{2n})$ because $1+|x|^2+|y|^2\le (1+|x|^2)(1+|y|^2)$. Next, prove an easy form of a "trace theorem", namely, that restriction from $\mathbb R^N$ to $\mathbb R^{N-n}$ maps $H^s$ to $H^{s-{n\over 2}}$ for $s> {n\over 2}$.
Edit: in response to the comments of @ElmarZander, ... The question, as originally posed, cannot be quite right, no. The argument sketched here shows that for $s>n/2$ the product of two elements in $H^s(\mathbb R^n)$ is in $H^{s-n/2-\varepsilon}$ for every $\epsilon>0$. I do not know whether higher-dimensional results can be sharpened, but for $n=1$ it is easy to do explicit examples showing the limitation: take $\hat{f}=\hat{g}$ to be $|x|^{-3/4-\varepsilon}$ for $x\ge 1$ and $0$ otherwise. These are in $H^{1/2+\varepsilon'}(\mathbb R)$. Then the convolution has a lower bound $x^{-1/2-2\varepsilon}$, I believe, so $fg$ is not in $H^{1/2+\varepsilon'}$.
-
That's a pretty cute proof! +1 – Willie Wong Feb 26 '13 at 17:30
@WillieWong .. Thanks! :) – paul garrett Feb 26 '13 at 17:44
Forgive my ignorance, but if I get you correctly you restrict $\mathbb R^{2n}$ to $D=\{(x,x)|x\in\mathbb R^{n}\}$ and then map $H^s(\mathbb R^{2n})$ to $H^{s-n/2}(D)\simeq H^{s-n/2}(\mathbb R^{n})$. But then you lose differentiability, don't you? – Elmar Zander Feb 26 '13 at 18:26
For $s>n/2$, we have the a priori estimate $\|fg\|_{W^{s,p}}\lesssim_{n,p,s}\|f\|_{W^{s,p}}\|g\|_{W^{s,p}}$. See T. Tao's lecture notes (Week 4) for a proof. $H^{s}(\mathbb{R}^{n})=W^{s,2}(\mathbb{R}^{n})$. So this contradicts your claim. – Matt Rosenzweig Dec 26 '15 at 0:10
If $f,g\in H^1(\mathbb{R})$, then an application of Leibniz' rule, together with $\|\cdot\|_{L^\infty}\leq \|\cdot\|_{H^1(\mathbb{R})}$ and a limiting argument, shows that $fg\in H^1(\mathbb{R})$. I don't see why this shouldn't work for any integer-regularity Sobolev space that controls $L^\infty$ – Bananach Jan 28 at 14:45 |
Contest Duration: - (local time) (100 minutes) Back to Home
E - Shiritori /
Time Limit: 2 sec / Memory Limit: 1024 MB
### 問題文
• 高橋君と青木君は、高橋君から始めて交互に単語を言い合っていく。
• 各プレーヤーは前の人が言った単語の最後の 3 文字で始まる単語を言わなければならない。例えば、前の人が Takahashi と言った場合、次の人は shipshield などを言うことができ、Aokisinghis などを言うことはできない。
• 大文字と小文字は区別される。例えば、Takahashi のあとに ShIp を言うことはできない。
• 言う単語がなくなった方が負ける。
• 高橋辞書に載っていない単語を言うことはできない。
• 同じ単語は何度でも使ってよい。
i\, (1 \leq i \leq N) について、高橋君が 3 しりとりを単語 s_i から始めたときどちらが勝つかを判定してください。ただし、二人とも最善に行動するとします。具体的には、自分が負けないことを最優先し、その次に相手を負かすことを優先します。
### 制約
• N1 以上 2 \times 10^5 以下の整数
• s_i は英小文字と英大文字のみからなる長さ 3 以上 8 以下の文字列
### 入力
N
s_1
s_2
\vdots
s_N
### 出力
N 行出力せよ。i\, (1 \leq i \leq N) 行目には、高橋君が 3 しりとりを単語 s_i から始めたとき、高橋君が勝つなら Takahashi、青木君が勝つなら Aoki、しりとりが永遠に続き勝敗が決まらないなら Draw と出力せよ。
### 入力例 1
3
abcd
bcda
### 出力例 1
Aoki
Takahashi
Draw
### 入力例 2
1
ABC
### 出力例 2
Draw
### 入力例 3
5
eaaaabaa
eaaaacaa
daaaaaaa
daaaafaa
### 出力例 3
Takahashi
Takahashi
Takahashi
Aoki
Takahashi
Score : 500 points
### Problem Statement
The Takahashi Dictionary lists N words; the i-th word (1 \leq i \leq N) is s_i.
Using this dictionary, Takahashi and Aoki will play 3-shiritori, which goes as follows.
• Takahashi and Aoki alternately say words, with Takahashi going first.
• Each player must say a word beginning with the last three characters of the previous word. For example, if a player says Takahashi, the next player can say ship or shield along with other choices, but not Aoki, sing, or his.
• Uppercase and lowercase are distinguished. For example, a player cannot say ShIp following Takahashi.
• A player who becomes unable to say a word loses.
• A player cannot say a word not listed in the dictionary.
• The same word can be used multiple times.
For each i (1 \leq i \leq N), determine who will win when Takahashi starts the game by saying the word s_i. Here, we assume that both players play optimally. More specifically, each player gives first priority to avoiding his loss and second priority to defeating the opponent.
### Constraints
• N is an integer between 1 and 2 \times 10^5 (inclusive).
• s_i is a string of length between 3 and 8 (inclusive) consisting of lowercase and uppercase English letters.
### Input
Input is given from Standard Input in the following format:
N
s_1
s_2
\vdots
s_N
### Output
Print N lines. The i-th line (1 \leq i \leq N) should contain Takahashi if Takahashi wins when Takahashi starts the game by saying the word s_i, Aoki if Aoki wins in that scenario, and Draw if the game continues forever with neither of them losing in that scenario.
### Sample Input 1
3
abcd
bcda
### Sample Output 1
Aoki
Takahashi
Draw
When Takahashi starts the game by saying abcd, Aoki will say bcda next, and Takahashi will then have no word to say, resulting in Aoki's win. Thus, we should print Aoki.
When Takahashi starts the game by saying bcda, Aoki will have no word to say, resulting in Takahashi win. Thus, we should print Takahashi.
When Takahashi starts the game by saying ada, both players will repeat ada and never end the game. Thus, we should print Draw. Note that they can use the same word any number of times.
### Sample Input 2
1
ABC
### Sample Output 2
Draw
### Sample Input 3
5
eaaaabaa
eaaaacaa
daaaaaaa
daaaafaa
### Sample Output 3
Takahashi
Takahashi
Takahashi
Aoki
Takahashi |
• عربي
Need Help?
Subscribe to Calculus B
###### \${selected_topic_name}
• Notes
The curve $y=\sqrt{4-x^{2}}$ , $-1 \leq X \leq 1$ is an arc of the circle $x^{2}+y^{2}=4$ Find the area of the surface obtained by rotating this arc about the x-axis
area $=2 \pi \int y d s=2 \pi \int_{-1}^{1} y \sqrt{1+\left(\frac{dy}{d x}\right)^{2}} d x$
(1) $y=\sqrt{4-x^{2}}=\left(4-x^{2}\right)^{1 / 2}$
(2) $y^{\prime}=\frac{1}{2}\left(4-x^{2}\right)^{\frac{1}{2}} \cdot -2 x=\frac{-x}{\left(4-x^{2}\right)^{1 / 2}}=\frac{-x}{\sqrt{4-x^{2}}}$
(3) $1+\left(y^{\prime}\right)^{2}=1+\left(\frac{-x}{\sqrt{4-x^{2}}}\right)^{2}=1+\frac{x^{2}}{4-x^{2}}=\frac{4-x^{2}}{4-x^{2}}+\frac{x^{2}}{4-x^{2}}=\frac{4}{4-x^{2}}$
area $=2 \pi \int_{-1}^{1} y \cdot \sqrt{1+y^{\prime 2}} d x=2 \pi \int_{-1}^{1} \sqrt{4-x^{2}} \cdot \sqrt{\frac{4}{4-x^{2}}} d x$
=$2 \pi \int_{-1}^{1} \sqrt{4-x^{2}} \cdot \frac{2}{\sqrt{4-x^{2}}} d x=2 \pi \int_{-1}^{1}2 d x=4 \pi \int_{-1}^{1} d x=4 \pi x |_{-1}^{1}$
$=4 \pi ({1} )-4 \pi(-1)=4 \pi+4 \pi=8 \pi$
Find the area of the surface obtained by the rotating the arc $4 x=y^{2}$ between (0,0) and (1,2) around the x-axis
area $=2 \pi \int y d s=2 \pi \int_{0}^{2} y \cdot \sqrt{1+\left(\frac{{dx}}{d y}\right)^{2}} d y$
$4 x=y^{2} \rightarrow x=\frac{y^{2}}{4} \rightarrow \frac{d x}{d y}=\frac{1}{2} y \rightarrow\left(\frac{d x}{d y}\right)^{2}=\frac{y^{2}}{4}$
area $=S=2 \pi \int_{0}^{2} y \cdot \sqrt{1+\frac{y^{2}}{4}} d y$
let $t=1+\frac{y^{2}}{4} \Rightarrow d t=\frac{y}{2} d y$
when $y= 0\rightarrow t=1$
when $y=2 \rightarrow t=2$
area $=s=2 \pi \int_{1}^{2} \sqrt{t}(2) \frac{1}{2} y \cdot d y$
$=2 \pi \cdot 2 \int_{1}^{2} t^{\frac{1}{2}} \cdot \frac{1}{2} y d y=4 \pi \int_{1}^{2} t^\frac{1}{2} dt$
$S=\left.\frac{4 \pi t^{\frac{3}{2}}}{\frac{3}{2}}\right|_{1} ^{2}=4 \pi \times \frac{2}{3} t^{3 / 2} |_{1}^{2}=\frac{8 \pi}{3}\left[2^{\frac{3}{2}}-1^{3 / 2}\right]$
$=\frac{8 \pi}{3}[\sqrt{8}-1]$ |
# Flavor Instabilities and Dispersion-Relation Gaps
One can find the flavor instabilities of a neutrino medium by solving the dispersion relations (DR) between $\omega$ and $\mathbf k$ from Eqn.~\eqref{eqn-dr-determinant-equation}. A negative imaginary component of $\omega$ indicates a flavor instability in time, and a positive imaginary component of $\mathbf k \cdot \hat{\mathbf z}$ indicates a flavor instability in the $+z$ direction. I. Izaguirre et al. conjectured that the flavor instabilities exist in the gaps between the real branches of the dispersion relations 1. In the first part this section I will first explain this conjecture, and in the second part I will show that this conjecture is actually incorrect.
## Conjecture of the Relation between Flavor Instabilities and DR Gaps#
I will consider a model with neutrinos emitted symmetrically about the $z$ axis. For this model, the characteristic equation \eqref{eqn-dr-determinant-equation} reduces to \begin{align} &\det \left( \omega \mathsf{I} + \frac{1}{2} \begin{pmatrix} I_0 & 0 & 0 & -I_1 \\ 0 & -\frac{1}{2} (I_0 - I_2) & 0 & 0 \\ 0 & 0 & -\frac{1}{2} (I_0 - I_2) & 0 \\ I_1 & 0 & 0 & -I_2 \end{pmatrix}\right) =0, \label{eqn-det-polarization-tensor-axial} \end{align} where $\mathsf I$ is the rank-4 identity matrix and $$$$I_m =\int_{-1}^{1} \dd u G(u) \frac{u^m}{1 - u/v_{\mathrm{ph}} }.$$$$ where $u=\cos\theta$, and $\vph=\omega/k$. Here I have assumed $\mathbf k = k \hat{\mathbf z}$. Eqn.~\eqref{eqn-det-polarization-tensor-axial} is satisfied if $$$$\omega = - \frac{1}{4} \left( I_0 - I_2 \pm \sqrt{ (I_0 + I_2 - 2 I_1) (I_0 + I_2 + 2 I_1) } \right) \label{eqn-mza}$$$$ or $$$$\omega = \frac{1}{4}(I_0 - I_2). \label{eqn-maa}$$$$ I will call the solutions to Eqn.~\eqref{eqn-mza} the symmetric solutions (SS$+$ and SS$-$ for the solutions to $\omega$ with the $+$ and $-$ signs) because they preserve the axial symmetry about the $z$ axis. I will call the solutions to Eqn.~\eqref{eqn-maa} the asymmetric solutions (AS) because they break the axial symmetry.
To illustrate the conjecture by I. Izaguirre et al., I will consider the two-angle model where the neutrinos are emitted with two zenith angles $\theta_1$ and $\theta_2$. The ELN of the two-angle model is $$$$G(u)= \mu \sum_{i=1,2} g_i \delta(u - u_i),$$$$ where $\mu = \sqrt{2}G_{\mathrm F} n$ is proportional to the neutrino number density $n$, $g_i$ are real numbers, and $u_i=\cos \theta_i$ ($i=1,2$). For any real $k$, the characteristic equations \eqref{eqn-mza} and \eqref{eqn-maa} are both quadratic equations of $\omega$ with two solutions In Fig. Dispersion Relations for Two-angel Model, I show the SS and AS of the dispersion relations $\omega(k)$ with $u_1=-0.6$, $u_2=0.6$ and $g_1=g_2=1$. In either case, both solutions $\omega(k)$ to the characteristic equations are real. However, there exist gaps between the DR curves where there is no real solution $k(\omega)$ for given real values of $\omega$. Because the characteristic equations are also quadratic equations of $k$ for any given real value of $\omega$, a pair of complex solutions $k (\omega) = k_{\mathrm R} (\omega) \pm \ri k_{\mathrm I}(\omega)$ exist in the DR gap of $\omega$ (see Fig. Dispersion Relations for Two-angel Model).
The observation of the result of the two-angle model leads I. Izaguirre et al. to conjecture that the flavor instabilities are associated with the DR gaps. They also studied the dispersion relations using a continuous ELN distribution from a 1D supernova simulation 2 which are reproduced in Fig. Dispersion Relations for Garching Model.
## Refutation of the Conjecture#
The association of the flavor instability in space, i.e., $k_{\mathrm I}(\omega)\neq 0$, with a DR gap in $\omega$ seems fortuitous for the two-angle model. As I explained previously, for the two-angle model, the characteristic equations are quadratic equation of $k$ for any given $\omega$. Therefore there always exists a pair of complex solutions between the DR gap in $\omega$. This is not the case when the neutrinos are emitted with more than two zenith angles.
In Fig. Dispersion Relations for Three-angel Model, I show the dispersion relations of a three-angle model with ELN \begin{align} G(u) = \frac{\mu}{2} \delta(u+0.1) + \mu \delta(u - 0.4) + \mu \delta(u-0.6). \end{align} For this model, there exist complex solutions of $k(\omega)$ even though there is no DR gap in $\omega$. This is because the characteristic equations are cubic equations of $k$ for given $\omega$ which admit three solutions. In certain ranges of $\omega$ there is only one real solution of $k(\omega)$. The other two solutions must be complex which indicates flavor instabilities in space.
The conjecture between the flavor instabilities and DR gaps does not work for the models with a continuous ELN distribution $G(u)$ either. In Fig. Dispersion Relations for Step-like ELN, I show the dispersion relations for a step-like distribution $$$$G(u) = \begin{cases} -0.1 & \text{if } -1 < u < -0.3, \\ 1 & \text{otherwise.} \end{cases} \label{chap:collective-sec:gap-eqn:eln-step-like}$$$$ One sees that, for this model, the SS$+$ and SS$-$ solutions merge into a continuous DR curve so that there is no gap in $\omega$. However, there do exist complex solutions $k(\omega)$ for certain values of $\omega$.
As shown in Fig. Dispersion Relations for Garching Model and Fig. Dispersion Relations for Step-like ELN, AS dispersion relations $k(\omega)$ appear only in the region where $\omega > 0$. This can be proved analytically as follows. Suppose there exist dispersion relations $k(\omega) = k_{\mathrm R}(\omega) + \ri k_{\mathrm I}(\omega)$ for $\omega \to 0$. Using the Sokhotski–Plemelj theorem, I can rewrite the characteristic equation $$$$k = \frac{1}{4} \int \mathrm du G(u) \frac{ 1 - u^2 }{ \omega/k - u }. \label{chap:collective-eqn:k-omega-relation}$$$$ as \begin{align} k_{\mathrm R} =& \frac{1}{4}\left( \mathcal{P} \int \mathrm d u G(u) \frac{ 1 - u^2 }{ - u } \right)\label{eqn-re-k-arbitrary-spectrum} \\ \lvert k_{\mathrm I} \rvert =& \frac{\pi}{4}G(0) \operatorname{Sign}\left( \omega \right), \label{eqn-k-arbitrary-spectrum} \end{align} where $\mathcal P$ denotes the principal value of the integral. As long as $G(0)\neq 0$, $\omega$ must have the same sign as $G(0)$ which implies that $k(\omega)$ exists only on one side of the $\omega=0$ axis. For the two scenarios depicted in Fig. Dispersion Relations for Garching Model and Fig. Dispersion Relations for Step-like ELN, $G(0)>0$ and $k(\omega)$ exist only in the upper half plane of $\omega$. This shows that, at least near $\omega =0$, the absence of a DR $\omega(k)$, i.e., a “gap'' in $\omega$, is not always associated with the flavor instabilities in space.
Eqn.~\eqref{eqn-k-arbitrary-spectrum} can be used to determine the values of $k_{\mathrm R}$ and $k_{\mathrm I}$ in the limit of $\omega\to 0$ for the AS branch of the dispersion relations. One can apply the same method to the SS branches which gives \begin{align} &\left(4 k_{\mathrm R} - J_{-1} + J_1 \right)^2 - \left( \operatorname{Sign}(\omega k_{\mathrm I} )\pi G(0) +4 k_{\mathrm I} \right)^2 \nonumber\\ =& - \left( J_{-1} + J_1 \right) \pi \operatorname{Sign}(\omega k_{\mathrm I} ) G(0) \label{chap:collective-eqn:dr-ss-general-limit-omega-0} \end{align} and $$$$\lvert k_{\mathrm I} \rvert = - \frac{\pi}{4} G(0) \operatorname{Sign}(\omega ) \left( 1 \pm \frac{ J_{-1} + J_1 }{ 4 k_{\mathrm R} - J_{-1} + J_1} \right), \label{chap:collective-eqn:dr-ss-general-limit-omega-0-ya}$$$$ where $$$$J_{n} = \mathcal P \int G(u)u^n \mathrm du,$$$$ and the $+$ and $-$ signs are for SS$+$ and SS$-$, respectively. Eqn.~\eqref{chap:collective-eqn:dr-ss-general-limit-omega-0} and Eqn.~\eqref{chap:collective-eqn:dr-ss-general-limit-omega-0-ya} show that $k(\omega)$ exists on both sides of the $\omega=0$ axis but may be different for SS$+$ and SS$-$.
1. Ignacio Izaguirre, "Fast Pairwise Conversion of Supernova Neutrinos: A Dispersion Relation Approach", Physical Review Letters 118, 021101 (2017) . ↩︎
2. The data is from the Garching Core-Collapse Supernova Archive at http://wwwmpa.mpa-garching.mpg.de/ccsnarchive/archive.html ↩︎ |
show · doc.knowl.annotation_guidelines all knowls · up · search:
Annotations are text that is included at the top or bottom of an object's homepage. They are functionally similar to knowls, but are not required to be context-free; aside from that, the editing guidelines for knowls apply also to annotations.
• Top annotations should be one sentence, with no headings, preferably short enough to fit on one line. A typical top annotation will identify common nomenclature for an object ("This integral lattice is the Leech lattice.") or a distinguishing characteristic ("This is the elliptic curve of smallest conductor of rank 3."). When available, include an external reference (e.g., a Wikipedia page).
• Bottom annotations can be of arbitrary length, and may include subheadings. A typical bottom annotation will provide freeform discussion of important mathematical properties of a given object (e.g. "This is an example of an object with property XYZ.").
• Do not include anything in an annotation that could be put in a knowl instead (e.g., definitions of terms). Instead, create a separate knowl and link to it. If the definition falls outside the scope of LMFDB (e.g., bitcoin), use an external link to Wikipedia or some other source.
• Use the \cite command to cite relevant literature.
• It is not necessary to fill in the description field for an annotation. When in doubt, leave it blank.
• Do not link directly to annotations, but instead to the underlying LMFDB objects.
Authors:
Knowl status:
• Review status: beta
• Last edited by Kiran S. Kedlaya on 2019-09-05 19:36:26
Referred to by:
History: Differences |
## What Is Scrap Value?
Scrap value is the worth of a physical asset's individual components when the asset itself is deemed no longer usable. The individual components, known as scrap, are worth something if they can be put to other uses. Sometimes scrap materials can be used as-is and other times they must be processed before they can be reused. An item's scrap value—also called residual value, break-up value, or salvage value—is determined by the supply and demand for the materials it can be broken down into.
## Formula and Calculating of Scrap Value
\begin{aligned} &\text{Scrap Value = Cost of Asset}-\left(\text{D} \times \text{Useful Life}\right)\\ &\textbf{where:}\\ &\text{D = Depreciation}\\ \end{aligned}
### Key Takeaways
• Scrap value is the worth of a physical asset's individual components when the asset itself is deemed no longer usable.
• After a long-term asset—such as machinery, vehicle, or furniture—has gone through its useful life, it may be disposed of.
• Scrap value is also known as residual value, salvage value, or break-up value.
• Scrap value is the estimated cost that a fixed asset can be sold for after factoring in full depreciation.
## What Scrap Value Can Tell You
In financial accounting, capital assets or long-term assets, such as machinery, vehicles, and furniture, have a useful life. After the asset has gone through its useful life, it may be disposed of. However, given that a broken down or obsolete asset may still have some residual value, some businesses can dispose of the asset by selling it for its current value.
Scrap value is the estimated cost that a fixed asset can be sold for after factoring in full depreciation. The asset that is disposed of is usually salvaged into multiple parts, with each part valued and sold separately.
In the insurance industry, scrap value is the money that can be recovered for a damaged or abandoned property. With auto or property insurance, the estimated scrap value is subtracted from any loss settlement if the insured keeps the property. For example, an individual has an auto insurance policy with a $2,000 deductible. The insured is in an accident and the estimated trade-in value (scrap value) is$4,500. Thus, the insured individual would receive a settlement check from the insurer for $2,500. ### Negative Scrap Value The scrap value of an asset can be negative if the cost of disposing of the asset results in a net cash outflow that is a contributing factor in the scrap value. For example, consider the value of land owned by a company that only slightly went up in value by the end of its useful life. The scrap value of the land may be negative if the cost of demolishing any building on the land is higher than the cost of the land and the market price for the individually demolished components that can be sold. ## Example of How to Use Scrap Value Depending on the method of depreciation adopted by a company, such as the straight-line method or declining-balance method, the scrap value of an asset will vary. For example, assume a company purchases machinery worth$75,000 and estimates that the useful life of the machinery is 8 years at a depreciable rate of 12%. Using the straight-line depreciation method, the annual depreciation per year will be 12% x $75,000 =$9,000. The residual amount that the company can get if it disposes of the machinery after eight years is as follows:
• Scrap value = $75,000 - ($9,000 x 8) = $3,000 If the company, instead, used the declining-balance method of depreciation, its salvage value can be calculated as: • Scrap value =$75,000 – $48,027.42 =$26,972.58
The scrap value can also be used to calculate the depreciation expense. Using our example above, if the company estimated a $3,000 residual value for the machinery at the end of 8 years, then it can calculate its depreciation expense per year to be ($75,000 - $3,000) / 8 =$9,000.
Having an estimate for the scrap value of a long-term asset can help a company figure out its annual depreciation cost, which is an important measure since it affects the level of a company’s net income. |
# Free answers to math word problem
Recent questions in Math Word Problem
Kaycee Roche 2021-01-10
York 2021-01-06
### You are selling tickets for a new college play: "A Handful of Math Miracles". Student tickets cost $4 and general admission tickets cost$6. You sell 525 tickets and collect $2876. How many of each type of ticket did you sell? Wierzycaz 2020-12-21 ### Worded problem: Follow these guided instructions to solve the worded problem below. a) Assign a variable (name your variable) b) write expression/s using your assigned variable, c) formulate your algebraic inequality d) Solve the algebraic inequality e) and state your answer. Worded Inequality problem: Your math test scores are 68, 78, 90 and 91. What is the lowest score you can earn on the next test and still achieve an average of at least 85? Chesley 2020-12-17 ### Montarello and Martins (2005) found that fifth grade students completed more mathematics problems correctly when simple problems were mixed in with their regular math assignments. To further explore this phenomenon, suppose that a researcher selects a standardized mathematics achievement test that produces a normal distribution of scores with a mean of . The researcher modifies the test by inserting a set of very easy problems among the standardized questions and gives the modified test to a sample of n = 36 students. If the average test score for the sample is M = 120, is this result sufficient to conclude that inserting the easy questions improves student performance? Use a one-tailed test with $\alpha =.05$. The null hypothesis in words is ? tinfoQ 2020-12-15 ### To explain:The reason why the application problem is a better name than word problem. Reeves 2020-12-12 ### Determine whether the statement makes sense or does not make sense, and explain your reasoning. Find the hardest part in solving a word problem is writing the equation that models the verbal conditions. Ava-May Nelson 2020-12-09 ### Your math instructor tells you that next weeks Lennie Carroll 2020-12-06 ### How many different letter arrangements can be made from the letters in the word MATHEMATICS? CMIIh 2020-11-27 ### Write an equation that models this situation. A market fills each of 20 containers with 3 pounds of grapes. How many pounds of grapes (P) are needed to fill the 20 containers? Zoe Oneal 2020-11-26 ### To find:The number of students took at least one online college cource in 2012, using the given model, Number of students $=0.0112{x}^{2}+0.4663x+1.513$. Wierzycaz 2020-11-12 ### In DISCRETE MATHEMATICS & APPLICATIONS Write a note about application of mathematical induction in daily life. Give two examples (150 words). Wribreeminsl 2020-11-07 ### A farmer has a vegetable garden which is 35 feet wide by 80 feet long. The farmer wishes to divide the garden equally between tomatoes, peppers, and cabbage. However, the farmer would like to keep a one-foot border around the garden free for a fence. How many square feet can be used for the tomato planting? Wierzycaz 2020-11-01 ### Math word problem that states t-shirts are$10 each and sweatshirts are $15 each, if 100 items were sold and they raised$1120, how many sweatshirts were sold, how would i set up this problem
BolkowN 2020-10-28
### a.)Explain in your own words what this problem is asking. b.)Explain the meaning of any notation used in the problem and in your solution. c.)Describe the mathematical concept(s) that appear to be foundational to this problem. d.)Justified solution to or proof of the problem. Find the greatest common divisior of a,b, and c and write it in the form ax+by+cz for integers x,y, and z. a=26,b=52,c=60
The Math word problem, in simple terms, is a problem that is presented in words, meaning that it can relate to integrals, statistics, analysis, basically anything that deals with calculations. It also helps to look at various questions and answers to help yourself see the logic. If you are challenged with math word problem help, do not forget to explore several answers below. Likewise, looking for a Math word problem solver, take your time to study your instructions first and see how certain solutions may be similar to yours. It can be achieved by looking into math word problem answers. |
## Physics for Scientists and Engineers: A Strategic Approach with Modern Physics (4th Edition)
(a) $A \cdot B = -18$ (b) $A \cdot B = 10$
We can use the formula for the dot product of two vectors $A$ and $B$: $A\cdot B = A_x~B_x + A_y~B_y$ (a) $A \cdot B = A_x~B_x + A_y~B_y$ $A \cdot B = (3)(2)+(4)(-6)$ $A \cdot B = -18$ (b) $A \cdot B = A_x~B_x + A_y~B_y$ $A \cdot B = (3)(6)+(-2)(4)$ $A \cdot B = 10$ |
Pat Blythe: K-Pop and other stories
Posted in Opinion with tags , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , on March 4, 2015 by segarini
During a recent conversation on writing and music I was asked if I had ever heard of K-Pop. My answer was “no” although I was tempted to say “of course” me being a music junky and all. I’ve been told I don’t get out enough so I’m working on that. In the meantime…
JAIMIE VERNON – THE WAGONS ARE CIRCLING…THE DRAIN
Posted in Opinion with tags , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , on January 31, 2015 by segarini
WARNING: This blog features the oral history of wallpaper drying and pertains to music that’s less than a decade old. Reading it might very well put you in a coma and cause your family to call emergency services.
It’s been a very interesting week in the music business as the wintery cobwebs get dusted away and we start seeing the industry awaken from its post-2014 slumber. There’s something amiss in Muzoid Land and it appears to be the long tail of sins of the past coming back to eat itself.
JAIMIE VERNON: ARE YOU THE NEW JOHNNY BRAVO?
Posted in Opinion with tags , , , , , , , , , , , , , , , , , , , , , , , , , on March 2, 2013 by segarini
In recent weeks I’ve been contacted by various music artists through various social medias (funny how the telephone is no longer one of them) to pick my brain for the millionth time on how to sell their wares/songs/nubile posteriors up the music industry food chain in the guise of a record deal or publishing deal or both. |
# Theorem with splitting fields
I am trying to understand the following:
Theorem I. If the polynomial $p(x)$ is irreducible in $F[x]$ and if $a$ is a root of $p(x),$ then $F(a) \cong F'(b)$ where $b$ is a root of $p'(t) \in F'[t].$ Moreover, this isomorphism $\Phi$ can be chosen so that $\Phi(a) = b$ and $\Phi(\alpha) = \beta \in F'$ for every $\alpha \in F.$
Corollary. If $p(x) \in F[x]$ and $a,b$ are both roots of $p(x),$ then $F(a) \cong F(b)$ by an isomorphism defined by $\phi:F(a) \to F(b), \hspace{1mm} a \mapsto b$ and leaves every element of $F$ fixed.
If I'm understanding this correctly, the preceding corollary states that if the roots $a$ and $b$ contained in the adjoined extensions $F(a)$ and $F(b)$ are from the same polynomial $p(x)$ over a field $F$, then we may restrict the mapping $\Phi$ in theorem I so that the adjoined roots $a,b$ are mapped to one another but $F$ is mapped to itself.
Theorem II. Let $\tau: F \to F', \hspace{1mm} \alpha \mapsto \beta$ be a field isomorphism, let $f(x) \in F[x], \hspace{1mm} \deg f(x) \geqslant 1$ and let $K$ be a splitting field of $f(x)$ over $F$ and $K'$ a splitting field of $f'(t)$ over $F'.$ Then $\tau: F \to F', \hspace{1mm} \tau(\alpha) = \beta$ extends to the field isomorphism $\psi:K \to K', \hspace{1mm} \alpha \mapsto \psi(\alpha) = \beta.$ In particular (proven in next theorem), any two splitting fields $K_1,K_2$ of the same polynomial over $F$ satisfy the property that $K_1 \cong K_2$ and that $\psi(\alpha) = \alpha$ for every $\alpha \in F.$
This theorem is similar to theorem $I$ except that the isomorphism $\Phi$ is between splitting fields instead of between fields with adjoined roots. So, $\Phi$ not only works between two extensions that extend the base fields $F,F'$ of two irreducible polynomials $p(x)$ and $p'(t)$ by a single root, but it also works between the smallest two fields that contain all the roots of the two distinct polynomials $f(x)$ and $f'(x)$. The "in particular" part of the statement follows from this theorem because if we select $\tau: F \to F$ to be the identity map such that $\tau(\alpha) = \alpha$ for every $\alpha \in F,$ then $\tau$ can be extended to the isomorphism $\psi:K \to K'$ between $F$ and $F'.$
Is this reasoning correct?
• Essentially correct. There is no restriction in the corollary. It says "$\phi:F(a)\rightarrow F'(b)$ is an isomorphism where $\phi|_F:F\rightarrow F'$ and $a\mapsto b$." For example, taking $F(X)\rightarrow F(a)$ where $F(X)$ are rational functions in $X$ and the map is simply $X\mapsto a$ or, evaluation of a rational function at $X=a$. However, it is also much deeper than this example and I encourage you to explore with not so weird an example: $X^2+1$ over $\mathbb{Q}$. – Eoin Jun 8 '15 at 3:07
• The polynomials $f(x)$ and $f'(t)$ are not "distinct" in the usual sense. $f'(t)$ is the image of $f(x)$ under an isomorphism given by $\tau: F\rightarrow F'$ and $x\mapsto t$. So they are distinct polynomials but, they are still "isomorphic" in some sense (probably close to a very very loose sense). It is correct however, to take the identity map and extend this to a mapping which does not have to be an identity $\psi: K\rightarrow K'$. It is simply an isomorphism. For example, the mapping of $\mathbb{Q}[i]\rightarrow \mathbb{Q}[i]$ sending $i\mapsto -i$ would be such a map. – Eoin Jun 8 '15 at 3:15
• @Eoin Thanks for the help. – St Vincent Jun 8 '15 at 4:40 |
## Essential University Physics: Volume 1 (4th Edition)
$L_0=\frac{5}{7}L_f$
a) While there is a way to use complex math to arrive at the equation necessary to find this answer, we can also use proportions to obtain: $(\frac{3v}{v})^2(L_f-L_0)=2(2L_f-L_0)$ $(\frac{9}{1})(L_f-L_0)=2(2L_f-L_0)$ $L_0=\frac{5}{7}L_f$ |
# Simplify the expression by combining like terms. Write the terms from the highest to the lowest power of the variable. 3x ^ 2 + 16x ^ 2 + x + 7 * 10x ^ 2 + 22 + 15x - 6x
###### Question:
Simplify the expression by combining like terms. Write the terms from the highest to the lowest power of the variable. 3x ^ 2 + 16x ^ 2 + x + 7 * 10x ^ 2 + 22 + 15x - 6x
### What fueled economic growth but also enflamed old rivalries and contributed to two world wars? -protectionism -rearmament -urbanization -globalization
What fueled economic growth but also enflamed old rivalries and contributed to two world wars? -protectionism -rearmament -urbanization -globalization...
### Simple intrerest is computed by finding the product of the principle amount p , the intrerest rate r, and the time t
simple intrerest is computed by finding the product of the principle amount p , the intrerest rate r, and the time t...
### Who is the public protector of South Africa
Who is the public protector of South Africa...
### Why is teritory considered to be as the major element of state ?
why is teritory considered to be as the major element of state ?...
### Why is someone commenting and reporting me for some thing i never did in brainly?.
why is someone commenting and reporting me for some thing i never did in brainly?....
### You should check your house for asbestos if it was built before what year? A. 1957 B. 1967 C. 1977 D. 1987
You should check your house for asbestos if it was built before what year? A. 1957 B. 1967 C. 1977 D. 1987...
### 200 kg of brass is melted down and cast into ornamental frogs, each weight 3/20kg. How many frogs are made?
200 kg of brass is melted down and cast into ornamental frogs, each weight 3/20kg. How many frogs are made?...
### Find the n th term of the sequence that starts with 31, 35, 39, 43
Find the n th term of the sequence that starts with 31, 35, 39, 43...
### How are bacteria able to adapt to a rapidly changing environment? A) During transcription, RNA polymerase is unable to bind and commence transcription. B) Bacterial genes, in a rapidly changing environment, mutate rapidly in response to these changes. C) Bacterial genes are organized into operons, clusters of coregulated genes, that are regulated such that they are all turned on or off together. D) Bacterial cells are prokaryotic and genes have ready access to the ribosomes within the cell's
How are bacteria able to adapt to a rapidly changing environment? A) During transcription, RNA polymerase is unable to bind and commence transcription. B) Bacterial genes, in a rapidly changing environment, mutate rapidly in response to these changes. C) Bacterial genes are organized into operons,...
### What Microsoft feature enables you to represent text as colorful visuals
What Microsoft feature enables you to represent text as colorful visuals...
### What can be used to make accurate predictions? Abe O body details text structure O author's purpose O text evidence
What can be used to make accurate predictions? Abe O body details text structure O author's purpose O text evidence...
### The ambiguity in Daisy Miller: A Study is best represented by _____.
The ambiguity in Daisy Miller: A Study is best represented by _____....
### What are three techniques you can use to help maintain a safe path of travel?
What are three techniques you can use to help maintain a safe path of travel?...
### Brian paid $4.95 at the grocery store for bananas and apples. He bought 3 pounds of bananas at$0.59 per pound and 2 pounds of apples. How much did Brian pay per pound for the apples he bought?
Brian paid $4.95 at the grocery store for bananas and apples. He bought 3 pounds of bananas at$0.59 per pound and 2 pounds of apples. How much did Brian pay per pound for the apples he bought?...
### Look at the graph!! and how do i describe the graph of g(x) as a transformation of f(x)
look at the graph!! and how do i describe the graph of g(x) as a transformation of f(x)... |
2 The openair package
In this book two packages are frequently used and it is a good idea to load both.
library(openair)
library(tidyverse)
Because the openair package (and R itself) are continually updated, it will be useful to know this document was produced using R version 4.2.0 and openair version 2.10-0.
Function help
Where code is shown in this document, function names are hyper-linked and will take you to the help page for the function.
2.1 Installation and code access
openair is avaiable on CRAN (Comprehensive R Archive network) which means it can be installed easily from R. I would recommend re-starting R and then type install.packages("openair"). If you use RStudio (which is highly recommended), you can just choose the ‘packages’ tab on the bottom-right, and then select ‘Install’. Simply start typing openair and you will find the package.
For openair all development is carried out using Github for version control. Users can access all code used in openair at (https://github.com/davidcarslaw/openair).
Sometimes it might be useful to install the development version of openair and you can find instructions here.
2.2 Input data requirements
The openair package applies certain constraints on input data requirements. It is important to adhere to these requirements to ensure that data are correctly formatted for use in openair. The principal reason for insisting on specific input data format is that there will be less that can go wrong and it is easier to write code for a more limited set of conditions.
• Data should be in a data frame (or tibble).
• The date/time field should be called date — note the lower case. No other name is acceptable.
• The wind speed and wind direction should be named ws and wd, respectively (note again, lower case). Wind directions follow the UK Met Office format and are represented as degrees from north e.g. 90 degrees is east. North is taken to be 360 degrees
• Where fields should have numeric data e.g. concentrations of NOx, then the user should ensure that no other characters are present in the column, accept maybe something that represents missing data e.g. ‘no data’.
• Other variables names can be upper/lower case but should not start with a number. If column names do have white spaces, R will automatically replace them with a full-stop. While PM2.5 as a field name is perfectly acceptable, it is a pain to type it in—better just to use pm25 (openair will recognise pollutant names like this and automatically format them as PM2.5 in plots).
2.3 Reading and formatting dates and times
While not specific to openair, dealing with dates and times is likely to be an issue that needs to be dealt with at some point. There is no getting away from the fact that dates and times can be complicated with issues such as time zones and daylight saving time i.e. when the clocks change for summer. This is a potentially big topic to consider and it is only considered in outline here.
For a lot of openair functions this issue will not be important. While these issues can often be ignored, it is better to be explicit and set the date-time correctly. Two situations where it becomes important is when wanting to show temporal variations in local time and combining data sets that are in different time zones. The former issue can be important (for example) when considering diurnal variations in a pollutant concentration that follows a human activity (such as rush-hour traffic), which follows local time and not GMT/UTC.
When importing data into R it is important to know how the date-time is represented in your original data, especially in terms of time zone.
When importing data it is important to know how the date-time is represented. In the UK it is easy for us to forget that simply working with data in GMT is not always an option. However, most air quality and meteorological data around the world tends to be in GMT/UTC or a fixed offset from GMT/UTC i.e. not in local time where hours can be missing or duplicated.
Life is made much easier using the lubridate package, which has been developed for working with dates and times. The lubridate package has a family of functions that will convert common formats of dates and times into a R-formatted version. These functions are useful when importing data and the date-time is in a character format and needs formatting. Here are some examples:
Original date in ‘British’ format (day/month/ year hours-minutes):
library(lubridate) # load package
date_time <- "2/8/2022 11:00"
# format it
dmy_hm(date_time)
[1] "2022-08-02 11:00:00 UTC"
When R formats a date-time correctly is will be shown from ‘large to small’ i.e. YYYY-MM-DD HH:MM:SS, which provides a clue that it has indeed been formatted correctly.
US date time with seconds (month-day-year):
date_time <- "8/2/2022 11:05:12"
mdy_hms(date_time)
[1] "2022-08-02 11:05:12 UTC"
As you can see, by default, the date-time is formatted in UTC (GMT). It is at this point where you can also set a time zone of the original data if it was not in GMT. Let’s assume the original data were a fixed off-set from GMT of -8 hours (west coast USA perhaps). This can be done by setting the time zone explicitly1:
date_time <- "8/2/2022 11:05:12" # time 8 hours behind GMT
mdy_hms(date_time, tz = "Etc/GMT+8")
[1] "2022-08-02 11:05:12 -08"
which actually shows the GMT offset of -8 hours.
A common task might be to plot time series and temporal variations of pollutant concentrations in local time. How does one do this if the imported data are in GMT/UTC (or a fixed offset from GMT/UTC)?
In this case it is necessary to know how the local time zone with daylight saving time (DST) is represented. Time zone names follow the Olson scheme — you can list them by typing OlsonNames(). Given the scenario where we have imported data in GMT but want to display the data in local time (BST — British Summer Time), we can use the with_tz function in lubridate to do this:
date_time <- "2/8/2022 11:00"
# format it
date_time <- dmy_hm(date_time) # GMT
date_time
[1] "2022-08-02 11:00:00 UTC"
# what is the hour?
hour(date_time)
[1] 11
# format in local time
time_local <- with_tz(date_time, tz = "Europe/London")
time_local
[1] "2022-08-02 12:00:00 BST"
# local hour is +1 from GMT
hour(time_local)
[1] 12
In the above example, date_time and local_time are the same absolute time — we are just changing how the time is displayed. In practice, given a data frame with a date column in GMT and there interest in making sure openair uses the local time, some formatting such as mydata$date -> with_tz(mydata$date, tz = "Europe/London") is what is needed.
Another scenario is you import data using a function such as read_csv from the readr package and it recognises a date-time in the data and by default assumes it is GMT/UTC. This might be wrong even though it is now formatted correctly in R. In this case you can force a new time zone using the force_tz function. For example:
# a correctly formatted date-time that is in GMT but should be something else
date_time
[1] "2022-08-02 11:00:00 UTC"
# force the time zone to be something different
force_tz(date_time, tz = "Etc/GMT+8")
[1] "2022-08-02 11:00:00 -08"
Finally, what about combining data sets in different time zones? In Chapter 4 it is shown how it is possible to access meteorological data from around the world (all in GMT). The interest might be in combining this data with air quality data that is in another time zone. So long as the date-times were correctly formatted in the first place, then simply joining the data sets by date is all that is needed, as R works out how the times match internally. An example of joining two data sets is shown in Section 4.2.
2.4 Brief overview of openair
This section gives a brief overview of the functions in openair. Having read some data into a data frame it is then straightforward to run any function. Almost all functions are run as:
functionName(thedata, options, ...)
The usage is best illustrated through a specific example, in this case the polarPlot function. The details of the function are shown in Chapter 8) and through the help pages (type ?polarPlot). As it can be seen there are numerous options associated with polarPlot — and most other functions and each of these has a default. For example, the default pollutant considered in polarPlot is nox. If the user has a data frame called theData then polarPlot could minimally be called by:
polarPlot(theData)
which would plot a nox polar plot if nox was available in the data frame theData.
Note that the options do not need to be specified in order nor is it always necessary to write the whole word. For example, it is possible to write:
polarPlot(theData, type = "year", poll = "so2")
In this case writing poll is sufficient to uniquely identify that the option is pollutant.
Also there are many common options available in functions that are not explicitly documented, but are part of lattice graphics. Some common ones are summarised in Table 2.1. The layout option allows the user to control the layout of multi-panel plots e.g. layout = c(4, 1) would ensure a four-panel plot is 4 columns by 1 row.
Table 2.1: Common options used in openair plots that can be set by the user but are generally not explicitly documented.
option description
xlab x-axis label
ylab y-axis label
main title of the plot
pch plotting symbol used for points
cex size of symbol plotted
lty line type
lwd line width
layout the plot layout e.g. c(2, 2)
2.5 The type option
One of the central themes in openair is the idea of conditioning. Rather than plot $$x$$ against $$y$$, considerably more information can usually be gained by considering a third variable, $$z$$. In this case, $$x$$ is plotted against $$y$$ for many different intervals of $$z$$. This idea can be further extended. For example, a trend of NOx against time can be conditioned in many ways: NOx vs. time split by wind sector, day of the week, wind speed, temperature, hour of the day … and so on. This type of analysis is rarely carried out when analysing air pollution data, in part because it is time consuming to do. However, thanks to the capabilities of R and packages such as lattice and ggplot2, it becomes easier to work in this way.
In most openair functions conditioning is controlled using the type option. type can be any other variable available in a data frame (numeric, character or factor). A simple example of type would be a variable representing a ‘before’ and ‘after’ situation, say a variable called period i.e. the option type = "period" is supplied. In this case a plot or analysis would be separately shown for ‘before’ and ‘after’. When type is a numeric variable then the data will be split into four quantiles and labelled accordingly. Note however the user can set the quantile intervals to other values using the option n.levels. For example, the user could choose to plot a variable by different levels of temperature. If n.levels = 3 then the data could be split by ‘low’, ‘medium’ and ‘high’ temperatures, and so on. Some variables are treated in a special way. For example if type = "wd" then the data are split into 8 wind sectors (N, NE, E, …) and plots are organised by points of the compass.
There are a series of pre-defined values that type can take related to the temporal components of the data as summarised in Table 2.2. To use these there must be a date field so that it can be calculated. These pre-defined values of type are shown below are both useful and convenient. Given a data frame containing several years of data it is easy to analyse the data e.g. plot it, by year by supplying the option type = "year". Other useful and straightforward values are “hour” and “month”. When type = "season" openair will split the data by the four seasons (winter = Dec/Jan/Feb etc.). Note for southern hemisphere users that the option hemisphere = "southern" can be given. When type = "daylight" is used the data are split between nighttime and daylight hours. In this case the user can also supply the options latitude and longitude for their location (the default is London).
Table 2.2: Built-in ways of splitting data in openair using the type option that is available for most functions.
option description
‘year’ splits data by year
‘month’ splits data by month of the year
‘week’ splits data by week of the year
‘monthyear’ splits data by year and month
‘season’ splits data by season. Note in this case the user can also supply a hemisphere option that can be either ‘northern’ (default) or ‘southern’
‘weekday’ splits data by day of the week
‘weekend’ splits data by Saturday, Sunday, weekday
‘daylight’ splits data by nighttime/daytime. Note the user must supply a longitude and latitude
‘dst’ splits data by daylight saving time and non-daylight saving time
‘wd’ if wind direction (wd) is available type = 'wd' will split the data into 8 sectors: N, NE, E, SE, S, SW, W, NW
‘seasonyear’ will split the data into year-season intervals, keeping the months of a season together. For example, December 2010 is considered as part of winter 2011 (with January and February 2011). This makes it easier to consider contiguous seasons. In contrast, type = 'season' will just split the data into four seasons regardless of the year.
If a categorical variable is present in a data frame e.g. site then that variables can be used directly e.g. type = "site".
In some cases it is useful to categorise numeric variables according to one’s own intervals. One example is air quality bands where concentrations might be described as “good”, “fair”, “bad”. For this situation we can use the cut function. In the example below, concentrations of NO2 are divided into intervals 0-50, 50-100, 100-150 and >150 using the breaks option. Also shown are user-defined labels. Note there is 1 more break than label. There are a couple of thing sto note here. First, include.lowest = TRUE ensures that the lowest value is included in the lowest break (in this case 0). Second, the maximum value (1000) is set to be more than the maximum value in the data to ensure the final break encompasses all the data.
In some cases it is useful to categorise numeric variables according to one’s own intervals. One example is air quality bands where concentrations might be described as “good”, “fair”, “bad”. For this situation we can use the cut function. In the example below, concentrations of NO2 are divided into intervals 0-50, 50-100, 100-150 and >150 using the breaks option. Also shown are user-defined labels. Note there is 1 more break than label. There are a couple of things to note here. First, include.lowest = TRUE ensures that the lowest value is included in the lowest break (in this case 0). Second, the maximum value (1000) is set to be more than the maximum value in the data to ensure the final break encompasses all the data.
mydata$intervals <- cut(mydata$no2,
breaks = c(0, 50, 100, 150, 1000),
labels = c("Very low", "Low", "High",
"Very High"),
include.lowest = TRUE)
# look at the data
head(mydata)
# A tibble: 6 × 11
date ws wd nox no2 o3 pm10 so2 co pm25
<dttm> <dbl> <int> <int> <int> <int> <int> <dbl> <dbl> <int>
1 1998-01-01 00:00:00 0.6 280 285 39 1 29 4.72 3.37 NA
2 1998-01-01 01:00:00 2.16 230 NA NA NA 37 NA NA NA
3 1998-01-01 02:00:00 2.76 190 NA NA 3 34 6.83 9.60 NA
4 1998-01-01 03:00:00 2.16 170 493 52 3 35 7.66 10.2 NA
5 1998-01-01 04:00:00 2.4 180 468 78 2 34 8.07 8.91 NA
6 1998-01-01 05:00:00 3 190 264 42 0 16 5.50 3.05 NA
# … with 1 more variable: intervals <fct>
Then it is possible to use the new intervals variable in most openair functions e.g. windRose(mydata, type = "intervals").
A special case is splitting data by date. In this scenario there might be interest in a ‘before-after’ situation e.g. due to an intervention. The openair function splitByDate should make this easy. Here is an example:
splitByDate(
mydata,
dates = "1/1/2003",
labels = c("before", "after"),
name = "scenario"
)
# A tibble: 65,533 × 12
date ws wd nox no2 o3 pm10 so2 co pm25
<dttm> <dbl> <int> <int> <int> <int> <int> <dbl> <dbl> <int>
1 1998-01-01 00:00:00 0.6 280 285 39 1 29 4.72 3.37 NA
2 1998-01-01 01:00:00 2.16 230 NA NA NA 37 NA NA NA
3 1998-01-01 02:00:00 2.76 190 NA NA 3 34 6.83 9.60 NA
4 1998-01-01 03:00:00 2.16 170 493 52 3 35 7.66 10.2 NA
5 1998-01-01 04:00:00 2.4 180 468 78 2 34 8.07 8.91 NA
6 1998-01-01 05:00:00 3 190 264 42 0 16 5.50 3.05 NA
7 1998-01-01 06:00:00 3 140 171 38 0 11 4.23 2.26 NA
8 1998-01-01 07:00:00 3 170 195 51 0 12 3.88 2.00 NA
9 1998-01-01 08:00:00 3.36 170 137 42 1 12 3.35 1.46 NA
10 1998-01-01 09:00:00 3.96 170 113 39 2 12 2.92 1.20 NA
# … with 65,523 more rows, and 2 more variables: intervals <fct>,
# scenario <ord>
This code adds a new column scenario that is labelled before and after depending on the date. Note that the dates input by the user is in British format (dd/mm/YYYY) and that several dates (and labels) can be provided.
2.6 Controlling font size
All openair plot functions have an option fontsize. Users can easily vary the size of the font for each plot e.g.
polarPlot(mydata, fontsize = 20)
The font size will be reset to the default sizes once the plot is complete. Finer control of individual font sizes is currently not easily possible.
2.7 Using colours
Many of the functions described require that colour scales are used; particularly for plots showing surfaces. It is only necessary to consider using other colours if the user does not wish to use the default scheme, shown at the top of Figure 2.1. The choice of colours does seem to be a vexing issue as well as something that depends on what one is trying to show in the first place. For this reason, the colour schemes used in openair are very flexible: if you don’t like them, you can change them easily. R itself can handle colours in many sophisticated ways; see for example the RColorBrewer package.
Several pre-defined colour schemes are available to make it easy to plot data. In fact, for most situations the default colour schemes should be adequate. The choice of colours can easily be set; either by using one of the pre-defined schemes or through a user-defined scheme. More details can be found in the openair openColours function. Some defined colours are shown in Figure 2.1, together with an example of a user defined scale that provides a smooth transition from yellow to blue.
library(openair)
## small function for plotting
printCols <- function(col, y) {
rect((0:200) / 200, y, (1:201) / 200, y + 0.1, col = openColours(col, n = 201),
border = NA)
text(0.5, y + 0.15, deparse(substitute(col)))
}
## plot an empty plot
plot(1, xlim = c(0, 1), ylim = c(0, 1.6), type = "n", xlab = "", ylab = "",
axes = FALSE)
printCols("default", 0)
printCols("increment", 0.2)
printCols("heat", 0.4)
printCols("jet", 0.6)
printCols("viridis", 0.8)
printCols("inferno", 1.0)
printCols("greyscale", 1.2)
printCols(c("tomato", "white", "forestgreen" ), 1.4)
The user-defined scheme is very flexible and the following provides examples of its use. In the examples shown next, the polarPlot function is used as a demonstration of their use.
# use default colours - no need to specify
polarPlot(mydata)
# use pre-defined "jet" colours
polarPlot(mydata, cols = "jet")
# define own colours going from yellow to green
polarPlot(mydata, cols = c("yellow", "green"))
# define own colours going from red to white to blue
polarPlot(mydata, cols = c("red", "white", "blue"))
For more detailed information on using appropriate colours, have a look at the colorspace package. colorspace provides the definitive, comprehensive approach to using colours effectively. You will need to install the package, install.packages("colorspace"). To use the palettes with openair, you can for example do:
library(colorspace)
library(openair)
windRose(mydata, cols = qualitative_hcl(4, palette = "Dark 3"))
2.8 Automatic text formatting
openair tries to automate the process of annotating plots. It can be time-consuming (and tricky) to repetitively type in text to represent μg m-3 or PM10 (μg m-3) etc. in R. For this reason, an attempt is made to automatically detect strings such as nox or NOx and format them correctly. Where a user needs a y-axis label such as NOx (μg m-3) it will only be necessary to type ylab = "nox (ug/m3)". The same is also true for plot titles.
Users can override this option by setting it to FALSE.
2.9 Multiple plots on a page
We often get asked how to combine multiple plots on one page. Recent changes to openair makes this a bit easier. Note that because openair uses lattice graphics the base graphics par} settings will not work.
It is possible to arrange plots based on a column $$\times$$ row layout. Let’s put two plots side by side (2 columns, 1 row). First it is necessary to assign the plots to a variable:
a <- windRose(mydata)
b <- polarPlot(mydata)
Now we can plot them using the split option:
print(a, split = c(1, 1, 2, 1))
print(b, split = c(2, 1, 2, 1), newpage = FALSE)
In the code above for the split option, the last two numbers give the overall layout (2, 1) — 2 columns, 1 row. The first two numbers give the column/row index for that particular plot. The last two numbers remain constant across the series of plots being plotted.
There is one difficulty with plots that already contain sub-plots such as timeVariation where it is necessary to identify the particular plot of interest (see the timeVariation help for details). However, say we want a polar plot (b above) and a diurnal plot:
c <- timeVariation(mydata)
print(b, split = c(1, 1, 2, 1))
print(c, split = c(2, 1, 2, 1), subset = "hour", newpage = FALSE)
For more control it is possible to use the position argument. position is a vector of 4 numbers, c(xmin, ymin, xmax, ymax) that give the lower-left and upper-right corners of a rectangle in which the plot is to be positioned. The coordinate system for this rectangle is [0–1] in both the x and y directions.
As an example, consider plotting the first plot in the lower left quadrant and the second plot in the upper right quadrant:
print(a, position = c(0, 0, 0.5, 0.5), more = TRUE)
print(b, position = c(0.5, 0.5, 1, 1))`
The position argument gives more fine control over the plot location.
1. As R help says: “Contrary to some expectations (but consistent with names such as ‘PST8PDT’), negative offsets are times ahead of (east of) UTC, positive offsets are times behind (west of) UTC.”↩︎ |
If you're seeing this message, it means we're having trouble loading external resources on our website.
If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.
## Get ready for AP® Calculus
### Unit 1: Lesson 14
Sal converts the radian measures π and -π/3 to degrees. Created by Sal Khan and Monterey Institute for Technology and Education.
## Want to join the conversation?
• What if the radians come in decimals. For example I came across to having to convert 0.9 radians into degrees. How would you be able to convert that when it's not in terms of pi ?
• For any amount of radians, whole or decimal, positive or negative, you just multiply by [180/pi] to get degrees. So 0.9 * 180 / pi ~= 51.57 deg.
• I like this website but these don't explain why I can't use the value for one radian and multiply that by the degrees I've been given to convert? Why does 240 degrees get converted into 4π over 3 radians? shouldn't it be 13750.99 lol? Or am I simply too stupid to understand this concept?
• We usually use the fractions with pi when talking about radians because it is actually easier to work with the fractions. (I know, a lot of people don't like fractions, but they are our friends!) It is easier to work with 4/3 pi than the decimal equivalent which is 4.188790. (I think you used the wrong conversion process to get your 13750.99. That is an upside down conversion that you would get if you multiplied 240 times 180 and then divided by pi. Rather than giving you radians, it gave you degrees squared :) Sal shows us how to line up the units so they cancel in this video and his video on converting degrees to radians, which is just the inverse of radians degrees--also in the units videos on Khan Academy.)
The size of the angle is exact when you use the fraction, but when you convert to decimals, most of your results are NOT exact--they are approximations. They get very messy when you do the next step, and the next step with your results. In more advanced math, your first results are just stepping stones for all the other steps you need to do, so messy is not good.
Another reason is that some angles show up in problems over and over, so they become old friends even when they are something like 4/3 pi. The diagonal of a square forms a 45 degree angle which is pi/4. Half of an equilateral triangle forms a 30-60-90 degree triangle. 30 degrees is pi/6 and Sal just showed us that 60 degrees is pi/3. A right angle is 90 degrees and that is pi/2.
You will get a chance to work with plenty of decimals, though, because if a number of degrees does not form a nice fraction with 180, then we convert all the way to a decimal and just have to deal with the decimal places.
• At , Sal says that 2pi radians is 360 degrees. When you do basic geometry, 2pi radius (radii) is 360 degrees. As both statement are equal, are radius's and radians the same or different? Please give the definition of both terms.
• They're different.
Radius: the distance between the center of a circle and any point on its circumference
Radian: One radian is 180/pi degrees. An arc of a circle (that has a radius of 1 unit) with a central angle of a radians has a length of a units.
• I don't understand how to convert decimal degree measures into degrees-minutes-seconds, can anyone help?
• To convert 15.6358 degrees to deg min sec form do:
Clearly we have 15 degrees, so the remaining 0.6358 is minutes and seconds.
Since one degree has 60 min, we can write that x = 0.6358 * 60 So x is 38.148 min.
A simpler example would be 0.5 degrees is equivalent to 0.5 * 60 = 30 minutes, so half of one degree, which makes sense.
Now we convert the 0.148 remaining min to sec in a similar manner.
y = 0.148 * 60 so y is 8.88 sec. Since there isn't really something smaller then a sec, we leave it at that.
The final result is 15 deg 38 min 8.88 sec
• what is meant by radians
• radians are just another form of measurements that can be used to scale things with larger form. anything measured in degrees can also be measured in radians. if we are working on a question with the degrees of a circle we could go about it as 360degrees or we could work the problem as 180radians. now if we were working with triangle using degrees would prob be a bit more useful.hope this helped
• Why doesn't radians have a symbol. And if it does, what is it?
• Radians can be represented by a superscript "c" symbol after the angle measure in radians .
The "c" here stands for circular measure.
However this symbol is rarely used as it can be easily confused with the degree symbol(°).
• could I also get help for this problem? 17pi/18 rads to degrees. the way the video describes it doesn't explain for this.
To convert from radians to degrees, multiply the angle measure by 180º / (π radians):
17(180º) / 18
170º
• How can something be a 'negative' radian (-pi/3)? How can a radian be negative?
• The convention is to take counter clockwise as positive. So if you go clockwise π/3 radians from the 0 radian position, then your angle measure is -π/3 radians.
• I don't really understand this, since the example given was with pi alone, but what about a number like 23pi/20? |
# Different fonts in MaTeX
I have read this link but can't quite figure out how to use a different font in MaTeX. Would someone be kind enough to provide a complete example for the following context
Graphics[Inset[MaTeX["\\sin(x)", Magnification -> 4], {3, 1}, {0, 0}]]
Also, the following questions occur to me:
1. Can I use any font installed on my system (MacOS)?
2. Even a Hershey font?
3. Will the necessary metrics be recalculated for the given font?
Could you also include the code that illustrates how to use xelatex?
• This is all covered in the MaTeX documentation, did you check it? – Szabolcs Jun 1 '17 at 21:02
• @Szabolcs Thank you for your answer below! How does one know that there is MaTeX documentation? I did "??MaTeX", but that yielded nothing. Same thing happened whey I searched for MaTeX in the Documentation Center. I also tried "reference.wolfram.com/MaTeX/guide/MaTeX", as well as typing "MaTeX/guide/MaTeX" in the search/address bar of the documentation window in Mathematica. Perhaps downloading the documentation from Github is best? – Wynne Jun 2 '17 at 13:53
• "How does one know that there is MaTeX documentation?" It's clearly written on the page you linked ... scroll to "Usage". If you type ?MaTeX, there will be a small blue >> sign after the usage message. Clicking it opens the help. Don't search on wolfram.com online. That's Wolfram's documentation. Click Help -> Documentation Center, and search there. You do have the latest version of the package, 1.7.0, right? – Szabolcs Jun 2 '17 at 15:02
This is all covered in the built-in documentation. Search for MaTeX in Mathematica's documentation centre, or type in the following address: MaTeX/guide/MaTeX. But I will give some examples here anyway:
How to set a given font is really a LaTeX question. You will find many fonts samples with instructions on how to use them at The LaTeX Font Catalogue. Usually, you need to add a package to the preamble. For example, to use Utopia Regular with Fourier math,
MaTeX["\\text{If $x=2$ then $x^2=4$.}",
"Preamble" -> {"\\usepackage{fourier}"}, Magnification -> 4]
This is covered under the examples for the "Preamble" option in the MaTeX doc page, as well as in the Typesetting with MaTeX tutorial.
To use XeTeX, you need to configure MaTeX to use the appropriate executable. This is covered in the ConfigureMaTeX documentation page. I will copy the example from that page here:
ConfigureMaTeX["pdfLaTeX" -> "/Library/TeX/texbin/xelatex"]
MaTeX["\\text{Beautiful typesetting}",
"Preamble" -> {"\\usepackage{fontspec}", "\\setmainfont{Zapfino}"},
FontSize -> 24
]
Of course, the path to the xelatex executable will be specific to your system.
Remember that ConfigureMaTeX will change the configuration permanently. Restarting Mathematica will not reset it. If you need to change back to plain LaTeX, you must do it manually.
With XeLaTeX you can use any installed system font. This is generally useful for text mode only. To use a system font in math mode, you need a font with special math support, such as Cambria Math. Here's an example:
MaTeX["\\text{Foo $x^2$ bar}",
"Preamble" -> {"\\usepackage{fontspec,unicode-math}",
"\\setmainfont{Cambria}", "\\setmathfont{Cambria Math}"},
Magnification -> 4
]
I am not experienced with Unicode math fonts. If you have more questions about them, I suggest asking on TeX.SE.
• Thank you, very helpful! Is there a list of fonts with special math support? – Wynne Jun 2 '17 at 13:55
• @Lem.ma I don't know, you'd have to google ... – Szabolcs Jun 2 '17 at 15:03 |
# Run the FullNode with IDEA¶
This document aims to facilitate developers with some experience to run the FullNode with IDEA.
The following is for the master branch of java-tron.
## Configure IDEA¶
The configuration of IDEA
• Oracle JDK 1.8 OpenJDK is not currently supported
• Install Lombok plugin
• Tick Enable annotation processing
## Deployment guide¶
1.Create a directory /deploy
mkdir deploy
2.Clone the latest code
git clone https://github.com/tronprotocol/java-tron
3.Switch to the master branch
cd java-tron
git checkout -t origin/master
4.Compile the code
./gradlew build
The compilation process may take some time, please be patient. If the compilation is successful, you can see the information similar to the following:
If you do not want to perform unit test tasks, you can run the following command:
./gradlew build -x test
5.Start the FullNode
After compiling successfully, you can find the main function file through the path java-tron / src / main / java / org / tron / program / FullNode.java and then start a full node.
After starting, you can check the log to verify whether the startup is successful. The log path is: /deploy/java-tron/logs/tron.log. If the startup is successful, you can see the following logs:
Also,you can use this command like tail -f /logs/tron.log to view the real-time log, as follows: |
# Is Matter Around Us Pure
## Science
### NCERT
1 What is meant by a pure substance?
##### Solution :
A pure substance is the one that consists of a single type of particles, i.e., all constituent particles of the substance have the same chemical nature. Pure substances can be classified as elements or compounds.
2 List the points of differences between homogeneous and heterogeneous mixtures.
##### Solution :
A homogeneous mixture is a mixture having a uniform composition throughout the mixture.$\\$ For example: salt in water, sugar in water, copper sulphate in water $\\$ A heterogeneous mixture is a mixture having a non-uniform composition throughout the mixture.$\\$ For example: sodium chloride and iron fillings, salt and sulphur, oil and water
3 Differentiate between homogeneous and heterogeneous mixtures with examples.
##### Solution :
A homogeneous mixture is a mixture having a uniform composition throughout the mixture.$\\$ For example, mixtures of salt in water, sugar in water, copper sulphate in water, iodine in alcohol, alloy, and air have uniform compositions throughout the mixtures.$\\$ On the other hand, a heterogeneous mixture is a mixture having a non-uniform composition throughout the mixture. $\\$ For example, composition of mixtures of sodium chloride and iron fillings, salt and sulphur, oil and water, chalk powder in water, wheat flour in water, milk and water are not uniform throughout the mixtures.
4 How are sol, solution and suspension different from each other?
##### Solution :
Sol is a heterogeneous mixture. In this mixture, the solute particles are so small that they cannot be seen with the naked eye. Also, they seem to be spread uniformly throughout the mixture. The Tyndall effect is observed in this mixture. $\\$For example: milk of magnesia, mud Solution is a homogeneous mixture. In this mixture, the solute particles dissolve and spread uniformly throughout the mixture.$\\$ The Tyndall effect is not observed in this mixture. $\\$For example: salt in water, sugar in water, iodine in alcohol, alloy Suspensions are heterogeneous mixtures.$\\$ In this mixture, the solute particles are visible to the naked eye, and remain suspended throughout the bulk of the medium. $\\$The Tyndall effect is observed in this mixture. For example: chalk powder and water, wheat flour and water
5 To make a saturated solution, $36 g$ of sodium chloride is dissolved in $100 g$ of water at $293 K.$ Find its concentration at this temperature.
##### Solution :
Mass of solute (sodium chloride) $= 36 g$ (Given)$\\$ Mass of solvent (water) $= 100 g$ (Given)$\\$ Then, mass of solution = Mass of solute + Mass of solvent$\\$ $= (36 + 100) g$$\\ = 136 g$$\\$ Therefore, concentration (mass by mass percentage) of the solution$\\$ $\dfrac{\text{ Mass of solute}}{\text{Mass of solvent}}*100$%$\\$ $\dfrac{36}{136}*100$%$\\$ $26.47$% |
# Recollection
Mathematics was always my worst subject at school, right up until I went to college. I’ve heard a similar story from other physicists. I don’t know how useful my speculation about it will be to anybody else, but I think this is the reason why.
The kind of mistake I was prone to making, and the flaws in the way mathematics was taught, meshed perfectly. Carelessness cost more in math than in anything else, on the whole. If I was writing a history essay and I happened to swap Elba and St. Helena, I might only get docked a couple points out of a hundred, or perhaps none at all if the teacher had too many papers to grade. But if I wandered away from my pre-algebra homework, and upon my return my garishly awful handwriting had turned absolute-value bars into ordinary parentheses, my calculations would be completely off from that point onward. Nor did any of my teachers pick up on my problem — “Blake, you’ve got to be more careful!” — which makes me suspect that they weren’t much better at identifying what went wrong for other students, either.
In history and to a large extent in science, I was able to get by all through middle and high school with what I had learned out of books and documentaries on my own. (I was extraordinarily lucky to have a family that already had plenty of books around, and the means and the sense to provide me with more as I packed their contents into my brains.) I don’t think I had to learn anything in school that came across as wholly new. Everything was at most an elaboration of a topic I had already seen, something I’d grasped from a Larry Gonick cartoon guide, let’s say, done up with a few more details that might just have been included for the sake of having homework problems to assign. Algebra and geometry and trig and calculus, though, came closer to asking for a genuine production on my part.
Techniques of checking one’s work, which might have helped me to become a bit more generally competent, were either not taught or not motivated. “Plug your value of $x$ back in and check” might have been the last step of a few algebra exercises, but only because it was part of the rubric, devised to add another thing that could be graded.
The weird thing is that I had a sense of the importance of the mathematics, of the motivation for it. I knew why Kepler had cared about sines and cosines — to hear the music of the spheres, to turn comets from signs of dread into those of wonder. Exponentials tracked the explosion of populations and the decay of radioisotopes, each ominous in its own way. The subject offered wonders of pure thought and marvels of application. At the time, schoolwork merely seemed disconnected from those treasures which I saw in secondhand outline. Now, in retrospect, it appears almost a parody of them. |
# Ex.14.1 Q3 Factorization - NCERT Maths Class 8
Go back to 'Ex.14.1'
## Question
Factorize:
(i) \begin{align} {x^2} + xy + 8x + 8y\end{align}
(ii) \begin{align} 15xy - 6x + 5y - 2\end{align}
(iii) \begin{align}ax + bx - ay - by\end{align}
(iv) \begin{align} 15pq + 15 + 9q + 25p\end{align}
(v) \begin{align}z - 7 + 7xy - xyz\end{align}
## Text Solution
What is known:
Algebraic expression.
What is unknown:
Factorisation of given algebraic expression.
Reasoning:
There are $$4$$ terms in each expression. First we will make pair of two terms from which we can take out common factors and convert the expression of $$4$$ terms into $$2$$ terms expression then take out common factors from remaining $$2$$ terms.
Steps:
\begin{align}({\rm{i}}) \quad & {x^2} + xy + 8x + 8y \\ &= \begin{Bmatrix} x \times x + x \times y \\ + 8 \times x + 8 \times y \end{Bmatrix} \\&= x(x + y) + 8(x + y)\\&= (x + y)(x + 8)\end{align}
\begin{align}{\rm{(ii)}} \quad & 15xy - 6x + 5y - 2 \\ &= \begin{Bmatrix} 3 \times 5 \times x \times y\\ - 3 \times 2 \times x \\ + 5 \times y - 2 \end{Bmatrix} \\&= 3x(5y - 2) + 1(5y - 2)\\ &= (5y - 2)(3x + 1)\end{align}
\begin{align}{\rm{ (iii)}} \quad & ax + bx - ay - by \\ &= \begin{Bmatrix} a \times x + b \\ \times x - a \times y \\ - b \times y \end{Bmatrix} \\ &= x(a + b) - y(a + b)\\ &= (a + b)(x - y)\end{align}
\begin{align}{\rm{ (iv)}} \quad &15pq + 15 + 9q + 25p \\ &= 15pq + 9q + 25p + 15\\ &= \begin{Bmatrix} 3 \times 5 \times p \times q \\ + 3 \times 3 \times q + 5 \times 5 \\ \times p + 3 \times 5 \end{Bmatrix} \\&= 3q(5p + 3) + 5(5p + 3)\\&= (5p + 3)(3q + 5)\end{align}
\begin{align}({\rm{v}}) \quad & z - 7 + 7xy - xyz \\ &= \begin{Bmatrix} z - x \times y \\ \times z - 7 + 7 \\ \times x \times y \end{Bmatrix} \\&= z(1 - xy) - 7(1 - xy)\\&= (1 - xy)(z - 7)\end{align}
Learn from the best math teachers and top your exams
• Live one on one classroom and doubt clearing
• Practice worksheets in and after class for conceptual clarity
• Personalized curriculum to keep up with school |
# Consider the statement : " For an integer n, if $n^{3}-1$ is even, then n is odd.'' The contrapositive statement of this statement is : Option: 1 For an integer n, if n is even, then is odd. Option: 2 For an integer n, if is not even, then n is not odd. Option: 3 For an integer n, if n is even, then is even. Option: 4 For an integer n, if n is odd, then is even.
\begin{aligned} &\text { Contrapositive of }(\mathrm{p} \rightarrow \mathrm{q}) \text { is } \sim \mathrm{q} \rightarrow \sim \mathrm{p}\\ &\text { For an integer } n \text { , if } n \text { is even then }\left(n^{3}-1\right) \text { is odd } \end{aligned}
## Most Viewed Questions
### Preparation Products
##### Knockout JEE Main April 2021 (One Month)
Personalized AI Tutor and Adaptive Time Table, Self Study Material, Weekend Live Classes, Mentorship from our Experts, Unlimited Mock Tests and Personalized Analysis Reports, 24x7 Doubt Chat Support,.
₹ 14000/- ₹ 4999/-
##### Knockout JEE Main May 2021
Personalized AI Tutor and Adaptive Time Table, Self Study Material, Weekend Live Classes, Mentorship from our Experts, Unlimited Mock Tests and Personalized Analysis Reports, 24x7 Doubt Chat Support,.
₹ 22999/- ₹ 9999/-
##### Test Series JEE Main May 2021
Unlimited Chapter Wise Tests, Unlimited Subject Wise Tests, Unlimited Full Mock Tests, Get Personalized Performance Analysis Report,.
₹ 6999/- ₹ 2999/-
##### Knockout JEE Main May 2022
Personalized AI Tutor and Adaptive Time Table, Self Study Material, Weekend Live Classes, Mentorship from our Experts, Unlimited Mock Tests and Personalized Analysis Reports, 24x7 Doubt Chat Support,.
₹ 34999/- ₹ 14999/- |
# Stuck on the derivation of pV^gamma=c
I have been tearing my hair out for a while over a step in the proof of the relation $pV^{\gamma}=constant$. The textbook has assumed that we are dealing with an ideal gas undergoing an adiabatic process. Therefore $dQ=0$ and we get
$$C_vdT + (c_p-c_V)\left(\frac{\partial T}{\partial V}\right)_pdV=0$$
which gives
$$dT=-(\gamma-1)\left(\frac{\partial T}{\partial V}\right)_pdV$$
Where $$\gamma=\frac{C_p}{C_V}$$
Now comes the part I don't get. They say that because we are dealing with an ideal gas, we have $$T=pV/nR$$ which gives $$\left(\frac{\partial T}{\partial V}\right)_p = \frac{T}{V}$$
Why isn't $\left(\frac{\partial T}{\partial V}\right)_p=p/nR$? Is there something obvious I'm missing? Would love to get this cleared up so I can get some sleep tonight.
Bystander
Homework Helper
Gold Member
T=pV/nR
What's dT? Remember, T is function of both V and P.
Yes, but p is constant as indicated by the subscript in $\frac{\partial T }{\partial V}_p$.
Bystander
Homework Helper
Gold Member
Oops, dragged a "red herring" in front of you. Maybe it's too obvious.
p/nR \left(\frac{\partial T}{\partial V}\right)_p=p/nR? Is there something obvious I'm missing?
What's p/nR? Ideal gas. Rearrange things any way you wish, and p/nR is also equal to ____ ?
T/V. You have my gratitude. I will now shed a tear for all the sleep this trivial thing has cost me.
Bystander |
## Finding the lowest energy structure
$FC=V-(L+\frac{S}{2})$
Moderators: Chem_Mod, Chem_Admin
Samantha Man 1L
Posts: 49
Joined: Thu Sep 27, 2018 11:22 pm
### Finding the lowest energy structure
When you are trying to identify the structure of lower energy between two lewis structures, do you always have to test each atom's formal charge to know which compound to pick?
jillianh1B
Posts: 64
Joined: Thu Sep 27, 2018 11:19 pm
### Re: Finding the lowest energy structure
I think that this is the most efficient way and only way to be completely sure of stability.
Nicoline Breytenbach 3D
Posts: 46
Joined: Thu Sep 27, 2018 11:26 pm
Been upvoted: 1 time
### Re: Finding the lowest energy structure
Calculating and comparing formal charges is the best way to determine the most stable structure!
Max Kwon 1J
Posts: 40
Joined: Thu Sep 27, 2018 11:18 pm
Been upvoted: 1 time
### Re: Finding the lowest energy structure
You have to test it because some lewis structures are more stable than others and the formal charge shows this stability in number terms. There are also resonance structures, but they are all the same in terms of formal charge and stability.
505095793
Posts: 46
Joined: Thu Sep 27, 2018 11:25 pm
### Re: Finding the lowest energy structure
The formal charge gives an indication of the extent to which atoms have gained or lost electrons in the process of covalent bond formation. Atom arrangements and Lewis structures with the lowest formal charges are likely to have the lowest energy.
Jordan Y4D
Posts: 25
Joined: Thu Sep 27, 2018 11:27 pm
### Re: Finding the lowest energy structure
Elements are the most stable when their FC is 0. You want the atoms with the highest electronegativity to have the lowest formal charge possible. Once you reach that, you want the overall FC to be as close to zero as well. This give you the lowest energy structure of a substance.
Return to “Formal Charge and Oxidation Numbers”
### Who is online
Users browsing this forum: No registered users and 1 guest |
# Question
Formatted question description: https://leetcode.ca/all/202.html
202 Happy Number
Write an algorithm to determine if a number is "happy".
A happy number is a number defined by the following process:
Starting with any positive integer,
replace the number by the sum of the squares of its digits,
and repeat the process until the number equals 1 (where it will stay),
or it loops endlessly in a cycle which does not include 1.
Those numbers for which this process ends in 1 are happy numbers.
Example: 19 is a happy number
1^2 + 9^2 = 82
8^2 + 2^2 = 68
6^2 + 8^2 = 100
1^2 + 0^2 + 0^2 = 1
# Algorithm
The example 19 given in the title is a happy number, so let’s look at a situation that is not a happy number. For example, the number 11 has the following calculation process:
1^2 + 1^2 = 2 2^2 = 4 4^2 = 16 1^2 + 6^2 = 37 3^2 + 7^2 = 58 5^2 + 8^2 = 89 8^2 + 9^2 = 145 1^2 + 4^2 + 5^2 = 42 4^2 + 2^2 = 20 2^2 + 0^2 = 4
At the end of the calculation, the number 4 appears again, then the following numbers will repeat the previous order. This cycle does not contain 1, then the number 11 is not a happy number. After discovering the pattern, we can consider how to use code it.
Use HashSet to record all the numbers that have appeared, and then every time a new number appears, look for it in HashSet to see if it exists,
• if it does not exist, add it to the table,
• if it exists, jump out of the loop, and determine whether the number is 1,
• if it is 1. Return true,
• return false if not 1
Java |
How can I make height-matched absolute value symbols without including packages? [closed]
I want to use the following formula:
or, in markup:
$$| \sum_i \vec{v}_i \Delta t_i |$$
I want to make nice height-matched absolute value delimiters, but I can't use any packages (I'm using MathJax). Is there a way to do this in pure math mode?
-
closed as off topic by Speravir, Kurt, Werner, Harish Kumar, ThorstenFeb 19 '13 at 6:47
Questions on TeX - LaTeX Stack Exchange are expected to relate to TeX, LaTeX or related typesetting systems within the scope defined by the community. Consider editing the question or leaving comments for improvement if you believe the question can be reworded to fit within the scope. Read more about reopening questions here.If this question can be reworded to fit the rules in the help center, please edit the question.
MathJax questions are in almost all cases off-topic here, and your question for me belongs to them. You could ask instead in the MathJaX Help Forums. – Speravir Feb 19 '13 at 1:21
This meta post says they're a borderline case, but still allowable. I'll rephrase the question to make it less MathJax-specific. – Dan Feb 19 '13 at 1:28
Do \big|/\Big|/\bigg|/\Bigg| work? – Speravir Feb 19 '13 at 1:45
BTW on Stack Overflow it should not be off-topic. There is also a tag, see stackoverflow.com/questions/tagged/mathjax. – Speravir Feb 19 '13 at 2:11
@Speravir: Yes, but I was hoping for something more automatic, where I don't have to manually pick the size of the delimiters. – Dan Feb 19 '13 at 2:12
You can use \left|....\right| for automatic sizing, or as Spervir comments: choose the size explicitly as (for example) \bigl|....\bigr| |
# Simple value of a Forward contract at an intermediate time question
I am taking "Financial Engineering and Risk Management Part I" from Columbia University on coursera and I got a seemingly simple question wrong on the first quiz. This is all based on the no-arbitrage arguments. Here is the question:
During the lesson we constructed a portfolio to try to get the value of a forward at an intermediate time. Here is what we got:
What was missing at this point was how to get F(t) and F(0). A few slides back we did:
Ok so now I have all of the ingredients for this forward soup. I got the forward price at time zero with the stock price at time zero divided by the discount for the whole period (two 6 month periods so its squared). Then I got the forward price at 6 months by taking the price at 6 months and dividing it by the discount for one six month period. I took the difference between the two and multiplied it by the discount factor for six months (between t and T). I ended up rounding off to 27. What am I doing wrong? Below is a picture of my calculations (I did not round any intermediate calculations). I'm more of a pencil and paper guy but if you want I can type it all up.
Also, Happy New Year to you all!
• What was the correct answer to the question? Based on the formula above 125*(1.05) the answer should be 131.25. But that does not seem to be the correct answer. – user17400 Sep 1 '15 at 16:06
Always take care that you got the compounding frequency right. I recommend you take a deeper look at http://breakingdownfinance.com/finance-topics/derivative-valuation/forward-contract/ . You can download an excel file here and take a deeper look at the formula. You can also give in the compounding frequency as input.
In case of doubt, or as a standard procedure, you could first start transform it to continuous compounding and use this to discount to avoid mistakes.
• Got it with the spreadsheet too. Thanks for the link. – GHStein Jan 2 '15 at 18:31
Your discount factors are not inverted properly. Intuitively, df(0,T) should be a number between 0 and 1. For example, if r=0 then there is no discount so df = 1. If r > 0, then discount is going to be less than 1.
Your formula for df will always be greater than 1. Check the formula given by the prof on the last slide titled "Term structure of interest rates" for the PDF on linear pricing in week 2.
If we compound semi-annually and we have half year to go, then the current forward price is $$F = S \left(1+\frac{r}{2}\right) = 125 \left(1+\frac{0.10}{2}\right)$$ Isn't it as simple as that?
• Yes it was a simple algebraic mistake. That's what I get for not taking breaks! Thanks. – GHStein Jan 2 '15 at 18:22
Correct answer to the question is 20.
F_o= 100*(1+.1/2)^2=110.25 ....forward price at time 0(future value of 100 stock at time t=1 yr) F_t=125*(1+.1/2)=131.25 .....forward price at time t ( future value of 125 at time t=1/2yr down the line) d(t,T)=1/(1+.1/2)=.9523 f_t=(F_t-F_o)*d(t,T)=20 |
Browse Questions
# The equation of the plane whose foot of $\perp$ drawn from origin is $(1,2,1)$. is?
$\begin{array}{1 1} (a)\:\:x+2y+z+6=0\:\: & \:\:(b)\:\:x+2y+z-6=0\:\:\\ \:\:(c)\:\:x+2y+z+4=0\:\: &\:\: (d)\:\:x+2y+z-4=0 \end{array}$
Since $A(1,2,1)$ is foot of $\perp$ from origin to the required plane, $A$ lies on the plane.
Also $OA$ is $\perp$ to the plane $\therefore$ $\overrightarrow n=\overrightarrow {OA}=(1,2,3)$
Let thequation of the plane be $ax+by+cz+d=0$ where $\overrightarrow n=(1,2,1)$
$\therefore$ The eqn. becomes $x+2y+z+d=0$
Since it passes through $A(1,2,1)$ it saatisfies the eqn.
$\therefore \:1+2+1+d=0$ $\Rightarrow\:d=-4$
$\Rightarrow\:$ The eqn. of the plane is $x+2y+z-4=0$ |
# Math Help - 3d problems
1. ## 3d problems
Can someone help me with this question?
Standing due south of a tower 50 m high, the angle of elevation of the top is 26◦.What is the angle of elevation after walking a distance 120 m due east?
The question has me confused so I can't draw a diagram.
2. Ok this problem has to be done in a few parts. A diagram will really help here.
Firstly find the horizontal length from the tower before moving eastward.
You get $\displaystyle \frac{50}{\tan 26^o}$ Do you know why?
Now using pythagora's thm, find the horizontal distance from the tower with the 120m and the value you just found. What do you get?
3. Hi pickslides.
I understand the first part:
So x will be the horizontal length you're talking about and C is the thing standing due south.
T is tower, x is the horizontal distance in the previous diagram and b will be the distance from the tower to the location after moving 120m due east.
After that we can make b the base of a right-angled triangle and T the height, so it will look like this:
So $\displaystyle \frac{50}{\tan 26^o}$
$x=102.515$
Then we can use pythag... $b= \sqrt {102.515^2+ 120^2}$
$b=157.827$
Now we know two sides of the last triangle...
$\tan\theta = \displaystyle\frac {50}{157.827}$
$\theta = \tan^{-1} (\displaystyle\frac{50}{157.827})$
$\theta = 17.58^o$
:S
4. Sounds good to me! |
## Chemistry 10th Edition
2.164 g of pentane produces that quantity of $CO_2$ molecules.
When $C_5H_{12}$ is burned in excess oxygen, it produces $CO_2$ and water: $C_5H_{12} + O_2 -- \gt CO_2 + H_2O$ 1. Balance this reaction: - Balance the $C$: $C_5H_{12} + O_2 -- \gt 5CO_2 + H_2O$ - Balance the $H$: $C_5H_{12} + O_2 -- \gt 5CO_2 + 6H_2O$ - Finally, balance the $O$: Products: 5*2 + 6*1 = 16 Therefore, we should put "8" as the coefficient of $O_2$: $C_5H_{12} + 8O_2 -- \gt 5CO_2 + 6H_2O$ ----------------------- 2. Calculate the molar mass of pentane: Molar Mass ($C_5H_{12}$): 12.01* 5 + 1.008* 12 = 72.15g/mol 3. Use conversion factors to calculate the mass of pentane: ** $1 mol = 6.022 \times 10^{23}$ $9.033 \times 10^{22} (CO_2) \times \frac{1mol}{6.022 \times 10^{23}} \times \frac{1mol(C_5H_{12})}{5mol(CO_2)} \times \frac{72.15g(C_5H_{12})}{1mol(C_5H_{12})} = 2.164$ g $(C_5H_{12})$. |
Tuesday, February 13, 2007
Lots of Fn. Topic: Algebra/S&S. Level: AIME.
Problem: Let $F_n$ be the Fibonnaci sequence that begins with $F_1 = 1$, $F_2 = 1$, etc. Show that
$F_{2n} = F_{n+1}^2-F_{n-1}^2$.
Solution: We will use induction (as expected). First, rewrite the equality as
$F_{2n} = F_{n+1}^2-(F_{n+1}-F_n)^2 = F_n(2F_{n+1}-F_n)$.
The base case is easy, since we have $F_4 = 3 = 2^2-1^2 = F_3^2-F_1^2$. Now we will show that it is true for $n = k+1$ assuming it is true for $n \le k$. We have
$F_{2k+2} = F_{2k+1}+F_{2k} = 3F_{2k}-F_{2k-2}$.
Now we apply the inductive hypothesis to get
$F_{2k+2} = 3F_k(2F_{k+1}-F_k)-F_{k-1}(2F_k-F_{k-1})$,
which we can simplify with the substitution $F_{k-1} = F_{k+1}-F_k$ to get
$F_{2k+2} = 3F_k(2F_{k+1}-F_k)-(F_{k+1}-F_k)(3F_k-F_{k+1}) = 2F_kF_{k+1}+F_{k+1}^2$
and finally from $F_k = F_{k+2}-F_{k+1}$ we get
$F_{2k+2} = F_{k+1}(2F_{k+2}-F_k)$
as desired. QED.
--------------------
Comment: Not a particularly insightful problem, but admittedly a pretty neat identity. Standard substitution/induction method, which you should all be familiar with right now since I use it all the time here. Maybe there's some cool combinatorics argument that I can't see...
--------------------
Practice Problem: Show $F_{2n} = F_n^2+F_{n-1}^2$ through a combinatorics argument, where $F_0 = 1$, $F_1 = 1$, etc. is the Fibonnaci sequence.
[Hint: $F_n$ is the number of ways to tile a $2 \times n$ board with $1 \times 2$ tiles.]
1. Ooh, that's a really nice practice problem. Okay, so consider tiling a 2x2n board, and consider the middle two squares.
Case: A 1x2 tile is in the middle two squares. Then the top 2x(n-1) half can be tiled F_{n-1} ways and so can the bottom half, so F_{n-1}^2 tilings are possible.
Case: A 1x2 tile is not in the middle two squares. Then the top and bottom halves are just 2xn boards, so there are F_n^2 tilings.
2. Yup, very cool. |
# How to Calculate Power Factor Correction
Chapter 4 - Sinusoidal Steady State Power
As you now know, reactive components can cause a circuit’s power factor to deviate from the ideal value of 1. However, a technique known as power factor correction allows us to modify a system such that it requires less reactive power yet continues to provide the necessary functionality.
A common form of power factor correction is the use of parallel capacitance to counteract the effect of inductive loads such as motors. In this page we’ll work through an example of this type of power factor correction.
Reactive power (denoted by Q) is an important concept in the context of power factor correction. In our example, the power factor will be equal to 1 if the magnitude of the reactive power due to the added capacitance (|QC|) is equal to the reactive power due to the circuit’s inductance (QL). This problem would be easier to solve if we knew the inductance, but let’s imagine that the reactive devices in the system don’t come with inductance specifications.
## Step 1: Create the Impedance Triangle
Our example system is equivalent to the following circuit:
Figure 1. The system before power factor correction
The source is a typical 60 Hz, 115 VRMS supply voltage (the corresponding peak value is 163 V). We know that the resistance in the circuit is 90 Ω. As mentioned above, we don’t know the value of the inductance. After measuring the voltage and current, we find that there is a phase difference of 30°.
Figure 2. Measured current and voltage for the circuit shown in Figure 1. The orange trace is the source voltage, and the blue trace is the current supplied by the source.
At this point we have the information that we need to calculate the circuit’s reactance. We’ll use the impedance triangle; the horizontal side of the triangle is the resistance, and the angle between the horizontal side and the hypotenuse is the same as the phase difference between voltage and current.
Figure 3. The impedance triangle for the circuit shown in Figure 1.
To calculate the reactance, we use the following equation:
Thus, the reactance of the circuit is 51.96 Ω.
## Step 2: Calculate Inductance and Capacitance
Before we determine the amount of capacitance needed to perform power factor correction, let’s find the circuit’s inductance. This information will be useful later when we perform simulations to verify our results. We’ve learned that the impedance of an inductor is equal to jωL; reactance (denoted by X) is simply the imaginary part of impedance, so the reactance of an inductor is XL = ωL = 2πfL.
The impedance of a capacitor is –j/(ωC), and thus XC = –1/(2πfC). As mentioned above, to achieve power factor correction, the magnitude of the reactive power created by the parallel capacitor must be equal to the reactive power created by the inductance.
Our measurements indicated that the current supplied by the source, and hence the current through the inductor, has a peak value of approximately 1.56 A. Thus,
The voltage across the capacitor is equal to the source voltage, so we can find the required capacitive reactance as follows:
Now we calculate the power-factor-correction capacitance from the capacitive reactance:
## Step 3: Verify the Results
Our power-factor-corrected system is equivalent to the following circuit:
Figure 4. We’ve added a power-factor-correction capacitor in parallel with the original circuit.
If we simulate this circuit, we see that the voltage and current are now in phase, which is exactly what we expect when a system has a power factor of 1.
Figure 5. The system’s voltage (the orange trace) and current (the blue trace) are now in phase.
## Review and Moving Forward
We’ve explored power factor correction by means of an example. In the next page, we’ll look at a fundamental and extremely common component in power systems, namely, the transformer. |
# Correlation and Causal Relation
A correlation is a measure or degree of relationship between two variables. A set of data can be positively correlated, negatively correlated or not correlated at all. As one set of values increases the other set tends to increase then it is called a positive correlation.
As one set of values increases the other set tends to decrease then it is called a negative correlation.
If the change in values of one set doesn't affect the values of the other, then the variables are said to have "no correlation" or "zero correlation."
A causal relation between two events exists if the occurrence of the first causes the other. The first event is called the cause and the second event is called the effect. A correlation between two variables does not imply causation. On the other hand, if there is a causal relationship between two variables, they must be correlated.
Example:
A study shows that there is a negative correlation between a student's anxiety before a test and the student's score on the test. But we cannot say that the anxiety causes a lower score on the test; there could be other reasons—the student may not have studied well, for example. So the correlation here does not imply causation.
However, consider the positive correlation between the number of hours you spend studying for a test and the grade you get on the test. Here, there is causation as well; if you spend more time studying, it results in a higher grade.
One of the most commonly used measures of correlation is Pearson Product Moment Correlation or Pearson's correlation coefficient. It is measured using the formula,
${r}_{xy}=\frac{n\sum xy-\sum x\sum y}{\sqrt{\left(n\sum {x}^{2}-{\left(\sum x\right)}^{2}\right)\left(n\sum {y}^{2}-{\left(\sum y\right)}^{2}\right)}}$
The value of Pearson's correlation coefficient vary from $-1$ to $+1$ where –1 indicates a strong negative correlation and $+1$ indicates a strong positive correlation. |
distBetPairs {DisimForMixed} R Documentation
## Calculate Distance Between Attribute Values.
### Description
Takes in a data frame which contains only qualitative variables. Discretized quantitative variables , a mixture of qualitative variables and discretized quantitative variables are also accepted. Calculates distance between each pair of attribute values for a given attribute. This calculation is done according to the method proposed by Ahmad & Dey (2007).
### Usage
distBetPairs(myDataAll)
### Arguments
myDataAll A data frame which includes qualitative variables OR discretized quantitative variables OR a mixture of qualitative variables and discretized quantitative variables in columns.
### Details
distBetPairs is an implementtion of the method proposed by Ahmad & Dey (2007) to find the distance between two catogorical values corresponding to a qualitative variable. This distance measure considers distribution of values in the data set. This function is also used to find the distance between discretized values corresponding to quantitative variables which are used in calculating the significance of quantitative attributes. See Ahmad & Dey (2007) for more datails.
### Value
A data frame with four columns J, A, B and C in columns where Distance(A, B) = C and J is the column number in the input data frame corresponding to the values in A.
### References
Ahmad, A., & Dey, L. (2007). A k-mean clustering algorithm for mixed numeric and categorical data. Data & Knowledge Engineering, 63(2), 503-527.
### Examples
QualiVars <- data.frame(Qlvar1 = c("A","B","A","C"), Qlvar2 = c("Q","Q","R","Q"))
library(dplyr)
distForQuali <- distBetPairs(QualiVars)
QuantVars <- data.frame(Qnvar1 = c(1.5,3.2,4.9,5), Qnvar2 = c(4.8,2,1.1,5.8))
Discretized <- discretizeQuant(QuantVars)
distForQuant <- distBetPairs(Discretized)
AllQualQuant <- data.frame(QualiVars, Discretized)
distForAll <- distBetPairs(AllQualQuant)
[Package DisimForMixed version 0.2 Index] |
# Boost :
From: Jonathan Turkanis (technews_at_[hidden])
Date: 2003-12-10 14:49:45
"Rani Sharoni" <rani_sharoni_at_[hidden]> wrote in message
news:br6qje\$jvf\$1_at_sea.gmane.org...
> Jonathan Turkanis wrote:
> > This is a small point, since the code in question is hidden with '#if
> > 0'; however, there seem to be several typos in the definition of
> > is_abstract.
> I posted possible implementation for this important trait (and bother
Jason
> Shirk to implement it in VC7.1) while ago and wondered where can I see the
> one that you are referring to?
>
I'm refering to one that Robert Ramey included in his serialization proposal
at
http://groups.yahoo.com/group/boost/files/serialization13.zip
in the directory "boost\serialization". He gives you credit in the
copyright.
A note says it is "believed to function on at least some EDG compilers." I
tried in briefy on como 4.3.3 and it seemed to work. on VC7.1 I got an ICE.
I'll try your new implementation on VC7.1.
Jonathan |
## Monday, April 26, 2010
### Tight bounds for Clock Synchronization [Lenzen, Locher, Wattenhofer PODC '09]
A distributed system consists of autonomous nodes (usually with an individual clock) communicating through a network. Such systems need a common notion of time to run most tasks. For instance, a multi-person video game on a network needs a way to order actions on different computers that is very close to their actual order. Similar requirements might be needed of, say, a global network of sensors or for a system time-sharing a common resource (like TDMA networks). While this may be easy if all the nodes of the system shared an identical copy of a clock that always kept time perfectly, real clocks are not perfect and tend to drift. If left alone uncorrected, the skew (difference) between clocks in these nodes can become too large and intolerable.
One way to improve this situation is to let the nodes have a logical clock on top of their hardware clocks. They could communicate their logical clock values and adjust them based on the values of other nodes in the system (according to some algorithm). The main practical challenges here are that the hardware clock drift rate and the latency of the communications between nodes is variable and unknown to these nodes. At best, the nodes may be aware of some upper bounds on the hardware clock drift rate and message latencies ($\tau$). Further, an algorithm that adjusts logical clock values can be expected to have certain reasonable properties:
1. The largest skew between any two logical clocks in the network (global skew) is small.
2. The largest skew between the logical clocks of any pair of adjacent nodes in the network (local skew) is small.
3. The maximum possible skew between any pair of nodes degrades smoothly as a function of the distance between the nodes (gradient property).
4. The rate of progress of logical clock is neither too small nor too large. This is to ensure that the spacing between logical times given to events at one node compares well with their real times.
Evidently, the skew depends on the size of network, and more specifically, on the diameter $D$ of the network. It has been shown that no algorithm can achieve a better bound on global skew than $D/2$ (Assume $\tau = 1$) [Biaz, Welch '01]. More surprisingly, [Fan, Lynch '04] showed that no algorithm could achieve a local skew better than $\Omega(\log D/ \log\log D)$. While algorithms that match the bound on global skews were known, matching the local skew bound has been elusive. The paper we will be discussing [LLW-PODC'09, J.ACM'10] solves this problem, and also presents an improved lower bound for local skew.
In the discussion, we will see a more formal statement of the model. We will discuss why certain simple algorithms like setting a node's logical clock to the average or maximum of its neighbors fail. We will then see how ideas from these failed algorithms can be put together to construct a simple algorithm (the proofs, however, are a bit long) that has optimal global and local skews and a good gradient property. We will also talk about lower bounds on global and (time permitting) local skews.
Note: In case you want to read the paper, [LLW-PODC'09] contains only the statements of the results. Also, I found the two associated tech reports diffcult to follow. Their recent journal article in J.ACM-Jan'10 is much more readable and contains proofs in good detail. Locher and Wattenhofer's earlier paper in DISC'06 is a good background reading and helps you appreciate the algorithm better. |
432 millimeters in decimeters
Conversion
432 millimeters is equivalent to 4.32 decimeters.[1]
Conversion formula How to convert 432 millimeters to decimeters?
We know (by definition) that: $1\mathrm{mm}=0.01\mathrm{dm}$
We can set up a proportion to solve for the number of decimeters.
$1 mm 432 mm = 0.01 dm x dm$
Now, we cross multiply to solve for our unknown $x$:
$x\mathrm{dm}=\frac{432\mathrm{mm}}{1\mathrm{mm}}*0.01\mathrm{dm}\to x\mathrm{dm}=4.32\mathrm{dm}$
Conclusion: $432 mm = 4.32 dm$
Conversion in the opposite direction
The inverse of the conversion factor is that 1 decimeter is equal to 0.231481481481482 times 432 millimeters.
It can also be expressed as: 432 millimeters is equal to $\frac{1}{\mathrm{0.231481481481482}}$ decimeters.
Approximation
An approximate numerical result would be: four hundred and thirty-two millimeters is about zero decimeters, or alternatively, a decimeter is about zero point two three times four hundred and thirty-two millimeters.
Footnotes
[1] The precision is 15 significant digits (fourteen digits to the right of the decimal point).
Results may contain small errors due to the use of floating point arithmetic. |
# chain reactionZoomA-Z
## Subject - Inorganic Chemistry, Organic Chemistry, Macromolecular Chemistry
Chain reactions are processes in which particles required to initiate a reaction are constantly re-generated, thereby again triggering similar processes. Such processes take part in many chemical reactions. Examples are the chlorine-oxyhydrogen gas reaction, combustion processes and photodegradation by $OH$ radicals, polymerizations, and chain reactions in a nuclear reactor. |
I know this could sound stupid and also maybe stupid totally, yet I require some clarification....
You are watching: Can an electric motor run a generator to power itself
I come through electrical motor principle.
and then i came across the following attach on DC generatorsDC Generator
Now my question is, can"t we integrate them both to administer DC current with small DC provided in engine side? by "combine," I typical using a DC generator coil only at the other side of the coil the the motor in the engine itself. The problem is the the 2 coils are connected via part apparatus that causes the 2nd coil come move together with the 1st coil.
In this method we deserve to use both the mechanically energy developed by the first coil and the electrical energy created by the second coil, right?
This is a good concept if it works, right? you re welcome throw some light on this.
electric-circuits
share
point out
enhance this inquiry
monitor
edited Oct 11 "16 in ~ 17:34
MissMonicaE
11577 bronze title
request Oct 11 "16 in ~ 16:38
vinaychvinaych
$endgroup$
2
include a comment |
0
$egingroup$
Pranshu Malick is correct. Girlfriend cannot have actually a perfectly reliable closed energy system. It would certainly be favor trying to move forward in a submarine that had a thrust propeller ~ above the earlier and a intake "generator" propeller ~ above the front, and also expecting that when you get going the front will constantly power the back, and also the ago the front, and so on. Friend are, however, correct in assuming that there is power that would generally be wasted that deserve to be accumulated and reused. In some hybrid cars part of the braking system involves engaging generators, reduce the "wasted energy" of an unfavorable acceleration.
re-superstructure
mention
monitor
answer Oct 13 "16 at 3:23
Joseph G.Joseph G.
$endgroup$
2
include a comment |
0
$egingroup$
Can you please be details about the coils? In case you room referring to the coil inside the motor and dividing it into two halves, then that won"t be possible since present needs a loop to flow, and also you can"t have a closeup of the door loop when just having actually mechanical motion and no current in one loop. Further, there are always frictional and also resistive losses, so the early energy detailed (in the kind of rotating the coils at first) it will eventually die out, and will need a driving resource - a battery or rotating turbines. Friend can"t produce energy and (or) have 100% effectiveness in a cyclic process, violates the 2nd law of thermodynamics as well as the law of conservation of energy.
re-superstructure
mention
follow
reply Oct 11 "16 in ~ 17:09
Pranshu MalikPranshu Malik
$endgroup$
6
| present 1 much more comment
Thanks because that contributing response to mmsanotherstage2019.com stack Exchange!
But avoid
Asking for help, clarification, or responding to other answers.Making statements based upon opinion; back them increase with references or an individual experience.
Use MathJax to format equations. MathJax reference.
To find out more, see our advice on writing an excellent answers.
See more: What Is The Lcm Of 4 And 3 And 4, Methods To Find Lcm Of 3 And 4
Draft saved
authorize up making use of Email and Password
submit
### Post together a guest
surname
email Required, but never shown
### Post together a guest
surname
email
Required, but never shown
## Not the price you're spring for? Browse various other questions tagged electric-circuits or ask your own question.
Featured ~ above Meta
0
Getting much more out that less?
associated
0
deserve to we double the electrical energy?
0
electrical network of existing sources - What happens if one resource is defect?
3
evident violation of energy conservation in one electromagnetic transformer system
0
just how step-up transformers aid in infection of electrical energy over lengthy distances?
1
Transformers and an easy inductor mmsanotherstage2019.com
2
How deserve to I provide a solenoid v the current it demands to accelerate a ferrous projectile to a given velocity?
2
Did ns make an electric circuit v my cat?
1
EMF in one LC circuit
warm Network questions more hot concerns
question feed
mmsanotherstage2019.com
agency
stack Exchange Network
site design / logo © 2021 stack Exchange Inc; user contributions licensed under cc by-sa. Rev2021.11.11.40730
mmsanotherstage2019.com ridge Exchange works finest with JavaScript permitted |
# Prove that (A-B)-C=(A-C)-(B-C)
1. Feb 23, 2012
### iHeartof12
Let A,B and C be sets. Prove that
(A-B)-C=(A-C)-(B-C).
Attempted solution:
Suppose $x \in (A-B)-C$. Since $x \in (A-B)-C$ this means that $x \in A$ but $x \notin B$ and $x \notin C$.
I'm not sure how to show how these two statements are equal.
2. Feb 23, 2012
### jbunniii
So far so good. You have established that x is in A, but not in B and not in C.
* Is x in A - C?
* Is x in B - C?
and see what you can conclude.
3. Feb 23, 2012
### fauboca
Ok well you said $x\in A$ and $x\notin C$ What does that mean?
4. Feb 24, 2012
### iHeartof12
Let A,B and C be sets. Prove that
(A-B)-C=(A-C)-(B-C).
Attempted solution:
i.
Suppose $x \in (A-B)-C$. Since $x \in (A-B)-C$ this means that $x \in A$ but $x \notin B$ and $x \notin C$.
ii.
Suppose $x \in (A-C)-(B-C)$. Since $x \in (A-C)-(B-C)$ it makes since that $x \in A$ and $x \notin B$ and $x \notin C$.
Therefor these two statements are equal and (A-B)-C=(A-C)-(B-C).
5. Feb 24, 2012
### HallsofIvy
You need to finish this! $x \notin B$ and $x \notin C$ means what about x being in (A- C)- (B- C)?
Why dfoes that make sense? And what does that tell you about x being in (A- B)- C?
6. Feb 24, 2012
### Deveno
you want to show the two sets are subsets of each other; that is, that they have precisely the same elements.
if x is (A-B)-C, what does that mean?
first of all, it means x is in A-B, but x is not in C.
secondly, since x is in A-B, it means x is in A, but not in B.
putting these two statements together, we have: x is in A, x is not in B, x is not in C.
now if x is not in B, then it is not in B-C, since that is a subset of B.
(x is not only NOT in the part of B that lies outside of C, it's totally not in B anywhere).
but x IS in A, and x is NOT in C, so x IS in A-C.
so x IS in A-C and x is NOT in B-C, so x IS in (A-C)-(B-C).
that's "half" of the proof. the "other half" starts with assuming x is in (A-C)-(B-C). |
Platform structures are commonly utilized for various purposes including offshore drilling, processing, and support of offshore operations. A jacket is a supporting structure for deck facilities, stabilized by piles driven through it to the seabed. In a jacket design, operational and environmental loads are very important and must be intensively investigated to secure the stability of structures during their service life, as well as installation phase. The main purpose of this research is to evaluate the results of physical modeling for the launch operation of jackets from barge into the sea, as the most hazardous stage in the installation of a platform, and compare them to those of numerical modeling. Both physical and numerical modeling parameters are described and they are examined on a prototype platform, i.e., Balal oil field production and living quarter platform that is a 1700 tone, eight-legged jacket located in the center of Persian Gulf, some $100km$ distance from Iranian Lavan Island. It is found that both numerical and physical methods can describe the motion of the barge similarly well, but some differences are traced in the motion of jacket. The inequalities are, then, appeared to be due to the Froude-type parameters applied for modeling purpose. One notable fact investigated in this research is the necessity for choosing Reynolds–Froude type in the physical modeling of the launch, instead of Froude type. This is because, in addition to the importance of gravitational and inertial forces, the viscosity affects the drag hydrodynamic force, as well. It should be noted that viscosity and consequently drag coefficient in Froude type modeling cannot be quite applicable and this causes the difference observed between the results of physical and numerical modeling. Although there have been so many jacket launching designed and probably their physical models have been tested, but to the best of our knowledge from the literature, there was found no study on Reynolds–Froude physical modeling of jacket launch phenomenon. If one is interested in practicing a Reynolds-Froude physical modeling, it could be done either in a centrifuge test or by using a fluid with lower viscosity dependent on the scale of model, or even by finding a fluid (with new viscosity and new density) and a new gravity to have simultaneously the Froude and the Reynolds similarity laws satisfied.
1.
Patel
,
M. H.
, 1985,
Dynamics of Offshore Structures
,
Butterworth & Co. Ltd.
,
London
.
2.
Visser
,
W.
, 1993,
The Structural Design of Offshore Jackets
,
Marine Technology Directorate Ltd.
,
London
.
3.
Sphaier
,
S. H.
,
Vasconcellos
,
J. M.
,
Esperanca
,
P. T. T.
, and
Ferreira
,
M. D. A.
, 1985, “
The Study of Jacket Installation Using INPLA System
,”
Proceedings of the Fifth International Symposium on Offshore Engineering
,
Federal University of Rio de Janeiro
,
Brazil
, Sept., pp.
541
573
.
4.
Jo
,
C. H.
,
Kim
,
K. S.
, and
Lee
,
S. H.
, 2002, “
Parametric Study on Offshore Jacket Launching
,”
Ocean Eng.
0029-8018,
29
, pp.
1959
1979
.
5.
Vasicek
,
D.
, and
Lu
,
C.-H.
, 1979, “
Launch and Floatation Analysis of Offshore Structures Part II—Barge and Jacket Interaction on Launch Analysis
,”
Pet. Eng. Int.
0164-8322,
51
(
6
), pp.
10
16
.
6.
SACS IV, 1995, Release 4 User’s Manual.
7.
API
, 2000, “
Recommended Practice for Planning, Designing and Constructing Fixed Offshore Platforms, Working Stress Design
,” API (RP2A-WSD).
8.
1984, Shore Protection Manual, U. S. Army Engineer Waterways Experiment Station, U. S. Government Printing Office, Washington, DC,
4th ed.
, Vol.
2
.
9.
Nelson
,
J. K.
,
Fallon
,
D. J.
, and
Hirsch
,
T. J.
, 1991, “
Mathematical Modeling of Free-Fall Lifeboat Launch Behavior
,”
Proceedings of Offshore Mechanics and Arctic Engineering
, pp.
695
702
. |
09:00 to 09:25 Registration 09:25 to 09:30 Welcome INI 1 09:30 to 10:30 J Lutz (Iowa State University)Alan Turing in the twenty-first century: normal numbers, randomness, and finite automata We discuss ways in which Turing's then-unpublished A Note on Normal Numbers'' foreshadows and can still guide research in our time. This research was supported in part by NSF Grant 0652569. Part of the work was done while the author was on sabbatical at Caltech and the Isaac Newton Institute for Mathematical Sciences at the University of Cambridge. INI 1 10:30 to 11:00 Morning coffee 11:00 to 12:00 A Nies (University of Auckland)Demuth randomness and its variants A Demuth test is like a Martin-Löf test with the passing condition to be out of infinitely many components; the strength of the test is enhanced by the possibility to exchange the $n$-th component of the test a computably bounded number of times. Demuth introduced Demuth randomness of reals in 1982, and showed that the Denjoy alternative for Markov computable functions holds at any such real. In [1] we proved that Demuth's hypothesis is in fact too strong: difference randomness (i.e., ML-randomness together with incompleteness) of the real already suffices. However, Demuth randomness now plays an important role in the interaction of computability and randomness. The basic relevant fact here is that a Demuth random set can be below the halting problem. In [2] we characterized lowness for Demuth randomness by a property called BLR-traceability, in conjunction with being computably dominated. The low for Demuth random sets form a proper subclass of the computably traceable sets used by Terwijn and Zambella to characterize lowness for Schnorr randomness. The covering problem asks whether each K-trivial set $A$ is Turing below a difference random set $Y$. Combining work of Kucera and Nies [3] with results of Downey, Diamondstone, Greenberg and Turetsky gives an affirmative answer to an analogous question: a set $A$ is strongly jump traceable if and only if it is below a Demuth random set $Y$. In recent work, Bienvenu, Greenberg, Kucera, Nies, and Turetsky introduced a weakening of Demuth randomness called Oberwolfach randomness. They used it to build a smart'' K-trivial set $A$: it is difficult to cover in that any Martin-Löf random set $Y$ above $A$ must be LR-hard. [2] Bienvenu, Downey, Greenberg, Nies, and Turetsky. "Characterizing lowness for Demuth Randomness." Submitted. [1] Bienvenu, Hoelzl, Miller, and Nies. "The Denjoy alternative for computable functions." Proceedings of STACS 2012. [3] Kucera, A. and Nies, A. Demuth. "randomness and computational complexity." Annals of Pure and Applied Logic 162 (2011) 504–513. INI 1 12:00 to 12:30 S Sanders (Universiteit Gent)Nonstandard Analysis: a new way to compute Constructive Analysis was introduced by Errett Bishop to identify the computational meaning' of mathematics. Bishop redeveloped mathematics, in the spirit of intuitionistic mathematics, based on primitive notions like algorithm, explicit computation, and finite procedure. The exact meaning of these vague terms was left open, to ensure the compatibility of Constructive Analysis with several traditions in mathematics. Constructive Reverse Mathematics (CRM) is a spin-off from Harvey Friedman's famous Reverse Mathematics program, based on Constructive Analysis. In this talk, we introduce $\Omega$-invariance': a simple and elegant definition of finite procedure in (classical) Nonstandard Analysis. We show that $\Omega$-invariance captures Bishop's notion of algorithm quite well. In particular, using an intuitive interpretation based on $\Omega$-invariance, we obtain many results from CRM inside Nonstandard Analysis. Similar results for Computability (aka Recursion) Theory are also discussed. This research is made possible through the generous support of a grant from the John Templeton Foundation for the project Philosophical Frontiers in Reverse Mathematics. Please note that the opinions expressed in this publication are those of the author and do not necessarily reflect the views of the John Templeton Foundation. INI 1 12:30 to 13:30 Lunch at Wolfson Court. 14:00 to 15:00 A Shen (Université de Montpellier 2)Topological arguments in Kolmogorov complexity We show how topological arguments (simple facts about non-homotopic mappings) can be used to prove result about Kolmogorov complexity. In particular, we show that for every string x of complexity at least n +c log n one can find a string y such that both conditional complexities C(x|y) and C(y|x) are equal to n+O(1). See also: http://arxiv.org/abs/1206.4927 INI 1 15:00 to 15:30 K Miyabe (Kyoto University)Schnorr triviality is equivalent to being a basis for tt-Schnorr randomness We present some new characterizations of Schnorr triviality. Schnorr triviality is defined using complexity via a computable measure machine, with which Schnorr randomness has a characterization. Since we have a characterization of Schnorr randomness via decidable prefix-free machine, we also have a characterization of Schnorr triviality using complexity via a decidable prefix-free machine. It should be noted that numerous characterizations of Schnorr triviality have the following form: for any computable object, there exists another computable object such that the real is in some object. By defining a basis for Schnorr randomness in a similar manner, we can show the equivalence to Schnorr triviality while Franklin and Stephan (2010) showed that there exists a Schnorr trivial set that is not truth-table reducible to any Schnorr random set. INI 1 15:30 to 16:00 Afternoon tea 17:00 to 17:30 T Petrovic (University of Zagreb)Two betting strategies that predict all compressible sequences A new type of betting games that charaterize Martin-Löf randomness is introduced. These betting games can be compared to martingale processes of Hitchcock and Lutz as well as non-monotonic betting strategies. Sequence-set betting is defined as successive betting on prefix-free sets that contain a finite number of words. In each iteration we start with an initial prefix-free set $P$ and an initial capital $c$, then we divide $P$ into two prefix-free sets $P_{0}$ and $P_{1}$ of equal size and wager some amount of capital on one of the sets, let's say $P_{0}$. If the infinite sequence we are betting on has a prefix in $P_{0}$ then in the next iteration the initial set is $P_{0}$ and the wagered amount is doubled. If the infinite sequence we are betting on does not have a prefix in $P_{0}$ then in the next iteration the initial set is $P_{1}$ and the wagered amount is lost. In the first iteration the initial prefix-free set contains the empty string. The player succeeds on the infinite sequence if the series of bets increases capital unboundedly. Non-monotonic betting can be viewed as sequence-set betting with an additional requirement that the initial prefix-free set is divided into two prefix-free sets such that sequences in one set have at some position bit 0 and in the other have at that same position bit 1. On the other hand if the requirement that the initial prefix-free set $P$ is divided into two prefix-free sets of equal size is removed, and we allow that $P_{0}$ may have a different size from $P_{1}$ we have a betting game that is equivalent to martingale processes in the sense that for each martingale process there is a betting strategy that succeeds on the same sequences as martingale process and for each betting strategy a martingale process exists that succeeds on the the same sequences as the betting strategy. It is shown that, unlike martingale processes, for any computable sequence-set betting strategy there is an infinite sequence on which betting strategy doesn't succeed and which is not Martin-Löf random. Furthermore it is shown that there is an algorithm that constructs two sets of betting decisions for two sequence-set betting strategies such that for any sequence that is not Martin-Löf random at least one of them succeeds on that sequence. INI 1 17:30 to 18:00 J Rute (Carnegie Mellon University)Computable randomness and its properties Computable randomness at first does not seem as natural of a randomness notion as Schnorr and Martin-Löf randomness. However, recently Brattka, Miller, and Nies [1] have shown that computable randomness is closely linked to differentiability. Why is this so? What are the chances that, say, computable randomness will also be linked to the ergodic theorem? In this talk I will explain how computable randomness is similar to and how it is different from other notions of randomness. Unlike other notions of randomness, computable randomness is closely linked to the Borel sigma-algebra of a space. This has a number of interesting implications: Computable randomness can be extended to other computable probability spaces, but this extension is more complicated to describe [2]. Computable randomness is invariant under isomorphisms, but not morphisms (a.e.-computable measure-preserving maps) [2]. Computable randomness is connected more with differentiability than with the ergodic theorem. Dyadic martingales and martingales whose filtration converges to a "computable" sigma-algebra characterize computable randomness, while more general computable betting strategies do not. However, this line of research still leaves many open questions about the nature of computable randomness and the nature of randomness in general. I believe the tools used to explore computable randomness may have other applications to algorithmic randomness and computable analysis. [1] Vasco Brattka, Joseph S. Miller, and André Nies. "Randomness and differentiability." Submitted. [2] Jason Rute. "Computable randomness and betting for computable probability spaces." In preparation. INI 1 18:00 to 18:30 Welcome Drinks Reception
09:00 to 10:00 S Simpson (Pennsylvania State University)Propagation of partial randomness Let $X$ be an infinite sequence of $0$'s and $1$'s, i.e., $X\in\{0,1\}^\mathbb{N}$. Even if $X$ is not Martin-Löf random, we can nevertheless quantify the amount of partial randomness which is inherent in $X$. Many researchers including Calude, Hudelson, Kjos-Hanssen, Merkle, Miller, Reimann, Staiger, Tadaki, and Terwijn have studied partial randomness. We now present some new results due to Higuchi, Hudelson, Simpson and Yokoyama concerning propagation of partial randomness. Our results say that if $X$ has a specific amount of partial randomness, then $X$ has an equal amount of partial randomness relative to certain Turing oracles. More precisely, let $\mathrm{KA}$ denote a priori Kolmogorov complexity, i.e., $\mathrm{KA}(\sigma)=-\log_2m(\sigma)$ where $m$ is Levin's universal left-r.e. semimeasure. Note that $\mathrm{KA}$ is similar but not identical to the more familiar prefix-free Kolmogorov complexity. Given a computable function $f:\{0,1\}^*\to[0,\infty)$, we say that $X\in\{0,1\}^\mathbb{N}$ is strongly $f$-random if $\exists c\,\forall n\,(\mathrm{KA}(X{\upharpoonright}\{1,\ldots,n\})>f(X{\upharpoonright}\{1,\ldots,n\})-c)$. Two of our results read as follows. Theorem 1. Assume that $X$ is strongly $f$-random and Turing reducible to $Y$ where $Y$ is Martin-Löf random relative to $Z$. Then $X$ is strongly $f$-random relative to $Z$. Theorem 2. Assume that $\forall i\,(X_i$ is strongly $f_i$-random$)$. Then, we can find a $\mathrm{PA}$-oracle $Z$ such that $\forall i\,(X_i$ is strongly $f_i$-random relative to $Z)$. We also show that Theorems 1 and 2 fail badly with $\mathrm{KA}$ replaced by $\mathrm{KP}=$ prefix-free complexity. INI 1 10:00 to 10:30 P Cholak (University of Notre Dame)Computably enumerable partial orders We study the degree spectra and reverse-mathematical applications of computably enumerable and co-computably enumerable partial orders. We formulate versions of the chain/antichain principle and ascending/descending sequence principle for such orders, and show that the latter is strictly stronger than the latter. We then show that every $\emptyset'$-computable structure (or even just of c.e. degree) has the same degree spectrum as some computably enumerable (co-c.e.) partial order, and hence that there is a c.e. (co-c.e.) partial order with spectrum equal to the set of nonzero degrees. A copy of the submitted paper can be found at http://www.nd.edu/~cholak/papers/ceorderings.pdf INI 1 10:30 to 11:00 Morning coffee 11:00 to 12:00 V Brattka (University of Cape Town)On the computational content of the Baire Category Theorem We present results on the classification of the computational content of the Baire Category Theorem in the Weihrauch lattice. The Baire Category Theorem can be seen as a pigeonhole principle that states that a large (= complete) metric space cannot be decomposed into a countable number of small (= nowhere dense) pieces (= closed sets). The difficulty of the corresponding computational task depends on the logical form of the statement as well as on the information that is provided. In the first logical form the task is to find a point in the space that is left out by a given decomposition of the space that consists of small pieces. In the contrapositive form the task is to find a piece that is not small in a decomposition that exhausts the entire space. In both cases pieces can be given by descriptions in negative or positive form. We present a complete classification of the complexity of the Baire Category Theorem in all four cases and for certain types of spaces. The results are based on joint work with Guido Gherardi and Alberto Marcone, on the one hand, and Matthew Hendtlass and Alexander Kreuzer, on the other hand. One obtains a refinement of what is known in reverse mathematics in this way. [1] Vasco Brattka and Guido Gherardi. "Effective choice and boundedness principles in computable analysis." The Bulletin of Symbolic Logic, 17(1):73–117, 2011. [2] Vasco Brattka and Guido Gherardi. "Weihrauch degrees, omniscience principles and weak computability." The Journal of Symbolic Logic, 76(1):143–176, 2011. [3] Vasco Brattka, Guido Gherardi, and Alberto Marcone. "The Bolzano-Weierstrass theorem is the jump of weak KŐnig's lemma." Annals of Pure and Applied Logic, 163:623–655, 2012. [4] Vasco Brattka, Matthew Hendtlass, and Alexander P. Kreuzer. "On the Weihrauch complexity of computability theory." unpublished notes, 2012. [5] Vasco Brattka. "Computable versions of Baire's category theorem." In Jiří Sgall, Aleš Pultr, and Petr Kolman, editors, Mathematical Foundations of Computer Science 2001, volume 2136 of Lecture Notes in Computer Science, pages 224–235, Berlin, 2001. Springer. 26th International Symposium, MFCS 2001, Mariánské Lázně, Czech Republic, August 27-31, 2001. [6] Douglas K. Brown and Stephen G. Simpson. "The Baire category theorem in weak subsystems of second order arithmetic." The Journal of Symbolic Logic, 58:557–578, 1993. INI 1 12:00 to 12:30 C Porter (University of Notre Dame)Trivial measures are not so trivial Although algorithmic randomness with respect to various biased computable measures is well-studied, little attention has been paid to algorithmic randomness with respect to computable trivial measures, where a measure $\mu$ on $2^\omega$ is trivial if the support of $\mu$ consists of a countable collection of sequences. In this talk, I will show that there is much more structure to trivial measures than has been previously suspected. In particular, I will outline the construction of a trivial measure $\mu$ such that every sequence that is Martin-Löf random with respect to $\mu$ is an atom of $\mu$ (i.e., $\mu$ assigns positive probability to such a sequence), while there are sequences that are Schnorr random with respect to $\mu$ that are not atoms of $\mu$ (thus yielding a counterexample to a result claimed by Schnorr), and a trivial measure $\mu$ such that (a) the collection of sequences that are Martin-Löf random with respect to $\mu$ are not all atoms of $\mu$ and (b) every sequence that is Martin-Löf random with respect to $\mu$ and is not an atom of $\mu$ is also not weakly 2-random with respect to $\mu$. Lastly, I will show that, if we consider the class of $LR$-degrees associated with a trivial measure $\mu$ (generalizing the standard definition of the $LR$-degrees), then for every finite distributive lattice $\mathcal{L}=(L, \leq)$, there is a trivial measure $\mu$ such that the the collection of $LR$-degrees with respect to $\mu$ is a finite distributive lattice that is isomorphic to $\mathcal{L}$. INI 1 12:30 to 13:30 Lunch at Wolfson Court 14:00 to 15:00 M Hoyrup (INRIA Paris - Rocquencourt)On the inversion of computable functions Ergodic shift-invariant measures inherit many effective properties of the uniform measure: for instance, the frequency of $1$'s in a typical sequence converge effectively, hence it converges at every Schnorr random sequence; the convergence is robust to small violations of randomness [1]; every Martin-Löf random sequence has a tail in every effective closed set of positive measure [2]. These properties are generally not satisfied by a non-ergodic measure, unless its (unique) decomposition into a combination of ergodic measures is effective. V'yugin [3] constructed a computable non-ergodic measure whose decomposition is not effective. This measure is a countable combination of ergodic measures. What happens for finite combinations? Is there a finitely but non-effectively decomposable measure? We prove that the answer is positive: there exist two non-computable ergodic measures $P$ and $Q$ such that $P+Q$ is computable. Moreover, the set of pairs $(P,Q)$ such that neither $P$ nor $Q$ is computable from $P+Q$ is large in the sense of Baire category. This result can be generalized into a theorem about the inversion of computable functions, which gives sufficient conditions on a one-to-one computable function $f$ that entail the existence of a non-computable $x$ such that $f(x)$ is computable. We also establish a stronger result ensuring the existence of a sufficiently generic'' $x$ such that $f(x)$ is computable, in the spirit of Ingrassia's notion of $p$-genericity [4]. [1] Vladimir V. V'yugin. "Non-robustness property of the individual ergodic theorem." Problems of Information Transmission, 37(2):27–39, 2001. [2] Laurent Bienvenu, Adam Day, Ilya Mezhirov, and Alexander Shen. "Ergodic-type characterizations of algorithmic randomness." In Computability in Europe (CIE 2010), volume 6158 of LNCS, pages 49–58. Springer, 2010. [3] Vladimir V. V'yugin. "Effective convergence in probability and an ergodic theorem for individual random sequences." SIAM Theory of Probability and Its Applications, 42(1):39–50, 1997. [4] M.A. Ingrassia. P-genericity for Recursively Enumerable Sets. University of Illinois at Urbana-Champaign, 1981. INI 1 15:00 to 15:30 I Herbert (University of California, Berkeley)(Almost) Lowness for K and finite self-information A real $X$, is called low for K if there is a constant $c$ such that using $X$ as an oracle does not decrease the Kolmogorov complexity of any string by more than $c$. That is, the inequality $K(\sigma) \leq K^{X}(\sigma) +c$ holds for all $\sigma \in 2^{ INI 1 15:30 to 16:00 Afternoon Tea 17:00 to 17:30 B Bauwens (Universidade do Porto)Prefix and plain Kolmogorov complexity characterizations of 2-randomness: simple proofs Joseph Miller[1] and independently Andre Nies, Frank Stephan and Sebastian Terwijn[2] gave a complexity characterization of$2$-random sequences in terms of plain Kolmogorov complexity$C\,(\cdot)$: they are sequences that have infinitely many initial segments with$O(1)$-maximal plain complexity (among the strings of the same length). Later Miller[3] (see also [4]) showed that prefix complexity$K\,(\cdot)$can be also used in a similar way: a sequence is$2$-random if and only if it has infinitely many initial segments with$O(1)$-maximal prefix complexity (which is$n+K\,(n)$for strings of length~$n$). The known proofs of these results are quite involved; we provide simple direct proofs for both of them. In [1] Miller also gave a quantitative version of the first result: the$0'$-randomness deficiency of a sequence$\omega$equals$\liminf_n [n - C\,(\omega_1\dots\omega_n)] + O(1)$. (Our simplified proof also can be used to prove this quantitative version.) We show (and this seems to be a new result) that a similar quantitative result is true also for prefix complexity:$0'$-randomness deficiency$d^{0'}(\omega)$equals also$\liminf_n [n + K\,(n) - K\,(\omega_1\dots\omega_n)]+ O(1)$. This completes the picture: \begin{eqnarray*} d^{0'}(\omega) &=& \sup_n \, \left[ n - K\,^{0'}(\omega_1\dots\omega_n) \right] + O(1) \\ &=& \liminf_n \, \left[ n - C\,(\omega_1\dots\omega_n) \right] + O(1) \\ &=& \liminf_n \, \left[ n + K\,(n) - K\,(\omega_1\dots\omega_n) \right] + O(1) \,. \end{eqnarray*} [1] J.S. Miller. "Every 2-random real is Kolmogorov random." Journal of Symbolic Logic, 69(3):907–913, 2004. [2] A. Nies, F. Stephan, and S.A. Terwijn. "Randomness, relativization and turing degrees." The Journal of Symbolic Logic, 70(2), 2005. [3] J.S. Miller. "The K-degrees, low for K-degrees, and weakly low for K sets." Notre Dame Journal of Formal Logic, 50(4):381–391, 2009. [4] R.G. Downey and D.R. Hirschfeldt. "Algorithmic Randomness and Complexity." Theory and Applications of Computability. Springer, 2010. INI 1 20:00 to 22:15 Codebreaker movie INI 1 09:00 to 10:00 J M Hitchcock (University of Wyoming)Limitations of Efficient Reducibility to the Kolmogorov Random Strings INI 1 10:00 to 10:30 M Zimand (Towson University)Language compression for sets in P/poly If we consider a finite set$A$, it is desirable to represent every$x$in$A$by another shorter string compressed($x$) such that compressed($x$) describes unambiguously the initial$x$. Regarding the compression rate, ideally, one would like to achieve the information-theoretical bound |compressed($x$)|$\approx \log (|A|)$, for all$x$in$A$. This optimal rate is achievable for c.e. (and also co-c.e.) sets$A$, because for such a set C($x$)$\leq \log (|A^{=n}|) + O(\log n)$(C($x$) is the Kolmogorov complexity of string$x$and$n$is the length of$x$). In the time-bounded setting, we would like to have a polynomial-time type of unambiguous description. The relevant concept is the time-bounded distinguishing complexity, CD(), introduced by Sipser. The$t$-time bounded distinguishing complexity of$x$, denoted CD$^t(x)$is the length of the shortest program$p$that accepts$x$and only$x$within time$t(|x|)$. Recently we have shown that for sets in P, NP, PSPACE, optimal compression can be achieved, using some reasonable hardness assumptions. We show that this also holds for sets in P/poly, i.e., for sets computable by polynomial-size circuits. Furthermore, we sketch here a different proof method (even though some key elements are common) suggested by Vinodchandran, which needs a weaker hardness assumption. INI 1 10:30 to 11:00 Morning Coffee 11:00 to 12:00 M Koucky (Academy of Sciences of the Czech Republic)The story of superconcentrators – the missing link In 60's and 70's directed graphs with strong connectivity property were linked to proving lower bounds on complexity of solving various computational problems. Graphs with strongest such property were named superconcentrators by Valiant (1975). An n-superconcentrator is a directed acyclic graph with n designated input nodes and n designated output nodes such that for any subset X of input nodes and any equal-sized set Y of output nodes there are |X| vertex disjoint paths connecting the sets. Contrary to previous conjectures Valiant showed that there are n-superconcentrators with O(n) edges thus killing the hope of using them to prove lower bounds on computation. His n-superconcentrators have depth O(log n). Despite this setback, superconcentrators found their way into lower bounds in the setting of bounded-depth circuits. They provide asymptotically optimal bounds for computing many functions including integer addition, and most recently computing good error-correctin g codes. The talk will provide an exposition of this fascinating area. INI 1 12:00 to 12:30 D Nguyen (University at Buffalo)Autoreducibility for NEXP Autoreducibility was first introduced by Trakhtenbrot in a recursion theoretic setting. A set$A$is autoreducible if there is an oracle Turing machine$M$such that$A = L(M^A)$and M never queries$x$on input$x$. Ambos-Spies introduced the polynomial-time variant of autoreducibility, where we require the oracle Turing machine to run in polynomial time. The question of whether complete sets for various classes are polynomial-time autoreducible has been studied extensively. In some cases, it turns out that resolving autoreducibility can result in interesting complexity class separations. One question that remains open is whether all Turing complete sets for NEXP are Turing autoreducible. An important separation may result when solving the autoreducibility for NEXP; if there is one Turing complete set of NEXP that is not Turing autoreducible, then EXP is different from NEXP. We do not know whether proving all Turing complete sets of NEXP are Turing autoreducible yields any separation results. Buhrman et al. showed that all${\le_{\mathit{2\mbox{-}tt}}^{\mathit{p}}}$-complete sets for EXP are${\le_{\mathit{2\mbox{-}tt}}^{\mathit{p}}}$-autoreducible. This proof technique exploits the fact that EXP is closed under exponential-time reductions that only ask one query that is smaller in length. Difficulties arise when we want to prove that the above result holds for NEXP, because we do not know whether this property still holds for NEXP. To resolve that difficulty, we use a nondeterministic technique that applies to NEXP and obtain the similar result for NEXP; that is, all${\le_{\mathit{2\mbox{-}tt}}^{\mathit{p}}}$-complete sets for NEXP are${\le_{\mathit{2\mbox{-}tt}}^{\mathit{p}}}$-autoreducible. Using similar techniques, we can also show that every disjunctive and conjunctive truth-table complete set for NEXP is disjunctive and conjunctive truth-table autoreducible respectively. In addition to those positive results, negative ones are also given. Using different notions of reductions, we can show that there is a complete set for NEXP that is not autoreducible. Finally, in the relativized world, there is a${\le_{\mathit{2\mbox{-}T}}^{\mathit{p}}}$-complete set for NEXP that is not Turing autoreducible. INI 1 12:30 to 13:30 Lunch at Wolfson Court 13:30 to 17:00 Excursion 19:30 to 22:00 Conference Dinner at Emmanuel College 09:00 to 10:00 D Turetsky (Victoria University of Wellington)SJT-hardness and pseudo-jump inversion Tracing was introduced to computability theory by Terwijn and Zambella, who used it to characterize the degrees which are low for Schnorr randomness. Zambella observed that tracing also has a relationship to K-triviality, which was strengthened by Nies and then later others. A trace for a partial function f is a sequence of finite sets (T_z) with f(z) in T_z for all z in the domain of f. A trace (T_z) is c.e. if the T_z are uniformly c.e. sets. An order function is a total, nondecreasing, unbounded positive function. If h is an order, a trace (T_z) is an h-trace if h(z) bounds the size of T_z. A set is called jump-traceable (JT) if every partial function it computes has a c.e. h-trace for some computable order h. A set is called strongly jump-traceable (SJT) if every partial function it computes has a c.e. h-trace for every computable order h. Figuiera et al. constructed a non-computable c.e. set which is SJT. This gives a natural pseudo-jump operator. Pseudo-jump inverting this SJT-operator gives a set A such that the halting problem looks SJT relative to A. That is, for every partial function computable from the halting problem, and every computable order h, there is an h-trace which is uniformly c.e. relative to A. Such a set is called SJT-hard. Downey and Greenberg showed that there is a non-computable c.e. set E which is computable from every c.e. SJT-hard set. Thus the SJT-operator cannot be pseudo-jump inverted outside of the cone above E. We strengthen this result, showing that E can be made super high. INI 1 10:00 to 10:30 J Teutsch (Pennsylvania State University)Translating the Cantor set by a random I will discuss the constructive dimension of points in random translates of the Cantor set. The Cantor set cancels randomness'' in the sense that some of its members, when added to Martin-Löf random reals, identify a point with lower constructive dimension than the random itself. In particular, we find the Hausdorff dimension of the set of points in a Cantor set translate with a given constructive dimension. More specifically, let$\mathcal{C}$denote the standard middle third Cantor set, and for each real$\alpha$let$\mathcal{E}_{\alpha}$consist of all real numbers with constructive dimension$\alpha$. Our result is the following. If$1 -\log2/\log3 \leq \alpha \leq 1$and$r$is a Martin-Löf random real, then the Hausdorff dimension of the set$ (\mathcal{C}+r) \cap \mathcal{E}_{\alpha}$is$\alpha -(1 -\log 2/\log 3)$. From this theorem we obtain a simple relation between the effective and classical Hausdorff dimensions of this set; the difference is exactly$1$minus the dimension of the Cantor set. We conclude that many points in the Cantor set additively cancel randomness. On the surface, the theorem above describes a connection between algorithmic randomness and classical fractal geometry. Less obvious is its relationship to additive number theory. In 1954, G. G. Lorentz proved the following statement. There exists a constant$c$such that for any integer$k$, if$A\subseteq [0, k)$is a set of integers with${\left|A\right|} \geq \ell \geq 2$, then there exists a set of integers$B \subseteq (-k,k)$such that$A + B \supseteq [0, k)$with${\left|B\right|} \leq ck\frac{\log \ell}{\ell}$. Given a Martin-Löf random real$r$, I will show how Lorentz's Lemma can be used to identify a point$x\in\mathcal{C}$such that the constructive dimension of$x+r$is close to$1 - \log 2 / \log 3$, which is as small as it can possibly be. This talk is based on joint work with Randy Dougherty, Jack Lutz, and Dan Mauldin. INI 1 10:30 to 11:00 Morning Coffee 11:00 to 12:00 G Barmpalias (Chinese Academy of Sciences)Exact pairs for the ideal of the$K$-trivial sequences in the Turing degrees The$K$-trivial sets form an ideal in the Turing degrees, which is generated by its computably enumerable (c.e.) members and has an exact pair below the degree of the halting problem. The question of whether it has an exact pair in the c.e. degrees was first raised in a published list of questions in the Bulletin of Symbolic Logic in 2006 by Miller and Nies and later in Nies' book on computability and randomness. Moreover it was featured in several conferences in algorithmic randomness, since 2005. We give a negative answer to this question. In fact, we show the following stronger statement in the c.e. degrees. There exists a$K$-trivial degree$\mathbf{d}$such that for all degrees$\mathbf{a}, \mathbf{b}$which are not$K$-trivial and$\mathbf{a}>\mathbf{d}, \mathbf{b}>\mathbf{d}$there exists a degree$\mathbf{v}$which is not$K$-trivial and$\mathbf{a}>\mathbf{v}, \mathbf{b}>\mathbf{v}$. This work sheds light to the question of the definability of the$K$-trivial degrees in the c.e. degrees. INI 1 12:00 to 12:30 P A Heiber (Universidad de Buenos Aires)Normality and Differentiability By transferring to the world of functions computable by finite automata the classical theorem of numerical analysis establishing that every non-decreasing real valued function is almost everywhere differentiable, we obtain a characterization of the property of Borel normality. We consider functions mapping infinite sequences to infinite sequences and a notion of differentiability that, on the class of non-decreasing real valued functions, coincides with standard differentiability. We prove that the following are equivalent, for a real x in [0,1]: (1) x is normal to base b. (2) Every non-decreasing function computable by a finite automaton mapping infinite sequences to infinite sequences is differentiable at the expansion of x in base b. (3) Every non-decreasing function computable by a finite automaton in base b mapping real numbers to real numbers is differentiable at x. Joint work with Verónica Becher, Universidad de Buenos Aires. INI 1 12:30 to 13:30 Lunch at Wolfson Court 14:00 to 14:30 W Fouche (University of South Africa)Kolmogorov complexity and Fourier aspects of Brownian motion It is well-known that the notion of randomness, suitably refined, goes a long way in dealing with the tension between the incompatability of shortest descriptions and of effecting the most-economical algorithmical processing" Manin(2006). In this work, we continue to explore this interplay between short descriptions and randomness in the context of Brownian motion and its associated geometry. In this way one sees how random phenomena associated with the geometry of Brownian motion, are implicitly enfolded in each real number which is complex in the sense of Kolmogorov. These random phenomena range from fractal geometry, Fourier analysis and non-classical noises in quantum physics. In this talk we shall discuss countable dense random sets as the appear in the theory of Brownian motion in the context of algorithmic randomness. We shall also discuss applications to Fourier analysis. In particular, we also discuss the images of certain$\Pi_2^0$perfect sets of Hausdorff dimension zero under a complex oscillation (which is also known as an algorithmically random Brownian motion). This opens the way to relate certain non-classical noises to Kolmogorov complexity. For example, the work of the present work enables one to represent Warren's splitting noise directly in terms of infinite binary strings which are Kolmogorov-Chaitin-Martin-Löf random. INI 1 14:30 to 15:00 J Franklin (University of Connecticut)Ergodic theory and strong randomness notions There has been a great deal of interest recently in the connection between algorithmic randomness and ergodic theory, which naturally leads to the question of how much one can say if the transformations in question need not be ergodic. We have essentially reversed a result of V'yugin and shown that if an element of the Cantor space is not Martin-Löf random, then there is a computable function and a computable transformation for which this element is not typical with respect to the ergodic theorem. More recently, we have shown that every weakly 2-random element of the Cantor space is typical with respect to the ergodic theorem for every lower semicomputable function and computable transformation. I will explain the proof of the latter result and discuss the technical difficulties present in producing a full characterization. INI 1 15:00 to 15:30 Z Reznikova (Novosibirsk State University)Integration of ideas and methods of Kolmogorov Complexity and classical mathematical statistics A new approach is suggested which allows to combine the advantages of methods based on Kolmogorov complexity with classical methods of testing of statistical hypotheses. As distinct from other approaches to analysis of different sequences by means of Kolmogorov complexity, we stay within the framework of mathematical statistics. As examples, we consider behavioural sequences of animals (ethological texts'') testing the hypotheses whether the complexity of hunting behaviour in ants and rodents depends on the age of an individual. INI 1 15:30 to 16:00 Afternoon Tea 17:00 to 17:30 D Ryabko ([INRIA, Lille, France])Limit capacity of non-stochastic steganographic systems and Hausdorff dimension It was shown recently that the limit capacity of perfect steganography systems for i.i.d. and Markov sources equals the Shannon entropy of the cover'' process. Here we address the problem of limit capacity of general perfect steganographic systems. We show that this value asymptotically equals the Hausdorff dimension of the set of possible cover messages. INI 1 10:00 to 10:30 K Tadaki (Chuo University)Cryptography and Algorithmic Randomness In modern cryptography, the random oracle model is widely used as an imaginary framework in which the security of a cryptographic scheme is discussed. In the random oracle model, the cryptographic hash function used in a cryptographic scheme is formulated as a random variable uniformly distributed over all possibility of the function, called the random oracle, and the legitimate users and the adversary against the scheme are modeled so as to get the values of the hash function not by evaluating it in their own but by querying the random oracle. Since the random oracle is an imaginary object, even if the security of a cryptographic scheme is proved in the random oracle model, the random oracle has to be instantiated using a concrete cryptographic hash function such as the SHA hash functions if we want to use the scheme in the real world. However, it is not clear how much the instantiation can maintain the security originally proved in the random oracle model, nor is it clear w hether the random oracle can be instantiated somehow while keeping the original security. In the present talk we investigate this problem using concepts and methods of algorithmic randomness. Our results use the general form of definitions of security notions for cryptographic schemes, and depend neither on specific schemes nor on specific security notions. INI 1 10:30 to 11:00 Morning Coffee 11:00 to 12:00 A Lewis (University of Leeds)The typical Turing degree Since Turing degrees are tailsets, it follows from Kolmogorov's 0-1 law that for any property which may or may not be satisfied by any given Turing degree, the satisfying class will either be of Lebesgue measure 0 or 1, so long as it is measurable. So either the typical degree satisfies the property, or else the typical degree satisfies its negation. Further, there is then some level of randomness sufficient to ensure typicality in this regard. I shall describe a number of results in a largely new programme of research which aims to establish the (order theoretically) definable properties of the typical Turing degree, and the level of randomness required in order to guarantee typicality. A similar analysis can be made in terms of Baire category, where a standard form of genericity now plays the role that randomness plays in the context of measure. This case has been fairly extensively examined in the previous literature. I shall analyse how our new results for the measure theoretic case contrast with existing results for Baire category, and also provide some new results for the category theoretic analysis. This is joint work with George Barmpalias and Adam Day. INI 1 12:00 to 12:30 H Takahashi ([University of Electro-Communications, Tokyo])Algorithmic randomness and stochastic selection function For$x=x_1x_2\cdots,y=y_1y_2\cdots\in\{0,1\}^\infty$, let$x^n:=x_1\cdots x_n$and$x/y:=x_{\tau(1)}x_{\tau(2)}\cdots$where$\{i\mid y_i=1\}=\{\tau(1) The following two statements are equivalent. $x\in {\cal R}^u$, where $u$ is the uniform measure on $\{0,1\}^\infty$. $\exists \mbox{ computable }w\ \ x\in{\cal R}^w \mbox{ and }x/y_i(w, x)\in{\cal R}^u\mbox{ for }i=1,2,\ldots, 6, \mbox{ where }\\ \{y_1(w, x),\ldots, y_6(w, x )\} \mbox{ consists of non-trivial selection functions and depends on } w\mbox{ and }x.$ The author do not know whether we can drop the assumption that $x$ is random w.r.t. some computable probability in (ii), i.e., whether we can replace (ii) with $x/y\in R^u$ for $y\in Y^x$ where $Y^x$ consists of non-trivial selection functions and depends on $x$. We also have a similar algorithmic analogy for Steinhause theorem [1]. Let $w$ be a computable probability such that $\forall y\in{\cal R}^w,\ \lim_n K(y^n)/n=0$, (b) $\lim_n \sum_{1\leq i\leq n} y_i/n$ exists for $y\in{\cal R}^w$, and $\forall \epsilon >0 \exists y\in{\cal R}^w \lim_n \sum_{1\leq i\leq n} y_i/n>1-\epsilon$. Then the following two statements are equivalent. $\lim_{n\to\infty} \frac{1}{n}K(x^n)=1$. $\lim_{n\to\infty} \frac{1}{| x^n/y^n|}K(x^n/y^n)=1$ for $y\in{\cal R}^w$, where $K$ is the prefix-complexity. For example, $w:=\int P_\rho d\rho$, where $P_\rho$ is a probability derived from irrational rotation with parameter $\rho$, satisfies the condition of Prop. 2, see [2]. There are similar algorithmic analogies for Kamae's theorem [4], see [3]. [1] H. Steinhaus. "Les probabilités dénombrables et leur rapport à la théorie de la meésure." Fund. Math., 4:286–310, 1922. [2] H. Takahashi and K. Aihara. "Algorithmic analysis of irrational rotations in a sigle neuron model." J. Complexity, 19:132–152, 2003. [3] H. Takahashi. "Algorithmic analogies to Kamae-Weiss theorem on normal numbers." In Solomonoff 85th memorial conference. To appear in LNAI. [4] T. Kamae. "Subsequences of normal numbers." Israel J. Math., 16:121–149, 1973. INI 1 12:30 to 13:30 Lunch at Wolfson Court 14:00 to 15:00 A Day (University of California, Berkeley)Cupping with random sets A set $X$ is ML-cuppable if there exists an incomplete Martin-Löf random $R$ that joins $X$ to zero jump. It is weakly ML-cuppable if there exists an incomplete Martin-Löf random $R$ that joins $X$ above zero jump. We prove that a set is K-trivial if and only if it is not weakly ML-cuppable. Further, we show that a set below zero jump is K-trivial if and only if it is not ML-cuppable. These results settle a question of Kučera, who introduced both cuppability notions. This is joint work with Joseph S. Miller. INI 1 15:30 to 16:00 Afternoon Tea 16:00 to 17:00 R Downey (Victoria University of Wellington)Resolute sets and initial segment complexity Notions of triviality have been remarkably productive in algorithmic randomness,with $K$-triviality being the most notable. Of course, ever since the original characterization of Martin-Löf randomness by initial segment complexity, there has been a longstanding interplay between initial segment complexity and calibrations of randomness, as witnessed by concepts such as autocomplexity, and the like. In this talk I wish to discuss recent work with George Barmpalias on a triviality notion we call resoluteness. Resoluteness is defined in terms of computable shifts by is intimately related to a notion we call weak resoluteness where $A$ is weakly resolute iff for all computable orders $h$, $K(A\uparrow n) \ge^+ K(A\uparrow h(n)),$ for all $n$. It is not difficult to see that $K$-trivials have this property but it turns out that there are uncountablly many degrees which are completely $K$-resolute, and there are c.e. degrees also with this property. These degrees seem related to Lathrop-Lutz ultracompressible degrees. Our investigations are only just beginning and I will report on our progress. Joint work with George Barmpalias. INI 1 |
- Current Issue - Ahead of Print Articles - All Issues - Search - Open Access - Information for Authors - Downloads - Guideline - Regulations ㆍPaper Submission ㆍPaper Reviewing ㆍPublication and Distribution - Code of Ethics - For Authors ㆍOnline Submission ㆍMy Manuscript - For Reviewers - For Editors
Some branching formulas for Kac--Moody Lie algebras Commun. Korean Math. Soc. 2019 Vol. 34, No. 4, 1079-1098 https://doi.org/10.4134/CKMS.c180373Published online October 31, 2019 Kyu-Hwan Lee, Jerzy Weyman University of Connecticut; University of Connecticut Abstract : In this paper we give some branching rules for the fundamental representations of Kac--Moody Lie algebras associated to $T$-shaped graphs. These formulas are useful to describe generators of the generic rings for free resolutions of length three described in \cite{JWm18}. We also make some conjectures about the generic rings. Keywords : Kac--Moody algebras, branching formulas MSC numbers : 13C99, 13H10 Downloads: Full-text PDF Full-text HTML |
# Multiple alignments within align environment
My minimum working example:
\documentclass{report}
\usepackage{amsmath}
\begin{document}
\begin{align}
\dot{x}_2 = \dot{x}_{2,0}&+\frac{\partial f_2(x_1)}{\partial x_1} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})+\\
&+ \frac{1}{2}\frac{\partial {f_2}^2(x_1)}{\partial {x_1}^2} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})^2+ \\
&+ \frac{1}{6}\frac{\partial {f_2}^3(x_1)}{\partial {x_1}^3} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})^3+ \\
&+ g_2 (u-u_0)
\end{align}
\end{document}
The result:
Now I would like to obtain the following result instead:
Any ideas?
-
You could use \phantom{\frac{1}{2}} in the first line. By the way, do you really need a “+” at the end of each line? – Manuel Apr 30 at 15:39
Is this some hideous homework exercise where you need to get everything lined up perfectly to get the marks? I had one of those as an undergrad - you got docked a point if you didn't notice a capital A was suddenly in 10pt font when the rest was 12pt... – FionaSmith Apr 30 at 15:50
I think you mean \frac{\partial^k {f_2}(x_1)}{\partial {x_1}^k} and not \frac{\partial {f_2}^k(x_1)}{\partial {x_1}^k}. Also, do you really want each line numbered? Most of the time an equation gets only one number at most. You can use alignat* for no numbers and aligned for one number. – Matthew Leingang Apr 30 at 23:51
Yep! alignat
\documentclass{report}
\usepackage{amsmath}
\begin{document}
\begin{alignat}{3}
\dot{x}_2 = \dot{x}_{2,0} \,& + \,& \,\frac{\partial f_2(x_1)}{\partial x_1} \bigg|_{x_1=x_{1,0}} & (x_1-x_{1,0}) &+\\
&+& \,\frac{1}{2}\frac{\partial {f_2}^2(x_1)}{\partial {x_1}^2} \bigg|_{x_1=x_{1,0}} & (x_1-x_{1,0})^2 &+ \\
&+& \,\frac{1}{6}\frac{\partial {f_2}^3(x_1)}{\partial {x_1}^3} \bigg|_{x_1=x_{1,0}} & (x_1-x_{1,0})^3 &+ \\
&+ \protect\makebox[0pt][l]{\,\,$g_2 (u-u_0)$}\,
\end{alignat}
\end{document}
Note: you should take care with the number in curly brackets - it's to do with the total number of ampersands. I think it's supposed to be total + 1 divided by 2, but I have four... I had just copied and pasted from a bit of my own code. Perhaps someone will explain the number of ampersands thing better! The first ampersand marks an alignment point, and the next column will be right aligned. If you put in another, it will then be left aligned for the subsequent column so you can put && in if you want to immediately left align.
Note I put a few \, spacing commands in to line stuff up with slightly bigger gaps to match what you were looking for.
Now tell me, I'm confused: since you have uploaded exactly what you wanted - how did you generate that?
-
Thanks, that's looks good. Only thing is that the term $g_2(u-u_0)$ is not located at the correct location. By the way, I used Paint to shift some terms. – Pietair Apr 30 at 15:36
oh yeah didn't notice that! I will shamelessly copy in that part of Stephen's answer since I would have had no idea how to get that part to left align. Anyway, now you have two options! – FionaSmith Apr 30 at 15:43
Use alignat and \mathrlap from the mathtools package for the last line.
The alignat environment has an implicit {rlrl...rlrl} alignment. The argument is the number of the r columns.
The command \mathrlap puts its argument in a zero width box and aligns it left.
\documentclass{report}
\begin{document}
\begin{alignat}{3}
\dot{x}_2 = \dot{x}_{2,0}&+{}&\frac{\partial f_2(x_1)}{\partial x_1} \bigg|_{x_1=x_{1,0}}
&(x_1-x_{1,0})&{}+{}\\
&+{}& \frac{1}{2}\frac{\partial {f_2}^2(x_1)}{\partial {x_1}^2} \bigg|_{x_1=x_{1,0}} &(x_1-x_{1,0})^2&{}+{} \\
&+{}& \frac{1}{6}\frac{\partial {f_2}^3(x_1)}{\partial {x_1}^3} \bigg|_{x_1=x_{1,0}} &(x_1-x_{1,0})^3&{}+{} \\
&+\mathrlap{g_2(u-u_0)}
\end{alignat}
\end{document}
I have used &+{}& and &{}+{} to get the right spacing around the +.
-
not come across mathrlap - I'm off to investigate. Note you need to align the plus signs at the ends of the lines too for a complete solution :) – FionaSmith Apr 30 at 15:47
The spacings of the +'s at the end are also wrong ;-) – daleif Apr 30 at 15:54
I have aligned the + at the end and changed the spacings around of them. – esdd Apr 30 at 16:16
Here, I did it with a tabular stack. Note, however, that it will only allow a single equation number to be applied to the result. However, since it is a single equation, I thought that might be okay (if you want the vertical position of the number to be adjusted, that can be done).
The only real quirk to the solution was getting the g_2 term left aligned in a right-aligned column. For that, I just grafted it to the right side of the preceding + sign.
\documentclass{report}
\usepackage{tabstackengine}
\stackMath
\usepackage{amsmath}
\begin{document}
$$\setstackgap{S}{6pt} \TABbinary \setstacktabulargap{0pt} \tabularShortstack{rcrlc}{ \dot{x}_2 = \dot{x}_{2,0}&+&\dfrac{\partial f_2(x_1)}{\partial x_1} \bigg|_{x_1=x_{1,0}}& (x_1-x_{1,0})&+\\ &+& \dfrac{1}{2}\dfrac{\partial {f_2}^2(x_1)}{\partial {x_1}^2} \bigg|_{x_1=x_{1,0}}& (x_1-x_{1,0})^2&+ \\ &+& \dfrac{1}{6}\dfrac{\partial {f_2}^3(x_1)}{\partial {x_1}^3} \bigg|_{x_1=x_{1,0}}& (x_1-x_{1,0})^3&+ \\ &+ \protect\makebox[0pt][l]{g_2 (u-u_0)}& &&& }$$
\end{document}
-
I just stole your answer for the last line to correct my alignat solution but I don't know how to tag you without commenting on your own post! – FionaSmith Apr 30 at 15:48
@FionaSmith No problem. To paraphrase the humorous lyrics of Tom Lehrer (youtube.com/watch?v=HfjFnjWjDOI), "plagiarize, only be sure always to call it please, 'research'." 8^b [My retort is just humor. What you did is perfectly acceptable] – Steven B. Segletes Apr 30 at 16:06
You can keep the alignment points in the align environment as they are now, and just insert a couple of \phantom instructions in the first row to achieve the needed spacing adjustments.
By the way, I would recommend you omit the + symbols at the ends of rows 1, 2, and 3 since they're made redundant by the + symbols at the start of rows 2, 3, and 4.
\documentclass{report}
\usepackage{amsmath}
\begin{document}
\begin{align}
\dot{x}_2 = \dot{x}_{2,0}&+\phantom{\frac{1}{2}}\frac{\partial {f_2}\phantom{^1}(x_1)}{\partial x_1} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})\\
&+ \frac{1}{2}\frac{\partial {f_2}^2(x_1)}{\partial {x_1}^2} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})^2\\
&+ \frac{1}{6}\frac{\partial {f_2}^3(x_1)}{\partial {x_1}^3} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})^3\\
&+ g_2 (u-u_0)
\end{align}
\end{document}
However, if these + symbols must be there, I would recommend you (a) write +{} instead of just + for these symbols to get proper spacing and (b) append \phantom{^1} to (x_1-x_{1,0}) in the first row to further fine-tune the spacing.
Finally, if you wanted the vertical space between the third and fourth row to be the same as between the other rows, you could add the instruction \phantom{\bigg|} after g_2 (u-u_0). With this adjustment made, and the + symbols at the ends of the first three rows not deleted, your equations would look like this:
\documentclass{report}
\usepackage{amsmath,}
\begin{document}
\begin{align}
\dot{x}_2 = \dot{x}_{2,0}&+\phantom{\frac{1}{2}}\frac{\partial {f_2}\phantom{^1}(x_1)}{\partial x_1} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})\phantom{^1} +{}\\
&+ \frac{1}{2}\frac{\partial {f_2}^2(x_1)}{\partial {x_1}^2} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})^2 +{}\\
&+ \frac{1}{6}\frac{\partial {f_2}^3(x_1)}{\partial {x_1}^3} \bigg|_{x_1=x_{1,0}} (x_1-x_{1,0})^3 +{}\\
&+ g_2 (u-u_0)\phantom{\bigg|}
\end{align}
\end{document}
-
Why not ^{\phantom 3} instead of \phantom{^1}? Might not the latter be thinner than the 3 superscript? – Matthew Leingang Apr 30 at 23:54
@MatthewLeingang - the numerals of the font in use (Computer Modern) are tabular-style (as well as lining style). Hence, the 1 and the 3 take up the same horizontal space. The objects \phantom{^1} and \phantom{^3} also take up the same amount of space. – Mico May 1 at 0:11 |
Permutation matrices A permutation matrix is a square matrix that has exactly one 1 in every row and column and O's elsewhere. What is the expected value of its trace? Therefore, the matrix is full-rank. Spam is usually deleted within one day. The trace of a permutation matrix is the number of fixed points of the permutation. Permutation matrix: | | ||| | Matrices describing the permutations of 3 elements| ... World Heritage Encyclopedia, the aggregation of the largest online encyclopedias available, and the most definitive collection ever assembled. MathJax reference. Define 2x2 and 3x3 permutation matrices. Exercise 1. Both methods of defining permutation matrices appear in the literature and the properties expressed in one representation can be easily converted to the other representation. Later edit: Thanks to Sean Eberhard's comment, it becomes clear that the unitary matrices which are linear combinations of permutation matrices are precisely those unitary matrices which have the vector v above as an eigenvector- any unitary matrix which has v as an eigenvector necessarily leaves v ⊥ invariant, so any linear combination of permutation matrices both has v has an eigenvector and leaves … By Exercise 1 we can write a permutation matrix as a matrix of unit column-vectors: which proves orthogonality. there is exactly one nonzero entry in each row and each column.Unlike a permutation matrix, where the nonzero entry must be 1, in a generalized permutation matrix the nonzero entry can be any nonzero value. A permutation matrixis a square matrix1in which is zero everywhere apart from having only one ‘1’ on every row and in every column. A permutation matrix permutes (changes orders of) rows of a matrix. $0$ and $1$ such that there is exactly one $1$ in every row and every column. rows and n columns. rev 2020.12.3.38123, The best answers are voted up and rise to the top, Mathematics Stack Exchange works best with JavaScript enabled, Start here for a quick overview of the site, Detailed answers to any questions you might have, Discuss the workings and policies of this site, Learn more about Stack Overflow the company, Learn more about hiring developers or posting ads with us. An invertible matrix A is a generalized permutation matrix if and only if it can be written as a product of an invertible diagonal matrix D and an (implicitly invertible) permutation matrix P: i.e., [math] A=DP. The column representation of a permutation matrix is used throughout this section, except when otherwise indicated. If the permutation has fixed points, so it can be written in cycle form as π = (a 1)(a 2)...(a k) σ where σ has no fixed points, then e a 1,e a 2,...,e a k are eigenvectors of the permutation matrix. n permutation matrix is the matrix of a permutation chosen uniformly at random from Sn. I want to prove that there exists some $N > 0$ such that $P^N = I.$. P = perms(v) returns a matrix containing all permutations of the elements of vector v in reverse lexicographic order.Each row of P contains a different permutation of the n elements in v.Matrix P has the same data type as v, and it has n! there is exactly one nonzero entry in each row and each column. I would like to see a purely matrix theoretic proof of this fact. Now, use the fact that $P$ is invertible. The simplest permutation matrix is I, the identity matrix. Prove that a matrix is the permutation matrix of a permutation, Understanding representation of permutation matrix as vector, Prove a matrix is a generalized permutation matrix, $\{0,1\}$-matrix and permutation matrices. I suggest constructing a group homomorphism $S_n\rightarrow GL_n$ whose image is permutation matrices. Proof. One way to construct permutation matrices is to permute the rows (or columns) of the identity matrix. Do players know if a hit from a monster is a critical hit? Corollary: A permutation matrix is the same as an invertible matrix where every column is a standard basis vector. Exercise 2. If a nonsingular matrix and its inverse are both nonnegative matrices (i.e. In mathematics, a generalized permutation matrix (or monomial matrix) is a matrix with the same nonzero pattern as a permutation matrix, i.e. $P^N$ is a permutation matrix for $N > 0.$ I imagine that I can do this there are two natural ways to associate the permutation with a permutation matrix; namely, starting with the m × m identity matrix, I m, either permute the columns or permute the rows, according to π. Another property of permutation matrices is given below. permutation matrices … 2.6 Permutation matrices A permutation matrix P is a square matrix of order n such that each line (a line is either a row or a column) contains one element equal to 1, the remaining elements of the line being equal to 0. Suppose $\lambda^ia=\lambda^ja$ where $0\leq i 0$ such that $P$ is also a component $... Permutation matrices$ X\subseteq\mathbb C $since it is orthogonal ( see here ) n$:... Behavior has blocked 117 access attempts in the last 7 days are unit and. Pσ = I ( because the multiplicative identity I of nxn matrices is a square matrix has. $, we can choose some nonzero$ a\in x $, two of values... Does it often take so much effort to develop them because all rows are different, we choose... Fact that if$ n $must be finite P^N permutation matrix properties I.$ '', . Off books with pictures and onto books with pictures and onto books with pictures and onto with!, is this a bit of an unnecessary way to construct permutation matrices is to permute rows... Great answers of convenience theoretic proof of this statement is given by I < n. Inverse are both nonnegative matrices ( i.e $m < n$ orthogonal ( see here ) member without intrusive! '' device I can bring with me to visit the developing world not contain more than swap ''... 'S elsewhere to use such shortcuts, to see a purely matrix proof... Income: how can I pay respect for a recently deceased team member without seeming intrusive “ key ”... I pay respect for a recently deceased team member without seeming intrusive subscribe... Logo © 2020 Stack Exchange Inc ; user contributions licensed under cc by-sa a fleet of generation ships one... Hard drives for PCs cost since it is orthogonal ( see here ) ( 2 ) shortcut! The same as an invertible matrix where every column is a standard basis vector ; j 1. Develop them do most Christians eat pork when Deuteronomy says not to a critical hit with references personal... Your privacy and do not share your email address to subscribe to this blog section we look... Denoting permutations is merely a matter of convenience $with eigenvalue$ \lambda x=Px $is a! Wax from a toilet ring fell into the drain, how do I address this statements! Depend on how much group theory permutation matrix properties here is an orthogonal matrix invertible! Suggest constructing a group homomorphism$ S_n\rightarrow GL_n $whose image is permutation matrices a! Matrix results in a rearrangement of the identity matrix statements based on opinion back. Pork when Deuteronomy says not to some nonzero$ a\in x, i.e an invertible matrix where every is. Permutation that we necessarily have $P^ { n-m } =I$ nine-year old off! Your privacy and do not share your email drives for PCs cost set be! Thanks for contributing an answer to this blog reason through the proof throughout this we! To an identity the permutation the set of components of $\lambda X\subseteq x$ ). Exist integers $m < n$ would be shuffle rows '', so $x=Px.$ X\subseteq\mathbb C $be the set of components of$ x $, so$ X\subseteq... Its columns are equal n×npermutation matrix results in a rearrangement of the numbers 1 to nonzero $x. Here is an orthogonal matrix Tehran '' filmed in Athens cc by-sa following: permutation... P$ with eigenvalue $\lambda X\subseteq x$ is only finitely many permutations this section we look... Related fields } =I $Exchange is a question and answer site for people math! 'S elsewhere \lambda^ia=\lambda^ja$ where $0\leq I < j\leq n$, so ... To some permutation of the rows of an identity PCs cost entries ) then! What purpose does read '' exit 1 when EOF is encountered '' viruses, then why it! Are both nonnegative matrices ( i.e of an identity matrix according to permutation. A ) all its columns are equal unity because all rows are different for each eigenvalue, so P^! Email address to subscribe to this blog shuffle rows '': which proves orthogonality you! Changes orders of ) rows of a permutation matrix pigeonhole principle, there exist integers $<..., we can write a permutation matrix is randomly picked n×npermutation matrix results in a rearrangement of the numbers to. Bit of an unnecessary way to construct permutation matrices is a permutation is. Where$ 0\leq I < j\leq n $such that$ P^m=P^n $nxn matrices is to permute the of! There is only finitely many permutations entry in each row and each.! Here ) logo © 2020 Stack Exchange Inc ; user contributions licensed under cc.. Permutations is merely a matter of convenience the matrix of unit column-vectors: which orthogonality. \Lambda^N a\in x equation ( 1 ) below Exchange Inc ; user licensed. Get my nine-year old boy off books with pictures and onto books text. Of components of$ \lambda X\subseteq x \$, we can write a permutation matrix is the number fixed! Illegal to carry someone else 's ID or credit card of fixed points of the unitary group 2 from exam... |
# Why, historically, do we multiply matrices as we do?
Multiplication of matrices — taking the dot product of the $i$th row of the first matrix and the $j$th column of the second to yield the $ij$th entry of the product — is not a very intuitive operation: if you were to ask someone how to mutliply two matrices, he probably would not think of that method. Of course, it turns out to be very useful: matrix multiplication is precisely the operation that represents composition of transformations. But it's not intuitive. So my question is where it came from. Who thought of multiplying matrices in that way, and why? (Was it perhaps multiplication of a matrix and a vector first? If so, who thought of multiplying them in that way, and why?) My question is intact no matter whether matrix multiplication was done this way only after it was used as representation of composition of transformations, or whether, on the contrary, matrix multiplication came first. (Again, I'm not asking about the utility of multiplying matrices as we do: this is clear to me. I'm asking a question about history.)
-
As to the last, matrix multiplication definitely came first (centuries first), and I'm reasonably certain from a compact representation of systems of linear equations. Leibniz already had a determinant formula. As I have no historic sources for first use, this doesn't answer your question though. – gnometorule Jan 7 at 4:19 Matrices are linear operators and have meaning only when it acts on vectors. Given matrices $A$ and $B$, what would we want the operator/matrix $BA$ to mean? Ideally, we would want $BA$ to mean the following. For all vectors $x$, we want $(BA)x = B(Ax)$ Once we have this i.e. $(BA)x = B(Ax)$ for all $x$, then we are forced to live with the way we currently multiply matrices. And as to why matrix-vector product is defined in the way it is, the primary reason for introducing matrices was to handle linear transformation in a notationally convenient way. – user17762 Jan 7 at 4:21 @HarryStern, explicitly, I thought. Anyway, yes, that's what I want. – msh210 Jan 7 at 4:44 I actually missed your explicit question in the middle of the body. Perhaps you should change that to the title. – Harry Stern Jan 7 at 4:46 There is another question, of course, which is not so much why matrix multiplication was defined like this, but why it stuck - why this apparently curious definition took off, and didn't die the death of so many putative definitions. And that was because it proved mathematically fruitful. – Mark Bennet Jan 7 at 8:47
Matrix multiplication is a symbolic way of substituting one linear change of variables into another one. If $x' = ax + by$ and $y' = cx+dy$, and $x'' = a'x' + b'y'$ and $y'' = c'x' + d'y'$ then we can plug the first pair of formulas into the second to express $x''$ and $y''$ in terms of $x$ and $y$: $$x'' = a'x' + b'y' = a'(ax + by) + b'(cx+dy) = (a'a + b'c)x + (a'b + b'd)y$$ and $$y'' = c'x' + d'y' = c'(ax+by) + d'(cx+dy) = (c'a+d'c)x + (c'b+d'd)y.$$ It can be tedious to keep writing the variables, so we use arrays to track the coefficients, with the formulas for $x'$ and $x''$ on the first row and for $y'$ and $y''$ on the second row. The above two linear substitutions coincide with the matrix product $$\left( \begin{array}{cc} a'&b'\\c'&d' \end{array} \right) \left( \begin{array}{cc} a&b\\c&d \end{array} \right) = \left( \begin{array}{cc} a'a+b'c&a'b+b'd\\c'a+d'c&c'b+d'd \end{array} \right).$$ So matrix multiplication is just a bookkeeping device for systems of linear substitutions plugged into one another (order matters). The formulas are not intuitive, but it's nothing other than the simple idea of combining two linear changes of variables in succession.
Matrix multiplication was first defined explicitly in print by Cayley in 1858, in order to reflect the effect of composition of linear transformations. See paragraph 3 at http://darkwing.uoregon.edu/~vitulli/441.sp04/LinAlgHistory.html. However, the idea of tracking what happens to coefficients when one linear change of variables is substituted into another (which we view as matrix multiplication) goes back further. For instance, the work of number theorists in the early 19th century on binary quadratic forms $ax^2 + bxy + cy^2$ was full of linear changes of variables plugged into each other (especially linear changes of variable that we would recognize as coming from ${\rm SL}_2({\mathbf Z})$). For more on the background, see the paper by Thomas Hawkins on matrix theory in the 1974 ICM. Google "ICM 1974 Thomas Hawkins" and you'll find his paper among the top 3 hits.
-
This is interesting, but does not really answer the question, which is mainly historical. – MJD Jan 7 at 4:33
Huh? The question is, historically, why do we multiply matrices the way we do. I answered that question: it is a shorthand for substituting one linear change of variables into another. – KCd Jan 7 at 4:36
Who thought of multiplying matrices in that way? When? How do you know that this was the first reason found for defining matrix multiplication in this way? – MJD Jan 7 at 4:38
What @MJD said. – msh210 Jan 7 at 4:45
@MJD: Cayley thought of multiplying matrices that way when he defined matrix multiplication. I won't say that it was the reason for defining matrix multiplication, but just that it was a way to think about the total process. If you read about the way linear algebra developed, you'll find that the subject came from several directions; adding one system of linear equations (with the same variables) led to addition of matrices and the algebraic calculations that arise when dealing with substitutions of linear changes of variables led to matrix multiplication. – KCd Jan 7 at 5:01
\begin{align} u & = 3x + 7y \\ v & = -2x + 11y \\ \\ \\ \\ p & =13u-20v \\ q & = 2u+6v \end{align} Given $x$ and $y$, how do you find $p$ and $q$? How do you wrote \begin{align} p & = \bullet x + \bullet y \\ q & = \bullet x+\bullet y\quad\text{?} \end{align} What numbers go where the four $\bullet$s are? |
# What is the aldol condensation mechanism?
Then teach the underlying concepts
Don't copy without citing sources
preview
?
#### Explanation
Explain in detail...
#### Explanation:
I want someone to double check my answer
2
Jun 13, 2016
Aldol condensation is the reaction of a ketone with an aldehyde in strong base on high heat.
The mechanism goes like this:
1. First, the strong base (${\text{OH}}^{-}$) acquires a proton from the $\alpha$-carbon. This generates the enolate to some extent, and sets the substrate up for the base-catalyzed aldol reaction.
2. The enolate is then used as a nucleophile to attack the aldehyde in an ${\text{S}}_{N} 2$ fashion. Remember that a new $\text{C"-"C}$ bond is created here, so you must count your carbons and make sure of that.
3. A proton transfer occurs from the $\text{H"_2"O}$ that formed in step 1 to protonate the oxide.
This forms the $\beta$-hydroxyketone intermediate (or $\beta$-hydroxyaldehyde if ${R}_{1}$ is $\text{H}$).
4. Another ${\text{OH}}^{-}$ acquires an $\setminus m a t h b f \left(\alpha\right)$-proton , just as in step 1, generating an enolate. This is upon additional heating, and starts the dehydration portion of the aldol condensation, which is basically the second half of the reaction.
The electrons didn't flow to the right because the $\text{OH}$ is electron-donating, which decreases the electropositivity of the carbon adjacent to ${R}_{2}$. Instead, some "momentum" from forming the enolate is necessary to proceed.
5. The $\text{OH}$ is eliminated, forming the final product: an $\setminus m a t h b f \left(\alpha , \beta\right)$-unsaturated ketone (or aldehyde, if ${R}_{1} = \text{H}$).
• 10 minutes ago
• 10 minutes ago
• 11 minutes ago
• 13 minutes ago
• A minute ago
• 2 minutes ago
• 3 minutes ago
• 3 minutes ago
• 4 minutes ago
• 9 minutes ago
• 10 minutes ago
• 10 minutes ago
• 11 minutes ago
• 13 minutes ago |
ACTA MATHEMATICA UNIVERSITATIS COMENIANAE
Vol. 65, 1 (1996)
pp. 111-123
MINIMAL AND MAXIMAL SETS OF BELL-TYPE INEQUALITIES HOLDING IN A LOGIC
H. LANGER and M. MACZYNSKI
Abstract. It is shown that for every integer $n>1$ the poset $(\\f\:2^\1,\ldots,n \to Z\,| \sum_I\subseteq\1,\ldots,n\f(I)p(\bigwedge_i\in Ia_i)\in [0,1]$ for all states $p$ on $L$ and all $a_1,\ldots,a_n\in L \,|\,L\;:$ ortholattice$\\,,\,\subseteq)$ possesses a smallest and a greatest element. The functions in this poset are interpreted as Bell-type inequalities holding in $L$.
AMS subject classification. 06C15; Secondary 03G12, 81P10
Keywords. Ortholattice, logic, state, Bell-type inequality |
##### Page tree
Go to start of banner
# Loan Products
Constructor for loan products including interest rates, maturities, amounts, etc...
## Create a new Loan Product
To create a new Loan product click Create button on the upper right-hand corner and fill the following fields:
• Name - specify the name of a new loan product
• Code - specify a short unique code for designation this new product. For example, it can contain capital letters of the product.
• Availability - set the availability of the product by checking the boxes you need: Person, Company, Group
• Schedule type - select the schedule type you need from the drop-down list
• Currency - select the currency code from the drop-down list
• Interest rate - set minimum and maximum value for the yearly interest rate of loan product
• Amount - set minimum and the maximum amounts allowed for the loan product
• Maturity - set minimum and maximum tenor, in periods, for the loan product
• Grace period - set minimum and the maximum number of periods during which only interest payments will be collected, but not principal repayments. This function can also be used for setting up products with end-of-term (bullet) repayment. When the grace period is over, the principal of the loan will start to be collected.
• Has payees - check the box has payees if required. Payees are individuals or entities receiving the loan on behalf of the client.
• Entry fees - select required entry fee from the drop-down list
• Top up allow - top-up allows increasing the amount of the initially approved loan up to a certain value. Check the box top up allow to set the top up limit and top up olb
• Top up max limit - set the maximum limit for the top up (based on disbursed amount)
• Top up max olb - set maximum olb amount for the top up (based on the outstanding balance)
Select Accounts from the drop-down lists:
• Principal - set account used for the Principal for the loan product
• Interest accrual - set Interest accrual account for the loan product
• Interest income - set Interest income account for the loan product
• Penalty accrual - set Penalty accrual account for the loan product
• Penalty income - set Penalty income account for the loan product
• Write off portfolio - set Write off portfolio account for the loan product
• Write off interest - set Write off interest account for the loan product
• Write off penalty - set Write off penalty account for the loan product
Please note that every field is required. Make sure that you select the right accounts as otherwise, accounting transactions related to this product will not be correct.
After that, click the Save button at the upper right-hand corner to save the new entry or Cancel to discard your changes.
## Edit Loan Product
Click on the entry you want to edit to go the entry details where you will find the Edit button.
Once changes made click Save to save them or Cancel to discard.
## Schedule types
### Annuity schedule
An annuity is a series of payments made at equal intervals. There are seven annuity schedules available:
1. ANNUITY_BIWEEKLY - the period is two weeks
2. ANNUITY_MONTHLY - the period is one month, 360 days in the year
3. ANNUITY_MONTHLY_FACT - the period is one month, the actual number of days in the year (365 or 366)
4. ANNUITY_QUARTERLY - the period is three months, 360 days in the year
5. ANNUITY_QUARTERLY_FACT - the period is three months, the actual number of days in the year (365 or 366)
6. ANNUITY_SEMIANNUALLY - the period is six months, 360 days in the year
7. ANNUITY_SEMIANNUALLY_FACT - the period is six months, the actual number of days in the year (365 or 366)
All of them have the same rule of interest accrual so let's look at ANNUITY_MONTHLY for example:
Example:
Let's say we have the loan amount 100 000 USD with interest 12% for 12 months. Here is an algorithm of calculating stream of payments:
We have to find the annuity coefficient which is calculated by the following formula $P = \frac{S*I_p}{1-(1 + I_p)^{-T}}$
where is an interest rate for the period, e.g. dividing yearly interest rate for a number of periods in the year and T is a number of months in the loan maturity.
In our case, period interest rate is 12% / 12 = 1% and T = 12 months. So the annuity coefficient is $\frac{100000*0.01}{1-(1+0.01)^{-12}}=8884.88$
So we found our total amount and it will remain the same for the following payments but principal and interest will change.
The next payment is calculated absolutely the same but with different loan amount (OLB) because it was reduced after the first payment.
Other types of annuity schedule differ only in period interest rate because there will be more or less than 12 payment periods in the year.
### Differential types
Principal divided on a number of periods in the year and paid only at the end of each period (3 or 6 months) and interest paid every month. Interest is calculated as the interest rate for the period (e.g. yearly rate divided on 12) multiplied on OLB.
1. DIFFERENTIAL_QUARTERLY - principal paid once in three months, for example, a loan amount is 100 000 USD and maturity is 12 months. Then every three months you will pay 25 000 USD plus interest.
2. DIFFERENTIAL_SEMI_ANNUAL - principal paid once in six months
### Fixed types
Principal divided on a number of periods in the year and paid at the end of each period as well as interest. Interest is calculated as the interest rate for the period (e.g. yearly rate divided on 12) multiplied on OLB.
1. FIXED_PRINCIPAL_BIWEEKLY
2. FIXED_PRINCIPAL_MONTHLY
3. FIXED_PRINCIPAL_QUARTERLY
4. FIXED_PRINCIPAL_SEMIANNUAL_QUARTERLY
5. FIXED_PRINCIPAL_BY_SIX_MONTH
### Flat types
Principal divided on a number of periods in the year and paid at the end of each period as well as interest. Interest is calculated as the interest rate for the period (e.g. yearly rate divided on 12) multiplied on an initial loan amount.
1. FLAT_BIWEEKLY
2. FLAT_MONTHLY
• No labels |
# How do you solve log_3 x=2?
$x = 9$
#### Explanation:
We can write log problems two different ways:
${\log}_{a} b = c$
${a}^{c} = b$
So we can take the original question:
${\log}_{3} x = 2$
and rewrite it to:
${3}^{2} = x = 9$ |
## Evgeny Khukhro’s talk in Brasilia
On 27 November 2015 Evgeny Khukhro gave a talk at Algebra Seminar of University of Brasilia Finite groups with a Frobenius group of automorphisms whose kernel is generated by a splitting automorphism of prime order.
Abstract: It is proved that if a finite group $G$ admits a Frobenius group of automorphisms $FH$ with complement $H$ whose kernel $F=\langle\varphi\rangle$ is generated by a splitting automorphism $\varphi$ of prime order $p$ (that is, such that $xx^{\varphi}\cdots x^{\varphi^{p-1}}=1$ for all $x\in G$), then $G$ is nilpotent of class bounded in terms of $p$ and the derived length of $C_G(H)$. The proof is based on the author’s original method of elimination of operators by nilpotency and a joint result with P. Shumyatsky about groups of prime exponent corresponding to the case $\varphi =1$. |
Subsets and Splits