hexsha
stringlengths 40
40
| size
int64 6
14.9M
| ext
stringclasses 1
value | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 6
260
| max_stars_repo_name
stringlengths 6
119
| max_stars_repo_head_hexsha
stringlengths 40
41
| max_stars_repo_licenses
sequence | max_stars_count
int64 1
191k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 6
260
| max_issues_repo_name
stringlengths 6
119
| max_issues_repo_head_hexsha
stringlengths 40
41
| max_issues_repo_licenses
sequence | max_issues_count
int64 1
67k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 6
260
| max_forks_repo_name
stringlengths 6
119
| max_forks_repo_head_hexsha
stringlengths 40
41
| max_forks_repo_licenses
sequence | max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | avg_line_length
float64 2
1.04M
| max_line_length
int64 2
11.2M
| alphanum_fraction
float64 0
1
| cells
sequence | cell_types
sequence | cell_type_groups
sequence |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
e79dba57ebaed58af4b37b50715bb68cc00f6586 | 7,647 | ipynb | Jupyter Notebook | examples/denoising2D_probabilistic/1_training.ipynb | uschmidt83/CSBDeep | 72fb3bc7c87f80018f15fe97203dc1f7ff20c5c5 | [
"BSD-3-Clause"
] | 205 | 2018-02-27T09:54:56.000Z | 2022-03-24T02:20:02.000Z | examples/denoising2D_probabilistic/1_training.ipynb | lucas-sancere/CSBDeep | 2c374d5641f1eba8aafbb8d2b858814bc281701a | [
"BSD-3-Clause"
] | 59 | 2018-02-07T07:56:59.000Z | 2022-02-03T14:05:23.000Z | examples/denoising2D_probabilistic/1_training.ipynb | lucas-sancere/CSBDeep | 2c374d5641f1eba8aafbb8d2b858814bc281701a | [
"BSD-3-Clause"
] | 81 | 2018-06-02T15:03:19.000Z | 2022-03-12T08:43:17.000Z | 28.856604 | 292 | 0.591474 | [
[
[
"<hr style=\"height:2px;\">\n\n# Demo: Probabilistic neural network training for denoising of synthetic 2D data\n\nThis notebook demonstrates training a probabilistic CARE model for a 2D denoising task, using provided synthetic training data. \nNote that training a neural network for actual use should be done on more (representative) data and with more training time.\n\nMore documentation is available at http://csbdeep.bioimagecomputing.com/doc/.",
"_____no_output_____"
]
],
[
[
"from __future__ import print_function, unicode_literals, absolute_import, division\nimport numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\n%config InlineBackend.figure_format = 'retina'\n\nfrom tifffile import imread\nfrom csbdeep.utils import download_and_extract_zip_file, axes_dict, plot_some, plot_history\nfrom csbdeep.utils.tf import limit_gpu_memory\nfrom csbdeep.io import load_training_data\nfrom csbdeep.models import Config, CARE",
"_____no_output_____"
]
],
[
[
"The TensorFlow backend uses all available GPU memory by default, hence it can be useful to limit it:",
"_____no_output_____"
]
],
[
[
"# limit_gpu_memory(fraction=1/2)",
"_____no_output_____"
]
],
[
[
"<hr style=\"height:2px;\">\n\n# Training data\n\nDownload and read provided training data, use 10% as validation data.",
"_____no_output_____"
]
],
[
[
"download_and_extract_zip_file (\n url = 'http://csbdeep.bioimagecomputing.com/example_data/synthetic_disks.zip',\n targetdir = 'data',\n)",
"_____no_output_____"
],
[
"(X,Y), (X_val,Y_val), axes = load_training_data('data/synthetic_disks/data.npz', validation_split=0.1, verbose=True)\n\nc = axes_dict(axes)['C']\nn_channel_in, n_channel_out = X.shape[c], Y.shape[c]",
"_____no_output_____"
],
[
"plt.figure(figsize=(12,5))\nplot_some(X_val[:5],Y_val[:5])\nplt.suptitle('5 example validation patches (top row: source, bottom row: target)');",
"_____no_output_____"
]
],
[
[
"<hr style=\"height:2px;\">\n\n# CARE model\n\nBefore we construct the actual CARE model, we have to define its configuration via a `Config` object, which includes \n* parameters of the underlying neural network,\n* the learning rate,\n* the number of parameter updates per epoch,\n* the loss function, and\n* whether the model is probabilistic or not.\n\nThe defaults should be sensible in many cases, so a change should only be necessary if the training process fails. \n\nFor a probabilistic model, we have to explicitly set `probabilistic=True`.\n\n---\n\n<span style=\"color:red;font-weight:bold;\">Important</span>: Note that for this notebook we use a very small number of update steps per epoch for immediate feedback, whereas this number should be increased considerably (e.g. `train_steps_per_epoch=400`) to obtain a well-trained model.",
"_____no_output_____"
]
],
[
[
"config = Config(axes, n_channel_in, n_channel_out, probabilistic=True, train_steps_per_epoch=30)\nprint(config)\nvars(config)",
"_____no_output_____"
]
],
[
[
"We now create a CARE model with the chosen configuration:",
"_____no_output_____"
]
],
[
[
"model = CARE(config, 'my_model', basedir='models')",
"_____no_output_____"
]
],
[
[
"<hr style=\"height:2px;\">\n\n# Training\n\nTraining the model will likely take some time. We recommend to monitor the progress with [TensorBoard](https://www.tensorflow.org/programmers_guide/summaries_and_tensorboard) (example below), which allows you to inspect the losses during training.\nFurthermore, you can look at the predictions for some of the validation images, which can be helpful to recognize problems early on.\n\nYou can start TensorBoard from the current working directory with `tensorboard --logdir=.`\nThen connect to [http://localhost:6006/](http://localhost:6006/) with your browser.\n\n",
"_____no_output_____"
]
],
[
[
"history = model.train(X,Y, validation_data=(X_val,Y_val))",
"_____no_output_____"
]
],
[
[
"Plot final training history (available in TensorBoard during training):",
"_____no_output_____"
]
],
[
[
"print(sorted(list(history.history.keys())))\nplt.figure(figsize=(16,5))\nplot_history(history,['loss','val_loss'],['mse','val_mse','mae','val_mae']);",
"_____no_output_____"
]
],
[
[
"<hr style=\"height:2px;\">\n\n# Evaluation\n\nExample results for validation images.",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(12,10))\n_P = model.keras_model.predict(X_val[:5])\n_P_mean = _P[...,:(_P.shape[-1]//2)]\n_P_scale = _P[...,(_P.shape[-1]//2):]\nplot_some(X_val[:5],Y_val[:5],_P_mean,_P_scale,pmax=99.5)\nplt.suptitle('5 example validation patches\\n' \n 'first row: input (source), ' \n 'second row: target (ground truth), '\n 'third row: predicted Laplace mean, '\n 'forth row: predicted Laplace scale');",
"_____no_output_____"
]
],
[
[
"<hr style=\"height:2px;\">\n\n# Export model to be used with CSBDeep **Fiji** plugins and **KNIME** workflows\n\nSee https://github.com/CSBDeep/CSBDeep_website/wiki/Your-Model-in-Fiji for details.",
"_____no_output_____"
]
],
[
[
"model.export_TF()",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79dbe4f6493f2dd2d2488a8365ea16d42509345 | 1,265 | ipynb | Jupyter Notebook | ML1/lectures/06_week_lab.ipynb | Mosuswalks/AI-with-Python | af1b45368b792bdefb9ac96cfd964791ea155baf | [
"Apache-2.0"
] | null | null | null | ML1/lectures/06_week_lab.ipynb | Mosuswalks/AI-with-Python | af1b45368b792bdefb9ac96cfd964791ea155baf | [
"Apache-2.0"
] | null | null | null | ML1/lectures/06_week_lab.ipynb | Mosuswalks/AI-with-Python | af1b45368b792bdefb9ac96cfd964791ea155baf | [
"Apache-2.0"
] | null | null | null | 19.461538 | 64 | 0.535178 | [
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport seaborn as sns\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"## COMP 3122 - Artificial Intelligence with Python\n__Week 6 lab__\n\n### [github.com/kamrik/ML1](https://github.com/kamrik/ML1)",
"_____no_output_____"
],
[
"## Plan for today\n - Demo of subplots and imshow\n - Lab exercise on GitHub in `exercises/week6_lab.ipynb`",
"_____no_output_____"
]
]
] | [
"code",
"markdown"
] | [
[
"code"
],
[
"markdown",
"markdown"
]
] |
e79dc90ccb5877a035747de8e3e876b8850996bc | 258,019 | ipynb | Jupyter Notebook | chap4.ipynb | mohantyk/dsp_in_comm | d0d25c02704b005eb526f8615d46e6a13c5f7c31 | [
"MIT"
] | 2 | 2020-09-19T15:26:31.000Z | 2022-01-02T16:05:54.000Z | chap4.ipynb | mohantyk/dsp_in_comm | d0d25c02704b005eb526f8615d46e6a13c5f7c31 | [
"MIT"
] | 1 | 2021-07-24T00:48:58.000Z | 2021-11-03T18:06:51.000Z | chap4.ipynb | mohantyk/dsp_in_comm | d0d25c02704b005eb526f8615d46e6a13c5f7c31 | [
"MIT"
] | 2 | 2020-02-16T22:46:39.000Z | 2020-02-28T18:24:31.000Z | 248.333975 | 77,312 | 0.921215 | [
[
[
"## Imports",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom numpy import pi, cos, sin, array",
"_____no_output_____"
],
[
"from scipy import signal\nfrom scipy.linalg import toeplitz, inv",
"_____no_output_____"
],
[
"import matplotlib.pyplot as plt\nplt.style.use('dark_background')\nnp.set_printoptions(precision=3, suppress=True)",
"_____no_output_____"
],
[
"from warnings import filterwarnings\nfilterwarnings('ignore', category=UserWarning)",
"_____no_output_____"
]
],
[
[
"### Useful functions",
"_____no_output_____"
]
],
[
[
"def calculate_error_rms(x, x_est):\n error = x[:len(x_est)] - x_est\n error_rms = np.linalg.norm(error)/len(error)\n return error_rms.round(5)",
"_____no_output_____"
]
],
[
[
"## Zero Forcing Equalizer",
"_____no_output_____"
]
],
[
[
"x = np.random.choice([-1, 1], 40)\nchannel = array([.1, -.1, 0.05, 1, .05])\n\ny = np.convolve(x, channel)",
"_____no_output_____"
],
[
"_, ax = plt.subplots(1, 2, figsize=(10, 4))\nax[0].stem(x)\nax[1].stem(y);",
"_____no_output_____"
],
[
"# 6 tap equalizer\ntaps = 6\nY = toeplitz(y[taps-1:taps-1+taps], np.flip(y[:taps]))\nzerof = inv(Y)@x[:taps]\nzerof",
"_____no_output_____"
],
[
"x_zerof = np.convolve(y, zerof, 'valid') # x_est is shorter than x because of 'valid'\nplt.stem(x_zerof);",
"_____no_output_____"
],
[
"calculate_error_rms(x, x_zerof)",
"_____no_output_____"
]
],
[
[
"## Least Squares Error Equalizer",
"_____no_output_____"
]
],
[
[
"taps = 6 \nL = 30 # number of input samples used\nY = toeplitz(y[taps-1: taps-1+L ] ,np.flip(y[:taps]))\nlse = inv(Y.T @ Y)@Y.T @ x[:L] #.T is fine because Y is a real matrix, else use Y.conj().T\nlse",
"_____no_output_____"
],
[
"x_lse = np.convolve(y, lse, 'valid')\nplt.stem(x_lse);",
"_____no_output_____"
],
[
"calculate_error_rms(x, x_lse)",
"_____no_output_____"
]
],
[
[
"### Channel Estimation",
"_____no_output_____"
]
],
[
[
"L = 30 \ntaps = 5\nX = toeplitz( x[:L], np.zeros(6) )\n\nh_est = inv(X.T@X)@X.T@y[:L]\nh_est",
"_____no_output_____"
]
],
[
[
"## MMSE Equalizer",
"_____no_output_____"
]
],
[
[
"x_var = 1\nnoise_var = 1e-4 # SNR = 40 dB\nchannel = array([.1, -.1, .05, .9+0.1j, .05], complex) # L = 5\nN = 5 # Taps in equalizer\nL = len(channel)\nD = N + L - 4 ",
"_____no_output_____"
],
[
"# Calculate H \ncol = np.zeros(N, complex)\ncol[0] = channel[0]\n\nrow = np.zeros(N+L-1, complex)\nrow[:L] = channel\n\nH = toeplitz(col, row)\nH.shape",
"_____no_output_____"
],
[
"# Calculate R_y\nR_x = x_var * np.eye(N+L-1)\nR_v = noise_var * np.eye(N)\nR_y = H @ R_x @ H.conj().T + R_v",
"_____no_output_____"
],
[
"xD_x = np.zeros(N+L-1)\nxD_x[D] = x_var\n\nC = xD_x @ H.conj().T @ inv(R_y)\nC",
"_____no_output_____"
],
[
"corrected_channel = np.convolve(channel , C)\ncorrected_channel",
"_____no_output_____"
],
[
"plt.stem(channel.real, markerfmt='o', label='real')\nplt.stem(channel.imag, markerfmt='x', label='imag')\nplt.legend();",
"_____no_output_____"
],
[
"symbols = 2000\nx = (np.random.choice([-1, 1], symbols) + 1j*np.random.choice([-1, 1], symbols))*np.sqrt(x_var/2)\nnoise = (np.random.choice([-1, 1], symbols) + 1j*np.random.choice([-1, 1], symbols))*np.sqrt(noise_var/2)\n\ny = np.convolve(x, channel, 'full') # This is not the same as signal.lfilter\ny = y[:len(x)]\nobservations = y + noise",
"_____no_output_____"
]
],
[
[
"A quick note about how `np.convolve()` and `signal.lfilter()` are related.\n1. \n```\ny_full = np.convolve(x, channel, 'full') # size: 2004\n```\n2. \n```\ny_same = np.convolve(x, channel, 'same') # size: 2000, \n```\nThis is same as `y_full[2:-2]` (drops 2 samples in the beginning and end)\n3. \n```\ny_filt = signal.lfilter(channel, 1, x) # size: 2000\n```\nThis is same as `y_full[:-4]` (drops 4 samples at end)\n",
"_____no_output_____"
]
],
[
[
"estimation = signal.lfilter(C, 1, observations)\n\nerror = estimation[D:] - x[:-D]\nevm_db = 10*np.log10(np.var(error)/x_var)\nsnr_db = 10*np.log10(np.var(y)/noise_var)\nevm_rms = 100*np.sqrt( np.var(error)/x_var )\n\nprint(f'EVM: {evm_db:.2f} dB , {evm_rms:.4f} (% rms) \\nSNR: {snr_db:.1f} dB')",
"EVM: -27.36 dB , 4.2832 (% rms) \nSNR: 39.3 dB\n"
],
[
"fig, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize=(10,8))\nax1.scatter(observations.real, observations.imag)\nax1.grid(linestyle='dashed')\nax1.set_title('Observations Y[n]')\n\nax2.scatter(estimation[D:].real, estimation[D:].imag);\nax2.grid(linestyle='dashed');\nax2.set_title('Estimate of x with EVM={:.2f}dB'.format(evm_db));\n\nax3.stem(corrected_channel.real, markerfmt='o', label='real')\nax3.stem(corrected_channel.imag, markerfmt='x', label='imag')\nax3.legend()\nax3.set_title('Corrected channel')\n\nw, h = signal.freqz(channel, 1)\nax4.plot(w/(2*pi), 20*np.log10(abs(h)), label='Original channel')\nw, h = signal.freqz(corrected_channel, 1)\nax4.plot(w/(2*pi), 20*np.log10(abs(h)), label='Equalized channel');\nax4.set_title('Magnitude Responses')\nax4.legend()\nax4.grid(linestyle='dashed');",
"_____no_output_____"
]
],
[
[
"## Least Mean Squares (LMS) ",
"_____no_output_____"
]
],
[
[
"x_len = 1500\nx = np.random.choice([-1, 1], x_len) + 1j*np.random.choice([-1, 1],x_len)\nchannel = array([-.19j, .14, 1+.1j, -.16, .11j+.03])\ny = signal.lfilter(channel, 1, x)",
"_____no_output_____"
],
[
"_, (ax1, ax2) = plt.subplots(2, 1)\nax1.stem(x[:80].imag)\nax2.stem(y[:80].imag);",
"_____no_output_____"
],
[
"taps = 13\nequalizer = np.zeros(taps, complex)\nequalizer[taps//2] = 1 # All pass filter, just delays the input\n\nchannel_delay = np.argmax(channel)\nequalizer_delay = np.argmax(equalizer)\napprox_delay = channel_delay + equalizer_delay\nprint(f'Initial approximate delay : {approx_delay} ({channel_delay} + {equalizer_delay})')\n\nu = 0.005 # Learning rate\nY = np.zeros_like(equalizer, complex)\nestimate = np.zeros_like(y, complex)\nerror = np.zeros_like(y, complex)\n\n# Adaptation loop\nfor n, sample in enumerate(y):\n Y = np.roll(Y, 1) \n Y[0] = sample\n estimate[n] = np.dot(Y, equalizer)\n \n if n >= approx_delay+5:\n training_symbol = x[n-approx_delay]\n error[n] = training_symbol - estimate[n]\n equalizer += 2*u*error[n]*Y.conjugate()\n #print('{}: {} \\t {:.2f} {}'.format(n, training_symbol, estimate[n], equalizer[:2]))\n \n \nplt.plot(abs(error))\nplt.xlabel('LMS iterations')\nplt.title('Absolute value of error')\nplt.grid();",
"Initial approximate delay : 8 (2 + 6)\n"
],
[
"fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(10,4))\nax1.scatter(y[10:].real, y[10:].imag)\nax1.grid(linestyle='dashed')\nax2.scatter(estimate[200:].real, estimate[200:].imag)\nax2.grid(linestyle='dashed');",
"_____no_output_____"
]
],
[
[
"## Decision feedback equalizers",
"_____no_output_____"
]
],
[
[
"# QPSK signal\nx_len = 200\nqpsk = np.random.choice([-1, 1], x_len) + 1j*np.random.choice([-1, 1], x_len)\nchannel = array([0.05, 0.1j, 1.0, 0.20, -0.15j, 0.1])\ny = signal.lfilter(channel, 1, qpsk)",
"_____no_output_____"
],
[
"# Feed forward filter\n# feed_forward_filter = array([0, 0, 0, 1, 0]) # Pass through\nfeed_forward_filter = array([ 0.004, 0.011j, -0.06, -0.1j, 1]) # From MMSE, see next section\n\nfeed_forward = signal.lfilter(feed_forward_filter, 1, y)",
"_____no_output_____"
],
[
"# Decision feedback loop\npost_cursors = 3\nfeedback_coeffs = -channel[-post_cursors:] # Negative of post-cursors\nfb_pipe = np.zeros_like(feedback_coeffs, complex)\n\nestimate = np.zeros_like(y, complex)\nlast_estimate = 0\n\nfor n, ff_out in enumerate(feed_forward):\n decision = np.sign(last_estimate.real) + 1j*np.sign(last_estimate.imag)\n fb_pipe = np.roll(fb_pipe, 1)\n fb_pipe[0] = decision\n \n last_estimate = ff_out + np.dot(fb_pipe, feedback_coeffs)\n estimate[n] = last_estimate\n ",
"_____no_output_____"
],
[
"# EVM \nideal = np.sign(y[5:].real) + 1j*np.sign(y[5:].imag)\nerror = y[5:] - ideal\nevm_observation = 10*np.log10(error.var()/ideal.var())\n\nprint('EVM of observation: {:.2f} dB'.format(evm_observation))",
"EVM of observation: -10.67 dB\n"
],
[
"ideal = np.sign( estimate[5:].real) + 1j*np.sign(estimate[5:].imag)\nerror = estimate[5:] - ideal\nevm_estimate = 10*np.log10(error.var()/ideal.var())\n\nprint('EVM of estimate: {:.2f} dB'.format(evm_estimate))",
"EVM of estimate: -23.84 dB\n"
],
[
"_, (ax1, ax2) = plt.subplots(1, 2, figsize=(10,4))\nax1.scatter(y[5:].real, y[5:].imag)\nax1.set_title('Observation y[n]')\nax1.grid()\n\nax2.scatter(estimate[10:].real, estimate[10:].imag)\nax2.set_title('Estimate of x[n]')\nax2.grid();",
"_____no_output_____"
]
],
[
[
"### MMSE Optimal Coefficient Calculation",
"_____no_output_____"
]
],
[
[
"x_var = 1\nnoise_var = 0 # Noise-free\nchannel = array([0.05, 0.1j, 1]) # After removing post-cursors (using feedback loop)\nL = len(channel) # number of taps in channel\n\nN = 5 # taps in feed forward FIR\nD = N + L - 2",
"_____no_output_____"
],
[
"# Calculate H \ncol = np.zeros(N, complex)\ncol[0] = channel[0]\n\nrow = np.zeros(N+L-1, complex)\nrow[:L] = channel\n\nH = toeplitz(col, row)\nH.shape",
"_____no_output_____"
],
[
"# Calculate covariance matrices\nR_x = x_var*np.eye(N+L-1)\nR_v = noise_var*np.eye(N)\nR_y = H @ R_x @ H.conj().T + R_v",
"_____no_output_____"
],
[
"xD_x = np.zeros(N+L-1)\nxD_x[D] = x_var\n\nequalizer = xD_x @ H.conj().T @ inv(R_y)\nequalizer",
"_____no_output_____"
]
],
[
[
"### Combine LMS and Decision Feedback",
"_____no_output_____"
]
],
[
[
"x_len = 1000\nqpsk = np.random.choice([1, -1], x_len) + 1j*np.random.choice([1, -1], x_len)\nchannel = array([0.05, 0.1j, 1.0, 0.2, -0.15j, 0.1])\ny = signal.lfilter(channel, 1, qpsk)",
"_____no_output_____"
],
[
"# Initialize feed forward equalizer\nequalizer = array([ 0.004, 0.0109j, -0.06, -0.1j, 1.0]) # MMSE (see prev section)\nN = len(equalizer)\nchannel_delay = np.argmax(channel)\nequalizer_delay = np.argmax(equalizer)\ntotal_delay = channel_delay + equalizer_delay\n\nY = np.zeros_like(equalizer, complex)\nequalizer_output = np.zeros_like(y, complex)",
"_____no_output_____"
],
[
"# Adaptive loop\npost_cursors = 3\nfb_coeffs = -channel[-post_cursors:]\nfb_pipe = np.zeros_like(fb_coeffs, complex)\n\nerror = np.zeros_like(y, complex)\nestimate = np.zeros_like(y, complex)\n\nu = 0.005 # LMS learning rate\nlast_estimate = 0\nfor n, obs in enumerate(y):\n Y = np.roll(Y, 1)\n Y[0] = obs\n equalizer_output[n] = np.dot(Y, equalizer)\n \n decision = np.sign(last_estimate.real) + 1j*np.sign(last_estimate.imag)\n fb_pipe = np.roll(fb_pipe, 1)\n fb_pipe[0] = decision\n estimate[n] = equalizer_output[n] + np.dot(fb_coeffs, fb_pipe)\n last_estimate = estimate[n]\n \n if n > total_delay : \n training_symbol = qpsk[n-total_delay]\n error[n] = training_symbol - estimate[n]\n \n equalizer += 2*u*error[n]*Y.conj()\n fb_coeffs += 2*u*error[n]*fb_pipe.conj()\n \nplt.plot(abs(error));",
"_____no_output_____"
],
[
"# EVM \nideal = np.sign(y[5:].real) + 1j*np.sign(y[5:].imag)\ny_error = y[5:] - ideal\nevm_observation = 10*np.log10(y_error.var()/ideal.var())\nprint('EVM of observation: {:.2f} dB'.format(evm_observation))\n\n# Check EVM after error has gone down\nideal = np.sign( estimate[600:].real) + 1j*np.sign(estimate[600:].imag)\nestimate_error = estimate[600:] - ideal\nevm_estimate = 10*np.log10(estimate_error.var()/ideal.var())\nprint('EVM of estimate: {:.2f} dB'.format(evm_estimate))",
"EVM of observation: -10.78 dB\nEVM of estimate: -59.91 dB\n"
],
[
"_, (ax1, ax2) = plt.subplots(1, 2, figsize=(10,4))\nax1.scatter(y[5:].real, y[5:].imag)\nax1.set_title('Observation y[n]')\nax1.grid()\n\nax2.scatter(estimate[600:].real, estimate[600:].imag)\nax2.set_title('Estimate of x[n]')\nax2.grid();",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
] |
e79dd24c7f6a14165dfe36becbb97e0427e62f60 | 516,526 | ipynb | Jupyter Notebook | code/analysis_01.ipynb | jsheunis/COSN-talk | 32f93d19172d502151cbff87be73030e77e08ad8 | [
"MIT"
] | 6 | 2021-06-09T13:32:54.000Z | 2021-11-05T11:32:40.000Z | code/analysis_01.ipynb | jsheunis/COSN-talk | 32f93d19172d502151cbff87be73030e77e08ad8 | [
"MIT"
] | null | null | null | code/analysis_01.ipynb | jsheunis/COSN-talk | 32f93d19172d502151cbff87be73030e77e08ad8 | [
"MIT"
] | 1 | 2021-06-18T19:10:07.000Z | 2021-06-18T19:10:07.000Z | 1,693.527869 | 167,740 | 0.955528 | [
[
[
"import os, glob\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport pandas as pd\nimport nibabel as nb\nimport nibabel.freesurfer.mghformat as mgh\nfrom nibabel.freesurfer.io import read_geometry\nfrom nilearn.plotting.surf_plotting import load_surf_data\nimport myvis\nfrom brainstat.stats.terms import FixedEffect\nfrom brainstat.stats.SLM import SLM\nimport zipfile",
"_____no_output_____"
]
],
[
[
"### 0. get data demographics with pandas",
"_____no_output_____"
]
],
[
[
"datadir = '../data'\n\nzipped_data = os.path.join(datadir, 'thickness.zip')\nwith zipfile.ZipFile(zipped_data, 'r') as zip_ref:\n zip_ref.extractall(datadir)\n\ndfname = os.path.join(datadir, 'thickness.csv')\ndf = pd.read_csv(dfname)\n\nidx = np.array(df[\"ID2\"])\nage = np.array(df[\"AGE\"])\niq = np.array(df[\"IQ\"])\n\n",
"_____no_output_____"
]
],
[
[
"### 1. load thickness data ",
"_____no_output_____"
]
],
[
[
"j = 0\nfor Sidx in idx:\n \n # load mgh files for left hem \n for file_L in glob.glob('../data/thickness/%s*_lh2fsaverage5_20.mgh' \n % (Sidx)):\n \n S_Lmgh = mgh.load(file_L).get_fdata()\n S_Larr = np.array(S_Lmgh)[:,:,0]\n \n # load mgh files for right hem \n for file_R in glob.glob('../data/thickness/%s*_rh2fsaverage5_20.mgh' \n % (Sidx)):\n \n S_Rmgh = mgh.load(file_R).get_fdata()\n S_Rarr = np.array(S_Rmgh)[:,:,0]\n\n # concatenate left & right thickness for each subject \n Sidx_thickness = np.concatenate((S_Larr, S_Rarr)).T\n\n if j == 0:\n thickness = Sidx_thickness\n else:\n thickness = np.concatenate((thickness, Sidx_thickness), axis=0)\n j += 1 ",
"_____no_output_____"
],
[
"Mean_thickness = thickness.mean(axis=0)\nprint(\"all thicknes data loaded shape: \", thickness.shape)\nprint(\"Mean thickness shape: \", Mean_thickness.shape)\n",
"all thicknes data loaded shape: (259, 20484)\nMean thickness shape: (20484,)\n"
],
[
"fig01 = plt.imshow(thickness, extent=[0,20484,0,259], aspect='auto')\n",
"_____no_output_____"
]
],
[
[
"### 2. plot mean thickness along the cortex",
"_____no_output_____"
]
],
[
[
"Fs_Mesh_L = read_geometry(os.path.join(datadir, 'fsaverage5/lh.pial'))\nFs_Mesh_R = read_geometry(os.path.join(datadir, 'fsaverage5/rh.pial'))\n\nFs_Bg_Map_L = load_surf_data(os.path.join(datadir, 'fsaverage5/lh.sulc'))\nFs_Bg_Map_R = load_surf_data(os.path.join(datadir, 'fsaverage5/rh.sulc'))\n\nMask_Left = nb.freesurfer.io.read_label((os.path.join(\n datadir,'fsaverage5/lh.cortex.label')))\nMask_Right = nb.freesurfer.io.read_label((os.path.join(\n datadir,'fsaverage5/rh.cortex.label')))\n\nsurf_mesh = {}\nsurf_mesh['coords'] = np.concatenate((Fs_Mesh_L[0], Fs_Mesh_R[0]))\nsurf_mesh['tri'] = np.concatenate((Fs_Mesh_L[1], Fs_Mesh_R[1]))\nbg_map = np.concatenate((Fs_Bg_Map_L, Fs_Bg_Map_R))\nmedial_wall = np.concatenate((Mask_Left, 10242 + Mask_Right))\n",
"_____no_output_____"
],
[
"fig02 = myvis.plot_surfstat(surf_mesh, bg_map, Mean_thickness, \n mask = medial_wall,\n cmap = 'viridis', vmin = 1.5, vmax = 4)",
"_____no_output_____"
]
],
[
[
"### 3. build the stats model",
"_____no_output_____"
]
],
[
[
"term_intercept = FixedEffect(1, names=\"intercept\")\nterm_age = FixedEffect(age, \"age\")\nterm_iq = FixedEffect(iq, \"iq\")\nmodel = term_intercept + term_age \n\nslm = SLM(model, -age, surf=surf_mesh)\nslm.fit(thickness)\ntvals = slm.t.flatten()\npvals = slm.fdr()\n\nprint(\"t-values: \", tvals) # These are the t-values of the model.\nprint(\"p-values: \", pvals) # These are the p-values of the model.\n",
"t-values: [ 3.70352445 9.00643325 11.51625374 ... 2.42269373 2.80816754\n 3.23542035]\np-values: [1.70069476e-04 8.81888145e-17 5.71164388e-24 ... 9.66039843e-03\n 3.29548858e-03 8.66342267e-04]\n"
],
[
"fig03 = myvis.plot_surfstat(surf_mesh, bg_map, tvals,\n mask = medial_wall, cmap = 'gnuplot',\n vmin = tvals.min(), vmax = tvals.max())\n\n",
"_____no_output_____"
],
[
"fig04 = myvis.plot_surfstat(surf_mesh, bg_map, pvals,\n mask = medial_wall, cmap = 'YlOrRd',\n vmin = 0, vmax = 0.05)\n",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
e79ddf7c51cfe0768b68a9d2bc181d67a49bb7b1 | 84,989 | ipynb | Jupyter Notebook | Simple_Spacecraft/first_attempt.ipynb | dev10110/deep_learnt_controls | b353e8654b65a5b0dc92b6a2d5799d2168c55d87 | [
"MIT"
] | 3 | 2019-10-22T13:45:38.000Z | 2022-03-05T07:29:00.000Z | Simple_Spacecraft/first_attempt.ipynb | dev10110/deep_learnt_controls | b353e8654b65a5b0dc92b6a2d5799d2168c55d87 | [
"MIT"
] | null | null | null | Simple_Spacecraft/first_attempt.ipynb | dev10110/deep_learnt_controls | b353e8654b65a5b0dc92b6a2d5799d2168c55d87 | [
"MIT"
] | null | null | null | 127.611111 | 14,856 | 0.87316 | [
[
[
"\n#x range: +- 200\n#z range: 500, 2000\n#vx range: +- 10\n#vz range: -30, 10\n\n#m range: 8000, 12000\nimport pandas as pd\nimport numpy as np\nimport scipy as sp\nfrom scipy import integrate as spint\nimport matplotlib.pyplot as plt\n\nfrom math import sin, cos, asin\n\nfrom rocket import Rocket",
"_____no_output_____"
],
[
"r = Rocket(*[-30., 1000., 10., 20., 10000.])",
"_____no_output_____"
],
[
"r",
"_____no_output_____"
],
[
"r.s_0",
"_____no_output_____"
],
[
"r.propagate(r.s_0, (0,0))",
"_____no_output_____"
],
[
"\nt = 0.\ns = r.s_0\ndt = 0.1\nhistory = [(t, *s)]\n\nwhile t < r.t_char:\n t = t+dt\n s = r.propagate(s, (0.,0.), dt)\n history.append((t, *s))",
"_____no_output_____"
],
[
"t",
"_____no_output_____"
],
[
"df = pd.DataFrame.from_records(history, columns=['t','x','z','vx','vz','m'])",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
],
[
"plt.plot(df.t, df.z)\nplt.grid()",
"_____no_output_____"
],
[
"plt.plot(df.t[df.z>0], df.z[df.z>0])",
"_____no_output_____"
],
[
"plt.plot(df.x[df.z>0], df.z[df.z>0])\nax = plt.gca()\nax.set_aspect(1)\nplt.grid()",
"_____no_output_____"
],
[
"plt.plot(((df.vx**2+df.vz**2)**0.5)[df.z>0], df.z[df.z>0])\nplt.grid()\nplt.xlabel(\"speed (m/s)\")\nplt.ylabel(\"z (m)\")",
"_____no_output_____"
],
[
"plt.plot(df.vx[df.z>0], df.z[df.z>0])\nplt.grid()\nplt.xlabel(\"vx (m/s)\")\nplt.ylabel(\"z (m)\")\nplt.show()\n\nplt.plot(df.vz[df.z>0], df.z[df.z>0])\nplt.grid()\nplt.xlabel(\"vz (m/s)\")\nplt.ylabel(\"z (m)\")\nplt.show()",
"_____no_output_____"
],
[
"#attempt to solve the mass optimal control problem\ndef bvp_dynamics(t, s, p):\n #t (time) #not needed\n #s (state) #(x,z,vx,vz,m,lambdam)\n #p (parameters) #(lx0, lz0, lvx0, lvz0)\n \n #implicitly assumes alpha = 1\n \n \n (x, z, vx, vz, m, lm) = s\n (lx0, lz0, lvx0, lvz0) = p\n \n lv_mag = (lvx0**2+lvz0**2)**0.5\n \n \n #determine u1\n S1 = 1 - (lv_mag*r.c2)/m - lm\n if len(S1) == 1:\n u1 = (lambda s: 1 if s<0 else 0)(S1)\n else:\n u1 = np.array([(lambda s: 1 if s<0 else 0)(s1) for s1 in S1])\n \n #determine u2, ith\n ith1 = -(1/lv_mag)*(lvx0)\n ith2 = -(1/lv_mag)*(lvz0)\n \n #construct dynamics\n ds = [None]*6\n \n ds[0] = vx\n ds[1] = vz\n \n ds[2] = (u1*(ith1/m))*r.c1\n ds[3] = (u1*(ith2/m))*r.c1 - r.g\n \n ds[4] = u1*(-r.c1/r.c2)\n \n \n ds[5] = ((u1/m**2)*r.c1)*((lvx0-lx0*t)*ith1 + (lvz0-lz0*t)*ith2)\n \n return ds\n \ndef bvp_boundary(sa, sb, p):\n \n res = []\n \n #initial conditions:\n res.append(sa[0] - r.x_0 )\n res.append(sa[1] - r.z_0 )\n res.append(sa[2] - r.vx_0)\n res.append(sa[3] - r.vz_0)\n res.append(sa[4] - r.m_0 )\n \n #final conditions\n res.append(sb[0] - 0)\n res.append(sb[1] - 0)\n res.append(sb[2] - 0)\n res.append(sb[3] - 0)\n res.append(sb[5] - 0) #lambda m = 0 in final time\n \n return np.array(res)\n ",
"_____no_output_____"
],
[
"np.diff(x_g)/(t_guess[1]-t_guess[0])",
"_____no_output_____"
],
[
"t_guess = np.linspace(0, (2*r.z_0/r.g)/2)\n\nx_g = np.linspace(r.x_0, 0)\nz_g = np.linspace(r.z_0, 0)\nvx_g = np.linspace(r.vx_0, 0)\nvz_g = np.linspace(r.vz_0, 0)\nm_g = np.linspace(r.m_0, 0.7*r.m_0)\nlm_g = np.linspace(10,1)\n\n\n\nstate_guess = np.array([x_g, z_g, vx_g, vz_g, m_g, lm_g])",
"_____no_output_____"
],
[
"r.vx_0",
"_____no_output_____"
],
[
"state_guess.shape",
"_____no_output_____"
],
[
"p_guess = np.array([-10,-10,10,10])",
"_____no_output_____"
],
[
"bvp_dynamics(0, state_guess, p_guess)",
"_____no_output_____"
],
[
"bvp_boundary(r.s_0, [0,0,0,0,7500,0], [1,1,1,1])",
"_____no_output_____"
],
[
"sol = spint.solve_bvp(bvp_dynamics, bvp_boundary, x = t_guess, y = state_guess, p=p_guess, verbose=2)",
"_____no_output_____"
],
[
"sol",
"_____no_output_____"
],
[
"t_sol = sol.x",
"_____no_output_____"
],
[
"(x_sol, z_sol, vx_sol, vz_sol, m_sol, lm_sol) = sol.y",
"_____no_output_____"
],
[
"plt.plot(t_sol, vz_sol)\nplt.grid()",
"_____no_output_____"
],
[
"plt.plot(t_guess, m_g)\nplt.grid()",
"_____no_output_____"
],
[
"vz_sol-vz_g",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79def8eff74061dd3c5f550f4d5d1eea398e67f | 2,411 | ipynb | Jupyter Notebook | HelloJupyter.ipynb | fsslh007/artificial-intelligence-fundamentals-with-python-2021-class | dd1e81c00b3a4d28bd7f150c07c87214f02d4858 | [
"MIT"
] | null | null | null | HelloJupyter.ipynb | fsslh007/artificial-intelligence-fundamentals-with-python-2021-class | dd1e81c00b3a4d28bd7f150c07c87214f02d4858 | [
"MIT"
] | null | null | null | HelloJupyter.ipynb | fsslh007/artificial-intelligence-fundamentals-with-python-2021-class | dd1e81c00b3a4d28bd7f150c07c87214f02d4858 | [
"MIT"
] | null | null | null | 16.86014 | 64 | 0.467856 | [
[
[
"print(\"hello World\")",
"hello World\n"
],
[
"x=5",
"_____no_output_____"
],
[
"print(x)",
"5\n"
],
[
"print(\"shift+enter = run\")",
"shift+enter = run\n"
],
[
"print(\"ctrl + enter = run the same cell\")",
"ctrl + enter = run the same cell\n"
]
],
[
[
"# This is a Level 1 Heading\nThis is some paragraph text that describes the code below:",
"_____no_output_____"
]
],
[
[
"print(\"this is the code the was describes above\")",
"this is the code the was describes above\n"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79df3ff3d18657aed79dcac2e093580b4127d62 | 729,244 | ipynb | Jupyter Notebook | examples/filters/tutorials/snow_partitioning_parallel.ipynb | jamesobutler/porespy | ec9791e63db6e6a1281e364f4d2ea5d3796f70c9 | [
"MIT"
] | 165 | 2016-01-07T19:15:58.000Z | 2022-03-24T16:24:54.000Z | examples/filters/tutorials/snow_partitioning_parallel.ipynb | jamesobutler/porespy | ec9791e63db6e6a1281e364f4d2ea5d3796f70c9 | [
"MIT"
] | 550 | 2016-02-28T22:49:06.000Z | 2022-03-30T13:33:17.000Z | examples/filters/tutorials/snow_partitioning_parallel.ipynb | jamesobutler/porespy | ec9791e63db6e6a1281e364f4d2ea5d3796f70c9 | [
"MIT"
] | 81 | 2015-08-20T05:14:25.000Z | 2022-03-20T09:09:58.000Z | 351.781959 | 347,142 | 0.906919 | [
[
[
"# SNOW partitioning parallel\nThe filter is used to perform SNOW algorithm in parallel and serial mode to save computational time and memory requirement respectively. [SNOW](https://journals.aps.org/pre/abstract/10.1103/PhysRevE.96.023307) algorithm converts a binary image in to partitioned regions while avoiding oversegmentation. SNOW_partitioning_parallel speeds up this process by decomposing the domain into several subdomains and either process them in different cores in parallel to save time or one by one in single core to save memory requirements. ",
"_____no_output_____"
],
[
"#### Import Modules",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport porespy as ps\nfrom porespy.tools import randomize_colors\nimport matplotlib.pyplot as plt\nimport matplotlib.gridspec as gridspec\ngs = gridspec.GridSpec(2, 4)\ngs.update(wspace=0.5)\nnp.random.seed(10)\nps.visualization.set_mpl_style()",
"_____no_output_____"
]
],
[
[
"#### Create a random image of overlapping spheres",
"_____no_output_____"
]
],
[
[
"im = ps.generators.overlapping_spheres([1000, 1000], r=10, porosity=0.5)\nfig, ax = plt.subplots()\nax.imshow(im, origin='lower');",
"_____no_output_____"
]
],
[
[
"#### Apply SNOW_partitioning_parallel on the binary image ",
"_____no_output_____"
]
],
[
[
"snow_out = ps.filters.snow_partitioning_parallel(\n im=im, divs=2, r_max=5, sigma=0.4)",
"_____no_output_____"
]
],
[
[
"#### Plot output results ",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots(1, 3, figsize=[9, 3])\nax[0].imshow(snow_out.im);\nax[1].imshow(snow_out.dt);\nax[2].imshow(randomize_colors(snow_out.regions));\nax[0].set_title('Binary Image');\nax[1].set_title('Euclidean Distance Transform')\nax[2].set_title('Segmented Image');",
"_____no_output_____"
],
[
"print(f\"Number of regions: {snow_out.regions.max()}\")",
"Number of regions: 1006\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e79dfd1455c1ca15f19b42fbcac1497e22e25a3c | 1,016,657 | ipynb | Jupyter Notebook | notebooks/blog4.ipynb | zhijianli9999/zhijianli9999.github.io | 5b1c6ad940fbbf2c21581a0b944e66ae95c02cc4 | [
"MIT"
] | null | null | null | notebooks/blog4.ipynb | zhijianli9999/zhijianli9999.github.io | 5b1c6ad940fbbf2c21581a0b944e66ae95c02cc4 | [
"MIT"
] | null | null | null | notebooks/blog4.ipynb | zhijianli9999/zhijianli9999.github.io | 5b1c6ad940fbbf2c21581a0b944e66ae95c02cc4 | [
"MIT"
] | null | null | null | 1,071.29294 | 496,648 | 0.953646 | [
[
[
"# Blog 4: Spectral Clustering\n\nIn this blog post, we'll explore several algorithms to cluster data points. \n\nSome notation:\n- Boldface capital letters like $\\mathbf{A}$ are matrices.\n- Boldface lowercase letters like $\\mathbf{v}$ are vectors.\n\n\n## The clustering problem\nWe're aiming to solve the problem of assigning labels to observations based on the distribution of their features. In this exercise, the features are represented as the matrix $\\mathbf{X}$, with each row representing a data point and each data point being in $\\mathbb{R}^2$. We are also dealing with the simple case of 2 clusters for now. What we do generalize is the shape of the distribution in $\\mathbb{R}^2$. But first, let's look at a simple 2-cluster distribution. ",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom sklearn import datasets\nfrom matplotlib import pyplot as plt",
"_____no_output_____"
],
[
"n = 200\nnp.random.seed(1111)\nX, y = datasets.make_blobs(n_samples=n, shuffle=True, random_state=None, centers = 2, cluster_std = 2.0)\nplt.scatter(X[:,0], X[:,1])",
"_____no_output_____"
]
],
[
[
"For this simple distribution, we can use k-means clustering. Intuitively, this minimizes the distances between each cluster's members and its \"center of gravity\", and works well for roughly circular blobs. ",
"_____no_output_____"
]
],
[
[
"from sklearn.cluster import KMeans\nkm = KMeans(n_clusters = 2)\nkm.fit(X)\n\nplt.scatter(X[:,0], X[:,1], c = km.predict(X))",
"_____no_output_____"
]
],
[
[
"### Generalizing the distribution\nNow let's look at this distribution of data points. ",
"_____no_output_____"
]
],
[
[
"np.random.seed(1234)\nn = 200\nX, y = datasets.make_moons(n_samples=n, shuffle=True, noise=0.05, random_state=None)\nplt.scatter(X[:,0], X[:,1])",
"_____no_output_____"
]
],
[
[
"The two clusters are still obvious by sight, but k-means clustering fails. ",
"_____no_output_____"
]
],
[
[
"km = KMeans(n_clusters = 2)\nkm.fit(X)\nplt.scatter(X[:,0], X[:,1], c = km.predict(X))",
"_____no_output_____"
]
],
[
[
"## Constructing a similarity matrix A\n\nAn important ingredient in all of the clustering algorithms in this post is the similarity matrix *similarity matrix* $\\mathbf{A}$. A is a symmetric square matrix with n rows and columns. `A[i,j]` should be equal to `1` if and only if `X[i]` is close to `X[j]`. To quantify closeness, we will have a threshold distance `epsilon`, set to 0.4 for now. The diagonal entries `A[i,i]` should all be equal to zero.",
"_____no_output_____"
]
],
[
[
"from sklearn.metrics import pairwise_distances",
"_____no_output_____"
],
[
"epsilon = 0.4\n\ndist = pairwise_distances(X)\nA = np.array(dist < epsilon).astype('int')\nnp.fill_diagonal(A,0)",
"_____no_output_____"
],
[
"A",
"_____no_output_____"
]
],
[
[
"## Norm cut objective function\n\nNow that we have encoded the pairwise distances of the data points in $\\mathbf{A}$, we can define the clustering problem as a minimization problem. Intuitively, the parameter space of this minimization problem should be the possible cluster assignments (a vector of n 0s and 1s), and the objective function should decrease when our assignment yields more proximity within each cluster, and increase when our assignment yields more proximity between different clusters. \n\nWe thus define the *binary norm cut objective* of the similarity matrix $\\mathbf{A}$ :\n\n$$N_{\\mathbf{A}}(C_0, C_1)\\equiv \\mathbf{cut}(C_0, C_1)\\left(\\frac{1}{\\mathbf{vol}(C_0)} + \\frac{1}{\\mathbf{vol}(C_1)}\\right)\\;.$$\n\nIn this expression, \n- $C_0$ and $C_1$ are the clusters assigned.\n- $\\mathbf{cut}(C_0, C_1) \\equiv \\sum_{i \\in C_0, j \\in C_1} a_{ij}$ (the \"*cut*\") is the number of ones in $\\mathbf{A}$ connecting $C_0$ to cluster $C_1$.\n- $\\mathbf{vol}(C_0) \\equiv \\sum_{i \\in C_0}d_i$, where $d_i = \\sum_{j = 1}^n a_{ij}$ is the *degree* of row $i$. This volume term measures the size and connectedness within the closter $C_0$.\n\n#### The cut term\nThe function `cut(A,y)` below computes the cut term. We first construct a `diff` matrix ($n$ by $n$), whose $i,j$-th term is an indicator for points i and j being in different clusters under classification `y`. We then elementwise-multiply it with the similarity matrix.",
"_____no_output_____"
]
],
[
[
"def cut(A,y):\n # diff[i,j] is 1 iff y[i] is in a different group than y[j]\n diff = np.array([y != i for i in y], dtype='int')\n # return sum of entries in A where diff is 1, divide by 2 due to double counting\n return np.sum(np.multiply(diff,A))/2\n",
"_____no_output_____"
]
],
[
[
"We first test it with the true labels, `y`.",
"_____no_output_____"
]
],
[
[
"cut(A,y)",
"_____no_output_____"
]
],
[
[
"...and then with randomly generated labels",
"_____no_output_____"
]
],
[
[
"randn = np.random.randint(0,2,n)\ncut(A,randn)",
"_____no_output_____"
]
],
[
[
"As expected, the true labels yield a lower cut term than the fake labels. ",
"_____no_output_____"
],
[
"#### The Volume Term \n\nNow we compute the *volume* of each cluster. The volume of the cluster is just the sum of degrees of the cluster's elements. Remember we want to minimize $\\frac{1}{\\mathbf{vol}(C_0)}$ so we want to maximize the volume. ",
"_____no_output_____"
]
],
[
[
"def vols(A,y):\n # sum the rows of A where the row belongs to the cluster\n v0 = np.sum(A[y==0,])\n v1 = np.sum(A[y==1,])\n return v0, v1",
"_____no_output_____"
]
],
[
[
"Now we have all the ingredients for the norm cut objective.",
"_____no_output_____"
]
],
[
[
"def normcut(A,y):\n v0, v1 = vols(A,y)\n return cut(A,y) * (1/v0 + 1/v1)",
"_____no_output_____"
],
[
"print(vols(A,y))\nprint(normcut(A,y))",
"(2299, 2217)\n0.011518412331615225\n"
]
],
[
[
"Again, the true labels `y` yield a much lower minimizing value than the random labels.",
"_____no_output_____"
]
],
[
[
"normcut(A,randn)",
"_____no_output_____"
]
],
[
[
"## Part C\n\nUnfortunately, with what we have, the parameter space (the possible set of labels) is too large for a computationally efficient algorithm. This is why we need some linear algebra magic to give us another formula for the norm cut objective:\n\n$$\\mathbf{N}_{\\mathbf{A}}(C_0, C_1) = \\frac{\\mathbf{z}^T (\\mathbf{D} - \\mathbf{A})\\mathbf{z}}{\\mathbf{z}^T\\mathbf{D}\\mathbf{z}}\\;,$$\n\nwhere\n- $\\mathbf{D}$ is the diagonal matrix with nonzero entries $d_{ii} = d_i$, and where $d_i = \\sum_{j = 1}^n a_i$ is the degree.\n- and $\\mathbf{z}$ is a vector such that\n$$\nz_i = \n\\begin{cases}\n \\frac{1}{\\mathbf{vol}(C_0)} &\\quad \\text{if } y_i = 0 \\\\ \n -\\frac{1}{\\mathbf{vol}(C_1)} &\\quad \\text{if } y_i = 1 \\\\ \n\\end{cases}\n$$\n\n\nSince the volume term is just a function of `A` and `y`, $\\mathbf{z}$ encodes all the information in `A` and `y` through the volume term and the sign. We define the function `transform(A,y)` to compute the appropriate $\\mathbf{z}$ vector. \n",
"_____no_output_____"
]
],
[
[
"def transform(A,y):\n # compute volumes\n v0, v1 = vols(A,y)\n # initialize z to be array of same shape as y, then fill depending on y\n z = np.where(y==0, 1/v0, -1/v1)\n return z",
"_____no_output_____"
],
[
"z = transform(A,y)\n# degree matrix: row sums placed on diagonal\n# the \"at\" sign is the matrix product\nD = np.diag([email protected](n)) \nnormcut_formula = (z@(D-A)@z)/(z@D@z)\nnormcut_formula",
"_____no_output_____"
]
],
[
[
"We check that the value of the norm cut function is numerically close by either method.",
"_____no_output_____"
]
],
[
[
"np.isclose(normcut(A,y), normcut_formula)",
"_____no_output_____"
]
],
[
[
"We can also check the identity $\\mathbf{z}^T\\mathbf{D}\\mathbb{1} = 0$, where $\\mathbb{1}$ is the vector of `n` ones. This identity effectively says that $\\mathbf{z}$ should contain roughly as many positive as negative entries, i.e. as many labels in each cluster.\n",
"_____no_output_____"
]
],
[
[
"D = np.diag([email protected](n))\nnp.isclose((z@[email protected](n)),0)",
"_____no_output_____"
]
],
[
[
"We denote the objective function\n\n$$ R_\\mathbf{A}(\\mathbf{z})\\equiv \\frac{\\mathbf{z}^T (\\mathbf{D} - \\mathbf{A})\\mathbf{z}}{\\mathbf{z}^T\\mathbf{D}\\mathbf{z}} $$\n\nWe can minimize this function subject to the condition $\\mathbf{z}^T\\mathbf{D}\\mathbb{1} = 0$, which says that the clusters ar equally sized. We can guarantee the condition holds if, instead of minimizing over $\\mathbf{z}$, we minimize the orthogonal complement of $\\mathbf{z}$ relative to $\\mathbf{D}\\mathbb{1}$. The `orth_obj` function computes this. Then we use the `minimize` function from `scipy.optimize` to minimize the function `orth_obj` with respect to $\\mathbf{z}$. ",
"_____no_output_____"
]
],
[
[
"def orth(u, v):\n return (u @ v) / (v @ v) * v\n\ne = np.ones(n) \n\nd = D @ e\n\ndef orth_obj(z):\n z_o = z - orth(z, d)\n return (z_o @ (D - A) @ z_o)/(z_o @ D @ z_o)",
"_____no_output_____"
],
[
"from scipy.optimize import minimize",
"_____no_output_____"
],
[
"z_min = minimize(fun=orth_obj, x0=np.ones(n)).x",
"_____no_output_____"
]
],
[
[
"By construction, the sign of `z_min[i]` corresponds to the cluster label of data point `i`. We plot the points below, coloring it by the sign of `z_min`.",
"_____no_output_____"
]
],
[
[
"plt.scatter(X[:,0], X[:,1], c = (z_min >= 0))",
"_____no_output_____"
]
],
[
[
"## Part F\n\nExplicitly minimizing the orthogonal objective is extremely slow, but thankfully find a solution using eigenvalues and eigenvectors.\n\nThe Rayleigh-Ritz Theorem implies that the minimizing $\\mathbf{z}$ is a solution to the eigenvalue problem \n\n$$ \\mathbf{D}^{-1}(\\mathbf{D} - \\mathbf{A}) \\mathbf{z} = \\lambda \\mathbf{z}\\;, \\quad \\mathbf{z}^T\\mathbb{1} = 0\\;.$$\n\nSince $\\mathbb{1}$ is the eigenvector with smallest eigenvalue, the vector $\\mathbf{z}$ that we want must be the eigenvector with the second-smallest eigenvalue. \n\nWe construct the *Laplacian* matrix of $\\mathbf{A}$, $\\mathbf{L} = \\mathbf{D}^{-1}(\\mathbf{D} - \\mathbf{A})$, and find the eigenvector corresponding to its second-smallest eigenvalue, `z_eig`. ",
"_____no_output_____"
]
],
[
[
"L = np.linalg.inv(D)@(D-A)",
"_____no_output_____"
],
[
"Lam, U = np.linalg.eig(L)\nz_eig = U[:,1]",
"_____no_output_____"
]
],
[
[
"Now we color the point according to the sign of `z_eig`. Looks pretty good. ",
"_____no_output_____"
]
],
[
[
"plt.scatter(X[:,0], X[:,1], c = z_eig<0)",
"_____no_output_____"
]
],
[
[
"Finally, we can define `spectral_clustering(X, epsilon)` which takes in the input data `X` and the distance threshold `epsilon`, performs spectral clustering, and returns an array of labels indicating whether data point `i` is in group `0` or group `1`. \n",
"_____no_output_____"
]
],
[
[
"def spectral_clustering(X, epsilon):\n '''\n Given input X (n by 2 array) and distance threshold epsilon, \n performs spectral clustering, and\n returns n by 1 labels of cluster classification. \n '''\n A = np.array(pairwise_distances(X) < epsilon).astype('int')\n np.fill_diagonal(A,0)\n D = np.diag([email protected](X.shape[0]))\n L = np.linalg.inv(D)@(D-A)\n Lam, U = np.linalg.eig(L)\n z_eig = U[:,1]\n labels = np.array(z_eig<0, dtype='int')\n return labels",
"_____no_output_____"
],
[
"spectral_clustering(X,0.4)",
"_____no_output_____"
]
],
[
[
"Now we can run som experiments, making the problem harder by increasing the noise parameter, and increase the computation by increasing `n`.",
"_____no_output_____"
]
],
[
[
"np.random.seed(123)\nfig, axs = plt.subplots(3, figsize = (8,20))\nnoises = [0.05, 0.1, 0.2]\nfor i in range(3):\n X, y = datasets.make_moons(n_samples=1000, shuffle=True, noise=noises[i], random_state=None)\n axs[i].scatter(X[:,0], X[:,1], c = spectral_clustering(X,0.4))\n axs[i].set_title(label = \"noise = \" + str(noises[i]))\n",
"_____no_output_____"
]
],
[
[
"How does it perform on a different distribution?",
"_____no_output_____"
]
],
[
[
"n = 1000\nX, y = datasets.make_circles(n_samples=n, shuffle=True, noise=0.05, random_state=None, factor = 0.4)\nplt.scatter(X[:,0], X[:,1])",
"_____no_output_____"
],
[
"# k-means fails\nkm = KMeans(n_clusters = 2)\nkm.fit(X)\nplt.scatter(X[:,0], X[:,1], c = km.predict(X))",
"_____no_output_____"
]
],
[
[
"By adjusting the distance parameter `epsilon`, we can find a way to cluster the two circular blobs. We do run into singularity issues for some values of `epsilon`, but otherwise the results are plotted below. ",
"_____no_output_____"
]
],
[
[
"fig, axs = plt.subplots(11, figsize=(8,50))\nfor i in range(3,11):\n epsilon = i/10\n try:\n axs[i].scatter(X[:,0], X[:,1], c = spectral_clustering(X,epsilon))\n axs[i].set_title(label = \"epsilon = \" + str(epsilon))\n except:\n print(\"Error when epsilon = \", epsilon)",
"_____no_output_____"
]
],
[
[
"`epsilon` = 0.4 and 0.5 both work for this particular dataset.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e79dfe610d37553675267bf9343dad0c4b0e0c58 | 2,141 | ipynb | Jupyter Notebook | note/deploy/git.ipynb | MattiasDC/mercs | 466962e254c4f56f4a16a31b1a3d7bd893c8e23e | [
"MIT"
] | 11 | 2020-01-28T16:15:53.000Z | 2021-05-20T08:05:42.000Z | note/deploy/git.ipynb | MattiasDC/mercs | 466962e254c4f56f4a16a31b1a3d7bd893c8e23e | [
"MIT"
] | null | null | null | note/deploy/git.ipynb | MattiasDC/mercs | 466962e254c4f56f4a16a31b1a3d7bd893c8e23e | [
"MIT"
] | 4 | 2020-02-06T09:02:28.000Z | 2022-02-14T09:42:04.000Z | 20.198113 | 312 | 0.546007 | [
[
[
"# GIT\n\nDocumented shell script which makes explicit all the hooks I desire before doing a commit.",
"_____no_output_____"
],
[
"# HOOK: Documentation\n\nCrucial for reproducibility and therefore our first hook.",
"_____no_output_____"
],
[
"# HOOK: Tests\n\nCI happens on github. I do not like to spend time on writing tests (nobody does), but I do most of my dev-work in notebooks anyway. I preserve those notebooks, which at the end of a development session are supposed to work, as tests. In this way the work done lives on as a test of the future codebase. \n\nWhen a notebook becomes obsolete, I am forced to delete or adapt, which I think is a good thing.",
"_____no_output_____"
],
[
"# Actual Git actions",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
e79e341b14a5cefcafe39deb02ce22724ca12d64 | 3,635 | ipynb | Jupyter Notebook | 1b_step_functions_sagemaker/sagemaker-custom/0_custom_train/lab/wip/.ipynb_checkpoints/1_training-container-add-tests-checkpoint.ipynb | marcelokscunha/amazon-sagemaker-mlops-workshop | 855ee8d2446b0c3b061619f7677a62273ae29843 | [
"MIT-0"
] | 10 | 2020-05-05T12:44:34.000Z | 2021-04-14T15:17:43.000Z | 1b_step_functions_sagemaker/sagemaker-custom/0_custom_train/lab/wip/.ipynb_checkpoints/1_training-container-add-tests-checkpoint.ipynb | marcelokscunha/amazon-sagemaker-mlops-workshop | 855ee8d2446b0c3b061619f7677a62273ae29843 | [
"MIT-0"
] | null | null | null | 1b_step_functions_sagemaker/sagemaker-custom/0_custom_train/lab/wip/.ipynb_checkpoints/1_training-container-add-tests-checkpoint.ipynb | marcelokscunha/amazon-sagemaker-mlops-workshop | 855ee8d2446b0c3b061619f7677a62273ae29843 | [
"MIT-0"
] | 3 | 2020-04-24T17:01:51.000Z | 2020-05-08T14:52:55.000Z | 34.951923 | 1,188 | 0.571939 | [
[
[
"<div id=\"container_build\">\n<h2>3. Build and push the container</h2>\n</div>\n (. . .)",
"_____no_output_____"
],
[
"[Go to ECR in the AWS console](https://console.aws.amazon.com/ecr/home?region=us-east-1) and check if our new repository called was created and the image was pushed to it.\n\n",
"_____no_output_____"
],
[
"----\n**TODO Add local test**\n\n```\nSM_TRAINING_ENV={\"additional_framework_parameters\":{},\"channel_input_dirs\":{\"train\":\"/opt/ml/input/data/train\",\"validation\":\"/opt/ml/input/data/validation\"},\"current_host\":\"algo-1-5onks\",\"framework_module\":\"custom_lightgbm_framework.training:main\",\"hosts\":[\"algo-1-5onks\"],\"hyperparameters\":{},\"input_config_dir\":\"/opt/ml/input/config\",\"input_data_config\":{\"train\":{\"ContentType\":\"text/csv\",\"TrainingInputMode\":\"File\"},\"validation\":{\"ContentType\":\"text/csv\",\"TrainingInputMode\":\"File\"}},\"input_dir\":\"/opt/ml/input\",\"is_master\":true,\"job_name\":\"sagemaker-custom-2020-08-02-03-57-03-742\",\"log_level\":20,\"master_hostname\":\"algo-1-5onks\",\"model_dir\":\"/opt/ml/model\",\"module_dir\":\"s3://sagemaker-us-east-1-725879053979/sagemaker-custom/code/sourcedir.tar.gz\",\"module_name\":\"train\",\"network_interface_name\":\"eth0\",\"num_cpus\":4,\"num_gpus\":0,\"output_data_dir\":\"/opt/ml/output/data\",\"output_dir\":\"/opt/ml/output\",\"output_intermediate_dir\":\"/opt/ml/output/intermediate\",\"resource_config\":{\"current_host\":\"algo-1-5onks\",\"hosts\":[\"algo-1-5onks\"]},\"user_entry_point\":\"train.py\"}\n```\n\nLook:\n\nhttps://github.com/aws/sagemaker-training-toolkit/blob/74722fab9c9a9138b350df2cf54a204e2ad790c4/src/sagemaker_training/environment.py#L311\n\n```\n!sudo rm -rf train_tests && mkdir -p train_tests\nwith open(\"train_tests/vars.env\", \"w\") as f:\n f.write(\"AWS_ACCOUNT_ID=%s\\n\" % account_id)\n f.write(\"IMAGE_TAG=%s\\n\" % image_tag)\n f.write(\"AWS_DEFAULT_REGION=%s\\n\" % region)\n \n (...)\n \n f.close()\n\n!cat tests/vars.env\n\n```\n\nPass env vars to docker:\n\nhttps://docs.docker.com/engine/reference/commandline/run/#set-environment-variables--e---env---env-file\n\n!docker run --env-file vars.env \\<IMG> train\n\n!docker run --env-file vars.env sagemaker-training-containers/framework-container:latest train ",
"_____no_output_____"
],
[
"<div id=\"testing\">\n<h2>4. Training with Amazon SageMaker</h2>\n</div>\n\n(. . .)",
"_____no_output_____"
],
[
"**TODO trigger Docker image build in CodePipeline**\n\nUse CodeBuild local",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
e79e3d6477fce74a0396c0b12a2eb29a9aaa341a | 24,638 | ipynb | Jupyter Notebook | sklearn-RandomForestClassifier.ipynb | samirma/YaStockRnn | 89145b0a33de1161dc963f68c45e44c298c9dcd4 | [
"MIT"
] | null | null | null | sklearn-RandomForestClassifier.ipynb | samirma/YaStockRnn | 89145b0a33de1161dc963f68c45e44c298c9dcd4 | [
"MIT"
] | null | null | null | sklearn-RandomForestClassifier.ipynb | samirma/YaStockRnn | 89145b0a33de1161dc963f68c45e44c298c9dcd4 | [
"MIT"
] | null | null | null | 56.509174 | 13,400 | 0.759802 | [
[
[
"import pandas as pd\nimport data_util\nfrom tqdm.notebook import tqdm\n#from tqdm import tqdm_notebook as tqdm\nfrom data_generator import DataGenerator\nfrom state_util import StateUtil\nfrom tec_an import TecAn\nimport numpy as np\nfrom data_util import *\nfrom sklearn_model_hyper import *\n\n\nimport numpy as np\nimport math\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport scipy.stats as scs\nimport scikitplot as skplt\n\nfrom tensorflow.keras.layers import InputLayer, BatchNormalization, GlobalMaxPool1D, Bidirectional, Dense, Flatten, Conv2D, LeakyReLU, Dropout, LSTM, GRU, Input\nfrom tensorflow.keras import Model, Sequential\nfrom tensorflow.keras import datasets, layers, models\nfrom tensorflow.keras import regularizers\nfrom keras.wrappers.scikit_learn import KerasClassifier\nfrom imblearn.over_sampling import RandomOverSampler\nimport tensorflow as tf\nfrom keras.models import Sequential\nfrom keras.layers import Dense\nimport tensorflow.keras as keras\nimport random\nfrom catboost import CatBoost\nfrom sklearn.ensemble import RandomForestClassifier\n\n\n\nfrom sklearn.metrics import confusion_matrix\nfrom sklearn.model_selection import GridSearchCV\nfrom sklearn.metrics import accuracy_score, f1_score, precision_score\n",
"_____no_output_____"
],
[
"import pandas as pd\nimport data_util\nfrom tqdm import tqdm_notebook as tqdm\nfrom data_generator import DataGenerator\nfrom state_util import StateUtil\nfrom tec_an import TecAn\nimport numpy as np\n\n",
"_____no_output_____"
],
[
"path = \"./data/\"\ntrainX_raw, trainY_raw = load_data(\"simple_full_\", \"train\", path)\nvalX_raw, valY_raw = load_data(\"backtest\", \"train\", path)\n\ntrainX_balanced, trainY_balanced = get_balanced_set(trainX_raw, trainY_raw)\n\nX_train, Y_train = trainX_balanced, trainY_balanced\n\nvalX, valY = valX_raw, valY_raw\n\nfeatures = trainX_raw.shape[-1]\n\nprint(\"{}\".format(trainX_raw.shape))\n",
"(4586, 16)\n"
],
[
"%%time\nfrom sklearn.pipeline import make_pipeline\nfrom sklearn.preprocessing import StandardScaler\n\nparams = {\n 'random_state' : [42],\n 'n_estimators': [10, 30, 500, 1000, 2500],\n #'max_features': [1,4,5,6,7,8, 10, 30, 500],\n 'max_depth' : [4,5,6,7,8, 10, 50, 100],\n 'criterion' :['gini', 'entropy']\n}\n\n\nclfR = RandomForestClassifier()\n\nclf_grid = gridSearch(X_train, Y_train, clfR, params, make_scorer(accuracy_score))\n\n#5501154734411087",
"{'criterion': 'entropy', 'max_depth': 100, 'n_estimators': 30, 'random_state': 42}\n0.5782909930715935\nCPU times: user 58min 15s, sys: 20.6 s, total: 58min 35s\nWall time: 58min 38s\n"
],
[
"print(clf_grid.best_params_)",
"{'criterion': 'entropy', 'max_depth': 100, 'n_estimators': 30, 'random_state': 42}\n"
],
[
"\nmodel = make_pipeline(StandardScaler(), RandomForestClassifier(**clf_grid.best_params_))\n\nmodel.fit(X_train, Y_train)\n\neval_data(model, X_train, Y_train)",
" precision recall f1-score support\n\n class 0 1.00 1.00 1.00 2165\n class 1 1.00 1.00 1.00 2165\n\n accuracy 1.00 4330\n macro avg 1.00 1.00 1.00 4330\nweighted avg 1.00 1.00 1.00 4330\n\n"
],
[
"from sklearn.pipeline import Pipeline\nfrom sklearn.feature_selection import SelectFromModel\nfrom sklearn.svm import LinearSVC\n\n#model = clf_grid.best_estimator_\n\nclf = Pipeline([\n ('feature_selection', SelectFromModel(\n LinearSVC()\n )\n ),\n ('classification', RandomForestClassifier())\n])\n\nclf.fit(X_train, Y_train)\n\neval_data(clf, X_train, Y_train)",
" precision recall f1-score support\n\n class 0 1.00 1.00 1.00 2165\n class 1 1.00 1.00 1.00 2165\n\n accuracy 1.00 4330\n macro avg 1.00 1.00 1.00 4330\nweighted avg 1.00 1.00 1.00 4330\n\n"
],
[
"#val_x_norm = normalizer(valX).numpy()\n\neval_data(model, valX, valY)\n\n#0.4957874270900843",
" precision recall f1-score support\n\n class 0 0.50 0.62 0.55 924\n class 1 0.48 0.36 0.41 901\n\n accuracy 0.49 1825\n macro avg 0.49 0.49 0.48 1825\nweighted avg 0.49 0.49 0.48 1825\n\n"
],
[
"from joblib import dump, load\ndump(model, 'model/RandomForestClassifier_accuracy_score') ",
"_____no_output_____"
],
[
"valX_raw, valY_raw = load_data(\"backtest\", \"train\", path)\nvalX_raw, valY_raw = load_data(\"\", \"val\", path)\n\nprint(\"{}\".format(valX_raw.shape))\n\neval_data(model, valX_raw, valY_raw)\n",
"(459, 16)\n precision recall f1-score support\n\n class 0 0.50 0.58 0.54 219\n class 1 0.55 0.47 0.50 240\n\n accuracy 0.52 459\n macro avg 0.52 0.52 0.52 459\nweighted avg 0.52 0.52 0.52 459\n\n"
],
[
"import time\nimport numpy as np\n\nforest = clf_grid.best_estimator_\n\nstart_time = time.time()\nimportances = forest.feature_importances_\nstd = np.std([\n tree.feature_importances_ for tree in forest.estimators_], axis=0)\nelapsed_time = time.time() - start_time\n\nprint(f\"Elapsed time to compute the importances: \"\n f\"{elapsed_time:.3f} seconds\")",
"Elapsed time to compute the importances: 0.008 seconds\n"
],
[
"import pandas as pd\n\nfeature_names = ['F{}'.format(i) for i in range(features)]\n\nforest_importances = pd.Series(importances, index=feature_names)\n\nfig, ax = plt.subplots()\nforest_importances.plot.bar(yerr=std, ax=ax)\nax.set_title(\"Feature importances using MDI\")\nax.set_ylabel(\"Mean decrease in impurity\")\nfig.tight_layout()",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79e4875903a97f29525a1d7f3cd35b80ccabb05 | 246,068 | ipynb | Jupyter Notebook | weekly_challenges/challenge_set_1_heintz.ipynb | athena15/metis | bc7e6a1c1d1ebc463dff4ae79fae310b4ed527de | [
"MIT"
] | null | null | null | weekly_challenges/challenge_set_1_heintz.ipynb | athena15/metis | bc7e6a1c1d1ebc463dff4ae79fae310b4ed527de | [
"MIT"
] | null | null | null | weekly_challenges/challenge_set_1_heintz.ipynb | athena15/metis | bc7e6a1c1d1ebc463dff4ae79fae310b4ed527de | [
"MIT"
] | null | null | null | 40.076221 | 208 | 0.444674 | [
[
[
"Topic: Challenge Set 1 (MTA Subway Turnstile Data - \n\nSubject: Explore MTA turnstile data\n\nDate: 09/29/2018\n\nName: Brenner Heintz\n",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nimport random\nimport itertools\nimport calendar\nimport datetime as dt\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nimport matplotlib.dates as mdates\n%matplotlib inline",
"_____no_output_____"
],
[
"%xmode",
"Exception reporting mode: Verbose\n"
],
[
"import matplotlib.style as style\nstyle.use('fivethirtyeight')",
"_____no_output_____"
],
[
"sns.set_context('notebook', font_scale=1.2)",
"_____no_output_____"
],
[
"%config InlineBackend.figure_format = 'svg'",
"_____no_output_____"
]
],
[
[
"Downloaded data from: http://web.mta.info/developers/turnstile.html\n\nDownloaded 3 weeks of data:\nSaturday, September 22, 2018\nSaturday, September 15, 2018\nSaturday, September 08, 2018\n\nDocumentation at: http://web.mta.info/developers/resources/nyct/turnstile/ts_Field_Description.txt\n\nMap of the MTA system: http://web.mta.info/maps/submap.html",
"_____no_output_____"
],
[
"**Challenge 1**\n\nOpen up a new IPython notebook\n\nDownload a few MTA turnstile data files \n\nOpen up a file, use csv reader to read it and ensure there is a column for each feature (C/A, UNIT, SCP, STATION). These are the first four columns. ",
"_____no_output_____"
]
],
[
[
"df1 = pd.read_csv('http://web.mta.info/developers/data/nyct/turnstile/turnstile_180908.txt')",
"_____no_output_____"
],
[
"df2 = pd.read_csv('http://web.mta.info/developers/data/nyct/turnstile/turnstile_180915.txt')",
"_____no_output_____"
],
[
"df3 = pd.read_csv('http://web.mta.info/developers/data/nyct/turnstile/turnstile_180922.txt')",
"_____no_output_____"
],
[
"frames = [df1, df2, df3]\ndf = pd.concat(frames)",
"_____no_output_____"
],
[
"df.info(2)",
"<class 'pandas.core.frame.DataFrame'>\nInt64Index: 592752 entries, 0 to 199101\nData columns (total 11 columns):\nC/A 592752 non-null object\nUNIT 592752 non-null object\nSCP 592752 non-null object\nSTATION 592752 non-null object\nLINENAME 592752 non-null object\nDIVISION 592752 non-null object\nDATE 592752 non-null object\nTIME 592752 non-null object\nDESC 592752 non-null object\nENTRIES 592752 non-null int64\nEXITS 592752 non-null int64\ndtypes: int64(2), object(9)\nmemory usage: 54.3+ MB\n"
],
[
"df.head(2)",
"_____no_output_____"
]
],
[
[
"**Challenge 2**\n\n\"Let's turn this into a time series. Create a new column that specifies the date and time of each entry.\"",
"_____no_output_____"
]
],
[
[
"df['DATE'] = pd.to_datetime(df['DATE'], format='%m/%d/%Y')\n# df['DATETIME'] = pd.to_datetime(df.DATE + ' ' + df.TIME, format='%m/%d/%Y')",
"_____no_output_____"
]
],
[
[
"**Challenge 3**\n\nThese counts are for every n hours. (What is n?) We want total daily entries.",
"_____no_output_____"
]
],
[
[
"df['STATION_KEY'] = df['C/A'] + ' ' + df['UNIT'] + ' ' + df['STATION']",
"_____no_output_____"
],
[
"df['EXITS'] = df['EXITS ']\ndf.drop('EXITS ', axis=1, inplace=True)",
"_____no_output_____"
],
[
"# Reset index because index was duplicated on all 3 original dataframes\ndf.reset_index(inplace=True)",
"_____no_output_____"
],
[
"df['ENTRY_DIFFS'] = df.groupby(['STATION_KEY','SCP'])['ENTRIES'].diff(periods=-1)*-1",
"_____no_output_____"
],
[
"df['EXIT_DIFFS'] = df.groupby(['STATION_KEY','SCP'])['EXITS'].diff(periods=-1)*-1",
"_____no_output_____"
],
[
"df['TOTAL'] = df['ENTRY_DIFFS'] + df['EXIT_DIFFS']",
"_____no_output_____"
],
[
"df = df[(df['ENTRY_DIFFS'] < 2E5) \n & (df['ENTRY_DIFFS'] > 0) \n & (df['EXIT_DIFFS'] < 2E5)\n & (df['EXIT_DIFFS'] > 0)]",
"_____no_output_____"
],
[
"df.head(1)",
"_____no_output_____"
],
[
"df.groupby(['STATION_KEY', 'SCP','DATE'])['ENTRY_DIFFS'].sum()",
"_____no_output_____"
]
],
[
[
"**Challenge 4**\n\nNow plot the daily time series for a turnstile.",
"_____no_output_____"
]
],
[
[
"x = df.groupby(['STATION_KEY', 'SCP','DATE'])['ENTRY_DIFFS'].sum()",
"_____no_output_____"
],
[
"x = pd.DataFrame(x)",
"_____no_output_____"
],
[
"x.reset_index(inplace=True)",
"_____no_output_____"
],
[
"x.head(2)",
"_____no_output_____"
],
[
"x_values = x[(x['SCP']=='02-00-00') & (x['STATION_KEY']=='A002 R051 59 ST')]['DATE']",
"_____no_output_____"
],
[
"y_values = x[(x['SCP']=='02-00-00') & (x['STATION_KEY']=='A002 R051 59 ST')]['ENTRY_DIFFS']",
"_____no_output_____"
],
[
"y_values = y_values.astype(int)",
"_____no_output_____"
],
[
"fig, ax = plt.subplots()\n\nfig.set_size_inches(8,4)\n\nfig.autofmt_xdate()\nax.xaxis.set_major_locator(mdates.WeekdayLocator())\nax.xaxis.set_major_formatter(mdates.DateFormatter('%b. %d'))\n\nax.set_xlabel('Date')\nax.set_ylabel('Daily Traffic')\nax.set_title('Daily Turnstile Entries')\n\nplt.tight_layout()\n\nplt.plot(x_values,y_values, linewidth=1.5, color='r')",
"_____no_output_____"
]
],
[
[
"**Challenge 5**\n\n\nWe want to combine the numbers together -- for each ControlArea/UNIT/STATION combo, for each day, add the counts from each turnstile belonging to that combo.",
"_____no_output_____"
]
],
[
[
"df.head(3)",
"_____no_output_____"
],
[
"df['UNIT'].nunique()",
"_____no_output_____"
],
[
"df.groupby(['C/A', 'UNIT', 'SCP', 'DATE']).sum()",
"_____no_output_____"
]
],
[
[
"**Challenge 6**\n\nSimilarly, combine everything in each station, and come up with a time series of [(date1, count1),(date2,count2),...] type of time series for each STATION, by adding up all the turnstiles in a station.",
"_____no_output_____"
]
],
[
[
"station_df = df.groupby(['STATION', 'DATE']).sum()",
"_____no_output_____"
],
[
"station_df.reset_index(inplace=True)",
"_____no_output_____"
]
],
[
[
"**Challenge 7**\n\nPlot the time series (either daily or your preferred level of granularity) for a station.",
"_____no_output_____"
]
],
[
[
"x_values = station_df[station_df['STATION'] == '1 AV']['DATE']",
"_____no_output_____"
],
[
"y_values = station_df[station_df['STATION'] == '1 AV']['TOTAL']\ny_values = y_values.astype(int)",
"_____no_output_____"
],
[
"fig, ax = plt.subplots()\n\nfig.set_size_inches(8,4)\n\nfig.autofmt_xdate()\nax.xaxis.set_major_locator(mdates.WeekdayLocator())\nax.xaxis.set_major_formatter(mdates.DateFormatter('%b. %d'))\n\nax.set_xlabel('Date')\nax.set_ylabel('Daily Traffic')\nax.set_title('Daily Turnstile Entries (1st Ave Station)')\n\n# plt.tight_layout()\n\ndates = mdates.date2num(x_values)\nplt.plot_date(dates, y_values, fmt='-', color='purple', linewidth=2);",
"_____no_output_____"
]
],
[
[
"**Challenge 8**\n\nSelect a station and find the total daily counts for this station. Then plot those daily counts for each week separately.\n\nTo clarify: if I have 10 weeks of data on the 28th st 6 station, I will add 10 lines to the same figure (e.g. running plt.plot(week_count_list) once for each week). Each plot will have 7 points of data.",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots()\n\nfig.set_size_inches(8,4)\n\nfig.autofmt_xdate()\nax.xaxis.set_major_locator(mdates.WeekdayLocator())\nax.xaxis.set_major_formatter(mdates.DateFormatter('%b. %d'))\n\nax.set_xlabel('Date')\nax.set_ylabel('Daily Entries')\nax.set_title('Daily Turnstile Entries (First Ave Station)')\n\nplt.plot(x_values[:8],y_values[:8], linewidth=1.5, color='r')\nplt.plot(x_values[8:16],y_values[8:16], linewidth=1.5, color='g')\nplt.plot(x_values[16:24],y_values[16:24], linewidth=1.5, color='b')\nplt.legend(labels=['Week 1', 'Week 2', 'Week 3'], loc='best')",
"_____no_output_____"
]
],
[
[
"**Challenge 9**\n\n\nOver multiple weeks, sum total ridership for each station and sort them, so you can find out the stations with the highest traffic during the time you investigate\n\n",
"_____no_output_____"
]
],
[
[
"total_ridership_counts = df.groupby('STATION').sum()",
"_____no_output_____"
],
[
"total_ridership_counts.reset_index(inplace=True)",
"_____no_output_____"
],
[
"total_ridership_counts.head(3)",
"_____no_output_____"
]
],
[
[
"**Challenge 10**\n\nMake a single list of these total ridership values and plot it with\n\nplt.hist(total_ridership_counts)",
"_____no_output_____"
]
],
[
[
"y_vals = total_ridership_counts['TOTAL']",
"_____no_output_____"
],
[
"fig, ax = plt.subplots()\n\nfig.set_size_inches(8,4)\n\nax.set_xlabel('Entries')\nax.set_ylabel('Number of Stations')\nax.set_title('Histogram of Total Entries')\n\nax.set_xlim(0,3000000)\n\nplt.ticklabel_format(style='plain', axis='x')\nplt.hist(y_vals, bins=30);",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e79e55dd68e70cc90876b55f9e7c3b4a8d099471 | 64,079 | ipynb | Jupyter Notebook | notebooks/Genome comparison/55. Execute Sequence Comparison to Generate Homology Matrix -new.ipynb | biosustain/p-thermo | d60550cbc4897e1a3f9caaabd45b6a808244ebd1 | [
"Apache-2.0"
] | null | null | null | notebooks/Genome comparison/55. Execute Sequence Comparison to Generate Homology Matrix -new.ipynb | biosustain/p-thermo | d60550cbc4897e1a3f9caaabd45b6a808244ebd1 | [
"Apache-2.0"
] | null | null | null | notebooks/Genome comparison/55. Execute Sequence Comparison to Generate Homology Matrix -new.ipynb | biosustain/p-thermo | d60550cbc4897e1a3f9caaabd45b6a808244ebd1 | [
"Apache-2.0"
] | null | null | null | 29.407526 | 503 | 0.530845 | [
[
[
"# Notebook 1: Homology matrix generation from genome sequences\n\nIn this notebook, I will be applying the notebooks accompanying the paper by Norsigian et al., 2020. (doi:10.1038/s41596-019-0254-3.) I will apply this for the P. thermo model we've been working on and the M10EXG strain that we used to validate our model with.\n\nThis is the the first notebook in the tutorial to create homology matrix from genome sequences.There are four major steps in this notebook\n1. Download the genome annotation (GenBank files) from NCBI, and generate fasta files (protein &nucleotide) from them\n2. Perform BLASTp to find homologous proteins in strains of interest\n3. Use best bidirectional hits to create gene presence/absence matrix\n4. Supplementary for best practice: use BLASTn to check if we have missed any unannotated open reading frames and retain these genes in orthology matrix as well as guide future manual curation",
"_____no_output_____"
]
],
[
[
"#import packages needed\nimport pandas as pd\nfrom glob import glob",
"_____no_output_____"
],
[
"from Bio import Entrez, SeqIO",
"_____no_output_____"
],
[
"import sys",
"_____no_output_____"
],
[
"import cobra",
"_____no_output_____"
],
[
"import decimal",
"_____no_output_____"
]
],
[
[
"__NOTE__ to be able to import the Entrez and SeqIO, I need to change the folder name from 'bio' to 'Bio' and then it'll work. \n\nC:\\Users\\vivmol\\AppData\\Local\\Continuum\\anaconda3\\envs\\g-thermo\\Lib\\site-packages\n\nSo be careful whenever i install Biopython again that this needs to be fixed.",
"_____no_output_____"
],
[
"Here I will be working with strains in the faculative anaerobic clade of the genus. I will also add genomes that are obligate aerobes to see if that could highlight to us what changed between these species that made them become obligate aerobes. ",
"_____no_output_____"
]
],
[
[
"# Load the information on the five strains we will be working with in this tutorial\nStrainsOfInterest=pd.read_excel('Strain Information.xlsx')\nStrainsOfInterest",
"_____no_output_____"
],
[
"#The Reference Genome is as Described in the Base Reconstruction; here the reference is \nreferenceStrainID='NCIMB11955'\ntargetStrainIDs=list(StrainsOfInterest['NCBI ID'])",
"_____no_output_____"
]
],
[
[
"## 1. Download genome annotations (GenBank files) to generate fasta files ",
"_____no_output_____"
],
[
"### Dowload genomes from NCBI\nDownload the genome annotations (GenBank files) from NCBI for strains of interest. ",
"_____no_output_____"
]
],
[
[
"# define a function to download the annotated genebank files from NCBI\ndef dl_genome(id, folder='genomes'): # be sure get CORRECT ID\n files=glob('%s/*.gb'%folder)\n out_file = '%s/%s.gb'%(folder, id)\n\n if out_file in files:\n print (out_file, 'already downloaded')\n return\n else:\n print ('downloading %s from NCBI'%id)\n \n from Bio import Entrez\n Entrez.email = \"[email protected]\" #Insert email here for NCBI\n handle = Entrez.efetch(db=\"nucleotide\", id=id, rettype=\"gb\", retmode=\"text\")\n fout = open(out_file,'w')\n fout.write(handle.read())\n fout.close()",
"_____no_output_____"
],
[
"# execute the above function, and download the GenBank files for 8 P. thermo strains\nfor strain in targetStrainIDs:\n dl_genome(strain, folder='genomes')",
"downloading CP016552.1 from NCBI\ndownloading CP020030.1 from NCBI\ndownloading CP012712.1 from NCBI\ndownloading CP002835.1 from NCBI\ndownloading CP001638 from NCBI\ndownloading CP008903.1 from NCBI\n"
],
[
"#also download the reference strain info\ndl_genome(referenceStrainID, folder='genomes')",
"downloading CP016622.1 from NCBI\n"
]
],
[
[
"### Examine the Downloaded Strains",
"_____no_output_____"
]
],
[
[
"# define a function to gather information of the downloaded strains from the GenBank files\ndef get_strain_info(folder='genomes'):\n files = glob('%s/*.gb'%folder)\n strain_info = []\n \n for file in files:\n handle = open(file)\n record = SeqIO.read(handle, \"genbank\")\n \n for f in record.features:\n if f.type=='source':\n info = {}\n info['file'] = file\n info['id'] = file.split('\\\\')[-1].split('.')[0]\n for q in f.qualifiers.keys():\n info[q] = '|'.join(f.qualifiers[q])\n strain_info.append(info)\n return pd.DataFrame(strain_info)",
"_____no_output_____"
],
[
"# information on the downloaded strain\nget_strain_info(folder='genomes')",
"_____no_output_____"
]
],
[
[
"### Generate FASTA files for both Protein and Nucleotide Pipelines\nFrom the GenBank file, we can extract sequence and annoation information to generate fasta files for the protein and nucleotide analyses. The resulting fasta files will then be used in step 2 as input for BLAST ",
"_____no_output_____"
]
],
[
[
"# define a function to parse the Genbank file to generate fasta files for both protein and nucleotide sequences\ndef parse_genome(id, type='prot', in_folder='genomes', out_folder='prots', overwrite=1):\n\n in_file = '%s/%s.gb'%(in_folder, id)\n out_file='%s/%s.fa'%(out_folder, id)\n files =glob('%s/*.fa'%out_folder)\n \n if out_file in files and overwrite==0:\n print (out_file, 'already parsed')\n return\n else:\n print ('parsing %s'%id)\n \n handle = open(in_file)\n \n fout = open(out_file,'w')\n x = 0\n \n records = SeqIO.parse(handle, \"genbank\")\n for record in records:\n for f in record.features:\n if f.type=='CDS':\n seq=f.extract(record.seq)\n \n if type=='nucl':\n seq=str(seq)\n else:\n seq=str(seq.translate())\n \n if 'locus_tag' in f.qualifiers.keys():\n locus = f.qualifiers['locus_tag'][0]\n elif 'gene' in f.qualifiers.keys():\n locus = f.qualifiers['gene'][0]\n else:\n locus = 'gene_%i'%x\n x+=1\n fout.write('>%s\\n%s\\n'%(locus, seq))\n fout.close()",
"_____no_output_____"
],
[
"# Generate fasta files for 5 strains of interest\nfor strain in targetStrainIDs:\n parse_genome(strain, type='prot', in_folder='genomes', out_folder='prots')\n parse_genome(strain, type='nucl', in_folder='genomes', out_folder='nucl')\n",
"parsing 2501416905\nparsing 2501416905\n"
],
[
"#Also generate fasta files for the reference strain\nparse_genome(referenceStrainID, type='nucl', in_folder='genomes', out_folder='nucl')\nparse_genome(referenceStrainID, type='prots', in_folder='genomes', out_folder='prots')",
"parsing NCIMB11955\nparsing NCIMB11955\n"
]
],
[
[
"## 2. Perform BLAST to find homologous proteins in strains of interest",
"_____no_output_____"
],
[
"### Make BLAST DB for each of the target strains for both Protein and Nucleotide Pipelines\n\nIn this tutorial, we will run both BLASTp for proteins and BLSATn for nucleotides. BLASTp will be used as the main approach to identify homologous proteins in reference strain and other strains of interest, while BLASTn will be used as a supplementary method to check for any unannotated genes",
"_____no_output_____"
]
],
[
[
"# Define a function to make blast database for either protein of nucleotide\ndef make_blast_db(id,folder='prots',db_type='prot'):\n import os\n \n out_file ='%s/%s.fa.pin'%(folder, id)\n files =glob('%s/*.fa.pin'%folder)\n \n if out_file in files:\n print (id, 'already has a blast db')\n return\n if db_type=='nucl':\n ext='fna'\n else:\n ext='fa'\n\n cmd_line='makeblastdb -in %s/%s.%s -dbtype %s' %(folder, id, ext, db_type)\n \n print ('making blast db with following command line...')\n print (cmd_line)\n os.system(cmd_line)",
"_____no_output_____"
],
[
"sys.path.append('..\\\\..\\\\..\\\\..\\\\..\\\\..\\\\Program Files\\\\NCBI\\\\blast-2.10.1+\\\\bin')",
"_____no_output_____"
],
[
"# make protein sequence databases \n# Because we are performing bi-directional blast, we make databases from both reference strain and strains of interest\nfor strain in targetStrainIDs:\n make_blast_db(strain,folder='prots',db_type='prot')\nmake_blast_db(referenceStrainID,folder='prots',db_type='prot')",
"making blast db with following command line...\nmakeblastdb -in prots/2501416905.fa -dbtype prot\nmaking blast db with following command line...\nmakeblastdb -in prots/NCIMB11955.fa -dbtype prot\n"
]
],
[
[
"### Define functions to run protein BLAST and get sequence lengths\n- BLASTp will be the main approach used here to identify homologous proteins between strains \n- Aside from sequence similarity, we also want to ensure the coverage of sequence mapping is sufficient. Therefore, we need to identiy the sequence length for each protein and compare it with the alignment length.",
"_____no_output_____"
]
],
[
[
"# define a function to run BLASTp\ndef run_blastp(seq,db,in_folder='prots', out_folder='bbh', out=None,outfmt=6,evalue=0.001,threads=1):\n import os\n if out==None:\n out='%s/%s_vs_%s.txt'%(out_folder, seq, db)\n print(out)\n \n files =glob('%s/*.txt'%out_folder)\n if out in files:\n print (seq, 'already blasted')\n return\n \n print ('blasting %s vs %s'%(seq, db))\n \n db = '%s/%s.fa'%(in_folder, db)\n seq = '%s/%s.fa'%(in_folder, seq)\n cmd_line='blastp -db %s -query %s -out %s -evalue %s -outfmt %s -num_threads %i' \\\n %(db, seq, out, evalue, outfmt, threads)\n \n print ('running blastp with following command line...')\n print (cmd_line)\n os.system(cmd_line)\n return out",
"_____no_output_____"
],
[
"# define a function to get sequence length \n\ndef get_gene_lens(query, in_folder='prots'):\n\n file = '%s/%s.fa'%(in_folder, query)\n handle = open(file)\n records = SeqIO.parse(handle, \"fasta\")\n out = []\n \n for record in records:\n out.append({'gene':record.name, 'gene_length':len(record.seq)})\n \n out = pd.DataFrame(out)\n return out",
"_____no_output_____"
]
],
[
[
"## 3. Use Bi-Directional BLASTp Best Hits to create gene presence/absence matrix",
"_____no_output_____"
],
[
"### Obtain Bi-Directional BLASTp Best Hits\n\nFrom the above BLASTp results, we can obtain Bi-Directional BLASTp Best Hits to identify homologous proteins. Note beside gene similarity score, the coverage of alignment is also used to filter mapping results. ",
"_____no_output_____"
]
],
[
[
"# define a function to get Bi-Directional BLASTp Best Hits\ndef get_bbh(query, subject, in_folder='bbh'): \n \n #Utilize the defined protein BLAST function\n run_blastp(query, subject)\n run_blastp(subject, query)\n \n query_lengths = get_gene_lens(query, in_folder='prots')\n subject_lengths = get_gene_lens(subject, in_folder='prots')\n \n #Define the output file of this BLAST\n out_file = '%s/%s_vs_%s_parsed.csv'%(in_folder,query, subject)\n files=glob('%s/*_parsed.csv'%in_folder)\n \n #Combine the results of the protein BLAST into a dataframe\n print ('parsing BBHs for', query, subject)\n cols = ['gene', 'subject', 'PID', 'alnLength', 'mismatchCount', 'gapOpenCount', 'queryStart', 'queryEnd', 'subjectStart', 'subjectEnd', 'eVal', 'bitScore']\n bbh=pd.read_csv('%s/%s_vs_%s.txt'%(in_folder,query, subject), sep='\\t', names=cols)\n bbh = pd.merge(bbh, query_lengths) \n bbh['COV'] = bbh['alnLength']/bbh['gene_length']\n \n bbh2=pd.read_csv('%s/%s_vs_%s.txt'%(in_folder,subject, query), sep='\\t', names=cols)\n bbh2 = pd.merge(bbh2, subject_lengths) \n bbh2['COV'] = bbh2['alnLength']/bbh2['gene_length']\n out = pd.DataFrame()\n \n # Filter the genes based on coverage\n bbh = bbh[bbh.COV>=0.25]\n bbh2 = bbh2[bbh2.COV>=0.25]\n \n #Delineate the best hits from the BLAST\n for g in bbh.gene.unique():\n res = bbh[bbh.gene==g]\n if len(res)==0:\n continue\n best_hit = res.loc[res.PID.idxmax()]\n best_gene = best_hit.subject\n res2 = bbh2[bbh2.gene==best_gene]\n if len(res2)==0:\n continue\n best_hit2 = res2.loc[res2.PID.idxmax()]\n best_gene2 = best_hit2.subject\n if g==best_gene2:\n best_hit['BBH'] = '<=>'\n else:\n best_hit['BBH'] = '->'\n out=pd.concat([out, pd.DataFrame(best_hit).transpose()])\n \n #Save the final file to a designated CSV file \n out.to_csv(out_file)",
"_____no_output_____"
],
[
"# Execute the BLAST for each target strain against the reference strain, save results to 'bbh' i.e. \"bidirectional best\n# hits\" folder to create\n# homology matrix\n\nfor strain in targetStrainIDs:\n get_bbh(referenceStrainID,strain, in_folder='bbh')",
"bbh/NCIMB11955_vs_2501416905.txt\nblasting NCIMB11955 vs 2501416905\nrunning blastp with following command line...\nblastp -db prots/2501416905.fa -query prots/NCIMB11955.fa -out bbh/NCIMB11955_vs_2501416905.txt -evalue 0.001 -outfmt 6 -num_threads 1\nbbh/2501416905_vs_NCIMB11955.txt\nblasting 2501416905 vs NCIMB11955\nrunning blastp with following command line...\nblastp -db prots/NCIMB11955.fa -query prots/2501416905.fa -out bbh/2501416905_vs_NCIMB11955.txt -evalue 0.001 -outfmt 6 -num_threads 1\nparsing BBHs for NCIMB11955 2501416905\n"
]
],
[
[
"### Parse the BLAST Results into one Homology Matrix of the Reconstruction Genes\n\nFor the homology matrix, want to find, for each gene in the reference annotation, is there one in the other strains. And then later filter this down to metabolic genes. ",
"_____no_output_____"
]
],
[
[
"#Load all the BLAST files between the reference strain and target strains\n\nblast_files=glob('%s/*_parsed.csv'%'bbh')\n\nfor blast in blast_files:\n bbh=pd.read_csv(blast)\n print (blast,bbh.shape) ",
"bbh\\NCIMB11955_vs_2501416905_parsed.csv (3520, 16)\n"
]
],
[
[
"In this section of the notebook, I will deviate from the published tutorial. In the tutorial, they map the orthologous genes onto the curated model of the reference genome. In reality, we are curious as to how homologous the genomes are to one another, and how many metabolic genes the different strains have in common. So we need to compare the orthologues to the reference genome and not the reference model. I'll try to adapt the scripts so that we can get a homology matrix that captures this.",
"_____no_output_____"
],
[
"Make a single dataframe where one column is all the genes in the reference organism, and the other column is a different strain and contains the PID.\n\nThen you can count for each column, the number of genes which have a PID above 80% (selected threshold) and compare it to the total number of genes.",
"_____no_output_____"
]
],
[
[
"#import all the csv files\ncompare = pd.read_csv('bbh/NCIMB11955_vs_2501416905_parsed.csv')",
"_____no_output_____"
],
[
"#filter out all other columns that i won't use later\ncompare = compare[['gene', 'PID']]",
"_____no_output_____"
],
[
"#list of all ORFs found in the reference genome\nwith open('prots/NCIMB11955.fa') as fasta_file: # Will close handle cleanly\n NCIMB_ids = []\n for seq_record in SeqIO.parse(fasta_file, 'fasta'): # (generator)\n NCIMB_ids.append(seq_record.id)",
"_____no_output_____"
],
[
"comparison = pd.DataFrame({'gene': NCIMB_ids})",
"_____no_output_____"
],
[
"comparison = pd.merge(comparison, compare, on='gene',how=\"outer\")",
"_____no_output_____"
],
[
"strains = ['NCIMB11955', 'M10EXG']",
"_____no_output_____"
],
[
"comparison.columns = strains",
"_____no_output_____"
]
],
[
[
"Now that we have a dataframe that compares the PID scores for each gene in the reference genome to the other strains, we can start to 'sum up' what percentage of the genes in the reference have a matched gene in the strain. \n\nFor this, we will set a threshold of 80% sequence identity between genes to be counted as a true homologue. One can decide to change this arbitrarily set value if they like. \n\nThe reference genome has 3708 ORFs annotated to it. ",
"_____no_output_____"
]
],
[
[
"columns = list(comparison) \n \nfor i in columns: #iterate through the columns\n if i in 'NCIMB11955': # skip reference column\n continue\n else:\n #now go through each row in this column\n common_genes = []\n for index,row in comparison.iterrows():\n value = row[i]\n if value > 90: #selected threshold level\n common_genes.append(1)\n elif value < 90:\n common_genes.append(0)\n homology = sum(common_genes)\n fraction = 100*homology/3708\n print(i,': ', fraction) ",
"M10EXG : 88.9967637540453\n"
]
],
[
[
"Try the same as above, but use the E-value as the threshold instead.",
"_____no_output_____"
]
],
[
[
"#import all the csv files\ncompare = pd.read_csv('bbh/NCIMB11955_vs_2501416905_parsed.csv')",
"_____no_output_____"
],
[
"#filter out all other columns that i won't use later\ncompare_eval = compare[['gene', 'eVal']]",
"_____no_output_____"
],
[
"#make dataframe for first comparison, and then add on the rest\ncomparison_eval = pd.DataFrame({'gene': NCIMB_ids})",
"_____no_output_____"
],
[
"comparison_eval = pd.merge(comparison_eval, compare_eval, on='gene',how=\"outer\")",
"_____no_output_____"
],
[
"strains = ['NCIMB11955', 'M10EXG']",
"_____no_output_____"
],
[
"comparison_eval.columns = strains",
"_____no_output_____"
],
[
"columns = list(comparison_eval) \n \nfor i in columns: #iterate through the columns\n if i in 'NCIMB11955': # skip reference column\n continue\n else:\n #now go through each row in this column\n common_genes = []\n for index,row in comparison_eval.iterrows():\n value = row[i]\n if value < 5E-5 : #selected threshold level\n common_genes.append(1)\n elif value >5E-5:\n continue\n homology = sum(common_genes)\n fraction = (homology/3708)*100\n print(i,': ', fraction)\n ",
"M10EXG : 94.47141316073355\n"
]
],
[
[
"Now that we have done the above, we can start to look at the overlap between the strains, so that we can create a venn diagram of total ORFs that are unique to each strain but also ovelap.\n",
"_____no_output_____"
],
[
"__Approach__\n- Import the fasta files in the prots folder: this is a list of all the ORFs in each strain. \n\n- make a list, per strain, of all genes for that strain that fit the comparison threshold (for this use Eval < 1E-10 for now) i.e. all the genes that these two strains have in common\n\n- then make an overlap of the all the lists: should give for each strain which are unique and which overlap with one another.\n",
"_____no_output_____"
]
],
[
[
"#import total strain lists, for each strain\nwith open('prots/2501416905.fa') as fasta_file: # Will close handle cleanly\n M10EXG_ids = []\n for seq_record in SeqIO.parse(fasta_file, 'fasta'): # (generator)\n M10EXG_ids.append(seq_record.id)",
"_____no_output_____"
]
],
[
[
"Next is to make a dataframe for the comparison of the reference (here NCIMB11955) to the other strain. The list should then contain the gene names of the reference organism, as well as the strain being mapped against. ",
"_____no_output_____"
]
],
[
[
"compare_eval",
"_____no_output_____"
],
[
"ol_reference = [] #the overlapping genes, with the reference strain ID\nol_strain =[] #the overlapping genes, with the reference strain ID\nfor index,row in compare.iterrows():\n value = row['eVal']\n if value > 1E-5: #selected threshold level\n continue #we don't want to save these anywhere \n elif value < 1E-5:\n ol_reference.append(row['gene'])\n ol_strain.append(row['subject'])\nNCIMB_M10EXG_OL = pd.DataFrame({'NCIMB11955': ol_reference, 'M10EXG': ol_strain})",
"_____no_output_____"
],
[
"# so NCIMB_M10EXG_OL is now a dataframe that maps each gene in the NCIMB to a gene in M10EXG",
"_____no_output_____"
],
[
"len(NCIMB_M10EXG_OL)",
"_____no_output_____"
]
],
[
[
"In the example above, you then see which 3498 genes of the target strain match the reference strain.\nWe can then see how many genes each strain has by themselves and from that make the venndiagram.",
"_____no_output_____"
]
],
[
[
"len(NCIMB_ids)",
"_____no_output_____"
]
],
[
[
"So NCIMB has 259 unique ORFs. ",
"_____no_output_____"
],
[
"In the above, we had NCIMB as the reference strain. TO find how many true unique genes there are in M10EXG, we would need to repeat the above analysis but now with M10EXG as reference strain. That will be done below.",
"_____no_output_____"
]
],
[
[
"StrainsOfInterest=pd.read_excel('Strain InformationB.xlsx')\nStrainsOfInterest",
"_____no_output_____"
],
[
"#switch reference and target here\nreferenceStrainID='2501416905'\ntargetStrainIDs=list(StrainsOfInterest['NCBI ID'])",
"_____no_output_____"
],
[
"for strain in targetStrainIDs:\n get_bbh(referenceStrainID,strain, in_folder='bbh')",
"bbh/2501416905_vs_NCIMB11955.txt\nblasting 2501416905 vs NCIMB11955\nrunning blastp with following command line...\nblastp -db prots/NCIMB11955.fa -query prots/2501416905.fa -out bbh/2501416905_vs_NCIMB11955.txt -evalue 0.001 -outfmt 6 -num_threads 1\nbbh/NCIMB11955_vs_2501416905.txt\nblasting NCIMB11955 vs 2501416905\nrunning blastp with following command line...\nblastp -db prots/2501416905.fa -query prots/NCIMB11955.fa -out bbh/NCIMB11955_vs_2501416905.txt -evalue 0.001 -outfmt 6 -num_threads 1\nparsing BBHs for 2501416905 NCIMB11955\n"
]
],
[
[
"Now import this file and make it into a dataframe so we can find the unique genes.",
"_____no_output_____"
]
],
[
[
"#import all the csv files\ncompare = pd.read_csv('bbh/2501416905_vs_NCIMB11955_parsed.csv')",
"_____no_output_____"
],
[
"#filter out all other columns that i won't use later\ncompare_eval = compare[['gene', 'eVal', 'subject']]",
"_____no_output_____"
],
[
"strains = ['M10EXG', 'eVal', 'NCIMB11955']",
"_____no_output_____"
],
[
"compare_eval.columns = strains",
"_____no_output_____"
]
],
[
[
"Now that we have done the above, we can slook at the overlap from M10EXg to the NCIMB strain. \n",
"_____no_output_____"
],
[
"__Approach__\n- make a list, per strain, of all genes for that strain that fit the comparison threshold (for this use Eval < 1E-5 for now) i.e. all the genes that these two strains have in common\n",
"_____no_output_____"
]
],
[
[
"ol_reference = [] #the overlapping genes, with the reference strain ID\nol_strain =[] #the overlapping genes, with the reference strain ID\nfor index,row in compare_eval.iterrows():\n value = row['eVal']\n if value > 1E-5: #selected threshold level\n continue #we don't want to save these anywhere \n elif value < 1E-5:\n ol_reference.append(row['M10EXG'])\n ol_strain.append(row['NCIMB11955'])\nM10EXG_NCIMB_OL = pd.DataFrame({'M10EXG': ol_reference, 'NCIMB': ol_strain})",
"_____no_output_____"
],
[
"# so M10EXG_NCIMB_OL is now a dataframe that maps each gene in the M10EXG to a gene in NCIMB11955",
"_____no_output_____"
],
[
"len(M10EXG_NCIMB_OL)",
"_____no_output_____"
]
],
[
[
"There are 5 genes less mapped as overlap, but this is within the error range.",
"_____no_output_____"
]
],
[
[
"len(M10EXG_ids)",
"_____no_output_____"
]
],
[
[
"So we would have 234 unique genes in M10EXG. ",
"_____no_output_____"
],
[
"So overall, we have 3498 genes that overlap, 259 unique in NCIMB and 234 unique in M10EXG. This very closely matches the results of the KBASE pipeline we did, which is good.",
"_____no_output_____"
],
[
"## Filter for metabolic genes\nNow that we know the overlap in the total protein, we want to find out what percentage of the metabolic genes overlap between the strains. So first I will filter all the genes from the target organism into the ones that are metabolic genes. And then apply this list to the list of bi-directional hits, so see how many of the metabolic genes overlap. ",
"_____no_output_____"
],
[
"I can't get the definition to work, So I will just write a script that will make a dataframe linking each gene to a possible EC code. Then we can filter the bbh-file made previously for these metabolic genes and get the answer that way.",
"_____no_output_____"
]
],
[
[
"genes = []\nEC = []\nx = 0\nfor seq_record in SeqIO.parse(\"genomes/NCIMB11955.gb\", \"genbank\"):\n for f in seq_record.features:\n if f.type=='CDS':\n if 'locus_tag' in f.qualifiers.keys():\n locus = f.qualifiers['locus_tag'][0]\n elif 'gene' in f.qualifiers.keys():\n locus = f.qualifiers['gene'][0]\n else:\n locus = 'gene_%i'%x\n x+=1\n try:\n synonyms = f.qualifiers['gene_synonym']\n #here it will check that it has one of the gene_synonyms as ec code, i.e. is metabolic\n check = []\n for a in synonyms:\n if a[0].isdigit():\n ec = a\n check.append(1)\n else:\n continue\n if sum(check) > 0:\n \n genes.append(locus)\n EC.append(a)\n except KeyError:\n continue\n",
"_____no_output_____"
],
[
"NCIMB_met = pd.DataFrame({'Gene': genes, 'EC': EC})",
"_____no_output_____"
]
],
[
[
"Now I'll do the same for the M10EXG genome/annotation.",
"_____no_output_____"
]
],
[
[
"genes = []\nEC = []\nx = 0\nfor seq_record in SeqIO.parse(\"genomes/2501416905.gb\", \"genbank\"):\n for f in seq_record.features:\n if f.type=='CDS':\n if 'locus_tag' in f.qualifiers.keys():\n locus = f.qualifiers['locus_tag'][0]\n elif 'gene' in f.qualifiers.keys():\n locus = f.qualifiers['gene'][0]\n else:\n locus = 'gene_%i'%x\n x+=1\n try:\n synonyms = f.qualifiers['gene_synonym']\n #here it will check that it has one of the gene_synonyms as ec code, i.e. is metabolic\n check = []\n for a in synonyms:\n if a[0].isdigit():\n ec = a\n check.append(1)\n else:\n continue\n if sum(check) > 0:\n \n genes.append(locus)\n EC.append(a)\n except KeyError:\n continue\n",
"_____no_output_____"
],
[
"M10EXG_met = pd.DataFrame({'Gene': genes, 'EC': EC})",
"_____no_output_____"
],
[
"len(NCIMB_met)",
"_____no_output_____"
],
[
"len(M10EXG_met)",
"_____no_output_____"
]
],
[
[
"So, this shows we have 1424 metabolic genes in the reference strain and 1417 in the M10EXG strain. Note, there are ofcourse many hypothetical proteins annotated in the genome. Those are ignored. \nNow, I will have to filter the list of BBH based on just the genes that are metabolic. \n\nApproach:\n- NCIMB_M10EXG_OL has all the CDS that are considered hits here (with the threshold of 1E-5 set). I will iterate through each row, and make a new dataframe which contains the filtered set of genes for comparing.\n\nNOTE: at the end of comparing with NCIMC11955 as reference, I will do the same with the M10EXG as reference, to find which genes are unique there as well.",
"_____no_output_____"
]
],
[
[
"NCIMB = []\nM10EXG = []\nfor row, index in NCIMB_M10EXG_OL.iterrows():\n ref_gene = index['NCIMB11955']\n try:\n NCIMB_met.loc[NCIMB_met[\"Gene\"] == ref_gene,'EC'].values[0] #if it is a metaoblic gene this will give a hit\n NCIMB.append(index['NCIMB11955'])\n M10EXG.append(index['M10EXG'])\n except IndexError: #if the hit isn't metabolic, we can ignore it\n continue",
"_____no_output_____"
],
[
"#make data frame\nOL_metabolic = pd.DataFrame({'NCIMB11955':NCIMB, 'M10EXG':M10EXG})",
"_____no_output_____"
],
[
"len(OL_metabolic)",
"_____no_output_____"
]
],
[
[
"So we have 1384 metabolic genes that overlap. That means there are 40 genes unique to NCIMB, and for M10EXG we need to check how many are still unmatched.",
"_____no_output_____"
],
[
"Finally, I will make a list of the unique metabolic genes, that can be supplied in supplementary, and given a short look through if anything unexpected appears.\n\nApproach:\n- NCIMB_met and M10EXG_met has all the metabolic genes with their corresponding EC code. I will filter out the significant hits here for the NCIMB strain first. I need to repeat the above analysis for M10EXG to be able to see how many unique it has properly. (will be done further in this notebook)\n- then try to find some more information about each E.C. code, e.g. from KEGG database I've Used before? I can look into the type of reactions (i.e. category) and/or the extact reaction name.",
"_____no_output_____"
]
],
[
[
"genes = []\nECs =[]\nfor row, index in NCIMB_met.iterrows():\n gene = index['Gene']\n try:\n OL_metabolic.loc[OL_metabolic[\"NCIMB11955\"] == gene,'M10EXG'].values[0] #If the gene is in the overlap dataframe it will give an output\n continue\n except IndexError: #i.e. if the gene doesn't have an overlap in M10EXG\n genes.append(gene)\n ECs.append(index['EC'])\n \n#make dataframe\nNCIMB_unique = pd.DataFrame({'Unique gene':genes, 'EC':ECs}) ",
"_____no_output_____"
],
[
"len(NCIMB_unique)",
"_____no_output_____"
]
],
[
[
"Now we know that for NCIMB, there are 40 unique metabolic genes and 1384 overlapping genes. I will do the same as above but for the M10EXG strain to find how many are unique in that strain when that is used as reference sequence.",
"_____no_output_____"
]
],
[
[
"M10EXG = []\nNCIMB = []\nfor row, index in M10EXG_NCIMB_OL.iterrows():\n ref_gene = index['M10EXG']\n try:\n M10EXG_met.loc[M10EXG_met[\"Gene\"] == ref_gene,'EC'].values[0] #if it is a metaoblic gene this will give a hit\n M10EXG.append(index['M10EXG'])\n NCIMB.append(index['NCIMB'])\n except IndexError: #if the hit isn't metabolic, we can ignore it\n continue",
"_____no_output_____"
],
[
"#make data frame\nOL_metabolic_M10EXG = pd.DataFrame({'M10EXG':M10EXG, 'NCIMB11955':NCIMB})",
"_____no_output_____"
],
[
"len(OL_metabolic_M10EXG)",
"_____no_output_____"
]
],
[
[
"Again here there is a difference of a few genes, but falls within an error range if you consider the size of the total metabolic gene set. So this is fine. Next, we will make a dataframe of the metabolic genes that are unique to M10EXG.",
"_____no_output_____"
]
],
[
[
"genes = []\nECs =[]\nfor row, index in M10EXG_met.iterrows():\n gene = index['Gene']\n try:\n OL_metabolic_M10EXG.loc[OL_metabolic_M10EXG[\"M10EXG\"] == gene,'NCIMB11955'].values[0] #If the gene is in the overlap dataframe it will give an output\n continue\n except IndexError: #i.e. if the gene doesn't have an overlap in M10EXG\n genes.append(gene)\n ECs.append(index['EC'])\n \n#make dataframe\nM10EXG_unique = pd.DataFrame({'Unique gene':genes, 'EC':ECs})",
"_____no_output_____"
],
[
"len(M10EXG_unique)",
"_____no_output_____"
]
],
[
[
"So we have 29 unique genes in M10EXG, and 40 in our strain. ",
"_____no_output_____"
],
[
"Now we have two dataframes, one for each strain, with the unique metabolic genes, we can try to find a bit more information about them. I will try to get a name for the reaction, as well as what pathway they are a part of. \n\nFirst I will prepare a dataframe from the KEGG Site that contains the information about which pathway the EC code belongs to.",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv('http://rest.kegg.jp/link/ec/pathway', header=None, sep = '\\t')",
"_____no_output_____"
],
[
"df.columns = ['Pathway', 'EC'] #rename the columns",
"_____no_output_____"
],
[
"#remove all 'path:' and 'rn:'\ndf['Pathway'] = df['Pathway'].str.replace(r'path:ec', '')",
"_____no_output_____"
],
[
"df['EC'] = df['EC'].str.replace(r'ec:', '')",
"_____no_output_____"
],
[
"#remove the rows with 'path_map' to prevent duplication\ndf = df[~df['Pathway'].str.contains(\"map\")]",
"_____no_output_____"
],
[
"#now link the pathway code to the name of the pathway it is involved in ",
"_____no_output_____"
],
[
"df_groups = pd.read_csv('http://rest.kegg.jp/list/pathway', header=None, sep = '\\t')",
"_____no_output_____"
],
[
"df_groups.columns = ['Pathway', 'Name']",
"_____no_output_____"
],
[
"df_groups['Pathway'] = df_groups['Pathway'].str.replace(r'path:map', '')",
"_____no_output_____"
],
[
"#now filter out the IDs I dont want to include\n#i want to remove all rows below number 153\ndf_groups = df_groups[0:154]",
"_____no_output_____"
],
[
"#then merge the df and df_groups together\ndf = pd.merge(df_groups,df,on='Pathway',how='left')",
"_____no_output_____"
]
],
[
[
"Now I can map the EC code from the unique metabolic reaction lists to these pathway classes.",
"_____no_output_____"
]
],
[
[
"pathway =[]\nfor index, row in NCIMB_unique.iterrows():\n ec = row['EC']\n types =[]\n found = df.loc[df[\"EC\"] == ec]\n for indexa, rowa in found.iterrows() :\n types.append(rowa['Name']) \n pathway.append(types)\nNCIMB_unique['Pathway'] = pathway",
"_____no_output_____"
],
[
"pathway =[]\nfor index, row in M10EXG_unique.iterrows():\n ec = row['EC']\n types =[]\n found = df.loc[df[\"EC\"] == ec]\n for indexa, rowa in found.iterrows() :\n types.append(rowa['Name']) \n pathway.append(types)\nM10EXG_unique['Pathway'] = pathway",
"_____no_output_____"
]
],
[
[
"Now I'll also add a column which looks at the KO of the ec codes recognized so that we get a bit more information about which reactions are unique in each strain.\n\nFirst I will prepare a dataframe that contains the EC codes linked to the KO ontology terms for these annotations. That we can use to map the unique reactions to.\nThere will probably still be some that are not mapped, those we will need to check by hand.",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv('http://rest.kegg.jp/link/ec/ko', header=None, sep = '\\t')",
"_____no_output_____"
],
[
"df.columns = ['KO', 'EC'] #rename the columns",
"_____no_output_____"
],
[
"#remove all 'ko:' and 'ec:'\ndf['KO'] = df['KO'].str.replace(r'ko:', '')",
"_____no_output_____"
],
[
"df['EC'] = df['EC'].str.replace(r'ec:', '')",
"_____no_output_____"
],
[
"#now import the list of KO terms with more meaningful description\nko = pd.read_csv('http://rest.kegg.jp/list/ko', header=None, sep = '\\t')",
"_____no_output_____"
],
[
"ko.columns = ['KO', 'Name']",
"_____no_output_____"
],
[
"ko['KO'] = ko['KO'].str.replace(r'ko:', '')",
"_____no_output_____"
],
[
"#link the two dataframes together\ndf = pd.merge(df,ko,on='KO',how='left')",
"_____no_output_____"
]
],
[
[
"Now we can map the EC code to a KO term.",
"_____no_output_____"
]
],
[
[
"ko_term =[]\nfor index, row in NCIMB_unique.iterrows():\n ec = row['EC']\n types =[]\n found = df.loc[df[\"EC\"] == ec]\n for indexa, rowa in found.iterrows() :\n types.append(rowa['Name']) \n ko_term.append(types)\nNCIMB_unique['KO'] = ko_term",
"_____no_output_____"
],
[
"ko_term =[]\nfor index, row in M10EXG_unique.iterrows():\n ec = row['EC']\n types =[]\n found = df.loc[df[\"EC\"] == ec]\n for indexa, rowa in found.iterrows() :\n types.append(rowa['Name']) \n ko_term.append(types)\nM10EXG_unique['KO'] = ko_term",
"_____no_output_____"
]
],
[
[
"Now I will export these two tables and need to some some manual inspection of them.",
"_____no_output_____"
]
],
[
[
"M10EXG_unique.to_csv('M10EXG_unique.csv')",
"_____no_output_____"
],
[
"NCIMB_unique.to_csv('NCIMB_unique.csv')",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e79e5a5070ae7c72881bfb8a67f8110fade1a248 | 13,561 | ipynb | Jupyter Notebook | notebooks/feature_engineering/raw/ex2.ipynb | dansbecker/learntools | 7ca35bfe2e2582bf93968ad75a3b0bfc888a04fa | [
"Apache-2.0"
] | 1 | 2019-10-24T05:43:20.000Z | 2019-10-24T05:43:20.000Z | notebooks/feature_engineering/raw/ex2.ipynb | dansbecker/learntools | 7ca35bfe2e2582bf93968ad75a3b0bfc888a04fa | [
"Apache-2.0"
] | null | null | null | notebooks/feature_engineering/raw/ex2.ipynb | dansbecker/learntools | 7ca35bfe2e2582bf93968ad75a3b0bfc888a04fa | [
"Apache-2.0"
] | null | null | null | 30.002212 | 394 | 0.586609 | [
[
[
"# Introduction\n\nIn this exercise you'll apply more advanced encodings to encode the categorical variables ito improve your classifier model. The encodings you will implement are:\n\n- Count Encoding\n- Target Encoding\n- Leave-one-out Encoding\n- CatBoost Encoding\n- Feature embedding with SVD \n\nYou'll refit the classifier after each encoding to check its performance on hold-out data. First, run the next cell to repeat the work you did in the last exercise.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nfrom sklearn import preprocessing, metrics\nimport lightgbm as lgb\n\n# Set up code checking\n# This can take a few seconds, thanks for your patience\nfrom learntools.core import binder\nbinder.bind(globals())\nfrom learntools.feature_engineering.ex2 import *\n\nclicks = pd.read_parquet('../input/feature-engineering-data/baseline_data.pqt')",
"_____no_output_____"
]
],
[
[
"Here I'll define a couple functions to help test the new encodings.",
"_____no_output_____"
]
],
[
[
"def get_data_splits(dataframe, valid_fraction=0.1):\n \"\"\" Splits a dataframe into train, validation, and test sets. First, orders by \n the column 'click_time'. Set the size of the validation and test sets with\n the valid_fraction keyword argument.\n \"\"\"\n\n dataframe = dataframe.sort_values('click_time')\n valid_rows = int(len(dataframe) * valid_fraction)\n train = dataframe[:-valid_rows * 2]\n # valid size == test size, last two sections of the data\n valid = dataframe[-valid_rows * 2:-valid_rows]\n test = dataframe[-valid_rows:]\n \n return train, valid, test\n\ndef train_model(train, valid, test=None, feature_cols=None):\n if feature_cols is None:\n feature_cols = train.columns.drop(['click_time', 'attributed_time',\n 'is_attributed'])\n dtrain = lgb.Dataset(train[feature_cols], label=train['is_attributed'])\n dvalid = lgb.Dataset(valid[feature_cols], label=valid['is_attributed'])\n \n param = {'num_leaves': 64, 'objective': 'binary', \n 'metric': 'auc', 'seed': 7}\n num_round = 1000\n print(\"Training model!\")\n bst = lgb.train(param, dtrain, num_round, valid_sets=[dvalid], \n early_stopping_rounds=20, verbose_eval=False)\n \n valid_pred = bst.predict(valid[feature_cols])\n valid_score = metrics.roc_auc_score(valid['is_attributed'], valid_pred)\n print(f\"Validation AUC score: {valid_score}\")\n \n if test is not None: \n test_pred = bst.predict(test[feature_cols])\n test_score = metrics.roc_auc_score(test['is_attributed'], test_pred)\n return bst, valid_score, test_score\n else:\n return bst, valid_score",
"_____no_output_____"
]
],
[
[
"Run this cell to get a baseline score. If your encodings do better than this, you can keep them.",
"_____no_output_____"
]
],
[
[
"print(\"Baseline model\")\ntrain, valid, test = get_data_splits(clicks)\n_ = train_model(train, valid)",
"_____no_output_____"
]
],
[
[
"### 1) Categorical encodings and leakage\n\nThese encodings are all based on statistics calculated from the dataset like counts and means. Considering this, what data should you be using to calculate the encodings?\n\nUncomment the following line after you've decided your answer.",
"_____no_output_____"
]
],
[
[
"q_1.solution()",
"_____no_output_____"
]
],
[
[
"### 2) Count encodings\n\nHere, encode the categorical features `['ip', 'app', 'device', 'os', 'channel']` using the count of each value in the data set. Using `CountEncoder` from the `category_encoders` library, fit the encoding using the categorical feature columns defined in `cat_features`. Then apply the encodings to the train and validation sets, adding them as new columns with names suffixed `\"_count\"`.",
"_____no_output_____"
]
],
[
[
"import category_encoders as ce\n\ncat_features = ['ip', 'app', 'device', 'os', 'channel']\ntrain, valid, test = get_data_splits(clicks)\n\n# Create the count encoder\ncount_enc = ____\n\n# Learn encoding from the training set\n____\n\n# Apply encoding to the train and validation sets as new columns\n# Make sure to add `_count` as a suffix to the new columns\ntrain_encoded = ____\nvalid_encoded = ____\n\nq_2.check()",
"_____no_output_____"
],
[
"# Uncomment if you need some guidance\nq_2.hint()\nq_2.solution()",
"_____no_output_____"
],
[
"#%%RM_IF(PROD)%%\ncat_features = ['ip', 'app', 'device', 'os', 'channel']\ntrain, valid, test = get_data_splits(clicks)\n\n# Create the count encoder\ncount_enc = ce.CountEncoder(cols=cat_features)\n\n# Learn encoding from the training set\ncount_enc.fit(train[cat_features])\n\n# Apply encoding to the train and validation sets\ntrain_encoded = train.join(count_enc.transform(train[cat_features]).add_suffix('_count'))\nvalid_encoded = valid.join(count_enc.transform(valid[cat_features]).add_suffix('_count'))\n\nq_2.assert_check_passed()",
"_____no_output_____"
],
[
"# Train the model on the encoded datasets\n# This can take around 30 seconds to complete\n_ = train_model(train_encoded, valid_encoded)",
"_____no_output_____"
]
],
[
[
"Count encoding improved our model's score!",
"_____no_output_____"
],
[
"### 3) Why is count encoding effective?\nAt first glance, it could be surprising that Count Encoding helps make accurate models. \nWhy do you think is count encoding is a good idea, or how does it improve the model score?\n\nUncomment the following line after you've decided your answer.",
"_____no_output_____"
]
],
[
[
"q_3.solution()",
"_____no_output_____"
]
],
[
[
"### 4) Target encoding\n\nHere you'll try some supervised encodings that use the labels (the targets) to transform categorical features. The first one is target encoding. Create the target encoder from the `category_encoders` library. Then, learn the encodings from the training dataset, apply the encodings to all the datasets and retrain the model.",
"_____no_output_____"
]
],
[
[
"cat_features = ['ip', 'app', 'device', 'os', 'channel']\ntrain, valid, test = get_data_splits(clicks)\n\n# Create the target encoder. You can find this easily by using tab completion.\n# Start typing ce. the press Tab to bring up a list of classes and functions.\ntarget_enc = ____\n\n# Learn encoding from the training set. Use the 'is_attributed' column as the target.\n____\n\n# Apply encoding to the train and validation sets as new columns\n# Make sure to add `_target` as a suffix to the new columns\ntrain_encoded = ____\nvalid_encoded = ____\n\nq_4.check()",
"_____no_output_____"
],
[
"# Uncomment these if you need some guidance\n#q_4.hint()\n#q_4.solution()",
"_____no_output_____"
],
[
"#%%RM_IF(PROD)%%\ncat_features = ['ip', 'app', 'device', 'os', 'channel']\ntarget_enc = ce.TargetEncoder(cols=cat_features)\n\ntrain, valid, test = get_data_splits(clicks)\ntarget_enc.fit(train[cat_features], train['is_attributed'])\n\ntrain_encoded = train.join(target_enc.transform(train[cat_features]).add_suffix('_target'))\nvalid_encoded = valid.join(target_enc.transform(valid[cat_features]).add_suffix('_target'))\n\nq_4.assert_check_passed()",
"_____no_output_____"
],
[
"_ = train_model(train_encoded, valid_encoded)",
"_____no_output_____"
]
],
[
[
"### 5) Try removing IP encoding\n\nTry leaving `ip` out of the encoded features and retrain the model with target encoding again. You should find that the score increases and is above the baseline score! Why do you think the score is below baseline when we encode the IP address but above baseline when we don't?\n\nUncomment the following line after you've decided your answer.",
"_____no_output_____"
]
],
[
[
"# q_5.solution()",
"_____no_output_____"
]
],
[
[
"### 6) CatBoost Encoding\n\nThe CatBoost encoder is supposed to working well with the LightGBM model. Encode the categorical features with `CatBoostEncoder` and train the model on the encoded data again.",
"_____no_output_____"
]
],
[
[
"train, valid, test = get_data_splits(clicks)\n\n# Create the CatBoost encoder\ncb_enc = ____\n\n# Learn encoding from the training set\n____\n\n# Apply encoding to the train and validation sets as new columns\n# Make sure to add `_cb` as a suffix to the new columns\ntrain_encoded = ____\nvalid_encoded = ____\nq_6.check()",
"_____no_output_____"
],
[
"# Uncomment these if you need some guidance\n#q_6.hint()\n#q_6.solution()",
"_____no_output_____"
],
[
"#%%RM_IF(PROD)%%\ncat_features = ['app', 'device', 'os', 'channel']\ntrain, valid, _ = get_data_splits(clicks)\n\ncb_enc = ce.CatBoostEncoder(cols=cat_features, random_state=7)\n\n# Learn encodings on the train set\ncb_enc.fit(train[cat_features], train['is_attributed'])\n\n# Apply encodings to each set\ntrain_encoded = train.join(cb_enc.transform(train[cat_features]).add_suffix('_cb'))\nvalid_encoded = valid.join(cb_enc.transform(valid[cat_features]).add_suffix('_cb'))\n\nq_6.assert_check_passed()",
"_____no_output_____"
],
[
"_ = train_model(train, valid)",
"_____no_output_____"
]
],
[
[
"The CatBoost encodings work the best, so we'll keep those.",
"_____no_output_____"
]
],
[
[
"encoded = cb_enc.transform(clicks[cat_features])\nfor col in encoded:\n clicks.insert(len(clicks.columns), col + '_cb', encoded[col])",
"_____no_output_____"
]
],
[
[
"# Keep Going\n\nNow you are ready to **[generating completely new features](#$NEXT_NOTEBOOK_URL$)** from the data itself.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e79e6ac586656ca390f75e959a6355d0bfbbb2e6 | 16,169 | ipynb | Jupyter Notebook | site/en/r1/tutorials/eager/eager_basics.ipynb | atharva1503/docs | d810ab4728a4810615c83e9b17861ed6d9d76f1f | [
"Apache-2.0"
] | 2 | 2020-06-20T14:10:44.000Z | 2020-10-12T07:10:34.000Z | site/en/r1/tutorials/eager/eager_basics.ipynb | atharva1503/docs | d810ab4728a4810615c83e9b17861ed6d9d76f1f | [
"Apache-2.0"
] | null | null | null | site/en/r1/tutorials/eager/eager_basics.ipynb | atharva1503/docs | d810ab4728a4810615c83e9b17861ed6d9d76f1f | [
"Apache-2.0"
] | null | null | null | 35.380744 | 621 | 0.547777 | [
[
[
"##### Copyright 2018 The TensorFlow Authors.",
"_____no_output_____"
]
],
[
[
"#@title Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.",
"_____no_output_____"
]
],
[
[
"# Eager execution basics",
"_____no_output_____"
],
[
"<table class=\"tfo-notebook-buttons\" align=\"left\">\n <td>\n <a target=\"_blank\" href=\"https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/r1/tutorials/eager/eager_basics.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n </td>\n <td>\n <a target=\"_blank\" href=\"https://github.com/tensorflow/docs/blob/master/site/en/r1/tutorials/eager/eager_basics.ipynb\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" />View source on GitHub</a>\n </td>\n</table>",
"_____no_output_____"
],
[
"This is an introductory tutorial for using TensorFlow. It will cover:\n\n* Importing required packages\n* Creating and using Tensors\n* Using GPU acceleration\n* Datasets",
"_____no_output_____"
],
[
"## Import TensorFlow\n\nTo get started, import the `tensorflow` module and enable eager execution.\nEager execution enables a more interactive frontend to TensorFlow, the details of which we will discuss much later.",
"_____no_output_____"
]
],
[
[
"from __future__ import absolute_import, division, print_function, unicode_literals\n\ntry:\n # %tensorflow_version only exists in Colab.\n %tensorflow_version 2.x\nexcept Exception:\n pass\nimport tensorflow.compat.v1 as tf\n",
"_____no_output_____"
]
],
[
[
"## Tensors\n\nA Tensor is a multi-dimensional array. Similar to NumPy `ndarray` objects, `Tensor` objects have a data type and a shape. Additionally, Tensors can reside in accelerator (like GPU) memory. TensorFlow offers a rich library of operations ([tf.add](https://www.tensorflow.org/api_docs/python/tf/add), [tf.matmul](https://www.tensorflow.org/api_docs/python/tf/matmul), [tf.linalg.inv](https://www.tensorflow.org/api_docs/python/tf/linalg/inv) etc.) that consume and produce Tensors. These operations automatically convert native Python types. For example:\n",
"_____no_output_____"
]
],
[
[
"print(tf.add(1, 2))\nprint(tf.add([1, 2], [3, 4]))\nprint(tf.square(5))\nprint(tf.reduce_sum([1, 2, 3]))\nprint(tf.encode_base64(\"hello world\"))\n\n# Operator overloading is also supported\nprint(tf.square(2) + tf.square(3))",
"_____no_output_____"
]
],
[
[
"Each Tensor has a shape and a datatype",
"_____no_output_____"
]
],
[
[
"x = tf.matmul([[1]], [[2, 3]])\nprint(x.shape)\nprint(x.dtype)",
"_____no_output_____"
]
],
[
[
"The most obvious differences between NumPy arrays and TensorFlow Tensors are:\n\n1. Tensors can be backed by accelerator memory (like GPU, TPU).\n2. Tensors are immutable.",
"_____no_output_____"
],
[
"### NumPy Compatibility\n\nConversion between TensorFlow Tensors and NumPy ndarrays is quite simple as:\n\n* TensorFlow operations automatically convert NumPy ndarrays to Tensors.\n* NumPy operations automatically convert Tensors to NumPy ndarrays.\n\nTensors can be explicitly converted to NumPy ndarrays by invoking the `.numpy()` method on them.\nThese conversions are typically cheap as the array and Tensor share the underlying memory representation if possible. However, sharing the underlying representation isn't always possible since the Tensor may be hosted in GPU memory while NumPy arrays are always backed by host memory, and the conversion will thus involve a copy from GPU to host memory.",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\nndarray = np.ones([3, 3])\n\nprint(\"TensorFlow operations convert numpy arrays to Tensors automatically\")\ntensor = tf.multiply(ndarray, 42)\nprint(tensor)\n\n\nprint(\"And NumPy operations convert Tensors to numpy arrays automatically\")\nprint(np.add(tensor, 1))\n\nprint(\"The .numpy() method explicitly converts a Tensor to a numpy array\")\nprint(tensor.numpy())",
"_____no_output_____"
]
],
[
[
"## GPU acceleration\n\nMany TensorFlow operations can be accelerated by using the GPU for computation. Without any annotations, TensorFlow automatically decides whether to use the GPU or CPU for an operation (and copies the tensor between CPU and GPU memory if necessary). Tensors produced by an operation are typically backed by the memory of the device on which the operation executed. For example:",
"_____no_output_____"
]
],
[
[
"x = tf.random.uniform([3, 3])\n\nprint(\"Is there a GPU available: \"),\nprint(tf.test.is_gpu_available())\n\nprint(\"Is the Tensor on GPU #0: \"),\nprint(x.device.endswith('GPU:0'))",
"_____no_output_____"
]
],
[
[
"### Device Names\n\nThe `Tensor.device` property provides a fully qualified string name of the device hosting the contents of the tensor. This name encodes many details, such as an identifier of the network address of the host on which this program is executing and the device within that host. This is required for distributed execution of a TensorFlow program. The string ends with `GPU:<N>` if the tensor is placed on the `N`-th GPU on the host.",
"_____no_output_____"
],
[
"\n\n### Explicit Device Placement\n\nThe term \"placement\" in TensorFlow refers to how individual operations are assigned (placed on) a device for execution. As mentioned above, when there is no explicit guidance provided, TensorFlow automatically decides which device to execute an operation, and copies Tensors to that device if needed. However, TensorFlow operations can be explicitly placed on specific devices using the `tf.device` context manager. For example:",
"_____no_output_____"
]
],
[
[
"import time\n\ndef time_matmul(x):\n start = time.time()\n for loop in range(10):\n tf.matmul(x, x)\n\n result = time.time()-start\n\n print(\"10 loops: {:0.2f}ms\".format(1000*result))\n\n\n# Force execution on CPU\nprint(\"On CPU:\")\nwith tf.device(\"CPU:0\"):\n x = tf.random_uniform([1000, 1000])\n assert x.device.endswith(\"CPU:0\")\n time_matmul(x)\n\n# Force execution on GPU #0 if available\nif tf.test.is_gpu_available():\n with tf.device(\"GPU:0\"): # Or GPU:1 for the 2nd GPU, GPU:2 for the 3rd etc.\n x = tf.random_uniform([1000, 1000])\n assert x.device.endswith(\"GPU:0\")\n time_matmul(x)",
"_____no_output_____"
]
],
[
[
"## Datasets\n\nThis section demonstrates the use of the [`tf.data.Dataset` API](https://www.tensorflow.org/r1/guide/datasets) to build pipelines to feed data to your model. It covers:\n\n* Creating a `Dataset`.\n* Iteration over a `Dataset` with eager execution enabled.\n\nWe recommend using the `Dataset`s API for building performant, complex input pipelines from simple, re-usable pieces that will feed your model's training or evaluation loops.\n\nIf you're familiar with TensorFlow graphs, the API for constructing the `Dataset` object remains exactly the same when eager execution is enabled, but the process of iterating over elements of the dataset is slightly simpler.\nYou can use Python iteration over the `tf.data.Dataset` object and do not need to explicitly create an `tf.data.Iterator` object.\nAs a result, the discussion on iterators in the [TensorFlow Guide](https://www.tensorflow.org/r1/guide/datasets) is not relevant when eager execution is enabled.",
"_____no_output_____"
],
[
"### Create a source `Dataset`\n\nCreate a _source_ dataset using one of the factory functions like [`Dataset.from_tensors`](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#from_tensors), [`Dataset.from_tensor_slices`](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#from_tensor_slices) or using objects that read from files like [`TextLineDataset`](https://www.tensorflow.org/api_docs/python/tf/data/TextLineDataset) or [`TFRecordDataset`](https://www.tensorflow.org/api_docs/python/tf/data/TFRecordDataset). See the [TensorFlow Guide](https://www.tensorflow.org/r1/guide/datasets#reading_input_data) for more information.",
"_____no_output_____"
]
],
[
[
"ds_tensors = tf.data.Dataset.from_tensor_slices([1, 2, 3, 4, 5, 6])\n\n# Create a CSV file\nimport tempfile\n_, filename = tempfile.mkstemp()\n\nwith open(filename, 'w') as f:\n f.write(\"\"\"Line 1\nLine 2\nLine 3\n \"\"\")\n\nds_file = tf.data.TextLineDataset(filename)",
"_____no_output_____"
]
],
[
[
"### Apply transformations\n\nUse the transformations functions like [`map`](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#map), [`batch`](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#batch), [`shuffle`](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#shuffle) etc. to apply transformations to the records of the dataset. See the [API documentation for `tf.data.Dataset`](https://www.tensorflow.org/api_docs/python/tf/data/Dataset) for details.",
"_____no_output_____"
]
],
[
[
"ds_tensors = ds_tensors.map(tf.square).shuffle(2).batch(2)\n\nds_file = ds_file.batch(2)",
"_____no_output_____"
]
],
[
[
"### Iterate\n\nWhen eager execution is enabled `Dataset` objects support iteration.\nIf you're familiar with the use of `Dataset`s in TensorFlow graphs, note that there is no need for calls to `Dataset.make_one_shot_iterator()` or `get_next()` calls.",
"_____no_output_____"
]
],
[
[
"print('Elements of ds_tensors:')\nfor x in ds_tensors:\n print(x)\n\nprint('\\nElements in ds_file:')\nfor x in ds_file:\n print(x)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79e6dc2f01d200ebfa7a5bef54a1ae1e29b6911 | 281,642 | ipynb | Jupyter Notebook | notebooks/knn.ipynb | Bhaskers-Blu-Org2/metric-transfer.pytorch | b0ae8ed6e6f62357100d799defbb61a78c831a87 | [
"MIT"
] | 51 | 2019-07-23T23:47:12.000Z | 2022-03-04T13:03:25.000Z | notebooks/knn.ipynb | Bhaskers-Blu-Org2/metric-transfer.pytorch | b0ae8ed6e6f62357100d799defbb61a78c831a87 | [
"MIT"
] | 2 | 2021-01-25T08:08:17.000Z | 2021-01-28T03:36:01.000Z | notebooks/knn.ipynb | chingisooinar/metric-transfer.pytorch | b0ae8ed6e6f62357100d799defbb61a78c831a87 | [
"MIT"
] | 19 | 2019-07-25T02:46:26.000Z | 2021-03-07T17:35:37.000Z | 597.96603 | 256,252 | 0.926375 | [
[
[
"%matplotlib inline\nimport sys\nimport os\nimport argparse\nimport time\nimport numpy as np\nsys.path.append('../')\n\nimport torch\nimport torch.nn as nn\nimport torch.optim as optim\nimport torch.nn.functional as F\nimport torch.backends.cudnn as cudnn\nimport torch.optim.lr_scheduler as lr_scheduler\nimport torchvision\nimport torchvision.transforms as transforms\nimport matplotlib.pyplot as plt\nimport easydict as edict\n\nfrom lib import models, datasets\nimport math",
"_____no_output_____"
],
[
"args = edict\nargs.resume = '../checkpoint/pretrain_models/ckpt_instance_cifar10_wrn-28-2_82.12.pth.tar'\nargs.cache = '../checkpoint/train_features_labels_cache/instance_wrn-28-2.pth.tar'\nargs.save_path = '../checkpoint/pseudos/instance_knn_wrn-28-2'\nos.makedirs(args.save_path, exist_ok=True)\n\nargs.num_class = 10\nargs.rng_seed = 0",
"_____no_output_____"
],
[
"ckpt = torch.load(args.cache)",
"_____no_output_____"
],
[
"ckpt = torch.load(args.cache)\ntrain_features = ckpt['train_features']\ntrain_labels = ckpt['train_labels']\n\nprint(train_features.dtype, train_labels.dtype)\nprint(train_features.shape, train_labels.shape)",
"torch.float32 torch.int64\ntorch.Size([50000, 128]) torch.Size([50000])\n"
]
],
[
[
"# use cpu because the following computation need a lot of memory",
"_____no_output_____"
]
],
[
[
"device = 'cpu'\ntrain_features, train_labels = train_features.to(device), train_labels.to(device)",
"_____no_output_____"
],
[
"num_train_data = train_labels.shape[0]\nnum_class = torch.max(train_labels) + 1\n\ntorch.manual_seed(args.rng_seed)\ntorch.cuda.manual_seed_all(args.rng_seed)\nperm = torch.randperm(num_train_data).to(device)\nprint(perm)",
"tensor([36044, 49165, 37807, ..., 42128, 15898, 31476])\n"
]
],
[
[
"# soft label",
"_____no_output_____"
]
],
[
[
"fig = plt.figure(dpi=200)\nfor num_labeled_data in [50, 100, 250, 500, 1000, 2000, 4000, 8000]:\n index_labeled = []\n index_unlabeled = []\n data_per_class = num_labeled_data // args.num_class\n for c in range(10):\n indexes_c = perm[train_labels[perm] == c]\n index_labeled.append(indexes_c[:data_per_class])\n index_unlabeled.append(indexes_c[data_per_class:])\n index_labeled = torch.cat(index_labeled)\n index_unlabeled = torch.cat(index_unlabeled)\n \n# index_labeled = perm[:num_labeled_data]\n# index_unlabeled = perm[num_labeled_data:]\n\n # calculate similarity matrix\n dist = torch.mm(train_features, train_features[index_labeled].t())\n dist[index_labeled, torch.arange(num_labeled_data)] = 0\n\n K = min(num_labeled_data, 200)\n yd, yi = dist.topk(K, dim=1, largest=True, sorted=True)\n candidates = train_labels.view(1,-1).expand(num_train_data, -1)\n retrieval = torch.gather(candidates, 1, index_labeled[yi])\n\n retrieval_one_hot = torch.zeros(num_train_data * K, num_class).to(device)\n retrieval_one_hot.scatter_(1, retrieval.view(-1, 1), 1)\n\n temperature = 0.1\n\n yd_transform = (yd / temperature).exp_()\n probs = torch.sum(torch.mul(retrieval_one_hot.view(num_train_data, -1 , num_class), yd_transform.view(num_train_data, -1, 1)), 1)\n probs.div_(probs.sum(dim=1, keepdim=True))\n probs_sorted, predictions = probs.sort(1, True)\n correct = predictions.eq(train_labels.data.view(-1,1))\n\n confidence = probs_sorted[:, 0] # - probs_sorted[:, 1]\n \n correct = correct[index_unlabeled, :]\n confidence = confidence[index_unlabeled]\n \n n = confidence.shape[0]\n arange = 1 + np.arange(n)\n idx = confidence.sort(descending=True)[1]\n correct_sorted = correct[idx, 0].numpy()\n accuracies = np.cumsum(correct_sorted) / arange\n xs = arange / n\n\n plt.plot(xs, accuracies, label='num_labeled_data={}'.format(num_labeled_data))\n \n # save pseudo labels\n unlabeled_probs_top1, unlabeled_indexes_top1 = probs_sorted[:, 0][index_unlabeled].sort(0, True)\n pseudo_indexes = index_unlabeled[unlabeled_indexes_top1]\n pseudo_labels = predictions[index_unlabeled, 0][unlabeled_indexes_top1]\n pseudo_probs = probs[index_unlabeled][unlabeled_indexes_top1]\n assert torch.all(pseudo_labels == pseudo_probs.max(1)[1])\n \n save_dict = {\n 'pseudo_indexes': pseudo_indexes,\n 'pseudo_labels': pseudo_labels,\n 'pseudo_probs': pseudo_probs,\n 'labeled_indexes': index_labeled,\n 'unlabeled_indexes': index_unlabeled,\n }\n torch.save(save_dict, os.path.join(args.save_path, '{}.pth.tar'.format(num_labeled_data)))\n \n acc = (pseudo_labels == train_labels[pseudo_indexes]).float().mean().item()\n print('num_labeled={:4}, acc={:2.2f}, AUC={:2.2f}'.format(num_labeled_data, acc*100, accuracies.mean() * 100))\n \nplt.xlabel('ratio of data')\nplt.ylabel('top1 accuracy')\n# plt.xticks(np.arange(0, 1.05, 0.1))\n# plt.yticks(np.arange(0.36, 1.01, 0.05))\nplt.grid()\nlegend = plt.legend(loc='upper left', bbox_to_anchor=(1, 1))\n\nplt.show()",
"tensor([24681, 42151, 48978, 41040, 36909, 8628, 24936, 35926, 15934, 8801,\n 36293, 28026, 3814, 34981, 21135, 16904, 20152, 3486, 11894, 29780,\n 23932, 33744, 41766, 42979, 49518, 11341, 6091, 48161, 36335, 29858,\n 36044, 25569, 46340, 8832, 38677, 37807, 4480, 18517, 8409, 15769,\n 49165, 43846, 13449, 34615, 38862, 17849, 49031, 4115, 29909, 34515])\nnum_labeled= 50, acc=38.82, AUC=54.82\ntensor([24681, 42151, 48978, 41040, 36909, 10197, 38770, 40649, 43279, 27934,\n 8628, 24936, 35926, 15934, 8801, 27814, 37390, 12841, 22270, 10107,\n 36293, 28026, 3814, 34981, 21135, 46451, 48404, 8366, 32477, 48870,\n 16904, 20152, 3486, 11894, 29780, 47462, 34297, 1271, 28738, 12147,\n 23932, 33744, 41766, 42979, 49518, 18912, 32018, 17323, 1974, 27259,\n 11341, 6091, 48161, 36335, 29858, 17315, 22879, 43307, 23185, 10318,\n 36044, 25569, 46340, 8832, 38677, 36881, 11751, 9278, 1836, 8557,\n 37807, 4480, 18517, 8409, 15769, 440, 45506, 22942, 37635, 49162,\n 49165, 43846, 13449, 34615, 38862, 11524, 21361, 18397, 10903, 33335,\n 17849, 49031, 4115, 29909, 34515, 1456, 32156, 21390, 25936, 46645])\nnum_labeled= 100, acc=45.43, AUC=62.99\ntensor([24681, 42151, 48978, 41040, 36909, 10197, 38770, 40649, 43279, 27934,\n 48148, 38536, 37693, 33617, 27768, 36763, 31048, 2345, 15097, 49656,\n 20265, 10365, 49091, 36836, 35285, 8628, 24936, 35926, 15934, 8801,\n 27814, 37390, 12841, 22270, 10107, 43669, 36935, 9806, 2389, 17223,\n 40808, 45946, 31854, 49487, 578, 31363, 2318, 2765, 29571, 15248,\n 36293, 28026, 3814, 34981, 21135, 46451, 48404, 8366, 32477, 48870,\n 38750, 45437, 11815, 34305, 22797, 27326, 38971, 30453, 9585, 21701,\n 47668, 41469, 300, 44482, 17954, 16904, 20152, 3486, 11894, 29780,\n 47462, 34297, 1271, 28738, 12147, 40307, 44267, 11855, 22489, 31199,\n 45646, 6936, 35292, 6714, 14575, 47132, 14125, 5086, 9343, 16013,\n 23932, 33744, 41766, 42979, 49518, 18912, 32018, 17323, 1974, 27259,\n 10654, 20054, 15256, 46803, 27812, 39173, 11810, 27935, 6983, 15578,\n 25954, 17146, 38810, 32496, 46321, 11341, 6091, 48161, 36335, 29858,\n 17315, 22879, 43307, 23185, 10318, 16247, 35949, 31833, 26102, 6222,\n 28877, 38784, 29549, 32795, 31568, 20322, 42243, 39846, 34133, 26785,\n 36044, 25569, 46340, 8832, 38677, 36881, 11751, 9278, 1836, 8557,\n 39511, 31396, 35620, 9481, 37289, 24394, 16822, 37564, 47653, 28188,\n 45944, 35444, 20432, 21644, 24946, 37807, 4480, 18517, 8409, 15769,\n 440, 45506, 22942, 37635, 49162, 10081, 37293, 29962, 40230, 26420,\n 42284, 47996, 30720, 3054, 21691, 10422, 43896, 33469, 45230, 22687,\n 49165, 43846, 13449, 34615, 38862, 11524, 21361, 18397, 10903, 33335,\n 17560, 11906, 23965, 8483, 41980, 891, 34910, 26038, 28874, 17625,\n 22948, 8489, 11265, 36893, 6346, 17849, 49031, 4115, 29909, 34515,\n 1456, 32156, 21390, 25936, 46645, 3489, 45235, 3126, 13128, 40168,\n 18175, 40819, 28019, 39620, 29182, 14458, 46401, 16051, 16394, 3737])\nnum_labeled= 250, acc=58.77, AUC=77.08\ntensor([24681, 42151, 48978, 41040, 36909, 10197, 38770, 40649, 43279, 27934,\n 48148, 38536, 37693, 33617, 27768, 36763, 31048, 2345, 15097, 49656,\n 20265, 10365, 49091, 36836, 35285, 42465, 44296, 15003, 8209, 38023,\n 31880, 32206, 29039, 14945, 31888, 29837, 1950, 7251, 16637, 10343,\n 32460, 44820, 3295, 7168, 3092, 46523, 19682, 49635, 12477, 43592,\n 8628, 24936, 35926, 15934, 8801, 27814, 37390, 12841, 22270, 10107,\n 43669, 36935, 9806, 2389, 17223, 40808, 45946, 31854, 49487, 578,\n 31363, 2318, 2765, 29571, 15248, 27576, 7540, 1985, 2762, 48358,\n 19594, 47745, 37081, 47495, 31208, 11301, 49554, 11418, 19382, 47839,\n 32147, 9476, 30495, 40431, 39933, 30439, 35339, 43609, 13277, 30945,\n 36293, 28026, 3814, 34981, 21135, 46451, 48404, 8366, 32477, 48870,\n 38750, 45437, 11815, 34305, 22797, 27326, 38971, 30453, 9585, 21701,\n 47668, 41469, 300, 44482, 17954, 38329, 45218, 39478, 3196, 39983,\n 39687, 41411, 22897, 32181, 37797, 28057, 35824, 34053, 20645, 35161,\n 36011, 21725, 48101, 22322, 34836, 16783, 9386, 28856, 33987, 21163,\n 16904, 20152, 3486, 11894, 29780, 47462, 34297, 1271, 28738, 12147,\n 40307, 44267, 11855, 22489, 31199, 45646, 6936, 35292, 6714, 14575,\n 47132, 14125, 5086, 9343, 16013, 30947, 48088, 48762, 11743, 34946,\n 3320, 49791, 6930, 47963, 45082, 33427, 26336, 23584, 9865, 16808,\n 47306, 9073, 44093, 45471, 17137, 27683, 49609, 9591, 2770, 16432,\n 23932, 33744, 41766, 42979, 49518, 18912, 32018, 17323, 1974, 27259,\n 10654, 20054, 15256, 46803, 27812, 39173, 11810, 27935, 6983, 15578,\n 25954, 17146, 38810, 32496, 46321, 712, 17420, 37731, 38015, 48513,\n 20564, 3460, 3360, 381, 14656, 30589, 11499, 16383, 86, 41003,\n 47345, 17877, 1814, 25666, 6875, 9141, 23255, 6303, 33823, 11510,\n 11341, 6091, 48161, 36335, 29858, 17315, 22879, 43307, 23185, 10318,\n 16247, 35949, 31833, 26102, 6222, 28877, 38784, 29549, 32795, 31568,\n 20322, 42243, 39846, 34133, 26785, 32333, 751, 3167, 6253, 49920,\n 16935, 49866, 48991, 39167, 35254, 47913, 30735, 29392, 32516, 36016,\n 19177, 38632, 20860, 28831, 26382, 26072, 16212, 40662, 48181, 34411,\n 36044, 25569, 46340, 8832, 38677, 36881, 11751, 9278, 1836, 8557,\n 39511, 31396, 35620, 9481, 37289, 24394, 16822, 37564, 47653, 28188,\n 45944, 35444, 20432, 21644, 24946, 23479, 15385, 40549, 31140, 28169,\n 11124, 29296, 36749, 7339, 33690, 5182, 14694, 4644, 13500, 34013,\n 36586, 23708, 25417, 42099, 9814, 26089, 5596, 17351, 45518, 48895,\n 37807, 4480, 18517, 8409, 15769, 440, 45506, 22942, 37635, 49162,\n 10081, 37293, 29962, 40230, 26420, 42284, 47996, 30720, 3054, 21691,\n 10422, 43896, 33469, 45230, 22687, 7833, 19956, 5778, 21601, 37273,\n 14301, 40517, 20072, 19779, 34904, 589, 10873, 23612, 21108, 7831,\n 18874, 7836, 9082, 3343, 8980, 37573, 41000, 7849, 27054, 47189,\n 49165, 43846, 13449, 34615, 38862, 11524, 21361, 18397, 10903, 33335,\n 17560, 11906, 23965, 8483, 41980, 891, 34910, 26038, 28874, 17625,\n 22948, 8489, 11265, 36893, 6346, 11842, 21202, 16315, 24760, 27243,\n 4742, 27976, 31390, 34977, 17335, 45620, 8968, 26923, 18444, 44681,\n 11091, 12126, 3094, 23215, 13893, 47978, 29766, 16336, 36405, 12646,\n 17849, 49031, 4115, 29909, 34515, 1456, 32156, 21390, 25936, 46645,\n 3489, 45235, 3126, 13128, 40168, 18175, 40819, 28019, 39620, 29182,\n 14458, 46401, 16051, 16394, 3737, 26511, 26805, 18120, 14610, 30722,\n 35974, 24086, 44058, 146, 45840, 18030, 47269, 5138, 48196, 44182,\n 15244, 10930, 10793, 23551, 41181, 5605, 28932, 3276, 19831, 14695])\nnum_labeled= 500, acc=67.35, AUC=84.90\ntensor([24681, 42151, 48978, 41040, 36909, 10197, 38770, 40649, 43279, 27934,\n 48148, 38536, 37693, 33617, 27768, 36763, 31048, 2345, 15097, 49656,\n 20265, 10365, 49091, 36836, 35285, 42465, 44296, 15003, 8209, 38023,\n 31880, 32206, 29039, 14945, 31888, 29837, 1950, 7251, 16637, 10343,\n 32460, 44820, 3295, 7168, 3092, 46523, 19682, 49635, 12477, 43592,\n 30240, 46458, 49169, 15546, 5775, 24420, 47520, 6952, 42010, 10115,\n 14778, 47740, 44771, 35626, 15594, 46750, 20568, 34313, 3909, 5346,\n 38312, 4239, 37191, 34578, 12767, 44082, 14052, 38326, 36358, 14307,\n 9373, 32378, 47242, 27961, 25097, 3515, 35665, 30114, 39959, 8096,\n 17101, 30400, 31991, 10811, 2413, 28866, 41222, 43266, 40928, 49556,\n 8628, 24936, 35926, 15934, 8801, 27814, 37390, 12841, 22270, 10107,\n 43669, 36935, 9806, 2389, 17223, 40808, 45946, 31854, 49487, 578,\n 31363, 2318, 2765, 29571, 15248, 27576, 7540, 1985, 2762, 48358,\n 19594, 47745, 37081, 47495, 31208, 11301, 49554, 11418, 19382, 47839,\n 32147, 9476, 30495, 40431, 39933, 30439, 35339, 43609, 13277, 30945,\n 8690, 23921, 1502, 35694, 29808, 2180, 12719, 16765, 42443, 25534,\n 15779, 27037, 1946, 17861, 21238, 14035, 45675, 11831, 45001, 22067,\n 11850, 1170, 25915, 25071, 43369, 28326, 14805, 31445, 12620, 41075,\n 9266, 9809, 45475, 31209, 34179, 13843, 16410, 26315, 42237, 22553,\n 39769, 44954, 25842, 7358, 1006, 12123, 184, 17294, 22063, 590,\n 36293, 28026, 3814, 34981, 21135, 46451, 48404, 8366, 32477, 48870,\n 38750, 45437, 11815, 34305, 22797, 27326, 38971, 30453, 9585, 21701,\n 47668, 41469, 300, 44482, 17954, 38329, 45218, 39478, 3196, 39983,\n 39687, 41411, 22897, 32181, 37797, 28057, 35824, 34053, 20645, 35161,\n 36011, 21725, 48101, 22322, 34836, 16783, 9386, 28856, 33987, 21163,\n 28158, 45428, 17217, 22012, 26219, 9002, 48288, 30836, 13557, 14404,\n 10160, 23930, 28933, 45662, 11779, 3368, 8438, 26266, 10155, 30454,\n 47710, 49101, 17745, 13195, 46053, 830, 15160, 6520, 4257, 40865,\n 4598, 13985, 21168, 6648, 48427, 7555, 24978, 31495, 2191, 35482,\n 25859, 34651, 35107, 49248, 46575, 19964, 36242, 19874, 46983, 18464,\n 16904, 20152, 3486, 11894, 29780, 47462, 34297, 1271, 28738, 12147,\n 40307, 44267, 11855, 22489, 31199, 45646, 6936, 35292, 6714, 14575,\n 47132, 14125, 5086, 9343, 16013, 30947, 48088, 48762, 11743, 34946,\n 3320, 49791, 6930, 47963, 45082, 33427, 26336, 23584, 9865, 16808,\n 47306, 9073, 44093, 45471, 17137, 27683, 49609, 9591, 2770, 16432,\n 4266, 19727, 5918, 17897, 36269, 10753, 23197, 7603, 38614, 23995,\n 42134, 36588, 21049, 14030, 19049, 43978, 2412, 24154, 20144, 28055,\n 34016, 43850, 8088, 36767, 19617, 48390, 14942, 6585, 34768, 32160,\n 9024, 27938, 38887, 16586, 6981, 31913, 7132, 14107, 43633, 47683,\n 6422, 6466, 49325, 16035, 37410, 15967, 8738, 33980, 1120, 2986,\n 23932, 33744, 41766, 42979, 49518, 18912, 32018, 17323, 1974, 27259,\n 10654, 20054, 15256, 46803, 27812, 39173, 11810, 27935, 6983, 15578,\n 25954, 17146, 38810, 32496, 46321, 712, 17420, 37731, 38015, 48513,\n 20564, 3460, 3360, 381, 14656, 30589, 11499, 16383, 86, 41003,\n 47345, 17877, 1814, 25666, 6875, 9141, 23255, 6303, 33823, 11510,\n 9359, 45076, 49252, 28315, 41554, 23603, 36156, 6009, 47921, 45286,\n 18264, 31515, 47482, 34808, 41216, 34316, 43524, 43239, 40550, 8257,\n 22386, 1158, 19358, 21323, 24900, 49812, 27500, 30094, 19006, 543,\n 31998, 32997, 44519, 9963, 25435, 32757, 38375, 15864, 14073, 11794,\n 30802, 39803, 19711, 4137, 41286, 20870, 11190, 42575, 38598, 12160,\n 11341, 6091, 48161, 36335, 29858, 17315, 22879, 43307, 23185, 10318,\n 16247, 35949, 31833, 26102, 6222, 28877, 38784, 29549, 32795, 31568,\n 20322, 42243, 39846, 34133, 26785, 32333, 751, 3167, 6253, 49920,\n 16935, 49866, 48991, 39167, 35254, 47913, 30735, 29392, 32516, 36016,\n 19177, 38632, 20860, 28831, 26382, 26072, 16212, 40662, 48181, 34411,\n 20043, 48422, 27950, 12839, 43425, 45828, 6519, 23067, 38680, 18074,\n 39029, 8730, 42518, 32910, 8198, 17780, 5100, 25287, 34739, 28079,\n 38692, 45501, 27377, 44838, 21767, 38099, 25272, 10072, 13100, 42950,\n 48220, 11928, 12567, 22473, 39718, 43446, 30279, 44268, 33138, 26365,\n 41130, 34879, 3580, 43851, 49334, 23046, 42092, 44112, 1618, 13598,\n 36044, 25569, 46340, 8832, 38677, 36881, 11751, 9278, 1836, 8557,\n 39511, 31396, 35620, 9481, 37289, 24394, 16822, 37564, 47653, 28188,\n 45944, 35444, 20432, 21644, 24946, 23479, 15385, 40549, 31140, 28169,\n 11124, 29296, 36749, 7339, 33690, 5182, 14694, 4644, 13500, 34013,\n 36586, 23708, 25417, 42099, 9814, 26089, 5596, 17351, 45518, 48895,\n 42836, 2978, 931, 37134, 9784, 43269, 35098, 2605, 37956, 34564,\n 8564, 7348, 45280, 48627, 8991, 39657, 72, 25294, 10815, 1584,\n 33787, 48401, 34388, 47873, 35614, 29292, 9742, 863, 6273, 10837,\n 18126, 21636, 33201, 33548, 40748, 17480, 45208, 16442, 10786, 14513,\n 4062, 16546, 2777, 24266, 17172, 40466, 12765, 7538, 39823, 24808,\n 37807, 4480, 18517, 8409, 15769, 440, 45506, 22942, 37635, 49162,\n 10081, 37293, 29962, 40230, 26420, 42284, 47996, 30720, 3054, 21691,\n 10422, 43896, 33469, 45230, 22687, 7833, 19956, 5778, 21601, 37273,\n 14301, 40517, 20072, 19779, 34904, 589, 10873, 23612, 21108, 7831,\n 18874, 7836, 9082, 3343, 8980, 37573, 41000, 7849, 27054, 47189,\n 15364, 31967, 10077, 17616, 1273, 35056, 18601, 3358, 20186, 31777,\n 20202, 45480, 42290, 41877, 48551, 6776, 5254, 37474, 33648, 34733,\n 18405, 20436, 35238, 30991, 5989, 42814, 29245, 39432, 48649, 42423,\n 34428, 26208, 7606, 9271, 45699, 44986, 36495, 27565, 11278, 16348,\n 47572, 30592, 13394, 29899, 40198, 43355, 6372, 11, 33773, 29185,\n 49165, 43846, 13449, 34615, 38862, 11524, 21361, 18397, 10903, 33335,\n 17560, 11906, 23965, 8483, 41980, 891, 34910, 26038, 28874, 17625,\n 22948, 8489, 11265, 36893, 6346, 11842, 21202, 16315, 24760, 27243,\n 4742, 27976, 31390, 34977, 17335, 45620, 8968, 26923, 18444, 44681,\n 11091, 12126, 3094, 23215, 13893, 47978, 29766, 16336, 36405, 12646,\n 9474, 17180, 34582, 20650, 9188, 21840, 3333, 24632, 10682, 31424,\n 48164, 4225, 40571, 627, 44534, 14402, 33713, 1782, 35352, 12610,\n 1506, 15490, 22172, 46668, 5273, 4995, 8241, 3425, 20239, 31930,\n 2160, 11667, 44384, 9514, 40789, 259, 46363, 30509, 19028, 22477,\n 485, 42502, 15951, 39365, 12041, 5600, 13616, 18221, 31846, 34120,\n 17849, 49031, 4115, 29909, 34515, 1456, 32156, 21390, 25936, 46645,\n 3489, 45235, 3126, 13128, 40168, 18175, 40819, 28019, 39620, 29182,\n 14458, 46401, 16051, 16394, 3737, 26511, 26805, 18120, 14610, 30722,\n 35974, 24086, 44058, 146, 45840, 18030, 47269, 5138, 48196, 44182,\n 15244, 10930, 10793, 23551, 41181, 5605, 28932, 3276, 19831, 14695,\n 26136, 30056, 29384, 47028, 46830, 48340, 39876, 24089, 49534, 28120,\n 19199, 24477, 43137, 27365, 26681, 39585, 7418, 6438, 43619, 39522,\n 11594, 40742, 13155, 34549, 24712, 29628, 39427, 14265, 4638, 17888,\n 34863, 22935, 5499, 53, 1390, 8479, 38402, 45813, 31216, 40083,\n 27439, 11700, 26659, 24188, 3363, 39361, 23975, 11210, 23166, 703])\n"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79e6f5452945dde48d9093c3caa96bb2c12fbbc | 46,051 | ipynb | Jupyter Notebook | tweets-collection.ipynb | dalalbinhumaid/expo-2020-sentiment-analysis | a11ac8eed495515bf3ebd2876ab0e61f50f7eb12 | [
"MIT"
] | null | null | null | tweets-collection.ipynb | dalalbinhumaid/expo-2020-sentiment-analysis | a11ac8eed495515bf3ebd2876ab0e61f50f7eb12 | [
"MIT"
] | null | null | null | tweets-collection.ipynb | dalalbinhumaid/expo-2020-sentiment-analysis | a11ac8eed495515bf3ebd2876ab0e61f50f7eb12 | [
"MIT"
] | 1 | 2022-03-07T17:25:53.000Z | 2022-03-07T17:25:53.000Z | 83.274864 | 17,418 | 0.757421 | [
[
[
"# Introduction\nThe aim of this project is to conduct a sentiment analysis using python and twitter's API. The topic in my case is Expo 2020.\n\n**What is Expo 2020?**\n> Initiated in London 1851. It is a global gathering aimed to find solutions to challenges imposed by the current times. Aims to create enriching and immersive experience. World expo traverses through different cities each time. It also revolves around certain themes. Currently, World expo is taking place in Dubai, UAE. Between Oct, 2021 and Mar, 2022. For more information please visit [expo2020dubai](https://www.expo2020dubai.com/)\n\n**The goal of this project**\n> As discussed above the main goal is to conduct a sentiment analysis and learn the basics of data science and big data projects. Since expo 2020 is considered an educational exhibition that revolves around modern day problems and is hosted currently in the middle east. The aim is to measure the awareness of the arab society - *By arab society we mean anyone who posts their opinions in arabic*\n\n\n\nThis part is about discovering and collecting as much data as possible that is relating to the topic at hand. I will be needing tweets that are written in Arabic. \nEach step is further illustrated in its own markdown.\n",
"_____no_output_____"
]
],
[
[
"import tweepy\nimport pandas as pd\nimport numpy as np\nimport configparser\nimport matplotlib.pyplot as plt\nimport random",
"_____no_output_____"
]
],
[
[
"### 1. Configuration and Authentication \n---\nThis is the setup part and API authentication. Prior to using the API it is necessary to create a developer account, the account grants you two levels of access. A user level and an application/project level. I will be using **configparser** to ensure my API keys are not visible. I suggest you do the same. The following is how to set up the configuration process.\n\n\n1. create a project from the developer's portal\n2. generate your API and access keys\n3. save them in a 'config.ini' file in the following format:\n \n ``` ini\n [twitter]\n CONSUMER_KEY = 'YOUR CONSUMER KEY'\n CONSUMER_SECRET = 'YOUR CONSUMER SECRET'\n ACCESS_TOKEN = 'YOUR ACCESS TOKEN'\n ACCESS_TOKEN_SECRET = 'YOUR ACCESS TOKEN SECRET' \n ```\n \n4. install configparser by running `pip install configparser`\n\n> **Note:** If you don't plan on using the config parser make sure you remove the import and change the next cell accordingly. To eliminate any errors. Make sure you adhere to the same variable names!",
"_____no_output_____"
]
],
[
[
"# read the file from 'config.ini' \nconfig = configparser.ConfigParser()\nconfig.read('config.ini')\n\n# API Variables\nCONSUMER_KEY = config['twitter']['CONSUMER_KEY']\nCONSUMER_SECRET = config['twitter']['CONSUMER_SECRET']\nACCESS_TOKEN = config['twitter']['ACCESS_TOKEN']\nACCESS_TOKEN_SECRET = config['twitter']['ACCESS_TOKEN_SECRET']",
"_____no_output_____"
],
[
"# authenticate using tweepy\ndef twitter_setup():\n auth = tweepy.OAuth1UserHandler(CONSUMER_KEY, CONSUMER_SECRET) # project access\n auth.set_access_token(ACCESS_TOKEN, ACCESS_TOKEN_SECRET) # user access\n\n api = tweepy.API(auth = auth)\n return api\n\nextractor = twitter_setup() ",
"_____no_output_____"
]
],
[
[
"### 2. Data Collection\n---\nAfter setting up the credentials and authenticating the project. I can start extracting data using **tweepy's** API. The aim is to search different terms and different hashtags in order to collect as much entries as the API allows for. There are many limitations since I have the `Elevated Access`. The main ones is that the search API only allows search to go back 7 days. One way this was managed was starting the project early. However, the ideal goal is to be able to search the entire archive. Nonetheless, the project aims to measure the opinion at the current time. Therefore, the regular search will suffice!\n\nI have created a function that when called extracts tweets depending on a local list. This list can have as many search queires as anyone would like. Also, The function parses the needed information and stores it in a data frame. Upon each iteration it appends to the previous data frame. This is beneficial when storing the data in .csv files.",
"_____no_output_____"
]
],
[
[
"def extract_tweets():\n tweets = [] # main data frame\n data = [] # temporary data frame\n columns_header = ['ID', 'Tweet', 'Timestamp', 'Likes', 'Retweets', 'Length']\n search_terms = ['@expo2020dubai -filter:retweets',\n '#expo2020 -filter:retweets',\n '#اكسبو -filter:retweets',\n 'اكسبو دبي -filter:retweets'] # search terms\n\n # fetch the tweets once prior to the iteration to append things correctly\n collected_tweets = tweepy.Cursor(extractor.search_tweets, q='expo dubai -filter:retweets', lang='ar', tweet_mode='extended').items(600)\n\n for tweet in collected_tweets:\n data.append([tweet.id, tweet.full_text, tweet.created_at,tweet.favorite_count, tweet.retweet_count, len(tweet.full_text)])\n\n tweets = pd.DataFrame(data=data, columns=columns_header) # store in original data frame\n\n for term in search_terms:\n data = []\n collected_tweets = tweepy.Cursor(extractor.search_tweets, q=term, lang='ar', tweet_mode='extended').items(600)\n\n for tweet in collected_tweets:\n data.append([tweet.id, tweet.full_text, tweet.created_at, tweet.favorite_count, tweet.retweet_count, len(tweet.full_text)])\n\n df = pd.DataFrame(data=data, columns=columns_header)\n frames = [tweets, df] \n tweets = pd.concat(frames) # append the data frame to the previous one\n\n # since we are appending data frames the index value changes each time\n # here the goal is to create a new index that is incremented by one \n tweets.insert(0, 'index', range(0, len(tweets))) \n tweets = tweets.set_index('index')\n\n # random number to ensure files don't get overwritten\n tweets.to_csv('Tweets\\\\tweets155.csv')\n \n return tweets",
"_____no_output_____"
],
[
"tweets = extract_tweets()",
"_____no_output_____"
]
],
[
[
"### 3. Preliminary Data Exploration",
"_____no_output_____"
]
],
[
[
"display(tweets.head())\ndisplay(tweets.tail())\nprint('total of collected tweets is ', len(tweets))",
"_____no_output_____"
],
[
"\ntweets.info()\n",
"<class 'pandas.core.frame.DataFrame'>\nInt64Index: 706 entries, 0 to 705\nData columns (total 6 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 ID 706 non-null int64 \n 1 Tweet 706 non-null object \n 2 Timestamp 706 non-null datetime64[ns, UTC]\n 3 Likes 706 non-null int64 \n 4 Retweets 706 non-null int64 \n 5 Length 706 non-null int64 \ndtypes: datetime64[ns, UTC](1), int64(4), object(1)\nmemory usage: 38.6+ KB\n"
]
],
[
[
"### 4. Data Visualization",
"_____no_output_____"
]
],
[
[
"plt.plot(tweets['Timestamp'], tweets['Likes'])\nplt.gcf().autofmt_xdate()\nplt.show()",
"_____no_output_____"
],
[
"plt.scatter(tweets['Length'], tweets['Likes'], color='pink')\nplt.show()\n",
"_____no_output_____"
],
[
"# tweets1 = pd.read_csv('tweets1.csv')\n# tweets2 = pd.read_csv('tweets3.csv')\n# tweets3 = pd.read_csv('tweets327.csv')\n# tweets4 = pd.read_csv('tweets1439.csv')\n# tweets5 = pd.read_csv('tweets526.csv')\n# tweets6 = pd.read_csv('tweets546.csv')\n# tweets7 = pd.read_csv('tweets129.csv')\n# tweets8 = pd.read_csv('tweets1700.csv')\n\n# tweets1 = tweets1.drop(['Unnamed: 0'], axis=1)\n# tweets2 = tweets2.drop(['Unnamed: 0'], axis=1)\n# tweets3 = tweets3.drop(['index'], axis=1)\n# tweets4 = tweets4.drop(['index'], axis=1)\n# tweets5 = tweets5.drop(['index'], axis=1)\n# tweets6 = tweets6.drop(['index'], axis=1)\n# tweets7 = tweets7.drop(['index'], axis=1)\n# tweets8 = tweets8.drop(['index'], axis=1)\n\n\n\n# frames = [tweets1, tweets2, tweets3, tweets4, tweets5, tweets6, tweets7, tweets8]\n# final_tweets = pd.concat(frames) # append the data frame to the previous one\n# final_tweets.insert(0, 'index', range(0, len(final_tweets)))\n# final_tweets = final_tweets.set_index('index')\n\n# display(final_tweets.head())\n# display(final_tweets.tail())\n\n# final_tweets.to_csv('dirty_tweets_updated.csv')\n",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
e79e722b0a50d3e38ef245cef866e222b0209e35 | 93,238 | ipynb | Jupyter Notebook | notebooks/EDA.ipynb | simran-grewal/COVID-19-Data-Analysis | 8751aa75c451e956d5b1d1e2d6f2ffbec8dc673a | [
"FTL"
] | null | null | null | notebooks/EDA.ipynb | simran-grewal/COVID-19-Data-Analysis | 8751aa75c451e956d5b1d1e2d6f2ffbec8dc673a | [
"FTL"
] | null | null | null | notebooks/EDA.ipynb | simran-grewal/COVID-19-Data-Analysis | 8751aa75c451e956d5b1d1e2d6f2ffbec8dc673a | [
"FTL"
] | null | null | null | 246.010554 | 83,792 | 0.906723 | [
[
[
"import pandas as pd\nimport numpy as np\n\nfrom datetime import datetime\n\n%matplotlib inline\n\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\n\nimport seaborn as sns",
"_____no_output_____"
],
[
"mpl.rcParams['figure.figsize'] = (16, 9)\npd.set_option('display.max_rows', 500)\nsns.set(style = \"darkgrid\")",
"_____no_output_____"
],
[
"df_plot = pd.read_csv('../data/processed/Covid_flat_table.csv', sep = ';')",
"_____no_output_____"
],
[
"df_plot.head()",
"_____no_output_____"
],
[
"plt.figure()\nax = df_plot.set_index('date').plot()\nax.set_yscale('log')",
"_____no_output_____"
]
],
[
[
"## Plotly",
"_____no_output_____"
]
],
[
[
"import plotly.graph_objects as go\nimport dash\nimport dash_core_components as dcc\nimport dash_html_components as html",
"_____no_output_____"
],
[
"country_list = df_plot[1:].columns",
"_____no_output_____"
],
[
"fig = go.Figure()\nfor each in country_list:\n \n fig.add_trace(go.Scatter(\n x = df_plot.date, \n y = df_plot[each], \n mode = 'markers+lines', \n name = each,\n line_width = 2,\n marker_size = 4,\n opacity = 0.9\n ))\n \n\nfig.update_layout(\n xaxis_title = 'Time',\n yaxis_title = \"Confirmed infected people (source johns hopkins, log-scale)\"\n)\n\nfig.update_yaxes(type = 'log')\nfig.update_layout(xaxis_rangeslider_visible = True)\nfig.show(renderer = 'chrome')",
"_____no_output_____"
],
[
"option_list = []\nfor each in country_list:\n label_dict = {}\n label_dict['label'] = each\n label_dict['value'] = each\n option_list.append(label_dict)",
"_____no_output_____"
],
[
"app = dash.Dash()\napp.layout = html.Div([\n html.Label('Multi-Select Country'),\n dcc.Dropdown(\n id = 'country_drop_down',\n options = option_list,\n value = ['Canada', 'India'],\n multi = True\n ),\n \n dcc.Graph(figure = fig, id = 'main_window_slope')\n])",
"_____no_output_____"
],
[
"from dash.dependencies import Input, Output\n\[email protected](\n Output('main_window_slope', 'figure'),\n [Input('country_drop_down', 'value')]\n)\n\ndef update_figure(country_list):\n traces = []\n for each in country_list:\n traces.append(dict(\n x = df_plot.date, \n y = df_plot[each], \n mode = 'markers+lines', \n name = each,\n line_width = 2,\n marker_size = 4,\n opacity = 0.9\n ))\n return {\n 'data': traces,\n 'layout': dict(\n width = 1280,\n height = 720,\n xaxis_title = \"Time\",\n yaxis_title = \"Confirmed infected people (source johns hopkins, log-scale)\", \n )\n }\n ",
"_____no_output_____"
],
[
"app.run_server(debug = True, use_reloader = False)",
"Dash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\nDash is running on http://127.0.0.1:8050/\n\n * Serving Flask app \"__main__\" (lazy loading)\n * Environment: production\n WARNING: This is a development server. Do not use it in a production deployment.\n Use a production WSGI server instead.\n * Debug mode: on\n"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79e8da660abaafe06f155118da8315274e1cefe | 22,718 | ipynb | Jupyter Notebook | Copy_of_get_started.ipynb | dlminvestments/cloudml-template | 40315bf31f2c0b0ff929c7afdbbe4e1049bef371 | [
"Apache-2.0"
] | null | null | null | Copy_of_get_started.ipynb | dlminvestments/cloudml-template | 40315bf31f2c0b0ff929c7afdbbe4e1049bef371 | [
"Apache-2.0"
] | 53 | 2020-11-17T04:53:22.000Z | 2022-02-10T02:57:21.000Z | Copy_of_get_started.ipynb | dlminvestments/cloudml-template | 40315bf31f2c0b0ff929c7afdbbe4e1049bef371 | [
"Apache-2.0"
] | null | null | null | 37.737542 | 404 | 0.553086 | [
[
[
"<a href=\"https://colab.research.google.com/github/dlminvestments/cloudml-template/blob/master/Copy_of_get_started.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"##### Copyright 2019 The TensorFlow Authors.",
"_____no_output_____"
]
],
[
[
"#@title Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.",
"_____no_output_____"
]
],
[
[
"# Get started with TensorBoard\n\n<table class=\"tfo-notebook-buttons\" align=\"left\">\n <td>\n <a target=\"_blank\" href=\"https://www.tensorflow.org/tensorboard/get_started\"><img src=\"https://www.tensorflow.org/images/tf_logo_32px.png\" />View on TensorFlow.org</a>\n </td>\n <td>\n <a target=\"_blank\" href=\"https://colab.research.google.com/github/tensorflow/tensorboard/blob/master/docs/get_started.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n </td>\n <td>\n <a target=\"_blank\" href=\"https://github.com/tensorflow/tensorboard/blob/master/docs/get_started.ipynb\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" />View source on GitHub</a>\n </td>\n</table>",
"_____no_output_____"
],
[
"In machine learning, to improve something you often need to be able to measure it. TensorBoard is a tool for providing the measurements and visualizations needed during the machine learning workflow. It enables tracking experiment metrics like loss and accuracy, visualizing the model graph, projecting embeddings to a lower dimensional space, and much more.\n\nThis quickstart will show how to quickly get started with TensorBoard. The remaining guides in this website provide more details on specific capabilities, many of which are not included here. ",
"_____no_output_____"
]
],
[
[
"# Load the TensorBoard notebook extension\n%load_ext tensorboard",
"_____no_output_____"
],
[
"import tensorflow as tf\nimport datetime",
"_____no_output_____"
],
[
"# Clear any logs from previous runs\n!rm -rf ./logs/ ",
"_____no_output_____"
]
],
[
[
"Using the [MNIST](https://en.wikipedia.org/wiki/MNIST_database) dataset as the example, normalize the data and write a function that creates a simple Keras model for classifying the images into 10 classes.",
"_____no_output_____"
]
],
[
[
"mnist = tf.keras.datasets.mnist\n\n(x_train, y_train),(x_test, y_test) = mnist.load_data()\nx_train, x_test = x_train / 255.0, x_test / 255.0\n\ndef create_model():\n return tf.keras.models.Sequential([\n tf.keras.layers.Flatten(input_shape=(28, 28)),\n tf.keras.layers.Dense(512, activation='relu'),\n tf.keras.layers.Dropout(0.2),\n tf.keras.layers.Dense(10, activation='softmax')\n ])",
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz\n11493376/11490434 [==============================] - 0s 0us/step\n"
]
],
[
[
"## Using TensorBoard with Keras Model.fit()",
"_____no_output_____"
],
[
"When training with Keras's [Model.fit()](https://www.tensorflow.org/api_docs/python/tf/keras/models/Model#fit), adding the `tf.keras.callbacks.TensorBoard` callback ensures that logs are created and stored. Additionally, enable histogram computation every epoch with `histogram_freq=1` (this is off by default)\n\nPlace the logs in a timestamped subdirectory to allow easy selection of different training runs.",
"_____no_output_____"
]
],
[
[
"model = create_model()\nmodel.compile(optimizer='adam',\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy'])\n\nlog_dir = \"logs/fit/\" + datetime.datetime.now().strftime(\"%Y%m%d-%H%M%S\")\ntensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)\n\nmodel.fit(x=x_train, \n y=y_train, \n epochs=5, \n validation_data=(x_test, y_test), \n callbacks=[tensorboard_callback])",
"Train on 60000 samples, validate on 10000 samples\nEpoch 1/5\n60000/60000 [==============================] - 15s 246us/sample - loss: 0.2217 - accuracy: 0.9343 - val_loss: 0.1019 - val_accuracy: 0.9685\nEpoch 2/5\n60000/60000 [==============================] - 14s 229us/sample - loss: 0.0975 - accuracy: 0.9698 - val_loss: 0.0787 - val_accuracy: 0.9758\nEpoch 3/5\n60000/60000 [==============================] - 14s 231us/sample - loss: 0.0718 - accuracy: 0.9771 - val_loss: 0.0698 - val_accuracy: 0.9781\nEpoch 4/5\n60000/60000 [==============================] - 14s 227us/sample - loss: 0.0540 - accuracy: 0.9820 - val_loss: 0.0685 - val_accuracy: 0.9795\nEpoch 5/5\n60000/60000 [==============================] - 14s 228us/sample - loss: 0.0433 - accuracy: 0.9862 - val_loss: 0.0623 - val_accuracy: 0.9823\n"
]
],
[
[
"Start TensorBoard through the command line or within a notebook experience. The two interfaces are generally the same. In notebooks, use the `%tensorboard` line magic. On the command line, run the same command without \"%\".",
"_____no_output_____"
]
],
[
[
"%tensorboard --logdir logs/fit",
"_____no_output_____"
]
],
[
[
"<img class=\"tfo-display-only-on-site\" src=\"https://github.com/tensorflow/tensorboard/blob/master/docs/images/quickstart_model_fit.png?raw=1\"/>",
"_____no_output_____"
],
[
"A brief overview of the dashboards shown (tabs in top navigation bar):\n\n* The **Scalars** dashboard shows how the loss and metrics change with every epoch. You can use it to also track training speed, learning rate, and other scalar values.\n* The **Graphs** dashboard helps you visualize your model. In this case, the Keras graph of layers is shown which can help you ensure it is built correctly. \n* The **Distributions** and **Histograms** dashboards show the distribution of a Tensor over time. This can be useful to visualize weights and biases and verify that they are changing in an expected way.\n\nAdditional TensorBoard plugins are automatically enabled when you log other types of data. For example, the Keras TensorBoard callback lets you log images and embeddings as well. You can see what other plugins are available in TensorBoard by clicking on the \"inactive\" dropdown towards the top right.",
"_____no_output_____"
],
[
"## Using TensorBoard with other methods\n",
"_____no_output_____"
],
[
"When training with methods such as [`tf.GradientTape()`](https://www.tensorflow.org/api_docs/python/tf/GradientTape), use `tf.summary` to log the required information.\n\nUse the same dataset as above, but convert it to `tf.data.Dataset` to take advantage of batching capabilities:",
"_____no_output_____"
]
],
[
[
"train_dataset = tf.data.Dataset.from_tensor_slices((x_train, y_train))\ntest_dataset = tf.data.Dataset.from_tensor_slices((x_test, y_test))\n\ntrain_dataset = train_dataset.shuffle(60000).batch(64)\ntest_dataset = test_dataset.batch(64)",
"_____no_output_____"
]
],
[
[
"The training code follows the [advanced quickstart](https://www.tensorflow.org/tutorials/quickstart/advanced) tutorial, but shows how to log metrics to TensorBoard. Choose loss and optimizer:",
"_____no_output_____"
]
],
[
[
"loss_object = tf.keras.losses.SparseCategoricalCrossentropy()\noptimizer = tf.keras.optimizers.Adam()",
"_____no_output_____"
]
],
[
[
"Create stateful metrics that can be used to accumulate values during training and logged at any point:",
"_____no_output_____"
]
],
[
[
"# Define our metrics\ntrain_loss = tf.keras.metrics.Mean('train_loss', dtype=tf.float32)\ntrain_accuracy = tf.keras.metrics.SparseCategoricalAccuracy('train_accuracy')\ntest_loss = tf.keras.metrics.Mean('test_loss', dtype=tf.float32)\ntest_accuracy = tf.keras.metrics.SparseCategoricalAccuracy('test_accuracy')",
"_____no_output_____"
]
],
[
[
"Define the training and test functions:",
"_____no_output_____"
]
],
[
[
"def train_step(model, optimizer, x_train, y_train):\n with tf.GradientTape() as tape:\n predictions = model(x_train, training=True)\n loss = loss_object(y_train, predictions)\n grads = tape.gradient(loss, model.trainable_variables)\n optimizer.apply_gradients(zip(grads, model.trainable_variables))\n\n train_loss(loss)\n train_accuracy(y_train, predictions)\n\ndef test_step(model, x_test, y_test):\n predictions = model(x_test)\n loss = loss_object(y_test, predictions)\n\n test_loss(loss)\n test_accuracy(y_test, predictions)",
"_____no_output_____"
]
],
[
[
"Set up summary writers to write the summaries to disk in a different logs directory:",
"_____no_output_____"
]
],
[
[
"current_time = datetime.datetime.now().strftime(\"%Y%m%d-%H%M%S\")\ntrain_log_dir = 'logs/gradient_tape/' + current_time + '/train'\ntest_log_dir = 'logs/gradient_tape/' + current_time + '/test'\ntrain_summary_writer = tf.summary.create_file_writer(train_log_dir)\ntest_summary_writer = tf.summary.create_file_writer(test_log_dir)",
"_____no_output_____"
]
],
[
[
"Start training. Use `tf.summary.scalar()` to log metrics (loss and accuracy) during training/testing within the scope of the summary writers to write the summaries to disk. You have control over which metrics to log and how often to do it. Other `tf.summary` functions enable logging other types of data.",
"_____no_output_____"
]
],
[
[
"model = create_model() # reset our model\n\nEPOCHS = 5\n\nfor epoch in range(EPOCHS):\n for (x_train, y_train) in train_dataset:\n train_step(model, optimizer, x_train, y_train)\n with train_summary_writer.as_default():\n tf.summary.scalar('loss', train_loss.result(), step=epoch)\n tf.summary.scalar('accuracy', train_accuracy.result(), step=epoch)\n\n for (x_test, y_test) in test_dataset:\n test_step(model, x_test, y_test)\n with test_summary_writer.as_default():\n tf.summary.scalar('loss', test_loss.result(), step=epoch)\n tf.summary.scalar('accuracy', test_accuracy.result(), step=epoch)\n \n template = 'Epoch {}, Loss: {}, Accuracy: {}, Test Loss: {}, Test Accuracy: {}'\n print (template.format(epoch+1,\n train_loss.result(), \n train_accuracy.result()*100,\n test_loss.result(), \n test_accuracy.result()*100))\n\n # Reset metrics every epoch\n train_loss.reset_states()\n test_loss.reset_states()\n train_accuracy.reset_states()\n test_accuracy.reset_states()",
"Epoch 1, Loss: 0.24321186542510986, Accuracy: 92.84333801269531, Test Loss: 0.13006582856178284, Test Accuracy: 95.9000015258789\nEpoch 2, Loss: 0.10446818172931671, Accuracy: 96.84833526611328, Test Loss: 0.08867532759904861, Test Accuracy: 97.1199951171875\nEpoch 3, Loss: 0.07096975296735764, Accuracy: 97.80166625976562, Test Loss: 0.07875105738639832, Test Accuracy: 97.48999786376953\nEpoch 4, Loss: 0.05380449816584587, Accuracy: 98.34166717529297, Test Loss: 0.07712937891483307, Test Accuracy: 97.56999969482422\nEpoch 5, Loss: 0.041443776339292526, Accuracy: 98.71833038330078, Test Loss: 0.07514958828687668, Test Accuracy: 97.5\n"
]
],
[
[
"Open TensorBoard again, this time pointing it at the new log directory. We could have also started TensorBoard to monitor training while it progresses.",
"_____no_output_____"
]
],
[
[
"%tensorboard --logdir logs/gradient_tape",
"_____no_output_____"
]
],
[
[
"<img class=\"tfo-display-only-on-site\" src=\"https://github.com/tensorflow/tensorboard/blob/master/docs/images/quickstart_gradient_tape.png?raw=1\"/>",
"_____no_output_____"
],
[
"That's it! You have now seen how to use TensorBoard both through the Keras callback and through `tf.summary` for more custom scenarios. ",
"_____no_output_____"
],
[
"## TensorBoard.dev: Host and share your ML experiment results\n\n[TensorBoard.dev](https://tensorboard.dev) is a free public service that enables you to upload your TensorBoard logs and get a permalink that can be shared with everyone in academic papers, blog posts, social media, etc. This can enable better reproducibility and collaboration.\n\nTo use TensorBoard.dev, run the following command:\n",
"_____no_output_____"
]
],
[
[
"!tensorboard dev upload \\\n --logdir logs/fit \\\n --name \"(optional) My latest experiment\" \\\n --description \"(optional) Simple comparison of several hyperparameters\" \\\n --one_shot",
"2020-12-17 00:52:27.342972: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.1\n\n***** TensorBoard Uploader *****\n\nThis will upload your TensorBoard logs to https://tensorboard.dev/ from\nthe following directory:\n\nlogs/fit\n\nThis TensorBoard will be visible to everyone. Do not upload sensitive\ndata.\n\nYour use of this service is subject to Google's Terms of Service\n<https://policies.google.com/terms> and Privacy Policy\n<https://policies.google.com/privacy>, and TensorBoard.dev's Terms of Service\n<https://tensorboard.dev/policy/terms/>.\n\nThis notice will not be shown again while you are logged into the uploader.\nTo log out, run `tensorboard dev auth revoke`.\n\nContinue? (yes/NO) Yes\n\nPlease visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=373649185512-8v619h5kft38l4456nm2dj4ubeqsrvh6.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&scope=openid+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email&state=IdEQglmQZEYpNFRPpVZ7G44Y0MLdJI&prompt=consent&access_type=offline\nEnter the authorization code: "
]
],
[
[
"Note that this invocation uses the exclamation prefix (`!`) to invoke the shell\nrather than the percent prefix (`%`) to invoke the colab magic. When invoking this command from the command line there is no need for either prefix.\n\nView an example [here](https://tensorboard.dev/experiment/EDZb7XgKSBKo6Gznh3i8hg/#scalars).\n\nFor more details on how to use TensorBoard.dev, see https://tensorboard.dev/#get-started",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e79e93292ef7aeff32418974eee6ac2a959c42d5 | 55,401 | ipynb | Jupyter Notebook | ml_repo/2. Working with Libraries/Some Common Datasets (Updated).ipynb | sachinpr0001/data_science | d028233ff7bbcbbb6b26f01806d1c5ccf788df9a | [
"MIT"
] | null | null | null | ml_repo/2. Working with Libraries/Some Common Datasets (Updated).ipynb | sachinpr0001/data_science | d028233ff7bbcbbb6b26f01806d1c5ccf788df9a | [
"MIT"
] | null | null | null | ml_repo/2. Working with Libraries/Some Common Datasets (Updated).ipynb | sachinpr0001/data_science | d028233ff7bbcbbb6b26f01806d1c5ccf788df9a | [
"MIT"
] | null | null | null | 77.701262 | 31,776 | 0.810978 | [
[
[
"# Commonly Available Datasets",
"_____no_output_____"
],
[
"### MNIST Dataset\n\nDataset of 70,000 Handwritten Digits (28X28 Images)\n\nOriginal Dataset : http://yann.lecun.com/exdb/mnist/",
"_____no_output_____"
]
],
[
[
"from sklearn.datasets import load_digits",
"_____no_output_____"
],
[
"import matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"mnist = load_digits()",
"_____no_output_____"
],
[
"X = mnist.data\nY = mnist.target",
"_____no_output_____"
],
[
"print(X.shape)\nprint(Y.shape)",
"(1797, 64)\n(1797,)\n"
],
[
"example = X[42]\nprint(Y[42])",
"1\n"
],
[
"img = example.reshape((8,8))",
"_____no_output_____"
],
[
"print(img)",
"[[ 0. 0. 0. 0. 12. 5. 0. 0.]\n [ 0. 0. 0. 2. 16. 12. 0. 0.]\n [ 0. 0. 1. 12. 16. 11. 0. 0.]\n [ 0. 2. 12. 16. 16. 10. 0. 0.]\n [ 0. 6. 11. 5. 15. 6. 0. 0.]\n [ 0. 0. 0. 1. 16. 9. 0. 0.]\n [ 0. 0. 0. 2. 16. 11. 0. 0.]\n [ 0. 0. 0. 3. 16. 8. 0. 0.]]\n"
],
[
"plt.imshow(img,cmap=\"gray\")\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Boston Dataset\nHousing Prices Dataset",
"_____no_output_____"
]
],
[
[
"from sklearn.datasets import load_boston\nboston = load_boston()",
"_____no_output_____"
],
[
"X = boston.data\nY = boston.target",
"_____no_output_____"
],
[
"print(X.shape)",
"(506, 13)\n"
],
[
"print(Y.shape)",
"(506,)\n"
]
],
[
[
"#### Loading & Visualising MNIST Dataset using Pandas & Matplotlib",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"df = pd.read_csv(\"../Datasets/MNIST-2/mnist_train.csv\")",
"_____no_output_____"
],
[
"df.shape",
"_____no_output_____"
],
[
"df.head(n=3)",
"_____no_output_____"
],
[
"print(type(df))",
"<class 'pandas.core.frame.DataFrame'>\n"
],
[
"data = df.values\nnp.random.shuffle(data)",
"_____no_output_____"
],
[
"print(type(data))",
"<class 'numpy.ndarray'>\n"
],
[
"print(data.shape)\n",
"(42000, 785)\n"
],
[
"X = data[ : ,1: ]\nY = data[ : ,0]",
"_____no_output_____"
],
[
"print(X.shape,Y.shape)",
"(42000, 784) (42000,)\n"
],
[
"## Try to visualise one image",
"_____no_output_____"
],
[
"def drawImg(X,Y,i):\n plt.imshow(X[i].reshape(28,28),cmap='gray')\n plt.title(\"Label \"+ str(Y[i]))\n plt.show()\n \n \nfor i in range(1):\n drawImg(X,Y,i)",
"_____no_output_____"
],
[
"## Split this dataset => \n\nsplit = int(0.80*X.shape[0])\nprint(split)",
"33600\n"
],
[
"X_train,Y_train = X[ :split, :], Y[:split]\nX_test,Y_test = X[split: , :], Y[split: ]\n\nprint(X_train.shape,Y_train.shape)\n\nprint(X_test.shape,Y_test.shape)",
"(33600, 784) (33600,)\n(8400, 784) (8400,)\n"
],
[
"# Randomization\nimport numpy as np\na = np.array([1,2,3,4,5])\nnp.random.shuffle(a)\n\nprint(a)",
"[5 2 3 4 1]\n"
],
[
"# Randomly shuffle a 2 D array\na = np.array([[1,2,3],\n [4,5,6],\n [7,8,9]])\n\nnp.random.shuffle(a)\nprint(a)",
"[[1 2 3]\n [7 8 9]\n [4 5 6]]\n"
],
[
"# Try to plot a visualisation (Grid of first 25 images 5 X 5)\n\nplt.figure(figsize=(10,10))\nfor i in range(25):\n plt.subplot(5,5,i+1)\n plt.imshow(X_train[i].reshape(28,28),cmap='gray')\n plt.title(Y_train[i])\n plt.axis(\"off\")\n ",
"_____no_output_____"
],
[
"# last thing \nfrom sklearn.model_selection import train_test_split\n\nXT,Xt,YT,Yt = train_test_split(X,Y,test_size=0.2,random_state=5)\nprint(XT.shape,YT.shape)\nprint(Xt.shape,Yt.shape)",
"(33600, 784) (33600,)\n(8400, 784) (8400,)\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79e935b8942e5438c2a48881f704db0a326b21f | 14,614 | ipynb | Jupyter Notebook | completed/00. Python Fundamentals (Part 1).ipynb | cjwinchester/cfj-2017 | db14686b0269303eb1db5942dd30b3a28775fb0b | [
"MIT"
] | 4 | 2017-10-20T02:56:21.000Z | 2019-04-10T14:59:31.000Z | completed/00. Python Fundamentals (Part 1).ipynb | cjwinchester/cfj-2017 | db14686b0269303eb1db5942dd30b3a28775fb0b | [
"MIT"
] | 5 | 2020-03-24T15:29:43.000Z | 2021-06-01T21:50:07.000Z | completed/00. Python Fundamentals (Part 1).ipynb | cjwinchester/cfj-2017 | db14686b0269303eb1db5942dd30b3a28775fb0b | [
"MIT"
] | 2 | 2020-08-18T19:21:49.000Z | 2020-12-15T04:28:34.000Z | 26.474638 | 347 | 0.548994 | [
[
[
"# Python fundamentals\n\nA quick introduction to the [Python programming language](https://www.python.org/) and [Jupyter notebooks](https://jupyter.org/). ([We're using Python 3, not Python 2](https://pythonclock.org/).)",
"_____no_output_____"
],
[
"### Basic data types and the print() function",
"_____no_output_____"
]
],
[
[
"# variable assignment\n# https://www.digitalocean.com/community/tutorials/how-to-use-variables-in-python-3\n\n# strings -- enclose in single or double quotes, just make sure they match\nmy_name = 'Cody'\n\n# numbers\nint_num = 6\nfloat_num = 6.4\n\n# the print function\nprint(8)\nprint('Hello!')\nprint(my_name)\nprint(int_num)\nprint(float_num)\n\n# booleans\nprint(True)\nprint(False)\nprint(4 > 6)\nprint(6 == 6)\nprint('ell' in 'Hello')",
"_____no_output_____"
]
],
[
[
"### Basic math\n\nYou can do [basic math](https://www.digitalocean.com/community/tutorials/how-to-do-math-in-python-3-with-operators) with Python. (You can also do [more advanced math](https://docs.python.org/3/library/math.html).)",
"_____no_output_____"
]
],
[
[
"# addition\nadd_eq = 4 + 2\n\n# subtraction\nsub_eq = 4 - 2\n\n# multiplication\nmult_eq = 4 * 2\n\n# division\ndiv_eq = 4 / 2\n\n# etc.",
"_____no_output_____"
]
],
[
[
"### Lists\n\nA comma-separated collection of items between square brackets: `[]`. Python keeps track of the order of things inside a list.",
"_____no_output_____"
]
],
[
[
"# create a list: name, hometown, age\n# an item's position in the list is the key thing\ncody = ['Cody', 'Midvale, WY', 32]\n\n# create another list of mixed data\nmy_list = [1, 2, 3, 'hello', True, ['a', 'b', 'c']]\n\n# use len() to get the number of items in the list\nmy_list_count = len(my_list)\n\nprint('There are', my_list_count, 'items in my list.')\n\n# use square brackets [] to access items in a list\n# (counting starts at zero in Python)\n\n# get the first item\nfirst_item = my_list[0]\nprint(first_item)\n\n# you can do negative indexing to get items from the end of your list\n\n# get the last item\nlast_item = my_list[-1]\nprint(last_item)\n\n# Use colons to get a range of items in a list\n\n# get the first two items\n# the last number in a list slice is the first list item that's ~not~ included in the result\nmy_range = my_list[0:2]\nprint(my_range)\n\n# if you leave the last number off, it takes the item at the first number's index and everything afterward\n# get everything from the third item onward\nmy_open_range = my_list[2:]\nprint(my_open_range)\n\n# Use append() to add things to a list\nmy_list.append(5)\nprint(my_list)\n\n# Use pop() to remove items from the end of a list\nmy_list.pop()\nprint(my_list)\n\n# use join() to join items from a list into a string with a delimiter of your choosing\nletter_list = ['a', 'b', 'c']\njoined_list = '-'.join(letter_list)\nprint(joined_list)",
"_____no_output_____"
]
],
[
[
"### Dictionaries\n\nA data structure that maps _keys_ to _values_ inside curly brackets: `{}`. Items in the dictionary are separated by commas. Python does not keep track of the order of items in a dictionary; if you need to keep track of insertion order, use an [OrderedDict](https://docs.python.org/3/library/collections.html#collections.OrderedDict) instead.",
"_____no_output_____"
]
],
[
[
"my_dict = {'name': 'Cody', 'title': 'Training director', 'organization': 'IRE'}\n\n# Access items in a dictionary using square brackets and the key (typically a string)\nmy_name = my_dict['name']\nprint(my_name)\n\n# You can also use the `get()` method to retrieve values\n# you can optionally provide a second argument as the default value\n# if the key doesn't exist (otherwise defaults to `None`)\nmy_name = my_dict.get('name', 'Jefferson Humperdink')\nprint(my_name)\n\n# Use the .keys() method to get the keys of a dictionary\nprint(my_dict.keys())\n\n# Use the .values() method to get the values\nprint(my_dict.values())\n\n# add items to a dictionary using square brackets, the name of the key (typically a string)\n# and set the value like you'd set a variable, with =\nmy_dict['my_age'] = 32\nprint(my_dict)\n\n# delete an item from a dictionary with `del`\ndel my_dict['my_age']\nprint(my_dict)",
"_____no_output_____"
]
],
[
[
"### Commenting your code\n\nPython skips lines that begin with a hashtag # -- these lines are used to write comments to help explain the code to others (and to your future self).\n\nMulti-line comments are enclosed between triple quotes: \"\"\" \"\"\"",
"_____no_output_____"
]
],
[
[
"# this is a one-line comment\n\n\"\"\"\nThis is a \nmulti-line comment\n\n~~~\n\n\"\"\"",
"_____no_output_____"
]
],
[
[
"### Comparison operators\n\nWhen you want to [compare values](https://docs.python.org/3/reference/expressions.html#value-comparisons), you can use these symbols:\n\n- `<` means less than\n- `>` means greater than\n- `==` means equal\n- `>=` means greater than or equal\n- `<=` means less than or equal\n- `!=` means not equal",
"_____no_output_____"
]
],
[
[
"4 > 6\n\n'Hello!' == 'Hello!'\n\n(2 + 2) != (4 * 2)\n\n100.2 >= 100",
"_____no_output_____"
]
],
[
[
"### String functions\n\nPython has a number of built-in methods to work with strings. They're useful if, say, you're using Python to clean data. Here are a few of them:",
"_____no_output_____"
],
[
"#### _strip()_\n\nCall `strip()` on a string to remove whitespace from either side. It's like using the `=TRIM()` function in Excel.",
"_____no_output_____"
]
],
[
[
"whitespace_str = ' hello! '\nprint(whitespace_str.strip())",
"_____no_output_____"
]
],
[
[
"#### _upper()_ and _lower()_\n\nCall `.upper()` on a string to make the characters uppercase. Call `.lower()` on a string to make the characters lowercase. This can be useful when testing strings for equality.",
"_____no_output_____"
]
],
[
[
"my_name = 'Cody'\n\nmy_name_upper = my_name.upper()\nprint(my_name_upper)\n\nmy_name_lower = my_name.lower()\nprint(my_name_lower)",
"_____no_output_____"
]
],
[
[
"#### _replace()_\n\nUse `.replace()` to substitute bits of text.",
"_____no_output_____"
]
],
[
[
"company = 'Bausch & Lomb'\n\ncompany_no_ampersand = company.replace('&', 'and')\n\nprint(company_no_ampersand)",
"_____no_output_____"
]
],
[
[
"#### _split()_\n\nUse `.split()` to split a string on some delimiter. If you don't specify a delimiter, it uses a single space as the default.",
"_____no_output_____"
]
],
[
[
"date = '6/4/2011'\n\ndate_split = date.split('/')\n\nprint(date_split)",
"_____no_output_____"
]
],
[
[
"#### _zfill()_\n\nAmong other things, you can use `.zfill()` to add zero padding -- for instance, if you're working with ZIP code data that was saved as a number somewhere and you've lost the leading zeroes for that handful of ZIP codes that begin with 0.\n\n_Note: `.zfill()` is a string method, so if you want to apply it to a number, you'll need to first coerce it to a string with `str()`._",
"_____no_output_____"
]
],
[
[
"mangled_zip = '2301'\nfixed_zip = mangled_zip.zfill(5)\nprint(fixed_zip)\n\nnum_zip = 2301\nfixed_num_zip = str(num_zip).zfill(5)\nprint(fixed_num_zip)",
"_____no_output_____"
]
],
[
[
"#### _slicing_\n\nLike lists, strings are _iterables_, so you can use slicing to grab chunks.",
"_____no_output_____"
]
],
[
[
"my_string = 'supercalifragilisticexpialidocious'\n\nchunk = my_string[9:20]\n\nprint(chunk)",
"_____no_output_____"
]
],
[
[
"#### _startswith()_, _endswith()_ and _in_\n\nIf you need to test whether a string starts with a series of characters, use `.startswith()`. If you need to test whether a string ends with a series of characters, use `.endswith()`. If you need to test whether a string is part of another string -- or a list of strings -- use `.in()`.\n\nThese are case sensitive, so you'd typically `.upper()` or `.lower()` the strings you're comparing to ensure an apples-to-apples comparison.",
"_____no_output_____"
]
],
[
[
"str_to_test = 'hello'\n\nprint(str_to_test.startswith('hel'))\nprint(str_to_test.endswith('lo'))\nprint('el' in str_to_test)\nprint(str_to_test in ['hi', 'whatsup', 'salutations', 'hello'])",
"_____no_output_____"
]
],
[
[
"### String formatting\n\nUsing curly brackets with the [various options](https://pyformat.info/) available to the `.format()` method, you can create string templates for your data. Some examples:",
"_____no_output_____"
]
],
[
[
"# date in m/d/yyyy format\nin_date = '8/17/1982'\n\n# split out individual pieces of the date\n# using a shortcut method to assign variables to the resulting list\nmonth, day, year = in_date.split('/')\n\n# reshuffle as yyyy-mm-dd using .format()\n# use a formatting option (:0>2) to left-pad month/day numbers with a zero\nout_date = '{}-{:0>2}-{:0>2}'.format(year, month, day)\n\nprint(out_date)",
"_____no_output_____"
],
[
"# construct a greeting template\ngreeting = 'Hello, {}! My name is {}.'\nyour_name = 'Pat'\nmy_name = 'Cody'\n\nprint(greeting.format(your_name, my_name))",
"_____no_output_____"
]
],
[
[
"### Type coercion\n\nConsider:\n\n```python\n# this is a number, can't do string-y things to it\nage = 32\n\n# this is a string, can't do number-y things to it\nage = '32'\n```\nThere are several functions you can use to _coerce_ a value of one type to a value of another type. Here are a couple of them:\n\n- `int()` tries to convert to an integer\n- `str()` tries to convert to a string\n- `float()` tries to convert to a float",
"_____no_output_____"
]
],
[
[
"# two strings of numbers\nnum_1 = '100'\nnum_2 = '200'\n\n# what happens when you add them without coercing?\nconcat = num_1 + num_2\nprint(concat)\n\n# coerce to integer, then add them\nadded = int(num_1) + int(num_2)\nprint(added)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79e9a96e8cae562b0f63cf48caa868537fd8cf3 | 70,690 | ipynb | Jupyter Notebook | notebooks/evaluation_drm_ec.ipynb | DRyanMiller/Echo_Chamber | edeac4289e739b5a9f53e522332b8408e099ffb8 | [
"ADSL"
] | 1 | 2019-07-17T18:28:51.000Z | 2019-07-17T18:28:51.000Z | notebooks/evaluation_drm_ec.ipynb | DRyanMiller/Echo_Chamber | edeac4289e739b5a9f53e522332b8408e099ffb8 | [
"ADSL"
] | null | null | null | notebooks/evaluation_drm_ec.ipynb | DRyanMiller/Echo_Chamber | edeac4289e739b5a9f53e522332b8408e099ffb8 | [
"ADSL"
] | null | null | null | 59.204355 | 17,456 | 0.769529 | [
[
[
"# Evaluation - Echo Chamber",
"_____no_output_____"
],
[
"The primary goal of this project is to provide users with recommendations that are different from those produced by an ALS recommendation system, but not too different. To evaluate the performance of the augmented, four metrics were developed to evaluate how different the movies recommended by the augmented model are from those recommened by the ALS model.\n\nThe <a href='#metric1'>first metric</a> looks a user's Top 100 ALS recommendations (based on predicted user ratings) and the Top 100 recommendations from the user's cluster (based on the cluster centroid's ratings). The proportion of the ALS recommended movies also found in the cluster recommendations should be high. A high proportion indicates that the user's cluster is representative of the user's movie preferences. \n\nThe <a href='#metric2'>second metric</a> looks a user's Top 100 ALS recommendations (based on predicted user ratings) and the top recommendations from the augmented model (based on the cluster centroid's ratings for the two clusters nearest to the user's cluster). The proportion of the ALS recommended movies also found in the augmented model recommendations should be low. A low proportion indicates that the recommendations from the augmented model differ from those produced by the ALS model.\n\nThe <a href='#metric3'>third metric</a> utilizes the distance between movies (based on the ALS item factors) to evaluate the extent to which the movies from the augmented model are qualitatively different from the movies from the ALS model. For this metric, the mean squared distance between the Top 100 movies from the ALS model (excluding the distance between a movie and itself) is calculated for each user in the sample. Likewise, the mean squared distance between each of the Top 100 ALS movies and each of the top movies from augmented model is calculated for each user in the sample. The difference between these mean squared distances for the sample are tested using a t-test. A negative and statistically significant t-statistic indicates the mean difference between the two sets of recommendations is greater than the mean difference within the ALS recommendations and, thus, the movies from the augmented model are qualitatively different from the movies from the ALS model.\n\nThe <a href='#metric4'>final metric</a> evaluates whether the movies recommended by the augmented model are too qualitatively different from the ALS recommendations. The metric tests the difference between two differences: the difference between ALS and the augmented model and the difference between ALS and recommendations from the two clusters furthest away from the user's cluster. These differences are calculated in the same way as described above for third metric. The difference in differences is tested using a t-test. A positive and statistically significant t-statistic indicates the difference between the ALS recommendations and those from the furthest clusters is greater than the distance between the ALS recommendations and the augmented model's recommendations. A greater distance is evidence that the augmented model recommendations are not too different from the ALS recommendations since the movies recommended by the furthest clusters are more different. \n\nThe performance of the augmented model is evaluated in the cells below by applying the above metrics to a sample of 1000 users from the MovieLens dataset. The links above can be used to skip to a particular metric in Sections 5 - 8. Sections 2 - 4 demonstrate the process for importing, sampling, filtering, and engineering the data for evaluation. ",
"_____no_output_____"
],
[
"## Local Code Imports",
"_____no_output_____"
]
],
[
[
"from src import model as mdl\nfrom src import custom as cm",
"_____no_output_____"
]
],
[
[
"## Code Imports",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nfrom joblib import load\nfrom scipy import stats\nimport matplotlib.pyplot as plt\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"# Data",
"_____no_output_____"
],
[
"## Import Data Files",
"_____no_output_____"
]
],
[
[
"user_fac = pd.read_csv('../data/processed/user_factors.csv', index_col='id')",
"_____no_output_____"
],
[
"item_fac = pd.read_csv(\n '../data/processed/item_factors_unstacked.csv', index_col='id')",
"_____no_output_____"
]
],
[
[
"## Build Rankings Matrix",
"_____no_output_____"
],
[
"The following cell multiplies the user factors and item factors calculated from the ALS model implementation to get the ALS movie ratings for each user (see the SparkALS.py file in the model folder for the ALS implementation code).",
"_____no_output_____"
]
],
[
[
"ALS_rankings_matrix = user_fac.to_numpy().dot(item_fac.T.to_numpy())",
"_____no_output_____"
],
[
"ALS_rankings_matrix.shape",
"_____no_output_____"
]
],
[
[
"## Sample Users",
"_____no_output_____"
],
[
"To sample users, an index of random values is generated. The index is then used to filter rows from the user factors object (user_fac) and the ALS rankings matrix. The sample is transformed from a numpy array to a dataframe, the columns set to be the movie IDs (take from the item_fac index), and then transposed so that each column represents a user and each row a movie. Lastly, the index is reset making the movieId a column in the dataframe. ",
"_____no_output_____"
]
],
[
[
"idx = np.random.randint(0, 243658, size=1000)",
"_____no_output_____"
],
[
"sample_user_facs = user_fac.to_numpy()[idx, :]",
"_____no_output_____"
],
[
"sample = ALS_rankings_matrix[idx, :]",
"_____no_output_____"
],
[
"sample_df = pd.DataFrame(sample)",
"_____no_output_____"
],
[
"sample_df.columns = item_fac.index",
"_____no_output_____"
],
[
"sample_T = sample_df.T",
"_____no_output_____"
],
[
"sample_T.reset_index(inplace=True)",
"_____no_output_____"
]
],
[
[
"## Filter for Most Rated Movies",
"_____no_output_____"
],
[
"The ALS and augmented models used in this project only recommend movies that have been rated by more than 50 users. The file most_rated.csv contains the movie ID, title, and genre for these often rated movies. Using the most rated movies, the sample is filtered to include only those movies with more than 50 user reviews. This is done so that the evaluation below is based on the same set of possible movie recommendations as used in the recommendation system. ",
"_____no_output_____"
]
],
[
[
"most_rated = pd.read_csv(\n '../data/processed/most_rated.csv', index_col='Unnamed: 0')",
"_____no_output_____"
],
[
"sample_redx = pd.merge(sample_T, most_rated, how='inner',\n left_on='id', right_on='movieId')",
"_____no_output_____"
],
[
"sample_redx.set_index('id', inplace=True)",
"_____no_output_____"
],
[
"sample_redx.drop(['movieId', 'title', 'genres'], axis=1, inplace=True)",
"_____no_output_____"
]
],
[
[
"# Get ALS Top 100 for Sample",
"_____no_output_____"
],
[
"The following cell creates a list of the Top 100 rated movies for each user based on the ALS model. ",
"_____no_output_____"
]
],
[
[
"als_top_100s = []\nfor idx, col in enumerate(sample_redx):\n top_100 = sample_redx[col].sort_values(ascending=False).head(100)\n top_100_df = pd.DataFrame(top_100)\n top_100_df.reset_index(inplace=True)\n als_top_100s.append(top_100_df)",
"_____no_output_____"
]
],
[
[
"# Get Cluster Top 100 for Sample",
"_____no_output_____"
],
[
"The augmented model gets recommendations from the two clusters nearest to the user's cluster. The following cells predict the cluster of each user in the sample, gets creates a dataframe with the movie ratings for each cluster, and generates a list of the Top 100 movies recommendations for each user in the sample from the user's predicted cluster. ",
"_____no_output_____"
],
[
"## Predict Users' Clusters",
"_____no_output_____"
]
],
[
[
"gbc = load('../models/fifp_classification.joblib')",
"_____no_output_____"
],
[
"preds = gbc.predict(sample_user_facs)",
"_____no_output_____"
]
],
[
[
"## Get Cluster Centroid Ratings",
"_____no_output_____"
]
],
[
[
"centroids = pd.read_csv(\n '../data/processed/centroids.csv', index_col='Unnamed: 0')\ncentroid_ratings_T_df = cm.get_centroid_ratings(centroids, item_fac)",
"_____no_output_____"
],
[
"centroid_ratings_T_df.reset_index(inplace=True)",
"_____no_output_____"
],
[
"centroid_ratings_redx = pd.merge(\n centroid_ratings_T_df, most_rated, how='inner', left_on='id', right_on='movieId')",
"_____no_output_____"
],
[
"centroid_ratings_redx.set_index('id', inplace=True)",
"_____no_output_____"
],
[
"centroid_ratings_redx.drop(\n ['movieId', 'title', 'genres'], axis=1, inplace=True)",
"_____no_output_____"
]
],
[
[
"## Get Cluster Top 100",
"_____no_output_____"
]
],
[
[
"cluster_top_100s = []\nfor cluster in preds:\n top_100 = centroid_ratings_redx[cluster].sort_values(\n ascending=False).head(100)\n top_100_df = pd.DataFrame(top_100)\n top_100_df.reset_index(inplace=True)\n cluster_top_100s.append(top_100_df)",
"_____no_output_____"
]
],
[
[
"<a id='metric1'></a>",
"_____no_output_____"
],
[
"# Evaluation Metric 1: Proportion of Top 100 Movies Shared between the ALS recommendations and the User's Cluster Centroid",
"_____no_output_____"
],
[
"The first metric evaluates the similarity between the top 100 ALS recommendations (based on predicted user ratings) and those generated by the user's cluster (based on the cluster centroid's ratings). The proportion of the ALS recommended movies also found in the cluster recommendations should be high indicating the user's cluster is representative of the user's movie preferences.",
"_____no_output_____"
]
],
[
[
"proportions1 = []\nfor i in range(len(cluster_top_100s)):\n als_set = set(als_top_100s[i].iloc[:, 0])\n cluster_set = set(cluster_top_100s[i].iloc[:, 0])\n intersection = als_set.intersection(cluster_set)\n n_in_common = len(intersection)\n proportion_in_common = (n_in_common/100)\n proportions1.append(proportion_in_common)",
"_____no_output_____"
],
[
"plt.figure(figsize=(12, 6))\nplt.title('Proportion of Movies Common to a User\\'s ALS Recommendations and a User\\'s Own Cluster\\'s Recommendations')\nplt.ylabel('Frequencies')\nplt.xlabel('Proportion of Movies in Common')\nplt.hist(proportions1, bins=20)\nplt.show;\nplt.savefig('../reports/figures/Metric1.png')",
"_____no_output_____"
]
],
[
[
"<a id='metric2'></a>",
"_____no_output_____"
],
[
"# Evaluation Metric 2: Proportion of Top 100 Movies Shared between the ALS and Augmented Recommendations",
"_____no_output_____"
],
[
"The second metric evaluates the similarity between the top 100 ALS recommendations (based on predicted user ratings) and the top recommendations from the augmented model. The proportion of the ALS recommended movies also found in the augmented model recommendations should be low indicating the recommendations from the augmented model differ from those produced by the ALS model.",
"_____no_output_____"
]
],
[
[
"cluster_distances = pd.read_csv(\n '../data/processed/cluster_distances_df.csv', index_col='Unnamed: 0')",
"_____no_output_____"
],
[
"cluster_distances.head()",
"_____no_output_____"
],
[
"cm.get_nearest_clusters(cluster_distances, '8')",
"_____no_output_____"
],
[
"proportions2 = []\nfor i in range(len(cluster_top_100s)):\n j = preds[i]\n nearest_clusters = cm.get_nearest_clusters(\n cluster_distances, '{}'.format(j))\n cluster1 = nearest_clusters[0]\n cluster2 = nearest_clusters[1]\n als_set = set(als_top_100s[i].iloc[:, 0])\n cluster1_set = set(cluster_top_100s[cluster1].iloc[:, 0])\n cluster2_set = set(cluster_top_100s[cluster2].iloc[:, 0])\n cluster_full = cluster1_set.union(cluster2_set)\n intersection = als_set.intersection(cluster_full)\n n_in_common = len(intersection)\n proportion_in_common = (n_in_common/100)\n proportions2.append(proportion_in_common)",
"_____no_output_____"
],
[
"plt.figure(figsize=(12, 6))\nplt.title('Proportion of Movies Common to a User\\'s ALS and Augmented Recommendations')\nplt.ylabel('Frequencies')\nplt.xlabel('Proportion of Movies in Common')\nplt.hist(proportions2, bins=20)\nplt.show;\nplt.savefig('../reports/figures/Metric2.png')",
"_____no_output_____"
]
],
[
[
"<a id='metric3'></a>",
"_____no_output_____"
],
[
"# Evaluation Metric 3: Distances Between ALS Recommended Movies and Nearest Cluster Recommended Movies",
"_____no_output_____"
],
[
"In addition to recommending a different set of movies than the ALS model, the goal of the augmented model is to recommend movies that are qualitatively more diverse. To test for differences in the qualities of movie recommended by each model, the third evaluation metric assesses the distances between movies. For each user, the mean squared distance between movies recommended by the ALS model is calculated. Then, the mean squared distance between movies recommended by ALS model and movies recommended by the augmented model is calculated for each user. For the sample, the difference in the two mean squared distance calculations is tested using a t-test. A negative and statistically significant t-statistic indicates the mean difference between the two sets of recommendations is greater than the mean difference within the ALS recommendations and, thus, the movies from the augmented model are qualitatively different from the movies from the ALS model.",
"_____no_output_____"
]
],
[
[
"movie_distances = cm.get_cluster_distances(item_fac)",
"_____no_output_____"
],
[
"movie_distances.columns = item_fac.index",
"_____no_output_____"
],
[
"movie_distances.index = item_fac.index",
"_____no_output_____"
],
[
"nearest_clusters_top = []\nfor i in range(len(cluster_top_100s)):\n j = preds[i]\n nearest_clusters = cm.get_nearest_clusters(\n cluster_distances, '{}'.format(j))\n cluster1 = nearest_clusters[0]\n cluster2 = nearest_clusters[1]\n cluster1_set = set(cluster_top_100s[cluster1].iloc[:, 0])\n cluster2_set = set(cluster_top_100s[cluster2].iloc[:, 0])\n cluster_full_set = cluster1_set.union(cluster2_set)\n cluster_full_list = list(cluster_full_set)\n nearest_clusters_top.append(cluster_full_list)",
"_____no_output_____"
],
[
"MSS_distances_within = []\nfor i in range(len(als_top_100s)):\n within_distances = []\n for j in als_top_100s[i].iloc[:, 0]:\n for k in als_top_100s[i].iloc[:, 0]:\n if j == k:\n pass\n else:\n within_distances.append(movie_distances[j][k])\n sq_within_distances = [x**2 for x in within_distances]\n MSS_distances = np.mean(sq_within_distances)\n MSS_distances_within.append(MSS_distances)",
"_____no_output_____"
],
[
"MSS_distances_between = []\nfor i in range(len(als_top_100s)):\n between_distances = []\n for j in als_top_100s[i].iloc[:, 0]:\n for k in nearest_clusters_top[i]:\n between_distances.append(movie_distances[j][k])\n sq_between_distances = [x**2 for x in between_distances]\n MSS_distances = np.mean(sq_between_distances)\n MSS_distances_between.append(MSS_distances)",
"_____no_output_____"
],
[
"stats.ttest_ind(MSS_distances_within, MSS_distances_between, equal_var=False)",
"_____no_output_____"
]
],
[
[
"<a id='metric4'></a>",
"_____no_output_____"
],
[
"# Evaluation: Difference in Distances Between ALS Recommended Movies and Nearest Cluster Recommended Movies and the Distances Between ALS Recommended Movies and Furthest Cluster Recommended Movies",
"_____no_output_____"
],
[
"The goal of the augmented model is to recommend movies that are qualitatively different, but not too different, from those provided by the ALS model. Recommendations that are too different will be ignored or, worse, could result in the user no longer using the recommendation service. \n\nTo test that the recommendations provided by the augmented model are not too different for the ALS recommendation, metric 4 compares the differences between the ALS and augmented recommendations to the difference between the ALS recommendations and those provided by the two clusters furthest from the user's cluster. Recommendations from the furthest clusters should be more different than recommendations from the nearest clusters (i.e., the augmented model). For the sample, the difference in differences is tested using a t-test. A positive and statistically significant t-statistic indicates the difference between the ALS recommendations and the furthest cluster recommendations is greater than the difference between the ALS recommendations and the augmented recommendations. If the difference is positive and statitically significant, then the augmented recommendations are not as different as the recommendations from the furthest clusters and therefore are not too different from the ALS recommendations. ",
"_____no_output_____"
]
],
[
[
"furthest_clusters_top = []\nfor i in range(len(cluster_top_100s)):\n j = preds[i]\n nearest_clusters = cm.get_furthest_clusters(\n cluster_distances, '{}'.format(j))\n cluster1 = nearest_clusters[0]\n cluster2 = nearest_clusters[1]\n cluster1_set = set(cluster_top_100s[cluster1].iloc[:, 0])\n cluster2_set = set(cluster_top_100s[cluster2].iloc[:, 0])\n cluster_full_set = cluster1_set.union(cluster2_set)\n cluster_full_list = list(cluster_full_set)\n furthest_clusters_top.append(cluster_full_list)",
"_____no_output_____"
],
[
"MSS_distances_between_furthest = []\nfor i in range(len(als_top_100s)):\n within_distances = []\n for j in als_top_100s[i].iloc[:, 0]:\n for k in furthest_clusters_top[i]:\n within_distances.append(movie_distances[j][k])\n sq_within_distances = [x**2 for x in within_distances]\n MSS_distances = np.mean(sq_within_distances)\n MSS_distances_between_furthest.append(MSS_distances)",
"_____no_output_____"
],
[
"within_between_differences = np.subtract(\n MSS_distances_between, MSS_distances_within)\nwithin_furthest_differences = np.subtract(\n MSS_distances_between_furthest, MSS_distances_within)",
"_____no_output_____"
],
[
"stats.ttest_ind(within_furthest_differences,\n within_between_differences, equal_var=False)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
e79e9dafb13590032f010d0f321ec988e37f3130 | 12,053 | ipynb | Jupyter Notebook | doc/7-Working-with-Large-Images.ipynb | hinerm/pyimagej | 61dda04a7cd15da8329629029bd7d7b38d331365 | [
"Apache-2.0"
] | null | null | null | doc/7-Working-with-Large-Images.ipynb | hinerm/pyimagej | 61dda04a7cd15da8329629029bd7d7b38d331365 | [
"Apache-2.0"
] | null | null | null | doc/7-Working-with-Large-Images.ipynb | hinerm/pyimagej | 61dda04a7cd15da8329629029bd7d7b38d331365 | [
"Apache-2.0"
] | null | null | null | 41.136519 | 5,128 | 0.719406 | [
[
[
"PyImageJ Tutorial\n===\n\nThis notebook covers how to use ImageJ as a library from Python. A major advantage of this approach is the ability to combine ImageJ with other tools available from the Python software ecosystem, including NumPy, SciPy, scikit-image, CellProfiler, OpenCV, ITK and more.\n\nThis notebook assumes familiarity with the ImageJ API. Detailed tutorials in that regard can be found in the other notebooks.",
"_____no_output_____"
],
[
"## 7 Visualizing large images\n\nBefore we begin: how much memory is Java using right now?",
"_____no_output_____"
]
],
[
[
"from scyjava import jimport\nRuntime = jimport('java.lang.Runtime')\ndef java_mem():\n rt = Runtime.getRuntime()\n mem_max = rt.maxMemory()\n mem_used = rt.totalMemory() - rt.freeMemory()\n return '{} of {} MB ({}%)'.format(int(mem_used)/2**20, int(mem_max/2**20), int(100*mem_used/mem_max))\n\njava_mem()",
"_____no_output_____"
]
],
[
[
"Now let's open an obnoxiously huge synthetic dataset:",
"_____no_output_____"
]
],
[
[
"big_data = ij.scifio().datasetIO().open('lotsofplanes&lengths=512,512,16,1000,10000&axes=X,Y,Channel,Z,Time.fake')",
"_____no_output_____"
]
],
[
[
"How many total samples does this image have?",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\ndims = [big_data.dimension(d) for d in range(big_data.numDimensions())]\npix = np.prod(dims)\nstr(pix/2**40) + \" terapixels\"",
"_____no_output_____"
]
],
[
[
"And how much did memory usage in Java increase?",
"_____no_output_____"
]
],
[
[
"java_mem()",
"_____no_output_____"
]
],
[
[
"Let's visualize this beast. First, we define a function for slicing out a single plane:",
"_____no_output_____"
]
],
[
[
"def plane(image, pos):\n while image.numDimensions() > 2:\n image = ij.op().transform().hyperSliceView(image, image.numDimensions() - 1, pos[-1])\n pos.pop()\n return ij.py.from_java(image)\n\nij.py.show(plane(big_data, [0, 0, 0]))",
"_____no_output_____"
]
],
[
[
"But we can do better. Let's provide some interaction. First, a function to extract the _non-planar_ axes as a dict:",
"_____no_output_____"
]
],
[
[
"def axes(dataset):\n axes = {}\n for d in range(2, dataset.numDimensions()):\n axis = dataset.axis(d)\n label = axis.type().getLabel()\n length = dataset.dimension(d)\n axes[label] = length\n return axes\n\naxes(big_data)",
"_____no_output_____"
],
[
"import ipywidgets, matplotlib\n\nwidgets = {}\nfor label, length in axes(big_data).items():\n label = str(label) # HINT: Convert Java string to a python string to use with ipywidgets.\n widgets[label] = ipywidgets.IntSlider(description=label, max=length-1)\n\nwidgets",
"_____no_output_____"
],
[
"def f(**kwargs):\n matplotlib.pyplot.imshow(plane(big_data, list(kwargs.values())), cmap='gray')\nipywidgets.interact(f, **widgets);",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
e79ea036c80f2daaea91268f8a2277b5e3c74162 | 101,202 | ipynb | Jupyter Notebook | python/jupyterNotebooks/Example 2_1 Fourier Series.ipynb | oiseth/TKT4108StructuralDynamics2 | 826929e055d9b1457fa0b4c1c8537afa06dc98dd | [
"MIT"
] | 1 | 2021-09-30T09:52:10.000Z | 2021-09-30T09:52:10.000Z | python/jupyterNotebooks/Example 2_1 Fourier Series.ipynb | oiseth/TKT4108StructuralDynamics2 | 826929e055d9b1457fa0b4c1c8537afa06dc98dd | [
"MIT"
] | null | null | null | python/jupyterNotebooks/Example 2_1 Fourier Series.ipynb | oiseth/TKT4108StructuralDynamics2 | 826929e055d9b1457fa0b4c1c8537afa06dc98dd | [
"MIT"
] | 4 | 2021-09-01T09:46:43.000Z | 2021-10-09T18:58:52.000Z | 261.503876 | 20,716 | 0.918124 | [
[
[
"# Exampe 2.1: Fourier series",
"_____no_output_____"
]
],
[
[
"import numpy as np # Import numpy\nfrom matplotlib import pyplot as plt # pyplot module for plotting",
"_____no_output_____"
]
],
[
[
"## Define waveform\nWe start by defining a square waveform. We will integrate the waveform over one period. We have, however, plotted several periods to illustrate the behaviour of the Fourier series. The blue line in the figure shows four periods of the time series that we will approximate, while the brown line shows the line that we will integrate (one period)",
"_____no_output_____"
]
],
[
[
"dt = 0.01 # Time step\nt = np.arange(0,10.01,dt) # time vector\nx = np.zeros(t.shape) # Initialize the x array\nx[t<5] = 1.0 # Set the value of x to one for t<5\n# Plot waveform\nplt.figure()\nplt.plot(np.hstack((t-20.0,t-10.0, t, t+10.0)),np.hstack((x,x,x,x))); # Plot four periods\nplt.plot(t,x); #Plot one period\nplt.ylim(-2, 2);\nplt.xlim(-20,20);\nplt.grid();\nplt.xlabel('$t$');\nplt.ylabel('$X(t)$');\n\n",
"_____no_output_____"
]
],
[
[
"## Alternative 1: Obtaining Fourier coefficients expressed by sin() and cos() by the trapezoidal rule\nFourier series expressed in terms of sinus and cosine functions are defined by\n\n$$ X(t) = a_{0} + \\sum_{k=1}^{\\infty} \\left( a_k cos\\left(\\frac{2\\pi k}{T}t \\right) + b_k sin\\left(\\frac{2\\pi k}{T}t \\right)\\right) $$\n\nHere $a_0$, $a_k$ and $b_k$ are Fourier coefficients given by\n\n$$a_0 = \\frac{1}{T} \\int_{0}^{T}X(t)dt$$\n\n$$a_k = \\frac{1}{T} \\int_{0}^{T}X(t)cos\\left(\\frac{2\\pi k}{T}t \\right)dt$$\n\n$$b_k = \\frac{1}{T} \\int_{0}^{T}X(t)sin\\left(\\frac{2\\pi k}{T}t \\right)dt$$\n\nThe integrals above can be solved analytically and by numerical integration. In this example, we will consider different methods for numerical integration to obtain the coefficients, and we use the trapezoidal rule in this section.",
"_____no_output_____"
]
],
[
[
"nterms = 50 # Number of Fourier coefficeints\nT = np.max(t) # The period of the waveform\na0 = 1/T*np.trapz(x,t) # Mean value\nak = np.zeros((nterms)) \nbk = np.zeros((nterms))\nfor k in range(nterms): # Integrate for all terms\n ak[k] = 1/T*np.trapz(x*np.cos(2.0*np.pi*(k+1.0)*t/T),t)\n bk[k] = 1/T*np.trapz(x*np.sin(2.0*np.pi*(k+1.0)*t/T),t)\n\n# Plot Fourier coeffecients\nfig, axs = plt.subplots(nrows=1, ncols=2, constrained_layout=True)\n\nax1 = axs[0]\nax1.plot(np.arange(1,nterms+1),ak)\nax1.set_ylim(-1, 1)\nax1.grid()\nax1.set_ylabel('$a_k$');\nax1.set_xlabel('$k$');\n\nax2 = axs[1]\nax2.plot(np.arange(1,nterms+1),bk)\nax2.set_ylim(-1, 1)\nax2.grid()\nax2.set_ylabel('$b_k$');\nax2.set_xlabel('$k$');\n\n",
"_____no_output_____"
]
],
[
[
"The mean value is 0,5, while the figures above show that the $a_k$ coefficients are all zero and that every second $b_k$ coefficient is zero and that the nonzero terms become smaller as $k$ increases.It is interesting to plot the Fourier approximation and see how its accuracy depends on the number of terms used in the approximation.",
"_____no_output_____"
]
],
[
[
"# Plot Fourier series approximation\ntp = np.linspace(-20,20,1000)\nX_Fourier = np.ones(tp.shape[0])*a0\nfor k in range(nterms):\n X_Fourier = X_Fourier + 2.0*(ak[k]*np.cos(2.0*np.pi*(k+1.0)*tp/T) + bk[k]*np.sin(2.0*np.pi*(k+1.0)*tp/T))\n\nplt.figure(figsize=(8,4))\nplt.plot(np.hstack((t-20.0,t-10.0, t, t+10.0)),np.hstack((x,x,x,x))); # Plot four periods\nplt.plot(tp,X_Fourier, label=('Fourier approximation Nterms='+str(nterms)));\nplt.ylim(-2, 2)\nplt.xlim(-20,20)\nplt.grid()\nplt.xlabel('$t$')\nplt.ylabel('$X(t)$')\nplt.legend();\n ",
"_____no_output_____"
]
],
[
[
"The figure above shows that the Fourier approximation fits the waveform reasonably well and that the approximation gets better as more terms are added to the approximation. Try to change the number of terms yourself and observe how the approximation improves. Also, note the high-frequency oscillations observed in the corners of the Fourier approximation. This is called the Gibbs phenomenon. Note that it is impossible to get rid of these oscillations and that a Fourier series will always slightly overshoot when approximating step changes.",
"_____no_output_____"
],
[
"## Alternative 2: Trapezoidal rule, complex Fourier series\nThis approach is the same as the example above. Still, now we use the complex format of the Fourier series, which is essentially only rewriting of the formula introducing Euler's formula $e^{i \\omega t} = cos(\\omega t) + i sin(\\omega t)$. We define \n\n$$x_k = a_k -ib_k $$\n\n$$ e^{-i\\left(\\frac{2\\pi kt}{T} \\right)} =cos\\left(\\frac{2\\pi k}{T}t \\right) - i sin\\left(\\frac{2\\pi k}{T}t \\right) $$\n\n\n$$x_k = \\frac{1}{T} \\int_{0}^{T}X(t) \\left( cos\\left(\\frac{2\\pi k}{T}t \\right) - i sin\\left(\\frac{2\\pi k}{T}t \\right) \\right)dt$$\n\n$$x_k = \\frac{1}{T} \\int_{0}^{T} X(t) e^{-i\\left(\\frac{2\\pi kt}{T} \\right)}dt$$\n\nNote that the mean value is obtained when $k=0$ and that both negative and positive $k$ values are necessary to cancel the imaginary part of the approximation. We will now calculate the Fourier coefficients using the trapezoidal rule.",
"_____no_output_____"
]
],
[
[
"nterms = t.shape[0] # The numer of terms in the Fourier series\nXk = np.zeros(nterms,dtype=complex)\nfor k in range(nterms):\n Xk[k] = 1/T*np.trapz(x*np.exp(-1j*2*np.pi/T*k*t),t)\n\n# Plot Fourier coeffecients\nfig, axs = plt.subplots(nrows=1, ncols=2, constrained_layout=True)\n\nax1 = axs[0]\nax1.plot(np.arange(0,nterms),np.real(Xk))\nax1.set_ylim(-1, 1)\nax1.set_xlim(0, 10)\nax1.grid()\nax1.set_ylabel('$Re(X_k)$');\nax1.set_xlabel('$k$');\n\nax2 = axs[1]\nax2.plot(np.arange(0,nterms),np.imag(Xk))\nax2.set_ylim(-1, 1)\nax2.set_xlim(0, 10)\nax2.grid()\nax2.set_ylabel('$Imag(X_k)$');\nax2.set_xlabel('$k$');\n \n \n ",
"_____no_output_____"
]
],
[
[
"The figures above and the defined relation $x_k=a_k-ib_k$ show that we get the same Fourier coefficients as the case above and that this is only a rewriting of the Fourier approximation.",
"_____no_output_____"
],
[
"## Alternative 3: Left rectangular rule, complex Fourier series\n The function is assumed constant and equal to the left side of the rectangle when integrating by the left rectangular rule. This is what is typically implemented in softwares as the descrete Fourier transform. The integral using the left rectangular rule can be expressed as\n \n $$x_k = \\frac{1}{T} \\int_{0}^{T} X(t) e^{-i\\left(\\frac{2\\pi kt}{T} \\right)}dt$$\n \n $$x_k = \\frac{1}{T} \\sum_{r=0}^{N-1} X_r e^{-i\\left(\\frac{2\\pi k r\\Delta t}{T} \\right)} \\Delta t$$\n \n $$x_k = \\frac{1}{N} \\sum_{r=0}^{N-1} X_r e^{-i\\left(\\frac{2\\pi k r}{N} \\right)}$$\n ",
"_____no_output_____"
]
],
[
[
"N = t.shape[0] # The numer of terms in the Fourier series\nXk = np.zeros(nterms,dtype=complex)\nfor k in range(N):\n Xk[k] = 1/N*np.matmul(x,np.exp(-1j*2*np.pi/N*k*np.arange(N)))\n\n# Plot Fourier coeffecients\nfig, axs = plt.subplots(nrows=1, ncols=2, constrained_layout=True)\n\nax1 = axs[0]\nax1.plot(np.arange(0,nterms),np.real(Xk))\nax1.set_ylim(-1, 1)\nax1.set_xlim(0, 10)\nax1.grid()\nax1.set_ylabel('$Re(X_k)$');\nax1.set_xlabel('$k$');\n\nax2 = axs[1]\nax2.plot(np.arange(0,nterms),np.imag(Xk))\nax2.set_ylim(-1, 1)\nax2.set_xlim(0, 10)\nax2.grid()\nax2.set_ylabel('$Imag(X_k)$');\nax2.set_xlabel('$k$');",
"_____no_output_____"
]
],
[
[
"## Alternative 4: The fast Fourier transform\nThe discrete Fourier transform can be implemented in a clever time-saving way. This implementation is called the fast Fourier transform and is implemented in many software and is available in the NumPy package. The Fourier coefficients can be obtained using the FFT as shown below.\n\n",
"_____no_output_____"
]
],
[
[
"N = t.shape[0] # The numer of terms in the Fourier series\nXk = np.fft.fft(x)/N\n\n# Plot Fourier coeffecients\nfig, axs = plt.subplots(nrows=1, ncols=2, constrained_layout=True)\n\nax1 = axs[0]\nax1.plot(np.arange(0,nterms),np.real(Xk))\nax1.set_ylim(-1, 1)\nax1.set_xlim(0, 10)\nax1.grid()\nax1.set_ylabel('$Re(X_k)$');\nax1.set_xlabel('$k$');\n\nax2 = axs[1]\nax2.plot(np.arange(0,nterms),np.imag(Xk))\nax2.set_ylim(-1, 1)\nax2.set_xlim(0, 10)\nax2.grid()\nax2.set_ylabel('$Imag(X_k)$');\nax2.set_xlabel('$k$');",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79eac1c1a9efb91e0d8b6813e7c6730142d7d92 | 108,542 | ipynb | Jupyter Notebook | research/tutorials/transfer_learning/transfer_learning_tutorial.ipynb | Nhat-Minh-Hoang-Tran-BSC2021/Intel-Convolutional_Neural_Networks | bbc2c1adb38bfd7c066a4fe6b5a0e0dab6df21cb | [
"MIT"
] | 2 | 2019-04-02T02:36:41.000Z | 2021-01-18T15:20:39.000Z | research/tutorials/transfer_learning/transfer_learning_tutorial.ipynb | Nhat-Minh-Hoang-Tran-BSC2021/Intel-Convolutional_Neural_Networks | bbc2c1adb38bfd7c066a4fe6b5a0e0dab6df21cb | [
"MIT"
] | 1 | 2020-05-20T08:40:00.000Z | 2020-05-20T08:40:00.000Z | research/tutorials/transfer_learning/transfer_learning_tutorial.ipynb | Nhat-Minh-Hoang-Tran-BSC2021/Intel-Convolutional_Neural_Networks | bbc2c1adb38bfd7c066a4fe6b5a0e0dab6df21cb | [
"MIT"
] | 2 | 2019-04-22T09:19:13.000Z | 2020-02-12T18:01:32.000Z | 215.789264 | 92,924 | 0.903567 | [
[
[
"%matplotlib inline",
"_____no_output_____"
]
],
[
[
"\n# Transfer Learning Tutorial\n\n**Author**: `Sasank Chilamkurthy <https://chsasank.github.io>`_\n\nIn this tutorial, you will learn how to train your network using\ntransfer learning. You can read more about the transfer learning at `cs231n\nnotes <https://cs231n.github.io/transfer-learning/>`__\n\nQuoting these notes,\n\n In practice, very few people train an entire Convolutional Network\n from scratch (with random initialization), because it is relatively\n rare to have a dataset of sufficient size. Instead, it is common to\n pretrain a ConvNet on a very large dataset (e.g. ImageNet, which\n contains 1.2 million images with 1000 categories), and then use the\n ConvNet either as an initialization or a fixed feature extractor for\n the task of interest.\n\nThese two major transfer learning scenarios look as follows:\n\n- **Finetuning the convnet**: Instead of random initializaion, we\n initialize the network with a pretrained network, like the one that is\n trained on imagenet 1000 dataset. Rest of the training looks as\n usual.\n- **ConvNet as fixed feature extractor**: Here, we will freeze the weights\n for all of the network except that of the final fully connected\n layer. This last fully connected layer is replaced with a new one\n with random weights and only this layer is trained.\n",
"_____no_output_____"
]
],
[
[
"import os\nimport time\nimport torch\nimport torch.nn as nn\nimport torch.optim as optim\nfrom torch.optim import lr_scheduler\n\nimport torchvision\nimport torchvision.models as models\nimport torchvision.datasets as datasets\nimport torchvision.transforms as transforms\nimport matplotlib.pyplot as plt\n\nimport numpy as np\nimport copy\n\nplt.ion() # interactive mode",
"_____no_output_____"
]
],
[
[
"## Load Data\n\nWe will use `torchvision` and `torch.utils.data` packages for loading the\ndata.\n\nThe problem we're going to solve today is to train a model to classify\n**ants** and **bees**. We have about 120 training images each for ants and bees.\nThere are 75 validation images for each class. Usually, this is a very\nsmall dataset to generalize upon, if trained from scratch. Since we\nare using transfer learning, we should be able to generalize reasonably\nwell.\n\nThis dataset is a very small subset of imagenet.\n\n.. Note ::\n Download the data from\n `here <https://download.pytorch.org/tutorial/hymenoptera_data.zip>`_\n and extract it to the current directory.\n",
"_____no_output_____"
]
],
[
[
"# Data augmentation and normalization for training\n# Just normalization for validation\ndata_transforms = {\n 'train': transforms.Compose([\n transforms.RandomResizedCrop(224),\n transforms.RandomHorizontalFlip(),\n transforms.ToTensor(),\n transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])\n ]),\n 'val': transforms.Compose([\n transforms.Resize(256),\n transforms.CenterCrop(224),\n transforms.ToTensor(),\n transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])\n ]),\n}\n\n\ndata_dir = '/media/storage/datasets/cat_vs_dog'\nimage_datasets = {x: datasets.ImageFolder(os.path.join(data_dir, x),\n data_transforms[x])\n for x in ['train', 'val']}\n\ndataloaders = {x: torch.utils.data.DataLoader(image_datasets[x], batch_size=4,\n shuffle=True, num_workers=4)\n for x in ['train', 'val']}\n\ndataset_sizes = {x: len(image_datasets[x]) for x in ['train', 'val']}\nclass_names = image_datasets['train'].classes\n\ndevice = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")",
"_____no_output_____"
]
],
[
[
"### Visualize a few images\n\nLet's visualize a few training images so as to understand the data augmentations.\n",
"_____no_output_____"
]
],
[
[
"def imshow(inp, title=None):\n \"\"\"Imshow for Tensor.\"\"\"\n inp = inp.numpy().transpose((1, 2, 0)) # Convert [channel, height, width] --> [height, width, channel]\n mean = np.array([0.485, 0.456, 0.406])\n std = np.array([0.229, 0.224, 0.225])\n inp = std * inp + mean\n inp = np.clip(inp, 0, 1)\n plt.imshow(inp)\n if title is not None:\n plt.title(title)\n plt.pause(0.001) # pause a bit so that plots are updated\n\n\n# Get a batch of training data\ninputs, classes = next(iter(dataloaders['train']))\n\n# Make a grid from batch\nout = torchvision.utils.make_grid(inputs)\n\nimshow(out, title=[class_names[x] for x in classes])",
"_____no_output_____"
]
],
[
[
"Training the model\n------------------\n\nNow, let's write a general function to train a model. Here, we will\nillustrate:\n\n- Scheduling the learning rate\n- Saving the best model\n\nIn the following, parameter ``scheduler`` is an LR scheduler object from\n``torch.optim.lr_scheduler``.\n",
"_____no_output_____"
]
],
[
[
"def train_model(model, criterion, optimizer, scheduler, num_epochs=25):\n since = time.time()\n\n best_model_wts = copy.deepcopy(model.state_dict())\n best_acc = 0.0\n\n for epoch in range(num_epochs):\n print(f'Epoch {epoch}/{num_epochs - 1}')\n print('-' * 10)\n\n # Each epoch has a training and validation phase\n for phase in ['train', 'val']:\n if phase == 'train':\n scheduler.step()\n model.train() # Set model to training mode\n else:\n model.eval() # Set model to evaluate mode\n\n running_loss = 0.0\n running_corrects = 0\n\n # Iterate over data.\n for inputs, labels in dataloaders[phase]:\n inputs = inputs.to(device)\n labels = labels.to(device)\n\n # zero the parameter gradients\n optimizer.zero_grad()\n\n # forward\n # track history if only in train\n with torch.set_grad_enabled(phase == 'train'):\n outputs = model(inputs)\n _, preds = torch.max(outputs, 1)\n loss = criterion(outputs, labels)\n\n # backward + optimize only if in training phase\n if phase == 'train':\n loss.backward()\n optimizer.step()\n\n # statistics\n running_loss += loss.item() * inputs.size(0)\n running_corrects += torch.sum(preds == labels.data)\n\n epoch_loss = running_loss / dataset_sizes[phase]\n epoch_acc = running_corrects.double() / dataset_sizes[phase]\n\n print(f'{phase} Loss: {epoch_loss:.4f} Acc: {epoch_acc:.4f}')\n\n # deep copy the model\n if phase == 'val' and epoch_acc > best_acc:\n best_acc = epoch_acc\n best_model_wts = copy.deepcopy(model.state_dict())\n\n print()\n\n time_elapsed = time.time() - since\n print(f'Training complete in {time_elapsed // 60:.0f}m {time_elapsed % 60:.0f}s')\n print(f'Best val Acc: {best_acc:4f}')\n\n # load best model weights\n model.load_state_dict(best_model_wts)\n return model",
"_____no_output_____"
]
],
[
[
"### Visualizing the model predictions\n\nGeneric function to display predictions for a few images\n",
"_____no_output_____"
]
],
[
[
"def visualize_model(model, num_images=6):\n was_training = model.training\n model.eval()\n images_so_far = 0\n fig = plt.figure()\n\n with torch.no_grad():\n for i, (inputs, labels) in enumerate(dataloaders['val']):\n inputs = inputs.to(device)\n labels = labels.to(device)\n\n outputs = model(inputs)\n _, preds = torch.max(outputs, 1)\n\n for j in range(inputs.size()[0]):\n images_so_far += 1\n ax = plt.subplot(num_images//2, 2, images_so_far)\n ax.axis('off')\n ax.set_title('predicted: {}'.format(class_names[preds[j]]))\n imshow(inputs.cpu().data[j])\n\n if images_so_far == num_images:\n model.train(mode=was_training)\n return\n model.train(mode=was_training)",
"_____no_output_____"
]
],
[
[
"Finetuning the convnet\n----------------------\n\nLoad a pretrained model and reset final fully connected layer.\n\n\n",
"_____no_output_____"
]
],
[
[
"model_ft = models.resnet18(pretrained=True)\nnum_ftrs = model_ft.fc.in_features\nmodel_ft.fc = nn.Linear(num_ftrs, 2)\n\nmodel_ft = model_ft.to(device)\n\ncriterion = nn.CrossEntropyLoss()\n\n# Observe that all parameters are being optimized\noptimizer_ft = optim.SGD(model_ft.parameters(), lr=0.001, momentum=0.9)\n\n# Decay LR by a factor of 0.1 every 7 epochs\nexp_lr_scheduler = lr_scheduler.StepLR(optimizer_ft, step_size=7, gamma=0.1)",
"_____no_output_____"
]
],
[
[
"### Train and evaluate\n\nIt should take around 15-25 min on CPU. On GPU though, it takes less than a\nminute.\n",
"_____no_output_____"
]
],
[
[
"model_ft = train_model(model_ft, criterion, optimizer_ft, exp_lr_scheduler,\n num_epochs=25)",
"_____no_output_____"
],
[
"visualize_model(model_ft)",
"_____no_output_____"
]
],
[
[
"ConvNet as fixed feature extractor\n----------------------------------\n\nHere, we need to freeze all the network except the final layer. We need\nto set ``requires_grad == False`` to freeze the parameters so that the\ngradients are not computed in ``backward()``.\n\nYou can read more about this in the documentation\n`here <https://pytorch.org/docs/notes/autograd.html#excluding-subgraphs-from-backward>`__.\n\n\n",
"_____no_output_____"
]
],
[
[
"model_conv = torchvision.models.resnet18(pretrained=True)\nfor param in model_conv.parameters():\n param.requires_grad = False\n\n# Parameters of newly constructed modules have requires_grad=True by default\nnum_ftrs = model_conv.fc.in_features\nmodel_conv.fc = nn.Linear(num_ftrs, 2)\n\nmodel_conv = model_conv.to(device)\n\ncriterion = nn.CrossEntropyLoss()\n\n# Observe that only parameters of final layer are being optimized as\n# opposed to before.\noptimizer_conv = optim.SGD(model_conv.fc.parameters(), lr=0.001, momentum=0.9)\n\n# Decay LR by a factor of 0.1 every 7 epochs\nexp_lr_scheduler = lr_scheduler.StepLR(optimizer_conv, step_size=7, gamma=0.1)",
"_____no_output_____"
]
],
[
[
"### Train and evaluate\n\nOn CPU this will take about half the time compared to previous scenario.\nThis is expected as gradients don't need to be computed for most of the\nnetwork. However, forward does need to be computed.\n\n\n",
"_____no_output_____"
]
],
[
[
"model_conv = train_model(model_conv, criterion, optimizer_conv,\n exp_lr_scheduler, num_epochs=25)",
"_____no_output_____"
],
[
"visualize_model(model_conv)\n\nplt.ioff()\nplt.show()",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e79eb02f813fdf8f58faa45da078edead5395c1d | 86,576 | ipynb | Jupyter Notebook | analyses/2019-03-19-plot-estimated-and-observed-clade-frequencies.ipynb | blab/flu-forecasting | 723c515ba2e8813f081ae48b23d63871e9e3db4e | [
"MIT"
] | 2 | 2020-08-19T04:09:28.000Z | 2021-07-05T02:32:04.000Z | analyses/2019-03-19-plot-estimated-and-observed-clade-frequencies.ipynb | elifesciences-publications/flu-forecasting | 1fee7ab1f755ad8ae5be28542045b5b609e4774b | [
"MIT"
] | null | null | null | analyses/2019-03-19-plot-estimated-and-observed-clade-frequencies.ipynb | elifesciences-publications/flu-forecasting | 1fee7ab1f755ad8ae5be28542045b5b609e4774b | [
"MIT"
] | 1 | 2020-09-01T11:45:41.000Z | 2020-09-01T11:45:41.000Z | 49.642202 | 28,464 | 0.562223 | [
[
[
"# Plot estimated and observed clade frequencies\n\nPlot the overall observed clade frequencies compared to the estimated frequencies at each timepoint. The differences between these frequencies tells us something about the error in frequency estimation due to missing data from the near future.",
"_____no_output_____"
]
],
[
[
"from collections import defaultdict\nimport json\nimport matplotlib.pyplot as plt\nimport matplotlib.gridspec as gridspec\nimport networkx as nx\nimport numpy as np\nimport pandas as pd\nimport seaborn as sns\n\n%matplotlib inline\n\nplt.style.use(\"huddlej\")",
"_____no_output_____"
],
[
"!pwd",
"/Users/jlhudd/projects/nextstrain/flu-forecasting/analyses\n"
],
[
"with open(\"../results/builds/h3n2/5_viruses_per_month/sample_0/2005-10-01--2015-10-01/timepoints/2015-10-01/segments/ha/frequencies.json\", \"r\") as fh:\n frequencies = json.load(fh)",
"_____no_output_____"
],
[
"with open(\"../results/builds/h3n2/5_viruses_per_month/sample_0/2005-10-01--2015-10-01/timepoints/2015-10-01/segments/ha/clades.json\", \"r\") as fh:\n clades = json.load(fh)",
"_____no_output_____"
],
[
"tips = pd.read_csv(\"../results/builds/h3n2/5_viruses_per_month/sample_0/2005-10-01--2015-10-01/standardized_tip_attributes.tsv\",\n sep=\"\\t\", parse_dates=[\"timepoint\"])\ntips = tips.loc[\n tips[\"segment\"] == \"ha\",\n [\"strain\", \"clade_membership\", \"timepoint\", \"frequency\"]\n].copy()",
"_____no_output_____"
],
[
"data_path = \"../results/builds/h3n2/5_viruses_per_month/sample_0/2005-10-01--2015-10-01/tips_to_clades.tsv\"",
"_____no_output_____"
],
[
"df = pd.read_csv(data_path, sep=\"\\t\", parse_dates=[\"timepoint\"])",
"_____no_output_____"
],
[
"# successful clade\nclade_name = \"d4aa5d5\"\n\n# unsuccessful clade\n#clade_name = \"5f0cf16\"",
"_____no_output_____"
],
[
"clade_tips = df[df[\"clade_membership\"] == clade_name][\"tip\"].unique()",
"_____no_output_____"
],
[
"clade_tips.shape",
"_____no_output_____"
],
[
"df[\"tip\"].unique().shape",
"_____no_output_____"
],
[
"clade_tips.shape",
"_____no_output_____"
],
[
"estimated_clade_frequencies = tips[tips[\"strain\"].isin(clade_tips)].groupby(\"timepoint\")[\"frequency\"].sum().reset_index()",
"_____no_output_____"
],
[
"estimated_clade_frequencies[\"timepoint_float\"] = estimated_clade_frequencies[\"timepoint\"].dt.year + (estimated_clade_frequencies[\"timepoint\"].dt.month - 1) / 12.0",
"_____no_output_____"
],
[
"estimated_clade_frequencies",
"_____no_output_____"
],
[
"clade_frequencies = np.zeros_like(frequencies[\"data\"][\"pivots\"])",
"_____no_output_____"
],
[
"for tip in clade_tips:\n clade_frequencies += frequencies[\"data\"][\"frequencies\"][tip]",
"_____no_output_____"
],
[
"clade_frequencies",
"_____no_output_____"
],
[
"fig, ax = plt.subplots(1, 1, figsize=(8, 6))\nax.plot(frequencies[\"data\"][\"pivots\"], clade_frequencies, \"o-\")\nax.plot(estimated_clade_frequencies[\"timepoint_float\"], estimated_clade_frequencies[\"frequency\"], \"o\")\nax.set_xlabel(\"Date\")\nax.set_ylabel(\"Frequency\")\n#ax.set_ylim(0, 1)",
"_____no_output_____"
],
[
"tips[tips[\"strain\"].isin(clade_tips)]",
"_____no_output_____"
],
[
"found_clade_tips = tips[tips[\"strain\"].isin(clade_tips)][\"strain\"].unique()",
"_____no_output_____"
],
[
"set(clade_tips) - set(found_clade_tips)",
"_____no_output_____"
],
[
"tips[tips[\"strain\"] == \"A/Kenya/230/2012\"]",
"_____no_output_____"
],
[
"df[df[\"clade_membership\"] == \"5f0cf16\"]",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79eb4b750be9d46159f08aea2071f840012fdca | 48,367 | ipynb | Jupyter Notebook | tests/project/ipynb/python_essentials.ipynb | QuantEcon/sphinx-tojupyter | ba15167e433ef67b92d88484465c40cb9153bc3c | [
"BSD-3-Clause"
] | 1 | 2021-02-08T11:43:38.000Z | 2021-02-08T11:43:38.000Z | tests/project/ipynb/python_essentials.ipynb | QuantEcon/sphinx-tojupyter | ba15167e433ef67b92d88484465c40cb9153bc3c | [
"BSD-3-Clause"
] | 23 | 2020-11-10T04:42:58.000Z | 2022-02-16T07:48:53.000Z | tests/project/ipynb/python_essentials.ipynb | QuantEcon/sphinx-tojupyter | ba15167e433ef67b92d88484465c40cb9153bc3c | [
"BSD-3-Clause"
] | null | null | null | 22.207071 | 230 | 0.514897 | [
[
[
"\n<a id='python-done-right'></a>\n<div id=\"qe-notebook-header\" align=\"right\" style=\"text-align:right;\">\n <a href=\"https://quantecon.org/\" title=\"quantecon.org\">\n <img style=\"width:250px;display:inline;\" width=\"250px\" src=\"https://assets.quantecon.org/img/qe-menubar-logo.svg\" alt=\"QuantEcon\">\n </a>\n</div>",
"_____no_output_____"
],
[
"# Python Essentials",
"_____no_output_____"
],
[
"## Contents\n\n- [Python Essentials](#Python-Essentials) \n - [Overview](#Overview) \n - [Data Types](#Data-Types) \n - [Input and Output](#Input-and-Output) \n - [Iterating](#Iterating) \n - [Comparisons and Logical Operators](#Comparisons-and-Logical-Operators) \n - [More Functions](#More-Functions) \n - [Coding Style and PEP8](#Coding-Style-and-PEP8) \n - [Exercises](#Exercises) \n - [Solutions](#Solutions) ",
"_____no_output_____"
],
[
"## Overview\n\nWe have covered a lot of material quite quickly, with a focus on examples.\n\nNow let’s cover some core features of Python in a more systematic way.\n\nThis approach is less exciting but helps clear up some details.",
"_____no_output_____"
],
[
"## Data Types\n\n\n<a id='index-0'></a>\nComputer programs typically keep track of a range of data types.\n\nFor example, `1.5` is a floating point number, while `1` is an integer.\n\nPrograms need to distinguish between these two types for various reasons.\n\nOne is that they are stored in memory differently.\n\nAnother is that arithmetic operations are different\n\n- For example, floating point arithmetic is implemented on most machines by a\n specialized Floating Point Unit (FPU). \n\n\nIn general, floats are more informative but arithmetic operations on integers\nare faster and more accurate.\n\nPython provides numerous other built-in Python data types, some of which we’ve already met\n\n- strings, lists, etc. \n\n\nLet’s learn a bit more about them.",
"_____no_output_____"
],
[
"### Primitive Data Types\n\nOne simple data type is **Boolean values**, which can be either `True` or `False`",
"_____no_output_____"
]
],
[
[
"x = True\nx",
"_____no_output_____"
]
],
[
[
"We can check the type of any object in memory using the `type()` function.",
"_____no_output_____"
]
],
[
[
"type(x)",
"_____no_output_____"
]
],
[
[
"In the next line of code, the interpreter evaluates the expression on the right of = and binds y to this value",
"_____no_output_____"
]
],
[
[
"y = 100 < 10\ny",
"_____no_output_____"
],
[
"type(y)",
"_____no_output_____"
]
],
[
[
"In arithmetic expressions, `True` is converted to `1` and `False` is converted `0`.\n\nThis is called **Boolean arithmetic** and is often useful in programming.\n\nHere are some examples",
"_____no_output_____"
]
],
[
[
"x + y",
"_____no_output_____"
],
[
"x * y",
"_____no_output_____"
],
[
"True + True",
"_____no_output_____"
],
[
"bools = [True, True, False, True] # List of Boolean values\n\nsum(bools)",
"_____no_output_____"
]
],
[
[
"Complex numbers are another primitive data type in Python",
"_____no_output_____"
]
],
[
[
"x = complex(1, 2)\ny = complex(2, 1)\nprint(x * y)\n\ntype(x)",
"_____no_output_____"
]
],
[
[
"### Containers\n\nPython has several basic types for storing collections of (possibly heterogeneous) data.\n\nWe’ve [already discussed lists](https://python-programming.quantecon.org/python_by_example.html#lists-ref).\n\n\n<a id='index-1'></a>\nA related data type is **tuples**, which are “immutable” lists",
"_____no_output_____"
]
],
[
[
"x = ('a', 'b') # Parentheses instead of the square brackets\nx = 'a', 'b' # Or no brackets --- the meaning is identical\nx",
"_____no_output_____"
],
[
"type(x)",
"_____no_output_____"
]
],
[
[
"In Python, an object is called **immutable** if, once created, the object cannot be changed.\n\nConversely, an object is **mutable** if it can still be altered after creation.\n\nPython lists are mutable",
"_____no_output_____"
]
],
[
[
"x = [1, 2]\nx[0] = 10\nx",
"_____no_output_____"
]
],
[
[
"But tuples are not",
"_____no_output_____"
]
],
[
[
"x = (1, 2)\nx[0] = 10",
"_____no_output_____"
]
],
[
[
"We’ll say more about the role of mutable and immutable data a bit later.\n\nTuples (and lists) can be “unpacked” as follows",
"_____no_output_____"
]
],
[
[
"integers = (10, 20, 30)\nx, y, z = integers\nx",
"_____no_output_____"
],
[
"y",
"_____no_output_____"
]
],
[
[
"You’ve actually [seen an example of this](https://python-programming.quantecon.org/about_py.html#tuple-unpacking-example) already.\n\nTuple unpacking is convenient and we’ll use it often.",
"_____no_output_____"
],
[
"#### Slice Notation\n\n\n<a id='index-2'></a>\nTo access multiple elements of a list or tuple, you can use Python’s slice\nnotation.\n\nFor example,",
"_____no_output_____"
]
],
[
[
"a = [2, 4, 6, 8]\na[1:]",
"_____no_output_____"
],
[
"a[1:3]",
"_____no_output_____"
]
],
[
[
"The general rule is that `a[m:n]` returns `n - m` elements, starting at `a[m]`.\n\nNegative numbers are also permissible",
"_____no_output_____"
]
],
[
[
"a[-2:] # Last two elements of the list",
"_____no_output_____"
]
],
[
[
"The same slice notation works on tuples and strings",
"_____no_output_____"
]
],
[
[
"s = 'foobar'\ns[-3:] # Select the last three elements",
"_____no_output_____"
]
],
[
[
"#### Sets and Dictionaries\n\n\n<a id='index-4'></a>\nTwo other container types we should mention before moving on are [sets](https://docs.python.org/3/tutorial/datastructures.html#sets) and [dictionaries](https://docs.python.org/3/tutorial/datastructures.html#dictionaries).\n\nDictionaries are much like lists, except that the items are named instead of\nnumbered",
"_____no_output_____"
]
],
[
[
"d = {'name': 'Frodo', 'age': 33}\ntype(d)",
"_____no_output_____"
],
[
"d['age']",
"_____no_output_____"
]
],
[
[
"The names `'name'` and `'age'` are called the *keys*.\n\nThe objects that the keys are mapped to (`'Frodo'` and `33`) are called the `values`.\n\nSets are unordered collections without duplicates, and set methods provide the\nusual set-theoretic operations",
"_____no_output_____"
]
],
[
[
"s1 = {'a', 'b'}\ntype(s1)",
"_____no_output_____"
],
[
"s2 = {'b', 'c'}\ns1.issubset(s2)",
"_____no_output_____"
],
[
"s1.intersection(s2)",
"_____no_output_____"
]
],
[
[
"The `set()` function creates sets from sequences",
"_____no_output_____"
]
],
[
[
"s3 = set(('foo', 'bar', 'foo'))\ns3",
"_____no_output_____"
]
],
[
[
"## Input and Output\n\n\n<a id='index-5'></a>\nLet’s briefly review reading and writing to text files, starting with writing",
"_____no_output_____"
]
],
[
[
"f = open('newfile.txt', 'w') # Open 'newfile.txt' for writing\nf.write('Testing\\n') # Here '\\n' means new line\nf.write('Testing again')\nf.close()",
"_____no_output_____"
]
],
[
[
"Here\n\n- The built-in function `open()` creates a file object for writing to. \n- Both `write()` and `close()` are methods of file objects. \n\n\nWhere is this file that we’ve created?\n\nRecall that Python maintains a concept of the present working directory (pwd) that can be located from with Jupyter or IPython via",
"_____no_output_____"
]
],
[
[
"%pwd",
"_____no_output_____"
]
],
[
[
"If a path is not specified, then this is where Python writes to.\n\nWe can also use Python to read the contents of `newline.txt` as follows",
"_____no_output_____"
]
],
[
[
"f = open('newfile.txt', 'r')\nout = f.read()\nout",
"_____no_output_____"
],
[
"print(out)",
"_____no_output_____"
]
],
[
[
"### Paths\n\n\n<a id='index-6'></a>\nNote that if `newfile.txt` is not in the present working directory then this call to `open()` fails.\n\nIn this case, you can shift the file to the pwd or specify the [full path](https://en.wikipedia.org/wiki/Path_%28computing%29) to the file",
"_____no_output_____"
],
[
"```python3\nf = open('insert_full_path_to_file/newfile.txt', 'r')\n```\n",
"_____no_output_____"
],
[
"\n<a id='iterating-version-1'></a>",
"_____no_output_____"
],
[
"## Iterating\n\n\n<a id='index-7'></a>\nOne of the most important tasks in computing is stepping through a\nsequence of data and performing a given action.\n\nOne of Python’s strengths is its simple, flexible interface to this kind of iteration via\nthe `for` loop.",
"_____no_output_____"
],
[
"### Looping over Different Objects\n\nMany Python objects are “iterable”, in the sense that they can be looped over.\n\nTo give an example, let’s write the file us_cities.txt, which lists US cities and their population, to the present working directory.\n\n\n<a id='us-cities-data'></a>",
"_____no_output_____"
]
],
[
[
"%%file us_cities.txt\nnew york: 8244910\nlos angeles: 3819702\nchicago: 2707120\nhouston: 2145146\nphiladelphia: 1536471\nphoenix: 1469471\nsan antonio: 1359758\nsan diego: 1326179\ndallas: 1223229",
"_____no_output_____"
]
],
[
[
"Here %%file is an [IPython cell magic](https://ipython.readthedocs.io/en/stable/interactive/magics.html#cell-magics).\n\nSuppose that we want to make the information more readable, by capitalizing names and adding commas to mark thousands.\n\nThe program below reads the data in and makes the conversion:",
"_____no_output_____"
]
],
[
[
"data_file = open('us_cities.txt', 'r')\nfor line in data_file:\n city, population = line.split(':') # Tuple unpacking\n city = city.title() # Capitalize city names\n population = f'{int(population):,}' # Add commas to numbers\n print(city.ljust(15) + population)\ndata_file.close()",
"_____no_output_____"
]
],
[
[
"Here `format()` is a string method [used for inserting variables into strings](https://docs.python.org/3/library/string.html#formatspec).\n\nThe reformatting of each line is the result of three different string methods,\nthe details of which can be left till later.\n\nThe interesting part of this program for us is line 2, which shows that\n\n1. The file object `data_file` is iterable, in the sense that it can be placed to the right of `in` within a `for` loop. \n1. Iteration steps through each line in the file. \n\n\nThis leads to the clean, convenient syntax shown in our program.\n\nMany other kinds of objects are iterable, and we’ll discuss some of them later on.",
"_____no_output_____"
],
[
"### Looping without Indices\n\nOne thing you might have noticed is that Python tends to favor looping without explicit indexing.\n\nFor example,",
"_____no_output_____"
]
],
[
[
"x_values = [1, 2, 3] # Some iterable x\nfor x in x_values:\n print(x * x)",
"_____no_output_____"
]
],
[
[
"is preferred to",
"_____no_output_____"
]
],
[
[
"for i in range(len(x_values)):\n print(x_values[i] * x_values[i])",
"_____no_output_____"
]
],
[
[
"When you compare these two alternatives, you can see why the first one is preferred.\n\nPython provides some facilities to simplify looping without indices.\n\nOne is `zip()`, which is used for stepping through pairs from two sequences.\n\nFor example, try running the following code",
"_____no_output_____"
]
],
[
[
"countries = ('Japan', 'Korea', 'China')\ncities = ('Tokyo', 'Seoul', 'Beijing')\nfor country, city in zip(countries, cities):\n print(f'The capital of {country} is {city}')",
"_____no_output_____"
]
],
[
[
"The `zip()` function is also useful for creating dictionaries — for\nexample",
"_____no_output_____"
]
],
[
[
"names = ['Tom', 'John']\nmarks = ['E', 'F']\ndict(zip(names, marks))",
"_____no_output_____"
]
],
[
[
"If we actually need the index from a list, one option is to use `enumerate()`.\n\nTo understand what `enumerate()` does, consider the following example",
"_____no_output_____"
]
],
[
[
"letter_list = ['a', 'b', 'c']\nfor index, letter in enumerate(letter_list):\n print(f\"letter_list[{index}] = '{letter}'\")",
"_____no_output_____"
]
],
[
[
"### List Comprehensions\n\n\n<a id='index-8'></a>\nWe can also simplify the code for generating the list of random draws considerably by using something called a *list comprehension*.\n\n[List comprehensions](https://en.wikipedia.org/wiki/List_comprehension) are an elegant Python tool for creating lists.\n\nConsider the following example, where the list comprehension is on the\nright-hand side of the second line",
"_____no_output_____"
]
],
[
[
"animals = ['dog', 'cat', 'bird']\nplurals = [animal + 's' for animal in animals]\nplurals",
"_____no_output_____"
]
],
[
[
"Here’s another example",
"_____no_output_____"
]
],
[
[
"range(8)",
"_____no_output_____"
],
[
"doubles = [2 * x for x in range(8)]\ndoubles",
"_____no_output_____"
]
],
[
[
"## Comparisons and Logical Operators",
"_____no_output_____"
],
[
"### Comparisons\n\n\n<a id='index-9'></a>\nMany different kinds of expressions evaluate to one of the Boolean values (i.e., `True` or `False`).\n\nA common type is comparisons, such as",
"_____no_output_____"
]
],
[
[
"x, y = 1, 2\nx < y",
"_____no_output_____"
],
[
"x > y",
"_____no_output_____"
]
],
[
[
"One of the nice features of Python is that we can *chain* inequalities",
"_____no_output_____"
]
],
[
[
"1 < 2 < 3",
"_____no_output_____"
],
[
"1 <= 2 <= 3",
"_____no_output_____"
]
],
[
[
"As we saw earlier, when testing for equality we use `==`",
"_____no_output_____"
]
],
[
[
"x = 1 # Assignment\nx == 2 # Comparison",
"_____no_output_____"
]
],
[
[
"For “not equal” use `!=`",
"_____no_output_____"
]
],
[
[
"1 != 2",
"_____no_output_____"
]
],
[
[
"Note that when testing conditions, we can use **any** valid Python expression",
"_____no_output_____"
]
],
[
[
"x = 'yes' if 42 else 'no'\nx",
"_____no_output_____"
],
[
"x = 'yes' if [] else 'no'\nx",
"_____no_output_____"
]
],
[
[
"What’s going on here?\n\nThe rule is:\n\n- Expressions that evaluate to zero, empty sequences or containers (strings, lists, etc.) and `None` are all equivalent to `False`. \n - for example, `[]` and `()` are equivalent to `False` in an `if` clause \n- All other values are equivalent to `True`. \n - for example, `42` is equivalent to `True` in an `if` clause ",
"_____no_output_____"
],
[
"### Combining Expressions\n\n\n<a id='index-10'></a>\nWe can combine expressions using `and`, `or` and `not`.\n\nThese are the standard logical connectives (conjunction, disjunction and denial)",
"_____no_output_____"
]
],
[
[
"1 < 2 and 'f' in 'foo'",
"_____no_output_____"
],
[
"1 < 2 and 'g' in 'foo'",
"_____no_output_____"
],
[
"1 < 2 or 'g' in 'foo'",
"_____no_output_____"
],
[
"not True",
"_____no_output_____"
],
[
"not not True",
"_____no_output_____"
]
],
[
[
"Remember\n\n- `P and Q` is `True` if both are `True`, else `False` \n- `P or Q` is `False` if both are `False`, else `True` ",
"_____no_output_____"
],
[
"## More Functions\n\n\n<a id='index-11'></a>\nLet’s talk a bit more about functions, which are all important for good programming style.",
"_____no_output_____"
],
[
"### The Flexibility of Python Functions\n\nAs we discussed in the [previous lecture](https://python-programming.quantecon.org/python_by_example.html#python-by-example), Python functions are very flexible.\n\nIn particular\n\n- Any number of functions can be defined in a given file. \n- Functions can be (and often are) defined inside other functions. \n- Any object can be passed to a function as an argument, including other functions. \n- A function can return any kind of object, including functions. \n\n\nWe already [gave an example](https://python-programming.quantecon.org/functions.html#test-program-6) of how straightforward it is to pass a function to\na function.\n\nNote that a function can have arbitrarily many `return` statements (including zero).\n\nExecution of the function terminates when the first return is hit, allowing\ncode like the following example",
"_____no_output_____"
]
],
[
[
"def f(x):\n if x < 0:\n return 'negative'\n return 'nonnegative'",
"_____no_output_____"
]
],
[
[
"Functions without a return statement automatically return the special Python object `None`.",
"_____no_output_____"
],
[
"### Docstrings\n\n\n<a id='index-12'></a>\nPython has a system for adding comments to functions, modules, etc. called *docstrings*.\n\nThe nice thing about docstrings is that they are available at run-time.\n\nTry running this",
"_____no_output_____"
]
],
[
[
"def f(x):\n \"\"\"\n This function squares its argument\n \"\"\"\n return x**2",
"_____no_output_____"
]
],
[
[
"After running this code, the docstring is available",
"_____no_output_____"
]
],
[
[
"f?",
"_____no_output_____"
]
],
[
[
"```ipython\nType: function\nString Form:<function f at 0x2223320>\nFile: /home/john/temp/temp.py\nDefinition: f(x)\nDocstring: This function squares its argument\n```\n",
"_____no_output_____"
]
],
[
[
"f??",
"_____no_output_____"
]
],
[
[
"```ipython\nType: function\nString Form:<function f at 0x2223320>\nFile: /home/john/temp/temp.py\nDefinition: f(x)\nSource:\ndef f(x):\n \"\"\"\n This function squares its argument\n \"\"\"\n return x**2\n```\n",
"_____no_output_____"
],
[
"With one question mark we bring up the docstring, and with two we get the source code as well.",
"_____no_output_____"
],
[
"### One-Line Functions: `lambda`\n\n\n<a id='index-13'></a>\nThe `lambda` keyword is used to create simple functions on one line.\n\nFor example, the definitions",
"_____no_output_____"
]
],
[
[
"def f(x):\n return x**3",
"_____no_output_____"
]
],
[
[
"and",
"_____no_output_____"
]
],
[
[
"f = lambda x: x**3",
"_____no_output_____"
]
],
[
[
"are entirely equivalent.\n\nTo see why `lambda` is useful, suppose that we want to calculate $ \\int_0^2 x^3 dx $ (and have forgotten our high-school calculus).\n\nThe SciPy library has a function called `quad` that will do this calculation for us.\n\nThe syntax of the `quad` function is `quad(f, a, b)` where `f` is a function and `a` and `b` are numbers.\n\nTo create the function $ f(x) = x^3 $ we can use `lambda` as follows",
"_____no_output_____"
]
],
[
[
"from scipy.integrate import quad\n\nquad(lambda x: x**3, 0, 2)",
"_____no_output_____"
]
],
[
[
"Here the function created by `lambda` is said to be *anonymous* because it was never given a name.",
"_____no_output_____"
],
[
"### Keyword Arguments\n\n\n<a id='index-14'></a>\nIn a [previous lecture](https://python-programming.quantecon.org/python_by_example.html#python-by-example), you came across the statement",
"_____no_output_____"
],
[
"```python3\nplt.plot(x, 'b-', label=\"white noise\")\n```\n",
"_____no_output_____"
],
[
"In this call to Matplotlib’s `plot` function, notice that the last argument is passed in `name=argument` syntax.\n\nThis is called a *keyword argument*, with `label` being the keyword.\n\nNon-keyword arguments are called *positional arguments*, since their meaning\nis determined by order\n\n- `plot(x, 'b-', label=\"white noise\")` is different from `plot('b-', x, label=\"white noise\")` \n\n\nKeyword arguments are particularly useful when a function has a lot of arguments, in which case it’s hard to remember the right order.\n\nYou can adopt keyword arguments in user-defined functions with no difficulty.\n\nThe next example illustrates the syntax",
"_____no_output_____"
]
],
[
[
"def f(x, a=1, b=1):\n return a + b * x",
"_____no_output_____"
]
],
[
[
"The keyword argument values we supplied in the definition of `f` become the default values",
"_____no_output_____"
]
],
[
[
"f(2)",
"_____no_output_____"
]
],
[
[
"They can be modified as follows",
"_____no_output_____"
]
],
[
[
"f(2, a=4, b=5)",
"_____no_output_____"
]
],
[
[
"## Coding Style and PEP8\n\n\n<a id='index-15'></a>\nTo learn more about the Python programming philosophy type `import this` at the prompt.\n\nAmong other things, Python strongly favors consistency in programming style.\n\nWe’ve all heard the saying about consistency and little minds.\n\nIn programming, as in mathematics, the opposite is true\n\n- A mathematical paper where the symbols $ \\cup $ and $ \\cap $ were\n reversed would be very hard to read, even if the author told you so on the\n first page. \n\n\nIn Python, the standard style is set out in [PEP8](https://www.python.org/dev/peps/pep-0008/).\n\n(Occasionally we’ll deviate from PEP8 in these lectures to better match mathematical notation)",
"_____no_output_____"
],
[
"## Exercises\n\nSolve the following exercises.\n\n(For some, the built-in function `sum()` comes in handy).\n\n\n<a id='pyess-ex1'></a>",
"_____no_output_____"
],
[
"### Exercise 1\n\nPart 1: Given two numeric lists or tuples `x_vals` and `y_vals` of equal length, compute\ntheir inner product using `zip()`.\n\nPart 2: In one line, count the number of even numbers in 0,…,99.\n\n- Hint: `x % 2` returns 0 if `x` is even, 1 otherwise. \n\n\nPart 3: Given `pairs = ((2, 5), (4, 2), (9, 8), (12, 10))`, count the number of pairs `(a, b)`\nsuch that both `a` and `b` are even.\n\n\n<a id='pyess-ex2'></a>",
"_____no_output_____"
],
[
"### Exercise 2\n\nConsider the polynomial\n\n\n<a id='equation-polynom0'></a>\n$$\np(x)\n= a_0 + a_1 x + a_2 x^2 + \\cdots a_n x^n\n= \\sum_{i=0}^n a_i x^i \\tag{5.1}\n$$\n\nWrite a function `p` such that `p(x, coeff)` that computes the value in [(5.1)](#equation-polynom0) given a point `x` and a list of coefficients `coeff`.\n\nTry to use `enumerate()` in your loop.\n\n\n<a id='pyess-ex3'></a>",
"_____no_output_____"
],
[
"### Exercise 3\n\nWrite a function that takes a string as an argument and returns the number of capital letters in the string.\n\nHint: `'foo'.upper()` returns `'FOO'`.\n\n\n<a id='pyess-ex4'></a>",
"_____no_output_____"
],
[
"### Exercise 4\n\nWrite a function that takes two sequences `seq_a` and `seq_b` as arguments and\nreturns `True` if every element in `seq_a` is also an element of `seq_b`, else\n`False`.\n\n- By “sequence” we mean a list, a tuple or a string. \n- Do the exercise without using [sets](https://docs.python.org/3/tutorial/datastructures.html#sets) and set methods. \n\n\n\n<a id='pyess-ex5'></a>",
"_____no_output_____"
],
[
"### Exercise 5\n\nWhen we cover the numerical libraries, we will see they include many\nalternatives for interpolation and function approximation.\n\nNevertheless, let’s write our own function approximation routine as an exercise.\n\nIn particular, without using any imports, write a function `linapprox` that takes as arguments\n\n- A function `f` mapping some interval $ [a, b] $ into $ \\mathbb R $. \n- Two scalars `a` and `b` providing the limits of this interval. \n- An integer `n` determining the number of grid points. \n- A number `x` satisfying `a <= x <= b`. \n\n\nand returns the [piecewise linear interpolation](https://en.wikipedia.org/wiki/Linear_interpolation) of `f` at `x`, based on `n` evenly spaced grid points `a = point[0] < point[1] < ... < point[n-1] = b`.\n\nAim for clarity, not efficiency.",
"_____no_output_____"
],
[
"### Exercise 6\n\nUsing list comprehension syntax, we can simplify the loop in the following\ncode.",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\nn = 100\nϵ_values = []\nfor i in range(n):\n e = np.random.randn()\n ϵ_values.append(e)",
"_____no_output_____"
]
],
[
[
"## Solutions",
"_____no_output_____"
],
[
"### Exercise 1",
"_____no_output_____"
],
[
"#### Part 1 Solution:\n\nHere’s one possible solution",
"_____no_output_____"
]
],
[
[
"x_vals = [1, 2, 3]\ny_vals = [1, 1, 1]\nsum([x * y for x, y in zip(x_vals, y_vals)])",
"_____no_output_____"
]
],
[
[
"This also works",
"_____no_output_____"
]
],
[
[
"sum(x * y for x, y in zip(x_vals, y_vals))",
"_____no_output_____"
]
],
[
[
"#### Part 2 Solution:\n\nOne solution is",
"_____no_output_____"
]
],
[
[
"sum([x % 2 == 0 for x in range(100)])",
"_____no_output_____"
]
],
[
[
"This also works:",
"_____no_output_____"
]
],
[
[
"sum(x % 2 == 0 for x in range(100))",
"_____no_output_____"
]
],
[
[
"Some less natural alternatives that nonetheless help to illustrate the\nflexibility of list comprehensions are",
"_____no_output_____"
]
],
[
[
"len([x for x in range(100) if x % 2 == 0])",
"_____no_output_____"
]
],
[
[
"and",
"_____no_output_____"
]
],
[
[
"sum([1 for x in range(100) if x % 2 == 0])",
"_____no_output_____"
]
],
[
[
"#### Part 3 Solution\n\nHere’s one possibility",
"_____no_output_____"
]
],
[
[
"pairs = ((2, 5), (4, 2), (9, 8), (12, 10))\nsum([x % 2 == 0 and y % 2 == 0 for x, y in pairs])",
"_____no_output_____"
]
],
[
[
"### Exercise 2",
"_____no_output_____"
]
],
[
[
"def p(x, coeff):\n return sum(a * x**i for i, a in enumerate(coeff))",
"_____no_output_____"
],
[
"p(1, (2, 4))",
"_____no_output_____"
]
],
[
[
"### Exercise 3\n\nHere’s one solution:",
"_____no_output_____"
]
],
[
[
"def f(string):\n count = 0\n for letter in string:\n if letter == letter.upper() and letter.isalpha():\n count += 1\n return count\n\nf('The Rain in Spain')",
"_____no_output_____"
]
],
[
[
"An alternative, more pythonic solution:",
"_____no_output_____"
]
],
[
[
"def count_uppercase_chars(s):\n return sum([c.isupper() for c in s])\n\ncount_uppercase_chars('The Rain in Spain')",
"_____no_output_____"
]
],
[
[
"### Exercise 4\n\nHere’s a solution:",
"_____no_output_____"
]
],
[
[
"def f(seq_a, seq_b):\n is_subset = True\n for a in seq_a:\n if a not in seq_b:\n is_subset = False\n return is_subset\n\n# == test == #\n\nprint(f([1, 2], [1, 2, 3]))\nprint(f([1, 2, 3], [1, 2]))",
"_____no_output_____"
]
],
[
[
"Of course, if we use the `sets` data type then the solution is easier",
"_____no_output_____"
]
],
[
[
"def f(seq_a, seq_b):\n return set(seq_a).issubset(set(seq_b))",
"_____no_output_____"
]
],
[
[
"### Exercise 5",
"_____no_output_____"
]
],
[
[
"def linapprox(f, a, b, n, x):\n \"\"\"\n Evaluates the piecewise linear interpolant of f at x on the interval\n [a, b], with n evenly spaced grid points.\n\n Parameters\n ==========\n f : function\n The function to approximate\n\n x, a, b : scalars (floats or integers)\n Evaluation point and endpoints, with a <= x <= b\n\n n : integer\n Number of grid points\n\n Returns\n =======\n A float. The interpolant evaluated at x\n\n \"\"\"\n length_of_interval = b - a\n num_subintervals = n - 1\n step = length_of_interval / num_subintervals\n\n # === find first grid point larger than x === #\n point = a\n while point <= x:\n point += step\n\n # === x must lie between the gridpoints (point - step) and point === #\n u, v = point - step, point\n\n return f(u) + (x - u) * (f(v) - f(u)) / (v - u)",
"_____no_output_____"
]
],
[
[
"### Exercise 6\n\nHere’s one solution.",
"_____no_output_____"
]
],
[
[
"n = 100\nϵ_values = [np.random.randn() for i in range(n)]",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79eb5f3f553973c5668094a18ffbac4382a943a | 8,440 | ipynb | Jupyter Notebook | notebooks/NumPy Compat.ipynb | costrouc/uarray | c3c42147181a88265942ad5f9cf439467f746782 | [
"BSD-3-Clause"
] | null | null | null | notebooks/NumPy Compat.ipynb | costrouc/uarray | c3c42147181a88265942ad5f9cf439467f746782 | [
"BSD-3-Clause"
] | null | null | null | notebooks/NumPy Compat.ipynb | costrouc/uarray | c3c42147181a88265942ad5f9cf439467f746782 | [
"BSD-3-Clause"
] | null | null | null | 22.387268 | 326 | 0.484005 | [
[
[
"# `uarray` NumPy Compatability",
"_____no_output_____"
]
],
[
[
"from uarray import *\nimport numpy as np\nfrom numba import njit",
"_____no_output_____"
]
],
[
[
"## Original Expression",
"_____no_output_____"
],
[
"Let's look at this simple NumPy expression of calling the outer production of two values and then indexing it:",
"_____no_output_____"
]
],
[
[
"def some_fn(a, b):\n return np.multiply.outer(a, b)[5]",
"_____no_output_____"
]
],
[
[
"We can see that this does a lot of extra work, since we discard most of the results of the outer product after indexing. We can look at the time:",
"_____no_output_____"
]
],
[
[
"args = [np.arange(1000), np.arange(10)]",
"_____no_output_____"
],
[
"# NBVAL_IGNORE_OUTPUT\n%timeit some_fn(*args)",
"27.5 µs ± 214 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)\n"
]
],
[
[
"## Uarray reduced",
"_____no_output_____"
],
[
"Now let's use uarray's `optimize` decorator to create an updated function that specifes the dimensionality of the arrays to produced an optimized form:",
"_____no_output_____"
]
],
[
[
"# enable_logging()",
"_____no_output_____"
],
[
"optimized_some_fn = optimize(args[0].shape, args[1].shape)(some_fn)",
"_____no_output_____"
]
],
[
[
"Now let's try our function out to see if it's faster:",
"_____no_output_____"
]
],
[
[
"# NBVAL_IGNORE_OUTPUT\n%timeit optimized_some_fn(*args)",
"5.47 µs ± 48.5 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)\n"
]
],
[
[
"Yep about 10x as fast. Let's look at how this is done! First, we create an abstract representation of the array operations:",
"_____no_output_____"
]
],
[
[
"optimized_some_fn.__optimize_steps__['resulting_expr']",
"_____no_output_____"
]
],
[
[
"Then, we compile that to Python AST:",
"_____no_output_____"
]
],
[
[
"print(optimized_some_fn.__optimize_steps__['ast_as_source'])",
"\n\ndef fn(a, b):\n i_5 = ()\n i_6 = 10\n i_1 = ((i_6,) + i_5)\n i_0 = np.empty(i_1)\n i_2 = 10\n for i_3 in range(i_2):\n i_4 = i_0[i_3]\n i_9 = 5\n i_10 = a\n i_13 = i_10[i_9]\n i_11 = i_3\n i_12 = b\n i_14 = i_12[i_11]\n i_4 = (i_13 * i_14)\n i_0[i_3] = i_4\n return i_0\n\n"
]
],
[
[
"## Numba optimized",
"_____no_output_____"
],
[
"To give this an extra speed boost, we can compile the returned expression with Numba:",
"_____no_output_____"
]
],
[
[
"numba_optimized = njit(optimized_some_fn)",
"_____no_output_____"
],
[
"# NBVAL_IGNORE_OUTPUT\n# run once first to compile\nnumba_optimized(*args)\n \n%timeit numba_optimized(*args)",
"876 ns ± 16.7 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)\n"
]
],
[
[
"Great, another speedup!",
"_____no_output_____"
],
[
"## Unkown dimensionality?",
"_____no_output_____"
],
[
"What if we want to produce a version of the function that works on any dimensional input? Or if we just want to actually defer to NumPy's implementation and not replace `outer`? We simply omit the `with_dim` methods and we get back an abstract representation that is compiled without any knowledge of the dimensionality:",
"_____no_output_____"
]
],
[
[
"dims_not_known = optimize(some_fn)",
"_____no_output_____"
],
[
"dims_not_known.__optimize_steps__['resulting_expr']",
"_____no_output_____"
],
[
"print(dims_not_known.__optimize_steps__['ast_as_source'])",
"\n\ndef fn(a, b):\n i_18 = 5\n i_16 = a\n i_17 = b\n i_19 = np.multiply.outer(i_16, i_17)\n i_15 = i_19[i_18]\n return i_15\n\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
]
] |
e79ecd5eb3a572759793be812bce89d108a46a0a | 427,031 | ipynb | Jupyter Notebook | twitter-crawler.ipynb | phdabel/twitter-crawler | 30ba2935ef2706ea56f0ca8ee737e7c21004bf84 | [
"Apache-2.0"
] | null | null | null | twitter-crawler.ipynb | phdabel/twitter-crawler | 30ba2935ef2706ea56f0ca8ee737e7c21004bf84 | [
"Apache-2.0"
] | null | null | null | twitter-crawler.ipynb | phdabel/twitter-crawler | 30ba2935ef2706ea56f0ca8ee737e7c21004bf84 | [
"Apache-2.0"
] | null | null | null | 40.739458 | 293 | 0.537397 | [
[
[
"import os\nimport tweepy\nimport pandas as pd",
"_____no_output_____"
]
],
[
[
"## Tweepy\n\nPara acessar a API do twitter, deve-se acessar a área de desenvolvedor, criar uma aplicação e solicitar as credenciais de acesso. As células abaixo demonstram como fazer a conexão na API do twitter usando a biblioteca python Tweepy e retornar a lista de tweets na timeline do usuário.\n\n[Documentação do tweepy](https://tweepy.readthedocs.io/)",
"_____no_output_____"
]
],
[
[
"consumer_key = os.environ['CONSUMER_API_KEY']\nconsumer_secret = os.environ['CONSUMER_API_SECRET']\naccess_token = os.environ['ACCESS_TOKEN']\naccess_token_secret = os.environ['ACCESS_TOKEN_SECRET']",
"_____no_output_____"
],
[
"auth = tweepy.OAuthHandler(consumer_key, consumer_secret)\nauth.set_access_token(access_token, access_token_secret)\n\napi = tweepy.API(auth)",
"_____no_output_____"
]
],
[
[
"Recuperar tweets da timeline",
"_____no_output_____"
]
],
[
[
"public_tweets = api.home_timeline()",
"_____no_output_____"
]
],
[
[
"Criar um dicionário vazio para incluir os tweets.\nPasso necessário para criar um dataframe.",
"_____no_output_____"
]
],
[
[
"tweets = {'id':[],'author':[], 'screen_name':[], 'tweet':[],'created_at':[],'language':[],'n_retweets':[],'n_likes':[]}",
"_____no_output_____"
],
[
"ct = 0\nnum_results = 1000\nresult_count = 0\nlast_id = None\nwhile result_count < num_results:\n gremio_tweets = api.search(q='gremio',lang='pt',since_id=last_id)\n for tweet in gremio_tweets:\n print(\"id: \"+tweet.id_str)\n print(\"Screen name: \"+tweet.author.screen_name)\n print(\"Autor: \"+tweet.author.name)\n print(\"Tweet: \"+tweet.text)\n print(\"Data de criação: \"+tweet.created_at.strftime(\"%d/%m/%Y, %H:%M:%S\"))\n print(\"Idioma: \"+tweet.lang)\n print(\"Retweets: \"+str(tweet.retweet_count))\n print(\"Curtidas: \"+str(tweet.favorite_count))\n tweets['id'].append(tweet.id)\n tweets['screen_name'].append(tweet.author.screen_name)\n tweets['author'].append(tweet.author.name)\n tweets['tweet'].append(tweet.text)\n tweets['created_at'].append(tweet.created_at)\n tweets['language'].append(tweet.lang)\n tweets['n_retweets'].append(tweet.retweet_count)\n tweets['n_likes'].append(tweet.favorite_count)\n print(\"==========================\")\n result_count += 1",
"id: 1201655788508471298\nScreen name: RauenPaian\nAutor: IMORTAL 🖤💙🖤💙\nTweet: RT @SoccerGremio: A informação que nos chega, é que são apenas detalhes a serem finalizados entre Grêmio e Palmeiras por Raphael Veiga. Os…\nData de criação: 03/12/2019, 00:14:00\nIdioma: pt\nRetweets: 54\nCurtidas: 0\n==========================\nid: 1201655782686760960\nScreen name: regisschuch\nAutor: Régis\nTweet: Esperamos pela parte do @Gremio Que não contrate esse Egídio. JOGADOR muito ruim.\nData de criação: 03/12/2019, 00:13:58\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655773706833922\nScreen name: magrin00\nAutor: 🄼🄰🄶🅁🄸🄽ᶜʳᶠ ⚫🔴\nTweet: RT @TozzaFla: Fifa reconhece títulos mundiais de Flamengo, Grêmio, Santos e São Paulo | futebol internacional | Globoesporte https://t.co/A…\nData de criação: 03/12/2019, 00:13:56\nIdioma: pt\nRetweets: 232\nCurtidas: 0\n==========================\nid: 1201655770200317958\nScreen name: BaseFuracao\nAutor: Base Furacão\nTweet: Em atuação coletiva, 3-0 vs Boca.\nEm emoção e vibração, Remontada vs Grêmio. https://t.co/ebQwsoBUmo\nData de criação: 03/12/2019, 00:13:55\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655768107356164\nScreen name: Proerrd\nAutor: Vivi pereiraⓟ\nTweet: @di_dinelli @isastrevisan comissão rainha grêmio nadinha\nData de criação: 03/12/2019, 00:13:55\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655765859262465\nScreen name: jean_wosch\nAutor: Jean Wosch\nTweet: @omenguista @LibertadoresBR @AthleticoPR @Flamengo @Gremio @Palmeiras @SantosFC @SaoPauloFC Não reclama, se não pio… https://t.co/n60v1B2Mby\nData de criação: 03/12/2019, 00:13:54\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655763921453056\nScreen name: avalyzinho\nAutor: nx.avaly 15/11\nTweet: RT @Jhorobert11: Obrigado senhor 🙏🏽🙏🏽\nFeliz por mais uma vitória e pelos 2 gols ⚽️⚽️ Isso é Grêmio 🇪🇪🇪🇪\nJR11 https://t.co/AEzxnn3GLt\nData de criação: 03/12/2019, 00:13:54\nIdioma: pt\nRetweets: 20\nCurtidas: 0\n==========================\nid: 1201655762931666948\nScreen name: DanielZ80970238\nAutor: Daniel Zurita\nTweet: RT @TozzaFla: Fifa reconhece títulos mundiais de Flamengo, Grêmio, Santos e São Paulo | futebol internacional | Globoesporte https://t.co/A…\nData de criação: 03/12/2019, 00:13:54\nIdioma: pt\nRetweets: 232\nCurtidas: 0\n==========================\nid: 1201655755826442241\nScreen name: BianoRL1\nAutor: @BianoRL\nTweet: RT @ejramorim: Gostaria de agradecer as atletas, comissão e direção por essa temporada, é um enorme prazer poder conviver e aprender diaria…\nData de criação: 03/12/2019, 00:13:52\nIdioma: pt\nRetweets: 3\nCurtidas: 0\n==========================\nid: 1201655731537268736\nScreen name: LuizCar74542646\nAutor: Luiz Carlos\nTweet: Tu tens razão, qual era a meia cancha do grêmio em 2017, campeão da libertadores, quem entrou e nunca mais saiu Art… https://t.co/VNcZjOWna6\nData de criação: 03/12/2019, 00:13:46\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655713065512960\nScreen name: ceciliaraisa\nAutor: O Chelsea kerrLUTE 😒💙😒💦\nTweet: @manamaiara @leilinha_1910 @tathiane_vidal pra não dar briga é melhor ele ir pro Grêmio\nData de criação: 03/12/2019, 00:13:42\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655702562967552\nScreen name: cxcarloos\nAutor: carrlos\nTweet: RT @SoccerGremio: A informação que nos chega, é que são apenas detalhes a serem finalizados entre Grêmio e Palmeiras por Raphael Veiga. Os…\nData de criação: 03/12/2019, 00:13:39\nIdioma: pt\nRetweets: 54\nCurtidas: 0\n==========================\nid: 1201655693784338433\nScreen name: willianselong_\nAutor: ₩illian\nTweet: @cesarspo @sandra_kunst @Gremio Vo passar por burro nada o comentário é meu e quem decide sou eu, evito ladainha de pessoas como fosse.\nData de criação: 03/12/2019, 00:13:37\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655691825553411\nScreen name: pmsilva37\nAutor: Coalito 🐨\nTweet: @carlydamasceno2 Com empate também, caso o Grêmio vença o Cruzeiro.\nFicaria 3 pontos na frente e pelo número de vit… https://t.co/hA7TYBW0Jk\nData de criação: 03/12/2019, 00:13:37\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655682900078592\nScreen name: VNJS18\nAutor: Vinicin\nTweet: RT @RDTRubroNegro: Flamengo pré Jorge Jesus não vencia:\n\n- A Liberta há 38 anos.\n\n- O Brasileiro há 10 anos.\n\n- Na Arena da Baixada há 45 a…\nData de criação: 03/12/2019, 00:13:35\nIdioma: pt\nRetweets: 597\nCurtidas: 0\n==========================\nid: 1201655788508471298\nScreen name: RauenPaian\nAutor: IMORTAL 🖤💙🖤💙\nTweet: RT @SoccerGremio: A informação que nos chega, é que são apenas detalhes a serem finalizados entre Grêmio e Palmeiras por Raphael Veiga. Os…\nData de criação: 03/12/2019, 00:14:00\nIdioma: pt\nRetweets: 54\nCurtidas: 0\n==========================\nid: 1201655782686760960\nScreen name: regisschuch\nAutor: Régis\nTweet: Esperamos pela parte do @Gremio Que não contrate esse Egídio. JOGADOR muito ruim.\nData de criação: 03/12/2019, 00:13:58\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655773706833922\nScreen name: magrin00\nAutor: 🄼🄰🄶🅁🄸🄽ᶜʳᶠ ⚫🔴\nTweet: RT @TozzaFla: Fifa reconhece títulos mundiais de Flamengo, Grêmio, Santos e São Paulo | futebol internacional | Globoesporte https://t.co/A…\nData de criação: 03/12/2019, 00:13:56\nIdioma: pt\nRetweets: 232\nCurtidas: 0\n==========================\nid: 1201655770200317958\nScreen name: BaseFuracao\nAutor: Base Furacão\nTweet: Em atuação coletiva, 3-0 vs Boca.\nEm emoção e vibração, Remontada vs Grêmio. https://t.co/ebQwsoBUmo\nData de criação: 03/12/2019, 00:13:55\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655768107356164\nScreen name: Proerrd\nAutor: Vivi pereiraⓟ\nTweet: @di_dinelli @isastrevisan comissão rainha grêmio nadinha\nData de criação: 03/12/2019, 00:13:55\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655765859262465\nScreen name: jean_wosch\nAutor: Jean Wosch\nTweet: @omenguista @LibertadoresBR @AthleticoPR @Flamengo @Gremio @Palmeiras @SantosFC @SaoPauloFC Não reclama, se não pio… https://t.co/n60v1B2Mby\nData de criação: 03/12/2019, 00:13:54\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655763921453056\nScreen name: avalyzinho\nAutor: nx.avaly 15/11\nTweet: RT @Jhorobert11: Obrigado senhor 🙏🏽🙏🏽\nFeliz por mais uma vitória e pelos 2 gols ⚽️⚽️ Isso é Grêmio 🇪🇪🇪🇪\nJR11 https://t.co/AEzxnn3GLt\nData de criação: 03/12/2019, 00:13:54\nIdioma: pt\nRetweets: 20\nCurtidas: 0\n==========================\nid: 1201655762931666948\nScreen name: DanielZ80970238\nAutor: Daniel Zurita\nTweet: RT @TozzaFla: Fifa reconhece títulos mundiais de Flamengo, Grêmio, Santos e São Paulo | futebol internacional | Globoesporte https://t.co/A…\nData de criação: 03/12/2019, 00:13:54\nIdioma: pt\nRetweets: 232\nCurtidas: 0\n==========================\nid: 1201655755826442241\nScreen name: BianoRL1\nAutor: @BianoRL\nTweet: RT @ejramorim: Gostaria de agradecer as atletas, comissão e direção por essa temporada, é um enorme prazer poder conviver e aprender diaria…\nData de criação: 03/12/2019, 00:13:52\nIdioma: pt\nRetweets: 3\nCurtidas: 0\n==========================\nid: 1201655731537268736\nScreen name: LuizCar74542646\nAutor: Luiz Carlos\nTweet: Tu tens razão, qual era a meia cancha do grêmio em 2017, campeão da libertadores, quem entrou e nunca mais saiu Art… https://t.co/VNcZjOWna6\nData de criação: 03/12/2019, 00:13:46\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655713065512960\nScreen name: ceciliaraisa\nAutor: O Chelsea kerrLUTE 😒💙😒💦\nTweet: @manamaiara @leilinha_1910 @tathiane_vidal pra não dar briga é melhor ele ir pro Grêmio\nData de criação: 03/12/2019, 00:13:42\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655702562967552\nScreen name: cxcarloos\nAutor: carrlos\nTweet: RT @SoccerGremio: A informação que nos chega, é que são apenas detalhes a serem finalizados entre Grêmio e Palmeiras por Raphael Veiga. Os…\nData de criação: 03/12/2019, 00:13:39\nIdioma: pt\nRetweets: 54\nCurtidas: 0\n==========================\nid: 1201655693784338433\nScreen name: willianselong_\nAutor: ₩illian\nTweet: @cesarspo @sandra_kunst @Gremio Vo passar por burro nada o comentário é meu e quem decide sou eu, evito ladainha de pessoas como fosse.\nData de criação: 03/12/2019, 00:13:37\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655691825553411\nScreen name: pmsilva37\nAutor: Coalito 🐨\nTweet: @carlydamasceno2 Com empate também, caso o Grêmio vença o Cruzeiro.\nFicaria 3 pontos na frente e pelo número de vit… https://t.co/hA7TYBW0Jk\nData de criação: 03/12/2019, 00:13:37\nIdioma: pt\nRetweets: 0\nCurtidas: 0\n==========================\nid: 1201655682900078592\nScreen name: VNJS18\nAutor: Vinicin\nTweet: RT @RDTRubroNegro: Flamengo pré Jorge Jesus não vencia:\n\n- A Liberta há 38 anos.\n\n- O Brasileiro há 10 anos.\n\n- Na Arena da Baixada há 45 a…\nData de criação: 03/12/2019, 00:13:35\nIdioma: pt\nRetweets: 597\nCurtidas: 0\n==========================\n"
]
],
[
[
"Criação do dataframe",
"_____no_output_____"
]
],
[
[
"df = pd.DataFrame.from_dict(tweets)",
"_____no_output_____"
],
[
"df",
"_____no_output_____"
]
],
[
[
"Salvar o dataframe em csv para uso posterior.",
"_____no_output_____"
]
],
[
[
"df.to_csv('dataframe.csv')",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79ed749964f1155e0574401dfc398248121a83d | 272,245 | ipynb | Jupyter Notebook | Applied Data Science with Python Specialzation/Applied Plotting Charting and Data Representation in Python/Assignment2/Assignment2.ipynb | lynnxlmiao/Coursera | 8dc4073e29429dac14998689814388ee84435824 | [
"MIT"
] | null | null | null | Applied Data Science with Python Specialzation/Applied Plotting Charting and Data Representation in Python/Assignment2/Assignment2.ipynb | lynnxlmiao/Coursera | 8dc4073e29429dac14998689814388ee84435824 | [
"MIT"
] | null | null | null | Applied Data Science with Python Specialzation/Applied Plotting Charting and Data Representation in Python/Assignment2/Assignment2.ipynb | lynnxlmiao/Coursera | 8dc4073e29429dac14998689814388ee84435824 | [
"MIT"
] | null | null | null | 239.021071 | 219,747 | 0.889239 | [
[
[
"# Assignment 2\n\nBefore working on this assignment please read these instructions fully. In the submission area, you will notice that you can click the link to **Preview the Grading** for each step of the assignment. This is the criteria that will be used for peer grading. Please familiarize yourself with the criteria before beginning the assignment.\n\nAn NOAA dataset has been stored in the file `data/C2A2_data/BinnedCsvs_d100/4e86d2106d0566c6ad9843d882e72791333b08be3d647dcae4f4b110.csv`. The data for this assignment comes from a subset of The National Centers for Environmental Information (NCEI) [Daily Global Historical Climatology Network](https://www1.ncdc.noaa.gov/pub/data/ghcn/daily/readme.txt) (GHCN-Daily). The GHCN-Daily is comprised of daily climate records from thousands of land surface stations across the globe.\n\nEach row in the assignment datafile corresponds to a single observation.\n\nThe following variables are provided to you:\n\n* **id** : station identification code\n* **date** : date in YYYY-MM-DD format (e.g. 2012-01-24 = January 24, 2012)\n* **element** : indicator of element type\n * TMAX : Maximum temperature (tenths of degrees C)\n * TMIN : Minimum temperature (tenths of degrees C)\n* **value** : data value for element (tenths of degrees C)\n\nFor this assignment, you must:\n\n1. Read the documentation and familiarize yourself with the dataset, then write some python code which returns a line graph of the record high and record low temperatures by day of the year over the period 2005-2014. The area between the record high and record low temperatures for each day should be shaded.\n2. Overlay a scatter of the 2015 data for any points (highs and lows) for which the ten year record (2005-2014) record high or record low was broken in 2015.\n3. Watch out for leap days (i.e. February 29th), it is reasonable to remove these points from the dataset for the purpose of this visualization.\n4. Make the visual nice! Leverage principles from the first module in this course when developing your solution. Consider issues such as legends, labels, and chart junk.\n\nThe data you have been given is near **Singapore, Central Singapore Community Development Council, Singapore**, and the stations the data comes from are shown on the map below.",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\nimport mplleaflet\nimport pandas as pd\n\ndef leaflet_plot_stations(binsize, hashid):\n\n df = pd.read_csv('data/C2A2_data/BinSize_d{}.csv'.format(binsize))\n\n station_locations_by_hash = df[df['hash'] == hashid]\n\n lons = station_locations_by_hash['LONGITUDE'].tolist()\n lats = station_locations_by_hash['LATITUDE'].tolist()\n\n plt.figure(figsize=(8,8))\n\n plt.scatter(lons, lats, c='r', alpha=0.7, s=200)\n\n return mplleaflet.display()\n\nleaflet_plot_stations(100,'4e86d2106d0566c6ad9843d882e72791333b08be3d647dcae4f4b110')",
"_____no_output_____"
],
[
"# Import useful libraries\nimport matplotlib.pyplot as plt\nimport matplotlib.dates as dates\nimport matplotlib.ticker as ticker\nimport pandas as pd\nimport numpy as np\n\n%matplotlib notebook\n\n# Read the dataframe\ndf1 = pd.read_csv('data/C2A2_data/BinnedCsvs_d100/4e86d2106d0566c6ad9843d882e72791333b08be3d647dcae4f4b110.csv')\ndf1.head()",
"_____no_output_____"
],
[
"#How many records?\nlen(df1)",
"_____no_output_____"
],
[
"minimum = []\nmaximum = []\nmonth = []\n\n#remove February 29\ndf1 = df1[~(df1['Date'].str.endswith(r'02-29'))]\ntimes1 = pd.DatetimeIndex(df1['Date'])",
"_____no_output_____"
],
[
"#after removing Feb 29, how many records remaining\nlen(df1)",
"_____no_output_____"
],
[
"#Data for 2005-2014\ndf = df1[times1.year != 2015]\ntimes = pd.DatetimeIndex(df['Date'])\nfor j in df.groupby([times.month, times.day]):\n minimum.append(min(j[1]['Data_Value']))\n maximum.append(max(j[1]['Data_Value']))",
"_____no_output_____"
],
[
"#Data of 2015\ndf2015 = df1[times1.year == 2015]\ntimes2015 = pd.DatetimeIndex(df2015['Date'])\nminimum2015 = []\nmaximum2015 = []\nfor j in df2015.groupby([times2015.month, times2015.day]):\n minimum2015.append(min(j[1]['Data_Value']))\n maximum2015.append(max(j[1]['Data_Value']))",
"_____no_output_____"
],
[
"minaxis = []\nmaxaxis = []\nminvals = []\nmaxvals = []\nfor i in range(len(minimum)):\n if((minimum[i] - minimum2015[i]) > 0):\n minaxis.append(i)\n minvals.append(minimum2015[i])\n if((maximum[i] - maximum2015[i]) < 0):\n maxaxis.append(i)\n maxvals.append(maximum2015[i])",
"_____no_output_____"
],
[
"plt.figure()\ncolors = ['skyblue', 'lightcoral']\nplt.plot(minimum, c='skyblue', alpha = 0.5, label = 'Minimum Temperature (2005-14)')\nplt.plot(maximum, c ='lightcoral', alpha = 0.5, label = 'Maximum Temperature (2005-14)')\nplt.scatter(minaxis, minvals, s = 10, c = 'blue', label = 'Record Break Minimum (2015)')\nplt.scatter(maxaxis, maxvals, s = 10, c = 'red', label = 'Record Break Maximum (2015)')\nplt.gca().fill_between(range(len(minimum)), \n minimum, maximum, \n facecolor='lightgray', \n alpha=0.2)\n\nplt.ylim(0, 450)\nplt.legend(loc = 8, frameon=False, title='Temperature', fontsize=8)\nplt.xticks( np.linspace(15,15 + 30*11 , num = 12), (r'Jan', r'Feb', r'Mar', r'Apr', r'May', r'Jun', r'Jul', r'Aug', r'Sep', r'Oct', r'Nov', r'Dec') )\nplt.xlabel('Months')\nplt.ylabel('Temperature (tenths of degrees C)')\nplt.title(r'Temperature Summary Plot of Singapore (2005-2015)')\nplt.show()",
"_____no_output_____"
],
[
"plt.savefig('Temperature.png', transparent = True, bbox_inches = 'tight')",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79ede6103715450ffe8f734ed11f45c865e5a67 | 101,126 | ipynb | Jupyter Notebook | 2_classical_ml_approach.ipynb | funmilola09/Recurrent-Neural-Pipeline | 83ac34a33c6f82124bf9ebfc6ddc82f7d708f1a0 | [
"MIT"
] | 7 | 2020-01-04T04:54:55.000Z | 2021-01-06T14:44:09.000Z | 2_classical_ml_approach.ipynb | funmilola09/Recurrent-Neural-Pipeline | 83ac34a33c6f82124bf9ebfc6ddc82f7d708f1a0 | [
"MIT"
] | null | null | null | 2_classical_ml_approach.ipynb | funmilola09/Recurrent-Neural-Pipeline | 83ac34a33c6f82124bf9ebfc6ddc82f7d708f1a0 | [
"MIT"
] | 4 | 2020-01-04T04:55:27.000Z | 2021-01-06T14:44:22.000Z | 40.113447 | 407 | 0.420535 | [
[
[
"# Classical Machine Learning Approach\n\nIn this notebook we will be learning to\n 1. Create a Naive TF - IDF based Bag of Words representation of text.\n 2. Use classical ML models to solve text classification.\n 3. Use a One Vs Rest strategy to solve multi-label text classification.\n\n\n **HOT TIP** : *Save them as pickle for easy rendering for experiments*\n\n This Notebook uses code from https://github.com/susanli2016/Machine-Learning-with-Python/blob/master/Multi%20label%20text%20classification.ipynb\n",
"_____no_output_____"
]
],
[
[
"# Installing packages.\n!pip install contractions\n!pip install textsearch\n!pip install tqdm\n\n# Importing packages.\nimport nltk\nnltk.download('punkt')\nnltk.download('stopwords')\n%matplotlib inline\nimport re\nimport matplotlib\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nfrom sklearn.naive_bayes import MultinomialNB\nfrom sklearn.metrics import accuracy_score\nfrom sklearn.multiclass import OneVsRestClassifier\nfrom nltk.corpus import stopwords\nstop_words = set(stopwords.words('english'))\nfrom sklearn.svm import LinearSVC\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.pipeline import Pipeline\nimport seaborn as sns\nfrom sklearn.metrics import confusion_matrix, classification_report\nimport pickle\nimport ast\nfrom sklearn.externals import joblib\nfrom datetime import datetime\nfrom sklearn.preprocessing import MultiLabelBinarizer",
"Collecting contractions\n Downloading https://files.pythonhosted.org/packages/85/41/c3dfd5feb91a8d587ed1a59f553f07c05f95ad4e5d00ab78702fbf8fe48a/contractions-0.0.24-py2.py3-none-any.whl\nCollecting textsearch\n Downloading https://files.pythonhosted.org/packages/42/a8/03407021f9555043de5492a2bd7a35c56cc03c2510092b5ec018cae1bbf1/textsearch-0.0.17-py2.py3-none-any.whl\nCollecting Unidecode\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/d0/42/d9edfed04228bacea2d824904cae367ee9efd05e6cce7ceaaedd0b0ad964/Unidecode-1.1.1-py2.py3-none-any.whl (238kB)\n\u001b[K |████████████████████████████████| 245kB 4.0MB/s \n\u001b[?25hCollecting pyahocorasick\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f4/9f/f0d8e8850e12829eea2e778f1c90e3c53a9a799b7f412082a5d21cd19ae1/pyahocorasick-1.4.0.tar.gz (312kB)\n\u001b[K |████████████████████████████████| 317kB 58.8MB/s \n\u001b[?25hBuilding wheels for collected packages: pyahocorasick\n Building wheel for pyahocorasick (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for pyahocorasick: filename=pyahocorasick-1.4.0-cp36-cp36m-linux_x86_64.whl size=81698 sha256=f163f074e78b5cd47a2b71398341f597c7b77d8aa2b7781783d0cc381dc4a4a6\n Stored in directory: /root/.cache/pip/wheels/0a/90/61/87a55f5b459792fbb2b7ba6b31721b06ff5cf6bde541b40994\nSuccessfully built pyahocorasick\nInstalling collected packages: Unidecode, pyahocorasick, textsearch, contractions\nSuccessfully installed Unidecode-1.1.1 contractions-0.0.24 pyahocorasick-1.4.0 textsearch-0.0.17\nRequirement already satisfied: textsearch in /usr/local/lib/python3.6/dist-packages (0.0.17)\nRequirement already satisfied: Unidecode in /usr/local/lib/python3.6/dist-packages (from textsearch) (1.1.1)\nRequirement already satisfied: pyahocorasick in /usr/local/lib/python3.6/dist-packages (from textsearch) (1.4.0)\nRequirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (4.28.1)\n[nltk_data] Downloading package punkt to /root/nltk_data...\n[nltk_data] Unzipping tokenizers/punkt.zip.\n[nltk_data] Downloading package stopwords to /root/nltk_data...\n[nltk_data] Unzipping corpora/stopwords.zip.\n"
],
[
"# Let's mount our G-Drive.\n\nfrom google.colab import drive\ndrive.mount('/content/drive', force_remount=True)",
"Mounted at /content/drive\n"
],
[
"# Data read and preparation.\n# Mentioning where is our data located on G-Drive. Make sure to rectify your path\npath = '/content/drive/My Drive/ICDMAI_Tutorial/notebook/'\ndata ='filtered_data/question_tag_text_mapping.pkl'\nml_model = path + 'ml_model/'",
"_____no_output_____"
],
[
"# Let us quickly load our question tag data\nquestion_tag = pd.read_pickle(path+data)\nquestion_tag.head(3)",
"_____no_output_____"
]
],
[
[
"### Creating one hot encoding from multilabelled tagged data",
"_____no_output_____"
]
],
[
[
"# In order to use one vs rest strategy we will need to one hot encoding each tag across all documents.\nmlb = MultiLabelBinarizer()\nquestion_tag['Tag_pop'] = question_tag['Tag']\nquestion_tag = question_tag.join(pd.DataFrame(mlb.fit_transform(question_tag.pop('Tag_pop')),\n columns=mlb.classes_,\n index=question_tag.index))\nquestion_tag.head(3)",
"_____no_output_____"
],
[
"# Creating a list of all existing 'Tags'\ndummy = question_tag.drop(['Id', 'OwnerUserId', 'CreationDate', 'ClosedDate', 'Score', 'Title','Body','Tag'], axis=1)\ncategories = list(dummy.columns.values)",
"_____no_output_____"
]
],
[
[
"### Text preprocessing",
"_____no_output_____"
]
],
[
[
"# Let us createa a very basic text preprocessor which we will use for cleaning text.\ndef clean_text(text):\n text = text.lower()\n text = re.sub(r\"what's\", \"what is \", text)\n text = re.sub(r\"\\'s\", \" \", text)\n text = re.sub(r\"\\'ve\", \" have \", text)\n text = re.sub(r\"can't\", \"can not \", text)\n text = re.sub(r\"n't\", \" not \", text)\n text = re.sub(r\"i'm\", \"i am \", text)\n text = re.sub(r\"\\'re\", \" are \", text)\n text = re.sub(r\"\\'d\", \" would \", text)\n text = re.sub(r\"\\'ll\", \" will \", text)\n text = re.sub(r\"\\'scuse\", \" excuse \", text)\n text = re.sub('\\W', ' ', text)\n text = re.sub('\\s+', ' ', text)\n text = text.strip(' ')\n return text\n\nquestion_tag['Body'] = question_tag['Body'].map(lambda com : clean_text(com))",
"_____no_output_____"
]
],
[
[
"### Creating a 70/30 Train-Test Split",
"_____no_output_____"
]
],
[
[
"train, test = train_test_split(question_tag, random_state=42, test_size=0.30, shuffle=True)\n\nX_train = train.Body\nX_test = test.Body\n\nprint(\"Train data shape : {}\".format(X_train.shape))\nprint(\"Test data shape : {}\".format(X_test.shape))",
"Train data shape : (736394,)\nTest data shape : (315598,)\n"
]
],
[
[
"# Creating Bag of Words representation using TF - IDF\n 1. Initializing the Vectorizer object\n 2. Create a corpus from training data.\n 3. Create a document term matrix",
"_____no_output_____"
]
],
[
[
"#Initializing the Vectorizer object\ntfidf = TfidfVectorizer(stop_words=stop_words)\n\n#Create a corpus from training data\n#Create a document term matrix of training data based on the corpus.\nX_train_dtm = tfidf.fit_transform(X_train)\n\n#Create a document term matrix of test data based on the corpus.\n#Note that the dimensions/columns of DTM of the test data will be based on the training data corpus only.\nX_test_dtm = tfidf.transform(X_test)",
"_____no_output_____"
]
],
[
[
"## Pipeline\nscikit-learn provides a Pipeline utility to help automate machine learning workflows. Pipelines are very common in Machine Learning systems, since there is a lot of data to manipulate and many data transformations to apply. So we will utilize pipeline to train every classifier.\n\n## One Vs Rest Multilabel strategy\nThe Multi-label algorithm accepts a binary mask over multiple labels. The result for each prediction will be an array of 0s and 1s marking which class labels apply to each row input sample.\n\nOneVsRest strategy can be used for multilabel learning, where a classifier is used to predict multiple labels for instance. **Naive Bayes**, **SVM**, **Logistic Regression** supports multi-class, but we are in a multi-label scenario, therefore, we wrap them in the OneVsRestClassifier.\n\n### We create a Training Pipeline and a Scoring Pipeline",
"_____no_output_____"
]
],
[
[
"def tag_level_training_pipeline(X_train, train, X_test, test, classifier_pipeline, output_directory):\n \n #1. Create a classifier for each Tag\n for category in categories:\n print('... Processing {}'.format(category))\n \n # 1. train the model using X_dtm & y\n classifier_pipeline.fit(X_train, train[category])\n \n # 2. save the model to disk\n filename = ml_model + output_directory +str(category)+ '_model.pkl'\n joblib.dump(classifier_pipeline, filename, compress = 1)\n \n # 3. compute the testing accuracy\n prediction = classifier_pipeline.predict(X_test)\n print('Test accuracy is {}'.format(accuracy_score(test[category], prediction)))\n print(classification_report(test[category], prediction))\n\n",
"_____no_output_____"
],
[
"def tag_level_predict(X_train, train, X_test, test, model_directory):\n prediction_df = pd.DataFrame(columns=['dummy1'])\n \n #Score the document across classifier for each Tag\n for category in categories:\n \n # 1. load the model\n filename = ml_model + model_directory +str(category)+ '_model.pkl'\n classifier_pipeline = joblib.load(filename)\n \n # 2. predict on the test data.\n prediction = classifier_pipeline.predict(X_test)\n prediction_df[str(category)] = prediction\n\n # Remember We had encoded the labels. It time to bring them back to their original form.\n for category in categories:\n prediction_df.loc[prediction_df[str(category)] == 1, str(category)] = category\n prediction_df['predicted_labels'] = prediction_df[[str(i) for i in categories]].values.tolist()\n prediction_df['predicted_labels'] = prediction_df['predicted_labels'].apply(lambda x : list(set(x)))\n # prediction_df['predicted_labels'] = prediction_df['predicted_labels'].apply(lambda x: x.remove(0) if (0 in x) else x )\n \n # We create result having orignal labels and predicted labels for metrics Evaluation\n final_pred_df = pd.concat([test[['Id','Tag']].reset_index(), prediction_df[['predicted_labels']].reset_index()], axis=1)\n final_pred_df['original_labels'] = final_pred_df['Tag']\n # prediction_df[['Id']] = test[['Id']]\n final_pred_df_result = final_pred_df[['Id','original_labels','predicted_labels']]\n return final_pred_df_result",
"_____no_output_____"
],
[
"# importing os module \nimport os\ntry:\n os.rename('/content/drive/My Drive/ICDMAI_Tutorial/notebook/ml_model/SVM/_net_model.pkl', '/content/drive/My Drive/ICDMAI_Tutorial/notebook/ml_model/SVM/.net_model.pkl')\nexcept :\n print(\"Already in proper filename!\") ",
"_____no_output_____"
],
[
"## A Dummy example.\nX_test = [\"How to handle memory locking ?\", \"How to handle memory locking in java ?\", \"How to handle memory locking in java python ?\",\"This post is not about java\"]\nX_test_dtm = tfidf.transform(X_test)\nresult = tag_level_predict(X_train_dtm, train, X_test_dtm, test.head(1), 'SVM/')\n\nfor i in range(result.shape[0]):\n print(\"Input [\",X_test[i],\"] || Predicted classes: \",result.predicted_labels[i])",
"/usr/local/lib/python3.6/dist-packages/sklearn/utils/deprecation.py:144: FutureWarning: The sklearn.svm.classes module is deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.svm. Anything that cannot be imported from sklearn.svm is now part of the private API.\n warnings.warn(message, FutureWarning)\n/usr/local/lib/python3.6/dist-packages/sklearn/base.py:318: UserWarning: Trying to unpickle estimator LinearSVC from version 0.21.3 when using version 0.22.1. This might lead to breaking code or invalid results. Use at your own risk.\n UserWarning)\n/usr/local/lib/python3.6/dist-packages/sklearn/utils/deprecation.py:144: FutureWarning: The sklearn.preprocessing.label module is deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.preprocessing. Anything that cannot be imported from sklearn.preprocessing is now part of the private API.\n warnings.warn(message, FutureWarning)\n/usr/local/lib/python3.6/dist-packages/sklearn/base.py:318: UserWarning: Trying to unpickle estimator LabelBinarizer from version 0.21.3 when using version 0.22.1. This might lead to breaking code or invalid results. Use at your own risk.\n UserWarning)\n/usr/local/lib/python3.6/dist-packages/sklearn/base.py:318: UserWarning: Trying to unpickle estimator OneVsRestClassifier from version 0.21.3 when using version 0.22.1. This might lead to breaking code or invalid results. Use at your own risk.\n UserWarning)\n/usr/local/lib/python3.6/dist-packages/sklearn/base.py:318: UserWarning: Trying to unpickle estimator Pipeline from version 0.21.3 when using version 0.22.1. This might lead to breaking code or invalid results. Use at your own risk.\n UserWarning)\n"
]
],
[
[
"# Evaluating our results",
"_____no_output_____"
]
],
[
[
"# Here we define precision, recall, f1 measure at a single document level.\ndef document_evaluation_metrics(prd_grp,grp,metric=\"precision\"):\n pred_group = prd_grp\n if 0 in pred_group: pred_group.remove(0)\n group = grp\n\n set_pred_group = set(pred_group)\n set_group = set(group)\n intrsct = set_group.intersection(set_pred_group)\n accuracy = len(intrsct) / float(len(set_pred_group) if len(set_pred_group)>1 else 1)\n recall = len(intrsct) / float(len(set_group) if len(set_group)>1 else 1)\n if metric == \"precision\":\n return accuracy\n elif metric == \"recall\":\n return recall\n elif metric == \"f1_measure\":\n if accuracy == 0 or recall == 0:\n return 0\n elif accuracy > 0 and recall >0 :\n f1_measure = 2*accuracy*recall/(float(accuracy + recall))\n return f1_measure\n \n return -1\n\n# Provide overall average stats and populate document level metrics.\ndef model_evaluation_stats(final_pred_df, model_name=\"default\"):\n final_pred_df['doc_precision'] = final_pred_df.apply(lambda x: document_evaluation_metrics(x.predicted_labels, x.original_labels, \"precision\"), axis=1)\n final_pred_df['doc_recall'] = final_pred_df.apply(lambda x: document_evaluation_metrics(x.predicted_labels, x.original_labels, \"recall\"), axis=1)\n final_pred_df['doc_f1_measure'] = final_pred_df.apply(lambda x: document_evaluation_metrics(x.predicted_labels, x.original_labels, \"f1_measure\"), axis=1)\n \n print('Avearge precision across documents is {}'.format(final_pred_df['doc_precision'].mean()))\n print('Avearge recall across documents is {}'.format(final_pred_df['doc_recall'].mean()))\n print('Avearge f1 measure across documents is {}'.format(final_pred_df['doc_f1_measure'].mean()))\n pickle.dump(final_pred_df, open(ml_model + model_name + \".pkl\", 'wb'))\n # final_pred_df.to_csv(ml_model + 'SVM_Tag_predictions.txt',sep='\\t',index=False)",
"_____no_output_____"
]
],
[
[
"# Let us train, score and evaluate Naive Bayes",
"_____no_output_____"
]
],
[
[
"#Naive Bayes Classifier\nNB_pipeline = Pipeline([\n ('clf', OneVsRestClassifier(MultinomialNB(\n fit_prior=True, class_prior=None))),\n ])\n\ntag_level_training_pipeline(X_train_dtm, train, X_test_dtm, test, NB_pipeline, 'NaiveBayes/')\nresult = tag_level_predict(X_train_dtm, train, X_test_dtm, test, 'NaiveBayes/')\nmodel_evaluation_stats(result, \"NaiveBayes\")",
"_____no_output_____"
]
],
[
[
"# Let us train, score and evaluate Support Vector Machines",
"_____no_output_____"
]
],
[
[
"#SVM Classifier\nSVC_pipeline = Pipeline([\n ('clf', OneVsRestClassifier(LinearSVC(), n_jobs=1)),\n ])\n\ntag_level_training_pipeline(X_train_dtm, train, X_test_dtm, test, SVC_pipeline, 'SVM/')\nresult = tag_level_predict(X_train_dtm, train, X_test_dtm, test, 'SVM/')\nmodel_evaluation_stats(result, \"SVM\")",
"... Processing .net\nTest accuracy is 0.9771893358006071\n precision recall f1-score support\n\n 0 0.98 1.00 0.99 308362\n 1 0.51 0.09 0.15 7236\n\n accuracy 0.98 315598\n macro avg 0.75 0.54 0.57 315598\nweighted avg 0.97 0.98 0.97 315598\n\n... Processing agile\nTest accuracy is 0.9999429654180318\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 315573\n 1 0.89 0.32 0.47 25\n\n accuracy 1.00 315598\n macro avg 0.94 0.66 0.74 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing ajax\nTest accuracy is 0.9887356700612805\n precision recall f1-score support\n\n 0 0.99 1.00 0.99 310952\n 1 0.70 0.41 0.52 4646\n\n accuracy 0.99 315598\n macro avg 0.84 0.71 0.76 315598\nweighted avg 0.99 0.99 0.99 315598\n\n... Processing amazon-web-services\nTest accuracy is 0.9981780619649048\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314643\n 1 0.79 0.55 0.65 955\n\n accuracy 1.00 315598\n macro avg 0.89 0.77 0.82 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing android\nTest accuracy is 0.9829815144582665\n precision recall f1-score support\n\n 0 0.99 1.00 0.99 288322\n 1 0.96 0.84 0.90 27276\n\n accuracy 0.98 315598\n macro avg 0.97 0.92 0.94 315598\nweighted avg 0.98 0.98 0.98 315598\n\n... Processing android-studio\nTest accuracy is 0.9972401599503166\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314589\n 1 0.67 0.27 0.38 1009\n\n accuracy 1.00 315598\n macro avg 0.84 0.63 0.69 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing angular2\nTest accuracy is 0.9991064582158315\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314898\n 1 0.94 0.64 0.76 700\n\n accuracy 1.00 315598\n macro avg 0.97 0.82 0.88 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing angularjs\nTest accuracy is 0.9949904625504598\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 309420\n 1 0.93 0.80 0.86 6178\n\n accuracy 0.99 315598\n macro avg 0.96 0.90 0.93 315598\nweighted avg 0.99 0.99 0.99 315598\n\n... Processing apache\nTest accuracy is 0.9952788040481879\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 313590\n 1 0.71 0.43 0.54 2008\n\n accuracy 1.00 315598\n macro avg 0.85 0.72 0.77 315598\nweighted avg 0.99 1.00 0.99 315598\n\n... Processing apache-spark\nTest accuracy is 0.999404305477221\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314985\n 1 0.93 0.75 0.83 613\n\n accuracy 1.00 315598\n macro avg 0.97 0.87 0.91 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing api\nTest accuracy is 0.9951900835873485\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314094\n 1 0.47 0.07 0.12 1504\n\n accuracy 1.00 315598\n macro avg 0.73 0.53 0.56 315598\nweighted avg 0.99 1.00 0.99 315598\n\n... Processing asp.net\nTest accuracy is 0.9815905043758199\n precision recall f1-score support\n\n 0 0.99 1.00 0.99 306664\n 1 0.79 0.48 0.60 8934\n\n accuracy 0.98 315598\n macro avg 0.89 0.74 0.79 315598\nweighted avg 0.98 0.98 0.98 315598\n\n... Processing asp.net-web-api\nTest accuracy is 0.9985804726265692\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314999\n 1 0.72 0.41 0.52 599\n\n accuracy 1.00 315598\n macro avg 0.86 0.71 0.76 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing azure\nTest accuracy is 0.9987769250755708\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314517\n 1 0.91 0.71 0.80 1081\n\n accuracy 1.00 315598\n macro avg 0.95 0.86 0.90 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing bash\nTest accuracy is 0.995047497132428\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 313311\n 1 0.76 0.46 0.58 2287\n\n accuracy 1.00 315598\n macro avg 0.88 0.73 0.79 315598\nweighted avg 0.99 1.00 0.99 315598\n\n... Processing c\nTest accuracy is 0.9872432651664459\n precision recall f1-score support\n\n 0 0.99 1.00 0.99 308691\n 1 0.81 0.55 0.65 6907\n\n accuracy 0.99 315598\n macro avg 0.90 0.77 0.82 315598\nweighted avg 0.99 0.99 0.99 315598\n\n... Processing c#\nTest accuracy is 0.9414159785549971\n precision recall f1-score support\n\n 0 0.95 0.98 0.97 285147\n 1 0.77 0.56 0.65 30451\n\n accuracy 0.94 315598\n macro avg 0.86 0.77 0.81 315598\nweighted avg 0.94 0.94 0.94 315598\n\n... Processing c++\nTest accuracy is 0.9785581657678439\n precision recall f1-score support\n\n 0 0.98 0.99 0.99 301367\n 1 0.85 0.63 0.73 14231\n\n accuracy 0.98 315598\n macro avg 0.92 0.81 0.86 315598\nweighted avg 0.98 0.98 0.98 315598\n\n... Processing cloud\nTest accuracy is 0.9995722406352385\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 315459\n 1 0.75 0.04 0.08 139\n\n accuracy 1.00 315598\n macro avg 0.87 0.52 0.54 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing codeigniter\nTest accuracy is 0.9979055634066122\n precision recall f1-score support\n\n 0 1.00 1.00 1.00 314150\n 1 0.90 0.61 0.73 1448\n\n accuracy 1.00 315598\n macro avg 0.95 0.80 0.86 315598\nweighted avg 1.00 1.00 1.00 315598\n\n... Processing css\nTest accuracy is 0.9785454914162954\n precision recall f1-score support\n\n 0 0.99 0.99 0.99 302936\n 1 0.78 0.64 0.71 12662\n\n accuracy 0.98 315598\n macro avg 0.88 0.82 0.85 315598\nweighted avg 0.98 0.98 0.98 315598\n\n... Processing devops\nTest accuracy is 0.9999524711816932\n"
]
],
[
[
"# Let us train, score and evaluate Logistic Regression",
"_____no_output_____"
]
],
[
[
"#Logistic Regression Classifier\nLogReg_pipeline = Pipeline([\n ('clf', OneVsRestClassifier(LogisticRegression(solver='sag'), n_jobs=1)),\n ])\n\ntag_level_training_pipeline(X_train_dtm, train, X_test_dtm, test, LogReg_pipeline, 'LogisticRegression/')\nresult = tag_level_predict(X_train_dtm, train, X_test_dtm, test, 'LogisticRegression/')\nmodel_evaluation_stats(result, \"LogisticRegression\")",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79edfa1f800222a66a565ec08d35aece2d41cdd | 68,115 | ipynb | Jupyter Notebook | randomization/thanks-recipient-study-2019/generate-wikipedia-thanks-recipient-randomizations-final-10.29.2019.R.ipynb | mitmedialab/CivilServant-Wikipedia-Analysis | 261513d66266b0a149bf79c165a1f0744d9829b8 | [
"MIT"
] | null | null | null | randomization/thanks-recipient-study-2019/generate-wikipedia-thanks-recipient-randomizations-final-10.29.2019.R.ipynb | mitmedialab/CivilServant-Wikipedia-Analysis | 261513d66266b0a149bf79c165a1f0744d9829b8 | [
"MIT"
] | 2 | 2021-02-02T22:48:32.000Z | 2021-03-20T04:11:33.000Z | randomization/thanks-recipient-study-2019/generate-wikipedia-thanks-recipient-randomizations-final-10.29.2019.R.ipynb | mitmedialab/CivilServant-Wikipedia-Analysis | 261513d66266b0a149bf79c165a1f0744d9829b8 | [
"MIT"
] | null | null | null | 41.057866 | 13,868 | 0.543038 | [
[
[
"# Wikipedia Thanks-Receiver Study Randomization \n[J. Nathan Matias](https://twitter.com/natematias)\nOctober 29, 2019\n\nThis code takes as input data described in the [randomization data format](https://docs.google.com/document/d/1plhoDbQryYQ32vZMXu8YmlLSp30QTdup43k6uTePOT4/edit?usp=drive_web&ouid=117701977297551627494) and produces randomizations for the Thanks Recipient study.\n\nNotes:\n* We use the 99% confidence interval cutoffs from our first sample rather than relative to each subsequent sample\n * Polish Experienced: 235.380736142341\n * Polish Newcomer: 72.2118047599678\n * Arabic Newcomer: 54.7365066602131\n * German Newcomer: 63.3678642498622\n* We will be drawing only 300 Polish accounts",
"_____no_output_____"
]
],
[
[
"options(\"scipen\"=9, \"digits\"=4)\nlibrary(ggplot2)\nlibrary(rlang)\nlibrary(tidyverse)\nlibrary(viridis)\nlibrary(blockTools)\nlibrary(blockrand)\nlibrary(gmodels) # contains CrossTable\nlibrary(DeclareDesign)\nlibrary(DescTools) # contains Freq\nlibrary(uuid)\noptions(repr.plot.width=7, repr.plot.height=3.5)\nsessionInfo()",
"_____no_output_____"
]
],
[
[
"# Load Input Dataframe",
"_____no_output_____"
]
],
[
[
"filename <- \"all-thankees-historical-20191029.csv\"\ndata.path <- \"/home/civilservant/Tresors/CivilServant/projects/wikipedia-integration/gratitude-study/Data Drills/thankee\"\nrecipient.df <- read.csv(file.path(data.path, \"historical_output\", filename))",
"_____no_output_____"
]
],
[
[
"### Load Participants in the Thanker Study",
"_____no_output_____"
]
],
[
[
"thanker.df <- read.csv(file.path(data.path, \"..\", \"thanker_hardlaunch\", \"randomization_output\",\n \"all-thanker-randomization-final-20190729.csv\"))\nusernames.to.exclude <- thanker.df$user_name",
"_____no_output_____"
]
],
[
[
"### Load Liaison Usernames",
"_____no_output_____"
]
],
[
[
"liaison.df <- read.csv(file.path(data.path, \"..\", \"thanker_hardlaunch\", \"randomization_output\",\n \"liason-thanker-randomization-datadrill-20190718.csv\"))",
"_____no_output_____"
],
[
"usernames.to.exclude <- append(as.character(usernames.to.exclude), as.character(liaison.df$user_name))\nprint(paste(length(usernames.to.exclude), \"usernames to exclude\"))",
"[1] \"462 usernames to exclude\"\n"
]
],
[
[
"### Adjust Column Names to Match Thankee Randomization Specification",
"_____no_output_____"
]
],
[
[
"recipient.df$prev_experience <- factor(as.integer(gsub(\"bin_\", \"\", recipient.df$prev_experience)))\nrecipient.df$anonymized_id <- sapply( seq_along(1:nrow(recipient.df)), UUIDgenerate )\nrecipient.df$newcomer <- recipient.df$prev_experience == 0 \nrecipient.df <- subset(recipient.df, lang!=\"en\")\n#recipient.df <- subset(recipient.df, user_editcount_quality >=4 )",
"_____no_output_____"
],
[
"hist(recipient.df$user_editcount_quality)",
"_____no_output_____"
]
],
[
[
"# Confirm the number of participants",
"_____no_output_____"
]
],
[
[
"print(\"Newcomer Participants to Randomize\")\nsummary(subset(recipient.df, newcomer == 1)$lang)",
"[1] \"Newcomer Participants to Randomize\"\n"
],
[
"## Polish Experienced Accounts\nprint(\"Experienced Participants to Randomize\")\nsummary(subset(recipient.df, newcomer == 0)$lang)",
"[1] \"Experienced Participants to Randomize\"\n"
]
],
[
[
"# Omit Participants",
"_____no_output_____"
],
[
"### Omit Participants in the Thanker Study",
"_____no_output_____"
]
],
[
[
"print(paste(nrow(recipient.df), \"participants before removing thankers\"))\nrecipient.df <- subset(recipient.df, (user_name %in% usernames.to.exclude)!=TRUE)\nprint(paste(nrow(recipient.df), \"participants after removing thankers\")) ",
"[1] \"3262 participants before removing thankers\"\n[1] \"3262 participants after removing thankers\"\n"
]
],
[
[
"### Subset values outside the 99% confidence intervals\nWe are using upper confidence intervals from the first randomization, found at [generate-wikipedia-thanks-recipient-randomizations-final-07.28.3019](generate-wikipedia-thanks-recipient-randomizations-final-07.28.3019.R.ipynb)\n * Polish Experienced: 235.380736142341\n * Polish Newcomer: 72.2118047599678\n * Arabic Newcomer: 54.7365066602131\n * German Newcomer: 63.3678642498622",
"_____no_output_____"
]
],
[
[
"upper.conf.ints <- data.frame(lang=c(\"pl\", \"pl\", \"de\", \"ar\"),\n newcomer=c(0,1,1,1),\n conf.int = c(\n 235.380736142341,\n 72.2118047599678,\n 54.7365066602131,\n 63.3678642498622\n ))",
"_____no_output_____"
],
[
"upper.conf.ints\n#subset(upper.conf.ints, lang==\"pl\" & newcomer ==1)$conf.int",
"_____no_output_____"
],
[
"## CREATE A PLACEHOLDER WITH ZERO ROWS\n## BEFORE ITERATING\nrecipient.trimmed.df <- recipient.df[0,]\n\nfor(l in c(\"ar\", \"de\", \"fa\", \"pl\")){\n print(paste(\"Language: \", l))\n for(n in c(0,1)){\n print(paste(\" newcomer:\", n == 1))\n lang.df <- subset(recipient.df, lang==l & newcomer == n)\n print(paste( \" \", nrow(lang.df), \"rows from original dataset\"))\n prev.conf.int <- subset(upper.conf.ints, lang==l & newcomer ==n)$conf.int\n\n print( \" 99% confidence intervals:\")\n print(paste(\" upper: \", prev.conf.int ,sep=\"\"))\n \n print(paste(\" Removing\", \n nrow(subset(lang.df,\n labor_hours_84_days_pre_sample > prev.conf.int)), \"outliers\",\n \"observations because labor_hours_84_days_pre_sample is an outlier.\"))\n lang.subset.df <- subset(lang.df, labor_hours_84_days_pre_sample <= prev.conf.int)\n print(paste( \" \", nrow(lang.subset.df), \"rows in trimmed dataset\"))\n recipient.trimmed.df <- rbind(recipient.trimmed.df, lang.subset.df)\n }\n}\n\nrecipient.df.penultimate <- recipient.trimmed.df",
"[1] \"Language: ar\"\n[1] \" newcomer: FALSE\"\n[1] \" 0 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: \"\n[1] \" Removing 0 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 0 rows in trimmed dataset\"\n[1] \" newcomer: TRUE\"\n[1] \" 743 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: 63.3678642498622\"\n[1] \" Removing 8 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 735 rows in trimmed dataset\"\n[1] \"Language: de\"\n[1] \" newcomer: FALSE\"\n[1] \" 0 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: \"\n[1] \" Removing 0 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 0 rows in trimmed dataset\"\n[1] \" newcomer: TRUE\"\n[1] \" 1565 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: 54.7365066602131\"\n[1] \" Removing 37 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 1528 rows in trimmed dataset\"\n[1] \"Language: fa\"\n[1] \" newcomer: FALSE\"\n[1] \" 0 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: \"\n[1] \" Removing 0 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 0 rows in trimmed dataset\"\n[1] \" newcomer: TRUE\"\n[1] \" 0 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: \"\n[1] \" Removing 0 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 0 rows in trimmed dataset\"\n[1] \"Language: pl\"\n[1] \" newcomer: FALSE\"\n[1] \" 512 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: 235.380736142341\"\n[1] \" Removing 1 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 511 rows in trimmed dataset\"\n[1] \" newcomer: TRUE\"\n[1] \" 442 rows from original dataset\"\n[1] \" 99% confidence intervals:\"\n[1] \" upper: 72.2118047599678\"\n[1] \" Removing 7 outliers observations because labor_hours_84_days_pre_sample is an outlier.\"\n[1] \" 435 rows in trimmed dataset\"\n"
]
],
[
[
"# Review and Generate Variables",
"_____no_output_____"
]
],
[
[
"print(aggregate(recipient.df.penultimate[c(\"labor_hours_84_days_pre_sample\")],\n FUN=mean, by = list(recipient.df.penultimate$prev_experience)))",
" Group.1 labor_hours_84_days_pre_sample\n1 0 5.878\n2 90 4.519\n3 180 8.063\n4 365 5.797\n5 730 5.354\n6 1460 5.866\n7 2920 8.791\n"
],
[
"print(CrossTable(recipient.df.penultimate$has_email, recipient.df.penultimate$newcomer, \n prop.r = FALSE, prop.c=TRUE, prop.t = FALSE, prop.chisq = FALSE))",
"\n \n Cell Contents\n|-------------------------|\n| N |\n| N / Col Total |\n|-------------------------|\n\n \nTotal Observations in Table: 3209 \n\n \n | recipient.df.penultimate$newcomer \nrecipient.df.penultimate$has_email | FALSE | TRUE | Row Total | \n-----------------------------------|-----------|-----------|-----------|\n False | 20 | 18 | 38 | \n | 0.039 | 0.007 | | \n-----------------------------------|-----------|-----------|-----------|\n True | 491 | 2680 | 3171 | \n | 0.961 | 0.993 | | \n-----------------------------------|-----------|-----------|-----------|\n Column Total | 511 | 2698 | 3209 | \n | 0.159 | 0.841 | | \n-----------------------------------|-----------|-----------|-----------|\n\n \n$t\n y\nx FALSE TRUE\n False 20 18\n True 491 2680\n\n$prop.row\n y\nx FALSE TRUE\n False 0.5263 0.4737\n True 0.1548 0.8452\n\n$prop.col\n y\nx FALSE TRUE\n False 0.039139 0.006672\n True 0.960861 0.993328\n\n$prop.tbl\n y\nx FALSE TRUE\n False 0.006232 0.005609\n True 0.153007 0.835151\n\n"
],
[
"## Update the has_email field\n## recipient.df.penultimate$has_email <- recipient.df.penultimate$has_email == \"True\"\n\n## PREVIOUS EXPERIENCE\nprint(\"prev_experience\")\nprint(summary(factor(recipient.df.penultimate$prev_experience)))\ncat(\"\\n\")\n\n## SHOW LABOR HOURS BY EXPERIENCE GROUP:\nprint(\"Aggregate labor_hours_84_days_pre_sample\")\nprint(aggregate(recipient.df.penultimate[c(\"labor_hours_84_days_pre_sample\")],\n FUN=mean, by = list(recipient.df.penultimate$prev_experience)))\ncat(\"\\n\")\n\nprint(\"NEWCOMERS AND EMAILS\")\nprint(\"--------------------\")\nprint(CrossTable(recipient.df.penultimate$has_email, recipient.df.penultimate$newcomer, \n prop.r = FALSE, prop.c=TRUE, prop.t = FALSE, prop.chisq = FALSE))\n\n# VARIABLE: num_prev_thanks_pre_treatment\nprint(\"num_prev_thanks_pre_sample\")\nprint(summary(recipient.df.penultimate$num_prev_thanks_pre_sample))\ncat(\"\\n\")\n \n## SHOW PREVIOUS THANKS BY EXPERIENCE GROUP:\nprint(\"num_prev_thanks_pre_sample by prev_experience\")\nprint(aggregate(recipient.df.penultimate[c(\"num_prev_thanks_pre_sample\")],\n FUN=mean, by = list(recipient.df.penultimate$prev_experience)))\ncat(\"\\n\")",
"[1] \"prev_experience\"\n 0 90 180 365 730 1460 2920 \n2698 63 52 69 81 102 144 \n\n[1] \"Aggregate labor_hours_84_days_pre_sample\"\n Group.1 labor_hours_84_days_pre_sample\n1 0 5.878\n2 90 4.519\n3 180 8.063\n4 365 5.797\n5 730 5.354\n6 1460 5.866\n7 2920 8.791\n\n[1] \"NEWCOMERS AND EMAILS\"\n[1] \"--------------------\"\n\n \n Cell Contents\n|-------------------------|\n| N |\n| N / Col Total |\n|-------------------------|\n\n \nTotal Observations in Table: 3209 \n\n \n | recipient.df.penultimate$newcomer \nrecipient.df.penultimate$has_email | FALSE | TRUE | Row Total | \n-----------------------------------|-----------|-----------|-----------|\n False | 20 | 18 | 38 | \n | 0.039 | 0.007 | | \n-----------------------------------|-----------|-----------|-----------|\n True | 491 | 2680 | 3171 | \n | 0.961 | 0.993 | | \n-----------------------------------|-----------|-----------|-----------|\n Column Total | 511 | 2698 | 3209 | \n | 0.159 | 0.841 | | \n-----------------------------------|-----------|-----------|-----------|\n\n \n$t\n y\nx FALSE TRUE\n False 20 18\n True 491 2680\n\n$prop.row\n y\nx FALSE TRUE\n False 0.5263 0.4737\n True 0.1548 0.8452\n\n$prop.col\n y\nx FALSE TRUE\n False 0.039139 0.006672\n True 0.960861 0.993328\n\n$prop.tbl\n y\nx FALSE TRUE\n False 0.006232 0.005609\n True 0.153007 0.835151\n\n[1] \"num_prev_thanks_pre_sample\"\n Min. 1st Qu. Median Mean 3rd Qu. Max. \n 0.00 0.00 0.00 0.34 0.00 112.00 \n\n[1] \"num_prev_thanks_pre_sample by prev_experience\"\n Group.1 num_prev_thanks_pre_sample\n1 0 0.1542\n2 90 0.1111\n3 180 0.1731\n4 365 0.2899\n5 730 0.9630\n6 1460 0.6176\n7 2920 3.4097\n\n"
]
],
[
[
"# Subset Sample to Planned sample sizes\nSample sizes are reported in the experiment [Decisions Document](https://docs.google.com/document/d/1HryhsmWI6WthXQC7zv9Hz1a9DhpZ3FxVRLjTONuMg4I/edit)\n\n* Arabic newcomers (1750 goal) (hoping for as many as possible in first sample)\n * hoping for 1350 in the first sample and 400 later\n* German newcomers (3000 goal) (hoping for as many as possible in first sample)\n * hoping for 1600 in first sample and 1400 later\n* Persian Experienced (2400 goal)\n* Polish:\n * Newcomers: (800 goal)\n * Experienced: (2400 goal)",
"_____no_output_____"
]
],
[
[
"## Seed generated by Brooklyn Integers\n# https://www.brooklynintegers.com/int/1495265601/\nset.seed(1495265601)",
"_____no_output_____"
],
[
"print(\"Newcomers\")\nsummary(subset(recipient.df.penultimate, newcomer==1)$lang)",
"[1] \"Newcomers\"\n"
],
[
"print(\"Experienced\")\nsummary(subset(recipient.df.penultimate, newcomer==0)$lang)",
"[1] \"Experienced\"\n"
],
[
"## CREATE THE FINAL PARTICIPANT SAMPLE BEFORE RANDOMIZATION\nrecipient.df.final <- recipient.df.penultimate",
"_____no_output_____"
]
],
[
[
"# Generate Randomization Blocks",
"_____no_output_____"
]
],
[
[
"recipient.df.final$lang_prev_experience <- factor(paste(recipient.df.final$lang, recipient.df.final$prev_experience))\ncolnames(recipient.df.final)",
"_____no_output_____"
],
[
"## BLOCKING VARIABLES\nbv = c(\"labor_hours_84_days_pre_sample\", \"num_prev_thanks_pre_sample\")\n\nblock.size = 2\n\n## TODO: CHECK TO SEE IF I CAN DO BALANCED RANDOMIZATION\n## WITHIN BLOCKS LARGER THAN 2\nblockobj = block(data=recipient.df.final,\n n.tr = block.size,\n groups = \"lang_prev_experience\",\n id.vars=\"anonymized_id\",\n block.vars = bv,\n distance =\"mahalanobis\"\n )\n## CHECK DISTANCES\n#print(blockobj)\nrecipient.df.final$randomization_block_id <- createBlockIDs(blockobj,\n data=recipient.df.final,\n id.var = \"anonymized_id\")\nrecipient.df.final$randomization_block_size = block.size",
"_____no_output_____"
]
],
[
[
"### Identify Incomplete Blocks and Remove Participants in Incomplete Blocks From the Experiment",
"_____no_output_____"
]
],
[
[
"block.sizes <- aggregate(recipient.df.final$randomization_block_id, FUN=length, by=list(recipient.df.final$randomization_block_id))\nincomplete.blocks <- subset(block.sizes, x == 1)$Group.1\nincomplete.blocks",
"_____no_output_____"
],
[
"nrow(subset(recipient.df.final, randomization_block_id %in% incomplete.blocks))",
"_____no_output_____"
],
[
"removed.observations <- subset(recipient.df.final, (\n randomization_block_id %in% incomplete.blocks)==TRUE)\n\nrecipient.df.final <- \n subset(recipient.df.final, (\n randomization_block_id %in% incomplete.blocks)!=TRUE)\n\nprint(paste(\"Removed\", nrow(removed.observations), \"units placed in incomplete blocks.\"))",
"[1] \"Removed 5 units placed in incomplete blocks.\"\n"
]
],
[
[
"# Generate Randomizations",
"_____no_output_____"
]
],
[
[
"assignments <- block_ra(blocks=recipient.df.final$randomization_block_id, \n num_arms = 2, conditions = c(0,1))\nrecipient.df.final$randomization_arm <- assignments ",
"_____no_output_____"
]
],
[
[
"### Check Balance",
"_____no_output_____"
]
],
[
[
"print(\"Aggregating labor hours by treatment\")\nprint(aggregate(recipient.df.final[c(\"labor_hours_84_days_pre_sample\")],\n FUN=mean, by = list(recipient.df.final$randomization_arm)))\n\nprint(\"CrossTable of lang by treatment\")\nCrossTable(recipient.df.final$lang, recipient.df.final$randomization_arm, \n prop.r = TRUE, prop.c=FALSE, prop.t = FALSE, prop.chisq = FALSE)\n\nprint(\"CrossTable of lang_prev_experience by treatment\")\nCrossTable(recipient.df.final$lang_prev_experience, recipient.df.final$randomization_arm, \n prop.r = TRUE, prop.c=FALSE, prop.t = FALSE, prop.chisq = FALSE)\n",
"[1] \"Aggregating labor hours by treatment\"\n Group.1 labor_hours_84_days_pre_sample\n1 0 6.008\n2 1 5.944\n[1] \"CrossTable of lang by treatment\"\n\n \n Cell Contents\n|-------------------------|\n| N |\n| N / Row Total |\n|-------------------------|\n\n \nTotal Observations in Table: 3204 \n\n \n | recipient.df.final$randomization_arm \nrecipient.df.final$lang | 0 | 1 | Row Total | \n------------------------|-----------|-----------|-----------|\n ar | 367 | 367 | 734 | \n | 0.500 | 0.500 | 0.229 | \n------------------------|-----------|-----------|-----------|\n de | 764 | 764 | 1528 | \n | 0.500 | 0.500 | 0.477 | \n------------------------|-----------|-----------|-----------|\n pl | 471 | 471 | 942 | \n | 0.500 | 0.500 | 0.294 | \n------------------------|-----------|-----------|-----------|\n Column Total | 1602 | 1602 | 3204 | \n------------------------|-----------|-----------|-----------|\n\n \n[1] \"CrossTable of lang_prev_experience by treatment\"\n\n \n Cell Contents\n|-------------------------|\n| N |\n| N / Row Total |\n|-------------------------|\n\n \nTotal Observations in Table: 3204 \n\n \n | recipient.df.final$randomization_arm \nrecipient.df.final$lang_prev_experience | 0 | 1 | Row Total | \n----------------------------------------|-----------|-----------|-----------|\n ar 0 | 367 | 367 | 734 | \n | 0.500 | 0.500 | 0.229 | \n----------------------------------------|-----------|-----------|-----------|\n de 0 | 764 | 764 | 1528 | \n | 0.500 | 0.500 | 0.477 | \n----------------------------------------|-----------|-----------|-----------|\n pl 0 | 217 | 217 | 434 | \n | 0.500 | 0.500 | 0.135 | \n----------------------------------------|-----------|-----------|-----------|\n pl 1460 | 51 | 51 | 102 | \n | 0.500 | 0.500 | 0.032 | \n----------------------------------------|-----------|-----------|-----------|\n pl 180 | 26 | 26 | 52 | \n | 0.500 | 0.500 | 0.016 | \n----------------------------------------|-----------|-----------|-----------|\n pl 2920 | 72 | 72 | 144 | \n | 0.500 | 0.500 | 0.045 | \n----------------------------------------|-----------|-----------|-----------|\n pl 365 | 34 | 34 | 68 | \n | 0.500 | 0.500 | 0.021 | \n----------------------------------------|-----------|-----------|-----------|\n pl 730 | 40 | 40 | 80 | \n | 0.500 | 0.500 | 0.025 | \n----------------------------------------|-----------|-----------|-----------|\n pl 90 | 31 | 31 | 62 | \n | 0.500 | 0.500 | 0.019 | \n----------------------------------------|-----------|-----------|-----------|\n Column Total | 1602 | 1602 | 3204 | \n----------------------------------------|-----------|-----------|-----------|\n\n \n"
]
],
[
[
"# Subset Polish Experienced Accounts\nWithin Polish, identify 300 accounts (150 blocks) to include and drop all of the others.\n\nNote: since the previous randomization included a larger number of more experienced accounts, we're prioritizing accounts from experience groups 90, 180, 365, 730, and 1460 (all except 2920). ",
"_____no_output_____"
]
],
[
[
"## SHOW PREVIOUS THANKS BY EXPERIENCE GROUP:\nrecipient.df.final$count.var <- 1\nprint(\"Number of Accounts for each experience level among Polish Participants\")\nprint(aggregate(subset(recipient.df.final, lang=\"pl\")[c(\"count.var\")],\n FUN=sum, by = list(subset(recipient.df.final, lang=\"pl\")$prev_experience)))\ncat(\"\\n\")",
"[1] \"Number of Accounts for each experience level among Polish Participants\"\n Group.1 count.var\n1 0 2696\n2 90 62\n3 180 52\n4 365 68\n5 730 80\n6 1460 102\n7 2920 144\n\n"
],
[
"print(paste(\"Total number of rows: \", nrow(recipient.df.final), sep=\"\"))\n\nrecipient.df.final.a <- subset(recipient.df.final, !(lang==\"pl\" & prev_experience==2920))\nrecipient.df.final.a$initial.block.id <- recipient.df.final.a$randomization_block_id\nprint(paste(\"Total number of rows once we subset Polish:\", nrow(recipient.df.final.a)))",
"[1] \"Total number of rows: 3204\"\n[1] \"Total number of rows once we subset Polish: 3060\"\n"
]
],
[
[
"### Offset block IDs to be unique\nObserve the block IDs from the previous randomizations and ensure that these ones are unique and larger.",
"_____no_output_____"
]
],
[
[
"## LOAD PREVIOUS RANDOMIZATIONS\nprev_randomization_filename <- \"thanks-recipient-randomizations-20190729.csv\"\nprev.randomization.df <- read.csv(file.path(data.path, \"randomization_output\", prev_randomization_filename))\nprint(paste(\"Max Block ID: \", max(prev.randomization.df$randomization_block_id)))\nprev.max.block.id <- max(prev.randomization.df$randomization_block_id)\nprev.max.block.id <- 4221",
"[1] \"Max Block ID: 3707\"\n"
],
[
"recipient.df.final.a$randomization_block_id <- recipient.df.final.a$initial.block.id + prev.max.block.id\nsummary(recipient.df.final.a$randomization_block_id)",
"_____no_output_____"
]
],
[
[
"### Sort by block ID",
"_____no_output_____"
]
],
[
[
"recipient.df.final.a <- recipient.df.final.a[order(recipient.df.final.a$randomization_block_id),]",
"_____no_output_____"
],
[
"print(\"Newcomers\")\nsummary(subset(recipient.df.final.a, newcomer==1)$lang)\nprint(\"Experienced\")\nsummary(subset(recipient.df.final.a, newcomer==0)$lang)",
"[1] \"Newcomers\"\n"
]
],
[
[
"# Output and Archive Randomizations",
"_____no_output_____"
]
],
[
[
"randomization.filename <- paste(\"thanks-recipient-randomizations-\", format(Sys.Date(), format=\"%Y%m%d\"), \".csv\", sep=\"\") \nwrite.csv(recipient.df.final.a, file = file.path(data.path, \"randomization_output\", randomization.filename))",
"_____no_output_____"
],
[
"colnames(recipient.df.final.a)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e79ee427e6472bee1a7447acf5d7c4d4db1c76aa | 16,956 | ipynb | Jupyter Notebook | tbio/marriages.ipynb | VincentCheng34/StudyOnPython | b3f905b2e77f6ccfdce675bb2596e6ac708859a7 | [
"MIT"
] | 1 | 2019-05-01T06:29:14.000Z | 2019-05-01T06:29:14.000Z | tbio/marriages.ipynb | VincentCheng34/StudyOnPython | b3f905b2e77f6ccfdce675bb2596e6ac708859a7 | [
"MIT"
] | null | null | null | tbio/marriages.ipynb | VincentCheng34/StudyOnPython | b3f905b2e77f6ccfdce675bb2596e6ac708859a7 | [
"MIT"
] | null | null | null | 25.926606 | 109 | 0.422564 | [
[
[
"# SELECT DISTINCT ?manVal ?wifeVal ?womanVal ?husbandVal WHERE {\n# {\n# ?man a tbio:Person .\n# ?man tbio:hasWife ?wife .\n# }\n# UNION\n# {\n# ?woman a tbio:Person .\n# ?woman tbio:hasHusband ?husband .\n# }\n# BIND(STR(?man) AS ?manStr) .\n# BIND(REPLACE(?manStr, \"http://tbio.orient.cas.cz#\", \"\") AS ?manVal) .\n# BIND(STR(?wife) AS ?wifeStr) .\n# BIND(REPLACE(?wifeStr, \"http://tbio.orient.cas.cz#\", \"\") AS ?wifeVal) .\n# BIND(STR(?woman) AS ?womanStr) .\n# BIND(REPLACE(?womanStr, \"http://tbio.orient.cas.cz#\", \"\") AS ?womanVal) .\n# BIND(STR(?husband) AS ?husbandStr) .\n# BIND(REPLACE(?husbandStr, \"http://tbio.orient.cas.cz#\", \"\") AS ?husbandVal) .\n# }",
"_____no_output_____"
],
[
"import pandas as pd",
"_____no_output_____"
]
],
[
[
"# Get marriages",
"_____no_output_____"
]
],
[
[
"filepath = '/Volumes/backup_128G/z_repository/TBIO_data/RequestsFromTana/20190515'\n\nfilename = 'marriages.tsv'\n\nread_filename = '{0}/{1}'.format(filepath, filename)",
"_____no_output_____"
],
[
"marriageDf = pd.read_csv(read_filename, delimiter='\\t')\nmarriageDf.fillna('', inplace=True)\nmarriageDf.shape, marriageDf.head()",
"_____no_output_____"
]
],
[
[
"# To unique spouses",
"_____no_output_____"
]
],
[
[
"startId = 10000\npersonIds = {}\nmarriageDic = {}\nfor idx in range(0, len(marriageDf)):\n row = marriageDf.loc[idx]\n\n man = str(row['?manVal'])\n wife = str(row['?wifeVal'])\n woman = str(row['?womanVal'])\n husband = str(row['?husbandVal'])\n \n if man != '' and man not in personIds:\n personIds[man] = startId\n startId += 1\n if wife != '' and wife not in personIds:\n personIds[wife] = startId\n startId += 1\n if woman != '' and woman not in personIds:\n personIds[woman] = startId\n startId += 1\n if husband != '' and husband not in personIds:\n personIds[husband] = startId\n startId += 1\n \n if man not in marriageDic:\n marriageDic[man] = [wife]\n elif wife not in marriageDic[man]:\n marriageDic[man].append(wife)\n# else:\n# print(\"man WRONG:\", man, wife)\n\n if wife not in marriageDic:\n marriageDic[wife] = [man]\n elif man not in marriageDic[wife]:\n marriageDic[wife].append(man)\n# else:\n# print(\"wife WRONG:\", wife, man)\n \n if woman not in marriageDic:\n marriageDic[woman] = [husband]\n elif husband not in marriageDic[woman]:\n marriageDic[woman].append(husband)\n# else:\n# print(\"woman WRONG:\", woman, husband)\n \n if husband not in marriageDic:\n marriageDic[husband] = [woman]\n elif woman not in marriageDic[husband]:\n marriageDic[husband].append(woman)\n# else:\n# print(\"husband WRONG:\", husband, woman)\n \n# marriageDic",
"_____no_output_____"
],
[
"len(personIds)",
"_____no_output_____"
],
[
"personDf = pd.DataFrame(personIds, index=['ID']).T\npersonDf.head()",
"_____no_output_____"
],
[
"write_nodes_to = '{0}/{1}'.format(filepath, 'nodes_person_20190516_v2.xlsx')\npersonDf.to_excel(write_nodes_to)",
"_____no_output_____"
]
],
[
[
"# Read person-family map table",
"_____no_output_____"
]
],
[
[
"familymembers = 'Familymembers.xlsx'\n\nread_familymembers = '{0}/{1}'.format(filepath, familymembers)",
"_____no_output_____"
],
[
"fmDf = pd.read_excel(read_familymembers)\nfmDf.shape, fmDf.head()",
"_____no_output_____"
],
[
"startId = 20000\nfamilyIds = {}\nfmDic = {}\nfor idx in range(0, len(fmDf)):\n row = fmDf.loc[idx]\n\n person = str(row['personStr'])\n family = str(row['familyStr'])\n \n if family not in familyIds:\n familyIds[family] = startId\n startId += 1\n \n if person not in fmDic:\n fmDic[person] = family\n elif family != fmDic[person]:\n print(\"Dup:\", person, family, fmDic[person])\n\n# fmDic",
"_____no_output_____"
],
[
"len(familyIds)",
"_____no_output_____"
],
[
"familyDf = pd.DataFrame(familyIds, index=['ID']).T\nfamilyDf.head()",
"_____no_output_____"
],
[
"write_nodes_to = '{0}/{1}'.format(filepath, 'nodes_family_20190516_v2.xlsx')\nfamilyDf.to_excel(write_nodes_to)",
"_____no_output_____"
]
],
[
[
"# results",
"_____no_output_____"
],
[
"### `Source (Family)` | `Target(Family)`| `Type(Undirected)` | `Person/Source` | `Person/target`",
"_____no_output_____"
]
],
[
[
"def getFamilyName(INperson):\n if INperson not in fmDic:\n# print(INperson, \" Not found!\")\n return ''\n return fmDic[INperson]\n\ndef getPersonId(INperson):\n if INperson not in personIds:\n return 0\n return personIds[INperson]\n\ndef getFamilyId(INfamily):\n if INfamily not in familyIds:\n return 0\n return familyIds[INfamily]\n\nresList = []\nfor sPerson in marriageDic:\n spouses = marriageDic[sPerson]\n for tPerson in spouses:\n sFamily = getFamilyName(sPerson)\n tFamily = getFamilyName(tPerson)\n \n if sFamily == '' or tFamily == '':\n# print(fPerson, fFamily, sPerson, sFamily)\n continue\n\n sPersonId = getPersonId(sPerson)\n tPersonId = getPersonId(tPerson)\n sFamilyId = getFamilyId(sFamily)\n tFamilyId = getFamilyId(tFamily)\n resList.append([sFamilyId, tFamilyId, 'Undirected', sPersonId, tPersonId])\n\nprint(len(resList))",
"2573\n"
],
[
"resDf = pd.DataFrame(resList, columns=['SourceFamily', 'TargetFamily', 'Type', \n 'SourcePerson', 'TargetPerson'])\nresDf.drop_duplicates(keep='first', inplace=True)\nresDf.sort_values(by=['SourceFamily', 'TargetFamily', 'SourcePerson', 'TargetPerson'], inplace=True)\nresDf.head()",
"_____no_output_____"
],
[
"print(len(resDf))",
"2573\n"
],
[
"write_file_to = '{0}/{1}'.format(filepath, 'marriages_20190516_v2.xlsx')\nresDf.to_excel(write_file_to, index=False)",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
e79eecc282ba07fcd085c8125304fba081f76cc6 | 14,300 | ipynb | Jupyter Notebook | docs/example_single.ipynb | MaxGhenis/Tax-Cruncher | c06fd06ec4d17cc2795f97a6ff088fcd220d3498 | [
"MIT"
] | 1 | 2019-10-15T04:07:56.000Z | 2019-10-15T04:07:56.000Z | docs/example_single.ipynb | MaxGhenis/Tax-Cruncher | c06fd06ec4d17cc2795f97a6ff088fcd220d3498 | [
"MIT"
] | null | null | null | docs/example_single.ipynb | MaxGhenis/Tax-Cruncher | c06fd06ec4d17cc2795f97a6ff088fcd220d3498 | [
"MIT"
] | 1 | 2020-01-03T02:39:54.000Z | 2020-01-03T02:39:54.000Z | 29.065041 | 70 | 0.340839 | [
[
[
"from taxcrunch.cruncher import Cruncher",
"_____no_output_____"
],
[
"path = '../docs/example_adjustment.json'\nc = Cruncher(path)",
"CTC_c was redefined in release 1.0.0\n\n"
],
[
"# basic outputs\nc.basic_table()",
"_____no_output_____"
],
[
"# marginal tax rates\nc.mtr_table()",
"_____no_output_____"
],
[
"# detailed outputs with marginal tax rate analysis\nc.calc_table()",
"_____no_output_____"
],
[
"#detailed outputs with difference between reform and baseline\nc.calc_diff_table()",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79eee115af640f9714ca2d71bdf7b1db03c97b6 | 30,140 | ipynb | Jupyter Notebook | challenge.ipynb | aindrilachatterjee12345/Movies-ETL-Challenge8 | c0e400bdc8e1552cf637e1359702ddf54fd7e8f0 | [
"MIT"
] | null | null | null | challenge.ipynb | aindrilachatterjee12345/Movies-ETL-Challenge8 | c0e400bdc8e1552cf637e1359702ddf54fd7e8f0 | [
"MIT"
] | null | null | null | challenge.ipynb | aindrilachatterjee12345/Movies-ETL-Challenge8 | c0e400bdc8e1552cf637e1359702ddf54fd7e8f0 | [
"MIT"
] | null | null | null | 55.814815 | 192 | 0.426244 | [
[
[
"import json\nimport pandas as pd\nimport numpy as np\nimport time\n\nimport re\n\nfrom sqlalchemy import create_engine\nimport psycopg2\n\n\n",
"_____no_output_____"
],
[
" def clean_movie(movie):\n movie = dict(movie) #create a non-destructive copy\n alt_titles = {}\n #combine alternate title into one list\n for key in ['Also known as','Arabic','Cantonese','Chinese','French',\n 'Hangul','Hebrew','Hepburn','Japanese','Literally',\n 'Mandarin','McCune–Reischauer','Original title','Polish',\n 'Revised Romanization','Romanized','Russian',\n 'Simplified','Traditional','Yiddish']:\n if key in movie:\n alt_titles[key] = movie[key]\n movie.pop(key)\n if len(alt_titles) > 0:\n movie['alt_titles'] = alt_titles\n def change_column_name(old_name, new_name):\n if old_name in movie:\n movie[new_name] = movie.pop(old_name)\n \n change_column_name('Adaptation by', 'Writer(s)')\n change_column_name('Country of origin', 'Country')\n change_column_name('Directed by', 'Director')\n change_column_name('Distributed by', 'Distributor')\n change_column_name('Edited by', 'Editor(s)')\n change_column_name('Length', 'Running time')\n change_column_name('Original release', 'Release date')\n change_column_name('Music by', 'Composer(s)')\n change_column_name('Produced by', 'Producer(s)')\n change_column_name('Producer', 'Producer(s)')\n change_column_name('Productioncompanies ', 'Production company(s)')\n change_column_name('Productioncompany ', 'Production company(s)')\n change_column_name('Released', 'Release Date')\n change_column_name('Release Date', 'Release date')\n change_column_name('Screen story by', 'Writer(s)')\n change_column_name('Screenplay by', 'Writer(s)')\n change_column_name('Story by', 'Writer(s)')\n change_column_name('Theme music composer', 'Composer(s)')\n change_column_name('Written by', 'Writer(s)')\n \n return movie",
"_____no_output_____"
],
[
"def extract_transform_load(wiki_file, kaggle_file, ratings_file):\n \n with open(wiki_file, mode='r') as file:\n wiki_movies_raw = json.load(file)\n \n kaggle_metadata = pd.read_csv(kaggle_file)\n ratings = pd.read_csv(ratings_file)\n\n wiki_movies = [movie for movie in wiki_movies_raw\n if ('Director' in movie or 'Directed by' in movie)\n and 'imdb_link' in movie]\n \n clean_movies = [clean_movie(movie) for movie in wiki_movies]\n wiki_movies_df = pd.DataFrame(clean_movies)\n\n print(wiki_movies_df)\n print(kaggle_metadata)\n print(ratings)\n#Assuming wikipedia data still contains IMDB id\n try:\n wiki_movies_df['imdb_id'] = wiki_movies_df['imdb_link'].str.extract(r'(tt\\d{7})')\n wiki_movies_df.drop_duplicates(subset='imdb_id', inplace=True)\n except Exception as e:\n print(e)\n \n wiki_columns_to_keep = [column for column in wiki_movies_df.columns if wiki_movies_df[column].isnull().sum() < len(wiki_movies_df) * 0.9]\n wiki_movies_df = wiki_movies_df[wiki_columns_to_keep] \n box_office = wiki_movies_df['Box office'].dropna() \n box_office = box_office.apply(lambda x: ' '.join(x) if type(x) == list else x)\n \n form_one = r'\\$\\d+\\.?\\d*\\s*[mb]illion'\n form_two = r'\\$\\d{1,3}(?:,\\d{3})+'\n ",
"_____no_output_____"
],
[
" def parse_dollars(s):\n # if s is not a string, return NaN\n if type(s) != str:\n return np.nan\n\n # if input is of the form $###.# million\n if re.match(r'\\$\\s*\\d+\\.?\\d*\\s*milli?on', s, flags=re.IGNORECASE):\n\n # remove dollar sign and \" million\"\n s = re.sub('\\$|\\s|[a-zA-Z]','', s)\n\n # convert to float and multiply by a million\n value = float(s) * 10**6\n\n # return value\n return value\n\n # if input is of the form $###.# billion\n elif re.match(r'\\$\\s*\\d+\\.?\\d*\\s*billi?on', s, flags=re.IGNORECASE):\n\n # remove dollar sign and \" billion\"\n s = re.sub('\\$|\\s|[a-zA-Z]','', s)\n\n # convert to float and multiply by a billion\n value = float(s) * 10**9\n\n # return value\n return value\n\n # if input is of the form $###,###,###\n elif re.match(r'\\$\\s*\\d{1,3}(?:[,\\.]\\d{3})+(?!\\s[mb]illion)', s, flags=re.IGNORECASE):\n\n # remove dollar sign and commas\n s = re.sub('\\$|,','', s)\n\n # convert to float\n value = float(s)\n\n # return value\n return value\n\n # otherwise, return NaN\n else:\n return np.nan\n wiki_movies_df['box_office'] = box_office.str.extract(f'({form_one}|{form_two})', flags=re.IGNORECASE)[0].apply(parse_dollars)\n wiki_movies_df.drop('Box office', axis=1, inplace=True)\n \n budget = wiki_movies_df['Budget'].dropna()\n budget = budget.map(lambda x: ' '.join(x) if type(x) == list else x)\n budget = budget.str.replace(r'\\$.*[-—–](?![a-z])', '$', regex=True)\n wiki_movies_df['budget'] = budget.str.extract(f'({form_one}|{form_two})', flags=re.IGNORECASE)[0].apply(parse_dollars)\n \n release_date = wiki_movies_df['Release date'].dropna().apply(lambda x: ' '.join(x) if type(x) == list else x)\n date_form_one = r'(?:January|February|March|April|May|June|July|August|September|October|November|December)\\s[123]\\d,\\s\\d{4}'\n date_form_two = r'\\d{4}.[01]\\d.[123]\\d'\n date_form_three = r'(?:January|February|March|April|May|June|July|August|September|October|November|December)\\s\\d{4}'\n date_form_four = r'\\d{4}'\n wiki_movies_df['release_date'] = pd.to_datetime(release_date.str.extract(f'({date_form_one}|{date_form_two}|{date_form_three}|{date_form_four})')[0], infer_datetime_format=True)\n \n running_time = wiki_movies_df['Running time'].dropna().apply(lambda x: ' '.join(x) if type(x) == list else x)\n running_time_extract = running_time.str.extract(r'(\\d+)\\s*ho?u?r?s?\\s*(\\d*)|(\\d+)\\s*m')\n running_time_extract = running_time_extract.apply(lambda col: pd.to_numeric(col, errors='coerce')).fillna(0)\n wiki_movies_df['running_time'] = running_time_extract.apply(lambda row: row[0]*60 + row[1] if row[2] == 0 else row[2], axis=1)\n wiki_movies_df.drop('Running time', axis=1, inplace=True)\n\n kaggle_metadata = kaggle_metadata[kaggle_metadata['adult'] == 'False'].drop('adult',axis='columns')\n kaggle_metadata['video'] = kaggle_metadata['video'] == 'True'\n kaggle_metadata['budget'] = kaggle_metadata['budget'].astype(int)\n kaggle_metadata['id'] = pd.to_numeric(kaggle_metadata['id'], errors='raise')\n kaggle_metadata['popularity'] = pd.to_numeric(kaggle_metadata['popularity'], errors='raise')\n kaggle_metadata['release_date'] = pd.to_datetime(kaggle_metadata['release_date'])\n\n movies_df = pd.merge(wiki_movies_df, kaggle_metadata, on='imdb_id', suffixes=['_wiki','_kaggle'])\n movies_df.drop(columns=['title_wiki','release_date_wiki','Language','Production company(s)'], inplace=True)\n\n def fill_missing_kaggle_data(df, kaggle_column, wiki_column):\n df[kaggle_column] = df.apply(\n lambda row: row[wiki_column] if row[kaggle_column] == 0 else row[kaggle_column]\n , axis=1)\n df.drop(columns=wiki_column, inplace=True)\n \n fill_missing_kaggle_data(movies_df, 'runtime', 'running_time')\n fill_missing_kaggle_data(movies_df, 'budget_kaggle', 'budget_wiki')\n fill_missing_kaggle_data(movies_df, 'revenue', 'box_office')\n \n movies_df = movies_df.loc[:, ['imdb_id','id','title_kaggle','original_title','tagline','belongs_to_collection','url','imdb_link',\n 'runtime','budget_kaggle','revenue','release_date_kaggle','popularity','vote_average','vote_count',\n 'genres','original_language','overview','spoken_languages','Country',\n 'production_companies','production_countries','Distributor',\n 'Producer(s)','Director','Starring','Cinematography','Editor(s)','Writer(s)','Composer(s)','Based on'\n ]]\n\n\n movies_df.rename({'id':'kaggle_id',\n 'title_kaggle':'title',\n 'url':'wikipedia_url',\n 'budget_kaggle':'budget',\n 'release_date_kaggle':'release_date',\n 'Country':'country',\n 'Distributor':'distributor',\n 'Producer(s)':'producers',\n 'Director':'director',\n 'Starring':'starring',\n 'Cinematography':'cinematography',\n 'Editor(s)':'editors',\n 'Writer(s)':'writers',\n 'Composer(s)':'composers',\n 'Based on':'based_on'\n }, axis='columns', inplace=True)\n\n rating_counts = ratings.groupby(['movieId','rating'], as_index=False).count().rename({'userId':'count'}, axis=1).pivot(index='movieId',columns='rating', values='count')\n rating_counts.columns = ['rating_' + str(col) for col in rating_counts.columns] \n movies_with_ratings_df = pd.merge(movies_df, rating_counts, left_on='kaggle_id', right_index=True, how='left')\n movies_with_ratings_df[rating_counts.columns] = movies_with_ratings_df[rating_counts.columns].fillna(0)\n \n\n db_string = f\"postgres://postgres:{db_password}@localhost:5432/movies_data\"\n engine = create_engine(db_string)\n movies_df.to_sql(name='movies', con=engine)\n\n rows_imported = 0\n # get the start_time from time.time()\n start_time = time.time()\n for data in pd.read_csv(f'{file_dir}/ratings.csv', chunksize=1000000):\n print(f'importing rows {rows_imported} to {rows_imported + len(data)}...', end='')\n data.to_sql(name='ratings', con=engine, if_exists='append')\n rows_imported += len(data)\n\n # add elapsed time to final print out\n print(f'Done. {time.time() - start_time} total seconds elapsed')\n\n\nfile_dir = \"./Resources\" \nwiki_file = f'{file_dir}/wikipedia.movies.json' \nkaggle_file = f'{file_dir}/movies_metadata.csv' \nratings_file = f'{file_dir}/ratings.csv' \n\nextract_transform_load(wiki_file,kaggle_file,ratings_file)\n",
"/opt/anaconda3/lib/python3.7/site-packages/IPython/core/interactiveshell.py:3254: DtypeWarning: Columns (10) have mixed types.Specify dtype option on import or set low_memory=False.\n if (await self.run_code(code, result, async_=asy)):\n"
],
[
" ",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code"
]
] |
e79ef34940cd237c3004e002f145bc059ecd54ee | 19,251 | ipynb | Jupyter Notebook | LeNet-Lab-Solution.ipynb | mdeopujari/CarND-Traffic-Sign-Classifier-Project | 18e9ef04ec16bc36dd969ad03d5c0cc0a2d1e89f | [
"MIT"
] | null | null | null | LeNet-Lab-Solution.ipynb | mdeopujari/CarND-Traffic-Sign-Classifier-Project | 18e9ef04ec16bc36dd969ad03d5c0cc0a2d1e89f | [
"MIT"
] | null | null | null | LeNet-Lab-Solution.ipynb | mdeopujari/CarND-Traffic-Sign-Classifier-Project | 18e9ef04ec16bc36dd969ad03d5c0cc0a2d1e89f | [
"MIT"
] | null | null | null | 37.970414 | 2,016 | 0.621578 | [
[
[
"# LeNet Lab Solution\n\nSource: Yan LeCun",
"_____no_output_____"
],
[
"## Load Data\n\nLoad the MNIST data, which comes pre-loaded with TensorFlow.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"from tensorflow.examples.tutorials.mnist import input_data\n\nmnist = input_data.read_data_sets(\"MNIST_data/\", reshape=False)\nX_train, y_train = mnist.train.images, mnist.train.labels\nX_validation, y_validation = mnist.validation.images, mnist.validation.labels\nX_test, y_test = mnist.test.images, mnist.test.labels\n\nassert(len(X_train) == len(y_train))\nassert(len(X_validation) == len(y_validation))\nassert(len(X_test) == len(y_test))\n\nprint()\nprint(\"Image Shape: {}\".format(X_train[0].shape))\nprint()\nprint(\"Training Set: {} samples\".format(len(X_train)))\nprint(\"Validation Set: {} samples\".format(len(X_validation)))\nprint(\"Test Set: {} samples\".format(len(X_test)))",
"G:\\ProgramData\\Anaconda3\\envs\\tensorflow\\lib\\site-packages\\h5py\\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\n"
]
],
[
[
"The MNIST data that TensorFlow pre-loads comes as 28x28x1 images.\n\nHowever, the LeNet architecture only accepts 32x32xC images, where C is the number of color channels.\n\nIn order to reformat the MNIST data into a shape that LeNet will accept, we pad the data with two rows of zeros on the top and bottom, and two columns of zeros on the left and right (28+2+2 = 32).\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\n# Pad images with 0s\nX_train = np.pad(X_train, ((0,0),(2,2),(2,2),(0,0)), 'constant')\nX_validation = np.pad(X_validation, ((0,0),(2,2),(2,2),(0,0)), 'constant')\nX_test = np.pad(X_test, ((0,0),(2,2),(2,2),(0,0)), 'constant')\n \nprint(\"Updated Image Shape: {}\".format(X_train[0].shape))",
"Updated Image Shape: (36, 36, 1)\n"
]
],
[
[
"## Visualize Data\n\nView a sample from the dataset.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"import random\nimport numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\n\nindex = random.randint(0, len(X_train))\nimage = X_train[index].squeeze()\n\nplt.figure(figsize=(1,1))\nplt.imshow(image, cmap=\"gray\")\nprint(y_train[index])",
"(32, 32)\n4\n"
]
],
[
[
"## Preprocess Data\n\nShuffle the training data.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"from sklearn.utils import shuffle\n\nX_train, y_train = shuffle(X_train, y_train)",
"_____no_output_____"
]
],
[
[
"## Setup TensorFlow\nThe `EPOCH` and `BATCH_SIZE` values affect the training speed and model accuracy.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"import tensorflow as tf\n\nEPOCHS = 10\nBATCH_SIZE = 128",
"_____no_output_____"
]
],
[
[
"## SOLUTION: Implement LeNet-5\nImplement the [LeNet-5](http://yann.lecun.com/exdb/lenet/) neural network architecture.\n\nThis is the only cell you need to edit.\n### Input\nThe LeNet architecture accepts a 32x32xC image as input, where C is the number of color channels. Since MNIST images are grayscale, C is 1 in this case.\n\n### Architecture\n**Layer 1: Convolutional.** The output shape should be 28x28x6.\n\n**Activation.** Your choice of activation function.\n\n**Pooling.** The output shape should be 14x14x6.\n\n**Layer 2: Convolutional.** The output shape should be 10x10x16.\n\n**Activation.** Your choice of activation function.\n\n**Pooling.** The output shape should be 5x5x16.\n\n**Flatten.** Flatten the output shape of the final pooling layer such that it's 1D instead of 3D. The easiest way to do is by using `tf.contrib.layers.flatten`, which is already imported for you.\n\n**Layer 3: Fully Connected.** This should have 120 outputs.\n\n**Activation.** Your choice of activation function.\n\n**Layer 4: Fully Connected.** This should have 84 outputs.\n\n**Activation.** Your choice of activation function.\n\n**Layer 5: Fully Connected (Logits).** This should have 10 outputs.\n\n### Output\nReturn the result of the 2nd fully connected layer.",
"_____no_output_____"
]
],
[
[
"from tensorflow.contrib.layers import flatten\n\ndef LeNet(x): \n # Arguments used for tf.truncated_normal, randomly defines variables for the weights and biases for each layer\n mu = 0\n sigma = 0.1\n \n # SOLUTION: Layer 1: Convolutional. Input = 32x32x1. Output = 28x28x6.\n conv1_W = tf.Variable(tf.truncated_normal(shape=(5, 5, 1, 6), mean = mu, stddev = sigma))\n conv1_b = tf.Variable(tf.zeros(6))\n conv1 = tf.nn.conv2d(x, conv1_W, strides=[1, 1, 1, 1], padding='VALID') + conv1_b\n\n # SOLUTION: Activation.\n conv1 = tf.nn.relu(conv1)\n\n # SOLUTION: Pooling. Input = 28x28x6. Output = 14x14x6.\n conv1 = tf.nn.max_pool(conv1, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='VALID')\n\n # SOLUTION: Layer 2: Convolutional. Output = 10x10x16.\n conv2_W = tf.Variable(tf.truncated_normal(shape=(5, 5, 6, 16), mean = mu, stddev = sigma))\n conv2_b = tf.Variable(tf.zeros(16))\n conv2 = tf.nn.conv2d(conv1, conv2_W, strides=[1, 1, 1, 1], padding='VALID') + conv2_b\n \n # SOLUTION: Activation.\n conv2 = tf.nn.relu(conv2)\n\n # SOLUTION: Pooling. Input = 10x10x16. Output = 5x5x16.\n conv2 = tf.nn.max_pool(conv2, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='VALID')\n\n # SOLUTION: Flatten. Input = 5x5x16. Output = 400.\n fc0 = flatten(conv2)\n \n # SOLUTION: Layer 3: Fully Connected. Input = 400. Output = 120.\n fc1_W = tf.Variable(tf.truncated_normal(shape=(400, 120), mean = mu, stddev = sigma))\n fc1_b = tf.Variable(tf.zeros(120))\n fc1 = tf.matmul(fc0, fc1_W) + fc1_b\n \n # SOLUTION: Activation.\n fc1 = tf.nn.relu(fc1)\n\n # SOLUTION: Layer 4: Fully Connected. Input = 120. Output = 84.\n fc2_W = tf.Variable(tf.truncated_normal(shape=(120, 84), mean = mu, stddev = sigma))\n fc2_b = tf.Variable(tf.zeros(84))\n fc2 = tf.matmul(fc1, fc2_W) + fc2_b\n \n # SOLUTION: Activation.\n fc2 = tf.nn.relu(fc2)\n\n # SOLUTION: Layer 5: Fully Connected. Input = 84. Output = 10.\n fc3_W = tf.Variable(tf.truncated_normal(shape=(84, 10), mean = mu, stddev = sigma))\n fc3_b = tf.Variable(tf.zeros(10))\n logits = tf.matmul(fc2, fc3_W) + fc3_b\n \n return logits",
"_____no_output_____"
]
],
[
[
"## Features and Labels\nTrain LeNet to classify [MNIST](http://yann.lecun.com/exdb/mnist/) data.\n\n`x` is a placeholder for a batch of input images.\n`y` is a placeholder for a batch of output labels.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"x = tf.placeholder(tf.float32, (None, 32, 32, 1))\ny = tf.placeholder(tf.int32, (None))\none_hot_y = tf.one_hot(y, 10)",
"_____no_output_____"
]
],
[
[
"## Training Pipeline\nCreate a training pipeline that uses the model to classify MNIST data.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"rate = 0.001\n\nlogits = LeNet(x)\ncross_entropy = tf.nn.softmax_cross_entropy_with_logits(labels=one_hot_y, logits=logits)\nloss_operation = tf.reduce_mean(cross_entropy)\noptimizer = tf.train.AdamOptimizer(learning_rate = rate)\ntraining_operation = optimizer.minimize(loss_operation)",
"_____no_output_____"
]
],
[
[
"## Model Evaluation\nEvaluate how well the loss and accuracy of the model for a given dataset.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"correct_prediction = tf.equal(tf.argmax(logits, 1), tf.argmax(one_hot_y, 1))\naccuracy_operation = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\nsaver = tf.train.Saver()\n\ndef evaluate(X_data, y_data):\n num_examples = len(X_data)\n total_accuracy = 0\n sess = tf.get_default_session()\n for offset in range(0, num_examples, BATCH_SIZE):\n batch_x, batch_y = X_data[offset:offset+BATCH_SIZE], y_data[offset:offset+BATCH_SIZE]\n accuracy = sess.run(accuracy_operation, feed_dict={x: batch_x, y: batch_y})\n total_accuracy += (accuracy * len(batch_x))\n return total_accuracy / num_examples",
"_____no_output_____"
]
],
[
[
"## Train the Model\nRun the training data through the training pipeline to train the model.\n\nBefore each epoch, shuffle the training set.\n\nAfter each epoch, measure the loss and accuracy of the validation set.\n\nSave the model after training.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"with tf.Session() as sess:\n sess.run(tf.global_variables_initializer())\n num_examples = len(X_train)\n \n print(\"Training...\")\n print()\n for i in range(EPOCHS):\n X_train, y_train = shuffle(X_train, y_train)\n for offset in range(0, num_examples, BATCH_SIZE):\n end = offset + BATCH_SIZE\n batch_x, batch_y = X_train[offset:end], y_train[offset:end]\n sess.run(training_operation, feed_dict={x: batch_x, y: batch_y})\n \n validation_accuracy = evaluate(X_validation, y_validation)\n print(\"EPOCH {} ...\".format(i+1))\n print(\"Validation Accuracy = {:.3f}\".format(validation_accuracy))\n print()\n \n saver.save(sess, './lenet')\n print(\"Model saved\")",
"_____no_output_____"
]
],
[
[
"## Evaluate the Model\nOnce you are completely satisfied with your model, evaluate the performance of the model on the test set.\n\nBe sure to only do this once!\n\nIf you were to measure the performance of your trained model on the test set, then improve your model, and then measure the performance of your model on the test set again, that would invalidate your test results. You wouldn't get a true measure of how well your model would perform against real data.\n\nYou do not need to modify this section.",
"_____no_output_____"
]
],
[
[
"with tf.Session() as sess:\n saver.restore(sess, tf.train.latest_checkpoint('.'))\n\n test_accuracy = evaluate(X_test, y_test)\n print(\"Test Accuracy = {:.3f}\".format(test_accuracy))",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79efe2b769b9cf7202ae0dd4507f0b50eb2d8b3 | 2,701 | ipynb | Jupyter Notebook | removeNaN.ipynb | adamamiller/supernova-spectrum-analysis | 1f7816bdc7dadb1a9a2ee3a97a1f77dd6f0c06dd | [
"MIT"
] | null | null | null | removeNaN.ipynb | adamamiller/supernova-spectrum-analysis | 1f7816bdc7dadb1a9a2ee3a97a1f77dd6f0c06dd | [
"MIT"
] | null | null | null | removeNaN.ipynb | adamamiller/supernova-spectrum-analysis | 1f7816bdc7dadb1a9a2ee3a97a1f77dd6f0c06dd | [
"MIT"
] | 2 | 2020-10-07T20:10:30.000Z | 2021-05-09T23:16:36.000Z | 25.72381 | 123 | 0.584228 | [
[
[
"import subprocess\nimport shlex\nimport pandas as pd\nimport numpy as np\nfrom astropy.table import Table\nfrom astropy.table import Column\nimport astropy\nimport os\nimport glob2\nimport matplotlib.pyplot as plt\nfrom matplotlib.backends.backend_pdf import PdfPages",
"_____no_output_____"
],
[
"sample_location = \"/home/xhall/Documents/NewZTF/spectra/\"\nnew_sample_location = \"/home/xhall/Documents/NewZTF/spectra_nonan/\"\nsource = \"/home/xhall/Documents/NewZTF/SNIDoutput/\"\nimage_output = \"/home/xhall/Documents/NewZTF/Images/\"\nsnid = \"/home/xhall/Documents/SNID/snid-5.0/snid\"",
"_____no_output_____"
],
[
"sample = Table.read(\"/home/xhall/Documents/NewZTF/ML_sample.ascii\", format = \"ascii\")",
"_____no_output_____"
],
[
"for i in np.unique(sample[\"col8\"]):\n try:\n spectra = Table.from_pandas(Table.read(sample_location + str(i), format = \"ascii\").to_pandas().dropna())\n astropy.io.ascii.write(spectra, new_sample_location + str(i), overwrite=True, format = \"no_header\")\n except:\n print(i)",
"ZTF18abcfdzu_20181229_NOT_v1.ascii\nZTF18acbwaxk_20190110_NOT_v1.ascii\nZTF18aceqrrs_20190227_NOT_v1.ascii\nZTF18acsjrbc_20190102_NOT_v1.ascii\nZTF19aafmyow_20190816_NOT_v1.ascii\nZTF19aapfmki_20190511_NOT_v1.ascii\nZTF19abdoior_20190815_NOT_v1.ascii\nZTF19abucwzt_20200127_NOT_v1.ascii\nZTF20aaelulu_20200114_NOT_v1.ascii\nZTF20aalrqbu_20200506_NOT_v1.ascii\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code"
]
] |
e79f042e42e31083557d67a434833f3475fbdfd0 | 273,582 | ipynb | Jupyter Notebook | data/DBSCAN.ipynb | tristanang/bilayer-clusters | 865d0cbafc543e78017d57a9bc6f99b9ac05792f | [
"MIT"
] | 1 | 2020-05-16T03:38:06.000Z | 2020-05-16T03:38:06.000Z | data/DBSCAN.ipynb | tristanang/bilayer-clusters | 865d0cbafc543e78017d57a9bc6f99b9ac05792f | [
"MIT"
] | null | null | null | data/DBSCAN.ipynb | tristanang/bilayer-clusters | 865d0cbafc543e78017d57a9bc6f99b9ac05792f | [
"MIT"
] | 1 | 2018-07-09T17:49:39.000Z | 2018-07-09T17:49:39.000Z | 579.622881 | 99,190 | 0.929893 | [
[
[
"import pickle\n\npickle_off = open(\"clusters.dict\",\"rb\")\nclusters = pickle.load(pickle_off)",
"_____no_output_____"
],
[
"X = clusters[0][28]['lipid'][0][1]\nprint(len(X))",
"400\n"
],
[
"print(__doc__)\n\nimport numpy as np\n\nfrom sklearn.cluster import DBSCAN\nfrom sklearn import metrics\n#from sklearn.datasets.samples_generator import make_blobs\n#from sklearn.preprocessing import StandardScaler\n\n\n# #############################################################################\n#Generate sample data\n#centers = [[1, 1], [-1, -1], [1, -1]]\n#X, labels_true = make_blobs(n_samples=750, centers=centers, cluster_std=0.4,random_state=0)\n\n#X = StandardScaler().fit_transform(X)\n\n# #############################################################################\n# Compute DBSCAN\ndb = DBSCAN(eps=1.1, min_samples=3).fit(X)\ncore_samples_mask = np.zeros_like(db.labels_, dtype=bool)\ncore_samples_mask[db.core_sample_indices_] = True\nlabels = db.labels_\n\n# Number of clusters in labels, ignoring noise if present.\nn_clusters_ = len(set(labels)) - (1 if -1 in labels else 0)\n\nprint('Estimated number of clusters: %d' % n_clusters_)\n#print(\"Homogeneity: %0.3f\" % metrics.homogeneity_score(labels_true, labels))\n#print(\"Completeness: %0.3f\" % metrics.completeness_score(labels_true, labels))\n#print(\"V-measure: %0.3f\" % metrics.v_measure_score(labels_true, labels))\n#print(\"Adjusted Rand Index: %0.3f\" % metrics.adjusted_rand_score(labels_true, labels))\n#print(\"Adjusted Mutual Information: %0.3f\" % metrics.adjusted_mutual_info_score(labels_true, labels))\n#print(\"Silhouette Coefficient: %0.3f\" % metrics.silhouette_score(X, labels))\n\n# #############################################################################\n# Plot result\nimport matplotlib.pyplot as plt\n\n# Black removed and is used for noise instead.\nunique_labels = set(labels)\ncolors = [plt.cm.Spectral(each)\n for each in np.linspace(0, 1, len(unique_labels))]\nfor k, col in zip(unique_labels, colors):\n if k == -1:\n # Black used for noise.\n col = [0, 0, 0, 1]\n\n class_member_mask = (labels == k)\n\n xy = X[class_member_mask & core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=14)\n\n xy = X[class_member_mask & ~core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=6)\n\nplt.title('Estimated number of clusters: %d' % n_clusters_)\nplt.show()",
"Automatically created module for IPython interactive environment\nEstimated number of clusters: 26\n"
],
[
"X = clusters[0][37]['lipid'][0][1]\nprint(len(X))",
"369\n"
],
[
"print(__doc__)\n\nimport numpy as np\n\nfrom sklearn.cluster import DBSCAN\nfrom sklearn import metrics\n#from sklearn.datasets.samples_generator import make_blobs\n#from sklearn.preprocessing import StandardScaler\n\n\n# #############################################################################\n#Generate sample data\n#centers = [[1, 1], [-1, -1], [1, -1]]\n#X, labels_true = make_blobs(n_samples=750, centers=centers, cluster_std=0.4,random_state=0)\n\n#X = StandardScaler().fit_transform(X)\n\n# #############################################################################\n# Compute DBSCAN\ndb = DBSCAN(eps=1.1, min_samples=2\n ).fit(X)\ncore_samples_mask = np.zeros_like(db.labels_, dtype=bool)\ncore_samples_mask[db.core_sample_indices_] = True\nlabels = db.labels_\n\n# Number of clusters in labels, ignoring noise if present.\nn_clusters_ = len(set(labels)) - (1 if -1 in labels else 0)\n\nprint('Estimated number of clusters: %d' % n_clusters_)\n#print(\"Homogeneity: %0.3f\" % metrics.homogeneity_score(labels_true, labels))\n#print(\"Completeness: %0.3f\" % metrics.completeness_score(labels_true, labels))\n#print(\"V-measure: %0.3f\" % metrics.v_measure_score(labels_true, labels))\n#print(\"Adjusted Rand Index: %0.3f\" % metrics.adjusted_rand_score(labels_true, labels))\n#print(\"Adjusted Mutual Information: %0.3f\" % metrics.adjusted_mutual_info_score(labels_true, labels))\n#print(\"Silhouette Coefficient: %0.3f\" % metrics.silhouette_score(X, labels))\n\n# #############################################################################\n# Plot result\nimport matplotlib.pyplot as plt\n\n# Black removed and is used for noise instead.\nunique_labels = set(labels)\ncolors = [plt.cm.Spectral(each)\n for each in np.linspace(0, 1, len(unique_labels))]\nfor k, col in zip(unique_labels, colors):\n if k == -1:\n # Black used for noise.\n col = [0, 0, 0, 1]\n\n class_member_mask = (labels == k)\n\n xy = X[class_member_mask & core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=14)\n\n xy = X[class_member_mask & ~core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=6)\n\nplt.title('Estimated number of clusters: %d' % n_clusters_)\nplt.show()",
"Automatically created module for IPython interactive environment\nEstimated number of clusters: 63\n"
],
[
"X = clusters[0][37]['lipid'][0][2]\nprint(len(X))",
"239\n"
],
[
"print(__doc__)\n\nimport numpy as np\n\nfrom sklearn.cluster import DBSCAN\nfrom sklearn import metrics\n#from sklearn.datasets.samples_generator import make_blobs\n#from sklearn.preprocessing import StandardScaler\n\n\n# #############################################################################\n#Generate sample data\n#centers = [[1, 1], [-1, -1], [1, -1]]\n#X, labels_true = make_blobs(n_samples=750, centers=centers, cluster_std=0.4,random_state=0)\n\n#X = StandardScaler().fit_transform(X)\n\n# #############################################################################\n# Compute DBSCAN\ndb = DBSCAN(eps=1.1, min_samples=2\n ).fit(X)\ncore_samples_mask = np.zeros_like(db.labels_, dtype=bool)\ncore_samples_mask[db.core_sample_indices_] = True\nlabels = db.labels_\n\n# Number of clusters in labels, ignoring noise if present.\nn_clusters_ = len(set(labels)) - (1 if -1 in labels else 0)\n\nprint('Estimated number of clusters: %d' % n_clusters_)\n#print(\"Homogeneity: %0.3f\" % metrics.homogeneity_score(labels_true, labels))\n#print(\"Completeness: %0.3f\" % metrics.completeness_score(labels_true, labels))\n#print(\"V-measure: %0.3f\" % metrics.v_measure_score(labels_true, labels))\n#print(\"Adjusted Rand Index: %0.3f\" % metrics.adjusted_rand_score(labels_true, labels))\n#print(\"Adjusted Mutual Information: %0.3f\" % metrics.adjusted_mutual_info_score(labels_true, labels))\n#print(\"Silhouette Coefficient: %0.3f\" % metrics.silhouette_score(X, labels))\n\n# #############################################################################\n# Plot result\nimport matplotlib.pyplot as plt\n\n# Black removed and is used for noise instead.\nunique_labels = set(labels)\ncolors = [plt.cm.Spectral(each)\n for each in np.linspace(0, 1, len(unique_labels))]\nfor k, col in zip(unique_labels, colors):\n if k == -1:\n # Black used for noise.\n col = [0, 0, 0, 1]\n\n class_member_mask = (labels == k)\n\n xy = X[class_member_mask & core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=14)\n\n xy = X[class_member_mask & ~core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=6)\n\nplt.title('Estimated number of clusters: %d' % n_clusters_)\nplt.show()",
"Automatically created module for IPython interactive environment\nEstimated number of clusters: 43\n"
],
[
"X = clusters[0][37]['lipid'][0][3]\nprint(len(X))",
"78\n"
],
[
"print(__doc__)\n\nimport numpy as np\n\nfrom sklearn.cluster import DBSCAN\nfrom sklearn import metrics\n#from sklearn.datasets.samples_generator import make_blobs\n#from sklearn.preprocessing import StandardScaler\n\n\n# #############################################################################\n#Generate sample data\n#centers = [[1, 1], [-1, -1], [1, -1]]\n#X, labels_true = make_blobs(n_samples=750, centers=centers, cluster_std=0.4,random_state=0)\n\n#X = StandardScaler().fit_transform(X)\n\n# #############################################################################\n# Compute DBSCAN\ndb = DBSCAN(eps=1.1, min_samples=2\n ).fit(X)\ncore_samples_mask = np.zeros_like(db.labels_, dtype=bool)\ncore_samples_mask[db.core_sample_indices_] = True\nlabels = db.labels_\n\n# Number of clusters in labels, ignoring noise if present.\nn_clusters_ = len(set(labels)) - (1 if -1 in labels else 0)\n\nprint('Estimated number of clusters: %d' % n_clusters_)\n#print(\"Homogeneity: %0.3f\" % metrics.homogeneity_score(labels_true, labels))\n#print(\"Completeness: %0.3f\" % metrics.completeness_score(labels_true, labels))\n#print(\"V-measure: %0.3f\" % metrics.v_measure_score(labels_true, labels))\n#print(\"Adjusted Rand Index: %0.3f\" % metrics.adjusted_rand_score(labels_true, labels))\n#print(\"Adjusted Mutual Information: %0.3f\" % metrics.adjusted_mutual_info_score(labels_true, labels))\n#print(\"Silhouette Coefficient: %0.3f\" % metrics.silhouette_score(X, labels))\n\n# #############################################################################\n# Plot result\nimport matplotlib.pyplot as plt\n\n# Black removed and is used for noise instead.\nunique_labels = set(labels)\ncolors = [plt.cm.Spectral(each)\n for each in np.linspace(0, 1, len(unique_labels))]\nfor k, col in zip(unique_labels, colors):\n if k == -1:\n # Black used for noise.\n col = [0, 0, 0, 1]\n\n class_member_mask = (labels == k)\n\n xy = X[class_member_mask & core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=14)\n\n xy = X[class_member_mask & ~core_samples_mask]\n plt.plot(xy[:, 0], xy[:, 1], 'o', markerfacecolor=tuple(col),\n markeredgecolor='k', markersize=6)\n\nplt.title('Estimated number of clusters: %d' % n_clusters_)\nplt.show()",
"Automatically created module for IPython interactive environment\nEstimated number of clusters: 12\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f058a4dbd323ad97cdce8770deeee7af5dbc3 | 807,617 | ipynb | Jupyter Notebook | experiments/job.case_hepatocyte/s1_reprocess_tms_droplet_hep.ipynb | martinjzhang/scDRS | 69a9fb4e50dbfa6b1afe0dd222b0d349c5db00eb | [
"MIT"
] | 24 | 2021-09-30T12:31:58.000Z | 2022-03-28T01:14:39.000Z | experiments/job.case_hepatocyte/s1_reprocess_tms_droplet_hep.ipynb | martinjzhang/scDRS | 69a9fb4e50dbfa6b1afe0dd222b0d349c5db00eb | [
"MIT"
] | 5 | 2021-09-29T11:20:37.000Z | 2022-03-06T21:53:08.000Z | experiments/job.case_hepatocyte/s1_reprocess_tms_droplet_hep.ipynb | martinjzhang/scDRS | 69a9fb4e50dbfa6b1afe0dd222b0d349c5db00eb | [
"MIT"
] | null | null | null | 2,029.188442 | 449,728 | 0.959693 | [
[
[
"import scanpy as sc\nfrom anndata import read_h5ad\nimport pandas as pd\nimport numpy as np\nimport scipy as sp\nfrom statsmodels.stats.multitest import multipletests\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport os\nfrom os.path import join\nimport time\n\n# scTRS tools\nimport scTRS.util as util\nimport scTRS.data_loader as dl\nimport scTRS.method as md\n\n# autoreload\n%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"# File paths\nDATA_PATH = '/n/holystore01/LABS/price_lab/Users/mjzhang/scTRS_data'\nFIG_PATH = '/n/holystore01/LABS/price_lab/Users/mjzhang/scTRS_data/results/fig_hep'\nDS_LIST = ['droplet']\n\n# Score files\nDIC_SCORE_PATH = {'droplet': DATA_PATH+'/score_file/score.tms_droplet_with_cov.magma_10kb_1000'}\n\nDIC_TRAIT_LIST = {}\nDIC_TRAIT_LIST = {'droplet': ['UKB_460K.biochemistry_LDLdirect']}",
"_____no_output_____"
],
[
"# Load raw data \ndic_data_raw = {}\n# dic_data_raw['facs'] = dl.load_tms_ct(DATA_PATH, data_name='facs')\ndic_data_raw['droplet'] = dl.load_tms_ct(DATA_PATH, data_name='droplet')",
"_____no_output_____"
],
[
"# Load score \ndic_score = {x:pd.DataFrame() for x in DIC_SCORE_PATH}\nfor score in DIC_SCORE_PATH:\n for trait in DIC_TRAIT_LIST[score]:\n file_path = join(DIC_SCORE_PATH[score], '%s.score.gz'%trait)\n if os.path.exists(file_path):\n temp_df = pd.read_csv(file_path, sep='\\t', index_col=0)\n temp_df.columns = ['%s.%s'%(trait,x) for x in temp_df.columns]\n temp_df['%s.fdr'%trait] = multipletests(temp_df['%s.pval'%trait], method='fdr_bh')[1]\n dic_score[score] = pd.concat([dic_score[score], temp_df], axis=1)\n else:\n print('# missing: %s'%file_path) ",
"_____no_output_____"
]
],
[
[
"### Get data for only hepatocytes and rerun harmony+umap",
"_____no_output_____"
]
],
[
[
"# Reprocess t cell data\nfor ds in DS_LIST:\n print(ds)\n ind_select = dic_data_raw[ds].obs['cell_ontology_class']=='hepatocyte'\n adata = dic_data_raw[ds][ind_select,:].copy()\n sc.pp.filter_cells(adata, min_genes=250)\n sc.pp.filter_genes(adata, min_cells=50)\n adata.obs['batch_harmony'] = adata.obs['mouse.id']\n adata.obs['batch_harmony'] = adata.obs['batch_harmony'].astype('category')\n\n sc.pp.highly_variable_genes(adata, subset = False, min_disp=.5, \n min_mean=.0125, max_mean=10, n_bins=20, n_top_genes=None)\n sc.pp.scale(adata, max_value=10, zero_center=False)\n sc.pp.pca(adata, n_comps=50, use_highly_variable=True, svd_solver='arpack')\n sc.external.pp.harmony_integrate(adata, key='batch_harmony', max_iter_harmony=20)\n sc.pp.neighbors(adata, n_neighbors=50, n_pcs=20, use_rep=\"X_pca_harmony\")\n# sc.pp.neighbors(adata, n_neighbors=50, n_pcs=20, use_rep=\"X_pca\")\n sc.tl.leiden(adata, resolution=0.7) \n sc.tl.umap(adata)\n sc.pl.umap(adata, color='cell_ontology_class')\n sc.pl.umap(adata, color='leiden')\n sc.pl.umap(adata, color=['age', 'sex', 'mouse.id', 'n_genes', 'subtissue'])\n adata.write('/n/holystore01/LABS/price_lab/Users/mjzhang/scTRS_data/single_cell_data/tms_proc/'\n 'hep.%s.h5ad'%ds)",
"droplet\n"
],
[
"adata = read_h5ad('/n/holystore01/LABS/price_lab/Users/mjzhang/scTRS_data/single_cell_data/tms_proc/hep.droplet.h5ad')\n# dic_cluster = {'0':'4', '1':'3','2':'1','3':'5','4':'0','5':'2'}\n# adata.obs['leiden_old'] = adata.obs['leiden'].values\n# adata.obs['leiden'] = [dic_cluster[x] for x in adata.obs['leiden_old']]\nadata.obs = adata.obs.join(dic_score['droplet'])\nsc.pl.umap(adata, color='leiden')\nsc.pl.umap(adata, color='UKB_460K.biochemistry_LDLdirect.norm_score')\n# sc.pl.umap(adata, color=['Pecam1', 'Nrp1', 'Kdr', 'Oit3'])\n# sc.pl.umap(adata, color=['Clec4f', 'Cd68', 'Irf7'])\n# sc.pl.umap(adata, color=['Alb', 'Ttr', 'Apoa1', 'Serpina1c'])\n# adata.write('/n/holystore01/LABS/price_lab/Users/mjzhang/scTRS_data/single_cell_data/tms_proc/hep.facs_annot.h5ad')",
"_____no_output_____"
],
[
"temp_df = dic_data_raw['facs'][dic_data_raw['facs'].obs['cell_ontology_class']=='hepatocyte']\\\n .obs.groupby(['subtissue', 'mouse.id']).agg({'cell':len})\ntemp_df.loc[~temp_df['cell'].isna()]",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
e79f1c6d1cf11431036cefc765c5548953906354 | 38,586 | ipynb | Jupyter Notebook | HAR_DM/Practice Work/Summarizing with dplyr.ipynb | ashudva/HAR | e5fbda42e0278157a4ad1a132b6c308e6a04bb53 | [
"MIT"
] | 3 | 2020-04-28T06:35:57.000Z | 2021-12-23T07:46:30.000Z | HAR_DM/Practice Work/Summarizing with dplyr.ipynb | ashudva/HAR | e5fbda42e0278157a4ad1a132b6c308e6a04bb53 | [
"MIT"
] | null | null | null | HAR_DM/Practice Work/Summarizing with dplyr.ipynb | ashudva/HAR | e5fbda42e0278157a4ad1a132b6c308e6a04bb53 | [
"MIT"
] | 1 | 2020-04-29T11:55:49.000Z | 2020-04-29T11:55:49.000Z | 45.341951 | 197 | 0.380112 | [
[
[
"library(tidyverse)\nlibrary(dslabs)\ndata(heights)\n\ns <- heights %>% filter(sex == \"Male\") %>% summarize(mean = mean(height), sd = sd(height))\ns",
"_____no_output_____"
],
[
"class(s)",
"_____no_output_____"
],
[
"s1 <- heights %>% filter(sex == \"Male\") %>% summarize(mean = mean(height), sd = sd(height), median = median(height), max = max(height), min = min(height))\ns1",
"_____no_output_____"
],
[
"quantile(heights$height, c(0.0, 0.05, 1.0))",
"_____no_output_____"
],
[
"# This code generates error because summarize accept only those function which return single values\nheights %>% filter(sex == \"Male\") %>% summarize(range = quantile(height, c(0.0, 0.5, 1.0)))",
"_____no_output_____"
]
],
[
[
"# Dot Placeholder",
"_____no_output_____"
],
[
"### Pull() and dot behaves in the same way",
"_____no_output_____"
]
],
[
[
"library(tidyverse)\nlibrary(dslabs)\ndata(murders)\n\nmurders <- murders %>% mutate(murder_rate = (total/population) * 100000) \nsummarize(murders, mean = mean(murder_rate))\n\nus_murder_rate <- murders %>% summarize(rate = sum(total)/ sum(population) *100000)\nr <- us_murder_rate %>% .$rate\nclass(r)\nr\nclass(us_murder_rate$rate)",
"_____no_output_____"
],
[
"us_murder_rate <- murders %>%\n summarize(rate = sum(total) / sum(population) * 100000) %>%\n .$rate\nus_murder_rate",
"_____no_output_____"
],
[
"heights %>% group_by(sex) %>% summarize(mean = mean(height), sd = sd(height))\nmurders %>% mutate(murder_rate = total/population * 100000) %>% group_by(region) %>% summarize(mean = mean(murder_rate), sd = sd(murder_rate))",
"_____no_output_____"
],
[
"murders %>% mutate(murder_rate = total/population * 100000) %>% arrange(desc(murder_rate)) %>% top_n(10, murder_rate)",
"_____no_output_____"
],
[
"murders %>% arrange(population) %>% head()",
"_____no_output_____"
],
[
"murders %>% arrange(desc(population)) %>% head()",
"_____no_output_____"
],
[
"murders %>% arrange(region, population) %>% head()",
"_____no_output_____"
],
[
"murders %>% top_n(10, murder_rate)",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f21c3936d93bde9d9bf523fc09fda92c9ac15 | 20,860 | ipynb | Jupyter Notebook | MixtureModel.ipynb | gtrichards/PHYS_T480 | 60d85f1d0f00059007cb5c5cbe9ff7050adde4b2 | [
"MIT"
] | 20 | 2016-09-20T16:26:32.000Z | 2019-01-21T06:33:21.000Z | MixtureModel.ipynb | gtrichards/PHYS_T480 | 60d85f1d0f00059007cb5c5cbe9ff7050adde4b2 | [
"MIT"
] | null | null | null | MixtureModel.ipynb | gtrichards/PHYS_T480 | 60d85f1d0f00059007cb5c5cbe9ff7050adde4b2 | [
"MIT"
] | 7 | 2016-11-17T17:36:48.000Z | 2019-11-20T13:00:33.000Z | 37.25 | 690 | 0.550096 | [
[
[
"## Gaussian Mixture Models (GMM)\n\nKDE centers each bin (or kernel rather) at each point. In a [**mixture model**](https://en.wikipedia.org/wiki/Mixture_model) we don't use a kernel for each data point, but rather we fit for the *locations of the kernels*--in addition to the width. So a mixture model is sort of a hybrid between an $N$-D histogram and KDE. Using lots of kernels (maybe even more than the BIC score suggests) may make sense if you just want to provide an accurate description of the data (as in density estimation). Using fewer kernels makes mixture models more like clustering, where the suggestion is still to use many kernels in order to divide the sample into real clusters and \"background\".",
"_____no_output_____"
],
[
"Gaussians are the most commonly used components for mixture models. So, the pdf is modeled by a sum of Gaussians:\n$$p(x) = \\sum_{k=1}^N \\alpha_k \\mathscr{N}(x|\\mu_k,\\Sigma_k),$$\nwhere $\\alpha_k$ is the \"mixing coefficient\" with $0\\le \\alpha_k \\le 1$ and $\\sum_{k=1}^N \\alpha_k = 1$.\n\nWe can solve for the parameters using maximum likelihood analyis as we have discussed previously.\nHowever, this can be complicated in multiple dimensions, requiring the use of [**Expectation Maximization (EM)**](https://en.wikipedia.org/wiki/Expectation%E2%80%93maximization_algorithm) methods.",
"_____no_output_____"
],
[
"### Expectation Maximization (ultra simplified version)\n\n(Note: all explanations of EM are far more complicated than seems necessary for our purposes, so here is my overly simplified explanation.)\n\nThis may make more sense in terms of our earlier Bayesian analyses if we write this as \n$$p(z=c) = \\alpha_k,$$\nand\n$$p(x|z=c) = \\mathscr{N}(x|\\mu_k,\\Sigma_k),$$\nwhere $z$ is a \"hidden\" variable related to which \"component\" each point is assigned to.\n\nIn the Expectation step, we hold $\\mu_k, \\Sigma_k$, and $\\alpha_k$ fixed and compute the probability that each $x_i$ belongs to component, $c$. \n\nIn the Maximization step, we hold the probability of the components fixed and maximize $\\mu_k, \\Sigma_k,$ and $\\alpha_k$.",
"_____no_output_____"
],
[
"Note that $\\alpha$ is the relative weight of each Gaussian component and not the probability of each point belonging to a specific component.",
"_____no_output_____"
],
[
"We can use the following animation to illustrate the process. \n\nWe start with a 2-component GMM, where the initial components can be randomly determined.\n\nThe points that are closest to the centroid of a component will be more probable under that distribution in the \"E\" step and will pull the centroid towards them in the \"M\" step. Iteration between the \"E\" and \"M\" step eventually leads to convergence.\n\nIn this particular example, 3 components better describes the data and similarly converges. Note that the process is not that sensitive to how the components are first initialized. We pretty much get the same result in the end.",
"_____no_output_____"
]
],
[
[
"from IPython.display import YouTubeVideo\nYouTubeVideo(\"B36fzChfyGU\")",
"_____no_output_____"
]
],
[
[
"A typical call to the [Gaussian Mixture Model](http://scikit-learn.org/stable/modules/mixture.html) algorithm looks like this:",
"_____no_output_____"
]
],
[
[
"# Execute this cell\nimport numpy as np\nfrom sklearn.mixture import GMM\n\nX = np.random.normal(size=(1000,2)) #1000 points in 2D\ngmm = GMM(3) #three components\ngmm.fit(X)\nlog_dens = gmm.score(X)\nBIC = gmm.bic(X)",
"_____no_output_____"
]
],
[
[
"Let's start with the 1-D example given in Ivezic, Figure 6.8, which compares a Mixture Model to KDE.\n[Note that the version at astroML.org has some bugs!]",
"_____no_output_____"
]
],
[
[
"# Execute this cell\n# Ivezic, Figure 6.8\n# Author: Jake VanderPlas\n# License: BSD\n# The figure produced by this code is published in the textbook\n# \"Statistics, Data Mining, and Machine Learning in Astronomy\" (2013)\n# For more information, see http://astroML.github.com\n# To report a bug or issue, use the following forum:\n# https://groups.google.com/forum/#!forum/astroml-general\n%matplotlib inline\nimport numpy as np\nfrom matplotlib import pyplot as plt\nfrom scipy import stats\n\nfrom astroML.plotting import hist\nfrom sklearn.mixture import GMM\n\nfrom sklearn.neighbors import KernelDensity\n\n#------------------------------------------------------------\n# Generate our data: a mix of several Cauchy distributions\n# this is the same data used in the Bayesian Blocks figure\nnp.random.seed(0)\nN = 10000\nmu_gamma_f = [(5, 1.0, 0.1),\n (7, 0.5, 0.5),\n (9, 0.1, 0.1),\n (12, 0.5, 0.2),\n (14, 1.0, 0.1)]\ntrue_pdf = lambda x: sum([f * stats.cauchy(mu, gamma).pdf(x)\n for (mu, gamma, f) in mu_gamma_f])\nx = np.concatenate([stats.cauchy(mu, gamma).rvs(int(f * N))\n for (mu, gamma, f) in mu_gamma_f])\nnp.random.shuffle(x)\nx = x[x > -10]\nx = x[x < 30]\n\n#------------------------------------------------------------\n# plot the results\nfig = plt.figure(figsize=(10, 10))\nfig.subplots_adjust(bottom=0.08, top=0.95, right=0.95, hspace=0.1)\nN_values = (500, 5000)\nsubplots = (211, 212)\nk_values = (10, 100)\n\nfor N, k, subplot in zip(N_values, k_values, subplots):\n ax = fig.add_subplot(subplot)\n xN = x[:N]\n t = np.linspace(-10, 30, 1000)\n\n kde = KernelDensity(0.1, kernel='gaussian')\n kde.fit(xN[:, None])\n dens_kde = np.exp(kde.score_samples(t[:, None]))\n\n # Compute density via Gaussian Mixtures\n # we'll try several numbers of clusters\n n_components = np.arange(3, 16)\n gmms = [GMM(n_components=n).fit(xN[:,None]) for n in n_components]\n BICs = [gmm.bic(xN[:,None]) for gmm in gmms]\n i_min = np.argmin(BICs)\n t = np.linspace(-10, 30, 1000)\n logprob, responsibilities = gmms[i_min].score_samples(t[:,None])\n\n # plot the results\n ax.plot(t, true_pdf(t), ':', color='black', zorder=3,\n label=\"Generating Distribution\")\n ax.plot(xN, -0.005 * np.ones(len(xN)), '|k', lw=1.5)\n ax.plot(t, np.exp(logprob), '-', color='gray',\n label=\"Mixture Model\\n(%i components)\" % n_components[i_min])\n ax.plot(t, dens_kde, '-', color='black', zorder=3,\n label=\"Kernel Density $(h=0.1)$\")\n\n # label the plot\n ax.text(0.02, 0.95, \"%i points\" % N, ha='left', va='top',\n transform=ax.transAxes)\n ax.set_ylabel('$p(x)$')\n ax.legend(loc='upper right')\n\n if subplot == 212:\n ax.set_xlabel('$x$')\n\n ax.set_xlim(0, 20)\n ax.set_ylim(-0.01, 0.4001)\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"Hmm, that doesn't look so great for the 5000 point distribution. Plot the BIC values and see if anything looks awry.",
"_____no_output_____"
],
[
"What do the individual components look like? Make a plot of those. Careful with the shapes of the arrays!",
"_____no_output_____"
],
[
"Can you figure out something that you can do to improve the results? ",
"_____no_output_____"
],
[
"Ivezic, Figure 6.6 shows a 2-D example. In the first panel, we have the raw data. In the second panel we have a density plot (essentially a 2-D histogram). We then try to represent the data with a series of Gaussians. We allow up to 14 Gaussians and use the AIC/BIC to determine the best choice for this number. This is shown in the third panel. Finally, the fourth panel shows the chosen Gaussians with their centroids and 1-$\\sigma$ contours.\n\nIn this case 7 components are required for the best fit. While it looks like we could do a pretty good job with just 2 components, there does appear to be some \"background\" that is a high enough level to justify further components.",
"_____no_output_____"
]
],
[
[
"# Execute this cell\n# Ivezic, Figure 6.6\n# Author: Jake VanderPlas\n# License: BSD\n# The figure produced by this code is published in the textbook\n# \"Statistics, Data Mining, and Machine Learning in Astronomy\" (2013)\n# For more information, see http://astroML.github.com\n# To report a bug or issue, use the following forum:\n# https://groups.google.com/forum/#!forum/astroml-general\n%matplotlib inline\nimport numpy as np\nfrom matplotlib import pyplot as plt\nfrom scipy.stats import norm\n\nfrom sklearn.mixture import GMM\n\nfrom astroML.datasets import fetch_sdss_sspp\nfrom astroML.decorators import pickle_results\nfrom astroML.plotting.tools import draw_ellipse\n\n#------------------------------------------------------------\n# Get the Segue Stellar Parameters Pipeline data\ndata = fetch_sdss_sspp(cleaned=True)\n\n# Note how X was created from two columns of data\nX = np.vstack([data['FeH'], data['alphFe']]).T\n\n# truncate dataset for speed\nX = X[::5]\n\n#------------------------------------------------------------\n# Compute GMM models & AIC/BIC\nN = np.arange(1, 14)\n\n#@pickle_results(\"GMM_metallicity.pkl\")\ndef compute_GMM(N, covariance_type='full', n_iter=1000):\n models = [None for n in N]\n for i in range(len(N)):\n #print N[i]\n models[i] = GMM(n_components=N[i], n_iter=n_iter, covariance_type=covariance_type)\n models[i].fit(X)\n return models\n\nmodels = compute_GMM(N)\n\nAIC = [m.aic(X) for m in models]\nBIC = [m.bic(X) for m in models]\n\ni_best = np.argmin(BIC)\ngmm_best = models[i_best]\nprint \"best fit converged:\", gmm_best.converged_\nprint \"BIC: n_components = %i\" % N[i_best]\n\n#------------------------------------------------------------\n# compute 2D density\nFeH_bins = 51\nalphFe_bins = 51\nH, FeH_bins, alphFe_bins = np.histogram2d(data['FeH'], data['alphFe'], (FeH_bins, alphFe_bins))\n\nXgrid = np.array(map(np.ravel,\n np.meshgrid(0.5 * (FeH_bins[:-1]\n + FeH_bins[1:]),\n 0.5 * (alphFe_bins[:-1]\n + alphFe_bins[1:])))).T\nlog_dens = gmm_best.score(Xgrid).reshape((51, 51))\n\n#------------------------------------------------------------\n# Plot the results\nfig = plt.figure(figsize=(12, 5))\nfig.subplots_adjust(wspace=0.45, bottom=0.25, top=0.9, left=0.1, right=0.97)\n\n# plot data\nax = fig.add_subplot(141)\nax.scatter(data['FeH'][::10],data['alphFe'][::10],marker=\".\",color='k',edgecolors='None')\nax.set_xlabel(r'$\\rm [Fe/H]$')\nax.set_ylabel(r'$\\rm [\\alpha/Fe]$')\nax.xaxis.set_major_locator(plt.MultipleLocator(0.3))\nax.set_xlim(-1.101, 0.101)\nax.text(0.93, 0.93, \"Input\",\n va='top', ha='right', transform=ax.transAxes)\n\n# plot density\nax = fig.add_subplot(142)\nax.imshow(H.T, origin='lower', interpolation='nearest', aspect='auto',\n extent=[FeH_bins[0], FeH_bins[-1],\n alphFe_bins[0], alphFe_bins[-1]],\n cmap=plt.cm.binary)\nax.set_xlabel(r'$\\rm [Fe/H]$')\nax.set_ylabel(r'$\\rm [\\alpha/Fe]$')\nax.xaxis.set_major_locator(plt.MultipleLocator(0.3))\nax.set_xlim(-1.101, 0.101)\nax.text(0.93, 0.93, \"Density\",\n va='top', ha='right', transform=ax.transAxes)\n\n# plot AIC/BIC\nax = fig.add_subplot(143)\nax.plot(N, AIC, '-k', label='AIC')\nax.plot(N, BIC, ':k', label='BIC')\nax.legend(loc=1)\nax.set_xlabel('N components')\nplt.setp(ax.get_yticklabels(), fontsize=7)\n\n# plot best configurations for AIC and BIC\nax = fig.add_subplot(144)\nax.imshow(np.exp(log_dens),\n origin='lower', interpolation='nearest', aspect='auto',\n extent=[FeH_bins[0], FeH_bins[-1],\n alphFe_bins[0], alphFe_bins[-1]],\n cmap=plt.cm.binary)\n\nax.scatter(gmm_best.means_[:, 0], gmm_best.means_[:, 1], c='w')\nfor mu, C, w in zip(gmm_best.means_, gmm_best.covars_, gmm_best.weights_):\n draw_ellipse(mu, C, scales=[1], ax=ax, fc='none', ec='k')\n\nax.text(0.93, 0.93, \"Converged\",\n va='top', ha='right', transform=ax.transAxes)\n\nax.set_xlim(-1.101, 0.101)\nax.set_ylim(alphFe_bins[0], alphFe_bins[-1])\nax.xaxis.set_major_locator(plt.MultipleLocator(0.3))\nax.set_xlabel(r'$\\rm [Fe/H]$')\nax.set_ylabel(r'$\\rm [\\alpha/Fe]$')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"That said, I'd say that there are *too* many components here. So, I'd be inclined to explore this a bit further if it were my data.\n\nLastly, let's look at a 2-D case where we are using GMM more to characterize the data than to find clusters. ",
"_____no_output_____"
]
],
[
[
"# Execute this cell\n# Ivezic, Figure 6.7\n# Author: Jake VanderPlas\n# License: BSD\n# The figure produced by this code is published in the textbook\n# \"Statistics, Data Mining, and Machine Learning in Astronomy\" (2013)\n# For more information, see http://astroML.github.com\n# To report a bug or issue, use the following forum:\n# https://groups.google.com/forum/#!forum/astroml-general\nimport numpy as np\nfrom matplotlib import pyplot as plt\n\nfrom sklearn.mixture import GMM\nfrom astroML.datasets import fetch_great_wall\nfrom astroML.decorators import pickle_results\n\n#------------------------------------------------------------\n# load great wall data\nX = fetch_great_wall()\n\n#------------------------------------------------------------\n# Create a function which will save the results to a pickle file\n# for large number of clusters, computation will take a long time!\n#@pickle_results('great_wall_GMM.pkl')\ndef compute_GMM(n_clusters, n_iter=1000, min_covar=3, covariance_type='full'):\n clf = GMM(n_clusters, covariance_type=covariance_type,\n n_iter=n_iter, min_covar=min_covar)\n clf.fit(X)\n print \"converged:\", clf.converged_\n return clf\n\n#------------------------------------------------------------\n# Compute a grid on which to evaluate the result\nNx = 100\nNy = 250\nxmin, xmax = (-375, -175)\nymin, ymax = (-300, 200)\n\nXgrid = np.vstack(map(np.ravel, np.meshgrid(np.linspace(xmin, xmax, Nx),\n np.linspace(ymin, ymax, Ny)))).T\n\n#------------------------------------------------------------\n# Compute the results\n#\n# we'll use 100 clusters. In practice, one should cross-validate\n# with AIC and BIC to settle on the correct number of clusters.\nclf = compute_GMM(n_clusters=100)\nlog_dens = clf.score(Xgrid).reshape(Ny, Nx)\n\n#------------------------------------------------------------\n# Plot the results\nfig = plt.figure(figsize=(10, 5))\nfig.subplots_adjust(hspace=0, left=0.08, right=0.95, bottom=0.13, top=0.9)\n\nax = fig.add_subplot(211, aspect='equal')\nax.scatter(X[:, 1], X[:, 0], s=1, lw=0, c='k')\n\nax.set_xlim(ymin, ymax)\nax.set_ylim(xmin, xmax)\n\nax.xaxis.set_major_formatter(plt.NullFormatter())\nplt.ylabel(r'$x\\ {\\rm (Mpc)}$')\n\nax = fig.add_subplot(212, aspect='equal')\nax.imshow(np.exp(log_dens.T), origin='lower', cmap=plt.cm.binary,\n extent=[ymin, ymax, xmin, xmax])\nax.set_xlabel(r'$y\\ {\\rm (Mpc)}$')\nax.set_ylabel(r'$x\\ {\\rm (Mpc)}$')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"Note that this is very different than the non-parametric density estimates that we did last time in that the GMM isn't doing that great of a job of matching the distribution. However, the advantage is that we now have a *model*. This model can be stored very compactly with just a few numbers, unlike the KDE or KNN maps which require a floating point number for each grid point. \n\nOne thing that you might imagine doing with this is subtracting the model from the data and looking for interesting things among the residuals.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e79f4091135a3c94d3607b3f662dcdb129f0ab83 | 743,906 | ipynb | Jupyter Notebook | notebooks/stephan_notebooks/10-Categorical.ipynb | raspstephan/nwp-downscale | dbb6d560126a8d13a2748bc088f382e68fa7c0b3 | [
"MIT"
] | 7 | 2020-12-14T02:57:39.000Z | 2022-03-29T08:32:36.000Z | notebooks/stephan_notebooks/10-Categorical.ipynb | raspstephan/nwp-downscale | dbb6d560126a8d13a2748bc088f382e68fa7c0b3 | [
"MIT"
] | 39 | 2020-12-03T11:39:11.000Z | 2021-09-28T09:57:33.000Z | notebooks/stephan_notebooks/10-Categorical.ipynb | raspstephan/nwp-downscale | dbb6d560126a8d13a2748bc088f382e68fa7c0b3 | [
"MIT"
] | 1 | 2021-09-21T23:57:36.000Z | 2021-09-21T23:57:36.000Z | 263.516118 | 156,940 | 0.921248 | [
[
[
"%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"import xarray as xr\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport torch.nn as nn\nimport torch\nimport pandas as pd\nif torch.cuda.is_available():\n device = torch.device(\"cuda\") \nelse:\n device = torch.device(\"cpu\")",
"_____no_output_____"
],
[
"from src.dataloader import *\nfrom src.models import *\nfrom src.trainer import *\nfrom src.utils import *",
"_____no_output_____"
],
[
"DATADRIVE = '/datadrive_ssd/'",
"_____no_output_____"
]
],
[
[
"## Initial changes",
"_____no_output_____"
]
],
[
[
"ds_train = TiggeMRMSDataset(\n tigge_dir=f'{DATADRIVE}/tigge/32km/',\n tigge_vars=['total_precipitation'],\n mrms_dir=f'{DATADRIVE}/mrms/4km/RadarOnly_QPE_06H/',\n rq_fn=f'{DATADRIVE}/mrms/4km/RadarQuality.nc',\n# const_fn='/datadrive/tigge/32km/constants.nc',\n# const_vars=['orog', 'lsm'],\n data_period=('2018-01', '2019-01'),\n first_days=5,\n scale=False\n# split='train'\n)",
"_____no_output_____"
],
[
"mean_precip = []\nfor idx in range(len(ds_train.idxs)):\n X, y = ds_train[idx]\n mean_precip.append(y.max())",
"_____no_output_____"
],
[
"mean_precip = np.array(mean_precip)",
"_____no_output_____"
],
[
"plt.hist(mean_precip, bins=100);\n# plt.yscale('log')",
"_____no_output_____"
],
[
"cat_bins = np.arange(0, 102, 2, dtype='float')\ncat_bins = np.append(np.insert(cat_bins, 1, 0.01), np.inf)\nlen(cat_bins)",
"_____no_output_____"
],
[
"cat_bins",
"_____no_output_____"
],
[
"X, y = ds_train[600]\nX.shape, y.shape",
"_____no_output_____"
],
[
"plt.imshow(y[0])\nplt.colorbar();",
"_____no_output_____"
],
[
"def to_categorical(y, num_classes=None, dtype='float32'):\n \"\"\"Copied from keras source code\n \"\"\"\n y = np.array(y, dtype='int')\n input_shape = y.shape\n if input_shape and input_shape[-1] == 1 and len(input_shape) > 1:\n input_shape = tuple(input_shape[:-1])\n y = y.ravel()\n if not num_classes:\n num_classes = np.max(y) + 1\n n = y.shape[0]\n categorical = np.zeros((n, num_classes), dtype=dtype)\n categorical[np.arange(n), y] = 1\n output_shape = input_shape + (num_classes,)\n categorical = np.reshape(categorical, output_shape)\n return categorical",
"_____no_output_____"
],
[
"a = pd.cut(y.reshape(-1), cat_bins, labels=False, include_lowest=True).reshape(y.shape)\na.shape",
"_____no_output_____"
],
[
"plt.imshow(a[0])\nplt.colorbar();",
"_____no_output_____"
],
[
"plt.hist(a.flat, bins=cat_bins);",
"_____no_output_____"
],
[
"a = to_categorical(a.squeeze(), num_classes=len(cat_bins))",
"_____no_output_____"
],
[
"a = np.rollaxis(a, 2)",
"_____no_output_____"
],
[
"a.shape",
"_____no_output_____"
]
],
[
[
"## Changes implemented",
"_____no_output_____"
]
],
[
[
"cat_bins = np.arange(0, 55, 5, dtype='float')\ncat_bins = np.append(np.insert(cat_bins, 1, 0.01), np.inf)\nlen(cat_bins)",
"_____no_output_____"
],
[
"plt.plot(cat_bins)",
"_____no_output_____"
],
[
"ds_train = TiggeMRMSDataset(\n tigge_dir=f'{DATADRIVE}/tigge/32km/',\n tigge_vars=['total_precipitation'],\n mrms_dir=f'{DATADRIVE}/mrms/4km/RadarOnly_QPE_06H/',\n rq_fn=f'{DATADRIVE}/mrms/4km/RadarQuality.nc',\n# const_fn='/datadrive/tigge/32km/constants.nc',\n# const_vars=['orog', 'lsm'],\n data_period=('2018-01', '2018-12'),\n first_days=5,\n scale=True,\n cat_bins=cat_bins\n# split='train'\n)",
"/anaconda/envs/nwp-downscale/lib/python3.8/site-packages/xarray/core/indexing.py:1369: PerformanceWarning: Slicing is producing a large chunk. To accept the large\nchunk and silence this warning, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': False}):\n ... array[indexer]\n\nTo avoid creating the large chunks, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': True}):\n ... array[indexer]\n return self.array[key]\n"
],
[
"ds_valid = TiggeMRMSDataset(\n tigge_dir=f'{DATADRIVE}/tigge/32km/',\n tigge_vars=['total_precipitation'],\n mrms_dir=f'{DATADRIVE}/mrms/4km/RadarOnly_QPE_06H/',\n rq_fn=f'{DATADRIVE}/mrms/4km/RadarQuality.nc',\n# const_fn='/datadrive/tigge/32km/constants.nc',\n# const_vars=['orog', 'lsm'],\n data_period=('2019-01', '2019-12'),\n first_days=2,\n scale=True,\n cat_bins=cat_bins,\n mins=ds_train.mins,\n maxs=ds_train.maxs\n# split='train'\n)",
"/anaconda/envs/nwp-downscale/lib/python3.8/site-packages/xarray/core/indexing.py:1369: PerformanceWarning: Slicing is producing a large chunk. To accept the large\nchunk and silence this warning, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': False}):\n ... array[indexer]\n\nTo avoid creating the large chunks, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': True}):\n ... array[indexer]\n return self.array[key]\n"
],
[
"X, y = ds_train[600]\nX.shape, y.shape",
"_____no_output_____"
],
[
"y",
"_____no_output_____"
],
[
"sampler_train = torch.utils.data.WeightedRandomSampler(ds_train.compute_weights(), len(ds_train), replacement=True)\nsampler_valid = torch.utils.data.WeightedRandomSampler(ds_valid.compute_weights(), len(ds_valid), replacement=True)",
"_____no_output_____"
],
[
"dl_train = torch.utils.data.DataLoader(ds_train, batch_size=32, sampler=sampler_train)\ndl_valid = torch.utils.data.DataLoader(ds_valid, batch_size=32, sampler=sampler_valid)",
"_____no_output_____"
],
[
"len(dl_train)",
"_____no_output_____"
],
[
"len(ds_train)",
"_____no_output_____"
],
[
"X, y = next(iter(dl_train))",
"_____no_output_____"
],
[
"fig, axs = plt.subplots(4, 8, figsize=(24, 12))\nfor x, ax in zip(X.numpy(), axs.flat):\n ax.imshow(x[0], cmap='gist_ncar_r', vmin=0, vmax=0.5)",
"_____no_output_____"
],
[
"X.shape, y.shape",
"_____no_output_____"
],
[
"y.type()",
"_____no_output_____"
],
[
"class UpsampleBlock(nn.Module):\n def __init__(self, nf, spectral_norm=False, method='PixelShuffle'):\n super().__init__()\n self.conv = nn.Conv2d(nf, nf * 4 if method=='PixelShuffle' else nf, kernel_size=3, stride=1, padding=1)\n if method == 'PixelShuffle':\n self.upsample = nn.PixelShuffle(2)\n elif method == 'bilinear':\n self.upsample = nn.Upsample(scale_factor=2, mode='bilinear')\n else:\n raise NotImplementedError\n self.activation = nn.LeakyReLU(0.2)\n if spectral_norm: \n self.conv = nn.utils.spectral_norm(self.conv)\n \n def forward(self, x):\n out = self.conv(x)\n out = self.upsample(out)\n out = self.activation(out)\n return out",
"_____no_output_____"
],
[
"class Generator(nn.Module):\n \"\"\"Generator with noise vector and spectral normalization \"\"\"\n def __init__(self, nres, nf_in, nf, relu_out=False, use_noise=True, spectral_norm=True,\n nout=1, softmax_out=False, upsample_method='PixelShuffle'):\n \"\"\" General Generator with different options to use. e.g noise, Spectral normalization (SN) \"\"\"\n super().__init__()\n self.relu_out = relu_out\n self.softmax_out = softmax_out\n self.use_noise = use_noise\n self.spectral_norm = spectral_norm\n\n # First convolution\n if use_noise: \n self.conv_in = nn.Conv2d(nf_in, nf-1, kernel_size=9, stride=1, padding=4)\n else: \n self.conv_in = nn.Conv2d(nf_in, nf, kernel_size=9, stride=1, padding=4)\n self.activation_in = nn.LeakyReLU(0.2)\n\n # Resblocks keeping shape\n self.resblocks = nn.Sequential(*[\n ResidualBlock(nf, spectral_norm=spectral_norm) for i in range(nres)\n ])\n # Resblocks with upscaling\n self.upblocks = nn.Sequential(*[\n UpsampleBlock(nf, spectral_norm=spectral_norm, method=upsample_method) for i in range(3)\n ])\n self.conv_out = nn.Conv2d(nf, nout, kernel_size=9, stride=1, padding=4)\n \n if spectral_norm: \n self.conv_in = nn.utils.spectral_norm(self.conv_in)\n self.conv_out = nn.utils.spectral_norm(self.conv_out)\n \n def forward(self, x):\n out = self.conv_in(x)\n out = self.activation_in(out)\n if self.use_noise: \n bs, _, h, w = x.shape\n z = torch.normal(0, 1, size=(bs, 1, h, w), device=device, requires_grad=True)\n out = torch.cat([out, z], dim=1)\n skip = out\n out = self.resblocks(out)\n out = out + skip\n out = self.upblocks(out)\n out = self.conv_out(out)\n if self.relu_out:\n out = nn.functional.relu(out)\n if self.softmax_out:\n out = nn.functional.softmax(out, dim=1)\n return out",
"_____no_output_____"
],
[
"gen = Generator(\n nres=3, nf_in=1, nf=64, relu_out=False, use_noise=False, spectral_norm=False,\n nout=len(cat_bins)-1, softmax_out=False, upsample_method='bilinear'\n).to(device)",
"_____no_output_____"
],
[
"count_parameters(gen)",
"_____no_output_____"
],
[
"criterion = nn.CrossEntropyLoss()\noptimizer = torch.optim.Adam(gen.parameters(), lr=1e-4)",
"_____no_output_____"
],
[
"trainer = Trainer(gen, optimizer, criterion, dl_train, dl_valid)",
"_____no_output_____"
],
[
"trainer.fit(10)",
"_____no_output_____"
],
[
"trainer.plot_losses()",
"_____no_output_____"
],
[
"preds = nn.functional.softmax(gen(X.to(device)), dim=1).cpu().detach().numpy()",
"_____no_output_____"
],
[
"preds.shape",
"_____no_output_____"
],
[
"target = y.cpu().detach().numpy()",
"_____no_output_____"
],
[
"target.shape",
"_____no_output_____"
],
[
"np.argmax(target.mean((1, 2)))",
"_____no_output_____"
],
[
"i=7\nplt.imshow(target[i])\nplt.colorbar()",
"_____no_output_____"
],
[
"target[i, 20, 20]",
"_____no_output_____"
],
[
"plt.imshow(X[i, 0] * ds_train.maxs.tp.values)\nplt.colorbar();",
"_____no_output_____"
],
[
"plt.imshow(np.argmax(preds[i], axis=0))\nplt.colorbar()",
"_____no_output_____"
],
[
"plt.plot(preds[i, :, 100, 100])\nplt.axvline(target[i, 100, 100])",
"_____no_output_____"
],
[
"cdf = np.cumsum(preds, axis=1)\ncdf.shape",
"_____no_output_____"
],
[
"len(cat_bins)",
"_____no_output_____"
],
[
"plt.plot(cat_bins[:-1], cdf[i, :, 20, 20])\nplt.plot(cat_bins[:-1], cdf[i, :, 100, 100])",
"_____no_output_____"
],
[
"from scipy.ndimage import gaussian_filter\ndef corr_random2D(size, sigma=5, inflation=0.5):\n r = np.random.uniform(size=(size, size))\n r = gaussian_filter(r, sigma)\n r = (r - 0.5) * (inflation / r.std()) + 0.5\n r = 1/(1 + np.exp(-r))\n return r",
"_____no_output_____"
],
[
"rand = corr_random2D(128, 3)\nplt.imshow(rand)\nplt.colorbar();",
"_____no_output_____"
],
[
"cat_bins",
"_____no_output_____"
],
[
"c = cdf[i, :, 100, 100]",
"_____no_output_____"
],
[
"c = np.insert(c, 0, 0)",
"_____no_output_____"
],
[
"c",
"_____no_output_____"
],
[
"len(cat_bins), len(c)",
"_____no_output_____"
],
[
"p = rand[100, 100]\np",
"_____no_output_____"
],
[
"b = np.digitize(p, c, right=True) - 1\nb",
"_____no_output_____"
],
[
"c[b], c[b+1]",
"_____no_output_____"
],
[
"w1 = (p - c[b]) / (c[b+1] - c[b])\nw2 = (c[b+1] - p) / (c[b+1] - c[b])\nw1, w2",
"_____no_output_____"
],
[
"v = cat_bins[b] * w1 + cat_bins[b+1] * w2\nv",
"_____no_output_____"
],
[
"def cat2real1D(pdf, q, cat_bins, interpolate=True):\n c = np.cumsum(pdf)\n c = np.insert(c, 0, 0)\n b = np.digitize(q, c, right=True) - 1\n if interpolate:\n w1 = (q - c[b]) / (c[b+1] - c[b])\n w2 = (c[b+1] - q) / (c[b+1] - c[b])\n else: \n w1, w2 = 0.5, 0.5\n assert w1 >0, 'Weights must be positive'\n assert w2 >0, 'Weights must be positive'\n v = cat_bins[b] * w2 + cat_bins[b+1] * w1\n return v",
"_____no_output_____"
],
[
"import pdb",
"_____no_output_____"
],
[
"def cat2real2D(pdf, q, cat_bins, interpolate=True):\n nbins, nx, ny = pdf.shape\n# pdb.set_trace()\n r = [cat2real1D(a, b, cat_bins, interpolate) for a, b in zip(pdf.reshape(nbins, -1).T, q.reshape(-1))]\n r = np.array(r).reshape(nx, ny)\n return r",
"_____no_output_____"
],
[
"o = cat2real2D(pdf, rand, cat_bins)",
"_____no_output_____"
],
[
"plt.imshow(o)\nplt.colorbar();",
"_____no_output_____"
],
[
"weights = ds_valid.compute_weights()",
"_____no_output_____"
],
[
"np.argsort(weights)[::-1][:20]",
"_____no_output_____"
],
[
"X_sample, y_sample = ds_valid.__getitem__(501, no_cat=True)",
"_____no_output_____"
],
[
"X_sample.shape, y_sample.shape",
"_____no_output_____"
],
[
"vmin=0\nvmax=10",
"_____no_output_____"
],
[
"fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(10, 5))\nimg = ax1.imshow(X_sample[0]*ds_valid.maxs.tp.values, cmap='gist_ncar_r', vmin=vmin, vmax=vmax)\nplt.colorbar(img, ax=ax1, shrink=0.7)\nimg = ax2.imshow(y_sample[0], cmap='gist_ncar_r', vmin=vmin, vmax=vmax)\nplt.colorbar(img, ax=ax2, shrink=0.7)",
"_____no_output_____"
],
[
"pdf = nn.functional.softmax(gen(torch.from_numpy(X_sample[None]).to(device))).cpu().detach().numpy()[0]",
"<ipython-input-501-b128a729a8e5>:1: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.\n pdf = nn.functional.softmax(gen(torch.from_numpy(X_sample[None]).to(device))).cpu().detach().numpy()[0]\n"
],
[
"pdf.shape",
"_____no_output_____"
],
[
"fig, axs = plt.subplots(1, 5, figsize=(20, 4))\nfor i in range(5):\n if i ==0:\n rand = np.ones((128, 128)) * 0.5\n else:\n rand = corr_random2D(128, 3, inflation=0.3)\n o = cat2real2D(pdf, rand, cat_bins)\n axs[i].imshow(o, cmap='gist_ncar_r', vmin=vmin, vmax=vmax)",
"_____no_output_____"
]
],
[
[
"## Overfitting test MSE",
"_____no_output_____"
]
],
[
[
"ds_train = TiggeMRMSDataset(\n tigge_dir=f'{DATADRIVE}/tigge/32km/',\n tigge_vars=['total_precipitation'],\n mrms_dir=f'{DATADRIVE}/mrms/4km/RadarOnly_QPE_06H/',\n rq_fn=f'{DATADRIVE}/mrms/4km/RadarQuality.nc',\n# const_fn='/datadrive/tigge/32km/constants.nc',\n# const_vars=['orog', 'lsm'],\n data_period=('2018-01', '2018-12'),\n first_days=5,\n scale=True,\n# cat_bins=cat_bins\n# split='train'\n)",
"/anaconda/envs/nwp-downscale/lib/python3.8/site-packages/xarray/core/indexing.py:1369: PerformanceWarning: Slicing is producing a large chunk. To accept the large\nchunk and silence this warning, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': False}):\n ... array[indexer]\n\nTo avoid creating the large chunks, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': True}):\n ... array[indexer]\n return self.array[key]\n"
],
[
"ds_valid = TiggeMRMSDataset(\n tigge_dir=f'{DATADRIVE}/tigge/32km/',\n tigge_vars=['total_precipitation'],\n mrms_dir=f'{DATADRIVE}/mrms/4km/RadarOnly_QPE_06H/',\n rq_fn=f'{DATADRIVE}/mrms/4km/RadarQuality.nc',\n# const_fn='/datadrive/tigge/32km/constants.nc',\n# const_vars=['orog', 'lsm'],\n data_period=('2019-01', '2019-12'),\n first_days=2,\n scale=True,\n# cat_bins=cat_bins,\n mins=ds_train.mins,\n maxs=ds_train.maxs\n# split='train'\n)",
"_____no_output_____"
],
[
"X, y = ds_train[600]\nX.shape, y.shape",
"_____no_output_____"
],
[
"sampler_train = torch.utils.data.WeightedRandomSampler(ds_train.compute_weights(), len(ds_train))\nsampler_valid = torch.utils.data.WeightedRandomSampler(ds_valid.compute_weights(), len(ds_valid))",
"_____no_output_____"
],
[
"dl_train = torch.utils.data.DataLoader(ds_train, batch_size=32, sampler=sampler_train)\ndl_valid = torch.utils.data.DataLoader(ds_valid, batch_size=32, sampler=sampler_valid)",
"_____no_output_____"
],
[
"len(dl_train)",
"_____no_output_____"
],
[
"len(ds_train)",
"_____no_output_____"
],
[
"X, y = next(iter(dl_train))",
"_____no_output_____"
],
[
"fig, axs = plt.subplots(4, 8, figsize=(24, 12))\nfor x, ax in zip(X.numpy(), axs.flat):\n ax.imshow(x[0], cmap='gist_ncar_r', vmin=0, vmax=0.5)",
"_____no_output_____"
],
[
"X.shape, y.shape",
"_____no_output_____"
],
[
"y.type()",
"_____no_output_____"
],
[
"class UpsampleBlock(nn.Module):\n def __init__(self, nf, spectral_norm=False, method='PixelShuffle'):\n super().__init__()\n self.conv = nn.Conv2d(nf, nf * 4 if method=='PixelShuffle' else nf, kernel_size=3, stride=1, padding=1)\n if method == 'PixelShuffle':\n self.upsample = nn.PixelShuffle(2)\n elif method == 'bilinear':\n self.upsample = nn.Upsample(scale_factor=2, mode='bilinear')\n else:\n raise NotImplementedError\n self.activation = nn.LeakyReLU(0.2)\n if spectral_norm: \n self.conv = nn.utils.spectral_norm(self.conv)\n \n def forward(self, x):\n out = self.conv(x)\n out = self.upsample(out)\n out = self.activation(out)\n return out",
"_____no_output_____"
],
[
"class Generator(nn.Module):\n \"\"\"Generator with noise vector and spectral normalization \"\"\"\n def __init__(self, nres, nf_in, nf, relu_out=False, use_noise=True, spectral_norm=True,\n nout=1, softmax_out=False, upsample_method='PixelShuffle'):\n \"\"\" General Generator with different options to use. e.g noise, Spectral normalization (SN) \"\"\"\n super().__init__()\n self.relu_out = relu_out\n self.softmax_out = softmax_out\n self.use_noise = use_noise\n self.spectral_norm = spectral_norm\n\n # First convolution\n if use_noise: \n self.conv_in = nn.Conv2d(nf_in, nf-1, kernel_size=9, stride=1, padding=4)\n else: \n self.conv_in = nn.Conv2d(nf_in, nf, kernel_size=9, stride=1, padding=4)\n self.activation_in = nn.LeakyReLU(0.2)\n\n # Resblocks keeping shape\n self.resblocks = nn.Sequential(*[\n ResidualBlock(nf, spectral_norm=spectral_norm) for i in range(nres)\n ])\n # Resblocks with upscaling\n self.upblocks = nn.Sequential(*[\n UpsampleBlock(nf, spectral_norm=spectral_norm, method=upsample_method) for i in range(3)\n ])\n self.conv_out = nn.Conv2d(nf, nout, kernel_size=9, stride=1, padding=4)\n \n if spectral_norm: \n self.conv_in = nn.utils.spectral_norm(self.conv_in)\n self.conv_out = nn.utils.spectral_norm(self.conv_out)\n \n def forward(self, x):\n out = self.conv_in(x)\n out = self.activation_in(out)\n if self.use_noise: \n bs, _, h, w = x.shape\n z = torch.normal(0, 1, size=(bs, 1, h, w), device=device, requires_grad=True)\n out = torch.cat([out, z], dim=1)\n skip = out\n out = self.resblocks(out)\n out = out + skip\n out = self.upblocks(out)\n out = self.conv_out(out)\n if self.relu_out:\n out = nn.functional.relu(out)\n if self.softmax_out:\n out = nn.functional.softmax(out, dim=1)\n return out",
"_____no_output_____"
],
[
"gen = Generator(\n nres=3, nf_in=1, nf=64, relu_out=False, use_noise=False, spectral_norm=False,\n nout=1, softmax_out=False, upsample_method='PixelShuffle'\n).to(device)",
"_____no_output_____"
],
[
"count_parameters(gen)",
"_____no_output_____"
],
[
"criterion = nn.MSELoss()\noptimizer = torch.optim.Adam(gen.parameters(), lr=1e-5)",
"_____no_output_____"
],
[
"trainer = Trainer(gen, optimizer, criterion, dl_train, dl_valid)",
"_____no_output_____"
],
[
"trainer.fit(10)",
"_____no_output_____"
],
[
"trainer.plot_losses()",
"_____no_output_____"
],
[
"plot_sample(X, y, gen, 13)",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f4dda4e6f7a93fc2c4e4cba4020dbc3670ec6 | 3,722 | ipynb | Jupyter Notebook | src/dataset.ipynb | gitmenonsandu/Image-Denoising-with-Deep-CNN | ecbd1ab63c5237b6b68492c5421330fb2f1c3b3d | [
"MIT"
] | 6 | 2017-11-30T02:59:37.000Z | 2019-09-18T12:18:14.000Z | src/dataset.ipynb | gitmenonsandu/Image-Denoising-with-Deep-CNN | ecbd1ab63c5237b6b68492c5421330fb2f1c3b3d | [
"MIT"
] | 2 | 2017-11-29T08:39:56.000Z | 2018-08-02T21:32:59.000Z | src/dataset.ipynb | gitmenonsandu/Image-Denoising-with-Deep-CNN | ecbd1ab63c5237b6b68492c5421330fb2f1c3b3d | [
"MIT"
] | 6 | 2017-11-29T12:11:37.000Z | 2020-04-11T11:25:12.000Z | 20.910112 | 73 | 0.509404 | [
[
[
"import sys\nimport os\nimport numpy as np\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"HEIGHT = 96\nWIDTH = 96\nDEPTH = 3\n\nSIZE = HEIGHT*WIDTH*DEPTH",
"_____no_output_____"
],
[
"DATA_PATH = '../dataset/stl10_binary/train_X.bin'\nLABEL_PATH = '../dataset/stl10_binary/train_y.bin'",
"_____no_output_____"
],
[
"def read_labels(path_to_labels):\n with open(path_to_labels, 'rb') as f:\n labels = np.fromfile(f,dtype=np.uint8)\n return labels",
"_____no_output_____"
],
[
"def read_all_images(path_to_data):\n with open(path_to_data, 'rb') as f:\n all_data = np.fromfile(f,dtype=np.uint8)\n \n #Data resized to 3x64x64\n #-1 since size of the pictures depends on the input file\n images = np.reshape(all_data, (-1, 3, HEIGHT, WIDTH))\n \n #Transposing to a standard image format\n #Comment this line before training algorithms like CNNs\n images = np.transpose(images, (0,3,2,1))\n return images",
"_____no_output_____"
],
[
"def read_single_image(image_file):\n image = np.fromfile(image_file,dtype=np.uint8,count=SIZE)\n \n image = np.reshape(image,(3,HEIGHT,WIDTH))\n \n image = np.transpose(image, (2,1,0))\n return image",
"_____no_output_____"
],
[
"def plot_image(image):\n plt.imshow(image)\n plt.show()",
"_____no_output_____"
],
[
"def display_one_image(image):\n with open(DATA_PATH, 'rb') as f:\n image=read_single_image(f)\n plot_image(image)",
"_____no_output_____"
],
[
"def get_shape_of_dataset():\n images= read_all_images(DATA_PATH)\n return images.shape",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f4e192ed8d9a73b670cfc83deba9d035f7d1f | 201,747 | ipynb | Jupyter Notebook | examples/savefig.ipynb | fadeawaylove/stock-trade-system | 133762e6459745fc6c818b43729c1ffff5b9c5ad | [
"Apache-2.0"
] | 3 | 2020-11-10T03:35:05.000Z | 2021-07-04T15:18:44.000Z | examples/savefig.ipynb | fadeawaylove/stock-trade-system | 133762e6459745fc6c818b43729c1ffff5b9c5ad | [
"Apache-2.0"
] | null | null | null | examples/savefig.ipynb | fadeawaylove/stock-trade-system | 133762e6459745fc6c818b43729c1ffff5b9c5ad | [
"Apache-2.0"
] | null | null | null | 543.792453 | 72,257 | 0.919716 | [
[
[
"# This allows multiple outputs from a single jupyter notebook cell:\nfrom IPython.core.interactiveshell import InteractiveShell\nInteractiveShell.ast_node_interactivity = \"all\"",
"_____no_output_____"
],
[
"%matplotlib inline\nimport pandas as pd\nimport mplfinance as mpf\nimport io\nprint('pandas version:',pd.__version__)\nprint('mplfinance version:',mpf.__version__)",
"pandas version: 1.0.3\nmplfinance version: 0.12.5a0\n"
]
],
[
[
"---\n\n# Saving your plot to a file (or to an io buffer):\n\n- `mplfinance.plot()` allows you to save your plot to a file, or io-buffer, using the `savefig` keyword.\n- The value of `savefig` may be a `str`, `dict`, or `io.ByteIO` object. \n - If the value is a `str` then it is assumed to be the file name to which to save the figure/plot.\n - If the value is an `io.ByteIO` object, then the figure will be saved to the io buffer object. This avoids interaction with disk, and can also be useful when mplfinance is behind a web server (so that requests for an image file can be serviced without going to disk).\n\nIf the file extension is one of those recognized by `matplotlib.pyplot.savefig()` then the file type will be inferred from the extension, for example: `.pdf`, `.svg`, `.png`, `.jpg` ...\n",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv('data/SP500_NOV2019_Hist.csv',index_col=0,parse_dates=True)",
"_____no_output_____"
],
[
"%%capture \n## cell magic function `%%capture` blocks jupyter notebook output,\n## which is not needed here since the plot is saved to a file anyway:\nmpf.plot(df,type='candle',volume=True,savefig='testsave.png')",
"_____no_output_____"
]
],
[
[
"---\n\n### We can use IPython.display.Image to display the image file here in the notebook:",
"_____no_output_____"
]
],
[
[
"import IPython.display as IPydisplay",
"_____no_output_____"
],
[
"%ls -l testsave.png\nIPydisplay.Image(filename='testsave.png')",
"-rw-rw-rw- 1 dino dino 24877 Jun 7 17:52 \u001b[0m\u001b[01;35mtestsave.png\u001b[0m\r\n"
]
],
[
[
"---\n\n### We can use io to save the plot as a byte buffer:",
"_____no_output_____"
]
],
[
[
"%%capture \n## cell magic function `%%capture` blocks jupyter notebook output, \n## which is not needed here, since the plot is saved to the io-buffer anyway:\nbuf = io.BytesIO()\nmpf.plot(df,type='candle',volume=True,savefig=buf)\nbuf.seek(0)",
"_____no_output_____"
]
],
[
[
"### We can use Ipython.display.Image to display the image in the ioBytes buffer:",
"_____no_output_____"
]
],
[
[
"IPydisplay.Image(buf.read())",
"_____no_output_____"
]
],
[
[
"---\n\n# Specifying image attributes with `savefig`\n\nWe can control various attributes of the saved figure/plot by passing a `dict`ionary as the value for the `savefig` keyword.\n\nThe dictionary **must** contain the keyword `fname` for the file name to be saved, **and *may* contain any of the other keywords accepted by [`matplotlib.pyplot.savefig()`](https://matplotlib.org/3.1.1/api/_as_gen/matplotlib.pyplot.savefig.html)** (for example: dpi, facecolor, edgecolor, orientation, format, metadata, quality)\n\nWhen creating the `dict`, I recommend using the `dict()` constructor so that that `keyword=` syntax may be used and thereby more closely resemble calling:\n**[`matplotlib.pyplot.savefig()`](https://matplotlib.org/3.1.1/api/_as_gen/matplotlib.pyplot.savefig.html)**\n",
"_____no_output_____"
]
],
[
[
"%%capture\n## %%capture blocks jupyter notebook output; plots are saved to files anyway:\nsave = dict(fname='tsave30.jpg',dpi=30,pad_inches=0.25)\nmpf.plot(df,volume=True,savefig=save)\nmpf.plot(df,volume=True,savefig=dict(fname='tsave100.jpg',dpi=100,pad_inches=0.25))",
"_____no_output_____"
],
[
"%ls -l tsave30.jpg\n%ls -l tsave100.jpg\nIPydisplay.Image(filename='tsave30.jpg')\nIPydisplay.Image(filename='tsave100.jpg')",
"-rw-rw-rw- 1 dino dino 11016 Jun 7 17:52 \u001b[0m\u001b[01;35mtsave30.jpg\u001b[0m\n-rw-rw-rw- 1 dino dino 54172 Jun 7 17:52 \u001b[0m\u001b[01;35mtsave100.jpg\u001b[0m\n"
]
],
[
[
"## Specifying image attributes (via `savefig`) dict also works with an io.BytesIO buffer:\n- Just assign the io-buffer to the `fname` key in the savefig dict",
"_____no_output_____"
]
],
[
[
"%%capture\nbuf30dpi = io.BytesIO()\nbuf100dpi = io.BytesIO()\nmpf.plot(df,volume=True,savefig=dict(fname=buf30dpi ,dpi=30 ,pad_inches=0.25))\nmpf.plot(df,volume=True,savefig=dict(fname=buf100dpi,dpi=100,pad_inches=0.25))",
"_____no_output_____"
]
],
[
[
"### Use Ipython.display.Image to display the buffer contents:",
"_____no_output_____"
]
],
[
[
"_ = buf30dpi.seek(0)\nIPydisplay.Image(buf30dpi.read())",
"_____no_output_____"
],
[
"_ = buf100dpi.seek(0)\nIPydisplay.Image(buf100dpi.read())",
"_____no_output_____"
]
],
[
[
"---\n\n## A note about `jpeg` files:\n\n**[`matplotlib.pyplot.savefig()`](https://matplotlib.org/3.1.1/api/_as_gen/matplotlib.pyplot.savefig.html)** uses the Python Image Library (PIL or pillow) to save jpeg files. \nThus you must have pillow installed (`pip install pillow`) to save jpeg images. \n\n**Version 3.1.2 of matplotlib has an incompatibility with version 7.x.x of pillow** (which was released to PyPI on On January 2, 2020. This incompatibility was **[fixed here](https://github.com/matplotlib/matplotlib/pull/16086/commits)** however (as of this writting) the fixed version of matplotlib was not yet on PyPI (to be pip installable), *but it may be by the time you read this*.\n\nIf you encounter an exception when trying to save a jpeg file, that says \"format 'jpg' is not supported\" check that you have pillow installed. If you do have pillow 7.x.x installed, and you are encountering this exception, the permanent fix for the problem is to upgrade to the new version of matplotlib `pip install --upgrade matplotlib`. (This should work if the new version of matplotlib is on Pypi at the time).\n\nIf upgrading matplotlib doesn't work, you can immediately fix this problem in one of the following temporary solutions:\n\n- install the previous version of pillow: **`pip install Pillow==6.2.2`**\n- edit your installed version of `.../site-packages/matplotlib/backend_bases.py` and apply **[this one-line fix](https://github.com/matplotlib/matplotlib/pull/16086/files).**\n\n\n---",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
] |
e79f5a75a3c5578cd8e6b5bec00ef0e348a2e5a4 | 328,624 | ipynb | Jupyter Notebook | notebooks/data_sources.ipynb | braadbaart/covid19 | ffe22cd5ed7e8409c99ac27805e33eea941f031c | [
"MIT"
] | null | null | null | notebooks/data_sources.ipynb | braadbaart/covid19 | ffe22cd5ed7e8409c99ac27805e33eea941f031c | [
"MIT"
] | 4 | 2020-11-13T18:45:25.000Z | 2022-02-10T01:26:47.000Z | notebooks/data_sources.ipynb | braadbaart/covid19 | ffe22cd5ed7e8409c99ac27805e33eea941f031c | [
"MIT"
] | null | null | null | 511.079316 | 143,036 | 0.932491 | [
[
[
"import numpy as np\nimport pandas as pd\n\n%matplotlib inline\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"pd.options.display.max_colwidth = 200",
"_____no_output_____"
]
],
[
[
"### Oxford University COVID-19 forecasting project data\n\nThe data below was downloaded from the [Oxford University countermeasures database](https://www.notion.so/COVID-19-countermeasures-database-b532a58d6f944ef6982ab565627bdb08). See the website for a description of the data sources.",
"_____no_output_____"
]
],
[
[
"countermeasures_df = pd.read_csv(\"data/containment_measures_march23.csv\")",
"_____no_output_____"
],
[
"countermeasures_df.head(1)",
"_____no_output_____"
],
[
"countermeasures_df[[\"Country\", \"Date Start\", \"Description of measure implemented\"]].groupby(\"Country\").head(10)",
"_____no_output_____"
],
[
"print(countermeasures_df[\"Country\"].unique())",
"['Austria' 'Germany' 'United Kingdom' 'Vietnam' 'South Korea' 'Singapore'\n 'Israel' 'Japan' 'Sweden' 'San Marino' 'Hong Kong' 'Taiwan' 'Macau'\n 'China' 'United States' nan 'Thailand' 'Italy' 'Czechia' 'Australia'\n 'Trinidad and Tobago' 'Qatar' 'North Korea' 'New Zealand' 'Colombia'\n 'France' 'Romania' 'Portugal' 'Spain' 'Belgium' 'Luxembourg' 'Albania'\n 'Andorra' 'Azerbaijan' 'Belarus' 'Bosnia and Herzegovina' 'Bulgaria'\n 'Estonia' 'Denmark' 'Cyprus' 'Croatia' 'Finland' 'Georgia' 'Hungary'\n 'Latvia' 'Lithuania' 'Moldova' 'Greece' 'Malta' 'Monaco' 'Netherlands'\n 'Iceland' 'Guernsey' 'Macedonia' 'Ireland' 'Vatican City' 'Jersey'\n 'Kosovo' 'Kazakhstan' 'US: Florida' 'Ukraine' 'Turkey' 'Poland'\n 'Slovakia' 'Serbia' 'Slovenia' 'Switzerland' 'Montenegro' 'Norway' 'Iran'\n 'Liechtenstein' 'Russia' 'US:N Carolina' 'US: Oregon' 'US: Arizona'\n 'US:California' 'US:Idaho' 'US: Nevada' 'US:Utah' 'US:Washington'\n 'US: S Carolina' 'Mexico' 'Egypt' 'Malaysia' 'US:Georgia' 'US:Maryland'\n 'US: Indiana' 'US: Illinois' 'US:Delaware' 'US: Wisconsin' 'US: Virginia'\n 'US: Michigan']\n"
],
[
"print(countermeasures_df[\"Implementing City\"].unique())",
"[nan 'Danang' 'Seoul' 'Wuhan' 'Beijing, Shenzhen' 'Washington DC'\n 'Huanggang' 'Huanggong' 'Wenzhou' 'Beijing' 'Shenzhen' 'Bangkok' 'Tokyo'\n 'Jingmen' 'Nanjing, Suzhou' 'Xiaogan' 'Hangzhou'\n 'Yiwu International Trade City' 'Dalian, Qingdao, Shenyang, Weihai'\n 'Yantai' 'Madrid' 'La Gomera' 'Valencia' 'Sevilla' 'Barcelona' 'Paris'\n 'Vitoria-Gasteiz' 'Banja Luka'\n 'Banja Luka, Doboj, Mrkonjić Grad, Prnjavor, Čelinac' 'Sofia' 'Budapest'\n 'Miskolc' 'Budapest, Kaposvár, Miskolc, Szeged, Székesfehérvár'\n 'Vatican City' 'San Francisco' 'Moscow'\n 'Bertonico, Casalpusterlengo, Castelgerundo, Castiglione d’Adda, Codogno, Fombio, Maleo, San Fiorano, Somaglia, Terranova, Vò'\n 'Naples, Palermo' 'Taranto' 'Messina' 'Medicina' 'Ischgl' 'Bansko']\n"
],
[
"print(countermeasures_df[\"Implementing State/Province\"].unique())",
"[nan 'Quang Ninh' 'Gyeongbook Province' 'Daegu, Gyeongbook Province'\n 'Busan' 'Hubei' 'Hunan' 'Tianjin' 'Zheijang' 'Meituan' 'Chongqing'\n 'Zhejian' 'Shanghai' 'Sichuan' 'Jiangsu' 'Guangdong' 'Hangzhou'\n 'Shenzhen' 'Leishenshan' 'Guangzhou' 'Kanagawa prefecture' 'Guangxi'\n 'Guizhou' 'Huanggang' 'Henan, Shandong' 'Liaoning, Shandong' 'Shandong'\n 'Madrid' 'Tyrol' 'Basque Country' 'Galicia' 'Republika Srpska' 'Bosnia'\n 'Autonomous Region of Madeira' 'California' 'Leningrad Oblast'\n 'Moscow Oblast' 'St. Petersburg Oblast' 'Santa Clara County' 'Seattle'\n 'Berkley, Countra Costa Country, Santa Clara County, los Angles'\n 'Orange County' 'Placer County, San Mateo County, Sonoma County'\n 'San Benito County, Santa Clara County'\n 'Kershan County, Lancaster County' 'Bavaria' 'Colima' 'Mexico City'\n 'South Fulton' 'Atalanta, Brookhaven, Clarkston, Dunwoody' 'Atalanta'\n 'Albany, Athens-Clark county, Bosnia, Dougherty County'\n 'Gyeonggi-Province, Paju' 'Gyeonggi-Province, Seoul'\n 'Farifax county, Loudoun county, Prince William County, Stafford County']\n"
]
],
[
[
"### John Hopkins containment measures database\n\nThe data is made available as part of the John Hopkins [Containment Measures Database](http://epidemicforecasting.org/containment). See the website in the link for a description of the data sources.",
"_____no_output_____"
]
],
[
[
"containment_df = pd.read_csv(\"data/countermeasures_db_johnshopkins_2020_03_30.csv\")",
"_____no_output_____"
],
[
"containment_df.columns",
"_____no_output_____"
],
[
"print(containment_df[\"Country\"].unique())",
"['Austria' 'Germany' 'United Kingdom' 'Vietnam' 'South Korea' 'Singapore'\n 'Israel' 'Japan' 'Sweden' 'San Marino' 'Slovenia' 'Canada' 'Taiwan'\n 'Macau' 'Hong Kong' 'China' 'Thailand' 'Italy' 'Czechia' 'Australia'\n 'Trinidad and Tobago' 'Qatar' 'New Zealand' 'Colombia' 'Romania' 'France'\n 'Portugal' 'Spain' 'Belgium' 'Luxembourg' 'Albania' 'Andorra'\n 'Azerbaijan' 'Belarus' 'Bosnia and Herzegovina' 'Bulgaria' 'Denmark'\n 'Estonia' 'Cyprus' 'Croatia' 'Finland' 'Georgia' 'Hungary' 'Latvia'\n 'Lithuania' 'Greece' 'Moldova' 'Malta' 'Monaco' 'Netherlands' 'Iceland'\n 'Ireland' 'Kosovo' 'Kazakhstan' 'Poland' 'Turkey' 'Ukraine' 'Slovakia'\n 'Serbia' 'Switzerland' 'Norway' 'Montenegro' 'Iran' 'Liechtenstein'\n 'Russia' 'Mexico' 'Egypt' 'Malaysia' 'Nepal' 'Afghanistan' 'Iraq'\n 'Philippines' 'Kuwait' 'South Africa' 'Armenia' 'Pakistan' 'Brazil'\n 'Costa Rica' 'Panama' 'India' 'Bahrain' 'United Arab Emirates'\n 'Kyrgyzstan' 'Indonesia' 'Namibia' 'Uganda']\n"
],
[
"cases_df = containment_df[[\"Date\", \"Country\", \"Confirmed Cases\", \"Deaths\"]]\\\n.loc[containment_df[\"Confirmed Cases\"] > 3000]\\\n.pivot(index=\"Date\", columns=\"Country\", values=\"Confirmed Cases\")",
"_____no_output_____"
],
[
"cases_df.plot(figsize=(16,8), title=\"Per-country growth in confirmed cases after the first 3000\")\\\n.legend(bbox_to_anchor=(1,1))",
"_____no_output_____"
],
[
"deaths_df = containment_df[[\"Date\", \"Country\", \"Confirmed Cases\", \"Deaths\"]]\\\n.loc[containment_df[\"Deaths\"] > 100]\\\n.pivot(index=\"Date\", columns=\"Country\", values=\"Deaths\")",
"_____no_output_____"
],
[
"deaths_df.plot(figsize=(16,8), title=\"Deaths per country after the first 100 deaths recorded\")\\\n.legend(bbox_to_anchor=(1,1))",
"_____no_output_____"
],
[
"other_cm_cols = ['Unnamed: 0', 'Resumption', 'Diagnostic criteria loosened', 'Testing criteria', 'Date', 'Country',\n 'Confirmed Cases', 'Deaths']",
"_____no_output_____"
],
[
"countermeasures = list(filter(lambda m: m not in other_cm_cols, containment_df.columns))",
"_____no_output_____"
],
[
"cm_df = containment_df[countermeasures + ['Date', 'Country']].fillna(0)",
"_____no_output_____"
],
[
"cm_df[countermeasures] = cm_df[countermeasures].mask(cm_df[countermeasures] > 0, 1)",
"_____no_output_____"
],
[
"cm_df.groupby(\"Date\").sum().plot(figsize=(16,8), title=\"Number of countries implementing measure by date\")\\\n.legend(bbox_to_anchor=(1,1))",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f5cfb666900bb892b73518f8fabb24e87238b | 120,185 | ipynb | Jupyter Notebook | milestone-two/sarsa_lambda_agent_mountain_car_v0.ipynb | galleon/prototyping-self-driving-agents | fa91feba106d66878f72c54a0d6ede0f8f9a013c | [
"Unlicense"
] | null | null | null | milestone-two/sarsa_lambda_agent_mountain_car_v0.ipynb | galleon/prototyping-self-driving-agents | fa91feba106d66878f72c54a0d6ede0f8f9a013c | [
"Unlicense"
] | null | null | null | milestone-two/sarsa_lambda_agent_mountain_car_v0.ipynb | galleon/prototyping-self-driving-agents | fa91feba106d66878f72c54a0d6ede0f8f9a013c | [
"Unlicense"
] | null | null | null | 182.651976 | 40,255 | 0.868844 | [
[
[
"### **Rendering component declaration.**",
"_____no_output_____"
]
],
[
[
"# imports for setting up display for the colab server.\n!sudo apt-get update > /dev/null 2>&1\n!sudo apt-get install -y xvfb x11-utils > /dev/null 2>&1\n!pip install gym==0.17.* pyvirtualdisplay==0.2.* PyOpenGL==3.1.* PyOpenGL-accelerate==3.1.* > /dev/null 2>&1",
"_____no_output_____"
],
[
"# gym related import statements.\nimport gym\nfrom gym import logger as gymlogger\nfrom gym.wrappers import Monitor\ngymlogger.set_level(40) #error only\n# RL agent construction related imports.\nimport numpy as np\nnp.random.seed(0)\nimport matplotlib.pyplot as plt\nfrom scipy.special import softmax\n# virtual display related import statements.\nimport math\nimport glob\nimport io\nimport base64\nimport time\nfrom time import sleep\nfrom tqdm import tqdm\nfrom IPython.display import HTML\nfrom IPython import display as ipythondisplay",
"_____no_output_____"
],
[
"# This creates virtual display to send the frames for being rendered.\nfrom pyvirtualdisplay import Display\ndisplay = Display(visible=0, size=(1366, 768))\ndisplay.start()",
"_____no_output_____"
],
[
"def show_video():\n '''\n This function loads the data video inline into the colab notebook.\n By reading the video stored by the Monitor class.\n '''\n mp4list = glob.glob('video/*.mp4')\n if len(mp4list) > 0:\n mp4 = mp4list[0]\n video = io.open(mp4, 'r+b').read()\n encoded = base64.b64encode(video)\n ipythondisplay.display(HTML(data='''<video alt=\"test\" autoplay \n loop controls style=\"height: 400px;\">\n <source src=\"data:video/mp4;base64,{0}\" type=\"video/mp4\" />\n </video>'''.format(encoded.decode('ascii'))))\n else: \n print(\"Could not find video\")\n \n\ndef wrap_env(env):\n '''\n This monitoring tool records the outputs from the output and saves it a\n mp4 file in the stated directory. If we don't change the video directory\n the videos will get stored in 'content/' directory.\n '''\n env = Monitor(env, './video', force=True)\n return env",
"_____no_output_____"
]
],
[
[
"### **SARSA(Lambda) Algorithm Implementation for MountainCarV0 Environment**\n\n__The implementation consists of below stated sections:__ \n* __Agent class decleration and parsing environment.__\n* __SARSA(LAMBDA) Algorithm Implementation.__\n* __Plotting results, outputting results and downloading them.__",
"_____no_output_____"
],
[
"### **Agent Class Decleration and Parsing Environment**",
"_____no_output_____"
]
],
[
[
"# This environment has two degrees of freedom: position and velocity.\n# Our agent will learn to interact with environment having these two values.\nclass State:\n def __init__(self):\n self.pos = None\n self.vel = None\n\n# Agent class defined for storing all the agent related values.\n# and getting actions from the policy. Here, target policy is same as behavior policy.\nclass Agent:\n def __init__(self, env):\n self.velocity_lim = np.array([env.observation_space.low[1], env.observation_space.high[1]])\n self.position_lim = np.array([env.observation_space.low[0], env.observation_space.high[0]])\n\n self.velocity_step, self.position_step = 0.005, 0.1\n self.velocity_space = np.arange(self.velocity_lim[0], self.velocity_lim[1] \n + self.velocity_step, self.velocity_step)\n self.position_space = np.arange(self.position_lim[0], self.position_lim[1] \n + self.position_step, self.position_step)\n self.m, self.n, self.n_action = len(self.velocity_space), len(self.position_space), 3\n self.Q_sa = np.full(shape = (self.m, self.n, 3),\n fill_value = 0.0, dtype = np.float32)\n self.collective_record = []\n self.success = []\n \n def get_action_value_index(self, state):\n pos_offset = state[0] - self.position_lim[0]\n vel_offset = state[1] - self.velocity_lim[0]\n pos_ind = pos_offset // self.position_step\n vel_ind = vel_offset // self.velocity_step\n \n return np.array([vel_ind, pos_ind], dtype= np.int)\n \n def get_action(self, state):\n ind = self.get_action_value_index(state, 0)\n p = self.Policy[ind[0], ind[1], :]\n action = np.random.choice([0, 1, 2], size = 1, p = p)\n return action[0]",
"_____no_output_____"
],
[
"# Wraping the environment in the Monitor class.\nenv = wrap_env(gym.make('MountainCar-v0'))\n# Fixing the randomness in the environment.\nenv.seed(0)",
"_____no_output_____"
],
[
"# Parsing the environment onto Agent object.\nsarsa_agent = Agent(env)\n# For some information into Q(s,a) table generated.\n# It's dimension is equal to A(Q(s,a)) = n[D(1)]*n[D(2)]*...*n[D(f)]*n(D(actions))\nprint(\"Q Shape = \",sarsa_agent.Q_sa.shape)",
"Q Shape = (30, 20, 3)\n"
]
],
[
[
"### **SARSA(Lambda) algorithm implementation.**",
"_____no_output_____"
]
],
[
[
"# SARSA(lambda) algorithm takes part from TD(0) and TD(1) algorithm.\n\neps = 0.8 # greedy epsilon exploration-vs-exploitation variable.\nchanged_eps= []\nchanges_alpha= []\nalpha = 0.2 # learning rate value\nlambda_val = 0.8 # credit assignment variable to previous states.\nalpha_decay = 0.999\neps_decay = 0.995\nsarsa_agent.e = np.zeros(shape = (sarsa_agent.m, sarsa_agent.n, 3)) # eligibility of all states.\nfinish = False\nnum_iter = 2000\nfor i_eps in tqdm(range(1, num_iter + 1)):\n state = env.reset()\n sarsa_agent.e[:, :, :] = 0\n gamma = 1.0\n ind = sarsa_agent.get_action_value_index(state)\n # greedy exploration and exploitation step.\n if np.random.random() < 1 - eps:\n action = np.argmax(sarsa_agent.Q_sa[ind[0], ind[1], :]) \n else:\n action = np.random.randint(0, 3)\n # running episodes for 200 times for this environment.\n for t in range(201):\n ind = sarsa_agent.get_action_value_index(state)\n next_state, reward, done, info = env.step(action)\n next_ind = sarsa_agent.get_action_value_index(next_state)\n \n if np.random.random() < 1 - eps:\n next_action = np.argmax(sarsa_agent.Q_sa[next_ind[0], next_ind[1], :]) \n else: \n next_action = np.random.randint(0, 3)\n \n # forward view T(lambda) SARSA equation for making updates.\n delta = reward + gamma * sarsa_agent.Q_sa[next_ind[0],next_ind[1], next_action] - sarsa_agent.Q_sa[ind[0],ind[1],action]\n sarsa_agent.e[ind[0],ind[1],action] += 1\n sarsa_agent.Q_sa = np.add(sarsa_agent.Q_sa, np.multiply(alpha * delta, sarsa_agent.e))\n sarsa_agent.e = np.multiply(gamma * lambda_val, sarsa_agent.e)\n \n if done: \n if t < 199:\n sarsa_agent.success.append((i_eps, t))\n sarsa_agent.collective_record.append(-t)\n eps = max(0.0, eps * eps_decay)\n alpha = max(0.0, alpha * alpha_decay)\n break\n state = next_state\n action = next_action",
"100%|██████████| 2000/2000 [00:52<00:00, 37.97it/s] \n"
]
],
[
[
"### **Plotting reward fuction resuls and displaying output.**",
"_____no_output_____"
]
],
[
[
"# This graph shows the saturation of rewards to a minimum under 1000 episodes.\nfig, ax = plt.subplots(figsize = (9, 5))\nplt.plot(sarsa_agent.collective_record[:],'.')\nplt.yticks(range(-110, -200, -10))\nplt.ylabel(\"reward function values\")\nplt.xlabel(\"episode number count\")\nplt.grid()\nplt.show()",
"_____no_output_____"
],
[
"# Calculating the mean performance of the agent.\nfor i_eps in (range(1, 100)):\n state = env.reset()\n gamma = 1.0\n ind = sarsa_agent.get_action_value_index(state)\n action = np.argmax(sarsa_agent.Q_sa[ind[0], ind[1], :]) \n \n for t in range(201):\n env.render()\n ind = sarsa_agent.get_action_value_index(state)\n next_state, reward, done, info = env.step(action)\n next_ind = sarsa_agent.get_action_value_index(next_state)\n \n next_action = np.argmax(sarsa_agent.Q_sa[next_ind[0], next_ind[1], :])\n \n if done: \n if t < 199:\n sarsa_agent.success.append((i_eps, t))\n sarsa_agent.collective_record.append(-t)\n sleep(1)\n break\n state = next_state\n action = next_action\n ",
"_____no_output_____"
],
[
"# Plotting the mean performance of the agent.\nfig, ax = plt.subplots(figsize = (9, 5))\nplt.plot(sarsa_agent.collective_record[-100:], '-')\nplt.yticks(range(-110, -200, -10))\nplt.title(\"Test Results\")\nplt.ylabel(\"Mean reward function value\")\nplt.xlabel(\"Episode number count\")\nplt.grid()\nplt.show()",
"_____no_output_____"
],
[
"# Demonstrating the output of the agent's working.\nshow_video()",
"_____no_output_____"
],
[
"# zipping the video folder for the given SARSA agent.\n!zip -r /content/file.zip /content/video\n# downloading the file resource.\nfrom google.colab import files\nfiles.download(\"/content/file.zip\")",
" adding: content/video/ (stored 0%)\n adding: content/video/openaigym.video.0.1649.video000512.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video002000.mp4 (deflated 7%)\n adding: content/video/openaigym.video.0.1649.video000001.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000729.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video001000.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000512.mp4 (deflated 6%)\n adding: content/video/openaigym.video.0.1649.video001000.mp4 (deflated 7%)\n adding: content/video/openaigym.video.0.1649.video000125.mp4 (deflated 7%)\n adding: content/video/openaigym.video.0.1649.video000008.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000064.mp4 (deflated 9%)\n adding: content/video/openaigym.video.0.1649.video000125.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000000.mp4 (deflated 12%)\n adding: content/video/openaigym.video.0.1649.video000343.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000027.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000027.mp4 (deflated 9%)\n adding: content/video/openaigym.video.0.1649.video000343.mp4 (deflated 7%)\n adding: content/video/openaigym.video.0.1649.video000216.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000000.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000008.mp4 (deflated 9%)\n adding: content/video/openaigym.video.0.1649.video000216.mp4 (deflated 7%)\n adding: content/video/openaigym.video.0.1649.video000729.mp4 (deflated 7%)\n adding: content/video/openaigym.video.0.1649.video000001.mp4 (deflated 9%)\n adding: content/video/openaigym.video.0.1649.video002000.meta.json (deflated 60%)\n adding: content/video/openaigym.video.0.1649.video000064.meta.json (deflated 60%)\n"
],
[
"",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f5ffa0da5cec3a57b2fe23a2c5932c0470e1e | 25,020 | ipynb | Jupyter Notebook | .ipynb_checkpoints/polynomial_feat-checkpoint.ipynb | borab96/misc-notebooks | 484520f5ea554340a0836aef5b30ea451d9a8545 | [
"MIT"
] | null | null | null | .ipynb_checkpoints/polynomial_feat-checkpoint.ipynb | borab96/misc-notebooks | 484520f5ea554340a0836aef5b30ea451d9a8545 | [
"MIT"
] | null | null | null | .ipynb_checkpoints/polynomial_feat-checkpoint.ipynb | borab96/misc-notebooks | 484520f5ea554340a0836aef5b30ea451d9a8545 | [
"MIT"
] | null | null | null | 25,020 | 25,020 | 0.780296 | [
[
[
"import numpy as np\r\nimport matplotlib.pyplot as plt\r\nimport pandas as pd\r\nfrom sklearn.preprocessing import PolynomialFeatures\r\nfrom sklearn.model_selection import train_test_split\r\nfrom sklearn.pipeline import Pipeline\r\nfrom sklearn.linear_model import LinearRegression\r\nfrom sklearn.metrics import mean_squared_error\r\n\r\ndef train_test_val_split(x, y, random_state=None, split=[0.6, 0.2, 0.2]):\r\n \"\"\"\r\n Runs the train test split twice to break data up into train, validation and test sets at ratios specified by\r\n split\r\n \"\"\"\r\n x_train, x_test_val, y_train, y_test_val = train_test_split(x, y, test_size=1-split[0], random_state=random_state)\r\n x_val, x_test, y_val, y_test = train_test_split(x_test_val, y_test_val, test_size=split[1]/(split[1]+split[2]), random_state=random_state)\r\n return x_train, x_val, x_test, y_train, y_val, y_test\r\n",
"_____no_output_____"
]
],
[
[
"# Data",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv(\"HW5_data.csv\")\r\nx = df[[\"X\", \"Y\"]].values\r\ny = df.Z.values\r\ndf.head()",
"_____no_output_____"
],
[
"df.describe()",
"_____no_output_____"
]
],
[
[
"The data is split into training, validation and test sets with ratios $0.6:0.2:0.2$ ",
"_____no_output_____"
]
],
[
[
"x_train, x_val, x_test, y_train, y_val, y_test = train_test_val_split(x, y)\r\nprint(\"Training size\")\r\nprint(x_train.shape)\r\nprint(\"Validation size\")\r\nprint(x_val.shape)\r\nprint(\"Test size\")\r\nprint(x_test.shape)",
"Training size\n(600, 2)\nValidation size\n(200, 2)\nTest size\n(200, 2)\n"
]
],
[
[
"# Model\r\n\r\nWe are told to apply a polynomial transformation on the features and use linear regression to obtain the optimal coefficients of the polynomial fit. We create a simple pipeline to achieve this. The only hyperparameter is the maximal degree of the polynomial feature map. The feature map ignores the bias but the linear regressor does not which correctly accounts for the data not being centered. \r\n\r\nIn order to tune the degree hyperparameter we sweep the parameter space and compute the MSE on the unseen validation set we had created. I am assuming the question does not require anything beyond this simple approach.",
"_____no_output_____"
]
],
[
[
"def poly_fit_pipeline(degree):\r\n polynomial_features = PolynomialFeatures(degree=degree, include_bias=False)\r\n pipeline = Pipeline([(\"polynomial_features\", polynomial_features), (\"linear_regression\", LinearRegression())])\r\n return pipeline\r\n\r\ndegrees = [2,3,4,5,6,7]\r\nmse = []\r\n\r\nfor d in degrees:\r\n model = poly_fit_pipeline(d)\r\n model.fit(x_train, y_train)\r\n y_pred = model.predict(x_val)\r\n mse.append(mean_squared_error(y_pred, y_val))",
"_____no_output_____"
],
[
"print(\"Lowest validation MSE: \"+str(round(np.min(mse),3)))\r\nprint(\"Optimal degree: \"+str(degrees[np.argmin(mse)]))\r\nplt.figure()\r\nplt.plot(degrees, np.log(mse))\r\nplt.ylabel(r\"$\\log MSE$\")\r\nplt.xlabel(\"D\")",
"Lowest validation MSE: 0.009\nOptimal degree: 6\n"
],
[
"model_optimal = poly_fit_pipeline(6)\r\nmodel_optimal.fit(x_train, y_train)\r\nmse_train = mean_squared_error(y_train, model.predict(x_train))\r\nprint(\"Training MSE: \"+str(mse_train))\r\nmse_val = mean_squared_error(y_val, model.predict(x_val))\r\nprint(\"Validation MSE: \"+str(mse_val))\r\nmse_test = mean_squared_error(y_test, model.predict(x_test))\r\nprint(\"Test MSE: \"+str(mse_test) )",
"Training MSE: 0.008621014128936854\nValidation MSE: 0.00895723100171667\nTest MSE: 0.009421933878894272\n"
]
],
[
[
"The optimization over the hyperparameter space indicates that a maximal degree of $6$ is optimal. Notice that the difference between a degree 5 fit and a degree 6 fit is miniscule so one could also go with the less complex model that offers very similar accuracy. We'll stick with $D=6$ though.\r\n\r\n## Details of the optimal model",
"_____no_output_____"
]
],
[
[
"print(\"Model parameters\")\r\nprint(model_optimal.get_params())\r\nprint(\"regression coefficients\")\r\nprint(model_optimal['linear_regression'].coef_)\r\nprint(\"regression intercept\")\r\nprint(model_optimal['linear_regression'].intercept_)\r\n",
"Model parameters\n{'memory': None, 'steps': [('polynomial_features', PolynomialFeatures(degree=6, include_bias=False, interaction_only=False,\n order='C')), ('linear_regression', LinearRegression(copy_X=True, fit_intercept=True, n_jobs=None, normalize=False))], 'verbose': False, 'polynomial_features': PolynomialFeatures(degree=6, include_bias=False, interaction_only=False,\n order='C'), 'linear_regression': LinearRegression(copy_X=True, fit_intercept=True, n_jobs=None, normalize=False), 'polynomial_features__degree': 6, 'polynomial_features__include_bias': False, 'polynomial_features__interaction_only': False, 'polynomial_features__order': 'C', 'linear_regression__copy_X': True, 'linear_regression__fit_intercept': True, 'linear_regression__n_jobs': None, 'linear_regression__normalize': False}\nregression coefficients\n[ 8.50234361e-01 9.62565178e-01 2.44544103e-01 -2.94295566e+00\n 5.06385541e-03 2.24830034e-03 4.74477640e-01 9.21766117e-04\n -1.47698485e-03 2.97942590e-01 2.64905776e-03 2.57614722e-03\n -1.42506563e-03 2.57835307e-04 2.44146426e-04 -6.50061514e-07\n -4.57242498e-04 2.01383539e-04 4.08693006e-06 1.20000278e+00\n -9.28090109e-06 -4.25226714e-06 9.58005035e-06 1.34097259e-05\n -2.02810867e-05 7.75568090e-06 -1.56872787e-06]\nregression intercept\n5.092513526440598\n"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79f675c5e632bf599efda84a7fbf9fb0fd026d2 | 13,326 | ipynb | Jupyter Notebook | notebooks/supervised/detection/object-detection-efficientdet-d4.ipynb | lucasdavid/algorithms-in-tensorflow | 78525e70c5a688c3d81c75b4ed19cbbef6ef568e | [
"Apache-2.0"
] | 3 | 2021-03-19T01:15:15.000Z | 2021-12-12T20:55:45.000Z | notebooks/supervised/detection/object-detection-efficientdet-d4.ipynb | lucasdavid/algorithms-in-tensorflow | 78525e70c5a688c3d81c75b4ed19cbbef6ef568e | [
"Apache-2.0"
] | null | null | null | notebooks/supervised/detection/object-detection-efficientdet-d4.ipynb | lucasdavid/algorithms-in-tensorflow | 78525e70c5a688c3d81c75b4ed19cbbef6ef568e | [
"Apache-2.0"
] | null | null | null | 24.406593 | 108 | 0.547276 | [
[
[
"# Object Detection",
"_____no_output_____"
],
[
"## Setup",
"_____no_output_____"
]
],
[
[
"#@title\n\nimport os\n\n!pip install --quiet tensorflow_text\nos.environ[\"TFHUB_MODEL_LOAD_FORMAT\"] = \"COMPRESSED\"",
"_____no_output_____"
],
[
"#@title\n\nimport os\nimport math\n\nimport numpy as np\nimport requests\n\nimport tensorflow as tf\nimport tensorflow_hub as hub\nimport tensorflow_text as tf_text\nimport tensorflow_datasets as tfds\n\nimport matplotlib.pyplot as plt\nimport seaborn as sns",
"_____no_output_____"
],
[
"sns.set_style(\"whitegrid\", {'axes.grid' : False})",
"_____no_output_____"
],
[
"%load_ext tensorboard",
"_____no_output_____"
],
[
"import requests\n\ndef download_image(url, path):\n r = requests.get(url, allow_redirects=True)\n with open(path, 'wb') as f:\n f.write(r.content)\n return path\n\ndef plot(y, titles=None):\n for i, image in enumerate(y):\n if image is None:\n plt.subplot(1, len(y), i+1)\n plt.axis('off')\n continue\n \n t = titles[i] if titles else None\n plt.subplot(1, len(y), i+1, title=t)\n plt.imshow(image)\n plt.axis('off')\n plt.tight_layout()",
"_____no_output_____"
]
],
[
[
"## Model Definition",
"_____no_output_____"
]
],
[
[
"detector = hub.load(\"https://tfhub.dev/tensorflow/efficientdet/d4/1\")",
"_____no_output_____"
]
],
[
[
"## Application",
"_____no_output_____"
]
],
[
[
"INPUT_SHAPE = [299, 299, 3]\n\nDATA_DIR = 'images/'\nIMAGES = [\n 'https://raw.githubusercontent.com/keisen/tf-keras-vis/master/examples/images/goldfish.jpg',\n 'https://raw.githubusercontent.com/keisen/tf-keras-vis/master/examples/images/bear.jpg',\n 'https://raw.githubusercontent.com/keisen/tf-keras-vis/master/examples/images/soldiers.jpg',\n 'https://3.bp.blogspot.com/-W__wiaHUjwI/Vt3Grd8df0I/AAAAAAAAA78/7xqUNj8ujtY/s400/image02.png'\n]",
"_____no_output_____"
],
[
"#@title\n\n\nos.makedirs(os.path.join(DATA_DIR, 'unknown'), exist_ok=True)\n\nfor i in IMAGES:\n _, f = os.path.split(i)\n download_image(i, os.path.join(DATA_DIR, 'unknown', f))",
"_____no_output_____"
],
[
"images_set = (\n tf.keras.preprocessing.image_dataset_from_directory(\n DATA_DIR,\n image_size=INPUT_SHAPE[:2],\n batch_size=32,\n shuffle=False)\n .cache()\n .prefetch(buffer_size=tf.data.experimental.AUTOTUNE))",
"_____no_output_____"
],
[
"#@title\n\nplt.figure(figsize=(12, 4))\nfor images, _ in images_set.take(1):\n for i, image in enumerate(images):\n plt.subplot(math.ceil(len(images) / 4), 4, i+1)\n plt.imshow(image.numpy().astype('uint8'))\n plt.axis('off')\n\nplt.tight_layout()",
"_____no_output_____"
],
[
"inputs = tf.cast(images[3:4], tf.uint8)",
"_____no_output_____"
],
[
"y = detector(inputs)\nclass_ids = y[\"detection_classes\"]",
"_____no_output_____"
],
[
"print(*y.keys(), sep='\\n')",
"_____no_output_____"
],
[
"y['num_detections']",
"_____no_output_____"
],
[
"y['detection_classes']",
"_____no_output_____"
],
[
"y['detection_boxes']",
"_____no_output_____"
],
[
"import matplotlib.patches as patches\n\nfig, ax = plt.subplots(1)\n\nax.imshow(inputs[0].numpy())\n\n# for y['detection_boxes']:\nax.add_patch(patches.Rectangle((50,100),40,30,linewidth=1,edgecolor='r',facecolor='none'))\n\n;",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f6dc6e1a9d70f42593c1155f6504019206930 | 2,955 | ipynb | Jupyter Notebook | ipynb/01c-laboratorio-computacional.ipynb | gcpeixoto/FMECD | 9bca72574c6630d1594396fffef31cfb8d58dec2 | [
"CC0-1.0"
] | null | null | null | ipynb/01c-laboratorio-computacional.ipynb | gcpeixoto/FMECD | 9bca72574c6630d1594396fffef31cfb8d58dec2 | [
"CC0-1.0"
] | null | null | null | ipynb/01c-laboratorio-computacional.ipynb | gcpeixoto/FMECD | 9bca72574c6630d1594396fffef31cfb8d58dec2 | [
"CC0-1.0"
] | null | null | null | 34.360465 | 277 | 0.57665 | [
[
[
"# Laboratório Computacional 1",
"_____no_output_____"
],
[
"No laboratório computacional, você praticará o que aprendeu. Resolva os problemas com o auxílio do Python pesquisando apenas as informações essenciais de que precisa. Não use respostas prontas.",
"_____no_output_____"
],
[
"**Problema:** Quantos segundos existem em um século? Escreva a resposta usando um objeto `str` que denote o número de acordo com nosso sistema decimal (separado por milhar, milhão, etc.) e imprima o resultado com `print`. A resposta deve ser algo do tipo\n\n```python\n'1 século possui x.xxx.xxx.xxx segundos'\n```",
"_____no_output_____"
],
[
"**Problema:** Escreva um código para calcular as raízes da equação $ax^2 + bx + c = 0$, para $a$, $b$ e $c$ conhecidos e do tipo `int`. Em que situação sua resposta seria um objeto `complex`? Mostre um exemplo. *Obs.:* use `1j` para construir a parte imaginária.",
"_____no_output_____"
],
[
"**Problema:** Observe a tabela a seguir, onde **DS (UA)** é a distância do referido planeta do até o Sol em unidades astronômicas (UA), **Tm (F)** sua temperatura superficial mínima em graus Farenheit e **TM (F)** sua temperatura superficial máxima em graus Farenheit.\n\n| | DS (UA) | Tm ($^{\\circ}$C) | TM ($^{\\circ}$C) | DS (km) | TmM ($^{\\circ}$C) |\n|--|--|--|--|--|--|\nMercúrio | 0.39 | -275 | 840 | ? | ? |\nVênus | 0.723 | 870 | 870 | ? | ? |\nTerra | 1.0 | -129 | 136 | ? | ? |\nMarte | 1.524 | -195 | 70 | ? | ? |\n\n\n\n- Escreva um código para converter a temperatura dos planetas de graus Farenheit ($^{\\circ}$F) para Celsius ($^{\\circ}$C).\n\n- Escreva um código para converter unidades astronômicas em quilômetros.\n\n- Imprima os valores que deveriam ser inseridos na coluna **DS (km)** horizontalmente usando `print`.\n\n- Repita o item anterior para a coluna **TmM ($^{\\circ}$C)**, que é a média aritmética entre **Tm** e **TM**.\n \n \n*Observação:* use notação científica (exemplo: $4.2 \\times 10^8$ pode ser escrito como `4.2e8` em Python).",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
e79f6ee5a0f3a825fd40c9ef2ed776f430b06651 | 363,195 | ipynb | Jupyter Notebook | C3D/.ipynb_checkpoints/C3D-checkpoint.ipynb | reetikaag/human-activity-recognition | 1e6760a88ca52fe9a8a8ca60d000cd3426851156 | [
"MIT"
] | null | null | null | C3D/.ipynb_checkpoints/C3D-checkpoint.ipynb | reetikaag/human-activity-recognition | 1e6760a88ca52fe9a8a8ca60d000cd3426851156 | [
"MIT"
] | null | null | null | C3D/.ipynb_checkpoints/C3D-checkpoint.ipynb | reetikaag/human-activity-recognition | 1e6760a88ca52fe9a8a8ca60d000cd3426851156 | [
"MIT"
] | null | null | null | 474.764706 | 334,088 | 0.838748 | [
[
[
"#!pip install -r requirements.txt",
"_____no_output_____"
],
[
"#!pip3 install ffmpeg",
"_____no_output_____"
],
[
"#!pip3 install --upgrade tensorflow keras numpy pandas sklearn pillow",
"_____no_output_____"
],
[
"from keras.callbacks import TensorBoard, ModelCheckpoint, EarlyStopping, CSVLogger\nfrom data import DataSet\nimport time\nimport os.path\nimport shutil\nimport csv\nimport glob\nimport os\nimport os.path\nfrom subprocess import call",
"_____no_output_____"
],
[
"def get_train_test_lists(version='01'):\n \"\"\"\n Using one of the train/test files (01, 02, or 03), get the filename\n breakdowns we'll later use to move everything.\n \"\"\"\n # Get our files based on version.\n test_file = os.path.join('/home/shared/workspace/rose_ntu_dataset/'+'ntutraintest_rgb', 'testlist' + version + '.txt')\n train_file = os.path.join('/home/shared/workspace/rose_ntu_dataset/'+'ntutraintest_rgb', 'trainlist' + version + '.txt')\n\n # Build the test list.\n with open(test_file) as fin:\n test_list = [row.strip() for row in list(fin)]\n\n # Build the train list. Extra step to remove the class index.\n with open(train_file) as fin:\n train_list = [row.strip() for row in list(fin)]\n train_list = [row.split(' ')[0] for row in train_list]\n\n # Set the groups in a dictionary.\n file_groups = {\n 'train': train_list,\n 'test': test_list\n }\n print(file_groups['train'],file_groups['test'])\n return file_groups",
"_____no_output_____"
],
[
"file_groups = get_train_test_lists()",
"['sneezeCough/S010C003P019R002A041_rgb', 'sneezeCough/S005C001P021R001A041_rgb', 'sneezeCough/S003C002P001R001A041_rgb', 'sneezeCough/S005C001P013R002A041_rgb', 'sneezeCough/S015C001P025R001A041_rgb', 'sneezeCough/S013C002P037R002A041_rgb', 'sneezeCough/S012C003P028R001A041_rgb', 'sneezeCough/S011C003P025R002A041_rgb', 'sneezeCough/S001C003P007R002A041_rgb', 'sneezeCough/S012C003P019R002A041_rgb', 'sneezeCough/S017C001P015R002A041_rgb', 'sneezeCough/S017C002P003R002A041_rgb', 'sneezeCough/S008C001P034R001A041_rgb', 'sneezeCough/S008C002P015R002A041_rgb', 'sneezeCough/S013C002P017R002A041_rgb', 'sneezeCough/S001C002P003R002A041_rgb', 'sneezeCough/S011C001P008R002A041_rgb', 'sneezeCough/S009C002P025R002A041_rgb', 'sneezeCough/S016C003P019R001A041_rgb', 'sneezeCough/S012C002P028R002A041_rgb', 'sneezeCough/S015C001P015R001A041_rgb', 'sneezeCough/S006C001P015R001A041_rgb', 'sneezeCough/S007C001P028R001A041_rgb', 'sneezeCough/S010C003P007R002A041_rgb', 'sneezeCough/S014C003P019R001A041_rgb', 'sneezeCough/S013C002P025R001A041_rgb', 'sneezeCough/S015C002P008R001A041_rgb', 'sneezeCough/S015C002P037R001A041_rgb', 'sneezeCough/S011C003P008R001A041_rgb', 'sneezeCough/S009C002P019R002A041_rgb', 'sneezeCough/S008C003P034R001A041_rgb', 'sneezeCough/S012C001P007R001A041_rgb', 'sneezeCough/S001C003P001R002A041_rgb', 'sneezeCough/S002C002P007R002A041_rgb', 'sneezeCough/S010C003P008R001A041_rgb', 'sneezeCough/S016C001P040R002A041_rgb', 'sneezeCough/S002C003P011R002A041_rgb', 'sneezeCough/S009C001P008R001A041_rgb', 'sneezeCough/S003C003P015R001A041_rgb', 'sneezeCough/S011C003P018R002A041_rgb', 'sneezeCough/S001C003P008R001A041_rgb', 'sneezeCough/S012C003P008R001A041_rgb', 'sneezeCough/S008C002P029R001A041_rgb', 'sneezeCough/S012C001P019R002A041_rgb', 'sneezeCough/S015C001P037R002A041_rgb', 'sneezeCough/S008C002P007R002A041_rgb', 'sneezeCough/S008C003P036R001A041_rgb', 'sneezeCough/S011C003P016R001A041_rgb', 'sneezeCough/S012C001P028R001A041_rgb', 'sneezeCough/S002C001P007R002A041_rgb', 'sneezeCough/S006C001P016R001A041_rgb', 'sneezeCough/S015C002P016R002A041_rgb', 'sneezeCough/S008C003P025R001A041_rgb', 'sneezeCough/S002C003P007R002A041_rgb', 'sneezeCough/S012C002P019R001A041_rgb', 'sneezeCough/S013C002P025R002A041_rgb', 'sneezeCough/S003C003P018R001A041_rgb', 'sneezeCough/S017C003P009R001A041_rgb', 'sneezeCough/S008C002P036R002A041_rgb', 'sneezeCough/S007C002P015R001A041_rgb', 'sneezeCough/S006C003P023R002A041_rgb', 'sneezeCough/S005C002P017R002A041_rgb', 'sneezeCough/S013C003P019R001A041_rgb', 'sneezeCough/S010C002P021R001A041_rgb', 'sneezeCough/S006C002P001R002A041_rgb', 'sneezeCough/S010C003P017R001A041_rgb', 'sneezeCough/S001C003P007R001A041_rgb', 'sneezeCough/S003C001P007R002A041_rgb', 'sneezeCough/S016C003P025R001A041_rgb', 'sneezeCough/S007C001P007R002A041_rgb', 'sneezeCough/S014C003P017R002A041_rgb', 'sneezeCough/S005C002P013R002A041_rgb', 'sneezeCough/S008C001P007R001A041_rgb', 'sneezeCough/S012C001P017R001A041_rgb', 'sneezeCough/S003C002P017R002A041_rgb', 'sneezeCough/S015C003P015R001A041_rgb', 'sneezeCough/S014C001P015R002A041_rgb', 'sneezeCough/S007C001P019R002A041_rgb', 'sneezeCough/S007C003P018R002A041_rgb', 'sneezeCough/S008C002P008R001A041_rgb', 'sneezeCough/S011C002P038R002A041_rgb', 'sneezeCough/S003C003P001R001A041_rgb', 'sneezeCough/S015C001P025R002A041_rgb', 'sneezeCough/S011C001P018R001A041_rgb', 'sneezeCough/S003C001P017R002A041_rgb', 'sneezeCough/S012C001P007R002A041_rgb', 'sneezeCough/S002C002P007R001A041_rgb', 'sneezeCough/S006C003P001R002A041_rgb', 'sneezeCough/S005C003P004R001A041_rgb', 'sneezeCough/S009C001P017R002A041_rgb', 'sneezeCough/S010C002P016R001A041_rgb', 'sneezeCough/S006C002P024R001A041_rgb', 'sneezeCough/S003C002P002R002A041_rgb', 'sneezeCough/S006C002P017R001A041_rgb', 'sneezeCough/S015C002P016R001A041_rgb', 'sneezeCough/S011C003P028R001A041_rgb', 'sneezeCough/S014C003P027R001A041_rgb', 'sneezeCough/S012C001P016R002A041_rgb', 'sneezeCough/S011C003P027R002A041_rgb', 'sneezeCough/S008C001P033R002A041_rgb', 'sneezeCough/S008C003P033R002A041_rgb', 'sneezeCough/S010C002P025R001A041_rgb', 'sneezeCough/S002C003P010R002A041_rgb', 'sneezeCough/S011C001P028R001A041_rgb', 'sneezeCough/S012C002P037R002A041_rgb', 'sneezeCough/S006C001P024R002A041_rgb', 'sneezeCough/S002C002P009R002A041_rgb', 'sneezeCough/S010C002P008R002A041_rgb', 'sneezeCough/S007C001P025R002A041_rgb', 'sneezeCough/S001C001P002R002A041_rgb', 'sneezeCough/S001C002P004R001A041_rgb', 'sneezeCough/S016C003P008R001A041_rgb', 'sneezeCough/S010C002P015R001A041_rgb', 'sneezeCough/S015C001P007R002A041_rgb', 'sneezeCough/S007C002P007R001A041_rgb', 'sneezeCough/S004C002P007R001A041_rgb', 'sneezeCough/S011C002P025R001A041_rgb', 'sneezeCough/S008C002P019R002A041_rgb', 'sneezeCough/S013C003P025R001A041_rgb', 'sneezeCough/S007C001P016R001A041_rgb', 'sneezeCough/S010C002P013R002A041_rgb', 'sneezeCough/S002C003P008R002A041_rgb', 'sneezeCough/S010C001P021R001A041_rgb', 'sneezeCough/S001C001P008R001A041_rgb', 'sneezeCough/S001C003P001R001A041_rgb', 'sneezeCough/S011C002P017R002A041_rgb', 'sneezeCough/S016C002P007R001A041_rgb', 'sneezeCough/S016C003P008R002A041_rgb', 'sneezeCough/S012C003P037R001A041_rgb', 'sneezeCough/S013C002P007R001A041_rgb', 'sneezeCough/S008C002P025R002A041_rgb', 'sneezeCough/S007C002P028R001A041_rgb', 'sneezeCough/S011C001P038R002A041_rgb', 'sneezeCough/S005C002P010R001A041_rgb', 'sneezeCough/S015C003P025R002A041_rgb', 'sneezeCough/S009C002P017R002A041_rgb', 'sneezeCough/S010C001P018R002A041_rgb', 'sneezeCough/S012C003P017R001A041_rgb', 'sneezeCough/S014C001P039R002A041_rgb', 'sneezeCough/S010C001P007R001A041_rgb', 'sneezeCough/S012C003P019R001A041_rgb', 'sneezeCough/S007C001P017R002A041_rgb', 'sneezeCough/S010C002P008R001A041_rgb', 'sneezeCough/S006C001P017R001A041_rgb', 'sneezeCough/S010C001P019R001A041_rgb', 'sneezeCough/S013C003P015R002A041_rgb', 'sneezeCough/S013C002P018R001A041_rgb', 'sneezeCough/S008C003P032R001A041_rgb', 'sneezeCough/S009C002P008R002A041_rgb', 'sneezeCough/S007C001P028R002A041_rgb', 'sneezeCough/S002C001P003R001A041_rgb', 'sneezeCough/S005C003P017R002A041_rgb', 'sneezeCough/S001C001P006R002A041_rgb', 'sneezeCough/S017C001P020R001A041_rgb', 'sneezeCough/S007C003P026R002A041_rgb', 'sneezeCough/S015C002P007R002A041_rgb', 'sneezeCough/S013C002P037R001A041_rgb', 'sneezeCough/S003C003P007R001A041_rgb', 'sneezeCough/S010C002P013R001A041_rgb', 'sneezeCough/S011C001P016R001A041_rgb', 'sneezeCough/S011C002P028R001A041_rgb', 'sneezeCough/S012C001P027R001A041_rgb', 'sneezeCough/S007C003P015R002A041_rgb', 'sneezeCough/S017C002P020R002A041_rgb', 'sneezeCough/S013C002P008R002A041_rgb', 'sneezeCough/S008C001P019R001A041_rgb', 'sneezeCough/S001C001P007R001A041_rgb', 'sneezeCough/S006C002P008R002A041_rgb', 'sneezeCough/S013C001P028R002A041_rgb', 'sneezeCough/S017C002P015R001A041_rgb', 'sneezeCough/S008C003P007R002A041_rgb', 'sneezeCough/S010C003P015R002A041_rgb', 'sneezeCough/S006C001P007R001A041_rgb', 'sneezeCough/S005C002P015R002A041_rgb', 'sneezeCough/S011C003P016R002A041_rgb', 'sneezeCough/S014C001P039R001A041_rgb', 'sneezeCough/S007C001P027R001A041_rgb', 'sneezeCough/S010C003P016R002A041_rgb', 'sneezeCough/S012C003P037R002A041_rgb', 'sneezeCough/S010C003P007R001A041_rgb', 'sneezeCough/S006C002P023R001A041_rgb', 'sneezeCough/S005C002P010R002A041_rgb', 'sneezeCough/S014C002P037R001A041_rgb', 'sneezeCough/S009C001P008R002A041_rgb', 'sneezeCough/S007C001P015R001A041_rgb', 'sneezeCough/S015C001P008R001A041_rgb', 'sneezeCough/S009C003P015R001A041_rgb', 'sneezeCough/S010C001P013R002A041_rgb', 'sneezeCough/S017C003P003R002A041_rgb', 'sneezeCough/S007C002P018R002A041_rgb', 'sneezeCough/S005C003P016R001A041_rgb', 'sneezeCough/S003C001P017R001A041_rgb', 'sneezeCough/S006C001P022R002A041_rgb', 'sneezeCough/S008C003P035R002A041_rgb', 'sneezeCough/S013C003P007R002A041_rgb', 'sneezeCough/S011C003P038R002A041_rgb', 'sneezeCough/S006C003P017R001A041_rgb', 'sneezeCough/S007C001P019R001A041_rgb', 'sneezeCough/S010C003P019R001A041_rgb', 'sneezeCough/S009C003P017R002A041_rgb', 'sneezeCough/S007C002P025R002A041_rgb', 'sneezeCough/S002C003P012R001A041_rgb', 'sneezeCough/S001C002P005R002A041_rgb', 'sneezeCough/S017C002P008R001A041_rgb', 'sneezeCough/S014C002P007R002A041_rgb', 'sneezeCough/S015C002P037R002A041_rgb', 'sneezeCough/S004C002P020R001A041_rgb', 'sneezeCough/S004C001P003R002A041_rgb', 'sneezeCough/S011C001P025R001A041_rgb', 'sneezeCough/S003C001P016R002A041_rgb', 'sneezeCough/S017C002P009R001A041_rgb', 'sneezeCough/S013C002P027R001A041_rgb', 'sneezeCough/S013C003P016R002A041_rgb', 'sneezeCough/S002C001P011R001A041_rgb', 'sneezeCough/S016C001P007R001A041_rgb', 'sneezeCough/S008C002P034R001A041_rgb', 'sneezeCough/S017C001P007R002A041_rgb', 'sneezeCough/S011C001P002R002A041_rgb', 'sneezeCough/S015C001P019R001A041_rgb', 'sneezeCough/S006C002P023R002A041_rgb', 'sneezeCough/S009C003P019R002A041_rgb', 'sneezeCough/S009C001P019R002A041_rgb', 'sneezeCough/S004C003P020R001A041_rgb', 'sneezeCough/S012C003P008R002A041_rgb', 'sneezeCough/S001C002P008R001A041_rgb', 'sneezeCough/S006C001P001R002A041_rgb', 'sneezeCough/S009C003P017R001A041_rgb', 'sneezeCough/S012C001P015R001A041_rgb', 'sneezeCough/S004C001P020R002A041_rgb', 'sneezeCough/S013C002P016R002A041_rgb', 'sneezeCough/S008C001P032R002A041_rgb', 'sneezeCough/S008C002P015R001A041_rgb', 'sneezeCough/S008C001P036R002A041_rgb', 'sneezeCough/S016C001P021R002A041_rgb', 'sneezeCough/S014C002P008R002A041_rgb', 'sneezeCough/S001C002P006R001A041_rgb', 'sneezeCough/S017C001P016R002A041_rgb', 'sneezeCough/S001C002P001R001A041_rgb', 'sneezeCough/S008C001P025R002A041_rgb', 'sneezeCough/S008C001P031R001A041_rgb', 'sneezeCough/S006C001P022R001A041_rgb', 'sneezeCough/S011C003P001R002A041_rgb', 'sneezeCough/S013C001P027R001A041_rgb', 'sneezeCough/S016C002P007R002A041_rgb', 'sneezeCough/S011C003P017R002A041_rgb', 'sneezeCough/S011C002P027R001A041_rgb', 'sneezeCough/S007C001P027R002A041_rgb', 'sneezeCough/S010C001P019R002A041_rgb', 'sneezeCough/S017C003P007R001A041_rgb', 'sneezeCough/S007C001P001R002A041_rgb', 'sneezeCough/S002C002P008R002A041_rgb', 'sneezeCough/S016C003P007R002A041_rgb', 'sneezeCough/S010C002P018R002A041_rgb', 'sneezeCough/S009C002P007R002A041_rgb', 'sneezeCough/S003C003P002R001A041_rgb', 'sneezeCough/S011C001P007R002A041_rgb', 'sneezeCough/S017C001P016R001A041_rgb', 'sneezeCough/S006C002P008R001A041_rgb', 'sneezeCough/S013C003P007R001A041_rgb', 'sneezeCough/S007C003P016R002A041_rgb', 'sneezeCough/S010C001P025R001A041_rgb', 'sneezeCough/S007C002P008R002A041_rgb', 'sneezeCough/S017C003P016R002A041_rgb', 'sneezeCough/S005C001P013R001A041_rgb', 'sneezeCough/S017C003P020R001A041_rgb', 'sneezeCough/S005C003P015R002A041_rgb', 'sneezeCough/S013C002P027R002A041_rgb', 'sneezeCough/S003C001P001R001A041_rgb', 'sneezeCough/S012C002P028R001A041_rgb', 'sneezeCough/S013C001P008R002A041_rgb', 'sneezeCough/S007C002P008R001A041_rgb', 'sneezeCough/S008C001P032R001A041_rgb', 'sneezeCough/S013C001P037R002A041_rgb', 'sneezeCough/S013C001P037R001A041_rgb', 'sneezeCough/S007C001P018R001A041_rgb', 'sneezeCough/S002C003P003R002A041_rgb', 'sneezeCough/S002C001P010R001A041_rgb', 'sneezeCough/S017C002P009R002A041_rgb', 'sneezeCough/S008C003P034R002A041_rgb', 'sneezeCough/S008C003P031R002A041_rgb', 'sneezeCough/S007C002P016R001A041_rgb', 'sneezeCough/S007C003P026R001A041_rgb', 'sneezeCough/S003C003P017R002A041_rgb', 'sneezeCough/S015C001P007R001A041_rgb', 'sneezeCough/S006C003P019R001A041_rgb', 'sneezeCough/S009C002P017R001A041_rgb', 'sneezeCough/S006C003P022R001A041_rgb', 'sneezeCough/S011C001P001R001A041_rgb', 'sneezeCough/S007C003P025R002A041_rgb', 'sneezeCough/S007C001P026R001A041_rgb', 'sneezeCough/S011C003P017R001A041_rgb', 'sneezeCough/S006C003P024R001A041_rgb', 'sneezeCough/S010C003P013R002A041_rgb', 'sneezeCough/S009C003P016R002A041_rgb', 'sneezeCough/S007C001P016R002A041_rgb', 'sneezeCough/S015C002P007R001A041_rgb', 'sneezeCough/S014C003P037R002A041_rgb', 'sneezeCough/S009C002P016R002A041_rgb', 'sneezeCough/S005C001P017R002A041_rgb', 'sneezeCough/S011C001P025R002A041_rgb', 'sneezeCough/S011C001P028R002A041_rgb', 'sneezeCough/S008C001P036R001A041_rgb', 'sneezeCough/S001C001P001R002A041_rgb', 'sneezeCough/S015C001P015R002A041_rgb', 'sneezeCough/S014C001P027R001A041_rgb', 'sneezeCough/S007C003P028R001A041_rgb', 'sneezeCough/S005C001P010R002A041_rgb', 'sneezeCough/S012C002P007R002A041_rgb', 'sneezeCough/S016C003P019R002A041_rgb', 'sneezeCough/S010C003P025R002A041_rgb', 'sneezeCough/S014C001P019R002A041_rgb', 'sneezeCough/S015C003P025R001A041_rgb', 'sneezeCough/S003C001P019R002A041_rgb', 'sneezeCough/S008C003P035R001A041_rgb', 'sneezeCough/S003C003P008R002A041_rgb', 'sneezeCough/S017C001P008R001A041_rgb', 'sneezeCough/S011C001P007R001A041_rgb', 'sneezeCough/S002C001P011R002A041_rgb', 'sneezeCough/S004C001P007R002A041_rgb', 'sneezeCough/S010C002P025R002A041_rgb', 'sneezeCough/S014C003P008R002A041_rgb', 'sneezeCough/S015C003P007R001A041_rgb', 'sneezeCough/S001C002P002R002A041_rgb', 'sneezeCough/S008C001P035R002A041_rgb', 'sneezeCough/S006C001P024R001A041_rgb', 'sneezeCough/S003C002P018R001A041_rgb', 'sneezeCough/S001C003P004R001A041_rgb', 'sneezeCough/S017C003P020R002A041_rgb', 'sneezeCough/S013C001P017R002A041_rgb', 'sneezeCough/S012C003P007R001A041_rgb', 'sneezeCough/S010C001P008R002A041_rgb', 'sneezeCough/S014C003P008R001A041_rgb', 'sneezeCough/S013C003P037R002A041_rgb', 'sneezeCough/S014C001P015R001A041_rgb', 'sneezeCough/S015C002P017R002A041_rgb', 'sneezeCough/S011C001P001R002A041_rgb', 'sneezeCough/S007C002P026R001A041_rgb', 'sneezeCough/S006C002P024R002A041_rgb', 'sneezeCough/S016C001P025R002A041_rgb', 'sneezeCough/S011C002P007R001A041_rgb', 'sneezeCough/S008C003P029R002A041_rgb', 'sneezeCough/S008C003P031R001A041_rgb', 'sneezeCough/S015C002P017R001A041_rgb', 'sneezeCough/S005C002P018R001A041_rgb', 'sneezeCough/S005C002P021R002A041_rgb', 'sneezeCough/S014C002P037R002A041_rgb', 'sneezeCough/S015C001P016R001A041_rgb', 'sneezeCough/S014C003P025R001A041_rgb', 'sneezeCough/S013C001P017R001A041_rgb', 'sneezeCough/S003C001P015R001A041_rgb', 'sneezeCough/S013C001P015R001A041_rgb', 'sneezeCough/S006C002P019R001A041_rgb', 'sneezeCough/S011C001P019R002A041_rgb', 'sneezeCough/S005C003P004R002A041_rgb', 'sneezeCough/S004C001P008R001A041_rgb', 'sneezeCough/S011C001P027R002A041_rgb', 'sneezeCough/S015C003P015R002A041_rgb', 'sneezeCough/S006C003P008R002A041_rgb', 'sneezeCough/S015C003P017R002A041_rgb', 'sneezeCough/S006C001P015R002A041_rgb', 'sneezeCough/S003C002P015R001A041_rgb', 'sneezeCough/S002C003P003R001A041_rgb', 'sneezeCough/S014C001P017R002A041_rgb', 'sneezeCough/S012C001P008R001A041_rgb', 'sneezeCough/S009C001P016R002A041_rgb', 'sneezeCough/S008C002P033R002A041_rgb', 'sneezeCough/S002C003P013R001A041_rgb', 'sneezeCough/S010C003P025R001A041_rgb', 'sneezeCough/S007C002P017R001A041_rgb', 'sneezeCough/S006C001P007R002A041_rgb', 'sneezeCough/S012C003P017R002A041_rgb', 'sneezeCough/S004C003P007R001A041_rgb', 'sneezeCough/S004C001P008R002A041_rgb', 'sneezeCough/S003C003P007R002A041_rgb', 'sneezeCough/S006C002P007R001A041_rgb', 'sneezeCough/S015C003P016R001A041_rgb', 'sneezeCough/S014C001P037R002A041_rgb', 'sneezeCough/S012C002P025R002A041_rgb', 'sneezeCough/S006C002P017R002A041_rgb', 'sneezeCough/S014C002P008R001A041_rgb', 'sneezeCough/S015C001P017R001A041_rgb', 'sneezeCough/S002C002P013R002A041_rgb', 'sneezeCough/S008C002P025R001A041_rgb', 'sneezeCough/S017C001P020R002A041_rgb', 'sneezeCough/S002C001P009R001A041_rgb', 'sneezeCough/S010C001P015R001A041_rgb', 'sneezeCough/S013C002P016R001A041_rgb', 'sneezeCough/S003C002P017R001A041_rgb', 'sneezeCough/S002C002P009R001A041_rgb', 'sneezeCough/S014C001P008R002A041_rgb', 'sneezeCough/S011C002P001R002A041_rgb', 'sneezeCough/S002C001P003R002A041_rgb', 'sneezeCough/S015C003P019R001A041_rgb', 'sneezeCough/S003C001P018R001A041_rgb', 'sneezeCough/S013C003P017R002A041_rgb', 'sneezeCough/S016C003P025R002A041_rgb', 'sneezeCough/S001C002P005R001A041_rgb', 'sneezeCough/S001C001P003R002A041_rgb', 'sneezeCough/S006C001P016R002A041_rgb', 'sneezeCough/S012C001P025R001A041_rgb', 'sneezeCough/S009C003P025R001A041_rgb', 'sneezeCough/S011C003P002R002A041_rgb', 'sneezeCough/S015C002P008R002A041_rgb', 'sneezeCough/S013C003P018R002A041_rgb', 'sneezeCough/S013C002P018R002A041_rgb', 'sneezeCough/S004C003P008R002A041_rgb', 'sneezeCough/S016C002P039R001A041_rgb', 'sneezeCough/S011C002P027R002A041_rgb', 'sneezeCough/S015C002P025R002A041_rgb', 'sneezeCough/S008C002P034R002A041_rgb', 'sneezeCough/S004C001P020R001A041_rgb', 'sneezeCough/S011C002P016R002A041_rgb', 'sneezeCough/S012C002P008R001A041_rgb', 'sneezeCough/S008C003P001R001A041_rgb', 'sneezeCough/S003C003P016R001A041_rgb', 'sneezeCough/S010C003P016R001A041_rgb', 'sneezeCough/S015C003P017R001A041_rgb', 'sneezeCough/S014C003P039R002A041_rgb', 'sneezeCough/S007C001P007R001A041_rgb', 'sneezeCough/S011C001P018R002A041_rgb', 'sneezeCough/S007C002P007R002A041_rgb', 'sneezeCough/S013C003P037R001A041_rgb', 'sneezeCough/S011C003P008R002A041_rgb', 'sneezeCough/S015C003P016R002A041_rgb', 'sneezeCough/S003C003P002R002A041_rgb', 'sneezeCough/S017C001P003R002A041_rgb', 'sneezeCough/S010C002P017R001A041_rgb', 'sneezeCough/S013C003P019R002A041_rgb', 'sneezeCough/S003C001P007R001A041_rgb', 'sneezeCough/S001C001P005R002A041_rgb', 'sneezeCough/S002C002P003R002A041_rgb', 'sneezeCough/S002C002P011R002A041_rgb', 'sneezeCough/S012C003P007R002A041_rgb', 'sneezeCough/S008C001P035R001A041_rgb', 'sneezeCough/S009C001P019R001A041_rgb', 'sneezeCough/S002C002P003R001A041_rgb', 'sneezeCough/S007C002P001R001A041_rgb', 'sneezeCough/S011C002P007R002A041_rgb', 'sneezeCough/S011C002P008R002A041_rgb', 'sneezeCough/S013C001P027R002A041_rgb', 'sneezeCough/S012C003P016R001A041_rgb', 'sneezeCough/S001C001P006R001A041_rgb', 'sneezeCough/S014C002P027R001A041_rgb', 'sneezeCough/S013C003P015R001A041_rgb', 'sneezeCough/S012C003P018R002A041_rgb', 'sneezeCough/S005C003P018R001A041_rgb', 'sneezeCough/S013C002P028R002A041_rgb', 'sneezeCough/S003C003P019R002A041_rgb', 'sneezeCough/S012C001P028R002A041_rgb', 'sneezeCough/S007C001P026R002A041_rgb', 'sneezeCough/S005C001P021R002A041_rgb', 'sneezeCough/S001C002P008R002A041_rgb', 'sneezeCough/S003C003P008R001A041_rgb', 'sneezeCough/S012C003P027R002A041_rgb', 'sneezeCough/S012C001P037R002A041_rgb', 'sneezeCough/S016C001P021R001A041_rgb', 'sneezeCough/S010C001P016R001A041_rgb', 'sneezeCough/S012C003P015R002A041_rgb', 'sneezeCough/S008C003P032R002A041_rgb', 'sneezeCough/S012C001P037R001A041_rgb', 'sneezeCough/S013C003P008R001A041_rgb', 'sneezeCough/S016C003P040R001A041_rgb', 'sneezeCough/S011C002P017R001A041_rgb', 'sneezeCough/S011C002P028R002A041_rgb', 'sneezeCough/S013C002P019R001A041_rgb', 'sneezeCough/S008C003P019R001A041_rgb', 'sneezeCough/S006C001P019R002A041_rgb', 'sneezeCough/S008C003P001R002A041_rgb', 'sneezeCough/S003C003P017R001A041_rgb', 'sneezeCough/S002C003P010R001A041_rgb', 'sneezeCough/S009C002P015R002A041_rgb', 'sneezeCough/S015C001P008R002A041_rgb', 'sneezeCough/S005C002P021R001A041_rgb', 'sneezeCough/S009C001P016R001A041_rgb', 'sneezeCough/S013C001P019R002A041_rgb', 'sneezeCough/S010C001P021R002A041_rgb', 'sneezeCough/S005C003P017R001A041_rgb', 'sneezeCough/S004C001P003R001A041_rgb', 'sneezeCough/S013C002P019R002A041_rgb', 'sneezeCough/S007C002P018R001A041_rgb', 'sneezeCough/S007C003P007R001A041_rgb', 'sneezeCough/S003C002P016R002A041_rgb', 'sneezeCough/S017C001P015R001A041_rgb', 'sneezeCough/S012C002P008R002A041_rgb', 'sneezeCough/S011C002P002R002A041_rgb', 'sneezeCough/S013C001P007R001A041_rgb', 'sneezeCough/S001C003P006R002A041_rgb', 'sneezeCough/S008C001P019R002A041_rgb', 'sneezeCough/S015C002P015R001A041_rgb', 'sneezeCough/S002C002P014R002A041_rgb', 'sneezeCough/S004C003P003R002A041_rgb', 'sneezeCough/S005C003P021R002A041_rgb', 'sneezeCough/S014C003P019R002A041_rgb', 'sneezeCough/S004C003P007R002A041_rgb', 'sneezeCough/S012C001P027R002A041_rgb', 'sneezeCough/S006C003P023R001A041_rgb', 'sneezeCough/S006C001P008R002A041_rgb', 'sneezeCough/S017C001P009R002A041_rgb', 'sneezeCough/S006C003P015R002A041_rgb', 'sneezeCough/S005C003P010R002A041_rgb', 'sneezeCough/S016C002P021R002A041_rgb', 'sneezeCough/S007C001P025R001A041_rgb', 'sneezeCough/S010C003P018R001A041_rgb', 'sneezeCough/S004C001P007R001A041_rgb', 'sneezeCough/S016C003P039R001A041_rgb', 'sneezeCough/S007C001P017R001A041_rgb', 'sneezeCough/S010C003P021R002A041_rgb', 'sneezeCough/S012C002P015R002A041_rgb', 'sneezeCough/S008C003P019R002A041_rgb', 'sneezeCough/S014C002P039R001A041_rgb', 'sneezeCough/S004C002P020R002A041_rgb', 'sneezeCough/S015C003P008R001A041_rgb', 'sneezeCough/S010C002P016R002A041_rgb', 'sneezeCough/S001C001P005R001A041_rgb', 'sneezeCough/S003C002P007R002A041_rgb', 'sneezeCough/S002C003P008R001A041_rgb', 'sneezeCough/S014C002P025R002A041_rgb', 'sneezeCough/S011C003P027R001A041_rgb', 'sneezeCough/S014C003P025R002A041_rgb', 'sneezeCough/S007C003P017R001A041_rgb', 'sneezeCough/S006C001P001R001A041_rgb', 'sneezeCough/S011C001P015R002A041_rgb', 'sneezeCough/S001C002P007R002A041_rgb', 'sneezeCough/S002C003P007R001A041_rgb', 'sneezeCough/S007C003P027R002A041_rgb', 'sneezeCough/S002C001P009R002A041_rgb', 'sneezeCough/S014C001P008R001A041_rgb', 'sneezeCough/S014C003P039R001A041_rgb', 'sneezeCough/S002C002P014R001A041_rgb', 'sneezeCough/S012C001P019R001A041_rgb', 'sneezeCough/S013C003P027R002A041_rgb', 'sneezeCough/S011C002P018R002A041_rgb', 'sneezeCough/S002C003P014R001A041_rgb', 'sneezeCough/S001C001P001R001A041_rgb', 'sneezeCough/S011C002P019R001A041_rgb', 'sneezeCough/S011C003P015R001A041_rgb', 'sneezeCough/S008C003P036R002A041_rgb', 'sneezeCough/S008C001P015R002A041_rgb', 'sneezeCough/S012C002P037R001A041_rgb', 'sneezeCough/S009C001P007R002A041_rgb', 'sneezeCough/S017C002P016R002A041_rgb', 'sneezeCough/S012C003P018R001A041_rgb', 'sneezeCough/S005C003P015R001A041_rgb', 'sneezeCough/S013C003P018R001A041_rgb', 'sneezeCough/S012C002P015R001A041_rgb', 'sneezeCough/S014C002P019R001A041_rgb', 'sneezeCough/S008C001P029R001A041_rgb', 'sneezeCough/S005C002P004R001A041_rgb', 'sneezeCough/S011C002P038R001A041_rgb', 'sneezeCough/S005C003P016R002A041_rgb', 'sneezeCough/S012C001P018R001A041_rgb', 'sneezeCough/S001C001P004R002A041_rgb', 'sneezeCough/S014C001P025R001A041_rgb', 'sneezeCough/S001C002P006R002A041_rgb', 'sneezeCough/S008C002P030R002A041_rgb', 'sneezeCough/S015C003P008R002A041_rgb', 'sneezeCough/S002C003P014R002A041_rgb', 'sneezeCough/S017C003P003R001A041_rgb', 'sneezeCough/S015C001P017R002A041_rgb', 'sneezeCough/S015C003P037R002A041_rgb', 'sneezeCough/S002C001P014R001A041_rgb', 'sneezeCough/S005C002P015R001A041_rgb', 'sneezeCough/S002C001P013R002A041_rgb', 'sneezeCough/S013C001P008R001A041_rgb', 'sneezeCough/S007C003P001R002A041_rgb', 'sneezeCough/S005C002P013R001A041_rgb', 'sneezeCough/S014C002P019R002A041_rgb', 'sneezeCough/S012C002P016R002A041_rgb', 'sneezeCough/S016C001P039R002A041_rgb', 'sneezeCough/S008C003P008R002A041_rgb', 'sneezeCough/S015C003P007R002A041_rgb', 'sneezeCough/S011C001P016R002A041_rgb', 'sneezeCough/S011C001P017R002A041_rgb', 'sneezeCough/S014C002P025R001A041_rgb', 'sneezeCough/S007C002P017R002A041_rgb', 'sneezeCough/S008C002P029R002A041_rgb', 'sneezeCough/S017C002P020R001A041_rgb', 'sneezeCough/S002C002P013R001A041_rgb', 'sneezeCough/S007C003P019R002A041_rgb', 'sneezeCough/S002C003P013R002A041_rgb', 'sneezeCough/S011C002P015R002A041_rgb', 'sneezeCough/S012C003P028R002A041_rgb', 'sneezeCough/S008C003P030R001A041_rgb', 'sneezeCough/S013C001P019R001A041_rgb', 'sneezeCough/S006C003P008R001A041_rgb', 'sneezeCough/S012C001P018R002A041_rgb', 'sneezeCough/S014C002P015R001A041_rgb', 'sneezeCough/S013C003P017R001A041_rgb', 'sneezeCough/S001C003P003R002A041_rgb', 'sneezeCough/S009C003P008R001A041_rgb', 'sneezeCough/S006C002P016R002A041_rgb', 'sneezeCough/S014C001P037R001A041_rgb', 'sneezeCough/S014C002P015R002A041_rgb', 'sneezeCough/S016C003P021R002A041_rgb', 'sneezeCough/S002C002P010R001A041_rgb', 'sneezeCough/S003C002P008R002A041_rgb', 'sneezeCough/S003C003P019R001A041_rgb', 'sneezeCough/S007C003P008R002A041_rgb', 'sneezeCough/S008C001P025R001A041_rgb', 'sneezeCough/S011C002P001R001A041_rgb', 'sneezeCough/S012C002P018R001A041_rgb', 'sneezeCough/S003C002P002R001A041_rgb', 'sneezeCough/S013C003P008R002A041_rgb', 'sneezeCough/S015C001P016R002A041_rgb', 'sneezeCough/S009C003P019R001A041_rgb', 'sneezeCough/S002C001P008R002A041_rgb', 'sneezeCough/S009C003P015R002A041_rgb', 'sneezeCough/S003C001P008R001A041_rgb', 'sneezeCough/S001C001P004R001A041_rgb', 'sneezeCough/S011C003P001R001A041_rgb', 'sneezeCough/S007C003P008R001A041_rgb', 'sneezeCough/S010C001P007R002A041_rgb', 'sneezeCough/S010C003P021R001A041_rgb', 'sneezeCough/S001C003P002R002A041_rgb', 'sneezeCough/S014C003P017R001A041_rgb', 'sneezeCough/S016C001P007R002A041_rgb', 'sneezeCough/S015C003P019R002A041_rgb', 'sneezeCough/S014C003P007R001A041_rgb', 'sneezeCough/S009C003P008R002A041_rgb', 'sneezeCough/S003C001P016R001A041_rgb', 'sneezeCough/S004C002P008R001A041_rgb', 'sneezeCough/S006C002P015R001A041_rgb', 'sneezeCough/S013C002P015R001A041_rgb', 'sneezeCough/S005C003P018R002A041_rgb', 'sneezeCough/S016C001P025R001A041_rgb', 'sneezeCough/S013C001P028R001A041_rgb', 'sneezeCough/S011C003P007R002A041_rgb', 'sneezeCough/S002C001P007R001A041_rgb', 'sneezeCough/S003C003P001R002A041_rgb', 'sneezeCough/S013C001P025R001A041_rgb', 'sneezeCough/S008C001P007R002A041_rgb', 'sneezeCough/S016C002P008R002A041_rgb', 'sneezeCough/S002C001P012R001A041_rgb', 'sneezeCough/S011C002P018R001A041_rgb', 'sneezeCough/S010C001P025R002A041_rgb', 'sneezeCough/S007C002P001R002A041_rgb', 'sneezeCough/S009C001P025R001A041_rgb', 'sneezeCough/S014C001P017R001A041_rgb', 'sneezeCough/S009C003P016R001A041_rgb', 'sneezeCough/S010C002P017R002A041_rgb', 'sneezeCough/S013C001P025R002A041_rgb', 'sneezeCough/S008C001P030R001A041_rgb', 'sneezeCough/S001C002P007R001A041_rgb', 'sneezeCough/S011C003P018R001A041_rgb', 'sneezeCough/S008C003P007R001A041_rgb', 'sneezeCough/S014C003P015R002A041_rgb', 'sneezeCough/S007C002P026R002A041_rgb', 'sneezeCough/S011C003P038R001A041_rgb', 'sneezeCough/S009C002P008R001A041_rgb', 'sneezeCough/S014C002P027R002A041_rgb', 'sneezeCough/S001C002P001R002A041_rgb', 'sneezeCough/S015C003P037R001A041_rgb', 'sneezeCough/S014C001P027R002A041_rgb', 'sneezeCough/S005C001P016R002A041_rgb', 'sneezeCough/S005C003P013R001A041_rgb', 'sneezeCough/S003C003P015R002A041_rgb', 'sneezeCough/S009C002P016R001A041_rgb', 'sneezeCough/S007C002P027R002A041_rgb', 'sneezeCough/S017C002P007R002A041_rgb', 'sneezeCough/S011C001P019R001A041_rgb', 'sneezeCough/S016C003P039R002A041_rgb', 'sneezeCough/S002C001P014R002A041_rgb', 'sneezeCough/S006C003P019R002A041_rgb', 'sneezeCough/S013C003P028R002A041_rgb', 'sneezeCough/S001C003P005R001A041_rgb', 'sneezeCough/S002C001P008R001A041_rgb', 'sneezeCough/S006C002P007R002A041_rgb', 'sneezeCough/S004C002P008R002A041_rgb', 'sneezeCough/S004C003P003R001A041_rgb', 'sneezeCough/S009C003P007R001A041_rgb', 'sneezeCough/S005C001P004R001A041_rgb', 'sneezeCough/S017C003P007R002A041_rgb', 'sneezeCough/S013C002P028R001A041_rgb', 'sneezeCough/S009C002P025R001A041_rgb', 'sneezeCough/S017C003P017R002A041_rgb', 'sneezeCough/S007C002P028R002A041_rgb', 'sneezeCough/S008C002P001R002A041_rgb', 'sneezeCough/S011C003P019R001A041_rgb', 'sneezeCough/S002C003P011R001A041_rgb', 'sneezeCough/S008C001P015R001A041_rgb', 'sneezeCough/S012C002P019R002A041_rgb', 'sneezeCough/S011C001P027R001A041_rgb', 'sneezeCough/S002C001P012R002A041_rgb', 'sneezeCough/S006C002P016R001A041_rgb', 'sneezeCough/S005C003P010R001A041_rgb', 'sneezeCough/S014C003P007R002A041_rgb', 'sneezeCough/S008C001P033R001A041_rgb', 'sneezeCough/S007C001P008R001A041_rgb', 'sneezeCough/S012C001P008R002A041_rgb', 'sneezeCough/S002C002P010R002A041_rgb', 'sneezeCough/S004C002P003R001A041_rgb', 'sneezeCough/S007C003P025R001A041_rgb', 'sneezeCough/S002C003P009R002A041_rgb', 'sneezeCough/S010C001P017R001A041_rgb', 'sneezeCough/S010C002P007R001A041_rgb', 'sneezeCough/S001C001P008R002A041_rgb', 'sneezeCough/S010C001P018R001A041_rgb', 'sneezeCough/S005C003P013R002A041_rgb', 'sneezeCough/S010C003P015R001A041_rgb', 'sneezeCough/S013C003P027R001A041_rgb', 'sneezeCough/S010C003P017R002A041_rgb', 'sneezeCough/S010C002P015R002A041_rgb', 'sneezeCough/S003C001P001R002A041_rgb', 'sneezeCough/S017C002P003R001A041_rgb', 'sneezeCough/S010C003P008R002A041_rgb', 'sneezeCough/S006C003P016R002A041_rgb', 'sneezeCough/S016C001P040R001A041_rgb', 'sneezeCough/S017C002P015R002A041_rgb', 'sneezeCough/S003C002P015R002A041_rgb', 'sneezeCough/S012C002P027R001A041_rgb', 'sneezeCough/S010C002P019R002A041_rgb', 'sneezeCough/S011C003P007R001A041_rgb', 'sneezeCough/S001C001P003R001A041_rgb', 'sneezeCough/S016C001P008R001A041_rgb', 'sneezeCough/S013C001P018R001A041_rgb', 'sneezeCough/S017C001P009R001A041_rgb', 'sneezeCough/S012C002P016R001A041_rgb', 'sneezeCough/S008C002P032R002A041_rgb', 'sneezeCough/S015C001P019R002A041_rgb', 'sneezeCough/S011C001P002R001A041_rgb', 'sneezeCough/S006C003P001R001A041_rgb', 'sneezeCough/S016C002P008R001A041_rgb', 'sneezeCough/S009C003P007R002A041_rgb', 'sneezeCough/S006C002P022R002A041_rgb', 'sneezeCough/S017C002P007R001A041_rgb', 'sneezeCough/S009C001P025R002A041_rgb', 'sneezeCough/S002C002P012R001A041_rgb', 'sneezeCough/S009C002P019R001A041_rgb', 'sneezeCough/S012C002P017R001A041_rgb', 'sneezeCough/S006C003P016R001A041_rgb', 'sneezeCough/S017C001P017R002A041_rgb', 'sneezeCough/S006C003P024R002A041_rgb', 'sneezeCough/S001C002P003R001A041_rgb', 'sneezeCough/S005C001P004R002A041_rgb', 'sneezeCough/S004C002P007R002A041_rgb', 'sneezeCough/S012C001P025R002A041_rgb', 'sneezeCough/S017C002P017R001A041_rgb', 'sneezeCough/S011C002P002R001A041_rgb', 'sneezeCough/S002C001P013R001A041_rgb', 'sneezeCough/S016C002P040R001A041_rgb', 'sneezeCough/S005C003P021R001A041_rgb', 'sneezeCough/S011C001P008R001A041_rgb', 'sneezeCough/S001C002P004R002A041_rgb', 'sneezeCough/S005C002P016R002A041_rgb', 'sneezeCough/S003C001P015R002A041_rgb', 'sneezeCough/S014C001P007R002A041_rgb', 'sneezeCough/S012C003P015R001A041_rgb', 'sneezeCough/S008C001P031R002A041_rgb', 'sneezeCough/S016C003P007R001A041_rgb', 'sneezeCough/S010C002P018R001A041_rgb', 'sneezeCough/S003C002P016R001A041_rgb', 'sneezeCough/S008C003P015R002A041_rgb', 'sneezeCough/S016C002P039R002A041_rgb', 'sneezeCough/S007C003P001R001A041_rgb', 'sneezeCough/S001C002P002R001A041_rgb', 'sneezeCough/S011C002P008R001A041_rgb', 'sneezeCough/S001C001P002R001A041_rgb', 'sneezeCough/S009C002P015R001A041_rgb', 'staggering/S003C002P007R001A042_rgb', 'staggering/S009C002P008R001A042_rgb', 'staggering/S006C003P022R001A042_rgb', 'staggering/S013C002P015R002A042_rgb', 'staggering/S011C001P008R001A042_rgb', 'staggering/S011C001P025R001A042_rgb', 'staggering/S010C002P013R001A042_rgb', 'staggering/S002C001P003R002A042_rgb', 'staggering/S011C002P008R001A042_rgb', 'staggering/S002C003P013R001A042_rgb', 'staggering/S008C002P031R001A042_rgb', 'staggering/S003C001P019R002A042_rgb', 'staggering/S011C002P016R002A042_rgb', 'staggering/S006C001P023R002A042_rgb', 'staggering/S009C001P016R002A042_rgb', 'staggering/S005C002P004R002A042_rgb', 'staggering/S015C003P015R001A042_rgb', 'staggering/S003C002P016R002A042_rgb', 'staggering/S014C003P039R001A042_rgb', 'staggering/S009C003P016R001A042_rgb', 'staggering/S017C002P003R001A042_rgb', 'staggering/S001C003P007R002A042_rgb', 'staggering/S003C003P007R002A042_rgb', 'staggering/S007C001P019R002A042_rgb', 'staggering/S001C002P005R002A042_rgb', 'staggering/S007C003P018R001A042_rgb', 'staggering/S012C002P025R002A042_rgb', 'staggering/S005C002P013R002A042_rgb', 'staggering/S005C003P021R002A042_rgb', 'staggering/S002C002P003R001A042_rgb', 'staggering/S009C001P007R001A042_rgb', 'staggering/S008C003P001R002A042_rgb', 'staggering/S015C001P025R002A042_rgb', 'staggering/S004C002P008R002A042_rgb', 'staggering/S012C003P016R001A042_rgb', 'staggering/S017C002P009R001A042_rgb', 'staggering/S013C001P016R002A042_rgb', 'staggering/S011C002P025R002A042_rgb', 'staggering/S013C003P016R002A042_rgb', 'staggering/S017C002P009R002A042_rgb', 'staggering/S013C001P015R001A042_rgb', 'staggering/S013C001P019R002A042_rgb', 'staggering/S008C001P032R002A042_rgb', 'staggering/S016C002P040R001A042_rgb', 'staggering/S011C003P038R001A042_rgb', 'staggering/S005C002P017R001A042_rgb', 'staggering/S007C003P017R002A042_rgb', 'staggering/S006C001P017R001A042_rgb', 'staggering/S009C002P019R001A042_rgb', 'staggering/S010C001P019R002A042_rgb', 'staggering/S016C002P025R002A042_rgb', 'staggering/S015C002P008R002A042_rgb', 'staggering/S008C001P031R001A042_rgb', 'staggering/S007C002P025R002A042_rgb', 'staggering/S008C001P029R002A042_rgb', 'staggering/S010C003P016R001A042_rgb', 'staggering/S011C003P018R001A042_rgb', 'staggering/S003C001P018R002A042_rgb', 'staggering/S014C001P037R001A042_rgb', 'staggering/S005C003P018R002A042_rgb', 'staggering/S005C001P013R001A042_rgb', 'staggering/S005C003P015R001A042_rgb', 'staggering/S015C002P007R001A042_rgb', 'staggering/S008C001P033R002A042_rgb', 'staggering/S017C001P020R002A042_rgb', 'staggering/S010C003P017R001A042_rgb', 'staggering/S008C002P031R002A042_rgb', 'staggering/S011C002P027R002A042_rgb', 'staggering/S011C001P015R001A042_rgb', 'staggering/S001C002P006R001A042_rgb', 'staggering/S012C003P018R001A042_rgb', 'staggering/S001C003P007R001A042_rgb', 'staggering/S016C003P021R001A042_rgb', 'staggering/S002C001P009R002A042_rgb', 'staggering/S013C001P027R002A042_rgb', 'staggering/S010C001P021R001A042_rgb', 'staggering/S008C002P034R002A042_rgb', 'staggering/S011C003P017R002A042_rgb', 'staggering/S003C003P019R002A042_rgb', 'staggering/S010C002P021R001A042_rgb', 'staggering/S011C003P008R002A042_rgb', 'staggering/S007C003P027R001A042_rgb', 'staggering/S012C003P025R001A042_rgb', 'staggering/S012C001P019R001A042_rgb', 'staggering/S015C002P017R002A042_rgb', 'staggering/S012C002P016R002A042_rgb', 'staggering/S001C001P006R001A042_rgb', 'staggering/S010C001P021R002A042_rgb', 'staggering/S014C002P025R001A042_rgb', 'staggering/S008C001P036R001A042_rgb', 'staggering/S008C002P032R002A042_rgb', 'staggering/S010C001P013R001A042_rgb', 'staggering/S014C002P025R002A042_rgb', 'staggering/S015C001P019R001A042_rgb', 'staggering/S002C001P010R002A042_rgb', 'staggering/S014C003P019R002A042_rgb', 'staggering/S008C002P036R001A042_rgb', 'staggering/S006C002P001R002A042_rgb', 'staggering/S007C003P015R002A042_rgb', 'staggering/S013C001P015R002A042_rgb', 'staggering/S005C002P013R001A042_rgb', 'staggering/S008C001P007R002A042_rgb', 'staggering/S016C003P025R001A042_rgb', 'staggering/S009C003P025R001A042_rgb', 'staggering/S008C002P033R001A042_rgb', 'staggering/S009C001P017R002A042_rgb', 'staggering/S008C002P019R001A042_rgb', 'staggering/S003C001P018R001A042_rgb', 'staggering/S011C002P038R002A042_rgb', 'staggering/S010C003P013R001A042_rgb', 'staggering/S006C001P015R002A042_rgb', 'staggering/S006C002P007R001A042_rgb', 'staggering/S017C002P008R001A042_rgb', 'staggering/S015C001P007R001A042_rgb', 'staggering/S006C003P008R001A042_rgb', 'staggering/S002C001P003R001A042_rgb', 'staggering/S015C002P019R002A042_rgb', 'staggering/S003C003P015R002A042_rgb', 'staggering/S007C003P018R002A042_rgb', 'staggering/S006C001P022R001A042_rgb', 'staggering/S003C002P018R001A042_rgb', 'staggering/S004C003P020R001A042_rgb', 'staggering/S005C003P010R002A042_rgb', 'staggering/S005C001P004R002A042_rgb', 'staggering/S011C001P008R002A042_rgb', 'staggering/S008C003P025R002A042_rgb', 'staggering/S009C003P025R002A042_rgb', 'staggering/S006C003P008R002A042_rgb', 'staggering/S013C003P037R001A042_rgb', 'staggering/S011C001P017R001A042_rgb', 'staggering/S010C003P015R001A042_rgb', 'staggering/S006C003P017R002A042_rgb', 'staggering/S008C001P008R002A042_rgb', 'staggering/S013C002P007R002A042_rgb', 'staggering/S008C002P008R002A042_rgb', 'staggering/S002C002P003R002A042_rgb', 'staggering/S006C003P023R001A042_rgb', 'staggering/S014C002P007R002A042_rgb', 'staggering/S003C002P016R001A042_rgb', 'staggering/S009C002P016R002A042_rgb', 'staggering/S002C001P014R002A042_rgb', 'staggering/S006C002P001R001A042_rgb', 'staggering/S002C002P013R002A042_rgb', 'staggering/S001C003P005R001A042_rgb', 'staggering/S008C003P029R001A042_rgb', 'staggering/S008C003P032R002A042_rgb', 'staggering/S012C001P008R002A042_rgb', 'staggering/S015C001P037R001A042_rgb', 'staggering/S008C002P029R001A042_rgb', 'staggering/S001C001P002R001A042_rgb', 'staggering/S011C002P001R001A042_rgb', 'staggering/S007C001P001R001A042_rgb', 'staggering/S016C001P025R002A042_rgb', 'staggering/S004C001P008R001A042_rgb', 'staggering/S016C003P040R001A042_rgb', 'staggering/S008C002P008R001A042_rgb', 'staggering/S002C001P008R002A042_rgb', 'staggering/S017C003P009R002A042_rgb', 'staggering/S006C001P001R002A042_rgb', 'staggering/S001C003P002R001A042_rgb', 'staggering/S010C002P007R002A042_rgb', 'staggering/S016C001P008R002A042_rgb', 'staggering/S015C003P008R001A042_rgb', 'staggering/S007C001P016R002A042_rgb', 'staggering/S014C002P027R002A042_rgb', 'staggering/S007C002P016R001A042_rgb', 'staggering/S008C001P033R001A042_rgb', 'staggering/S003C002P015R002A042_rgb', 'staggering/S016C002P019R002A042_rgb', 'staggering/S007C002P016R002A042_rgb', 'staggering/S014C003P015R001A042_rgb', 'staggering/S005C002P018R001A042_rgb', 'staggering/S008C001P025R002A042_rgb', 'staggering/S016C001P008R001A042_rgb', 'staggering/S008C003P036R002A042_rgb', 'staggering/S010C001P018R002A042_rgb', 'staggering/S005C002P010R001A042_rgb', 'staggering/S015C003P016R002A042_rgb', 'staggering/S016C001P021R002A042_rgb', 'staggering/S017C002P007R001A042_rgb', 'staggering/S007C003P007R002A042_rgb', 'staggering/S006C002P016R001A042_rgb', 'staggering/S016C003P007R002A042_rgb', 'staggering/S014C003P008R002A042_rgb', 'staggering/S016C001P039R002A042_rgb', 'staggering/S011C003P017R001A042_rgb', 'staggering/S002C003P014R002A042_rgb', 'staggering/S002C001P009R001A042_rgb', 'staggering/S015C002P015R002A042_rgb', 'staggering/S003C001P015R001A042_rgb', 'staggering/S007C002P026R001A042_rgb', 'staggering/S005C002P015R001A042_rgb', 'staggering/S011C003P016R001A042_rgb', 'staggering/S013C001P008R001A042_rgb', 'staggering/S003C003P018R002A042_rgb', 'staggering/S013C002P019R001A042_rgb', 'staggering/S010C001P013R002A042_rgb', 'staggering/S015C002P025R002A042_rgb', 'staggering/S003C001P001R002A042_rgb', 'staggering/S016C003P039R002A042_rgb', 'staggering/S006C001P019R001A042_rgb', 'staggering/S013C003P008R002A042_rgb', 'staggering/S017C003P007R002A042_rgb', 'staggering/S012C001P028R002A042_rgb', 'staggering/S017C001P016R001A042_rgb', 'staggering/S010C003P015R002A042_rgb', 'staggering/S009C003P008R002A042_rgb', 'staggering/S011C003P001R002A042_rgb', 'staggering/S007C002P025R001A042_rgb', 'staggering/S011C003P028R001A042_rgb', 'staggering/S017C001P007R002A042_rgb', 'staggering/S013C003P017R002A042_rgb', 'staggering/S009C003P015R002A042_rgb', 'staggering/S003C001P019R001A042_rgb', 'staggering/S004C001P003R002A042_rgb', 'staggering/S013C002P019R002A042_rgb', 'staggering/S004C001P020R002A042_rgb', 'staggering/S013C001P018R002A042_rgb', 'staggering/S006C001P016R002A042_rgb', 'staggering/S017C001P015R002A042_rgb', 'staggering/S015C003P015R002A042_rgb', 'staggering/S015C001P015R001A042_rgb', 'staggering/S006C001P023R001A042_rgb', 'staggering/S003C002P008R001A042_rgb', 'staggering/S015C003P007R001A042_rgb', 'staggering/S005C003P013R001A042_rgb', 'staggering/S016C003P040R002A042_rgb', 'staggering/S017C003P016R002A042_rgb', 'staggering/S017C001P003R002A042_rgb', 'staggering/S004C001P007R001A042_rgb', 'staggering/S013C002P015R001A042_rgb', 'staggering/S002C001P010R001A042_rgb', 'staggering/S008C001P036R002A042_rgb', 'staggering/S016C002P039R001A042_rgb', 'staggering/S014C001P017R001A042_rgb', 'staggering/S009C001P015R002A042_rgb', 'staggering/S008C001P029R001A042_rgb', 'staggering/S003C003P019R001A042_rgb', 'staggering/S006C002P024R002A042_rgb', 'staggering/S004C002P003R002A042_rgb', 'staggering/S005C001P010R002A042_rgb', 'staggering/S004C001P003R001A042_rgb', 'staggering/S007C002P026R002A042_rgb', 'staggering/S007C002P008R002A042_rgb', 'staggering/S015C001P017R001A042_rgb', 'staggering/S005C002P018R002A042_rgb', 'staggering/S011C001P001R001A042_rgb', 'staggering/S012C003P007R001A042_rgb', 'staggering/S006C002P023R002A042_rgb', 'staggering/S013C001P017R002A042_rgb', 'staggering/S003C003P002R002A042_rgb', 'staggering/S014C001P025R001A042_rgb', 'staggering/S011C001P018R001A042_rgb', 'staggering/S012C002P008R002A042_rgb', 'staggering/S014C001P007R001A042_rgb', 'staggering/S002C003P003R002A042_rgb', 'staggering/S011C003P002R002A042_rgb', 'staggering/S007C003P026R001A042_rgb', 'staggering/S013C003P019R001A042_rgb', 'staggering/S014C001P017R002A042_rgb', 'staggering/S001C003P006R001A042_rgb', 'staggering/S015C001P016R001A042_rgb', 'staggering/S007C003P028R002A042_rgb', 'staggering/S008C002P015R001A042_rgb', 'staggering/S017C002P003R002A042_rgb', 'staggering/S009C002P015R001A042_rgb', 'staggering/S002C001P008R001A042_rgb', 'staggering/S002C003P003R001A042_rgb', 'staggering/S012C003P019R002A042_rgb', 'staggering/S010C001P025R002A042_rgb', 'staggering/S011C001P018R002A042_rgb', 'staggering/S011C003P007R002A042_rgb', 'staggering/S009C001P025R002A042_rgb', 'staggering/S005C001P010R001A042_rgb', 'staggering/S010C003P025R002A042_rgb', 'staggering/S011C003P025R001A042_rgb', 'staggering/S007C001P028R001A042_rgb', 'staggering/S008C002P007R001A042_rgb', 'staggering/S015C001P037R002A042_rgb', 'staggering/S005C002P016R001A042_rgb', 'staggering/S012C002P017R001A042_rgb', 'staggering/S006C003P019R001A042_rgb', 'staggering/S013C001P028R001A042_rgb', 'staggering/S013C003P028R002A042_rgb', 'staggering/S014C002P017R002A042_rgb', 'staggering/S007C002P018R002A042_rgb', 'staggering/S007C003P028R001A042_rgb', 'staggering/S017C003P017R002A042_rgb', 'staggering/S014C002P008R001A042_rgb', 'staggering/S005C001P017R002A042_rgb', 'staggering/S015C003P017R002A042_rgb', 'staggering/S014C003P027R001A042_rgb', 'staggering/S008C002P019R002A042_rgb', 'staggering/S012C001P007R001A042_rgb', 'staggering/S009C002P017R002A042_rgb', 'staggering/S011C001P027R001A042_rgb', 'staggering/S016C001P021R001A042_rgb', 'staggering/S002C002P014R002A042_rgb', 'staggering/S010C003P013R002A042_rgb', 'staggering/S006C002P023R001A042_rgb', 'staggering/S016C002P021R002A042_rgb', 'staggering/S014C002P015R001A042_rgb', 'staggering/S001C001P007R001A042_rgb', 'staggering/S007C003P016R002A042_rgb', 'staggering/S014C001P037R002A042_rgb', 'staggering/S008C001P019R002A042_rgb', 'staggering/S011C002P028R002A042_rgb', 'staggering/S005C001P016R002A042_rgb', 'staggering/S012C001P017R001A042_rgb', 'staggering/S002C002P010R001A042_rgb', 'staggering/S011C001P016R002A042_rgb', 'staggering/S002C003P007R002A042_rgb', 'staggering/S007C003P015R001A042_rgb', 'staggering/S012C003P017R002A042_rgb', 'staggering/S002C001P007R002A042_rgb', 'staggering/S017C002P007R002A042_rgb', 'staggering/S017C001P007R001A042_rgb', 'staggering/S005C001P021R001A042_rgb', 'staggering/S005C002P017R002A042_rgb', 'staggering/S001C002P006R002A042_rgb', 'staggering/S001C002P004R001A042_rgb', 'staggering/S006C003P024R001A042_rgb', 'staggering/S008C002P025R002A042_rgb', 'staggering/S006C002P019R001A042_rgb', 'staggering/S005C003P017R002A042_rgb', 'staggering/S014C001P008R002A042_rgb', 'staggering/S004C003P007R001A042_rgb', 'staggering/S007C002P019R002A042_rgb', 'staggering/S012C001P037R001A042_rgb', 'staggering/S013C003P028R001A042_rgb', 'staggering/S014C002P017R001A042_rgb', 'staggering/S010C001P025R001A042_rgb', 'staggering/S015C002P016R002A042_rgb', 'staggering/S014C003P037R002A042_rgb', 'staggering/S006C003P001R002A042_rgb', 'staggering/S008C003P036R001A042_rgb', 'staggering/S001C001P001R002A042_rgb', 'staggering/S014C003P019R001A042_rgb', 'staggering/S001C002P007R001A042_rgb', 'staggering/S005C002P015R002A042_rgb', 'staggering/S002C002P012R002A042_rgb', 'staggering/S011C001P002R001A042_rgb', 'staggering/S012C003P028R001A042_rgb', 'staggering/S012C003P018R002A042_rgb', 'staggering/S003C003P018R001A042_rgb', 'staggering/S002C001P014R001A042_rgb', 'staggering/S001C003P003R001A042_rgb', 'staggering/S007C001P028R002A042_rgb', 'staggering/S001C002P003R001A042_rgb', 'staggering/S012C002P018R002A042_rgb', 'staggering/S010C002P019R002A042_rgb', 'staggering/S002C001P011R002A042_rgb', 'staggering/S017C001P008R002A042_rgb', 'staggering/S011C001P025R002A042_rgb', 'staggering/S002C003P012R002A042_rgb', 'staggering/S017C001P020R001A042_rgb', 'staggering/S007C002P027R002A042_rgb', 'staggering/S003C002P002R002A042_rgb', 'staggering/S009C001P008R001A042_rgb', 'staggering/S007C001P007R002A042_rgb', 'staggering/S016C001P025R001A042_rgb', 'staggering/S008C002P029R002A042_rgb', 'staggering/S015C001P007R002A042_rgb', 'staggering/S011C001P019R001A042_rgb', 'staggering/S010C003P017R002A042_rgb', 'staggering/S007C001P015R002A042_rgb', 'staggering/S009C001P017R001A042_rgb', 'staggering/S013C002P037R001A042_rgb', 'staggering/S001C003P004R002A042_rgb', 'staggering/S009C003P007R002A042_rgb', 'staggering/S007C003P026R002A042_rgb', 'staggering/S016C002P039R002A042_rgb', 'staggering/S002C002P012R001A042_rgb', 'staggering/S002C002P007R002A042_rgb', 'staggering/S003C001P001R001A042_rgb', 'staggering/S004C002P003R001A042_rgb', 'staggering/S006C002P016R002A042_rgb', 'staggering/S016C002P008R002A042_rgb', 'staggering/S001C003P008R001A042_rgb', 'staggering/S001C003P005R002A042_rgb', 'staggering/S015C003P025R002A042_rgb', 'staggering/S006C001P001R001A042_rgb', 'staggering/S012C001P037R002A042_rgb', 'staggering/S016C001P007R002A042_rgb', 'staggering/S012C002P007R001A042_rgb', 'staggering/S011C002P007R001A042_rgb', 'staggering/S017C001P009R002A042_rgb', 'staggering/S003C003P016R002A042_rgb', 'staggering/S016C001P040R002A042_rgb', 'staggering/S011C002P015R002A042_rgb', 'staggering/S006C001P015R001A042_rgb', 'staggering/S004C002P020R002A042_rgb', 'staggering/S013C001P028R002A042_rgb', 'staggering/S004C003P007R002A042_rgb', 'staggering/S007C003P025R001A042_rgb', 'staggering/S011C003P038R002A042_rgb', 'staggering/S003C001P007R001A042_rgb', 'staggering/S015C003P019R001A042_rgb', 'staggering/S011C002P027R001A042_rgb', 'staggering/S002C003P010R001A042_rgb', 'staggering/S011C002P018R002A042_rgb', 'staggering/S007C003P001R002A042_rgb', 'staggering/S006C003P015R001A042_rgb', 'staggering/S010C003P008R002A042_rgb', 'staggering/S006C002P017R001A042_rgb', 'staggering/S008C002P035R001A042_rgb', 'staggering/S003C002P017R001A042_rgb', 'staggering/S008C003P033R001A042_rgb', 'staggering/S016C002P007R001A042_rgb', 'staggering/S007C002P017R001A042_rgb', 'staggering/S007C001P015R001A042_rgb', 'staggering/S015C002P007R002A042_rgb', 'staggering/S001C002P001R002A042_rgb', 'staggering/S017C002P015R001A042_rgb', 'staggering/S017C002P016R001A042_rgb', 'staggering/S012C003P008R001A042_rgb', 'staggering/S012C001P016R001A042_rgb', 'staggering/S004C003P003R001A042_rgb', 'staggering/S010C003P007R002A042_rgb', 'staggering/S016C002P021R001A042_rgb', 'staggering/S010C002P025R002A042_rgb', 'staggering/S005C002P021R001A042_rgb', 'staggering/S015C001P015R002A042_rgb', 'staggering/S011C003P008R001A042_rgb', 'staggering/S006C001P024R001A042_rgb', 'staggering/S003C001P008R001A042_rgb', 'staggering/S001C002P007R002A042_rgb', 'staggering/S009C002P019R002A042_rgb', 'staggering/S007C001P008R001A042_rgb', 'staggering/S004C003P020R002A042_rgb', 'staggering/S007C002P015R001A042_rgb', 'staggering/S012C001P025R001A042_rgb', 'staggering/S012C001P016R002A042_rgb', 'staggering/S016C003P019R002A042_rgb', 'staggering/S015C001P008R001A042_rgb', 'staggering/S011C001P027R002A042_rgb', 'staggering/S016C001P019R002A042_rgb', 'staggering/S002C003P007R001A042_rgb', 'staggering/S010C002P008R002A042_rgb', 'staggering/S010C001P017R001A042_rgb', 'staggering/S008C001P031R002A042_rgb', 'staggering/S012C001P017R002A042_rgb', 'staggering/S014C003P015R002A042_rgb', 'staggering/S007C001P027R001A042_rgb', 'staggering/S014C001P039R001A042_rgb', 'staggering/S017C003P007R001A042_rgb', 'staggering/S006C003P001R001A042_rgb', 'staggering/S011C001P019R002A042_rgb', 'staggering/S002C003P008R002A042_rgb', 'staggering/S003C002P002R001A042_rgb', 'staggering/S012C002P025R001A042_rgb', 'staggering/S007C003P007R001A042_rgb', 'staggering/S012C003P007R002A042_rgb', 'staggering/S007C003P027R002A042_rgb', 'staggering/S015C003P037R002A042_rgb', 'staggering/S010C002P016R001A042_rgb', 'staggering/S011C003P018R002A042_rgb', 'staggering/S013C002P027R002A042_rgb', 'staggering/S006C001P007R002A042_rgb', 'staggering/S013C002P025R001A042_rgb', 'staggering/S004C001P007R002A042_rgb', 'staggering/S010C003P008R001A042_rgb', 'staggering/S014C001P027R002A042_rgb', 'staggering/S010C003P018R002A042_rgb', 'staggering/S012C003P037R001A042_rgb', 'staggering/S004C001P008R002A042_rgb', 'staggering/S008C001P034R002A042_rgb', 'staggering/S013C003P007R001A042_rgb', 'staggering/S008C002P036R002A042_rgb', 'staggering/S004C001P020R001A042_rgb', 'staggering/S008C003P034R001A042_rgb', 'staggering/S010C002P018R001A042_rgb', 'staggering/S011C002P017R001A042_rgb', 'staggering/S015C001P019R002A042_rgb', 'staggering/S009C003P016R002A042_rgb', 'staggering/S002C001P013R002A042_rgb', 'staggering/S009C002P017R001A042_rgb', 'staggering/S013C003P037R002A042_rgb', 'staggering/S014C002P037R002A042_rgb', 'staggering/S014C002P008R002A042_rgb', 'staggering/S009C002P016R001A042_rgb', 'staggering/S009C002P007R001A042_rgb', 'staggering/S011C001P007R001A042_rgb', 'staggering/S003C003P015R001A042_rgb', 'staggering/S008C002P001R001A042_rgb', 'staggering/S003C002P001R002A042_rgb', 'staggering/S013C001P016R001A042_rgb', 'staggering/S011C002P017R002A042_rgb', 'staggering/S005C001P018R001A042_rgb', 'staggering/S008C001P025R001A042_rgb', 'staggering/S005C003P004R002A042_rgb', 'staggering/S003C002P008R002A042_rgb', 'staggering/S005C001P017R001A042_rgb', 'staggering/S002C002P014R001A042_rgb', 'staggering/S012C001P015R002A042_rgb', 'staggering/S014C002P019R001A042_rgb', 'staggering/S006C002P017R002A042_rgb', 'staggering/S011C002P007R002A042_rgb', 'staggering/S008C001P030R001A042_rgb', 'staggering/S008C003P035R001A042_rgb', 'staggering/S001C001P003R002A042_rgb', 'staggering/S014C001P007R002A042_rgb', 'staggering/S016C003P019R001A042_rgb', 'staggering/S005C002P016R002A042_rgb', 'staggering/S013C003P015R001A042_rgb', 'staggering/S005C001P018R002A042_rgb', 'staggering/S008C002P032R001A042_rgb', 'staggering/S003C002P019R001A042_rgb', 'staggering/S010C001P008R002A042_rgb', 'staggering/S013C003P018R002A042_rgb', 'staggering/S011C002P015R001A042_rgb', 'staggering/S013C002P018R002A042_rgb', 'staggering/S003C003P002R001A042_rgb', 'staggering/S005C001P015R001A042_rgb', 'staggering/S014C003P025R002A042_rgb', 'staggering/S011C003P002R001A042_rgb', 'staggering/S011C001P017R002A042_rgb', 'staggering/S009C003P007R001A042_rgb', 'staggering/S017C003P020R002A042_rgb', 'staggering/S012C002P016R001A042_rgb', 'staggering/S002C002P013R001A042_rgb', 'staggering/S014C001P027R001A042_rgb', 'staggering/S007C003P008R002A042_rgb', 'staggering/S001C003P003R002A042_rgb', 'staggering/S016C002P040R002A042_rgb', 'staggering/S005C001P004R001A042_rgb', 'staggering/S010C001P016R002A042_rgb', 'staggering/S011C002P019R001A042_rgb', 'staggering/S010C001P008R001A042_rgb', 'staggering/S014C003P037R001A042_rgb', 'staggering/S011C002P001R002A042_rgb', 'staggering/S002C003P011R002A042_rgb', 'staggering/S010C002P008R001A042_rgb', 'staggering/S015C003P019R002A042_rgb', 'staggering/S008C002P030R002A042_rgb', 'staggering/S008C001P015R001A042_rgb', 'staggering/S011C001P001R002A042_rgb', 'staggering/S005C003P010R001A042_rgb', 'staggering/S008C001P032R001A042_rgb', 'staggering/S015C002P019R001A042_rgb', 'staggering/S010C003P021R002A042_rgb', 'staggering/S001C001P003R001A042_rgb', 'staggering/S017C002P016R002A042_rgb', 'staggering/S007C001P018R002A042_rgb', 'staggering/S001C002P004R002A042_rgb', 'staggering/S012C001P018R001A042_rgb', 'staggering/S017C003P008R001A042_rgb', 'staggering/S008C003P008R001A042_rgb', 'staggering/S003C002P019R002A042_rgb', 'staggering/S014C002P007R001A042_rgb', 'staggering/S007C002P018R001A042_rgb', 'staggering/S011C002P028R001A042_rgb', 'staggering/S017C003P015R002A042_rgb', 'staggering/S008C002P007R002A042_rgb', 'staggering/S011C003P001R001A042_rgb', 'staggering/S003C003P017R002A042_rgb', 'staggering/S011C003P025R002A042_rgb', 'staggering/S006C001P008R002A042_rgb', 'staggering/S016C003P021R002A042_rgb', 'staggering/S005C001P016R001A042_rgb', 'staggering/S001C002P002R001A042_rgb', 'staggering/S015C003P007R002A042_rgb', 'staggering/S017C002P008R002A042_rgb', 'staggering/S001C003P004R001A042_rgb', 'staggering/S009C001P015R001A042_rgb', 'staggering/S008C002P025R001A042_rgb', 'staggering/S001C003P006R002A042_rgb', 'staggering/S006C003P016R001A042_rgb', 'staggering/S008C001P008R001A042_rgb', 'staggering/S010C001P016R001A042_rgb', 'staggering/S003C002P018R002A042_rgb', 'staggering/S005C003P016R002A042_rgb', 'staggering/S008C002P035R002A042_rgb', 'staggering/S006C003P024R002A042_rgb', 'staggering/S016C002P008R001A042_rgb', 'staggering/S013C003P018R001A042_rgb', 'staggering/S005C003P016R001A042_rgb', 'staggering/S007C002P001R002A042_rgb', 'staggering/S008C002P001R002A042_rgb', 'staggering/S014C001P019R001A042_rgb', 'staggering/S015C003P037R001A042_rgb', 'staggering/S015C001P017R002A042_rgb', 'staggering/S012C001P019R002A042_rgb', 'staggering/S011C002P038R001A042_rgb', 'staggering/S008C003P031R001A042_rgb', 'staggering/S008C003P025R001A042_rgb', 'staggering/S006C002P015R001A042_rgb', 'staggering/S016C001P007R001A042_rgb', 'staggering/S001C001P005R002A042_rgb', 'staggering/S002C001P007R001A042_rgb', 'staggering/S005C003P018R001A042_rgb', 'staggering/S011C001P002R002A042_rgb', 'staggering/S015C002P015R001A042_rgb', 'staggering/S006C001P016R001A042_rgb', 'staggering/S017C003P009R001A042_rgb', 'staggering/S016C003P008R002A042_rgb', 'staggering/S009C003P015R001A042_rgb', 'staggering/S008C003P034R002A042_rgb', 'staggering/S011C002P025R001A042_rgb', 'staggering/S003C003P017R001A042_rgb', 'staggering/S001C001P007R002A042_rgb', 'staggering/S012C002P027R001A042_rgb', 'staggering/S012C001P027R001A042_rgb', 'staggering/S009C001P025R001A042_rgb', 'staggering/S017C001P017R002A042_rgb', 'staggering/S007C003P019R002A042_rgb', 'staggering/S010C001P019R001A042_rgb', 'staggering/S006C002P008R002A042_rgb', 'staggering/S017C002P017R002A042_rgb', 'staggering/S001C001P002R002A042_rgb', 'staggering/S007C001P026R002A042_rgb', 'staggering/S010C001P007R001A042_rgb', 'staggering/S009C003P019R001A042_rgb', 'staggering/S009C003P017R001A042_rgb', 'staggering/S007C001P001R002A042_rgb', 'staggering/S015C001P025R001A042_rgb', 'staggering/S017C002P020R001A042_rgb', 'staggering/S008C001P007R001A042_rgb', 'staggering/S010C002P017R002A042_rgb', 'staggering/S003C002P001R001A042_rgb', 'staggering/S017C002P020R002A042_rgb', 'staggering/S012C002P015R001A042_rgb', 'staggering/S017C003P020R001A042_rgb', 'staggering/S014C001P015R001A042_rgb', 'staggering/S001C002P008R001A042_rgb', 'staggering/S011C003P007R001A042_rgb', 'staggering/S013C002P027R001A042_rgb', 'staggering/S010C001P015R002A042_rgb', 'staggering/S010C002P015R002A042_rgb', 'staggering/S011C002P019R002A042_rgb', 'staggering/S003C003P008R001A042_rgb', 'staggering/S009C002P007R002A042_rgb', 'staggering/S003C003P001R001A042_rgb', 'staggering/S008C001P019R001A042_rgb', 'staggering/S014C001P008R001A042_rgb', 'staggering/S012C002P019R002A042_rgb', 'staggering/S004C003P008R002A042_rgb', 'staggering/S017C003P016R001A042_rgb', 'staggering/S016C002P025R001A042_rgb', 'staggering/S008C001P015R002A042_rgb', 'staggering/S012C001P027R002A042_rgb', 'staggering/S005C001P015R002A042_rgb', 'staggering/S013C002P018R001A042_rgb', 'staggering/S016C001P039R001A042_rgb', 'staggering/S006C002P022R002A042_rgb', 'staggering/S013C003P025R002A042_rgb', 'staggering/S001C001P004R002A042_rgb', 'staggering/S006C001P024R002A042_rgb', 'staggering/S008C003P033R002A042_rgb', 'staggering/S006C003P017R001A042_rgb', 'staggering/S011C001P038R002A042_rgb', 'staggering/S008C003P030R002A042_rgb', 'staggering/S002C003P009R001A042_rgb', 'staggering/S008C001P030R002A042_rgb', 'staggering/S001C001P008R002A042_rgb', 'staggering/S014C002P019R002A042_rgb', 'staggering/S007C002P007R002A042_rgb', 'staggering/S001C002P005R001A042_rgb', 'staggering/S004C002P020R001A042_rgb', 'staggering/S014C003P017R002A042_rgb', 'staggering/S008C003P035R002A042_rgb', 'staggering/S005C003P013R002A042_rgb', 'staggering/S002C003P012R001A042_rgb', 'staggering/S003C003P016R001A042_rgb', 'staggering/S007C003P008R001A042_rgb', 'staggering/S017C002P017R001A042_rgb', 'staggering/S008C001P001R001A042_rgb', 'staggering/S015C001P008R002A042_rgb', 'staggering/S007C002P008R001A042_rgb', 'staggering/S015C002P008R001A042_rgb', 'staggering/S007C003P019R001A042_rgb', 'staggering/S014C002P039R002A042_rgb', 'staggering/S010C002P007R001A042_rgb', 'staggering/S006C002P019R002A042_rgb', 'staggering/S004C002P008R001A042_rgb', 'staggering/S017C001P003R001A042_rgb', 'staggering/S013C002P008R001A042_rgb', 'staggering/S012C001P028R001A042_rgb', 'staggering/S013C001P037R001A042_rgb', 'staggering/S003C001P017R001A042_rgb', 'staggering/S012C002P037R001A042_rgb', 'staggering/S017C003P008R002A042_rgb', 'staggering/S013C003P016R001A042_rgb', 'staggering/S010C002P025R001A042_rgb', 'staggering/S016C003P008R001A042_rgb', 'staggering/S016C003P025R002A042_rgb', 'staggering/S011C003P019R002A042_rgb', 'staggering/S007C003P016R001A042_rgb', 'staggering/S010C003P019R002A042_rgb', 'staggering/S013C001P007R002A042_rgb', 'staggering/S012C002P028R002A042_rgb', 'staggering/S011C001P016R001A042_rgb', 'staggering/S013C001P007R001A042_rgb', 'staggering/S009C002P025R001A042_rgb', 'staggering/S013C003P015R002A042_rgb', 'staggering/S006C001P017R002A042_rgb', 'staggering/S014C003P017R001A042_rgb', 'staggering/S006C003P007R001A042_rgb', 'staggering/S013C001P018R001A042_rgb', 'staggering/S014C003P025R001A042_rgb', 'staggering/S015C003P025R001A042_rgb', 'staggering/S012C001P025R002A042_rgb', 'staggering/S012C003P015R002A042_rgb', 'staggering/S001C003P001R002A042_rgb', 'staggering/S008C001P035R002A042_rgb', 'staggering/S009C001P019R002A042_rgb', 'staggering/S014C002P039R001A042_rgb', 'staggering/S007C001P016R001A042_rgb', 'staggering/S007C001P017R001A042_rgb', 'staggering/S015C002P037R002A042_rgb', 'staggering/S002C003P011R001A042_rgb', 'staggering/S013C003P008R001A042_rgb', 'staggering/S001C003P002R002A042_rgb', 'staggering/S011C003P027R002A042_rgb', 'staggering/S007C001P018R001A042_rgb', 'staggering/S011C003P016R002A042_rgb', 'staggering/S008C003P008R002A042_rgb', 'staggering/S007C002P028R002A042_rgb', 'staggering/S013C001P037R002A042_rgb', 'staggering/S010C001P015R001A042_rgb', 'staggering/S011C003P019R001A042_rgb', 'staggering/S014C003P027R002A042_rgb', 'staggering/S005C003P004R001A042_rgb', 'staggering/S008C003P029R002A042_rgb', 'staggering/S008C001P034R001A042_rgb', 'staggering/S007C001P019R001A042_rgb', 'staggering/S009C002P008R002A042_rgb', 'staggering/S013C003P017R001A042_rgb', 'staggering/S006C003P016R002A042_rgb', 'staggering/S007C001P017R002A042_rgb', 'staggering/S011C001P007R002A042_rgb', 'staggering/S007C002P007R001A042_rgb', 'staggering/S010C003P021R001A042_rgb', 'staggering/S001C002P008R002A042_rgb', 'staggering/S013C002P028R001A042_rgb', 'staggering/S010C002P017R001A042_rgb', 'staggering/S014C001P039R002A042_rgb', 'staggering/S012C002P028R001A042_rgb', 'staggering/S012C002P019R001A042_rgb', 'staggering/S015C003P008R002A042_rgb', 'staggering/S006C002P015R002A042_rgb', 'staggering/S010C002P016R002A042_rgb', 'staggering/S010C002P021R002A042_rgb', 'staggering/S001C002P001R001A042_rgb', 'staggering/S007C001P007R001A042_rgb', 'staggering/S002C003P013R002A042_rgb', 'staggering/S009C003P008R001A042_rgb', 'staggering/S016C003P007R001A042_rgb', 'staggering/S010C001P018R001A042_rgb', 'staggering/S017C001P016R002A042_rgb', 'staggering/S002C001P012R001A042_rgb', 'staggering/S012C002P007R002A042_rgb', 'staggering/S002C002P008R002A042_rgb', 'staggering/S003C001P017R002A042_rgb', 'staggering/S011C003P015R001A042_rgb', 'staggering/S003C001P015R002A042_rgb', 'staggering/S008C003P019R001A042_rgb', 'staggering/S003C001P016R001A042_rgb', 'fallingDown/S015C003P016R001A043_rgb', 'fallingDown/S008C002P025R001A043_rgb', 'fallingDown/S003C001P016R002A043_rgb', 'fallingDown/S009C003P017R002A043_rgb', 'fallingDown/S015C003P019R001A043_rgb', 'fallingDown/S010C001P015R001A043_rgb', 'fallingDown/S011C001P002R001A043_rgb', 'fallingDown/S008C002P033R002A043_rgb', 'fallingDown/S007C002P017R002A043_rgb', 'fallingDown/S004C002P003R001A043_rgb', 'fallingDown/S016C003P039R001A043_rgb', 'fallingDown/S002C002P011R001A043_rgb', 'fallingDown/S011C002P015R002A043_rgb', 'fallingDown/S010C003P018R001A043_rgb', 'fallingDown/S009C001P017R001A043_rgb', 'fallingDown/S009C002P025R002A043_rgb', 'fallingDown/S010C003P017R001A043_rgb', 'fallingDown/S016C003P019R001A043_rgb', 'fallingDown/S003C001P007R002A043_rgb', 'fallingDown/S014C001P017R001A043_rgb', 'fallingDown/S013C002P016R002A043_rgb', 'fallingDown/S010C001P017R002A043_rgb', 'fallingDown/S001C003P004R002A043_rgb', 'fallingDown/S008C003P029R001A043_rgb', 'fallingDown/S017C002P003R001A043_rgb', 'fallingDown/S017C003P009R001A043_rgb', 'fallingDown/S009C001P007R001A043_rgb', 'fallingDown/S002C002P010R002A043_rgb', 'fallingDown/S012C003P017R002A043_rgb', 'fallingDown/S008C002P036R002A043_rgb', 'fallingDown/S001C003P006R001A043_rgb', 'fallingDown/S015C001P015R001A043_rgb', 'fallingDown/S003C002P016R002A043_rgb', 'fallingDown/S012C003P028R002A043_rgb', 'fallingDown/S008C003P034R002A043_rgb', 'fallingDown/S005C002P017R001A043_rgb', 'fallingDown/S012C001P025R001A043_rgb', 'fallingDown/S014C003P015R002A043_rgb', 'fallingDown/S016C001P008R001A043_rgb', 'fallingDown/S001C001P001R002A043_rgb', 'fallingDown/S001C001P004R001A043_rgb', 'fallingDown/S010C002P018R001A043_rgb', 'fallingDown/S016C001P007R001A043_rgb', 'fallingDown/S013C001P015R001A043_rgb', 'fallingDown/S003C002P019R002A043_rgb', 'fallingDown/S006C003P022R001A043_rgb', 'fallingDown/S016C001P021R002A043_rgb', 'fallingDown/S008C001P025R001A043_rgb', 'fallingDown/S006C002P008R001A043_rgb', 'fallingDown/S002C003P003R001A043_rgb', 'fallingDown/S011C003P016R001A043_rgb', 'fallingDown/S010C003P013R002A043_rgb', 'fallingDown/S016C002P008R002A043_rgb', 'fallingDown/S007C001P017R002A043_rgb', 'fallingDown/S008C001P032R002A043_rgb', 'fallingDown/S011C001P019R001A043_rgb', 'fallingDown/S001C001P004R002A043_rgb', 'fallingDown/S008C001P019R001A043_rgb', 'fallingDown/S010C001P025R002A043_rgb', 'fallingDown/S012C001P019R002A043_rgb', 'fallingDown/S008C002P019R002A043_rgb', 'fallingDown/S011C001P028R002A043_rgb', 'fallingDown/S006C002P023R002A043_rgb', 'fallingDown/S011C003P017R001A043_rgb', 'fallingDown/S008C003P015R002A043_rgb', 'fallingDown/S013C003P027R002A043_rgb', 'fallingDown/S014C003P039R002A043_rgb', 'fallingDown/S011C003P015R001A043_rgb', 'fallingDown/S010C001P007R001A043_rgb', 'fallingDown/S006C002P001R002A043_rgb', 'fallingDown/S017C003P017R002A043_rgb', 'fallingDown/S008C001P015R001A043_rgb', 'fallingDown/S006C001P022R002A043_rgb', 'fallingDown/S002C002P008R001A043_rgb', 'fallingDown/S012C003P015R002A043_rgb', 'fallingDown/S017C001P009R001A043_rgb', 'fallingDown/S006C001P008R002A043_rgb', 'fallingDown/S012C001P028R001A043_rgb', 'fallingDown/S002C003P011R001A043_rgb', 'fallingDown/S007C003P026R001A043_rgb', 'fallingDown/S008C003P019R001A043_rgb', 'fallingDown/S017C001P003R001A043_rgb', 'fallingDown/S001C002P006R002A043_rgb', 'fallingDown/S014C001P019R001A043_rgb', 'fallingDown/S017C002P020R002A043_rgb', 'fallingDown/S008C002P032R002A043_rgb', 'fallingDown/S007C001P025R002A043_rgb', 'fallingDown/S001C003P008R001A043_rgb', 'fallingDown/S012C001P007R002A043_rgb', 'fallingDown/S001C002P003R002A043_rgb', 'fallingDown/S009C002P015R002A043_rgb', 'fallingDown/S017C003P009R002A043_rgb', 'fallingDown/S011C001P007R002A043_rgb', 'fallingDown/S013C003P028R001A043_rgb', 'fallingDown/S011C001P017R002A043_rgb', 'fallingDown/S007C003P007R001A043_rgb', 'fallingDown/S008C001P001R002A043_rgb', 'fallingDown/S013C002P028R002A043_rgb', 'fallingDown/S007C002P017R001A043_rgb', 'fallingDown/S003C002P015R001A043_rgb', 'fallingDown/S016C001P025R001A043_rgb', 'fallingDown/S002C003P010R001A043_rgb', 'fallingDown/S002C001P012R001A043_rgb', 'fallingDown/S011C001P027R002A043_rgb', 'fallingDown/S008C001P007R002A043_rgb', 'fallingDown/S007C001P026R002A043_rgb', 'fallingDown/S011C003P025R002A043_rgb', 'fallingDown/S010C002P013R001A043_rgb', 'fallingDown/S001C003P001R001A043_rgb', 'fallingDown/S009C002P007R002A043_rgb', 'fallingDown/S010C002P007R001A043_rgb', 'fallingDown/S003C002P007R001A043_rgb', 'fallingDown/S008C003P029R002A043_rgb', 'fallingDown/S008C003P032R002A043_rgb', 'fallingDown/S003C002P008R002A043_rgb', 'fallingDown/S015C002P008R001A043_rgb', 'fallingDown/S015C001P016R001A043_rgb', 'fallingDown/S010C001P018R002A043_rgb', 'fallingDown/S013C001P017R001A043_rgb', 'fallingDown/S004C002P003R002A043_rgb', 'fallingDown/S002C002P011R002A043_rgb', 'fallingDown/S012C002P007R002A043_rgb', 'fallingDown/S014C001P015R001A043_rgb', 'fallingDown/S010C002P016R001A043_rgb', 'fallingDown/S008C001P019R002A043_rgb', 'fallingDown/S007C003P001R002A043_rgb', 'fallingDown/S014C001P007R002A043_rgb', 'fallingDown/S013C002P025R001A043_rgb', 'fallingDown/S017C001P016R001A043_rgb', 'fallingDown/S014C003P025R002A043_rgb', 'fallingDown/S013C003P008R002A043_rgb', 'fallingDown/S001C002P006R001A043_rgb', 'fallingDown/S003C003P007R002A043_rgb', 'fallingDown/S013C001P027R002A043_rgb', 'fallingDown/S008C001P033R002A043_rgb', 'fallingDown/S008C003P007R001A043_rgb', 'fallingDown/S002C003P014R002A043_rgb', 'fallingDown/S003C001P018R002A043_rgb', 'fallingDown/S007C002P008R002A043_rgb', 'fallingDown/S005C003P021R001A043_rgb', 'fallingDown/S013C002P007R002A043_rgb', 'fallingDown/S005C001P010R001A043_rgb', 'fallingDown/S013C001P025R001A043_rgb', 'fallingDown/S009C001P015R001A043_rgb', 'fallingDown/S005C002P010R002A043_rgb', 'fallingDown/S003C003P019R001A043_rgb', 'fallingDown/S013C002P015R002A043_rgb', 'fallingDown/S008C003P007R002A043_rgb', 'fallingDown/S013C003P025R002A043_rgb', 'fallingDown/S007C003P018R001A043_rgb', 'fallingDown/S012C002P028R002A043_rgb', 'fallingDown/S002C003P014R001A043_rgb', 'fallingDown/S007C002P015R001A043_rgb', 'fallingDown/S009C002P025R001A043_rgb', 'fallingDown/S006C003P024R002A043_rgb', 'fallingDown/S010C003P015R002A043_rgb', 'fallingDown/S008C002P031R001A043_rgb', 'fallingDown/S007C003P019R002A043_rgb', 'fallingDown/S011C001P018R002A043_rgb', 'fallingDown/S011C001P025R001A043_rgb', 'fallingDown/S007C002P027R002A043_rgb', 'fallingDown/S017C003P003R002A043_rgb', 'fallingDown/S014C002P027R001A043_rgb', 'fallingDown/S010C001P008R002A043_rgb', 'fallingDown/S012C001P017R002A043_rgb', 'fallingDown/S008C001P030R002A043_rgb', 'fallingDown/S008C002P008R002A043_rgb', 'fallingDown/S006C002P024R001A043_rgb', 'fallingDown/S013C002P037R002A043_rgb', 'fallingDown/S009C001P016R002A043_rgb', 'fallingDown/S002C003P003R002A043_rgb', 'fallingDown/S014C003P017R002A043_rgb', 'fallingDown/S001C001P006R001A043_rgb', 'fallingDown/S012C002P019R001A043_rgb', 'fallingDown/S017C002P017R001A043_rgb', 'fallingDown/S013C003P018R002A043_rgb', 'fallingDown/S010C003P018R002A043_rgb', 'fallingDown/S013C003P016R002A043_rgb', 'fallingDown/S017C001P017R002A043_rgb', 'fallingDown/S013C003P025R001A043_rgb', 'fallingDown/S007C003P016R001A043_rgb', 'fallingDown/S013C002P008R001A043_rgb', 'fallingDown/S016C002P007R001A043_rgb', 'fallingDown/S005C001P017R001A043_rgb', 'fallingDown/S006C001P024R002A043_rgb', 'fallingDown/S014C002P025R002A043_rgb', 'fallingDown/S016C002P007R002A043_rgb', 'fallingDown/S011C002P001R001A043_rgb', 'fallingDown/S012C002P008R001A043_rgb', 'fallingDown/S006C002P016R002A043_rgb', 'fallingDown/S015C003P019R002A043_rgb', 'fallingDown/S009C003P019R001A043_rgb', 'fallingDown/S014C001P025R001A043_rgb', 'fallingDown/S004C003P020R001A043_rgb', 'fallingDown/S006C002P017R002A043_rgb', 'fallingDown/S008C001P034R002A043_rgb', 'fallingDown/S011C002P007R001A043_rgb', 'fallingDown/S012C001P008R002A043_rgb', 'fallingDown/S007C002P019R002A043_rgb', 'fallingDown/S015C003P017R002A043_rgb', 'fallingDown/S005C002P016R002A043_rgb', 'fallingDown/S010C001P019R001A043_rgb', 'fallingDown/S013C002P017R002A043_rgb', 'fallingDown/S002C003P012R001A043_rgb', 'fallingDown/S011C003P027R001A043_rgb', 'fallingDown/S008C002P035R002A043_rgb', 'fallingDown/S013C002P019R001A043_rgb', 'fallingDown/S015C001P016R002A043_rgb', 'fallingDown/S006C001P008R001A043_rgb', 'fallingDown/S016C003P007R001A043_rgb', 'fallingDown/S011C002P025R002A043_rgb', 'fallingDown/S013C003P017R002A043_rgb', 'fallingDown/S012C002P028R001A043_rgb', 'fallingDown/S011C003P016R002A043_rgb', 'fallingDown/S011C003P017R002A043_rgb', 'fallingDown/S008C001P032R001A043_rgb', 'fallingDown/S007C002P007R002A043_rgb', 'fallingDown/S012C001P015R001A043_rgb', 'fallingDown/S015C002P017R001A043_rgb', 'fallingDown/S017C001P003R002A043_rgb', 'fallingDown/S009C003P017R001A043_rgb', 'fallingDown/S012C002P025R001A043_rgb', 'fallingDown/S013C002P008R002A043_rgb', 'fallingDown/S005C003P013R001A043_rgb', 'fallingDown/S015C003P007R002A043_rgb', 'fallingDown/S017C002P009R001A043_rgb', 'fallingDown/S008C003P036R001A043_rgb', 'fallingDown/S012C001P037R001A043_rgb', 'fallingDown/S006C003P008R001A043_rgb', 'fallingDown/S010C001P018R001A043_rgb', 'fallingDown/S017C003P017R001A043_rgb', 'fallingDown/S015C001P019R001A043_rgb', 'fallingDown/S007C003P015R002A043_rgb', 'fallingDown/S011C001P018R001A043_rgb', 'fallingDown/S007C002P027R001A043_rgb', 'fallingDown/S017C001P015R002A043_rgb', 'fallingDown/S006C002P001R001A043_rgb', 'fallingDown/S017C003P020R001A043_rgb', 'fallingDown/S004C003P008R001A043_rgb', 'fallingDown/S011C003P007R001A043_rgb', 'fallingDown/S013C002P037R001A043_rgb', 'fallingDown/S015C003P015R002A043_rgb', 'fallingDown/S002C001P013R002A043_rgb', 'fallingDown/S012C002P015R001A043_rgb', 'fallingDown/S009C003P008R001A043_rgb', 'fallingDown/S008C002P031R002A043_rgb', 'fallingDown/S005C001P018R001A043_rgb', 'fallingDown/S014C002P019R001A043_rgb', 'fallingDown/S006C001P001R001A043_rgb', 'fallingDown/S009C003P015R001A043_rgb', 'fallingDown/S001C001P002R001A043_rgb', 'fallingDown/S012C001P037R002A043_rgb', 'fallingDown/S015C003P007R001A043_rgb', 'fallingDown/S010C002P021R002A043_rgb', 'fallingDown/S010C003P025R002A043_rgb', 'fallingDown/S004C001P020R002A043_rgb', 'fallingDown/S001C001P001R001A043_rgb', 'fallingDown/S008C003P031R001A043_rgb', 'fallingDown/S007C001P025R001A043_rgb', 'fallingDown/S012C003P027R002A043_rgb', 'fallingDown/S006C003P019R001A043_rgb', 'fallingDown/S005C003P021R002A043_rgb', 'fallingDown/S008C003P030R001A043_rgb', 'fallingDown/S003C002P018R002A043_rgb', 'fallingDown/S015C003P008R001A043_rgb', 'fallingDown/S004C002P020R001A043_rgb', 'fallingDown/S005C002P021R001A043_rgb', 'fallingDown/S017C003P015R001A043_rgb', 'fallingDown/S015C001P019R002A043_rgb', 'fallingDown/S015C001P008R002A043_rgb', 'fallingDown/S007C002P016R001A043_rgb', 'fallingDown/S005C003P018R001A043_rgb', 'fallingDown/S007C001P001R001A043_rgb', 'fallingDown/S015C002P019R001A043_rgb', 'fallingDown/S008C003P034R001A043_rgb', 'fallingDown/S016C003P025R001A043_rgb', 'fallingDown/S010C001P013R002A043_rgb', 'fallingDown/S004C001P020R001A043_rgb', 'fallingDown/S015C001P007R001A043_rgb', 'fallingDown/S009C001P007R002A043_rgb', 'fallingDown/S016C002P019R002A043_rgb', 'fallingDown/S012C003P018R002A043_rgb', 'fallingDown/S014C001P027R002A043_rgb', 'fallingDown/S012C003P037R002A043_rgb', 'fallingDown/S017C002P008R002A043_rgb', 'fallingDown/S007C002P026R002A043_rgb', 'fallingDown/S016C001P021R001A043_rgb', 'fallingDown/S012C002P018R001A043_rgb', 'fallingDown/S010C002P021R001A043_rgb', 'fallingDown/S014C002P039R002A043_rgb', 'fallingDown/S011C001P016R001A043_rgb', 'fallingDown/S011C003P015R002A043_rgb', 'fallingDown/S004C003P007R002A043_rgb', 'fallingDown/S006C001P016R002A043_rgb', 'fallingDown/S011C002P016R002A043_rgb', 'fallingDown/S009C002P007R001A043_rgb', 'fallingDown/S013C003P019R001A043_rgb', 'fallingDown/S015C002P017R002A043_rgb', 'fallingDown/S001C003P006R002A043_rgb', 'fallingDown/S007C003P028R002A043_rgb', 'fallingDown/S013C001P019R002A043_rgb', 'fallingDown/S010C001P015R002A043_rgb', 'fallingDown/S016C002P021R002A043_rgb', 'fallingDown/S001C002P005R001A043_rgb', 'fallingDown/S007C002P025R002A043_rgb', 'fallingDown/S002C003P012R002A043_rgb', 'fallingDown/S016C003P021R001A043_rgb', 'fallingDown/S005C002P015R001A043_rgb', 'fallingDown/S008C003P008R001A043_rgb', 'fallingDown/S015C003P037R002A043_rgb', 'fallingDown/S008C002P001R002A043_rgb', 'fallingDown/S008C003P033R001A043_rgb', 'fallingDown/S002C001P007R002A043_rgb', 'fallingDown/S009C001P019R001A043_rgb', 'fallingDown/S007C003P028R001A043_rgb', 'fallingDown/S002C002P013R002A043_rgb', 'fallingDown/S001C002P001R002A043_rgb', 'fallingDown/S008C002P015R001A043_rgb', 'fallingDown/S012C002P015R002A043_rgb', 'fallingDown/S005C003P015R001A043_rgb', 'fallingDown/S008C001P034R001A043_rgb', 'fallingDown/S010C001P017R001A043_rgb', 'fallingDown/S008C003P001R001A043_rgb', 'fallingDown/S003C003P018R002A043_rgb', 'fallingDown/S017C003P003R001A043_rgb', 'fallingDown/S014C001P037R002A043_rgb', 'fallingDown/S009C001P025R001A043_rgb', 'fallingDown/S005C003P010R002A043_rgb', 'fallingDown/S008C002P033R001A043_rgb', 'fallingDown/S011C001P038R001A043_rgb', 'fallingDown/S006C001P001R002A043_rgb', 'fallingDown/S004C002P020R002A043_rgb', 'fallingDown/S006C002P016R001A043_rgb', 'fallingDown/S007C001P008R002A043_rgb', 'fallingDown/S002C002P007R001A043_rgb', 'fallingDown/S011C002P028R002A043_rgb', 'fallingDown/S008C001P035R002A043_rgb', 'fallingDown/S002C003P013R002A043_rgb', 'fallingDown/S016C003P025R002A043_rgb', 'fallingDown/S013C002P025R002A043_rgb', 'fallingDown/S007C003P008R002A043_rgb', 'fallingDown/S003C002P001R001A043_rgb', 'fallingDown/S005C002P017R002A043_rgb', 'fallingDown/S011C003P028R001A043_rgb', 'fallingDown/S006C002P019R001A043_rgb', 'fallingDown/S002C001P003R002A043_rgb', 'fallingDown/S010C001P025R001A043_rgb', 'fallingDown/S011C003P002R002A043_rgb', 'fallingDown/S001C002P002R001A043_rgb', 'fallingDown/S011C003P018R001A043_rgb', 'fallingDown/S007C001P008R001A043_rgb', 'fallingDown/S014C003P019R001A043_rgb', 'fallingDown/S010C001P019R002A043_rgb', 'fallingDown/S005C001P010R002A043_rgb', 'fallingDown/S001C002P004R001A043_rgb', 'fallingDown/S013C002P019R002A043_rgb', 'fallingDown/S008C001P029R002A043_rgb', 'fallingDown/S002C003P009R001A043_rgb', 'fallingDown/S014C002P007R002A043_rgb', 'fallingDown/S006C002P008R002A043_rgb', 'fallingDown/S001C001P007R001A043_rgb', 'fallingDown/S014C001P039R002A043_rgb', 'fallingDown/S002C001P011R002A043_rgb', 'fallingDown/S010C002P015R002A043_rgb', 'fallingDown/S007C003P008R001A043_rgb', 'fallingDown/S009C003P025R001A043_rgb', 'fallingDown/S014C003P027R002A043_rgb', 'fallingDown/S007C001P027R001A043_rgb', 'fallingDown/S012C001P018R002A043_rgb', 'fallingDown/S002C002P007R002A043_rgb', 'fallingDown/S003C002P017R002A043_rgb', 'fallingDown/S010C003P019R002A043_rgb', 'fallingDown/S017C001P015R001A043_rgb', 'fallingDown/S012C003P017R001A043_rgb', 'fallingDown/S008C001P035R001A043_rgb', 'fallingDown/S017C002P020R001A043_rgb', 'fallingDown/S003C003P001R001A043_rgb', 'fallingDown/S005C002P010R001A043_rgb', 'fallingDown/S012C003P008R002A043_rgb', 'fallingDown/S011C003P001R001A043_rgb', 'fallingDown/S007C003P025R001A043_rgb', 'fallingDown/S012C003P037R001A043_rgb', 'fallingDown/S017C001P017R001A043_rgb', 'fallingDown/S002C001P013R001A043_rgb', 'fallingDown/S012C001P008R001A043_rgb', 'fallingDown/S015C001P037R001A043_rgb', 'fallingDown/S002C003P007R001A043_rgb', 'fallingDown/S008C001P031R001A043_rgb', 'fallingDown/S010C002P017R001A043_rgb', 'fallingDown/S013C003P008R001A043_rgb', 'fallingDown/S013C002P016R001A043_rgb', 'fallingDown/S015C001P008R001A043_rgb', 'fallingDown/S016C002P039R001A043_rgb', 'fallingDown/S005C003P017R001A043_rgb', 'fallingDown/S006C002P007R002A043_rgb', 'fallingDown/S009C003P007R002A043_rgb', 'fallingDown/S017C001P008R001A043_rgb', 'fallingDown/S008C002P030R001A043_rgb', 'fallingDown/S003C001P017R002A043_rgb', 'fallingDown/S012C003P008R001A043_rgb', 'fallingDown/S008C002P034R002A043_rgb', 'fallingDown/S002C001P014R001A043_rgb', 'fallingDown/S003C001P001R002A043_rgb', 'fallingDown/S011C003P008R002A043_rgb', 'fallingDown/S013C002P017R001A043_rgb', 'fallingDown/S015C003P015R001A043_rgb', 'fallingDown/S007C001P019R001A043_rgb', 'fallingDown/S013C001P008R001A043_rgb', 'fallingDown/S003C002P002R001A043_rgb', 'fallingDown/S003C001P001R001A043_rgb', 'fallingDown/S008C001P036R002A043_rgb', 'fallingDown/S001C003P002R001A043_rgb', 'fallingDown/S016C002P019R001A043_rgb', 'fallingDown/S015C002P037R001A043_rgb', 'fallingDown/S004C001P007R001A043_rgb', 'fallingDown/S012C003P027R001A043_rgb', 'fallingDown/S011C002P017R001A043_rgb', 'fallingDown/S014C001P039R001A043_rgb', 'fallingDown/S016C003P021R002A043_rgb', 'fallingDown/S003C003P016R001A043_rgb', 'fallingDown/S014C003P007R001A043_rgb', 'fallingDown/S002C002P009R001A043_rgb', 'fallingDown/S013C003P007R001A043_rgb', 'fallingDown/S013C003P016R001A043_rgb', 'fallingDown/S003C001P016R001A043_rgb', 'fallingDown/S010C002P018R002A043_rgb', 'fallingDown/S011C002P008R002A043_rgb', 'fallingDown/S007C002P028R002A043_rgb', 'fallingDown/S006C003P008R002A043_rgb', 'fallingDown/S008C002P029R001A043_rgb', 'fallingDown/S014C002P017R001A043_rgb', 'fallingDown/S008C001P007R001A043_rgb', 'fallingDown/S017C003P008R001A043_rgb', 'fallingDown/S002C002P003R002A043_rgb', 'fallingDown/S007C003P001R001A043_rgb', 'fallingDown/S011C001P007R001A043_rgb', 'fallingDown/S011C002P002R002A043_rgb', 'fallingDown/S006C001P024R001A043_rgb', 'fallingDown/S003C002P002R002A043_rgb', 'fallingDown/S007C002P016R002A043_rgb', 'fallingDown/S005C001P004R001A043_rgb', 'fallingDown/S010C003P021R001A043_rgb', 'fallingDown/S014C001P019R002A043_rgb', 'fallingDown/S006C001P019R002A043_rgb', 'fallingDown/S009C002P017R001A043_rgb', 'fallingDown/S017C002P017R002A043_rgb', 'fallingDown/S008C003P035R002A043_rgb', 'fallingDown/S007C002P019R001A043_rgb', 'fallingDown/S005C001P015R002A043_rgb', 'fallingDown/S003C001P007R001A043_rgb', 'fallingDown/S007C002P015R002A043_rgb', 'fallingDown/S005C001P016R002A043_rgb', 'fallingDown/S005C001P013R001A043_rgb', 'fallingDown/S013C001P007R001A043_rgb', 'fallingDown/S017C002P015R002A043_rgb', 'fallingDown/S011C001P016R002A043_rgb', 'fallingDown/S001C001P005R001A043_rgb', 'fallingDown/S006C001P017R002A043_rgb', 'fallingDown/S008C001P031R002A043_rgb', 'fallingDown/S013C001P037R001A043_rgb', 'fallingDown/S005C002P013R002A043_rgb', 'fallingDown/S006C003P015R001A043_rgb', 'fallingDown/S016C001P007R002A043_rgb', 'fallingDown/S009C002P016R002A043_rgb', 'fallingDown/S001C002P002R002A043_rgb', 'fallingDown/S017C003P007R001A043_rgb', 'fallingDown/S015C003P037R001A043_rgb', 'fallingDown/S014C002P017R002A043_rgb', 'fallingDown/S012C001P028R002A043_rgb', 'fallingDown/S011C001P002R002A043_rgb', 'fallingDown/S008C002P019R001A043_rgb', 'fallingDown/S012C003P015R001A043_rgb', 'fallingDown/S016C001P019R001A043_rgb', 'fallingDown/S017C002P008R001A043_rgb', 'fallingDown/S004C002P007R001A043_rgb', 'fallingDown/S009C003P016R001A043_rgb', 'fallingDown/S011C003P028R002A043_rgb', 'fallingDown/S008C001P033R001A043_rgb', 'fallingDown/S004C001P003R001A043_rgb', 'fallingDown/S001C003P001R002A043_rgb', 'fallingDown/S008C003P031R002A043_rgb', 'fallingDown/S003C001P017R001A043_rgb', 'fallingDown/S007C002P018R002A043_rgb', 'fallingDown/S012C002P025R002A043_rgb', 'fallingDown/S008C002P007R001A043_rgb', 'fallingDown/S017C003P015R002A043_rgb', 'fallingDown/S015C002P025R001A043_rgb', 'fallingDown/S016C001P008R002A043_rgb', 'fallingDown/S011C002P017R002A043_rgb', 'fallingDown/S008C003P035R001A043_rgb', 'fallingDown/S014C003P037R001A043_rgb', 'fallingDown/S013C001P016R001A043_rgb', 'fallingDown/S001C003P004R001A043_rgb', 'fallingDown/S011C003P007R002A043_rgb', 'fallingDown/S002C003P008R002A043_rgb', 'fallingDown/S014C001P027R001A043_rgb', 'fallingDown/S010C002P007R002A043_rgb', 'fallingDown/S003C003P018R001A043_rgb', 'fallingDown/S011C003P002R001A043_rgb', 'fallingDown/S014C003P008R001A043_rgb', 'fallingDown/S007C003P027R002A043_rgb', 'fallingDown/S014C002P027R002A043_rgb', 'fallingDown/S014C003P037R002A043_rgb', 'fallingDown/S006C002P007R001A043_rgb', 'fallingDown/S010C001P007R002A043_rgb', 'fallingDown/S017C002P015R001A043_rgb', 'fallingDown/S010C003P008R002A043_rgb', 'fallingDown/S004C002P008R002A043_rgb', 'fallingDown/S016C001P039R002A043_rgb', 'fallingDown/S008C002P007R002A043_rgb', 'fallingDown/S016C002P040R002A043_rgb', 'fallingDown/S003C003P008R002A043_rgb', 'fallingDown/S001C001P008R001A043_rgb', 'fallingDown/S012C001P018R001A043_rgb', 'fallingDown/S012C001P016R001A043_rgb', 'fallingDown/S010C003P008R001A043_rgb', 'fallingDown/S013C003P007R002A043_rgb', 'fallingDown/S016C002P039R002A043_rgb', 'fallingDown/S012C002P019R002A043_rgb', 'fallingDown/S005C002P015R002A043_rgb', 'fallingDown/S007C001P016R001A043_rgb', 'fallingDown/S001C002P008R002A043_rgb', 'fallingDown/S013C001P007R002A043_rgb', 'fallingDown/S013C001P016R002A043_rgb', 'fallingDown/S017C002P003R002A043_rgb', 'fallingDown/S003C003P019R002A043_rgb', 'fallingDown/S016C003P040R001A043_rgb', 'fallingDown/S015C002P016R001A043_rgb', 'fallingDown/S011C001P015R002A043_rgb', 'fallingDown/S005C001P021R002A043_rgb', 'fallingDown/S014C003P019R002A043_rgb', 'fallingDown/S012C002P016R001A043_rgb', 'fallingDown/S003C002P017R001A043_rgb', 'fallingDown/S013C001P028R001A043_rgb', 'fallingDown/S001C003P003R001A043_rgb', 'fallingDown/S012C002P027R001A043_rgb', 'fallingDown/S011C002P018R001A043_rgb', 'fallingDown/S005C001P015R001A043_rgb', 'fallingDown/S013C002P027R001A043_rgb', 'fallingDown/S002C002P014R002A043_rgb', 'fallingDown/S004C001P003R002A043_rgb', 'fallingDown/S003C001P008R001A043_rgb', 'fallingDown/S002C001P008R001A043_rgb', 'fallingDown/S012C001P016R002A043_rgb', 'fallingDown/S016C003P007R002A043_rgb', 'fallingDown/S016C003P008R001A043_rgb', 'fallingDown/S014C001P007R001A043_rgb', 'fallingDown/S001C002P001R001A043_rgb', 'fallingDown/S002C003P009R002A043_rgb', 'fallingDown/S011C002P019R001A043_rgb', 'fallingDown/S008C003P015R001A043_rgb', 'fallingDown/S007C001P016R002A043_rgb', 'fallingDown/S016C002P025R002A043_rgb', 'fallingDown/S015C001P037R002A043_rgb', 'fallingDown/S010C003P017R002A043_rgb', 'fallingDown/S012C003P016R001A043_rgb', 'fallingDown/S001C003P005R001A043_rgb', 'fallingDown/S003C003P002R002A043_rgb', 'fallingDown/S012C003P007R001A043_rgb', 'fallingDown/S014C001P008R002A043_rgb', 'fallingDown/S009C002P016R001A043_rgb', 'fallingDown/S006C001P007R001A043_rgb', 'fallingDown/S017C003P016R002A043_rgb', 'fallingDown/S002C003P013R001A043_rgb', 'fallingDown/S008C003P033R002A043_rgb', 'fallingDown/S011C002P038R002A043_rgb', 'fallingDown/S011C002P016R001A043_rgb', 'fallingDown/S014C003P007R002A043_rgb', 'fallingDown/S014C003P015R001A043_rgb', 'fallingDown/S007C003P018R002A043_rgb', 'fallingDown/S014C002P015R001A043_rgb', 'fallingDown/S012C001P027R002A043_rgb', 'fallingDown/S001C001P008R002A043_rgb', 'fallingDown/S005C001P017R002A043_rgb', 'fallingDown/S002C001P009R001A043_rgb', 'fallingDown/S016C002P021R001A043_rgb', 'fallingDown/S008C002P034R001A043_rgb', 'fallingDown/S008C003P030R002A043_rgb', 'fallingDown/S007C003P019R001A043_rgb', 'fallingDown/S007C002P007R001A043_rgb', 'fallingDown/S010C002P015R001A043_rgb', 'fallingDown/S007C002P008R001A043_rgb', 'fallingDown/S012C003P016R002A043_rgb', 'fallingDown/S013C003P018R001A043_rgb', 'fallingDown/S009C001P019R002A043_rgb', 'fallingDown/S017C001P007R002A043_rgb', 'fallingDown/S008C003P025R001A043_rgb', 'fallingDown/S008C001P008R001A043_rgb', 'fallingDown/S003C003P008R001A043_rgb', 'fallingDown/S013C003P015R001A043_rgb', 'fallingDown/S001C003P003R002A043_rgb', 'fallingDown/S017C001P008R002A043_rgb', 'fallingDown/S002C002P003R001A043_rgb', 'fallingDown/S007C003P016R002A043_rgb', 'fallingDown/S015C001P017R001A043_rgb', 'fallingDown/S002C002P008R002A043_rgb', 'fallingDown/S008C002P030R002A043_rgb', 'fallingDown/S016C002P025R001A043_rgb', 'fallingDown/S010C001P013R001A043_rgb', 'fallingDown/S001C001P003R002A043_rgb', 'fallingDown/S014C002P007R001A043_rgb', 'fallingDown/S001C002P007R001A043_rgb', 'fallingDown/S011C001P001R001A043_rgb', 'fallingDown/S007C003P027R001A043_rgb', 'fallingDown/S002C001P010R001A043_rgb', 'fallingDown/S015C003P008R002A043_rgb', 'fallingDown/S010C003P025R001A043_rgb', 'fallingDown/S015C002P025R002A043_rgb', 'fallingDown/S009C003P008R002A043_rgb', 'fallingDown/S016C003P019R002A043_rgb', 'fallingDown/S016C001P040R001A043_rgb', 'fallingDown/S016C001P039R001A043_rgb', 'fallingDown/S001C003P005R002A043_rgb', 'fallingDown/S007C001P028R002A043_rgb', 'fallingDown/S004C003P003R002A043_rgb', 'fallingDown/S001C002P008R001A043_rgb', 'fallingDown/S009C001P025R002A043_rgb', 'fallingDown/S007C002P001R002A043_rgb', 'fallingDown/S014C003P017R001A043_rgb', 'fallingDown/S007C002P028R001A043_rgb', 'fallingDown/S016C001P025R002A043_rgb', 'fallingDown/S003C003P001R002A043_rgb', 'fallingDown/S005C001P021R001A043_rgb', 'fallingDown/S012C003P018R001A043_rgb', 'fallingDown/S012C001P025R002A043_rgb', 'fallingDown/S006C003P016R002A043_rgb', 'fallingDown/S011C002P015R001A043_rgb', 'fallingDown/S017C003P008R002A043_rgb', 'fallingDown/S005C003P004R002A043_rgb', 'fallingDown/S016C001P040R002A043_rgb', 'fallingDown/S002C002P009R002A043_rgb', 'fallingDown/S011C001P025R002A043_rgb', 'fallingDown/S010C002P019R001A043_rgb', 'fallingDown/S015C003P017R001A043_rgb', 'fallingDown/S007C001P001R002A043_rgb', 'fallingDown/S012C003P019R001A043_rgb', 'fallingDown/S011C002P018R002A043_rgb', 'fallingDown/S012C003P025R001A043_rgb', 'fallingDown/S001C001P003R001A043_rgb', 'fallingDown/S007C001P027R002A043_rgb', 'fallingDown/S009C001P015R002A043_rgb', 'fallingDown/S007C003P017R002A043_rgb', 'fallingDown/S006C003P007R001A043_rgb', 'fallingDown/S011C003P008R001A043_rgb', 'fallingDown/S001C003P007R002A043_rgb', 'fallingDown/S003C001P008R002A043_rgb', 'fallingDown/S004C001P008R001A043_rgb', 'fallingDown/S012C002P017R001A043_rgb', 'fallingDown/S001C001P006R002A043_rgb', 'fallingDown/S003C003P002R001A043_rgb', 'fallingDown/S009C003P007R001A043_rgb', 'fallingDown/S007C002P026R001A043_rgb', 'fallingDown/S010C003P013R001A043_rgb', 'fallingDown/S005C002P016R001A043_rgb', 'fallingDown/S011C001P015R001A043_rgb', 'fallingDown/S017C001P007R001A043_rgb', 'fallingDown/S013C001P008R002A043_rgb', 'fallingDown/S016C003P039R002A043_rgb', 'fallingDown/S006C003P022R002A043_rgb', 'fallingDown/S001C001P002R002A043_rgb', 'fallingDown/S008C003P001R002A043_rgb', 'fallingDown/S017C001P020R001A043_rgb', 'fallingDown/S006C001P016R001A043_rgb', 'fallingDown/S013C002P007R001A043_rgb', 'fallingDown/S004C002P007R002A043_rgb', 'fallingDown/S011C001P038R002A043_rgb', 'fallingDown/S012C001P017R001A043_rgb', 'fallingDown/S010C002P016R002A043_rgb', 'fallingDown/S017C002P007R001A043_rgb', 'fallingDown/S014C003P039R001A043_rgb', 'fallingDown/S009C001P017R002A043_rgb', 'fallingDown/S012C001P015R002A043_rgb', 'fallingDown/S011C001P001R002A043_rgb', 'fallingDown/S004C003P007R001A043_rgb', 'fallingDown/S015C001P025R002A043_rgb', 'fallingDown/S014C002P015R002A043_rgb', 'fallingDown/S008C003P036R002A043_rgb', 'fallingDown/S006C002P023R001A043_rgb', 'fallingDown/S008C003P032R001A043_rgb', 'fallingDown/S010C003P016R002A043_rgb', 'fallingDown/S001C001P007R002A043_rgb', 'fallingDown/S009C001P008R002A043_rgb', 'fallingDown/S013C002P015R001A043_rgb', 'fallingDown/S005C003P018R002A043_rgb', 'fallingDown/S015C002P015R002A043_rgb', 'fallingDown/S002C001P009R002A043_rgb', 'fallingDown/S014C001P017R002A043_rgb', 'fallingDown/S006C002P022R002A043_rgb', 'fallingDown/S012C001P007R001A043_rgb', 'fallingDown/S014C002P039R001A043_rgb', 'fallingDown/S003C001P019R001A043_rgb', 'fallingDown/S006C001P023R002A043_rgb', 'fallingDown/S014C001P015R002A043_rgb', 'fallingDown/S008C003P025R002A043_rgb', 'fallingDown/S013C001P018R001A043_rgb', 'fallingDown/S013C001P027R001A043_rgb', 'fallingDown/S007C002P018R001A043_rgb', 'fallingDown/S017C002P016R001A043_rgb', 'fallingDown/S006C001P019R001A043_rgb', 'fallingDown/S011C002P027R001A043_rgb', 'fallingDown/S011C001P008R001A043_rgb', 'fallingDown/S007C001P007R002A043_rgb', 'fallingDown/S008C002P008R001A043_rgb', 'fallingDown/S011C002P007R002A043_rgb', 'fallingDown/S013C001P037R002A043_rgb', 'fallingDown/S007C001P026R001A043_rgb', 'fallingDown/S017C002P016R002A043_rgb', 'fallingDown/S006C001P023R001A043_rgb', 'fallingDown/S008C002P001R001A043_rgb', 'fallingDown/S011C002P019R002A043_rgb', 'fallingDown/S003C002P001R002A043_rgb', 'fallingDown/S003C002P008R001A043_rgb', 'fallingDown/S007C001P015R001A043_rgb', 'fallingDown/S003C002P007R002A043_rgb', 'fallingDown/S005C003P004R001A043_rgb', 'fallingDown/S015C002P015R001A043_rgb', 'fallingDown/S017C001P016R002A043_rgb', 'fallingDown/S009C003P025R002A043_rgb', 'fallingDown/S003C003P015R002A043_rgb', 'fallingDown/S017C003P020R002A043_rgb', 'fallingDown/S011C001P028R001A043_rgb', 'fallingDown/S007C001P028R001A043_rgb', 'fallingDown/S006C002P015R002A043_rgb', 'fallingDown/S011C002P008R001A043_rgb', 'fallingDown/S011C001P017R001A043_rgb', 'fallingDown/S013C003P017R001A043_rgb', 'fallingDown/S010C003P007R002A043_rgb', 'fallingDown/S008C002P029R002A043_rgb', 'fallingDown/S008C002P036R001A043_rgb', 'fallingDown/S015C002P016R002A043_rgb', 'fallingDown/S008C001P025R002A043_rgb', 'fallingDown/S010C001P016R002A043_rgb', 'fallingDown/S005C002P021R002A043_rgb', 'fallingDown/S011C002P025R001A043_rgb', 'fallingDown/S008C002P025R002A043_rgb', 'fallingDown/S006C003P001R002A043_rgb', 'fallingDown/S003C002P019R001A043_rgb', 'fallingDown/S014C002P037R002A043_rgb', 'fallingDown/S005C003P010R001A043_rgb', 'fallingDown/S003C001P019R002A043_rgb', 'fallingDown/S004C001P008R002A043_rgb', 'fallingDown/S011C003P025R001A043_rgb', 'fallingDown/S006C002P024R002A043_rgb', 'fallingDown/S014C001P025R002A043_rgb', 'fallingDown/S006C003P001R001A043_rgb', 'fallingDown/S015C002P008R002A043_rgb', 'fallingDown/S001C002P003R001A043_rgb', 'fallingDown/S012C003P019R002A043_rgb', 'fallingDown/S003C002P018R001A043_rgb', 'fallingDown/S006C003P007R002A043_rgb', 'fallingDown/S011C003P019R001A043_rgb', 'fallingDown/S012C002P008R002A043_rgb', 'fallingDown/S008C001P030R001A043_rgb', 'fallingDown/S002C001P012R002A043_rgb', 'fallingDown/S007C001P018R001A043_rgb', 'fallingDown/S006C001P022R001A043_rgb', 'fallingDown/S015C001P025R001A043_rgb', 'fallingDown/S005C003P015R002A043_rgb', 'fallingDown/S005C001P013R002A043_rgb', 'headache/S013C003P018R002A044_rgb', 'headache/S010C003P013R001A044_rgb', 'headache/S008C003P019R001A044_rgb', 'headache/S013C003P037R002A044_rgb', 'headache/S007C001P018R001A044_rgb', 'headache/S006C001P022R002A044_rgb', 'headache/S013C002P028R002A044_rgb', 'headache/S008C003P031R001A044_rgb', 'headache/S008C001P033R001A044_rgb', 'headache/S010C003P008R002A044_rgb', 'headache/S008C002P034R001A044_rgb', 'headache/S007C001P018R002A044_rgb', 'headache/S011C002P017R002A044_rgb', 'headache/S006C002P015R002A044_rgb', 'headache/S015C003P037R002A044_rgb', 'headache/S012C002P027R002A044_rgb', 'headache/S013C003P027R002A044_rgb', 'headache/S008C001P001R001A044_rgb', 'headache/S005C003P016R002A044_rgb', 'headache/S017C002P016R002A044_rgb', 'headache/S012C001P017R002A044_rgb', 'headache/S008C001P029R001A044_rgb', 'headache/S011C001P017R002A044_rgb', 'headache/S007C001P026R001A044_rgb', 'headache/S015C002P025R001A044_rgb', 'headache/S007C002P018R002A044_rgb', 'headache/S002C002P003R002A044_rgb', 'headache/S007C003P018R002A044_rgb', 'headache/S007C003P025R001A044_rgb', 'headache/S007C003P015R002A044_rgb', 'headache/S012C001P008R002A044_rgb', 'headache/S003C002P002R002A044_rgb', 'headache/S001C003P006R002A044_rgb', 'headache/S012C003P027R001A044_rgb', 'headache/S008C001P030R002A044_rgb', 'headache/S016C003P021R002A044_rgb', 'headache/S008C003P033R001A044_rgb', 'headache/S008C001P019R002A044_rgb', 'headache/S006C003P015R001A044_rgb', 'headache/S011C003P025R001A044_rgb', 'headache/S017C001P003R001A044_rgb', 'headache/S017C001P003R002A044_rgb', 'headache/S001C002P006R001A044_rgb', 'headache/S001C003P007R002A044_rgb', 'headache/S006C003P016R002A044_rgb', 'headache/S013C001P015R001A044_rgb', 'headache/S007C001P017R002A044_rgb', 'headache/S003C003P015R001A044_rgb', 'headache/S001C001P007R002A044_rgb', 'headache/S007C002P007R001A044_rgb', 'headache/S014C003P019R002A044_rgb', 'headache/S008C003P001R002A044_rgb', 'headache/S006C003P023R002A044_rgb', 'headache/S011C002P001R001A044_rgb', 'headache/S003C003P001R001A044_rgb', 'headache/S003C001P018R002A044_rgb', 'headache/S005C002P013R002A044_rgb', 'headache/S003C001P007R002A044_rgb', 'headache/S017C002P009R002A044_rgb', 'headache/S014C002P015R001A044_rgb', 'headache/S001C002P005R001A044_rgb', 'headache/S005C002P010R002A044_rgb', 'headache/S009C001P015R001A044_rgb', 'headache/S012C002P007R001A044_rgb', 'headache/S011C002P018R001A044_rgb', 'headache/S016C002P019R002A044_rgb', 'headache/S013C002P008R002A044_rgb', 'headache/S017C001P020R002A044_rgb', 'headache/S016C001P019R001A044_rgb', 'headache/S013C001P019R001A044_rgb', 'headache/S005C001P021R002A044_rgb', 'headache/S008C002P035R002A044_rgb', 'headache/S011C002P017R001A044_rgb', 'headache/S011C002P008R002A044_rgb', 'headache/S013C002P008R001A044_rgb', 'headache/S005C002P017R001A044_rgb', 'headache/S016C003P019R002A044_rgb', 'headache/S005C003P004R002A044_rgb', 'headache/S007C003P026R002A044_rgb', 'headache/S005C001P021R001A044_rgb', 'headache/S006C001P007R002A044_rgb', 'headache/S013C002P028R001A044_rgb', 'headache/S002C003P009R002A044_rgb', 'headache/S011C001P002R002A044_rgb', 'headache/S002C002P011R002A044_rgb', 'headache/S008C001P033R002A044_rgb', 'headache/S002C001P012R002A044_rgb', 'headache/S008C002P030R001A044_rgb', 'headache/S003C001P002R002A044_rgb', 'headache/S007C002P025R001A044_rgb', 'headache/S013C001P025R002A044_rgb', 'headache/S017C003P007R002A044_rgb', 'headache/S017C003P020R002A044_rgb', 'headache/S004C002P003R002A044_rgb', 'headache/S006C001P015R002A044_rgb', 'headache/S008C001P007R002A044_rgb', 'headache/S003C001P017R002A044_rgb', 'headache/S006C001P008R001A044_rgb', 'headache/S013C002P007R001A044_rgb', 'headache/S009C003P016R001A044_rgb', 'headache/S013C003P016R002A044_rgb', 'headache/S012C001P027R001A044_rgb', 'headache/S003C001P018R001A044_rgb', 'headache/S001C001P004R002A044_rgb', 'headache/S003C002P018R001A044_rgb', 'headache/S006C001P016R001A044_rgb', 'headache/S007C003P017R002A044_rgb', 'headache/S006C003P007R001A044_rgb', 'headache/S001C002P001R001A044_rgb', 'headache/S012C001P019R002A044_rgb', 'headache/S008C001P032R001A044_rgb', 'headache/S010C002P016R002A044_rgb', 'headache/S012C003P028R001A044_rgb', 'headache/S013C003P007R002A044_rgb', 'headache/S001C002P003R001A044_rgb', 'headache/S009C002P025R001A044_rgb', 'headache/S008C003P035R002A044_rgb', 'headache/S003C003P018R002A044_rgb', 'headache/S014C003P037R002A044_rgb', 'headache/S003C003P017R002A044_rgb', 'headache/S017C003P008R002A044_rgb', 'headache/S011C003P017R001A044_rgb', 'headache/S011C003P016R002A044_rgb', 'headache/S007C003P019R002A044_rgb', 'headache/S012C001P008R001A044_rgb', 'headache/S012C001P018R002A044_rgb', 'headache/S012C002P028R002A044_rgb', 'headache/S008C003P019R002A044_rgb', 'headache/S017C002P020R002A044_rgb', 'headache/S007C003P007R002A044_rgb', 'headache/S011C001P001R002A044_rgb', 'headache/S004C001P007R001A044_rgb', 'headache/S011C001P019R001A044_rgb', 'headache/S014C002P039R001A044_rgb', 'headache/S013C001P027R002A044_rgb', 'headache/S014C001P007R001A044_rgb', 'headache/S001C001P005R001A044_rgb', 'headache/S004C002P020R001A044_rgb', 'headache/S003C002P008R001A044_rgb', 'headache/S008C002P015R001A044_rgb', 'headache/S017C002P015R002A044_rgb', 'headache/S003C002P019R001A044_rgb', 'headache/S015C002P017R002A044_rgb', 'headache/S007C003P007R001A044_rgb', 'headache/S005C003P015R001A044_rgb', 'headache/S004C003P007R002A044_rgb', 'headache/S005C003P004R001A044_rgb', 'headache/S006C001P023R001A044_rgb', 'headache/S013C001P017R001A044_rgb', 'headache/S006C003P017R002A044_rgb', 'headache/S008C003P015R001A044_rgb', 'headache/S003C003P008R002A044_rgb', 'headache/S015C001P015R002A044_rgb', 'headache/S005C003P018R001A044_rgb', 'headache/S016C001P025R002A044_rgb', 'headache/S011C002P027R001A044_rgb', 'headache/S013C003P015R002A044_rgb', 'headache/S005C003P015R002A044_rgb', 'headache/S011C002P028R002A044_rgb', 'headache/S002C001P014R001A044_rgb', 'headache/S009C002P015R002A044_rgb', 'headache/S011C002P038R001A044_rgb', 'headache/S003C002P017R001A044_rgb', 'headache/S005C002P016R001A044_rgb', 'headache/S006C002P017R001A044_rgb', 'headache/S012C002P016R002A044_rgb', 'headache/S013C002P018R002A044_rgb', 'headache/S011C002P008R001A044_rgb', 'headache/S017C002P003R002A044_rgb', 'headache/S007C001P027R002A044_rgb', 'headache/S006C003P023R001A044_rgb', 'headache/S011C003P002R001A044_rgb', 'headache/S014C002P008R002A044_rgb', 'headache/S007C002P027R002A044_rgb', 'headache/S013C002P037R001A044_rgb', 'headache/S017C001P007R001A044_rgb', 'headache/S015C002P037R001A044_rgb', 'headache/S012C003P018R002A044_rgb', 'headache/S001C003P003R002A044_rgb', 'headache/S014C002P019R002A044_rgb', 'headache/S010C002P025R002A044_rgb', 'headache/S013C003P008R001A044_rgb', 'headache/S008C001P035R001A044_rgb', 'headache/S016C002P040R001A044_rgb', 'headache/S012C002P037R002A044_rgb', 'headache/S005C003P017R002A044_rgb', 'headache/S003C002P001R001A044_rgb', 'headache/S014C003P017R001A044_rgb', 'headache/S015C002P017R001A044_rgb', 'headache/S012C003P025R001A044_rgb', 'headache/S016C002P039R002A044_rgb', 'headache/S011C001P007R001A044_rgb', 'headache/S010C002P019R001A044_rgb', 'headache/S008C003P029R002A044_rgb', 'headache/S014C003P025R002A044_rgb', 'headache/S007C002P026R001A044_rgb', 'headache/S005C003P021R002A044_rgb', 'headache/S008C002P030R002A044_rgb', 'headache/S008C002P035R001A044_rgb', 'headache/S016C001P021R002A044_rgb', 'headache/S005C001P015R001A044_rgb', 'headache/S006C002P019R001A044_rgb', 'headache/S004C001P020R002A044_rgb', 'headache/S006C003P008R002A044_rgb', 'headache/S011C001P018R001A044_rgb', 'headache/S008C003P036R001A044_rgb', 'headache/S006C002P007R002A044_rgb', 'headache/S012C003P008R001A044_rgb', 'headache/S002C001P008R001A044_rgb', 'headache/S015C001P007R002A044_rgb', 'headache/S002C003P010R002A044_rgb', 'headache/S010C001P018R001A044_rgb', 'headache/S001C001P001R001A044_rgb', 'headache/S008C002P019R001A044_rgb', 'headache/S010C001P013R001A044_rgb', 'headache/S013C001P017R002A044_rgb', 'headache/S013C003P027R001A044_rgb', 'headache/S010C001P025R002A044_rgb', 'headache/S009C002P016R001A044_rgb', 'headache/S002C003P007R001A044_rgb', 'headache/S011C002P015R002A044_rgb', 'headache/S016C002P019R001A044_rgb', 'headache/S014C003P027R001A044_rgb', 'headache/S012C003P019R002A044_rgb', 'headache/S008C001P008R001A044_rgb', 'headache/S001C002P007R002A044_rgb', 'headache/S005C001P013R002A044_rgb', 'headache/S017C002P007R001A044_rgb', 'headache/S008C001P031R001A044_rgb', 'headache/S016C001P007R001A044_rgb', 'headache/S011C003P018R001A044_rgb', 'headache/S009C002P008R002A044_rgb', 'headache/S016C003P039R001A044_rgb', 'headache/S002C001P011R002A044_rgb', 'headache/S002C001P007R001A044_rgb', 'headache/S015C003P025R001A044_rgb', 'headache/S013C002P019R002A044_rgb', 'headache/S009C002P017R001A044_rgb', 'headache/S002C001P003R002A044_rgb', 'headache/S003C002P016R001A044_rgb', 'headache/S004C001P007R002A044_rgb', 'headache/S001C002P006R002A044_rgb', 'headache/S005C002P015R002A044_rgb', 'headache/S004C003P020R002A044_rgb', 'headache/S009C003P019R001A044_rgb', 'headache/S013C003P028R001A044_rgb', 'headache/S003C001P015R002A044_rgb', 'headache/S006C001P019R001A044_rgb', 'headache/S015C001P007R001A044_rgb', 'headache/S011C001P017R001A044_rgb', 'headache/S003C002P016R002A044_rgb', 'headache/S007C002P018R001A044_rgb', 'headache/S007C001P007R002A044_rgb', 'headache/S006C002P008R002A044_rgb', 'headache/S003C002P019R002A044_rgb', 'headache/S015C002P016R002A044_rgb', 'headache/S012C003P015R001A044_rgb', 'headache/S016C001P007R002A044_rgb', 'headache/S006C002P017R002A044_rgb', 'headache/S012C001P027R002A044_rgb', 'headache/S003C001P015R001A044_rgb', 'headache/S009C003P008R001A044_rgb', 'headache/S011C003P015R001A044_rgb', 'headache/S006C001P017R001A044_rgb', 'headache/S012C001P019R001A044_rgb', 'headache/S017C001P016R002A044_rgb', 'headache/S008C003P029R001A044_rgb', 'headache/S001C002P001R002A044_rgb', 'headache/S001C001P002R001A044_rgb', 'headache/S011C003P015R002A044_rgb', 'headache/S008C002P029R001A044_rgb', 'headache/S006C001P015R001A044_rgb', 'headache/S006C001P023R002A044_rgb', 'headache/S004C003P003R002A044_rgb', 'headache/S016C002P021R002A044_rgb', 'headache/S012C003P037R001A044_rgb', 'headache/S013C003P017R002A044_rgb', 'headache/S016C003P025R002A044_rgb', 'headache/S012C001P025R001A044_rgb', 'headache/S009C003P007R001A044_rgb', 'headache/S006C003P022R001A044_rgb', 'headache/S012C003P018R001A044_rgb', 'headache/S006C003P001R002A044_rgb', 'headache/S006C001P022R001A044_rgb', 'headache/S010C001P019R001A044_rgb', 'headache/S011C002P019R001A044_rgb', 'headache/S005C003P013R002A044_rgb', 'headache/S016C002P040R002A044_rgb', 'headache/S008C002P001R002A044_rgb', 'headache/S011C002P007R002A044_rgb', 'headache/S015C001P016R001A044_rgb', 'headache/S014C002P008R001A044_rgb', 'headache/S010C003P017R002A044_rgb', 'headache/S003C001P001R002A044_rgb', 'headache/S011C002P007R001A044_rgb', 'headache/S017C002P008R001A044_rgb', 'headache/S017C001P020R001A044_rgb', 'headache/S011C002P027R002A044_rgb', 'headache/S016C001P025R001A044_rgb', 'headache/S002C003P011R001A044_rgb', 'headache/S011C003P007R002A044_rgb', 'headache/S014C001P007R002A044_rgb', 'headache/S011C001P007R002A044_rgb', 'headache/S013C002P016R001A044_rgb', 'headache/S003C003P018R001A044_rgb', 'headache/S005C001P016R002A044_rgb', 'headache/S007C002P008R002A044_rgb', 'headache/S009C003P017R001A044_rgb', 'headache/S015C002P008R001A044_rgb', 'headache/S012C003P017R001A044_rgb', 'headache/S012C002P019R001A044_rgb', 'headache/S015C003P015R001A044_rgb', 'headache/S010C003P021R001A044_rgb', 'headache/S007C002P001R002A044_rgb', 'headache/S009C001P007R002A044_rgb', 'headache/S012C001P028R002A044_rgb', 'headache/S008C003P032R002A044_rgb', 'headache/S007C002P027R001A044_rgb', 'headache/S014C001P015R001A044_rgb', 'headache/S007C001P001R002A044_rgb', 'headache/S015C003P025R002A044_rgb', 'headache/S011C002P002R002A044_rgb', 'headache/S015C003P016R001A044_rgb', 'headache/S008C002P008R002A044_rgb', 'headache/S011C003P027R001A044_rgb', 'headache/S003C003P017R001A044_rgb', 'headache/S006C003P017R001A044_rgb', 'headache/S009C002P007R002A044_rgb', 'headache/S016C001P040R002A044_rgb', 'headache/S008C003P036R002A044_rgb', 'headache/S002C001P012R001A044_rgb', 'headache/S008C001P034R002A044_rgb', 'headache/S001C001P008R001A044_rgb', 'headache/S008C002P015R002A044_rgb', 'headache/S011C001P028R001A044_rgb', 'headache/S009C002P019R001A044_rgb', 'headache/S007C003P015R001A044_rgb', 'headache/S011C003P007R001A044_rgb', 'headache/S013C003P017R001A044_rgb', 'headache/S005C001P013R001A044_rgb', 'headache/S013C002P016R002A044_rgb', 'headache/S013C001P007R002A044_rgb', 'headache/S011C003P027R002A044_rgb', 'headache/S012C001P016R002A044_rgb', 'headache/S007C001P001R001A044_rgb', 'headache/S008C002P007R002A044_rgb', 'headache/S012C001P015R002A044_rgb', 'headache/S007C002P028R002A044_rgb', 'headache/S012C002P007R002A044_rgb', 'headache/S002C003P003R001A044_rgb', 'headache/S011C003P008R001A044_rgb', 'headache/S007C002P019R002A044_rgb', 'headache/S013C002P019R001A044_rgb', 'headache/S007C003P027R001A044_rgb', 'headache/S008C002P034R002A044_rgb', 'headache/S014C001P039R001A044_rgb', 'headache/S003C003P019R002A044_rgb', 'headache/S008C002P025R002A044_rgb', 'headache/S010C001P015R001A044_rgb', 'headache/S005C002P013R001A044_rgb', 'headache/S002C003P014R002A044_rgb', 'headache/S003C002P001R002A044_rgb', 'headache/S007C003P017R001A044_rgb', 'headache/S007C003P025R002A044_rgb', 'headache/S011C001P008R002A044_rgb', 'headache/S002C002P008R001A044_rgb', 'headache/S006C003P019R001A044_rgb', 'headache/S016C003P008R001A044_rgb', 'headache/S017C001P008R002A044_rgb', 'headache/S014C002P007R001A044_rgb', 'headache/S010C003P016R002A044_rgb', 'headache/S009C002P008R001A044_rgb', 'headache/S009C001P015R002A044_rgb', 'headache/S014C002P025R002A044_rgb', 'headache/S011C001P008R001A044_rgb', 'headache/S003C003P007R002A044_rgb', 'headache/S008C001P036R002A044_rgb', 'headache/S008C003P032R001A044_rgb', 'headache/S013C002P025R002A044_rgb', 'headache/S006C001P001R002A044_rgb', 'headache/S014C003P027R002A044_rgb', 'headache/S012C001P025R002A044_rgb', 'headache/S002C001P013R002A044_rgb', 'headache/S009C001P019R002A044_rgb', 'headache/S010C001P018R002A044_rgb', 'headache/S011C003P008R002A044_rgb', 'headache/S016C003P019R001A044_rgb', 'headache/S009C003P017R002A044_rgb', 'headache/S005C003P017R001A044_rgb', 'headache/S013C001P018R001A044_rgb', 'headache/S016C002P025R002A044_rgb', 'headache/S013C003P037R001A044_rgb', 'headache/S014C003P007R001A044_rgb', 'headache/S017C002P003R001A044_rgb', 'headache/S011C001P018R002A044_rgb', 'headache/S006C003P007R002A044_rgb', 'headache/S017C003P017R001A044_rgb', 'headache/S014C003P017R002A044_rgb', 'headache/S008C003P034R002A044_rgb', 'headache/S011C003P028R001A044_rgb', 'headache/S001C003P004R001A044_rgb', 'headache/S006C003P024R001A044_rgb', 'headache/S013C002P027R001A044_rgb', 'headache/S001C003P006R001A044_rgb', 'headache/S007C003P008R001A044_rgb', 'headache/S015C003P019R002A044_rgb', 'headache/S008C002P019R002A044_rgb', 'headache/S004C001P003R001A044_rgb', 'headache/S001C001P006R001A044_rgb', 'headache/S007C003P027R002A044_rgb', 'headache/S017C002P017R001A044_rgb', 'headache/S012C002P017R002A044_rgb', 'headache/S005C002P004R002A044_rgb', 'headache/S011C001P016R001A044_rgb', 'headache/S015C001P015R001A044_rgb', 'headache/S011C001P027R001A044_rgb', 'headache/S005C001P018R002A044_rgb', 'headache/S010C002P007R002A044_rgb', 'headache/S015C002P007R002A044_rgb', 'headache/S007C001P028R001A044_rgb', 'headache/S016C003P021R001A044_rgb', 'headache/S002C002P014R001A044_rgb', 'headache/S010C001P008R002A044_rgb', 'headache/S010C002P013R001A044_rgb', 'headache/S005C001P017R001A044_rgb', 'headache/S003C003P016R001A044_rgb', 'headache/S015C001P017R001A044_rgb', 'headache/S002C002P003R001A044_rgb', 'headache/S005C001P017R002A044_rgb', 'headache/S006C002P001R002A044_rgb', 'headache/S010C001P013R002A044_rgb', 'headache/S003C002P018R002A044_rgb', 'headache/S017C001P015R001A044_rgb', 'headache/S003C002P015R002A044_rgb', 'headache/S016C001P040R001A044_rgb', 'headache/S014C003P039R002A044_rgb', 'headache/S011C001P028R002A044_rgb', 'headache/S002C003P008R001A044_rgb', 'headache/S008C001P029R002A044_rgb', 'headache/S010C002P013R002A044_rgb', 'headache/S002C001P008R002A044_rgb', 'headache/S011C003P028R002A044_rgb', 'headache/S012C002P017R001A044_rgb', 'headache/S007C001P008R002A044_rgb', 'headache/S007C002P015R002A044_rgb', 'headache/S014C003P037R001A044_rgb', 'headache/S001C001P004R001A044_rgb', 'headache/S011C003P038R001A044_rgb', 'headache/S010C003P025R002A044_rgb', 'headache/S005C002P018R001A044_rgb', 'headache/S006C002P022R001A044_rgb', 'headache/S008C001P025R002A044_rgb', 'headache/S013C001P008R002A044_rgb', 'headache/S008C002P025R001A044_rgb', 'headache/S015C003P017R002A044_rgb', 'headache/S008C002P036R001A044_rgb', 'headache/S014C002P015R002A044_rgb', 'headache/S012C002P025R002A044_rgb', 'headache/S007C002P019R001A044_rgb', 'headache/S017C003P015R002A044_rgb', 'headache/S012C001P016R001A044_rgb', 'headache/S001C003P008R002A044_rgb', 'headache/S010C003P021R002A044_rgb', 'headache/S007C002P016R001A044_rgb', 'headache/S014C001P037R001A044_rgb', 'headache/S004C003P007R001A044_rgb', 'headache/S010C002P015R002A044_rgb', 'headache/S014C002P037R001A044_rgb', 'headache/S005C001P010R001A044_rgb', 'headache/S013C002P017R002A044_rgb', 'headache/S003C002P007R002A044_rgb', 'headache/S014C001P019R001A044_rgb', 'headache/S011C001P001R001A044_rgb', 'headache/S002C002P011R001A044_rgb', 'headache/S007C003P018R001A044_rgb', 'headache/S001C001P007R001A044_rgb', 'headache/S003C003P019R001A044_rgb', 'headache/S006C001P008R002A044_rgb', 'headache/S009C001P016R001A044_rgb', 'headache/S013C003P016R001A044_rgb', 'headache/S012C003P016R001A044_rgb', 'headache/S007C001P019R001A044_rgb', 'headache/S015C002P025R002A044_rgb', 'headache/S004C002P007R001A044_rgb', 'headache/S003C002P015R001A044_rgb', 'headache/S016C002P007R001A044_rgb', 'headache/S015C001P008R001A044_rgb', 'headache/S017C001P009R002A044_rgb', 'headache/S016C002P039R001A044_rgb', 'headache/S008C001P008R002A044_rgb', 'headache/S006C003P016R001A044_rgb', 'headache/S006C001P024R002A044_rgb', 'headache/S005C003P013R001A044_rgb', 'headache/S011C002P002R001A044_rgb', 'headache/S017C002P015R001A044_rgb', 'headache/S016C003P008R002A044_rgb', 'headache/S016C003P007R002A044_rgb', 'headache/S010C003P015R001A044_rgb', 'headache/S001C001P003R002A044_rgb', 'headache/S012C002P008R001A044_rgb', 'headache/S002C003P012R001A044_rgb', 'headache/S002C002P010R001A044_rgb', 'headache/S016C001P021R001A044_rgb', 'headache/S004C001P003R002A044_rgb', 'headache/S009C001P007R001A044_rgb', 'headache/S002C003P011R002A044_rgb', 'headache/S010C003P013R002A044_rgb', 'headache/S011C003P017R002A044_rgb', 'headache/S007C002P028R001A044_rgb', 'headache/S015C001P025R002A044_rgb', 'headache/S006C003P008R001A044_rgb', 'headache/S005C003P021R001A044_rgb', 'headache/S004C002P003R001A044_rgb', 'headache/S011C002P016R001A044_rgb', 'headache/S001C002P005R002A044_rgb', 'headache/S017C001P009R001A044_rgb', 'headache/S010C002P016R001A044_rgb', 'headache/S014C003P008R001A044_rgb', 'headache/S014C003P015R001A044_rgb', 'headache/S001C003P004R002A044_rgb', 'headache/S001C002P002R002A044_rgb', 'headache/S007C001P016R001A044_rgb', 'headache/S017C002P017R002A044_rgb', 'headache/S011C002P038R002A044_rgb', 'headache/S009C002P019R002A044_rgb', 'headache/S011C001P016R002A044_rgb', 'headache/S007C001P015R001A044_rgb', 'headache/S009C001P008R001A044_rgb', 'headache/S002C002P013R002A044_rgb', 'headache/S009C002P025R002A044_rgb', 'headache/S002C002P008R002A044_rgb', 'headache/S007C001P007R001A044_rgb', 'headache/S008C002P036R002A044_rgb', 'headache/S006C001P007R001A044_rgb', 'headache/S013C002P015R001A044_rgb', 'headache/S013C002P017R001A044_rgb', 'headache/S016C001P039R002A044_rgb', 'headache/S008C001P031R002A044_rgb', 'headache/S015C003P007R001A044_rgb', 'headache/S012C003P017R002A044_rgb', 'headache/S010C002P017R002A044_rgb', 'headache/S005C002P004R001A044_rgb', 'headache/S017C003P009R002A044_rgb', 'headache/S004C003P008R002A044_rgb', 'headache/S002C002P014R002A044_rgb', 'headache/S003C001P019R001A044_rgb', 'headache/S006C001P001R001A044_rgb', 'headache/S003C001P019R002A044_rgb', 'headache/S005C003P016R001A044_rgb', 'headache/S009C003P008R002A044_rgb', 'headache/S007C001P015R002A044_rgb', 'headache/S009C003P025R001A044_rgb', 'headache/S005C002P016R002A044_rgb', 'headache/S010C003P007R001A044_rgb', 'headache/S017C002P020R001A044_rgb', 'headache/S011C003P019R001A044_rgb', 'headache/S007C001P019R002A044_rgb', 'headache/S010C002P017R001A044_rgb', 'headache/S008C002P032R001A044_rgb', 'headache/S006C001P024R001A044_rgb', 'headache/S001C003P005R002A044_rgb', 'headache/S009C003P019R002A044_rgb', 'headache/S001C002P008R002A044_rgb', 'headache/S004C002P007R002A044_rgb', 'headache/S017C003P003R002A044_rgb', 'headache/S004C003P003R001A044_rgb', 'headache/S001C003P003R001A044_rgb', 'headache/S012C003P028R002A044_rgb', 'headache/S011C002P028R001A044_rgb', 'headache/S017C001P017R002A044_rgb', 'headache/S006C002P024R001A044_rgb', 'headache/S012C002P025R001A044_rgb', 'headache/S017C003P003R001A044_rgb', 'headache/S004C001P008R002A044_rgb', 'headache/S014C002P025R001A044_rgb', 'headache/S012C001P018R001A044_rgb', 'headache/S009C002P007R001A044_rgb', 'headache/S011C001P038R001A044_rgb', 'headache/S015C003P015R002A044_rgb', 'headache/S013C003P025R002A044_rgb', 'headache/S015C002P016R001A044_rgb', 'headache/S001C001P008R002A044_rgb', 'headache/S002C001P010R002A044_rgb', 'headache/S012C003P007R001A044_rgb', 'headache/S003C003P002R002A044_rgb', 'headache/S010C001P007R002A044_rgb', 'headache/S014C001P019R002A044_rgb', 'headache/S011C002P025R002A044_rgb', 'headache/S008C001P001R002A044_rgb', 'headache/S015C002P019R002A044_rgb', 'headache/S008C003P030R001A044_rgb', 'headache/S010C001P021R001A044_rgb', 'headache/S015C002P037R002A044_rgb', 'headache/S009C001P025R002A044_rgb', 'headache/S015C001P025R001A044_rgb', 'headache/S010C002P015R001A044_rgb', 'headache/S014C003P039R001A044_rgb', 'headache/S007C003P026R001A044_rgb', 'headache/S008C003P001R001A044_rgb', 'headache/S017C001P007R002A044_rgb', 'headache/S008C001P032R002A044_rgb', 'headache/S003C002P002R001A044_rgb', 'headache/S008C001P030R001A044_rgb', 'headache/S012C003P015R002A044_rgb', 'headache/S011C003P002R002A044_rgb', 'headache/S010C001P016R001A044_rgb', 'headache/S013C002P015R002A044_rgb', 'headache/S010C002P018R001A044_rgb', 'headache/S013C003P025R001A044_rgb', 'headache/S008C003P031R002A044_rgb', 'headache/S010C003P016R001A044_rgb', 'headache/S017C002P008R002A044_rgb', 'headache/S006C001P017R002A044_rgb', 'headache/S006C003P015R002A044_rgb', 'headache/S007C003P008R002A044_rgb', 'headache/S011C001P015R001A044_rgb', 'headache/S005C001P004R002A044_rgb', 'headache/S010C002P021R001A044_rgb', 'headache/S014C001P008R002A044_rgb', 'headache/S010C003P008R001A044_rgb', 'headache/S015C001P019R001A044_rgb', 'headache/S014C001P017R002A044_rgb', 'headache/S012C003P027R002A044_rgb', 'headache/S004C002P020R002A044_rgb', 'headache/S002C001P011R001A044_rgb', 'headache/S012C002P015R001A044_rgb', 'headache/S004C001P008R001A044_rgb', 'headache/S015C001P016R002A044_rgb', 'headache/S001C001P002R002A044_rgb', 'headache/S007C002P017R001A044_rgb', 'headache/S015C001P019R002A044_rgb', 'headache/S016C002P008R002A044_rgb', 'headache/S011C002P016R002A044_rgb', 'headache/S014C003P019R001A044_rgb', 'headache/S015C003P008R001A044_rgb', 'headache/S014C002P017R002A044_rgb', 'headache/S007C002P016R002A044_rgb', 'headache/S008C003P007R001A044_rgb', 'headache/S010C002P008R001A044_rgb', 'headache/S008C002P033R002A044_rgb', 'headache/S012C001P028R001A044_rgb', 'headache/S012C002P037R001A044_rgb', 'headache/S002C001P013R001A044_rgb', 'headache/S002C001P003R001A044_rgb', 'headache/S008C001P007R001A044_rgb', 'headache/S005C001P004R001A044_rgb', 'headache/S017C001P016R001A044_rgb', 'headache/S017C003P020R001A044_rgb', 'headache/S015C003P007R002A044_rgb', 'headache/S013C003P007R001A044_rgb', 'headache/S008C003P008R001A044_rgb', 'headache/S017C001P015R002A044_rgb', 'headache/S008C003P025R001A044_rgb', 'headache/S016C003P007R001A044_rgb', 'headache/S014C001P037R002A044_rgb', 'headache/S003C001P008R001A044_rgb', 'headache/S007C001P008R001A044_rgb', 'headache/S006C001P019R002A044_rgb', 'headache/S008C001P015R001A044_rgb', 'headache/S002C001P009R002A044_rgb', 'headache/S009C001P019R001A044_rgb', 'headache/S014C001P027R002A044_rgb', 'headache/S002C003P003R002A044_rgb', 'headache/S006C002P024R002A044_rgb', 'headache/S002C003P008R002A044_rgb', 'headache/S015C003P037R001A044_rgb', 'headache/S009C003P025R002A044_rgb', 'headache/S007C002P001R001A044_rgb', 'headache/S011C001P025R002A044_rgb', 'headache/S003C002P017R002A044_rgb', 'headache/S003C003P008R001A044_rgb', 'headache/S014C002P007R002A044_rgb', 'headache/S008C003P007R002A044_rgb', 'headache/S002C002P007R002A044_rgb', 'headache/S001C003P002R001A044_rgb', 'headache/S011C003P001R002A044_rgb', 'headache/S012C002P015R002A044_rgb', 'headache/S013C001P037R002A044_rgb', 'headache/S001C001P001R002A044_rgb', 'headache/S008C002P031R002A044_rgb', 'headache/S013C001P015R002A044_rgb', 'headache/S007C003P028R001A044_rgb', 'headache/S011C002P025R001A044_rgb', 'headache/S006C003P022R002A044_rgb', 'headache/S014C001P039R002A044_rgb', 'headache/S017C002P009R001A044_rgb', 'headache/S009C002P017R002A044_rgb', 'headache/S013C003P028R002A044_rgb', 'headache/S012C003P016R002A044_rgb', 'headache/S012C002P028R001A044_rgb', 'headache/S011C003P001R001A044_rgb', 'headache/S013C002P018R001A044_rgb', 'headache/S008C003P033R002A044_rgb', 'headache/S008C003P035R001A044_rgb', 'headache/S012C001P015R001A044_rgb', 'headache/S008C002P032R002A044_rgb', 'headache/S017C002P016R001A044_rgb', 'headache/S003C001P016R002A044_rgb', 'headache/S014C003P025R001A044_rgb', 'headache/S017C003P009R001A044_rgb', 'headache/S001C003P002R002A044_rgb', 'headache/S015C001P037R001A044_rgb', 'headache/S006C003P001R001A044_rgb', 'headache/S003C002P008R002A044_rgb', 'headache/S005C003P010R001A044_rgb', 'headache/S001C003P001R002A044_rgb', 'headache/S010C001P016R002A044_rgb', 'headache/S005C002P021R002A044_rgb', 'headache/S016C001P008R002A044_rgb', 'headache/S013C001P007R001A044_rgb', 'headache/S003C001P002R001A044_rgb', 'headache/S016C003P039R002A044_rgb', 'headache/S012C003P008R002A044_rgb', 'headache/S013C003P019R002A044_rgb', 'headache/S005C001P016R001A044_rgb', 'headache/S002C002P009R001A044_rgb', 'headache/S004C001P020R001A044_rgb', 'headache/S002C001P007R002A044_rgb', 'headache/S008C003P030R002A044_rgb', 'headache/S010C002P008R002A044_rgb', 'headache/S006C002P016R001A044_rgb', 'headache/S013C001P037R001A044_rgb', 'headache/S007C001P017R001A044_rgb', 'headache/S006C002P022R002A044_rgb', 'headache/S010C001P019R002A044_rgb', 'headache/S004C002P008R002A044_rgb', 'headache/S010C003P018R002A044_rgb', 'headache/S014C003P008R002A044_rgb', 'headache/S007C003P019R001A044_rgb', 'headache/S014C001P025R001A044_rgb', 'headache/S012C001P007R001A044_rgb', 'headache/S014C002P039R002A044_rgb', 'headache/S012C002P018R002A044_rgb', 'headache/S013C002P025R001A044_rgb', 'headache/S011C001P019R002A044_rgb', 'headache/S005C002P018R002A044_rgb', 'headache/S010C002P018R002A044_rgb', 'headache/S013C003P015R001A044_rgb', 'headache/S003C001P008R002A044_rgb', 'headache/S009C003P015R002A044_rgb', 'headache/S007C003P028R002A044_rgb', 'headache/S007C002P017R002A044_rgb', 'headache/S010C001P017R002A044_rgb', 'headache/S014C002P027R001A044_rgb', 'headache/S015C002P019R001A044_rgb', 'headache/S010C003P019R001A044_rgb', 'headache/S001C001P003R001A044_rgb', 'headache/S010C003P025R001A044_rgb', 'headache/S016C001P019R002A044_rgb', 'headache/S005C002P017R002A044_rgb', 'headache/S004C003P020R001A044_rgb', 'headache/S014C001P015R002A044_rgb', 'headache/S010C003P017R001A044_rgb', 'headache/S007C001P028R002A044_rgb', 'headache/S005C001P018R001A044_rgb', 'headache/S003C001P001R001A044_rgb', 'headache/S010C003P015R002A044_rgb', 'headache/S001C002P002R001A044_rgb', 'headache/S011C001P038R002A044_rgb', 'chestPain/S008C002P015R002A045_rgb', 'chestPain/S017C003P003R001A045_rgb', 'chestPain/S001C001P007R002A045_rgb', 'chestPain/S010C002P025R001A045_rgb', 'chestPain/S016C001P025R001A045_rgb', 'chestPain/S010C001P021R001A045_rgb', 'chestPain/S014C002P008R001A045_rgb', 'chestPain/S017C003P007R001A045_rgb', 'chestPain/S005C001P017R002A045_rgb', 'chestPain/S013C002P016R002A045_rgb', 'chestPain/S002C002P010R002A045_rgb', 'chestPain/S011C003P002R001A045_rgb', 'chestPain/S008C003P034R002A045_rgb', 'chestPain/S007C002P027R002A045_rgb', 'chestPain/S011C002P018R001A045_rgb', 'chestPain/S011C003P019R002A045_rgb', 'chestPain/S003C002P019R001A045_rgb', 'chestPain/S008C003P031R001A045_rgb', 'chestPain/S007C001P017R001A045_rgb', 'chestPain/S008C003P032R002A045_rgb', 'chestPain/S006C003P024R001A045_rgb', 'chestPain/S004C003P007R001A045_rgb', 'chestPain/S016C003P040R001A045_rgb', 'chestPain/S007C001P027R002A045_rgb', 'chestPain/S005C001P013R001A045_rgb', 'chestPain/S003C002P007R001A045_rgb', 'chestPain/S011C001P027R002A045_rgb', 'chestPain/S011C002P025R002A045_rgb', 'chestPain/S001C002P007R002A045_rgb', 'chestPain/S007C002P017R002A045_rgb', 'chestPain/S012C002P027R001A045_rgb', 'chestPain/S002C002P008R002A045_rgb', 'chestPain/S015C002P007R001A045_rgb', 'chestPain/S009C001P008R001A045_rgb', 'chestPain/S011C003P025R001A045_rgb', 'chestPain/S014C001P039R001A045_rgb', 'chestPain/S016C003P019R001A045_rgb', 'chestPain/S010C001P018R001A045_rgb', 'chestPain/S009C002P007R001A045_rgb', 'chestPain/S002C002P003R001A045_rgb', 'chestPain/S010C003P017R002A045_rgb', 'chestPain/S008C001P029R002A045_rgb', 'chestPain/S012C001P015R001A045_rgb', 'chestPain/S007C003P019R002A045_rgb', 'chestPain/S014C002P017R001A045_rgb', 'chestPain/S010C002P013R001A045_rgb', 'chestPain/S005C002P018R001A045_rgb', 'chestPain/S009C001P007R002A045_rgb', 'chestPain/S002C001P003R001A045_rgb', 'chestPain/S003C003P002R001A045_rgb', 'chestPain/S002C002P008R001A045_rgb', 'chestPain/S003C003P001R001A045_rgb', 'chestPain/S008C001P015R002A045_rgb', 'chestPain/S007C001P016R002A045_rgb', 'chestPain/S010C003P016R001A045_rgb', 'chestPain/S007C001P007R001A045_rgb', 'chestPain/S003C001P017R001A045_rgb', 'chestPain/S017C003P009R001A045_rgb', 'chestPain/S006C002P016R001A045_rgb', 'chestPain/S011C003P008R001A045_rgb', 'chestPain/S014C002P025R001A045_rgb', 'chestPain/S003C002P002R002A045_rgb', 'chestPain/S017C001P017R001A045_rgb', 'chestPain/S013C002P028R002A045_rgb', 'chestPain/S002C002P013R002A045_rgb', 'chestPain/S004C003P008R001A045_rgb', 'chestPain/S015C001P015R001A045_rgb', 'chestPain/S014C003P037R001A045_rgb', 'chestPain/S013C002P037R002A045_rgb', 'chestPain/S001C001P006R001A045_rgb', 'chestPain/S006C001P023R002A045_rgb', 'chestPain/S008C003P035R002A045_rgb', 'chestPain/S015C003P016R001A045_rgb', 'chestPain/S008C002P035R002A045_rgb', 'chestPain/S005C002P021R002A045_rgb', 'chestPain/S009C001P017R002A045_rgb', 'chestPain/S008C003P029R001A045_rgb', 'chestPain/S005C001P016R001A045_rgb', 'chestPain/S016C002P008R002A045_rgb', 'chestPain/S017C002P003R001A045_rgb', 'chestPain/S010C001P015R001A045_rgb', 'chestPain/S012C002P017R001A045_rgb', 'chestPain/S008C001P030R001A045_rgb', 'chestPain/S007C001P027R001A045_rgb', 'chestPain/S016C002P021R001A045_rgb', 'chestPain/S011C003P015R001A045_rgb', 'chestPain/S003C001P016R001A045_rgb', 'chestPain/S005C002P015R002A045_rgb', 'chestPain/S011C001P019R002A045_rgb', 'chestPain/S003C002P017R002A045_rgb', 'chestPain/S014C002P015R002A045_rgb', 'chestPain/S011C001P028R002A045_rgb', 'chestPain/S002C001P012R001A045_rgb', 'chestPain/S013C003P025R001A045_rgb', 'chestPain/S011C002P025R001A045_rgb', 'chestPain/S007C001P017R002A045_rgb', 'chestPain/S005C001P015R002A045_rgb', 'chestPain/S006C001P007R001A045_rgb', 'chestPain/S008C002P001R002A045_rgb', 'chestPain/S011C001P001R002A045_rgb', 'chestPain/S001C003P002R002A045_rgb', 'chestPain/S011C003P018R002A045_rgb', 'chestPain/S012C003P008R001A045_rgb', 'chestPain/S015C002P025R001A045_rgb', 'chestPain/S002C002P011R001A045_rgb', 'chestPain/S012C003P028R001A045_rgb', 'chestPain/S005C001P018R002A045_rgb', 'chestPain/S014C003P039R001A045_rgb', 'chestPain/S006C003P007R001A045_rgb', 'chestPain/S007C001P019R001A045_rgb', 'chestPain/S014C001P039R002A045_rgb', 'chestPain/S014C001P037R001A045_rgb', 'chestPain/S011C001P038R002A045_rgb', 'chestPain/S013C003P017R001A045_rgb', 'chestPain/S008C003P029R002A045_rgb', 'chestPain/S013C003P019R002A045_rgb', 'chestPain/S009C003P017R001A045_rgb', 'chestPain/S010C002P013R002A045_rgb', 'chestPain/S001C003P006R001A045_rgb', 'chestPain/S012C002P018R001A045_rgb', 'chestPain/S002C002P007R002A045_rgb', 'chestPain/S011C002P008R002A045_rgb', 'chestPain/S003C003P019R002A045_rgb', 'chestPain/S017C002P015R001A045_rgb', 'chestPain/S017C002P016R001A045_rgb', 'chestPain/S006C001P016R002A045_rgb', 'chestPain/S008C001P019R002A045_rgb', 'chestPain/S007C002P026R001A045_rgb', 'chestPain/S003C002P015R001A045_rgb', 'chestPain/S009C002P019R002A045_rgb', 'chestPain/S010C001P007R002A045_rgb', 'chestPain/S017C003P008R002A045_rgb', 'chestPain/S005C001P004R002A045_rgb', 'chestPain/S004C001P007R002A045_rgb', 'chestPain/S007C001P008R002A045_rgb', 'chestPain/S004C001P008R001A045_rgb', 'chestPain/S008C001P029R001A045_rgb', 'chestPain/S001C002P004R001A045_rgb', 'chestPain/S003C001P002R002A045_rgb', 'chestPain/S015C003P025R002A045_rgb', 'chestPain/S007C002P007R002A045_rgb', 'chestPain/S014C003P015R002A045_rgb', 'chestPain/S005C003P017R002A045_rgb', 'chestPain/S007C002P015R002A045_rgb', 'chestPain/S002C002P012R002A045_rgb', 'chestPain/S007C003P018R001A045_rgb', 'chestPain/S010C003P008R001A045_rgb', 'chestPain/S007C002P018R001A045_rgb', 'chestPain/S015C003P016R002A045_rgb', 'chestPain/S013C001P037R002A045_rgb', 'chestPain/S001C001P001R001A045_rgb', 'chestPain/S006C002P008R002A045_rgb', 'chestPain/S002C001P008R001A045_rgb', 'chestPain/S014C003P008R001A045_rgb', 'chestPain/S007C002P008R002A045_rgb', 'chestPain/S014C002P008R002A045_rgb', 'chestPain/S007C001P007R002A045_rgb', 'chestPain/S008C001P034R001A045_rgb', 'chestPain/S010C003P021R001A045_rgb', 'chestPain/S008C003P015R001A045_rgb', 'chestPain/S012C001P027R001A045_rgb', 'chestPain/S012C001P027R002A045_rgb', 'chestPain/S012C001P007R001A045_rgb', 'chestPain/S006C001P022R001A045_rgb', 'chestPain/S013C001P028R001A045_rgb', 'chestPain/S013C003P008R002A045_rgb', 'chestPain/S003C002P001R002A045_rgb', 'chestPain/S006C003P016R001A045_rgb', 'chestPain/S010C002P021R001A045_rgb', 'chestPain/S009C003P015R001A045_rgb', 'chestPain/S008C003P030R002A045_rgb', 'chestPain/S012C003P017R001A045_rgb', 'chestPain/S011C003P015R002A045_rgb', 'chestPain/S005C003P013R002A045_rgb', 'chestPain/S001C002P008R001A045_rgb', 'chestPain/S017C002P017R001A045_rgb', 'chestPain/S015C003P019R001A045_rgb', 'chestPain/S004C001P020R002A045_rgb', 'chestPain/S008C003P036R002A045_rgb', 'chestPain/S015C003P025R001A045_rgb', 'chestPain/S010C002P008R001A045_rgb', 'chestPain/S017C001P007R002A045_rgb', 'chestPain/S014C001P025R002A045_rgb', 'chestPain/S017C002P003R002A045_rgb', 'chestPain/S008C001P035R002A045_rgb', 'chestPain/S001C003P003R002A045_rgb', 'chestPain/S008C002P030R002A045_rgb', 'chestPain/S011C002P015R001A045_rgb', 'chestPain/S013C003P015R002A045_rgb', 'chestPain/S001C003P002R001A045_rgb', 'chestPain/S008C002P025R001A045_rgb', 'chestPain/S002C001P003R002A045_rgb', 'chestPain/S017C001P008R002A045_rgb', 'chestPain/S006C001P024R001A045_rgb', 'chestPain/S007C002P015R001A045_rgb', 'chestPain/S001C001P005R002A045_rgb', 'chestPain/S006C001P007R002A045_rgb', 'chestPain/S001C002P003R002A045_rgb', 'chestPain/S013C003P025R002A045_rgb', 'chestPain/S013C003P007R001A045_rgb', 'chestPain/S013C001P015R002A045_rgb', 'chestPain/S006C001P022R002A045_rgb', 'chestPain/S001C001P008R001A045_rgb', 'chestPain/S004C003P007R002A045_rgb', 'chestPain/S002C002P009R001A045_rgb', 'chestPain/S008C002P030R001A045_rgb', 'chestPain/S014C001P007R002A045_rgb', 'chestPain/S003C003P015R002A045_rgb', 'chestPain/S006C003P017R002A045_rgb', 'chestPain/S012C003P018R002A045_rgb', 'chestPain/S002C003P012R002A045_rgb', 'chestPain/S009C003P015R002A045_rgb', 'chestPain/S006C003P019R001A045_rgb', 'chestPain/S013C001P019R002A045_rgb', 'chestPain/S001C001P005R001A045_rgb', 'chestPain/S008C002P034R002A045_rgb', 'chestPain/S003C002P016R001A045_rgb', 'chestPain/S015C002P019R001A045_rgb', 'chestPain/S002C002P010R001A045_rgb', 'chestPain/S010C003P008R002A045_rgb', 'chestPain/S016C001P039R001A045_rgb', 'chestPain/S002C003P011R001A045_rgb', 'chestPain/S005C002P010R001A045_rgb', 'chestPain/S008C001P007R002A045_rgb', 'chestPain/S013C002P017R002A045_rgb', 'chestPain/S008C002P033R002A045_rgb', 'chestPain/S005C001P010R001A045_rgb', 'chestPain/S010C002P017R002A045_rgb', 'chestPain/S003C001P019R002A045_rgb', 'chestPain/S012C003P027R002A045_rgb', 'chestPain/S011C002P019R002A045_rgb', 'chestPain/S001C001P006R002A045_rgb', 'chestPain/S016C003P008R001A045_rgb', 'chestPain/S006C002P016R002A045_rgb', 'chestPain/S006C001P017R002A045_rgb', 'chestPain/S011C003P017R001A045_rgb', 'chestPain/S014C003P015R001A045_rgb', 'chestPain/S009C002P016R001A045_rgb', 'chestPain/S005C003P018R002A045_rgb', 'chestPain/S017C001P003R001A045_rgb', 'chestPain/S007C002P019R001A045_rgb', 'chestPain/S016C003P021R002A045_rgb', 'chestPain/S007C003P019R001A045_rgb', 'chestPain/S007C003P025R001A045_rgb', 'chestPain/S011C003P027R002A045_rgb', 'chestPain/S012C001P016R001A045_rgb', 'chestPain/S002C001P011R002A045_rgb', 'chestPain/S013C001P017R001A045_rgb', 'chestPain/S006C001P001R002A045_rgb', 'chestPain/S005C003P015R002A045_rgb', 'chestPain/S014C002P007R002A045_rgb', 'chestPain/S005C001P018R001A045_rgb', 'chestPain/S008C001P025R002A045_rgb', 'chestPain/S008C003P015R002A045_rgb', 'chestPain/S010C002P018R001A045_rgb', 'chestPain/S014C002P025R002A045_rgb', 'chestPain/S006C002P023R001A045_rgb', 'chestPain/S010C001P019R001A045_rgb', 'chestPain/S008C001P001R002A045_rgb', 'chestPain/S017C003P016R002A045_rgb', 'chestPain/S004C003P003R002A045_rgb', 'chestPain/S013C003P028R001A045_rgb', 'chestPain/S013C001P016R002A045_rgb', 'chestPain/S010C001P013R002A045_rgb', 'chestPain/S015C002P017R002A045_rgb', 'chestPain/S003C001P008R001A045_rgb', 'chestPain/S001C002P006R002A045_rgb', 'chestPain/S006C001P008R002A045_rgb', 'chestPain/S010C002P017R001A045_rgb', 'chestPain/S001C001P004R002A045_rgb', 'chestPain/S012C002P019R002A045_rgb', 'chestPain/S002C003P009R001A045_rgb', 'chestPain/S003C001P018R002A045_rgb', 'chestPain/S013C001P007R002A045_rgb', 'chestPain/S001C001P004R001A045_rgb', 'chestPain/S007C003P025R002A045_rgb', 'chestPain/S010C001P008R002A045_rgb', 'chestPain/S008C003P033R002A045_rgb', 'chestPain/S017C002P007R001A045_rgb', 'chestPain/S004C003P003R001A045_rgb', 'chestPain/S009C003P025R002A045_rgb', 'chestPain/S008C002P029R001A045_rgb', 'chestPain/S004C001P020R001A045_rgb', 'chestPain/S012C003P007R002A045_rgb', 'chestPain/S001C002P002R001A045_rgb', 'chestPain/S010C002P015R002A045_rgb', 'chestPain/S015C003P015R001A045_rgb', 'chestPain/S011C003P028R002A045_rgb', 'chestPain/S016C002P039R002A045_rgb', 'chestPain/S006C002P017R001A045_rgb', 'chestPain/S006C002P015R001A045_rgb', 'chestPain/S010C002P007R001A045_rgb', 'chestPain/S002C003P014R001A045_rgb', 'chestPain/S017C002P020R002A045_rgb', 'chestPain/S014C003P017R001A045_rgb', 'chestPain/S014C001P037R002A045_rgb', 'chestPain/S006C002P007R001A045_rgb', 'chestPain/S002C002P014R001A045_rgb', 'chestPain/S011C003P002R002A045_rgb', 'chestPain/S007C003P018R002A045_rgb', 'chestPain/S010C002P007R002A045_rgb', 'chestPain/S014C001P015R001A045_rgb', 'chestPain/S011C003P016R001A045_rgb', 'chestPain/S011C002P008R001A045_rgb', 'chestPain/S014C003P017R002A045_rgb', 'chestPain/S008C002P036R001A045_rgb', 'chestPain/S001C003P008R002A045_rgb', 'chestPain/S017C002P009R002A045_rgb', 'chestPain/S006C002P022R002A045_rgb', 'chestPain/S005C002P004R002A045_rgb', 'chestPain/S006C002P019R001A045_rgb', 'chestPain/S013C001P027R002A045_rgb', 'chestPain/S014C003P025R001A045_rgb', 'chestPain/S014C001P008R002A045_rgb', 'chestPain/S003C003P018R002A045_rgb', 'chestPain/S017C002P008R001A045_rgb', 'chestPain/S007C002P019R002A045_rgb', 'chestPain/S006C002P024R002A045_rgb', 'chestPain/S013C002P015R001A045_rgb', 'chestPain/S005C002P017R002A045_rgb', 'chestPain/S007C001P016R001A045_rgb', 'chestPain/S007C003P028R001A045_rgb', 'chestPain/S011C002P027R002A045_rgb', 'chestPain/S008C003P033R001A045_rgb', 'chestPain/S012C001P016R002A045_rgb', 'chestPain/S010C003P018R001A045_rgb', 'chestPain/S005C002P013R002A045_rgb', 'chestPain/S009C001P025R002A045_rgb', 'chestPain/S016C001P040R002A045_rgb', 'chestPain/S012C003P037R002A045_rgb', 'chestPain/S013C001P027R001A045_rgb', 'chestPain/S005C001P021R001A045_rgb', 'chestPain/S012C001P018R001A045_rgb', 'chestPain/S016C001P040R001A045_rgb', 'chestPain/S016C001P021R001A045_rgb', 'chestPain/S012C002P016R001A045_rgb', 'chestPain/S005C001P013R002A045_rgb', 'chestPain/S001C002P001R002A045_rgb', 'chestPain/S008C001P032R002A045_rgb', 'chestPain/S007C003P016R002A045_rgb', 'chestPain/S015C001P019R001A045_rgb', 'chestPain/S004C002P008R001A045_rgb', 'chestPain/S017C001P020R001A045_rgb', 'chestPain/S001C003P006R002A045_rgb', 'chestPain/S008C002P015R001A045_rgb', 'chestPain/S001C002P006R001A045_rgb', 'chestPain/S014C003P019R001A045_rgb', 'chestPain/S001C002P004R002A045_rgb', 'chestPain/S013C003P017R002A045_rgb', 'chestPain/S011C001P007R001A045_rgb', 'chestPain/S014C002P037R001A045_rgb', 'chestPain/S006C003P023R001A045_rgb', 'chestPain/S012C003P015R001A045_rgb', 'chestPain/S003C001P017R002A045_rgb', 'chestPain/S008C001P015R001A045_rgb', 'chestPain/S012C003P028R002A045_rgb', 'chestPain/S009C001P017R001A045_rgb', 'chestPain/S016C003P039R002A045_rgb', 'chestPain/S016C003P039R001A045_rgb', 'chestPain/S015C002P008R002A045_rgb', 'chestPain/S012C001P025R002A045_rgb', 'chestPain/S013C002P028R001A045_rgb', 'chestPain/S017C001P003R002A045_rgb', 'chestPain/S011C002P007R002A045_rgb', 'chestPain/S005C003P004R001A045_rgb', 'chestPain/S002C003P008R001A045_rgb', 'chestPain/S016C001P021R002A045_rgb', 'chestPain/S008C001P030R002A045_rgb', 'chestPain/S007C002P016R002A045_rgb', 'chestPain/S011C002P038R002A045_rgb', 'chestPain/S012C002P007R001A045_rgb', 'chestPain/S002C002P007R001A045_rgb', 'chestPain/S008C001P031R002A045_rgb', 'chestPain/S006C002P017R002A045_rgb', 'chestPain/S004C002P020R001A045_rgb', 'chestPain/S009C002P015R002A045_rgb', 'chestPain/S007C002P025R002A045_rgb', 'chestPain/S002C001P009R002A045_rgb', 'chestPain/S007C003P001R002A045_rgb', 'chestPain/S016C003P007R002A045_rgb', 'chestPain/S010C001P016R001A045_rgb', 'chestPain/S013C002P025R001A045_rgb', 'chestPain/S008C002P008R001A045_rgb', 'chestPain/S006C003P022R001A045_rgb', 'chestPain/S004C002P008R002A045_rgb', 'chestPain/S013C001P018R001A045_rgb', 'chestPain/S009C003P008R001A045_rgb', 'chestPain/S009C002P007R002A045_rgb', 'chestPain/S007C003P016R001A045_rgb', 'chestPain/S005C002P021R001A045_rgb', 'chestPain/S014C002P027R001A045_rgb', 'chestPain/S016C003P025R002A045_rgb', 'chestPain/S002C003P009R002A045_rgb', 'chestPain/S008C003P008R001A045_rgb', 'chestPain/S013C001P008R002A045_rgb', 'chestPain/S006C001P024R002A045_rgb', 'chestPain/S010C003P019R002A045_rgb', 'chestPain/S009C003P008R002A045_rgb', 'chestPain/S008C002P019R001A045_rgb', 'chestPain/S016C001P007R001A045_rgb', 'chestPain/S010C001P025R002A045_rgb', 'chestPain/S012C002P017R002A045_rgb', 'chestPain/S006C003P017R001A045_rgb', 'chestPain/S013C001P019R001A045_rgb', 'chestPain/S010C003P018R002A045_rgb', 'chestPain/S008C003P030R001A045_rgb', 'chestPain/S015C001P007R002A045_rgb', 'chestPain/S015C001P015R002A045_rgb', 'chestPain/S011C003P028R001A045_rgb', 'chestPain/S007C002P028R002A045_rgb', 'chestPain/S009C002P016R002A045_rgb', 'chestPain/S010C002P015R001A045_rgb', 'chestPain/S011C002P016R001A045_rgb', 'chestPain/S003C003P016R002A045_rgb', 'chestPain/S011C003P001R001A045_rgb', 'chestPain/S015C001P025R002A045_rgb', 'chestPain/S010C003P025R001A045_rgb', 'chestPain/S013C003P018R001A045_rgb', 'chestPain/S007C001P018R002A045_rgb', 'chestPain/S012C001P008R002A045_rgb', 'chestPain/S017C003P003R002A045_rgb', 'chestPain/S017C001P017R002A045_rgb', 'chestPain/S007C001P001R002A045_rgb', 'chestPain/S006C003P022R002A045_rgb', 'chestPain/S014C003P039R002A045_rgb', 'chestPain/S015C001P016R002A045_rgb', 'chestPain/S008C002P031R001A045_rgb', 'chestPain/S012C002P015R002A045_rgb', 'chestPain/S006C003P001R002A045_rgb', 'chestPain/S001C001P003R002A045_rgb', 'chestPain/S016C002P007R002A045_rgb', 'chestPain/S016C002P025R002A045_rgb', 'chestPain/S010C001P007R001A045_rgb', 'chestPain/S016C002P039R001A045_rgb', 'chestPain/S011C001P008R002A045_rgb', 'chestPain/S003C002P018R002A045_rgb', 'chestPain/S001C003P007R002A045_rgb', 'chestPain/S011C001P016R002A045_rgb', 'chestPain/S003C002P002R001A045_rgb', 'chestPain/S010C001P016R002A045_rgb', 'chestPain/S011C002P001R001A045_rgb', 'chestPain/S008C002P001R001A045_rgb', 'chestPain/S005C003P010R001A045_rgb', 'chestPain/S016C001P019R002A045_rgb', 'chestPain/S013C002P037R001A045_rgb', 'chestPain/S003C002P015R002A045_rgb', 'chestPain/S014C002P015R001A045_rgb', 'chestPain/S005C003P004R002A045_rgb', 'chestPain/S007C003P007R002A045_rgb', 'chestPain/S014C003P007R001A045_rgb', 'chestPain/S015C003P037R002A045_rgb', 'chestPain/S012C001P015R002A045_rgb', 'chestPain/S007C001P025R002A045_rgb', 'chestPain/S008C002P034R001A045_rgb', 'chestPain/S001C001P008R002A045_rgb', 'chestPain/S017C001P009R001A045_rgb', 'chestPain/S010C003P007R001A045_rgb', 'chestPain/S007C001P018R001A045_rgb', 'chestPain/S013C002P019R001A045_rgb', 'chestPain/S001C003P008R001A045_rgb', 'chestPain/S017C002P017R002A045_rgb', 'chestPain/S008C001P001R001A045_rgb', 'chestPain/S017C003P015R001A045_rgb', 'chestPain/S013C001P018R002A045_rgb', 'chestPain/S011C003P001R002A045_rgb', 'chestPain/S003C003P001R002A045_rgb', 'chestPain/S005C003P015R001A045_rgb', 'chestPain/S010C002P019R001A045_rgb', 'chestPain/S010C003P017R001A045_rgb', 'chestPain/S014C001P008R001A045_rgb', 'chestPain/S011C001P015R001A045_rgb', 'chestPain/S014C002P019R002A045_rgb', 'chestPain/S013C002P016R001A045_rgb', 'chestPain/S013C003P018R002A045_rgb', 'chestPain/S015C003P008R001A045_rgb', 'chestPain/S006C002P007R002A045_rgb', 'chestPain/S017C003P009R002A045_rgb', 'chestPain/S015C002P015R002A045_rgb', 'chestPain/S007C001P019R002A045_rgb', 'chestPain/S001C003P005R001A045_rgb', 'chestPain/S011C003P007R001A045_rgb', 'chestPain/S006C001P017R001A045_rgb', 'chestPain/S013C003P027R001A045_rgb', 'chestPain/S012C001P028R002A045_rgb', 'chestPain/S011C002P027R001A045_rgb', 'chestPain/S009C003P019R002A045_rgb', 'chestPain/S009C002P008R002A045_rgb', 'chestPain/S007C001P028R002A045_rgb', 'chestPain/S006C003P016R002A045_rgb', 'chestPain/S008C002P008R002A045_rgb', 'chestPain/S008C002P025R002A045_rgb', 'chestPain/S004C003P020R002A045_rgb', 'chestPain/S015C001P008R002A045_rgb', 'chestPain/S004C003P008R002A045_rgb', 'chestPain/S006C002P001R002A045_rgb', 'chestPain/S011C002P015R002A045_rgb', 'chestPain/S011C002P007R001A045_rgb', 'chestPain/S009C001P025R001A045_rgb', 'chestPain/S016C003P008R002A045_rgb', 'chestPain/S005C003P017R001A045_rgb', 'chestPain/S003C001P018R001A045_rgb', 'chestPain/S010C003P025R002A045_rgb', 'chestPain/S006C001P015R002A045_rgb', 'chestPain/S008C001P036R002A045_rgb', 'chestPain/S001C003P003R001A045_rgb', 'chestPain/S007C003P028R002A045_rgb', 'chestPain/S009C001P015R002A045_rgb', 'chestPain/S014C001P007R001A045_rgb', 'chestPain/S002C002P014R002A045_rgb', 'chestPain/S007C003P026R001A045_rgb', 'chestPain/S015C002P037R002A045_rgb', 'chestPain/S003C003P016R001A045_rgb', 'chestPain/S012C003P015R002A045_rgb', 'chestPain/S002C001P009R001A045_rgb', 'chestPain/S008C001P007R001A045_rgb', 'chestPain/S011C002P002R001A045_rgb', 'chestPain/S009C001P016R001A045_rgb', 'chestPain/S002C002P003R002A045_rgb', 'chestPain/S012C003P037R001A045_rgb', 'chestPain/S007C002P025R001A045_rgb', 'chestPain/S007C003P027R002A045_rgb', 'chestPain/S006C003P015R001A045_rgb', 'chestPain/S004C001P003R002A045_rgb', 'chestPain/S014C003P008R002A045_rgb', 'chestPain/S001C001P007R001A045_rgb', 'chestPain/S010C001P018R002A045_rgb', 'chestPain/S008C002P007R001A045_rgb', 'chestPain/S003C003P007R001A045_rgb', 'chestPain/S010C002P019R002A045_rgb', 'chestPain/S013C002P015R002A045_rgb', 'chestPain/S006C002P008R001A045_rgb', 'chestPain/S002C001P010R001A045_rgb', 'chestPain/S012C003P008R002A045_rgb', 'chestPain/S011C003P018R001A045_rgb', 'chestPain/S012C001P019R001A045_rgb', 'chestPain/S011C002P028R001A045_rgb', 'chestPain/S014C002P007R001A045_rgb', 'chestPain/S013C002P025R002A045_rgb', 'chestPain/S007C002P016R001A045_rgb', 'chestPain/S008C001P035R001A045_rgb', 'chestPain/S004C001P003R001A045_rgb', 'chestPain/S013C003P027R002A045_rgb', 'chestPain/S015C001P025R001A045_rgb', 'chestPain/S008C001P032R001A045_rgb', 'chestPain/S008C003P032R001A045_rgb', 'chestPain/S005C001P016R002A045_rgb', 'chestPain/S009C003P016R001A045_rgb', 'chestPain/S016C001P019R001A045_rgb', 'chestPain/S010C002P008R002A045_rgb', 'chestPain/S017C002P007R002A045_rgb', 'chestPain/S012C001P017R002A045_rgb', 'chestPain/S008C002P036R002A045_rgb', 'chestPain/S011C001P008R001A045_rgb', 'chestPain/S006C003P001R001A045_rgb', 'chestPain/S012C003P007R001A045_rgb', 'chestPain/S002C002P009R002A045_rgb', 'chestPain/S005C003P013R001A045_rgb', 'chestPain/S012C002P016R002A045_rgb', 'chestPain/S012C002P037R001A045_rgb', 'chestPain/S011C002P017R001A045_rgb', 'chestPain/S011C001P028R001A045_rgb', 'chestPain/S002C003P007R001A045_rgb', 'chestPain/S012C003P025R002A045_rgb', 'chestPain/S001C003P005R002A045_rgb', 'chestPain/S013C003P037R001A045_rgb', 'chestPain/S006C003P019R002A045_rgb', 'chestPain/S014C001P019R001A045_rgb', 'chestPain/S003C003P008R002A045_rgb', 'chestPain/S015C001P017R002A045_rgb', 'chestPain/S015C003P007R002A045_rgb', 'chestPain/S016C002P019R002A045_rgb', 'chestPain/S010C003P021R002A045_rgb', 'chestPain/S016C003P019R002A045_rgb', 'chestPain/S003C001P001R001A045_rgb', 'chestPain/S008C003P031R002A045_rgb', 'chestPain/S004C002P003R001A045_rgb', 'chestPain/S006C002P024R001A045_rgb', 'chestPain/S006C001P016R001A045_rgb', 'chestPain/S010C002P021R002A045_rgb', 'chestPain/S017C002P015R002A045_rgb', 'chestPain/S008C003P001R001A045_rgb', 'chestPain/S011C002P018R002A045_rgb', 'chestPain/S007C002P007R001A045_rgb', 'chestPain/S005C002P018R002A045_rgb', 'chestPain/S014C001P015R002A045_rgb', 'chestPain/S003C002P001R001A045_rgb', 'chestPain/S012C003P016R002A045_rgb', 'chestPain/S010C003P013R001A045_rgb', 'chestPain/S011C001P025R002A045_rgb', 'chestPain/S012C002P028R002A045_rgb', 'chestPain/S010C001P008R001A045_rgb', 'chestPain/S005C002P016R002A045_rgb', 'chestPain/S007C001P026R002A045_rgb', 'chestPain/S007C003P001R001A045_rgb', 'chestPain/S017C003P017R001A045_rgb', 'chestPain/S015C003P037R001A045_rgb', 'chestPain/S012C001P025R001A045_rgb', 'chestPain/S017C002P009R001A045_rgb', 'chestPain/S015C002P008R001A045_rgb', 'chestPain/S015C003P007R001A045_rgb', 'chestPain/S003C003P015R001A045_rgb', 'chestPain/S007C001P015R001A045_rgb', 'chestPain/S012C002P018R002A045_rgb', 'chestPain/S012C002P008R001A045_rgb', 'chestPain/S013C003P037R002A045_rgb', 'chestPain/S003C003P018R001A045_rgb', 'chestPain/S007C001P028R001A045_rgb', 'chestPain/S004C001P007R001A045_rgb', 'chestPain/S002C003P003R001A045_rgb', 'chestPain/S017C001P015R002A045_rgb', 'chestPain/S007C002P027R001A045_rgb', 'chestPain/S011C001P018R001A045_rgb', 'chestPain/S002C001P013R002A045_rgb', 'chestPain/S012C001P008R001A045_rgb', 'chestPain/S005C003P016R002A045_rgb', 'chestPain/S011C002P019R001A045_rgb', 'chestPain/S012C001P018R002A045_rgb', 'chestPain/S014C003P019R002A045_rgb', 'chestPain/S008C001P033R002A045_rgb', 'chestPain/S001C002P003R001A045_rgb', 'chestPain/S008C003P007R002A045_rgb', 'chestPain/S013C001P025R002A045_rgb', 'chestPain/S006C001P019R002A045_rgb', 'chestPain/S017C003P020R002A045_rgb', 'chestPain/S013C002P017R001A045_rgb', 'chestPain/S003C001P002R001A045_rgb', 'chestPain/S005C003P016R001A045_rgb', 'chestPain/S010C003P007R002A045_rgb', 'chestPain/S011C002P017R002A045_rgb', 'chestPain/S006C001P019R001A045_rgb', 'chestPain/S016C002P008R001A045_rgb', 'chestPain/S001C002P002R002A045_rgb', 'chestPain/S002C001P014R001A045_rgb', 'chestPain/S010C003P015R002A045_rgb', 'chestPain/S002C001P012R002A045_rgb', 'chestPain/S016C002P025R001A045_rgb', 'chestPain/S014C001P027R002A045_rgb', 'chestPain/S003C002P016R002A045_rgb', 'chestPain/S011C003P019R001A045_rgb', 'chestPain/S002C003P013R001A045_rgb', 'chestPain/S011C002P002R002A045_rgb', 'chestPain/S003C002P008R001A045_rgb', 'chestPain/S008C001P031R001A045_rgb', 'chestPain/S008C003P019R001A045_rgb', 'chestPain/S013C001P016R001A045_rgb', 'chestPain/S017C001P007R001A045_rgb', 'chestPain/S012C002P025R001A045_rgb', 'chestPain/S015C003P015R002A045_rgb', 'chestPain/S008C002P031R002A045_rgb', 'chestPain/S014C001P027R001A045_rgb', 'chestPain/S006C003P008R002A045_rgb', 'chestPain/S012C003P017R002A045_rgb', 'chestPain/S017C001P020R002A045_rgb', 'chestPain/S013C002P007R002A045_rgb', 'chestPain/S006C001P001R001A045_rgb', 'chestPain/S003C002P017R001A045_rgb', 'chestPain/S005C002P016R001A045_rgb', 'chestPain/S009C002P017R002A045_rgb', 'chestPain/S016C003P025R001A045_rgb', 'chestPain/S009C003P007R001A045_rgb', 'chestPain/S002C003P008R002A045_rgb', 'chestPain/S008C003P025R001A045_rgb', 'chestPain/S008C002P032R001A045_rgb', 'chestPain/S013C002P027R002A045_rgb', 'chestPain/S005C002P010R002A045_rgb', 'chestPain/S013C002P008R002A045_rgb', 'chestPain/S011C001P027R001A045_rgb', 'chestPain/S008C001P034R002A045_rgb', 'chestPain/S001C003P001R002A045_rgb', 'chestPain/S005C001P021R002A045_rgb', 'chestPain/S002C003P012R001A045_rgb', 'chestPain/S007C003P027R001A045_rgb', 'chestPain/S014C001P025R001A045_rgb', 'chestPain/S009C003P017R002A045_rgb', 'chestPain/S003C001P007R001A045_rgb', 'chestPain/S017C003P008R001A045_rgb', 'chestPain/S011C001P017R001A045_rgb', 'chestPain/S016C001P025R002A045_rgb', 'chestPain/S008C001P019R001A045_rgb', 'chestPain/S009C002P019R001A045_rgb', 'chestPain/S017C003P017R002A045_rgb', 'chestPain/S012C002P015R001A045_rgb', 'chestPain/S011C003P007R002A045_rgb', 'chestPain/S006C003P023R002A045_rgb', 'chestPain/S003C002P008R002A045_rgb', 'chestPain/S005C001P017R001A045_rgb', 'chestPain/S016C001P039R002A045_rgb', 'chestPain/S013C002P008R001A045_rgb', 'chestPain/S016C001P007R002A045_rgb', 'chestPain/S001C002P001R001A045_rgb', 'chestPain/S011C001P019R001A045_rgb', 'chestPain/S001C001P002R001A045_rgb', 'chestPain/S002C001P014R002A045_rgb', 'chestPain/S002C001P007R001A045_rgb', 'chestPain/S016C003P021R001A045_rgb', 'chestPain/S003C002P007R002A045_rgb', 'chestPain/S013C003P015R001A045_rgb', 'chestPain/S013C001P007R001A045_rgb', 'chestPain/S015C003P017R001A045_rgb', 'chestPain/S006C003P008R001A045_rgb', 'chestPain/S009C003P007R002A045_rgb', 'chestPain/S002C003P011R002A045_rgb', 'chestPain/S001C002P008R002A045_rgb', 'chestPain/S007C002P018R002A045_rgb', 'chestPain/S017C001P008R001A045_rgb', 'chestPain/S012C001P037R002A045_rgb', 'chestPain/S007C003P017R001A045_rgb', 'chestPain/S010C002P018R002A045_rgb', 'chestPain/S002C003P003R002A045_rgb', 'chestPain/S004C002P007R002A045_rgb', 'chestPain/S008C002P019R002A045_rgb', 'chestPain/S007C003P015R002A045_rgb', 'chestPain/S005C002P013R001A045_rgb', 'chestPain/S002C003P013R002A045_rgb', 'chestPain/S009C002P025R001A045_rgb', 'chestPain/S009C002P025R002A045_rgb', 'chestPain/S012C001P037R001A045_rgb', 'chestPain/S011C001P038R001A045_rgb', 'chestPain/S002C002P012R001A045_rgb', 'chestPain/S013C001P008R001A045_rgb', 'chestPain/S013C003P016R002A045_rgb', 'chestPain/S007C002P008R001A045_rgb', 'chestPain/S007C003P017R002A045_rgb', 'chestPain/S007C003P015R001A045_rgb', 'chestPain/S001C003P004R002A045_rgb', 'chestPain/S015C003P019R002A045_rgb', 'chestPain/S001C002P005R001A045_rgb', 'chestPain/S002C002P013R001A045_rgb', 'chestPain/S015C002P037R001A045_rgb', 'chestPain/S008C002P029R002A045_rgb', 'chestPain/S003C003P017R001A045_rgb', 'chestPain/S010C001P021R002A045_rgb', 'chestPain/S017C003P007R002A045_rgb', 'chestPain/S001C002P005R002A045_rgb', 'chestPain/S016C003P040R002A045_rgb', 'chestPain/S013C001P017R002A045_rgb', 'chestPain/S007C003P026R002A045_rgb', 'chestPain/S012C003P025R001A045_rgb', 'chestPain/S013C002P018R002A045_rgb', 'chestPain/S017C001P015R001A045_rgb', 'chestPain/S017C001P016R002A045_rgb', 'chestPain/S003C002P019R002A045_rgb', 'chestPain/S013C003P008R001A045_rgb', 'chestPain/S012C003P018R001A045_rgb', 'chestPain/S011C001P025R001A045_rgb', 'chestPain/S002C001P010R002A045_rgb', 'chestPain/S005C001P004R001A045_rgb', 'chestPain/S005C002P015R001A045_rgb', 'chestPain/S008C003P034R001A045_rgb', 'chestPain/S010C003P019R001A045_rgb', 'chestPain/S013C001P037R001A045_rgb', 'chestPain/S013C003P016R001A045_rgb', 'chestPain/S005C003P010R002A045_rgb', 'chestPain/S011C003P025R002A045_rgb', 'chestPain/S010C003P013R002A045_rgb', 'chestPain/S014C001P017R001A045_rgb', 'chestPain/S003C003P007R002A045_rgb', 'chestPain/S009C001P016R002A045_rgb', 'backPain/S007C003P001R001A046_rgb', 'backPain/S002C003P012R001A046_rgb', 'backPain/S007C001P026R001A046_rgb', 'backPain/S004C001P003R002A046_rgb', 'backPain/S014C001P007R001A046_rgb', 'backPain/S009C002P007R002A046_rgb', 'backPain/S001C002P006R002A046_rgb', 'backPain/S007C003P019R002A046_rgb', 'backPain/S003C003P008R002A046_rgb', 'backPain/S011C002P025R001A046_rgb', 'backPain/S016C003P008R002A046_rgb', 'backPain/S016C003P040R002A046_rgb', 'backPain/S005C001P013R001A046_rgb', 'backPain/S003C003P002R001A046_rgb', 'backPain/S011C002P017R001A046_rgb', 'backPain/S010C003P018R002A046_rgb', 'backPain/S013C001P019R001A046_rgb', 'backPain/S006C002P015R002A046_rgb', 'backPain/S004C001P007R001A046_rgb', 'backPain/S008C002P019R002A046_rgb', 'backPain/S015C002P016R002A046_rgb', 'backPain/S016C001P021R001A046_rgb', 'backPain/S010C001P007R001A046_rgb', 'backPain/S010C003P025R001A046_rgb', 'backPain/S003C001P019R001A046_rgb', 'backPain/S008C003P033R001A046_rgb', 'backPain/S008C003P001R001A046_rgb', 'backPain/S002C001P003R002A046_rgb', 'backPain/S006C002P007R002A046_rgb', 'backPain/S002C002P013R001A046_rgb', 'backPain/S012C001P007R001A046_rgb', 'backPain/S008C001P008R001A046_rgb', 'backPain/S017C003P017R002A046_rgb', 'backPain/S013C002P016R001A046_rgb', 'backPain/S003C002P016R002A046_rgb', 'backPain/S011C002P007R002A046_rgb', 'backPain/S013C001P008R001A046_rgb', 'backPain/S011C003P001R001A046_rgb', 'backPain/S003C003P007R001A046_rgb', 'backPain/S006C001P024R001A046_rgb', 'backPain/S006C002P008R001A046_rgb', 'backPain/S002C002P014R002A046_rgb', 'backPain/S003C002P002R001A046_rgb', 'backPain/S009C003P017R002A046_rgb', 'backPain/S003C002P018R002A046_rgb', 'backPain/S011C003P019R002A046_rgb', 'backPain/S012C001P016R002A046_rgb', 'backPain/S001C003P003R001A046_rgb', 'backPain/S001C002P001R001A046_rgb', 'backPain/S006C002P019R002A046_rgb', 'backPain/S008C001P007R002A046_rgb', 'backPain/S001C002P004R001A046_rgb', 'backPain/S008C003P031R001A046_rgb', 'backPain/S013C002P028R001A046_rgb', 'backPain/S014C002P017R002A046_rgb', 'backPain/S013C003P025R001A046_rgb', 'backPain/S002C003P008R001A046_rgb', 'backPain/S008C001P015R002A046_rgb', 'backPain/S004C003P003R001A046_rgb', 'backPain/S005C001P015R002A046_rgb', 'backPain/S017C001P017R001A046_rgb', 'backPain/S008C003P035R002A046_rgb', 'backPain/S013C001P008R002A046_rgb', 'backPain/S012C001P015R002A046_rgb', 'backPain/S011C003P038R002A046_rgb', 'backPain/S007C002P017R001A046_rgb', 'backPain/S013C001P027R002A046_rgb', 'backPain/S005C003P017R001A046_rgb', 'backPain/S001C003P008R002A046_rgb', 'backPain/S001C003P005R001A046_rgb', 'backPain/S012C003P027R001A046_rgb', 'backPain/S011C003P015R001A046_rgb', 'backPain/S016C001P008R001A046_rgb', 'backPain/S013C003P007R001A046_rgb', 'backPain/S005C001P015R001A046_rgb', 'backPain/S007C003P026R001A046_rgb', 'backPain/S011C002P016R002A046_rgb', 'backPain/S017C002P015R002A046_rgb', 'backPain/S017C001P009R001A046_rgb', 'backPain/S003C003P017R002A046_rgb', 'backPain/S014C001P015R001A046_rgb', 'backPain/S016C003P019R002A046_rgb', 'backPain/S012C001P028R001A046_rgb', 'backPain/S010C001P008R001A046_rgb', 'backPain/S005C003P016R001A046_rgb', 'backPain/S017C001P003R001A046_rgb', 'backPain/S005C003P018R001A046_rgb', 'backPain/S016C001P040R001A046_rgb', 'backPain/S009C001P007R002A046_rgb', 'backPain/S008C001P001R001A046_rgb', 'backPain/S010C001P019R001A046_rgb', 'backPain/S008C003P036R002A046_rgb', 'backPain/S012C002P017R002A046_rgb', 'backPain/S011C003P008R002A046_rgb', 'backPain/S008C002P032R002A046_rgb', 'backPain/S012C001P019R001A046_rgb', 'backPain/S014C002P019R001A046_rgb', 'backPain/S009C002P008R002A046_rgb', 'backPain/S006C001P016R002A046_rgb', 'backPain/S008C003P032R002A046_rgb', 'backPain/S003C001P015R002A046_rgb', 'backPain/S005C001P004R002A046_rgb', 'backPain/S002C002P014R001A046_rgb', 'backPain/S003C001P002R002A046_rgb', 'backPain/S002C002P012R002A046_rgb', 'backPain/S003C001P016R002A046_rgb', 'backPain/S012C002P025R001A046_rgb', 'backPain/S004C001P007R002A046_rgb', 'backPain/S006C002P007R001A046_rgb', 'backPain/S006C002P017R002A046_rgb', 'backPain/S011C002P038R001A046_rgb', 'backPain/S008C003P032R001A046_rgb', 'backPain/S007C003P018R001A046_rgb', 'backPain/S008C003P008R001A046_rgb', 'backPain/S008C001P034R001A046_rgb', 'backPain/S008C002P035R001A046_rgb', 'backPain/S012C002P007R001A046_rgb', 'backPain/S012C001P008R002A046_rgb', 'backPain/S014C001P007R002A046_rgb', 'backPain/S008C003P001R002A046_rgb', 'backPain/S010C003P017R002A046_rgb', 'backPain/S017C001P009R002A046_rgb', 'backPain/S006C002P023R002A046_rgb', 'backPain/S016C002P008R001A046_rgb', 'backPain/S011C001P038R002A046_rgb', 'backPain/S006C001P022R002A046_rgb', 'backPain/S008C003P029R002A046_rgb', 'backPain/S009C001P016R001A046_rgb', 'backPain/S012C003P037R002A046_rgb', 'backPain/S012C003P015R002A046_rgb', 'backPain/S007C001P017R002A046_rgb', 'backPain/S005C001P010R001A046_rgb', 'backPain/S015C001P025R001A046_rgb', 'backPain/S013C001P025R001A046_rgb', 'backPain/S011C003P015R002A046_rgb', 'backPain/S016C002P025R001A046_rgb', 'backPain/S001C003P002R001A046_rgb', 'backPain/S006C003P007R001A046_rgb', 'backPain/S007C003P015R002A046_rgb', 'backPain/S002C002P012R001A046_rgb', 'backPain/S002C002P011R002A046_rgb', 'backPain/S014C001P037R001A046_rgb', 'backPain/S006C001P023R002A046_rgb', 'backPain/S009C001P017R001A046_rgb', 'backPain/S016C002P007R001A046_rgb', 'backPain/S013C001P028R002A046_rgb', 'backPain/S014C002P008R001A046_rgb', 'backPain/S012C003P017R001A046_rgb', 'backPain/S012C002P018R002A046_rgb', 'backPain/S007C002P025R002A046_rgb', 'backPain/S003C002P018R001A046_rgb', 'backPain/S014C003P019R001A046_rgb', 'backPain/S002C002P007R002A046_rgb', 'backPain/S006C002P016R002A046_rgb', 'backPain/S004C002P007R001A046_rgb', 'backPain/S013C003P018R002A046_rgb', 'backPain/S007C002P001R002A046_rgb', 'backPain/S015C001P015R002A046_rgb', 'backPain/S011C003P027R002A046_rgb', 'backPain/S008C002P001R001A046_rgb', 'backPain/S003C002P019R002A046_rgb', 'backPain/S012C003P027R002A046_rgb', 'backPain/S015C001P037R001A046_rgb', 'backPain/S004C001P020R002A046_rgb', 'backPain/S008C001P033R002A046_rgb', 'backPain/S008C001P034R002A046_rgb', 'backPain/S012C002P017R001A046_rgb', 'backPain/S001C002P005R001A046_rgb', 'backPain/S017C001P007R002A046_rgb', 'backPain/S015C002P007R002A046_rgb', 'backPain/S017C003P020R001A046_rgb', 'backPain/S006C003P015R001A046_rgb', 'backPain/S013C001P028R001A046_rgb', 'backPain/S016C001P025R001A046_rgb', 'backPain/S013C002P018R001A046_rgb', 'backPain/S009C002P015R001A046_rgb', 'backPain/S001C003P008R001A046_rgb', 'backPain/S008C001P031R002A046_rgb', 'backPain/S009C002P008R001A046_rgb', 'backPain/S003C001P015R001A046_rgb', 'backPain/S009C002P017R001A046_rgb', 'backPain/S001C001P001R001A046_rgb', 'backPain/S007C001P019R001A046_rgb', 'backPain/S013C002P027R001A046_rgb', 'backPain/S010C002P015R001A046_rgb', 'backPain/S002C003P013R001A046_rgb', 'backPain/S002C003P003R001A046_rgb', 'backPain/S017C002P009R001A046_rgb', 'backPain/S016C003P039R001A046_rgb', 'backPain/S004C002P007R002A046_rgb', 'backPain/S008C003P031R002A046_rgb', 'backPain/S001C003P002R002A046_rgb', 'backPain/S011C001P018R001A046_rgb', 'backPain/S005C002P013R001A046_rgb', 'backPain/S016C001P007R001A046_rgb', 'backPain/S014C001P037R002A046_rgb', 'backPain/S014C001P027R001A046_rgb', 'backPain/S009C003P019R001A046_rgb', 'backPain/S016C003P040R001A046_rgb', 'backPain/S016C002P040R001A046_rgb', 'backPain/S016C001P019R002A046_rgb', 'backPain/S015C001P016R001A046_rgb', 'backPain/S006C003P016R002A046_rgb', 'backPain/S011C001P017R002A046_rgb', 'backPain/S007C002P001R001A046_rgb', 'backPain/S011C002P002R002A046_rgb', 'backPain/S012C001P018R001A046_rgb', 'backPain/S003C002P019R001A046_rgb', 'backPain/S017C002P020R002A046_rgb', 'backPain/S009C002P016R001A046_rgb', 'backPain/S014C002P007R002A046_rgb', 'backPain/S015C001P008R001A046_rgb', 'backPain/S007C003P017R002A046_rgb', 'backPain/S016C001P040R002A046_rgb', 'backPain/S003C001P017R002A046_rgb', 'backPain/S010C002P017R002A046_rgb', 'backPain/S003C002P007R001A046_rgb', 'backPain/S017C002P017R002A046_rgb', 'backPain/S014C002P025R002A046_rgb', 'backPain/S008C003P019R002A046_rgb', 'backPain/S007C001P008R002A046_rgb', 'backPain/S007C001P017R001A046_rgb', 'backPain/S001C001P003R001A046_rgb', 'backPain/S010C003P007R002A046_rgb', 'backPain/S011C001P028R001A046_rgb', 'backPain/S015C002P017R002A046_rgb', 'backPain/S006C003P022R002A046_rgb', 'backPain/S001C001P007R001A046_rgb', 'backPain/S011C002P002R001A046_rgb', 'backPain/S013C002P028R002A046_rgb', 'backPain/S016C003P007R001A046_rgb', 'backPain/S014C003P039R002A046_rgb', 'backPain/S014C003P015R002A046_rgb', 'backPain/S005C001P018R002A046_rgb', 'backPain/S013C002P008R002A046_rgb', 'backPain/S004C003P003R002A046_rgb', 'backPain/S008C001P025R001A046_rgb', 'backPain/S010C001P013R002A046_rgb', 'backPain/S006C003P022R001A046_rgb', 'backPain/S015C001P017R001A046_rgb', 'backPain/S011C003P008R001A046_rgb', 'backPain/S008C001P025R002A046_rgb', 'backPain/S003C003P017R001A046_rgb', 'backPain/S013C003P016R001A046_rgb', 'backPain/S013C002P018R002A046_rgb', 'backPain/S012C001P025R001A046_rgb', 'backPain/S007C003P016R002A046_rgb', 'backPain/S001C001P005R001A046_rgb', 'backPain/S001C002P003R001A046_rgb', 'backPain/S017C002P009R002A046_rgb', 'backPain/S002C002P007R001A046_rgb', 'backPain/S011C001P027R002A046_rgb', 'backPain/S015C002P015R001A046_rgb', 'backPain/S011C003P025R001A046_rgb', 'backPain/S003C001P001R001A046_rgb', 'backPain/S017C003P003R002A046_rgb', 'backPain/S011C001P038R001A046_rgb', 'backPain/S009C003P008R002A046_rgb', 'backPain/S009C001P025R001A046_rgb', 'backPain/S005C001P017R002A046_rgb', 'backPain/S017C003P016R001A046_rgb', 'backPain/S014C003P007R002A046_rgb', 'backPain/S001C001P006R001A046_rgb', 'backPain/S008C003P029R001A046_rgb', 'backPain/S014C002P037R002A046_rgb', 'backPain/S011C003P018R001A046_rgb', 'backPain/S008C001P033R001A046_rgb', 'backPain/S007C001P016R002A046_rgb', 'backPain/S003C002P008R001A046_rgb', 'backPain/S007C002P016R001A046_rgb', 'backPain/S007C002P019R002A046_rgb', 'backPain/S009C001P008R001A046_rgb', 'backPain/S002C003P009R001A046_rgb', 'backPain/S009C003P008R001A046_rgb', 'backPain/S005C002P010R001A046_rgb', 'backPain/S011C001P007R002A046_rgb', 'backPain/S015C001P017R002A046_rgb', 'backPain/S007C003P015R001A046_rgb', 'backPain/S012C001P028R002A046_rgb', 'backPain/S013C001P016R001A046_rgb', 'backPain/S008C002P036R001A046_rgb', 'backPain/S005C001P004R001A046_rgb', 'backPain/S006C002P022R002A046_rgb', 'backPain/S011C003P018R002A046_rgb', 'backPain/S006C001P022R001A046_rgb', 'backPain/S002C001P013R002A046_rgb', 'backPain/S013C003P018R001A046_rgb', 'backPain/S002C003P010R001A046_rgb', 'backPain/S008C002P029R001A046_rgb', 'backPain/S007C001P007R002A046_rgb', 'backPain/S006C003P007R002A046_rgb', 'backPain/S011C002P028R002A046_rgb', 'backPain/S014C003P015R001A046_rgb', 'backPain/S017C003P007R001A046_rgb', 'backPain/S002C001P012R001A046_rgb', 'backPain/S010C002P017R001A046_rgb', 'backPain/S014C002P015R002A046_rgb', 'backPain/S006C001P008R001A046_rgb', 'backPain/S003C002P016R001A046_rgb', 'backPain/S013C002P016R002A046_rgb', 'backPain/S008C003P015R001A046_rgb', 'backPain/S014C001P025R002A046_rgb', 'backPain/S008C003P035R001A046_rgb', 'backPain/S002C001P011R001A046_rgb', 'backPain/S014C002P027R001A046_rgb', 'backPain/S004C001P003R001A046_rgb', 'backPain/S002C002P011R001A046_rgb', 'backPain/S015C002P016R001A046_rgb', 'backPain/S014C003P025R002A046_rgb', 'backPain/S008C002P007R002A046_rgb', 'backPain/S010C003P018R001A046_rgb', 'backPain/S015C002P015R002A046_rgb', 'backPain/S008C001P036R002A046_rgb', 'backPain/S006C001P001R001A046_rgb', 'backPain/S007C003P018R002A046_rgb', 'backPain/S008C002P035R002A046_rgb', 'backPain/S007C002P016R002A046_rgb', 'backPain/S002C003P011R002A046_rgb', 'backPain/S005C002P021R001A046_rgb', 'backPain/S013C003P027R002A046_rgb', 'backPain/S012C001P007R002A046_rgb', 'backPain/S011C001P018R002A046_rgb', 'backPain/S010C003P008R001A046_rgb', 'backPain/S007C003P019R001A046_rgb', 'backPain/S007C003P028R001A046_rgb', 'backPain/S002C003P014R002A046_rgb', 'backPain/S012C003P018R002A046_rgb', 'backPain/S007C001P018R002A046_rgb', 'backPain/S001C002P006R001A046_rgb', 'backPain/S002C003P008R002A046_rgb', 'backPain/S011C001P028R002A046_rgb', 'backPain/S011C001P007R001A046_rgb', 'backPain/S007C001P025R001A046_rgb', 'backPain/S016C001P025R002A046_rgb', 'backPain/S016C002P021R002A046_rgb', 'backPain/S011C003P002R001A046_rgb', 'backPain/S008C003P015R002A046_rgb', 'backPain/S007C002P017R002A046_rgb', 'backPain/S007C001P001R002A046_rgb', 'backPain/S002C001P010R001A046_rgb', 'backPain/S006C003P023R002A046_rgb', 'backPain/S007C002P007R002A046_rgb', 'backPain/S002C001P007R002A046_rgb', 'backPain/S015C001P025R002A046_rgb', 'backPain/S012C003P007R001A046_rgb', 'backPain/S003C001P001R002A046_rgb', 'backPain/S014C001P008R001A046_rgb', 'backPain/S006C001P007R002A046_rgb', 'backPain/S010C001P013R001A046_rgb', 'backPain/S013C003P008R001A046_rgb', 'backPain/S003C002P002R002A046_rgb', 'backPain/S015C001P019R001A046_rgb', 'backPain/S013C001P018R002A046_rgb', 'backPain/S011C001P016R001A046_rgb', 'backPain/S010C001P016R002A046_rgb', 'backPain/S017C003P017R001A046_rgb', 'backPain/S010C002P019R001A046_rgb', 'backPain/S014C003P008R001A046_rgb', 'backPain/S012C002P015R002A046_rgb', 'backPain/S014C001P027R002A046_rgb', 'backPain/S017C001P007R001A046_rgb', 'backPain/S008C002P030R001A046_rgb', 'backPain/S009C001P016R002A046_rgb', 'backPain/S014C001P025R001A046_rgb', 'backPain/S010C001P025R001A046_rgb', 'backPain/S009C001P017R002A046_rgb', 'backPain/S001C003P004R002A046_rgb', 'backPain/S013C001P007R002A046_rgb', 'backPain/S011C002P018R002A046_rgb', 'backPain/S015C003P007R002A046_rgb', 'backPain/S010C002P007R002A046_rgb', 'backPain/S011C001P027R001A046_rgb', 'backPain/S017C003P008R002A046_rgb', 'backPain/S008C002P001R002A046_rgb', 'backPain/S015C002P019R002A046_rgb', 'backPain/S007C001P001R001A046_rgb', 'backPain/S010C003P017R001A046_rgb', 'backPain/S006C002P019R001A046_rgb', 'backPain/S005C003P021R001A046_rgb', 'backPain/S011C001P002R001A046_rgb', 'backPain/S015C003P015R002A046_rgb', 'backPain/S011C002P019R002A046_rgb', 'backPain/S008C002P031R002A046_rgb', 'backPain/S006C003P017R001A046_rgb', 'backPain/S011C001P015R002A046_rgb', 'backPain/S006C001P008R002A046_rgb', 'backPain/S010C001P021R002A046_rgb', 'backPain/S017C003P015R001A046_rgb', 'backPain/S009C001P015R001A046_rgb', 'backPain/S005C002P004R002A046_rgb', 'backPain/S006C003P008R001A046_rgb', 'backPain/S002C002P010R001A046_rgb', 'backPain/S001C001P001R002A046_rgb', 'backPain/S017C001P008R002A046_rgb', 'backPain/S013C002P015R001A046_rgb', 'backPain/S001C003P007R001A046_rgb', 'backPain/S007C001P028R002A046_rgb', 'backPain/S017C001P016R002A046_rgb', 'backPain/S009C003P016R002A046_rgb', 'backPain/S014C001P019R002A046_rgb', 'backPain/S003C003P018R002A046_rgb', 'backPain/S001C001P007R002A046_rgb', 'backPain/S013C001P037R002A046_rgb', 'backPain/S014C003P007R001A046_rgb', 'backPain/S003C001P018R001A046_rgb', 'backPain/S008C001P035R001A046_rgb', 'backPain/S017C003P008R001A046_rgb', 'backPain/S001C002P004R002A046_rgb', 'backPain/S010C003P021R002A046_rgb', 'backPain/S005C002P015R002A046_rgb', 'backPain/S011C001P008R001A046_rgb', 'backPain/S013C001P015R001A046_rgb', 'backPain/S008C002P033R001A046_rgb', 'backPain/S008C001P015R001A046_rgb', 'backPain/S010C001P016R001A046_rgb', 'backPain/S008C003P036R001A046_rgb', 'backPain/S007C001P018R001A046_rgb', 'backPain/S008C003P034R002A046_rgb', 'backPain/S015C003P008R002A046_rgb', 'backPain/S014C003P039R001A046_rgb', 'backPain/S013C001P015R002A046_rgb', 'backPain/S001C002P008R002A046_rgb', 'backPain/S014C001P039R002A046_rgb', 'backPain/S007C003P007R002A046_rgb', 'backPain/S013C001P017R002A046_rgb', 'backPain/S009C002P007R001A046_rgb', 'backPain/S003C001P007R001A046_rgb', 'backPain/S005C001P016R002A046_rgb', 'backPain/S015C003P025R002A046_rgb', 'backPain/S005C002P018R001A046_rgb', 'backPain/S005C001P010R002A046_rgb', 'backPain/S001C002P008R001A046_rgb', 'backPain/S010C001P019R002A046_rgb', 'backPain/S013C002P037R001A046_rgb', 'backPain/S008C002P032R001A046_rgb', 'backPain/S013C003P028R001A046_rgb', 'backPain/S003C003P019R002A046_rgb', 'backPain/S003C001P018R002A046_rgb', 'backPain/S015C002P037R002A046_rgb', 'backPain/S003C003P008R001A046_rgb', 'backPain/S016C003P025R001A046_rgb', 'backPain/S006C001P001R002A046_rgb', 'backPain/S008C002P031R001A046_rgb', 'backPain/S009C001P019R002A046_rgb', 'backPain/S010C002P013R001A046_rgb', 'backPain/S012C003P019R002A046_rgb', 'backPain/S006C001P015R002A046_rgb', 'backPain/S009C003P007R001A046_rgb', 'backPain/S012C002P027R001A046_rgb', 'backPain/S016C003P021R002A046_rgb', 'backPain/S006C003P015R002A046_rgb', 'backPain/S007C003P008R002A046_rgb', 'backPain/S007C003P001R002A046_rgb', 'backPain/S014C001P017R002A046_rgb', 'backPain/S016C002P025R002A046_rgb', 'backPain/S006C003P023R001A046_rgb', 'backPain/S003C002P008R002A046_rgb', 'backPain/S012C002P025R002A046_rgb', 'backPain/S001C002P002R001A046_rgb', 'backPain/S012C003P016R001A046_rgb', 'backPain/S017C003P015R002A046_rgb', 'backPain/S003C001P008R001A046_rgb', 'backPain/S001C001P004R001A046_rgb', 'backPain/S010C002P013R002A046_rgb', 'backPain/S017C002P007R001A046_rgb', 'backPain/S014C002P015R001A046_rgb', 'backPain/S016C002P021R001A046_rgb', 'backPain/S007C001P008R001A046_rgb', 'backPain/S002C002P003R002A046_rgb', 'backPain/S012C001P019R002A046_rgb', 'backPain/S017C001P016R001A046_rgb', 'backPain/S002C003P007R001A046_rgb', 'backPain/S002C001P007R001A046_rgb', 'backPain/S010C003P008R002A046_rgb', 'backPain/S013C003P007R002A046_rgb', 'backPain/S012C001P025R002A046_rgb', 'backPain/S010C002P025R002A046_rgb', 'backPain/S013C003P019R001A046_rgb', 'backPain/S003C001P017R001A046_rgb', 'backPain/S010C001P021R001A046_rgb', 'backPain/S013C002P017R002A046_rgb', 'backPain/S010C001P007R002A046_rgb', 'backPain/S013C003P017R001A046_rgb', 'backPain/S012C003P015R001A046_rgb', 'backPain/S005C001P017R001A046_rgb', 'backPain/S011C001P019R002A046_rgb', 'backPain/S011C003P028R002A046_rgb', 'backPain/S008C003P033R002A046_rgb', 'backPain/S012C002P027R002A046_rgb', 'backPain/S015C003P015R001A046_rgb', 'backPain/S016C003P021R001A046_rgb', 'backPain/S012C003P017R002A046_rgb', 'backPain/S003C003P001R001A046_rgb', 'backPain/S002C001P013R001A046_rgb', 'backPain/S012C003P018R001A046_rgb', 'backPain/S017C003P016R002A046_rgb', 'backPain/S015C002P025R002A046_rgb', 'backPain/S010C001P025R002A046_rgb', 'backPain/S001C003P001R002A046_rgb', 'backPain/S005C003P013R001A046_rgb', 'backPain/S007C002P007R001A046_rgb', 'backPain/S004C001P008R001A046_rgb', 'backPain/S010C003P019R002A046_rgb', 'backPain/S008C002P034R002A046_rgb', 'backPain/S007C002P025R001A046_rgb', 'backPain/S011C001P016R002A046_rgb', 'backPain/S008C003P030R002A046_rgb', 'backPain/S008C003P007R002A046_rgb', 'backPain/S008C001P029R001A046_rgb', 'backPain/S006C003P017R002A046_rgb', 'backPain/S008C002P025R002A046_rgb', 'backPain/S006C002P024R001A046_rgb', 'backPain/S010C002P007R001A046_rgb', 'backPain/S008C003P008R002A046_rgb', 'backPain/S004C002P008R001A046_rgb', 'backPain/S014C003P008R002A046_rgb', 'backPain/S017C002P003R002A046_rgb', 'backPain/S012C003P016R002A046_rgb', 'backPain/S008C002P033R002A046_rgb', 'backPain/S017C001P017R002A046_rgb', 'backPain/S006C001P015R001A046_rgb', 'backPain/S017C001P020R002A046_rgb', 'backPain/S014C001P015R002A046_rgb', 'backPain/S004C001P020R001A046_rgb', 'backPain/S003C001P002R001A046_rgb', 'backPain/S015C002P008R001A046_rgb', 'backPain/S002C002P003R001A046_rgb', 'backPain/S009C001P025R002A046_rgb', 'backPain/S002C002P008R002A046_rgb', 'backPain/S010C002P016R001A046_rgb', 'backPain/S016C001P039R002A046_rgb', 'backPain/S004C003P008R001A046_rgb', 'backPain/S015C003P016R001A046_rgb', 'backPain/S007C002P018R001A046_rgb', 'backPain/S011C002P019R001A046_rgb', 'backPain/S016C002P019R001A046_rgb', 'backPain/S013C001P027R001A046_rgb', 'backPain/S014C002P039R001A046_rgb', 'backPain/S003C002P001R002A046_rgb', 'backPain/S012C002P037R002A046_rgb', 'backPain/S017C002P003R001A046_rgb', 'backPain/S009C003P015R001A046_rgb', 'backPain/S013C003P016R002A046_rgb', 'backPain/S004C001P008R002A046_rgb', 'backPain/S010C002P019R002A046_rgb', 'backPain/S007C002P008R001A046_rgb', 'backPain/S015C003P016R002A046_rgb', 'backPain/S007C002P015R002A046_rgb', 'backPain/S008C001P035R002A046_rgb', 'backPain/S001C002P005R002A046_rgb', 'backPain/S001C002P001R002A046_rgb', 'backPain/S011C002P028R001A046_rgb', 'backPain/S009C002P025R002A046_rgb', 'backPain/S007C002P028R001A046_rgb', 'backPain/S009C002P025R001A046_rgb', 'backPain/S011C001P001R001A046_rgb', 'backPain/S012C001P027R001A046_rgb', 'backPain/S017C001P008R001A046_rgb', 'backPain/S002C001P011R002A046_rgb', 'backPain/S010C003P015R002A046_rgb', 'backPain/S014C003P037R001A046_rgb', 'backPain/S008C003P034R001A046_rgb', 'backPain/S002C003P007R002A046_rgb', 'backPain/S011C002P038R002A046_rgb', 'backPain/S008C002P015R001A046_rgb', 'backPain/S013C001P018R001A046_rgb', 'backPain/S012C002P028R001A046_rgb', 'backPain/S016C002P007R002A046_rgb', 'backPain/S017C003P009R002A046_rgb', 'backPain/S012C002P008R001A046_rgb', 'backPain/S013C002P027R002A046_rgb', 'backPain/S015C002P025R001A046_rgb', 'backPain/S006C001P007R001A046_rgb', 'backPain/S004C003P020R002A046_rgb', 'backPain/S005C003P004R002A046_rgb', 'backPain/S002C002P009R001A046_rgb', 'backPain/S003C002P017R001A046_rgb', 'backPain/S015C001P007R001A046_rgb', 'backPain/S015C003P017R002A046_rgb', 'backPain/S012C002P007R002A046_rgb', 'backPain/S008C002P019R001A046_rgb', 'backPain/S005C003P010R002A046_rgb', 'backPain/S001C001P002R001A046_rgb', 'backPain/S010C003P016R001A046_rgb', 'backPain/S011C001P001R002A046_rgb', 'backPain/S009C001P015R002A046_rgb', 'backPain/S010C001P018R001A046_rgb', 'backPain/S011C001P025R002A046_rgb', 'backPain/S003C001P008R002A046_rgb', 'backPain/S008C001P008R002A046_rgb', 'backPain/S010C002P025R001A046_rgb', 'backPain/S013C002P007R001A046_rgb', 'backPain/S009C003P019R002A046_rgb', 'backPain/S010C002P018R002A046_rgb', 'backPain/S006C002P008R002A046_rgb', 'backPain/S006C001P019R002A046_rgb', 'backPain/S010C001P017R001A046_rgb', 'backPain/S011C002P017R002A046_rgb', 'backPain/S009C003P015R002A046_rgb', 'backPain/S015C001P007R002A046_rgb', 'backPain/S011C001P015R001A046_rgb', 'backPain/S017C001P003R002A046_rgb', 'backPain/S011C003P025R002A046_rgb', 'backPain/S016C001P019R001A046_rgb', 'backPain/S005C003P015R001A046_rgb', 'backPain/S002C003P013R002A046_rgb', 'backPain/S011C002P016R001A046_rgb', 'backPain/S005C001P021R002A046_rgb', 'backPain/S006C001P017R001A046_rgb', 'backPain/S003C003P019R001A046_rgb', 'backPain/S013C002P025R002A046_rgb', 'backPain/S011C001P025R001A046_rgb', 'backPain/S002C001P009R001A046_rgb', 'backPain/S006C001P023R001A046_rgb', 'backPain/S008C001P031R001A046_rgb', 'backPain/S012C001P018R002A046_rgb', 'backPain/S016C002P019R002A046_rgb', 'backPain/S001C001P005R002A046_rgb', 'backPain/S012C003P025R002A046_rgb', 'backPain/S005C002P016R001A046_rgb', 'backPain/S005C002P017R001A046_rgb', 'backPain/S011C003P017R001A046_rgb', 'backPain/S008C001P036R001A046_rgb', 'backPain/S008C002P015R002A046_rgb', 'backPain/S010C003P025R002A046_rgb', 'backPain/S014C003P027R001A046_rgb', 'backPain/S011C003P007R002A046_rgb', 'backPain/S007C001P016R001A046_rgb', 'backPain/S017C003P003R001A046_rgb', 'backPain/S015C003P019R001A046_rgb', 'backPain/S008C002P036R002A046_rgb', 'backPain/S007C002P028R002A046_rgb', 'backPain/S014C002P007R001A046_rgb', 'backPain/S005C002P018R002A046_rgb', 'backPain/S015C001P037R002A046_rgb', 'backPain/S016C001P021R002A046_rgb', 'backPain/S001C003P003R002A046_rgb', 'backPain/S011C003P028R001A046_rgb', 'backPain/S010C001P017R002A046_rgb', 'backPain/S012C002P019R002A046_rgb', 'backPain/S016C001P039R001A046_rgb', 'backPain/S006C003P024R001A046_rgb', 'backPain/S006C001P017R002A046_rgb', 'backPain/S007C001P027R001A046_rgb', 'backPain/S008C003P019R001A046_rgb', 'backPain/S017C002P007R002A046_rgb', 'backPain/S013C002P019R002A046_rgb', 'backPain/S013C002P017R001A046_rgb', 'backPain/S002C001P014R002A046_rgb', 'backPain/S015C003P007R001A046_rgb', 'backPain/S005C001P018R001A046_rgb', 'backPain/S006C003P024R002A046_rgb', 'backPain/S006C001P016R001A046_rgb', 'backPain/S012C001P037R002A046_rgb', 'backPain/S014C003P017R002A046_rgb', 'backPain/S008C002P007R001A046_rgb', 'backPain/S011C002P027R002A046_rgb', 'backPain/S013C002P008R001A046_rgb', 'backPain/S013C003P015R002A046_rgb', 'backPain/S002C001P014R001A046_rgb', 'backPain/S007C001P015R001A046_rgb', 'backPain/S010C001P015R002A046_rgb', 'backPain/S001C003P007R002A046_rgb', 'backPain/S004C002P008R002A046_rgb', 'backPain/S011C001P017R001A046_rgb', 'backPain/S006C002P001R001A046_rgb', 'backPain/S003C002P015R001A046_rgb', 'backPain/S002C002P013R002A046_rgb', 'backPain/S005C001P016R001A046_rgb', 'backPain/S008C001P030R001A046_rgb', 'backPain/S017C003P020R002A046_rgb', 'backPain/S009C002P019R001A046_rgb', 'backPain/S002C003P011R001A046_rgb', 'backPain/S007C001P028R001A046_rgb', 'backPain/S002C002P008R001A046_rgb', 'backPain/S004C002P020R002A046_rgb', 'backPain/S009C001P019R001A046_rgb', 'backPain/S015C002P037R001A046_rgb', 'backPain/S012C003P037R001A046_rgb', 'backPain/S005C001P021R001A046_rgb', 'backPain/S010C002P008R001A046_rgb', 'backPain/S004C002P003R001A046_rgb', 'backPain/S009C003P007R002A046_rgb', 'backPain/S007C002P019R001A046_rgb', 'backPain/S010C001P015R001A046_rgb', 'backPain/S013C002P037R002A046_rgb', 'backPain/S013C002P007R002A046_rgb', 'backPain/S004C002P020R001A046_rgb', 'backPain/S007C003P027R002A046_rgb', 'backPain/S006C002P001R002A046_rgb', 'backPain/S015C001P016R002A046_rgb', 'backPain/S012C002P028R002A046_rgb', 'backPain/S014C003P027R002A046_rgb', 'backPain/S010C003P007R001A046_rgb', 'backPain/S013C001P037R001A046_rgb', 'backPain/S017C002P008R001A046_rgb', 'backPain/S003C003P018R001A046_rgb', 'backPain/S012C002P019R001A046_rgb', 'backPain/S010C002P021R001A046_rgb', 'backPain/S005C002P015R001A046_rgb', 'backPain/S012C003P025R001A046_rgb', 'backPain/S013C001P025R002A046_rgb', 'backPain/S003C003P001R002A046_rgb', 'backPain/S004C003P020R001A046_rgb', 'backPain/S005C003P015R002A046_rgb', 'backPain/S016C002P040R002A046_rgb', 'backPain/S008C002P025R001A046_rgb', 'backPain/S011C002P015R001A046_rgb', 'backPain/S011C003P027R001A046_rgb', 'backPain/S009C002P016R002A046_rgb', 'backPain/S010C002P015R002A046_rgb', 'backPain/S017C003P009R001A046_rgb', 'backPain/S016C003P025R002A046_rgb', 'backPain/S014C002P017R001A046_rgb', 'backPain/S013C003P037R002A046_rgb', 'backPain/S007C003P008R001A046_rgb', 'backPain/S002C002P010R002A046_rgb', 'backPain/S002C003P010R002A046_rgb', 'backPain/S013C003P017R002A046_rgb', 'backPain/S012C001P016R001A046_rgb', 'backPain/S014C002P025R001A046_rgb', 'backPain/S016C003P008R001A046_rgb', 'backPain/S013C003P028R002A046_rgb', 'backPain/S016C003P007R002A046_rgb', 'backPain/S003C003P007R002A046_rgb', 'backPain/S007C002P008R002A046_rgb', 'backPain/S008C002P034R001A046_rgb', 'backPain/S011C003P001R002A046_rgb', 'backPain/S015C002P007R001A046_rgb', 'backPain/S011C003P019R001A046_rgb', 'backPain/S014C002P008R002A046_rgb', 'backPain/S001C002P007R002A046_rgb', 'backPain/S007C002P026R002A046_rgb', 'backPain/S013C003P015R001A046_rgb', 'backPain/S003C002P001R001A046_rgb', 'backPain/S012C003P028R001A046_rgb', 'backPain/S009C002P017R002A046_rgb', 'backPain/S010C002P021R002A046_rgb', 'backPain/S003C003P016R001A046_rgb', 'backPain/S007C002P027R002A046_rgb', 'backPain/S014C002P039R002A046_rgb', 'backPain/S017C002P016R001A046_rgb', 'backPain/S015C003P019R002A046_rgb', 'backPain/S010C003P021R001A046_rgb', 'backPain/S005C002P016R002A046_rgb', 'backPain/S005C002P004R001A046_rgb', 'backPain/S014C003P017R001A046_rgb', 'backPain/S006C001P019R001A046_rgb', 'backPain/S017C001P015R002A046_rgb', 'backPain/S015C001P019R002A046_rgb', 'backPain/S011C001P019R001A046_rgb', 'backPain/S003C002P007R002A046_rgb', 'backPain/S014C002P019R002A046_rgb', 'backPain/S009C003P016R001A046_rgb', 'backPain/S001C001P002R002A046_rgb', 'backPain/S001C002P003R002A046_rgb', 'backPain/S011C003P002R002A046_rgb', 'backPain/S010C003P015R001A046_rgb', 'neckPain/S010C003P008R001A047_rgb', 'neckPain/S011C001P008R002A047_rgb', 'neckPain/S012C001P016R002A047_rgb', 'neckPain/S002C001P003R001A047_rgb', 'neckPain/S002C003P013R002A047_rgb', 'neckPain/S005C002P018R001A047_rgb', 'neckPain/S007C001P017R001A047_rgb', 'neckPain/S011C002P038R001A047_rgb', 'neckPain/S011C002P007R001A047_rgb', 'neckPain/S005C001P010R002A047_rgb', 'neckPain/S012C003P007R002A047_rgb', 'neckPain/S007C002P016R001A047_rgb', 'neckPain/S011C002P002R001A047_rgb', 'neckPain/S013C002P025R002A047_rgb', 'neckPain/S013C002P018R001A047_rgb', 'neckPain/S016C001P040R002A047_rgb', 'neckPain/S009C001P008R001A047_rgb', 'neckPain/S005C003P021R002A047_rgb', 'neckPain/S006C003P019R001A047_rgb', 'neckPain/S007C003P027R001A047_rgb', 'neckPain/S012C003P016R002A047_rgb', 'neckPain/S006C002P023R002A047_rgb', 'neckPain/S008C003P025R002A047_rgb', 'neckPain/S007C001P007R002A047_rgb', 'neckPain/S003C002P015R002A047_rgb', 'neckPain/S010C001P008R002A047_rgb', 'neckPain/S013C002P028R002A047_rgb', 'neckPain/S004C002P020R002A047_rgb', 'neckPain/S002C003P008R002A047_rgb', 'neckPain/S015C002P025R001A047_rgb', 'neckPain/S014C002P008R002A047_rgb', 'neckPain/S002C001P003R002A047_rgb', 'neckPain/S009C001P017R001A047_rgb', 'neckPain/S003C001P016R002A047_rgb', 'neckPain/S002C001P014R001A047_rgb', 'neckPain/S007C001P028R002A047_rgb', 'neckPain/S007C003P025R001A047_rgb', 'neckPain/S015C002P019R002A047_rgb', 'neckPain/S017C003P020R002A047_rgb', 'neckPain/S005C002P016R002A047_rgb', 'neckPain/S003C002P007R002A047_rgb', 'neckPain/S007C003P025R002A047_rgb', 'neckPain/S004C001P003R001A047_rgb', 'neckPain/S007C003P028R001A047_rgb', 'neckPain/S010C003P021R001A047_rgb', 'neckPain/S016C002P008R001A047_rgb', 'neckPain/S011C002P038R002A047_rgb', 'neckPain/S015C002P015R001A047_rgb', 'neckPain/S014C001P025R001A047_rgb', 'neckPain/S015C001P007R002A047_rgb', 'neckPain/S012C001P037R001A047_rgb', 'neckPain/S015C003P008R001A047_rgb', 'neckPain/S001C001P007R001A047_rgb', 'neckPain/S012C001P017R001A047_rgb', 'neckPain/S003C002P017R001A047_rgb', 'neckPain/S005C003P013R002A047_rgb', 'neckPain/S016C003P025R002A047_rgb', 'neckPain/S008C001P031R001A047_rgb', 'neckPain/S011C003P016R002A047_rgb', 'neckPain/S014C001P017R001A047_rgb', 'neckPain/S008C001P015R001A047_rgb', 'neckPain/S015C002P016R001A047_rgb', 'neckPain/S006C002P007R002A047_rgb', 'neckPain/S006C001P008R002A047_rgb', 'neckPain/S016C001P025R002A047_rgb', 'neckPain/S014C003P015R002A047_rgb', 'neckPain/S013C001P016R001A047_rgb', 'neckPain/S013C002P007R001A047_rgb', 'neckPain/S006C003P015R002A047_rgb', 'neckPain/S007C001P008R002A047_rgb', 'neckPain/S005C001P017R001A047_rgb', 'neckPain/S008C003P033R001A047_rgb', 'neckPain/S012C003P016R001A047_rgb', 'neckPain/S013C001P008R002A047_rgb', 'neckPain/S003C001P015R001A047_rgb', 'neckPain/S012C001P025R001A047_rgb', 'neckPain/S007C001P001R002A047_rgb', 'neckPain/S013C001P017R002A047_rgb', 'neckPain/S001C002P004R002A047_rgb', 'neckPain/S017C002P007R002A047_rgb', 'neckPain/S010C001P015R001A047_rgb', 'neckPain/S014C002P037R002A047_rgb', 'neckPain/S004C001P020R002A047_rgb', 'neckPain/S008C003P029R001A047_rgb', 'neckPain/S002C003P012R002A047_rgb', 'neckPain/S012C002P037R001A047_rgb', 'neckPain/S012C002P007R001A047_rgb', 'neckPain/S001C001P002R002A047_rgb', 'neckPain/S017C002P003R001A047_rgb', 'neckPain/S005C001P018R002A047_rgb', 'neckPain/S003C001P015R002A047_rgb', 'neckPain/S016C002P007R002A047_rgb', 'neckPain/S011C002P019R002A047_rgb', 'neckPain/S009C002P007R001A047_rgb', 'neckPain/S017C003P016R001A047_rgb', 'neckPain/S017C002P015R002A047_rgb', 'neckPain/S002C002P010R001A047_rgb', 'neckPain/S002C003P011R001A047_rgb', 'neckPain/S007C003P017R002A047_rgb', 'neckPain/S009C001P007R001A047_rgb', 'neckPain/S012C002P016R001A047_rgb', 'neckPain/S006C001P008R001A047_rgb', 'neckPain/S010C003P007R001A047_rgb', 'neckPain/S012C002P017R001A047_rgb', 'neckPain/S008C003P033R002A047_rgb', 'neckPain/S016C001P008R002A047_rgb', 'neckPain/S014C002P039R002A047_rgb', 'neckPain/S012C002P028R002A047_rgb', 'neckPain/S008C001P036R001A047_rgb', 'neckPain/S008C002P015R002A047_rgb', 'neckPain/S005C002P016R001A047_rgb', 'neckPain/S014C002P008R001A047_rgb', 'neckPain/S014C003P027R001A047_rgb', 'neckPain/S011C002P008R002A047_rgb', 'neckPain/S006C001P017R001A047_rgb', 'neckPain/S001C002P004R001A047_rgb', 'neckPain/S016C001P039R001A047_rgb', 'neckPain/S005C002P018R002A047_rgb', 'neckPain/S014C001P037R001A047_rgb', 'neckPain/S007C002P007R002A047_rgb', 'neckPain/S010C001P018R001A047_rgb', 'neckPain/S010C002P015R002A047_rgb', 'neckPain/S007C001P007R001A047_rgb', 'neckPain/S008C003P015R002A047_rgb', 'neckPain/S015C001P025R001A047_rgb', 'neckPain/S003C002P008R002A047_rgb', 'neckPain/S006C001P015R002A047_rgb', 'neckPain/S003C001P018R002A047_rgb', 'neckPain/S017C002P020R001A047_rgb', 'neckPain/S016C001P008R001A047_rgb', 'neckPain/S002C001P011R002A047_rgb', 'neckPain/S001C002P008R001A047_rgb', 'neckPain/S008C001P029R001A047_rgb', 'neckPain/S007C003P001R002A047_rgb', 'neckPain/S001C001P006R001A047_rgb', 'neckPain/S003C001P002R002A047_rgb', 'neckPain/S006C003P017R002A047_rgb', 'neckPain/S012C001P018R001A047_rgb', 'neckPain/S011C001P016R001A047_rgb', 'neckPain/S006C002P017R002A047_rgb', 'neckPain/S011C002P017R001A047_rgb', 'neckPain/S006C001P019R001A047_rgb', 'neckPain/S012C003P037R002A047_rgb', 'neckPain/S016C002P040R002A047_rgb', 'neckPain/S010C001P025R001A047_rgb', 'neckPain/S015C003P008R002A047_rgb', 'neckPain/S005C001P021R001A047_rgb', 'neckPain/S007C002P027R002A047_rgb', 'neckPain/S003C002P016R001A047_rgb', 'neckPain/S010C003P017R002A047_rgb', 'neckPain/S009C001P007R002A047_rgb', 'neckPain/S006C003P022R002A047_rgb', 'neckPain/S008C002P031R001A047_rgb', 'neckPain/S003C001P017R002A047_rgb', 'neckPain/S003C001P018R001A047_rgb', 'neckPain/S015C002P037R001A047_rgb', 'neckPain/S012C002P027R002A047_rgb', 'neckPain/S004C001P003R002A047_rgb', 'neckPain/S011C003P018R002A047_rgb', 'neckPain/S008C002P034R002A047_rgb', 'neckPain/S012C001P025R002A047_rgb', 'neckPain/S003C003P019R002A047_rgb', 'neckPain/S017C002P015R001A047_rgb', 'neckPain/S017C002P016R001A047_rgb', 'neckPain/S005C001P004R001A047_rgb', 'neckPain/S008C001P033R002A047_rgb', 'neckPain/S005C003P018R001A047_rgb', 'neckPain/S002C003P003R001A047_rgb', 'neckPain/S008C002P001R002A047_rgb', 'neckPain/S012C003P027R001A047_rgb', 'neckPain/S016C003P039R001A047_rgb', 'neckPain/S008C001P019R002A047_rgb', 'neckPain/S007C001P025R002A047_rgb', 'neckPain/S008C002P019R002A047_rgb', 'neckPain/S005C002P010R001A047_rgb', 'neckPain/S003C001P016R001A047_rgb', 'neckPain/S007C003P007R001A047_rgb', 'neckPain/S005C001P021R002A047_rgb', 'neckPain/S015C002P007R002A047_rgb', 'neckPain/S017C001P016R002A047_rgb', 'neckPain/S010C002P013R001A047_rgb', 'neckPain/S013C001P007R001A047_rgb', 'neckPain/S007C001P015R001A047_rgb', 'neckPain/S010C001P017R001A047_rgb', 'neckPain/S002C001P013R002A047_rgb', 'neckPain/S005C001P013R002A047_rgb', 'neckPain/S001C002P001R001A047_rgb', 'neckPain/S011C002P008R001A047_rgb', 'neckPain/S011C003P025R001A047_rgb', 'neckPain/S016C002P039R002A047_rgb', 'neckPain/S013C001P027R001A047_rgb', 'neckPain/S008C002P035R002A047_rgb', 'neckPain/S014C001P007R002A047_rgb', 'neckPain/S006C001P007R002A047_rgb', 'neckPain/S014C003P025R002A047_rgb', 'neckPain/S012C002P025R001A047_rgb', 'neckPain/S001C003P003R001A047_rgb', 'neckPain/S002C003P003R002A047_rgb', 'neckPain/S006C002P008R002A047_rgb', 'neckPain/S002C003P008R001A047_rgb', 'neckPain/S005C002P021R002A047_rgb', 'neckPain/S008C001P029R002A047_rgb', 'neckPain/S009C001P025R002A047_rgb', 'neckPain/S008C003P007R001A047_rgb', 'neckPain/S006C002P023R001A047_rgb', 'neckPain/S017C003P020R001A047_rgb', 'neckPain/S001C003P006R001A047_rgb', 'neckPain/S012C002P027R001A047_rgb', 'neckPain/S017C002P009R002A047_rgb', 'neckPain/S015C002P025R002A047_rgb', 'neckPain/S009C003P015R001A047_rgb', 'neckPain/S001C002P001R002A047_rgb', 'neckPain/S005C003P017R001A047_rgb', 'neckPain/S011C001P015R002A047_rgb', 'neckPain/S013C002P017R002A047_rgb', 'neckPain/S002C002P011R002A047_rgb', 'neckPain/S005C001P010R001A047_rgb', 'neckPain/S014C002P025R001A047_rgb', 'neckPain/S004C002P020R001A047_rgb', 'neckPain/S009C001P015R002A047_rgb', 'neckPain/S008C003P019R001A047_rgb', 'neckPain/S015C001P016R002A047_rgb', 'neckPain/S007C002P028R002A047_rgb', 'neckPain/S011C001P028R001A047_rgb', 'neckPain/S010C001P021R002A047_rgb', 'neckPain/S003C002P001R001A047_rgb', 'neckPain/S011C001P028R002A047_rgb', 'neckPain/S012C002P019R001A047_rgb', 'neckPain/S015C003P016R002A047_rgb', 'neckPain/S006C002P001R002A047_rgb', 'neckPain/S016C003P008R002A047_rgb', 'neckPain/S016C003P019R002A047_rgb', 'neckPain/S005C003P004R002A047_rgb', 'neckPain/S011C001P016R002A047_rgb', 'neckPain/S003C001P007R002A047_rgb', 'neckPain/S012C001P016R001A047_rgb', 'neckPain/S008C002P029R001A047_rgb', 'neckPain/S013C002P016R002A047_rgb', 'neckPain/S012C003P007R001A047_rgb', 'neckPain/S006C002P019R001A047_rgb', 'neckPain/S012C001P028R001A047_rgb', 'neckPain/S008C003P036R001A047_rgb', 'neckPain/S003C002P015R001A047_rgb', 'neckPain/S015C002P007R001A047_rgb', 'neckPain/S012C002P018R002A047_rgb', 'neckPain/S017C001P009R001A047_rgb', 'neckPain/S002C002P008R001A047_rgb', 'neckPain/S014C003P017R001A047_rgb', 'neckPain/S005C003P004R001A047_rgb', 'neckPain/S006C002P015R001A047_rgb', 'neckPain/S008C002P034R001A047_rgb', 'neckPain/S009C002P015R001A047_rgb', 'neckPain/S014C003P019R001A047_rgb', 'neckPain/S003C003P017R001A047_rgb', 'neckPain/S003C001P008R002A047_rgb', 'neckPain/S009C002P007R002A047_rgb', 'neckPain/S006C001P022R001A047_rgb', 'neckPain/S016C003P019R001A047_rgb', 'neckPain/S012C003P019R002A047_rgb', 'neckPain/S010C003P015R002A047_rgb', 'neckPain/S013C002P018R002A047_rgb', 'neckPain/S004C002P003R002A047_rgb', 'neckPain/S007C002P008R002A047_rgb', 'neckPain/S005C002P004R001A047_rgb', 'neckPain/S006C003P017R001A047_rgb', 'neckPain/S005C002P013R001A047_rgb', 'neckPain/S008C001P030R002A047_rgb', 'neckPain/S013C003P017R001A047_rgb', 'neckPain/S006C002P015R002A047_rgb', 'neckPain/S012C003P017R002A047_rgb', 'neckPain/S011C001P002R001A047_rgb', 'neckPain/S007C002P017R002A047_rgb', 'neckPain/S011C003P018R001A047_rgb', 'neckPain/S010C003P019R001A047_rgb', 'neckPain/S008C002P030R001A047_rgb', 'neckPain/S017C001P020R001A047_rgb', 'neckPain/S010C001P021R001A047_rgb', 'neckPain/S008C002P036R001A047_rgb', 'neckPain/S002C001P012R001A047_rgb', 'neckPain/S007C002P018R002A047_rgb', 'neckPain/S007C003P018R002A047_rgb', 'neckPain/S011C003P038R002A047_rgb', 'neckPain/S013C001P028R001A047_rgb', 'neckPain/S013C001P018R001A047_rgb', 'neckPain/S006C002P016R001A047_rgb', 'neckPain/S012C003P017R001A047_rgb', 'neckPain/S002C002P009R001A047_rgb', 'neckPain/S017C003P003R001A047_rgb', 'neckPain/S017C001P015R002A047_rgb', 'neckPain/S013C001P037R001A047_rgb', 'neckPain/S011C001P025R001A047_rgb', 'neckPain/S014C003P007R001A047_rgb', 'neckPain/S007C003P016R001A047_rgb', 'neckPain/S007C001P015R002A047_rgb', 'neckPain/S008C003P001R001A047_rgb', 'neckPain/S004C003P008R001A047_rgb', 'neckPain/S014C002P015R001A047_rgb', 'neckPain/S011C003P017R001A047_rgb', 'neckPain/S015C001P037R002A047_rgb', 'neckPain/S005C001P015R001A047_rgb', 'neckPain/S010C002P007R001A047_rgb', 'neckPain/S010C002P018R001A047_rgb', 'neckPain/S008C002P019R001A047_rgb', 'neckPain/S005C002P013R002A047_rgb', 'neckPain/S008C001P025R002A047_rgb', 'neckPain/S008C001P034R002A047_rgb', 'neckPain/S010C003P013R001A047_rgb', 'neckPain/S005C003P016R002A047_rgb', 'neckPain/S015C002P037R002A047_rgb', 'neckPain/S017C002P009R001A047_rgb', 'neckPain/S013C002P025R001A047_rgb', 'neckPain/S001C002P007R002A047_rgb', 'neckPain/S009C001P025R001A047_rgb', 'neckPain/S011C001P019R002A047_rgb', 'neckPain/S014C001P015R002A047_rgb', 'neckPain/S006C002P001R001A047_rgb', 'neckPain/S011C003P001R002A047_rgb', 'neckPain/S001C001P003R002A047_rgb', 'neckPain/S003C003P001R002A047_rgb', 'neckPain/S017C003P017R001A047_rgb', 'neckPain/S013C003P016R002A047_rgb', 'neckPain/S011C002P027R001A047_rgb', 'neckPain/S013C001P015R002A047_rgb', 'neckPain/S009C002P019R002A047_rgb', 'neckPain/S015C003P019R001A047_rgb', 'neckPain/S003C002P002R002A047_rgb', 'neckPain/S011C002P025R001A047_rgb', 'neckPain/S008C003P019R002A047_rgb', 'neckPain/S001C001P004R002A047_rgb', 'neckPain/S014C001P015R001A047_rgb', 'neckPain/S010C002P021R001A047_rgb', 'neckPain/S017C002P008R002A047_rgb', 'neckPain/S016C003P039R002A047_rgb', 'neckPain/S001C003P001R002A047_rgb', 'neckPain/S001C003P006R002A047_rgb', 'neckPain/S010C001P018R002A047_rgb', 'neckPain/S007C003P015R001A047_rgb', 'neckPain/S016C003P040R001A047_rgb', 'neckPain/S001C003P005R001A047_rgb', 'neckPain/S005C001P017R002A047_rgb', 'neckPain/S008C003P015R001A047_rgb', 'neckPain/S002C002P008R002A047_rgb', 'neckPain/S002C003P014R001A047_rgb', 'neckPain/S002C002P013R001A047_rgb', 'neckPain/S011C002P015R001A047_rgb', 'neckPain/S015C002P017R002A047_rgb', 'neckPain/S002C002P014R001A047_rgb', 'neckPain/S008C001P008R001A047_rgb', 'neckPain/S007C001P016R001A047_rgb', 'neckPain/S017C002P017R002A047_rgb', 'neckPain/S001C003P007R001A047_rgb', 'neckPain/S013C001P019R002A047_rgb', 'neckPain/S010C001P016R001A047_rgb', 'neckPain/S001C001P006R002A047_rgb', 'neckPain/S008C003P007R002A047_rgb', 'neckPain/S008C001P035R001A047_rgb', 'neckPain/S002C003P009R002A047_rgb', 'neckPain/S007C002P019R002A047_rgb', 'neckPain/S010C002P013R002A047_rgb', 'neckPain/S014C001P039R001A047_rgb', 'neckPain/S011C001P002R002A047_rgb', 'neckPain/S008C003P034R002A047_rgb', 'neckPain/S002C003P010R001A047_rgb', 'neckPain/S017C002P017R001A047_rgb', 'neckPain/S011C001P001R002A047_rgb', 'neckPain/S003C003P008R001A047_rgb', 'neckPain/S007C002P025R002A047_rgb', 'neckPain/S010C001P016R002A047_rgb', 'neckPain/S008C002P001R001A047_rgb', 'neckPain/S012C003P015R001A047_rgb', 'neckPain/S001C002P007R001A047_rgb', 'neckPain/S015C001P008R001A047_rgb', 'neckPain/S006C003P001R002A047_rgb', 'neckPain/S014C003P037R001A047_rgb', 'neckPain/S008C002P033R002A047_rgb', 'neckPain/S013C002P037R001A047_rgb', 'neckPain/S017C002P007R001A047_rgb', 'neckPain/S003C003P015R002A047_rgb', 'neckPain/S006C003P024R001A047_rgb', 'neckPain/S006C001P001R001A047_rgb', 'neckPain/S010C002P016R001A047_rgb', 'neckPain/S012C001P027R002A047_rgb', 'neckPain/S013C002P008R002A047_rgb', 'neckPain/S016C003P021R002A047_rgb', 'neckPain/S005C003P018R002A047_rgb', 'neckPain/S012C002P025R002A047_rgb', 'neckPain/S011C002P017R002A047_rgb', 'neckPain/S017C001P017R002A047_rgb', 'neckPain/S012C002P028R001A047_rgb', 'neckPain/S006C001P024R002A047_rgb', 'neckPain/S007C002P001R001A047_rgb', 'neckPain/S013C001P007R002A047_rgb', 'neckPain/S007C003P015R002A047_rgb', 'neckPain/S007C001P018R001A047_rgb', 'neckPain/S017C002P008R001A047_rgb', 'neckPain/S006C002P024R001A047_rgb', 'neckPain/S001C003P007R002A047_rgb', 'neckPain/S004C001P008R002A047_rgb', 'neckPain/S001C002P003R001A047_rgb', 'neckPain/S003C002P008R001A047_rgb', 'neckPain/S012C002P017R002A047_rgb', 'neckPain/S002C002P012R002A047_rgb', 'neckPain/S011C001P008R001A047_rgb', 'neckPain/S005C003P017R002A047_rgb', 'neckPain/S016C001P021R001A047_rgb', 'neckPain/S010C002P008R001A047_rgb', 'neckPain/S005C003P010R001A047_rgb', 'neckPain/S014C003P039R001A047_rgb', 'neckPain/S002C003P007R001A047_rgb', 'neckPain/S010C003P016R001A047_rgb', 'neckPain/S007C002P008R001A047_rgb', 'neckPain/S008C002P029R002A047_rgb', 'neckPain/S007C003P027R002A047_rgb', 'neckPain/S006C001P015R001A047_rgb', 'neckPain/S013C001P008R001A047_rgb', 'neckPain/S003C003P018R002A047_rgb', 'neckPain/S011C001P007R002A047_rgb', 'neckPain/S010C003P007R002A047_rgb', 'neckPain/S002C002P014R002A047_rgb', 'neckPain/S016C001P021R002A047_rgb', 'neckPain/S006C003P015R001A047_rgb', 'neckPain/S010C003P018R002A047_rgb', 'neckPain/S013C002P007R002A047_rgb', 'neckPain/S006C001P001R002A047_rgb', 'neckPain/S013C003P028R001A047_rgb', 'neckPain/S013C003P025R001A047_rgb', 'neckPain/S008C002P032R002A047_rgb', 'neckPain/S002C003P014R002A047_rgb', 'neckPain/S007C002P018R001A047_rgb', 'neckPain/S002C003P012R001A047_rgb', 'neckPain/S004C003P003R001A047_rgb', 'neckPain/S006C003P008R002A047_rgb', 'neckPain/S007C001P027R002A047_rgb', 'neckPain/S013C002P016R001A047_rgb', 'neckPain/S016C003P025R001A047_rgb', 'neckPain/S011C003P038R001A047_rgb', 'neckPain/S016C001P040R001A047_rgb', 'neckPain/S012C001P017R002A047_rgb', 'neckPain/S006C002P007R001A047_rgb', 'neckPain/S015C001P019R001A047_rgb', 'neckPain/S001C001P004R001A047_rgb', 'neckPain/S014C003P008R002A047_rgb', 'neckPain/S011C003P015R002A047_rgb', 'neckPain/S014C003P039R002A047_rgb', 'neckPain/S008C001P025R001A047_rgb', 'neckPain/S012C003P037R001A047_rgb', 'neckPain/S005C002P017R002A047_rgb', 'neckPain/S013C002P027R001A047_rgb', 'neckPain/S007C003P017R001A047_rgb', 'neckPain/S014C001P008R002A047_rgb', 'neckPain/S002C001P009R001A047_rgb', 'neckPain/S017C003P016R002A047_rgb', 'neckPain/S014C003P007R002A047_rgb', 'neckPain/S002C002P003R002A047_rgb', 'neckPain/S010C002P017R001A047_rgb', 'neckPain/S007C003P018R001A047_rgb', 'neckPain/S015C001P037R001A047_rgb', 'neckPain/S011C002P007R002A047_rgb', 'neckPain/S014C001P027R001A047_rgb', 'neckPain/S010C003P016R002A047_rgb', 'neckPain/S001C001P001R002A047_rgb', 'neckPain/S001C003P002R001A047_rgb', 'neckPain/S013C003P037R002A047_rgb', 'neckPain/S017C001P015R001A047_rgb', 'neckPain/S004C003P007R001A047_rgb', 'neckPain/S001C001P007R002A047_rgb', 'neckPain/S013C001P028R002A047_rgb', 'neckPain/S009C001P017R002A047_rgb', 'neckPain/S008C001P007R001A047_rgb', 'neckPain/S010C001P019R002A047_rgb', 'neckPain/S006C002P016R002A047_rgb', 'neckPain/S005C003P021R001A047_rgb', 'neckPain/S008C003P035R001A047_rgb', 'neckPain/S008C003P008R002A047_rgb', 'neckPain/S009C003P019R001A047_rgb', 'neckPain/S013C002P037R002A047_rgb', 'neckPain/S016C002P025R001A047_rgb', 'neckPain/S004C003P020R001A047_rgb', 'neckPain/S008C003P025R001A047_rgb', 'neckPain/S009C003P019R002A047_rgb', 'neckPain/S012C001P015R001A047_rgb', 'neckPain/S016C001P007R001A047_rgb', 'neckPain/S001C001P005R001A047_rgb', 'neckPain/S005C001P016R001A047_rgb', 'neckPain/S014C001P008R001A047_rgb', 'neckPain/S003C001P017R001A047_rgb', 'neckPain/S007C001P026R002A047_rgb', 'neckPain/S006C003P008R001A047_rgb', 'neckPain/S007C002P017R001A047_rgb', 'neckPain/S016C002P040R001A047_rgb', 'neckPain/S014C001P017R002A047_rgb', 'neckPain/S002C002P007R001A047_rgb', 'neckPain/S015C003P007R002A047_rgb', 'neckPain/S005C002P015R002A047_rgb', 'neckPain/S012C002P008R001A047_rgb', 'neckPain/S011C002P001R001A047_rgb', 'neckPain/S008C001P001R001A047_rgb', 'neckPain/S007C003P001R001A047_rgb', 'neckPain/S008C002P035R001A047_rgb', 'neckPain/S001C002P006R002A047_rgb', 'neckPain/S002C002P013R002A047_rgb', 'neckPain/S013C001P017R001A047_rgb', 'neckPain/S013C003P018R001A047_rgb', 'neckPain/S011C003P025R002A047_rgb', 'neckPain/S017C001P009R002A047_rgb', 'neckPain/S003C002P001R002A047_rgb', 'neckPain/S014C002P019R001A047_rgb', 'neckPain/S003C001P002R001A047_rgb', 'neckPain/S009C001P015R001A047_rgb', 'neckPain/S011C003P002R002A047_rgb', 'neckPain/S008C001P030R001A047_rgb', 'neckPain/S012C003P027R002A047_rgb', 'neckPain/S003C003P002R001A047_rgb', 'neckPain/S008C001P001R002A047_rgb', 'neckPain/S013C003P037R001A047_rgb', 'neckPain/S007C001P026R001A047_rgb', 'neckPain/S013C003P016R001A047_rgb', 'neckPain/S015C003P037R002A047_rgb', 'neckPain/S016C001P039R002A047_rgb', 'neckPain/S014C002P007R002A047_rgb', 'neckPain/S014C001P039R002A047_rgb', 'neckPain/S009C003P007R002A047_rgb', 'neckPain/S010C001P025R002A047_rgb', 'neckPain/S002C002P012R001A047_rgb', 'neckPain/S014C001P037R002A047_rgb', 'neckPain/S006C001P022R002A047_rgb', 'neckPain/S011C001P027R002A047_rgb', 'neckPain/S008C003P030R001A047_rgb', 'neckPain/S008C001P035R002A047_rgb', 'neckPain/S016C002P021R001A047_rgb', 'neckPain/S008C003P032R002A047_rgb', 'neckPain/S011C003P027R002A047_rgb', 'neckPain/S006C003P024R002A047_rgb', 'neckPain/S010C001P015R002A047_rgb', 'neckPain/S010C002P008R002A047_rgb', 'neckPain/S001C003P002R002A047_rgb', 'neckPain/S013C002P019R002A047_rgb', 'neckPain/S014C002P017R002A047_rgb', 'neckPain/S007C002P015R002A047_rgb', 'neckPain/S002C001P008R002A047_rgb', 'neckPain/S004C002P007R001A047_rgb', 'neckPain/S007C003P019R001A047_rgb', 'neckPain/S007C003P028R002A047_rgb', 'neckPain/S003C003P015R001A047_rgb', 'neckPain/S004C002P007R002A047_rgb', 'neckPain/S011C002P028R002A047_rgb', 'neckPain/S011C002P002R002A047_rgb', 'neckPain/S017C003P017R002A047_rgb', 'neckPain/S013C002P028R001A047_rgb', 'neckPain/S006C002P017R001A047_rgb', 'neckPain/S009C003P016R002A047_rgb', 'neckPain/S010C001P013R001A047_rgb', 'neckPain/S010C002P019R001A047_rgb', 'neckPain/S014C001P007R001A047_rgb', 'neckPain/S014C001P019R001A047_rgb', 'neckPain/S007C001P027R001A047_rgb', 'neckPain/S008C002P025R001A047_rgb', 'neckPain/S003C003P017R002A047_rgb', 'neckPain/S006C003P023R001A047_rgb', 'neckPain/S010C002P017R002A047_rgb', 'neckPain/S004C002P008R001A047_rgb', 'neckPain/S015C001P015R002A047_rgb', 'neckPain/S008C001P033R001A047_rgb', 'neckPain/S017C003P008R001A047_rgb', 'neckPain/S013C002P017R001A047_rgb', 'neckPain/S009C002P017R001A047_rgb', 'neckPain/S011C001P007R001A047_rgb', 'neckPain/S012C001P019R001A047_rgb', 'neckPain/S015C003P019R002A047_rgb', 'neckPain/S003C001P019R002A047_rgb', 'neckPain/S002C001P009R002A047_rgb', 'neckPain/S014C002P027R001A047_rgb', 'neckPain/S010C001P007R002A047_rgb', 'neckPain/S005C001P016R002A047_rgb', 'neckPain/S017C001P008R001A047_rgb', 'neckPain/S011C003P001R001A047_rgb', 'neckPain/S015C002P019R001A047_rgb', 'neckPain/S014C002P027R002A047_rgb', 'neckPain/S011C002P025R002A047_rgb', 'neckPain/S013C001P018R002A047_rgb', 'neckPain/S001C003P003R002A047_rgb', 'neckPain/S007C001P017R002A047_rgb', 'neckPain/S009C001P008R002A047_rgb', 'neckPain/S013C003P015R001A047_rgb', 'neckPain/S011C001P017R002A047_rgb', 'neckPain/S005C003P010R002A047_rgb', 'neckPain/S017C001P007R002A047_rgb', 'neckPain/S015C002P017R001A047_rgb', 'neckPain/S013C003P019R001A047_rgb', 'neckPain/S016C003P040R002A047_rgb', 'neckPain/S006C001P017R002A047_rgb', 'neckPain/S011C003P028R002A047_rgb', 'neckPain/S012C002P015R001A047_rgb', 'neckPain/S014C003P027R002A047_rgb', 'neckPain/S006C003P016R002A047_rgb', 'neckPain/S006C002P008R001A047_rgb', 'neckPain/S010C002P015R001A047_rgb', 'neckPain/S006C001P023R002A047_rgb', 'neckPain/S014C002P007R001A047_rgb', 'neckPain/S005C002P015R001A047_rgb', 'neckPain/S006C001P023R001A047_rgb', 'neckPain/S017C001P020R002A047_rgb', 'neckPain/S012C002P019R002A047_rgb', 'neckPain/S014C003P015R001A047_rgb', 'neckPain/S015C003P016R001A047_rgb', 'neckPain/S013C003P008R002A047_rgb', 'neckPain/S011C001P038R001A047_rgb', 'neckPain/S009C003P016R001A047_rgb', 'neckPain/S012C003P008R001A047_rgb', 'neckPain/S004C003P007R002A047_rgb', 'neckPain/S012C001P008R001A047_rgb', 'neckPain/S015C001P017R001A047_rgb', 'neckPain/S002C003P011R002A047_rgb', 'neckPain/S011C003P016R001A047_rgb', 'neckPain/S016C001P025R001A047_rgb', 'neckPain/S004C003P008R002A047_rgb', 'neckPain/S002C003P013R001A047_rgb', 'neckPain/S007C002P001R002A047_rgb', 'neckPain/S009C002P008R002A047_rgb', 'neckPain/S005C003P015R001A047_rgb', 'neckPain/S015C003P017R001A047_rgb', 'neckPain/S012C003P025R001A047_rgb', 'neckPain/S009C002P019R001A047_rgb', 'neckPain/S015C003P015R001A047_rgb', 'neckPain/S009C002P025R001A047_rgb', 'neckPain/S008C002P032R001A047_rgb', 'neckPain/S016C002P007R001A047_rgb', 'neckPain/S012C002P016R002A047_rgb', 'neckPain/S014C003P017R002A047_rgb', 'neckPain/S007C001P025R001A047_rgb', 'neckPain/S002C001P011R001A047_rgb', 'neckPain/S009C002P016R001A047_rgb', 'neckPain/S015C001P017R002A047_rgb', 'neckPain/S002C002P010R002A047_rgb', 'neckPain/S013C002P019R001A047_rgb', 'neckPain/S007C003P026R002A047_rgb', 'neckPain/S011C002P028R001A047_rgb', 'neckPain/S008C002P015R001A047_rgb', 'neckPain/S017C003P007R001A047_rgb', 'neckPain/S001C003P008R001A047_rgb', 'neckPain/S008C001P032R002A047_rgb', 'neckPain/S011C003P017R002A047_rgb', 'neckPain/S003C002P016R002A047_rgb', 'neckPain/S011C002P018R002A047_rgb', 'neckPain/S003C002P019R002A047_rgb', 'neckPain/S011C003P027R001A047_rgb', 'neckPain/S002C001P007R002A047_rgb', 'neckPain/S012C003P018R001A047_rgb', 'neckPain/S007C002P007R001A047_rgb', 'neckPain/S014C002P015R002A047_rgb', 'neckPain/S006C003P023R002A047_rgb', 'neckPain/S007C003P026R001A047_rgb', 'neckPain/S016C001P019R001A047_rgb', 'neckPain/S011C001P025R002A047_rgb', 'neckPain/S006C001P019R002A047_rgb', 'neckPain/S003C003P016R001A047_rgb', 'neckPain/S011C002P018R001A047_rgb', 'neckPain/S012C003P025R002A047_rgb', 'neckPain/S009C003P017R002A047_rgb', 'neckPain/S015C003P025R002A047_rgb', 'neckPain/S010C003P025R001A047_rgb', 'neckPain/S011C003P019R002A047_rgb', 'neckPain/S007C002P026R002A047_rgb', 'neckPain/S012C002P015R002A047_rgb', 'neckPain/S017C003P015R001A047_rgb', 'neckPain/S003C002P017R002A047_rgb', 'neckPain/S009C001P019R001A047_rgb', 'neckPain/S017C003P009R002A047_rgb', 'neckPain/S013C003P027R001A047_rgb', 'neckPain/S006C003P001R001A047_rgb', 'neckPain/S004C001P008R001A047_rgb', 'neckPain/S005C001P015R002A047_rgb', 'neckPain/S014C002P037R001A047_rgb', 'neckPain/S010C001P008R001A047_rgb', 'neckPain/S017C001P003R001A047_rgb', 'neckPain/S015C001P025R002A047_rgb', 'neckPain/S016C003P007R001A047_rgb', 'neckPain/S004C003P003R002A047_rgb', 'neckPain/S016C002P039R001A047_rgb', 'neckPain/S012C003P028R001A047_rgb', 'neckPain/S016C002P019R002A047_rgb', 'neckPain/S013C001P015R001A047_rgb', 'neckPain/S013C001P025R002A047_rgb', 'neckPain/S010C002P016R002A047_rgb', 'neckPain/S001C003P004R001A047_rgb', 'neckPain/S004C001P020R001A047_rgb', 'neckPain/S008C003P029R002A047_rgb', 'neckPain/S013C003P007R002A047_rgb', 'neckPain/S012C001P008R002A047_rgb', 'neckPain/S002C001P008R001A047_rgb', 'neckPain/S013C003P017R002A047_rgb', 'neckPain/S008C002P007R002A047_rgb', 'neckPain/S007C001P001R001A047_rgb', 'neckPain/S002C001P013R001A047_rgb', 'neckPain/S010C003P008R002A047_rgb', 'neckPain/S010C002P007R002A047_rgb', 'neckPain/S011C002P027R002A047_rgb', 'neckPain/S006C003P007R002A047_rgb', 'neckPain/S015C001P019R002A047_rgb', 'neckPain/S007C002P027R001A047_rgb', 'neckPain/S014C002P039R001A047_rgb', 'neckPain/S017C001P017R001A047_rgb', 'neckPain/S014C002P019R002A047_rgb', 'neckPain/S013C001P019R001A047_rgb', 'neckPain/S007C002P025R001A047_rgb', 'neckPain/S009C002P025R002A047_rgb', 'neckPain/S002C003P010R002A047_rgb', 'neckPain/S013C002P008R001A047_rgb', 'neckPain/S005C003P015R002A047_rgb', 'neckPain/S008C003P034R001A047_rgb', 'neckPain/S008C001P019R001A047_rgb', 'neckPain/S013C001P016R002A047_rgb', 'neckPain/S011C003P007R001A047_rgb', 'neckPain/S010C002P021R002A047_rgb', 'neckPain/S017C002P003R002A047_rgb', 'neckPain/S014C003P008R001A047_rgb', 'neckPain/S001C002P008R002A047_rgb', 'neckPain/S014C001P027R002A047_rgb', 'neckPain/S017C003P009R001A047_rgb', 'neckPain/S007C003P019R002A047_rgb', 'neckPain/S004C001P007R002A047_rgb', 'neckPain/S015C003P007R001A047_rgb', 'neckPain/S001C002P003R002A047_rgb', 'neckPain/S008C002P036R002A047_rgb', 'neckPain/S002C001P014R002A047_rgb', 'neckPain/S013C003P025R002A047_rgb', 'neckPain/S011C002P019R001A047_rgb', 'neckPain/S005C003P013R001A047_rgb', 'neckPain/S012C001P027R001A047_rgb', 'neckPain/S005C002P010R002A047_rgb', 'neckPain/S006C001P024R001A047_rgb', 'neckPain/S013C003P019R002A047_rgb', 'neckPain/S008C002P008R001A047_rgb', 'neckPain/S008C003P031R002A047_rgb', 'neckPain/S008C003P032R001A047_rgb', 'neckPain/S017C001P003R002A047_rgb', 'neckPain/S010C001P017R002A047_rgb', 'neckPain/S013C002P015R002A047_rgb', 'neckPain/S012C001P018R002A047_rgb', 'neckPain/S008C003P031R001A047_rgb', 'neckPain/S004C003P020R002A047_rgb', 'neckPain/S001C001P008R002A047_rgb', 'neckPain/S017C001P007R001A047_rgb', 'neckPain/S007C001P028R001A047_rgb', 'neckPain/S007C002P016R002A047_rgb', 'neckPain/S011C003P015R001A047_rgb', 'neckPain/S007C002P015R001A047_rgb', 'neckPain/S017C003P007R002A047_rgb', 'neckPain/S003C002P019R001A047_rgb', 'neckPain/S011C001P018R002A047_rgb', 'neckPain/S007C003P008R001A047_rgb', 'neckPain/S011C001P019R001A047_rgb', 'neckPain/S001C002P005R002A047_rgb', 'neckPain/S012C002P008R002A047_rgb', 'neckPain/S005C001P018R001A047_rgb', 'neckPain/S001C002P002R002A047_rgb', 'neckPain/S015C002P015R002A047_rgb', 'neckPain/S006C001P007R001A047_rgb', 'nauseaVomiting/S001C003P003R001A048_rgb', 'nauseaVomiting/S016C003P040R001A048_rgb', 'nauseaVomiting/S014C001P039R002A048_rgb', 'nauseaVomiting/S009C001P007R001A048_rgb', 'nauseaVomiting/S004C003P003R001A048_rgb', 'nauseaVomiting/S017C003P015R002A048_rgb', 'nauseaVomiting/S007C001P019R002A048_rgb', 'nauseaVomiting/S011C001P001R002A048_rgb', 'nauseaVomiting/S001C003P001R001A048_rgb', 'nauseaVomiting/S010C001P008R002A048_rgb', 'nauseaVomiting/S010C003P025R001A048_rgb', 'nauseaVomiting/S008C003P008R001A048_rgb', 'nauseaVomiting/S016C001P025R001A048_rgb', 'nauseaVomiting/S014C001P007R001A048_rgb', 'nauseaVomiting/S015C003P007R001A048_rgb', 'nauseaVomiting/S010C001P008R001A048_rgb', 'nauseaVomiting/S011C001P007R001A048_rgb', 'nauseaVomiting/S012C003P037R001A048_rgb', 'nauseaVomiting/S001C003P003R002A048_rgb', 'nauseaVomiting/S013C003P019R002A048_rgb', 'nauseaVomiting/S007C002P017R001A048_rgb', 'nauseaVomiting/S011C003P015R002A048_rgb', 'nauseaVomiting/S005C002P013R002A048_rgb', 'nauseaVomiting/S012C001P037R002A048_rgb', 'nauseaVomiting/S017C002P020R002A048_rgb', 'nauseaVomiting/S013C002P008R001A048_rgb', 'nauseaVomiting/S012C002P016R001A048_rgb', 'nauseaVomiting/S006C002P023R002A048_rgb', 'nauseaVomiting/S001C002P004R001A048_rgb', 'nauseaVomiting/S015C002P016R001A048_rgb', 'nauseaVomiting/S011C002P027R001A048_rgb', 'nauseaVomiting/S011C001P018R002A048_rgb', 'nauseaVomiting/S002C002P011R001A048_rgb', 'nauseaVomiting/S005C001P015R001A048_rgb', 'nauseaVomiting/S005C001P016R002A048_rgb', 'nauseaVomiting/S013C002P015R002A048_rgb', 'nauseaVomiting/S009C001P017R002A048_rgb', 'nauseaVomiting/S002C001P013R001A048_rgb', 'nauseaVomiting/S012C002P018R002A048_rgb', 'nauseaVomiting/S013C001P007R001A048_rgb', 'nauseaVomiting/S009C001P008R001A048_rgb', 'nauseaVomiting/S009C002P016R002A048_rgb', 'nauseaVomiting/S009C003P025R002A048_rgb', 'nauseaVomiting/S015C001P019R002A048_rgb', 'nauseaVomiting/S004C001P008R002A048_rgb', 'nauseaVomiting/S008C001P007R001A048_rgb', 'nauseaVomiting/S009C002P019R001A048_rgb', 'nauseaVomiting/S006C002P019R001A048_rgb', 'nauseaVomiting/S002C003P013R002A048_rgb', 'nauseaVomiting/S008C002P001R001A048_rgb', 'nauseaVomiting/S007C003P028R001A048_rgb', 'nauseaVomiting/S011C002P008R001A048_rgb', 'nauseaVomiting/S007C002P016R001A048_rgb', 'nauseaVomiting/S003C002P007R001A048_rgb', 'nauseaVomiting/S016C001P040R001A048_rgb', 'nauseaVomiting/S006C001P007R002A048_rgb', 'nauseaVomiting/S009C001P019R001A048_rgb', 'nauseaVomiting/S015C001P025R002A048_rgb', 'nauseaVomiting/S007C001P018R001A048_rgb', 'nauseaVomiting/S013C001P008R001A048_rgb', 'nauseaVomiting/S003C003P018R002A048_rgb', 'nauseaVomiting/S004C001P007R001A048_rgb', 'nauseaVomiting/S011C003P025R001A048_rgb', 'nauseaVomiting/S007C002P027R001A048_rgb', 'nauseaVomiting/S014C001P027R002A048_rgb', 'nauseaVomiting/S003C002P017R002A048_rgb', 'nauseaVomiting/S003C003P016R001A048_rgb', 'nauseaVomiting/S002C003P014R002A048_rgb', 'nauseaVomiting/S005C003P004R001A048_rgb', 'nauseaVomiting/S012C003P025R002A048_rgb', 'nauseaVomiting/S005C003P017R001A048_rgb', 'nauseaVomiting/S010C003P017R002A048_rgb', 'nauseaVomiting/S001C001P005R002A048_rgb', 'nauseaVomiting/S017C003P015R001A048_rgb', 'nauseaVomiting/S011C003P016R001A048_rgb', 'nauseaVomiting/S015C002P025R002A048_rgb', 'nauseaVomiting/S013C003P019R001A048_rgb', 'nauseaVomiting/S011C002P001R001A048_rgb', 'nauseaVomiting/S013C001P028R001A048_rgb', 'nauseaVomiting/S011C002P028R001A048_rgb', 'nauseaVomiting/S014C003P025R002A048_rgb', 'nauseaVomiting/S008C001P015R001A048_rgb', 'nauseaVomiting/S003C001P018R001A048_rgb', 'nauseaVomiting/S007C001P017R001A048_rgb', 'nauseaVomiting/S008C001P015R002A048_rgb', 'nauseaVomiting/S015C001P015R002A048_rgb', 'nauseaVomiting/S002C001P011R002A048_rgb', 'nauseaVomiting/S012C002P027R001A048_rgb', 'nauseaVomiting/S005C003P010R001A048_rgb', 'nauseaVomiting/S005C001P017R002A048_rgb', 'nauseaVomiting/S002C003P014R001A048_rgb', 'nauseaVomiting/S015C001P037R001A048_rgb', 'nauseaVomiting/S006C003P016R002A048_rgb', 'nauseaVomiting/S016C001P025R002A048_rgb', 'nauseaVomiting/S009C002P008R002A048_rgb', 'nauseaVomiting/S011C001P019R002A048_rgb', 'nauseaVomiting/S001C002P003R001A048_rgb', 'nauseaVomiting/S008C001P008R002A048_rgb', 'nauseaVomiting/S016C003P025R002A048_rgb', 'nauseaVomiting/S016C002P021R001A048_rgb', 'nauseaVomiting/S012C003P018R001A048_rgb', 'nauseaVomiting/S010C001P025R002A048_rgb', 'nauseaVomiting/S003C003P017R002A048_rgb', 'nauseaVomiting/S012C001P019R001A048_rgb', 'nauseaVomiting/S002C003P011R001A048_rgb', 'nauseaVomiting/S011C001P025R001A048_rgb', 'nauseaVomiting/S006C001P023R002A048_rgb', 'nauseaVomiting/S012C002P017R002A048_rgb', 'nauseaVomiting/S007C003P019R001A048_rgb', 'nauseaVomiting/S013C001P027R001A048_rgb', 'nauseaVomiting/S017C003P009R002A048_rgb', 'nauseaVomiting/S002C003P012R002A048_rgb', 'nauseaVomiting/S012C002P018R001A048_rgb', 'nauseaVomiting/S007C003P008R001A048_rgb', 'nauseaVomiting/S007C003P017R002A048_rgb', 'nauseaVomiting/S009C003P016R001A048_rgb', 'nauseaVomiting/S006C001P016R002A048_rgb', 'nauseaVomiting/S015C003P017R002A048_rgb', 'nauseaVomiting/S003C003P002R001A048_rgb', 'nauseaVomiting/S006C003P015R002A048_rgb', 'nauseaVomiting/S008C001P030R002A048_rgb', 'nauseaVomiting/S007C002P001R002A048_rgb', 'nauseaVomiting/S013C003P017R002A048_rgb', 'nauseaVomiting/S017C002P009R002A048_rgb', 'nauseaVomiting/S010C002P021R001A048_rgb', 'nauseaVomiting/S016C003P019R001A048_rgb', 'nauseaVomiting/S016C002P007R001A048_rgb', 'nauseaVomiting/S017C001P003R001A048_rgb', 'nauseaVomiting/S013C003P025R001A048_rgb', 'nauseaVomiting/S003C002P002R001A048_rgb', 'nauseaVomiting/S001C002P006R001A048_rgb', 'nauseaVomiting/S006C002P016R001A048_rgb', 'nauseaVomiting/S011C001P027R001A048_rgb', 'nauseaVomiting/S010C003P007R001A048_rgb', 'nauseaVomiting/S003C002P016R002A048_rgb', 'nauseaVomiting/S012C001P017R002A048_rgb', 'nauseaVomiting/S008C003P036R001A048_rgb', 'nauseaVomiting/S005C002P004R001A048_rgb', 'nauseaVomiting/S017C003P008R002A048_rgb', 'nauseaVomiting/S009C001P017R001A048_rgb', 'nauseaVomiting/S014C001P019R002A048_rgb', 'nauseaVomiting/S013C001P037R002A048_rgb', 'nauseaVomiting/S007C002P015R001A048_rgb', 'nauseaVomiting/S012C002P028R002A048_rgb', 'nauseaVomiting/S006C001P007R001A048_rgb', 'nauseaVomiting/S007C002P026R001A048_rgb', 'nauseaVomiting/S002C001P003R002A048_rgb', 'nauseaVomiting/S016C001P019R001A048_rgb', 'nauseaVomiting/S003C003P007R002A048_rgb', 'nauseaVomiting/S001C001P003R001A048_rgb', 'nauseaVomiting/S007C001P018R002A048_rgb', 'nauseaVomiting/S012C001P027R002A048_rgb', 'nauseaVomiting/S015C001P037R002A048_rgb', 'nauseaVomiting/S007C003P015R002A048_rgb', 'nauseaVomiting/S010C003P019R001A048_rgb', 'nauseaVomiting/S015C003P016R001A048_rgb', 'nauseaVomiting/S010C002P008R002A048_rgb', 'nauseaVomiting/S017C001P008R002A048_rgb', 'nauseaVomiting/S013C001P025R001A048_rgb', 'nauseaVomiting/S007C003P017R001A048_rgb', 'nauseaVomiting/S003C001P015R002A048_rgb', 'nauseaVomiting/S015C003P016R002A048_rgb', 'nauseaVomiting/S002C002P010R002A048_rgb', 'nauseaVomiting/S011C002P015R002A048_rgb', 'nauseaVomiting/S011C002P018R001A048_rgb', 'nauseaVomiting/S002C003P010R002A048_rgb', 'nauseaVomiting/S010C001P017R002A048_rgb', 'nauseaVomiting/S015C002P019R001A048_rgb', 'nauseaVomiting/S015C001P007R002A048_rgb', 'nauseaVomiting/S006C001P024R002A048_rgb', 'nauseaVomiting/S005C002P004R002A048_rgb', 'nauseaVomiting/S011C003P015R001A048_rgb', 'nauseaVomiting/S002C002P012R002A048_rgb', 'nauseaVomiting/S011C002P028R002A048_rgb', 'nauseaVomiting/S008C001P033R001A048_rgb', 'nauseaVomiting/S008C003P031R001A048_rgb', 'nauseaVomiting/S002C001P010R002A048_rgb', 'nauseaVomiting/S007C002P028R001A048_rgb', 'nauseaVomiting/S016C002P039R001A048_rgb', 'nauseaVomiting/S007C003P027R002A048_rgb', 'nauseaVomiting/S008C001P008R001A048_rgb', 'nauseaVomiting/S014C003P008R001A048_rgb', 'nauseaVomiting/S013C003P007R002A048_rgb', 'nauseaVomiting/S011C001P016R002A048_rgb', 'nauseaVomiting/S005C001P018R002A048_rgb', 'nauseaVomiting/S010C003P013R001A048_rgb', 'nauseaVomiting/S017C002P020R001A048_rgb', 'nauseaVomiting/S010C001P021R001A048_rgb', 'nauseaVomiting/S008C003P008R002A048_rgb', 'nauseaVomiting/S010C002P025R002A048_rgb', 'nauseaVomiting/S011C002P017R002A048_rgb', 'nauseaVomiting/S001C002P005R002A048_rgb', 'nauseaVomiting/S015C003P015R002A048_rgb', 'nauseaVomiting/S005C002P015R002A048_rgb', 'nauseaVomiting/S002C003P008R001A048_rgb', 'nauseaVomiting/S011C001P002R002A048_rgb', 'nauseaVomiting/S006C001P008R001A048_rgb', 'nauseaVomiting/S001C003P004R002A048_rgb', 'nauseaVomiting/S016C003P021R001A048_rgb', 'nauseaVomiting/S010C001P021R002A048_rgb', 'nauseaVomiting/S016C002P025R002A048_rgb', 'nauseaVomiting/S014C002P008R001A048_rgb', 'nauseaVomiting/S016C002P039R002A048_rgb', 'nauseaVomiting/S009C002P019R002A048_rgb', 'nauseaVomiting/S008C003P001R002A048_rgb', 'nauseaVomiting/S005C001P017R001A048_rgb', 'nauseaVomiting/S016C002P008R001A048_rgb', 'nauseaVomiting/S014C001P015R002A048_rgb', 'nauseaVomiting/S013C003P025R002A048_rgb', 'nauseaVomiting/S011C001P007R002A048_rgb', 'nauseaVomiting/S005C002P018R002A048_rgb', 'nauseaVomiting/S011C003P017R002A048_rgb', 'nauseaVomiting/S010C001P016R001A048_rgb', 'nauseaVomiting/S016C001P008R001A048_rgb', 'nauseaVomiting/S003C003P018R001A048_rgb', 'nauseaVomiting/S012C003P007R002A048_rgb', 'nauseaVomiting/S014C003P015R002A048_rgb', 'nauseaVomiting/S008C002P008R001A048_rgb', 'nauseaVomiting/S008C001P019R001A048_rgb', 'nauseaVomiting/S006C001P024R001A048_rgb', 'nauseaVomiting/S002C003P007R002A048_rgb', 'nauseaVomiting/S007C001P019R001A048_rgb', 'nauseaVomiting/S013C001P037R001A048_rgb', 'nauseaVomiting/S006C001P015R002A048_rgb', 'nauseaVomiting/S004C002P020R002A048_rgb', 'nauseaVomiting/S011C003P027R001A048_rgb', 'nauseaVomiting/S001C002P008R001A048_rgb', 'nauseaVomiting/S003C001P019R001A048_rgb', 'nauseaVomiting/S008C003P019R002A048_rgb', 'nauseaVomiting/S010C003P015R002A048_rgb', 'nauseaVomiting/S013C002P027R002A048_rgb', 'nauseaVomiting/S012C001P027R001A048_rgb', 'nauseaVomiting/S012C001P025R001A048_rgb', 'nauseaVomiting/S017C001P003R002A048_rgb', 'nauseaVomiting/S012C003P016R001A048_rgb', 'nauseaVomiting/S001C003P001R002A048_rgb', 'nauseaVomiting/S009C001P025R001A048_rgb', 'nauseaVomiting/S010C002P007R002A048_rgb', 'nauseaVomiting/S012C003P017R002A048_rgb', 'nauseaVomiting/S013C002P027R001A048_rgb', 'nauseaVomiting/S007C001P025R001A048_rgb', 'nauseaVomiting/S014C001P039R001A048_rgb', 'nauseaVomiting/S002C002P007R001A048_rgb', 'nauseaVomiting/S013C001P016R001A048_rgb', 'nauseaVomiting/S011C001P019R001A048_rgb', 'nauseaVomiting/S006C003P017R001A048_rgb', 'nauseaVomiting/S009C001P008R002A048_rgb', 'nauseaVomiting/S001C001P006R001A048_rgb', 'nauseaVomiting/S005C001P015R002A048_rgb', 'nauseaVomiting/S016C001P021R001A048_rgb', 'nauseaVomiting/S002C001P008R001A048_rgb', 'nauseaVomiting/S001C001P003R002A048_rgb', 'nauseaVomiting/S011C003P019R002A048_rgb', 'nauseaVomiting/S010C002P019R001A048_rgb', 'nauseaVomiting/S008C003P007R002A048_rgb', 'nauseaVomiting/S002C002P003R001A048_rgb', 'nauseaVomiting/S013C001P019R002A048_rgb', 'nauseaVomiting/S014C002P019R002A048_rgb', 'nauseaVomiting/S010C003P017R001A048_rgb', 'nauseaVomiting/S016C002P040R002A048_rgb', 'nauseaVomiting/S007C001P027R001A048_rgb', 'nauseaVomiting/S008C001P001R001A048_rgb', 'nauseaVomiting/S001C003P005R002A048_rgb', 'nauseaVomiting/S010C002P016R002A048_rgb', 'nauseaVomiting/S007C002P018R002A048_rgb', 'nauseaVomiting/S005C003P015R001A048_rgb', 'nauseaVomiting/S013C003P027R002A048_rgb', 'nauseaVomiting/S012C003P028R001A048_rgb', 'nauseaVomiting/S008C001P025R001A048_rgb', 'nauseaVomiting/S008C001P001R002A048_rgb', 'nauseaVomiting/S011C001P028R002A048_rgb', 'nauseaVomiting/S007C001P008R002A048_rgb', 'nauseaVomiting/S013C002P007R002A048_rgb', 'nauseaVomiting/S011C003P018R002A048_rgb', 'nauseaVomiting/S009C001P016R002A048_rgb', 'nauseaVomiting/S007C001P026R002A048_rgb', 'nauseaVomiting/S008C002P035R001A048_rgb', 'nauseaVomiting/S007C001P016R001A048_rgb', 'nauseaVomiting/S008C002P035R002A048_rgb', 'nauseaVomiting/S012C001P028R001A048_rgb', 'nauseaVomiting/S003C003P015R001A048_rgb', 'nauseaVomiting/S008C003P032R002A048_rgb', 'nauseaVomiting/S012C002P015R001A048_rgb', 'nauseaVomiting/S007C003P025R001A048_rgb', 'nauseaVomiting/S007C002P019R001A048_rgb', 'nauseaVomiting/S007C001P007R001A048_rgb', 'nauseaVomiting/S012C002P025R001A048_rgb', 'nauseaVomiting/S005C002P016R002A048_rgb', 'nauseaVomiting/S004C003P007R001A048_rgb', 'nauseaVomiting/S015C001P019R001A048_rgb', 'nauseaVomiting/S015C002P019R002A048_rgb', 'nauseaVomiting/S002C002P013R002A048_rgb', 'nauseaVomiting/S014C003P007R002A048_rgb', 'nauseaVomiting/S006C001P016R001A048_rgb', 'nauseaVomiting/S011C002P038R001A048_rgb', 'nauseaVomiting/S012C001P007R001A048_rgb', 'nauseaVomiting/S014C002P037R001A048_rgb', 'nauseaVomiting/S005C003P016R002A048_rgb', 'nauseaVomiting/S017C002P008R001A048_rgb', 'nauseaVomiting/S006C003P024R001A048_rgb', 'nauseaVomiting/S008C001P030R001A048_rgb', 'nauseaVomiting/S006C001P001R001A048_rgb', 'nauseaVomiting/S001C002P003R002A048_rgb', 'nauseaVomiting/S003C002P008R002A048_rgb', 'nauseaVomiting/S013C003P037R002A048_rgb', 'nauseaVomiting/S016C003P039R002A048_rgb', 'nauseaVomiting/S001C003P002R001A048_rgb', 'nauseaVomiting/S008C001P032R001A048_rgb', 'nauseaVomiting/S006C003P016R001A048_rgb', 'nauseaVomiting/S009C002P015R002A048_rgb', 'nauseaVomiting/S010C001P013R001A048_rgb', 'nauseaVomiting/S015C003P025R002A048_rgb', 'nauseaVomiting/S002C002P011R002A048_rgb', 'nauseaVomiting/S007C003P016R002A048_rgb', 'nauseaVomiting/S014C001P027R001A048_rgb', 'nauseaVomiting/S007C003P015R001A048_rgb', 'nauseaVomiting/S015C003P019R002A048_rgb', 'nauseaVomiting/S008C003P033R002A048_rgb', 'nauseaVomiting/S005C002P018R001A048_rgb', 'nauseaVomiting/S012C002P037R002A048_rgb', 'nauseaVomiting/S011C003P016R002A048_rgb', 'nauseaVomiting/S003C001P008R001A048_rgb', 'nauseaVomiting/S013C001P007R002A048_rgb', 'nauseaVomiting/S009C002P017R002A048_rgb', 'nauseaVomiting/S011C002P017R001A048_rgb', 'nauseaVomiting/S005C003P010R002A048_rgb', 'nauseaVomiting/S006C002P007R001A048_rgb', 'nauseaVomiting/S003C001P015R001A048_rgb', 'nauseaVomiting/S012C002P016R002A048_rgb', 'nauseaVomiting/S012C003P008R002A048_rgb', 'nauseaVomiting/S010C003P025R002A048_rgb', 'nauseaVomiting/S015C001P008R001A048_rgb', 'nauseaVomiting/S007C001P008R001A048_rgb', 'nauseaVomiting/S013C003P008R001A048_rgb', 'nauseaVomiting/S012C002P008R002A048_rgb', 'nauseaVomiting/S005C001P021R001A048_rgb', 'nauseaVomiting/S013C002P018R001A048_rgb', 'nauseaVomiting/S013C003P008R002A048_rgb', 'nauseaVomiting/S015C001P007R001A048_rgb', 'nauseaVomiting/S003C002P018R002A048_rgb', 'nauseaVomiting/S007C002P015R002A048_rgb', 'nauseaVomiting/S012C002P019R001A048_rgb', 'nauseaVomiting/S003C003P017R001A048_rgb', 'nauseaVomiting/S013C003P007R001A048_rgb', 'nauseaVomiting/S014C002P007R001A048_rgb', 'nauseaVomiting/S007C002P027R002A048_rgb', 'nauseaVomiting/S015C003P019R001A048_rgb', 'nauseaVomiting/S013C002P025R001A048_rgb', 'nauseaVomiting/S014C002P017R002A048_rgb', 'nauseaVomiting/S011C002P025R002A048_rgb', 'nauseaVomiting/S014C003P017R001A048_rgb', 'nauseaVomiting/S017C002P015R001A048_rgb', 'nauseaVomiting/S007C003P001R001A048_rgb', 'nauseaVomiting/S009C001P007R002A048_rgb', 'nauseaVomiting/S002C002P008R001A048_rgb', 'nauseaVomiting/S016C001P007R002A048_rgb', 'nauseaVomiting/S016C001P039R001A048_rgb', 'nauseaVomiting/S017C001P017R001A048_rgb', 'nauseaVomiting/S010C001P015R002A048_rgb', 'nauseaVomiting/S015C002P007R002A048_rgb', 'nauseaVomiting/S005C001P010R002A048_rgb', 'nauseaVomiting/S014C002P008R002A048_rgb', 'nauseaVomiting/S008C002P025R002A048_rgb', 'nauseaVomiting/S005C003P018R001A048_rgb', 'nauseaVomiting/S011C002P008R002A048_rgb', 'nauseaVomiting/S002C001P003R001A048_rgb', 'nauseaVomiting/S007C003P018R002A048_rgb', 'nauseaVomiting/S010C002P018R001A048_rgb', 'nauseaVomiting/S010C003P015R001A048_rgb', 'nauseaVomiting/S006C003P019R002A048_rgb', 'nauseaVomiting/S009C002P025R001A048_rgb', 'nauseaVomiting/S012C002P007R002A048_rgb', 'nauseaVomiting/S011C001P017R002A048_rgb', 'nauseaVomiting/S011C002P027R002A048_rgb', 'nauseaVomiting/S015C003P007R002A048_rgb', 'nauseaVomiting/S014C003P017R002A048_rgb', 'nauseaVomiting/S001C002P001R002A048_rgb', 'nauseaVomiting/S017C001P009R001A048_rgb', 'nauseaVomiting/S008C001P029R001A048_rgb', 'nauseaVomiting/S008C001P036R002A048_rgb', 'nauseaVomiting/S002C002P010R001A048_rgb', 'nauseaVomiting/S010C001P019R001A048_rgb', 'nauseaVomiting/S010C001P025R001A048_rgb', 'nauseaVomiting/S001C001P004R001A048_rgb', 'nauseaVomiting/S001C001P001R001A048_rgb', 'nauseaVomiting/S002C002P014R002A048_rgb', 'nauseaVomiting/S006C003P023R002A048_rgb', 'nauseaVomiting/S003C002P015R002A048_rgb', 'nauseaVomiting/S017C002P015R002A048_rgb', 'nauseaVomiting/S005C001P018R001A048_rgb', 'nauseaVomiting/S007C003P008R002A048_rgb', 'nauseaVomiting/S012C001P016R002A048_rgb', 'nauseaVomiting/S012C002P037R001A048_rgb', 'nauseaVomiting/S001C002P002R001A048_rgb', 'nauseaVomiting/S009C002P025R002A048_rgb', 'nauseaVomiting/S003C001P001R001A048_rgb', 'nauseaVomiting/S017C001P017R002A048_rgb', 'nauseaVomiting/S006C003P008R001A048_rgb', 'nauseaVomiting/S014C003P037R001A048_rgb', 'nauseaVomiting/S016C003P008R001A048_rgb', 'nauseaVomiting/S002C002P014R001A048_rgb', 'nauseaVomiting/S005C002P013R001A048_rgb', 'nauseaVomiting/S012C003P016R002A048_rgb', 'nauseaVomiting/S005C003P017R002A048_rgb', 'nauseaVomiting/S006C003P008R002A048_rgb', 'nauseaVomiting/S012C001P008R001A048_rgb', 'nauseaVomiting/S013C001P016R002A048_rgb', 'nauseaVomiting/S012C003P025R001A048_rgb', 'nauseaVomiting/S011C002P007R002A048_rgb', 'nauseaVomiting/S016C002P040R001A048_rgb', 'nauseaVomiting/S012C003P015R002A048_rgb', 'nauseaVomiting/S002C003P013R001A048_rgb', 'nauseaVomiting/S007C003P001R002A048_rgb', 'nauseaVomiting/S005C003P018R002A048_rgb', 'nauseaVomiting/S003C001P007R002A048_rgb', 'nauseaVomiting/S006C002P008R001A048_rgb', 'nauseaVomiting/S013C003P015R001A048_rgb', 'nauseaVomiting/S017C001P015R001A048_rgb', 'nauseaVomiting/S012C001P007R002A048_rgb', 'nauseaVomiting/S009C003P015R001A048_rgb', 'nauseaVomiting/S011C003P008R001A048_rgb', 'nauseaVomiting/S012C002P008R001A048_rgb', 'nauseaVomiting/S007C002P025R001A048_rgb', 'nauseaVomiting/S016C003P007R002A048_rgb', 'nauseaVomiting/S011C002P038R002A048_rgb', 'nauseaVomiting/S015C003P037R001A048_rgb', 'nauseaVomiting/S003C003P019R002A048_rgb', 'nauseaVomiting/S014C001P015R001A048_rgb', 'nauseaVomiting/S005C002P015R001A048_rgb', 'nauseaVomiting/S008C001P036R001A048_rgb', 'nauseaVomiting/S002C001P008R002A048_rgb', 'nauseaVomiting/S001C002P004R002A048_rgb', 'nauseaVomiting/S011C001P016R001A048_rgb', 'nauseaVomiting/S009C001P025R002A048_rgb', 'nauseaVomiting/S013C002P008R002A048_rgb', 'nauseaVomiting/S013C003P028R002A048_rgb', 'nauseaVomiting/S007C002P017R002A048_rgb', 'nauseaVomiting/S016C003P008R002A048_rgb', 'nauseaVomiting/S002C001P010R001A048_rgb', 'nauseaVomiting/S013C003P027R001A048_rgb', 'nauseaVomiting/S008C003P025R002A048_rgb', 'nauseaVomiting/S012C001P019R002A048_rgb', 'nauseaVomiting/S001C001P004R002A048_rgb', 'nauseaVomiting/S009C003P015R002A048_rgb', 'nauseaVomiting/S008C003P015R001A048_rgb', 'nauseaVomiting/S003C001P001R002A048_rgb', 'nauseaVomiting/S002C003P011R002A048_rgb', 'nauseaVomiting/S012C003P027R002A048_rgb', 'nauseaVomiting/S003C001P002R002A048_rgb', 'nauseaVomiting/S012C003P027R001A048_rgb', 'nauseaVomiting/S003C002P002R002A048_rgb', 'nauseaVomiting/S013C001P019R001A048_rgb', 'nauseaVomiting/S003C002P001R001A048_rgb', 'nauseaVomiting/S017C002P008R002A048_rgb', 'nauseaVomiting/S007C001P025R002A048_rgb', 'nauseaVomiting/S017C001P016R002A048_rgb', 'nauseaVomiting/S005C002P017R001A048_rgb', 'nauseaVomiting/S011C003P028R001A048_rgb', 'nauseaVomiting/S013C003P016R001A048_rgb', 'nauseaVomiting/S008C002P030R001A048_rgb', 'nauseaVomiting/S006C003P024R002A048_rgb', 'nauseaVomiting/S015C003P008R002A048_rgb', 'nauseaVomiting/S010C002P017R002A048_rgb', 'nauseaVomiting/S004C002P007R001A048_rgb', 'nauseaVomiting/S012C001P028R002A048_rgb', 'nauseaVomiting/S007C001P007R002A048_rgb', 'nauseaVomiting/S001C002P007R002A048_rgb', 'nauseaVomiting/S003C001P017R002A048_rgb', 'nauseaVomiting/S013C002P017R002A048_rgb', 'nauseaVomiting/S006C001P017R002A048_rgb', 'nauseaVomiting/S017C002P016R001A048_rgb', 'nauseaVomiting/S008C002P019R002A048_rgb', 'nauseaVomiting/S014C003P008R002A048_rgb', 'nauseaVomiting/S006C001P001R002A048_rgb', 'nauseaVomiting/S008C001P033R002A048_rgb', 'nauseaVomiting/S008C002P029R001A048_rgb', 'nauseaVomiting/S012C001P015R002A048_rgb', 'nauseaVomiting/S005C002P021R002A048_rgb', 'nauseaVomiting/S016C002P019R001A048_rgb', 'nauseaVomiting/S016C001P021R002A048_rgb', 'nauseaVomiting/S003C002P019R001A048_rgb', 'nauseaVomiting/S015C003P008R001A048_rgb', 'nauseaVomiting/S011C001P008R002A048_rgb', 'nauseaVomiting/S015C001P017R001A048_rgb', 'nauseaVomiting/S001C002P008R002A048_rgb', 'nauseaVomiting/S008C003P029R002A048_rgb', 'nauseaVomiting/S010C002P021R002A048_rgb', 'nauseaVomiting/S007C001P028R001A048_rgb', 'nauseaVomiting/S007C003P026R002A048_rgb', 'nauseaVomiting/S001C001P006R002A048_rgb', 'nauseaVomiting/S006C002P015R001A048_rgb', 'nauseaVomiting/S006C003P022R002A048_rgb', 'nauseaVomiting/S013C002P037R002A048_rgb', 'nauseaVomiting/S003C001P018R002A048_rgb', 'nauseaVomiting/S009C003P017R001A048_rgb', 'nauseaVomiting/S008C001P025R002A048_rgb', 'nauseaVomiting/S016C002P007R002A048_rgb', 'nauseaVomiting/S003C001P019R002A048_rgb', 'nauseaVomiting/S004C002P007R002A048_rgb', 'nauseaVomiting/S011C003P001R001A048_rgb', 'nauseaVomiting/S003C002P019R002A048_rgb', 'nauseaVomiting/S012C002P027R002A048_rgb', 'nauseaVomiting/S013C001P017R002A048_rgb', 'nauseaVomiting/S017C001P009R002A048_rgb', 'nauseaVomiting/S008C003P030R001A048_rgb', 'nauseaVomiting/S008C002P032R002A048_rgb', 'nauseaVomiting/S012C003P037R002A048_rgb', 'nauseaVomiting/S014C002P027R002A048_rgb', 'nauseaVomiting/S005C003P015R002A048_rgb', 'nauseaVomiting/S002C002P009R002A048_rgb', 'nauseaVomiting/S011C003P019R001A048_rgb', 'nauseaVomiting/S001C001P007R002A048_rgb', 'nauseaVomiting/S003C001P008R002A048_rgb', 'nauseaVomiting/S003C003P007R001A048_rgb', 'nauseaVomiting/S017C003P016R002A048_rgb', 'nauseaVomiting/S012C001P025R002A048_rgb', 'nauseaVomiting/S008C002P034R002A048_rgb', 'nauseaVomiting/S010C001P013R002A048_rgb', 'nauseaVomiting/S008C002P001R002A048_rgb', 'nauseaVomiting/S009C001P016R001A048_rgb', 'nauseaVomiting/S004C001P020R002A048_rgb', 'nauseaVomiting/S002C001P012R001A048_rgb', 'nauseaVomiting/S002C003P003R001A048_rgb', 'nauseaVomiting/S014C001P008R001A048_rgb', 'nauseaVomiting/S007C001P016R002A048_rgb', 'nauseaVomiting/S015C002P016R002A048_rgb', 'nauseaVomiting/S014C003P007R001A048_rgb', 'nauseaVomiting/S009C002P016R001A048_rgb', 'nauseaVomiting/S013C001P028R002A048_rgb', 'nauseaVomiting/S011C002P019R002A048_rgb', 'nauseaVomiting/S014C003P019R001A048_rgb', 'nauseaVomiting/S008C002P015R002A048_rgb', 'nauseaVomiting/S013C002P016R002A048_rgb', 'nauseaVomiting/S008C003P034R001A048_rgb', 'nauseaVomiting/S009C003P008R002A048_rgb', 'nauseaVomiting/S011C003P008R002A048_rgb', 'nauseaVomiting/S001C003P007R001A048_rgb', 'nauseaVomiting/S003C002P007R002A048_rgb', 'nauseaVomiting/S008C003P007R001A048_rgb', 'nauseaVomiting/S010C001P018R001A048_rgb', 'nauseaVomiting/S002C001P009R001A048_rgb', 'nauseaVomiting/S003C002P016R001A048_rgb', 'nauseaVomiting/S012C001P018R001A048_rgb', 'nauseaVomiting/S017C002P017R001A048_rgb', 'nauseaVomiting/S003C002P001R002A048_rgb', 'nauseaVomiting/S006C001P022R001A048_rgb', 'nauseaVomiting/S014C001P025R002A048_rgb', 'nauseaVomiting/S014C002P037R002A048_rgb', 'nauseaVomiting/S003C002P017R001A048_rgb', 'nauseaVomiting/S016C001P019R002A048_rgb', 'nauseaVomiting/S017C002P017R002A048_rgb', 'nauseaVomiting/S006C003P017R002A048_rgb', 'nauseaVomiting/S017C003P020R001A048_rgb', 'nauseaVomiting/S007C003P007R002A048_rgb', 'nauseaVomiting/S007C002P016R002A048_rgb', 'nauseaVomiting/S011C003P038R002A048_rgb', 'nauseaVomiting/S006C002P024R001A048_rgb', 'nauseaVomiting/S006C002P015R002A048_rgb', 'nauseaVomiting/S003C003P001R002A048_rgb', 'nauseaVomiting/S001C002P002R002A048_rgb', 'nauseaVomiting/S008C003P029R001A048_rgb', 'nauseaVomiting/S002C001P014R002A048_rgb', 'nauseaVomiting/S008C003P025R001A048_rgb', 'nauseaVomiting/S001C003P006R001A048_rgb', 'nauseaVomiting/S017C001P008R001A048_rgb', 'nauseaVomiting/S008C002P032R001A048_rgb', 'nauseaVomiting/S006C002P008R002A048_rgb', 'nauseaVomiting/S017C002P016R002A048_rgb', 'nauseaVomiting/S016C002P019R002A048_rgb', 'nauseaVomiting/S006C001P008R002A048_rgb', 'nauseaVomiting/S013C003P017R001A048_rgb', 'nauseaVomiting/S002C002P009R001A048_rgb', 'nauseaVomiting/S003C001P017R001A048_rgb', 'nauseaVomiting/S010C001P007R002A048_rgb', 'nauseaVomiting/S012C002P015R002A048_rgb', 'nauseaVomiting/S005C003P016R001A048_rgb', 'nauseaVomiting/S005C003P021R001A048_rgb', 'nauseaVomiting/S012C002P019R002A048_rgb', 'nauseaVomiting/S014C002P027R001A048_rgb', 'nauseaVomiting/S010C003P018R002A048_rgb', 'nauseaVomiting/S009C002P017R001A048_rgb', 'nauseaVomiting/S007C002P018R001A048_rgb', 'nauseaVomiting/S014C001P008R002A048_rgb', 'nauseaVomiting/S012C003P019R001A048_rgb', 'nauseaVomiting/S010C002P013R001A048_rgb', 'nauseaVomiting/S008C003P035R001A048_rgb', 'nauseaVomiting/S005C002P021R001A048_rgb', 'nauseaVomiting/S005C002P010R002A048_rgb', 'nauseaVomiting/S010C003P013R002A048_rgb', 'nauseaVomiting/S007C002P008R001A048_rgb', 'nauseaVomiting/S007C001P028R002A048_rgb', 'nauseaVomiting/S017C002P007R002A048_rgb', 'nauseaVomiting/S006C001P022R002A048_rgb', 'nauseaVomiting/S006C003P007R001A048_rgb', 'nauseaVomiting/S015C002P025R001A048_rgb', 'nauseaVomiting/S009C001P015R002A048_rgb', 'nauseaVomiting/S004C003P008R002A048_rgb', 'nauseaVomiting/S003C001P007R001A048_rgb', 'nauseaVomiting/S012C003P028R002A048_rgb', 'nauseaVomiting/S014C001P017R002A048_rgb', 'nauseaVomiting/S008C003P033R001A048_rgb', 'nauseaVomiting/S005C001P016R001A048_rgb', 'nauseaVomiting/S011C001P018R001A048_rgb', 'nauseaVomiting/S007C002P028R002A048_rgb', 'nauseaVomiting/S011C003P007R002A048_rgb', 'nauseaVomiting/S002C002P003R002A048_rgb', 'nauseaVomiting/S016C001P007R001A048_rgb', 'nauseaVomiting/S015C002P008R001A048_rgb', 'nauseaVomiting/S008C002P019R001A048_rgb', 'nauseaVomiting/S009C002P007R002A048_rgb', 'nauseaVomiting/S008C003P001R001A048_rgb', 'nauseaVomiting/S001C001P002R002A048_rgb', 'nauseaVomiting/S008C001P007R002A048_rgb', 'nauseaVomiting/S011C001P002R001A048_rgb', 'nauseaVomiting/S014C002P015R002A048_rgb', 'nauseaVomiting/S010C001P018R002A048_rgb', 'nauseaVomiting/S011C001P038R002A048_rgb', 'nauseaVomiting/S008C003P032R001A048_rgb', 'nauseaVomiting/S015C003P025R001A048_rgb', 'nauseaVomiting/S013C002P028R001A048_rgb', 'nauseaVomiting/S004C001P008R001A048_rgb', 'nauseaVomiting/S013C001P025R002A048_rgb', 'nauseaVomiting/S003C001P002R001A048_rgb', 'nauseaVomiting/S008C003P015R002A048_rgb', 'nauseaVomiting/S006C002P019R002A048_rgb', 'nauseaVomiting/S001C003P008R001A048_rgb', 'nauseaVomiting/S016C002P021R002A048_rgb', 'nauseaVomiting/S007C003P026R001A048_rgb', 'nauseaVomiting/S017C003P016R001A048_rgb', 'nauseaVomiting/S004C003P007R002A048_rgb', 'nauseaVomiting/S002C002P007R002A048_rgb', 'nauseaVomiting/S014C003P027R002A048_rgb', 'nauseaVomiting/S015C001P008R002A048_rgb', 'nauseaVomiting/S007C001P015R001A048_rgb', 'nauseaVomiting/S001C002P005R001A048_rgb', 'nauseaVomiting/S011C003P002R002A048_rgb', 'nauseaVomiting/S010C003P021R002A048_rgb', 'nauseaVomiting/S011C001P038R001A048_rgb', 'nauseaVomiting/S012C003P017R001A048_rgb', 'nauseaVomiting/S015C002P037R002A048_rgb', 'nauseaVomiting/S009C003P007R001A048_rgb', 'nauseaVomiting/S011C001P028R001A048_rgb', 'nauseaVomiting/S010C003P008R002A048_rgb', 'nauseaVomiting/S013C003P028R001A048_rgb', 'nauseaVomiting/S015C001P025R001A048_rgb', 'nauseaVomiting/S005C001P004R001A048_rgb', 'nauseaVomiting/S007C001P027R002A048_rgb', 'nauseaVomiting/S016C001P008R002A048_rgb', 'nauseaVomiting/S017C001P015R002A048_rgb', 'nauseaVomiting/S008C002P031R002A048_rgb', 'nauseaVomiting/S016C003P021R002A048_rgb', 'nauseaVomiting/S009C003P016R002A048_rgb', 'nauseaVomiting/S010C001P016R002A048_rgb', 'nauseaVomiting/S017C003P017R001A048_rgb', 'nauseaVomiting/S016C003P007R001A048_rgb', 'nauseaVomiting/S003C003P008R002A048_rgb', 'nauseaVomiting/S009C001P019R002A048_rgb', 'nauseaVomiting/S007C003P028R002A048_rgb', 'nauseaVomiting/S013C001P017R001A048_rgb', 'nauseaVomiting/S005C003P013R002A048_rgb', 'nauseaVomiting/S011C001P025R002A048_rgb', 'nauseaVomiting/S006C002P017R001A048_rgb', 'nauseaVomiting/S003C002P015R001A048_rgb', 'nauseaVomiting/S010C003P008R001A048_rgb', 'nauseaVomiting/S011C003P028R002A048_rgb', 'nauseaVomiting/S011C003P017R001A048_rgb', 'nauseaVomiting/S002C001P007R002A048_rgb', 'nauseaVomiting/S005C003P004R002A048_rgb', 'nauseaVomiting/S008C001P029R002A048_rgb', 'nauseaVomiting/S017C002P003R002A048_rgb', 'nauseaVomiting/S015C003P017R001A048_rgb', 'nauseaVomiting/S001C002P007R001A048_rgb', 'nauseaVomiting/S012C003P015R001A048_rgb', 'nauseaVomiting/S004C001P003R002A048_rgb', 'nauseaVomiting/S003C002P018R001A048_rgb', 'nauseaVomiting/S011C002P016R002A048_rgb', 'nauseaVomiting/S012C002P028R001A048_rgb', 'nauseaVomiting/S017C003P003R002A048_rgb', 'nauseaVomiting/S014C003P027R001A048_rgb', 'nauseaVomiting/S011C001P008R001A048_rgb', 'nauseaVomiting/S010C002P016R001A048_rgb', 'nauseaVomiting/S014C001P025R001A048_rgb', 'nauseaVomiting/S002C003P007R001A048_rgb', 'nauseaVomiting/S011C003P007R001A048_rgb', 'nauseaVomiting/S015C001P015R001A048_rgb', 'nauseaVomiting/S005C001P013R002A048_rgb', 'nauseaVomiting/S017C003P020R002A048_rgb', 'nauseaVomiting/S004C002P020R001A048_rgb', 'nauseaVomiting/S011C001P001R001A048_rgb', 'nauseaVomiting/S008C002P031R001A048_rgb', 'nauseaVomiting/S006C002P017R002A048_rgb', 'nauseaVomiting/S017C001P020R001A048_rgb', 'nauseaVomiting/S001C002P001R001A048_rgb', 'nauseaVomiting/S003C003P002R002A048_rgb', 'nauseaVomiting/S001C001P002R001A048_rgb', 'nauseaVomiting/S015C003P037R002A048_rgb', 'nauseaVomiting/S015C002P037R001A048_rgb', 'nauseaVomiting/S006C003P019R001A048_rgb', 'nauseaVomiting/S015C003P015R001A048_rgb', 'nauseaVomiting/S007C002P001R001A048_rgb', 'nauseaVomiting/S009C003P008R001A048_rgb', 'nauseaVomiting/S004C003P008R001A048_rgb', 'nauseaVomiting/S006C001P019R002A048_rgb', 'nauseaVomiting/S006C003P001R002A048_rgb', 'nauseaVomiting/S011C003P018R001A048_rgb', 'nauseaVomiting/S013C001P018R002A048_rgb', 'nauseaVomiting/S011C002P018R002A048_rgb', 'nauseaVomiting/S009C003P025R001A048_rgb', 'nauseaVomiting/S005C001P013R001A048_rgb', 'nauseaVomiting/S012C003P008R001A048_rgb', 'nauseaVomiting/S012C001P018R002A048_rgb', 'nauseaVomiting/S008C002P033R001A048_rgb', 'nauseaVomiting/S006C003P023R001A048_rgb', 'nauseaVomiting/S013C003P037R001A048_rgb', 'nauseaVomiting/S017C003P009R001A048_rgb', 'nauseaVomiting/S010C002P018R002A048_rgb', 'nauseaVomiting/S010C003P016R001A048_rgb', 'nauseaVomiting/S010C002P007R001A048_rgb', 'nauseaVomiting/S006C002P001R001A048_rgb', 'nauseaVomiting/S014C001P017R001A048_rgb', 'nauseaVomiting/S013C001P008R002A048_rgb', 'nauseaVomiting/S016C001P040R002A048_rgb', 'nauseaVomiting/S011C002P016R001A048_rgb', 'nauseaVomiting/S013C003P016R002A048_rgb', 'nauseaVomiting/S014C002P015R001A048_rgb', 'nauseaVomiting/S001C003P008R002A048_rgb', 'nauseaVomiting/S013C001P015R002A048_rgb', 'nauseaVomiting/S001C003P006R002A048_rgb', 'nauseaVomiting/S011C001P027R002A048_rgb', 'nauseaVomiting/S011C001P015R002A048_rgb', 'nauseaVomiting/S017C003P017R002A048_rgb', 'nauseaVomiting/S005C002P010R001A048_rgb', 'nauseaVomiting/S008C001P031R001A048_rgb', 'nauseaVomiting/S013C002P028R002A048_rgb', 'nauseaVomiting/S002C003P010R001A048_rgb', 'nauseaVomiting/S013C001P027R002A048_rgb', 'nauseaVomiting/S014C001P007R002A048_rgb', 'nauseaVomiting/S001C003P007R002A048_rgb', 'nauseaVomiting/S008C003P019R001A048_rgb', 'nauseaVomiting/S016C003P039R001A048_rgb', 'nauseaVomiting/S012C003P018R002A048_rgb', 'nauseaVomiting/S008C003P035R002A048_rgb', 'nauseaVomiting/S014C002P025R002A048_rgb', 'nauseaVomiting/S014C002P039R001A048_rgb', 'nauseaVomiting/S013C002P007R001A048_rgb', 'nauseaVomiting/S011C003P002R001A048_rgb', 'nauseaVomiting/S013C002P016R001A048_rgb', 'nauseaVomiting/S005C002P016R001A048_rgb', 'nauseaVomiting/S014C001P037R002A048_rgb', 'nauseaVomiting/S003C003P019R001A048_rgb', 'nauseaVomiting/S013C002P018R002A048_rgb', 'nauseaVomiting/S004C002P008R002A048_rgb', 'nauseaVomiting/S003C001P016R001A048_rgb', 'nauseaVomiting/S013C002P019R002A048_rgb', 'nauseaVomiting/S013C001P015R001A048_rgb', 'nauseaVomiting/S001C002P006R002A048_rgb', 'nauseaVomiting/S004C001P003R001A048_rgb', 'nauseaVomiting/S011C002P002R002A048_rgb', 'nauseaVomiting/S015C002P015R001A048_rgb', 'fanSelf/S001C002P001R001A049_rgb', 'fanSelf/S001C002P001R002A049_rgb', 'fanSelf/S003C003P002R002A049_rgb', 'fanSelf/S004C003P020R002A049_rgb', 'fanSelf/S011C001P017R002A049_rgb', 'fanSelf/S011C001P002R002A049_rgb', 'fanSelf/S002C003P012R001A049_rgb', 'fanSelf/S015C003P019R002A049_rgb', 'fanSelf/S015C003P015R001A049_rgb', 'fanSelf/S011C003P002R001A049_rgb', 'fanSelf/S016C001P039R001A049_rgb', 'fanSelf/S004C003P008R002A049_rgb', 'fanSelf/S016C001P025R001A049_rgb', 'fanSelf/S013C003P016R001A049_rgb', 'fanSelf/S014C003P019R001A049_rgb', 'fanSelf/S015C001P015R001A049_rgb', 'fanSelf/S012C002P015R002A049_rgb', 'fanSelf/S008C003P015R002A049_rgb', 'fanSelf/S009C003P008R001A049_rgb', 'fanSelf/S016C001P007R001A049_rgb', 'fanSelf/S005C003P017R001A049_rgb', 'fanSelf/S017C002P016R001A049_rgb', 'fanSelf/S011C003P008R001A049_rgb', 'fanSelf/S007C003P025R001A049_rgb', 'fanSelf/S012C002P027R002A049_rgb', 'fanSelf/S013C001P027R001A049_rgb', 'fanSelf/S005C003P021R001A049_rgb', 'fanSelf/S008C002P033R001A049_rgb', 'fanSelf/S009C001P016R002A049_rgb', 'fanSelf/S007C002P028R002A049_rgb', 'fanSelf/S003C002P017R002A049_rgb', 'fanSelf/S002C003P003R001A049_rgb', 'fanSelf/S011C001P027R001A049_rgb', 'fanSelf/S007C002P015R001A049_rgb', 'fanSelf/S008C002P001R002A049_rgb', 'fanSelf/S004C002P008R001A049_rgb', 'fanSelf/S010C003P019R001A049_rgb', 'fanSelf/S009C001P025R002A049_rgb', 'fanSelf/S010C003P016R001A049_rgb', 'fanSelf/S007C002P015R002A049_rgb', 'fanSelf/S017C001P008R002A049_rgb', 'fanSelf/S007C002P007R001A049_rgb', 'fanSelf/S012C001P019R002A049_rgb', 'fanSelf/S017C003P017R001A049_rgb', 'fanSelf/S007C003P016R001A049_rgb', 'fanSelf/S008C003P035R002A049_rgb', 'fanSelf/S007C003P017R001A049_rgb', 'fanSelf/S016C002P019R002A049_rgb', 'fanSelf/S011C002P027R002A049_rgb', 'fanSelf/S010C001P015R001A049_rgb', 'fanSelf/S002C002P012R001A049_rgb', 'fanSelf/S011C002P019R001A049_rgb', 'fanSelf/S012C001P018R001A049_rgb', 'fanSelf/S005C003P004R001A049_rgb', 'fanSelf/S007C001P008R001A049_rgb', 'fanSelf/S016C001P040R002A049_rgb', 'fanSelf/S002C003P013R001A049_rgb', 'fanSelf/S014C003P008R001A049_rgb', 'fanSelf/S005C003P015R002A049_rgb', 'fanSelf/S011C003P007R002A049_rgb', 'fanSelf/S012C001P017R002A049_rgb', 'fanSelf/S006C001P024R002A049_rgb', 'fanSelf/S011C002P028R002A049_rgb', 'fanSelf/S004C001P008R001A049_rgb', 'fanSelf/S011C002P015R002A049_rgb', 'fanSelf/S012C003P025R002A049_rgb', 'fanSelf/S013C001P017R002A049_rgb', 'fanSelf/S008C001P007R002A049_rgb', 'fanSelf/S005C001P021R002A049_rgb', 'fanSelf/S011C001P008R002A049_rgb', 'fanSelf/S015C002P037R002A049_rgb', 'fanSelf/S017C003P008R001A049_rgb', 'fanSelf/S013C003P028R001A049_rgb', 'fanSelf/S016C002P019R001A049_rgb', 'fanSelf/S014C003P015R002A049_rgb', 'fanSelf/S002C003P011R002A049_rgb', 'fanSelf/S014C001P027R002A049_rgb', 'fanSelf/S002C002P008R002A049_rgb', 'fanSelf/S005C001P018R001A049_rgb', 'fanSelf/S002C003P010R002A049_rgb', 'fanSelf/S011C001P001R002A049_rgb', 'fanSelf/S011C003P001R002A049_rgb', 'fanSelf/S008C003P007R001A049_rgb', 'fanSelf/S006C001P019R001A049_rgb', 'fanSelf/S006C002P022R001A049_rgb', 'fanSelf/S001C003P007R002A049_rgb', 'fanSelf/S001C001P005R001A049_rgb', 'fanSelf/S015C002P008R002A049_rgb', 'fanSelf/S013C002P025R002A049_rgb', 'fanSelf/S010C001P018R002A049_rgb', 'fanSelf/S013C003P027R001A049_rgb', 'fanSelf/S012C002P019R002A049_rgb', 'fanSelf/S004C002P007R001A049_rgb', 'fanSelf/S002C002P011R002A049_rgb', 'fanSelf/S015C001P008R002A049_rgb', 'fanSelf/S003C003P001R001A049_rgb', 'fanSelf/S010C002P025R002A049_rgb', 'fanSelf/S007C002P018R001A049_rgb', 'fanSelf/S002C002P009R001A049_rgb', 'fanSelf/S006C002P024R001A049_rgb', 'fanSelf/S007C003P019R002A049_rgb', 'fanSelf/S013C001P015R002A049_rgb', 'fanSelf/S016C001P040R001A049_rgb', 'fanSelf/S012C001P027R001A049_rgb', 'fanSelf/S007C001P007R001A049_rgb', 'fanSelf/S008C002P030R001A049_rgb', 'fanSelf/S006C003P016R002A049_rgb', 'fanSelf/S011C002P016R002A049_rgb', 'fanSelf/S016C001P019R001A049_rgb', 'fanSelf/S012C003P028R002A049_rgb', 'fanSelf/S001C001P007R002A049_rgb', 'fanSelf/S016C003P007R001A049_rgb', 'fanSelf/S017C001P003R001A049_rgb', 'fanSelf/S005C002P004R002A049_rgb', 'fanSelf/S013C002P019R002A049_rgb', 'fanSelf/S013C001P018R001A049_rgb', 'fanSelf/S012C003P008R002A049_rgb', 'fanSelf/S002C002P010R002A049_rgb', 'fanSelf/S008C002P035R001A049_rgb', 'fanSelf/S010C001P021R001A049_rgb', 'fanSelf/S015C003P016R001A049_rgb', 'fanSelf/S001C003P006R002A049_rgb', 'fanSelf/S014C001P017R001A049_rgb', 'fanSelf/S003C002P016R001A049_rgb', 'fanSelf/S011C001P025R002A049_rgb', 'fanSelf/S013C003P018R002A049_rgb', 'fanSelf/S006C002P015R001A049_rgb', 'fanSelf/S008C003P025R002A049_rgb', 'fanSelf/S008C001P036R001A049_rgb', 'fanSelf/S007C002P019R002A049_rgb', 'fanSelf/S011C001P008R001A049_rgb', 'fanSelf/S013C001P037R001A049_rgb', 'fanSelf/S001C002P002R001A049_rgb', 'fanSelf/S008C002P015R002A049_rgb', 'fanSelf/S001C003P007R001A049_rgb', 'fanSelf/S014C003P025R001A049_rgb', 'fanSelf/S004C001P020R001A049_rgb', 'fanSelf/S006C003P015R001A049_rgb', 'fanSelf/S013C001P008R002A049_rgb', 'fanSelf/S017C003P008R002A049_rgb', 'fanSelf/S001C003P001R002A049_rgb', 'fanSelf/S008C003P036R001A049_rgb', 'fanSelf/S012C001P008R001A049_rgb', 'fanSelf/S006C003P001R001A049_rgb', 'fanSelf/S009C001P019R001A049_rgb', 'fanSelf/S006C002P019R001A049_rgb', 'fanSelf/S005C001P017R002A049_rgb', 'fanSelf/S006C003P008R002A049_rgb', 'fanSelf/S004C003P003R002A049_rgb', 'fanSelf/S003C001P008R001A049_rgb', 'fanSelf/S017C002P015R001A049_rgb', 'fanSelf/S007C003P025R002A049_rgb', 'fanSelf/S003C002P008R001A049_rgb', 'fanSelf/S015C001P008R001A049_rgb', 'fanSelf/S017C003P007R001A049_rgb', 'fanSelf/S012C003P007R002A049_rgb', 'fanSelf/S015C001P037R002A049_rgb', 'fanSelf/S010C001P007R002A049_rgb', 'fanSelf/S002C002P013R001A049_rgb', 'fanSelf/S013C003P008R002A049_rgb', 'fanSelf/S006C002P001R001A049_rgb', 'fanSelf/S002C001P011R001A049_rgb', 'fanSelf/S005C002P015R001A049_rgb', 'fanSelf/S014C003P037R002A049_rgb', 'fanSelf/S009C002P019R002A049_rgb', 'fanSelf/S014C002P019R001A049_rgb', 'fanSelf/S012C003P037R001A049_rgb', 'fanSelf/S001C001P006R001A049_rgb', 'fanSelf/S015C001P007R002A049_rgb', 'fanSelf/S011C002P015R001A049_rgb', 'fanSelf/S014C001P037R002A049_rgb', 'fanSelf/S007C002P025R002A049_rgb', 'fanSelf/S002C003P013R002A049_rgb', 'fanSelf/S013C002P016R001A049_rgb', 'fanSelf/S002C002P010R001A049_rgb', 'fanSelf/S014C001P037R001A049_rgb', 'fanSelf/S007C002P017R002A049_rgb', 'fanSelf/S008C001P019R002A049_rgb', 'fanSelf/S008C001P032R002A049_rgb', 'fanSelf/S008C002P034R002A049_rgb', 'fanSelf/S017C002P009R001A049_rgb', 'fanSelf/S006C003P008R001A049_rgb', 'fanSelf/S003C003P008R002A049_rgb', 'fanSelf/S011C001P002R001A049_rgb', 'fanSelf/S012C002P018R001A049_rgb', 'fanSelf/S005C003P021R002A049_rgb', 'fanSelf/S006C001P015R002A049_rgb', 'fanSelf/S015C002P025R002A049_rgb', 'fanSelf/S001C002P005R001A049_rgb', 'fanSelf/S012C002P007R001A049_rgb', 'fanSelf/S010C002P008R002A049_rgb', 'fanSelf/S007C003P015R001A049_rgb', 'fanSelf/S011C003P019R001A049_rgb', 'fanSelf/S015C001P015R002A049_rgb', 'fanSelf/S005C002P004R001A049_rgb', 'fanSelf/S006C003P001R002A049_rgb', 'fanSelf/S004C001P003R001A049_rgb', 'fanSelf/S016C001P025R002A049_rgb', 'fanSelf/S014C001P017R002A049_rgb', 'fanSelf/S007C003P017R002A049_rgb', 'fanSelf/S008C001P031R002A049_rgb', 'fanSelf/S008C001P015R001A049_rgb', 'fanSelf/S003C002P001R002A049_rgb', 'fanSelf/S008C001P001R002A049_rgb', 'fanSelf/S005C001P015R001A049_rgb', 'fanSelf/S010C002P008R001A049_rgb', 'fanSelf/S015C002P007R002A049_rgb', 'fanSelf/S003C003P002R001A049_rgb', 'fanSelf/S008C003P030R001A049_rgb', 'fanSelf/S014C003P027R002A049_rgb', 'fanSelf/S007C001P026R001A049_rgb', 'fanSelf/S008C002P008R001A049_rgb', 'fanSelf/S016C003P008R002A049_rgb', 'fanSelf/S005C003P013R002A049_rgb', 'fanSelf/S007C003P015R002A049_rgb', 'fanSelf/S016C001P019R002A049_rgb', 'fanSelf/S008C002P036R002A049_rgb', 'fanSelf/S013C002P018R001A049_rgb', 'fanSelf/S006C001P024R001A049_rgb', 'fanSelf/S006C001P023R001A049_rgb', 'fanSelf/S003C002P002R001A049_rgb', 'fanSelf/S011C002P019R002A049_rgb', 'fanSelf/S001C001P008R002A049_rgb', 'fanSelf/S007C002P016R001A049_rgb', 'fanSelf/S007C003P001R001A049_rgb', 'fanSelf/S001C003P005R002A049_rgb', 'fanSelf/S010C003P008R002A049_rgb', 'fanSelf/S008C002P001R001A049_rgb', 'fanSelf/S007C001P018R002A049_rgb', 'fanSelf/S001C002P005R002A049_rgb', 'fanSelf/S003C001P002R002A049_rgb', 'fanSelf/S011C003P015R001A049_rgb', 'fanSelf/S010C003P017R002A049_rgb', 'fanSelf/S017C002P003R001A049_rgb', 'fanSelf/S011C003P017R001A049_rgb', 'fanSelf/S017C003P009R001A049_rgb', 'fanSelf/S013C002P008R002A049_rgb', 'fanSelf/S003C001P017R002A049_rgb', 'fanSelf/S014C001P007R002A049_rgb', 'fanSelf/S004C003P007R001A049_rgb', 'fanSelf/S011C001P015R002A049_rgb', 'fanSelf/S009C002P008R002A049_rgb', 'fanSelf/S008C003P032R002A049_rgb', 'fanSelf/S006C002P017R002A049_rgb', 'fanSelf/S015C001P016R001A049_rgb', 'fanSelf/S013C001P008R001A049_rgb', 'fanSelf/S011C002P025R002A049_rgb', 'fanSelf/S015C002P015R002A049_rgb', 'fanSelf/S001C001P001R001A049_rgb', 'fanSelf/S002C003P007R002A049_rgb', 'fanSelf/S007C002P001R002A049_rgb', 'fanSelf/S015C002P017R002A049_rgb', 'fanSelf/S012C002P017R002A049_rgb', 'fanSelf/S008C001P034R002A049_rgb', 'fanSelf/S015C002P007R001A049_rgb', 'fanSelf/S001C002P007R001A049_rgb', 'fanSelf/S008C001P029R002A049_rgb', 'fanSelf/S003C001P007R002A049_rgb', 'fanSelf/S016C001P021R001A049_rgb', 'fanSelf/S011C002P001R001A049_rgb', 'fanSelf/S013C002P037R001A049_rgb', 'fanSelf/S002C002P003R002A049_rgb', 'fanSelf/S001C002P003R002A049_rgb', 'fanSelf/S003C002P002R002A049_rgb', 'fanSelf/S014C001P008R001A049_rgb', 'fanSelf/S002C001P010R001A049_rgb', 'fanSelf/S009C003P016R001A049_rgb', 'fanSelf/S014C001P027R001A049_rgb', 'fanSelf/S011C003P017R002A049_rgb', 'fanSelf/S011C001P018R002A049_rgb', 'fanSelf/S004C002P020R002A049_rgb', 'fanSelf/S002C002P012R002A049_rgb', 'fanSelf/S013C003P027R002A049_rgb', 'fanSelf/S007C002P017R001A049_rgb', 'fanSelf/S012C002P019R001A049_rgb', 'fanSelf/S010C003P016R002A049_rgb', 'fanSelf/S011C003P027R001A049_rgb', 'fanSelf/S013C003P037R001A049_rgb', 'fanSelf/S007C003P026R001A049_rgb', 'fanSelf/S008C002P036R001A049_rgb', 'fanSelf/S013C001P025R002A049_rgb', 'fanSelf/S002C003P003R002A049_rgb', 'fanSelf/S013C001P016R001A049_rgb', 'fanSelf/S003C001P002R001A049_rgb', 'fanSelf/S001C002P003R001A049_rgb', 'fanSelf/S005C003P016R001A049_rgb', 'fanSelf/S007C002P016R002A049_rgb', 'fanSelf/S008C001P025R002A049_rgb', 'fanSelf/S010C001P016R001A049_rgb', 'fanSelf/S003C001P019R002A049_rgb', 'fanSelf/S005C003P010R001A049_rgb', 'fanSelf/S008C001P036R002A049_rgb', 'fanSelf/S014C002P008R002A049_rgb', 'fanSelf/S016C003P008R001A049_rgb', 'fanSelf/S012C002P008R002A049_rgb', 'fanSelf/S002C001P007R002A049_rgb', 'fanSelf/S003C001P007R001A049_rgb', 'fanSelf/S007C001P016R001A049_rgb', 'fanSelf/S011C003P016R001A049_rgb', 'fanSelf/S015C003P037R001A049_rgb', 'fanSelf/S008C002P025R001A049_rgb', 'fanSelf/S012C002P008R001A049_rgb', 'fanSelf/S012C003P019R001A049_rgb', 'fanSelf/S001C001P001R002A049_rgb', 'fanSelf/S005C003P010R002A049_rgb', 'fanSelf/S012C002P037R001A049_rgb', 'fanSelf/S007C001P026R002A049_rgb', 'fanSelf/S011C003P025R001A049_rgb', 'fanSelf/S012C001P025R001A049_rgb', 'fanSelf/S008C003P031R002A049_rgb', 'fanSelf/S012C003P018R001A049_rgb', 'fanSelf/S017C003P016R002A049_rgb', 'fanSelf/S013C002P025R001A049_rgb', 'fanSelf/S013C002P019R001A049_rgb', 'fanSelf/S002C002P003R001A049_rgb', 'fanSelf/S007C002P007R002A049_rgb', 'fanSelf/S002C002P009R002A049_rgb', 'fanSelf/S003C003P016R002A049_rgb', 'fanSelf/S014C003P015R001A049_rgb', 'fanSelf/S017C002P015R002A049_rgb', 'fanSelf/S014C003P017R002A049_rgb', 'fanSelf/S009C001P007R002A049_rgb', 'fanSelf/S017C003P016R001A049_rgb', 'fanSelf/S011C003P038R002A049_rgb', 'fanSelf/S015C003P015R002A049_rgb', 'fanSelf/S006C003P022R002A049_rgb', 'fanSelf/S002C002P007R001A049_rgb', 'fanSelf/S014C002P027R001A049_rgb', 'fanSelf/S007C003P018R001A049_rgb', 'fanSelf/S009C003P017R002A049_rgb', 'fanSelf/S004C001P007R002A049_rgb', 'fanSelf/S006C001P015R001A049_rgb', 'fanSelf/S008C003P025R001A049_rgb', 'fanSelf/S011C002P018R002A049_rgb', 'fanSelf/S003C003P007R002A049_rgb', 'fanSelf/S013C002P028R001A049_rgb', 'fanSelf/S008C002P019R002A049_rgb', 'fanSelf/S016C001P007R002A049_rgb', 'fanSelf/S004C003P003R001A049_rgb', 'fanSelf/S007C003P008R001A049_rgb', 'fanSelf/S006C001P017R002A049_rgb', 'fanSelf/S005C003P017R002A049_rgb', 'fanSelf/S006C001P022R002A049_rgb', 'fanSelf/S017C002P017R001A049_rgb', 'fanSelf/S005C001P016R002A049_rgb', 'fanSelf/S006C003P024R001A049_rgb', 'fanSelf/S011C003P015R002A049_rgb', 'fanSelf/S013C003P028R002A049_rgb', 'fanSelf/S005C002P021R001A049_rgb', 'fanSelf/S012C003P008R001A049_rgb', 'fanSelf/S013C001P028R002A049_rgb', 'fanSelf/S005C001P013R001A049_rgb', 'fanSelf/S002C001P009R001A049_rgb', 'fanSelf/S015C002P008R001A049_rgb', 'fanSelf/S015C003P025R001A049_rgb', 'fanSelf/S001C003P004R002A049_rgb', 'fanSelf/S010C001P013R002A049_rgb', 'fanSelf/S009C002P016R002A049_rgb', 'fanSelf/S010C003P019R002A049_rgb', 'fanSelf/S002C003P014R002A049_rgb', 'fanSelf/S014C001P015R002A049_rgb', 'fanSelf/S008C002P031R001A049_rgb', 'fanSelf/S010C002P016R002A049_rgb', 'fanSelf/S011C003P025R002A049_rgb', 'fanSelf/S013C001P018R002A049_rgb', 'fanSelf/S011C001P025R001A049_rgb', 'fanSelf/S002C001P008R002A049_rgb', 'fanSelf/S003C002P015R001A049_rgb', 'fanSelf/S003C002P017R001A049_rgb', 'fanSelf/S006C003P017R001A049_rgb', 'fanSelf/S003C002P016R002A049_rgb', 'fanSelf/S007C001P008R002A049_rgb', 'fanSelf/S006C001P016R001A049_rgb', 'fanSelf/S006C002P023R001A049_rgb', 'fanSelf/S008C002P007R002A049_rgb', 'fanSelf/S002C003P009R002A049_rgb', 'fanSelf/S009C003P025R001A049_rgb', 'fanSelf/S010C003P017R001A049_rgb', 'fanSelf/S015C002P025R001A049_rgb', 'fanSelf/S013C003P037R002A049_rgb', 'fanSelf/S009C002P017R001A049_rgb', 'fanSelf/S016C003P007R002A049_rgb', 'fanSelf/S010C003P013R001A049_rgb', 'fanSelf/S015C002P019R001A049_rgb', 'fanSelf/S010C003P021R002A049_rgb', 'fanSelf/S010C003P015R002A049_rgb', 'fanSelf/S017C002P008R001A049_rgb', 'fanSelf/S014C001P039R001A049_rgb', 'fanSelf/S012C001P037R001A049_rgb', 'fanSelf/S013C002P028R002A049_rgb', 'fanSelf/S002C001P012R002A049_rgb', 'fanSelf/S006C002P015R002A049_rgb', 'fanSelf/S017C001P015R002A049_rgb', 'fanSelf/S011C002P017R002A049_rgb', 'fanSelf/S011C001P018R001A049_rgb', 'fanSelf/S015C002P037R001A049_rgb', 'fanSelf/S006C001P023R002A049_rgb', 'fanSelf/S009C003P017R001A049_rgb', 'fanSelf/S003C002P019R002A049_rgb', 'fanSelf/S008C002P035R002A049_rgb', 'fanSelf/S008C001P035R002A049_rgb', 'fanSelf/S010C001P025R002A049_rgb', 'fanSelf/S013C003P008R001A049_rgb', 'fanSelf/S005C003P018R001A049_rgb', 'fanSelf/S011C001P038R001A049_rgb', 'fanSelf/S012C002P016R001A049_rgb', 'fanSelf/S015C001P037R001A049_rgb', 'fanSelf/S006C002P001R002A049_rgb', 'fanSelf/S006C003P007R002A049_rgb', 'fanSelf/S003C003P017R001A049_rgb', 'fanSelf/S017C002P020R002A049_rgb', 'fanSelf/S012C003P015R001A049_rgb', 'fanSelf/S003C001P001R002A049_rgb', 'fanSelf/S012C001P007R002A049_rgb', 'fanSelf/S015C003P017R002A049_rgb', 'fanSelf/S006C001P008R002A049_rgb', 'fanSelf/S005C002P021R002A049_rgb', 'fanSelf/S002C001P008R001A049_rgb', 'fanSelf/S011C003P008R002A049_rgb', 'fanSelf/S003C001P019R001A049_rgb', 'fanSelf/S007C001P017R001A049_rgb', 'fanSelf/S006C002P023R002A049_rgb', 'fanSelf/S016C002P021R001A049_rgb', 'fanSelf/S017C001P007R001A049_rgb', 'fanSelf/S008C001P030R001A049_rgb', 'fanSelf/S014C002P039R002A049_rgb', 'fanSelf/S016C003P019R001A049_rgb', 'fanSelf/S014C002P025R001A049_rgb', 'fanSelf/S002C001P003R001A049_rgb', 'fanSelf/S011C002P007R002A049_rgb', 'fanSelf/S016C003P021R001A049_rgb', 'fanSelf/S013C001P027R002A049_rgb', 'fanSelf/S002C002P011R001A049_rgb', 'fanSelf/S010C002P021R001A049_rgb', 'fanSelf/S011C001P019R001A049_rgb', 'fanSelf/S006C003P017R002A049_rgb', 'fanSelf/S007C002P008R002A049_rgb', 'fanSelf/S011C002P016R001A049_rgb', 'fanSelf/S017C003P020R001A049_rgb', 'fanSelf/S008C002P030R002A049_rgb', 'fanSelf/S013C002P017R002A049_rgb', 'fanSelf/S002C001P012R001A049_rgb', 'fanSelf/S011C003P027R002A049_rgb', 'fanSelf/S014C002P015R001A049_rgb', 'fanSelf/S013C003P017R002A049_rgb', 'fanSelf/S011C001P017R001A049_rgb', 'fanSelf/S002C001P013R001A049_rgb', 'fanSelf/S010C003P007R002A049_rgb', 'fanSelf/S012C003P016R002A049_rgb', 'fanSelf/S005C002P016R002A049_rgb', 'fanSelf/S008C003P008R001A049_rgb', 'fanSelf/S005C001P017R001A049_rgb', 'fanSelf/S001C001P002R002A049_rgb', 'fanSelf/S011C003P018R001A049_rgb', 'fanSelf/S008C003P029R002A049_rgb', 'fanSelf/S014C001P039R002A049_rgb', 'fanSelf/S009C002P017R002A049_rgb', 'fanSelf/S016C002P007R001A049_rgb', 'fanSelf/S007C003P008R002A049_rgb', 'fanSelf/S003C001P016R002A049_rgb', 'fanSelf/S012C001P037R002A049_rgb', 'fanSelf/S012C003P017R002A049_rgb', 'fanSelf/S016C002P025R001A049_rgb', 'fanSelf/S007C001P018R001A049_rgb', 'fanSelf/S010C001P018R001A049_rgb', 'fanSelf/S005C003P016R002A049_rgb', 'fanSelf/S004C003P020R001A049_rgb', 'fanSelf/S005C001P010R001A049_rgb', 'fanSelf/S005C002P016R001A049_rgb', 'fanSelf/S006C002P008R002A049_rgb', 'fanSelf/S006C003P015R002A049_rgb', 'fanSelf/S013C001P025R001A049_rgb', 'fanSelf/S015C003P016R002A049_rgb', 'fanSelf/S015C001P007R001A049_rgb', 'fanSelf/S006C002P019R002A049_rgb', 'fanSelf/S004C003P008R001A049_rgb', 'fanSelf/S008C002P033R002A049_rgb', 'fanSelf/S009C002P025R001A049_rgb', 'fanSelf/S003C003P015R002A049_rgb', 'fanSelf/S002C003P008R002A049_rgb', 'fanSelf/S013C003P015R001A049_rgb', 'fanSelf/S001C001P002R001A049_rgb', 'fanSelf/S005C001P021R001A049_rgb', 'fanSelf/S012C002P007R002A049_rgb', 'fanSelf/S017C001P017R001A049_rgb', 'fanSelf/S016C003P040R001A049_rgb', 'fanSelf/S007C001P007R002A049_rgb', 'fanSelf/S011C001P027R002A049_rgb', 'fanSelf/S015C003P019R001A049_rgb', 'fanSelf/S009C002P015R002A049_rgb', 'fanSelf/S014C001P015R001A049_rgb', 'fanSelf/S001C002P002R002A049_rgb', 'fanSelf/S017C002P017R002A049_rgb', 'fanSelf/S003C003P018R001A049_rgb', 'fanSelf/S010C002P017R002A049_rgb', 'fanSelf/S010C002P021R002A049_rgb', 'fanSelf/S017C003P003R001A049_rgb', 'fanSelf/S014C002P027R002A049_rgb', 'fanSelf/S012C002P015R001A049_rgb', 'fanSelf/S006C002P016R002A049_rgb', 'fanSelf/S014C001P025R002A049_rgb', 'fanSelf/S012C003P037R002A049_rgb', 'fanSelf/S011C002P007R001A049_rgb', 'fanSelf/S006C002P017R001A049_rgb', 'fanSelf/S002C002P008R001A049_rgb', 'fanSelf/S001C002P007R002A049_rgb', 'fanSelf/S013C002P037R002A049_rgb', 'fanSelf/S003C001P018R002A049_rgb', 'fanSelf/S012C003P019R002A049_rgb', 'fanSelf/S017C003P007R002A049_rgb', 'fanSelf/S006C001P001R002A049_rgb', 'fanSelf/S007C002P008R001A049_rgb', 'fanSelf/S015C002P015R001A049_rgb', 'fanSelf/S014C002P037R002A049_rgb', 'fanSelf/S001C002P004R001A049_rgb', 'fanSelf/S017C003P009R002A049_rgb', 'fanSelf/S006C001P019R002A049_rgb', 'fanSelf/S009C001P015R002A049_rgb', 'fanSelf/S007C003P026R002A049_rgb', 'fanSelf/S006C001P007R002A049_rgb', 'fanSelf/S011C002P008R002A049_rgb', 'fanSelf/S017C003P020R002A049_rgb', 'fanSelf/S015C003P007R001A049_rgb', 'fanSelf/S006C002P022R002A049_rgb', 'fanSelf/S005C001P018R002A049_rgb', 'fanSelf/S004C001P003R002A049_rgb', 'fanSelf/S017C002P007R001A049_rgb', 'fanSelf/S006C002P007R002A049_rgb', 'fanSelf/S002C003P008R001A049_rgb', 'fanSelf/S011C003P019R002A049_rgb', 'fanSelf/S012C001P019R001A049_rgb', 'fanSelf/S015C002P016R002A049_rgb', 'fanSelf/S012C001P016R001A049_rgb', 'fanSelf/S013C002P007R002A049_rgb', 'fanSelf/S008C002P008R002A049_rgb', 'fanSelf/S007C003P001R002A049_rgb', 'fanSelf/S011C002P018R001A049_rgb', 'fanSelf/S010C002P018R002A049_rgb', 'fanSelf/S003C002P007R001A049_rgb', 'fanSelf/S015C003P008R001A049_rgb', 'fanSelf/S006C003P007R001A049_rgb', 'fanSelf/S008C003P019R001A049_rgb', 'fanSelf/S016C003P040R002A049_rgb', 'fanSelf/S007C001P028R002A049_rgb', 'fanSelf/S001C002P008R001A049_rgb', 'fanSelf/S011C003P007R001A049_rgb', 'fanSelf/S015C003P037R002A049_rgb', 'fanSelf/S009C002P015R001A049_rgb', 'fanSelf/S006C001P017R001A049_rgb', 'fanSelf/S009C001P008R001A049_rgb', 'fanSelf/S015C001P017R002A049_rgb', 'fanSelf/S007C002P028R001A049_rgb', 'fanSelf/S008C001P033R002A049_rgb', 'fanSelf/S002C003P012R002A049_rgb', 'fanSelf/S005C002P018R002A049_rgb', 'fanSelf/S006C002P024R002A049_rgb', 'fanSelf/S007C003P028R002A049_rgb', 'fanSelf/S017C003P017R002A049_rgb', 'fanSelf/S014C002P039R001A049_rgb', 'fanSelf/S005C001P015R002A049_rgb', 'fanSelf/S002C003P014R001A049_rgb', 'fanSelf/S013C003P019R002A049_rgb', 'fanSelf/S010C001P021R002A049_rgb', 'fanSelf/S013C002P016R002A049_rgb', 'fanSelf/S007C002P019R001A049_rgb', 'fanSelf/S007C001P015R002A049_rgb', 'fanSelf/S012C001P015R002A049_rgb', 'fanSelf/S007C002P027R001A049_rgb', 'fanSelf/S006C003P022R001A049_rgb', 'fanSelf/S002C001P010R002A049_rgb', 'fanSelf/S012C001P028R002A049_rgb', 'fanSelf/S014C003P039R001A049_rgb', 'fanSelf/S003C001P015R001A049_rgb', 'fanSelf/S011C002P027R001A049_rgb', 'fanSelf/S003C001P018R001A049_rgb', 'fanSelf/S015C002P017R001A049_rgb', 'fanSelf/S011C001P007R001A049_rgb', 'fanSelf/S001C003P003R002A049_rgb', 'fanSelf/S015C001P025R002A049_rgb', 'fanSelf/S015C002P019R002A049_rgb', 'fanSelf/S005C001P004R002A049_rgb', 'fanSelf/S008C002P007R001A049_rgb', 'fanSelf/S012C001P027R002A049_rgb', 'fanSelf/S004C002P003R001A049_rgb', 'fanSelf/S011C001P038R002A049_rgb', 'fanSelf/S011C003P018R002A049_rgb', 'fanSelf/S003C002P008R002A049_rgb', 'fanSelf/S008C001P007R001A049_rgb', 'fanSelf/S006C003P024R002A049_rgb', 'fanSelf/S017C001P017R002A049_rgb', 'fanSelf/S016C002P040R001A049_rgb', 'fanSelf/S008C001P029R001A049_rgb', 'fanSelf/S010C001P019R002A049_rgb', 'fanSelf/S011C001P028R002A049_rgb', 'fanSelf/S007C003P027R001A049_rgb', 'fanSelf/S001C003P001R001A049_rgb', 'fanSelf/S017C003P015R001A049_rgb', 'fanSelf/S007C001P001R002A049_rgb', 'fanSelf/S017C001P016R002A049_rgb', 'fanSelf/S006C002P007R001A049_rgb', 'fanSelf/S005C002P010R001A049_rgb', 'fanSelf/S010C003P025R002A049_rgb', 'fanSelf/S003C003P016R001A049_rgb', 'fanSelf/S002C001P007R001A049_rgb', 'fanSelf/S008C002P015R001A049_rgb', 'fanSelf/S012C003P027R001A049_rgb', 'fanSelf/S011C003P038R001A049_rgb', 'fanSelf/S006C003P023R001A049_rgb', 'fanSelf/S006C002P008R001A049_rgb', 'fanSelf/S014C001P025R001A049_rgb', 'fanSelf/S003C001P001R001A049_rgb', 'fanSelf/S003C003P019R001A049_rgb', 'fanSelf/S002C001P009R002A049_rgb', 'fanSelf/S014C001P007R001A049_rgb', 'fanSelf/S011C002P038R002A049_rgb', 'fanSelf/S009C001P016R001A049_rgb', 'fanSelf/S010C003P025R001A049_rgb', 'fanSelf/S008C001P034R001A049_rgb', 'fanSelf/S014C003P008R002A049_rgb', 'fanSelf/S002C002P014R002A049_rgb', 'fanSelf/S001C001P005R002A049_rgb', 'fanSelf/S004C001P008R002A049_rgb', 'fanSelf/S011C003P028R002A049_rgb', 'fanSelf/S014C002P015R002A049_rgb', 'fanSelf/S017C002P016R002A049_rgb', 'fanSelf/S013C003P016R002A049_rgb', 'fanSelf/S017C002P003R002A049_rgb', 'fanSelf/S013C003P019R001A049_rgb', 'fanSelf/S014C002P017R002A049_rgb', 'fanSelf/S010C002P007R002A049_rgb', 'fanSelf/S008C003P001R002A049_rgb', 'fanSelf/S008C003P035R001A049_rgb', 'fanSelf/S015C001P016R002A049_rgb', 'fanSelf/S004C003P007R002A049_rgb', 'fanSelf/S013C001P037R002A049_rgb', 'fanSelf/S001C001P008R001A049_rgb', 'fanSelf/S001C003P003R001A049_rgb', 'fanSelf/S014C002P037R001A049_rgb', 'fanSelf/S008C003P015R001A049_rgb', 'fanSelf/S013C002P015R002A049_rgb', 'fanSelf/S009C001P008R002A049_rgb', 'fanSelf/S005C001P010R002A049_rgb', 'fanSelf/S008C003P032R001A049_rgb', 'fanSelf/S013C003P017R001A049_rgb', 'fanSelf/S013C001P015R001A049_rgb', 'fanSelf/S010C001P015R002A049_rgb', 'fanSelf/S009C002P025R002A049_rgb', 'fanSelf/S015C003P025R002A049_rgb', 'fanSelf/S011C001P001R001A049_rgb', 'fanSelf/S012C003P015R002A049_rgb', 'fanSelf/S009C003P019R001A049_rgb', 'fanSelf/S008C002P019R001A049_rgb', 'fanSelf/S012C002P028R002A049_rgb', 'fanSelf/S002C001P013R002A049_rgb', 'fanSelf/S015C001P019R002A049_rgb', 'fanSelf/S003C003P017R002A049_rgb', 'fanSelf/S007C002P026R001A049_rgb', 'fanSelf/S009C001P025R001A049_rgb', 'fanSelf/S010C001P017R002A049_rgb', 'fanSelf/S001C003P005R001A049_rgb', 'fanSelf/S005C002P017R002A049_rgb', 'fanSelf/S012C003P028R001A049_rgb', 'fanSelf/S008C003P029R001A049_rgb', 'fanSelf/S003C001P016R001A049_rgb', 'fanSelf/S007C001P027R002A049_rgb', 'fanSelf/S017C001P007R002A049_rgb', 'fanSelf/S007C003P007R001A049_rgb', 'fanSelf/S013C003P007R002A049_rgb', 'fanSelf/S010C002P016R001A049_rgb', 'fanSelf/S005C003P013R001A049_rgb', 'fanSelf/S009C001P019R002A049_rgb', 'fanSelf/S015C002P016R001A049_rgb', 'fanSelf/S010C002P013R002A049_rgb', 'fanSelf/S007C001P015R001A049_rgb', 'fanSelf/S007C002P026R002A049_rgb', 'fanSelf/S014C002P019R002A049_rgb', 'fanSelf/S007C001P028R001A049_rgb', 'fanSelf/S001C001P006R002A049_rgb', 'fanSelf/S007C001P019R002A049_rgb', 'fanSelf/S009C003P019R002A049_rgb', 'fanSelf/S014C002P025R002A049_rgb', 'fanSelf/S003C002P018R001A049_rgb', 'fanSelf/S005C001P013R002A049_rgb', 'fanSelf/S008C002P032R002A049_rgb', 'fanSelf/S002C002P014R001A049_rgb', 'fanSelf/S016C002P025R002A049_rgb', 'fanSelf/S010C002P015R001A049_rgb', 'fanSelf/S017C003P003R002A049_rgb', 'fanSelf/S008C003P034R002A049_rgb', 'fanSelf/S015C001P019R001A049_rgb', 'fanSelf/S003C001P017R001A049_rgb', 'fanSelf/S014C003P007R002A049_rgb', 'fanSelf/S007C002P027R002A049_rgb', 'fanSelf/S016C003P021R002A049_rgb', 'fanSelf/S010C001P013R001A049_rgb', 'fanSelf/S016C002P039R002A049_rgb', 'fanSelf/S002C001P014R002A049_rgb', 'fanSelf/S010C003P018R002A049_rgb', 'fanSelf/S010C003P013R002A049_rgb', 'fanSelf/S013C001P007R001A049_rgb', 'fanSelf/S010C002P015R002A049_rgb', 'fanSelf/S007C003P007R002A049_rgb', 'fanSelf/S009C002P007R001A049_rgb', 'fanSelf/S012C003P018R002A049_rgb', 'fanSelf/S002C002P007R002A049_rgb', 'fanSelf/S001C003P002R001A049_rgb', 'fanSelf/S002C003P009R001A049_rgb', 'fanSelf/S007C003P016R002A049_rgb', 'fanSelf/S012C001P007R001A049_rgb', 'fanSelf/S008C001P001R001A049_rgb', 'fanSelf/S001C001P003R002A049_rgb', 'fanSelf/S008C001P008R002A049_rgb', 'fanSelf/S007C001P025R002A049_rgb', 'fanSelf/S017C001P020R001A049_rgb', 'fanSelf/S014C002P007R001A049_rgb', 'fanSelf/S016C002P007R002A049_rgb', 'fanSelf/S005C002P017R001A049_rgb', 'fanSelf/S016C001P008R002A049_rgb', 'fanSelf/S005C001P016R001A049_rgb', 'fanSelf/S001C001P004R002A049_rgb', 'fanSelf/S005C001P004R001A049_rgb', 'fanSelf/S007C001P001R001A049_rgb', 'fanSelf/S003C002P018R002A049_rgb', 'fanSelf/S008C003P008R002A049_rgb', 'fanSelf/S003C001P015R002A049_rgb', 'fanSelf/S011C001P016R001A049_rgb', 'fanSelf/S006C001P016R002A049_rgb', 'fanSelf/S011C001P016R002A049_rgb', 'fanSelf/S012C001P015R001A049_rgb', 'fanSelf/S016C003P025R002A049_rgb', 'fanSelf/S015C001P017R001A049_rgb', 'fanSelf/S002C002P013R002A049_rgb', 'fanSelf/S003C002P007R002A049_rgb', 'fanSelf/S011C003P028R001A049_rgb', 'fanSelf/S009C003P015R002A049_rgb', 'fanSelf/S010C003P018R001A049_rgb', 'fanSelf/S014C003P017R001A049_rgb', 'fanSelf/S005C002P010R002A049_rgb', 'fanSelf/S014C001P019R001A049_rgb', 'fanSelf/S008C003P036R002A049_rgb', 'fanSelf/S006C002P016R001A049_rgb', 'fanSelf/S007C003P028R001A049_rgb', 'fanSelf/S013C001P017R001A049_rgb', 'fanSelf/S010C003P007R001A049_rgb', 'fanSelf/S009C003P008R002A049_rgb', 'fanSelf/S017C002P009R002A049_rgb', 'fanSelf/S003C003P018R002A049_rgb', 'fanSelf/S014C001P019R002A049_rgb', 'fanSelf/S009C001P015R001A049_rgb', 'fanSelf/S006C003P019R001A049_rgb', 'fanSelf/S005C003P018R002A049_rgb', 'fanSelf/S006C001P007R001A049_rgb', 'fanSelf/S003C001P008R002A049_rgb', 'fanSelf/S004C001P020R002A049_rgb', 'fanSelf/S013C003P025R002A049_rgb', 'fanSelf/S012C003P007R001A049_rgb', 'fanSelf/S011C001P028R001A049_rgb', 'fanSelf/S012C003P025R001A049_rgb'] ['sneezeCough/S014C001P019R001A041_rgb 41', 'sneezeCough/S016C003P021R001A041_rgb 41', 'sneezeCough/S008C001P001R001A041_rgb 41', 'sneezeCough/S013C003P016R001A041_rgb 41', 'sneezeCough/S009C001P015R002A041_rgb 41', 'sneezeCough/S010C002P021R002A041_rgb 41', 'sneezeCough/S001C003P006R001A041_rgb 41', 'sneezeCough/S016C001P019R001A041_rgb 41', 'sneezeCough/S016C002P025R002A041_rgb 41', 'sneezeCough/S008C002P030R001A041_rgb 41', 'sneezeCough/S010C001P016R002A041_rgb 41', 'sneezeCough/S011C002P015R001A041_rgb 41', 'sneezeCough/S017C002P008R002A041_rgb 41', 'sneezeCough/S010C001P017R002A041_rgb 41', 'sneezeCough/S006C003P022R002A041_rgb 41', 'sneezeCough/S012C003P027R001A041_rgb 41', 'sneezeCough/S007C002P019R001A041_rgb 41', 'sneezeCough/S011C001P017R001A041_rgb 41', 'sneezeCough/S008C001P030R002A041_rgb 41', 'sneezeCough/S007C002P015R002A041_rgb 41', 'sneezeCough/S003C001P002R001A041_rgb 41', 'sneezeCough/S008C003P030R002A041_rgb 41', 'sneezeCough/S003C003P016R002A041_rgb 41', 'sneezeCough/S010C002P007R002A041_rgb 41', 'sneezeCough/S017C001P007R001A041_rgb 41', 'sneezeCough/S017C003P008R001A041_rgb 41', 'sneezeCough/S005C001P018R001A041_rgb 41', 'sneezeCough/S003C003P018R002A041_rgb 41', 'sneezeCough/S016C002P021R001A041_rgb 41', 'sneezeCough/S001C003P003R001A041_rgb 41', 'sneezeCough/S008C001P029R002A041_rgb 41', 'sneezeCough/S003C001P018R002A041_rgb 41', 'sneezeCough/S008C002P035R001A041_rgb 41', 'sneezeCough/S006C002P015R002A041_rgb 41', 'sneezeCough/S010C002P019R001A041_rgb 41', 'sneezeCough/S012C003P025R001A041_rgb 41', 'sneezeCough/S005C001P015R001A041_rgb 41', 'sneezeCough/S008C002P033R001A041_rgb 41', 'sneezeCough/S008C001P034R002A041_rgb 41', 'sneezeCough/S007C002P016R002A041_rgb 41', 'sneezeCough/S007C002P027R001A041_rgb 41', 'sneezeCough/S007C003P018R001A041_rgb 41', 'sneezeCough/S013C002P017R001A041_rgb 41', 'sneezeCough/S005C002P004R002A041_rgb 41', 'sneezeCough/S006C001P008R001A041_rgb 41', 'sneezeCough/S001C001P007R002A041_rgb 41', 'sneezeCough/S003C002P018R002A041_rgb 41', 'sneezeCough/S007C003P016R001A041_rgb 41', 'sneezeCough/S003C001P002R002A041_rgb 41', 'sneezeCough/S016C001P019R002A041_rgb 41', 'sneezeCough/S014C001P025R002A041_rgb 41', 'sneezeCough/S008C002P032R001A041_rgb 41', 'sneezeCough/S007C003P028R002A041_rgb 41', 'sneezeCough/S006C002P022R001A041_rgb 41', 'sneezeCough/S003C001P019R001A041_rgb 41', 'sneezeCough/S008C003P029R001A041_rgb 41', 'sneezeCough/S011C003P028R002A041_rgb 41', 'sneezeCough/S009C002P007R001A041_rgb 41', 'sneezeCough/S014C003P015R001A041_rgb 41', 'sneezeCough/S016C001P008R002A041_rgb 41', 'sneezeCough/S014C001P007R001A041_rgb 41', 'sneezeCough/S001C003P004R002A041_rgb 41', 'sneezeCough/S004C003P008R001A041_rgb 41', 'sneezeCough/S013C002P007R002A041_rgb 41', 'sneezeCough/S012C003P016R002A041_rgb 41', 'sneezeCough/S016C001P039R001A041_rgb 41', 'sneezeCough/S009C001P007R001A041_rgb 41', 'sneezeCough/S013C003P028R001A041_rgb 41', 'sneezeCough/S006C002P001R001A041_rgb 41', 'sneezeCough/S004C002P003R002A041_rgb 41', 'sneezeCough/S013C001P018R002A041_rgb 41', 'sneezeCough/S013C003P025R002A041_rgb 41', 'sneezeCough/S003C002P008R001A041_rgb 41', 'sneezeCough/S007C001P008R002A041_rgb 41', 'sneezeCough/S006C003P007R002A041_rgb 41', 'sneezeCough/S016C002P025R001A041_rgb 41', 'sneezeCough/S005C002P018R002A041_rgb 41', 'sneezeCough/S017C003P009R002A041_rgb 41', 'sneezeCough/S011C002P016R001A041_rgb 41', 'sneezeCough/S005C001P017R001A041_rgb 41', 'sneezeCough/S012C001P015R002A041_rgb 41', 'sneezeCough/S012C002P025R001A041_rgb 41', 'sneezeCough/S016C002P019R002A041_rgb 41', 'sneezeCough/S002C003P009R001A041_rgb 41', 'sneezeCough/S008C002P007R001A041_rgb 41', 'sneezeCough/S007C001P015R002A041_rgb 41', 'sneezeCough/S006C001P019R001A041_rgb 41', 'sneezeCough/S014C002P017R002A041_rgb 41', 'sneezeCough/S008C001P008R002A041_rgb 41', 'sneezeCough/S008C002P031R002A041_rgb 41', 'sneezeCough/S010C001P008R001A041_rgb 41', 'sneezeCough/S017C002P016R001A041_rgb 41', 'sneezeCough/S008C003P015R001A041_rgb 41', 'sneezeCough/S011C003P015R002A041_rgb 41', 'sneezeCough/S017C003P008R002A041_rgb 41', 'sneezeCough/S008C002P035R002A041_rgb 41', 'sneezeCough/S008C001P008R001A041_rgb 41', 'sneezeCough/S014C003P037R001A041_rgb 41', 'sneezeCough/S015C002P025R001A041_rgb 41', 'sneezeCough/S015C002P015R002A041_rgb 41', 'sneezeCough/S005C001P010R001A041_rgb 41', 'sneezeCough/S006C003P007R001A041_rgb 41', 'sneezeCough/S010C003P018R002A041_rgb 41', 'sneezeCough/S011C003P019R002A041_rgb 41', 'sneezeCough/S003C001P008R002A041_rgb 41', 'sneezeCough/S013C002P008R001A041_rgb 41', 'sneezeCough/S013C001P016R001A041_rgb 41', 'sneezeCough/S011C003P002R001A041_rgb 41', 'sneezeCough/S007C003P007R002A041_rgb 41', 'sneezeCough/S014C002P039R002A041_rgb 41', 'sneezeCough/S008C001P001R002A041_rgb 41', 'sneezeCough/S009C003P025R002A041_rgb 41', 'sneezeCough/S003C002P007R001A041_rgb 41', 'sneezeCough/S013C001P015R002A041_rgb 41', 'sneezeCough/S017C003P015R001A041_rgb 41', 'sneezeCough/S008C003P025R002A041_rgb 41', 'sneezeCough/S015C002P019R001A041_rgb 41', 'sneezeCough/S005C002P016R001A041_rgb 41', 'sneezeCough/S017C001P008R002A041_rgb 41', 'sneezeCough/S005C001P016R001A041_rgb 41', 'sneezeCough/S012C002P017R002A041_rgb 41', 'sneezeCough/S002C001P010R002A041_rgb 41', 'sneezeCough/S008C002P001R001A041_rgb 41', 'sneezeCough/S012C002P007R001A041_rgb 41', 'sneezeCough/S012C002P018R002A041_rgb 41', 'sneezeCough/S016C002P040R002A041_rgb 41', 'sneezeCough/S017C001P017R001A041_rgb 41', 'sneezeCough/S012C001P017R002A041_rgb 41', 'sneezeCough/S009C001P017R001A041_rgb 41', 'sneezeCough/S007C003P017R002A041_rgb 41', 'sneezeCough/S011C002P025R002A041_rgb 41', 'sneezeCough/S008C002P031R001A041_rgb 41', 'sneezeCough/S005C001P015R002A041_rgb 41', 'sneezeCough/S006C003P017R002A041_rgb 41', 'sneezeCough/S017C001P003R001A041_rgb 41', 'sneezeCough/S015C001P037R001A041_rgb 41', 'sneezeCough/S007C003P027R001A041_rgb 41', 'sneezeCough/S006C003P015R001A041_rgb 41', 'sneezeCough/S004C003P020R002A041_rgb 41', 'sneezeCough/S006C001P017R002A041_rgb 41', 'sneezeCough/S012C003P025R002A041_rgb 41', 'sneezeCough/S006C002P019R002A041_rgb 41', 'sneezeCough/S012C001P016R001A041_rgb 41', 'sneezeCough/S002C003P012R002A041_rgb 41', 'sneezeCough/S016C003P040R002A041_rgb 41', 'sneezeCough/S007C003P015R001A041_rgb 41', 'sneezeCough/S017C002P017R002A041_rgb 41', 'sneezeCough/S002C002P008R001A041_rgb 41', 'sneezeCough/S008C002P019R001A041_rgb 41', 'sneezeCough/S010C001P015R002A041_rgb 41', 'sneezeCough/S013C001P016R002A041_rgb 41', 'sneezeCough/S014C003P027R002A041_rgb 41', 'sneezeCough/S017C003P015R002A041_rgb 41', 'sneezeCough/S003C002P019R001A041_rgb 41', 'sneezeCough/S011C003P025R001A041_rgb 41', 'sneezeCough/S001C003P002R001A041_rgb 41', 'sneezeCough/S016C002P019R001A041_rgb 41', 'sneezeCough/S007C001P018R002A041_rgb 41', 'sneezeCough/S005C002P017R001A041_rgb 41', 'sneezeCough/S005C001P018R002A041_rgb 41', 'sneezeCough/S001C003P008R002A041_rgb 41', 'sneezeCough/S008C003P008R001A041_rgb 41', 'sneezeCough/S009C001P015R001A041_rgb 41', 'sneezeCough/S015C002P019R002A041_rgb 41', 'sneezeCough/S012C002P027R002A041_rgb 41', 'sneezeCough/S014C002P007R001A041_rgb 41', 'sneezeCough/S003C002P019R002A041_rgb 41', 'sneezeCough/S006C001P023R002A041_rgb 41', 'sneezeCough/S013C002P015R002A041_rgb 41', 'sneezeCough/S017C003P017R001A041_rgb 41', 'sneezeCough/S002C002P012R002A041_rgb 41', 'sneezeCough/S007C002P019R002A041_rgb 41', 'sneezeCough/S002C002P011R001A041_rgb 41', 'sneezeCough/S011C001P038R001A041_rgb 41', 'sneezeCough/S007C001P001R001A041_rgb 41', 'sneezeCough/S017C003P016R001A041_rgb 41', 'sneezeCough/S011C001P015R001A041_rgb 41', 'sneezeCough/S011C002P019R002A041_rgb 41', 'sneezeCough/S013C001P007R002A041_rgb 41', 'sneezeCough/S008C003P033R001A041_rgb 41', 'sneezeCough/S008C002P008R002A041_rgb 41', 'sneezeCough/S014C002P017R001A041_rgb 41', 'sneezeCough/S007C002P025R001A041_rgb 41', 'sneezeCough/S003C002P001R002A041_rgb 41', 'sneezeCough/S007C003P019R001A041_rgb 41', 'sneezeCough/S010C001P013R001A041_rgb 41', 'sneezeCough/S010C003P013R001A041_rgb 41', 'sneezeCough/S008C002P036R001A041_rgb 41', 'sneezeCough/S006C001P023R001A041_rgb 41', 'sneezeCough/S001C003P005R002A041_rgb 41', 'staggering/S008C002P030R001A042_rgb 42', 'staggering/S008C003P031R002A042_rgb 42', 'staggering/S016C002P007R002A042_rgb 42', 'staggering/S008C003P007R002A042_rgb 42', 'staggering/S015C002P025R001A042_rgb 42', 'staggering/S008C003P015R001A042_rgb 42', 'staggering/S002C003P010R002A042_rgb 42', 'staggering/S016C001P040R001A042_rgb 42', 'staggering/S012C002P017R002A042_rgb 42', 'staggering/S002C001P012R002A042_rgb 42', 'staggering/S013C003P007R002A042_rgb 42', 'staggering/S012C003P019R001A042_rgb 42', 'staggering/S007C001P025R001A042_rgb 42', 'staggering/S014C002P037R001A042_rgb 42', 'staggering/S007C002P028R001A042_rgb 42', 'staggering/S001C002P003R002A042_rgb 42', 'staggering/S010C001P007R002A042_rgb 42', 'staggering/S012C003P027R002A042_rgb 42', 'staggering/S011C003P027R001A042_rgb 42', 'staggering/S010C003P007R001A042_rgb 42', 'staggering/S010C002P013R002A042_rgb 42', 'staggering/S001C001P006R002A042_rgb 42', 'staggering/S007C002P001R001A042_rgb 42', 'staggering/S013C002P025R002A042_rgb 42', 'staggering/S013C002P037R002A042_rgb 42', 'staggering/S013C001P027R001A042_rgb 42', 'staggering/S015C003P017R001A042_rgb 42', 'staggering/S003C001P008R002A042_rgb 42', 'staggering/S005C001P021R002A042_rgb 42', 'staggering/S017C002P015R002A042_rgb 42', 'staggering/S013C001P019R001A042_rgb 42', 'staggering/S001C002P002R002A042_rgb 42', 'staggering/S015C002P016R001A042_rgb 42', 'staggering/S002C002P011R001A042_rgb 42', 'staggering/S006C002P008R001A042_rgb 42', 'staggering/S014C003P007R002A042_rgb 42', 'staggering/S006C001P022R002A042_rgb 42', 'staggering/S010C003P025R001A042_rgb 42', 'staggering/S013C002P007R001A042_rgb 42', 'staggering/S006C003P019R002A042_rgb 42', 'staggering/S011C001P038R001A042_rgb 42', 'staggering/S014C002P027R001A042_rgb 42', 'staggering/S007C001P008R002A042_rgb 42', 'staggering/S014C001P025R002A042_rgb 42', 'staggering/S008C003P019R002A042_rgb 42', 'staggering/S003C003P001R002A042_rgb 42', 'staggering/S005C003P021R001A042_rgb 42', 'staggering/S012C003P015R001A042_rgb 42', 'staggering/S017C003P015R001A042_rgb 42', 'staggering/S006C003P023R002A042_rgb 42', 'staggering/S012C002P037R002A042_rgb 42', 'staggering/S005C003P017R001A042_rgb 42', 'staggering/S012C003P016R002A042_rgb 42', 'staggering/S016C001P019R001A042_rgb 42', 'staggering/S008C003P001R001A042_rgb 42', 'staggering/S002C001P013R001A042_rgb 42', 'staggering/S012C002P027R002A042_rgb 42', 'staggering/S003C003P008R002A042_rgb 42', 'staggering/S011C002P016R001A042_rgb 42', 'staggering/S013C001P025R002A042_rgb 42', 'staggering/S005C002P021R002A042_rgb 42', 'staggering/S002C002P009R002A042_rgb 42', 'staggering/S013C002P008R002A042_rgb 42', 'staggering/S009C003P017R002A042_rgb 42', 'staggering/S007C003P017R001A042_rgb 42', 'staggering/S007C003P001R001A042_rgb 42', 'staggering/S005C003P015R002A042_rgb 42', 'staggering/S009C003P019R002A042_rgb 42', 'staggering/S008C002P034R001A042_rgb 42', 'staggering/S008C001P035R001A042_rgb 42', 'staggering/S009C001P007R002A042_rgb 42', 'staggering/S010C002P019R001A042_rgb 42', 'staggering/S009C001P019R001A042_rgb 42', 'staggering/S007C002P027R001A042_rgb 42', 'staggering/S015C002P017R001A042_rgb 42', 'staggering/S009C001P008R002A042_rgb 42', 'staggering/S007C002P017R002A042_rgb 42', 'staggering/S007C002P019R001A042_rgb 42', 'staggering/S002C002P007R001A042_rgb 42', 'staggering/S011C002P002R001A042_rgb 42', 'staggering/S015C002P037R001A042_rgb 42', 'staggering/S004C002P007R001A042_rgb 42', 'staggering/S003C002P015R001A042_rgb 42', 'staggering/S004C003P003R002A042_rgb 42', 'staggering/S006C001P019R002A042_rgb 42', 'staggering/S013C003P027R001A042_rgb 42', 'staggering/S003C001P002R002A042_rgb 42', 'staggering/S011C001P028R002A042_rgb 42', 'staggering/S010C001P017R002A042_rgb 42', 'staggering/S011C003P028R002A042_rgb 42', 'staggering/S014C003P007R001A042_rgb 42', 'staggering/S006C002P022R001A042_rgb 42', 'staggering/S007C001P027R002A042_rgb 42', 'staggering/S008C003P030R001A042_rgb 42', 'staggering/S015C001P016R002A042_rgb 42', 'staggering/S007C001P025R002A042_rgb 42', 'staggering/S016C002P019R001A042_rgb 42', 'staggering/S008C001P001R002A042_rgb 42', 'staggering/S002C001P011R001A042_rgb 42', 'staggering/S012C003P008R002A042_rgb 42', 'staggering/S011C001P028R001A042_rgb 42', 'staggering/S009C001P016R001A042_rgb 42', 'staggering/S012C001P015R001A042_rgb 42', 'staggering/S002C003P008R001A042_rgb 42', 'staggering/S006C003P007R002A042_rgb 42', 'staggering/S012C002P008R001A042_rgb 42', 'staggering/S006C003P015R002A042_rgb 42', 'staggering/S014C001P015R002A042_rgb 42', 'staggering/S013C002P017R001A042_rgb 42', 'staggering/S013C002P017R002A042_rgb 42', 'staggering/S014C003P008R001A042_rgb 42', 'staggering/S002C002P009R001A042_rgb 42', 'staggering/S002C002P008R001A042_rgb 42', 'staggering/S010C002P018R002A042_rgb 42', 'staggering/S013C003P025R001A042_rgb 42', 'staggering/S012C002P018R001A042_rgb 42', 'staggering/S006C002P024R001A042_rgb 42', 'staggering/S013C001P017R001A042_rgb 42', 'staggering/S013C001P025R001A042_rgb 42', 'staggering/S013C001P008R002A042_rgb 42', 'staggering/S015C003P016R001A042_rgb 42', 'staggering/S011C002P018R001A042_rgb 42', 'staggering/S011C001P015R002A042_rgb 42', 'staggering/S009C002P015R002A042_rgb 42', 'staggering/S012C001P018R002A042_rgb 42', 'staggering/S002C002P010R002A042_rgb 42', 'staggering/S017C001P015R001A042_rgb 42', 'staggering/S007C003P025R002A042_rgb 42', 'staggering/S012C003P037R002A042_rgb 42', 'staggering/S010C002P015R001A042_rgb 42', 'staggering/S004C003P008R001A042_rgb 42', 'staggering/S006C002P007R002A042_rgb 42', 'staggering/S001C001P001R001A042_rgb 42', 'staggering/S007C001P026R001A042_rgb 42', 'staggering/S002C003P009R002A042_rgb 42', 'staggering/S017C003P017R001A042_rgb 42', 'staggering/S011C003P015R002A042_rgb 42', 'staggering/S003C003P007R001A042_rgb 42', 'staggering/S014C002P015R002A042_rgb 42', 'staggering/S011C002P002R002A042_rgb 42', 'staggering/S008C002P015R002A042_rgb 42', 'staggering/S001C003P008R002A042_rgb 42', 'staggering/S013C002P016R001A042_rgb 42', 'staggering/S013C003P019R002A042_rgb 42', 'staggering/S003C001P007R002A042_rgb 42', 'staggering/S006C003P022R002A042_rgb 42', 'staggering/S005C002P004R001A042_rgb 42', 'staggering/S008C003P032R001A042_rgb 42', 'staggering/S006C001P008R001A042_rgb 42', 'staggering/S001C001P004R001A042_rgb 42', 'staggering/S017C001P008R001A042_rgb 42', 'staggering/S016C003P039R001A042_rgb 42', 'staggering/S012C001P007R002A042_rgb 42', 'staggering/S011C002P008R002A042_rgb 42', 'staggering/S014C001P019R002A042_rgb 42', 'staggering/S002C003P014R001A042_rgb 42', 'staggering/S003C002P017R002A042_rgb 42', 'staggering/S012C003P028R002A042_rgb 42', 'staggering/S012C003P025R002A042_rgb 42', 'staggering/S010C003P018R001A042_rgb 42', 'staggering/S006C001P007R001A042_rgb 42', 'staggering/S005C002P010R002A042_rgb 42', 'staggering/S002C002P011R002A042_rgb 42', 'staggering/S017C003P003R002A042_rgb 42', 'staggering/S004C002P007R002A042_rgb 42', 'staggering/S012C003P017R001A042_rgb 42', 'staggering/S003C002P007R002A042_rgb 42', 'staggering/S017C003P003R001A042_rgb 42', 'staggering/S003C001P002R001A042_rgb 42', 'staggering/S008C002P033R002A042_rgb 42', 'staggering/S012C001P008R001A042_rgb 42', 'staggering/S010C003P016R002A042_rgb 42', 'staggering/S012C003P027R001A042_rgb 42', 'staggering/S001C001P008R001A042_rgb 42', 'staggering/S001C003P001R001A042_rgb 42', 'staggering/S017C001P009R001A042_rgb 42', 'staggering/S008C003P015R002A042_rgb 42', 'staggering/S005C001P013R002A042_rgb 42', 'staggering/S012C002P015R002A042_rgb 42', 'staggering/S013C002P016R002A042_rgb 42', 'staggering/S013C003P027R002A042_rgb 42', 'staggering/S017C001P017R001A042_rgb 42', 'staggering/S007C002P015R002A042_rgb 42', 'staggering/S014C003P039R002A042_rgb 42', 'staggering/S010C003P019R001A042_rgb 42', 'staggering/S013C002P028R002A042_rgb 42', 'staggering/S001C001P005R001A042_rgb 42', 'staggering/S009C002P025R002A042_rgb 42', 'staggering/S008C003P007R001A042_rgb 42', 'staggering/S003C001P016R002A042_rgb 42', 'fallingDown/S009C002P015R001A043_rgb 43', 'fallingDown/S010C003P016R001A043_rgb 43', 'fallingDown/S007C002P001R001A043_rgb 43', 'fallingDown/S014C002P008R002A043_rgb 43', 'fallingDown/S005C003P013R002A043_rgb 43', 'fallingDown/S005C003P017R002A043_rgb 43', 'fallingDown/S014C001P008R001A043_rgb 43', 'fallingDown/S005C002P018R001A043_rgb 43', 'fallingDown/S005C002P004R002A043_rgb 43', 'fallingDown/S011C002P028R001A043_rgb 43', 'fallingDown/S014C003P025R001A043_rgb 43', 'fallingDown/S003C001P015R002A043_rgb 43', 'fallingDown/S003C002P015R002A043_rgb 43', 'fallingDown/S010C001P021R002A043_rgb 43', 'fallingDown/S013C002P028R001A043_rgb 43', 'fallingDown/S012C002P017R002A043_rgb 43', 'fallingDown/S012C002P016R002A043_rgb 43', 'fallingDown/S007C003P007R002A043_rgb 43', 'fallingDown/S011C003P038R001A043_rgb 43', 'fallingDown/S015C001P015R002A043_rgb 43', 'fallingDown/S002C001P011R001A043_rgb 43', 'fallingDown/S017C002P009R002A043_rgb 43', 'fallingDown/S008C003P019R002A043_rgb 43', 'fallingDown/S016C003P040R002A043_rgb 43', 'fallingDown/S015C002P037R002A043_rgb 43', 'fallingDown/S012C002P007R001A043_rgb 43', 'fallingDown/S010C002P025R001A043_rgb 43', 'fallingDown/S016C002P040R001A043_rgb 43', 'fallingDown/S010C002P019R002A043_rgb 43', 'fallingDown/S012C001P019R001A043_rgb 43', 'fallingDown/S009C002P019R001A043_rgb 43', 'fallingDown/S008C001P008R002A043_rgb 43', 'fallingDown/S013C003P037R002A043_rgb 43', 'fallingDown/S012C003P007R002A043_rgb 43', 'fallingDown/S014C001P037R001A043_rgb 43', 'fallingDown/S006C002P017R001A043_rgb 43', 'fallingDown/S002C001P007R001A043_rgb 43', 'fallingDown/S013C001P018R002A043_rgb 43', 'fallingDown/S009C002P019R002A043_rgb 43', 'fallingDown/S011C002P027R002A043_rgb 43', 'fallingDown/S002C002P013R001A043_rgb 43', 'fallingDown/S004C003P020R002A043_rgb 43', 'fallingDown/S003C001P002R001A043_rgb 43', 'fallingDown/S014C002P008R001A043_rgb 43', 'fallingDown/S009C003P015R002A043_rgb 43', 'fallingDown/S003C001P018R001A043_rgb 43', 'fallingDown/S007C001P007R001A043_rgb 43', 'fallingDown/S003C003P007R001A043_rgb 43', 'fallingDown/S001C001P005R002A043_rgb 43', 'fallingDown/S007C003P026R002A043_rgb 43', 'fallingDown/S009C003P019R002A043_rgb 43', 'fallingDown/S004C002P008R001A043_rgb 43', 'fallingDown/S013C002P027R002A043_rgb 43', 'fallingDown/S006C003P017R001A043_rgb 43', 'fallingDown/S013C003P028R002A043_rgb 43', 'fallingDown/S002C003P007R002A043_rgb 43', 'fallingDown/S002C002P010R001A043_rgb 43', 'fallingDown/S006C003P016R001A043_rgb 43', 'fallingDown/S005C002P013R001A043_rgb 43', 'fallingDown/S003C001P002R002A043_rgb 43', 'fallingDown/S008C001P015R002A043_rgb 43', 'fallingDown/S014C002P037R001A043_rgb 43', 'fallingDown/S003C002P016R001A043_rgb 43', 'fallingDown/S004C001P007R002A043_rgb 43', 'fallingDown/S002C001P014R002A043_rgb 43', 'fallingDown/S012C002P018R002A043_rgb 43', 'fallingDown/S007C001P018R002A043_rgb 43', 'fallingDown/S005C001P004R002A043_rgb 43', 'fallingDown/S001C002P005R002A043_rgb 43', 'fallingDown/S007C003P015R001A043_rgb 43', 'fallingDown/S007C001P015R002A043_rgb 43', 'fallingDown/S013C001P015R002A043_rgb 43', 'fallingDown/S009C001P008R001A043_rgb 43', 'fallingDown/S015C002P007R001A043_rgb 43', 'fallingDown/S006C003P023R002A043_rgb 43', 'fallingDown/S005C003P016R001A043_rgb 43', 'fallingDown/S003C003P017R002A043_rgb 43', 'fallingDown/S012C002P027R002A043_rgb 43', 'fallingDown/S010C002P008R001A043_rgb 43', 'fallingDown/S011C002P038R001A043_rgb 43', 'fallingDown/S010C003P019R001A043_rgb 43', 'fallingDown/S014C003P027R001A043_rgb 43', 'fallingDown/S007C002P025R001A043_rgb 43', 'fallingDown/S017C002P007R002A043_rgb 43', 'fallingDown/S011C003P018R002A043_rgb 43', 'fallingDown/S001C003P007R001A043_rgb 43', 'fallingDown/S006C003P023R001A043_rgb 43', 'fallingDown/S015C002P019R002A043_rgb 43', 'fallingDown/S011C002P002R001A043_rgb 43', 'fallingDown/S001C002P004R002A043_rgb 43', 'fallingDown/S016C003P008R002A043_rgb 43', 'fallingDown/S013C002P018R001A043_rgb 43', 'fallingDown/S010C002P025R002A043_rgb 43', 'fallingDown/S013C001P028R002A043_rgb 43', 'fallingDown/S006C001P015R002A043_rgb 43', 'fallingDown/S010C001P021R001A043_rgb 43', 'fallingDown/S005C001P018R002A043_rgb 43', 'fallingDown/S017C003P007R002A043_rgb 43', 'fallingDown/S008C003P008R002A043_rgb 43', 'fallingDown/S010C003P021R002A043_rgb 43', 'fallingDown/S010C002P013R002A043_rgb 43', 'fallingDown/S006C003P015R002A043_rgb 43', 'fallingDown/S012C001P027R001A043_rgb 43', 'fallingDown/S011C001P027R001A043_rgb 43', 'fallingDown/S006C002P019R002A043_rgb 43', 'fallingDown/S002C002P014R001A043_rgb 43', 'fallingDown/S011C003P001R002A043_rgb 43', 'fallingDown/S013C003P019R002A043_rgb 43', 'fallingDown/S012C002P037R001A043_rgb 43', 'fallingDown/S011C003P019R002A043_rgb 43', 'fallingDown/S005C002P018R002A043_rgb 43', 'fallingDown/S014C002P025R001A043_rgb 43', 'fallingDown/S013C003P015R002A043_rgb 43', 'fallingDown/S013C003P027R001A043_rgb 43', 'fallingDown/S008C002P015R002A043_rgb 43', 'fallingDown/S015C001P007R002A043_rgb 43', 'fallingDown/S004C003P003R001A043_rgb 43', 'fallingDown/S005C001P016R001A043_rgb 43', 'fallingDown/S017C003P016R001A043_rgb 43', 'fallingDown/S001C002P007R002A043_rgb 43', 'fallingDown/S001C003P002R002A043_rgb 43', 'fallingDown/S009C003P016R002A043_rgb 43', 'fallingDown/S007C001P019R002A043_rgb 43', 'fallingDown/S002C002P012R002A043_rgb 43', 'fallingDown/S006C003P024R001A043_rgb 43', 'fallingDown/S010C001P016R001A043_rgb 43', 'fallingDown/S012C003P025R002A043_rgb 43', 'fallingDown/S011C003P038R002A043_rgb 43', 'fallingDown/S017C001P009R002A043_rgb 43', 'fallingDown/S011C001P008R002A043_rgb 43', 'fallingDown/S009C001P016R001A043_rgb 43', 'fallingDown/S005C003P016R002A043_rgb 43', 'fallingDown/S010C002P008R002A043_rgb 43', 'fallingDown/S007C003P017R001A043_rgb 43', 'fallingDown/S016C001P019R002A043_rgb 43', 'fallingDown/S013C001P019R001A043_rgb 43', 'fallingDown/S006C002P022R001A043_rgb 43', 'fallingDown/S015C003P025R002A043_rgb 43', 'fallingDown/S003C001P015R001A043_rgb 43', 'fallingDown/S002C003P008R001A043_rgb 43', 'fallingDown/S001C003P008R002A043_rgb 43', 'fallingDown/S015C003P025R001A043_rgb 43', 'fallingDown/S015C003P016R002A043_rgb 43', 'fallingDown/S010C001P008R001A043_rgb 43', 'fallingDown/S003C003P016R002A043_rgb 43', 'fallingDown/S002C001P003R001A043_rgb 43', 'fallingDown/S006C001P007R002A043_rgb 43', 'fallingDown/S014C002P019R002A043_rgb 43', 'fallingDown/S017C001P020R002A043_rgb 43', 'fallingDown/S008C002P035R001A043_rgb 43', 'fallingDown/S002C001P010R002A043_rgb 43', 'fallingDown/S013C003P037R001A043_rgb 43', 'fallingDown/S005C002P004R001A043_rgb 43', 'fallingDown/S013C002P018R002A043_rgb 43', 'fallingDown/S002C003P011R002A043_rgb 43', 'fallingDown/S016C002P008R001A043_rgb 43', 'fallingDown/S012C002P037R002A043_rgb 43', 'fallingDown/S006C001P017R001A043_rgb 43', 'fallingDown/S015C002P007R002A043_rgb 43', 'fallingDown/S014C003P008R002A043_rgb 43', 'fallingDown/S011C001P019R002A043_rgb 43', 'fallingDown/S003C003P017R001A043_rgb 43', 'fallingDown/S006C001P015R001A043_rgb 43', 'fallingDown/S003C003P015R001A043_rgb 43', 'fallingDown/S010C003P007R001A043_rgb 43', 'fallingDown/S009C002P008R001A043_rgb 43', 'fallingDown/S008C001P029R001A043_rgb 43', 'fallingDown/S006C002P015R001A043_rgb 43', 'fallingDown/S010C002P017R002A043_rgb 43', 'fallingDown/S013C001P025R002A043_rgb 43', 'fallingDown/S008C002P032R001A043_rgb 43', 'fallingDown/S009C002P008R002A043_rgb 43', 'fallingDown/S011C003P027R002A043_rgb 43', 'fallingDown/S002C001P008R002A043_rgb 43', 'fallingDown/S007C001P017R001A043_rgb 43', 'fallingDown/S002C002P012R001A043_rgb 43', 'fallingDown/S012C003P028R001A043_rgb 43', 'fallingDown/S008C001P001R001A043_rgb 43', 'fallingDown/S002C003P010R002A043_rgb 43', 'fallingDown/S006C003P017R002A043_rgb 43', 'fallingDown/S013C001P017R002A043_rgb 43', 'fallingDown/S011C002P001R002A043_rgb 43', 'fallingDown/S009C002P017R002A043_rgb 43', 'fallingDown/S010C003P015R001A043_rgb 43', 'fallingDown/S008C001P036R001A043_rgb 43', 'fallingDown/S006C003P019R002A043_rgb 43', 'fallingDown/S007C003P025R002A043_rgb 43', 'fallingDown/S004C003P008R002A043_rgb 43', 'fallingDown/S015C001P017R002A043_rgb 43', 'headache/S013C001P016R002A044_rgb 44', 'headache/S014C002P019R001A044_rgb 44', 'headache/S007C003P016R001A044_rgb 44', 'headache/S006C002P015R001A044_rgb 44', 'headache/S005C002P021R001A044_rgb 44', 'headache/S010C001P007R001A044_rgb 44', 'headache/S013C001P018R002A044_rgb 44', 'headache/S011C003P018R002A044_rgb 44', 'headache/S002C003P010R001A044_rgb 44', 'headache/S010C001P015R002A044_rgb 44', 'headache/S017C003P016R001A044_rgb 44', 'headache/S014C002P027R002A044_rgb 44', 'headache/S002C003P009R001A044_rgb 44', 'headache/S002C002P012R001A044_rgb 44', 'headache/S015C002P008R002A044_rgb 44', 'headache/S003C003P002R001A044_rgb 44', 'headache/S003C003P001R002A044_rgb 44', 'headache/S011C003P016R001A044_rgb 44', 'headache/S001C002P008R001A044_rgb 44', 'headache/S003C001P016R001A044_rgb 44', 'headache/S013C002P037R002A044_rgb 44', 'headache/S006C002P023R002A044_rgb 44', 'headache/S013C001P019R002A044_rgb 44', 'headache/S012C003P025R002A044_rgb 44', 'headache/S015C003P017R001A044_rgb 44', 'headache/S012C002P019R002A044_rgb 44', 'headache/S014C001P027R001A044_rgb 44', 'headache/S016C003P040R001A044_rgb 44', 'headache/S010C002P025R001A044_rgb 44', 'headache/S006C001P016R002A044_rgb 44', 'headache/S001C002P007R001A044_rgb 44', 'headache/S005C001P015R002A044_rgb 44', 'headache/S006C002P001R001A044_rgb 44', 'headache/S014C001P008R001A044_rgb 44', 'headache/S005C001P010R002A044_rgb 44', 'headache/S016C003P040R002A044_rgb 44', 'headache/S001C002P003R002A044_rgb 44', 'headache/S008C001P015R002A044_rgb 44', 'headache/S016C002P007R002A044_rgb 44', 'headache/S006C003P024R002A044_rgb 44', 'headache/S007C001P016R002A044_rgb 44', 'headache/S012C001P017R001A044_rgb 44', 'headache/S007C003P016R002A044_rgb 44', 'headache/S016C002P008R001A044_rgb 44', 'headache/S002C002P010R002A044_rgb 44', 'headache/S001C003P008R001A044_rgb 44', 'headache/S017C003P017R002A044_rgb 44', 'headache/S008C002P001R001A044_rgb 44', 'headache/S017C001P017R001A044_rgb 44', 'headache/S014C002P017R001A044_rgb 44', 'headache/S016C001P008R001A044_rgb 44', 'headache/S016C002P021R001A044_rgb 44', 'headache/S013C001P028R002A044_rgb 44', 'headache/S003C001P007R001A044_rgb 44', 'headache/S012C001P037R002A044_rgb 44', 'headache/S012C002P018R001A044_rgb 44', 'headache/S010C001P008R001A044_rgb 44', 'headache/S008C002P033R001A044_rgb 44', 'headache/S005C002P010R001A044_rgb 44', 'headache/S007C002P015R001A044_rgb 44', 'headache/S011C003P019R002A044_rgb 44', 'headache/S005C003P018R002A044_rgb 44', 'headache/S011C002P018R002A044_rgb 44', 'headache/S008C001P035R002A044_rgb 44', 'headache/S008C002P029R002A044_rgb 44', 'headache/S007C001P025R001A044_rgb 44', 'headache/S009C002P016R002A044_rgb 44', 'headache/S010C002P007R001A044_rgb 44', 'headache/S013C001P025R001A044_rgb 44', 'headache/S005C002P015R001A044_rgb 44', 'headache/S017C003P016R002A044_rgb 44', 'headache/S007C001P027R001A044_rgb 44', 'headache/S012C003P019R001A044_rgb 44', 'headache/S015C003P016R002A044_rgb 44', 'headache/S010C002P019R002A044_rgb 44', 'headache/S007C002P007R002A044_rgb 44', 'headache/S010C001P021R002A044_rgb 44', 'headache/S015C003P008R002A044_rgb 44', 'headache/S002C002P013R001A044_rgb 44', 'headache/S002C003P013R002A044_rgb 44', 'headache/S015C001P017R002A044_rgb 44', 'headache/S007C002P025R002A044_rgb 44', 'headache/S014C003P007R002A044_rgb 44', 'headache/S002C001P014R002A044_rgb 44', 'headache/S002C003P007R002A044_rgb 44', 'headache/S010C001P025R001A044_rgb 44', 'headache/S014C002P037R002A044_rgb 44', 'headache/S007C003P001R001A044_rgb 44', 'headache/S002C002P012R002A044_rgb 44', 'headache/S008C001P034R001A044_rgb 44', 'headache/S015C001P008R002A044_rgb 44', 'headache/S011C003P038R002A044_rgb 44', 'headache/S011C001P027R002A044_rgb 44', 'headache/S005C003P010R002A044_rgb 44', 'headache/S008C001P036R001A044_rgb 44', 'headache/S003C003P007R001A044_rgb 44', 'headache/S006C003P019R002A044_rgb 44', 'headache/S001C002P004R002A044_rgb 44', 'headache/S013C003P019R001A044_rgb 44', 'headache/S015C002P015R001A044_rgb 44', 'headache/S015C002P007R001A044_rgb 44', 'headache/S008C002P031R001A044_rgb 44', 'headache/S006C002P007R001A044_rgb 44', 'headache/S015C003P019R001A044_rgb 44', 'headache/S012C003P007R002A044_rgb 44', 'headache/S003C002P007R001A044_rgb 44', 'headache/S001C001P005R002A044_rgb 44', 'headache/S013C002P027R002A044_rgb 44', 'headache/S002C001P010R001A044_rgb 44', 'headache/S016C001P039R001A044_rgb 44', 'headache/S011C002P019R002A044_rgb 44', 'headache/S016C002P025R001A044_rgb 44', 'headache/S009C002P015R001A044_rgb 44', 'headache/S007C003P001R002A044_rgb 44', 'headache/S017C003P015R001A044_rgb 44', 'headache/S011C001P015R002A044_rgb 44', 'headache/S003C003P015R002A044_rgb 44', 'headache/S006C002P008R001A044_rgb 44', 'headache/S009C001P008R002A044_rgb 44', 'headache/S009C001P017R001A044_rgb 44', 'headache/S002C003P013R001A044_rgb 44', 'headache/S008C003P015R002A044_rgb 44', 'headache/S007C001P025R002A044_rgb 44', 'headache/S002C002P009R002A044_rgb 44', 'headache/S011C001P025R001A044_rgb 44', 'headache/S002C002P007R001A044_rgb 44', 'headache/S014C001P017R001A044_rgb 44', 'headache/S001C003P005R001A044_rgb 44', 'headache/S002C003P012R002A044_rgb 44', 'headache/S009C003P016R002A044_rgb 44', 'headache/S010C003P007R002A044_rgb 44', 'headache/S003C003P016R002A044_rgb 44', 'headache/S017C003P008R001A044_rgb 44', 'headache/S012C001P007R002A044_rgb 44', 'headache/S009C001P025R001A044_rgb 44', 'headache/S015C001P037R002A044_rgb 44', 'headache/S002C001P009R001A044_rgb 44', 'headache/S003C001P017R001A044_rgb 44', 'headache/S004C002P008R001A044_rgb 44', 'headache/S002C003P014R001A044_rgb 44', 'headache/S013C001P027R001A044_rgb 44', 'headache/S006C002P019R002A044_rgb 44', 'headache/S012C001P037R001A044_rgb 44', 'headache/S009C001P016R002A044_rgb 44', 'headache/S013C001P028R001A044_rgb 44', 'headache/S013C002P007R002A044_rgb 44', 'headache/S001C002P004R001A044_rgb 44', 'headache/S011C001P002R001A044_rgb 44', 'headache/S011C002P001R002A044_rgb 44', 'headache/S008C003P034R001A044_rgb 44', 'headache/S010C001P017R001A044_rgb 44', 'headache/S009C003P015R001A044_rgb 44', 'headache/S007C002P008R001A044_rgb 44', 'headache/S007C001P026R002A044_rgb 44', 'headache/S001C001P006R002A044_rgb 44', 'headache/S017C003P007R001A044_rgb 44', 'headache/S010C002P021R002A044_rgb 44', 'headache/S013C001P016R001A044_rgb 44', 'headache/S016C003P025R001A044_rgb 44', 'headache/S008C001P019R001A044_rgb 44', 'headache/S010C003P018R001A044_rgb 44', 'headache/S014C001P025R002A044_rgb 44', 'headache/S008C003P008R002A044_rgb 44', 'headache/S006C002P023R001A044_rgb 44', 'headache/S012C003P037R002A044_rgb 44', 'headache/S009C003P007R002A044_rgb 44', 'headache/S013C001P008R001A044_rgb 44', 'headache/S008C002P007R001A044_rgb 44', 'headache/S006C002P016R002A044_rgb 44', 'headache/S012C002P016R001A044_rgb 44', 'headache/S017C002P007R002A044_rgb 44', 'headache/S014C003P015R002A044_rgb 44', 'headache/S012C002P027R001A044_rgb 44', 'headache/S009C001P017R002A044_rgb 44', 'headache/S008C003P025R002A044_rgb 44', 'headache/S011C003P025R002A044_rgb 44', 'headache/S008C001P025R001A044_rgb 44', 'headache/S004C003P008R001A044_rgb 44', 'headache/S017C001P008R001A044_rgb 44', 'headache/S011C002P015R001A044_rgb 44', 'headache/S013C003P018R001A044_rgb 44', 'headache/S010C003P019R002A044_rgb 44', 'headache/S013C003P008R002A044_rgb 44', 'headache/S001C003P007R001A044_rgb 44', 'headache/S012C002P008R002A044_rgb 44', 'headache/S015C002P015R002A044_rgb 44', 'headache/S008C002P008R001A044_rgb 44', 'headache/S007C002P026R002A044_rgb 44', 'headache/S001C003P001R001A044_rgb 44', 'chestPain/S012C002P027R002A045_rgb 45', 'chestPain/S008C003P007R001A045_rgb 45', 'chestPain/S013C002P018R001A045_rgb 45', 'chestPain/S015C002P015R001A045_rgb 45', 'chestPain/S002C003P007R002A045_rgb 45', 'chestPain/S007C001P001R001A045_rgb 45', 'chestPain/S006C002P022R001A045_rgb 45', 'chestPain/S010C003P015R001A045_rgb 45', 'chestPain/S016C001P008R002A045_rgb 45', 'chestPain/S008C001P033R001A045_rgb 45', 'chestPain/S017C002P008R002A045_rgb 45', 'chestPain/S011C003P017R002A045_rgb 45', 'chestPain/S009C003P019R001A045_rgb 45', 'chestPain/S001C001P002R002A045_rgb 45', 'chestPain/S008C003P025R002A045_rgb 45', 'chestPain/S007C002P001R001A045_rgb 45', 'chestPain/S010C002P016R002A045_rgb 45', 'chestPain/S009C003P016R002A045_rgb 45', 'chestPain/S002C001P007R002A045_rgb 45', 'chestPain/S015C001P016R001A045_rgb 45', 'chestPain/S009C001P019R002A045_rgb 45', 'chestPain/S017C001P009R002A045_rgb 45', 'chestPain/S009C001P007R001A045_rgb 45', 'chestPain/S010C001P017R001A045_rgb 45', 'chestPain/S011C001P015R002A045_rgb 45', 'chestPain/S012C002P037R002A045_rgb 45', 'chestPain/S010C002P016R001A045_rgb 45', 'chestPain/S005C003P018R001A045_rgb 45', 'chestPain/S017C003P020R001A045_rgb 45', 'chestPain/S003C001P008R002A045_rgb 45', 'chestPain/S012C001P017R001A045_rgb 45', 'chestPain/S011C001P018R002A045_rgb 45', 'chestPain/S009C001P015R001A045_rgb 45', 'chestPain/S012C002P028R001A045_rgb 45', 'chestPain/S014C003P027R001A045_rgb 45', 'chestPain/S012C003P016R001A045_rgb 45', 'chestPain/S012C002P025R002A045_rgb 45', 'chestPain/S008C003P019R002A045_rgb 45', 'chestPain/S014C003P025R002A045_rgb 45', 'chestPain/S012C001P019R002A045_rgb 45', 'chestPain/S009C001P008R002A045_rgb 45', 'chestPain/S011C003P016R002A045_rgb 45', 'chestPain/S001C003P004R001A045_rgb 45', 'chestPain/S011C003P038R002A045_rgb 45', 'chestPain/S016C002P019R001A045_rgb 45', 'chestPain/S014C003P027R002A045_rgb 45', 'chestPain/S008C001P025R001A045_rgb 45', 'chestPain/S015C002P016R001A045_rgb 45', 'chestPain/S006C002P001R001A045_rgb 45', 'chestPain/S016C001P008R001A045_rgb 45', 'chestPain/S002C003P010R001A045_rgb 45', 'chestPain/S011C002P038R001A045_rgb 45', 'chestPain/S012C001P007R002A045_rgb 45', 'chestPain/S007C003P008R002A045_rgb 45', 'chestPain/S011C001P007R002A045_rgb 45', 'chestPain/S012C003P027R001A045_rgb 45', 'chestPain/S015C003P008R002A045_rgb 45', 'chestPain/S011C001P002R002A045_rgb 45', 'chestPain/S008C001P036R001A045_rgb 45', 'chestPain/S015C001P008R001A045_rgb 45', 'chestPain/S008C002P032R002A045_rgb 45', 'chestPain/S013C001P028R002A045_rgb 45', 'chestPain/S003C001P015R002A045_rgb 45', 'chestPain/S003C003P017R002A045_rgb 45', 'chestPain/S005C001P015R001A045_rgb 45', 'chestPain/S016C002P007R001A045_rgb 45', 'chestPain/S001C003P007R001A045_rgb 45', 'chestPain/S011C001P001R001A045_rgb 45', 'chestPain/S011C002P001R002A045_rgb 45', 'chestPain/S015C001P037R001A045_rgb 45', 'chestPain/S010C001P025R001A045_rgb 45', 'chestPain/S011C003P027R001A045_rgb 45', 'chestPain/S015C002P017R001A045_rgb 45', 'chestPain/S004C002P020R002A045_rgb 45', 'chestPain/S007C001P008R001A045_rgb 45', 'chestPain/S014C002P027R002A045_rgb 45', 'chestPain/S007C002P028R001A045_rgb 45', 'chestPain/S001C001P001R002A045_rgb 45', 'chestPain/S003C003P002R002A045_rgb 45', 'chestPain/S008C002P035R001A045_rgb 45', 'chestPain/S008C003P036R001A045_rgb 45', 'chestPain/S005C002P004R001A045_rgb 45', 'chestPain/S007C003P007R001A045_rgb 45', 'chestPain/S013C003P019R001A045_rgb 45', 'chestPain/S011C002P016R002A045_rgb 45', 'chestPain/S001C003P001R001A045_rgb 45', 'chestPain/S003C001P001R002A045_rgb 45', 'chestPain/S010C001P015R002A045_rgb 45', 'chestPain/S009C001P019R001A045_rgb 45', 'chestPain/S006C003P015R002A045_rgb 45', 'chestPain/S011C003P038R001A045_rgb 45', 'chestPain/S013C003P007R002A045_rgb 45', 'chestPain/S010C001P017R002A045_rgb 45', 'chestPain/S012C003P019R001A045_rgb 45', 'chestPain/S011C001P002R001A045_rgb 45', 'chestPain/S012C003P019R002A045_rgb 45', 'chestPain/S002C001P011R001A045_rgb 45', 'chestPain/S015C002P025R002A045_rgb 45', 'chestPain/S008C003P035R001A045_rgb 45', 'chestPain/S003C001P007R002A045_rgb 45', 'chestPain/S017C003P016R001A045_rgb 45', 'chestPain/S005C001P010R002A045_rgb 45', 'chestPain/S005C003P021R002A045_rgb 45', 'chestPain/S006C002P023R002A045_rgb 45', 'chestPain/S017C003P015R002A045_rgb 45', 'chestPain/S013C002P019R002A045_rgb 45', 'chestPain/S015C002P019R002A045_rgb 45', 'chestPain/S014C002P019R001A045_rgb 45', 'chestPain/S010C001P013R001A045_rgb 45', 'chestPain/S002C001P013R001A045_rgb 45', 'chestPain/S016C002P021R002A045_rgb 45', 'chestPain/S004C001P008R002A045_rgb 45', 'chestPain/S010C002P025R002A045_rgb 45', 'chestPain/S005C003P021R001A045_rgb 45', 'chestPain/S014C002P037R002A045_rgb 45', 'chestPain/S004C003P020R001A045_rgb 45', 'chestPain/S017C002P020R001A045_rgb 45', 'chestPain/S007C003P008R001A045_rgb 45', 'chestPain/S017C001P016R001A045_rgb 45', 'chestPain/S003C001P016R002A045_rgb 45', 'chestPain/S012C002P007R002A045_rgb 45', 'chestPain/S008C002P033R001A045_rgb 45', 'chestPain/S015C001P019R002A045_rgb 45', 'chestPain/S006C001P008R001A045_rgb 45', 'chestPain/S011C001P017R002A045_rgb 45', 'chestPain/S008C002P007R002A045_rgb 45', 'chestPain/S006C003P007R002A045_rgb 45', 'chestPain/S011C003P008R002A045_rgb 45', 'chestPain/S007C002P026R002A045_rgb 45', 'chestPain/S014C002P017R002A045_rgb 45', 'chestPain/S003C001P019R001A045_rgb 45', 'chestPain/S017C002P016R002A045_rgb 45', 'chestPain/S009C002P015R001A045_rgb 45', 'chestPain/S003C001P015R001A045_rgb 45', 'chestPain/S006C001P015R001A045_rgb 45', 'chestPain/S006C003P024R002A045_rgb 45', 'chestPain/S008C001P008R002A045_rgb 45', 'chestPain/S006C002P015R002A045_rgb 45', 'chestPain/S011C001P016R001A045_rgb 45', 'chestPain/S007C001P026R001A045_rgb 45', 'chestPain/S013C002P007R001A045_rgb 45', 'chestPain/S002C001P008R002A045_rgb 45', 'chestPain/S015C001P037R002A045_rgb 45', 'chestPain/S014C002P039R002A045_rgb 45', 'chestPain/S007C002P017R001A045_rgb 45', 'chestPain/S015C001P007R001A045_rgb 45', 'chestPain/S006C002P019R002A045_rgb 45', 'chestPain/S002C003P014R002A045_rgb 45', 'chestPain/S015C002P007R002A045_rgb 45', 'chestPain/S002C003P010R002A045_rgb 45', 'chestPain/S013C002P027R001A045_rgb 45', 'chestPain/S008C003P001R002A045_rgb 45', 'chestPain/S013C001P025R001A045_rgb 45', 'chestPain/S007C001P025R001A045_rgb 45', 'chestPain/S005C002P017R001A045_rgb 45', 'chestPain/S013C001P015R001A045_rgb 45', 'chestPain/S013C003P028R002A045_rgb 45', 'chestPain/S012C002P019R001A045_rgb 45', 'chestPain/S006C001P023R001A045_rgb 45', 'chestPain/S014C003P007R002A045_rgb 45', 'chestPain/S009C002P017R001A045_rgb 45', 'chestPain/S015C001P017R001A045_rgb 45', 'chestPain/S012C001P028R001A045_rgb 45', 'chestPain/S010C001P019R002A045_rgb 45', 'chestPain/S011C002P028R002A045_rgb 45', 'chestPain/S010C003P016R002A045_rgb 45', 'chestPain/S002C002P011R002A045_rgb 45', 'chestPain/S014C001P019R002A045_rgb 45', 'chestPain/S001C001P003R001A045_rgb 45', 'chestPain/S014C002P039R001A045_rgb 45', 'chestPain/S004C002P003R002A045_rgb 45', 'chestPain/S008C001P008R001A045_rgb 45', 'chestPain/S012C002P008R002A045_rgb 45', 'chestPain/S008C003P008R002A045_rgb 45', 'chestPain/S001C002P007R001A045_rgb 45', 'chestPain/S003C003P008R001A045_rgb 45', 'chestPain/S003C002P018R001A045_rgb 45', 'chestPain/S015C003P017R002A045_rgb 45', 'chestPain/S016C002P040R001A045_rgb 45', 'chestPain/S004C002P007R001A045_rgb 45', 'chestPain/S015C002P016R002A045_rgb 45', 'chestPain/S009C002P008R001A045_rgb 45', 'chestPain/S009C003P025R001A045_rgb 45', 'chestPain/S003C003P019R001A045_rgb 45', 'chestPain/S014C001P017R002A045_rgb 45', 'chestPain/S007C002P001R002A045_rgb 45', 'chestPain/S016C003P007R001A045_rgb 45', 'chestPain/S014C003P037R002A045_rgb 45', 'chestPain/S007C001P015R002A045_rgb 45', 'chestPain/S016C002P040R002A045_rgb 45', 'backPain/S015C002P019R001A046_rgb 46', 'backPain/S015C001P008R002A046_rgb 46', 'backPain/S006C002P023R001A046_rgb 46', 'backPain/S011C001P008R002A046_rgb 46', 'backPain/S017C001P020R001A046_rgb 46', 'backPain/S015C003P017R001A046_rgb 46', 'backPain/S007C003P026R002A046_rgb 46', 'backPain/S003C002P017R002A046_rgb 46', 'backPain/S011C002P025R002A046_rgb 46', 'backPain/S012C002P016R001A046_rgb 46', 'backPain/S007C002P026R001A046_rgb 46', 'backPain/S006C003P019R001A046_rgb 46', 'backPain/S006C001P024R002A046_rgb 46', 'backPain/S016C003P019R001A046_rgb 46', 'backPain/S005C003P004R001A046_rgb 46', 'backPain/S002C003P003R002A046_rgb 46', 'backPain/S007C001P019R002A046_rgb 46', 'backPain/S012C002P008R002A046_rgb 46', 'backPain/S012C001P008R001A046_rgb 46', 'backPain/S011C002P007R001A046_rgb 46', 'backPain/S007C003P025R002A046_rgb 46', 'backPain/S005C003P018R002A046_rgb 46', 'backPain/S011C003P016R001A046_rgb 46', 'backPain/S001C001P003R002A046_rgb 46', 'backPain/S016C002P008R002A046_rgb 46', 'backPain/S010C002P018R001A046_rgb 46', 'backPain/S006C003P001R001A046_rgb 46', 'backPain/S011C002P027R001A046_rgb 46', 'backPain/S011C003P038R001A046_rgb 46', 'backPain/S002C003P014R001A046_rgb 46', 'backPain/S012C002P015R001A046_rgb 46', 'backPain/S012C001P017R002A046_rgb 46', 'backPain/S013C003P037R001A046_rgb 46', 'backPain/S008C001P030R002A046_rgb 46', 'backPain/S012C003P019R001A046_rgb 46', 'backPain/S011C002P001R001A046_rgb 46', 'backPain/S010C002P008R002A046_rgb 46', 'backPain/S002C002P009R002A046_rgb 46', 'backPain/S011C001P002R002A046_rgb 46', 'backPain/S016C001P008R002A046_rgb 46', 'backPain/S004C002P003R002A046_rgb 46', 'backPain/S008C001P032R001A046_rgb 46', 'backPain/S010C003P013R001A046_rgb 46', 'backPain/S007C001P027R002A046_rgb 46', 'backPain/S008C001P029R002A046_rgb 46', 'backPain/S002C003P009R002A046_rgb 46', 'backPain/S013C003P027R001A046_rgb 46', 'backPain/S015C001P015R001A046_rgb 46', 'backPain/S005C003P013R002A046_rgb 46', 'backPain/S007C001P007R001A046_rgb 46', 'backPain/S012C003P028R002A046_rgb 46', 'backPain/S008C001P032R002A046_rgb 46', 'backPain/S009C003P025R002A046_rgb 46', 'backPain/S001C001P004R002A046_rgb 46', 'backPain/S001C003P004R001A046_rgb 46', 'backPain/S011C002P015R002A046_rgb 46', 'backPain/S001C003P005R002A046_rgb 46', 'backPain/S007C002P027R001A046_rgb 46', 'backPain/S009C003P025R001A046_rgb 46', 'backPain/S011C002P008R002A046_rgb 46', 'backPain/S012C003P008R002A046_rgb 46', 'backPain/S001C003P001R001A046_rgb 46', 'backPain/S004C003P007R002A046_rgb 46', 'backPain/S010C003P016R002A046_rgb 46', 'backPain/S001C003P006R002A046_rgb 46', 'backPain/S017C002P017R001A046_rgb 46', 'backPain/S013C002P025R001A046_rgb 46', 'backPain/S012C001P037R001A046_rgb 46', 'backPain/S004C003P008R002A046_rgb 46', 'backPain/S012C003P008R001A046_rgb 46', 'backPain/S014C003P025R001A046_rgb 46', 'backPain/S003C001P019R002A046_rgb 46', 'backPain/S015C003P025R001A046_rgb 46', 'backPain/S011C002P001R002A046_rgb 46', 'backPain/S015C003P008R001A046_rgb 46', 'backPain/S001C003P006R001A046_rgb 46', 'backPain/S007C003P028R002A046_rgb 46', 'backPain/S002C001P010R002A046_rgb 46', 'backPain/S009C001P008R002A046_rgb 46', 'backPain/S015C003P037R001A046_rgb 46', 'backPain/S003C001P007R002A046_rgb 46', 'backPain/S006C002P022R001A046_rgb 46', 'backPain/S003C002P015R002A046_rgb 46', 'backPain/S007C003P016R001A046_rgb 46', 'backPain/S005C002P010R002A046_rgb 46', 'backPain/S005C003P010R001A046_rgb 46', 'backPain/S008C002P030R002A046_rgb 46', 'backPain/S002C001P008R002A046_rgb 46', 'backPain/S014C001P039R001A046_rgb 46', 'backPain/S012C002P037R001A046_rgb 46', 'backPain/S014C003P019R002A046_rgb 46', 'backPain/S005C001P013R002A046_rgb 46', 'backPain/S015C002P017R001A046_rgb 46', 'backPain/S015C003P037R002A046_rgb 46', 'backPain/S008C002P029R002A046_rgb 46', 'backPain/S001C001P008R001A046_rgb 46', 'backPain/S008C003P025R002A046_rgb 46', 'backPain/S011C003P007R001A046_rgb 46', 'backPain/S007C003P027R001A046_rgb 46', 'backPain/S003C003P015R001A046_rgb 46', 'backPain/S017C002P020R001A046_rgb 46', 'backPain/S017C003P007R002A046_rgb 46', 'backPain/S017C002P008R002A046_rgb 46', 'backPain/S008C003P030R001A046_rgb 46', 'backPain/S007C003P025R001A046_rgb 46', 'backPain/S008C001P019R002A046_rgb 46', 'backPain/S011C003P016R002A046_rgb 46', 'backPain/S006C002P017R001A046_rgb 46', 'backPain/S013C003P019R002A046_rgb 46', 'backPain/S007C003P007R001A046_rgb 46', 'backPain/S005C003P021R002A046_rgb 46', 'backPain/S013C003P025R002A046_rgb 46', 'backPain/S013C003P008R002A046_rgb 46', 'backPain/S006C002P024R002A046_rgb 46', 'backPain/S007C001P025R002A046_rgb 46', 'backPain/S012C001P015R001A046_rgb 46', 'backPain/S010C001P018R002A046_rgb 46', 'backPain/S003C003P002R002A046_rgb 46', 'backPain/S010C002P016R002A046_rgb 46', 'backPain/S015C002P008R002A046_rgb 46', 'backPain/S014C003P037R002A046_rgb 46', 'backPain/S011C003P017R002A046_rgb 46', 'backPain/S003C001P016R001A046_rgb 46', 'backPain/S005C003P016R002A046_rgb 46', 'backPain/S006C003P019R002A046_rgb 46', 'backPain/S014C002P027R002A046_rgb 46', 'backPain/S007C003P017R001A046_rgb 46', 'backPain/S002C001P012R002A046_rgb 46', 'backPain/S002C001P008R001A046_rgb 46', 'backPain/S016C002P039R001A046_rgb 46', 'backPain/S009C002P019R002A046_rgb 46', 'backPain/S013C001P017R001A046_rgb 46', 'backPain/S005C002P013R002A046_rgb 46', 'backPain/S006C003P001R002A046_rgb 46', 'backPain/S002C003P012R002A046_rgb 46', 'backPain/S012C003P007R002A046_rgb 46', 'backPain/S005C002P017R002A046_rgb 46', 'backPain/S001C001P008R002A046_rgb 46', 'backPain/S016C001P007R002A046_rgb 46', 'backPain/S005C003P017R002A046_rgb 46', 'backPain/S001C001P006R002A046_rgb 46', 'backPain/S009C003P017R001A046_rgb 46', 'backPain/S007C001P026R002A046_rgb 46', 'backPain/S017C001P015R001A046_rgb 46', 'backPain/S017C002P016R002A046_rgb 46', 'backPain/S006C003P016R001A046_rgb 46', 'backPain/S003C003P015R002A046_rgb 46', 'backPain/S012C001P017R001A046_rgb 46', 'backPain/S002C001P009R002A046_rgb 46', 'backPain/S008C001P001R002A046_rgb 46', 'backPain/S006C003P008R002A046_rgb 46', 'backPain/S016C002P039R002A046_rgb 46', 'backPain/S012C002P018R001A046_rgb 46', 'backPain/S013C001P016R002A046_rgb 46', 'backPain/S007C002P018R002A046_rgb 46', 'backPain/S013C002P015R002A046_rgb 46', 'backPain/S011C002P008R001A046_rgb 46', 'backPain/S001C002P002R002A046_rgb 46', 'backPain/S007C002P015R001A046_rgb 46', 'backPain/S010C003P013R002A046_rgb 46', 'backPain/S014C002P037R001A046_rgb 46', 'backPain/S013C001P007R001A046_rgb 46', 'backPain/S007C001P015R002A046_rgb 46', 'backPain/S009C001P007R001A046_rgb 46', 'backPain/S005C002P021R002A046_rgb 46', 'backPain/S004C003P007R001A046_rgb 46', 'backPain/S014C001P019R001A046_rgb 46', 'backPain/S003C003P016R002A046_rgb 46', 'backPain/S002C001P003R001A046_rgb 46', 'backPain/S016C003P039R002A046_rgb 46', 'backPain/S013C001P019R002A046_rgb 46', 'backPain/S008C001P007R001A046_rgb 46', 'backPain/S013C002P019R001A046_rgb 46', 'backPain/S012C002P016R002A046_rgb 46', 'backPain/S011C002P018R001A046_rgb 46', 'backPain/S014C001P017R001A046_rgb 46', 'backPain/S017C002P015R001A046_rgb 46', 'backPain/S006C002P016R001A046_rgb 46', 'backPain/S014C001P008R002A046_rgb 46', 'backPain/S008C002P008R001A046_rgb 46', 'backPain/S008C002P008R002A046_rgb 46', 'backPain/S012C001P027R002A046_rgb 46', 'backPain/S006C002P015R001A046_rgb 46', 'backPain/S008C003P025R001A046_rgb 46', 'backPain/S001C002P007R001A046_rgb 46', 'backPain/S008C001P019R001A046_rgb 46', 'backPain/S008C003P007R001A046_rgb 46', 'backPain/S010C001P008R002A046_rgb 46', 'backPain/S009C002P015R002A046_rgb 46', 'backPain/S010C003P019R001A046_rgb 46', 'neckPain/S014C003P037R002A047_rgb 47', 'neckPain/S008C001P036R002A047_rgb 47', 'neckPain/S008C002P007R001A047_rgb 47', 'neckPain/S008C003P030R002A047_rgb 47', 'neckPain/S012C003P019R001A047_rgb 47', 'neckPain/S008C003P001R002A047_rgb 47', 'neckPain/S012C003P008R002A047_rgb 47', 'neckPain/S003C002P007R001A047_rgb 47', 'neckPain/S003C003P016R002A047_rgb 47', 'neckPain/S016C003P021R001A047_rgb 47', 'neckPain/S010C003P025R002A047_rgb 47', 'neckPain/S013C001P037R002A047_rgb 47', 'neckPain/S001C003P001R001A047_rgb 47', 'neckPain/S003C001P008R001A047_rgb 47', 'neckPain/S017C003P015R002A047_rgb 47', 'neckPain/S015C002P008R001A047_rgb 47', 'neckPain/S011C002P001R002A047_rgb 47', 'neckPain/S002C002P003R001A047_rgb 47', 'neckPain/S008C001P031R002A047_rgb 47', 'neckPain/S008C001P015R002A047_rgb 47', 'neckPain/S015C003P025R001A047_rgb 47', 'neckPain/S008C001P034R001A047_rgb 47', 'neckPain/S010C002P025R001A047_rgb 47', 'neckPain/S007C002P028R001A047_rgb 47', 'neckPain/S001C002P006R001A047_rgb 47', 'neckPain/S007C001P019R002A047_rgb 47', 'neckPain/S007C003P016R002A047_rgb 47', 'neckPain/S003C003P007R001A047_rgb 47', 'neckPain/S003C002P018R001A047_rgb 47', 'neckPain/S003C003P002R002A047_rgb 47', 'neckPain/S012C001P015R002A047_rgb 47', 'neckPain/S011C003P007R002A047_rgb 47', 'neckPain/S001C001P002R001A047_rgb 47', 'neckPain/S003C003P019R001A047_rgb 47', 'neckPain/S007C001P018R002A047_rgb 47', 'neckPain/S009C002P008R001A047_rgb 47', 'neckPain/S010C003P017R001A047_rgb 47', 'neckPain/S005C003P016R001A047_rgb 47', 'neckPain/S001C001P001R001A047_rgb 47', 'neckPain/S015C001P007R001A047_rgb 47', 'neckPain/S006C003P007R001A047_rgb 47', 'neckPain/S007C001P016R002A047_rgb 47', 'neckPain/S010C002P019R002A047_rgb 47', 'neckPain/S009C001P016R001A047_rgb 47', 'neckPain/S013C003P018R002A047_rgb 47', 'neckPain/S015C003P015R002A047_rgb 47', 'neckPain/S015C002P008R002A047_rgb 47', 'neckPain/S011C001P001R001A047_rgb 47', 'neckPain/S003C003P001R001A047_rgb 47', 'neckPain/S010C001P019R001A047_rgb 47', 'neckPain/S012C003P028R002A047_rgb 47', 'neckPain/S012C002P018R001A047_rgb 47', 'neckPain/S003C001P007R001A047_rgb 47', 'neckPain/S008C002P008R002A047_rgb 47', 'neckPain/S001C003P008R002A047_rgb 47', 'neckPain/S013C003P027R002A047_rgb 47', 'neckPain/S016C002P021R002A047_rgb 47', 'neckPain/S017C003P003R002A047_rgb 47', 'neckPain/S008C001P007R002A047_rgb 47', 'neckPain/S001C003P004R002A047_rgb 47', 'neckPain/S016C001P019R002A047_rgb 47', 'neckPain/S002C001P010R002A047_rgb 47', 'neckPain/S006C002P019R002A047_rgb 47', 'neckPain/S001C002P002R001A047_rgb 47', 'neckPain/S005C002P021R001A047_rgb 47', 'neckPain/S006C003P016R001A047_rgb 47', 'neckPain/S014C002P025R002A047_rgb 47', 'neckPain/S011C003P002R001A047_rgb 47', 'neckPain/S012C001P037R002A047_rgb 47', 'neckPain/S004C002P008R002A047_rgb 47', 'neckPain/S015C001P016R001A047_rgb 47', 'neckPain/S005C001P004R002A047_rgb 47', 'neckPain/S013C003P015R002A047_rgb 47', 'neckPain/S014C002P017R001A047_rgb 47', 'neckPain/S001C002P005R001A047_rgb 47', 'neckPain/S012C001P007R002A047_rgb 47', 'neckPain/S004C002P003R001A047_rgb 47', 'neckPain/S013C003P008R001A047_rgb 47', 'neckPain/S016C002P025R002A047_rgb 47', 'neckPain/S002C003P009R001A047_rgb 47', 'neckPain/S012C003P018R002A047_rgb 47', 'neckPain/S008C003P036R002A047_rgb 47', 'neckPain/S013C003P007R001A047_rgb 47', 'neckPain/S010C001P007R001A047_rgb 47', 'neckPain/S012C003P015R002A047_rgb 47', 'neckPain/S006C002P022R002A047_rgb 47', 'neckPain/S007C002P019R001A047_rgb 47', 'neckPain/S012C001P028R002A047_rgb 47', 'neckPain/S006C003P019R002A047_rgb 47', 'neckPain/S003C001P001R001A047_rgb 47', 'neckPain/S010C003P021R002A047_rgb 47', 'neckPain/S010C002P025R002A047_rgb 47', 'neckPain/S007C003P008R002A047_rgb 47', 'neckPain/S008C002P025R002A047_rgb 47', 'neckPain/S010C003P013R002A047_rgb 47', 'neckPain/S014C003P025R001A047_rgb 47', 'neckPain/S011C003P028R001A047_rgb 47', 'neckPain/S007C003P007R002A047_rgb 47', 'neckPain/S003C003P018R001A047_rgb 47', 'neckPain/S001C001P003R001A047_rgb 47', 'neckPain/S010C003P018R001A047_rgb 47', 'neckPain/S011C001P017R001A047_rgb 47', 'neckPain/S002C001P010R001A047_rgb 47', 'neckPain/S015C003P037R001A047_rgb 47', 'neckPain/S006C001P016R001A047_rgb 47', 'neckPain/S002C002P007R002A047_rgb 47', 'neckPain/S016C001P007R002A047_rgb 47', 'neckPain/S006C002P024R002A047_rgb 47', 'neckPain/S008C003P008R001A047_rgb 47', 'neckPain/S009C003P017R001A047_rgb 47', 'neckPain/S016C002P008R002A047_rgb 47', 'neckPain/S009C003P025R001A047_rgb 47', 'neckPain/S008C003P035R002A047_rgb 47', 'neckPain/S006C001P016R002A047_rgb 47', 'neckPain/S005C002P004R002A047_rgb 47', 'neckPain/S002C003P007R002A047_rgb 47', 'neckPain/S003C001P001R002A047_rgb 47', 'neckPain/S010C003P019R002A047_rgb 47', 'neckPain/S011C003P019R001A047_rgb 47', 'neckPain/S006C002P022R001A047_rgb 47', 'neckPain/S003C003P008R002A047_rgb 47', 'neckPain/S009C003P007R001A047_rgb 47', 'neckPain/S001C003P005R002A047_rgb 47', 'neckPain/S011C001P015R001A047_rgb 47', 'neckPain/S002C002P011R001A047_rgb 47', 'neckPain/S012C002P037R002A047_rgb 47', 'neckPain/S010C001P013R002A047_rgb 47', 'neckPain/S007C001P019R001A047_rgb 47', 'neckPain/S009C002P015R002A047_rgb 47', 'neckPain/S009C003P025R002A047_rgb 47', 'neckPain/S008C002P033R001A047_rgb 47', 'neckPain/S015C001P015R001A047_rgb 47', 'neckPain/S009C001P019R002A047_rgb 47', 'neckPain/S007C001P008R001A047_rgb 47', 'neckPain/S011C001P027R001A047_rgb 47', 'neckPain/S015C001P008R002A047_rgb 47', 'neckPain/S002C001P012R002A047_rgb 47', 'neckPain/S012C001P007R001A047_rgb 47', 'neckPain/S011C002P015R002A047_rgb 47', 'neckPain/S012C001P019R002A047_rgb 47', 'neckPain/S009C003P008R002A047_rgb 47', 'neckPain/S014C001P019R002A047_rgb 47', 'neckPain/S004C001P007R001A047_rgb 47', 'neckPain/S011C001P018R001A047_rgb 47', 'neckPain/S011C003P008R002A047_rgb 47', 'neckPain/S008C002P031R002A047_rgb 47', 'neckPain/S008C002P030R002A047_rgb 47', 'neckPain/S005C002P017R001A047_rgb 47', 'neckPain/S012C002P007R002A047_rgb 47', 'neckPain/S003C002P018R002A047_rgb 47', 'neckPain/S013C001P027R002A047_rgb 47', 'neckPain/S013C003P028R002A047_rgb 47', 'neckPain/S013C001P025R001A047_rgb 47', 'neckPain/S007C002P026R001A047_rgb 47', 'neckPain/S017C001P016R001A047_rgb 47', 'neckPain/S005C001P013R001A047_rgb 47', 'neckPain/S016C003P007R002A047_rgb 47', 'neckPain/S011C001P038R002A047_rgb 47', 'neckPain/S008C001P008R002A047_rgb 47', 'neckPain/S003C001P019R001A047_rgb 47', 'neckPain/S002C001P007R001A047_rgb 47', 'neckPain/S011C003P008R001A047_rgb 47', 'neckPain/S015C003P017R002A047_rgb 47', 'neckPain/S011C002P016R001A047_rgb 47', 'neckPain/S016C002P019R001A047_rgb 47', 'neckPain/S002C002P009R002A047_rgb 47', 'neckPain/S003C002P002R001A047_rgb 47', 'neckPain/S017C002P020R002A047_rgb 47', 'neckPain/S008C001P032R001A047_rgb 47', 'neckPain/S010C003P015R001A047_rgb 47', 'neckPain/S009C003P008R001A047_rgb 47', 'neckPain/S009C001P016R002A047_rgb 47', 'neckPain/S015C002P016R002A047_rgb 47', 'neckPain/S009C002P017R002A047_rgb 47', 'neckPain/S010C002P018R002A047_rgb 47', 'neckPain/S017C003P008R002A047_rgb 47', 'neckPain/S001C001P008R001A047_rgb 47', 'neckPain/S014C001P025R002A047_rgb 47', 'neckPain/S013C002P027R002A047_rgb 47', 'neckPain/S003C003P007R002A047_rgb 47', 'neckPain/S006C003P022R001A047_rgb 47', 'neckPain/S001C001P005R002A047_rgb 47', 'neckPain/S009C002P016R002A047_rgb 47', 'neckPain/S016C003P008R001A047_rgb 47', 'neckPain/S009C003P015R002A047_rgb 47', 'neckPain/S017C002P016R002A047_rgb 47', 'neckPain/S014C003P019R002A047_rgb 47', 'neckPain/S017C001P008R002A047_rgb 47', 'neckPain/S011C002P016R002A047_rgb 47', 'neckPain/S013C002P015R001A047_rgb 47', 'nauseaVomiting/S016C003P040R002A048_rgb 48', 'nauseaVomiting/S015C002P017R001A048_rgb 48', 'nauseaVomiting/S008C001P031R002A048_rgb 48', 'nauseaVomiting/S008C002P007R002A048_rgb 48', 'nauseaVomiting/S015C001P017R002A048_rgb 48', 'nauseaVomiting/S012C001P015R001A048_rgb 48', 'nauseaVomiting/S001C001P008R001A048_rgb 48', 'nauseaVomiting/S010C002P015R002A048_rgb 48', 'nauseaVomiting/S008C001P019R002A048_rgb 48', 'nauseaVomiting/S006C002P022R001A048_rgb 48', 'nauseaVomiting/S006C002P007R002A048_rgb 48', 'nauseaVomiting/S012C003P007R001A048_rgb 48', 'nauseaVomiting/S008C001P035R002A048_rgb 48', 'nauseaVomiting/S008C002P008R002A048_rgb 48', 'nauseaVomiting/S016C003P025R001A048_rgb 48', 'nauseaVomiting/S013C002P025R002A048_rgb 48', 'nauseaVomiting/S009C001P015R001A048_rgb 48', 'nauseaVomiting/S009C003P019R001A048_rgb 48', 'nauseaVomiting/S002C001P011R001A048_rgb 48', 'nauseaVomiting/S006C002P023R001A048_rgb 48', 'nauseaVomiting/S010C003P019R002A048_rgb 48', 'nauseaVomiting/S011C002P001R002A048_rgb 48', 'nauseaVomiting/S001C003P004R001A048_rgb 48', 'nauseaVomiting/S004C001P020R001A048_rgb 48', 'nauseaVomiting/S009C002P008R001A048_rgb 48', 'nauseaVomiting/S016C001P039R002A048_rgb 48', 'nauseaVomiting/S009C003P017R002A048_rgb 48', 'nauseaVomiting/S007C002P007R002A048_rgb 48', 'nauseaVomiting/S010C002P019R002A048_rgb 48', 'nauseaVomiting/S013C002P019R001A048_rgb 48', 'nauseaVomiting/S014C003P037R002A048_rgb 48', 'nauseaVomiting/S010C002P017R001A048_rgb 48', 'nauseaVomiting/S002C002P013R001A048_rgb 48', 'nauseaVomiting/S014C001P037R001A048_rgb 48', 'nauseaVomiting/S006C001P017R001A048_rgb 48', 'nauseaVomiting/S008C001P034R001A048_rgb 48', 'nauseaVomiting/S006C001P019R001A048_rgb 48', 'nauseaVomiting/S015C001P016R002A048_rgb 48', 'nauseaVomiting/S008C003P036R002A048_rgb 48', 'nauseaVomiting/S002C002P012R001A048_rgb 48', 'nauseaVomiting/S010C001P007R001A048_rgb 48', 'nauseaVomiting/S008C002P007R001A048_rgb 48', 'nauseaVomiting/S011C001P015R001A048_rgb 48', 'nauseaVomiting/S008C002P015R001A048_rgb 48', 'nauseaVomiting/S017C001P016R001A048_rgb 48', 'nauseaVomiting/S012C001P037R001A048_rgb 48', 'nauseaVomiting/S010C002P015R001A048_rgb 48', 'nauseaVomiting/S013C003P018R002A048_rgb 48', 'nauseaVomiting/S015C002P015R002A048_rgb 48', 'nauseaVomiting/S010C001P019R002A048_rgb 48', 'nauseaVomiting/S001C001P001R002A048_rgb 48', 'nauseaVomiting/S007C002P019R002A048_rgb 48', 'nauseaVomiting/S012C001P016R001A048_rgb 48', 'nauseaVomiting/S003C003P008R001A048_rgb 48', 'nauseaVomiting/S014C003P015R001A048_rgb 48', 'nauseaVomiting/S008C001P035R001A048_rgb 48', 'nauseaVomiting/S012C002P007R001A048_rgb 48', 'nauseaVomiting/S014C003P039R001A048_rgb 48', 'nauseaVomiting/S008C002P025R001A048_rgb 48', 'nauseaVomiting/S002C003P008R002A048_rgb 48', 'nauseaVomiting/S008C002P030R002A048_rgb 48', 'nauseaVomiting/S007C003P025R002A048_rgb 48', 'nauseaVomiting/S017C003P007R001A048_rgb 48', 'nauseaVomiting/S001C003P005R001A048_rgb 48', 'nauseaVomiting/S012C001P017R001A048_rgb 48', 'nauseaVomiting/S007C001P015R002A048_rgb 48', 'nauseaVomiting/S016C003P019R002A048_rgb 48', 'nauseaVomiting/S006C002P024R002A048_rgb 48', 'nauseaVomiting/S004C003P020R002A048_rgb 48', 'nauseaVomiting/S010C003P018R001A048_rgb 48', 'nauseaVomiting/S011C002P025R001A048_rgb 48', 'nauseaVomiting/S011C002P002R001A048_rgb 48', 'nauseaVomiting/S009C003P007R002A048_rgb 48', 'nauseaVomiting/S011C002P019R001A048_rgb 48', 'nauseaVomiting/S014C001P019R001A048_rgb 48', 'nauseaVomiting/S008C001P034R002A048_rgb 48', 'nauseaVomiting/S011C002P015R001A048_rgb 48', 'nauseaVomiting/S001C001P008R002A048_rgb 48', 'nauseaVomiting/S010C001P017R001A048_rgb 48', 'nauseaVomiting/S011C002P007R001A048_rgb 48', 'nauseaVomiting/S012C001P008R002A048_rgb 48', 'nauseaVomiting/S001C001P007R001A048_rgb 48', 'nauseaVomiting/S011C003P027R002A048_rgb 48', 'nauseaVomiting/S013C003P018R001A048_rgb 48', 'nauseaVomiting/S006C003P001R001A048_rgb 48', 'nauseaVomiting/S002C001P013R002A048_rgb 48', 'nauseaVomiting/S007C001P001R002A048_rgb 48', 'nauseaVomiting/S002C003P009R001A048_rgb 48', 'nauseaVomiting/S012C002P025R002A048_rgb 48', 'nauseaVomiting/S017C003P008R001A048_rgb 48', 'nauseaVomiting/S014C003P039R002A048_rgb 48', 'nauseaVomiting/S008C003P034R002A048_rgb 48', 'nauseaVomiting/S012C003P019R002A048_rgb 48', 'nauseaVomiting/S015C002P008R002A048_rgb 48', 'nauseaVomiting/S011C001P017R001A048_rgb 48', 'nauseaVomiting/S003C002P008R001A048_rgb 48', 'nauseaVomiting/S014C002P039R002A048_rgb 48', 'nauseaVomiting/S013C001P018R001A048_rgb 48', 'nauseaVomiting/S006C001P015R001A048_rgb 48', 'nauseaVomiting/S002C003P003R002A048_rgb 48', 'nauseaVomiting/S017C002P009R001A048_rgb 48', 'nauseaVomiting/S010C002P025R001A048_rgb 48', 'nauseaVomiting/S008C002P029R002A048_rgb 48', 'nauseaVomiting/S005C001P021R002A048_rgb 48', 'nauseaVomiting/S008C002P036R002A048_rgb 48', 'nauseaVomiting/S006C002P022R002A048_rgb 48', 'nauseaVomiting/S004C003P020R001A048_rgb 48', 'nauseaVomiting/S010C003P016R002A048_rgb 48', 'nauseaVomiting/S008C002P033R002A048_rgb 48', 'nauseaVomiting/S003C003P001R001A048_rgb 48', 'nauseaVomiting/S004C001P007R002A048_rgb 48', 'nauseaVomiting/S015C002P017R002A048_rgb 48', 'nauseaVomiting/S017C003P007R002A048_rgb 48', 'nauseaVomiting/S017C003P003R001A048_rgb 48', 'nauseaVomiting/S004C002P003R001A048_rgb 48', 'nauseaVomiting/S014C003P019R002A048_rgb 48', 'nauseaVomiting/S017C001P007R001A048_rgb 48', 'nauseaVomiting/S014C002P007R002A048_rgb 48', 'nauseaVomiting/S006C003P007R002A048_rgb 48', 'nauseaVomiting/S006C003P015R001A048_rgb 48', 'nauseaVomiting/S005C003P013R001A048_rgb 48', 'nauseaVomiting/S007C003P027R001A048_rgb 48', 'nauseaVomiting/S012C002P017R001A048_rgb 48', 'nauseaVomiting/S016C002P025R001A048_rgb 48', 'nauseaVomiting/S005C001P010R001A048_rgb 48', 'nauseaVomiting/S008C003P031R002A048_rgb 48', 'nauseaVomiting/S002C001P009R002A048_rgb 48', 'nauseaVomiting/S014C002P025R001A048_rgb 48', 'nauseaVomiting/S007C001P017R002A048_rgb 48', 'nauseaVomiting/S001C003P002R002A048_rgb 48', 'nauseaVomiting/S004C003P003R002A048_rgb 48', 'nauseaVomiting/S005C001P004R002A048_rgb 48', 'nauseaVomiting/S005C002P017R002A048_rgb 48', 'nauseaVomiting/S013C002P037R001A048_rgb 48', 'nauseaVomiting/S011C003P001R002A048_rgb 48', 'nauseaVomiting/S007C003P019R002A048_rgb 48', 'nauseaVomiting/S010C002P008R001A048_rgb 48', 'nauseaVomiting/S003C003P016R002A048_rgb 48', 'nauseaVomiting/S013C002P015R001A048_rgb 48', 'nauseaVomiting/S005C003P021R002A048_rgb 48', 'nauseaVomiting/S002C002P008R002A048_rgb 48', 'nauseaVomiting/S002C003P012R001A048_rgb 48', 'nauseaVomiting/S010C002P013R002A048_rgb 48', 'nauseaVomiting/S011C003P025R002A048_rgb 48', 'nauseaVomiting/S006C003P022R001A048_rgb 48', 'nauseaVomiting/S002C001P007R001A048_rgb 48', 'nauseaVomiting/S010C001P015R001A048_rgb 48', 'nauseaVomiting/S002C001P014R001A048_rgb 48', 'nauseaVomiting/S007C001P026R001A048_rgb 48', 'nauseaVomiting/S009C002P007R001A048_rgb 48', 'nauseaVomiting/S008C001P032R002A048_rgb 48', 'nauseaVomiting/S002C001P012R002A048_rgb 48', 'nauseaVomiting/S007C003P007R001A048_rgb 48', 'nauseaVomiting/S013C003P015R002A048_rgb 48', 'nauseaVomiting/S007C002P007R001A048_rgb 48', 'nauseaVomiting/S001C001P005R001A048_rgb 48', 'nauseaVomiting/S017C001P007R002A048_rgb 48', 'nauseaVomiting/S006C002P016R002A048_rgb 48', 'nauseaVomiting/S010C003P007R002A048_rgb 48', 'nauseaVomiting/S002C003P009R002A048_rgb 48', 'nauseaVomiting/S004C002P003R002A048_rgb 48', 'nauseaVomiting/S007C003P018R001A048_rgb 48', 'nauseaVomiting/S008C003P030R002A048_rgb 48', 'nauseaVomiting/S014C002P017R001A048_rgb 48', 'nauseaVomiting/S009C003P019R002A048_rgb 48', 'nauseaVomiting/S013C002P017R001A048_rgb 48', 'nauseaVomiting/S017C002P007R001A048_rgb 48', 'nauseaVomiting/S017C001P020R002A048_rgb 48', 'nauseaVomiting/S011C003P038R001A048_rgb 48', 'nauseaVomiting/S006C002P001R002A048_rgb 48', 'nauseaVomiting/S003C003P015R002A048_rgb 48', 'nauseaVomiting/S003C001P016R002A048_rgb 48', 'nauseaVomiting/S015C002P007R001A048_rgb 48', 'nauseaVomiting/S008C002P034R001A048_rgb 48', 'nauseaVomiting/S016C002P008R002A048_rgb 48', 'nauseaVomiting/S014C002P019R001A048_rgb 48', 'nauseaVomiting/S007C002P008R002A048_rgb 48', 'nauseaVomiting/S007C002P026R002A048_rgb 48', 'nauseaVomiting/S006C001P023R001A048_rgb 48', 'nauseaVomiting/S017C002P003R001A048_rgb 48', 'nauseaVomiting/S007C003P016R001A048_rgb 48', 'nauseaVomiting/S014C003P025R001A048_rgb 48', 'nauseaVomiting/S007C001P001R001A048_rgb 48', 'nauseaVomiting/S007C002P025R002A048_rgb 48', 'nauseaVomiting/S004C002P008R001A048_rgb 48', 'nauseaVomiting/S015C001P016R001A048_rgb 48', 'nauseaVomiting/S009C002P015R001A048_rgb 48', 'nauseaVomiting/S010C003P021R001A048_rgb 48', 'nauseaVomiting/S008C002P036R001A048_rgb 48', 'fanSelf/S013C003P018R001A049_rgb 49', 'fanSelf/S012C002P025R001A049_rgb 49', 'fanSelf/S004C002P008R002A049_rgb 49', 'fanSelf/S017C001P008R001A049_rgb 49', 'fanSelf/S012C003P027R002A049_rgb 49', 'fanSelf/S005C002P018R001A049_rgb 49', 'fanSelf/S013C001P028R001A049_rgb 49', 'fanSelf/S008C002P034R001A049_rgb 49', 'fanSelf/S010C003P015R001A049_rgb 49', 'fanSelf/S002C001P011R002A049_rgb 49', 'fanSelf/S008C003P007R002A049_rgb 49', 'fanSelf/S012C003P016R001A049_rgb 49', 'fanSelf/S011C002P002R001A049_rgb 49', 'fanSelf/S003C003P001R002A049_rgb 49', 'fanSelf/S016C001P021R002A049_rgb 49', 'fanSelf/S011C001P019R002A049_rgb 49', 'fanSelf/S001C001P007R001A049_rgb 49', 'fanSelf/S012C001P018R002A049_rgb 49', 'fanSelf/S007C002P018R002A049_rgb 49', 'fanSelf/S013C001P019R002A049_rgb 49', 'fanSelf/S014C002P008R001A049_rgb 49', 'fanSelf/S008C003P034R001A049_rgb 49', 'fanSelf/S010C001P008R001A049_rgb 49', 'fanSelf/S014C003P025R002A049_rgb 49', 'fanSelf/S013C002P018R002A049_rgb 49', 'fanSelf/S009C003P015R001A049_rgb 49', 'fanSelf/S004C002P007R002A049_rgb 49', 'fanSelf/S013C002P008R001A049_rgb 49', 'fanSelf/S011C002P008R001A049_rgb 49', 'fanSelf/S003C003P019R002A049_rgb 49', 'fanSelf/S011C001P007R002A049_rgb 49', 'fanSelf/S012C001P017R001A049_rgb 49', 'fanSelf/S012C002P016R002A049_rgb 49', 'fanSelf/S005C002P015R002A049_rgb 49', 'fanSelf/S013C003P007R001A049_rgb 49', 'fanSelf/S017C001P009R002A049_rgb 49', 'fanSelf/S010C002P019R001A049_rgb 49', 'fanSelf/S007C003P019R001A049_rgb 49', 'fanSelf/S010C003P021R001A049_rgb 49', 'fanSelf/S013C002P027R001A049_rgb 49', 'fanSelf/S014C002P007R002A049_rgb 49', 'fanSelf/S002C001P003R002A049_rgb 49', 'fanSelf/S003C002P019R001A049_rgb 49', 'fanSelf/S017C001P020R002A049_rgb 49', 'fanSelf/S010C002P018R001A049_rgb 49', 'fanSelf/S011C002P002R002A049_rgb 49', 'fanSelf/S013C002P007R001A049_rgb 49', 'fanSelf/S009C001P017R002A049_rgb 49', 'fanSelf/S006C001P022R001A049_rgb 49', 'fanSelf/S017C002P020R001A049_rgb 49', 'fanSelf/S007C001P027R001A049_rgb 49', 'fanSelf/S014C003P027R001A049_rgb 49', 'fanSelf/S003C002P001R001A049_rgb 49', 'fanSelf/S008C001P030R002A049_rgb 49', 'fanSelf/S008C001P008R001A049_rgb 49', 'fanSelf/S012C002P027R001A049_rgb 49', 'fanSelf/S006C003P016R001A049_rgb 49', 'fanSelf/S004C001P007R001A049_rgb 49', 'fanSelf/S001C003P004R001A049_rgb 49', 'fanSelf/S012C002P017R001A049_rgb 49', 'fanSelf/S006C003P019R002A049_rgb 49', 'fanSelf/S002C003P011R001A049_rgb 49', 'fanSelf/S008C003P019R002A049_rgb 49', 'fanSelf/S003C003P008R001A049_rgb 49', 'fanSelf/S008C001P025R001A049_rgb 49', 'fanSelf/S013C002P027R002A049_rgb 49', 'fanSelf/S010C001P019R001A049_rgb 49', 'fanSelf/S008C003P030R002A049_rgb 49', 'fanSelf/S008C002P025R002A049_rgb 49', 'fanSelf/S010C002P017R001A049_rgb 49', 'fanSelf/S014C003P007R001A049_rgb 49', 'fanSelf/S001C003P006R001A049_rgb 49', 'fanSelf/S013C001P016R002A049_rgb 49', 'fanSelf/S008C001P035R001A049_rgb 49', 'fanSelf/S001C003P008R001A049_rgb 49', 'fanSelf/S011C002P038R001A049_rgb 49', 'fanSelf/S016C001P039R002A049_rgb 49', 'fanSelf/S009C002P007R002A049_rgb 49', 'fanSelf/S005C002P013R002A049_rgb 49', 'fanSelf/S007C002P001R001A049_rgb 49', 'fanSelf/S013C001P019R001A049_rgb 49', 'fanSelf/S001C001P004R001A049_rgb 49', 'fanSelf/S002C001P014R001A049_rgb 49', 'fanSelf/S001C003P002R002A049_rgb 49', 'fanSelf/S012C002P028R001A049_rgb 49', 'fanSelf/S008C003P031R001A049_rgb 49', 'fanSelf/S014C003P037R001A049_rgb 49', 'fanSelf/S007C001P016R002A049_rgb 49', 'fanSelf/S017C001P016R001A049_rgb 49', 'fanSelf/S004C002P003R002A049_rgb 49', 'fanSelf/S016C002P008R002A049_rgb 49', 'fanSelf/S017C001P009R001A049_rgb 49', 'fanSelf/S008C003P033R001A049_rgb 49', 'fanSelf/S017C001P003R002A049_rgb 49', 'fanSelf/S009C003P016R002A049_rgb 49', 'fanSelf/S012C002P025R002A049_rgb 49', 'fanSelf/S008C002P029R002A049_rgb 49', 'fanSelf/S013C003P025R001A049_rgb 49', 'fanSelf/S010C001P008R002A049_rgb 49', 'fanSelf/S017C002P007R002A049_rgb 49', 'fanSelf/S010C002P007R001A049_rgb 49', 'fanSelf/S017C002P008R002A049_rgb 49', 'fanSelf/S010C002P025R001A049_rgb 49', 'fanSelf/S011C002P025R001A049_rgb 49', 'fanSelf/S016C003P025R001A049_rgb 49', 'fanSelf/S008C001P031R001A049_rgb 49', 'fanSelf/S005C002P013R001A049_rgb 49', 'fanSelf/S010C001P016R002A049_rgb 49', 'fanSelf/S016C003P039R002A049_rgb 49', 'fanSelf/S001C002P006R001A049_rgb 49', 'fanSelf/S011C003P001R001A049_rgb 49', 'fanSelf/S012C001P016R002A049_rgb 49', 'fanSelf/S010C003P008R001A049_rgb 49', 'fanSelf/S003C003P015R001A049_rgb 49', 'fanSelf/S007C001P025R001A049_rgb 49', 'fanSelf/S012C001P028R001A049_rgb 49', 'fanSelf/S008C003P033R002A049_rgb 49', 'fanSelf/S002C003P007R001A049_rgb 49', 'fanSelf/S017C001P015R001A049_rgb 49', 'fanSelf/S007C003P018R002A049_rgb 49', 'fanSelf/S017C003P015R002A049_rgb 49', 'fanSelf/S001C003P008R002A049_rgb 49', 'fanSelf/S016C002P039R001A049_rgb 49', 'fanSelf/S001C001P003R001A049_rgb 49', 'fanSelf/S011C003P016R002A049_rgb 49', 'fanSelf/S010C001P007R001A049_rgb 49', 'fanSelf/S007C001P017R002A049_rgb 49', 'fanSelf/S008C001P033R001A049_rgb 49', 'fanSelf/S003C003P007R001A049_rgb 49', 'fanSelf/S009C002P008R001A049_rgb 49', 'fanSelf/S013C002P015R001A049_rgb 49', 'fanSelf/S008C002P031R002A049_rgb 49', 'fanSelf/S010C001P025R001A049_rgb 49', 'fanSelf/S009C001P007R001A049_rgb 49', 'fanSelf/S009C003P025R002A049_rgb 49', 'fanSelf/S008C002P032R001A049_rgb 49', 'fanSelf/S005C003P015R001A049_rgb 49', 'fanSelf/S006C001P008R001A049_rgb 49', 'fanSelf/S009C002P019R001A049_rgb 49', 'fanSelf/S011C002P028R001A049_rgb 49', 'fanSelf/S014C002P017R001A049_rgb 49', 'fanSelf/S012C002P037R002A049_rgb 49', 'fanSelf/S010C001P017R001A049_rgb 49', 'fanSelf/S008C001P019R001A049_rgb 49', 'fanSelf/S015C003P007R002A049_rgb 49', 'fanSelf/S009C002P016R001A049_rgb 49', 'fanSelf/S016C002P008R001A049_rgb 49', 'fanSelf/S010C002P019R002A049_rgb 49', 'fanSelf/S014C001P008R002A049_rgb 49', 'fanSelf/S015C001P025R001A049_rgb 49', 'fanSelf/S014C003P019R002A049_rgb 49', 'fanSelf/S010C002P013R001A049_rgb 49', 'fanSelf/S016C003P039R001A049_rgb 49', 'fanSelf/S009C001P017R001A049_rgb 49', 'fanSelf/S007C001P019R001A049_rgb 49', 'fanSelf/S009C003P007R001A049_rgb 49', 'fanSelf/S008C001P015R002A049_rgb 49', 'fanSelf/S008C003P001R001A049_rgb 49', 'fanSelf/S016C002P021R002A049_rgb 49', 'fanSelf/S011C003P002R002A049_rgb 49', 'fanSelf/S014C003P039R002A049_rgb 49', 'fanSelf/S001C002P008R002A049_rgb 49', 'fanSelf/S007C002P025R001A049_rgb 49', 'fanSelf/S003C002P015R002A049_rgb 49', 'fanSelf/S013C002P017R001A049_rgb 49', 'fanSelf/S012C001P008R002A049_rgb 49', 'fanSelf/S007C003P027R002A049_rgb 49', 'fanSelf/S002C003P010R001A049_rgb 49', 'fanSelf/S011C001P015R001A049_rgb 49', 'fanSelf/S015C003P017R001A049_rgb 49', 'fanSelf/S011C002P017R001A049_rgb 49', 'fanSelf/S012C003P017R001A049_rgb 49', 'fanSelf/S012C002P018R002A049_rgb 49', 'fanSelf/S005C003P004R002A049_rgb 49', 'fanSelf/S004C002P020R001A049_rgb 49', 'fanSelf/S016C001P008R001A049_rgb 49', 'fanSelf/S008C001P032R001A049_rgb 49', 'fanSelf/S006C003P023R002A049_rgb 49', 'fanSelf/S008C002P029R001A049_rgb 49', 'fanSelf/S016C002P040R002A049_rgb 49', 'fanSelf/S006C001P001R001A049_rgb 49', 'fanSelf/S011C002P001R002A049_rgb 49', 'fanSelf/S013C003P015R002A049_rgb 49', 'fanSelf/S012C001P025R002A049_rgb 49', 'fanSelf/S001C002P004R002A049_rgb 49', 'fanSelf/S016C003P019R002A049_rgb 49', 'fanSelf/S015C003P008R002A049_rgb 49', 'fanSelf/S001C002P006R002A049_rgb 49', 'fanSelf/S013C001P007R002A049_rgb 49', 'fanSelf/S009C003P007R002A049_rgb 49']\n"
],
[
"# for group, videos in file_groups.items():\n# for video in videos:\n# # Get the parts.\n# print(video)\n# parts = video.split(os.path.sep)\n# classname = parts[0]\n# filename = parts[1].split(\" \")[0]",
"_____no_output_____"
],
[
"# print(classname)",
"_____no_output_____"
],
[
"# if not os.path.exists(os.path.join('/home/shared/workspace/rose_ntu_dataset/C3D/', group, classname)):\n# print(\"Creating folder for %s/%s\" % (group, classname))\n# os.makedirs(os.path.join('/home/shared/workspace/rose_ntu_dataset/C3D/'+group, classname))",
"_____no_output_____"
],
[
"# class_file = '/home/shared/workspace/rose_ntu_dataset/nturgb/'+ classname+'/'+filename+'.avi'\n# print(class_file)\n# if not os.path.exists(class_file):\n# print(\"/home/shared/workspace/rose_ntu_dataset/nturgb\"+classname+'/'+filename)\n# print(\"Can't find %s to move. Skipping.\" % (class_dir))\n \n# dest = os.path.join(\"/home/shared/workspace/rose_ntu_dataset/C3D/\",group, classname)\n# print(\"Copying %s to %s\" % (class_file, dest))\n# shutil.copy(class_file, dest)",
"_____no_output_____"
],
[
"def move_files(file_groups):\n \"\"\"This assumes all of our files are currently in _this_ directory.\n So move them to the appropriate spot. Only needs to happen once.\n \"\"\"\n # Do each of our groups.\n for group, videos in file_groups.items():\n \n # Do each of our videos.\n for video in videos:\n\n # Get the parts.\n parts = video.split(os.path.sep)\n classname = parts[0]\n filename = parts[1].split(\" \")[0]\n\n # Check if this class exists.\n if not os.path.exists(os.path.join('/home/shared/workspace/rose_ntu_dataset/C3D/'+ group, classname)):\n print(\"Creating folder for %s/%s\" % (group, classname))\n os.makedirs(os.path.join('/home/shared/workspace/rose_ntu_dataset/C3D/'+group, classname))\n\n # Check if we have already moved this file, or at least that it\n # exists to move.\n class_file = '/home/shared/workspace/rose_ntu_dataset/nturgb/'+ classname+'/'+filename+'.avi'\n if not os.path.exists(class_file):\n print(\"Can't find %s to move. Skipping.\" % (class_file))\n continue\n\n # Move it.\n dest = os.path.join(\"/home/shared/workspace/rose_ntu_dataset/C3D/\",group, classname)\n print(\"Moving %s to %s\" % (filename, dest))\n print(\"Copying %s to %s\" % (class_file, dest))\n shutil.copy(class_file, dest)",
"_____no_output_____"
],
[
"# move_files(file_groups)",
"_____no_output_____"
],
[
"pwd",
"_____no_output_____"
],
[
"#cd data",
"_____no_output_____"
],
[
"def extract_files():\n \"\"\"After we have all of our videos split between train and test, and\n all nested within folders representing their classes, we need to\n make a data file that we can reference when training our RNN(s).\n This will let us keep track of image sequences and other parts\n of the training process.\n\n We'll first need to extract images from each of the videos. We'll\n need to record the following data in the file:\n\n [train|test], class, filename, nb frames\n\n Extracting can be done with ffmpeg:\n `ffmpeg -i video.mpg image-%04d.jpg`\n \"\"\"\n data_file = []\n folders = ['train', 'test']\n\n for folder in folders:\n class_folders = glob.glob(os.path.join(folder, '*'))\n print(class_folders)\n for vid_class in class_folders:\n class_files = glob.glob(os.path.join(vid_class, '*.avi'))\n print(class_files)\n for video_path in class_files:\n # Get the parts of the file.\n video_parts = get_video_parts(video_path)\n\n train_or_test, classname, filename_no_ext, filename = video_parts\n print(train_or_test, classname, filename_no_ext, filename)\n\n # Only extract if we haven't done it yet. Otherwise, just get\n # the info.\n if not check_already_extracted(video_parts):\n # Now extract it.\n src = os.path.join(train_or_test, classname, filename)\n dest = os.path.join(train_or_test, classname,\n filename_no_ext + '-%04d.jpg')\n call([\"ffmpeg\", \"-i\", src, dest])\n\n # Now get how many frames it is.\n nb_frames = get_nb_frames_for_video(video_parts)\n\n data_file.append([train_or_test, classname, filename_no_ext, nb_frames])\n\n print(\"Generated %d frames for %s\" % (nb_frames, filename_no_ext))\n\n with open('data_file.csv', 'w') as fout:\n writer = csv.writer(fout)\n writer.writerows(data_file)\n\n print(\"Extracted and wrote %d video files.\" % (len(data_file)))",
"_____no_output_____"
],
[
"def get_video_parts(video_path):\n \"\"\"Given a full path to a video, return its parts.\"\"\"\n parts = video_path.split(os.path.sep)\n filename = parts[2]\n filename_no_ext = filename.split('.')[0]\n classname = parts[1]\n train_or_test = parts[0]\n\n return train_or_test, classname, filename_no_ext, filename",
"_____no_output_____"
],
[
"def check_already_extracted(video_parts):\n \"\"\"Check to see if we created the -0001 frame of this file.\"\"\"\n train_or_test, classname, filename_no_ext, _ = video_parts\n return bool(os.path.exists(os.path.join(train_or_test, classname,\n filename_no_ext + '-0001.jpg')))",
"_____no_output_____"
],
[
"def get_nb_frames_for_video(video_parts):\n \"\"\"Given video parts of an (assumed) already extracted video, return\n the number of frames that were extracted.\"\"\"\n train_or_test, classname, filename_no_ext, _ = video_parts\n generated_files = glob.glob(os.path.join(train_or_test, classname,\n filename_no_ext + '*.jpg'))\n return len(generated_files)",
"_____no_output_____"
],
[
"# extract_files()",
"_____no_output_____"
],
[
"pwd",
"_____no_output_____"
],
[
"def train(data_type, seq_length, model, saved_model=None,\n class_limit=None, image_shape=None,\n load_to_memory=False, batch_size=32, nb_epoch=100):\n # Helper: Save the model.\n checkpointer = ModelCheckpoint(\n filepath=os.path.join('data', 'checkpoints', model + '-' + data_type + \\\n '.{epoch:03d}-{val_loss:.3f}.hdf5'),\n verbose=1,\n save_best_only=True)\n\n # Helper: TensorBoard\n tb = TensorBoard(log_dir=os.path.join('data', 'logs', model))\n\n # Helper: Stop when we stop learning.\n early_stopper = EarlyStopping(patience=5)\n\n # Helper: Save results.\n timestamp = time.time()\n csv_logger = CSVLogger(os.path.join('data', 'logs', model + '-' + 'training-' + \\\n str(timestamp) + '.log'))\n\n # Get the data and process it.\n if image_shape is None:\n data = DataSet(\n seq_length=seq_length,\n class_limit=class_limit\n )\n else:\n data = DataSet(\n seq_length=seq_length,\n class_limit=class_limit,\n image_shape=image_shape\n )\n\n # Get samples per epoch.\n # Multiply by 0.7 to attempt to guess how much of data.data is the train set.\n steps_per_epoch = (len(data.data) * 0.7) // batch_size\n\n if load_to_memory:\n # Get data.\n X, y = data.get_all_sequences_in_memory('train', data_type)\n X_test, y_test = data.get_all_sequences_in_memory('test', data_type)\n else:\n # Get generators.\n generator = data.frame_generator(batch_size, 'train', data_type)\n val_generator = data.frame_generator(batch_size, 'test', data_type)\n\n # Get the model.\n rm = call_c3d(seq_length, len(data.classes))\n\n # Fit!\n if load_to_memory:\n # Use standard fit.\n rm.fit(\n X,\n y,\n batch_size=batch_size,\n validation_data=(X_test, y_test),\n verbose=1,\n callbacks=[tb, early_stopper, csv_logger],\n epochs=nb_epoch)\n else:\n # Use fit generator.\n rm.fit(\n generator,\n steps_per_epoch=steps_per_epoch,\n epochs=nb_epoch,\n verbose=1,\n callbacks=[tb, early_stopper, csv_logger, checkpointer],\n validation_data=val_generator,\n validation_steps=40,\n workers=4)",
"_____no_output_____"
],
[
"checkpointer = ModelCheckpoint(\n filepath=os.path.join('data', 'checkpoints', 'c3d' + '-' + 'image' + \\\n '.{epoch:03d}-{val_loss:.3f}.hdf5'),\n verbose=1,\n save_best_only=True)",
"_____no_output_____"
],
[
"from keras.layers import Dense, Flatten, Dropout, ZeroPadding3D\nfrom keras.layers.recurrent import LSTM\nfrom keras.models import Sequential, load_model\nfrom tensorflow.keras.optimizers import Adam,RMSprop\nfrom keras.layers.wrappers import TimeDistributed\nfrom keras.layers.convolutional import (Conv2D, MaxPooling3D, Conv3D,\n MaxPooling2D)\nfrom collections import deque\nimport sys",
"_____no_output_____"
],
[
"def c3d(input_shape, num_classes):\n \"\"\"\n Build a 3D convolutional network, aka C3D.\n https://arxiv.org/pdf/1412.0767.pdf\n\n With thanks:\n https://gist.github.com/albertomontesg/d8b21a179c1e6cca0480ebdf292c34d2\n \"\"\"\n model = Sequential()\n\n # 1st layer group\n model.add(Conv3D(64, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv1a\",\n input_shape=input_shape))\n\n model.add(MaxPooling3D(pool_size=(1, 2, 2), strides=(1, 2, 2),\n padding='valid', name='pool1'))\n # 2nd layer group\n model.add(Conv3D(128, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv2a\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool2'))\n # 3rd layer group\n model.add(Conv3D(256, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv3a\"))\n\n model.add(Conv3D(256, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv3b\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool3'))\n # 4th layer group\n model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv4a\"))\n\n model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv4b\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool4'))\n\n # 5th layer group\n model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv5a\"))\n\n model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv5b\"))\n\n #model.add(ZeroPadding3D(padding=(0, 1, 1)))\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool5'))\n model.add(Flatten())\n\n # FC layers group\n model.add(Dense(4096, activation='relu', name='fc6'))\n #model.add(Dropout(0.5))\n model.add(Dense(4096, activation='relu', name='fc7'))\n #model.add(Dropout(0.5))\n model.add(Dense(num_classes, activation='softmax', dtype=\"float32\"))\n\n return model",
"_____no_output_____"
],
[
"def shallow_c3d(input_shape, num_classes):\n \"\"\"\n Build a 3D convolutional network, aka C3D.\n https://arxiv.org/pdf/1412.0767.pdf\n\n With thanks:\n https://gist.github.com/albertomontesg/d8b21a179c1e6cca0480ebdf292c34d2\n \"\"\"\n model = Sequential()\n\n # 1st layer group\n model.add(Conv3D(32, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv1a\",\n input_shape=input_shape))\n\n model.add(MaxPooling3D(pool_size=(1, 2, 2), strides=(1, 2, 2),\n padding='valid', name='pool1'))\n # 2nd layer group\n model.add(Conv3D(64, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv2a\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool2'))\n # 3rd layer group\n model.add(Conv3D(128, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv3a\"))\n\n model.add(Conv3D(128, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv3b\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool3'))\n # 4th layer group\n model.add(Conv3D(256, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv4a\"))\n\n model.add(Conv3D(256, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv4b\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool4'))\n\n# # 5th layer group\n# model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv5a\"))\n\n# model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv5b\"))\n\n# #model.add(ZeroPadding3D(padding=(0, 1, 1)))\n# model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n# padding='valid', name='pool5'))\n model.add(Flatten())\n\n # FC layers group\n model.add(Dense(1024, activation='relu', name='fc5'))\n model.add(Dropout(0.5))\n model.add(Dense(1024, activation='relu', name='fc6'))\n model.add(Dropout(0.5))\n model.add(Dense(num_classes, activation='softmax', dtype=\"float32\"))\n\n return model",
"_____no_output_____"
],
[
"def more_shallow_c3d(input_shape, num_classes):\n \"\"\"\n Build a 3D convolutional network, aka C3D.\n https://arxiv.org/pdf/1412.0767.pdf\n\n With thanks:\n https://gist.github.com/albertomontesg/d8b21a179c1e6cca0480ebdf292c34d2\n \"\"\"\n model = Sequential()\n\n # 1st layer group\n model.add(Conv3D(32, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv1a\",\n input_shape=input_shape))\n\n model.add(MaxPooling3D(pool_size=(1, 2, 2), strides=(1, 2, 2),\n padding='valid', name='pool1'))\n # 2nd layer group\n model.add(Conv3D(32, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv2a\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool2'))\n model.add(Dropout(0.25))\n \n # 3rd layer group\n model.add(Conv3D(64, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv3a\"))\n\n model.add(Conv3D(64, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv3b\"))\n\n model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n padding='valid', name='pool3'))\n model.add(Dropout(0.25))\n\n # 4th layer group\n# model.add(Conv3D(256, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv4a\"))\n\n# model.add(Conv3D(256, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv4b\"))\n\n# model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n# padding='valid', name='pool4'))\n\n# # 5th layer group\n# model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv5a\"))\n\n# model.add(Conv3D(512, kernel_size=(3,3,3), strides=(1,1,1), activation='relu', padding='same', name=\"conv5b\"))\n\n# #model.add(ZeroPadding3D(padding=(0, 1, 1)))\n# model.add(MaxPooling3D(pool_size=(2, 2, 2), strides=(2, 2, 2),\n# padding='valid', name='pool5'))\n model.add(Flatten())\n\n # FC layers group\n# model.add(Dense(1024, activation='relu', name='fc6'))\n# model.add(Dropout(0.5))\n model.add(Dense(512, activation='relu', name='fc4'))\n model.add(Dropout(0.5))\n model.add(Dense(num_classes, activation='softmax', dtype=\"float32\"))\n\n return model",
"_____no_output_____"
],
[
"def call_c3d(seq_length, num_classes):\n input_shape = (seq_length, 80, 80, 3)\n# model = c3d(input_shape, num_classes)\n model = more_shallow_c3d(input_shape, num_classes)\n# model = shallow_c3d(input_shape, num_classes)\n optimizer = Adam(learning_rate =1e-3, decay=1e-6)\n model.compile(loss='categorical_crossentropy', optimizer=optimizer, metrics=['accuracy'])\n model.summary()\n return model",
"_____no_output_____"
],
[
"train('images', 16, 'c3d', saved_model=None,\n class_limit=None, image_shape=(80,80,3),\n load_to_memory=False, batch_size=64, nb_epoch=20)",
"Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nconv1a (Conv3D) (None, 16, 80, 80, 32) 2624 \n_________________________________________________________________\npool1 (MaxPooling3D) (None, 16, 40, 40, 32) 0 \n_________________________________________________________________\nconv2a (Conv3D) (None, 16, 40, 40, 32) 27680 \n_________________________________________________________________\npool2 (MaxPooling3D) (None, 8, 20, 20, 32) 0 \n_________________________________________________________________\ndropout (Dropout) (None, 8, 20, 20, 32) 0 \n_________________________________________________________________\nconv3a (Conv3D) (None, 8, 20, 20, 64) 55360 \n_________________________________________________________________\nconv3b (Conv3D) (None, 8, 20, 20, 64) 110656 \n_________________________________________________________________\npool3 (MaxPooling3D) (None, 4, 10, 10, 64) 0 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 4, 10, 10, 64) 0 \n_________________________________________________________________\nflatten (Flatten) (None, 25600) 0 \n_________________________________________________________________\nfc4 (Dense) (None, 512) 13107712 \n_________________________________________________________________\ndropout_2 (Dropout) (None, 512) 0 \n_________________________________________________________________\ndense (Dense) (None, 9) 4617 \n=================================================================\nTotal params: 13,308,649\nTrainable params: 13,308,649\nNon-trainable params: 0\n_________________________________________________________________\nCreating train generator with 6825 samples.\nEpoch 1/20\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79f6fa57cfccad389a9d32ae0f53b65ec1214e1 | 27,842 | ipynb | Jupyter Notebook | data_processing.ipynb | annwhoorma/pmldl-project | bf8cc02849c70141927c8fb775af9fe76b370977 | [
"MIT"
] | null | null | null | data_processing.ipynb | annwhoorma/pmldl-project | bf8cc02849c70141927c8fb775af9fe76b370977 | [
"MIT"
] | null | null | null | data_processing.ipynb | annwhoorma/pmldl-project | bf8cc02849c70141927c8fb775af9fe76b370977 | [
"MIT"
] | null | null | null | 35.198483 | 365 | 0.489045 | [
[
[
"### Imports",
"_____no_output_____"
]
],
[
[
"from IPython.display import clear_output",
"_____no_output_____"
],
[
"!pip install path.py\n!pip install pytorch3d\nclear_output()",
"_____no_output_____"
],
[
"import numpy as np\nimport math\nimport random\nimport os\nimport plotly.graph_objects as go\nimport plotly.express as px\n\nimport torch\nfrom torch.utils.data import Dataset, DataLoader, Subset\nfrom torchvision import transforms, utils\n\nfrom path import Path\n\nrandom.seed = 42",
"_____no_output_____"
],
[
"!wget http://3dvision.princeton.edu/projects/2014/3DShapeNets/ModelNet10.zip\n!unzip -q ModelNet10.zip\n\npath = Path(\"ModelNet10\")\n\nfolders = [dir for dir in sorted(os.listdir(path)) if os.path.isdir(path/dir)]\n\nclear_output()\nclasses = {folder: i for i, folder in enumerate(folders)}\nclasses",
"_____no_output_____"
],
[
"def default_transforms():\n return transforms.Compose([\n PointSampler(1024),\n Normalize(),\n RandomNoise(),\n ToSorted(),\n ToTensor()\n ])\n\n!gdown https://drive.google.com/uc?id=1CVwVxdfUfP6TRcVUjjJvQeRcgCGcnSO_\nfrom helping import *\nclear_output()",
"_____no_output_____"
]
],
[
[
"### Data Preprocessing (optional)",
"_____no_output_____"
]
],
[
[
"with open(path/\"dresser/train/dresser_0001.off\", 'r') as f:\n verts, faces = read_off(f)\n\ni, j, k = np.array(faces).T\nx, y, z = np.array(verts).T\n\n# len(x)",
"_____no_output_____"
],
[
"# visualize_rotate([go.Mesh3d(x=x, y=y, z=z, color='lightpink', opacity=0.50, i=i,j=j,k=k)]).show()\n# visualize_rotate([go.Scatter3d(x=x, y=y, z=z, mode='markers')]).show()",
"_____no_output_____"
],
[
"# pcshow(x, y, z)",
"_____no_output_____"
],
[
"pointcloud = PointSampler(1024)((verts, faces))\n# pcshow(*pointcloud.T)\n\nnorm_pointcloud = Normalize()(pointcloud)\n# pcshow(*norm_pointcloud.T)\n\nnoisy_pointcloud = RandomNoise()(norm_pointcloud)\n# pcshow(*noisy_pointcloud.T)\n\nrot_pointcloud = RandomRotation_z()(noisy_pointcloud)\n# pcshow(*rot_pointcloud.T)\n\nsorted_pointcloud = ToSorted()(rot_pointcloud)\n# pcshow(*sorted_pointcloud.T)\n\ntensor_pointcloud = ToTensor()(sorted_pointcloud)",
"_____no_output_____"
]
],
[
[
"### Creating Loaders for Final Progress Report",
"_____no_output_____"
],
[
"#### Redefine classes",
"_____no_output_____"
]
],
[
[
"class PointCloudData(Dataset):\n def __init__(self, root_dir, valid=False, folder=\"train\", transform=default_transforms(), folders=None):\n self.root_dir = root_dir\n if not folders:\n folders = [dir for dir in sorted(os.listdir(root_dir)) if os.path.isdir(root_dir/dir)]\n self.classes = {folder: i for i, folder in enumerate(folders)}\n self.transforms = transform\n self.valid = valid\n self.pcs = []\n for category in self.classes.keys():\n new_dir = root_dir/Path(category)/folder\n for file in os.listdir(new_dir):\n if file.endswith('.off'):\n sample = {}\n with open(new_dir/file, 'r') as f:\n verts, faces = read_off(f)\n sample['pc'] = (verts, faces)\n sample['category'] = category\n self.pcs.append(sample)\n\n def __len__(self):\n return len(self.pcs)\n\n def __getitem__(self, idx):\n pointcloud = self.transforms(self.pcs[idx]['pc'])\n category = self.pcs[idx]['category']\n return pointcloud, self.classes[category]\n \nclass PointCloudDataPre(Dataset):\n def __init__(self, root_dir, valid=False, folder=\"train\", transform=default_transforms(), folders=None):\n self.root_dir = root_dir\n if not folders:\n folders = [dir for dir in sorted(os.listdir(root_dir)) if os.path.isdir(root_dir/dir)]\n self.classes = {folder: i for i, folder in enumerate(folders)}\n self.transforms = transform\n self.valid = valid\n self.pcs = []\n for category in self.classes.keys():\n new_dir = root_dir/Path(category)/folder\n for file in os.listdir(new_dir):\n if file.endswith('.off'):\n sample = {}\n with open(new_dir/file, 'r') as f:\n verts, faces = read_off(f)\n sample['pc'] = self.transforms((verts, faces))\n sample['category'] = category\n self.pcs.append(sample)\n\n def __len__(self):\n return len(self.pcs)\n\n def __getitem__(self, idx):\n pointcloud = self.pcs[idx]['pc']\n category = self.pcs[idx]['category']\n return pointcloud, self.classes[category]\n \n \nclass PointCloudDataBoth(Dataset):\n def __init__(self, root_dir, valid=False, folder=\"train\", static_transform=default_transforms(), later_transform=None, folders=None):\n self.root_dir = root_dir\n if not folders:\n folders = [dir for dir in sorted(os.listdir(root_dir)) if os.path.isdir(root_dir/dir)]\n self.classes = {folder: i for i, folder in enumerate(folders)}\n self.static_transform = static_transform\n self.later_transform = later_transform\n self.valid = valid\n self.pcs = []\n for category in self.classes.keys():\n new_dir = root_dir/Path(category)/folder\n for file in os.listdir(new_dir):\n if file.endswith('.off'):\n sample = {}\n with open(new_dir/file, 'r') as f:\n verts, faces = read_off(f)\n sample['pc'] = self.static_transform((verts, faces))\n sample['category'] = category\n self.pcs.append(sample)\n\n def __len__(self):\n return len(self.pcs)\n\n def __getitem__(self, idx):\n pointcloud = self.pcs[idx]['pc']\n if self.later_transform is not None:\n pointcloud = self.later_transform(pointcloud)\n category = self.pcs[idx]['category']\n return pointcloud, self.classes[category]",
"_____no_output_____"
],
[
"!mkdir drive/MyDrive/Thesis/dataloaders/final",
"_____no_output_____"
]
],
[
[
"#### Overfitting - all augmentations applied before training",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\ntrs = transforms.Compose([\n PointSampler(1024),\n ToSorted(),\n Normalize(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudDataPre(path, folders=['bed'], transform=trs)\nbeds_valid_dataset = PointCloudDataPre(path, folder='test', folders=['bed'], transform=trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_pre\ntorch.save(beds_train_loader, 'dataloader_beds_pre/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_pre/validloader.pth')\n\n!mkdir drive/MyDrive/Thesis/dataloaders/final\n!cp -r dataloader_beds_pre drive/MyDrive/Thesis/dataloaders/final",
"mkdir: cannot create directory ‘dataloader_beds_pre’: File exists\nmkdir: cannot create directory ‘drive/MyDrive/Thesis/dataloaders/final’: File exists\n"
]
],
[
[
"#### Underfitting - all augmentations applied during training",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\ntrs = transforms.Compose([\n PointSampler(1024),\n ToSorted(),\n Normalize(),\n RandomNoise(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudData(path, folders=['bed'], transform=trs)\nbeds_valid_dataset = PointCloudData(path, folder='test', folders=['bed'], transform=trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, num_workers=4, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, num_workers=4, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_dur\ntorch.save(beds_train_loader, 'dataloader_beds_dur/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_dur/validloader.pth')\n\n!cp -r dataloader_beds_dur drive/MyDrive/Thesis/dataloaders/final",
"/usr/local/lib/python3.7/dist-packages/torch/utils/data/dataloader.py:481: UserWarning:\n\nThis DataLoader will create 4 worker processes in total. Our suggested max number of worker in current system is 2, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.\n\n"
]
],
[
[
"#### Both - static and dynamic transformations",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\nstatic_trs = transforms.Compose([\n PointSampler(1024),\n ToSorted(),\n Normalize(),\n])\n\ndynamic_trs = transforms.Compose([\n RandomNoise(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudDataBoth(path, folders=['bed'], static_transform=static_trs, later_transform=dynamic_trs)\nbeds_valid_dataset = PointCloudDataBoth(path, folder='test', folders=['bed'], static_transform=static_trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_both\ntorch.save(beds_train_loader, 'dataloader_beds_both/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_both/validloader.pth')\n\n!cp -r dataloader_beds_both drive/MyDrive/Thesis/dataloaders/final",
"mkdir: cannot create directory ‘dataloader_beds_both’: File exists\n"
]
],
[
[
"#### Two classes: beds and tables",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\nstatic_trs = transforms.Compose([\n PointSampler(1024),\n ToSorted(),\n Normalize(),\n])\n\ndynamic_trs = transforms.Compose([\n RandomNoise(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudDataBoth(path, folders=['bed', 'table'], static_transform=static_trs, later_transform=dynamic_trs)\nbeds_valid_dataset = PointCloudDataBoth(path, folder='test', folders=['bed', 'table'], static_transform=trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_tables\ntorch.save(beds_train_loader, 'dataloader_beds_tables/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_tables/validloader.pth')\n\n!cp -r dataloader_beds_tables drive/MyDrive/Thesis/dataloaders/final",
"_____no_output_____"
]
],
[
[
"### For 512",
"_____no_output_____"
]
],
[
[
"!mkdir drive/MyDrive/Thesis/dataloaders/final512",
"_____no_output_____"
]
],
[
[
"#### Overfitting - all augmentations applied before training",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\ntrs = transforms.Compose([\n PointSampler(512),\n ToSorted(),\n Normalize(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudDataPre(path, folders=['bed'], transform=trs)\nbeds_valid_dataset = PointCloudDataPre(path, folder='test', folders=['bed'], transform=trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_pre\ntorch.save(beds_train_loader, 'dataloader_beds_pre/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_pre/validloader.pth')\n\n!mkdir drive/MyDrive/Thesis/dataloaders/final\n!cp -r dataloader_beds_pre drive/MyDrive/Thesis/dataloaders/final512",
"mkdir: cannot create directory ‘dataloader_beds_pre’: File exists\nmkdir: cannot create directory ‘drive/MyDrive/Thesis/dataloaders/final’: File exists\n"
]
],
[
[
"#### Underfitting - all augmentations applied during training",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\ntrs = transforms.Compose([\n PointSampler(512),\n ToSorted(),\n Normalize(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudData(path, folders=['bed'], transform=trs)\nbeds_valid_dataset = PointCloudData(path, folder='test', folders=['bed'], transform=trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, num_workers=4, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, num_workers=4, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_dur\ntorch.save(beds_train_loader, 'dataloader_beds_dur/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_dur/validloader.pth')\n\n!cp -r dataloader_beds_dur drive/MyDrive/Thesis/dataloaders/final512",
"/usr/local/lib/python3.7/dist-packages/torch/utils/data/dataloader.py:481: UserWarning:\n\nThis DataLoader will create 4 worker processes in total. Our suggested max number of worker in current system is 2, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.\n\n"
]
],
[
[
"#### Both - static and dynamic transformations",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\nstatic_trs = transforms.Compose([\n PointSampler(512),\n ToSorted(),\n Normalize(),\n])\n\ndynamic_trs = transforms.Compose([\n RandomNoise(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudDataBoth(path, folders=['bed'], static_transform=static_trs, later_transform=dynamic_trs)\nbeds_valid_dataset = PointCloudDataBoth(path, folder='test', folders=['bed'], static_transform=static_trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_both\ntorch.save(beds_train_loader, 'dataloader_beds_both/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_both/validloader.pth')\n\n!cp -r dataloader_beds_both drive/MyDrive/Thesis/dataloaders/final512",
"mkdir: cannot create directory ‘dataloader_beds_both’: File exists\n"
]
],
[
[
"#### Two classes: beds and tables",
"_____no_output_____"
]
],
[
[
"BATCH_SIZE = 48\n\nstatic_trs = transforms.Compose([\n PointSampler(512),\n ToSorted(),\n Normalize(),\n])\n\ndynamic_trs = transforms.Compose([\n RandomNoise(),\n ToTensor()\n])\n\nbeds_train_dataset = PointCloudDataBoth(path, folders=['bed', 'table'], static_transform=static_trs, later_transform=dynamic_trs)\nbeds_valid_dataset = PointCloudDataBoth(path, folder='test', folders=['bed', 'table'], static_transform=trs)\n\nbeds_train_loader = DataLoader(dataset=beds_train_dataset, shuffle=True, batch_size=BATCH_SIZE, drop_last=True)\nbeds_valid_loader = DataLoader(dataset=beds_valid_dataset, batch_size=BATCH_SIZE, drop_last=True)\n\n!mkdir dataloader_beds_tables\ntorch.save(beds_train_loader, 'dataloader_beds_tables/trainloader.pth')\ntorch.save(beds_valid_loader, 'dataloader_beds_tables/validloader.pth')\n\n!cp -r dataloader_beds_tables drive/MyDrive/Thesis/dataloaders/final",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79f7bcf9d87aa776ab28014d5db6053a0f9932d | 7,578 | ipynb | Jupyter Notebook | pyling/detectAuthors.ipynb | dirkroorda/explore | fa568d9999e33268714ee53a8f4fc7e26b51b3fc | [
"MIT"
] | null | null | null | pyling/detectAuthors.ipynb | dirkroorda/explore | fa568d9999e33268714ee53a8f4fc7e26b51b3fc | [
"MIT"
] | null | null | null | pyling/detectAuthors.ipynb | dirkroorda/explore | fa568d9999e33268714ee53a8f4fc7e26b51b3fc | [
"MIT"
] | null | null | null | 24.684039 | 136 | 0.485088 | [
[
[
"# Fill in full author references for mentions of authors\n\nFor example, if we find `Calderon`, we want to produce a string\n\n```\n<author><name key=\"cald\">Pedro Calderón de la Barca</name></author>\n```",
"_____no_output_____"
]
],
[
[
"testTexts = [\n \"Calderón de la Barca, Pedro\",\n \"CCCCCalderón\",\n \"Caldeeeeeerón\",\n \"Pedro Barca\",\n \"Pedro Barca\",\n \"Agustin Moreto\",\n \"A. Moreto\",\n \"Agustin\",\n \"Augustine\",\n]",
"_____no_output_____"
]
],
[
[
"## Triggers\n\nWe are going to find trigger strings for authors in the input texts.\n\nIn order to do that successfully, we normalize the text first:\n\n* we remove all accents from accented letters\n* we make everything lowercase",
"_____no_output_____"
],
[
"We need a function that can strip accents from characters.\n\nFrom [stackoverflow](https://stackoverflow.com/questions/517923/what-is-the-best-way-to-remove-accents-in-a-python-unicode-string)",
"_____no_output_____"
]
],
[
[
"import re\nimport unicodedata\n\ndef normalize(text):\n text = unicodedata.normalize('NFD', text)\n text = text.encode('ascii', 'ignore')\n text = text.decode(\"utf-8\")\n return text.lower().strip()",
"_____no_output_____"
],
[
"normalize(\"Calderón de la Barca, Pedro\")",
"_____no_output_____"
]
],
[
[
"## Authors\n\nWe compile a list of authors that we want to detect.\n\nFor each author we have a full name, a key, and a list of triggers.\n\nWe format the specficiation as a *yaml* file (which maps to a Python dictionary).",
"_____no_output_____"
]
],
[
[
"authorSpec = '''\ncald:\n full: Pedro Calderón de la Barca\n triggers:\n - calderon\n - barca\nmore:\n full: Agustín Moreto\n triggers:\n - moreto\n - agustin\n - augustine\n'''",
"_____no_output_____"
]
],
[
[
"In order to parse this file, you need to install pyyaml first\n\n``` sh\npip install yaml\n```\n\nor\n\n```\npip3 install yaml\n```",
"_____no_output_____"
]
],
[
[
"import yaml",
"_____no_output_____"
],
[
"authors = yaml.load(authorSpec, Loader=yaml.FullLoader)",
"_____no_output_____"
],
[
"authors",
"_____no_output_____"
]
],
[
[
"We need to compile the authors specification in such a way that we can use the triggers",
"_____no_output_____"
]
],
[
[
"triggers = {}\nfor (key, authorInfo) in authors.items():\n for trigger in authorInfo['triggers']:\n triggers[trigger] = key",
"_____no_output_____"
],
[
"triggers",
"_____no_output_____"
],
[
"def fillInAuthorDetails(text):\n normalized = normalize(text)\n output = None\n for trigger in triggers:\n if trigger in normalized:\n authorKey = triggers[trigger]\n authorFull = authors[authorKey][\"full\"]\n output = f\"\"\"<author><name key=\"{authorKey}\">{authorFull}</name></author>\"\"\"\n break\n if output is None:\n print(f\"!!! {normalized:<36} => NO AUTHOR DETECTED\")\n return output",
"_____no_output_____"
],
[
"for text in (testTexts):\n result = fillInAuthorDetails(text)\n if result is not None:\n print(f\"{text:<40} => {result}\")",
"Calderón de la Barca, Pedro => <author><name key=\"cald\">Pedro Calderón de la Barca</name></author>\nCCCCCalderón => <author><name key=\"cald\">Pedro Calderón de la Barca</name></author>\n!!! caldeeeeeeron => NO AUTHOR DETECTED\nPedro Barca => <author><name key=\"cald\">Pedro Calderón de la Barca</name></author>\nPedro Barca => <author><name key=\"cald\">Pedro Calderón de la Barca</name></author>\nAgustin Moreto => <author><name key=\"more\">Agustín Moreto</name></author>\nA. Moreto => <author><name key=\"more\">Agustín Moreto</name></author>\nAgustin => <author><name key=\"more\">Agustín Moreto</name></author>\nAugustine => <author><name key=\"more\">Agustín Moreto</name></author>\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
e79f7c86e937f0cdac99d706e12cfa0853d48eb1 | 14,489 | ipynb | Jupyter Notebook | q1_cell_based_qubicc_r2b5/source_code/commence_training_cross_validation-fold_2.ipynb | agrundner24/iconml_clc | a9f3547fae15593288066fe1d30631a99e4ccbeb | [
"MIT"
] | null | null | null | q1_cell_based_qubicc_r2b5/source_code/commence_training_cross_validation-fold_2.ipynb | agrundner24/iconml_clc | a9f3547fae15593288066fe1d30631a99e4ccbeb | [
"MIT"
] | null | null | null | q1_cell_based_qubicc_r2b5/source_code/commence_training_cross_validation-fold_2.ipynb | agrundner24/iconml_clc | a9f3547fae15593288066fe1d30631a99e4ccbeb | [
"MIT"
] | null | null | null | 34.497619 | 153 | 0.589068 | [
[
[
"## Cross-Validation\n\n1. We read the data from the npy files\n2. We combine the QUBICC and NARVAL data\n4. Set up cross validation\n\nDuring cross-validation:\n\n1. We scale the data, convert to tf data\n2. Plot training progress, model biases \n3. Write losses and epochs into file",
"_____no_output_____"
]
],
[
[
"# Ran with 800GB (750GB should also be fine)\n\nimport sys\nimport numpy as np\nimport time\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport os\nimport copy\nimport gc\n\n#Import sklearn before tensorflow (static Thread-local storage)\nfrom sklearn.preprocessing import StandardScaler\n\nimport tensorflow as tf\nfrom tensorflow.keras.models import load_model\nfrom tensorflow.keras.models import Sequential\nfrom tensorflow.keras.layers import Dense, Dropout, BatchNormalization\nfrom tensorflow.keras.regularizers import l1_l2\n\nfrom tensorflow.keras import backend as K\nfrom tensorflow.keras.layers import Activation\n\n# For Leaky_ReLU:\nfrom tensorflow import nn \n\nt0 = time.time()\npath = '/pf/b/b309170'\n\n# Add path with my_classes to sys.path\nsys.path.insert(0, path + '/workspace_icon-ml/cloud_cover_parameterization/')\n\n# Reloading custom file to incorporate changes dynamically\nimport importlib\nimport my_classes\nimportlib.reload(my_classes)\n\nfrom my_classes import read_mean_and_std\nfrom my_classes import TimeOut\n\n# Minutes per fold\ntimeout = 2120 \n\n# For logging purposes\ndays = 'all_days'\n\n# Maximum amount of epochs for each model\nepochs = 30 \n\n# Set seed for reproducibility\nseed = 10\ntf.random.set_seed(seed)\n\n# For store_mean_model_biases\nVERT_LAYERS = 31\n\ngpus = tf.config.experimental.list_physical_devices('GPU')\n# tf.config.experimental.set_visible_devices(gpus[3], 'GPU')",
"_____no_output_____"
],
[
"# Cloud Cover or Cloud Area?\noutput_var = 'cl_area' # Set output_var to one of {'clc', 'cl_area'}\n# QUBICC only or QUBICC+NARVAL training data? Always True for the paper\nqubicc_only = True\n\npath_base = os.path.join(path, 'workspace_icon-ml/cloud_cover_parameterization/grid_cell_based_QUBICC_R02B05')\npath_data = os.path.join(path, 'my_work/icon-ml_data/cloud_cover_parameterization/grid_cell_based_QUBICC_R02B05/based_on_var_interpolated_data')\n\nif output_var == 'clc':\n full_output_var_name = 'cloud_cover'\nelif output_var == 'cl_area':\n full_output_var_name = 'cloud_area'\n \nif qubicc_only:\n output_folder = '%s_R2B5_QUBICC'%full_output_var_name\nelse:\n output_folder = '%s_R2B5_QUBICC+NARVAL'%full_output_var_name\npath_model = os.path.join(path_base, 'saved_models', output_folder)\npath_figures = os.path.join(path_base, 'figures', output_folder)\nnarval_output_file = '%s_output_narval.npy'%full_output_var_name\nqubicc_output_file = '%s_output_qubicc.npy'%full_output_var_name",
"_____no_output_____"
],
[
"# Prevents crashes of the code\nphysical_devices = tf.config.list_physical_devices('GPU')\ntf.config.set_visible_devices(physical_devices[0], 'GPU')",
"_____no_output_____"
],
[
"# Allow the growth of memory Tensorflow allocates (limits memory usage overall)\nfor gpu in gpus:\n tf.config.experimental.set_memory_growth(gpu, True)",
"_____no_output_____"
],
[
"scaler = StandardScaler()",
"_____no_output_____"
]
],
[
[
"### Load the data",
"_____no_output_____"
]
],
[
[
"# input_narval = np.load(path_data + '/cloud_cover_input_narval.npy')\n# input_qubicc = np.load(path_data + '/cloud_cover_input_qubicc.npy')\n# output_narval = np.load(path_data + '/cloud_cover_output_narval.npy')\n# output_qubicc = np.load(path_data + '/cloud_cover_output_qubicc.npy')",
"_____no_output_____"
],
[
"input_data = np.concatenate((np.load(path_data + '/cloud_cover_input_narval.npy'), \n np.load(path_data + '/cloud_cover_input_qubicc.npy')), axis=0)\noutput_data = np.concatenate((np.load(os.path.join(path_data, narval_output_file)), \n np.load(os.path.join(path_data, qubicc_output_file))), axis=0)",
"_____no_output_____"
],
[
"samples_narval = np.load(path_data + '/cloud_cover_output_narval.npy').shape[0]",
"_____no_output_____"
],
[
"if qubicc_only:\n input_data = input_data[samples_narval:]\n output_data = output_data[samples_narval:]",
"_____no_output_____"
],
[
"(samples_total, no_of_features) = input_data.shape\n(samples_total, no_of_features)",
"_____no_output_____"
]
],
[
[
"*Temporal cross-validation*\n\nSplit into 2-weeks increments (when working with 3 months of data). It's 25 day increments with 5 months of data. <br>\n1.: Validate on increments 1 and 4 <br>\n2.: Validate on increments 2 and 5 <br>\n3.: Validate on increments 3 and 6\n\n--> 2/3 training data, 1/3 validation data",
"_____no_output_____"
]
],
[
[
"training_folds = []\nvalidation_folds = []\ntwo_week_incr = samples_total//6\n\nfor i in range(3):\n # Note that this is a temporal split since time was the first dimension in the original tensor\n first_incr = np.arange(samples_total//6*i, samples_total//6*(i+1))\n second_incr = np.arange(samples_total//6*(i+3), samples_total//6*(i+4))\n\n validation_folds.append(np.append(first_incr, second_incr))\n training_folds.append(np.arange(samples_total))\n training_folds[i] = np.delete(training_folds[i], validation_folds[i])",
"_____no_output_____"
]
],
[
[
"### Define the model",
"_____no_output_____"
],
[
"Activation function for the last layer",
"_____no_output_____"
]
],
[
[
"def lrelu(x):\n return nn.leaky_relu(x, alpha=0.01)",
"_____no_output_____"
],
[
"# Create the model\nmodel = Sequential()\n\n# First hidden layer\nmodel.add(Dense(units=64, activation='tanh', input_dim=no_of_features, \n kernel_regularizer=l1_l2(l1=0.004749, l2=0.008732)))\n\n# Second hidden layer\nmodel.add(Dense(units=64, activation=nn.leaky_relu, kernel_regularizer=l1_l2(l1=0.004749, l2=0.008732)))\n# model.add(Dropout(0.221)) # We drop 18% of the hidden nodes\nmodel.add(BatchNormalization())\n\n# Third hidden layer\nmodel.add(Dense(units=64, activation='tanh', kernel_regularizer=l1_l2(l1=0.004749, l2=0.008732)))\n# model.add(Dropout(0.221)) # We drop 18% of the hidden nodes\n\n# Output layer\nmodel.add(Dense(1, activation='linear', kernel_regularizer=l1_l2(l1=0.004749, l2=0.008732)))",
"_____no_output_____"
]
],
[
[
"### 3-fold cross-validation",
"_____no_output_____"
]
],
[
[
"# By decreasing timeout we make sure every fold gets the same amount of time\n# After all, data-loading took some time (Have 3 folds, 60 seconds/minute)\n# timeout = timeout - 1/3*1/60*(time.time() - t0)\ntimeout = timeout - 1/60*(time.time() - t0)\nt0 = time.time()\n\n#We loop through the folds\nfor i in range(3): \n \n filename = 'cross_validation_cell_based_fold_%d'%(i+1)\n \n #Standardize according to the fold\n scaler.fit(input_data[training_folds[i]])\n\n #Load the data for the respective fold and convert it to tf data\n input_train = scaler.transform(input_data[training_folds[i]])\n input_valid = scaler.transform(input_data[validation_folds[i]]) \n output_train = output_data[training_folds[i]]\n output_valid = output_data[validation_folds[i]]\n \n # Clear memory (Reduces memory requirement to 151 GB)\n del input_data, output_data, first_incr, second_incr, validation_folds, training_folds\n gc.collect()\n \n # Column-based: batchsize of 128\n # Cell-based: batchsize of at least 512\n # Shuffle is actually very important because we start off with the uppermost layers with clc=0 basically throughout\n # This can push us into a local minimum, preferrably yielding clc=0.\n # The size of the shuffle buffer significantly impacts RAM requirements! Do not increase to above 10000.\n # Possibly better to use .apply(tf.data.experimental.copy_to_device(\"/gpu:0\")) before prefetch\n # We might want to cache before shuffling, however it seems to slow down training\n # We do not repeat after shuffle, because the validation set should be evaluated after each epoch\n train_ds = tf.data.Dataset.zip((tf.data.Dataset.from_tensor_slices(input_train), \n tf.data.Dataset.from_tensor_slices(output_train))) \\\n .shuffle(10**5, seed=seed) \\\n .batch(batch_size=1028, drop_remainder=True) \\\n .prefetch(1)\n \n # Clear memory\n del input_train, output_train\n gc.collect()\n \n # No need to add prefetch.\n # tf data with batch_size=10**5 makes the validation evaluation 10 times faster\n valid_ds = tf.data.Dataset.zip((tf.data.Dataset.from_tensor_slices(input_valid), \n tf.data.Dataset.from_tensor_slices(output_valid))) \\\n .batch(batch_size=10**5, drop_remainder=True)\n \n # Clear memory (Reduces memory requirement to 151 GB)\n del input_valid, output_valid\n gc.collect()\n \n #Feed the model. Increase the learning rate by a factor of 2 when increasing the batch size by a factor of 4\n model.compile(\n optimizer=tf.keras.optimizers.Adam(learning_rate=0.000433, epsilon=0.1),\n loss=tf.keras.losses.MeanSquaredError()\n )\n \n #Train the model\n# time_callback = TimeOut(t0, timeout*(i+1))\n time_callback = TimeOut(t0, timeout)\n history = model.fit(train_ds, validation_data=valid_ds, epochs=epochs, verbose=2, \n callbacks=[time_callback])\n# history = model.fit(train_ds, epochs=epochs, validation_data=valid_ds, callbacks=[time_callback])\n\n #Save the model \n #Serialize model to YAML\n model_yaml = model.to_yaml()\n with open(os.path.join(path_model, filename+\".yaml\"), \"w\") as yaml_file:\n yaml_file.write(model_yaml)\n #Serialize model and weights to a single HDF5-file\n model.save(os.path.join(path_model, filename+'.h5'), \"w\")\n print('Saved model to disk')\n \n #Plot the training history\n if len(history.history['loss']) > len(history.history['val_loss']):\n del history.history['loss'][-1]\n pd.DataFrame(history.history).plot(figsize=(8,5))\n plt.grid(True)\n plt.ylabel('Mean Squared Error')\n plt.xlabel('Number of epochs')\n plt.savefig(os.path.join(path_figures, filename+'.pdf'))\n \n with open(os.path.join(path_model, filename+'.txt'), 'a') as file:\n file.write('Results from the %d-th fold\\n'%(i+1))\n file.write('Training epochs: %d\\n'%(len(history.history['val_loss'])))\n file.write('Weights restored from epoch: %d\\n\\n'%(1+np.argmin(history.history['val_loss'])))",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79f83f0ef76c79caf74ff29f21459a62fa1a9ba | 509,377 | ipynb | Jupyter Notebook | Buy & Hold/Buy&Hold.ipynb | scastellanog/Walk-forward-optimization | 3fcd46e06066bd5db640964d3e0e0070b0f66186 | [
"MIT"
] | null | null | null | Buy & Hold/Buy&Hold.ipynb | scastellanog/Walk-forward-optimization | 3fcd46e06066bd5db640964d3e0e0070b0f66186 | [
"MIT"
] | null | null | null | Buy & Hold/Buy&Hold.ipynb | scastellanog/Walk-forward-optimization | 3fcd46e06066bd5db640964d3e0e0070b0f66186 | [
"MIT"
] | 1 | 2020-07-20T12:56:04.000Z | 2020-07-20T12:56:04.000Z | 388.244665 | 152,504 | 0.924704 | [
[
[
"# Performance metrics of Buy & Hold Strategy",
"_____no_output_____"
],
[
"The purpose of this notebook is to calculate performance metrics over the benchmark and compare it with results obtained in other papers. I will compare my results with two papers: \n- Hybrid Investment Strategy Based on Momentum and Macroeconomic Approach - Kamil Korzeń, Robert Ślepaczuk \n- Predicting prices of S&P500 index using classical methods and recurrent neural networks - Mateusz Kijewski, Robert Ślepaczuk",
"_____no_output_____"
]
],
[
[
"# Settings for notebook visualization\nfrom IPython.core.interactiveshell import InteractiveShell\nInteractiveShell.ast_node_interactivity = 'all'\n%matplotlib inline\nfrom IPython.core.display import HTML\nHTML(\"\"\"<style>.output_png img {display: block;margin-left: auto;margin-right: auto;text-align: center;vertical-align: middle;} </style>\"\"\")",
"_____no_output_____"
],
[
"# Necessary imports\nimport os\nimport numpy as np\nimport pandas as pd\nimport matplotlib as plt\nimport quantstats as qs\nprint(\"Libraries imported correctly\")",
"Libraries imported correctly\n"
],
[
"os.chdir(\"/Users/Sergio/Documents/Master_QF/Thesis/Code/Algorithmic Strategies\")\n%run Functions.ipynb",
"_____no_output_____"
]
],
[
[
"# Load data",
"_____no_output_____"
]
],
[
[
"%run Functions.ipynb\ndf = get_sp500_data(from_local_file=True, save_to_file=False)\ndf['Market_daily_ret'] = df['Close'].pct_change()\n\ndf = df.loc['1990':'2020', ['Close', 'Market_daily_ret']]\n\ndf.head()\ndf['Close'].plot(title='SP500')",
"_____no_output_____"
]
],
[
[
"# Paper from Kamil: Hybrid Investment Strategy Based on Momentum and Macroeconomic Approach",
"_____no_output_____"
],
[
"Data from 1991-01-03 to 2018-01-03 \nUses daily returns to calculate the metrics",
"_____no_output_____"
]
],
[
[
"from IPython.display import Image\nImage(filename='/Users/Sergio/Documents/Master_QF/Thesis/Papers/Performance metrics/K-Formulas.png')\n# Data from 1991-01-03:2018-01-03",
"_____no_output_____"
]
],
[
[
"We do the backtest of buy_and_hold strategy and compare metrics with the ones from the paper:",
"_____no_output_____"
]
],
[
[
"%run Functions.ipynb\ndf_1 = df.loc['1991-01-03':'2018-01-03', ['Close', 'Market_daily_ret']].copy()\ndf_1 = backtest_strat(df_1, buy_and_hold(df_1), commision=0)[0]\n\ndf_1.head(4)\n#df_1.tail(2)\n#df_1['Close'].plot(title='SP500', legend=True)",
"_____no_output_____"
]
],
[
[
"In this paper, the return from the first day of 1991 (January 2nd) seems to be not included. \nMetrics from paper:",
"_____no_output_____"
]
],
[
[
"from IPython.display import Image\nImage(filename='/Users/Sergio/Documents/Master_QF/Thesis/Papers/Performance metrics/K-Table.png')",
"_____no_output_____"
],
[
"metrics = ['AbsRet', 'ARC', 'IR', 'aSD', 'MD']\n\npaper_data = [[742.801, 8.222, 0.466, 17.652, 56.775]]\ndf_metrics = pd.DataFrame(data=paper_data, index=['Paper metrics'], columns=metrics)\n\nmetrics_row = calculate_performance_metrics(df_1, strat_name='Buy and Hold')\ndf_metrics = pd.concat([df_metrics, metrics_row], axis=0).drop_duplicates().round(3)\n\ndf_metrics",
"_____no_output_____"
]
],
[
[
"# Paper: \"Predicting prices of S&P500 index using classical methods and recurrent neural networks\"",
"_____no_output_____"
],
[
"Data from 2000-01-01 to 2020-05-02 \nUses log returns to calculate the metrics",
"_____no_output_____"
]
],
[
[
"from IPython.display import Image\nImage(filename='/Users/Sergio/Documents/Master_QF/Thesis/Papers/Performance metrics/M-Formulas.png')\n# Data from 2000 : 2020-05-02",
"_____no_output_____"
]
],
[
[
"We do the backtest of buy_and_hold strategy and compare metrics with the ones from the paper:",
"_____no_output_____"
]
],
[
[
"df_2 = df.loc['2000-01-01':'2020-05-02', ['Close', 'Market_daily_ret']].copy()\n\ndf_2 = backtest_strat(df_2, buy_and_hold(df_2), commision=0)[0]\n\ndf_2.head(4)\n#df_2.tail(2)\n#df_2['Close'].plot(title='SP500', legend=True)",
"_____no_output_____"
]
],
[
[
"Metrics from paper:",
"_____no_output_____"
]
],
[
[
"from IPython.display import Image\nImage(filename='/Users/Sergio/Documents/Master_QF/Thesis/Papers/Performance metrics/M-Table.png')\n# Data from 2000 : 2020-05-02",
"_____no_output_____"
],
[
"metrics = ['ARC', 'IR', 'aSD', 'MD', 'AMD', 'MLD', 'All Risk', 'ARCMD', 'ARCAMD', 'Num Trades', 'No signal']\npaper_data = [[3.23, 0.16, 19.95, 64.33, 17.34, 7.155, 15.92, 0.05, 0.18, 1, 0]]\ndf_metrics = pd.DataFrame(data=paper_data, index=['Paper metrics'], columns=metrics)\n\nmetrics_row = calculate_performance_metrics(df_2, strat_name='Buy and Hold')\ndf_metrics = pd.concat([df_metrics, metrics_row], axis=0).drop_duplicates().round(3)\n\ndf_metrics",
"_____no_output_____"
]
],
[
[
"## Demonstration of MD, MLD and AMD using quanstats library",
"_____no_output_____"
],
[
"Using data from paper 1 (1991-01-03 to 2018-01-03) \nFollowing code is to check drawdowns. \n- Paper 2 gave a MD of 64.33%, which seems to be wrong",
"_____no_output_____"
]
],
[
[
"dd = qs.stats.drawdown_details(qs.stats.to_drawdown_series(df_1['Market_cum_ret'])).sort_values(by='max drawdown', ascending=True)\ndd.head()",
"_____no_output_____"
]
],
[
[
"Maximum Loss Duration (in years):",
"_____no_output_____"
]
],
[
[
"dd = qs.stats.drawdown_details(qs.stats.to_drawdown_series(df_1['Market_cum_ret'])).sort_values(by='days', ascending=False)\ndd.insert(4, 'years', dd['days']/365.25)\ndd.head(5)",
"_____no_output_____"
]
],
[
[
"For MLD in years, I believe I should divide the number of days of MLD by 365.25, but result is more similar to the one from paper if I divide the number of days by 366. Kamil, how was it calculated on the paper?",
"_____no_output_____"
]
],
[
[
"from datetime import datetime\nmax_loss_dur = datetime(2007, 5, 30) - datetime(2000, 3, 27)\nprint(max_loss_dur.days)\nprint(\"{:.4f}\".format(max_loss_dur.days / 365))\nprint(\"{:.4f}\".format(max_loss_dur.days / 365.25))\nprint(\"{:.4f}\".format(max_loss_dur.days / 366))",
"2620\n7.1781\n7.1732\n7.1585\n"
]
],
[
[
"To calculate AMD, I group returns by year and do the mean of the MD of each year:",
"_____no_output_____"
]
],
[
[
"print(\"AMD = {:.3f} %\".format(abs(df_2['Market_daily_ret'].groupby(by=df_2.index.year).apply(qs.stats.max_drawdown).mean()*100)))\ndf_2['Market_daily_ret'].groupby(by=df_2.index.year).apply(qs.stats.max_drawdown).mul(100).to_frame(name='MD (%)').abs().round(3).T",
"AMD = 16.520 %\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79f8bd04d984f05e68d18c46a2aa9882a1a7b57 | 36,815 | ipynb | Jupyter Notebook | notebooks/Model_Verification/Model_Verification_Regression.ipynb | hwiberg/OptiCL | f2512cfd408882700fe43c22d4e4b5c5f5c87b53 | [
"MIT"
] | 46 | 2021-11-04T20:17:32.000Z | 2022-03-25T17:12:02.000Z | notebooks/Model_Verification/Model_Verification_Regression.ipynb | hwiberg/OptiCL | f2512cfd408882700fe43c22d4e4b5c5f5c87b53 | [
"MIT"
] | null | null | null | notebooks/Model_Verification/Model_Verification_Regression.ipynb | hwiberg/OptiCL | f2512cfd408882700fe43c22d4e4b5c5f5c87b53 | [
"MIT"
] | 3 | 2021-11-12T02:57:54.000Z | 2022-01-14T15:45:55.000Z | 33.437784 | 592 | 0.384762 | [
[
[
"# Model Prediction Verification",
"_____no_output_____"
],
[
"This script demonstrates how to train a single model class, embed the model, and solve the optimization problem for *regression* problems (i.e., continuous outcome prediction). We fix a sample from our generated data and solve the optimization problem with all elements of $\\mathbf{x}$ equal to our data. In general, we might have some elements of $\\mathbf{x}$ that are fixed, called our \"conceptual variables,\" and the remaining indices are our decision variables. By fixing all elements of $\\mathbf{x}$, we can verify that the model prediction matches the original sklearn model.",
"_____no_output_____"
],
[
"## Load the relevant packages",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nimport math\nfrom sklearn.utils.extmath import cartesian\nimport time\nimport sys\nimport os\nimport time\n\nfrom sklearn.metrics import roc_auc_score, r2_score, mean_squared_error\nfrom sklearn.cluster import KMeans",
"_____no_output_____"
],
[
"import opticl\nfrom pyomo import environ\nfrom pyomo.environ import *",
"_____no_output_____"
]
],
[
[
"## Initialize data\nWe will work with a basic dataset from `sklearn`.",
"_____no_output_____"
]
],
[
[
"from sklearn.datasets import make_regression\nfrom sklearn.model_selection import train_test_split\nX, y = make_regression(n_samples=200, n_features = 20, random_state=1)\nX_train, X_test, y_train, y_test = train_test_split(X, y,\n random_state=1)\nX_train = pd.DataFrame(X_train).add_prefix('col')\nX_test = pd.DataFrame(X_test).add_prefix('col')\nX_train",
"_____no_output_____"
]
],
[
[
"## Train the chosen model type",
"_____no_output_____"
]
],
[
[
"# alg = 'rf' \nalg = 'gbm'\ntask_type = 'continuous'",
"_____no_output_____"
]
],
[
[
"The user can optionally select a manual parameter grid for the cross-validation procedure. We implement a default parameter grid; see **run_MLmodels.py** for details on the tuned parameters. If you wish to use the default, leave ```parameter_grid = None``` (or do not specify any grid).",
"_____no_output_____"
]
],
[
[
"parameter_grid = None\n# parameter_grid = {'hidden_layer_sizes': [(5,),(10,)]}",
"_____no_output_____"
],
[
"s = 1\nversion = 'test'\noutcome = 'temp'\n\nmodel_save = 'results/%s/%s_%s_model.csv' % (alg, version, outcome)\n\nalg_run = alg if alg != 'rf' else 'rf_shallow'\nm, perf = opticl.run_model(X_train, y_train, X_test, y_test, alg_run, outcome, task = task_type, \n seed = s, cv_folds = 5, \n # The user can manually specify the parameter grid for cross-validation if desired\n parameter_grid = parameter_grid,\n save_path = model_save,\n save = False)",
"------------- Initialize grid ----------------\n------------- Running model ----------------\nAlgorithm = gbm, metric = None\nsaving... results/gbm_temp_trained.pkl\n------------- Model evaluation ----------------\n-------------------training evaluation-----------------------\nTrain MSE: 4314.00082576947\nTrain R2: 0.8939305706172604\n-------------------testing evaluation-----------------------\nTest MSE: 17814.940522763252\nTest R2: 0.6170544988313675\n"
]
],
[
[
"After training the model, we will save the trained model in the format needed for embedding the constraints. See **constraint_learning.py** for the specific format that is extracted per method. We also save the performance of the model to use in the automated model selection pipeline (if desired).\n\nWe also create the save directory if it does not exist.\n\n",
"_____no_output_____"
]
],
[
[
"if not os.path.exists('results/%s/' % alg):\n os.makedirs('results/%s/' % alg)\n \nconstraintL = opticl.ConstraintLearning(X_train, y_train, m, alg)\nconstraint_add = constraintL.constraint_extrapolation(task_type)\nconstraint_add.to_csv(model_save, index = False)\n\nperf.to_csv('results/%s/%s_%s_performance.csv' % (alg, version, outcome), index= False)",
"_____no_output_____"
],
[
"constraint_add",
"_____no_output_____"
]
],
[
[
"### Check: what should the result be for our sample observation, if all x are fixed?",
"_____no_output_____"
],
[
"#### Choose sample to test\nThis will be the observation (\"patient\") that we feed into the optimization model.",
"_____no_output_____"
]
],
[
[
"sample_id = 1\nsample = X_train.loc[sample_id:sample_id,:].reset_index(drop = True)",
"_____no_output_____"
]
],
[
[
"Calculate model prediction directly in sklearn.",
"_____no_output_____"
]
],
[
[
"m.predict(sample)",
"_____no_output_____"
]
],
[
[
"## Optimization formulation\nWe will embed the model trained above. The model could also be selected using the model selection pipeline, which we demonstrate in the WFP example script.\n\nIf manually specifying the model, as we are here, the key elements of the ``model_master`` dataframe are:\n- model_type: algorithm name.\n- outcome: name of outcome of interest; this is relevant in the case of multiple learned outcomes.\n- save_path: file name of the extracted model.\n- objective: the weight of the objective if it should be included as an additive term in the objective. A weight of 0 omits it from the objective entirely.\n- lb/ub: the lower (or upper) bound that we wish to apply to the learned outcome. If there is no bound, it should be set to ``None``.\n\nIn this case, we set the outcome to be our only objective term, which will allow us to verify that the predictions are consistent between the embedded model and the sklearn prediction function.",
"_____no_output_____"
]
],
[
[
"model_master = pd.DataFrame(columns = ['model_type','outcome','save_path','lb','ub','objective'])\n\nmodel_master.loc[0,'model_type'] = alg\nmodel_master.loc[0,'save_path'] = 'results/%s/%s_%s_model.csv' % (alg, version, outcome)\nmodel_master.loc[0,'outcome'] = outcome\nmodel_master.loc[0,'objective'] = 1\nmodel_master.loc[0,'ub'] = None\nmodel_master.loc[0,'lb'] = None\nmodel_master.loc[0,'task'] = task_type\nmodel_master['SCM_counterfactuals'] = None\nmodel_master['features'] = [[col for col in X_train.columns]]",
"_____no_output_____"
]
],
[
[
"#### Solve with Pyomo",
"_____no_output_____"
]
],
[
[
"model_pyo = ConcreteModel()\n\n## We will create our x decision variables, and fix them all to our sample's values for model verification.\nN = X_train.columns\nmodel_pyo.x = Var(N, domain=Reals)\n\ndef fix_value(model_pyo, index):\n return model_pyo.x[index] == sample.loc[0,index]\n\nmodel_pyo.Constraint1 = Constraint(N, rule=fix_value)\n\n## Specify any non-learned objective components - none here \nmodel_pyo.OBJ = Objective(expr=0, sense=minimize)",
"_____no_output_____"
],
[
"final_model_pyo = opticl.optimization_MIP(model_pyo, model_pyo.x, model_master, X_train, tr = False)\n# final_model_pyo.pprint()\nopt = SolverFactory('gurobi')\nresults = opt.solve(final_model_pyo) ",
"Embedding objective function for temp\n"
]
],
[
[
"### Check for equality between sklearn and embedded models",
"_____no_output_____"
]
],
[
[
"print(\"True outcome: %.3f\" % m.predict(sample)[0])\nprint(\"Pyomo output: %.3f\" % final_model_pyo.OBJ())",
"True outcome: 182.759\nPyomo output: 182.759\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79f93994032b206e07887e91a4ff48ca45bf9dc | 5,141 | ipynb | Jupyter Notebook | colab_drive.ipynb | renatopcamara/Colaboratory | d4c3a090c92d7fe7a97936e1ddc5f02eed7b5676 | [
"MIT"
] | null | null | null | colab_drive.ipynb | renatopcamara/Colaboratory | d4c3a090c92d7fe7a97936e1ddc5f02eed7b5676 | [
"MIT"
] | null | null | null | colab_drive.ipynb | renatopcamara/Colaboratory | d4c3a090c92d7fe7a97936e1ddc5f02eed7b5676 | [
"MIT"
] | null | null | null | 35.701389 | 320 | 0.538806 | [
[
[
"[View in Colaboratory](https://colab.research.google.com/github/renatopcamara/Colaboratory/blob/master/colab_drive.ipynb)",
"_____no_output_____"
]
],
[
[
"!apt-get install -y -qq software-properties-common python-software-properties module-init-tools\n!add-apt-repository -y ppa:alessandro-strada/ppa 2>&1 > /dev/null\n!apt-get update -qq 2>&1 > /dev/null\n!apt-get -y install -qq google-drive-ocamlfuse fuse\nfrom google.colab import auth\nauth.authenticate_user()\nfrom oauth2client.client import GoogleCredentials\ncreds = GoogleCredentials.get_application_default()\nimport getpass\n!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL\nvcode = getpass.getpass()\n!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}",
"gpg: keybox '/tmp/tmpdpc2ug7s/pubring.gpg' created\ngpg: /tmp/tmpdpc2ug7s/trustdb.gpg: trustdb created\ngpg: key AD5F235DF639B041: public key \"Launchpad PPA for Alessandro Strada\" imported\ngpg: Total number processed: 1\ngpg: imported: 1\nWarning: apt-key output should not be parsed (stdout is not a terminal)\nPlease, open the following URL in a web browser: https://accounts.google.com/o/oauth2/auth?client_id=32555940559.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive&response_type=code&access_type=offline&approval_prompt=force\n··········\nPlease, open the following URL in a web browser: https://accounts.google.com/o/oauth2/auth?client_id=32555940559.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive&response_type=code&access_type=offline&approval_prompt=force\nPlease enter the verification code: Access token retrieved correctly.\n"
],
[
"# Create a directory and mount Google Drive using that directory.\n!mkdir -p drive\n!google-drive-ocamlfuse drive\n\nprint ('Files in Drive:')\n!ls drive/\n\n# Create a file in Drive.\n!echo \"This newly created file will appear in your Drive file list.\" > drive/created.txt",
"fuse: mountpoint is not empty\r\nfuse: if you are sure this is safe, use the 'nonempty' mount option\r\nFiles in Drive:\nAula08.zip\naula 5.pdf\naula 6.pdf\nClasse19-06.rar\nClassroom\nColab Notebooks\ncreated.txt\nDeep Learning.rar\nEfectos olvidados en el diseño de una campaña proselitista.pdf\nEnunciados.pdf\nestoque demanda fuzzy.pdf\nExercício_1.pdf\nICA\nINTEL\nLAB IBM\nPeriodos.docx\nPrograma Publicado.docx\nProjeto Data Mining.pptx\nroteirizacao.xlsx\nSample upload.txt\nScripts aulas 1 e 2.rar\nSparkExample.pdf\nteste.ipynb\n"
],
[
"",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
e79f975b722ccbf76323a6790cffd9d52b26fdb6 | 24,293 | ipynb | Jupyter Notebook | Visualizations/Trajectory.ipynb | yalamaddi/FCND-Term1-P2-3D-Motion-Planning-master | 8395cd0f8ff667bfe5f65137e516933a769c913d | [
"MIT"
] | 22 | 2018-05-31T22:54:15.000Z | 2022-03-03T12:57:48.000Z | Visualizations/Trajectory.ipynb | yalamaddi/FCND-Term1-P2-3D-Motion-Planning-master | 8395cd0f8ff667bfe5f65137e516933a769c913d | [
"MIT"
] | 3 | 2018-08-07T10:43:04.000Z | 2022-03-10T06:52:27.000Z | Visualizations/Trajectory.ipynb | yalamaddi/FCND-Term1-P2-3D-Motion-Planning-master | 8395cd0f8ff667bfe5f65137e516933a769c913d | [
"MIT"
] | 28 | 2018-03-26T17:19:57.000Z | 2022-02-28T04:29:01.000Z | 113.518692 | 17,756 | 0.846664 | [
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nfrom mpl_toolkits.mplot3d import Axes3D\n\n%matplotlib inline ",
"_____no_output_____"
],
[
"filename = '../colliders.csv'\ndata = np.loadtxt(filename,delimiter=',',dtype='Float64',skiprows=2)\nprint(data)",
"[[-310.2389 -439.2315 85.5 5. 5. 85.5 ]\n [-300.2389 -439.2315 85.5 5. 5. 85.5 ]\n [-290.2389 -439.2315 85.5 5. 5. 85.5 ]\n ...\n [ 257.8061 425.1645 1.75852 1.292725 1.292725 1.944791]\n [ 293.9967 368.3391 3.557666 1.129456 1.129456 3.667319]\n [ 281.5162 354.4156 4.999351 1.053772 1.053772 4.950246]]\n"
],
[
"import os, sys\nsys.path.append(os.path.dirname(os.path.dirname(os.path.abspath('planning_utils_from_seed_project.py'))))\nfrom planning_utils_from_seed_project import create_grid, a_star, heuristic, Action\nfrom udacidrone.frame_utils import global_to_local, local_to_global",
"_____no_output_____"
],
[
"TARGET_ALTITUDE = 5\nSAFETY_DISTANCE = 5\nglobal_home = np.array([-122.39745, 37.79248, 0.0])\nprint(f'Global Home => [lon, lat, alt] : {global_home}')\nglobal_position = np.array([-122.3974512, 37.7924799, 0.147])\nprint(f'Global Position => [lon, lat, alt] : {global_position}')\nlocal_position = global_to_local(global_position, global_home)\nprint(f'Local Position => [north, east, down] : {local_position}')",
"Global Home => [lon, lat, alt] : [-122.39745 37.79248 0. ]\nGlobal Position => [lon, lat, alt] : [-122.3974512 37.7924799 0.147 ]\nLocal Position => [north, east, down] : [-0.01177589 -0.10558296 -0.147 ]\n"
],
[
"grid, north_offset, east_offset = create_grid(data, TARGET_ALTITUDE, SAFETY_DISTANCE)\nprint(f'Grid offset : ({north_offset}, {east_offset})')\ngrid_start_north = int(np.ceil(local_position[0] - north_offset))\ngrid_start_east = int(np.ceil(local_position[1] - east_offset))\ngrid_start = (grid_start_north, grid_start_east)\nprint(f'Grid star : {grid_start}')",
"Grid offset : (-316, -445)\nGrid star : (316, 445)\n"
],
[
"grid_goal_north = grid_start_north + 400\ngrid_goal_east = grid_start_east + 400\ngrid_goal = (grid_goal_north, grid_goal_east)\nprint(f'Grid goal : {grid_goal}')\n\ngoal_local_position_north = grid_goal_north + north_offset\ngoal_local_position_east = grid_goal_east + east_offset\ngoal_local_position = np.array([goal_local_position_north, goal_local_position_east, local_position[2]])\nprint(f'Goal Local Position => [north, east, down] : {goal_local_position}')\n\ngoal_global_position = local_to_global(goal_local_position, global_home)\nprint(f'Global Global Position => [lon, lat, alt] : {goal_global_position}')\n\n",
"Grid goal : (716, 845)\nGoal Local Position => [north, east, down] : [ 4.00e+02 4.00e+02 -1.47e-01]\nGlobal Global Position => [lon, lat, alt] : [-122.39287756 37.79606176 0.147 ]\n"
],
[
"def visualize_path(g_start, g_goal, g, n_offset, e_offset):\n \"\"\"\n Visualize the path to get from `g_start` to `g_goal` defined by `waypoints` and the grid `g`\n \"\"\"\n path, cost = a_star(g, heuristic, grid_start, grid_goal)\n print(f'Path cost: {cost}')\n \n waypoints = np.array([[p[0] + n_offset, p[1] + e_offset, TARGET_ALTITUDE, 0] for p in path])\n print(f'Waypoint count : {waypoints.shape[0]}')\n \n fig = plt.figure(figsize=(20,10)) \n plt.imshow(g, origin='lower') \n plt.plot(g_start[1] + waypoints[:, 1], g_start[0] + waypoints[:, 0], 'g')\n plt.plot(g_start[1] + waypoints[:, 1], g_start[0] + waypoints[:, 0], 'oc')\n plt.plot(g_start[1], g_start[0], 'r+') \n plt.plot(g_goal[1], g_goal[0], 'r+')\n plt.xlabel('EAST')\n plt.ylabel('NORTH')\n plt.show()\n \n return waypoints\n \nwps = visualize_path(grid_start, grid_goal, grid, north_offset, east_offset)",
"Found a path.\nPath cost: 448160.0645114182\nWaypoint count : 1283\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79fa5a034833a577e2d763397df2ba355d836fc | 18,903 | ipynb | Jupyter Notebook | Chapter08 AdaBoost.ipynb | GodLovesJonny/Statistical-Learning-Method-Code | a0d39a1ffc874bfd4669f5c571a41dcbf58ac2fe | [
"MIT"
] | 4 | 2019-09-02T08:51:14.000Z | 2019-09-29T14:06:16.000Z | Chapter08 AdaBoost.ipynb | JonnieWayy/Statistical-Learning-Method-Code | a0d39a1ffc874bfd4669f5c571a41dcbf58ac2fe | [
"MIT"
] | null | null | null | Chapter08 AdaBoost.ipynb | JonnieWayy/Statistical-Learning-Method-Code | a0d39a1ffc874bfd4669f5c571a41dcbf58ac2fe | [
"MIT"
] | 1 | 2020-03-12T12:50:40.000Z | 2020-03-12T12:50:40.000Z | 61.373377 | 9,036 | 0.699042 | [
[
[
"import numpy as np\nimport pandas as pd\nfrom sklearn.datasets import load_iris\nfrom sklearn.model_selection import train_test_split\nimport matplotlib.pyplot as plt\n%matplotlib inline",
"_____no_output_____"
],
[
"# data\ndef create_data():\n iris = load_iris()\n df = pd.DataFrame(iris.data, columns=iris.feature_names)\n df['label'] = iris.target\n df.columns = ['sepal length', 'sepal width', 'petal length', 'petal width', 'label']\n data = np.array(df.iloc[:100, [0, 1, -1]])\n for i in range(len(data)):\n if data[i,-1] == 0:\n data[i,-1] = -1\n # print(data)\n return data[:,:2], data[:,-1]",
"_____no_output_____"
],
[
"X, y = create_data()\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)",
"_____no_output_____"
],
[
"plt.scatter(X[:50,0],X[:50,1], label='0')\nplt.scatter(X[50:,0],X[50:,1], label='1')\nplt.legend()",
"_____no_output_____"
],
[
"# AdaBoost\nclass AdaBoost:\n \n def __init__(self, n_estimators=50, learning_rate=1.0):\n self.clf_num = n_estimators\n self.learning_rate = learning_rate\n \n def init_args(self, datasets, labels):\n \n self.X = datasets\n self.Y = labels\n self.M, self.N = datasets.shape\n \n # set of weak clfs\n self.clf_sets = []\n \n # initialize weights\n self.weights = [1.0 / self.M] * self.M\n \n # coeff of G(x) alpha\n self.alpha = []\n \n def _G(self, features, labels, weights):\n m = len(features)\n error = 1000000000.0 # INF\n best_v = 0.0\n # 1-dim feature\n features_min = min(features)\n features_max = max(features)\n n_step = (features_max - features_min + self.learning_rate) // self.learning_rate\n direct, compare_array = None, None\n for i in range(1, int(n_step)):\n v = features_min + self.learning_rate * i \n if v not in features:\n # calculate the wrong clf\n compare_array_positive = np.array(\n [1 if features[k] > v else -1 for k in range(m)])\n weight_error_positive = sum([\n weights[k] for k in range(m)\n if compare_array_positive[k] != labels[k]\n ])\n\n compare_array_nagetive = np.array(\n [-1 if features[k] > v else 1 for k in range(m)])\n weight_error_nagetive = sum([\n weights[k] for k in range(m)\n if compare_array_nagetive[k] != labels[k]\n ])\n\n if weight_error_positive < weight_error_nagetive:\n weight_error = weight_error_positive\n _compare_array = compare_array_positive\n direct = 'positive'\n else:\n weight_error = weight_error_nagetive\n _compare_array = compare_array_nagetive\n direct = 'nagetive'\n\n # print('v:{} error:{}'.format(v, weight_error))\n if weight_error < error:\n error = weight_error\n compare_array = _compare_array\n best_v = v\n return best_v, direct, error, compare_array\n \n def _alpha(self, error):\n return 0.5 * np.log((1 - error) / error)\n \n def _Z(self, weights, a, clf):\n return sum([\n weights[i] * np.exp(-1 * a * self.Y[i] * clf[i])\n for i in range(self.M)\n ])\n \n # update weights\n def _w(self, a, clf, Z):\n for i in range(self.M):\n self.weights[i] = self.weights[i] * np.exp(\n -1 * a * self.Y[i] * clf[i]) / Z\n \n def G(self, x, v, direct):\n if direct == 'positive':\n return 1 if x > v else -1\n else:\n return -1 if x > v else 1\n \n def fit(self, X, y):\n self.init_args(X, y)\n\n for epoch in range(self.clf_num):\n best_clf_error, best_v, clf_result = 100000, None, None\n # choose the smallest error according to features\n for j in range(self.N):\n features = self.X[:, j]\n v, direct, error, compare_array = self._G(\n features, self.Y, self.weights)\n\n if error < best_clf_error:\n best_clf_error = error\n best_v = v\n final_direct = direct\n clf_result = compare_array\n axis = j\n\n if best_clf_error == 0:\n break\n\n a = self._alpha(best_clf_error)\n self.alpha.append(a)\n self.clf_sets.append((axis, best_v, final_direct))\n Z = self._Z(self.weights, a, clf_result)\n self._w(a, clf_result, Z)\n\n def predict(self, feature):\n result = 0.0\n for i in range(len(self.clf_sets)):\n axis, clf_v, direct = self.clf_sets[i]\n f_input = feature[axis]\n result += self.alpha[i] * self.G(f_input, clf_v, direct)\n # sign\n return 1 if result > 0 else -1\n\n def score(self, X_test, y_test):\n right_count = 0\n for i in range(len(X_test)):\n feature = X_test[i]\n if self.predict(feature) == y_test[i]:\n right_count += 1\n\n return right_count / len(X_test)",
"_____no_output_____"
],
[
"clf = AdaBoost(n_estimators=10, learning_rate=0.2)\nclf.fit(X_train, y_train)\nclf.score(X_test, y_test)",
"_____no_output_____"
],
[
"# 100次结果\nresult = []\nfor i in range(1, 101):\n X, y = create_data()\n X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.33)\n clf = AdaBoost(n_estimators=100, learning_rate=0.2)\n clf.fit(X_train, y_train)\n r = clf.score(X_test, y_test)\n if i % 10 == 0:\n print(' * Processed {}/100 => Batch Score: {:.3f}'.format(i, r))\n result.append(r)\n\nprint('Average Score: {:.3f}%'.format(sum(result)))",
" * Processed 10/100 => Batch Score: 0.667\n * Processed 20/100 => Batch Score: 0.576\n * Processed 30/100 => Batch Score: 0.939\n * Processed 40/100 => Batch Score: 0.545\n * Processed 50/100 => Batch Score: 0.939\n * Processed 60/100 => Batch Score: 0.455\n * Processed 70/100 => Batch Score: 0.697\n * Processed 80/100 => Batch Score: 0.727\n * Processed 90/100 => Batch Score: 0.636\n * Processed 100/100 => Batch Score: 0.515\nAverage Score: 66.394%\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79facb3e189da4127f2e754da863c1e48499d52 | 25,914 | ipynb | Jupyter Notebook | Exercicios_unidade_4_Manipulacao_de_Dados_Parte_01.ipynb | felipemoreia/Data-Science-com-Python---Awari | 3262b02153a70f40fe524ef73cdd4e5004bd4261 | [
"MIT"
] | null | null | null | Exercicios_unidade_4_Manipulacao_de_Dados_Parte_01.ipynb | felipemoreia/Data-Science-com-Python---Awari | 3262b02153a70f40fe524ef73cdd4e5004bd4261 | [
"MIT"
] | null | null | null | Exercicios_unidade_4_Manipulacao_de_Dados_Parte_01.ipynb | felipemoreia/Data-Science-com-Python---Awari | 3262b02153a70f40fe524ef73cdd4e5004bd4261 | [
"MIT"
] | null | null | null | 30.451234 | 267 | 0.312109 | [
[
[
"<a href=\"https://colab.research.google.com/github/felipemoreia/Awari/blob/master/Exercicios_unidade_4_Manipulacao_de_Dados_Parte_01.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# Awari - Data Science\n\n## Exercícios Unidade 4 - Parte 1",
"_____no_output_____"
],
[
"Neste Jupyter notebook você irá resolver uma exercícios utilizando a linguagem Python e a biblioteca Pandas.\n\nTodos os datasets utilizados nos exercícios estão salvos na pasta *datasets*.\n\nTodo o seu código deve ser executado neste Jupyter Notebook. Por fim, se desejar, revise as respostas com o seu mentor.",
"_____no_output_____"
]
],
[
[
"import pandas as pd",
"_____no_output_____"
]
],
[
[
"### Passo 1. Importando os dados\n\nCarregue os dados salvos no arquivo ***datasets/users_dataset.txt***.\n\nEsse arquivo possui um conjunto de dados de trabalhadores com 5 colunas separadas pelo símbolo \"|\" (pipe) e 943 linhas.\n\n*Dica: utilize a função read_csv com os parâmetros sep e index_col*",
"_____no_output_____"
]
],
[
[
"users = pd.read_csv('users_dataset.txt',sep='|', index_col='user_id')",
"_____no_output_____"
]
],
[
[
"### Passo 2. Mostre as 25 primeiras linhas do dataset.\n\n*Dica: use a função head do DataFrame*",
"_____no_output_____"
]
],
[
[
"users.head(25)",
"_____no_output_____"
]
],
[
[
"### Passo 3. Mostre as 10 últimas linhas",
"_____no_output_____"
]
],
[
[
"users.tail(10)",
"_____no_output_____"
]
],
[
[
"### Passo 4. Qual o número de linhas e colunas do DataFrame?",
"_____no_output_____"
]
],
[
[
"users.shape",
"_____no_output_____"
]
],
[
[
"### Passo 5. Mostre o nome de todas as colunas.",
"_____no_output_____"
]
],
[
[
"users.columns",
"_____no_output_____"
]
],
[
[
"### Passo 6. Qual o tipo de dado de cada columa?",
"_____no_output_____"
]
],
[
[
"users.dtypes",
"_____no_output_____"
]
],
[
[
"### Passo 7. Mostre os dados da coluna *occupation*.",
"_____no_output_____"
]
],
[
[
"users['occupation']",
"_____no_output_____"
]
],
[
[
"### Passo 8. Quantas ocupações diferentes existem neste dataset?",
"_____no_output_____"
]
],
[
[
"len(users['occupation'].unique())",
"_____no_output_____"
]
],
[
[
"### Passo 9. Qual a ocupação mais frequente?",
"_____no_output_____"
]
],
[
[
"users['occupation'].value_counts()",
"_____no_output_____"
]
],
[
[
"### Passo 10. Qual a idade média dos usuários?",
"_____no_output_____"
]
],
[
[
"users['age'].mean()",
"_____no_output_____"
]
],
[
[
"Referências: https://github.com/guipsamora/pandas_exercises\n\n### Awari - <a href=\"https://awari.com.br/\"> awari.com.br</a>",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e79faea9b88eb164f03adb6a736f0f2aae0a1743 | 13,108 | ipynb | Jupyter Notebook | my_modules/my_coding.ipynb | ARahmansetayesh/FeedbackAlignmentWithWeightNormalization | bd9e3d4f065f31ac75365395ef693aa1b79470f0 | [
"Apache-2.0"
] | null | null | null | my_modules/my_coding.ipynb | ARahmansetayesh/FeedbackAlignmentWithWeightNormalization | bd9e3d4f065f31ac75365395ef693aa1b79470f0 | [
"Apache-2.0"
] | null | null | null | my_modules/my_coding.ipynb | ARahmansetayesh/FeedbackAlignmentWithWeightNormalization | bd9e3d4f065f31ac75365395ef693aa1b79470f0 | [
"Apache-2.0"
] | null | null | null | 35.814208 | 534 | 0.442478 | [
[
[
"# Generating mutually exclusive n-hot coding\n\nSuppose the number of categories is $C$ and number of output neurons is $m$ ($ n \\cdot C \\leq m$). For generating mutually exclusive $n$-hot code vectors of size $m$ for each category, we started from the first category to the last one and successively for each category $c \\in \\{0,1,\\cdots,C-1\\}$ we initialized its code vector with zero elements and then randomly selected $n$ out of $m-c \\cdot n$ elements that were not equal to $1$ in any of the $c$ previously coded category vectors and set them equal to $1$. ",
"_____no_output_____"
]
],
[
[
"import random\nimport numpy as np\nimport torch\n\ndtype = torch.float\ndevice = torch.device('cuda' if torch.cuda.is_available() else 'cpu')\n\n\n",
"_____no_output_____"
],
[
"def Diff(li1, li2):\n return list(set(li1) - set(li2)) + list(set(li2) - set(li1))\n\n# coding_layers should be a list of number of neurons in each layer \n# ones_in_layes should be a list of number of ones in each coding vector\n \ndef get_n_hot_coding_map(coding_layers , ones_in_layes , number_of_categories ):\n\n if(type(coding_layers) != list ):\n raise Exception('coding_layers is not a list')\n if(type(ones_in_layes) != list ):\n raise Exception('ones_in_layes is not a list')\n\n\n if(type(number_of_categories) != int ):\n raise Exception('number_of_categories is not int') \n\n if( len(coding_layers) != len(ones_in_layes) ):\n raise Exception('inputs len mismatch') \n\n\n coding = []\n\n \n\n\n for i in range(len(coding_layers)):\n if ( number_of_categories*(ones_in_layes[i] ) > coding_layers[i] ):\n raise Exception('no a valide coding')\n\n\n\n coding.append( np.zeros([number_of_categories,coding_layers[i]]) )\n\n initial_indices_list = list(range(coding_layers[i]))\n for j in range(number_of_categories):\n\n indice = random.sample( initial_indices_list , ones_in_layes[i] )\n coding[i][j,indice] = 1\n k=0\n initial_indices_list = Diff(initial_indices_list, indice)\n \n\n\n coding[i] = torch.tensor(coding[i] , device=device, dtype=dtype, requires_grad=False )\n \n return coding\n\n# print(device)\n\n# coding = get_n_hot_coding_map([100] , [ 10 ] , 10 ) \n# # # print(coding)\n\n# m = np.zeros( len(coding[0][0]) )\n\n# for i in range(len(coding[0])):\n# m = m + np.array(coding[0][i])\n\n# print(m)\n\n# m = np.ones(len(coding[0][0]))\n# for i in range(len(coding[0])):\n# m = m * np.array(coding[0][i])\n\n# print(m)\n\n# # print( np.array(coding[1][0]) + np.array(coding[1][1]) + np.array(coding[1][2]) )\n# # print( np.array(coding[1][0]) * np.array(coding[1][1]) * np.array(coding[1][2]) )\n# # print( np.array([0,2,3]) * np.array([3,2,5]) )\n# # i=0\n\n# # print( torch.matmul( coding[i] , torch.transpose(coding[i],0, 1) ) )\n\n\n\n# # i=1\n\n# # print( torch.matmul( coding[i] , torch.transpose(coding[i],0, 1) ) )",
"_____no_output_____"
],
[
"# coding : list of tensors with size : ( number_of_categories , coding_layers[i]) , coding_layers[i] is number of neurons in layer i\n# category : 1:can be (Batch size , 1) 2:can be (Batch size ) \n\n\ndef code_category(coding , category):\n if(type(coding) != list ):\n raise Exception('coding is not a list')\n\n if(not torch.is_tensor(category) ): \n raise Exception('category is not a torch tensor')\n\n coded = []\n \n if (len(category.shape) == 1 or (len(category.shape) == 2 and (category.shape[1]) == 1)):\n\n category1 = category.view(-1).to(torch.int64 )\n for i in range( len(coding) ):\n coded.append(coding[i][ category1 , :])\n \n \n #if its one hot code\n else :\n raise Exception('category size mismach')\n\n return coded \n\n\n\n # for i in range(len(coding)):\n # output.append()\n\n# a=np.array( [2,1,1,1,1,1,1])\n# # print(a.shape)\n# category = torch.tensor(a).view([-1,1])\n\n# category = torch.tensor([ [0,0,1] , [1,0,0] , [1,0,0] , [1,0,1] ] , dtype=dtype )\n# # category = torch.tensor(a)\n# # category = a\n# # print(category.shape)\n# print(category)\n# # print(category.shape)\n\n\n# code = get_distributed_coding_map([5,3] , [ 2,1] , [1,0] , 3 ) \n# print(code)\n\n# code_category(code , category)",
"_____no_output_____"
],
[
"# coding : list of tensors with size : ( number_of_categories , coding_layers[i]) , coding_layers[i] is number of neurons in layer i\n# activity : list of activity of layers with size : ( batch_size , coding_layers[i])\n\n# return : tesor of size : ( batch_size , top )\n\ndef decode_category(coding , activity , top=1 , coef=None):\n if(type(coding) != list ):\n raise Exception('coding is not a list')\n\n if(type(activity) != list ):\n raise Exception('activity is not a list')\n\n result = torch.zeros( [ activity[0].shape[0] , coding[0].shape[0] ] , device = device , dtype=dtype, requires_grad=False )\n\n\n for i in range(len(coding)):\n if (coef==None):\n result = result + torch.matmul( activity[i] , torch.transpose( coding[i] , 0 , 1 ) ) \n\n else:\n result = result + coef[i] * torch.matmul( activity[i] , torch.transpose( coding[i] , 0 , 1 ) ) \n\n result2 = torch.zeros( [activity[0].shape[0] , top ] , device = device , dtype=dtype, requires_grad=False )\n\n\n for i in range(top): \n a,indece = torch.max(result ,dim=1)\n\n result2[np.arange(activity[0].shape[0]) , i ] = indece.view([-1]).to(dtype)\n result[np.arange(result.shape[0]) ,indece.view([-1]) ] = -1\n return result2\n \n\n\n# code = get_distributed_coding_map([5,3] , [ 2,1] , [1,0] , 3 ) \n\n# print(code[0])\n# print(code[1])\n\n# code1 = torch.tensor( [[1,0,0,1,0] , [0,1,0,0,0] , [0,0,1,0,0] ] )\n# code2 = torch.tensor( [[1,0,0] , [0,1,0] , [0,0,1] ] )\n\n\n\n# activity1 = torch.tensor( [[1,1,1,1,0] , [0,1,0,1,0] , [0,0,0,1,0] , [0,0,1,0,0] , [0,0,1,0,0] , [0,1,0,1,0]] )\n\n# activity2 = torch.tensor( [[1,0,1] , [0,0,0] , [0,1,0] , [0,0,1] , [0,1,0] , [0,1,0]] )\n\n# res = decode_category([code1,code2] , [activity1,activity2] ,top=2 , coef = [0.1 , 0.9] )\n\n# print(res)",
"_____no_output_____"
],
[
"\n# true_labels size : (batch size , 1) or (batch size)\ndef get_accuracy( true_labels , coding , activity , top=1 , coef=None ):\n\n if(type(activity) != list ):\n raise Exception('activity is not a list')\n\n if(true_labels.shape[0] != activity[0].shape[0] or len(true_labels.shape) > 1):\n raise Exception('true_labels shape mismatch')\n\n if(not torch.is_tensor(true_labels) ): \n raise Exception('true_labels is not a torch tensor')\n\n true_labels = true_labels.view([-1,1])\n\n\n predicted = decode_category(coding , activity , top=top , coef=coef)\n\n res = torch.sum(torch.clamp( torch.sum(predicted == true_labels , dim=1) , 0 ,1 )).to(dtype)/true_labels.shape[0]\n return res.item()\n\n\n\n# code1 = torch.tensor( [[1,0,0,1,0] , [0,1,0,0,0] , [0,0,1,0,0] ] )\n# code2 = torch.tensor( [[1,0,0] , [0,1,0] , [0,0,1] ] )\n\n\n\n# activity1 = torch.tensor( [[1,1,1,1,0] , [0,1,0,1,0] , [0,0,0,1,0] , [0,0,1,0,0] , [0,0,1,0,0] , [0,1,0,1,0]] )\n\n# activity2 = torch.tensor( [[1,0,1] , [0,0,0] , [0,1,0] , [0,0,1] , [0,1,0] , [0,1,0]] )\n\n# true_labels = torch.tensor([1,1,0,2,1,2])\n# get_accuracy( true_labels , [code1,code2] , [activity1,activity2] ,top=2)\n",
"_____no_output_____"
],
[
"# x list of size (batch , coding[0].shape[1] + coding[1].shape[1] + ... )\n\ndef seprate(coding , x):\n if(not torch.is_tensor(x) ): \n raise Exception('x is not a torch tensor')\n\n if(type(coding) != list ):\n raise Exception('coding is not a list')\n\n mlist = []\n form_ = 0\n til=0\n for i in range(len(coding)):\n til = til + coding[i].shape[1]\n mlist.append(x[: , form_ : til])\n form_ = form_ + coding[i].shape[1]\n if(til > x.shape[1]): \n raise Exception('x size mismatch')\n\n\n if(til != x.shape[1]): \n raise Exception('x size mismatch')\n\n return mlist\n\n\n# coding = get_distributed_coding_map([3,4] , [ 1 ,1 ] , [2,1] , 3 ) \n\n# x = torch.rand([5,7])\n\n# y =seprate(coding , x)\n# print(x)\n# print(y)\n\n# x = torch.tensor([[1,2,3.3],[5.4,1,0.2]])\n# print(x)\n# a,b = torch.max(x ,dim=1)\n\n# x[np.arange(x.shape[0]) , b ] = -1\n# print(b)\n\n# print(x)\n\n# a,b = torch.max(x ,dim=1)\n\n# x[np.arange(x.shape[0]) , b ] = -1\n# print(b)\n\n# print(x)",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79fb3286971293e3f9133c0ba9f157575dfd834 | 12,260 | ipynb | Jupyter Notebook | Content.ipynb | dev-aditya/QWorld_Summer_School_2021 | 1b8711327845617ca8dc32ff2a20f461d0ee01c7 | [
"Apache-2.0",
"CC-BY-4.0"
] | 1 | 2021-08-15T10:57:16.000Z | 2021-08-15T10:57:16.000Z | Content.ipynb | dev-aditya/QWorld_Summer_School_2021 | 1b8711327845617ca8dc32ff2a20f461d0ee01c7 | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | Content.ipynb | dev-aditya/QWorld_Summer_School_2021 | 1b8711327845617ca8dc32ff2a20f461d0ee01c7 | [
"Apache-2.0",
"CC-BY-4.0"
] | 3 | 2021-08-11T11:12:38.000Z | 2021-09-14T09:15:08.000Z | 40.065359 | 127 | 0.600734 | [
[
[
"<table width=\"100%\"><tr style=\"background-color:white;\">\n <td style=\"text-align:left;padding:0px;width:142px'\">\n <a href=\"https://qworld.net\" target=\"_blank\">\n <img src=\"qworld/images/QWorld.png\"></a></td>\n <td width=\"*\"> </td>\n <!-- ############################################# -->\n <td style=\"padding:0px;width:90px;\">\n <img align=\"right\" src=\"qworld/images/follow_us.png\" height=\"40px\"></td>\n <td style=\"padding:0px;width:40px;\">\n <a href=\"https://twitter.com/QWorld19\" target=\"_blank\">\n <img align=\"right\" src=\"qworld/images/Twitter.png\" width=\"40px\"></a> </td>\n <td style=\"padding:0px;width:5px;\"></td>\n <td style=\"padding:0px;width:40px;\">\n <a href=\"https://www.facebook.com/qworld19/\" target=\"_blank\">\n <img align=\"right\" src=\"qworld/images/Fb.png\"></a></td>\n <td style=\"padding:0px;width:5px;\"></td>\n <td style=\"padding:0px;width:40px;\">\n <a href=\"https://www.linkedin.com/company/qworld19\" target=\"_blank\">\n <img align=\"right\" src=\"qworld/images/LinkedIn.png\"></a></td>\n <td style=\"padding:0px;width:5px;\"></td>\n <td style=\"padding:0px;width:40px;\">\n <a href=\"https://youtube.com/QWorld19?sub_confirmation=1\" target=\"_blank\">\n <img align=\"right\" src=\"qworld/images/YT.png\"></a></td>\n <!-- ############################################# -->\n <td style=\"padding:0px;width:60px;\">\n <img align=\"right\" src=\"qworld/images/join.png\" height=\"40px\"></td>\n <td style=\"padding:0px;width:40px;\">\n <a href=\"https://discord.com/invite/akCvr7U87g\"\n target=\"_blank\">\n <img align=\"right\" src=\"qworld/images/Discord.png\"></a></td>\n <!-- ############################################# -->\n <td style=\"padding:0px;width:72px;\">\n <img align=\"right\" src=\"qworld/images/w3.png\" height=\"40px\"></td>\n <td style=\"padding:0px;width:40px;\">\n <a href=\"https://qworld.net\" target=\"_blank\">\n <img align=\"right\" src=\"qworld/images/www.png\"></a></td>\n</tr></table>",
"_____no_output_____"
],
[
"\n\n<h1 align=\"center\" style=\"color: #cd7f32;\"> QWorld Quantum Summer School 2021 | Content</h2>",
"_____no_output_____"
],
[
"---\n### Reference notebooks\n\n[Qiskit Reference](bronze/Q01_Qiskit_Reference.ipynb) | \n[Python Reference](python/Python04_Quick_Reference.ipynb) | \n[Drawing Reference](python/Python06_Drawing.ipynb)\n\n---",
"_____no_output_____"
],
[
"### Day 1 (26.07.2021) - Basics of classical systems\n\n[One Bit](classical-systems/CS04_One_Bit.ipynb) | \n[Coin Flipping](classical-systems/CS08_Coin_Flip.ipynb) | \n[Coin Flipping Game](classical-systems/CS12_Coin_Flip_Game.ipynb) | \n[Probabilistic States](classical-systems/CS16_Probabilistic_States.ipynb) | \n[Probabilistic Operators](classical-systems/CS20_Probabilistic_Operators.ipynb) | \n[Two Probabilistic Bits](classical-systems/CS24_Two_Probabilistic_Bits.ipynb)\n\n#### _Optional content_ \n[Exercises](classical-systems/Exercises_Probabilistic_Systems.ipynb) | \n[Problem Set](classical-systems/Problems_Probabilistic_Systems.ipynb) | \n[_Correlation (Extra)_](classical-systems/CS28_Correlation.ipynb) | \n[_Operators on Multiple Bits (Extra)_](classical-systems/CS40_Operators_on_Multiple_Bits.ipynb) \n",
"_____no_output_____"
],
[
"### Day 2 (27.07.2021) - Basics of quantum systems and Qiskit basics\n\n[Quantum Coin Flipping](photon/Photon20_Quantum_Coin_Flipping.ipynb)\n\n[First Quantum Programs with Qiskit](bronze/Q12_First_Quantum_Programs_with_Qiskit.ipynb)\n\n[Hadamard Operator](bronze/Q20_Hadamard.ipynb) | \n[One Qubit](bronze/Q24_One_Qubit.ipynb) | \n[Quantum State](bronze/Q28_Quantum_State.ipynb) | \n[Superposition and Measurement](bronze/Q36_Superposition_and_Measurement.ipynb) | \n[Visualization of a (Real-Valued) Qubit](bronze/Q32_Visualization_of_a_Qubit.ipynb)\n\n#### _Optional content_\n\n[Exercises](bronze/Exercises_Basics_of_Quantum_Systems.ipynb)",
"_____no_output_____"
],
[
"### Day 3 (28.07.2021) Quantum operators on a (real-valued) qubit and multiple qubits\n\n[Operations on the Unit Circle](bronze/Q40_Operations_on_the_Unit_Circle.ipynb) | \n[Rotations](bronze/Q44_Rotations.ipynb) | \n[Reflections](bronze/Q48_Reflections.ipynb) | \n[Quantum Tomography](bronze/Q52_Quantum_Tomography.ipynb) \n\n[Two Qubits](bronze/Q60_Two_Qubits.ipynb) | \n[Phase Kickback](bronze/Q64_Phase_Kickback.ipynb)\n\n#### _Optional content_\n\n[Exercises](bronze/Exercises_Quantum_Operators_on_a_Real-Valued_Qubit.ipynb)",
"_____no_output_____"
],
[
"### Day 4 (29.07.2021) Entanglement and protocols, multiqubit gates\n\n[Entanglement and Superdense Coding](bronze/Q72_Superdense_Coding.ipynb) | \n[Quantum Teleportation](bronze/Q76_Quantum_Teleportation.ipynb) | \n[Multiple Control Constructions](bronze/Q80_Multiple_Control_Constructions.ipynb) | \n\n#### _Optional content_\n\n[Exercises](bronze/Exercises_Quantum_Correlation.ipynb)",
"_____no_output_____"
],
[
"### Day 5 (30.07.2021) Grover's search algorithm\n\n[Inversion About the Mean](bronze/Q84_Inversion_About_the_Mean.ipynb) | \n[Grover's Search: One Qubit Representation](bronze/Q88_Grovers_Search_One_Qubit_Representation.ipynb) | \n[Grover's Search: Implementation](bronze/Q92_Grovers_Search_Implementation.ipynb)",
"_____no_output_____"
],
[
"---\n<h3 align=\"left\"> Projects You can check at the end of first week:</h3>\n\n*Difficulty levels:\neasy (<font size=\"+1\" color=\"7777ee\">★</font>), \nmedium (<font size=\"+1\" color=\"7777ee\">★★</font>), and\nhard (<font size=\"+1\" color=\"7777ee\">★★★</font>).*\n\n<font size=\"+1\" color=\"7777ee\"> ★</font> |\n[Correlation Game](projects/Project_Correlation_Game.ipynb) *on classical bits*\n<br>\n<font size=\"+1\" color=\"7777ee\"> ★</font> |\n[Swapping Quantum States](projects/Project_Swapping_Quantum_States.ipynb) *on qubits*\n<br>\n<font size=\"+1\" color=\"7777ee\"> ☆★</font> |\n[Simulating a Real-Valued Qubit](projects/Project_Simulating_a_RealValued_Qubit.ipynb)\n<br>\n<font size=\"+1\" color=\"7777ee\"> ★★</font> |\n[Quantum Tomography with Many Qubits](projects/Project_Quantum_Tomography_with_Many_Qubits.ipynb)\n<br>\n<font size=\"+1\" color=\"7777ee\"> ★★</font> |\n[Implementing Quantum Teleportation](projects/Project_Implementing_Quantum_Teleportation.ipynb)\n<br>\n<font size=\"+1\" color=\"7777ee\">☆★★</font> |\n[Communication via Superdense Coding](projects/Project_Communication_via_Superdense_Coding.ipynb)\n<br>\n<font size=\"+1\" color=\"7777ee\">★★★</font> |\n[Your Quantum Simulator](projects/Project_Your_Quantum_Simulator.ipynb)\n\n---",
"_____no_output_____"
],
[
"### Day 6 (02.08.2021) Introduction to Complex Numbers\n\n\n[Introduction to Qiskit](silver/A00_Qiskit_Introduction.ipynb) \n\n[Basics of complex numbers](silver/C01_Complex_Number_Basics.ipynb) | \n[Quantum states with complex numbers](silver/C02_Quantum_States_With_Complex_Numbers.ipynb) | \n[Mathematical notations](silver/C02_Mathematical_Notations.ipynb)\n\n[Quantum operators with complex numbers](silver/C03_Quantum_Operators_With_Complex_Numbers.ipynb)\n\n\n",
"_____no_output_____"
],
[
"### Day 7 (03.08.2021) Bloch Sphere and Operations with Complex Numbers\n\n[Global and local phase](silver/C05_Global_And_Local_Phase.ipynb) | \n[State representation conversion and visualization](silver/C06_State_Conversion_And_Visualization.ipynb) | \n[Bloch sphere](silver/C07_Bloch_Sphere.ipynb)\n \n\n[Quantum gates with complex numbers](silver/C04_Quantum_Gates_With_Complex_Numbers.ipynb) | \n[Operations on Bloch sphere](silver/C08_Operations_On_Bloch_Sphere.ipynb) | \n[Multiqubit operations](silver/C09_Multiqubit_Operations.ipynb)",
"_____no_output_____"
],
[
"### Day 8 (04.08.2021) Introduction to Cirq and Quantum Fourier Transform\n\n[Introduction to Cirq](silver/D00_Cirq_Introduction.ipynb) \n\n[Discrete Fourier Transform](silver/D01_Discrete_Fourier_Transform.ipynb) |\n[Quantum Fourier Transform](silver/D02_Quantum_Fourier_Transform.ipynb)",
"_____no_output_____"
],
[
"### Day 9 (05.08.2021) Applications of Quantum Fourier Transform\n\n\n\n[Phase Estimation](silver/D03_Phase_Estimation.ipynb) | \n[Order Finding Algorithm](silver/D04_Order_Finding_Algorithm.ipynb)",
"_____no_output_____"
],
[
"### Day 10 (06.08.2021) Shor's Algorithm\n\n[Shor's Algorithm](silver/D05_Shors_Algorithm.ipynb) | \n[Shor's Algorithm in more detail](silver/D06_Shors_Algorithm_In_More_Detail.ipynb) ",
"_____no_output_____"
],
[
"---\n\n[Credits](silver/S00_Credits.ipynb) | [References](silver/S01_References.ipynb)",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
e79fbb562a794881b49812b705fca32329f004ea | 8,067 | ipynb | Jupyter Notebook | solarpv/training/spot/train_solar_unet.ipynb | shivareddyiirs/solar-pv-global-inventory | 9940a454de88a39ca92dbabf07e98d8623f0ec8b | [
"MIT"
] | 64 | 2021-10-29T18:51:23.000Z | 2022-03-31T01:41:34.000Z | solarpv/training/spot/train_solar_unet.ipynb | shivareddyiirs/solar-pv-global-inventory | 9940a454de88a39ca92dbabf07e98d8623f0ec8b | [
"MIT"
] | 3 | 2021-11-25T06:00:38.000Z | 2021-11-28T16:06:46.000Z | solarpv/training/spot/train_solar_unet.ipynb | shivareddyiirs/solar-pv-global-inventory | 9940a454de88a39ca92dbabf07e98d8623f0ec8b | [
"MIT"
] | 9 | 2021-10-30T01:20:26.000Z | 2022-03-22T18:48:48.000Z | 31.389105 | 232 | 0.402256 | [
[
[
"# Train solar models",
"_____no_output_____"
]
],
[
[
"# import packages\nimport json\nimport logging\nimport matplotlib.pyplot as plt\nimport matplotlib.image as mpimg\n%matplotlib inline\nimport imp\nimport numpy as np\nimport os\nimport random\nimport rasterio\nimport shapely\nimport tensorflow as tf\n\nimport descarteslabs as dl",
"_____no_output_____"
],
[
"# Import local modules\nimport train\nimport generator\nimport transforms",
"_____no_output_____"
],
[
"# Define parameters\n# Note, setting epochs, steps to 2 for demonstration\n\n# For full training, use:\n# params = train.params\n\n# For testing, define the parameters here\nparams = {\n 'seed': 21, # for train/val data split \n\n # Training data specifications \n # DATASET METADATA # \n 'data_metadata': {\n 'products': ['airbus:oneatlas:spot:v2'],\n 'bands': ['red', 'green', 'blue', 'nir'],\n 'resolution': 1.5,\n 'start_datetime': '2016-01-01',\n 'end_datetime': '2018-12-31',\n 'tilesize': 512,\n 'pad': 0,\n },\n\n # GLOBAL METADATA # \n 'global_metadata': {\n 'local_ground': 'ground/', # directory containing image-target pairs \n 'local_model': 'model/', # directory to write this model \n },\n\n # MODEL METADATA \n 'model_name': 'solar_pv_airbus_spot_rgbn_v5',\n\n # TRAINING METADATA # \n # Metadata to define the training stage \n 'training_kwargs': {\n 'datalist': 'train_keys.txt',\n 'batchsize': 16,\n 'val_datalist': 'val_keys.txt',\n 'val_batchsize': 16,\n 'epochs': 1, #150,\n 'steps_per_epoch': 2,\n 'image_dim': (512, 512, 4) # This is the size of the training images \n },\n 'transforms': [\n transforms.CastTransform(feature_type='float32', target_type='bool'),\n transforms.SquareImageTransform(),\n transforms.AdditiveNoiseTransform(additive_noise=30.),\n transforms.MultiplicativeNoiseTransform(multiplicative_noise=0.3),\n transforms.NormalizeFeatureTransform(mean=128., std=1.),\n transforms.FlipFeatureTargetTransform(),\n ],\n}\n",
"_____no_output_____"
],
[
"print(params['training_kwargs'])",
"_____no_output_____"
],
[
"# Train the model\ntrain.train_from_document(params=params)",
"_____no_output_____"
],
[
"!cat 'model/train_solar_pv_airbus_spot_rgbn_v5.log'",
"_____no_output_____"
]
],
[
[
"## Load the model and predict on one training image",
"_____no_output_____"
]
],
[
[
"model = tf.keras.models.load_model('model/solar_pv_airbus_spot_rgbn_v5.hdf5')",
"_____no_output_____"
],
[
"trf = [\n transforms.CastTransform(feature_type='float32', target_type='bool'),\n transforms.SquareImageTransform(),\n transforms.NormalizeFeatureTransform(mean=128., std=1.),\n]",
"_____no_output_____"
],
[
"kw_train = params['training_kwargs']\ndata_list = os.path.join(params['global_metadata']['local_ground'], kw_train['datalist'])\n\ntrn_generator = generator.DataGenerator(data_list, batch_size=2, dim=(512,512, 4),\n shuffle=False, augment=True,\n transforms=trf,\n )",
"_____no_output_____"
],
[
"img, trg = trn_generator.__getitem__(0)",
"_____no_output_____"
],
[
"def img_plt(img):\n return np.clip((img+128).astype('uint8'), 0, 255)\n\nii=0\nfig, ax = plt.subplots(1,2, figsize=(10,8))\nax[0].imshow(img_plt(img[ii,:,:,:3]))\nax[1].imshow(img_plt(trg[ii,:,:,:].squeeze()))",
"_____no_output_____"
],
[
"proba = model.predict(img)",
"_____no_output_____"
],
[
"proba.shape",
"_____no_output_____"
],
[
"plt.imshow(proba[0,...,0].squeeze())",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79fbc6db7e8ccb2814da31f45bc78c3462afd7b | 105 | ipynb | Jupyter Notebook | notebooks/test_bert.ipynb | megaelius/EPAA | 38900be53d6544becf8c51ed1de0501fd639b9c8 | [
"FTL"
] | null | null | null | notebooks/test_bert.ipynb | megaelius/EPAA | 38900be53d6544becf8c51ed1de0501fd639b9c8 | [
"FTL"
] | null | null | null | notebooks/test_bert.ipynb | megaelius/EPAA | 38900be53d6544becf8c51ed1de0501fd639b9c8 | [
"FTL"
] | null | null | null | 17.5 | 53 | 0.828571 | [
[
[
"empty"
]
]
] | [
"empty"
] | [
[
"empty"
]
] |
e79fc3627069f4b5e1c39d0a55d84951c35e45f1 | 47,171 | ipynb | Jupyter Notebook | nb/tests/models_taus.ipynb | kgb0255/provabgs | a2a787fde5b9f6f2e093b59801121c29d4867471 | [
"MIT"
] | 11 | 2020-12-11T21:06:53.000Z | 2022-03-16T17:20:57.000Z | nb/tests/models_taus.ipynb | kgb0255/provabgs | a2a787fde5b9f6f2e093b59801121c29d4867471 | [
"MIT"
] | 15 | 2020-11-25T05:06:26.000Z | 2021-04-07T15:34:52.000Z | nb/tests/models_taus.ipynb | kgb0255/provabgs | a2a787fde5b9f6f2e093b59801121c29d4867471 | [
"MIT"
] | 3 | 2021-01-10T15:20:26.000Z | 2021-11-07T21:17:13.000Z | 142.510574 | 19,060 | 0.891056 | [
[
[
"# $\\tau$ and delayed-$\\tau$ model sanity checks\nIn this notebook I will check that the SFH are sensible and integrate to 1. I will check that the average SSFR does not exceed $1/dt$",
"_____no_output_____"
]
],
[
[
"import numpy as np \nfrom provabgs import infer as Infer\nfrom provabgs import models as Models\nfrom astropy.cosmology import Planck13",
"_____no_output_____"
],
[
"# --- plotting --- \nimport corner as DFM\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\nmpl.rcParams['text.usetex'] = True\nmpl.rcParams['font.family'] = 'serif'\nmpl.rcParams['axes.linewidth'] = 1.5\nmpl.rcParams['axes.xmargin'] = 1\nmpl.rcParams['xtick.labelsize'] = 'x-large'\nmpl.rcParams['xtick.major.size'] = 5\nmpl.rcParams['xtick.major.width'] = 1.5\nmpl.rcParams['ytick.labelsize'] = 'x-large'\nmpl.rcParams['ytick.major.size'] = 5\nmpl.rcParams['ytick.major.width'] = 1.5\nmpl.rcParams['legend.frameon'] = False",
"_____no_output_____"
],
[
"tau_model = Models.FSPS(name='tau') # tau model\ndtau_model = Models.FSPS(name='delayed_tau') # delayed tau model ",
"_____no_output_____"
],
[
"zred = 0.01\ntage = Planck13.age(zred).value",
"_____no_output_____"
],
[
"prior = Infer.load_priors([\n Infer.UniformPrior(0., 0.), \n Infer.UniformPrior(0.3, 1e1), # tau SFH\n Infer.UniformPrior(0., 0.2), # constant SFH\n Infer.UniformPrior(0., tage-2.), # start time\n Infer.UniformPrior(0., 0.5), # fburst\n Infer.UniformPrior(0., tage), # tburst\n Infer.UniformPrior(1e-6, 1e-3), # metallicity\n Infer.UniformPrior(0., 4.)])",
"_____no_output_____"
]
],
[
[
"## Check SFH sensibility",
"_____no_output_____"
]
],
[
[
"np.random.seed(2)",
"_____no_output_____"
],
[
"theta = prior.sample()\nprint('tau = %.2f' % theta[1])\nprint('tstart = %.2f' % theta[3])\nprint('tburst = %.2f' % theta[5])",
"tau = 7.81\ntstart = 5.79\ntburst = 12.39\n"
],
[
"t1, sfh1 = tau_model.SFH(theta, zred)\nt2, sfh2 = dtau_model.SFH(theta, zred)",
"_____no_output_____"
],
[
"fig = plt.figure(figsize=(10,5))\nsub = fig.add_subplot(111)\nsub.plot(t1, sfh1, label=r'$\\tau$ model')\nsub.plot(t2, sfh2, label=r'delayed-$\\tau$ model')\nsub.legend(loc='upper left', fontsize=20)\nsub.set_xlabel(r'$t_{\\rm lookback}$', fontsize=25)\nsub.set_xlim(0., tage)",
"_____no_output_____"
]
],
[
[
"## check SFH normalization",
"_____no_output_____"
]
],
[
[
"for i in range(100): \n theta = prior.sample()\n t1, sfh1 = tau_model.SFH(theta, zred)\n t2, sfh2 = dtau_model.SFH(theta, zred) \n assert np.abs(np.trapz(sfh1, t1) - 1) < 1e-4, ('int(SFH) = %f' % np.trapz(sfh1, t1))\n assert np.abs(np.trapz(sfh2, t2) - 1) < 1e-4, ('int(SFH) = %f' % np.trapz(sfh2, t2))",
"_____no_output_____"
]
],
[
[
"## check average SFR calculation ",
"_____no_output_____"
]
],
[
[
"thetas = np.array([prior.sample() for i in range(50000)])\navgsfr1 = tau_model.avgSFR(thetas, zred, dt=0.1)\navgsfr2 = dtau_model.avgSFR(thetas, zred, dt=0.1)",
"_____no_output_____"
],
[
"fig = plt.figure(figsize=(10,5))\nsub = fig.add_subplot(111)\nsub.hist(np.log10(avgsfr1), range=(-13, -7), bins=100, alpha=0.5)\nsub.hist(np.log10(avgsfr2), range=(-13, -7), bins=100, alpha=0.5)\nsub.axvline(-8, color='k', linestyle='--')\nsub.set_xlabel(r'$\\log{\\rm SSFR}$', fontsize=25)\nsub.set_xlim(-13., -7.)",
"_____no_output_____"
],
[
"avgsfr1 = tau_model.avgSFR(thetas, zred, dt=1)\navgsfr2 = dtau_model.avgSFR(thetas, zred, dt=1)",
"_____no_output_____"
],
[
"fig = plt.figure(figsize=(10,5))\nsub = fig.add_subplot(111)\nsub.hist(np.log10(avgsfr1), range=(-13, -7), bins=100, alpha=0.5)\nsub.hist(np.log10(avgsfr2), range=(-13, -7), bins=100, alpha=0.5)\nsub.axvline(-9, color='k', linestyle='--')\nsub.set_xlabel(r'$\\log{\\rm SSFR}$', fontsize=25)\nsub.set_xlim(-13., -7.)",
"_____no_output_____"
]
],
[
[
"None exceed the theoretical SSFR limit",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
]
] |
e79fc68a4227515f824945492076c2d0496650d1 | 12,073 | ipynb | Jupyter Notebook | notebooks/11.Text_Feature_Extraction.ipynb | ogrisel/euroscipy-2019-scikit-learn-tutorial | e141cd8f3e600f35826516738188e87ac3480fc3 | [
"CC0-1.0"
] | 3 | 2019-08-20T17:47:43.000Z | 2019-10-05T06:55:08.000Z | notebooks/11.Text_Feature_Extraction.ipynb | ogrisel/euroscipy-2019-scikit-learn-tutorial | e141cd8f3e600f35826516738188e87ac3480fc3 | [
"CC0-1.0"
] | null | null | null | notebooks/11.Text_Feature_Extraction.ipynb | ogrisel/euroscipy-2019-scikit-learn-tutorial | e141cd8f3e600f35826516738188e87ac3480fc3 | [
"CC0-1.0"
] | null | null | null | 31.440104 | 819 | 0.617411 | [
[
[
"%matplotlib inline\nimport matplotlib.pyplot as plt\nimport numpy as np",
"_____no_output_____"
]
],
[
[
"# Methods - Text Feature Extraction with Bag-of-Words",
"_____no_output_____"
],
[
"In many tasks, like in the classical spam detection, your input data is text.\nFree text with variables length is very far from the fixed length numeric representation that we need to do machine learning with scikit-learn.\nHowever, there is an easy and effective way to go from text data to a numeric representation using the so-called bag-of-words model, which provides a data structure that is compatible with the machine learning aglorithms in scikit-learn.",
"_____no_output_____"
],
[
"<img src=\"figures/bag_of_words.svg\" width=\"100%\">\n",
"_____no_output_____"
],
[
"Let's assume that each sample in your dataset is represented as one string, which could be just a sentence, an email, or a whole news article or book. To represent the sample, we first split the string into a list of tokens, which correspond to (somewhat normalized) words. A simple way to do this to just split by whitespace, and then lowercase the word. \n\nThen, we build a vocabulary of all tokens (lowercased words) that appear in our whole dataset. This is usually a very large vocabulary.\nFinally, looking at our single sample, we could show how often each word in the vocabulary appears.\nWe represent our string by a vector, where each entry is how often a given word in the vocabulary appears in the string.\n\nAs each sample will only contain very few words, most entries will be zero, leading to a very high-dimensional but sparse representation.\n\nThe method is called \"bag-of-words,\" as the order of the words is lost entirely.",
"_____no_output_____"
]
],
[
[
"X = [\"Some say the world will end in fire,\",\n \"Some say in ice.\"]",
"_____no_output_____"
],
[
"len(X)",
"_____no_output_____"
],
[
"from sklearn.feature_extraction.text import CountVectorizer\n\nvectorizer = CountVectorizer()\nvectorizer.fit(X)",
"_____no_output_____"
],
[
"vectorizer.vocabulary_",
"_____no_output_____"
],
[
"X_bag_of_words = vectorizer.transform(X)",
"_____no_output_____"
],
[
"X_bag_of_words.shape",
"_____no_output_____"
],
[
"X_bag_of_words",
"_____no_output_____"
],
[
"X_bag_of_words.toarray()",
"_____no_output_____"
],
[
"vectorizer.get_feature_names()",
"_____no_output_____"
],
[
"vectorizer.inverse_transform(X_bag_of_words)",
"_____no_output_____"
]
],
[
[
"# tf-idf Encoding\nA useful transformation that is often applied to the bag-of-word encoding is the so-called term-frequency inverse-document-frequency (tf-idf) scaling, which is a non-linear transformation of the word counts.\n\nThe tf-idf encoding rescales words that are common to have less weight:",
"_____no_output_____"
]
],
[
[
"from sklearn.feature_extraction.text import TfidfVectorizer\n\ntfidf_vectorizer = TfidfVectorizer()\ntfidf_vectorizer.fit(X)",
"_____no_output_____"
],
[
"import numpy as np\nnp.set_printoptions(precision=2)\n\nprint(tfidf_vectorizer.transform(X).toarray())",
"_____no_output_____"
]
],
[
[
"tf-idfs are a way to represent documents as feature vectors. tf-idfs can be understood as a modification of the raw term frequencies (`tf`); the `tf` is the count of how often a particular word occurs in a given document. The concept behind the tf-idf is to downweight terms proportionally to the number of documents in which they occur. Here, the idea is that terms that occur in many different documents are likely unimportant or don't contain any useful information for Natural Language Processing tasks such as document classification. If you are interested in the mathematical details and equations, see this [external IPython Notebook](http://nbviewer.jupyter.org/github/rasbt/pattern_classification/blob/master/machine_learning/scikit-learn/tfidf_scikit-learn.ipynb) that walks you through the computation.",
"_____no_output_____"
],
[
"# Bigrams and N-Grams\n\nIn the example illustrated in the figure at the beginning of this notebook, we used the so-called 1-gram (unigram) tokenization: Each token represents a single element with regard to the splittling criterion. \n\nEntirely discarding word order is not always a good idea, as composite phrases often have specific meaning, and modifiers like \"not\" can invert the meaning of words.\n\nA simple way to include some word order are n-grams, which don't only look at a single token, but at all pairs of neighborhing tokens. For example, in 2-gram (bigram) tokenization, we would group words together with an overlap of one word; in 3-gram (trigram) splits we would create an overlap two words, and so forth:\n\n- original text: \"this is how you get ants\"\n- 1-gram: \"this\", \"is\", \"how\", \"you\", \"get\", \"ants\"\n- 2-gram: \"this is\", \"is how\", \"how you\", \"you get\", \"get ants\"\n- 3-gram: \"this is how\", \"is how you\", \"how you get\", \"you get ants\"\n\nWhich \"n\" we choose for \"n-gram\" tokenization to obtain the optimal performance in our predictive model depends on the learning algorithm, dataset, and task. Or in other words, we have consider \"n\" in \"n-grams\" as a tuning parameters, and in later notebooks, we will see how we deal with these.\n\nNow, let's create a bag of words model of bigrams using scikit-learn's `CountVectorizer`:",
"_____no_output_____"
]
],
[
[
"# look at sequences of tokens of minimum length 2 and maximum length 2\nbigram_vectorizer = CountVectorizer(ngram_range=(2, 2))\nbigram_vectorizer.fit(X)",
"_____no_output_____"
],
[
"bigram_vectorizer.get_feature_names()",
"_____no_output_____"
],
[
"bigram_vectorizer.transform(X).toarray()",
"_____no_output_____"
]
],
[
[
"Often we want to include unigrams (single tokens) AND bigrams, wich we can do by passing the following tuple as an argument to the `ngram_range` parameter of the `CountVectorizer` function:",
"_____no_output_____"
]
],
[
[
"gram_vectorizer = CountVectorizer(ngram_range=(1, 2))\ngram_vectorizer.fit(X)",
"_____no_output_____"
],
[
"gram_vectorizer.get_feature_names()",
"_____no_output_____"
],
[
"gram_vectorizer.transform(X).toarray()",
"_____no_output_____"
]
],
[
[
"Character n-grams\n=================\n\nSometimes it is also helpful not only to look at words, but to consider single characters instead. \nThat is particularly useful if we have very noisy data and want to identify the language, or if we want to predict something about a single word.\nWe can simply look at characters instead of words by setting ``analyzer=\"char\"``.\nLooking at single characters is usually not very informative, but looking at longer n-grams of characters could be:",
"_____no_output_____"
]
],
[
[
"X",
"_____no_output_____"
],
[
"char_vectorizer = CountVectorizer(ngram_range=(2, 2), analyzer=\"char\")\nchar_vectorizer.fit(X)",
"_____no_output_____"
],
[
"print(char_vectorizer.get_feature_names())",
"_____no_output_____"
]
],
[
[
"<div class=\"alert alert-success\">\n <b>EXERCISE</b>:\n <ul>\n <li>\n Compute the bigrams from \"zen of python\" as given below (or by ``import this``), and find the most common trigram.\nWe want to treat each line as a separate document. You can achieve this by splitting the string by newlines (``\\n``).\nCompute the Tf-idf encoding of the data. Which words have the highest tf-idf score? Why?\nWhat changes if you use ``TfidfVectorizer(norm=\"none\")``?\n </li>\n </ul>\n</div>",
"_____no_output_____"
]
],
[
[
"zen = \"\"\"Beautiful is better than ugly.\nExplicit is better than implicit.\nSimple is better than complex.\nComplex is better than complicated.\nFlat is better than nested.\nSparse is better than dense.\nReadability counts.\nSpecial cases aren't special enough to break the rules.\nAlthough practicality beats purity.\nErrors should never pass silently.\nUnless explicitly silenced.\nIn the face of ambiguity, refuse the temptation to guess.\nThere should be one-- and preferably only one --obvious way to do it.\nAlthough that way may not be obvious at first unless you're Dutch.\nNow is better than never.\nAlthough never is often better than *right* now.\nIf the implementation is hard to explain, it's a bad idea.\nIf the implementation is easy to explain, it may be a good idea.\nNamespaces are one honking great idea -- let's do more of those!\"\"\"",
"_____no_output_____"
],
[
"# %load solutions/11_ngrams.py",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e79fcb43e8c1d8c389856cb9344285c127c97246 | 50,481 | ipynb | Jupyter Notebook | docs/examples/concept-drift-detection.ipynb | online-ml/creme | 60872844e6052b5ef20e4075aea30f9031377136 | [
"BSD-3-Clause"
] | 1,105 | 2019-01-24T15:15:30.000Z | 2020-11-10T18:27:00.000Z | docs/examples/concept-drift-detection.ipynb | online-ml/creme | 60872844e6052b5ef20e4075aea30f9031377136 | [
"BSD-3-Clause"
] | 328 | 2019-01-25T13:48:43.000Z | 2020-11-11T11:41:44.000Z | docs/examples/concept-drift-detection.ipynb | online-ml/creme | 60872844e6052b5ef20e4075aea30f9031377136 | [
"BSD-3-Clause"
] | 150 | 2019-01-29T19:05:21.000Z | 2020-11-11T11:50:14.000Z | 241.535885 | 21,568 | 0.911828 | [
[
[
"# Concept Drift\n\nIn the context of data streams, it is assumed that data can change over time. The change in the relationship between the data (features) and the target to learn is known as **Concept Drift**. As examples we can mention, the electricity demand across the year, the stock market, and the likelihood of a new movie to be successful. Let's consider the movie example: Two movies can have similar features such as popular actors/directors, storyline, production budget, marketing campaigns, etc. yet it is not certain that both will be similarly successful. What the target audience *considers* worth watching (and their money) is constantly changing and production companies must adapt accordingly to avoid \"box office flops\".\n\n## Impact of drift on learning\n\nConcept drift can have a significant impact on predictive performance if not handled properly. Most batch learning models will fail in the presence of concept drift as they are essentially trained on different data. On the other hand, stream learning methods continuously update themselves and adapt to new concepts. Furthermore, drift-aware methods use change detection methods (a.k.a. drift detectors) to trigger *mitigation mechanisms* if a change in performance is detected.\n\n## Detecting concept drift\n\nMultiple drift detection methods have been proposed. The goal of a drift detector is to signal an alarm in the presence of drift. A good drift detector maximizes the number of true positives while keeping the number of false positives to a minimum. It must also be resource-wise efficient to work in the context of infinite data streams.\n\nFor this example, we will generate a synthetic data stream by concatenating 3 distributions of 1000 samples each:\n\n- $dist_a$: $\\mu=0.8$, $\\sigma=0.05$\n- $dist_b$: $\\mu=0.4$, $\\sigma=0.02$\n- $dist_c$: $\\mu=0.6$, $\\sigma=0.1$.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nfrom matplotlib import gridspec\n\n# Generate data for 3 distributions\nrandom_state = np.random.RandomState(seed=42)\ndist_a = random_state.normal(0.8, 0.05, 1000)\ndist_b = random_state.normal(0.4, 0.02, 1000)\ndist_c = random_state.normal(0.6, 0.1, 1000)\n\n# Concatenate data to simulate a data stream with 2 drifts\nstream = np.concatenate((dist_a, dist_b, dist_c))\n\n# Auxiliary function to plot the data\ndef plot_data(dist_a, dist_b, dist_c, drifts=None):\n fig = plt.figure(figsize=(7,3), tight_layout=True)\n gs = gridspec.GridSpec(1, 2, width_ratios=[3, 1])\n ax1, ax2 = plt.subplot(gs[0]), plt.subplot(gs[1])\n ax1.grid()\n ax1.plot(stream, label='Stream')\n ax2.grid(axis='y')\n ax2.hist(dist_a, label=r'$dist_a$')\n ax2.hist(dist_b, label=r'$dist_b$')\n ax2.hist(dist_c, label=r'$dist_c$')\n if drifts is not None:\n for drift_detected in drifts:\n ax1.axvline(drift_detected, color='red')\n plt.show()\n\nplot_data(dist_a, dist_b, dist_c)",
"_____no_output_____"
]
],
[
[
"### Drift detection test\n\nWe will use the ADaptive WINdowing (`ADWIN`) drift detection method. Remember that the goal is to indicate that drift has occurred after samples **1000** and **2000** in the synthetic data stream.",
"_____no_output_____"
]
],
[
[
"from river import drift\n\ndrift_detector = drift.ADWIN()\ndrifts = []\n\nfor i, val in enumerate(stream):\n drift_detector.update(val) # Data is processed one sample at a time\n if drift_detector.change_detected:\n # The drift detector indicates after each sample if there is a drift in the data\n print(f'Change detected at index {i}')\n drifts.append(i)\n drift_detector.reset() # As a best practice, we reset the detector\n\nplot_data(dist_a, dist_b, dist_c, drifts)",
"Change detected at index 1055\nChange detected at index 2079\n"
]
],
[
[
"We see that `ADWIN` successfully indicates the presence of drift (red vertical lines) close to the begining of a new data distribution.\n\n\n---\nWe conclude this example with some remarks regarding concept drift detectors and their usage:\n\n- In practice, drift detectors provide stream learning methods with robustness against concept drift. Drift detectors monitor the model usually through a performance metric.\n- Drift detectors work on univariate data. This is why they are used to monitor a model's performance and not the data itself. Remember that concept drift is defined as a change in the relationship between data and the target to learn (in supervised learning).\n- Drift detectors define their expectations regarding input data. It is important to know these expectations to feed a given drift detector with the correct data.\n",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e79fd682e9864e8f2db6206960fcb8cb6455732a | 5,799 | ipynb | Jupyter Notebook | notebooks/13-org-ner-spacy.ipynb | sujitpal/content-engineering-tutorial | 7acb6ca846753426638bc3c29d90ce450a9e27bb | [
"Apache-2.0"
] | 6 | 2018-09-20T15:43:12.000Z | 2020-08-09T10:22:23.000Z | notebooks/13-org-ner-spacy.ipynb | sujitpal/content-engineering-tutorial | 7acb6ca846753426638bc3c29d90ce450a9e27bb | [
"Apache-2.0"
] | null | null | null | notebooks/13-org-ner-spacy.ipynb | sujitpal/content-engineering-tutorial | 7acb6ca846753426638bc3c29d90ce450a9e27bb | [
"Apache-2.0"
] | 1 | 2019-02-15T15:41:32.000Z | 2019-02-15T15:41:32.000Z | 29.891753 | 375 | 0.560441 | [
[
[
"## Extracting ORGs from papers using SpaCy\n\nThis notebook is based on the documentation on the [SpaCy Linguistic Features page](https://spacy.io/usage/linguistic-features#section-named-entities).\n\nWe try to extract ORG named entities from our papers dataset. These are likely to be universities and commercial research groups.",
"_____no_output_____"
]
],
[
[
"import os\nimport re\nimport spacy",
"_____no_output_____"
],
[
"DATA_DIR = \"../data\"\n\nTEXTFILES_ORG_DIR = os.path.join(DATA_DIR, \"textfiles_org\")\nORGS_SPACY_DIR = os.path.join(DATA_DIR, \"orgs_spacy\")",
"_____no_output_____"
]
],
[
[
"### Entity Extractor\n\nSpaCy entity extractor is __much faster__ compared to NLTK+Stanford.",
"_____no_output_____"
]
],
[
[
"def extract_entities(tagger, text):\n entities = []\n if text is None:\n return entities\n doc = tagger(text)\n for ent in doc.ents:\n if ent.label_ == \"ORG\":\n entities.append(ent.text)\n return entities\n \n \ntext = \"\"\"Yann Le Cun, a native of France was not even 30 when he joined AT&T \nBell Laboratories in New Jersey. At Bell Labs, LeCun developed a number of new \nmachine learning methods, including the convolutional neural network—modeled \nafter the visual cortex in animals. Today, he serves as chief AI scientist at\nFacebook, where he works tirelessly towards new breakthroughs.\"\"\"\ntext = text.replace(\"\\n\", \" \")\ntext = re.sub(\"\\s+\", \" \", text)\nprint(text)\n \nnlp = spacy.load(\"en\")\nentities = extract_entities(nlp, text)\nprint(entities)",
"Yann Le Cun, a native of France was not even 30 when he joined AT&T Bell Laboratories in New Jersey. At Bell Labs, LeCun developed a number of new machine learning methods, including the convolutional neural network—modeled after the visual cortex in animals. Today, he serves as chief AI scientist at Facebook, where he works tirelessly towards new breakthroughs.\n['AT&T Bell Laboratories', 'Bell Labs', 'Facebook']\n"
]
],
[
[
"## Apply to all (preprocessed) text files\n\nThe preprocessing was done in the `12-org-ner-nltk-stanford` notebook. It pulls the first 50 lines of the original file in an attempt to focus on the part of the text that are most likely to contain the ORGs we are interested in, ie, the affiliations of the authors.",
"_____no_output_____"
]
],
[
[
"if not os.path.exists(ORGS_SPACY_DIR):\n os.mkdir(ORGS_SPACY_DIR)",
"_____no_output_____"
],
[
"def get_text(textfile):\n lines = []\n f = open(textfile, \"r\")\n for line in f:\n lines.append(line.strip())\n f.close()\n text = \"\\n\".join(lines)\n return text\n\n\nnum_written = 0\nfor textfile in os.listdir(TEXTFILES_ORG_DIR):\n if num_written % 1000 == 0:\n print(\"orgs extracted from {:d} files\".format(num_written))\n doc_id = int(textfile.split(\".\")[0])\n orgfile = os.path.join(ORGS_SPACY_DIR, \"{:d}.org\".format(doc_id))\n if os.path.exists(orgfile):\n continue\n else:\n text = get_text(os.path.join(TEXTFILES_ORG_DIR, \"{:d}.txt\".format(doc_id)))\n entities = extract_entities(nlp, text)\n entities = list(set(entities))\n forgs = open(orgfile, \"w\")\n for entity in entities:\n forgs.write(\"{:s}\\n\".format(entity))\n forgs.close()\n num_written += 1\nprint(\"orgs extracted from {:d} files, COMPLETE\".format(num_written))",
"orgs extracted from 0 files\norgs extracted from 1000 files\norgs extracted from 2000 files\norgs extracted from 3000 files\norgs extracted from 4000 files\norgs extracted from 5000 files\norgs extracted from 6000 files\norgs extracted from 7000 files\norgs extracted from 7238 files, COMPLETE\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e79fe0f15ebb7389bd2c90ec2d7ca4b913b75600 | 61,648 | ipynb | Jupyter Notebook | bitcoin-price-prediction-for-20-days-lstm-vs-gru.ipynb | benvictoria17/Machinelearning | 0586d70c5cbec90675b87c57ca062e33b3776b85 | [
"MIT"
] | null | null | null | bitcoin-price-prediction-for-20-days-lstm-vs-gru.ipynb | benvictoria17/Machinelearning | 0586d70c5cbec90675b87c57ca062e33b3776b85 | [
"MIT"
] | null | null | null | bitcoin-price-prediction-for-20-days-lstm-vs-gru.ipynb | benvictoria17/Machinelearning | 0586d70c5cbec90675b87c57ca062e33b3776b85 | [
"MIT"
] | null | null | null | 102.066225 | 41,212 | 0.794105 | [
[
[
"import numpy as np \nimport pandas as pd \nimport matplotlib.pyplot as plt\nfrom datetime import datetime\nfrom keras.models import Sequential\nfrom keras.layers import Dense, LSTM, Dropout, GRU\nfrom keras.layers import *\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.metrics import mean_squared_error, mean_absolute_error\nfrom sklearn.model_selection import train_test_split\nfrom keras.callbacks import EarlyStopping\nfrom keras.optimizers import Adam, SGD",
"_____no_output_____"
],
[
"df = pd.read_csv(\"https://raw.githubusercontent.com/benvictoria17/AnalyzeStocks/master/dataset/BTC-USD-v2.csv\")\ndf = df.sort_values('Date').reset_index(drop=True)",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
],
[
"df.shape",
"_____no_output_____"
],
[
"df['Close'] = df['Close'].astype(float)\n\nplt.figure(figsize=(20,7))\nplt.plot(df['Date'].values, df['Close'].values, label = 'Bitcoin Stock Price', color = 'red')\nplt.xticks(np.arange(100,df.shape[0],200))\nplt.xlabel('Date')\nplt.ylabel('Close ($)')\nplt.legend()\nplt.show()",
"_____no_output_____"
],
[
"num_shape = 1800\n\ntrain = df.iloc[:num_shape, 1:2].values\ntest = df.iloc[num_shape:, 1:2].values",
"_____no_output_____"
],
[
"sc = MinMaxScaler(feature_range = (0, 1))\ntrain_scaled = sc.fit_transform(train)",
"_____no_output_____"
],
[
"X_train = []\n\n#Price on next day\ny_train = []\n\nwindow = 60\n\nfor i in range(window, num_shape):\n X_train_ = np.reshape(train_scaled[i-window:i, 0], (window, 1))\n X_train.append(X_train_)\n y_train.append(train_scaled[i, 0])\nX_train = np.stack(X_train)\ny_train = np.stack(y_train)",
"_____no_output_____"
],
[
"# Initializing the Recurrent Neural Network\nmodel = Sequential()\n#Adding the first LSTM layer with a sigmoid activation function and some Dropout regularization\n#Units - dimensionality of the output space\n\nmodel.add(LSTM(units = 50, return_sequences = True, input_shape = (X_train.shape[1], 1)))\nmodel.add(Dropout(0.2))\n\nmodel.add(LSTM(units = 50, return_sequences = True))\nmodel.add(Dropout(0.2))\n\nmodel.add(LSTM(units = 50, return_sequences = True))\nmodel.add(Dropout(0.2))\n\nmodel.add(LSTM(units = 50))\nmodel.add(Dropout(0.2))\n\n# Adding the output layer\nmodel.add(Dense(units = 1))\nmodel.summary()",
"Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nlstm (LSTM) (None, 60, 50) 10400 \n_________________________________________________________________\ndropout (Dropout) (None, 60, 50) 0 \n_________________________________________________________________\nlstm_1 (LSTM) (None, 60, 50) 20200 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 60, 50) 0 \n_________________________________________________________________\nlstm_2 (LSTM) (None, 60, 50) 20200 \n_________________________________________________________________\ndropout_2 (Dropout) (None, 60, 50) 0 \n_________________________________________________________________\nlstm_3 (LSTM) (None, 50) 20200 \n_________________________________________________________________\ndropout_3 (Dropout) (None, 50) 0 \n_________________________________________________________________\ndense (Dense) (None, 1) 51 \n=================================================================\nTotal params: 71,051\nTrainable params: 71,051\nNon-trainable params: 0\n_________________________________________________________________\n"
],
[
"model.compile(optimizer = 'adam', loss = 'mean_squared_error')\nmodel.fit(X_train, y_train, epochs = 50, batch_size = 32);",
"Epoch 1/50\n55/55 [==============================] - 8s 14ms/step - loss: 0.0245\nEpoch 2/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0045\nEpoch 3/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0034\nEpoch 4/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0025\nEpoch 5/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0019\nEpoch 6/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0024\nEpoch 7/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0022\nEpoch 8/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0022\nEpoch 9/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0021\nEpoch 10/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0015\nEpoch 11/50\n55/55 [==============================] - 1s 16ms/step - loss: 0.0018\nEpoch 12/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0016\nEpoch 13/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0016\nEpoch 14/50\n55/55 [==============================] - 1s 18ms/step - loss: 0.0017\nEpoch 15/50\n55/55 [==============================] - 1s 14ms/step - loss: 0.0016\nEpoch 16/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0012\nEpoch 17/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0012\nEpoch 18/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0019\nEpoch 19/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0014\nEpoch 20/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0014\nEpoch 21/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0012\nEpoch 22/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0015\nEpoch 23/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0015\nEpoch 24/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0015\nEpoch 25/50\n55/55 [==============================] - 1s 14ms/step - loss: 0.0015\nEpoch 26/50\n55/55 [==============================] - 1s 15ms/step - loss: 0.0011\nEpoch 27/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0012\nEpoch 28/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0013\nEpoch 29/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0014\nEpoch 30/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0011\nEpoch 31/50\n55/55 [==============================] - 1s 14ms/step - loss: 0.0013\nEpoch 32/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0011\nEpoch 33/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0010\nEpoch 34/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0013\nEpoch 35/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0011\nEpoch 36/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0011\nEpoch 37/50\n55/55 [==============================] - 1s 13ms/step - loss: 8.6887e-04\nEpoch 38/50\n55/55 [==============================] - 1s 13ms/step - loss: 9.8452e-04\nEpoch 39/50\n55/55 [==============================] - 1s 13ms/step - loss: 8.4599e-04\nEpoch 40/50\n55/55 [==============================] - 1s 14ms/step - loss: 8.6039e-04\nEpoch 41/50\n55/55 [==============================] - 1s 15ms/step - loss: 0.0012\nEpoch 42/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0010\nEpoch 43/50\n55/55 [==============================] - 1s 13ms/step - loss: 9.8633e-04\nEpoch 44/50\n55/55 [==============================] - 1s 13ms/step - loss: 7.9477e-04\nEpoch 45/50\n55/55 [==============================] - 1s 13ms/step - loss: 9.0478e-04\nEpoch 46/50\n55/55 [==============================] - 1s 13ms/step - loss: 0.0012\nEpoch 47/50\n55/55 [==============================] - 1s 13ms/step - loss: 9.0374e-04\nEpoch 48/50\n55/55 [==============================] - 1s 13ms/step - loss: 8.8376e-04\nEpoch 49/50\n55/55 [==============================] - 1s 13ms/step - loss: 8.4140e-04\nEpoch 50/50\n55/55 [==============================] - 1s 13ms/step - loss: 7.9677e-04\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e79fe835ee96b2abde029654477a07cf8439a1ef | 8,940 | ipynb | Jupyter Notebook | Jupyter/PythonCrashCourse2ndEdition/ch6_dictionaries.ipynb | awakun/LearningPython | 578f9290c8065df37ade49abe4b0ab4e6b35a1bd | [
"MIT"
] | null | null | null | Jupyter/PythonCrashCourse2ndEdition/ch6_dictionaries.ipynb | awakun/LearningPython | 578f9290c8065df37ade49abe4b0ab4e6b35a1bd | [
"MIT"
] | null | null | null | Jupyter/PythonCrashCourse2ndEdition/ch6_dictionaries.ipynb | awakun/LearningPython | 578f9290c8065df37ade49abe4b0ab4e6b35a1bd | [
"MIT"
] | null | null | null | 21.911765 | 497 | 0.49217 | [
[
[
"# Dictionaries\r\n\r\n## Working with Dictionaries\r\n\r\n* A collection of key-value pairs where each key is connected to a value.\r\n* Any object you can create in Python can be used as a value in a dictionary.\r\n* Defined with `{}` using `:` to match keys with values and `,` separates pairs:",
"_____no_output_____"
]
],
[
[
"alien_0 = {'color': 'green', 'points': 5}\r\nprint(alien_0)",
"{'color': 'green', 'points': 5}\n"
]
],
[
[
"## Accessing Values in a Dictionary\r\n\r\n* Access a value by indexing to its key (only if key exists!):",
"_____no_output_____"
]
],
[
[
"print(alien_0['color'])",
"green\n"
],
[
"# Error\r\nprint(alien_0['origin'])",
"_____no_output_____"
]
],
[
[
"* Can also use `get()` with the key as an argument, will return `None` if the key doesn't exist:",
"_____no_output_____"
]
],
[
[
"print(alien_0.get('origin'))",
"None\n"
]
],
[
[
"* `get()` also accepts a second argument, which if provided, will be returned if the key provided as the first argument does not exist:",
"_____no_output_____"
]
],
[
[
"print(alien_0.get('origin','This alien has no origin!'))",
"This alien has no origin!\n"
]
],
[
[
"* Can add to a dictionary by indexing to a new key and assigning it a value:",
"_____no_output_____"
]
],
[
[
"alien_0['x_position'] = 0\r\nalien_0['y_position'] = 25\r\nprint(alien_0)",
"{'color': 'green', 'points': 5, 'x_position': 0, 'y_position': 25}\n"
]
],
[
[
"* Same to modify a value:",
"_____no_output_____"
]
],
[
[
"alien_0['x_position'] = 5\r\nprint(alien_0)",
"{'color': 'green', 'points': 5, 'x_position': 5, 'y_position': 25}\n"
]
],
[
[
"* Remove a key-value pair with `del`:",
"_____no_output_____"
]
],
[
[
"del alien_0['points']\r\nprint(alien_0)",
"{'color': 'green', 'x_position': 5, 'y_position': 25}\n"
]
],
[
[
"## Style\r\n\r\n* Multiline dictionaries:\r\n * Are created with the opening bracket on the first line\r\n * Have key-value pairs each on their own line and indented 1 level\r\n * Closing bracket is at the same indent level.\r\n * Include a comma after the last key-value pair too\r\n\r\n```python\r\nfavorite_languages = {\r\n 'jen': 'python',\r\n 'sarah': 'c',\r\n 'edward': 'ruby',\r\n 'phil': 'python',\r\n }\r\n\r\n# Matthes, Eric. Python Crash Course, 2nd Edition (p. 97). No Starch Press. Kindle Edition. \r\n```\r\n\r\n## Looping Through a Dictionary\r\n\r\n* Can loop through key-value pairs, keys, or values\r\n* To loop through key-value pairs, use `items()` which creates a list of key-value pairs and assign 2 variables to iterate:",
"_____no_output_____"
]
],
[
[
"user_0 = {\r\n 'username': 'dkong',\r\n 'first': 'donkey',\r\n 'last': 'kong',\r\n }\r\n\r\nfor key, value in user_0.items():\r\n print(f\"\\nKey: {key}\")\r\n print(f\"Value: {value}\")",
"\nKey: username\nValue: dkong\n\nKey: first\nValue: donkey\n\nKey: last\nValue: kong\n"
]
],
[
[
"* To loop through the keys of a dictionary, use `keys()`:",
"_____no_output_____"
]
],
[
[
"for key in user_0.keys():\r\n print(key)",
"username\nfirst\nlast\n"
]
],
[
[
"* OR, simply loop through the dictionary like it were a list, as looping through the keys is the default behavior in Python:",
"_____no_output_____"
]
],
[
[
"for key in user_0:\r\n print(key)",
"username\nfirst\nlast\n"
]
],
[
[
"* To loop through values, use the `values()` method:",
"_____no_output_____"
]
],
[
[
"for value in user_0.values():\r\n print(value)",
"dkong\ndonkey\nkong\n"
]
],
[
[
"## Sets\r\n\r\n* Sets are collections where the elements must be unique\r\n* Can use `set()` to return a copy of a list without duplicates\r\n* No specific order.",
"_____no_output_____"
]
],
[
[
"languages = {'python', 'ruby', 'python', 'c'}\r\nprint(set(languages))",
"{'ruby', 'c', 'python'}\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79fefcc854cb7d5e1a5965a1f1dea3d99cca9f0 | 933 | ipynb | Jupyter Notebook | HelloGithub.ipynb | kjarnutowska/ws_matrix | 44a744f0a1ccbb23be8dabcac210e7359eeec8a8 | [
"MIT"
] | null | null | null | HelloGithub.ipynb | kjarnutowska/ws_matrix | 44a744f0a1ccbb23be8dabcac210e7359eeec8a8 | [
"MIT"
] | null | null | null | HelloGithub.ipynb | kjarnutowska/ws_matrix | 44a744f0a1ccbb23be8dabcac210e7359eeec8a8 | [
"MIT"
] | null | null | null | 933 | 933 | 0.708467 | [
[
[
"print(\"Hello Github\")",
"Hello Github\n"
],
[
"",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code"
]
] |
e79ff9f7c13f6e90558d362081cfcd3704791500 | 154,148 | ipynb | Jupyter Notebook | seminar01soln.ipynb | AllenDowney/BayesSeminar | 648325d20507901d3bb7f1328f5a2ac5cb002b7c | [
"MIT"
] | 31 | 2017-03-15T20:30:08.000Z | 2021-07-25T14:09:24.000Z | seminar01soln.ipynb | AllenDowney/BayesSeminar | 648325d20507901d3bb7f1328f5a2ac5cb002b7c | [
"MIT"
] | null | null | null | seminar01soln.ipynb | AllenDowney/BayesSeminar | 648325d20507901d3bb7f1328f5a2ac5cb002b7c | [
"MIT"
] | 14 | 2017-04-07T04:43:01.000Z | 2022-01-03T08:50:35.000Z | 158.915464 | 19,654 | 0.899921 | [
[
[
"Bayesian Statistics Seminar\n===\n\nCopyright 2017 Allen Downey\n\nMIT License: https://opensource.org/licenses/MIT",
"_____no_output_____"
]
],
[
[
"from __future__ import print_function, division\n\n%matplotlib inline\n\nimport warnings\nwarnings.filterwarnings('ignore')\n\nimport math\nimport numpy as np\n\nfrom thinkbayes2 import Pmf, Suite\nimport thinkplot",
"_____no_output_____"
]
],
[
[
"Working with Pmfs\n---\nCreate a Pmf object to represent a six-sided die.",
"_____no_output_____"
]
],
[
[
"d6 = Pmf()",
"_____no_output_____"
]
],
[
[
"A Pmf is a map from possible outcomes to their probabilities.",
"_____no_output_____"
]
],
[
[
"for x in [1,2,3,4,5,6]:\n d6[x] = 1",
"_____no_output_____"
]
],
[
[
"Initially the probabilities don't add up to 1.",
"_____no_output_____"
]
],
[
[
"d6.Print()",
"1 1\n2 1\n3 1\n4 1\n5 1\n6 1\n"
]
],
[
[
"`Normalize` adds up the probabilities and divides through. The return value is the total probability before normalizing.",
"_____no_output_____"
]
],
[
[
"d6.Normalize()",
"_____no_output_____"
]
],
[
[
"Now the Pmf is normalized.",
"_____no_output_____"
]
],
[
[
"d6.Print()",
"1 0.166666666667\n2 0.166666666667\n3 0.166666666667\n4 0.166666666667\n5 0.166666666667\n6 0.166666666667\n"
]
],
[
[
"And we can compute its mean (which only works if it's normalized).",
"_____no_output_____"
]
],
[
[
"d6.Mean()",
"_____no_output_____"
]
],
[
[
"`Random` chooses a random value from the Pmf.",
"_____no_output_____"
]
],
[
[
"d6.Random()",
"_____no_output_____"
]
],
[
[
"`thinkplot` provides methods for plotting Pmfs in a few different styles.",
"_____no_output_____"
]
],
[
[
"thinkplot.Hist(d6)",
"_____no_output_____"
]
],
[
[
"**Exercise 1:** The Pmf object provides `__add__`, so you can use the `+` operator to compute the Pmf of the sum of two dice.\n\nCompute and plot the Pmf of the sum of two 6-sided dice.",
"_____no_output_____"
]
],
[
[
"# Solution\nthinkplot.Hist(d6+d6)",
"_____no_output_____"
]
],
[
[
"**Exercise 2:** Suppose I roll two dice and tell you the result is greater than 3.\n\nPlot the Pmf of the remaining possible outcomes and compute its mean.",
"_____no_output_____"
]
],
[
[
"# Solution\n\npmf = d6 + d6\npmf[2] = 0\npmf[3] = 0\npmf.Normalize()\nthinkplot.Hist(pmf)\npmf.Mean()",
"_____no_output_____"
]
],
[
[
"The cookie problem\n---\nCreate a Pmf with two equally likely hypotheses.\n",
"_____no_output_____"
]
],
[
[
"cookie = Pmf(['Bowl 1', 'Bowl 2'])\ncookie.Print()",
"Bowl 1 0.5\nBowl 2 0.5\n"
]
],
[
[
"Update each hypothesis with the likelihood of the data (a vanilla cookie).",
"_____no_output_____"
]
],
[
[
"cookie['Bowl 1'] *= 0.75\ncookie['Bowl 2'] *= 0.5\ncookie.Normalize()",
"_____no_output_____"
]
],
[
[
"Print the posterior probabilities.",
"_____no_output_____"
]
],
[
[
"cookie.Print()",
"Bowl 1 0.6\nBowl 2 0.4\n"
]
],
[
[
"**Exercise 3:** Suppose we put the first cookie back, stir, choose again from the same bowl, and get a chocolate cookie.\n\nHint: The posterior (after the first cookie) becomes the prior (before the second cookie).",
"_____no_output_____"
]
],
[
[
"# Solution\n\ncookie['Bowl 1'] *= 0.25\ncookie['Bowl 2'] *= 0.5\ncookie.Normalize()\ncookie.Print()",
"Bowl 1 0.428571428571\nBowl 2 0.571428571429\n"
]
],
[
[
"**Exercise 4:** Instead of doing two updates, what if we collapse the two pieces of data into one update?\n\nRe-initialize `Pmf` with two equally likely hypotheses and perform one update based on two pieces of data, a vanilla cookie and a chocolate cookie.\n\nThe result should be the same regardless of how many updates you do (or the order of updates).",
"_____no_output_____"
]
],
[
[
"# Solution\n\ncookie = Pmf(['Bowl 1', 'Bowl 2'])\ncookie['Bowl 1'] *= 0.75 * 0.25\ncookie['Bowl 2'] *= 0.5 * 0.5\ncookie.Normalize()\ncookie.Print()",
"Bowl 1 0.428571428571\nBowl 2 0.571428571429\n"
]
],
[
[
"## STOP HERE",
"_____no_output_____"
],
[
"## The Euro problem\n\n\n**Exercise 5:** Write a class definition for `Euro`, which extends `Suite` and defines a likelihood function that computes the probability of the data (heads or tails) for a given value of `x` (the probability of heads).\n\nNote that `hypo` is in the range 0 to 100. Here's an outline to get you started.",
"_____no_output_____"
]
],
[
[
"class Euro(Suite):\n \n def Likelihood(self, data, hypo):\n \"\"\" \n hypo is the prob of heads (0-100)\n data is a string, either 'H' or 'T'\n \"\"\"\n return 1",
"_____no_output_____"
],
[
"# Solution\n\nclass Euro(Suite):\n \n def Likelihood(self, data, hypo):\n \"\"\" \n hypo is the prob of heads (0-100)\n data is a string, either 'H' or 'T'\n \"\"\"\n x = hypo / 100\n if data == 'H':\n return x\n else:\n return 1-x",
"_____no_output_____"
]
],
[
[
"We'll start with a uniform distribution from 0 to 100.",
"_____no_output_____"
]
],
[
[
"euro = Euro(range(101))\nthinkplot.Pdf(euro)",
"_____no_output_____"
]
],
[
[
"Now we can update with a single heads:",
"_____no_output_____"
]
],
[
[
"euro.Update('H')\nthinkplot.Pdf(euro)",
"_____no_output_____"
]
],
[
[
"Another heads:",
"_____no_output_____"
]
],
[
[
"euro.Update('H')\nthinkplot.Pdf(euro)",
"_____no_output_____"
]
],
[
[
"And a tails:",
"_____no_output_____"
]
],
[
[
"euro.Update('T')\nthinkplot.Pdf(euro)",
"_____no_output_____"
]
],
[
[
"Starting over, here's what it looks like after 7 heads and 3 tails.",
"_____no_output_____"
]
],
[
[
"euro = Euro(range(101))\n\nfor outcome in 'HHHHHHHTTT':\n euro.Update(outcome)\n\nthinkplot.Pdf(euro)\neuro.MaximumLikelihood()",
"_____no_output_____"
]
],
[
[
"The maximum posterior probability is 70%, which is the observed proportion.\n\nHere are the posterior probabilities after 140 heads and 110 tails.",
"_____no_output_____"
]
],
[
[
"euro = Euro(range(101))\n\nevidence = 'H' * 140 + 'T' * 110\nfor outcome in evidence:\n euro.Update(outcome)\n \nthinkplot.Pdf(euro)",
"_____no_output_____"
]
],
[
[
"The posterior mean s about 56%",
"_____no_output_____"
]
],
[
[
"euro.Mean()",
"_____no_output_____"
]
],
[
[
"So is the value with maximum aposteriori probability (MAP).",
"_____no_output_____"
]
],
[
[
"euro.MAP()",
"_____no_output_____"
]
],
[
[
"The posterior credible interval has a 90% chance of containing the true value (provided that the prior distribution truly represents our background knowledge).",
"_____no_output_____"
]
],
[
[
"euro.CredibleInterval(90)",
"_____no_output_____"
]
],
[
[
"**Exercise 6** The following function makes a `Euro` object with a triangle prior.",
"_____no_output_____"
]
],
[
[
"def TrianglePrior():\n \"\"\"Makes a Suite with a triangular prior.\"\"\"\n suite = Euro(label='triangle')\n for x in range(0, 51):\n suite.Set(x, x)\n for x in range(51, 101):\n suite.Set(x, 100-x) \n suite.Normalize()\n return suite",
"_____no_output_____"
]
],
[
[
"And here's what it looks like.",
"_____no_output_____"
]
],
[
[
"euro1 = Euro(range(101), label='uniform')\neuro2 = TrianglePrior()\nthinkplot.Pdfs([euro1, euro2])\nthinkplot.Config(title='Priors')",
"_____no_output_____"
]
],
[
[
"Update `euro1` and `euro2` with the same data we used before (140 heads and 110 tails) and plot the posteriors.",
"_____no_output_____"
]
],
[
[
"# Solution\n\nevidence = 'H' * 140 + 'T' * 110\nfor outcome in evidence:\n euro1.Update(outcome)\n euro2.Update(outcome)\n\nthinkplot.Pdfs([euro1, euro2])\nthinkplot.Config(title='Posteriors')",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e79ffd32f509a526511598a08ab163a270f14d7a | 1,142 | ipynb | Jupyter Notebook | docs/notebooks/SeleniumKeywords.ipynb | martinRenou/robotkernel | 827435dbaa5a201db4640f92ac2c4a7ead8f50cd | [
"BSD-3-Clause"
] | 56 | 2019-01-19T22:46:34.000Z | 2022-03-03T07:28:19.000Z | docs/notebooks/SeleniumKeywords.ipynb | martinRenou/robotkernel | 827435dbaa5a201db4640f92ac2c4a7ead8f50cd | [
"BSD-3-Clause"
] | 42 | 2019-01-09T18:16:32.000Z | 2022-03-29T20:18:41.000Z | docs/notebooks/SeleniumKeywords.ipynb | admariner/robotkernel | 6b0c3334a4aeb3f7e4f735613cf6be04cd950fd6 | [
"BSD-3-Clause"
] | 9 | 2019-01-25T03:54:49.000Z | 2021-12-05T11:20:59.000Z | 20.763636 | 66 | 0.538529 | [
[
[
"*** Settings ***\n\nLibrary SeleniumLibrary",
"_____no_output_____"
],
[
"*** Keywords ***\n\nOpen singleton browser\n [Documentation]\n ... Open a new browser window on the first call\n ... and selects that window on the subsequent calls.\n [Arguments] ${url}=about:blank\n Open browser ${url} alias=singleton",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code"
]
] |
e79ffe50a6f748ef5a3ab1a400c273a3d3021502 | 221,042 | ipynb | Jupyter Notebook | figs/fig4.ipynb | voytekresearch/alphalogical | 76640ff8cc285b6c689fa91736d1fe4484a9b605 | [
"MIT"
] | null | null | null | figs/fig4.ipynb | voytekresearch/alphalogical | 76640ff8cc285b6c689fa91736d1fe4484a9b605 | [
"MIT"
] | null | null | null | figs/fig4.ipynb | voytekresearch/alphalogical | 76640ff8cc285b6c689fa91736d1fe4484a9b605 | [
"MIT"
] | null | null | null | 243.170517 | 26,712 | 0.910067 | [
[
[
"%matplotlib inline\nimport matplotlib.pyplot as plt\nimport numpy as np\n\nfrom fakespikes.rates import random_boxcars, bursts, boxcar\nfrom fakespikes.util import create_times\n\nfrom pykdf.kdf import save_kdf, load_kdf\nimport numpy as np\nfrom pprint import pprint\n\nfrom foof.util import create_psd\nfrom fakespikes.util import create_times\nfrom bluemass.model import run\nfrom pykdf.kdf import save_kdf, load_kdf\nfrom fakespikes.rates import bursts, constant\nfrom fakespikes.neurons import Spikes\nimport fakespikes.util as sp\n\n\n\nimport seaborn as sns\nsns.set_style('ticks')",
"_____no_output_____"
]
],
[
[
"\n# Examples\n\n## Intro cartoons",
"_____no_output_____"
]
],
[
[
"t = 500e-3\ndt = 1e-3\ntimes = create_times(t, dt)\n\ns = .5\nnoi = np.random.normal(0, s, size=times.shape[0])\n\nf = 10\nr = 1\nro = 2\nn = 1\nl = 11.7e-3\nn_bursts = 2\nstim = boxcar(times, r, 2, l, dt, offset=200e-3) + ro\n\nre = stim + noi\n\nm_post = np.logical_and(times > 0.39, times < 0.42)\nm_pre = np.logical_and(times > 0.36, times < 0.39)",
"_____no_output_____"
],
[
"# One perfect stim\nplt.figure(figsize=(2, 10))\nplt.subplot(311)\nplt.plot(times, stim, 'k', linestyle='--')\n# plt.plot(times, re, 'k', alpha=0.2)\nplt.axis('off')\nplt.ylim(-2, 10)\nplt.xlim(.15, 0.35)",
"_____no_output_____"
],
[
"# Three stim\nstim += boxcar(times, r*2, 2, l, dt, offset=250e-3) \nstim += boxcar(times, r*3, 2, l, dt, offset=300e-3) ",
"_____no_output_____"
],
[
"plt.figure(figsize=(2, 8))\nplt.subplot(311)\nplt.plot(times, stim, 'purple', alpha=0.5)\n# plt.plot(times, re, 'k', alpha=0.2)\nplt.axis('off')\nplt.ylim(-2, 10)\nplt.xlim(.15, 0.35)",
"_____no_output_____"
],
[
"t = 500e-3\ndt = 1e-3\ntimes = create_times(t, dt)\n\ns = 3\nnoi = np.random.normal(0, s, size=times.shape[0])\n\nf = 10\nr = 14\nro = 2\nn = 1\nl = 11.7e-3\nn_bursts = 2\nstim = boxcar(times, r, 2, l, dt, offset=400e-3) + ro\n\n\nre = stim + noi\n\nm_post = np.logical_and(times > 0.39, times < 0.42)\nm_pre = np.logical_and(times > 0.1, times < 0.35)",
"_____no_output_____"
],
[
"# Zooooooom \nplt.figure(figsize=(3, 2))\nplt.plot(times, re, 'k', alpha=0.2)\nplt.plot(times[m_pre], re[m_pre], 'grey', linewidth=4)\nplt.plot(times[m_post], re[m_post], 'purple', linewidth=4, alpha=0.4)\nplt.axvline(x=0.4, color='purple', alpha=0.6, linewidth=3, linestyle='-.')\nplt.axvline(x=0.412, color='purple', alpha=0.6, linewidth=3, linestyle='-.')\n\nplt.xlim(0.1, 0.5)\nplt.axis('off')",
"_____no_output_____"
]
],
[
[
"## Model examples\n\n### Random phase",
"_____no_output_____"
]
],
[
[
"%run /home/ejp/src/bluemass/bm.py ../data/fig4/ ../pars/fig4/mathewson_constant_osc_r72.2222222222.yaml -t 0.5 --sigma 3 --loc r_E\nres1 = load_kdf(\"../data/fig4/result.hdf5\")\nidx1 = load_kdf(\"../data/fig4/index.hdf5\")\n\n%run /home/ejp/src/bluemass/bm.py ../data/fig4/ ../pars/fig4/mathewson_constant_osc_r72.2222222222.yaml -t 0.5 --sigma 3 --loc r_E\nres2 = load_kdf(\"../data/fig4/result.hdf5\")\nidx2 = load_kdf(\"../data/fig4/index.hdf5\")\n\n%run /home/ejp/src/bluemass/bm.py ../data/fig4/ ../pars/fig4/mathewson_constant_osc_r72.2222222222.yaml -t 0.5 --sigma 3 --loc r_E\nres3 = load_kdf(\"../data/fig4/result.hdf5\")\nidx3 = load_kdf(\"../data/fig4/index.hdf5\")\n\n%run /home/ejp/src/bluemass/bm.py ../data/fig4/ ../pars/fig4/mathewson_constant_osc_r72.2222222222.yaml -t 0.5 --sigma 3 --loc r_E\nres4 = load_kdf(\"../data/fig4/result.hdf5\")\nidx4 = load_kdf(\"../data/fig4/index.hdf5\")",
"_____no_output_____"
],
[
"times = res1['times']\nstim = res1['stims'][:,0]\n\nys1 = res1['ys']\nys2 = res2['ys']\nys3 = res3['ys']\nys4 = res4['ys']\n\nre1 = ys1[:, idx1['r_E']]\nre2 = ys2[:, idx2['r_E']]\nre3 = ys3[:, idx3['r_E']]\nre4 = ys4[:, idx4['r_E']]",
"_____no_output_____"
],
[
"plt.figure(figsize=(3, 3))\nplt.plot(times, re1, color='k', linewidth=3)\nplt.plot(times, re2+5, color='k', linewidth=3)\nplt.plot(times, re3+10, color='k', linewidth=3)\nplt.plot(times, re4+15, color='k', linewidth=3)\nplt.axis('off')\nplt.ylim(3, 25)\nplt.xlim(0.2, 0.5)\nplt.axvline(x=0.4, color='purple', alpha=0.7, linewidth=3, linestyle='-.')\nplt.axvline(x=0.412, color='purple', alpha=0.7, linewidth=3, linestyle='-.')",
"_____no_output_____"
],
[
"plt.figure(figsize=(3, 3))\nplt.plot(times, res1['rates'][:, 0] / res1['rates'][:, 0].max() + 0, 'grey', linewidth=3, linestyle='--')\nplt.plot(times, res2['rates'][:, 0] / res2['rates'][:, 0].max() + 1, 'grey', linewidth=3, linestyle='--')\nplt.plot(times, res3['rates'][:, 0] / res3['rates'][:, 0].max() + 2, 'grey', linewidth=3, linestyle='--')\nplt.plot(times, res4['rates'][:, 0] / res4['rates'][:, 0].max() + 3, 'grey', linewidth=3, linestyle='--')\nplt.axvline(x=0.4, color='purple', alpha=0.7, linewidth=3, linestyle='-.')\nplt.axvline(x=0.412, color='purple', alpha=0.7, linewidth=3, linestyle='-.')\nplt.xlim(0.2, 0.5)\nplt.ylim(0, 4.1)\nplt.axis('off')",
"_____no_output_____"
]
],
[
[
"### Locked burst",
"_____no_output_____"
]
],
[
[
"%run /home/ejp/src/bluemass/bm.py ../data/fig4/ ../pars/fig4/mathewson_lockedburst_osc_r72.2222222222.yaml -t 0.7 --sigma 1 --loc r_E\nres = load_kdf(\"../data/fig4/result.hdf5\")\nidx = load_kdf(\"../data/fig4/index.hdf5\")",
"_____no_output_____"
],
[
"times = res['times']\nstim = res['stims'][:,0]\nys = res['ys']\nre = ys[:, idx['r_E']]\nrates = res['rates'][:, 0] / res['rates'][:, 0].max()",
"_____no_output_____"
],
[
"plt.figure(figsize=(4, 2))\nplt.plot(times, re, color='k', linewidth=2)\nplt.axvline(x=0.4, color='purple', alpha=0.7, linewidth=2, linestyle='-.')\nplt.axvline(x=0.412, color='purple', alpha=0.7, linewidth=2, linestyle='-.')\nplt.xlim(0.1, 0.9)\nplt.ylim(3, 10)\nplt.axis('off')",
"_____no_output_____"
],
[
"plt.figure(figsize=(4, 1))\nplt.plot(times, rates,'grey', linewidth=4, linestyle='--')\nplt.axvline(x=0.4, color='purple', alpha=0.7, linewidth=2, linestyle='-.')\nplt.axvline(x=0.412, color='purple', alpha=0.7, linewidth=2, linestyle='-.')\nplt.xlim(0.1, 0.9)\nplt.ylim(0, 1.1)\nplt.axis('off')",
"_____no_output_____"
],
[
"nrns_e = Spikes(125, 1, dt=1e-3, seed=42)\nnrns_i = Spikes(125, 1, dt=1e-3, seed=42+1)\n\ntimes = nrns_e.times\nr = 125\nr_osc = bursts(times, r, 10, 2, min_a=12, offset=0.35)\n\no_e = nrns_e.poisson(r_osc).sum(1)\no_i = nrns_i.poisson(r_osc).sum(1)\n\nplt.figure(figsize=(2, 1))\nplt.plot(times, o_e-o_i, color='k')\n# plt.ylim(-120, 120)\nplt.xlim(0.2, 0.6)\nplt.axvline(x=0.4, color='purple', alpha=0.7, linewidth=2, linestyle='-.')\nplt.axvline(x=0.412, color='purple', alpha=0.7, linewidth=2, linestyle='-.')\nplt.plot(times, r_osc/2 + 10, 'grey', linewidth=3, linestyle='--')\nsub = plt.subplot(111)\nsub.set_frame_on(False)\nsub.get_yaxis().set_visible(False)\nsub.get_xaxis().set_visible(False)",
"_____no_output_____"
]
],
[
[
"# Results\n\n## Phase experiments",
"_____no_output_____"
]
],
[
[
"a1 = load_kdf(\"../data/fig4/a_part1.hdf5\")\n# a2 = load_kdf(\"../data/fig4/a_part2.hdf5\")\na3 = load_kdf(\"../data/fig4/a_part3.hdf5\")\na4 = load_kdf(\"../data/fig4/a_part4.hdf5\")\n# a5 = load_kdf(\"../data/fig4/a_part5.hdf5\")\na6 = load_kdf(\"../data/fig4/a_part6.hdf5\")\n\nb1 = load_kdf(\"../data/fig4/b_part1.hdf5\")\n# b2 = load_kdf(\"../data/fig4/b_part2.hdf5\")\nb3 = load_kdf(\"../data/fig4/b_part3.hdf5\")\nb4 = load_kdf(\"../data/fig4/b_part4.hdf5\")\n# # b5 = load_kdf(\"../data/fig4/b_part5.hdf5\")\nb6 = load_kdf(\"../data/fig4/b_part6.hdf5\")",
"_____no_output_____"
],
[
"pprint(a3.keys())\npprint(a3['hits'].shape)",
"[u'n_stim',\n u'hits',\n u'stims',\n u'd_primes',\n u'misses',\n u'rates',\n u'false_alarms',\n u'correct_rejections']\n(360, 10)\n"
]
],
[
[
"## 2 SD threshold",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(3, 3))\n\nr = a3['rates'] / 10.0 # 10 Hz is the noise level\nM = a3['d_primes'].mean(0)\nSD = a3['d_primes'].std(0)\nplt.plot(r, M, color='k', linewidth=3, label='!0 Hz oscillation')\nplt.fill_between(r, M+SD, M-SD, facecolor='black', alpha=0.1)\n\nr = a6['rates'] / 10\nM = a6['d_primes'].mean(0)\nSD = a6['d_primes'].std(0)\nplt.plot(r, M, color='grey', linewidth=3, label=\"Constant\")\nplt.fill_between(r, M+SD, M-SD, facecolor='grey', alpha=0.1)\n\nplt.xlabel(\"Input SNR\")\nplt.ylim(-3, 4)\nplt.ylabel(\"d'\")\nplt.legend(loc='upper left', fancybox=True, framealpha=0.5)\nplt.tight_layout()\nsns.despine()",
"_____no_output_____"
],
[
"plt.figure(figsize=(3, 3))\n\nr = b3['rates'] / 10.0 # 10 Hz is the noise level\nM = b3['d_primes'].mean(0)\nSD = b3['d_primes'].std(0)\nplt.plot(r, M, color='black', linewidth=3, label='2 cycle burst')\nplt.fill_between(r, M+SD, M-SD, facecolor='black', alpha=0.1)\n\nr = b6['rates'] / 10.0\nM = b6['d_primes'].mean(0)\nSD = b6['d_primes'].std(0)\nplt.plot(r, M, color='grey', linewidth=3, label=\"Constant\")\nplt.fill_between(r, M+SD, M-SD, facecolor='grey', alpha=0.1)\n\nplt.xlabel(\"Input SNR\")\nplt.ylim(-3, 4)\nplt.ylabel(\"d'\")\nplt.legend(loc='upper left', fancybox=True, framealpha=0.1)\nplt.tight_layout()\nsns.despine()",
"_____no_output_____"
]
],
[
[
"## 1 SD threshold",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(3, 3))\n\n# osc\nr = a1['rates'] / 10.0 # 10 Hz is the noise level\nM = a1['d_primes'].mean(0)\nSD = a1['d_primes'].std(0)\nSEM = SD / np.sqrt(len(SD))\nplt.plot(r, M, color='k', linewidth=3, label='!0 Hz oscillation')\nplt.fill_between(r, M+SEM, M-SEM, facecolor='black', alpha=0.1)\n\n# const\nr = a4['rates'] / 10\nM = a4['d_primes'].mean(0)\nSD = a4['d_primes'].std(0)\nSEM = SD / np.sqrt(len(SD))\nplt.plot(r, M, color='grey', linewidth=3, label=\"Constant\")\nplt.fill_between(r, M+SEM, M-SEM, facecolor='grey', alpha=0.1)\n\n# labels, etc\nplt.xlabel(\"Input SNR\")\nplt.ylim(-1, 3)\nplt.ylabel(\"d'\")\nplt.legend(loc='lower right', fancybox=True, framealpha=0.5)\n\nplt.tight_layout()\nsns.despine()",
"_____no_output_____"
],
[
"plt.figure(figsize=(3, 3))\n\nr = b1['rates'] / 10.0 # 10 Hz is the noise level\nM = b1['d_primes'].mean(0)\nSD = b1['d_primes'].std(0)\nSEM = SD / np.sqrt(len(SD))\n\nplt.plot(r, M, color='black', linewidth=3, label='2 cycle burst')\nplt.fill_between(r, M+SEM, M-SEM, facecolor='black', alpha=0.1)\n\nr = b4['rates'] / 10.0\nM = b4['d_primes'].mean(0)\nSD = b4['d_primes'].std(0)\nplt.plot(r, M, color='grey', linewidth=3, label=\"Constant\")\nplt.fill_between(r, M+SEM, M-SEM, facecolor='grey', alpha=0.1)\n\nplt.xlabel(\"Input SNR\")\nplt.ylim(-1, 3)\nplt.ylabel(\"d'\")\nplt.legend(loc='upper left', fancybox=True, framealpha=0.1)\nplt.tight_layout()\nsns.despine()",
"_____no_output_____"
]
],
[
[
"## Amplitude experiments",
"_____no_output_____"
]
],
[
[
"res = load_kdf(\"../data/fig4/4p.hdf5\")",
"_____no_output_____"
],
[
"plt.figure(figsize=(3, 3))\n\np = res['powers2']\nM = res['d_primes'].mean(0)\nSD = res['d_primes'].std(0)\nSEM = SD / np.sqrt(len(SD))\n\nplt.plot(p, M, color='black', linewidth=3)\nplt.fill_between(p, M+SEM, M-SEM, facecolor='black', alpha=0.1)\nplt.xlabel(\"Rel. power (AU)\")\nplt.ylabel(\"d'\")\nplt.xlim(1, 3)\nplt.axhline(y=0, color='k', linewidth=.1)\nplt.tight_layout()\nsns.despine()",
"_____no_output_____"
],
[
"plt.figure(figsize=(3, 3))\n\np = res['powers2'] / res['pow1']\nleft = res['p_lefts'].mean(0)\nright = res['p_rights'].mean(0)\n\nplt.plot(p, left, color='purple', alpha=0.4, label='left')\nplt.plot(p, right, color='purple', alpha=1, label='right')\nplt.legend(loc='upper left', ncol=2)\nplt.xlabel(\"Rel. bias (AU)\")\nplt.ylabel(\"Choice probability\")\nplt.ylim(0, 1)\nplt.xlim(1, 3)\nplt.axhline(y=0.5, color='k', linewidth=.1)\nplt.tight_layout()\nsns.despine()",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
e7a003afeea450e1247f58506903b7ea27f33765 | 148,204 | ipynb | Jupyter Notebook | samples/notebooks/week06-04-introduction-to-gans.ipynb | gu-ma/ba_218_comppx_h1901 | 7e9a32e44b152c928698657b6699e13a8edd0492 | [
"MIT"
] | null | null | null | samples/notebooks/week06-04-introduction-to-gans.ipynb | gu-ma/ba_218_comppx_h1901 | 7e9a32e44b152c928698657b6699e13a8edd0492 | [
"MIT"
] | null | null | null | samples/notebooks/week06-04-introduction-to-gans.ipynb | gu-ma/ba_218_comppx_h1901 | 7e9a32e44b152c928698657b6699e13a8edd0492 | [
"MIT"
] | null | null | null | 185.486859 | 11,674 | 0.870233 | [
[
[
"# Reference\n\nThis example is taken from the book [DL with Python](https://www.manning.com/books/deep-learning-with-python) by F. Chollet. \n\nAll the notebooks from the book are available for free on [Github](https://github.com/fchollet/deep-learning-with-python-notebooks)\n\nIf you like to run the example locally follow the instructions provided on [Keras website](https://keras.io/#installation)\n\n---",
"_____no_output_____"
]
],
[
[
"import keras\nkeras.__version__",
"Using TensorFlow backend.\n"
]
],
[
[
"# Introduction to generative adversarial networks\n\nThis notebook contains the second code sample found in Chapter 8, Section 5 of [Deep Learning with Python](https://www.manning.com/books/deep-learning-with-python?a_aid=keras&a_bid=76564dff). Note that the original text features far more content, in particular further explanations and figures: in this notebook, you will only find source code and related comments.\n\n---\n[...]",
"_____no_output_____"
],
[
"## A schematic GAN implementation\n\n\nIn what follows, we explain how to implement a GAN in Keras, in its barest form -- since GANs are quite advanced, diving deeply into the \ntechnical details would be out of scope for us. Our specific implementation will be a deep convolutional GAN, or DCGAN: a GAN where the \ngenerator and discriminator are deep convnets. In particular, it leverages a `Conv2DTranspose` layer for image upsampling in the generator.\n\nWe will train our GAN on images from CIFAR10, a dataset of 50,000 32x32 RGB images belong to 10 classes (5,000 images per class). To make \nthings even easier, we will only use images belonging to the class \"frog\".\n\nSchematically, our GAN looks like this:\n\n* A `generator` network maps vectors of shape `(latent_dim,)` to images of shape `(32, 32, 3)`.\n* A `discriminator` network maps images of shape (32, 32, 3) to a binary score estimating the probability that the image is real.\n* A `gan` network chains the generator and the discriminator together: `gan(x) = discriminator(generator(x))`. Thus this `gan` network maps \nlatent space vectors to the discriminator's assessment of the realism of these latent vectors as decoded by the generator.\n* We train the discriminator using examples of real and fake images along with \"real\"/\"fake\" labels, as we would train any regular image \nclassification model.\n* To train the generator, we use the gradients of the generator's weights with regard to the loss of the `gan` model. This means that, at \nevery step, we move the weights of the generator in a direction that will make the discriminator more likely to classify as \"real\" the \nimages decoded by the generator. I.e. we train the generator to fool the discriminator.",
"_____no_output_____"
],
[
"## A bag of tricks\n\n\nTraining GANs and tuning GAN implementations is notoriously difficult. There are a number of known \"tricks\" that one should keep in mind. \nLike most things in deep learning, it is more alchemy than science: these tricks are really just heuristics, not theory-backed guidelines. \nThey are backed by some level of intuitive understanding of the phenomenon at hand, and they are known to work well empirically, albeit not \nnecessarily in every context.\n\nHere are a few of the tricks that we leverage in our own implementation of a GAN generator and discriminator below. It is not an exhaustive \nlist of GAN-related tricks; you will find many more across the GAN literature.\n\n* We use `tanh` as the last activation in the generator, instead of `sigmoid`, which would be more commonly found in other types of models.\n* We sample points from the latent space using a _normal distribution_ (Gaussian distribution), not a uniform distribution.\n* Stochasticity is good to induce robustness. Since GAN training results in a dynamic equilibrium, GANs are likely to get \"stuck\" in all sorts of ways. \nIntroducing randomness during training helps prevent this. We introduce randomness in two ways: 1) we use dropout in the discriminator, 2) \nwe add some random noise to the labels for the discriminator.\n* Sparse gradients can hinder GAN training. In deep learning, sparsity is often a desirable property, but not in GANs. There are two things \nthat can induce gradient sparsity: 1) max pooling operations, 2) ReLU activations. Instead of max pooling, we recommend using strided \nconvolutions for downsampling, and we recommend using a `LeakyReLU` layer instead of a ReLU activation. It is similar to ReLU but it \nrelaxes sparsity constraints by allowing small negative activation values.\n* In generated images, it is common to see \"checkerboard artifacts\" caused by unequal coverage of the pixel space in the generator. To fix \nthis, we use a kernel size that is divisible by the stride size, whenever we use a strided `Conv2DTranpose` or `Conv2D` in both the \ngenerator and discriminator.",
"_____no_output_____"
],
[
"## The generator\n\n\nFirst, we develop a `generator` model, which turns a vector (from the latent space -- during training it will sampled at random) into a \ncandidate image. One of the many issues that commonly arise with GANs is that the generator gets stuck with generated images that look like \nnoise. A possible solution is to use dropout on both the discriminator and generator.",
"_____no_output_____"
]
],
[
[
"import keras\nfrom keras import layers\nimport numpy as np\n\nlatent_dim = 32\nheight = 32\nwidth = 32\nchannels = 3\n\ngenerator_input = keras.Input(shape=(latent_dim,))\n\n# First, transform the input into a 16x16 128-channels feature map\nx = layers.Dense(128 * 16 * 16)(generator_input)\nx = layers.LeakyReLU()(x)\nx = layers.Reshape((16, 16, 128))(x)\n\n# Then, add a convolution layer\nx = layers.Conv2D(256, 5, padding='same')(x)\nx = layers.LeakyReLU()(x)\n\n# Upsample to 32x32\nx = layers.Conv2DTranspose(256, 4, strides=2, padding='same')(x)\nx = layers.LeakyReLU()(x)\n\n# Few more conv layers\nx = layers.Conv2D(256, 5, padding='same')(x)\nx = layers.LeakyReLU()(x)\nx = layers.Conv2D(256, 5, padding='same')(x)\nx = layers.LeakyReLU()(x)\n\n# Produce a 32x32 1-channel feature map\nx = layers.Conv2D(channels, 7, activation='tanh', padding='same')(x)\ngenerator = keras.models.Model(generator_input, x)\ngenerator.summary()",
"Using TensorFlow backend.\n"
]
],
[
[
"## The discriminator\n\n\nThen, we develop a `discriminator` model, that takes as input a candidate image (real or synthetic) and classifies it into one of two \nclasses, either \"generated image\" or \"real image that comes from the training set\".",
"_____no_output_____"
]
],
[
[
"discriminator_input = layers.Input(shape=(height, width, channels))\nx = layers.Conv2D(128, 3)(discriminator_input)\nx = layers.LeakyReLU()(x)\nx = layers.Conv2D(128, 4, strides=2)(x)\nx = layers.LeakyReLU()(x)\nx = layers.Conv2D(128, 4, strides=2)(x)\nx = layers.LeakyReLU()(x)\nx = layers.Conv2D(128, 4, strides=2)(x)\nx = layers.LeakyReLU()(x)\nx = layers.Flatten()(x)\n\n# One dropout layer - important trick!\nx = layers.Dropout(0.4)(x)\n\n# Classification layer\nx = layers.Dense(1, activation='sigmoid')(x)\n\ndiscriminator = keras.models.Model(discriminator_input, x)\ndiscriminator.summary()\n\n# To stabilize training, we use learning rate decay\n# and gradient clipping (by value) in the optimizer.\ndiscriminator_optimizer = keras.optimizers.RMSprop(lr=0.0008, clipvalue=1.0, decay=1e-8)\ndiscriminator.compile(optimizer=discriminator_optimizer, loss='binary_crossentropy')",
"_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\ninput_2 (InputLayer) (None, 32, 32, 3) 0 \n_________________________________________________________________\nconv2d_5 (Conv2D) (None, 30, 30, 128) 3584 \n_________________________________________________________________\nleaky_re_lu_6 (LeakyReLU) (None, 30, 30, 128) 0 \n_________________________________________________________________\nconv2d_6 (Conv2D) (None, 14, 14, 128) 262272 \n_________________________________________________________________\nleaky_re_lu_7 (LeakyReLU) (None, 14, 14, 128) 0 \n_________________________________________________________________\nconv2d_7 (Conv2D) (None, 6, 6, 128) 262272 \n_________________________________________________________________\nleaky_re_lu_8 (LeakyReLU) (None, 6, 6, 128) 0 \n_________________________________________________________________\nconv2d_8 (Conv2D) (None, 2, 2, 128) 262272 \n_________________________________________________________________\nleaky_re_lu_9 (LeakyReLU) (None, 2, 2, 128) 0 \n_________________________________________________________________\nflatten_1 (Flatten) (None, 512) 0 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 512) 0 \n_________________________________________________________________\ndense_2 (Dense) (None, 1) 513 \n=================================================================\nTotal params: 790,913\nTrainable params: 790,913\nNon-trainable params: 0\n_________________________________________________________________\n"
]
],
[
[
"## The adversarial network\n\nFinally, we setup the GAN, which chains the generator and the discriminator. This is the model that, when trained, will move the generator \nin a direction that improves its ability to fool the discriminator. This model turns latent space points into a classification decision, \n\"fake\" or \"real\", and it is meant to be trained with labels that are always \"these are real images\". So training `gan` will updates the \nweights of `generator` in a way that makes `discriminator` more likely to predict \"real\" when looking at fake images. Very importantly, we \nset the discriminator to be frozen during training (non-trainable): its weights will not be updated when training `gan`. If the \ndiscriminator weights could be updated during this process, then we would be training the discriminator to always predict \"real\", which is \nnot what we want!",
"_____no_output_____"
]
],
[
[
"# Set discriminator weights to non-trainable\n# (will only apply to the `gan` model)\ndiscriminator.trainable = False\n\ngan_input = keras.Input(shape=(latent_dim,))\ngan_output = discriminator(generator(gan_input))\ngan = keras.models.Model(gan_input, gan_output)\n\ngan_optimizer = keras.optimizers.RMSprop(lr=0.0004, clipvalue=1.0, decay=1e-8)\ngan.compile(optimizer=gan_optimizer, loss='binary_crossentropy')",
"_____no_output_____"
]
],
[
[
"## How to train your DCGAN\n\nNow we can start training. To recapitulate, this is schematically what the training loop looks like:\n\n```\nfor each epoch:\n * Draw random points in the latent space (random noise).\n * Generate images with `generator` using this random noise.\n * Mix the generated images with real ones.\n * Train `discriminator` using these mixed images, with corresponding targets, either \"real\" (for the real images) or \"fake\" (for the generated images).\n * Draw new random points in the latent space.\n * Train `gan` using these random vectors, with targets that all say \"these are real images\". This will update the weights of the generator (only, since discriminator is frozen inside `gan`) to move them towards getting the discriminator to predict \"these are real images\" for generated images, i.e. this trains the generator to fool the discriminator.\n```\n\nLet's implement it:",
"_____no_output_____"
]
],
[
[
"import os\nfrom keras.preprocessing import image\n\n# Load CIFAR10 data\n(x_train, y_train), (_, _) = keras.datasets.cifar10.load_data()\n\n# Select frog images (class 6)\nx_train = x_train[y_train.flatten() == 6]\n\n# Normalize data\nx_train = x_train.reshape(\n (x_train.shape[0],) + (height, width, channels)).astype('float32') / 255.\n\niterations = 10000\nbatch_size = 20\nsave_dir = '/home/ubuntu/gan_images/'\n\n# Start training loop\nstart = 0\nfor step in range(iterations):\n # Sample random points in the latent space\n random_latent_vectors = np.random.normal(size=(batch_size, latent_dim))\n\n # Decode them to fake images\n generated_images = generator.predict(random_latent_vectors)\n\n # Combine them with real images\n stop = start + batch_size\n real_images = x_train[start: stop]\n combined_images = np.concatenate([generated_images, real_images])\n\n # Assemble labels discriminating real from fake images\n labels = np.concatenate([np.ones((batch_size, 1)),\n np.zeros((batch_size, 1))])\n # Add random noise to the labels - important trick!\n labels += 0.05 * np.random.random(labels.shape)\n\n # Train the discriminator\n d_loss = discriminator.train_on_batch(combined_images, labels)\n\n # sample random points in the latent space\n random_latent_vectors = np.random.normal(size=(batch_size, latent_dim))\n\n # Assemble labels that say \"all real images\"\n misleading_targets = np.zeros((batch_size, 1))\n\n # Train the generator (via the gan model,\n # where the discriminator weights are frozen)\n a_loss = gan.train_on_batch(random_latent_vectors, misleading_targets)\n \n start += batch_size\n if start > len(x_train) - batch_size:\n start = 0\n\n # Occasionally save / plot\n if step % 100 == 0:\n # Save model weights\n gan.save_weights('gan.h5')\n\n # Print metrics\n print('discriminator loss at step %s: %s' % (step, d_loss))\n print('adversarial loss at step %s: %s' % (step, a_loss))\n\n # Save one generated image\n img = image.array_to_img(generated_images[0] * 255., scale=False)\n img.save(os.path.join(save_dir, 'generated_frog' + str(step) + '.png'))\n\n # Save one real image, for comparison\n img = image.array_to_img(real_images[0] * 255., scale=False)\n img.save(os.path.join(save_dir, 'real_frog' + str(step) + '.png'))",
"discriminator loss at step 0: 0.685675\nadversarial loss at step 0: 0.667591\ndiscriminator loss at step 100: 0.756201\nadversarial loss at step 100: 0.820905\ndiscriminator loss at step 200: 0.699047\nadversarial loss at step 200: 0.776581\ndiscriminator loss at step 300: 0.684602\nadversarial loss at step 300: 0.513813\ndiscriminator loss at step 400: 0.707092\nadversarial loss at step 400: 0.716778\ndiscriminator loss at step 500: 0.686278\nadversarial loss at step 500: 0.741214\ndiscriminator loss at step 600: 0.692786\nadversarial loss at step 600: 0.745891\ndiscriminator loss at step 700: 0.69771\nadversarial loss at step 700: 0.781026\ndiscriminator loss at step 800: 0.69236\nadversarial loss at step 800: 0.748769\ndiscriminator loss at step 900: 0.663193\nadversarial loss at step 900: 0.689923\ndiscriminator loss at step 1000: 0.706922\nadversarial loss at step 1000: 0.741314\ndiscriminator loss at step 1100: 0.682189\nadversarial loss at step 1100: 0.76548\ndiscriminator loss at step 1200: 0.687244\nadversarial loss at step 1200: 0.746018\ndiscriminator loss at step 1300: 0.697884\nadversarial loss at step 1300: 0.766032\ndiscriminator loss at step 1400: 0.691977\nadversarial loss at step 1400: 0.735184\ndiscriminator loss at step 1500: 0.696238\nadversarial loss at step 1500: 0.738426\ndiscriminator loss at step 1600: 0.698334\nadversarial loss at step 1600: 0.741093\ndiscriminator loss at step 1700: 0.70315\nadversarial loss at step 1700: 0.736702\ndiscriminator loss at step 1800: 0.693836\nadversarial loss at step 1800: 0.742768\ndiscriminator loss at step 1900: 0.69059\nadversarial loss at step 1900: 0.741162\ndiscriminator loss at step 2000: 0.696293\nadversarial loss at step 2000: 0.755151\ndiscriminator loss at step 2100: 0.686166\nadversarial loss at step 2100: 0.755129\ndiscriminator loss at step 2200: 0.692612\nadversarial loss at step 2200: 0.772408\ndiscriminator loss at step 2300: 0.704013\nadversarial loss at step 2300: 0.776998\ndiscriminator loss at step 2400: 0.693268\nadversarial loss at step 2400: 0.70731\ndiscriminator loss at step 2500: 0.684289\nadversarial loss at step 2500: 0.742162\ndiscriminator loss at step 2600: 0.700483\nadversarial loss at step 2600: 0.734719\ndiscriminator loss at step 2700: 0.699952\nadversarial loss at step 2700: 0.759745\ndiscriminator loss at step 2800: 0.697416\nadversarial loss at step 2800: 0.733726\ndiscriminator loss at step 2900: 0.697604\nadversarial loss at step 2900: 0.740891\ndiscriminator loss at step 3000: 0.698498\nadversarial loss at step 3000: 0.754564\ndiscriminator loss at step 3100: 0.695516\nadversarial loss at step 3100: 0.759486\ndiscriminator loss at step 3200: 0.693453\nadversarial loss at step 3200: 0.769369\ndiscriminator loss at step 3300: 1.5083\nadversarial loss at step 3300: 0.726621\ndiscriminator loss at step 3400: 0.686934\nadversarial loss at step 3400: 0.747121\ndiscriminator loss at step 3500: 0.689791\nadversarial loss at step 3500: 0.751882\ndiscriminator loss at step 3600: 0.71331\nadversarial loss at step 3600: 0.704916\ndiscriminator loss at step 3700: 0.690504\nadversarial loss at step 3700: 0.853764\ndiscriminator loss at step 3800: 0.688844\nadversarial loss at step 3800: 0.791077\ndiscriminator loss at step 3900: 0.679162\nadversarial loss at step 3900: 0.724979\ndiscriminator loss at step 4000: 0.676585\nadversarial loss at step 4000: 0.69554\ndiscriminator loss at step 4100: 0.693313\nadversarial loss at step 4100: 0.742666\ndiscriminator loss at step 4200: 0.678367\nadversarial loss at step 4200: 0.778793\ndiscriminator loss at step 4300: 0.699712\nadversarial loss at step 4300: 0.740457\ndiscriminator loss at step 4400: 0.697605\nadversarial loss at step 4400: 0.755847\ndiscriminator loss at step 4500: 0.710596\nadversarial loss at step 4500: 0.814832\ndiscriminator loss at step 4600: 0.706518\nadversarial loss at step 4600: 0.83636\ndiscriminator loss at step 4700: 0.687217\nadversarial loss at step 4700: 0.775736\ndiscriminator loss at step 4800: 0.769103\nadversarial loss at step 4800: 0.774639\ndiscriminator loss at step 4900: 0.692414\nadversarial loss at step 4900: 0.775192\ndiscriminator loss at step 5000: 0.715357\nadversarial loss at step 5000: 0.775003\ndiscriminator loss at step 5100: 0.703434\nadversarial loss at step 5100: 0.940242\ndiscriminator loss at step 5200: 0.704034\nadversarial loss at step 5200: 0.708327\ndiscriminator loss at step 5300: 0.698559\nadversarial loss at step 5300: 0.730377\ndiscriminator loss at step 5400: 0.684378\nadversarial loss at step 5400: 0.759259\ndiscriminator loss at step 5500: 0.693699\nadversarial loss at step 5500: 0.700122\ndiscriminator loss at step 5600: 0.715242\nadversarial loss at step 5600: 0.808961\ndiscriminator loss at step 5700: 0.689339\nadversarial loss at step 5700: 0.621725\ndiscriminator loss at step 5800: 0.679717\nadversarial loss at step 5800: 0.787711\ndiscriminator loss at step 5900: 0.700126\nadversarial loss at step 5900: 0.742493\ndiscriminator loss at step 6000: 0.692087\nadversarial loss at step 6000: 0.839669\ndiscriminator loss at step 6100: 0.677867\nadversarial loss at step 6100: 0.797158\ndiscriminator loss at step 6200: 0.70392\nadversarial loss at step 6200: 0.842135\ndiscriminator loss at step 6300: 0.688377\nadversarial loss at step 6300: 0.718633\ndiscriminator loss at step 6400: 0.781234\nadversarial loss at step 6400: 0.710833\ndiscriminator loss at step 6500: 0.682696\nadversarial loss at step 6500: 0.739674\ndiscriminator loss at step 6600: 0.693081\nadversarial loss at step 6600: 0.747336\ndiscriminator loss at step 6700: 0.681836\nadversarial loss at step 6700: 0.780143\ndiscriminator loss at step 6800: 0.728136\nadversarial loss at step 6800: 0.838522\ndiscriminator loss at step 6900: 0.660475\nadversarial loss at step 6900: 0.717434\ndiscriminator loss at step 7000: 0.672144\nadversarial loss at step 7000: 0.948783\ndiscriminator loss at step 7100: 0.692428\nadversarial loss at step 7100: 0.837047\ndiscriminator loss at step 7200: 0.731133\nadversarial loss at step 7200: 0.728315\ndiscriminator loss at step 7300: 0.671766\nadversarial loss at step 7300: 0.793155\ndiscriminator loss at step 7400: 0.712387\nadversarial loss at step 7400: 0.807759\ndiscriminator loss at step 7500: 0.68638\nadversarial loss at step 7500: 0.967421\ndiscriminator loss at step 7600: 0.690096\nadversarial loss at step 7600: 0.811904\ndiscriminator loss at step 7700: 0.702784\nadversarial loss at step 7700: 0.867017\ndiscriminator loss at step 7800: 0.674138\nadversarial loss at step 7800: 0.837909\ndiscriminator loss at step 7900: 0.674747\nadversarial loss at step 7900: 0.743664\ndiscriminator loss at step 8000: 0.680357\nadversarial loss at step 8000: 0.810859\ndiscriminator loss at step 8100: 0.688885\nadversarial loss at step 8100: 0.786809\ndiscriminator loss at step 8200: 0.671557\nadversarial loss at step 8200: 0.784159\ndiscriminator loss at step 8300: 0.70359\nadversarial loss at step 8300: 0.95692\ndiscriminator loss at step 8400: 0.720167\nadversarial loss at step 8400: 1.14066\ndiscriminator loss at step 8500: 0.747376\nadversarial loss at step 8500: 0.630725\ndiscriminator loss at step 8600: 0.688931\nadversarial loss at step 8600: 0.849245\ndiscriminator loss at step 8700: 0.707559\nadversarial loss at step 8700: 0.713202\ndiscriminator loss at step 8800: 0.673593\nadversarial loss at step 8800: 0.832419\ndiscriminator loss at step 8900: 0.6777\nadversarial loss at step 8900: 0.773395\ndiscriminator loss at step 9000: 0.659887\nadversarial loss at step 9000: 0.77255\ndiscriminator loss at step 9100: 0.675182\nadversarial loss at step 9100: 0.749544\ndiscriminator loss at step 9200: 0.687147\nadversarial loss at step 9200: 0.836509\ndiscriminator loss at step 9300: 0.690807\nadversarial loss at step 9300: 0.829561\ndiscriminator loss at step 9400: 0.656649\nadversarial loss at step 9400: 0.788181\ndiscriminator loss at step 9500: 0.703494\nadversarial loss at step 9500: 0.78302\ndiscriminator loss at step 9600: 0.680718\nadversarial loss at step 9600: 0.813078\ndiscriminator loss at step 9700: 0.704956\nadversarial loss at step 9700: 0.761652\ndiscriminator loss at step 9800: 0.673504\nadversarial loss at step 9800: 0.853213\ndiscriminator loss at step 9900: 0.669288\nadversarial loss at step 9900: 0.677691\n"
]
],
[
[
"Let's display a few of our fake images:",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\n\n# Sample random points in the latent space\nrandom_latent_vectors = np.random.normal(size=(10, latent_dim))\n\n# Decode them to fake images\ngenerated_images = generator.predict(random_latent_vectors)\n\nfor i in range(generated_images.shape[0]):\n img = image.array_to_img(generated_images[i] * 255., scale=False)\n plt.figure()\n plt.imshow(img)\n \nplt.show()",
"_____no_output_____"
]
],
[
[
"Froggy with some pixellated artifacts.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e7a0102cc615247f4bd1c115ff70b7c2a137af0a | 5,646 | ipynb | Jupyter Notebook | examples/private-ai-series/duet_basics/exercise/Exercise_Duet_Basics_Data_Scientist.ipynb | Bhuvan-21/PySyft | 77ffb4fc68f141f2cbf812264c472d7f67004ae1 | [
"Apache-2.0"
] | 2 | 2018-07-23T20:34:10.000Z | 2020-08-01T09:09:09.000Z | examples/private-ai-series/duet_basics/exercise/Exercise_Duet_Basics_Data_Scientist.ipynb | Bhuvan-21/PySyft | 77ffb4fc68f141f2cbf812264c472d7f67004ae1 | [
"Apache-2.0"
] | 5 | 2020-09-11T05:47:12.000Z | 2020-10-13T08:36:17.000Z | examples/private-ai-series/duet_basics/exercise/Exercise_Duet_Basics_Data_Scientist.ipynb | Bhuvan-21/PySyft | 77ffb4fc68f141f2cbf812264c472d7f67004ae1 | [
"Apache-2.0"
] | 1 | 2021-05-22T17:11:42.000Z | 2021-05-22T17:11:42.000Z | 26.018433 | 431 | 0.553489 | [
[
[
"import syft as sy",
"_____no_output_____"
]
],
[
[
"# Part 1: Join the Duet Server the Data Owner connected to\n",
"_____no_output_____"
]
],
[
[
"duet = sy.join_duet(loopback=True)",
"_____no_output_____"
]
],
[
[
"### <img src=\"https://github.com/OpenMined/design-assets/raw/master/logos/OM/mark-primary-light.png\" alt=\"he-black-box\" width=\"100\"/> Checkpoint 0 : Now STOP and run the Data Owner notebook until Checkpoint 1.",
"_____no_output_____"
],
[
"# Part 2: Search for Available Data\n",
"_____no_output_____"
]
],
[
[
"# The data scientist can check the list of searchable data in Data Owner's duet store\nduet.store.pandas",
"_____no_output_____"
],
[
"# Data Scientist finds that there are Heights and Weights of a group of people. There are some analysis he/she can do with them together.\n\nheights_ptr = duet.store[0]\nweights_ptr = duet.store[1]\n\n# heights_ptr is a reference to the height dataset remotely available on data owner's server\nprint(heights_ptr)\n\n# weights_ptr is a reference to the weight dataset remotely available on data owner's server\nprint(weights_ptr)",
"_____no_output_____"
]
],
[
[
"## Calculate BMI (Body Mass Index) and weight status\n\nUsing the heights and weights pointers of the people of Group A, calculate their BMI and get a pointer to their individual BMI. From the BMI pointers, you can check if a person is normal-weight, overweight or obese, without knowing their actual heights and weights and even BMI values.\n\n BMI from 19 to 24 - Normal \n BMI from 25 to 29 - Overweight\n BMI from 30 to 39 - Obese\n\n BMI = [weight (kg) / (height (cm)^2)] x 10,000\n Hint: run duet.torch and find the required operators",
"_____no_output_____"
],
[
"One amazing thing about pointers is that from a pointer to a list of items, we can get the pointers to each item in the list. As example, here we have weights_ptr pointing to the weight-list, but from that we can also get the pointer to each weight and perform computation on each of them without even the knowing the value! Below code will show you how to access the pointers to each weight and height from the list pointer.",
"_____no_output_____"
]
],
[
[
"for i in range(6):\n print(\"Pointer to Weight of person\", i + 1, weights_ptr[i])\n print(\"Pointer to Height of person\", i + 1, heights_ptr[i])",
"_____no_output_____"
],
[
"def BMI_calculator(w_ptr, h_ptr):\n \n bmi_ptr = 0\n\n ##TODO\n\n \"Write your code here for calculating bmi_ptr\"\n\n ###\n\n return bmi_ptr\n\n\ndef weight_status(w_ptr, h_ptr):\n\n status = None\n\n bmi_ptr = BMI_calculator(w_ptr, h_ptr)\n\n ##TODO\n\n \"\"\"Write your code here. \n Possible values for status: \n Normal, \n Overweight, \n Obese, \n Out of range\n \"\"\"\"\"\n\n ###\n \n return status",
"_____no_output_____"
],
[
"for i in range(0, 6):\n bmi_ptr = BMI_calculator(weights_ptr[i], heights_ptr[i])",
"_____no_output_____"
],
[
"statuses = []\nfor i in range(0, 6):\n status = weight_status(weights_ptr[i], heights_ptr[i])\n print(\"Weight of Person\", i + 1, \"is\", status)\n statuses.append(status)",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
e7a011de1455ab5efade3b2eb44f882154404a6a | 79,208 | ipynb | Jupyter Notebook | HW4/HW4.ipynb | slowwavesleep/HSE_ML | 1b9efb19944d8d79f821433345f1684ffd100806 | [
"MIT"
] | 1 | 2020-12-09T16:25:02.000Z | 2020-12-09T16:25:02.000Z | HW4/HW4.ipynb | slowwavesleep/HSE_ML | 1b9efb19944d8d79f821433345f1684ffd100806 | [
"MIT"
] | null | null | null | HW4/HW4.ipynb | slowwavesleep/HSE_ML | 1b9efb19944d8d79f821433345f1684ffd100806 | [
"MIT"
] | null | null | null | 36.926807 | 15,840 | 0.559918 | [
[
[
"from IPython.core.display import display, HTML\ndisplay(HTML(\"<style>.container { width:85% !important; }</style>\"))",
"_____no_output_____"
]
],
[
[
"На нескольких алгоритмах кластеризации, умеющих работать с sparse матрицами, проверьте, что работает лучше Count_Vectorizer или TfidfVectorizer (попробуйте выжать максимум из каждого - попробуйте нграммы, символьные нграммы, разные значения max_features и min_df) (3 балла)\n\nНа нескольких алгоритмах кластеризации проверьте, какое матричное разложение (TruncatedSVD или NMF) работает лучше для кластеризации. (3 балла)\n\nС помощью алгоритмов, умеющих выделять выбросы, попробуйте найти необычные объявления (необычные - это такие, которые непонятно к какой категории можно вообще отнести, что-то с ошибками или вообще какая-то дичь). В этом задании можно использовать любую векторизацию. (4 балла)\n\nИспользуйте те же данные, что и в семинаре (колонки - title и category_name)\n\nДелайте соответствующие вашими ресурсам и потребностям алгоритма подвыборки из всего датасета. Для сравнения используйте любую из метрик, которые есть в семинаре. Оценивать на глаз тоже можно, но тогда нужно объяснить, почему вы считаете одну кластеризацию лучше.\n\nНЕ ЗАБЫВАЙТЕ подбирать параметры в кластеризации. За использование всех параметров по умолчанию, оценка будет снижаться (под использованием всех параметров по умолчанию я имею в виду что-то такое - cluster = DBSCAN())\n\nЕсли получится, используйте метод локтя. (1 бонусный балл)",
"_____no_output_____"
]
],
[
[
"from sklearn.feature_extraction.text import TfidfVectorizer, CountVectorizer\nimport pandas as pd\nfrom sklearn.decomposition import TruncatedSVD, NMF\nfrom sklearn.cluster import AffinityPropagation, AgglomerativeClustering, DBSCAN, \\\n KMeans, MiniBatchKMeans, Birch, MeanShift, SpectralClustering\nfrom sklearn.metrics import adjusted_rand_score, adjusted_mutual_info_score, \\\n silhouette_score, homogeneity_score, completeness_score, \\\n v_measure_score\nimport matplotlib.pyplot as plt\n\nimport warnings\nwarnings.filterwarnings(\"ignore\")",
"_____no_output_____"
],
[
"data = pd.read_csv('data.csv')",
"_____no_output_____"
],
[
"data = data[['category_name', 'title']]",
"_____no_output_____"
]
],
[
[
"На нескольких алгоритмах кластеризации, умеющих работать с sparse матрицами, проверьте, что работает лучше Count_Vectorizer или TfidfVectorizer (попробуйте выжать максимум из каждого - попробуйте нграммы, символьные нграммы, разные значения max_features и min_df) <br>\n<b>(3 балла)</b>",
"_____no_output_____"
]
],
[
[
"def eval_clusterization(X, y, cluster_labels):\n silhouette = silhouette_score(X, cluster_labels)\n homogeneity = homogeneity_score(y, cluster_labels)\n completeness = completeness_score(y, cluster_labels)\n v_measure = v_measure_score(y, cluster_labels)\n adj_rand = adjusted_rand_score(y, cluster_labels)\n mi_score = adjusted_mutual_info_score(y, cluster_labels)\n print('Clusterization metrics')\n print(f'Silhouette score: {silhouette:.3f}')\n print(f'Homogeneity score: {homogeneity:.3f}')\n print(f'Completeness score: {completeness:.3f}')\n print(f'V-measure: {v_measure:.3f}')\n print(f'Ajusted Rand Index: {adj_rand:.3f}')\n print(f'Adjusted Mutual Information score: {mi_score:.3f}')",
"_____no_output_____"
],
[
"def fit_and_eval(X, y, clusterizer):\n global sample\n clusterizer.fit(X)\n labels = clusterizer.labels_\n eval_clusterization(X, y, labels)\n sample['cluster'] = labels",
"_____no_output_____"
]
],
[
[
"#### Affinity Propagation",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.01)\ny = sample['category_name']",
"_____no_output_____"
]
],
[
[
"*TfidfVectorizer*",
"_____no_output_____"
]
],
[
[
"tf = TfidfVectorizer(min_df=2, max_df=0.9, max_features=500, ngram_range=(1, 2))\nX_tf = tf.fit_transform(sample['title'])",
"_____no_output_____"
],
[
"cluster = AffinityPropagation(damping=0.7, preference=-2, \n max_iter=400, verbose=2,\n convergence_iter=10)\n\nfit_and_eval(X_tf, y, cluster)",
"Did not converge\nClusterization metrics\nSilhouette score: 0.496\nHomogeneity score: 0.591\nCompleteness score: 0.389\nV-measure: 0.469\nAjusted Rand Index: -0.013\nAdjusted Mutual Information score: 0.198\n"
]
],
[
[
"*CountVectorizer*",
"_____no_output_____"
]
],
[
[
"cv = CountVectorizer(min_df=3, max_df=0.6, max_features=1000)\nX_cv = cv.fit_transform(sample['title'])",
"_____no_output_____"
],
[
"cluster = AffinityPropagation(damping=0.7, preference=-2, \n max_iter=400, verbose=2,\n convergence_iter=10)\n\nfit_and_eval(X_cv, y, cluster)",
"Converged after 244 iterations.\nClusterization metrics\nSilhouette score: 0.481\nHomogeneity score: 0.601\nCompleteness score: 0.371\nV-measure: 0.459\nAjusted Rand Index: -0.010\nAdjusted Mutual Information score: 0.155\n"
]
],
[
[
"Для этого алгоритма оба способа векторизации выдают близкие значения V-measure. У tf преимущество по silhouette score, а у cv по homogeneity. У tf выше показатели completeness и MI score, поэтому, на мой взгляд, в данном случае он лучше подходит для данного алгоритма.",
"_____no_output_____"
],
[
"#### K-means",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.01)\ny = sample['category_name']",
"_____no_output_____"
]
],
[
[
"*TfidfVectorizer*",
"_____no_output_____"
]
],
[
[
"tf = TfidfVectorizer(min_df=2, max_df=0.8, max_features=500)\nX_tf = tf.fit_transform(sample['title'])",
"_____no_output_____"
],
[
"cluster = KMeans(n_clusters=47, n_jobs=-1, random_state=0)\nfit_and_eval(X_tf, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.224\nHomogeneity score: 0.318\nCompleteness score: 0.406\nV-measure: 0.357\nAjusted Rand Index: -0.001\nAdjusted Mutual Information score: 0.249\n"
]
],
[
[
"*CountVectorizer*",
"_____no_output_____"
]
],
[
[
"cv = CountVectorizer(min_df=3, max_df=0.4, max_features=1000)\nX_cv = cv.fit_transform(sample['title'])",
"_____no_output_____"
],
[
"cluster = KMeans(n_clusters=47, n_jobs=-1, random_state=0)\nfit_and_eval(X_cv, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.184\nHomogeneity score: 0.276\nCompleteness score: 0.405\nV-measure: 0.328\nAjusted Rand Index: 0.003\nAdjusted Mutual Information score: 0.212\n"
]
],
[
[
"У tf выше homogeneity, но ниже completeness. У tf значительно выше silhouette score. MI score также выше. tf здесь опять же лучше.",
"_____no_output_____"
],
[
"Если получится, используйте метод локтя. <br> <b>(1 бонусный балл)</b>",
"_____no_output_____"
]
],
[
[
"def elbow_method(X, clusterizer, left_boundary, right_boundary, step):\n scores = []\n for i in range(left_boundary, right_boundary, step):\n cluster = clusterizer(n_clusters=i, n_jobs=-1, random_state=0)\n cluster.fit(X)\n labels = cluster.labels_\n score = silhouette_score(X, labels)\n scores.append(score)\n plt.figure(figsize=(12, 5))\n plt.plot(list(range(left_boundary, right_boundary, step)), scores)\n return scores",
"_____no_output_____"
],
[
"elbow_method(X_tf, KMeans, 25, 1250, 75)",
"_____no_output_____"
]
],
[
[
"Видим, что в районе 1100 кластеров перестает расти silhouette score.",
"_____no_output_____"
]
],
[
[
"cluster = KMeans(n_clusters=1100, n_jobs=-1, random_state=0)\ncluster.fit(X_tf)\nc_labels = cluster.labels_\neval_clusterization(X_tf, y, c_labels)",
"Clusterization metrics\nSilhouette score: 0.627\nHomogeneity score: 0.742\nCompleteness score: 0.390\nV-measure: 0.511\nAjusted Rand Index: -0.003\nAdjusted Mutual Information score: 0.111\n"
]
],
[
[
"Все метрики действительно повысились. При этом не ясно как интерпретировать такое количество кластеров.",
"_____no_output_____"
],
[
"#### Spectral Clustering",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.01)\ny = sample['category_name']",
"_____no_output_____"
]
],
[
[
"*TfidfVectorizer*",
"_____no_output_____"
]
],
[
[
"tf = TfidfVectorizer(min_df=2, max_df=0.8, max_features=500)\nX_tf = tf.fit_transform(sample['title'])",
"_____no_output_____"
],
[
"cluster = SpectralClustering(n_clusters=47, n_jobs=-1, random_state=0)\nfit_and_eval(X_tf, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.216\nHomogeneity score: 0.243\nCompleteness score: 0.377\nV-measure: 0.296\nAjusted Rand Index: -0.021\nAdjusted Mutual Information score: 0.171\n"
]
],
[
[
"*CountVectorizer*",
"_____no_output_____"
]
],
[
[
"cv = CountVectorizer(min_df=3, max_df=0.4, max_features=1000)\nX_cv = cv.fit_transform(sample['title'])",
"_____no_output_____"
],
[
"cluster = SpectralClustering(n_clusters=47, n_jobs=-1, random_state=0)\nfit_and_eval(X_cv, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.211\nHomogeneity score: 0.125\nCompleteness score: 0.593\nV-measure: 0.207\nAjusted Rand Index: 0.025\nAdjusted Mutual Information score: 0.086\n"
]
],
[
[
"На этом алгоритме видим явное преимущество tf почти по всем метрикам.",
"_____no_output_____"
],
[
"На нескольких алгоритмах кластеризации проверьте, какое матричное разложение (TruncatedSVD или NMF) работает лучше для кластеризации. <br>\n<b>(3 балла)</b>",
"_____no_output_____"
],
[
"#### Mean Shift",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.01)\ny = sample['category_name']",
"_____no_output_____"
],
[
"cv = CountVectorizer(min_df=3, max_df=0.6, max_features=2000)\nX_cv = cv.fit_transform(sample['title'])\nsvd = TruncatedSVD(50, random_state=0)\nX_svd = svd.fit_transform(X_cv)",
"_____no_output_____"
]
],
[
[
"*SVD*",
"_____no_output_____"
]
],
[
[
"cluster = MeanShift(cluster_all=False, bandwidth=0.8, n_jobs=-1)\nfit_and_eval(X_svd, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.714\nHomogeneity score: 0.339\nCompleteness score: 0.377\nV-measure: 0.357\nAjusted Rand Index: -0.008\nAdjusted Mutual Information score: 0.212\n"
]
],
[
[
"*NMF*",
"_____no_output_____"
]
],
[
[
"cv = CountVectorizer(min_df=3, max_df=0.6, max_features=2000)\nX_cv = cv.fit_transform(sample['title'])\nnmf = NMF(50, random_state=0)\nX_nmf = nmf.fit_transform(X_cv)",
"_____no_output_____"
],
[
"cluster = MeanShift(cluster_all=False, bandwidth=0.8, n_jobs=-1)\nfit_and_eval(X_nmf, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.608\nHomogeneity score: 0.002\nCompleteness score: 0.307\nV-measure: 0.004\nAjusted Rand Index: 0.000\nAdjusted Mutual Information score: 0.001\n"
]
],
[
[
"Для данного алгоритма значительное преимущество у SVD разложения: V-measure, MI score.",
"_____no_output_____"
],
[
"#### Agglomerative Clustering",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.05)\ny = sample['category_name']",
"_____no_output_____"
]
],
[
[
"*SVD*",
"_____no_output_____"
]
],
[
[
"cv = CountVectorizer(min_df=3, max_df=0.6, max_features=2000)\nX_cv = cv.fit_transform(sample['title'])\nsvd = TruncatedSVD(50, random_state=0)\nX_svd = svd.fit_transform(X_cv)",
"_____no_output_____"
],
[
"cluster = AgglomerativeClustering(n_clusters=47)\nfit_and_eval(X_svd, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.637\nHomogeneity score: 0.302\nCompleteness score: 0.370\nV-measure: 0.332\nAjusted Rand Index: -0.001\nAdjusted Mutual Information score: 0.282\n"
]
],
[
[
"*NMF*",
"_____no_output_____"
]
],
[
[
"cv = CountVectorizer(min_df=3, max_df=0.6, max_features=2000)\nX_cv = cv.fit_transform(sample['title'])\nnmf = NMF(50, random_state=0)\nX_nmf = nmf.fit_transform(X_cv)",
"_____no_output_____"
],
[
"cluster = AgglomerativeClustering(n_clusters=47)\nfit_and_eval(X_nmf, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.707\nHomogeneity score: 0.292\nCompleteness score: 0.365\nV-measure: 0.324\nAjusted Rand Index: -0.003\nAdjusted Mutual Information score: 0.272\n"
]
],
[
[
"В отличие от предыдущих алгоритмов, здесь NMF в некоторых метриках показывает результат лучше (silhouette score, v-measure), в остальных на почти на равных с SVD. Кажется, в данном случае сложно оценить какое разложение действительно лучше. ",
"_____no_output_____"
],
[
"#### DBSCAN",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.05)\ny = sample['category_name']",
"_____no_output_____"
],
[
"cv = CountVectorizer(min_df=3, max_df=0.6, max_features=2000)\nX_cv = cv.fit_transform(sample['title'])\nsvd = TruncatedSVD(50, random_state=0)\nX_svd = svd.fit_transform(X_cv)",
"_____no_output_____"
]
],
[
[
"*SVD*",
"_____no_output_____"
]
],
[
[
"cluster = DBSCAN(min_samples=7, eps=0.4, n_jobs=-1)\nfit_and_eval(X_svd, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.672\nHomogeneity score: 0.301\nCompleteness score: 0.343\nV-measure: 0.321\nAjusted Rand Index: -0.010\nAdjusted Mutual Information score: 0.265\n"
]
],
[
[
"*NMF*",
"_____no_output_____"
]
],
[
[
"# с параметрами как у SVD не работает\ncluster = DBSCAN(min_samples=10, eps=0.3)\nfit_and_eval(X_nmf, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.507\nHomogeneity score: 0.001\nCompleteness score: 0.053\nV-measure: 0.002\nAjusted Rand Index: -0.000\nAdjusted Mutual Information score: -0.000\n"
]
],
[
[
"Здесь SVD лучше по всем параметрам.",
"_____no_output_____"
],
[
"Данный раздел можно резюмировать тем, что несмотря на то, что tfidfvectorizer лучше работает на алгоритмах с dense матрицами, в алгоритмах со sparse матрицами лучше себя показывает countvectorizer. Честно говоря, достаточно сложно оценить в чем причина такого различия, но это стоит иметь в виду.",
"_____no_output_____"
],
[
"С помощью алгоритмов, умеющих выделять выбросы, попробуйте найти необычные объявления (необычные - это такие, которые непонятно к какой категории можно вообще отнести, что-то с ошибками или вообще какая-то дичь). В этом задании можно использовать любую векторизацию. <br>\n<b>(4 балла)</b>",
"_____no_output_____"
],
[
"Алгоритмы **DBSCAN** и **Mean Shift** умеют выделять в кластеры те элементы, которые не удалось поместить ни в какие другие кластеры. Строго говоря, это не обязательно то же самое, что выбросы.",
"_____no_output_____"
],
[
"#### DBSCAN",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.05)\ny = sample['category_name']",
"_____no_output_____"
],
[
"cv = CountVectorizer(min_df=4, max_df=0.6, max_features=2000)\nX_cv = cv.fit_transform(sample['title'])\nsvd = TruncatedSVD(50, random_state=0)\nX_svd = svd.fit_transform(X_cv)",
"_____no_output_____"
]
],
[
[
"Попробовал разные параметры. Увеличение требований к кластерам (больший `min_sample` и меньший `eps`) приводит к увеличению размера кластера `-1`. В результате получается от 600 до нескольких тысяч строк. Изменение параметра `leaf_size` для алгоритмов, поддерживающих его, не влияет на размер данного кластера.",
"_____no_output_____"
]
],
[
[
"cluster = DBSCAN(min_samples=6, eps=0.6, n_jobs=-1, algorithm='kd_tree', leaf_size=30)\nfit_and_eval(X_svd, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.431\nHomogeneity score: 0.253\nCompleteness score: 0.338\nV-measure: 0.289\nAjusted Rand Index: 0.007\nAdjusted Mutual Information score: 0.176\n"
]
],
[
[
"Не удалось подобрать такую комбинацию параметров, чтобы в кластере -1 было бы менее 600 строк.",
"_____no_output_____"
]
],
[
[
"len(sample.loc[sample.cluster == -1])",
"_____no_output_____"
]
],
[
[
"Кажется, что это обычные объявления. Сложно назвать это выбросами.",
"_____no_output_____"
]
],
[
[
"sample.loc[sample.cluster == -1].head(10)",
"_____no_output_____"
]
],
[
[
"#### Mean Shift",
"_____no_output_____"
]
],
[
[
"sample = data.sample(frac=0.01)\ny = sample['category_name']",
"_____no_output_____"
],
[
"cv = CountVectorizer(min_df=3, max_df=0.6, max_features=2000)\nX_cv = cv.fit_transform(sample['title'])\nsvd = TruncatedSVD(100, random_state=0)\nX_svd = svd.fit_transform(X_cv)",
"_____no_output_____"
],
[
"cluster = MeanShift(cluster_all=False, bandwidth=0.9, n_jobs=-1)\nfit_and_eval(X_svd, y, cluster)",
"Clusterization metrics\nSilhouette score: 0.595\nHomogeneity score: 0.412\nCompleteness score: 0.367\nV-measure: 0.388\nAjusted Rand Index: -0.013\nAdjusted Mutual Information score: 0.199\n"
]
],
[
[
"Опять же достаточно обычные объявления. Какой-то особой странности не наблюдается.",
"_____no_output_____"
]
],
[
[
"len(sample.loc[sample.cluster == -1])",
"_____no_output_____"
],
[
"sample.loc[sample.cluster == -1].head(10)",
"_____no_output_____"
]
],
[
[
"Вообще для определения именно выбросов есть специальные алгоритмы.",
"_____no_output_____"
]
],
[
[
"from sklearn.ensemble import IsolationForest\nfrom sklearn.neighbors import LocalOutlierFactor",
"_____no_output_____"
],
[
"iforest = IsolationForest(random_state=0)\nlof = LocalOutlierFactor(n_neighbors=30)",
"_____no_output_____"
],
[
"sample['forest'] = iforest.fit_predict(X_cv)\nsample['lof'] = lof.fit_predict(X_cv)",
"_____no_output_____"
]
],
[
[
"Однако, и они не выдают ничего интересного. IsolationForest в основном выбирает недвижимость.",
"_____no_output_____"
]
],
[
[
"sample.loc[sample.forest == -1].category_name.value_counts()",
"_____no_output_____"
],
[
"sample.loc[sample.forest == -1].head(10)",
"_____no_output_____"
]
],
[
[
"LocalOutlierFactor скорее стремится к одежде.",
"_____no_output_____"
]
],
[
[
"sample.loc[sample.lof == -1].category_name.value_counts()",
"_____no_output_____"
],
[
"sample.loc[sample.lof == -1].head(10)",
"_____no_output_____"
]
],
[
[
"Согласны они в небольшом количестве случаев.",
"_____no_output_____"
]
],
[
[
"sample.loc[(sample.lof == -1) & (sample.forest == -1)]",
"_____no_output_____"
]
],
[
[
"Как видим, ни одним из методов не получается достать ничего необычного.",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e7a01dbed148981a9d50c937afab0c5ec31b93f0 | 73,163 | ipynb | Jupyter Notebook | Lab_Qa_R.ipynb | Andrealioli/Lab_aph_Qa_Sf | bfbd7f1f191a101edf159f380d690317ccd6a1e6 | [
"MIT"
] | null | null | null | Lab_Qa_R.ipynb | Andrealioli/Lab_aph_Qa_Sf | bfbd7f1f191a101edf159f380d690317ccd6a1e6 | [
"MIT"
] | null | null | null | Lab_Qa_R.ipynb | Andrealioli/Lab_aph_Qa_Sf | bfbd7f1f191a101edf159f380d690317ccd6a1e6 | [
"MIT"
] | null | null | null | 178.012165 | 12,860 | 0.872313 | [
[
[
"# Índice do Efeito de Empacotamento ($Q_{a}^*$)",
"_____no_output_____"
],
[
"Para estimar o índice do efeito de empacotamento da Bricaud et al. (2004):\n\n$${Q_{a}}(\\lambda) = a_{ph} (\\lambda) / a_{sol} (\\lambda) $$\n\nOnde $a_{ph} (\\lambda)$ seria medido da amostra e o $a_{sol} (\\lambda) $ seria o coeficiente e absorção caso os pigmentos estivessem na solução.\n\nMas antes é necessário estimar o $a_{sol}$, que é igual a somatória da concentração de cada pigmento vezes seu respectivo coeficiente de absorção especifico :\n\n$$ a_{sol} (\\lambda) = \\sum C_{i}.a_{sol, i}^*(\\lambda) $$",
"_____no_output_____"
],
[
"Gráfico dos $a_{sol, i}^*(\\lambda)$ de cada pigmento dado pelo HPLC:",
"_____no_output_____"
]
],
[
[
"library(repr)\n#Carregando o arquivo que tem os coeficientes de absorção de cada pigmento Bricaud et al (2004)\nbricaud_asol = read.csv(\"Bricaud_et_al_2004.csv\", skip=4, na=\"999\")\n#Padronizando os nomes dos pigmentos\nnames(bricaud_asol)=c(\"lambda\", \"Chla\", \"DVChla\", \"Chlb\", \"DVChlb\", \"Chlc12\", \"Fuco\", \"ButFuco\", \"HexFuco\", \"Perid\", \"Diad\", \"Zea\", \"Allox\", \"betacar\", \"acar\")\noptions(repr.plot.width=6, repr.plot.height=4)\n#Plotando todos os pigmentos no mesmo gráfico\nmatplot(bricaud_asol$lambda, bricaud_asol[,2:15], type=\"l\",ylab=\"\" ,\n xlab=\"Wavelength (nm)\", ylim=c(0,0.08), cex.lab=0.9, cex.axis=0.9, lwd=rep(2,14))\n\nmtext(side=2, line=2.5, expression(a[sol]^{'*'}~('m'^{2}~\"mg\"^{-1})), cex=0.9)",
"_____no_output_____"
]
],
[
[
"Agora vamos pegar um resultado de HPLC hipotético e estimar o $a_{sol}(\\lambda)$:",
"_____no_output_____"
]
],
[
[
"HPLC = data.frame(\"Chla\"=1, \"DVChla\"=0.001, \n \"Chlb\"=0.03, \"DVChlb\"=0.01, \"Chlc12\"=0.3, \"Fuco\"=1.1, \n \"ButFuco\"=0.001, \"HexFuco\"=0.005, \"Perid\"=0.5, \"Diad\"=0.01,\n \"Zea\"=0.05, \"Allox\"=0.5, \"betacar\"=1, \"acar\"=0.3)\nasol=data.frame(wv=bricaud_asol$lambda)\nfor (i in names(HPLC)){\n# multiplicado cada concentração e pigmento coms eu respectivo coeficiente de absorção específico\n asol=cbind(asol, HPLC[,i]*bricaud_asol[,i])\n}\n#Somatória dos asol de todos os pigmentos para cada comprimento de onda \nasol_t = rowSums(asol[,-1], na.rm = T)\noptions(repr.plot.width=6, repr.plot.height=4)\nplot(asol$wv, asol_t, type=\"l\", xlab=\"Wavelength (nm)\", ylab=\"\", cex.lab=0.9, cex.axis=0.9, lwd=2)\nmtext(side=2, line=2.5, expression(a[sol]~('m'^{-1})), cex=0.9)",
"_____no_output_____"
],
[
"aph=asol_t*0.8\noptions(repr.plot.width=6, repr.plot.height=4)\nmatplot(asol$wv, cbind(asol_t, aph), type=\"l\", lw=c(2,2), \n col=c(\"black\", \"green\"), lt=c(1,2), xlab=\"Wavelength (nm)\", ylab=\"\")\nmtext(side=2, line=2.5, expression(a~('m'^{-1})))\nlegend(\"topright\", legend=c(expression(a[sol]), \n expression(a[ph])),lt=c(1,2), lw=c(2,2), col=c(\"black\", \"green\"), y.intersp=2, bty=\"n\")",
"_____no_output_____"
]
],
[
[
"## Os termos faltantes na reconstrução da Bricaud et al. (2004) e a solução proposta\n\n\n",
"_____no_output_____"
],
[
"Os autores observaram que muitas vezes os espectros reconstruídos eram menores do que os espectros medidos nas amostras (o que não é esperado uma vez que daria resultados de $Q_a$ maiores do que 1). Como esses erros pareciam ser sistemáticos os autores hipotetizaram a existência de pigmentos faltantes, que não são obtidos pelo HPLC. \nDessa forma estimaram o termo faltante por meio de uma relação empírica com a Chl-*a*:\n\n$$a_{miss}(440)= 0.0525[TChla]^{0.855}$$\n\nO número de amostras usadas para chegar nessa relação empírica foi pequeno (n=14), por isso, os autores alertam que os valores estimados não devem ser considerados como $Q_a$ absolutos, por isso, foi usado o termo $Q_a^*$.",
"_____no_output_____"
],
[
"Estimando o $Q_{a}^*(440)$ com o termo faltante:",
"_____no_output_____"
]
],
[
[
"a_todos = data.frame(wv=seq(400,700,2), \"asol_t\"=asol_t)\na_sol_440 = a_todos[which(a_todos$wv==440), \"asol_t\"]+(0.0525*(HPLC$Chla + HPLC$DVChla)^0.855)\naph_440 = a_sol_440*0.8\n##Considerando o termo faltante\nQa_440_miss = aph_440/a_sol_440 \nQa_440_miss\n##Sem considerar o termo faltante\nQa_440 = aph_440/a_todos[which(a_todos$wv==440), \"asol_t\"]\nQa_440\n",
"_____no_output_____"
]
],
[
[
"## $Q_{a}$ em diferentes tamanhos de células ",
"_____no_output_____"
],
[
"Seguindo o que é apresentado do artigo Bricaud et al. (2004), plotar o $Q_{a}^*$ considerando coeficientes da absorção do conteúdo celular ($acm$) diferentes, nesse exemplo vamos considerar no $\\lambda = 440nm$",
"_____no_output_____"
]
],
[
[
"#Coeficiente de absorção do conteudo celular no 440nm\nacm.440.1=5*10^4 #menos absorvente\nacm.440.2= 10^6 #mais absorvente\n\n#Intervalo de diametros das células\nd=(1:50)*10^-6 \n#Estimando o Qa (440) para os diferentes tamanhos\nQa.acm.1 = 1+(2*exp(-acm.440.1*d)/(acm.440.1*d)+2*(exp(-acm.440.1*d)-1)/(acm.440.1*d)^2)\nQa.acm.2 = 1+ (2*exp(-acm.440.2*d)/(acm.440.2*d)+2*(exp(-acm.440.2*d)-1)/(acm.440.2*d)^2)\n\nQa.1=(3/2)*Qa.acm.1/(acm.440.1*d)\nQa.2=(3/2)*Qa.acm.2/(acm.440.2*d)\n\ndf<- data.frame(d=1:50,Qa.1= Qa.1, Qa.2= Qa.2)\noptions(repr.plot.width=4, repr.plot.height=4)\n#Plotando as curvas\nmatplot(df$d, df[, c(\"Qa.1\", \"Qa.2\")], type=\"l\", log=\"xy\", xlab=\"\", ylab=\"\", lwd=c(2,2))\nmtext(side=2, line=2.5, expression(Q[a]^{'*'}~(440)))\nmtext(side=1, line=2.5, expression(\"Diameter\"~(mu~m)))\nlegend(\"bottomleft\", legend=c(expression(a[\"cm1\"]~\"(440)=\"~5~\".\"~10^{4}~(m^{-1})), expression(a[\"cm2\"]~\"(440)=\"~10^{6}~(m^{-1}))), \n col=c(\"black\", \"red\"), lty=c(1,2), lwd=c(2,2), bty=\"n\", y.intersp=2)\n \n\n",
"_____no_output_____"
]
],
[
[
"# Índice de tamanho ($S_{f}$)",
"_____no_output_____"
],
[
"O índice de tamanho foi elaborado com fundamento teórico que células apresentariam maior indice de empacotamento e teriam uma curva mais achatada para o coeficiente de absorção específico ($a_{ph}^*$) (Ciotti et al., 2002). Experimentalmente Ciotti et al. (2002) obtiveram curvas bases de referencia para amostras dominadas por picoplancton e outra para amostras dominadas por microplancton. Sendo o $S_{f}$ um indice que indicaria a proporção dessas classes de tamanho para a comunidade amostrada. Seguindo a seguinte equação:\n\n$$\\hat{a}_{ph} = [S_{f} . \\bar{a}_{pico}(\\lambda)]+[(1-S_{f}) . \\bar{a}_{micro}(\\lambda)]$$\n\nonde $\\hat{a}_{ph}$ é o coeficiente de absorção do fitoplancton normalizado e $\\bar{a}_{pico}$ e $\\bar{a}_{micro}$ são os vetores bases obtidos por Ciotti et al. (2002, 2006) para o pico e microplâncton, respectivamente.",
"_____no_output_____"
]
],
[
[
"#Vetores base para o pico e micro Ciotti et al(2002,2006)\npico =c(1.7439,1.8264,1.9128,1.9992,2.0895,2.1799,2.2702,2.3684,2.4666,2.5687,2.6669,2.7612,2.8437,2.9183,2.9890,3.0479,3.1029,3.1500,3.1854,3.2089,3.2247,3.2325,3.2286,3.2168,3.1932,3.1540,3.1029,3.0361,2.9576,2.8712,2.7848,2.6944,2.5137,2.4273,2.3488,2.2781,2.2486,2.2192,2.1720,2.1328,2.1013,2.0660,2.0267,1.9835,1.9285,1.8657,1.7989,1.7203,1.6339,1.5357,1.4336,1.3276,1.2176,1.1076,1.0016,0.8994,0.8013,0.7109,0.6284,0.5538,0.4870,0.4320,0.3782,0.3307,0.2875,0.2486,0.2137,0.1842,0.1599,0.1402,0.1233,0.1080,0.0935,0.0789,0.0656,0.0530,0.0424,0.0344,0.0290,0.0260,0.0258,0.0268,0.0304,0.0320,0.0331,0.0347,0.0355,0.0363,0.0382,0.0401,0.0416,0.0428,0.0432,0.0432,0.0432,0.0424,0.0416,0.0408,0.0408,0.0424,0.0452,0.0503,0.0562,0.0628,0.0695,0.0758,0.0821,0.0880,0.0939,0.1002,0.1060,0.1123,0.1178,0.1229,0.1261,0.1280,0.1288,0.1296,0.1308,0.1331,0.1371,0.1422,0.1493,0.1591,0.1728,0.1909,0.2137,0.2416,0.2757,0.3178,0.3692,0.4281,0.5499,0.6009,0.6324,0.6402,0.6324,0.6245,0.5892,0.5342,0.4674,0.3967,0.3276,0.2635,0.2078,0.1618,0.1249,0.0958,0.0746,0.0601,0.0503)\nmicro=c(1.574,1.584,1.600,1.617,1.633,1.654,1.669,1.674,1.684,1.697,1.708,1.710,1.716,1.737,1.763,1.793,1.812,1.827,1.830,1.834,1.824,1.800,1.771,1.741,1.712,1.685,1.667,1.650,1.641,1.631,1.631,1.623,1.616,1.606,1.592,1.568,1.542,1.509,1.481,1.459,1.437,1.415,1.399,1.387,1.377,1.367,1.349,1.338,1.319,1.301,1.271,1.242,1.222,1.196,1.169,1.141,1.118,1.096,1.075,1.057,1.035,1.013,0.992,0.977,0.959,0.944,0.927,0.909,0.888,0.868,0.847,0.826,0.806,0.785,0.764,0.737,0.711,0.682,0.653,0.626,0.604,0.580,0.555,0.535,0.514,0.501,0.487,0.478,0.475,0.468,0.464,0.459,0.452,0.452,0.449,0.443,0.433,0.424,0.416,0.406,0.401,0.400,0.403,0.408,0.416,0.429,0.443,0.458,0.473,0.487,0.495,0.499,0.504,0.514,0.521,0.525,0.532,0.535,0.534,0.535,0.532,0.528,0.526,0.528,0.538,0.549,0.574,0.605,0.655,0.720,0.798,0.889,0.979,1.068,1.147,1.207,1.243,1.249,1.227,1.174,1.096,1.004,0.893,0.767,0.635,0.516,0.409,0.323,0.253,0.200,0.158)\n#Intervalo do comprimento de ondas\nwv = seq(400,700,2)\n\ndf=data.frame(wv=wv, pico=pico, micro=micro)\n#Plotando o gráfico\nmatplot(df$wv, df[,c(\"pico\", \"micro\")],type=\"l\", xlab=\"\", ylab=\"\", lwd=2)\nmtext(side=2, line=2.5, expression(bar(a)~(\"dimessionless\")))\nmtext(side=1, line=2.5, expression(\"Wavelength\"~(nm)))\nlegend(x=620, y=3.3, legend=c(\"pico\",\"micro\"), col=c(\"black\",\"red\") , lty=c(1,2), bty=\"n\", y.intersp=2)",
"_____no_output_____"
],
[
"pico_esp = pico* 0.023 / 0.5892\nmicro_esp = micro * 0.0086 / 1.249 \ndf_esp = data.frame(wv=wv, pico=pico_esp, micro=micro_esp)\nmatplot(df_esp$wv, df_esp[,c(\"pico\", \"micro\")],type=\"l\", xlab=\"\", ylab=\"\", lwd=2)\nmtext(side=2, line=2.5, expression({a[ph]}^{\"*\"}~(m^{2}~mg^{-1})))\nmtext(side=1, line=2.5, expression(\"Wavelength\"~(nm)))\nlegend(x=620, y=0.13, legend=c(\"pico\",\"micro\"), col=c(\"black\",\"red\") , lty=c(1,2), bty=\"n\", y.intersp=2)",
"_____no_output_____"
]
],
[
[
"Considerando a relação estabelecida podemos simular um uma curva de $a_{ph}^*$ a partir dos vetores base:",
"_____no_output_____"
]
],
[
[
"Sf = 0.4\naph_simulado = (pico_esp*Sf) + ((1-Sf)*micro_esp)\nmatplot(df_esp$wv, df_esp[,c(\"pico\", \"micro\")],type=\"l\", xlab=\"\", ylab=\"\", lwd=2)\nmatlines(df_esp$wv, aph_simulado, col=\"green\", lwd=2)\nmtext(side=2, line=2.5, expression({a[ph]}^{\"*\"}~(m^{2}~mg^{-1})))\nmtext(side=1, line=2.5, expression(\"Wavelength\"~(nm)))\nlegend(x=600, y=0.13, legend=c(\"pico\",\"micro\", \"simulado\"), col=c(\"black\",\"red\", \"green\") , lty=c(1,2,1), lwd=rep(2,3), bty=\"n\", y.intersp=2)",
"_____no_output_____"
]
],
[
[
"# Referências",
"_____no_output_____"
],
[
"1. Bricaud, A., Claustre, H., Ras, J., Oubelkheir, K., 2004. Natural variability of phytoplanktonic absorption in oceanic waters: Influence of the size structure of algal populations. J. Geophys. Res. Ocean. 109, 1–12. https://doi.org/10.1029/2004JC002419\n2. Ciotti, A.M., Bricaud, A., 2006. Retrievals of a size parameter for phytoplankton and spectral light absorption by colored detrital matter from water-leaving radiances at SeaWiFS channels in a continental shelf region off Brazil. Limnol. Oceanogr. Methods 4, 237–253. https://doi.org/10.4319/lom.2006.4.237\n3. Ciotti, A.M., Lewis, M.R., Cullen, J.J., 2002. Assessment of the relationships between dominant cell size in natural phytoplankton communities and the spectral shape of the absorption coefficient. Limnol. Oceanogr. 47, 404–417.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
e7a03197e0bb0e8bdfec1d7e61ffd70c45c16996 | 49,970 | ipynb | Jupyter Notebook | notebooks/source/bayesian_imputation.ipynb | MarcoGorelli/numpyro | 77be4d8171b0570bcabbebf1456f70ef94f9d0e8 | [
"Apache-2.0"
] | null | null | null | notebooks/source/bayesian_imputation.ipynb | MarcoGorelli/numpyro | 77be4d8171b0570bcabbebf1456f70ef94f9d0e8 | [
"Apache-2.0"
] | null | null | null | notebooks/source/bayesian_imputation.ipynb | MarcoGorelli/numpyro | 77be4d8171b0570bcabbebf1456f70ef94f9d0e8 | [
"Apache-2.0"
] | 1 | 2020-09-18T11:33:33.000Z | 2020-09-18T11:33:33.000Z | 44.93705 | 596 | 0.447068 | [
[
[
"# Bayesian Imputation",
"_____no_output_____"
],
[
"Real-world datasets often contain many missing values. In those situations, we have to either remove those missing data (also known as \"complete case\") or replace them by some values. Though using complete case is pretty straightforward, it is only applicable when the number of missing entries is so small that throwing away those entries would not affect much the power of the analysis we are conducting on the data. The second strategy, also known as [imputation](https://en.wikipedia.org/wiki/Imputation_%28statistics%29), is more applicable and will be our focus in this tutorial.\n\nProbably the most popular way to perform imputation is to fill a missing value with the mean, median, or mode of its corresponding feature. In that case, we implicitly assume that the feature containing missing values has no correlation with the remaining features of our dataset. This is a pretty strong assumption and might not be true in general. In addition, it does not encode any uncertainty that we might put on those values. Below, we will construct a *Bayesian* setting to resolve those issues. In particular, given a model on the dataset, we will\n\n+ create a generative model for the feature with missing value\n+ and consider missing values as unobserved latent variables.",
"_____no_output_____"
]
],
[
[
"!pip install -q numpyro@git+https://github.com/pyro-ppl/numpyro",
"_____no_output_____"
],
[
"# first, we need some imports\nimport os\n\nfrom IPython.display import set_matplotlib_formats\nfrom matplotlib import pyplot as plt\nimport numpy as np\nimport pandas as pd\n\nfrom jax import numpy as jnp\nfrom jax import ops, random\nfrom jax.scipy.special import expit\n\nimport numpyro\nfrom numpyro import distributions as dist\nfrom numpyro.distributions import constraints\nfrom numpyro.infer import MCMC, NUTS, Predictive\n\nplt.style.use(\"seaborn\")\nif \"NUMPYRO_SPHINXBUILD\" in os.environ:\n set_matplotlib_formats(\"svg\")\n\nassert numpyro.__version__.startswith('0.6.0')",
"_____no_output_____"
]
],
[
[
"## Dataset",
"_____no_output_____"
],
[
"The data is taken from the competition [Titanic: Machine Learning from Disaster](https://www.kaggle.com/c/titanic) hosted on [kaggle](https://www.kaggle.com/). It contains information of passengers in the [Titanic accident](https://en.wikipedia.org/wiki/Sinking_of_the_RMS_Titanic) such as name, age, gender,... And our target is to predict if a person is more likely to survive.",
"_____no_output_____"
]
],
[
[
"train_df = pd.read_csv(\n \"https://raw.githubusercontent.com/agconti/kaggle-titanic/master/data/train.csv\"\n)\ntrain_df.info()\ntrain_df.head()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 891 entries, 0 to 890\nData columns (total 12 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 PassengerId 891 non-null int64 \n 1 Survived 891 non-null int64 \n 2 Pclass 891 non-null int64 \n 3 Name 891 non-null object \n 4 Sex 891 non-null object \n 5 Age 714 non-null float64\n 6 SibSp 891 non-null int64 \n 7 Parch 891 non-null int64 \n 8 Ticket 891 non-null object \n 9 Fare 891 non-null float64\n 10 Cabin 204 non-null object \n 11 Embarked 889 non-null object \ndtypes: float64(2), int64(5), object(5)\nmemory usage: 83.7+ KB\n"
]
],
[
[
"Look at the data info, we know that there are missing data at `Age`, `Cabin`, and `Embarked` columns. Although `Cabin` is an important feature (because the position of a cabin in the ship can affect the chance of people in that cabin to survive), we will skip it in this tutorial for simplicity. In the dataset, there are many categorical columns and two numerical columns `Age` and `Fare`. Let's first look at the distribution of those categorical columns:",
"_____no_output_____"
]
],
[
[
"for col in [\"Survived\", \"Pclass\", \"Sex\", \"SibSp\", \"Parch\", \"Embarked\"]:\n print(train_df[col].value_counts(), end=\"\\n\\n\")",
"0 549\n1 342\nName: Survived, dtype: int64\n\n3 491\n1 216\n2 184\nName: Pclass, dtype: int64\n\nmale 577\nfemale 314\nName: Sex, dtype: int64\n\n0 608\n1 209\n2 28\n4 18\n3 16\n8 7\n5 5\nName: SibSp, dtype: int64\n\n0 678\n1 118\n2 80\n5 5\n3 5\n4 4\n6 1\nName: Parch, dtype: int64\n\nS 644\nC 168\nQ 77\nName: Embarked, dtype: int64\n\n"
]
],
[
[
"## Prepare data",
"_____no_output_____"
],
[
"First, we will merge rare groups in `SibSp` and `Parch` columns together. In addition, we'll fill 2 missing entries in `Embarked` by the mode `S`. Note that we can make a generative model for those missing entries in `Embarked` but let's skip doing so for simplicity.",
"_____no_output_____"
]
],
[
[
"train_df.SibSp.clip(0, 1, inplace=True)\ntrain_df.Parch.clip(0, 2, inplace=True)\ntrain_df.Embarked.fillna(\"S\", inplace=True)",
"_____no_output_____"
]
],
[
[
"Looking closer at the data, we can observe that each name contains a title. We know that age is correlated with the title of the name: e.g. those with Mrs. would be older than those with `Miss.` (on average) so it might be good to create that feature. The distribution of titles is:",
"_____no_output_____"
]
],
[
[
"train_df.Name.str.split(\", \").str.get(1).str.split(\" \").str.get(0).value_counts()",
"_____no_output_____"
]
],
[
[
"We will make a new column `Title`, where rare titles are merged into one group `Misc.`.",
"_____no_output_____"
]
],
[
[
"train_df[\"Title\"] = (\n train_df.Name.str.split(\", \")\n .str.get(1)\n .str.split(\" \")\n .str.get(0)\n .apply(lambda x: x if x in [\"Mr.\", \"Miss.\", \"Mrs.\", \"Master.\"] else \"Misc.\")\n)",
"_____no_output_____"
]
],
[
[
"Now, it is ready to turn the dataframe, which includes categorical values, into numpy arrays. We also perform standardization (a good practice for regression models) for `Age` column.",
"_____no_output_____"
]
],
[
[
"title_cat = pd.CategoricalDtype(\n categories=[\"Mr.\", \"Miss.\", \"Mrs.\", \"Master.\", \"Misc.\"], ordered=True\n)\nembarked_cat = pd.CategoricalDtype(categories=[\"S\", \"C\", \"Q\"], ordered=True)\nage_mean, age_std = train_df.Age.mean(), train_df.Age.std()\ndata = dict(\n age=train_df.Age.pipe(lambda x: (x - age_mean) / age_std).values,\n pclass=train_df.Pclass.values - 1,\n title=train_df.Title.astype(title_cat).cat.codes.values,\n sex=(train_df.Sex == \"male\").astype(int).values,\n sibsp=train_df.SibSp.values,\n parch=train_df.Parch.values,\n embarked=train_df.Embarked.astype(embarked_cat).cat.codes.values,\n)\nsurvived = train_df.Survived.values\n# compute the age mean for each title\nage_notnan = data[\"age\"][jnp.isfinite(data[\"age\"])]\ntitle_notnan = data[\"title\"][jnp.isfinite(data[\"age\"])]\nage_mean_by_title = jnp.stack([age_notnan[title_notnan == i].mean() for i in range(5)])",
"_____no_output_____"
]
],
[
[
"## Modelling",
"_____no_output_____"
],
[
"First, we want to note that in NumPyro, the following models\n```python\ndef model1a():\n x = numpyro.sample(\"x\", dist.Normal(0, 1).expand([10])\n```\nand\n```python\ndef model1b():\n x = numpyro.sample(\"x\", dist.Normal(0, 1).expand([10].mask(False))\n numpyro.sample(\"x_obs\", dist.Normal(0, 1).expand([10]), obs=x)\n```\nare equivalent in the sense that both of them have\n\n+ the same latent sites `x` drawn from `dist.Normal(0, 1)` prior,\n+ and the same log densities `dist.Normal(0, 1).log_prob(x)`.\n\nNow, assume that we observed the last 6 values of `x` (non-observed entries take value `NaN`), the typical model will be\n```python\ndef model2a(x):\n x_impute = numpyro.sample(\"x_impute\", dist.Normal(0, 1).expand([4]))\n x_obs = numpyro.sample(\"x_obs\", dist.Normal(0, 1).expand([6]), obs=x[4:])\n x_imputed = jnp.concatenate([x_impute, x_obs])\n```\nor with the usage of `mask`,\n```python\ndef model2b(x):\n x_impute = numpyro.sample(\"x_impute\", dist.Normal(0, 1).expand([4]).mask(False))\n x_imputed = jnp.concatenate([x_impute, x[4:]])\n numpyro.sample(\"x\", dist.Normal(0, 1).expand([10]), obs=x_imputed)\n```",
"_____no_output_____"
],
[
"Both approaches to model the partial observed data `x` are equivalent. For the model below, we will use the latter method.",
"_____no_output_____"
]
],
[
[
"def model(age, pclass, title, sex, sibsp, parch, embarked, survived=None, bayesian_impute=True):\n b_pclass = numpyro.sample(\"b_Pclass\", dist.Normal(0, 1).expand([3]))\n b_title = numpyro.sample(\"b_Title\", dist.Normal(0, 1).expand([5]))\n b_sex = numpyro.sample(\"b_Sex\", dist.Normal(0, 1).expand([2]))\n b_sibsp = numpyro.sample(\"b_SibSp\", dist.Normal(0, 1).expand([2]))\n b_parch = numpyro.sample(\"b_Parch\", dist.Normal(0, 1).expand([3]))\n b_embarked = numpyro.sample(\"b_Embarked\", dist.Normal(0, 1).expand([3]))\n\n # impute age by Title\n isnan = np.isnan(age)\n age_nanidx = np.nonzero(isnan)[0]\n if bayesian_impute:\n age_mu = numpyro.sample(\"age_mu\", dist.Normal(0, 1).expand([5]))\n age_mu = age_mu[title]\n age_sigma = numpyro.sample(\"age_sigma\", dist.Normal(0, 1).expand([5]))\n age_sigma = age_sigma[title]\n age_impute = numpyro.sample(\n \"age_impute\", dist.Normal(age_mu[age_nanidx], age_sigma[age_nanidx]).mask(False)\n )\n age = ops.index_update(age, age_nanidx, age_impute)\n numpyro.sample(\"age\", dist.Normal(age_mu, age_sigma), obs=age)\n else: \n # fill missing data by the mean of ages for each title\n age_impute = age_mean_by_title[title][age_nanidx]\n age = ops.index_update(age, age_nanidx, age_impute)\n\n a = numpyro.sample(\"a\", dist.Normal(0, 1))\n b_age = numpyro.sample(\"b_Age\", dist.Normal(0, 1))\n logits = a + b_age * age\n logits = logits + b_title[title] + b_pclass[pclass] + b_sex[sex]\n logits = logits + b_sibsp[sibsp] + b_parch[parch] + b_embarked[embarked]\n numpyro.sample(\"survived\", dist.Bernoulli(logits=logits), obs=survived)",
"_____no_output_____"
]
],
[
[
"Note that in the model, the prior for `age` is `dist.Normal(age_mu, age_sigma)`, where the values of `age_mu` and `age_sigma` depend on `title`. Because there are missing values in `age`, we will encode those missing values in the latent parameter `age_impute`. Then we can replace `NaN` entries in `age` with the vector `age_impute`.",
"_____no_output_____"
],
[
"## Sampling",
"_____no_output_____"
],
[
"We will use MCMC with NUTS kernel to sample both regression coefficients and imputed values.",
"_____no_output_____"
]
],
[
[
"mcmc = MCMC(NUTS(model), num_warmup=1000, num_samples=1000)\nmcmc.run(random.PRNGKey(0), **data, survived=survived)\nmcmc.print_summary()",
"sample: 100%|██████████| 2000/2000 [00:18<00:00, 110.91it/s, 63 steps of size 6.48e-02. acc. prob=0.94] \n"
]
],
[
[
"To double check that the assumption \"age is correlated with title\" is reasonable, let's look at the infered age by title. Recall that we performed standarization on `age`, so here we need to scale back to original domain.",
"_____no_output_____"
]
],
[
[
"age_by_title = age_mean + age_std * mcmc.get_samples()[\"age_mu\"].mean(axis=0)\ndict(zip(title_cat.categories, age_by_title))",
"_____no_output_____"
]
],
[
[
"The infered result confirms our assumption that `Age` is correlated with `Title`:\n\n+ those with `Master.` title has pretty small age (in other words, they are children in the ship) comparing to the other groups,\n+ those with `Mrs.` title have larger age than those with `Miss.` title (in average).\n\nWe can also see that the result is similar to the actual statistical mean of `Age` given `Title` in our training dataset:",
"_____no_output_____"
]
],
[
[
"train_df.groupby(\"Title\")[\"Age\"].mean()",
"_____no_output_____"
]
],
[
[
"So far so good, we have many information about the regression coefficients together with imputed values and their uncertainties. Let's inspect those results a bit:\n\n+ The mean value `-0.44` of `b_Age` implies that those with smaller ages have better chance to survive.\n+ The mean value `(1.11, -1.07)` of `b_Sex` implies that female passengers have higher chance to survive than male passengers.",
"_____no_output_____"
],
[
"## Prediction",
"_____no_output_____"
],
[
"In NumPyro, we can use [Predictive](http://num.pyro.ai/en/stable/utilities.html#numpyro.infer.util.Predictive) utility for making predictions from posterior samples. Let's check how well the model performs on the training dataset. For simplicity, we will get a `survived` prediction for each posterior sample and perform the majority rule on the predictions.",
"_____no_output_____"
]
],
[
[
"posterior = mcmc.get_samples()\nsurvived_pred = Predictive(model, posterior)(random.PRNGKey(1), **data)[\"survived\"]\nsurvived_pred = (survived_pred.mean(axis=0) >= 0.5).astype(jnp.uint8)\nprint(\"Accuracy:\", (survived_pred == survived).sum() / survived.shape[0])\nconfusion_matrix = pd.crosstab(\n pd.Series(survived, name=\"actual\"), pd.Series(survived_pred, name=\"predict\")\n)\nconfusion_matrix / confusion_matrix.sum(axis=1)",
"Accuracy: 0.8271605\n"
]
],
[
[
"This is a pretty good result using a simple logistic regression model. Let's see how the model performs if we don't use Bayesian imputation here.",
"_____no_output_____"
]
],
[
[
"mcmc.run(random.PRNGKey(2), **data, survived=survived, bayesian_impute=False)\nposterior_1 = mcmc.get_samples()\nsurvived_pred_1 = Predictive(model, posterior_1)(random.PRNGKey(2), **data)[\"survived\"]\nsurvived_pred_1 = (survived_pred_1.mean(axis=0) >= 0.5).astype(jnp.uint8)\nprint(\"Accuracy:\", (survived_pred_1 == survived).sum() / survived.shape[0])\nconfusion_matrix = pd.crosstab(\n pd.Series(survived, name=\"actual\"), pd.Series(survived_pred_1, name=\"predict\")\n)\nconfusion_matrix / confusion_matrix.sum(axis=1)\nconfusion_matrix = pd.crosstab(\n pd.Series(survived, name=\"actual\"), pd.Series(survived_pred_1, name=\"predict\")\n)\nconfusion_matrix / confusion_matrix.sum(axis=1)",
"sample: 100%|██████████| 2000/2000 [00:11<00:00, 166.79it/s, 63 steps of size 7.18e-02. acc. prob=0.93] \n"
]
],
[
[
"We can see that Bayesian imputation does a little bit better here.",
"_____no_output_____"
],
[
"**Remark.** When using `posterior` samples to perform prediction on the new data, we need to marginalize out `age_impute` because those imputing values are specific to the training data:\n```python\nposterior.pop(\"age_impute\")\nsurvived_pred = Predictive(model, posterior)(random.PRNGKey(3), **new_data)\n```",
"_____no_output_____"
],
[
"## References\n\n1. McElreath, R. (2016). Statistical Rethinking: A Bayesian Course with Examples in R and Stan.\n2. Kaggle competition: [Titanic: Machine Learning from Disaster](https://www.kaggle.com/c/titanic)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
]
] |
e7a035d247732646179110fc95c8078074f59726 | 14,744 | ipynb | Jupyter Notebook | 2021Q1_DSF/5.- Spark/notebooks/spark_sql/respuestas/extra_02_spark_eda_review_con_respuestas.ipynb | serch86/binder-pyspark-DSF_2021Q1 | e22556125531eeab6113d60d3be010bc5d5c6961 | [
"MIT"
] | null | null | null | 2021Q1_DSF/5.- Spark/notebooks/spark_sql/respuestas/extra_02_spark_eda_review_con_respuestas.ipynb | serch86/binder-pyspark-DSF_2021Q1 | e22556125531eeab6113d60d3be010bc5d5c6961 | [
"MIT"
] | null | null | null | 2021Q1_DSF/5.- Spark/notebooks/spark_sql/respuestas/extra_02_spark_eda_review_con_respuestas.ipynb | serch86/binder-pyspark-DSF_2021Q1 | e22556125531eeab6113d60d3be010bc5d5c6961 | [
"MIT"
] | null | null | null | 14,744 | 14,744 | 0.570605 | [
[
[
"# Global data variables\nSANDBOX_NAME = 'fesc'# Sandbox Name\nDATA_PATH = \"/data/sandboxes/\" + SANDBOX_NAME + \"/data/\"",
"_____no_output_____"
]
],
[
[
" \n\n# Análisis de Datos Exploratorio",
"_____no_output_____"
],
[
"\n\n## Análisis Univariante",
"_____no_output_____"
]
],
[
[
"from pyspark.sql import functions as F",
"_____no_output_____"
],
[
"online_df = spark.read.csv(DATA_PATH + 'online_retail.csv', sep=';', header=True, inferSchema=True)",
"_____no_output_____"
],
[
"online_df.show(2)",
"+---------+---------+--------------------+--------+---------------+---------+----------+--------------+\n|InvoiceNo|StockCode| Description|Quantity| InvoiceDate|UnitPrice|CustomerID| Country|\n+---------+---------+--------------------+--------+---------------+---------+----------+--------------+\n| 536365| 85123A|WHITE HANGING HEA...| 6|01/12/2010 8:26| 2,55| 17850|United Kingdom|\n| 536365| 71053| WHITE METAL LANTERN| 6|01/12/2010 8:26| 3,39| 17850|United Kingdom|\n+---------+---------+--------------------+--------+---------------+---------+----------+--------------+\nonly showing top 2 rows\n\n"
],
[
"# Respuesta\nonline_df_2 = online_df.withColumn('timestamp', F.unix_timestamp(F.col('InvoiceDate'), 'dd/MM/yyyy HH:mm'))\nonline_df_2.show(2)",
"+---------+---------+--------------------+--------+---------------+---------+----------+--------------+----------+\n|InvoiceNo|StockCode| Description|Quantity| InvoiceDate|UnitPrice|CustomerID| Country| timestamp|\n+---------+---------+--------------------+--------+---------------+---------+----------+--------------+----------+\n| 536365| 85123A|WHITE HANGING HEA...| 6|01/12/2010 8:26| 2,55| 17850|United Kingdom|1291191960|\n| 536365| 71053| WHITE METAL LANTERN| 6|01/12/2010 8:26| 3,39| 17850|United Kingdom|1291191960|\n+---------+---------+--------------------+--------+---------------+---------+----------+--------------+----------+\nonly showing top 2 rows\n\n"
],
[
"# Respuesta\nonline_df_3 = online_df_2.withColumn('datetime', F.from_unixtime(F.col('timestamp')))\nonline_df_3.show(2)",
"+---------+---------+--------------------+--------+---------------+---------+----------+--------------+----------+-------------------+\n|InvoiceNo|StockCode| Description|Quantity| InvoiceDate|UnitPrice|CustomerID| Country| timestamp| datetime|\n+---------+---------+--------------------+--------+---------------+---------+----------+--------------+----------+-------------------+\n| 536365| 85123A|WHITE HANGING HEA...| 6|01/12/2010 8:26| 2,55| 17850|United Kingdom|1291191960|2010-12-01 08:26:00|\n| 536365| 71053| WHITE METAL LANTERN| 6|01/12/2010 8:26| 3,39| 17850|United Kingdom|1291191960|2010-12-01 08:26:00|\n+---------+---------+--------------------+--------+---------------+---------+----------+--------------+----------+-------------------+\nonly showing top 2 rows\n\n"
],
[
"# Respuesta\nonline_df.dtypes",
"_____no_output_____"
]
],
[
[
"\n\nPrimero identifica variables cualitativas y cuantitativas.",
"_____no_output_____"
]
],
[
[
"# Respuesta\nquantitative_vars = [c for c,t in online_df.dtypes if t in ['int', 'double']]\nqualitative_vars = [c for c,t in online_df.dtypes if t in ['boolean', 'string']]",
"_____no_output_____"
],
[
"# Respuesta\nquantitative_vars",
"_____no_output_____"
],
[
"# Respuesta\nqualitative_vars",
"_____no_output_____"
]
],
[
[
"\n\n### Variables cuantitativas \n\nCalcula métricas para una única columna",
"_____no_output_____"
]
],
[
[
"# Respuesta\navgs = [F.avg(col).alias('avg_' + col) for col in quantitative_vars]\nmaxs = [F.max(col).alias('max_' + col) for col in quantitative_vars]\nmins = [F.min(col).alias('min_' + col) for col in quantitative_vars]\nstds = [F.stddev(col).alias('std_' + col) for col in quantitative_vars]",
"_____no_output_____"
],
[
"# Respuesta\noperations = avgs + stds + maxs + mins \noperations",
"_____no_output_____"
],
[
"# Respuesta\nresults = online_df.select(operations).first()\n\nfor col in quantitative_vars:\n \n avg = results['avg_' + col]\n std = results['std_' + col]\n maxi = results['max_' + col]\n mini = results['min_' + col]\n \n print('{}: avg={}, std={}, min={}, max={}'.format(col, round(avg, 2), round(std, 2), mini, maxi))",
"_____no_output_____"
]
],
[
[
"\n\n### Variables cualitativas\n\nPara variables cualitativas se calculan tablas de frecuencia.",
"_____no_output_____"
],
[
"\n\nCalcula la tabla de frecuencia de las columnas cualitativas, y ordénalas de mayor a menor.",
"_____no_output_____"
]
],
[
[
"# Respuesta\nonline_df.groupBy('Country').count().sort(F.col(\"count\").desc()).show()",
"_____no_output_____"
],
[
"# Respuesta\nonline_df.groupBy('Country', 'InvoiceDate').count().sort(F.col('count').desc()).show()",
"_____no_output_____"
]
],
[
[
"\n\n## Análisis Multivariante",
"_____no_output_____"
],
[
"\n\n__Matriz de correlación__",
"_____no_output_____"
]
],
[
[
"# Respuesta\nfrom pyspark.mllib.linalg import Vectors\nfrom pyspark.mllib.stat import Statistics\nimport pandas as pd",
"_____no_output_____"
],
[
"# Respuesta\nonline_df.select(quantitative_vars).rdd.map(lambda v: Vectors.dense(v))",
"_____no_output_____"
],
[
"# Respuesta\ncorr_matrix = Statistics.corr(online_df.select(quantitative_vars).rdd.map(lambda v: Vectors.dense(v)), \n method='pearson')\ncorr_matrix",
"_____no_output_____"
]
],
[
[
"\n\n_Transforma la matriz en un DataFrame de pandas_",
"_____no_output_____"
]
],
[
[
"# Respuesta\ndf_corr_matrix = pd.DataFrame(corr_matrix, columns=quantitative_vars, index=quantitative_vars)\ndf_corr_matrix",
"_____no_output_____"
],
[
"# Respuesta\nimport numpy as np\nmask = np.zeros_like(corr_matrix, dtype=np.bool)\nmask[np.triu_indices_from(mask)] = True\nmask",
"_____no_output_____"
],
[
"# Respuesta\ndf_corr_matrix_reduced = df_corr_matrix.mask(mask)\ndf_corr_matrix_reduced",
"_____no_output_____"
],
[
"# Respuesta\nimport numpy as np\nfrom matplotlib import pyplot as plt\nimport seaborn as sns",
"_____no_output_____"
],
[
"# Respuesta\n%matplotlib inline",
"_____no_output_____"
],
[
"# Respuesta\nplt.figure(figsize=(8,7))\nsns.heatmap(df_corr_matrix, cmap='coolwarm', vmin=-1, vmax=1, annot=True, fmt='.2f')\nplt.show()",
"_____no_output_____"
]
],
[
[
"\n\n# Valores Atípicos",
"_____no_output_____"
],
[
"\n\n### Detección de outliers para variables que siguen la distribución normal",
"_____no_output_____"
]
],
[
[
"# Respuesta\ndef remove_tukey_outliers(df, col):\n \"\"\"\n Returns a new dataframe with outliers removed on column 'col' usting Tukey test\n \"\"\"\n \n q1, q3 = df.approxQuantile(col, [0.25, 0.75], 0.01)\n IQR = q3 - q1\n \n min_thresh = q1 - 1.5 * IQR\n max_thresh = q3 + 1.5 * IQR\n \n df_no_outliers = df.filter(F.col(col).between(min_thresh, max_thresh))\n \n return df_no_outliers",
"_____no_output_____"
],
[
"# Respuesta\nonline_df_no_outliers = remove_tukey_outliers(online_df, 'Quantity')",
"_____no_output_____"
],
[
"# Respuesta\nn_rows = online_df.count()",
"_____no_output_____"
],
[
"# Respuesta\nn_rows_no = online_df_no_outliers.count()\nperc_outliers = 100 * (n_rows - n_rows_no) / n_rows",
"_____no_output_____"
],
[
"# Respuesta\nprint('{} has {:.2f}% outliers'.format('Quantity', perc_outliers))",
"_____no_output_____"
]
],
[
[
"\n\n# Valores nulos",
"_____no_output_____"
]
],
[
[
"# Respuesta\ndef remove_nulls(df):\n df_no_nulls = df\n \n for element in df_no_nulls.columns:\n if df_no_nulls.where(df_no_nulls[element].isNull()).count() != 0:\n print('\\tThe column \"{}\" has null values'.format(element))\n df_no_nulls = df_no_nulls.where(df_no_nulls[element].isNotNull())\n if df_no_nulls.where(df_no_nulls[element].isNull()).count() == 0:\n print('The column \"{}\" does not have null values'.format(element))\n \n return df_no_nulls",
"_____no_output_____"
],
[
"# Respuesta\ndef check_nulls(df):\n \n existing_nulls = False\n \n for element in df.columns:\n if df.where(df[element].isNull()).count() != 0:\n print('\\tThe column \"{}\" has null values'.format(element))\n existing_nulls = True\n break\n if df.where(df[element].isNull()).count() == 0:\n print('The column \"{}\" does not have null values'.format(element))\n \n return existing_nulls",
"_____no_output_____"
],
[
"# Respuesta\nprint(online_df.count())\nonline_df_no_nulls = remove_nulls(online_df)\nprint(online_df_no_nulls.count())",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
e7a045fd9e06b44865500a8ce32def6a3c988dc7 | 715 | ipynb | Jupyter Notebook | lectures/index.ipynb | QuantEcon/python-lecture-sandpit.myst | 8e6c8cb971ab06af12e5364a4a2cb9d0d07ab925 | [
"BSD-3-Clause"
] | null | null | null | lectures/index.ipynb | QuantEcon/python-lecture-sandpit.myst | 8e6c8cb971ab06af12e5364a4a2cb9d0d07ab925 | [
"BSD-3-Clause"
] | 2 | 2020-12-14T07:16:37.000Z | 2021-10-05T00:06:51.000Z | lectures/index.ipynb | QuantEcon/python-lecture-sandpit.myst | 8e6c8cb971ab06af12e5364a4a2cb9d0d07ab925 | [
"BSD-3-Clause"
] | null | null | null | 16.627907 | 34 | 0.504895 | [
[
[
"# Python Lecture Sandpit\n\n```{tableofcontents}\n```",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown"
]
] |
e7a046fb49c4b5fb3694001e4d610afe9e70173e | 5,250 | ipynb | Jupyter Notebook | inteligencia_artificial/03-Variables.ipynb | edwinb-ai/intelicompu | 7ed4c51c789d0b71aac5f507d800cc57ba752fe4 | [
"Apache-2.0"
] | 1 | 2020-01-10T03:22:50.000Z | 2020-01-10T03:22:50.000Z | inteligencia_artificial/03-Variables.ipynb | edwinb-ai/intelicompu | 7ed4c51c789d0b71aac5f507d800cc57ba752fe4 | [
"Apache-2.0"
] | null | null | null | inteligencia_artificial/03-Variables.ipynb | edwinb-ai/intelicompu | 7ed4c51c789d0b71aac5f507d800cc57ba752fe4 | [
"Apache-2.0"
] | null | null | null | 28.846154 | 380 | 0.607429 | [
[
[
"# Variables y _placeholders_",
"_____no_output_____"
]
],
[
[
"import tensorflow as tf\nimport numpy as np",
"_____no_output_____"
]
],
[
[
"Las _variables_ y _placeholders_ son los pilares de _Tensorflow_. Sin embargo para entender porqué es esto, uno debe entender un poco más sobre la estructura general de _Tensorflow_ y cómo realiza los cálculos correspondientes.",
"_____no_output_____"
],
[
"## _Dataflow_ programming\n\n[_Dataflow programming_](https://en.wikipedia.org/wiki/Dataflow_programming) es una _paradigma_ computacional donde las operaciones, instrucciones y todo lo que sucede en un programa se lleva a cabo en un [grafo dirigido](https://en.wikipedia.org/wiki/Directed_graph).\n\nAquí se presenta un grafo dirigido.\n\n",
"_____no_output_____"
],
[
"_Tensorflow_ funciona de esta forma, utilizando instrucciones y herramientas como _session_, _variables_ y _placeholders_. Como se ha visto anteriormente, ninguna de estas estructuras muestra los datos que tiene pues se encuentra dentro de un grafo. En el momento en que se ejecuta la sesión se da la _instrucción total_ de llevar a cabo **todas** las operaciones del grafo.",
"_____no_output_____"
],
[
"## Ejemplo con _variables_",
"_____no_output_____"
]
],
[
[
"# Crear una variables con ceros, de dimensiones (3,4)\nmy_var = tf.Variable(tf.zeros((3, 4)))\n# Iniciar una sesión (en realidad se crea un grafo de computación/operacional)\nsession = tf.Session()\n# Inicializar las variables\ninits = tf.global_variables_initializer()\n# Correr todo el grafo\nsession.run(inits)",
"_____no_output_____"
]
],
[
[
"Aunque no se muestra nada, en el fondo se creó un **grafo** dirigido, donde un _nodo_ es la variable, y al inicializar el grafo, todas las operaciones pendientes se llevaron a cabo. A continuación se muestra un ejemplo adicional con _placeholders_ donde se puede visualizar mejor este hecho.",
"_____no_output_____"
],
[
"## Ejemplo con _placeholders_",
"_____no_output_____"
]
],
[
[
"# Crear valores aleatorios de numpy\nx_vals = np.random.random_sample((2, 2))\nprint(x_vals)",
"[[0.05037086 0.01199036]\n [0.89214588 0.4766158 ]]\n"
],
[
"# Crear una sesión; un grafo computacional\nsession = tf.Session()\n# El placeholder no puede tener otra dimensión diferente a (2,2)\nx = tf.placeholder(tf.float32, shape=(2,2))\n# identity devuelve un tensor con la misma forma y contenido de la estructura\n# de datos que se le suministra\ny = tf.identity(x)\n# Correr todo el grafo computacional\nsession.run(y, feed_dict={x: x_vals})",
"_____no_output_____"
]
],
[
[
"## Inicialización independiente de variables\n\nNo siempre se tienen que inicializar las variables de una sola forma, al mismo tiempo, sino que se pueden inicializar una por una según sea conveniente. Se muestra un ejemplo a continuación.",
"_____no_output_____"
]
],
[
[
"# Crear la sesión\nsession = tf.Session()\n# Se tiene una primera variable llena de cero\nfirst_var = tf.Variable(tf.zeros((3, 4)))\n# Y ahora se inicializa\nsession.run(first_var.initializer)\n# Se tiene una segunda variable llena de uno\nsecond_var = tf.Variable(tf.ones_like(first_var))\nsession.run(second_var.initializer)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
e7a055a4df058bda747e14d0a43f4fca715b98fe | 72,295 | ipynb | Jupyter Notebook | notebooks/Texas.ipynb | skesamsetty/Analysis-on-Mortality-Rate-with-Cause-a-as-Cancer-due-to-Climate-Changes | dc0f73cecc4a573ad76311a8c339bb3e3e42a305 | [
"Apache-2.0"
] | null | null | null | notebooks/Texas.ipynb | skesamsetty/Analysis-on-Mortality-Rate-with-Cause-a-as-Cancer-due-to-Climate-Changes | dc0f73cecc4a573ad76311a8c339bb3e3e42a305 | [
"Apache-2.0"
] | null | null | null | notebooks/Texas.ipynb | skesamsetty/Analysis-on-Mortality-Rate-with-Cause-a-as-Cancer-due-to-Climate-Changes | dc0f73cecc4a573ad76311a8c339bb3e3e42a305 | [
"Apache-2.0"
] | 2 | 2021-08-11T04:52:37.000Z | 2021-12-17T05:24:37.000Z | 67.628625 | 37,676 | 0.680614 | [
[
[
"# Dependencies and Setup\n\nimport pandas as pd\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"mortality_rate = pd.read_csv(\"../data/mortality_rate_by_US_state.csv\")\ntemperature_rate = pd.read_csv(\"../data/model_state.csv\")",
"_____no_output_____"
],
[
"cancer_rate = mortality_rate.loc[mortality_rate[\"Cause Name\"]==\"Cancer\"]\ncancer_rate = cancer_rate.rename(columns={\"Year\":\"year\"})\ncancer_rate",
"_____no_output_____"
],
[
"cancer_rate = mortality_rate.loc[mortality_rate[\"Cause Name\"]==\"Cancer\"]\ncancer_rate = cancer_rate.rename(columns={\"Year\":\"year\"})\ncancer_rate",
"_____no_output_____"
],
[
"cancer_df = cancer_rate.loc[(cancer_rate[\"year\"]>1999) & (cancer_rate[\"State\"]==\"Texas\")]\ncancer_df",
"_____no_output_____"
],
[
"climate_change = pd.read_csv(\"../data/climdiv_state_year.csv\")\nclimate_df =climate_change.loc[(climate_change[\"year\"]>1999) & (climate_change[\"year\"]<2017) & (climate_change[\"fips\"]==48)]\nclimate_df",
"_____no_output_____"
],
[
"whole_data = cancer_df.merge(climate_df , how= 'outer', on=\"year\" )\nyearly_cancer= whole_data.groupby(whole_data[\"year\"]).sum([\"Deaths\"])\ntemperature_yearly = whole_data.groupby(whole_data[\"year\"]).mean([\"tempc\"])\nyearly_cancer_deaths= yearly_cancer[[\"Deaths\"]]\ntemperature_yearly_change = temperature_yearly[[\"tempc\"]]\nannual_change = temperature_yearly_change.merge(yearly_cancer_deaths, on=\"year\")\nannual_change",
"_____no_output_____"
],
[
"import matplotlib.pyplot as plt\n\n\nfig, ax = plt.subplots()\nfig.subplots_adjust(right=0.75)\nfig.set_figheight(10)\nfig.set_figwidth(20)\ntwin1 = ax.twinx()\n\n\nax.bar(annual_change.index, annual_change[\"tempc\"], color=\"r\", label=\"Temperature\",alpha=0.5 )\n\ntwin1.plot(annual_change.index, annual_change[\"Deaths\"], label=\"Deaths\")\n\n\nax.set_xlabel(\"Years\")\nax.set_ylabel(\"Texas Temperature\")\ntwin1.set_ylabel(\"Texas Deaths\")\n\ntkw = dict(size=4, width=1.5)\nax.tick_params(axis='x', **tkw)\n\nplt.savefig(\"../images/texas.png\")\nplt.show()",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e7a074376257fefb7ab8d03adfa6a5229eaa2814 | 3,091 | ipynb | Jupyter Notebook | docs/Tutorial/solar/solar_resource_data.ipynb | SarthakJariwala/nrel_dev_api | 2cb486c88c67aa9758427a76ec5c79f7ef7bc997 | [
"Apache-2.0"
] | null | null | null | docs/Tutorial/solar/solar_resource_data.ipynb | SarthakJariwala/nrel_dev_api | 2cb486c88c67aa9758427a76ec5c79f7ef7bc997 | [
"Apache-2.0"
] | 115 | 2021-03-08T00:50:11.000Z | 2022-03-30T19:18:55.000Z | docs/Tutorial/solar/solar_resource_data.ipynb | SarthakJariwala/nrel_dev_api | 2cb486c88c67aa9758427a76ec5c79f7ef7bc997 | [
"Apache-2.0"
] | null | null | null | 21.465278 | 151 | 0.571013 | [
[
[
"# Solar Resource Data\n\n> Get average Direct Normal Irradiance (avg_dni), average Global Horizontal Irradiance (avg_ghi), and average Tilt (avg_lat_tilt) for a location.",
"_____no_output_____"
],
[
"An example to get solar resource data - average Direct Normal Irradiance, average Global Horizontal Irradiance, and average tilt - from NREL",
"_____no_output_____"
],
[
"First, let's set our NREL API key.",
"_____no_output_____"
]
],
[
[
"import os\nfrom nrel_dev_api import set_nrel_api_key\nfrom nrel_dev_api.solar import SolarResourceData\n\nNREL_API_KEY = os.environ[\"DEMO_NREL_API_KEY\"]\n\nset_nrel_api_key(NREL_API_KEY)",
"_____no_output_____"
]
],
[
[
"> Alternatively, you can provide your NREL Developer API key with every call. Setting it globally is just for convenience.",
"_____no_output_____"
],
[
"Let's check available solar resource data for Seattle, WA.",
"_____no_output_____"
]
],
[
[
"solar_resource_data = SolarResourceData(lat=47, lon=-122)",
"_____no_output_____"
]
],
[
[
"Outputs for solar resource data is available as the `outputs` attribute.",
"_____no_output_____"
]
],
[
[
"solar_resource_data.outputs",
"_____no_output_____"
]
],
[
[
"We can also provide the address to access the solar resource data.",
"_____no_output_____"
]
],
[
[
"address = \"Seattle, WA\"\n\nsolar_resource_data = SolarResourceData(address=address)",
"_____no_output_____"
]
],
[
[
"The complete response as a dictionary is available as the `response` attribute.",
"_____no_output_____"
]
],
[
[
"solar_resource_data.response",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e7a07fe778fb78fbab8d2399cc743d3c72aa85eb | 89,120 | ipynb | Jupyter Notebook | 2_tokamak_classifier/2_feat_selection.ipynb | djsegal/metis | 54b84108a58d3e95679b519eb361e6916a693709 | [
"MIT"
] | 1 | 2020-06-22T12:21:15.000Z | 2020-06-22T12:21:15.000Z | 2_tokamak_classifier/2_feat_selection.ipynb | djsegal/metis | 54b84108a58d3e95679b519eb361e6916a693709 | [
"MIT"
] | null | null | null | 2_tokamak_classifier/2_feat_selection.ipynb | djsegal/metis | 54b84108a58d3e95679b519eb361e6916a693709 | [
"MIT"
] | null | null | null | 55.734834 | 21,104 | 0.73088 | [
[
[
"import pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n\nfrom ipywidgets import interact, interactive, fixed, interact_manual\nimport ipywidgets as widgets\n\nfrom math import ceil, floor\n\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.linear_model import RidgeCV, LassoCV, ElasticNetCV, LogisticRegression\nfrom sklearn.preprocessing import StandardScaler, PolynomialFeatures\n\nfrom collections import defaultdict\nfrom tqdm.auto import tqdm\n\nimport random\nfrom sklearn.ensemble import RandomForestClassifier\n\nimport pandas_profiling\nimport warnings\n\nfrom collections import Counter\nfrom sklearn.svm import LinearSVC\n\nfrom sklearn.model_selection import GridSearchCV",
"_____no_output_____"
],
[
"tokamak_data = pd.read_pickle(\"./final.pkl\")\n",
"_____no_output_____"
],
[
"skipped_columns = [\"tok\", \"is_good\", \"standard\" , \"walmat_c\"]\n\nskipped_columns",
"_____no_output_____"
],
[
"tokamak_data.columns",
"_____no_output_____"
],
[
"cur_data = tokamak_data.copy()\n\ndesired_dict = {\n 'JET': 3144,\n 'ASDEX': 1250,\n 'AUG': 1250,\n 'CMOD': 750,\n 'D3D': 1250,\n 'JFT2M': 1000,\n 'JT60U': 250,\n 'TCV': 100,\n 'TDEV': 75,\n 'NSTX': 50,\n 'PBXM': 500,\n 'PDX': 500,\n 'TFTR': 500\n}\n\ntok_dict = {}\n\nfor cur_key in desired_dict.keys():\n tok_dict[cur_key] = np.sum(tokamak_data.tok == cur_key)\n\nfor cur_key, cur_value in desired_dict.items():\n print(cur_key, cur_value / tok_dict[cur_key])\n\nfor cur_key, cur_value in desired_dict.items():\n in_rows = cur_data[cur_data.tok == cur_key].copy()\n in_count = len(in_rows)\n\n cur_data = cur_data.append(in_rows.sample(cur_value % in_count))\n if cur_value < 2 * in_count: continue \n\n for cur_times in range(floor(cur_value/in_count)-1):\n cur_data = cur_data.append(in_rows.copy())\n",
"JET 1.0\nASDEX 2.0\nAUG 1.9592476489028212\nCMOD 8.720930232558139\nD3D 2.535496957403651\nJFT2M 1.8867924528301887\nJT60U 2.808988764044944\nTCV 4.761904761904762\nTDEV 7.5\nNSTX 7.142857142857143\nPBXM 1.893939393939394\nPDX 3.4965034965034967\nTFTR 4.8076923076923075\n"
],
[
"all_corr = tokamak_data.corr()",
"_____no_output_____"
],
[
"for cur_col in tokamak_data.columns:\n if cur_col in skipped_columns : continue\n if np.abs(all_corr[cur_col][\"is_good\"]) > 1e-2 : continue\n skipped_columns.append(cur_col)\n \nlen(skipped_columns)",
"_____no_output_____"
],
[
"for cur_index, cur_col in enumerate(tqdm(tokamak_data.columns)):\n if cur_col in skipped_columns : continue\n for other_col in tokamak_data.columns[0:cur_index]:\n if other_col in skipped_columns : continue\n \n assert all_corr[cur_col][other_col] == all_corr[other_col][cur_col]\n if np.abs(all_corr[cur_col][other_col]) < 0.99 : continue\n \n if np.abs(all_corr[cur_col][\"is_good\"]) > np.abs(all_corr[other_col][\"is_good\"]):\n skipped_columns.append(cur_col)\n else:\n skipped_columns.append(other_col)\n break\n \nlen(skipped_columns)",
"_____no_output_____"
],
[
"tmp_columns = [\n work_col for work_col in cur_data.columns if work_col not in skipped_columns\n]\n\ncur_X = cur_data[tmp_columns]\ncur_y = cur_data[\"is_good\"]\n\nscaler = StandardScaler().fit(cur_X)\ncur_X = pd.DataFrame(scaler.transform(cur_X), columns=cur_X.columns)\n",
"_____no_output_____"
],
[
"cur_clf = RandomForestClassifier(n_estimators=10000, random_state=0, n_jobs=-1)\ncur_clf.fit(cur_X, cur_y)\n",
"_____no_output_____"
],
[
"feat_labels = cur_X.columns\nbeat_val = 7.5 * np.max(cur_clf.feature_importances_) / 100\ncur_count = 0\n\nfor feature in reversed(sorted(zip(cur_clf.feature_importances_, feat_labels))):\n feat, importance = list(reversed(feature))\n assert importance >= 0\n if importance <= beat_val : continue\n cur_count += 1\n print(feat, \":\", importance)\n \nfor feature in reversed(sorted(zip(cur_clf.feature_importances_, feat_labels))):\n feat, importance = list(reversed(feature))\n assert importance >= 0\n if importance > beat_val : continue\n skipped_columns.append(feat)\n \nprint(cur_count)\n",
"prad : 0.014394738381617131\n_pressure_rgeo_vol : 0.00922429954227121\n_inv_abs_bt_inv_pressure_inv_vol : 0.009204016467752256\nq95 : 0.009048744755298639\nlog_inv_q95 : 0.008924060406044395\ninv_q95 : 0.0088926641694427\n_inv_pressure_inv_rgeo_2 : 0.007897110370144152\n_rgeo_tev_vol : 0.00783383835212216\n_inv_epsilon_inv_rgeo_inv_tev : 0.007662079946921427\n_inv_rgeo_inv_tev_inv_vol : 0.0075154354223034205\n_rgeo_2_tev : 0.0074964299305910614\n_pressure_tev_vol : 0.007484882459964529\n_inv_abs_ip_inv_tev_2 : 0.007129217099192277\n_inv_abs_bt_inv_rgeo_inv_tev : 0.007017990973480651\ndelta : 0.0061642078325764744\n_inv_rgeo_inv_tev : 0.006117610391297016\n_inv_abs_bt_inv_tev_inv_vol : 0.006060794622784743\npnbi : 0.006003886737372808\n_inv_pressure_inv_rgeo_nel : 0.005719620138219985\ntauth : 0.005509277681752673\ninv_tauth : 0.005424980082137503\nlog_inv_tauth : 0.005418732917221812\n_epsilon_inv_abs_ip_inv_tev : 0.005408562827386735\n_inv_abs_ip_inv_rgeo_inv_tev : 0.005342195267381596\n_abs_ip_inv_epsilon_rgeo : 0.00534133615828474\n_inv_abs_bt_2_nel : 0.005323338276893555\npinj : 0.005224682059717201\ntautot : 0.0052045475994646215\ninv_tautot : 0.005171900244943075\n_inv_abs_ip_pressure_vol : 0.00502671599972232\n_inv_abs_ip_inv_pressure_nel : 0.004831886585709376\npohm : 0.004722420503675061\nlog_pl : 0.004457072421821505\npl : 0.004438980047841214\ninv_pl : 0.004421755595720279\n_rgeo_tev_2 : 0.004142056115570486\n_inv_epsilon_2_inv_tev : 0.004118297185637147\ninv_time : 0.00410371271172558\nlog_time : 0.004099805995242189\n_abs_ip_inv_abs_bt_inv_epsilon : 0.004085404633486004\n_abs_bt_2_vol : 0.004074327894619803\n_nel_rgeo_vol : 0.004070629979572538\n_abs_bt_2_inv_abs_ip : 0.004029407679319101\n_inv_epsilon_2_rgeo : 0.004007345147602731\n_inv_abs_ip_inv_pressure_inv_vol : 0.004002856587887104\ninv_wmhd : 0.003985064172755349\n_abs_bt_epsilon_inv_rgeo : 0.003971191207120098\n_abs_ip_inv_abs_bt_rgeo : 0.0039697363105323236\ntime : 0.003966525419769529\nwmhd : 0.0039528078066139284\n_inv_tev_nel_vol : 0.003912540665623722\n_epsilon_inv_abs_ip_inv_vol : 0.0038904185285658518\n_inv_rgeo_pressure_vol : 0.0038809394351791452\n_epsilon_inv_rgeo : 0.0038377535746469695\nlog_inv_kappaa : 0.003834550137766331\n_inv_abs_ip_inv_vol_tev : 0.0038143129320720916\n_inv_abs_bt_pressure_vol : 0.0038089424533799955\n_inv_abs_ip_tev_vol : 0.0038060222931946383\nlog_wmhd : 0.0037685704276965216\n_inv_abs_bt_rgeo : 0.003758063099561299\nkappaa : 0.0036999742787289115\n_inv_abs_bt_nel_rgeo : 0.0036734446591392437\n_inv_abs_bt_inv_epsilon_2 : 0.0036517350449646177\n_inv_abs_bt_2_rgeo : 0.0036410877812787547\n_abs_bt_inv_abs_ip_rgeo : 0.0035846504818855484\n_inv_abs_bt_inv_nel_inv_vol : 0.003584042429477779\nabs_bt : 0.003579901075119212\n_nel_rgeo_2 : 0.0035776799432120873\nwtot : 0.0035773377854655597\n_epsilon_inv_abs_bt_2 : 0.003569430305988057\nwth : 0.0035337370263135654\ninv_abs_bt : 0.0035337103578387684\n_inv_epsilon_rgeo_2 : 0.0035276548940637643\n_inv_tev_pressure_vol : 0.0035165922454532712\n_abs_ip_inv_abs_bt : 0.003515851739449686\n_epsilon_tev : 0.003503118485575458\ninv_wth : 0.0035019002821287952\ninv_bt : 0.003479466948288331\n_abs_bt_2 : 0.003478861008165635\n_abs_bt_vol : 0.003476701205061918\n_inv_epsilon_inv_rgeo_2 : 0.0034527111571267238\nbt : 0.0034189275749674456\n_abs_ip_epsilon_inv_vol : 0.003374730663264989\nip : 0.0033703917175523486\n_abs_ip_inv_abs_bt_inv_vol : 0.003355371803449133\n_abs_bt_epsilon_rgeo : 0.0033546792486579425\n_abs_bt_rgeo_vol : 0.0033490928644591368\n_inv_epsilon_inv_vol_rgeo : 0.0033183062121860058\n_inv_abs_bt_inv_rgeo_vol : 0.0033114231881346307\n_inv_epsilon_rgeo_vol : 0.003310977923492494\n_inv_epsilon_2_vol : 0.003307266625578318\ninv_kappa : 0.0032536707933222423\n_inv_abs_ip_inv_rgeo_tev : 0.003245491672333413\ninv_vol : 0.003238993729016743\n_inv_epsilon_inv_tev_2 : 0.003233488703129343\nvol : 0.003211858929166544\nkappa : 0.0032054749139987506\n_abs_ip_inv_abs_bt_inv_tev : 0.003194646141347037\n_inv_abs_bt_inv_epsilon : 0.003186607455782177\nabs_pohm : 0.003176730141019627\n_abs_bt_epsilon_inv_nel : 0.003174856590499723\n_inv_abs_bt_2_vol : 0.0031414774329729532\n_abs_ip_epsilon_inv_abs_bt : 0.0031266337397890926\n_inv_abs_bt_nel_tev : 0.0031160663372743944\n_inv_abs_bt_2_inv_epsilon : 0.003100117127664662\n_inv_abs_ip_inv_epsilon_inv_rgeo : 0.003088867355616264\nbetmhd : 0.003077484541233428\n_inv_epsilon_2_inv_rgeo : 0.0030631540037651863\ninv_betmhd : 0.0030382498651269157\ninv_epsilon : 0.003026467150754654\nlog_beili2 : 0.003023064705767\nlog_betmhd : 0.003022755150917698\ninv_ip : 0.0030162134735837068\nepsilon : 0.0030026640494062494\n_inv_abs_bt_inv_abs_ip_inv_rgeo : 0.0030018110314828732\ninv_beili2 : 0.002974433194673219\n_inv_abs_bt_inv_vol_tev : 0.002972123298095259\n_inv_abs_bt_2_inv_nel : 0.0029394327476120804\nabs_ip : 0.0029219131594954723\n_inv_abs_bt_inv_epsilon_inv_pressure : 0.002914489585074112\nbeili2 : 0.002909045950597703\n_inv_abs_ip_2_inv_tev : 0.002907726913408173\ninv_abs_ip : 0.0028790018174563426\n_inv_abs_ip_2_rgeo : 0.0028704495756821033\n_epsilon_inv_abs_bt : 0.0028681721114627917\n_abs_ip_inv_epsilon_inv_vol : 0.0028675659595831507\n_rgeo_3 : 0.0028464245790009914\n_abs_bt_inv_abs_ip_2 : 0.0028360080077567567\nrgeo : 0.0028185041684252927\n_abs_bt_inv_abs_ip_inv_tev : 0.002810804326610007\namin : 0.0028021843711022844\n_inv_abs_bt_tev_vol : 0.002773665807642179\ninv_rgeo : 0.002770126062708701\n_inv_abs_ip_vol : 0.002746080882453991\n_inv_pressure_2_inv_vol : 0.0027370577489995375\n_abs_bt_2_rgeo : 0.002724989457543148\n_inv_abs_ip_rgeo_vol : 0.002722292791261279\ninv_amin : 0.0026776898060684403\nlog_amin : 0.002672522218097746\n_abs_ip_inv_rgeo_2 : 0.0026579093094776006\n_inv_abs_ip_inv_tev_vol : 0.0026577655045816255\n_abs_bt_2_inv_pressure : 0.0025983551044889363\n_inv_abs_bt_inv_rgeo_tev : 0.0025748796169221388\n_inv_epsilon_inv_nel_tev : 0.0025628660431598993\n_abs_bt_rgeo_2 : 0.002556501900363311\n_abs_bt_inv_rgeo_2 : 0.0025369712998964708\n_inv_abs_bt_pressure_tev : 0.002533335610682074\n_inv_abs_ip_inv_epsilon_rgeo : 0.00252524797128246\n_epsilon_2_inv_abs_ip : 0.0025231294805760874\n_abs_bt_abs_ip_inv_vol : 0.002510773423460043\n_abs_ip_epsilon_2 : 0.0025054072416606037\nlog_inv_greenwald : 0.002500827000350069\n_abs_ip_inv_vol_rgeo : 0.0024940035822449383\n_inv_abs_bt_nel_vol : 0.0024682687175542704\n_epsilon_inv_abs_bt_inv_rgeo : 0.0024599001664125553\n_inv_pressure_tev_2 : 0.002444504894748539\n_epsilon_inv_abs_bt_rgeo : 0.0024364762990000573\ninv_greenwald : 0.0024361827589537673\ngreenwald : 0.0024248759948962335\n_imp_prad_tok_pbxm : 0.0024061320770906785\n_inv_abs_ip_2_vol : 0.0023976444224829275\n_inv_abs_bt_2_inv_pressure : 0.0023821583987866574\n_abs_ip_inv_epsilon_inv_tev : 0.002380110036275979\n_epsilon_2_inv_abs_bt : 0.0023561398081051173\n_abs_bt_inv_pressure_tev : 0.002344794533598633\n_inv_abs_bt_2_tev : 0.0023375049361393217\n_abs_bt_2_abs_ip : 0.002336728319752561\n_inv_abs_ip_2_nel : 0.002336342363546604\n_inv_abs_bt_inv_abs_ip_rgeo : 0.0023314603164817684\n_abs_ip_2_inv_tev : 0.0023281883102828894\n_abs_ip_inv_epsilon : 0.0023013867659077186\n_inv_abs_bt_inv_epsilon_tev : 0.0022963874559798004\n_abs_ip_inv_epsilon_inv_rgeo : 0.002281227841028107\n_abs_ip_inv_nel_rgeo : 0.00226344712578815\n_inv_abs_ip_tev : 0.002260271887251295\n_inv_abs_bt_rgeo_tev : 0.002257133393987075\n_inv_epsilon_rgeo_tev : 0.0022384087284908314\n_abs_bt_epsilon_nel : 0.0022285552212045357\n_inv_abs_ip_nel_vol : 0.0022260466079764155\n_inv_abs_ip_nel_rgeo : 0.0022173003711277468\n_pressure_rgeo_tev : 0.0022166573505702286\n_inv_abs_bt_inv_rgeo : 0.0022165358905407526\n_inv_abs_bt_inv_abs_ip_inv_epsilon : 0.0022004147924759828\n_inv_epsilon_inv_tev_nel : 0.0021814269142475273\n_abs_bt_inv_tev : 0.0021793263848149886\n_abs_bt_inv_pressure_inv_rgeo : 0.0021757941321105955\n_inv_abs_bt_inv_nel_inv_tev : 0.0021734825181913917\n_inv_abs_ip_inv_tev_pressure : 0.0021567648083254992\n_epsilon_inv_abs_bt_tev : 0.00215512124055961\n_inv_nel_inv_tev_pressure : 0.002147574885779173\n_inv_nel_tev : 0.0021371290172520086\nimp_tauth : 0.002134125551501407\n_abs_bt_inv_tev_nel : 0.002126233124703597\n_abs_ip_inv_abs_bt_inv_nel : 0.0021185223095635634\n_inv_abs_bt_2_inv_tev : 0.002113732035506557\n_inv_abs_bt_tev_2 : 0.002110474902763798\n_inv_tev_2_nel : 0.0020908434900524836\n_abs_bt_abs_ip_inv_nel : 0.0020904094289111506\n_inv_abs_bt_inv_pressure_tev : 0.0020870991686363365\n_inv_epsilon_2_inv_nel : 0.002081668827864395\n_inv_nel_2_pressure : 0.002077606141718529\n_abs_bt_epsilon_inv_vol : 0.002072031945566217\n_inv_abs_ip_inv_tev_nel : 0.0020530494180488125\n_epsilon_inv_abs_bt_inv_abs_ip : 0.002009409829951437\n_inv_vol_tev : 0.002007894691554847\n_inv_epsilon_2_nel : 0.0019795601637362784\n_inv_abs_ip_2_pressure : 0.00195581538120834\n_inv_abs_ip_inv_vol_pressure : 0.0019439149924449478\n_abs_ip_inv_abs_bt_pressure : 0.0019392963236027313\n_inv_abs_ip_nel : 0.0019363404189324156\n_inv_rgeo_tev_2 : 0.0019319062321779047\n_inv_tev_vol : 0.0019302614069033979\n_inv_abs_ip_rgeo_tev : 0.001925449223622142\n_inv_epsilon_inv_pressure_tev : 0.001905758795511193\n_inv_tev_inv_vol_nel : 0.0018941096897614037\n_abs_ip_inv_pressure_nel : 0.0018925254599913887\n_inv_abs_bt_inv_abs_ip_tev : 0.001875957126487752\n_inv_epsilon_inv_pressure_nel : 0.0018674112787132127\n_epsilon_inv_abs_ip_nel : 0.0018669596593751457\n_inv_epsilon_inv_tev_pressure : 0.0018660912266885393\n_inv_abs_ip_pressure_rgeo : 0.0018649746614739487\n_inv_abs_ip_tev_2 : 0.0018630215186409569\n_abs_bt_pressure_rgeo : 0.0018583101847276249\n_abs_ip_epsilon_inv_pressure : 0.0018570210629298464\n_inv_tev_nel_2 : 0.0018418580441048745\n_inv_vol_rgeo_tev : 0.001835205111746962\n_inv_abs_bt_inv_tev_nel : 0.0018326342374278698\n_inv_abs_bt_inv_pressure : 0.001830870826294658\n_inv_abs_bt_nel : 0.0018158669456218247\n_abs_ip_inv_rgeo_tev : 0.001811953064033453\n_inv_pressure_inv_rgeo_tev : 0.0018050211144941585\n_inv_epsilon_nel : 0.0017909668652315313\n_inv_abs_bt_nel_pressure : 0.0017856631274619657\n_inv_abs_ip_pressure_tev : 0.001780231361097007\n_epsilon_inv_vol_tev : 0.0017703867107247706\n_inv_abs_bt_inv_nel : 0.0017626827354709944\n_abs_bt_inv_rgeo_tev : 0.001761812673000557\n_inv_abs_bt_inv_nel_pressure : 0.0017501180395301907\n_inv_abs_ip_nel_2 : 0.001749909492635693\n_inv_abs_bt_inv_abs_ip_pressure : 0.0017472768421963897\n_abs_bt_inv_nel_2 : 0.0017403120594679607\n_inv_abs_bt_inv_tev_vol : 0.0017391754691512523\n_inv_abs_ip_nel_tev : 0.0017335564658518315\n_inv_rgeo_2_tev : 0.0017202712723567962\n_abs_ip_inv_abs_bt_inv_pressure : 0.0017096743298938872\n_inv_vol_tev_2 : 0.0017035536016071049\n_inv_epsilon_inv_rgeo_tev : 0.001702503949264842\n_inv_epsilon_nel_rgeo : 0.0016959698268116472\n_inv_vol_nel_rgeo : 0.0016777501723176655\n_epsilon_inv_abs_bt_nel : 0.0016737238706273987\n_inv_abs_ip_nel_pressure : 0.0016715635200607254\n_inv_rgeo_tev : 0.0016586515058914304\nnev : 0.0016540405983159867\n_inv_nel_rgeo_tev : 0.0016490286686939367\n_abs_bt_inv_nel_rgeo : 0.0016367204352790274\n_inv_rgeo_nel_vol : 0.0016197384278741137\n_inv_epsilon_2_tev : 0.0015935602071939473\nlog_nev : 0.0015825027440339921\n_inv_abs_ip_pressure : 0.0015804085486097557\n_inv_epsilon_inv_tev_rgeo : 0.0015666031722692342\n_inv_abs_bt_inv_epsilon_inv_tev : 0.0015544332477085698\ninv_meff : 0.0015486154988485067\nlog_inv_meff : 0.0015482230463468843\n_abs_bt_inv_abs_ip_inv_nel : 0.0015471259282448442\n_inv_epsilon_inv_nel : 0.0015300443888141275\nmeff : 0.001526685351925521\n_nel_tev : 0.0015248315096300928\n_inv_abs_bt_inv_nel_vol : 0.0015195341665897983\n_inv_rgeo_nel_tev : 0.0015149666919041542\n_abs_bt_nel_rgeo : 0.0015124008586568968\n_inv_abs_bt_inv_abs_ip_inv_tev : 0.001510630080169986\n_inv_abs_bt_inv_pressure_nel : 0.0015048143345973897\n_inv_pressure_nel_rgeo : 0.0014997678208022865\n_abs_ip_inv_rgeo_pressure : 0.0014969443992540014\n_nel_2_vol : 0.0014869247061516495\n_nel_pressure_rgeo : 0.0014773248102115722\n_epsilon_inv_tev_2 : 0.001475529771761424\n_inv_abs_ip_inv_rgeo_pressure : 0.0014750789686454931\n_epsilon_inv_tev : 0.0014743942441634657\n_inv_abs_bt_pressure : 0.0014725072244249535\n_inv_epsilon_inv_nel_pressure : 0.001472313886464971\n_epsilon_inv_nel_2 : 0.0014642498125655556\n_inv_epsilon_inv_nel_vol : 0.0014623150495245158\n_inv_vol_pressure_tev : 0.00146180162570142\n_inv_abs_ip_inv_nel_vol : 0.0014616363284452782\n_inv_abs_bt_inv_pressure_rgeo : 0.001450570517828962\n_epsilon_inv_abs_bt_inv_nel : 0.0014481344283154686\n_nel_pressure_vol : 0.0014456509853124095\n_inv_nel_2_inv_tev : 0.0014390646821083905\n_abs_bt_inv_epsilon_tev : 0.0014334216796430166\n_abs_ip_inv_tev_nel : 0.0014325225297139735\n_inv_tev_nel_pressure : 0.0014263945000551877\n_inv_abs_bt_inv_vol_pressure : 0.0014240093546251322\ninv_tev : 0.0014232898068024951\n_abs_ip_inv_tev_pressure : 0.0014211361810187774\n_inv_abs_bt_inv_nel_inv_pressure : 0.0014193672410975698\n_nel_rgeo_tev : 0.0014052249240032173\n_epsilon_inv_abs_ip_pressure : 0.0014030006063560454\n_inv_epsilon_inv_rgeo_nel : 0.0014000730132494997\n_inv_vol_nel_tev : 0.0013960621743822476\n_inv_abs_ip_2_inv_nel : 0.0013879024086734678\n_inv_vol_nel_2 : 0.0013871336375167857\n_inv_pressure_nel : 0.0013869887473915805\n_inv_abs_bt_inv_abs_ip_inv_nel : 0.001374348614135431\n_inv_epsilon_inv_nel_inv_tev : 0.0013702149814968654\n_inv_abs_bt_inv_tev : 0.0013637165239293529\n_inv_nel_pressure_tev : 0.0013566991842327088\n_inv_abs_bt_inv_tev_2 : 0.0013554188875436034\n_nel_2_pressure : 0.0013534736705230815\n_inv_rgeo_nel_pressure : 0.001347083060958684\n_inv_epsilon_2_inv_pressure : 0.0013440598290010261\n_inv_nel_2_inv_rgeo : 0.0013373839688365762\n_nel_tev_2 : 0.001336631826726494\n_inv_abs_bt_inv_pressure_inv_tev : 0.001335664495101238\ntev : 0.0013121610334097718\n_inv_abs_ip_inv_nel_inv_pressure : 0.0013093723668956593\nlog_tev : 0.0013092192528688547\n_inv_nel_inv_rgeo : 0.001304565305829546\n_inv_epsilon_inv_pressure_inv_rgeo : 0.0013018561831195884\n_inv_abs_bt_inv_abs_ip_inv_pressure : 0.0012999348670515714\n_nel_pressure : 0.0012884108516358894\n_epsilon_inv_nel_inv_tev : 0.001285538540979432\n_inv_abs_ip_inv_nel_inv_tev : 0.0012766565232188354\n_inv_pressure_rgeo_tev : 0.0012761623887278858\n_inv_abs_ip_inv_pressure_2 : 0.0012735103975249544\n_epsilon_inv_abs_ip_inv_nel : 0.0012715209605861373\n_inv_abs_ip_pressure_2 : 0.0012672975093653354\n_inv_abs_ip_inv_pressure : 0.0012662675671129293\n_epsilon_inv_nel_inv_pressure : 0.0012563739224215693\n_inv_abs_bt_inv_pressure_vol : 0.001245664911437331\n_inv_abs_ip_inv_nel : 0.0012433962104054844\n_inv_abs_ip_inv_nel_rgeo : 0.0012324990948341917\n_inv_vol_nel_pressure : 0.0012249049425349942\n_epsilon_nel_rgeo : 0.0012238974477404093\n_epsilon_inv_abs_ip_inv_pressure : 0.0012220819938868449\n_abs_bt_inv_epsilon_pressure : 0.001217960252229364\n_imp_tauth_tok_aug : 0.001217414411618798\n_inv_epsilon_inv_nel_2 : 0.0012167001076246202\n_inv_abs_bt_inv_nel_2 : 0.0012100155206800735\n_abs_ip_nel_2 : 0.0012099998688141068\nimp_wth : 0.0012081277843344774\n_inv_epsilon_inv_nel_inv_pressure : 0.0011934294820791598\n_inv_abs_ip_2_inv_pressure : 0.001192722343563527\n_inv_epsilon_inv_pressure_inv_tev : 0.001182794800951796\n_inv_nel_2_rgeo : 0.001177033411370869\n_inv_abs_bt_inv_rgeo_pressure : 0.0011758264815158708\n_inv_nel_inv_pressure_2 : 0.0011694674912752845\n_inv_abs_bt_inv_pressure_2 : 0.0011689918929003354\n_abs_bt_inv_epsilon_inv_pressure : 0.001167092280795218\n_inv_vol_pressure_rgeo : 0.0011618622773151676\n_inv_rgeo_pressure_tev : 0.001161855130320882\n_epsilon_inv_rgeo_nel : 0.001161031305137033\n_inv_nel_rgeo_2 : 0.001141744609468245\n_inv_epsilon_inv_pressure_vol : 0.0011353417024591173\n_inv_nel_3 : 0.0011265889951726105\n_inv_rgeo_nel : 0.00112447372724121\n_inv_tev_pressure_2 : 0.0011010690956338716\n_nel_pressure_tev : 0.0010925377244007228\n_inv_abs_bt_pressure_2 : 0.0010808696720893958\n"
],
[
"tmp_columns = [\n work_col for work_col in cur_data.columns if work_col not in skipped_columns\n]\n\ncur_X = cur_data[tmp_columns]\ncur_y = cur_data[\"is_good\"]\n\nscaler = StandardScaler().fit(cur_X)\ncur_X = pd.DataFrame(scaler.transform(cur_X), columns=cur_X.columns)\n",
"_____no_output_____"
],
[
"tuned_parameters = {}\nsvc_clf = GridSearchCV(LinearSVC(C=5, penalty=\"l1\", dual=False, max_iter=10000, verbose=True), tuned_parameters, cv=4)\nsvc_clf.fit(cur_X, cur_y)\n",
"[LibLinear]"
],
[
"first_pass_columns = cur_X.columns[np.where(np.abs(svc_clf.best_estimator_.coef_) > 1e-5)[1]]\nlen(first_pass_columns)\n",
"_____no_output_____"
],
[
"cur_X = cur_data[first_pass_columns]\ncur_y = cur_data[\"is_good\"]\n\nscaler = StandardScaler().fit(cur_X)\ncur_X = pd.DataFrame(scaler.transform(cur_X), columns=cur_X.columns)\n",
"_____no_output_____"
],
[
"tuned_parameters = {}\nlog_clf = GridSearchCV(LogisticRegression(penalty='l1', C=1, solver='saga', max_iter=10000, verbose=True, n_jobs=-1), tuned_parameters, cv=4)\nlog_clf.fit(cur_X, cur_y)\n",
"[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 1.2min finished\n[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 2.3min finished\n[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 1.6min finished\n[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 2.7min finished\n[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n"
],
[
"second_pass_columns = cur_X.columns[np.where(np.abs(log_clf.best_estimator_.coef_) > 1e-5)[1]]\nlen(second_pass_columns)\n",
"_____no_output_____"
],
[
"cur_X = cur_data[second_pass_columns]\ncur_y = cur_data[\"is_good\"]\n\nscaler = StandardScaler().fit(cur_X)\ncur_X = pd.DataFrame(scaler.transform(cur_X), columns=cur_X.columns)\n",
"_____no_output_____"
],
[
"tuned_parameters = {}\nsvc_clf = GridSearchCV(LinearSVC(C=0.75, penalty=\"l1\", dual=False, max_iter=25000, verbose=True), tuned_parameters, cv=4)\nsvc_clf.fit(cur_X, cur_y)\n",
"[LibLinear]"
],
[
"third_pass_columns = cur_X.columns[np.where(np.abs(svc_clf.best_estimator_.coef_) > 1e-5)[1]]\nlen(third_pass_columns)\n",
"_____no_output_____"
],
[
"cur_X = cur_data[third_pass_columns]\ncur_y = cur_data[\"is_good\"]\n\nscaler = StandardScaler().fit(cur_X)\ncur_X = pd.DataFrame(scaler.transform(cur_X), columns=cur_X.columns)\n",
"_____no_output_____"
],
[
"tuned_parameters = {}\nlog_clf = GridSearchCV(LogisticRegression(penalty='l1', C=0.25, solver='saga', max_iter=10000, verbose=True, n_jobs=-1), tuned_parameters, cv=4)\nlog_clf.fit(cur_X, cur_y)\n",
"[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 14.1s finished\n[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 26.3s finished\n[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 10.6s finished\n[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 1 out of 1 | elapsed: 13.8s finished\n[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n"
],
[
"fourth_pass_columns = cur_X.columns[np.where(np.abs(log_clf.best_estimator_.coef_) > 1e-5)[1]]\nlen(fourth_pass_columns)\n",
"_____no_output_____"
],
[
"cur_X = cur_data[fourth_pass_columns]\ncur_y = cur_data[\"is_good\"]\n\nscaler = StandardScaler().fit(cur_X)\ncur_X = pd.DataFrame(scaler.transform(cur_X), columns=cur_X.columns)\n",
"_____no_output_____"
],
[
"cur_clf = RandomForestClassifier(n_estimators=10000, random_state=0, n_jobs=-1)\ncur_clf.fit(cur_X, cur_y)\n",
"_____no_output_____"
],
[
"features = fourth_pass_columns\nimportances = cur_clf.feature_importances_\nindices = reversed(np.argsort(importances))\n\ncount = 0\nnew_indices = []\nnames = []\nfor index in indices:\n if len(new_indices) >= 12 : break\n \n name = features[index]\n if name.startswith(\"_\") : \n new_indices.append(index)\n name = name.replace(\"abs_\", \"\")[1:]\n names.append(name)\n continue\n name = name.replace(\"inv_\", \"\")\n name = name.replace(\"log_\", \"\")\n \n if name in names : continue\n names.append(name)\n \n new_indices.append(index)\n \nindices = list(reversed(new_indices))\nnames = reversed(names)\n\nplt.title('Feature Importances (Random Forests)')\nplt.barh(range(len(indices)), importances[indices], color='b', align='center')\nplt.yticks(range(len(indices)), names)\nplt.xlabel('')\nplt.show()",
"_____no_output_____"
],
[
"def plot_1(ax):\n plt.sca(ax)\n \n features = fourth_pass_columns\n importances = log_clf.best_estimator_.coef_[0][np.where(np.abs(log_clf.best_estimator_.coef_) > 1e-5)[1]]\n indices = reversed(np.argsort(np.abs(importances)))\n\n count = 0\n new_indices = []\n names = []\n for index in indices:\n if len(new_indices) >= 12 : break\n\n name = features[index]\n if name.startswith(\"_\") : \n new_indices.append(index)\n name = name.replace(\"abs_\", \"\")[1:]\n names.append(name)\n continue\n name = name.replace(\"inv_\", \"\")\n name = name.replace(\"log_\", \"\")\n\n if name in names : continue\n names.append(name)\n\n new_indices.append(index)\n\n indices = list(reversed(new_indices))\n names = list(reversed(names))\n\n importances = importances[indices]\n indices = list(reversed(np.argsort((importances))))\n\n names = [names[i] for i in indices]\n\n # features = fourth_pass_columns\n # importances = log_clf.best_estimator_.coef_[0][np.where(np.abs(log_clf.best_estimator_.coef_) > 1e-5)[1]]\n # indices = reversed(np.argsort(np.abs(importances)))\n\n plt.title('Logistic Features')\n plt.barh(range(len(indices)), importances[indices], color='b', align='center')\n plt.yticks(range(len(indices)), names)\n plt.xlabel('')",
"_____no_output_____"
],
[
"cur_X = cur_data[fourth_pass_columns]\ncur_y = cur_data[\"is_good\"]\n\nscaler = StandardScaler().fit(cur_X)\ncur_X = pd.DataFrame(scaler.transform(cur_X), columns=cur_X.columns)\n\ntuned_parameters = {}\nsvc_clf = GridSearchCV(LinearSVC(C=10, penalty=\"l1\", dual=False, max_iter=10000, verbose=True), tuned_parameters, cv=4)\nsvc_clf.fit(cur_X, cur_y)\n",
"[LibLinear]"
],
[
"def plot_2(ax):\n plt.sca(ax)\n \n fifth_pass_columns = cur_X.columns[np.where(np.abs(svc_clf.best_estimator_.coef_) > 1e-5)[1]]\n\n features = fifth_pass_columns\n importances = svc_clf.best_estimator_.coef_[0][np.where(np.abs(svc_clf.best_estimator_.coef_) > 1e-5)[1]]\n indices = reversed(np.argsort(np.abs(importances)))\n\n count = 0\n new_indices = []\n names = []\n for index in indices:\n if len(new_indices) >= 12 : break\n\n name = features[index]\n if name.startswith(\"_\") : \n new_indices.append(index)\n name = name.replace(\"abs_\", \"\")[1:]\n names.append(name)\n continue\n name = name.replace(\"inv_\", \"\")\n name = name.replace(\"log_\", \"\")\n\n if name in names : continue\n names.append(name)\n\n new_indices.append(index)\n\n indices = list(reversed(new_indices))\n names = list(reversed(names))\n\n importances = importances[indices]\n indices = list(reversed(np.argsort((importances))))\n\n names = [names[i] for i in indices]\n\n # features = fourth_pass_columns\n # importances = log_clf.best_estimator_.coef_[0][np.where(np.abs(log_clf.best_estimator_.coef_) > 1e-5)[1]]\n # indices = reversed(np.argsort(np.abs(importances)))\n\n plt.title('SVM Features')\n plt.barh(range(len(indices)), importances[indices], color='b', align='center')\n plt.yticks(range(len(indices)), names)\n plt.xlabel('')\n",
"_____no_output_____"
],
[
"plt.figure(figsize=(12,6))\nax = plt.subplot(1,2,1)\nplot_1(ax)\nax = plt.subplot(1,2,2)\nax.yaxis.set_label_position(\"right\")\nax.yaxis.tick_right()\nplot_2(ax)",
"_____no_output_____"
],
[
"list(fourth_pass_columns)",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e7a0b590ac56f65dafaab36169345042e6ca5df3 | 3,783 | ipynb | Jupyter Notebook | Gridsearch Parameters.ipynb | BrittGeek/Time-Series-Forecasting | 91719c6ab9ff50f64ac062da1b6ceb27a8693c8f | [
"Apache-2.0"
] | null | null | null | Gridsearch Parameters.ipynb | BrittGeek/Time-Series-Forecasting | 91719c6ab9ff50f64ac062da1b6ceb27a8693c8f | [
"Apache-2.0"
] | null | null | null | Gridsearch Parameters.ipynb | BrittGeek/Time-Series-Forecasting | 91719c6ab9ff50f64ac062da1b6ceb27a8693c8f | [
"Apache-2.0"
] | null | null | null | 24.095541 | 87 | 0.510706 | [
[
[
"# Grid searching parameters\n\nhttps://machinelearningmastery.com/grid-search-arima-hyperparameters-with-python/",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nfrom pandas import read_csv\nimport numpy as np\nfrom datetime import datetime\nfrom pandas import Series\nfrom statsmodels.tsa.arima_model import ARIMA\nfrom sklearn.metrics import mean_squared_error\nimport warnings\nwarnings.filterwarnings(\"ignore\")",
"_____no_output_____"
],
[
"series = Series.from_csv('female.csv', header=0)",
"_____no_output_____"
],
[
"def evaluate_arima_model(X, arima_order):\n # prepare training dataset\n train_size = int(len(X) * 0.6)\n train, test = X[0:train_size], X[train_size:]\n history = [x for x in train]\n # make predictions\n predictions = list()\n for t in range(len(test)):\n model = ARIMA(train, order=arima_order)\n model_fit = model.fit(disp=0)\n yhat = model_fit.forecast()[0]\n predictions.append(yhat)\n history.append(test[t])\n # calculate out of sample error\n error = mean_squared_error(test, predictions)\n return error",
"_____no_output_____"
],
[
"def evaluate_models(dataset, p_values, d_values, q_values):\n best_score, best_cfg = float(\"inf\"), None\n for p in p_values:\n for d in d_values:\n for q in q_values:\n order = (p,d,q)\n try:\n mse = evaluate_arima_model(dataset, order)\n if mse < best_score:\n best_score, best_cfg = mse, order\n print('ARIMA%s MSE=%.3f' % (order,mse))\n except:\n continue\n \n print('Best ARIMA%s MSE=%.3f' % (best_cfg, best_score))",
"_____no_output_____"
],
[
"p_values = [0, 1, 2]\nd_values = range(0, 2)\nq_values = range(0, 2)\nwarnings.filterwarnings(\"ignore\")\nevaluate_models(series, p_values, d_values, q_values)",
"Best ARIMANone MSE=inf\n"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
] |
e7a0d5ef027a1aeca1a9ece7aacf555ff86dc29d | 300,464 | ipynb | Jupyter Notebook | notebooks/Calculate_sensitivity_eventio.ipynb | pawel21/cta-lstchain | b7fc7c9aaeb952b85982981699b85701c37eb5e5 | [
"BSD-3-Clause"
] | null | null | null | notebooks/Calculate_sensitivity_eventio.ipynb | pawel21/cta-lstchain | b7fc7c9aaeb952b85982981699b85701c37eb5e5 | [
"BSD-3-Clause"
] | null | null | null | notebooks/Calculate_sensitivity_eventio.ipynb | pawel21/cta-lstchain | b7fc7c9aaeb952b85982981699b85701c37eb5e5 | [
"BSD-3-Clause"
] | null | null | null | 384.717029 | 44,692 | 0.930248 | [
[
[
"# Notebook to perform a sensitivity calculation\n\n**Content:**\n- Calculation of the collection area\n- Sensitivity calculation in energy bins\n- Sensitivity calculation in bins of gammaness and theta2 cuts\n- Optimization of the cuts using Nex/sqrt(Nbg) -> LiMa to be implemented\n- Plotting of the sensitivity in absolute values",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nfrom matplotlib.colors import LogNorm\nimport h5py\nimport pandas as pd\nimport math\nimport pyhessio\nfrom astropy import units as u\nimport eventio\nfrom eventio.simtel.simtelfile import SimTelFile",
"_____no_output_____"
],
[
"simtelfile_gammas = \"/home/queenmab/DATA/LST1/Gamma/gamma_20deg_0deg_run8___cta-prod3-lapalma-2147m-LaPalma-FlashCam.simtel.gz\"\nsimtelfile_protons = \"/home/queenmab/DATA/LST1/Proton/proton_20deg_0deg_run194___cta-prod3-lapalma-2147m-LaPalma-FlashCam.simtel.gz\"\nPATH_EVENTS = \"../../cta-lstchain-extra/reco/sample_data/dl2/\"\nfile_g = PATH_EVENTS+\"/reco_gammas.h5\" ##Same events but with reconstructed \nfile_p = PATH_EVENTS+\"/reco_protons.h5\"\nevents_g = pd.read_hdf(file_g)\nevents_p = pd.read_hdf(file_p)\nTriggered_Events_real_gammas = events_g.shape[0]\nTriggered_Events_real_protons = events_p.shape[0]",
"_____no_output_____"
],
[
"source_gammas = SimTelFile(simtelfile_gammas)\nsource_protons = SimTelFile(simtelfile_protons)",
"_____no_output_____"
],
[
"emin_g, emax_g = source_gammas.mc_run_headers[0]['E_range']*1e3 #GeV\nspectral_index_g = source_gammas.mc_run_headers[0]['spectral_index']\nnum_showers = source_gammas.mc_run_headers[0]['num_showers']\nnum_use = source_gammas.mc_run_headers[0]['num_use']\nSimulated_Events_g = num_showers * num_use\nMax_impact_g = source_gammas.mc_run_headers[0]['core_range'][1]*1e2 #cm\nArea_sim_g = math.pi * math.pow(Max_impact_g,2)\ncone_g = source_gammas.mc_run_headers[0]['viewcone'][1]\n\nemin_p, emax_p = source_protons.mc_run_headers[0]['E_range']*1e3 #GeV\nspectral_index_p = source_protons.mc_run_headers[0]['spectral_index']\nnum_showers = source_protons.mc_run_headers[0]['num_showers']\nnum_use = source_protons.mc_run_headers[0]['num_use']\nSimulated_Events_p = num_showers * num_use\nMax_impact_p = source_protons.mc_run_headers[0]['core_range'][1]*1e2 #cm\nArea_sim_p = math.pi * math.pow(Max_impact_p,2)\ncone_p = source_protons.mc_run_headers[0]['viewcone'][1]",
"_____no_output_____"
],
[
"energies_g = []\nenergies_p = []\nwith SimTelFile(simtelfile_gammas) as f:\n for i, event in enumerate(f.iter_mc_events()):\n energies_g.append(event['mc_shower']['energy']*1e3) #In GeV\nwith SimTelFile(simtelfile_protons) as f:\n for i, event in enumerate(f.iter_mc_events()):\n energies_p.append(event['mc_shower']['energy']*1e3)",
"_____no_output_____"
],
[
"e_trig_g = 10**events_g.mc_energy\ne_trig_p = 10**events_p.mc_energy\nTriggered_Events_g = e_trig_g.shape[0] \nTriggered_Events_p = e_trig_p.shape[0]",
"_____no_output_____"
],
[
"fig,ax = plt.subplots()\nax.hist(np.log10(energies_p),label = 'Simulated protons')\nax.hist(np.log10(energies_g),label='Simulated gammas')\nax.set_yscale(\"log\")",
"_____no_output_____"
],
[
"##### Binnings and constants######\n# Whenever implemented using simulated files, most of these values can be read from the simulations\needges = 6\nebins = eedges-1\nE = np.logspace(math.log10(emin_g),math.log10(emax_g),eedges)\nE_trig = np.logspace(math.log10(emin_p),math.log10(100000),eedges)\nEmed = np.sqrt(E[:-1] * E[1:])\nEmed_trig = np.sqrt(E_trig[:-1] * E_trig[1:])\ngammaness_bins = 3\ntheta2_bins = 3\n\nIndex_Crab = -2.62",
"_____no_output_____"
],
[
"##### Collection area calculation ######\ndef collection_area(Esim, Etrig):\n # Esim are all the simulated energies\n # Etrig are the energies after cuts\n area = []\n Nsim = np.power(Esim,Index_Crab-spectral_index_g)\n Ncuts = np.power(Etrig,Index_Crab-spectral_index_g)\n \n for i in range(0,ebins):\n Nsim_w = np.sum(Nsim[(Esim < E[i+1]) & (Esim > E[i])])\n Ntrig_w = np.sum(Ncuts[(Etrig < E[i+1]) & (Etrig > E[i])])\n if(Nsim_w == 0):\n print(\"You have not simulated any events in the energy range between %.3f GeV and %.3f GeV\" % (E[i],E[i+1]))\n area.append(0)\n else:\n area.append(Ntrig_w / Nsim_w * Area_sim_g) # cm^2\n\n return area\n",
"_____no_output_____"
],
[
"# Plot the collection area\narea = collection_area(energies_g, e_trig_g)\nfig, ax = plt.subplots()\nax.set_xlabel(\"Energy [GeV]\")\nax.set_ylabel(\"Collection area [cm$^2$]\")\nax.grid(ls='--',alpha=0.4)\nax.loglog(E[:-1], area)",
"_____no_output_____"
],
[
"gammaness_g = events_g.gammaness\ngammaness_p = events_p.gammaness\n\ntheta2_g = (events_g.src_x-events_g.src_x_rec)**2+(events_g.src_y-events_g.src_y)**2\ntheta2_p = (events_p.src_x-events_p.src_x_rec)**2+(events_p.src_y-events_p.src_y)**2",
"_____no_output_____"
],
[
"####### Sensitivity calculation ##########\n# We will first go for a implementation using Sig = Nex/sqrt(Nbg)\nobstime = 50 * 3600 # s (50 hours)",
"_____no_output_____"
],
[
"####### Weighting of the hadrons #####\n# No simulation, just take the gamma energy distribution and convert it to hadrons\n\n#Float_t ProtonTrueSpectralIndex = -2.70;\n#Float_t ProtonTrueNorm = 9.6e-9; // (cm2 sr s GeV)^-1 at ProtonEnorm \n#Float_t ProtonEnorm = 1000.; // GeV \n\nK = Simulated_Events_p*(1+spectral_index_p)/(emax_p**(1+spectral_index_p)-emin_p**(1+spectral_index_p))\ncone = cone_p * math.pi/180\nif(cone == 0):\n Omega = 1\nelse:\n Omega = 2*np.pi*(1-np.cos(cone))\n\nK_w = 9.6e-11 # GeV^-1 cm^-2 s^-1 \nindex_w_p = -2.7 \nE0 = 1000. # GeV \n \nInt_e1_e2 = K*E0**spectral_index_p \nNp_ = Int_e1_e2*(emax_p**(index_w_p+1)-emin_p**(index_w_p+1))/(E0**index_w_p)/(index_w_p+1) \nRp = K_w*Area_sim_p*Omega*(emax_p**(index_w_p+1)-emin_p**(index_w_p+1))/(E0**index_w_p)/(index_w_p+1) # Rate (in Hz)\nprint(\"The total rate of simulated proton events is %.1f Hz\" % Rp)\n",
"The total rate of simulated proton events is 8906573.8 Hz\n"
],
[
"####### Weighting of the gamma simulations #####\n\n# HEGRA Crab\n# TF1* CrabFluxHEGRA = new TF1(\"CrabFluxHEGRA\",\"[0]*pow(x/1000.,-[1])\",50,80000);\n# CrabFluxHEGRA->SetParameter(0,2.83e-11);\n# CrabFluxHEGRA->SetParameter(1,2.62);\n\nK = Simulated_Events_g*(1+spectral_index_g)/(emax_g**(1+spectral_index_g)-emin_g**(1+spectral_index_g)) \nArea_sim = math.pi * math.pow(Max_impact_g,2) # cm^2\ncone=0\nif(cone == 0):\n Omega = 1\nelse:\n Omega = 2*np.pi*(1-np.cos(cone))\n\n\nK_w = 2.83e-11 # GeV^-1 cm^-2 s^-1 \nindex_w_g = -2.62 \nE0 = 1000. # GeV \n \nInt_e1_e2 = K*E0**spectral_index_g \nN_ = Int_e1_e2*(emax_g**(index_w_g+1)-emin_g**(index_w_g+1))/(E0**index_w_g)/(index_w_g+1) \nR = K_w*Area_sim_g*Omega*(emax_g**(index_w_g+1)-emin_g**(index_w_g+1))/(E0**index_w_g)/(index_w_g+1) # Rate (in Hz)\nprint(\"The total rate of simulated gamma events is %.1f Hz\" % R)",
"The total rate of simulated gamma events is 17168202.3 Hz\n"
],
[
"energies_g = np.asarray(energies_g)\nenergies_p = np.asarray(energies_p)\ne_w = ((energies_g/E0)**(index_w_g-spectral_index_g))*R/N_\ne_trig_w = ((e_trig_g/E0)**(index_w_g-spectral_index_g))*R/N_\nep_w = ((energies_p/E0)**(index_w_p-spectral_index_p))*Rp/Np_\nep_trig_w = ((e_trig_p/E0)**(index_w_p-spectral_index_p))*Rp/Np_\n\nfig, (ax1, ax2) = plt.subplots(1, 2, figsize=(15,5))\n\nax1.hist(np.log10(energies_g),histtype=u'step',bins=20, density=1,label=\"Simulated\")\nax1.hist(np.log10(energies_g),histtype=u'step',bins=20,weights = e_w, density=1,label=\"Weighted to Crab\")\nax1.set_yscale('log')\n#plt.xscale('log')\nax1.set_xlabel(\"$log_{10}E (GeV)$\")\nax1.grid(ls='--',alpha=.5)\nax1.legend()\n\n#ax2.hist(np.log10(e),histtype=u'step',bins=20,label=\"Simulated rate\")\nax2.hist(np.log10(energies_g),histtype=u'step',bins=20,weights = e_w,label=\"Simulated rate weighted to Crab\")\nax2.hist(np.log10(e_trig_g),histtype=u'step',bins=20,weights = e_trig_w,label=\"Triggered rate weighted to Crab\")\nax2.hist(np.log10(energies_p),histtype=u'step',bins=20,weights = ep_w,label=\"Simulated Protons\")\nax2.hist(np.log10(e_trig_p),histtype=u'step',bins=20,weights = ep_trig_w,label=\"Triggered Protons\")\n\nax2.legend()\nax2.set_yscale('log')\nax2.set_xlabel(\"$log_{10}E (GeV)$\")\nax2.grid(ls='--',alpha=.5)\n#plt.xscale('log')",
"_____no_output_____"
],
[
"for i in range(0,ebins): # binning in energy\n e_w_sum = np.sum(e_w[(energies_g < E[i+1]) & (energies_g > E[i])])\n print(\"Rate of gammas between %.1f GeV and %.1f GeV: %.2f Hz\" % (E[i],E[i+1],e_w_sum))\nfor i in range(0,ebins): # binning in energy\n e_w_sum = np.sum(ep_w[(energies_p < E[i+1]) & (energies_p > E[i])])\n print(\"Rate of protons between %.1f GeV and %.1f GeV: %.2f Hz\" % (E[i],E[i+1],e_w_sum))\nfor i in range(0,ebins): # binning in energy\n e_w_sum = np.sum(e_trig_w[(e_trig_g < E_trig[i+1]) & (e_trig_g > E_trig[i])])\n print(\"Rate of triggered gammas between %.1f GeV and %.1f GeV: %.6f Hz\" % (E_trig[i],E_trig[i+1],e_w_sum))\nfor i in range(0,ebins): # binning in energy\n e_w_sum = np.sum(ep_trig_w[(e_trig_p < E_trig[i+1]) & (e_trig_p > E_trig[i])])\n print(\"Rate of triggered protons between %.1f GeV and %.1f GeV: %.6f Hz\" % (E_trig[i],E_trig[i+1],e_w_sum))",
"Rate of gammas between 3.0 GeV and 30.6 GeV: 16766451.12 Hz\nRate of gammas between 30.6 GeV and 311.7 GeV: 389846.83 Hz\nRate of gammas between 311.7 GeV and 3176.6 GeV: 9200.08 Hz\nRate of gammas between 3176.6 GeV and 32376.9 GeV: 215.26 Hz\nRate of gammas between 32376.9 GeV and 330000.0 GeV: 7.20 Hz\nRate of protons between 3.0 GeV and 30.6 GeV: 8609967.25 Hz\nRate of protons between 30.6 GeV and 311.7 GeV: 276792.36 Hz\nRate of protons between 311.7 GeV and 3176.6 GeV: 5428.90 Hz\nRate of protons between 3176.6 GeV and 32376.9 GeV: 107.92 Hz\nRate of protons between 32376.9 GeV and 330000.0 GeV: 1.07 Hz\nRate of triggered gammas between 4.0 GeV and 30.3 GeV: 8835.013896 Hz\nRate of triggered gammas between 30.3 GeV and 229.7 GeV: 8148.590852 Hz\nRate of triggered gammas between 229.7 GeV and 1741.1 GeV: 797.127299 Hz\nRate of triggered gammas between 1741.1 GeV and 13195.1 GeV: 48.895267 Hz\nRate of triggered gammas between 13195.1 GeV and 100000.0 GeV: 0.962434 Hz\nRate of triggered protons between 4.0 GeV and 30.3 GeV: 3.656702 Hz\nRate of triggered protons between 30.3 GeV and 229.7 GeV: 37.462429 Hz\nRate of triggered protons between 229.7 GeV and 1741.1 GeV: 16.755434 Hz\nRate of triggered protons between 1741.1 GeV and 13195.1 GeV: 1.736270 Hz\nRate of triggered protons between 13195.1 GeV and 100000.0 GeV: 0.160248 Hz\n"
],
[
"# Cut optimization for gammas and hadrons\n\nfinal_gamma = np.ndarray(shape=(ebins,gammaness_bins,theta2_bins))\nfinal_hadrons = np.ndarray(shape=(ebins,gammaness_bins,theta2_bins))\n\nfor i in range(0,eedges-1): # binning in energy\n e_w_binE = np.sum(e_w[(energies_g < E[i+1]) & (energies_g > E[i])])\n for g in range(0,gammaness_bins): # cut in gammaness\n Ngammas = []\n Nhadrons = []\n for t in range(0,theta2_bins): # cut in theta2\n e_trig_w_sum = np.sum(e_trig_w[(e_trig_g < E_trig[i+1]) & (e_trig_g > E_trig[i]) \\\n & (gammaness_g > 0.1*g) & (theta2_g < 0.05*(t+1))])\n # Just considering all the hadrons give trigger...\n ep_w_sum = np.sum(ep_trig_w[(e_trig_p < E_trig[i+1]) & (e_trig_p > E_trig[i]) \\\n & (gammaness_p > 0.1*g) & (theta2_p < 0.05*(t+1))])\n \n final_gamma[i][g][t] = e_trig_w_sum * obstime\n final_hadrons[i][g][t] = ep_w_sum * obstime\n",
"_____no_output_____"
],
[
"def Calculate_sensititity(Ng, Nh, alpha):\n significance = (Ng)/np.sqrt(Nh * alpha)\n sensitivity = 5/significance * 100 # percentage of Crab\n \n return sensitivity\nsens = Calculate_sensititity(final_gamma, final_hadrons, 1)",
"_____no_output_____"
],
[
"def fill_bin_content(ax,energy_bin):\n for i in range(0,gammaness_bins):\n for j in range(0,theta2_bins):\n text = ax.text((j+0.5)*(0.5/theta2_bins), (i+0.5)*(1/gammaness_bins), \"%.2E %%\" % sens[energy_bin][i][j],\n ha=\"center\", va=\"center\", color=\"w\")\n return ax",
"_____no_output_____"
],
[
"def format_axes(ax,pl):\n ax.set_aspect(0.5)\n\n ax.set_ylabel(r'Gammaness',fontsize=15)\n ax.set_xlabel(r'$\\theta^2$ (deg$^2$)',fontsize=15)\n \n starty, endy = ax.get_ylim()\n ax.yaxis.set_ticks(np.arange(endy, starty, 0.1)[::-1])\n startx, endx = ax.get_xlim()\n ax.xaxis.set_ticks(np.arange(startx, endx, 0.1))\n\n cbaxes = fig.add_axes([0.9, 0.125, 0.03, 0.755])\n cbar = fig.colorbar(pl,cax=cbaxes)\n cbar.set_label('Sensitivity (% Crab)',fontsize=15) ",
"_____no_output_____"
],
[
"# Sensitivity plots for different Energy bins\nfor ebin in range(0,ebins):\n fig, ax = plt.subplots(figsize=(8,8))\n pl = ax.imshow(sens[ebin], cmap='viridis', extent=[0., 0.5, 1., 0.])\n fill_bin_content(ax, ebin)\n\n format_axes(ax, pl)",
"_____no_output_____"
],
[
"def Crab_spectrum(x):\n MAGIC_par=[3.23e-11, -2.47, -0.24]\n #dFdE = MAGIC_par[0]*pow(x/1.,MAGIC_par[1]+MAGIC_par[2]*np.log10(x/1.))\n dFdE = MAGIC_par[0]*pow(x/1000.,MAGIC_par[1]+MAGIC_par[2]*np.log10(x/1000.))\n \n return dFdE",
"_____no_output_____"
],
[
"def format_axes_array(ax, arr_i,arr_j):\n ax.set_aspect(0.5)\n if ((arr_i == 0) and (arr_j == 0)):\n ax.set_ylabel(r'Gammaness',fontsize=15)\n if ((arr_i == 3) and (arr_j == 2)):\n ax.set_xlabel(r'$\\theta^2$ (deg$^2$)',fontsize=15)\n\n starty, endy = ax.get_ylim()\n ax.yaxis.set_ticks(np.arange(endy, starty, 0.1)[::-1])\n startx, endx = ax.get_xlim()\n ax.xaxis.set_ticks(np.arange(startx, endx, 0.1))\n \n cbaxes = fig.add_axes([0.91, 0.125, 0.03, 0.755])\n cbar = fig.colorbar(pl,cax=cbaxes)\n cbar.set_label('Sensitivity (% Crab)',fontsize=15) ",
"_____no_output_____"
],
[
"#fig, ax = plt.subplots(figsize=(8,8), )\nfig, axarr = plt.subplots(2,3, sharex=True, sharey=True, figsize=(13.2,18))\nindices=[]\nsensitivity = np.ndarray(shape=ebins)\n\nsens = sens+1e-6\n\nfor ebin in range(0,ebins):\n arr_i = int(ebin/3)\n arr_j = ebin-int(ebin/3)*3\n pl = axarr[arr_i,arr_j].imshow(sens[ebin], cmap='viridis_r', extent=[0., 0.5, 1., 0.]\n #vmin=sens.min(), vmax=sens.max())\n ,norm=LogNorm(vmin=sens.min(), vmax=sens.max()))\n format_axes_array(axarr[arr_i,arr_j],arr_i,arr_j)\n\n # gammaness/theta2 indices where the minimum in sensitivity is reached\n ind = np.unravel_index(np.argmin(sens[sens>1e-6][ebin], axis=None), sens[ebin].shape)\n indices.append(ind)\n sensitivity[ebin] = sens[ebin][ind]\n \nfig.subplots_adjust(hspace = 0, wspace = 0)\n#format_axes(ax)",
"/home/queenmab/anaconda3/envs/cta-dev/lib/python3.6/site-packages/matplotlib/cbook/deprecation.py:107: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance.\n warnings.warn(message, mplDeprecation, stacklevel=1)\n"
],
[
"def plot_Crab(ax, percentage=100, **kwargs):\n # factor is the percentage of Crab \n En = np.logspace(math.log10(100),math.log10(3.e4),40) # in TeV\n dFdE = percentage / 100. * Crab_spectrum(En)\n ax.loglog(En,dFdE * En/1.e3 * En/1.e3, color='gray', **kwargs)\n \n return ax",
"_____no_output_____"
],
[
"def format_axes(ax):\n ax.set_xscale(\"log\", nonposx='clip')\n ax.set_yscale(\"log\", nonposy='clip')\n ax.set_xlim(5e1,9.e4)\n ax.set_ylim(1.e-14,5.e-10)\n ax.set_xlabel(\"Energy [GeV]\")\n ax.set_ylabel(r'E$^2$ $\\frac{\\mathrm{dN}}{\\mathrm{dE}}$ [TeV cm$^{-2}$ s$^{-1}$]')\n ax.grid(ls='--',alpha=.5)",
"_____no_output_____"
],
[
"sensitivity",
"_____no_output_____"
],
[
"Emed = Emed_trig[sensitivity>0]\ndef plot_sensitivity(ax):\n dFdE = Crab_spectrum(Emed)\n ax.loglog(Emed, sensitivity[sensitivity>0] / 100 * dFdE * Emed/1.e3 * Emed/1.e3, label = 'Sensitivity')",
"_____no_output_____"
],
[
"#### SENSITIVITY PLOT ######\nfig, ax = plt.subplots()\nplot_sensitivity(ax)\n\nplot_Crab(ax, label=r'Crab')\n#plot_Crab(ax,10,ls='dashed',label='10% Crab')\nplot_Crab(ax,1,ls='dotted',label='1% Crab')\n\n\n#format_axes(ax)\nax.legend(numpoints=1,prop={'size':9},ncol=2,loc='upper right')",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e7a0d98877ea06103b0aa72076769f98e8f9dae3 | 660,929 | ipynb | Jupyter Notebook | word2vec-embeddings/Negative_Sampling_My_Solution.ipynb | iromeo/deep-learning-v2-pytorch | ace2983842a7cf6204c2a59f3fba561dfc64aad1 | [
"MIT"
] | null | null | null | word2vec-embeddings/Negative_Sampling_My_Solution.ipynb | iromeo/deep-learning-v2-pytorch | ace2983842a7cf6204c2a59f3fba561dfc64aad1 | [
"MIT"
] | null | null | null | word2vec-embeddings/Negative_Sampling_My_Solution.ipynb | iromeo/deep-learning-v2-pytorch | ace2983842a7cf6204c2a59f3fba561dfc64aad1 | [
"MIT"
] | null | null | null | 779.397406 | 623,012 | 0.944191 | [
[
[
"# Skip-gram Word2Vec\n\nIn this notebook, I'll lead you through using PyTorch to implement the [Word2Vec algorithm](https://en.wikipedia.org/wiki/Word2vec) using the skip-gram architecture. By implementing this, you'll learn about embedding words for use in natural language processing. This will come in handy when dealing with things like machine translation.\n\n## Readings\n\nHere are the resources I used to build this notebook. I suggest reading these either beforehand or while you're working on this material.\n\n* A really good [conceptual overview](http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/) of Word2Vec from Chris McCormick \n* [First Word2Vec paper](https://arxiv.org/pdf/1301.3781.pdf) from Mikolov et al.\n* [Neural Information Processing Systems, paper](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf) with improvements for Word2Vec also from Mikolov et al.\n\n---\n## Word embeddings\n\nWhen you're dealing with words in text, you end up with tens of thousands of word classes to analyze; one for each word in a vocabulary. Trying to one-hot encode these words is massively inefficient because most values in a one-hot vector will be set to zero. So, the matrix multiplication that happens in between a one-hot input vector and a first, hidden layer will result in mostly zero-valued hidden outputs.\n\nTo solve this problem and greatly increase the efficiency of our networks, we use what are called **embeddings**. Embeddings are just a fully connected layer like you've seen before. We call this layer the embedding layer and the weights are embedding weights. We skip the multiplication into the embedding layer by instead directly grabbing the hidden layer values from the weight matrix. We can do this because the multiplication of a one-hot encoded vector with a matrix returns the row of the matrix corresponding the index of the \"on\" input unit.\n\n<img src='assets/lookup_matrix.png' width=50%>\n\nInstead of doing the matrix multiplication, we use the weight matrix as a lookup table. We encode the words as integers, for example \"heart\" is encoded as 958, \"mind\" as 18094. Then to get hidden layer values for \"heart\", you just take the 958th row of the embedding matrix. This process is called an **embedding lookup** and the number of hidden units is the **embedding dimension**.\n \nThere is nothing magical going on here. The embedding lookup table is just a weight matrix. The embedding layer is just a hidden layer. The lookup is just a shortcut for the matrix multiplication. The lookup table is trained just like any weight matrix.\n\nEmbeddings aren't only used for words of course. You can use them for any model where you have a massive number of classes. A particular type of model called **Word2Vec** uses the embedding layer to find vector representations of words that contain semantic meaning.",
"_____no_output_____"
],
[
"---\n## Word2Vec\n\nThe Word2Vec algorithm finds much more efficient representations by finding vectors that represent the words. These vectors also contain semantic information about the words.\n\n<img src=\"assets/context_drink.png\" width=40%>\n\nWords that show up in similar **contexts**, such as \"coffee\", \"tea\", and \"water\" will have vectors near each other. Different words will be further away from one another, and relationships can be represented by distance in vector space.\n\n\nThere are two architectures for implementing Word2Vec:\n>* CBOW (Continuous Bag-Of-Words) and \n* Skip-gram\n\n<img src=\"assets/word2vec_architectures.png\" width=60%>\n\nIn this implementation, we'll be using the **skip-gram architecture** with **negative sampling** because it performs better than CBOW and trains faster with negative sampling. Here, we pass in a word and try to predict the words surrounding it in the text. In this way, we can train the network to learn representations for words that show up in similar contexts.",
"_____no_output_____"
],
[
"---\n## Loading Data\n\nNext, we'll ask you to load in data and place it in the `data` directory\n\n1. Load the [text8 dataset](https://s3.amazonaws.com/video.udacity-data.com/topher/2018/October/5bbe6499_text8/text8.zip); a file of cleaned up *Wikipedia article text* from Matt Mahoney. \n2. Place that data in the `data` folder in the home directory.\n3. Then you can extract it and delete the archive, zip file to save storage space.\n\nAfter following these steps, you should have one file in your data directory: `data/text8`.",
"_____no_output_____"
]
],
[
[
"# read in the extracted text file \nwith open('data/text8') as f:\n text = f.read()\n\n# print out the first 100 characters\nprint(text[:100])",
" anarchism originated as a term of abuse first used against early working class radicals including t\n"
]
],
[
[
"## Pre-processing\n\nHere I'm fixing up the text to make training easier. This comes from the `utils.py` file. The `preprocess` function does a few things:\n>* It converts any punctuation into tokens, so a period is changed to ` <PERIOD> `. In this data set, there aren't any periods, but it will help in other NLP problems. \n* It removes all words that show up five or *fewer* times in the dataset. This will greatly reduce issues due to noise in the data and improve the quality of the vector representations. \n* It returns a list of words in the text.\n\nThis may take a few seconds to run, since our text file is quite large. If you want to write your own functions for this stuff, go for it!",
"_____no_output_____"
]
],
[
[
"import utils\n\n# get list of words\nwords = utils.preprocess(text)\nprint(words[:30])",
"['anarchism', 'originated', 'as', 'a', 'term', 'of', 'abuse', 'first', 'used', 'against', 'early', 'working', 'class', 'radicals', 'including', 'the', 'diggers', 'of', 'the', 'english', 'revolution', 'and', 'the', 'sans', 'culottes', 'of', 'the', 'french', 'revolution', 'whilst']\n"
],
[
"# print some stats about this word data\nprint(\"Total words in text: {}\".format(len(words)))\nprint(\"Unique words: {}\".format(len(set(words)))) # `set` removes any duplicate words",
"Total words in text: 16680599\nUnique words: 63641\n"
]
],
[
[
"### Dictionaries\n\nNext, I'm creating two dictionaries to convert words to integers and back again (integers to words). This is again done with a function in the `utils.py` file. `create_lookup_tables` takes in a list of words in a text and returns two dictionaries.\n>* The integers are assigned in descending frequency order, so the most frequent word (\"the\") is given the integer 0 and the next most frequent is 1, and so on. \n\nOnce we have our dictionaries, the words are converted to integers and stored in the list `int_words`.",
"_____no_output_____"
]
],
[
[
"vocab_to_int, int_to_vocab = utils.create_lookup_tables(words)\nint_words = [vocab_to_int[word] for word in words]\n\nprint(int_words[:30])",
"[5233, 3080, 11, 5, 194, 1, 3133, 45, 58, 155, 127, 741, 476, 10571, 133, 0, 27349, 1, 0, 102, 854, 2, 0, 15067, 58112, 1, 0, 150, 854, 3580]\n"
]
],
[
[
"## Subsampling\n\nWords that show up often such as \"the\", \"of\", and \"for\" don't provide much context to the nearby words. If we discard some of them, we can remove some of the noise from our data and in return get faster training and better representations. This process is called subsampling by Mikolov. For each word $w_i$ in the training set, we'll discard it with probability given by \n\n$$ P(w_i) = 1 - \\sqrt{\\frac{t}{f(w_i)}} $$\n\nwhere $t$ is a threshold parameter and $f(w_i)$ is the frequency of word $w_i$ in the total dataset.\n\n> Implement subsampling for the words in `int_words`. That is, go through `int_words` and discard each word given the probablility $P(w_i)$ shown above. Note that $P(w_i)$ is the probability that a word is discarded. Assign the subsampled data to `train_words`.",
"_____no_output_____"
]
],
[
[
"from collections import Counter\nimport random\nimport numpy as np\n\nthreshold = 1e-5\nword_counts = Counter(int_words)\n#print(list(word_counts.items())[0]) # dictionary of int_words, how many times they appear\n\ntotal_count = len(int_words)\nfreqs = {word: count/total_count for word, count in word_counts.items()}\np_drop = {word: 1 - np.sqrt(threshold/freqs[word]) for word in word_counts}\n# discard some frequent words, according to the subsampling equation\n# create a new list of words for training\ntrain_words = [word for word in int_words if random.random() < (1 - p_drop[word])]\n\nprint(train_words[:30])",
"[5233, 741, 10571, 27349, 15067, 58112, 854, 10712, 19, 708, 2757, 5233, 248, 44611, 2877, 5233, 8983, 4147, 6437, 5233, 1818, 4860, 6753, 7573, 566, 247, 11064, 7088, 5948, 4861]\n"
]
],
[
[
"## Making batches",
"_____no_output_____"
],
[
"Now that our data is in good shape, we need to get it into the proper form to pass it into our network. With the skip-gram architecture, for each word in the text, we want to define a surrounding _context_ and grab all the words in a window around that word, with size $C$. \n\nFrom [Mikolov et al.](https://arxiv.org/pdf/1301.3781.pdf): \n\n\"Since the more distant words are usually less related to the current word than those close to it, we give less weight to the distant words by sampling less from those words in our training examples... If we choose $C = 5$, for each training word we will select randomly a number $R$ in range $[ 1: C ]$, and then use $R$ words from history and $R$ words from the future of the current word as correct labels.\"\n\n> **Exercise:** Implement a function `get_target` that receives a list of words, an index, and a window size, then returns a list of words in the window around the index. Make sure to use the algorithm described above, where you chose a random number of words to from the window.\n\nSay, we have an input and we're interested in the idx=2 token, `741`: \n```\n[5233, 58, 741, 10571, 27349, 0, 15067, 58112, 3580, 58, 10712]\n```\n\nFor `R=2`, `get_target` should return a list of four values:\n```\n[5233, 58, 10571, 27349]\n```",
"_____no_output_____"
]
],
[
[
"def get_target(words, idx, window_size=5):\n ''' Get a list of words in a window around an index. '''\n \n R = np.random.randint(1, window_size+1)\n start = idx - R if (idx - R) > 0 else 0\n stop = idx + R\n target_words = words[start:idx] + words[idx+1:stop+1]\n \n return list(target_words)",
"_____no_output_____"
],
[
"# test your code!\n\n# run this cell multiple times to check for random window selection\nint_text = [i for i in range(10)]\nprint('Input: ', int_text)\nidx=5 # word index of interest\n\ntarget = get_target(int_text, idx=idx, window_size=5)\nprint('Target: ', target) # you should get some indices around the idx",
"Input: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]\nTarget: [0, 1, 2, 3, 4, 6, 7, 8, 9]\n"
]
],
[
[
"### Generating Batches \n\nHere's a generator function that returns batches of input and target data for our model, using the `get_target` function from above. The idea is that it grabs `batch_size` words from a words list. Then for each of those batches, it gets the target words in a window.",
"_____no_output_____"
]
],
[
[
"def get_batches(words, batch_size, window_size=5):\n ''' Create a generator of word batches as a tuple (inputs, targets) '''\n \n n_batches = len(words)//batch_size\n \n # only full batches\n words = words[:n_batches*batch_size]\n \n for idx in range(0, len(words), batch_size):\n x, y = [], []\n batch = words[idx:idx+batch_size]\n for ii in range(len(batch)):\n batch_x = batch[ii]\n batch_y = get_target(batch, ii, window_size)\n y.extend(batch_y)\n x.extend([batch_x]*len(batch_y))\n yield x, y\n ",
"_____no_output_____"
],
[
"int_text = [i for i in range(20)]\nx,y = next(get_batches(int_text, batch_size=4, window_size=5))\n\nprint('x\\n', x)\nprint('y\\n', y)",
"x\n [0, 0, 1, 1, 1, 2, 2, 2, 3, 3]\ny\n [1, 2, 0, 2, 3, 0, 1, 3, 1, 2]\n"
]
],
[
[
"---\n## Validation\n\nHere, I'm creating a function that will help us observe our model as it learns. We're going to choose a few common words and few uncommon words. Then, we'll print out the closest words to them using the cosine similarity: \n\n<img src=\"assets/two_vectors.png\" width=30%>\n\n$$\n\\mathrm{similarity} = \\cos(\\theta) = \\frac{\\vec{a} \\cdot \\vec{b}}{|\\vec{a}||\\vec{b}|}\n$$\n\n\nWe can encode the validation words as vectors $\\vec{a}$ using the embedding table, then calculate the similarity with each word vector $\\vec{b}$ in the embedding table. With the similarities, we can print out the validation words and words in our embedding table semantically similar to those words. It's a nice way to check that our embedding table is grouping together words with similar semantic meanings.",
"_____no_output_____"
]
],
[
[
"def cosine_similarity(embedding, valid_size=16, valid_window=100, device='cpu'):\n \"\"\" Returns the cosine similarity of validation words with words in the embedding matrix.\n Here, embedding should be a PyTorch embedding module.\n \"\"\"\n \n # Here we're calculating the cosine similarity between some random words and \n # our embedding vectors. With the similarities, we can look at what words are\n # close to our random words.\n \n # sim = (a . b) / |a||b|\n \n embed_vectors = embedding.weight\n \n # magnitude of embedding vectors, |b|\n magnitudes = embed_vectors.pow(2).sum(dim=1).sqrt().unsqueeze(0)\n \n # pick N words from our ranges (0,window) and (1000,1000+window). lower id implies more frequent \n valid_examples = np.array(random.sample(range(valid_window), valid_size//2))\n valid_examples = np.append(valid_examples,\n random.sample(range(1000,1000+valid_window), valid_size//2))\n valid_examples = torch.LongTensor(valid_examples).to(device)\n \n valid_vectors = embedding(valid_examples)\n similarities = torch.mm(valid_vectors, embed_vectors.t())/magnitudes\n \n return valid_examples, similarities",
"_____no_output_____"
]
],
[
[
"---\n# SkipGram model\n\nDefine and train the SkipGram model. \n> You'll need to define an [embedding layer](https://pytorch.org/docs/stable/nn.html#embedding) and a final, softmax output layer.\n\nAn Embedding layer takes in a number of inputs, importantly:\n* **num_embeddings** – the size of the dictionary of embeddings, or how many rows you'll want in the embedding weight matrix\n* **embedding_dim** – the size of each embedding vector; the embedding dimension\n\nBelow is an approximate diagram of the general structure of our network.\n<img src=\"assets/skip_gram_arch.png\" width=60%>\n\n>* The input words are passed in as batches of input word tokens. \n* This will go into a hidden layer of linear units (our embedding layer). \n* Then, finally into a softmax output layer. \n\nWe'll use the softmax layer to make a prediction about the context words by sampling, as usual.",
"_____no_output_____"
],
[
"---\n## Negative Sampling\n\nFor every example we give the network, we train it using the output from the softmax layer. That means for each input, we're making very small changes to millions of weights even though we only have one true example. This makes training the network very inefficient. We can approximate the loss from the softmax layer by only updating a small subset of all the weights at once. We'll update the weights for the correct example, but only a small number of incorrect, or noise, examples. This is called [\"negative sampling\"](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf). \n\nThere are two modifications we need to make. First, since we're not taking the softmax output over all the words, we're really only concerned with one output word at a time. Similar to how we use an embedding table to map the input word to the hidden layer, we can now use another embedding table to map the hidden layer to the output word. Now we have two embedding layers, one for input words and one for output words. Secondly, we use a modified loss function where we only care about the true example and a small subset of noise examples.\n\n$$\n- \\large \\log{\\sigma\\left(u_{w_O}\\hspace{0.001em}^\\top v_{w_I}\\right)} -\n\\sum_i^N \\mathbb{E}_{w_i \\sim P_n(w)}\\log{\\sigma\\left(-u_{w_i}\\hspace{0.001em}^\\top v_{w_I}\\right)}\n$$\n\nThis is a little complicated so I'll go through it bit by bit. $u_{w_O}\\hspace{0.001em}^\\top$ is the embedding vector for our \"output\" target word (transposed, that's the $^\\top$ symbol) and $v_{w_I}$ is the embedding vector for the \"input\" word. Then the first term \n\n$$\\large \\log{\\sigma\\left(u_{w_O}\\hspace{0.001em}^\\top v_{w_I}\\right)}$$\n\nsays we take the log-sigmoid of the inner product of the output word vector and the input word vector. Now the second term, let's first look at \n\n$$\\large \\sum_i^N \\mathbb{E}_{w_i \\sim P_n(w)}$$ \n\nThis means we're going to take a sum over words $w_i$ drawn from a noise distribution $w_i \\sim P_n(w)$. The noise distribution is basically our vocabulary of words that aren't in the context of our input word. In effect, we can randomly sample words from our vocabulary to get these words. $P_n(w)$ is an arbitrary probability distribution though, which means we get to decide how to weight the words that we're sampling. This could be a uniform distribution, where we sample all words with equal probability. Or it could be according to the frequency that each word shows up in our text corpus, the unigram distribution $U(w)$. The authors found the best distribution to be $U(w)^{3/4}$, empirically. \n\nFinally, in \n\n$$\\large \\log{\\sigma\\left(-u_{w_i}\\hspace{0.001em}^\\top v_{w_I}\\right)},$$ \n\nwe take the log-sigmoid of the negated inner product of a noise vector with the input vector. \n\n<img src=\"assets/neg_sampling_loss.png\" width=50%>\n\nTo give you an intuition for what we're doing here, remember that the sigmoid function returns a probability between 0 and 1. The first term in the loss pushes the probability that our network will predict the correct word $w_O$ towards 1. In the second term, since we are negating the sigmoid input, we're pushing the probabilities of the noise words towards 0.",
"_____no_output_____"
]
],
[
[
"import torch\nfrom torch import nn\nimport torch.optim as optim",
"_____no_output_____"
],
[
"tmp_emb = nn.Embedding(5, 2)\nprint(tmp_emb.weight.shape)\ntmp_w = tmp_emb.weight\nprint(tmp_w)\nprint(tmp_w.data)\ntmp_w.data.uniform_(-1,1)\nprint(tmp_w.data)",
"torch.Size([5, 2])\nParameter containing:\ntensor([[-0.2173, 0.7378],\n [-0.3144, 0.1738],\n [ 0.7321, 0.1778],\n [-1.6267, 1.4683],\n [ 0.0070, 0.3207]], requires_grad=True)\ntensor([[-0.2173, 0.7378],\n [-0.3144, 0.1738],\n [ 0.7321, 0.1778],\n [-1.6267, 1.4683],\n [ 0.0070, 0.3207]])\ntensor([[ 0.2382, -0.8082],\n [-0.5666, 0.7127],\n [-0.1985, 0.3409],\n [-0.8994, -0.8410],\n [-0.7099, 0.9193]])\n"
],
[
"class SkipGramNeg(nn.Module):\n def __init__(self, n_vocab, n_embed, noise_dist=None):\n super().__init__()\n \n self.n_vocab = n_vocab\n self.n_embed = n_embed\n self.noise_dist = noise_dist\n \n # define embedding layers for input and output words\n self.in_embed = nn.Embedding(n_vocab, n_embed)\n self.out_embed = nn.Embedding(n_vocab, n_embed)\n \n # Initialize both embedding tables with uniform distribution\n self.in_embed.weight.data.uniform_(-1,1)\n self.in_embed.weight.data.uniform_(-1,1)\n \n # !! note: no linear layer / soft max\n \n def forward_input(self, input_words):\n # return input vector embeddings\n \n return self.in_embed(input_words)\n \n def forward_output(self, output_words):\n # return output vector embeddings\n\n return self.out_embed(output_words)\n \n def forward_noise(self, batch_size, n_samples):\n \"\"\" Generate noise vectors with shape (batch_size, n_samples, n_embed)\"\"\"\n if self.noise_dist is None:\n # Sample words uniformly\n noise_dist = torch.ones(self.n_vocab)\n else:\n noise_dist = self.noise_dist\n \n # Sample words from our noise distribution\n noise_words = torch.multinomial(noise_dist,\n batch_size * n_samples,\n replacement=True)\n \n device = \"cuda\" if model.out_embed.weight.is_cuda else \"cpu\"\n noise_words = noise_words.to(device)\n \n ## TODO: get the noise embeddings\n # reshape the embeddings so that they have dims (batch_size, n_samples, n_embed)\n noise_words = self.out_embed(noise_words).view(batch_size, n_samples, self.n_embed)\n \n return noise_words",
"_____no_output_____"
],
[
"class NegativeSamplingLoss(nn.Module):\n def __init__(self):\n super().__init__()\n\n def forward(self, input_vectors, output_vectors, noise_vectors):\n \n batch_size, embed_size = input_vectors.shape\n \n # Input vectors should be a batch of column vectors\n input_vectors = input_vectors.view(batch_size, embed_size, 1)\n \n # Output vectors should be a batch of row vectors\n output_vectors = output_vectors.view(batch_size, 1, embed_size) # here is (1, emb_size)to make output vector is transposed\n \n # bmm = batch matrix multiplication\n # correct log-sigmoid loss\n out_loss = torch.bmm(output_vectors, input_vectors).sigmoid().log()\n out_loss = out_loss.squeeze() # remove empty dimentions\n \n # incorrect log-sigmoid loss\n noise_loss = torch.bmm(noise_vectors.neg(), input_vectors).sigmoid().log()\n noise_loss = noise_loss.squeeze().sum(1) # sum the losses over the sample of noise vectors\n\n # negate and sum correct and noisy log-sigmoid losses\n # return average batch loss\n return -(out_loss + noise_loss).mean()",
"_____no_output_____"
]
],
[
[
"### Training\n\nBelow is our training loop, and I recommend that you train on GPU, if available.",
"_____no_output_____"
]
],
[
[
"device = 'cuda' if torch.cuda.is_available() else 'cpu'\n\n# Get our noise distribution\n# Using word frequencies calculated earlier in the notebook\nword_freqs = np.array(sorted(freqs.values(), reverse=True))\nunigram_dist = word_freqs/word_freqs.sum()\nnoise_dist = torch.from_numpy(unigram_dist**(0.75)/np.sum(unigram_dist**(0.75)))\n\n# instantiating the model\nembedding_dim = 300\nmodel = SkipGramNeg(len(vocab_to_int), embedding_dim, noise_dist=noise_dist).to(device)\n\n# using the loss that we defined\ncriterion = NegativeSamplingLoss() \noptimizer = optim.Adam(model.parameters(), lr=0.003)\n\nprint_every = 1500\nsteps = 0\nepochs = 5\n\n# train for some number of epochs\nfor e in range(epochs):\n \n # get our input, target batches\n for input_words, target_words in get_batches(train_words, 512):\n steps += 1\n inputs, targets = torch.LongTensor(input_words), torch.LongTensor(target_words)\n inputs, targets = inputs.to(device), targets.to(device)\n \n # input, outpt, and noise vectors\n input_vectors = model.forward_input(inputs)\n output_vectors = model.forward_output(targets)\n noise_vectors = model.forward_noise(inputs.shape[0], 5) # batchsise: inputs.shape[0]; 5 - number of noise vectors\n\n # negative sampling loss\n loss = criterion(input_vectors, output_vectors, noise_vectors)\n\n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n\n # loss stats\n if steps % print_every == 0:\n print(\"Epoch: {}/{}\".format(e+1, epochs))\n print(\"Loss: \", loss.item()) # avg batch loss at this point in training\n valid_examples, valid_similarities = cosine_similarity(model.in_embed, device=device)\n _, closest_idxs = valid_similarities.topk(6)\n\n valid_examples, closest_idxs = valid_examples.to('cpu'), closest_idxs.to('cpu')\n for ii, valid_idx in enumerate(valid_examples):\n closest_words = [int_to_vocab[idx.item()] for idx in closest_idxs[ii]][1:]\n print(int_to_vocab[valid_idx.item()] + \" | \" + ', '.join(closest_words))\n print(\"...\\n\")",
"_____no_output_____"
],
[
"# change the name, for saving multiple files\n# model_name = 'w2v_skip_gram_ns_5_epoch.net'\n\ncheckpoint = {'n_vocab': model.n_vocab,\n 'n_embed': model.n_embed,\n 'state_dict': model.state_dict(),\n 'noise_dist': model.noise_dist}\n\nwith open(model_name, 'wb') as f:\n torch.save(checkpoint, f)",
"_____no_output_____"
],
[
"with open('w2v_skip_gram_ns_5_epoch.net', 'rb') as f:\n checkpoint = torch.load(f, map_location=torch.device('cpu') )\n\nmodel = SkipGramNeg(checkpoint['n_vocab'], checkpoint['n_embed'], noise_dist=checkpoint['noise_dist']).to(device)\nmodel.load_state_dict(checkpoint['state_dict'])",
"_____no_output_____"
]
],
[
[
"## Visualizing the word vectors\n\nBelow we'll use T-SNE to visualize how our high-dimensional word vectors cluster together. T-SNE is used to project these vectors into two dimensions while preserving local stucture. Check out [this post from Christopher Olah](http://colah.github.io/posts/2014-10-Visualizing-MNIST/) to learn more about T-SNE and other ways to visualize high-dimensional data.",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\n%config InlineBackend.figure_format = 'retina'\n\nimport matplotlib.pyplot as plt\nfrom sklearn.manifold import TSNE",
"_____no_output_____"
],
[
"# getting embeddings from the embedding layer of our model, by name\nembeddings = model.in_embed.weight.to('cpu').data.numpy()",
"_____no_output_____"
],
[
"viz_words = 380\ntsne = TSNE()\nembed_tsne = tsne.fit_transform(embeddings[:viz_words, :])",
"_____no_output_____"
],
[
"fig, ax = plt.subplots(figsize=(16, 16))\nfor idx in range(viz_words):\n plt.scatter(*embed_tsne[idx, :], color='steelblue')\n plt.annotate(int_to_vocab[idx], (embed_tsne[idx, 0], embed_tsne[idx, 1]), alpha=0.7)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
e7a0e5d7251432973c33e0101d424203f7ce282c | 22,757 | ipynb | Jupyter Notebook | Q7_Asgn1.nbconvert.ipynb | sunil-dhaka/india-covid19-cases-and-vaccination-analysis | 8a7aa0d6a0bd47ee61230342b23e1400a834ca59 | [
"MIT"
] | null | null | null | Q7_Asgn1.nbconvert.ipynb | sunil-dhaka/india-covid19-cases-and-vaccination-analysis | 8a7aa0d6a0bd47ee61230342b23e1400a834ca59 | [
"MIT"
] | null | null | null | Q7_Asgn1.nbconvert.ipynb | sunil-dhaka/india-covid19-cases-and-vaccination-analysis | 8a7aa0d6a0bd47ee61230342b23e1400a834ca59 | [
"MIT"
] | null | null | null | 76.110368 | 13,555 | 0.703124 | [
[
[
"#### Importing modules",
"_____no_output_____"
]
],
[
[
"import json \nimport pandas as pd\nimport numpy as np",
"_____no_output_____"
]
],
[
[
"#### Read cowin csv file",
"_____no_output_____"
]
],
[
[
"data=pd.read_csv('cowin_vaccine_data_districtwise.csv')",
"/home/sunild/anaconda3/lib/python3.8/site-packages/IPython/core/interactiveshell.py:3169: DtypeWarning: Columns (6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,239,240,241,242,243,244,245,246,247,248,249,250,251,252,253,254,255,256,257,258,259,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,277,278,279,280,281,282,283,284,285,286,287,288,289,290,291,292,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,336,337,338,339,340,341,342,343,344,345,346,347,348,349,350,351,352,353,354,355,356,357,358,359,360,361,362,363,364,365,366,367,368,369,370,371,372,373,374,375,376,377,378,379,380,381,382,383,384,385,386,387,388,389,390,391,392,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,409,410,411,412,413,414,415,416,417,418,419,420,421,422,423,424,425,426,427,428,429,430,431,432,433,434,435,436,437,438,439,440,441,442,443,444,445,446,447,448,449,450,451,452,453,454,455,456,457,458,459,460,461,462,463,464,465,466,467,468,469,470,471,472,473,474,475,476,477,478,479,480,481,482,483,484,485,486,487,488,489,490,491,492,493,494,495,496,497,498,499,500,501,502,503,504,505,506,507,508,509,510,511,512,513,514,515,516,517,518,519,520,521,522,523,524,525,526,527,528,529,530,531,532,533,534,535,536,537,538,539,540,541,542,543,544,545,546,547,548,549,550,551,552,553,554,555,556,557,558,559,560,561,562,563,564,565,566,567,568,569,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,595,596,597,598,599,600,601,602,603,604,605,606,607,608,609,610,611,612,613,614,615,616,617,618,619,620,621,622,623,624,625,626,627,628,629,630,631,632,633,634,635,636,637,638,639,640,641,642,643,644,645,646,647,648,649,650,651,652,653,654,655,656,657,658,659,660,661,662,663,664,665,666,667,668,669,670,671,672,673,674,675,676,677,678,679,680,681,682,683,684,685,686,687,688,689,690,691,692,693,694,695,696,697,698,699,700,701,702,703,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,731,732,733,734,735,736,737,738,739,740,741,742,743,744,745,746,747,748,749,750,751,752,753,754,755,756,757,758,759,760,761,762,763,764,765,766,767,768,769,770,771,772,773,774,775,776,777,778,779,780,781,782,783,784,785,786,787,788,789,790,791,792,793,794,795,796,797,798,799,800,801,802,803,804,805,806,807,808,809,810,811,812,813,814,815,816,817,818,819,820,821,822,823,824,825,826,827,828,829,830,831,832,833,834,835,836,837,838,839,840,841,842,843,844,845,846,847,848,849,850,851,852,853,854,855,856,857,858,859,860,861,862,863,864,865,866,867,868,869,870,871,872,873,874,875,876,877,878,879,880,881,882,883,884,885,886,887,888,889,890,891,892,893,894,895,896,897,898,899,900,901,902,903,904,905,906,907,908,909,910,911,912,913,914,915,916,917,918,919,920,921,922,923,924,925,926,927,928,929,930,931,932,933,934,935,936,937,938,939,940,941,942,943,944,945,946,947,948,949,950,951,952,953,954,955,956,957,958,959,960,961,962,963,964,965,966,967,968,969,970,971,972,973,974,975,976,977,978,979,980,981,982,983,984,985,986,987,988,989,990,991,992,993,994,995,996,997,998,999,1000,1001,1002,1003,1004,1005,1006,1007,1008,1009,1010,1011,1012,1013,1014,1015,1016,1017,1018,1019,1020,1021,1022,1023,1024,1025,1026,1027,1028,1029,1030,1031,1032,1033,1034,1035,1036,1037,1038,1039,1040,1041,1042,1043,1044,1045,1046,1047,1048,1049,1050,1051,1052,1053,1054,1055,1056,1057,1058,1059,1060,1061,1062,1063,1064,1065,1066,1067,1068,1069,1070,1071,1072,1073,1074,1075,1076,1077,1078,1079,1080,1081,1082,1083,1084,1085,1086,1087,1088,1089,1090,1091,1092,1093,1094,1095,1096,1097,1098,1099,1100,1101,1102,1103,1104,1105,1106,1107,1108,1109,1110,1111,1112,1113,1114,1115,1116,1117,1118,1119,1120,1121,1122,1123,1124,1125,1126,1127,1128,1129,1130,1131,1132,1133,1134,1135,1136,1137,1138,1139,1140,1141,1142,1143,1144,1145,1146,1147,1148,1149,1150,1151,1152,1153,1154,1155,1156,1157,1158,1159,1160,1161,1162,1163,1164,1165,1166,1167,1168,1169,1170,1171,1172,1173,1174,1175,1176,1177,1178,1179,1180,1181,1182,1183,1184,1185,1186,1187,1188,1189,1190,1191,1192,1193,1194,1195,1196,1197,1198,1199,1200,1201,1202,1203,1204,1205,1206,1207,1208,1209,1210,1211,1212,1213,1214,1215,1216,1217,1218,1219,1220,1221,1222,1223,1224,1225,1226,1227,1228,1229,1230,1231,1232,1233,1234,1235,1236,1237,1238,1239,1240,1241,1242,1243,1244,1245,1246,1247,1248,1249,1250,1251,1252,1253,1254,1255,1256,1257,1258,1259,1260,1261,1262,1263,1264,1265,1266,1267,1268,1269,1270,1271,1272,1273,1274,1275,1276,1277,1278,1279,1280,1281,1282,1283,1284,1285,1286,1287,1288,1289,1290,1291,1292,1293,1294,1295,1296,1297,1298,1299,1300,1301,1302,1303,1304,1305,1306,1307,1308,1309,1310,1311,1312,1313,1314,1315,1316,1317,1318,1319,1320,1321,1322,1323,1324,1325,1326,1327,1328,1329,1330,1331,1332,1333,1334,1335,1336,1337,1338,1339,1340,1341,1342,1343,1344,1345,1346,1347,1348,1349,1350,1351,1352,1353,1354,1355,1356,1357,1358,1359,1360,1361,1362,1363,1364,1365,1366,1367,1368,1369,1370,1371,1372,1373,1374,1375,1376,1377,1378,1379,1380,1381,1382,1383,1384,1385,1386,1387,1388,1389,1390,1391,1392,1393,1394,1395,1396,1397,1398,1399,1400,1401,1402,1403,1404,1405,1406,1407,1408,1409,1410,1411,1412,1413,1414,1415,1416,1417,1418,1419,1420,1421,1422,1423,1424,1425,1426,1427,1428,1429,1430,1431,1432,1433,1434,1435,1436,1437,1438,1439,1440,1441,1442,1443,1444,1445,1446,1447,1448,1449,1450,1451,1452,1453,1454,1455,1456,1457,1458,1459,1460,1461,1462,1463,1464,1465,1466,1467,1468,1469,1470,1471,1472,1473,1474,1475,1476,1477,1478,1479,1480,1481,1482,1483,1484,1485,1486,1487,1488,1489,1490,1491,1492,1493,1494,1495,1496,1497,1498,1499,1500,1501,1502,1503,1504,1505,1506,1507,1508,1509,1510,1511,1512,1513,1514,1515,1516,1517,1518,1519,1520,1521,1522,1523,1524,1525,1526,1527,1528,1529,1530,1531,1532,1533,1534,1535,1536,1537,1538,1539,1540,1541,1542,1543,1544,1545,1546,1547,1548,1549,1550,1551,1552,1553,1554,1555,1556,1557,1558,1559,1560,1561,1562,1563,1564,1565,1566,1567,1568,1569,1570,1571,1572,1573,1574,1575,1576,1577,1578,1579,1580,1581,1582,1583,1584,1585,1586,1587,1588,1589,1590,1591,1592,1593,1594,1595,1596,1597,1598,1599,1600,1601,1602,1603,1604,1605,1606,1607,1608,1609,1610,1611,1612,1613,1614,1615,1616,1617,1618,1619,1620,1621,1622,1623,1624,1625,1626,1627,1628,1629,1630,1631,1632,1633,1634,1635,1636,1637,1638,1639,1640,1641,1642,1643,1644,1645,1646,1647,1648,1649,1650,1651,1652,1653,1654,1655,1656,1657,1658,1659,1660,1661,1662,1663,1664,1665,1666,1667,1668,1669,1670,1671,1672,1673,1674,1675,1676,1677,1678,1679,1680,1681,1682,1683,1684,1685,1686,1687,1688,1689,1690,1691,1692,1693,1694,1695,1696,1697,1698,1699,1700,1701,1702,1703,1704,1705,1706,1707,1708,1709,1710,1711,1712,1713,1714,1715,1716,1717,1718,1719,1720,1721,1722,1723,1724,1725,1726,1727,1728,1729,1730,1731,1732,1733,1734,1735,1736,1737,1738,1739,1740,1741,1742,1743,1744,1745,1746,1747,1748,1749,1750,1751,1752,1753,1754,1755,1756,1757,1758,1759,1760,1761,1762,1763,1764,1765,1766,1767,1768,1769,1770,1771,1772,1773,1774,1775,1776,1777,1778,1779,1780,1781,1782,1783,1784,1785,1786,1787,1788,1789,1790,1791,1792,1793,1794,1795,1796,1797,1798,1799,1800,1801,1802,1803,1804,1805,1806,1807,1808,1809,1810,1811,1812,1813,1814,1815,1816,1817,1818,1819,1820,1821,1822,1823,1824,1825,1826,1827,1828,1829,1830,1831,1832,1833,1834,1835,1836,1837,1838,1839,1840,1841,1842,1843,1844,1845,1846,1847,1848,1849,1850,1851,1852,1853,1854,1855,1856,1857,1858,1859,1860,1861,1862,1863,1864,1865,1866,1867,1868,1869,1870,1871,1872,1873,1874,1875,1876,1877,1878,1879,1880,1881,1882,1883,1884,1885,1886,1887,1888,1889,1890,1891,1892,1893,1894,1895,1896,1897,1898,1899,1900,1901,1902,1903,1904,1905,1906,1907,1908,1909,1910,1911,1912,1913,1914,1915,1916,1917,1918,1919,1920,1921,1922,1923,1924,1925,1926,1927,1928,1929,1930,1931,1932,1933,1934,1935,1936,1937,1938,1939,1940,1941,1942,1943,1944,1945,1946,1947,1948,1949,1950,1951,1952,1953,1954,1955,1956,1957,1958,1959,1960,1961,1962,1963,1964,1965,1966,1967,1968,1969,1970,1971,1972,1973,1974,1975,1976,1977,1978,1979,1980,1981,1982,1983,1984,1985,1986,1987,1988,1989,1990,1991,1992,1993,1994,1995,1996,1997,1998,1999,2000,2001,2002,2003,2004,2005,2006,2007,2008,2009,2010,2011,2012,2013,2014,2015,2016,2017,2018,2019,2020,2021,2022,2023,2024,2025,2026,2027,2028,2029,2030,2031,2032,2033,2034,2035,2036,2037,2038,2039,2040,2041,2042,2043,2044,2045,2046,2047,2048,2049,2050,2051,2052,2053,2054,2055,2056,2057,2058,2059,2060,2061,2062,2063,2064,2065,2066,2067,2068,2069,2070,2071,2072,2073,2074,2075,2076,2077,2078,2079,2080,2081,2082,2083,2084,2085,2086,2087,2088,2089,2090,2091,2092,2093,2094,2095,2096,2097,2098,2099,2100,2101,2102,2103,2104,2105,2106,2107,2108,2109,2110,2111,2112,2113,2114,2115,2116,2117,2118,2119,2120,2121,2122,2123,2124,2125,2126,2127,2128,2129,2130,2131,2132,2133,2134,2135,2136,2137,2138,2139,2140,2141,2142,2143,2144,2145,2146,2147,2148,2149,2150,2151,2152,2153,2154,2155,2156,2157,2158,2159,2160,2161,2162,2163,2164,2165,2166,2167,2168,2169,2170,2171,2172,2173,2174,2175,2176,2177,2178,2179,2180,2181,2182,2183,2184,2185,2186,2187,2188,2189,2190,2191,2192,2193,2194,2195,2196,2197,2198,2199,2200,2201,2202,2203,2204,2205,2206,2207,2208,2209,2210,2211,2212,2213,2214,2215,2216,2217,2218,2219,2220,2221,2222,2223,2224,2225,2226,2227,2228,2229,2230,2231,2232,2233,2234,2235,2236,2237,2238,2239,2240,2241,2242,2243,2244,2245,2246,2247,2248,2249,2250,2251,2252,2253,2254,2255,2256,2257,2258,2259,2260,2261,2262,2263,2264,2265,2266,2267,2268,2269,2270,2271,2272,2273,2274,2275,2276,2277,2278,2279,2280,2281,2282,2283,2284,2285,2286,2287,2288,2289,2290,2291,2292,2293,2294,2295,2296,2297,2298,2299,2300,2301,2302,2303,2304,2305,2306,2307,2308,2309,2310,2311,2312,2313,2314,2315,2316,2317,2318,2319,2320,2321,2322,2323,2324,2325,2326,2327,2328,2329,2330,2331,2332,2333,2334,2335,2336,2337,2338,2339,2340,2341,2342,2343,2344,2345,2346,2347,2348,2349,2350,2351,2352,2353,2354,2355,2356,2357,2358,2359,2360,2361,2362,2363,2364,2365,2366,2367,2368,2369,2370,2371,2372,2373,2374,2375,2376,2377,2378,2379,2380,2381,2382,2383,2384,2385,2386,2387,2388,2389,2390,2391,2392,2393,2394,2395,2396,2397,2398,2399,2400,2401,2402,2403,2404,2405,2406,2407,2408,2409,2410,2411,2412,2413,2414,2415,2416,2417,2418,2419,2420,2421,2422,2423,2424,2425,2426,2427,2428,2429,2430,2431,2432,2433,2434,2435,2436,2437,2438,2439,2440,2441,2442,2443,2444,2445,2446,2447,2448,2449,2450,2451,2452,2453,2454,2455,2456,2457,2458,2459,2460,2461,2462,2463,2464,2465,2466,2467,2468,2469,2470,2471,2472,2473,2474,2475,2476,2477,2478,2479,2480,2481,2482,2483,2484,2485,2486,2487,2488,2489,2490,2491,2492,2493,2494,2495,2496,2497,2498,2499,2500,2501,2502,2503,2504,2505,2506,2507,2508,2509,2510,2511,2512,2513,2514,2515,2516,2517,2518,2519,2520,2521,2522,2523,2524,2525,2526,2527,2528,2529,2530,2531,2532,2533,2534,2535,2536,2537,2538,2539,2540,2541,2542,2543,2544,2545,2546,2547,2548,2549,2550,2551,2552,2553,2554,2555,2556,2557,2558,2559,2560,2561,2562,2563,2564,2565,2566,2567,2568,2569,2570,2571,2572,2573,2574,2575,2576,2577,2578,2579,2580,2581,2582,2583,2584,2585,2586,2587,2588,2589,2590,2591,2592,2593,2594,2595,2596,2597,2598,2599,2600,2601,2602,2603,2604,2605,2606,2607,2608,2609,2610,2611,2612,2613,2614,2615,2616,2617,2618,2619,2620,2621,2622,2623,2624,2625,2626,2627,2628,2629,2630,2631,2632,2633,2634,2635,2636,2637,2638,2639,2640,2641,2642,2643,2644,2645,2646,2647,2648,2649,2650,2651,2652,2653,2654,2655,2656,2657,2658,2659,2660,2661,2662,2663,2664,2665,2666,2667,2668,2669,2670,2671,2672,2673,2674,2675,2676,2677,2678,2679,2680,2681,2682,2683,2684,2685,2686,2687,2688,2689,2690,2691,2692,2693,2694,2695,2696,2697,2698,2699,2700,2701,2702,2703,2704,2705,2706,2707,2708,2709,2710,2711,2712,2713,2714,2715,2716,2717,2718,2719,2720,2721,2722,2723,2724,2725,2726,2727,2728,2729,2730,2731,2732,2733,2734,2735,2736,2737,2738,2739,2740,2741,2742,2743,2744,2745,2746,2747,2748,2749,2750,2751,2752,2753,2754,2755,2756,2757,2758,2759,2760,2761,2762,2763,2764,2765,2766,2767,2768,2769,2770,2771,2772,2773,2774,2775,2776,2777,2778,2779,2780,2781,2782,2783,2784,2785,2786,2787,2788,2789,2790,2791,2792,2793,2794,2795,2796,2797,2798,2799,2800,2801,2802,2803,2804,2805,2806,2807,2808,2809,2810,2811,2812,2813,2814,2815,2816,2817,2818,2819,2820,2821,2822,2823,2824,2825,2826,2827,2828,2829,2830,2831,2832,2833,2834,2835,2836,2837,2838,2839,2840,2841,2842,2843,2844,2845,2846,2847,2848,2849,2850,2851,2852,2853,2854,2855,2856,2857,2858,2859,2860,2861,2862,2863,2864,2865,2866,2867,2868,2869,2870,2871,2872,2873,2874,2875,2876,2877,2878,2879,2880,2881,2882,2883,2884,2885,2886,2887,2888,2889,2890,2891,2892,2893,2894,2895) have mixed types.Specify dtype option on import or set low_memory=False.\n has_raised = await self.run_ast_nodes(code_ast.body, cell_name,\n"
]
],
[
[
"#### Load modified json file from Q1",
"_____no_output_____"
]
],
[
[
"## district level\n# district data from json\nf=open('neighbor-districts-modified.json')\ndistricts_data=json.load(f)\ndistrict_names=[]\ndistrict_ids=[]\nfor key in districts_data:\n district_names.append(key.split('/')[0])\n district_ids.append(key.split('/')[1])\nDistricts=data['District_Key'].str.lower()",
"_____no_output_____"
]
],
[
[
"#### Prepare data frames for covaxin and covishield vaccine numbers",
"_____no_output_____"
]
],
[
[
"## dose1=Covaxin\n## dose2=CoviShield\n\ndata_dose1=data.loc[:,(data.loc[0,]=='Covaxin (Doses Administered)')].iloc[1:,:].fillna(0)\nfirst_date_dose1=data_dose1.iloc[:,0]\ndata_dose1=data_dose1.astype(int).diff(axis=1)\ndata_dose1.iloc[:,0]=first_date_dose1\ndata_dose1['District']=data['District_Key']\ndata_dose2=data.loc[:,(data.loc[0,]=='CoviShield (Doses Administered)')].iloc[1:,:].fillna(0)\nfirst_date_dose2=data_dose2.iloc[:,0]\ndata_dose2=data_dose2.astype(int).diff(axis=1)\ndata_dose2.iloc[:,0]=first_date_dose2\ndata_dose2['District']=data['District_Key']",
"_____no_output_____"
]
],
[
[
"#### To get asked dates binary list\n",
"_____no_output_____"
]
],
[
[
"import datetime\ntrue_false_female=[]\nfor key in data_dose1.columns[:-1]:\n changed_key=key.split('.')[0].split('/')\n true_false_female.append(datetime.date(int(changed_key[2]),int(changed_key[1]),int(changed_key[0]))<datetime.date(2021,8,15))\n \ntrue_false_female.append(False)\n\ntrue_false_male=[]\nfor key in data_dose2.columns[:-1]:\n changed_key=key.split('.')[0].split('/')\n true_false_male.append(datetime.date(int(changed_key[2]),int(changed_key[1]),int(changed_key[0]))<datetime.date(2021,8,15))\n \ntrue_false_male.append(False)",
"_____no_output_____"
]
],
[
[
"### Districts",
"_____no_output_____"
]
],
[
[
"districtids=[]\nstateids=[]\nratio=[]\ncount1=[]\ncount2=[]\nfor i in range(len(district_ids)):\n for j in range(data_dose1.shape[0]):\n if district_ids[i]==data_dose1['District'][j+1]: # why there is 'j+1 'in this line :: due to that NaN in first raw\n districtids.append(district_ids[i])\n stateids.append(district_ids[i].split('_')[0])\n count1.append(((data_dose2.loc[data_dose2['District']==district_ids[i],np.array(true_false_male)]).astype(int).sum().sum()))\n count2.append(((data_dose1.loc[data_dose1['District']==district_ids[i],np.array(true_false_male)]).astype(int).sum().sum()))\n if count2[-1]==0:\n ratio.append(np.nan) # ratio=NaN for when covaxin count = 0\n else:\n ratio.append(np.round(count1[-1]/count2[-1],3))\n break\nratio_df=pd.DataFrame({'districtid':districtids, 'vaccinationratio':ratio})\nratio_df.sort_values('vaccinationratio', axis = 0, ascending = True, kind='mergesort',inplace=True)\nratio_df.reset_index(inplace=True,drop=True)\nratio_df.to_csv('district-vaccine-type-ratio.csv',index=False)",
"_____no_output_____"
]
],
[
[
"### States",
"_____no_output_____"
]
],
[
[
"ratio_df1=pd.DataFrame({'districtid':districtids,'covaxin':count2,'covishield':count1,'stateid':stateids})\nunique_state_codes=np.array(np.unique(stateids))\nstateid=[]\nratio_state=[]\ncovaxin_count=[]\ncovishield_count=[]\nfor i in range(len(unique_state_codes)):\n stateid.append(unique_state_codes[i])\n foo_df=ratio_df1.loc[ratio_df1.stateid==unique_state_codes[i]]\n covaxin_count.append(foo_df.covaxin.astype(int).sum())\n covishield_count.append(foo_df.covishield.astype(int).sum())\n if covaxin_count[-1]==0:\n ratio_state.append(np.nan) # ratio=NaN for when covaxin count = 0\n \n else:\n ratio_state.append(np.round(covishield_count[-1]/covaxin_count[-1],3))\n \nratio_df_states=pd.DataFrame({'stateid':stateid, 'vaccinationratio':ratio_state})\nratio_df_states.sort_values('vaccinationratio', axis = 0, ascending = True, kind='mergesort',inplace=True)\nratio_df_states.reset_index(inplace=True,drop=True)\nratio_df_states.to_csv('state-vaccine-type-ratio.csv',index=False)\nratio_df_states.to_csv('state-vaccine-type-ratio.csv',index=False)",
"_____no_output_____"
]
],
[
[
"### Overall",
"_____no_output_____"
]
],
[
[
"# overall\noverall_df=pd.DataFrame({'overallid':['IN'], 'vaccinationratio':[np.round(sum(covishield_count)/sum(covaxin_count),3)]})\noverall_df.to_csv('overall-vaccine-type-ratio.csv',index=False)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e7a0eb240e4d7ca7ff9a09cc6dbfcf710ca48b6d | 5,351 | ipynb | Jupyter Notebook | Assignments/Assignment_3.ipynb | eftropiosk94/Python_Course | ff93f6c8605b97a2734cd615667d6c52eb315bef | [
"Apache-2.0"
] | 32 | 2020-08-29T02:58:46.000Z | 2022-03-31T09:57:41.000Z | Assignments/Assignment_3.ipynb | eftropiosk94/Python_Course | ff93f6c8605b97a2734cd615667d6c52eb315bef | [
"Apache-2.0"
] | 1 | 2020-09-26T18:53:51.000Z | 2020-09-29T21:50:16.000Z | Assignments/Assignment_3.ipynb | eftropiosk94/Python_Course | ff93f6c8605b97a2734cd615667d6c52eb315bef | [
"Apache-2.0"
] | 73 | 2020-07-30T23:24:35.000Z | 2022-03-09T20:27:24.000Z | 32.23494 | 613 | 0.577275 | [
[
[
"## Third Assignment",
"_____no_output_____"
],
[
"#### 1) Create a function named **swap ()**, which receives two tuples **a** and **b**, with two elements each. Your code should cause the last elements of the two tuples to be exchanged, and then return the two tuples. ",
"_____no_output_____"
],
[
"#### 2) Create the function **dist ()** that takes as an argument two tuples that represent Cartesian coordinates of two points. Your function must return a number that corresponds to the Cartesian distance between these two points.",
"_____no_output_____"
],
[
"#### 3) Create the class **Ball**, to represent a sphere filled with water (weighing 1000g per cubic meter) with a radius **r**. Depending on the surface painting, it can have different weights: blue - weight 1g per square meter: yellow - 2g per square meter or red - 3g per square meter. The class parameter is a tuple **(r, color)**, an integer and a string, respectively. The **weight()** method of the class should return the total weight in kg of the ball (the weight of the water with the external weight of the surface). See example:\n\n>```python\n>>>> Ball((2, \"red\")).weight()\n> 33.66111808566343\n>\n>>>> Ball((3, \"blue\")).weight()\n> 113.21043286476177\n\nHints: \n- Use $\\pi$ = 3.14159\n- Sphere volume: $\\frac43 \\pi r^3$\n- Surface area: $4\\pi r^2$",
"_____no_output_____"
],
[
"#### 4) Roman numbers are occasionally used for time stamps or other purposes, but their manual conversion is somewhat laborious. Your task is to implement two functions, ***int_to_Roman() ]*** which converts an integer to Roman (in string) and **Roman_to_int()** which does the reverse. See examples below:\n\n>```python\n>>>> int_to_Roman(1)\n> I\n>>>> int_to_Roman(3000)\n> MMM\n>\n>>>> Roman_to_int('MMMCMLXXXVI')\n> 3986\n>>>> Roman_to_int('C')\n> 100\n\nNote: All test cases will be less than 4000, so you do not have to worry about the characters with bars above them, used in some versions of the Roman numbering system.",
"_____no_output_____"
],
[
"#### 5) The **area()** function receives a list **l** of ordered pairs. These ordered pairs represent the vertices of a convex polygon in the Cartesian plane traversed in a single direction. There is a mathematical method that, given the coordinates of the vertices of a polygon, the area can be calculated. You can find more details about the method by clicking [here](https://www.mathopenref.com/coordpolygonarea.html). Your code should make the function return the number that corresponds to the area of the polygon represented by the entry (rounded to two decimal places). See the following examples:\n\n>```python\n>>>> area([(0,0),(5,0),(13,8)])\n> 20.00\n>\n>>>> area([(2,0),(6,0),(10,4),(0,4)])\n> 28.00\n```",
"_____no_output_____"
],
[
"## Challenge\n\n#### 6) The function **matches()** receives a list **l** of ordered distinct integers and an integer **n**. Its must return a list of tuples, each tuple with **n** elements, containing all possible combinations **n** by **n** (ordered, without repetitions) of the elements in the list **l**. You should only return a list **r** containing the generated tuples. Make sure that tuples are ordered in the list. See the examples below:\n\n>```python\n>>>> matches([1,2,3,4],2)\n> [(1,2),(1,3),(1,4),(2,3),(2,4),(3,4)]\n>\n>>>> matches([1,2,3,4],3)\n> [(1, 2, 3), (1, 2, 4), (1, 3, 4), (2, 3, 4)]\n",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
e7a0f103f265afd4a53199ede0e9ff3783f08757 | 14,301 | ipynb | Jupyter Notebook | src/01_pubmed_db/.ipynb_checkpoints/02_pubmed_parser-checkpoint.ipynb | brandonleekramer/the-growth-of-diversity | 27a7313a0aaba01f72fcc37b26da613d3ebad0ca | [
"MIT"
] | null | null | null | src/01_pubmed_db/.ipynb_checkpoints/02_pubmed_parser-checkpoint.ipynb | brandonleekramer/the-growth-of-diversity | 27a7313a0aaba01f72fcc37b26da613d3ebad0ca | [
"MIT"
] | null | null | null | src/01_pubmed_db/.ipynb_checkpoints/02_pubmed_parser-checkpoint.ipynb | brandonleekramer/the-growth-of-diversity | 27a7313a0aaba01f72fcc37b26da613d3ebad0ca | [
"MIT"
] | null | null | null | 37.733509 | 689 | 0.632823 | [
[
[
"<div style=\"font-size:30px\" align=\"center\"> <b> Downloading MEDLINE/PubMed Data and Posting to PostgreSQL </b> </div>\n\n<div style=\"font-size:18px\" align=\"center\"> <b> Brandon L. Kramer - University of Virginia's Bicomplexity Institute </b> </div>\n\n<br>\n\nThis notebook detail the process of downloading all of [PubMed's MEDLINE data](https://www.nlm.nih.gov/databases/download/pubmed_medline.html) and posting it to a PostgresSQL database ([UVA's Rivanna OpenOnDemand](https://rivanna-portal.hpc.virginia.edu/)). To do this, we use the terminal to download all of the data into Rivanna. Next, we use the [PubMedPortable](https://github.com/KerstenDoering/PubMedPortable) package through the Python shell to parse all of the data and build up a database. \n\n### Step 1: Download PubMed Data\n\nFirst, we download all of the data from [here](ftp://ftp.ncbi.nlm.nih.gov/pubmed/baseline/) using `wget`.",
"_____no_output_____"
]
],
[
[
"cd /scratch/kb7hp/pubmed_new \nwget --recursive --no-parent ftp://ftp.ncbi.nlm.nih.gov/pubmed/baseline/",
"_____no_output_____"
]
],
[
[
"### Step 2: Download PubMedPortable\n\nSecond, we will clone [PubMedPortable package from GitHub](https://github.com/KerstenDoering/PubMedPortable). ",
"_____no_output_____"
]
],
[
[
"cd /home/kb7hp/git/\ngit clone https://github.com/KerstenDoering/PubMedPortable.git\ncd PubMedPortable",
"_____no_output_____"
]
],
[
[
"### Step 3: Populate Tables in PostgreSQL Database \n\nGo to the [PubMedPortable](https://github.com/KerstenDoering/PubMedPortable/wiki#build-up-a-relational-database-in-postgresql) protocol:\n - Skip the part on making a superuser named parser and use Rivanna login and pwd instead \n - Since `PubMedPortable` is written with the login/pwd of parser/parser you have to update lines 704-750 of `PubMedDB.py` \n - Add `import psycopg2 as pg` to the beginning of the file\n - Update all the connections to: `con = 'postgresql+psycopg2://login:pwd@postgis1/sdad'`\n - Update all the `print` statements to `print()` (e.g. line 728)\n\nGo to [Rivanna OpenOnDemand](https://rivanna-portal.hpc.virginia.edu/), click on Clusters > Rivanna Shell Access and then create a new schema using the following commands:",
"_____no_output_____"
]
],
[
[
"psql -U login -d sdad -h postgis1\nCREATE SCHEMA pubmed_2021; ",
"_____no_output_____"
]
],
[
[
"Then return to the Python terminal and run this to populate the new schema:",
"_____no_output_____"
]
],
[
[
"cd /home/kb7hp/git/PubMedPortable\npython PubMedDB.py -d pubmed_2021",
"_____no_output_____"
]
],
[
[
"Go back to the Rivanna PostgreSQL shell to check if that worked:",
"_____no_output_____"
]
],
[
[
"\\dt pubmed_2021.*",
"_____no_output_____"
]
],
[
[
"Looks like it did so now we can start parsing.\n\n### Step 4: Testing MEDLINE Data Upload \n\nWe don't want to start dumping all 1062 files, so let's just start with one. We will create a pm_0001 folder and download just one of the .xml files from PubMed. Next, we had to debug the `PubMedParser.py` file by updating all of the `con` and `print` statements as we did above and update `_next` to `__next__` on line 65. After doing this, we ran the following code to upload our first test file.\n\n#### Batch 1 (0001)\n\nLet's give this a try:",
"_____no_output_____"
]
],
[
[
"cd /home/kb7hp/git/PubMedPortable/data\nmkdir pm_0001 \ncd pm_0001\nwget ftp://ftp.ncbi.nlm.nih.gov/pubmed/baseline/pubmed21n0001.xml.gz\ncd /home/kb7hp/git/PubMedPortable/\npython PubMedParser.py -i data/pm_0001/ ",
"_____no_output_____"
]
],
[
[
"It took about 8 minutes to run this one file. \n\n### Step 5: Uploading the Rest of the MEDLINE Dataset to PostgreSQL Database in Batches \n\nLet's add the rest of the data to Postgres. Ideally, we would just dump the whole thing at once, but Rivanna limits the amount of data we can store locally (for some reason `PubMedPortable` does not like absolute paths). Thus, we will only copy part of the data from the `\\scratch` folder to our temporary local folders.\n\n#### Batch 2 (0002-0011)",
"_____no_output_____"
]
],
[
[
"# move all the .xml.gz files to their own folder\ncd /scratch/kb7hp/\nmkdir pubmed_gz\ncd /scratch/kb7hp/pubmed_new/ftp.ncbi.nlm.nih.gov/pubmed/baseline/\nmv *.gz /scratch/kb7hp/pubmed_gz\n\n# and copy 10 of those files to that new folder \ncd /scratch/kb7hp/pubmed_gz/\ncp pubmed21n{0002..0011}.xml.gz /home/kb7hp/git/PubMedPortable/data/pm_0002_0011\n\n# and then we add those 10 files to our existing database \ncd /home/kb7hp/git/PubMedPortable/data/\npython PubMedParser.py -i data/pm_0002_0011/ -c -p 4",
"_____no_output_____"
]
],
[
[
"While I intially thought this process would take ~80 minutes, running these 10 files only look ~22 minutes because of the 4 cores that essentially cut the timing by a quarter. Thus, we spun an instance with 5 cores (1 extra as directed by the Rivanna admins) and ran the next ~90 files with this new allocation. When I checked the `pubmed_2021.tbl_abstract` table, we had 146,854 rows, which seemed low. Yet, the notification from the `PubMedParser.py` file indicated that all files were parsed. I would late come to realize that there are fewer abstracts than total records, which can be validated in the `pubmed_2021.tbl_medline_citation` table. \n\n#### Batch 3 (0012-0100)\n\nLet's dump the next batch of citations (0012-0100). We will copy over the next batch of data and with multiprocessing accounted for this should take ~3 hours to complete.",
"_____no_output_____"
]
],
[
[
"cd /scratch/kb7hp/pubmed_gz/\ncp pubmed21n{0012..0100}.xml.gz /home/kb7hp/git/PubMedPortable/data/pm_0012_0100\ncd /home/kb7hp/git/PubMedPortable/data/\npython PubMedParser.py -i data/pm_0012_0100/ -c -p 4",
"_____no_output_____"
]
],
[
[
"And indeed it did! We have loaded the first 100 files and it took just over 3 hours (13:19:19-16:22:52). \n\n#### Batch 4 (0101-0500)\n\nNow, let's get a bit more ambitious. Given its now night time, we are can boost the allocation to 9 cores and try ~400 files. This should take around around 7 hours to complete (400 files * 8 mins/file with 8 cores). ",
"_____no_output_____"
]
],
[
[
"# first we will clean up the local directory \ncd /home/kb7hp/git/PubMedPortable/data/\nrm -r pm_0001 \nrm -r pm_0002_0011\nrm -r pm_0012_0100\n\n# copy over our new files \ncd /scratch/kb7hp/pubmed_gz \ncp pubmed21n{0101..0500}.xml.gz /home/kb7hp/git/PubMedPortable/data/pm_0101_0500\n\n# and then run the script for the next 400 files \ncd /home/kb7hp/git/PubMedPortable/\npython PubMedParser.py -i data/pm_0101_0500/ -c -p 8",
"_____no_output_____"
]
],
[
[
"After parsing the pm_101_500 files, I woke up to a minor error, but it looks like the program continued running up through the very last citation of the last file. I checked the `pubmed_2021.tbl_abstract` table and had 6,388,959 entries while `pubmed_2021.tbl_medline_citation` had 13,095,000, which almost half of the 26 million advertised on [MEDLINE's website](https://www.nlm.nih.gov/bsd/medline.html). Thus, it does seem like everything parsed without any serious problems. I decided to finsih up the rest of the file parsing since (1) I cannot address any problem in a systematic way and (2) a full database with problems is still better than a half database with problems. \n\n#### Batch 5 (0501-0750)\n\nWith the space limitations, let's take a conservative approach and post the next 250 files to the database (once again using 9 cores on Rivanna). ",
"_____no_output_____"
]
],
[
[
"cd /home/kb7hp/git/PubMedPortable/data\nrm -r pm_0101_0500\nmkdir pm_0501_0750\ncd /scratch/kb7hp/pubmed_gz \ncp pubmed21n{0501..0750}.xml.gz /home/kb7hp/git/PubMedPortable/data/pm_0501_0750\ncd /home/kb7hp/git/PubMedPortable/\npython PubMedParser.py -i data/pm_0501_0750/ -c -p 8",
"_____no_output_____"
]
],
[
[
"This took just over 4 hours (08:34:23-13:00:31) and worked flawlessly (no errors whatsoever). At this point, we have 12,158,748 abstracts in the `pubmed_2021.tbl_abstract` table. \n\n#### Batch 6 (0751-0900)\n\nWhile I thought this would be the last batch, I ran out of space again trying to dump 750-1062. Let's do up to 900 and do the last batch later today. ",
"_____no_output_____"
]
],
[
[
"cd /home/kb7hp/git/PubMedPortable/data\nrm -r pm_0501_0750\nmkdir pm_0751_0900\ncd /scratch/kb7hp/pubmed_gz\ncp pubmed21n{0751..0900}.xml.gz /home/kb7hp/git/PubMedPortable/data/pm_0751_0900\ncd /home/kb7hp/git/PubMedPortable/\npython PubMedParser.py -i data/pm_0751_0900/ -c -p 8",
"_____no_output_____"
]
],
[
[
"That took __ hours and once again ran without errors. \n\n#### Batch 7 (0901-1062)\n\nWe dumped the last batch with this code and we were done! ",
"_____no_output_____"
]
],
[
[
"cd /home/kb7hp/git/PubMedPortable/data\nrm -r pm_0751_0900\nmkdir pm_0901_1062\ncd /scratch/kb7hp/pubmed_gz\ncp pubmed21n{0901..1062}.xml.gz /home/kb7hp/git/PubMedPortable/data/pm_0901_1062\ncd /home/kb7hp/git/PubMedPortable/\npython PubMedParser.py -i data/pm_0901_1062/ -c -p 8\n# started this around 8:50am ",
"_____no_output_____"
]
],
[
[
"### On to the Next Step in Your Research Project\n\nOverall, this was a surprisingly easy process. A major kudos goes out to PubMedPortable for this fantastic package. Now, let's get to text mining! \n\n### References \n\nDöring, K., Grüning, B. A., Telukunta, K. K., Thomas, P., & Günther, S. (2016). PubMedPortable: a framework for supporting the development of text mining applications. Plos one, 11(10), e0163794. (Link to [Article](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0163794) and [GitHub Repo](https://github.com/KerstenDoering/PubMedPortable))\n\nNational Library of Medicine. (2021). Download MEDLINE/PubMed Data. Link to [Data](ftp://ftp.ncbi.nlm.nih.gov/pubmed/baseline) and [Documentation](https://www.nlm.nih.gov/databases/download/pubmed_medline.html)",
"_____no_output_____"
]
],
[
[
"SELECT xml_file_name, COUNT(fk_pmid) \nFROM pubmed_2021.tbl_pmids_in_file\nGROUP BY xml_file_name; \n\n--- looks like there was some kind of problem parsing these files\n--- affected 0816, 0829, 0865, 0866, 0875, 0879, 0884, 0886, 0891\n--- all of the rest were in the high 29,000s or at 30000 \n--- i think i parse 900:1062 and come back to these problems later \n--- the best approach would be to create a view where the fk_pmids of all \n--- those files is removed across all the tables and then started anew \n--- another way is just to remove all duplicates from each table later \n--- https://stackoverflow.com/questions/6583916/delete-duplicate-rows-from-small-table\n python PubMedParser.py -i data/pubmed/ -c ",
"_____no_output_____"
],
[
"# ran 0001-0100 first \npython PubMedParser.py -i data/pubmed_data/ -c -p 12\n# took an hour and 10 mins \n# downloaded 0101-0500\npython PubMedParser.py -i data/after400/ -c -p 12\npython PubMedParser.py -i data/the700s/ -c -p 12",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
e7a0f3ab725e929597a7488e50f76e350e2a56b7 | 1,109 | ipynb | Jupyter Notebook | python-skylark/skylark/notebooks/Faster Least Squares.ipynb | xdata-skylark/libskylark | 89c3736136a24d519c14fc0738c21f37f1e10360 | [
"Apache-2.0"
] | 86 | 2015-01-20T03:12:46.000Z | 2022-01-10T04:05:21.000Z | python-skylark/skylark/notebooks/Faster Least Squares.ipynb | xdata-skylark/libskylark | 89c3736136a24d519c14fc0738c21f37f1e10360 | [
"Apache-2.0"
] | 48 | 2015-05-12T09:31:23.000Z | 2018-12-05T14:45:46.000Z | python-skylark/skylark/notebooks/Faster Least Squares.ipynb | xdata-skylark/libskylark | 89c3736136a24d519c14fc0738c21f37f1e10360 | [
"Apache-2.0"
] | 25 | 2015-01-18T23:02:11.000Z | 2021-06-12T07:30:35.000Z | 17.603175 | 58 | 0.511271 | [
[
[
"import El\nA, B, X = (El.Matrix(), El.Matrix(), El.Matrix())\nEl.Gaussian(A, 10000, 100)\nEl.Gaussian(B, 10000, 1)",
"_____no_output_____"
],
[
"import skylark.nla\nskylark.nla.faster_least_squares(A, B, X)\n",
"_____no_output_____"
],
[
"\n",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code"
]
] |
e7a100ffb8e6f3f1668eeb9976ac81695629724c | 21,431 | ipynb | Jupyter Notebook | Homework Notebooks/Econ126_Winter2021_Homework_02_blank.ipynb | letsgoexploring/econ126 | 05f50d2392dd1c7c38b14950cb8d7eff7ff775ee | [
"MIT"
] | 2 | 2020-12-12T16:28:44.000Z | 2021-02-24T12:11:04.000Z | Homework Notebooks/Econ126_Winter2021_Homework_02_blank.ipynb | letsgoexploring/econ126 | 05f50d2392dd1c7c38b14950cb8d7eff7ff775ee | [
"MIT"
] | 1 | 2019-04-29T08:50:41.000Z | 2019-04-29T08:51:05.000Z | Homework Notebooks/Econ126_Winter2021_Homework_02_blank.ipynb | letsgoexploring/econ126 | 05f50d2392dd1c7c38b14950cb8d7eff7ff775ee | [
"MIT"
] | 19 | 2019-03-08T18:49:19.000Z | 2022-03-07T23:27:16.000Z | 35.075286 | 395 | 0.387616 | [
[
[
"# Homework 2\n\n**Instructions:** Complete the notebook below. Download the completed notebook in HTML format. Upload assignment using Canvas.\n\n**Due:** Jan. 19 at **2pm**.",
"_____no_output_____"
],
[
"## Exercise: NumPy Arrays\n\nFollow the instructions in the following cells.",
"_____no_output_____"
]
],
[
[
"# Import numpy\n",
"_____no_output_____"
],
[
"# Use the 'np.arange()' function to create a variable called 'numbers1' that stores the integers \n# 1 through (and including) 10\n\n\n# Print the value of 'numbers1'\n",
"_____no_output_____"
],
[
"# Use the 'np.arange()' function to create a variable called 'numbers2' that stores the numbers \n# 0 through (and including) 1 with a step increment of 0.01\n\n\n# Print the value of 'numbers2'\n",
"_____no_output_____"
],
[
"# Print the 5th value of 'numbers2'. (Remember that the index starts counting at 0)\n",
"_____no_output_____"
],
[
"# Print the last value of 'numbers2'.\n",
"_____no_output_____"
],
[
"# Print the first 12 values of 'numbers2'.\n",
"_____no_output_____"
],
[
"# Print the last 12 values of 'numbers2'.\n",
"_____no_output_____"
],
[
"# Use the 'np.zeros()' function to create a variable called 'zeros' that is an array of 20 zeros\n\n\n# Print the value of 'zeros'\n",
"_____no_output_____"
],
[
"# Change the second value of 'zeros' to 1 and print \n\n\n# Print the value of 'zeros'\n",
"_____no_output_____"
]
],
[
[
"## Exercise: Random Numbers\n\nFollow the instructions in the following cells.",
"_____no_output_____"
]
],
[
[
"# Set the seed of NumPy's random number generator to 126\n\n\n# Create a variable called 'epsilon' that is an array containing 25 draws from\n# a normal distribution with mean 4 and standard deviation 2\n\n\n# Print the value of epsilon\n",
"_____no_output_____"
],
[
"# Print the mean of 'epsilon'\n",
"_____no_output_____"
],
[
"# Print the standard deviation of 'epsilon'\n",
"_____no_output_____"
]
],
[
[
"## Exercise: The Cobb-Douglas Production Function\n\nThe Cobb-Douglas production function can be written in per worker terms as :\n \\begin{align}\n y & = A k^{\\alpha},\n \\end{align}\nwhere $y$ denotes output per worker, $k$ denotes capital per worker, and $A$ denotes total factor productivity or technology.",
"_____no_output_____"
],
[
"### Part (a)\n\nOn a single axis: plot the Cobb-Douglas production for $A$ = 0.8, 1, 1.2, and 1.4 with $\\alpha$ = 0.35 and $k$ ranging from 0 to 10. Each line should have a different color. Your plot must have a title and axis labels. The plot should also contain a legend that clearly indicates which line is associated with which value of $A$ and does not cover the plotted lines.",
"_____no_output_____"
]
],
[
[
"# Import the pyplot module from Matplotlib as plt\n\n\n# Select the Matlplotlib style sheet to use (Optional)\n\n\n# Use the '%matplotlib inline' magic command to ensure that Matplotlib plots are displayed in the Notebook\n",
"_____no_output_____"
],
[
"# Set capital share (alpha)\n\n\n# Create an array of capital values\n\n\n# Plot production function for each of the given values for A\n\n\n# Add x- and y-axis labels\n\n\n# Add a title to the plot\n\n# Create a legend\n\n# Add a grid\n",
"_____no_output_____"
]
],
[
[
"**Question**\n\n1. *Briefly* explain in words how increasing $A$ affects the shape of the production function.",
"_____no_output_____"
],
[
"**Answer**\n\n1. ",
"_____no_output_____"
],
[
"### Part (b)\n\nOn a single axis: plot the Cobb-Douglas production for $\\alpha$ = 0.1, 0.2, 0.3, 0.4, and 0.5 with $A$ = 1 and $k$ ranging from 0 to 10. Each line should have a different color. Your plot must have a title and axis labels. The plot should also contain a legend that clearly indicates which line is associated with which value of $\\alpha$ and does not cover the plotted lines.",
"_____no_output_____"
]
],
[
[
"# Set TFP (A)\n\n\n# Plot production function for each of the given values for alpha\n\n\n# Add x- and y-axis labels\n\n\n# Add a title to the plot\n\n# Create a legend\n\n# Add a grid\n",
"_____no_output_____"
]
],
[
[
"**Question**\n\n1. *Briefly* explain in words how increasing $\\alpha$ affects the shape of the production function.",
"_____no_output_____"
],
[
"**Answer**\n\n1. ",
"_____no_output_____"
],
[
"### Exercise: The Cardioid\n\nThe cardioid is a shape described by the parametric equations:\n\n \\begin{align}\n x & = a(2\\cos \\theta - \\cos 2\\theta), \\\\\n y & = a(2\\sin \\theta - \\sin 2\\theta).\n \\end{align}\n \nConstruct a well-labeled graph of the cardiod for $a=4$ and $\\theta$ in $[0,2\\pi]$. Your plot must have a title and axis labels.",
"_____no_output_____"
]
],
[
[
"# Construct data for x and y\n\n\n# Plot y against x\n\n\n# Create x-axis label\n\n\n# Create y-axis label\n\n\n# Create title for plot\n\n\n# Add a grid to the plot\n",
"_____no_output_____"
]
],
[
[
"## Exercise: Unconstrained optimization\n\nConsider the quadratic function:\n \n\\begin{align}\nf(x) & = -7x^2 + 930x + 30\n\\end{align}\n \nYou will use analytic (i.e., pencil and paper) and numerical methods to find the the value of $x$ that maximizes $f(x)$. Another name for $x$ that maximizes $f(x)$ is the *argument of the maximum* value $f(x)$.\n\n### Part (a): Analytic solution\n\nUse standard calculus methods to solve for the value of $x$ that maximizes $f(x)$ to **five decimal places**. Use your answer to complete the sentence in the next cell.",
"_____no_output_____"
],
[
" The value of $x$ that maximizes $f(x)$ is: ",
"_____no_output_____"
],
[
"### Part (b): Numerical solution\n\nIn the cells below, you will use NumPy to try to compute the argument of the maximum of $f(x)$.",
"_____no_output_____"
]
],
[
[
"# Use np.arange to create a variable called 'x' that is equal to the numbers 0 through 100 \n# with a spacing between numbers of 0.1\n\n\n# Create a variable called 'f' that equals f(x) at each value of the array 'x' just defined\n\n\n# Use np.argmax to create a variable called xstar equal to the value of 'x' that maximizes the function f(x).\n\n\n# Print the value of xstar\n",
"_____no_output_____"
],
[
"# Use np.arange to create a variable called 'x' that is equal to the numbers 0 through 100 \n# with a spacing between numbers of 0.001\n\n\n# Create a variable called 'f' that equals f(x) at each value of the array 'x' just defined\n\n\n# Use np.argmax to create a variable called xstar equal to the value of 'x' that maximizes the function f(x).\n\n\n# Print the value of xstar\n",
"_____no_output_____"
],
[
"# Use np.arange to create a variable called 'x' that is equal to the numbers 0 through *50*\n# with a spacing between numbers of 0.001\n\n\n# Create a variable called 'f' that equals f(x) at each value of the array 'x' just defined\n\n\n# Use np.argmax to create a variable called xstar equal to the value of 'x' that maximizes the function f(x).\n\n\n# Print the value of xstar\n",
"_____no_output_____"
]
],
[
[
"### Part (c): Evaluation\n\nProvide answers to the follow questions in the next cell.\n\n**Questions**\n\n1. How did the choice of step size in the array `x` affect the accuracy of the computed results in the first two cells of Part (b)?\n\n2. What do you think is the drawback to decreasing the stepsize in `x`?\n\n3. In the previous cell, why did NumPy return value for `xstar` that is so different from the solution you derived in Part (a)?",
"_____no_output_____"
],
[
"**Answers**\n\n1. \n\n2. \n\n3. ",
"_____no_output_____"
],
[
"## Exercise: Utility Maximization\n\nRecall the two good utility maximization problem from microeconomics. Let $x$ and $y$ denote the amount of two goods that a person consumes. The person receives utility from consumption given by:\n \\begin{align}\n u(x,y) & = x^{\\alpha}y^{\\beta} \n \\end{align}\nThe person has income $M$ to spend on the two goods and the price of the goods are $p_x$ and $p_y$. The consumer's budget constraint is:\n \\begin{align}\n M & = p_x x + p_y y\n \\end{align}\nSuppose that $M = 100$, $\\alpha=0.25$, $\\beta=0.75$, $p_x = 1$. and $p_y = 0.5$. The consumer's problem is to maximize their utility subject to the budget constraint. While this problem can easily be solved by hand, we're going to use a computational approach. You can also solve the problem by hand to verify your solution.",
"_____no_output_____"
],
[
"### Part (a)\n\nUse the budget constraint to solve for $y$ in terms of $x$, $p_x$, $p_y$, and $M$. Use the result to write the consumer's utility as a function of $x$ only. Create a variable called `x` equal to an array of values from 0 to 80 with step size equal to 0.001 and a variable called `utility` equal to the consumer's utility. Plot the consumer's utility against $x$.",
"_____no_output_____"
]
],
[
[
"# Assign values to the constants alpha, beta, M, px, py\n\n\n# Create an array of x values\n\n\n# Create an array of utility values\n\n\n# Plot utility against x.\n\n\n# x- and y-axis labels\n\n\n# Title\n\n\n# Add grid\n",
"_____no_output_____"
]
],
[
[
"### Part (b)\n\nThe NumPy function `np.max()` returns the highest value in an array and `np.argmax()` returns the index of the highest value. Print the highest value and index of the highest value of `utility`.",
"_____no_output_____"
]
],
[
[
"\n",
"_____no_output_____"
]
],
[
[
"### Part (c)\n\nUse the index of the highest value of utility to find the value in `x` with the same index and store value in a new variable called `xstar`. Print the value of `xstar`.",
"_____no_output_____"
]
],
[
[
"# Create variable 'xstar' equal to value in 'x' that maximizes utility\n\n\n# Print value of 'xstar'\n",
"_____no_output_____"
]
],
[
[
"### Part (d)\n\nUse the budget constraint to the find the implied utility-maximizing vaue of $y$ and store this in a variable called `ystar`. Print `ystar`.",
"_____no_output_____"
]
],
[
[
"# Create variable 'ystar' equal to value in 'y' that maximizes utility\n\n\n# Print value of 'xstar'\n",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
e7a10a0082aecf006e8ab62a8d400e44404628c4 | 29,953 | ipynb | Jupyter Notebook | cloud/notebooks/python_sdk/deployments/spark/cars-4-you/Use Spark to recommend mitigation for car rental company.ipynb | muthukumarbala07/watson-machine-learning-samples | ecc66faf7a7c60ca168b9c7ef0bca3c766babb94 | [
"Apache-2.0"
] | null | null | null | cloud/notebooks/python_sdk/deployments/spark/cars-4-you/Use Spark to recommend mitigation for car rental company.ipynb | muthukumarbala07/watson-machine-learning-samples | ecc66faf7a7c60ca168b9c7ef0bca3c766babb94 | [
"Apache-2.0"
] | null | null | null | cloud/notebooks/python_sdk/deployments/spark/cars-4-you/Use Spark to recommend mitigation for car rental company.ipynb | muthukumarbala07/watson-machine-learning-samples | ecc66faf7a7c60ca168b9c7ef0bca3c766babb94 | [
"Apache-2.0"
] | null | null | null | 29.250977 | 765 | 0.577004 | [
[
[
"# Use Spark to recommend mitigation for car rental company with `ibm-watson-machine-learning`",
"_____no_output_____"
],
[
"This notebook contains steps and code to create a predictive model, and deploy it on WML. This notebook introduces commands for pipeline creation, model training, model persistance to Watson Machine Learning repository, model deployment, and scoring.\n\nSome familiarity with Python is helpful. This notebook uses Python 3.8 and Apache® Spark 2.4.\n\nYou will use **car_rental_training** dataset.\n\n\n## Learning goals\n\nThe learning goals of this notebook are:\n\n- Load a CSV file into an Apache® Spark DataFrame.\n- Explore data.\n- Prepare data for training and evaluation.\n- Create an Apache® Spark machine learning pipeline.\n- Train and evaluate a model.\n- Persist a pipeline and model in Watson Machine Learning repository.\n- Deploy a model for online scoring using Wastson Machine Learning API.\n- Score sample scoring data using the Watson Machine Learning API.\n\n\n## Contents\n\nThis notebook contains the following parts:\n1. [Setup](#setup)\n2. [Load and explore data](#load)\n3. [Create an Apache Spark machine learning model](#model)\n4. [Store the model in the Watson Machine Learning repository](#upload)\n5. [Deploy the model in the IBM Cloud](#deploy)\n6. [Score](#score)\n7. [Clean up](#cleanup)\n8. [Summary and next steps](#summary)",
"_____no_output_____"
],
[
"**Note:** This notebook works correctly with kernel `Python 3.6 with Spark 2.4`, please **do not change kernel**.",
"_____no_output_____"
],
[
"<a id=\"setup\"></a>\n## 1. Set up the environment\n\nBefore you use the sample code in this notebook, you must perform the following setup tasks:\n\n- Create a <a href=\"https://console.ng.bluemix.net/catalog/services/ibm-watson-machine-learning/\" target=\"_blank\" rel=\"noopener no referrer\">Watson Machine Learning (WML) Service</a> instance (a free plan is offered and information about how to create the instance can be found <a href=\"https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/ml-service-instance.html?context=analytics\" target=\"_blank\" rel=\"noopener no referrer\">here</a>).",
"_____no_output_____"
],
[
"### Connection to WML\n\nAuthenticate the Watson Machine Learning service on IBM Cloud. You need to provide platform `api_key` and instance `location`.\n\nYou can use [IBM Cloud CLI](https://cloud.ibm.com/docs/cli/index.html) to retrieve platform API Key and instance location.\n\nAPI Key can be generated in the following way:\n```\nibmcloud login\nibmcloud iam api-key-create API_KEY_NAME\n```\n\nIn result, get the value of `api_key` from the output.\n\n\nLocation of your WML instance can be retrieved in the following way:\n```\nibmcloud login --apikey API_KEY -a https://cloud.ibm.com\nibmcloud resource service-instance WML_INSTANCE_NAME\n```\n\nIn result, get the value of `location` from the output.",
"_____no_output_____"
],
[
"**Tip**: Your `Cloud API key` can be generated by going to the [**Users** section of the Cloud console](https://cloud.ibm.com/iam#/users). From that page, click your name, scroll down to the **API Keys** section, and click **Create an IBM Cloud API key**. Give your key a name and click **Create**, then copy the created key and paste it below. You can also get a service specific url by going to the [**Endpoint URLs** section of the Watson Machine Learning docs](https://cloud.ibm.com/apidocs/machine-learning). You can check your instance location in your <a href=\"https://console.ng.bluemix.net/catalog/services/ibm-watson-machine-learning/\" target=\"_blank\" rel=\"noopener no referrer\">Watson Machine Learning (WML) Service</a> instance details.\n\nYou can also get service specific apikey by going to the [**Service IDs** section of the Cloud Console](https://cloud.ibm.com/iam/serviceids). From that page, click **Create**, then copy the created key and paste it below.\n\n**Action**: Enter your `api_key` and `location` in the following cell.",
"_____no_output_____"
]
],
[
[
"api_key = 'PASTE YOUR PLATFORM API KEY HERE'\nlocation = 'PASTE YOUR INSTANCE LOCATION HERE'",
"_____no_output_____"
],
[
"wml_credentials = {\n \"apikey\": api_key,\n \"url\": 'https://' + location + '.ml.cloud.ibm.com'\n}",
"_____no_output_____"
]
],
[
[
"### Install and import the `ibm-watson-machine-learning` package\n**Note:** `ibm-watson-machine-learning` documentation can be found <a href=\"http://ibm-wml-api-pyclient.mybluemix.net/\" target=\"_blank\" rel=\"noopener no referrer\">here</a>.",
"_____no_output_____"
]
],
[
[
"!pip install -U ibm-watson-machine-learning",
"_____no_output_____"
],
[
"from ibm_watson_machine_learning import APIClient\n\nclient = APIClient(wml_credentials)",
"_____no_output_____"
]
],
[
[
"### Working with spaces\n\nFirst of all, you need to create a space that will be used for your work. If you do not have space already created, you can use [Deployment Spaces Dashboard](https://dataplatform.cloud.ibm.com/ml-runtime/spaces?context=cpdaas) to create one.\n\n- Click New Deployment Space\n- Create an empty space\n- Select Cloud Object Storage\n- Select Watson Machine Learning instance and press Create\n- Copy `space_id` and paste it below\n\n**Tip**: You can also use SDK to prepare the space for your work. More information can be found [here](https://github.com/IBM/watson-machine-learning-samples/blob/master/cloud/notebooks/python_sdk/instance-management/Space%20management.ipynb).\n\n**Action**: Assign space ID below",
"_____no_output_____"
]
],
[
[
"space_id = 'PASTE YOUR SPACE ID HERE'",
"_____no_output_____"
]
],
[
[
"You can use `list` method to print all existing spaces.",
"_____no_output_____"
]
],
[
[
"client.spaces.list(limit=10)",
"_____no_output_____"
]
],
[
[
"To be able to interact with all resources available in Watson Machine Learning, you need to set **space** which you will be using.",
"_____no_output_____"
]
],
[
[
"client.set.default_space(space_id)",
"_____no_output_____"
]
],
[
[
"**Note**: Please restart the kernel (Kernel -> Restart)",
"_____no_output_____"
],
[
"### Test Spark",
"_____no_output_____"
]
],
[
[
"try:\n from pyspark.sql import SparkSession\nexcept:\n print('Error: Spark runtime is missing. If you are using Watson Studio change the notebook runtime to Spark.')\n raise",
"_____no_output_____"
]
],
[
[
"<a id=\"load\"></a>\n## 2. Load and explore data",
"_____no_output_____"
],
[
"In this section you will load the data as an Apache Spark DataFrame and perform a basic exploration.",
"_____no_output_____"
],
[
"Read data into Spark DataFrame from DB2 database and show sample record.",
"_____no_output_____"
],
[
"### Load data",
"_____no_output_____"
]
],
[
[
"import os\nfrom wget import download\n\nsample_dir = 'spark_sample_model'\nif not os.path.isdir(sample_dir):\n os.mkdir(sample_dir)\n \nfilename = os.path.join(sample_dir, 'car_rental_training_data.csv')\nif not os.path.isfile(filename):\n filename = download('https://github.com/IBM/watson-machine-learning-samples/raw/master/cloud/data/cars-4-you/car_rental_training_data.csv', out=sample_dir)",
"_____no_output_____"
],
[
"spark = SparkSession.builder.getOrCreate()\n\ndf_data = spark.read\\\n .format('org.apache.spark.sql.execution.datasources.csv.CSVFileFormat')\\\n .option('header', 'true')\\\n .option('inferSchema', 'true')\\\n .option(\"delimiter\", \";\")\\\n .load(filename)\ndf_data.take(3)",
"_____no_output_____"
]
],
[
[
"### Explore data",
"_____no_output_____"
]
],
[
[
"df_data.printSchema()",
"_____no_output_____"
]
],
[
[
"As you can see, the data contains eleven fields. `Action` field is the one you would like to predict using feedback data in `Customer_Service` field.",
"_____no_output_____"
]
],
[
[
"print(\"Number of records: \" + str(df_data.count()))",
"_____no_output_____"
]
],
[
[
"As you can see, the data set contains 243 records.",
"_____no_output_____"
]
],
[
[
"df_data.select('Business_area').groupBy('Business_area').count().show()",
"_____no_output_____"
],
[
"df_data.select('Action').groupBy('Action').count().show(truncate=False)",
"_____no_output_____"
]
],
[
[
"<a id=\"model\"></a>\n## 3. Create an Apache Spark machine learning model\n\nIn this section you will learn how to:\n\n- [3.1 Prepare data for training a model](#prep)\n- [3.2 Create an Apache Spark machine learning pipeline](#pipe)\n- [3.3 Train a model](#train)",
"_____no_output_____"
],
[
"<a id=\"prep\"></a>\n### 3.1 Prepare data for training a model\n\nIn this subsection you will split your data into: train and test data set.",
"_____no_output_____"
]
],
[
[
"train_data, test_data = df_data.randomSplit([0.8, 0.2], 24)\n\nprint(\"Number of training records: \" + str(train_data.count()))\nprint(\"Number of testing records : \" + str(test_data.count()))",
"_____no_output_____"
]
],
[
[
"### 3.2 Create the pipeline<a id=\"pipe\"></a>",
"_____no_output_____"
],
[
"In this section you will create an Apache Spark machine learning pipeline and then train the model.",
"_____no_output_____"
]
],
[
[
"from pyspark.ml.feature import OneHotEncoder, StringIndexer, IndexToString, VectorAssembler, HashingTF, IDF, Tokenizer\nfrom pyspark.ml.classification import DecisionTreeClassifier\nfrom pyspark.ml.evaluation import MulticlassClassificationEvaluator\nfrom pyspark.ml import Pipeline, Model",
"_____no_output_____"
]
],
[
[
"In the following step, use the StringIndexer transformer to convert all the string fields to numeric ones.",
"_____no_output_____"
]
],
[
[
"string_indexer_gender = StringIndexer(inputCol=\"Gender\", outputCol=\"gender_ix\")\nstring_indexer_customer_status = StringIndexer(inputCol=\"Customer_Status\", outputCol=\"customer_status_ix\")\nstring_indexer_status = StringIndexer(inputCol=\"Status\", outputCol=\"status_ix\")\nstring_indexer_owner = StringIndexer(inputCol=\"Car_Owner\", outputCol=\"owner_ix\")\nstring_business_area = StringIndexer(inputCol=\"Business_Area\", outputCol=\"area_ix\")",
"_____no_output_____"
],
[
"assembler = VectorAssembler(inputCols=[\"gender_ix\", \"customer_status_ix\", \"status_ix\", \"owner_ix\", \"area_ix\", \"Children\", \"Age\", \"Satisfaction\"], outputCol=\"features\")",
"_____no_output_____"
],
[
"string_indexer_action = StringIndexer(inputCol=\"Action\", outputCol=\"label\").fit(df_data)",
"_____no_output_____"
],
[
"label_action_converter = IndexToString(inputCol=\"prediction\", outputCol=\"predictedLabel\", labels=string_indexer_action.labels)",
"_____no_output_____"
],
[
"dt_action = DecisionTreeClassifier()",
"_____no_output_____"
],
[
"pipeline_action = Pipeline(stages=[string_indexer_gender, string_indexer_customer_status, string_indexer_status, string_indexer_action, string_indexer_owner, string_business_area, assembler, dt_action, label_action_converter])",
"_____no_output_____"
],
[
"model_action = pipeline_action.fit(train_data)",
"_____no_output_____"
],
[
"predictions_action = model_action.transform(test_data)\npredictions_action.select('Business_Area','Action','probability','predictedLabel').show(2)",
"_____no_output_____"
],
[
"evaluator = MulticlassClassificationEvaluator(labelCol=\"label\", predictionCol=\"prediction\", metricName=\"accuracy\")\naccuracy = evaluator.evaluate(predictions_action)\n\nprint(\"Accuracy = %g\" % accuracy)",
"_____no_output_____"
]
],
[
[
"<a id=\"upload\"></a>\n## 4. Persist model",
"_____no_output_____"
],
[
"In this section you will learn how to store your pipeline and model in Watson Machine Learning repository by using python client libraries.",
"_____no_output_____"
],
[
"**Note**: Apache® Spark 2.4 is required.",
"_____no_output_____"
],
[
"#### Save training data in your Cloud Object Storage",
"_____no_output_____"
],
[
"ibm-cos-sdk library allows Python developers to manage Cloud Object Storage (COS).",
"_____no_output_____"
]
],
[
[
"import ibm_boto3\nfrom ibm_botocore.client import Config",
"_____no_output_____"
]
],
[
[
"**Action**: Put credentials from Object Storage Service in Bluemix here.",
"_____no_output_____"
]
],
[
[
"cos_credentials = {\n \"apikey\": \"***\",\n \"cos_hmac_keys\": {\n \"access_key_id\": \"***\",\n \"secret_access_key\": \"***\"\n },\n \"endpoints\": \"***\",\n \"iam_apikey_description\": \"***\",\n \"iam_apikey_name\": \"***\",\n \"iam_role_crn\": \"***\",\n \"iam_serviceid_crn\": \"***\",\n \"resource_instance_id\": \"***\"\n }",
"_____no_output_____"
],
[
"connection_apikey = cos_credentials['apikey']\nconnection_resource_instance_id = cos_credentials[\"resource_instance_id\"]\nconnection_access_key_id = cos_credentials['cos_hmac_keys']['access_key_id']\nconnection_secret_access_key = cos_credentials['cos_hmac_keys']['secret_access_key']",
"_____no_output_____"
]
],
[
[
"**Action**: Define the service endpoint we will use. <br>\n**Tip**: You can find this information in Endpoints section of your Cloud Object Storage intance's dashbord.",
"_____no_output_____"
]
],
[
[
"service_endpoint = 'https://s3.us.cloud-object-storage.appdomain.cloud'",
"_____no_output_____"
]
],
[
[
"You also need IBM Cloud authorization endpoint to be able to create COS resource object.",
"_____no_output_____"
]
],
[
[
"auth_endpoint = 'https://iam.cloud.ibm.com/identity/token'",
"_____no_output_____"
]
],
[
[
"We create COS resource to be able to write data to Cloud Object Storage.",
"_____no_output_____"
]
],
[
[
"cos = ibm_boto3.resource('s3',\n ibm_api_key_id=cos_credentials['apikey'],\n ibm_service_instance_id=cos_credentials['resource_instance_id'],\n ibm_auth_endpoint=auth_endpoint,\n config=Config(signature_version='oauth'),\n endpoint_url=service_endpoint)",
"_____no_output_____"
]
],
[
[
"Now you will create bucket in COS and copy `training dataset` for model from **car_rental_training_data.csv**.",
"_____no_output_____"
]
],
[
[
"from uuid import uuid4\n\nbucket_uid = str(uuid4())\n\nscore_filename = \"car_rental_training_data.csv\"\nbuckets = [\"car-rental-\" + bucket_uid]",
"_____no_output_____"
],
[
"for bucket in buckets:\n if not cos.Bucket(bucket) in cos.buckets.all():\n print('Creating bucket \"{}\"...'.format(bucket))\n try:\n cos.create_bucket(Bucket=bucket)\n except ibm_boto3.exceptions.ibm_botocore.client.ClientError as e:\n print('Error: {}.'.format(e.response['Error']['Message']))",
"_____no_output_____"
],
[
"bucket_obj = cos.Bucket(buckets[0])\n\nprint('Uploading data {}...'.format(score_filename))\nwith open(filename, 'rb') as f:\n bucket_obj.upload_fileobj(f, score_filename)\nprint('{} is uploaded.'.format(score_filename))",
"_____no_output_____"
]
],
[
[
"### Create connections to a COS bucket",
"_____no_output_____"
]
],
[
[
"datasource_type = client.connections.get_datasource_type_uid_by_name('bluemixcloudobjectstorage')\n\nconn_meta_props= {\n client.connections.ConfigurationMetaNames.NAME: \"COS connection - spark\",\n client.connections.ConfigurationMetaNames.DATASOURCE_TYPE: datasource_type,\n client.connections.ConfigurationMetaNames.PROPERTIES: {\n 'bucket': buckets[0],\n 'access_key': connection_access_key_id,\n 'secret_key': connection_secret_access_key,\n 'iam_url': auth_endpoint,\n 'url': service_endpoint\n }\n}\n\nconn_details = client.connections.create(meta_props=conn_meta_props)",
"_____no_output_____"
]
],
[
[
"**Note**: The above connection can be initialized alternatively with `api_key` and `resource_instance_id`. \nThe above cell can be replaced with:\n\n\n```\nconn_meta_props= {\n client.connections.ConfigurationMetaNames.NAME: f\"Connection to Database - {db_name} \",\n client.connections.ConfigurationMetaNames.DATASOURCE_TYPE: client.connections.get_datasource_type_uid_by_name(db_name),\n client.connections.ConfigurationMetaNames.DESCRIPTION: \"Connection to external Database\",\n client.connections.ConfigurationMetaNames.PROPERTIES: {\n 'bucket': bucket_name,\n 'api_key': cos_credentials['apikey'],\n 'resource_instance_id': cos_credentials['resource_instance_id'],\n 'iam_url': 'https://iam.cloud.ibm.com/identity/token',\n 'url': 'https://s3.us.cloud-object-storage.appdomain.cloud'\n }\n}\n\nconn_details = client.connections.create(meta_props=conn_meta_props)\n\n```",
"_____no_output_____"
]
],
[
[
"connection_id = client.connections.get_uid(conn_details)",
"_____no_output_____"
]
],
[
[
"### 4.2 Save the pipeline and model<a id=\"save\"></a>",
"_____no_output_____"
]
],
[
[
"training_data_references = [\n {\n \"id\":\"car-rental-training\",\n \"type\": \"connection_asset\",\n \"connection\": {\n \"id\": connection_id\n },\n \"location\": {\n \"bucket\": buckets[0],\n \"file_name\": score_filename,\n }\n }\n ]",
"_____no_output_____"
],
[
"saved_model = client.repository.store_model(\n model=model_action, \n meta_props={\n client.repository.ModelMetaNames.NAME:\"CARS4U - Action Recommendation Model\",\n client.repository.ModelMetaNames.TYPE: \"mllib_2.4\",\n client.repository.ModelMetaNames.SOFTWARE_SPEC_UID: client.software_specifications.get_id_by_name('spark-mllib_2.4'),\n client.repository.ModelMetaNames.TRAINING_DATA_REFERENCES: training_data_references,\n client.repository.ModelMetaNames.LABEL_FIELD: \"Action\",\n }, \n training_data=train_data, \n pipeline=pipeline_action)",
"_____no_output_____"
]
],
[
[
"Get saved model metadata from Watson Machine Learning.",
"_____no_output_____"
]
],
[
[
"published_model_id = client.repository.get_model_uid(saved_model)\n\nprint(\"Model Id: \" + str(published_model_id))",
"_____no_output_____"
]
],
[
[
"**Model Id** can be used to retrive latest model version from Watson Machine Learning instance.",
"_____no_output_____"
],
[
"Below you can see stored model details.",
"_____no_output_____"
]
],
[
[
"client.repository.get_model_details(published_model_id)",
"_____no_output_____"
]
],
[
[
"<a id=\"deploy\"></a>\n## 5. Deploy model in the IBM Cloud",
"_____no_output_____"
],
[
"You can use following command to create online deployment in cloud.",
"_____no_output_____"
]
],
[
[
"deployment_details = client.deployments.create(\n published_model_id, \n meta_props={\n client.deployments.ConfigurationMetaNames.NAME: \"CARS4U - Action Recommendation model deployment\",\n client.deployments.ConfigurationMetaNames.ONLINE: {}\n }\n)",
"_____no_output_____"
],
[
"deployment_details",
"_____no_output_____"
]
],
[
[
"<a id=\"score\"></a>\n## 6. Score",
"_____no_output_____"
]
],
[
[
"fields = ['ID', 'Gender', 'Status', 'Children', 'Age', 'Customer_Status','Car_Owner', 'Customer_Service', 'Business_Area', 'Satisfaction']\nvalues = [3785, 'Male', 'S', 1, 17, 'Inactive', 'Yes', 'The car should have been brought to us instead of us trying to find it in the lot.', 'Product: Information', 0]",
"_____no_output_____"
],
[
"import json\n\npayload_scoring = {\"input_data\": [{\"fields\": fields,\"values\": [values]}]}\nscoring_response = client.deployments.score(client.deployments.get_id(deployment_details), payload_scoring)\n\nprint(json.dumps(scoring_response, indent=3))",
"_____no_output_____"
]
],
[
[
"<a id=\"cleanup\"></a>\n## 7. Clean up",
"_____no_output_____"
],
[
"If you want to clean up all created assets:\n- experiments\n- trainings\n- pipelines\n- model definitions\n- models\n- functions\n- deployments\n\nplease follow up this sample [notebook](https://github.com/IBM/watson-machine-learning-samples/blob/master/cloud/notebooks/python_sdk/instance-management/Machine%20Learning%20artifacts%20management.ipynb).",
"_____no_output_____"
],
[
"<a id=\"summary\"></a>\n## 8. Summary and next steps ",
"_____no_output_____"
],
[
" You successfully completed this notebook! You learned how to use Apache Spark machine learning as well as Watson Machine Learning for model creation and deployment. Check out our [Online Documentation](https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/ml-service-instance.html?context=analytics) for more samples, tutorials, documentation, how-tos, and blog posts. ",
"_____no_output_____"
],
[
"### Authors\n\n**Amadeusz Masny**, Python Software Developer in Watson Machine Learning at IBM",
"_____no_output_____"
],
[
"Copyright © 2020, 2021 IBM. This notebook and its source code are released under the terms of the MIT License.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
e7a10d0cdc75ca48e02ca4d3b044ace3faa95550 | 65,378 | ipynb | Jupyter Notebook | notebooks/RTN Classification - Active.ipynb | exowanderer/BadPixelDetector | 1dd30eaec6f2a1c6edd40322cde395ac2cd06626 | [
"BSD-3-Clause"
] | null | null | null | notebooks/RTN Classification - Active.ipynb | exowanderer/BadPixelDetector | 1dd30eaec6f2a1c6edd40322cde395ac2cd06626 | [
"BSD-3-Clause"
] | 1 | 2020-06-25T10:46:56.000Z | 2020-06-25T10:46:56.000Z | notebooks/RTN Classification - Active.ipynb | exowanderer/BadPixelDetector | 1dd30eaec6f2a1c6edd40322cde395ac2cd06626 | [
"BSD-3-Clause"
] | null | null | null | 28.536883 | 174 | 0.544617 | [
[
[
"%matplotlib inline\nimport numpy as np\nfrom pylab import *\nfrom astropy.io import fits\nfrom statsmodels.robust import scale\nfrom sklearn import preprocessing as pp\nfrom time import time\n\nfrom seaborn import *\nimport statsmodels.api as sm\nfrom sklearn.neighbors import KernelDensity\n\nstyle.use('fivethirtyeight')\nfrom IPython import display\nrcParams['axes.grid'] = False\nrcParams['lines.linewidth'] = 1.0",
"_____no_output_____"
],
[
"from ipywidgets import widgets\n# from IPython.display import display\nfrom IPython import display",
"_____no_output_____"
],
[
"mask0 = fits.open('NRCNRCALONG-DARK-53421914341_1_485_SE_2015-12-08T21h45m34_mask.fits')\ndarks0 = fits.open('NRCNRCALONG-DARK-53421914341_1_485_SE_2015-12-08T21h45m34.fits')",
"_____no_output_____"
],
[
"nQuirks = sum(mask0[0].data[0] != 0)\nnQuirks",
"_____no_output_____"
],
[
"mask = mask0[0].data[0]\ndarks = darks0[0].data",
"_____no_output_____"
],
[
"flags_loc = transpose(np.where(mask != 0))\nflags_loc",
"_____no_output_____"
],
[
"quirks_store = np.loadtxt('quirks_FINAL_for_cnaw_mask_CV3_dark_frames.txt')\nclasses_store = np.loadtxt('myclasses_new_FINAL_for_cnaw_mask_CV3_dark_frames.txt').astype(int)",
"_____no_output_____"
]
],
[
[
"Check If All Class 1 Bad Pixels Are Indeed Just Noisy Pixels\n---",
"_____no_output_____"
]
],
[
[
"quirks_store[classes_store == 1].shape",
"_____no_output_____"
],
[
"fig = figure()#figsize=(6,6))\nax = fig.add_subplot(111)\n# ax.plot([nan,nan])\ncorrections = []\nfor cnow in np.where(classes_store == 1)[0]:\n # ax.lines.pop()\n ax.clear()\n ax.plot(quirks_store[cnow] - median(quirks_store[cnow]))\n ax.set_title('Entry:' + str(cnow) + '/ Class:' + str(classes_store[cnow]))\n fig.canvas.draw()\n display.display(plt.gcf())\n display.clear_output(wait=True)\n# checkClass = input('Is this a Noisy Pixel? ');\n# if checkClass != '':\n# corrections.append([cnow, checkClass])\n# for cnow in np.where(classes_store == 1)[0]:\n# plt.plot(quirks_store[cnow])\n# display.clear_output(wait=True)\n# display.display(plt.gcf())\n# checkClass = input('Is this a Noisy Pixel? ');print(checkClass)\n# plt.clf()\n# display.clear_output(wait=True)",
"_____no_output_____"
]
],
[
[
"Check If All Class 4 Bad Pixels Are Indeed Just CR Pixels\n---",
"_____no_output_____"
]
],
[
[
"quirks_store[classes_store == 4].shape",
"_____no_output_____"
]
],
[
[
"fig = figure()#figsize=(6,6))\nax = fig.add_subplot(111)\n\nCRs = np.where(classes_store == 4)[0]\ncorrections = []\nfor cnow in :\n # ax.lines.pop()\n ax = fig.add_subplot(111)\n ax.plot((quirks_store[cnow] - min(quirks_store[cnow])) / (max(quirks_store[cnow]) - min(quirks_store[cnow])), lw=2)\n ax.set_title('Entry:' + str(cnow) + '/ Class:' + str(classes_store[cnow]))\n ax.annotate(str(cnow), [110, 0.5], fontsize=50)\n fig.canvas.draw()\n display.display(plt.gcf())\n time.sleep(.05)\n display.clear_output(wait=True)\n ax.lines.pop()\n ax.texts.pop()\n if cnow > 500 and cnow < 1000:\n display.display(plt.clf())\n ax = fig.add_subplot(111)\n# checkClass = input('Is this a Cosmic Ray? ');\n# if checkClass != '':\n# corrections.append([cnow, checkClass])\n# for cnow in np.where(classes_store == 1)[0]:\n# plt.plot(quirks_store[cnow])\n# display.clear_output(wait=True)\n# display.display(plt.gcf())\n# checkClass = input('Is this a Noisy Pixel? ');print(checkClass)\n# plt.clf()\n# display.clear_output(wait=True)",
"_____no_output_____"
],
[
"corrections",
"_____no_output_____"
]
],
[
[
"np.where(classes_store == 6)[0]",
"_____no_output_____"
]
],
[
[
"classes_store[[140,260, 380]] = 2",
"_____no_output_____"
]
],
[
[
"plot(quirks_store[140]);\nplot(quirks_store[260]);\nplot(quirks_store[380]);",
"_____no_output_____"
],
[
"((quirks_store.T - np.min(quirks_store,axis=1)) / (np.max(quirks_store,axis=1) - np.min(quirks_store, axis=1))).shape",
"_____no_output_____"
],
[
"((quirks_store.T - np.min(quirks_store,axis=1)) / (np.max(quirks_store,axis=1) - np.min(quirks_store, axis=1))).T[classes_store == 4].T.shape",
"_____no_output_____"
],
[
"np.sum(classes_store == 2) // 100",
"_____no_output_____"
],
[
"quirk_store_norm = ((quirks_store.T - np.min(quirks_store,axis=1)) / (np.max(quirks_store,axis=1) - np.min(quirks_store, axis=1))).T",
"_____no_output_____"
],
[
"classNow = 4\nk = 1\nstepsize = 100\nquirksNow = quirk_store_norm[classes_store == classNow][k*stepsize:(k+1)*stepsize].T\nquirksNow.shape",
"_____no_output_____"
],
[
"classes_store_bak = np.copy(classes_store)",
"_____no_output_____"
]
],
[
[
"classNow = 5\nstepsize = 50\nfig = figure(figsize=(16,30))\nfor k in range( np.sum(classes_store == classNow) // stepsize):\n quirksNow = quirk_store_norm[classes_store == classNow][k*stepsize:(k+1)*stepsize]\n# upper = np.where(quirksNow[:,-1] > 0.5)[0]\n lower = np.where(quirksNow[:,-1] < 0.5)[0]\n classes_store[classes_store == classNow][lower] = np.ones(len(classes_store[classes_store == classNow][lower]))*6\n# ax = fig.add_subplot(np.int(np.ceil(np.sum(classes_store == classNow) // stepsize / 2)), 2, k+1)\n# plot(quirksNow[lower].T);",
"_____no_output_____"
]
],
[
[
"fig = figure(figsize=(16,8))\nax1 = fig.add_subplot(121)\nax2 = fig.add_subplot(122)\nax1.plot(quirk_store_norm[classes_store == 5].T, lw=1);\nylims = ax1.get_ylim()\nxlims = ax1.get_xlim()\nxyNow = [np.min(xlims) + 0.5*diff(xlims),\n np.min(ylims) + 0.5*diff(ylims)]\nax1.annotate(str(5), xyNow, fontsize=75)\nax2.plot(quirk_store_norm[classes_store == 6].T, lw=1);\nylims = ax2.get_ylim()\nxlims = ax2.get_xlim()\nxyNow = [np.min(xlims) + 0.5*diff(xlims),\n np.min(ylims) + 0.5*diff(ylims)]\nax2.annotate(str(6), xyNow, fontsize=75)\n",
"_____no_output_____"
]
],
[
[
"classes_store_new = np.copy(classes_store)\nclasses_store_new[(classes_store == 5)*(quirk_store_norm[:,-1] < 0.5)] = 6\n# classes_store_new[(classes_store == 5)*(quirk_store_norm[:,-1] >= 0.5)] = classes_store[(classes_store == 5)*(quirk_store_norm[:,-1] >= 0.5)]\nclasses_store_new[classes_store_new == 6]\nnp.savetxt('myclasses_new_FINAL_for_cnaw_mask_CV3_dark_frames.txt', classes_store_new.astype(int), fmt='%d')",
"_____no_output_____"
]
],
[
[
"darks.shape",
"_____no_output_____"
],
[
"darks_trnspsd = np.transpose(darks, axes=(1,2,0))",
"_____no_output_____"
],
[
"for irow in range(len(quirks_store)):\n quirk_pp = pp.scale(quirks_store[irow])\n # print(std(quirk_pp), scale.mad(quirk_pp))\n plot(quirk_pp, alpha=0.5)# - median(darks_trnspsd[icol,irow])))\n# darks_scaled = pp.scale(darks,axis=0)",
"_____no_output_____"
],
[
"darks.shape, darks_trnspsd.shape",
"_____no_output_____"
],
[
"darks_reshaped = darks_trnspsd.reshape(darks_trnspsd.shape[0]*darks_trnspsd.shape[1], darks_trnspsd.shape[2])",
"_____no_output_____"
],
[
"darks_reshaped.shape",
"_____no_output_____"
],
[
"icol,irow = np.random.randint(0,2048,2)\npp.scale(darks_trnspsd[icol,irow] / median(darks_trnspsd[icol,irow]))",
"_____no_output_____"
],
[
"darks_norm = darks / median(darks, axis=0)",
"_____no_output_____"
],
[
"darks_std = std(darks_norm, axis=0)\ndarks_std.shape",
"_____no_output_____"
],
[
"darks_med_std = median(darks_std)",
"_____no_output_____"
],
[
"darks_flat = []\nfor irow in range(darks_reshaped.shape[0]):\n limit_check = std(darks_reshaped[irow] / median(darks_reshaped[irow])-1) < 2*darks_med_std\n # print(limit_check, std(darks_reshaped[irow] / median(darks_reshaped[irow])), darks_med_std)\n if limit_check:\n darks_flat.append(darks_reshaped[irow])\n\nnNormals = len(darks_flat)\nnNormals",
"_____no_output_____"
],
[
"darks_flat = np.array(darks_flat)\ndarks_flat.shape",
"_____no_output_____"
]
],
[
[
"darks_flat = darks_trnspsd[darks_std < 2*darks_med_std]\nnNormals = len(darks_flat)\nnNormals",
"_____no_output_____"
]
],
[
[
"darks_norm_trnspsd = np.transpose(darks_norm, axes=(1,2,0))\ndarks_norm_flat = darks_norm_trnspsd[darks_std < 2*darks_med_std]",
"_____no_output_____"
],
[
"darks_norm_flat.shape",
"_____no_output_____"
],
[
"darks_norm_flat.shape[0]",
"_____no_output_____"
]
],
[
[
"Simulate RTNs because the CV3 training data has None\n---",
"_____no_output_____"
]
],
[
[
"np.random.seed(42)\nsaturation = 2**16\ndynRange = 2**9\nnSamps = 1000\nnSig = 4.0\nnFrames = darks_norm_flat.shape[1]\nrtn_syn = np.zeros((nSamps, nFrames))\nrtn_classes = np.zeros(nSamps)\nmaxRTNs = np.int(0.9*nFrames)\nmaxWidth = 50\nminWidth = 10\nrtnCnt = 0\n\ndark_inds = np.arange(darks_reshaped.shape[0])\nframe_inds= np.arange(darks_reshaped.shape[1])\nfor irow in np.random.choice(dark_inds,nSamps,replace=False):\n rtn_syn[rtnCnt] = np.copy(darks_reshaped[irow])\n if darks_reshaped[irow].std() > 50:\n print(darks_reshaped[irow].std())\n nRTNs = np.random.randint(maxRTNs)\n \n coinflip = np.random.randint(0, 2)\n sign_rand = np.random.choice([-1,1])\n minJump = nSig*std(rtn_syn[rtnCnt] - median(rtn_syn[rtnCnt]))\n jump = abs(np.random.normal(minJump,dynRange) - minJump) + minJump\n if coinflip:\n rtn_classes[rtnCnt] = 0\n RTN_locs = np.random.choice(frame_inds, nRTNs, replace=False)\n for iRTN in RTN_locs:\n rtn_syn[rtnCnt][iRTN] += sign_rand*jump\n else:\n randWidth = np.random.randint(minWidth, maxWidth + 1)\n randStart = np.random.randint(minWidth, nFrames - randWidth - minWidth + 1)\n rtn_syn[rtnCnt][randStart:randStart+randWidth] += sign_rand*jump\n rtn_classes[rtnCnt] = 1\n \n rtn_syn[rtnCnt][rtn_syn[rtnCnt] > saturation] = saturation\n # if not rtnCnt % 100:\n plot(rtn_syn[rtnCnt] - median(rtn_syn[rtnCnt]))\n \n rtnCnt = rtnCnt + 1\n\nxlim(-1,110);\n# ylim(-100,100);",
"_____no_output_____"
],
[
"darks_flat_med_axis0 = np.median(darks_flat,axis=0)\ndarks_flat_med_axis1 = np.median(darks_flat,axis=1)",
"_____no_output_____"
],
[
"darks_flat_std_axis0 = np.std(darks_flat,axis=0)\ndarks_flat_std_axis1 = np.std(darks_flat,axis=1)",
"_____no_output_____"
],
[
"darks_flat_med_axis0_norm = darks_flat_med_axis0[1:] / median(darks_flat_med_axis0[1:])",
"_____no_output_____"
],
[
"classLabels = {1:'Noisy', 2:'HP', 3:'IHP', 4:'LHP', 5:'SHP', 6:'CR', 7:'RTN0', 8:'RTN1'}\nfor k in range(1,9):\n print(k, classLabels[k])",
"_____no_output_____"
],
[
"def kde_sklearn(x, x_grid, bandwidth=0.2, **kwargs):\n \"\"\"Kernel Density Estimation with Scikit-learn\"\"\"\n kde_skl = KernelDensity(bandwidth=bandwidth, **kwargs)\n kde_skl.fit(x[:, np.newaxis])\n # score_samples() returns the log-likelihood of the samples\n log_pdf = kde_skl.score_samples(x_grid[:, np.newaxis])\n return np.exp(log_pdf)",
"_____no_output_____"
],
[
"flags_loc[:,0]",
"_____no_output_____"
],
[
"# x0, y0 = flags_loc[2187]\n\nfig = figure(figsize=(16,6))\nax1 = fig.add_subplot(121)\nax2 = fig.add_subplot(122)\nax1.clear()\n\nrtnNow = 3\n\nleave1out = np.zeros(rtn_syn[rtnNow].size)\nfor k in range(rtn_syn[rtnNow].size):\n leave1out[k] = np.std(hstack([rtn_syn[rtnNow][:k],rtn_syn[rtnNow][k+1:]]))\n\n\ntestRand1 = np.random.normal(rtn_syn[rtnNow].mean(), 0.1*rtn_syn[rtnNow].std(), rtn_syn[rtnNow].size)\ntestRand2 = np.random.normal(rtn_syn[rtnNow].mean(), rtn_syn[rtnNow].std(), rtn_syn[rtnNow].size)\n\nleave1out1 = np.zeros(rtn_syn[rtnNow].size)\nleave1out2 = np.zeros(rtn_syn[rtnNow].size)\nfor k in range(rtn_syn[rtnNow].size):\n leave1out1[k] = np.std(hstack([testRand1[:k],testRand1[k+1:]]))\n leave1out2[k] = np.std(hstack([testRand2[:k],testRand2[k+1:]]))\n\nl1o_diffRTN = np.std(rtn_syn[rtnNow]) - leave1out\nl1o_diffTR1 = np.std(testRand1) - leave1out1\nl1o_diffTR2 = np.std(testRand2) - leave1out2\n\nax1.hist(pp.scale(darks_flat_med_axis0), bins=20, normed=True, alpha=0.25, label='DarksMed Rescaled')\nkde2 = sm.nonparametric.KDEUnivariate(pp.scale(darks_flat_med_axis0))\nkde2.fit(kernel='uni', fft=False)\nax1.plot(kde2.support, kde2.density, lw=2, color=rcParams['axes.color_cycle'][1])\nax1.hist((l1o_diffRTN - l1o_diffRTN.mean()) / l1o_diffRTN.std(), bins=20, label='Leave 1 Out Std RTN', normed=True,alpha=0.5);\nax1.hist((l1o_diffTR1 - l1o_diffTR1.mean()) / l1o_diffTR1.std(), bins=20, label='Leave 1 Out rand 1', normed=True,alpha=0.5);\nax1.hist((l1o_diffTR2 - l1o_diffTR2.mean()) / l1o_diffTR2.std(), bins=20, label='Leave 1 Out rand 2', normed=True,alpha=0.5);\n\nax2.hist((rtn_syn[rtnNow] - rtn_syn[rtnNow].mean()) / rtn_syn[rtnNow].std(), bins=20, label='RTN Now', normed=True,alpha=0.5);\nax2.hist((testRand1 - testRand1.mean()) / testRand1.std(), bins=20, label='testRand1', normed=True,alpha=0.5);\nax2.hist((testRand2 - testRand2.mean()) / testRand2.std() , bins=20, label='testRand2', normed=True,alpha=0.5);\nax1.legend(loc=0)\nax2.legend(loc=0)\nfig.canvas.draw()\n# display.display(plt.gcf())\n# display.clear_output(wait=True)",
"_____no_output_____"
],
[
"# x0, y0 = flags_loc[2187]\n\nfig = figure()\nax = fig.add_subplot(111)\nrtnNow = 1\nfor rtnNow in range(len(rtn_syn)):\n ax.clear()\n leave1out = np.zeros(rtn_syn[rtnNow].size)\n for k in range(rtn_syn[rtnNow].size):\n leave1out[k] = np.std(hstack([rtn_syn[rtnNow][:k],rtn_syn[rtnNow][k+1:]]))\n \n ax.plot(rescale(np.std(rtn_syn[rtnNow]) - leave1out), label='Leave 1 Out Std');\n ax.plot(rescale(rtn_syn[rtnNow])+1, label='RTN Now');\n ax.legend(loc=0)\n fig.canvas.draw()\n display.display(plt.gcf())\n display.clear_output(wait=True)",
"_____no_output_____"
],
[
"x0, y0 = flags_loc[2808]#413, 176\nprint(x0, y0)\nmeandark = np.mean([darks[:,x0+1, y0+0]*std(darks[:,x0+1, y0+0]),darks[:,x0-1, y0+0]*std(darks[:,x0-1, y0+0]), \\\n darks[:,x0+0, y0+1]*std(darks[:,x0+0, y0+1]),darks[:,x0+0, y0-1]*std(darks[:,x0+0, y0-1])], axis=0)\n# 160 335\n# 159 335\n# 161 335\n# 160 334\n# 160 336\n# meandark = np.mean([darks[:,159, 335],darks[:,161, 335], darks[:,160, 334],darks[:,160, 336]],axis=0)\n\n# meddark = np.median([darks[:,159, 335]*std(darks[:,159, 335]),darks[:,161, 335]*std(darks[:,161, 335]), \\\n# darks[:,160, 334]*std(darks[:,160, 334]),darks[:,160, 336]*std(darks[:,160, 336])],axis=0)\n\n# meandark = np.mean([darks[:,159, 335]*std(darks[:,159, 335]),darks[:,161, 335]*std(darks[:,161, 335]), \\\n# darks[:,160, 334]*std(darks[:,160, 334]),darks[:,160, 336]*std(darks[:,160, 336])], axis=0)\n\nfig = figure(figsize=(12,6))\nplot((darks[:,x0, y0] - np.min(darks[:,x0, y0])), lw=4)\n# plot((meandark / np.min(meandark)), lw=4)\n\n# plot((meddark - meddark.min()) / (meddark.max() - meddark.min() ), lw=4)\n\nplot((darks[:,x0+1, y0+0]) - np.min((darks[:,x0+1, y0+0])))\nplot((darks[:,x0-1, y0+0]) - np.min((darks[:,x0-1, y0+0])))\nplot((darks[:,x0+0, y0+1]) - np.min((darks[:,x0+0, y0+1])))\nplot((darks[:,x0+0, y0-1]) - np.min((darks[:,x0+0, y0-1])))\nxlim(-1,110)",
"_____no_output_____"
],
[
"x0, y0 = flags_loc[2808]#413, 176\nprint(x0, y0)\nmeandark = np.mean([darks[:,x0+1, y0+0]*std(darks[:,x0+1, y0+0]),darks[:,x0-1, y0+0]*std(darks[:,x0-1, y0+0]), \\\n darks[:,x0+0, y0+1]*std(darks[:,x0+0, y0+1]),darks[:,x0+0, y0-1]*std(darks[:,x0+0, y0-1])], axis=0)\n# 160 335\n# 159 335\n# 161 335\n# 160 334\n# 160 336\n# meandark = np.mean([darks[:,159, 335],darks[:,161, 335], darks[:,160, 334],darks[:,160, 336]],axis=0)\n\n# meddark = np.median([darks[:,159, 335]*std(darks[:,159, 335]),darks[:,161, 335]*std(darks[:,161, 335]), \\\n# darks[:,160, 334]*std(darks[:,160, 334]),darks[:,160, 336]*std(darks[:,160, 336])],axis=0)\n\n# meandark = np.mean([darks[:,159, 335]*std(darks[:,159, 335]),darks[:,161, 335]*std(darks[:,161, 335]), \\\n# darks[:,160, 334]*std(darks[:,160, 334]),darks[:,160, 336]*std(darks[:,160, 336])], axis=0)\nfig = figure(figsize=(12,12))\n# plot(rescale(darks[:,x0, y0]), lw=4)\n# plot(rescale(meandark), lw=4)\n\n# plot((meddark - meddark.min()) / (meddark.max() - meddark.min() ), lw=4)\n\naxvline(argmax(diff(rescale(darks[:,x0, y0])))+1,lw=4)\nplot(rescale(darks[:,x0+1, y0+0]))\nplot(rescale(darks[:,x0-1, y0+0]))\nplot(rescale(darks[:,x0+0, y0+1]))\nplot(rescale(darks[:,x0+0, y0-1]))",
"_____no_output_____"
],
[
"def rescale(arr):\n return (arr - arr.min()) / (arr.max() - arr.min())",
"_____no_output_____"
],
[
"# quirkCheck = np.zeros(len(quirks_store))",
"_____no_output_____"
],
[
"np.savetxt('quirkCheck_save_bkup.txt', quirkCheck)",
"_____no_output_____"
],
[
"quirkCheck[np.where(quirkCheck ==0)[0].min()-1] = 0.0",
"_____no_output_____"
],
[
"fig = figure(figsize=(15,15))\n\nax1 = fig.add_subplot(221)\nax2 = fig.add_subplot(222)\nax3 = fig.add_subplot(223)\nax4 = fig.add_subplot(224)\n\n# fig = figure(figsize=(15,5))\n# ax1 = fig.add_subplot(141)\n# ax2 = fig.add_subplot(142)\n# ax3 = fig.add_subplot(143)\n# ax4 = fig.add_subplot(144)\n\ndarkMed = darks_flat_med_axis0 - np.min(darks_flat_med_axis0)\ndarksMed_scaled = darks_flat_med_axis0 / median(darks_flat_med_axis0)# pp.scale(darks_flat_med_axis0)\n\ndiff_darks_flat_med_axis0 = np.zeros(darks_flat_med_axis0.size)\ndiff_darks_flat_med_axis0[1:] = diff(darks_flat_med_axis0)\n\nclassLabels = {1:'Noisy', 2:'HP', 3:'IHP', 4:'LHP', 5:'SHP', 6:'CR', 7:'RTN0', 8:'RTN1'}\nnp.savetxt('quirkCheck_save_bkup.txt', quirkCheck)\nfor iQuirk, quirkNow in enumerate(quirks_store):\n if quirkCheck[iQuirk]:\n continue\n \n ax1.clear()\n ax2.clear()\n ax3.clear()\n ax4.clear()\n \n classNow = classes_store[iQuirk]\n classOut = classNow\n if classNow == 3:\n classOut = 7\n if classNow == 4:\n classOut = 6\n if classNow == 5:\n classOut = 5\n if classNow == 6:\n classOut = 3\n \n # ax1.plot(darks_reshaped[irow][1:] / median(darks_reshaped[irow][1:]) - darks_flat_med_axis0_norm);\n \n # Plot Subtraction frame: (Darknow - min(Darknow)) - (DarkMed - min(DarkMed))\n # darkNowMinusMed = (darks_reshaped[irow][1:] - np.min(darks_reshaped[irow][1:])) - \\\n # (darks_flat_med_axis0[1:] - np.min(darks_flat_med_axis0[1:]))\n \n quirkNow_scaled = pp.scale(quirkNow)\n \n quirkMinusMed = (quirkNow_scaled - np.min(quirkNow_scaled)) - (darksMed_scaled - np.min(darksMed_scaled))\n \n quirkNowRescaled = pp.scale((quirkNow - np.min(quirkNow))-(darks_flat_med_axis0 - np.min(darks_flat_med_axis0)))\n _, xhist = np.histogram(quirkNowRescaled, bins=20, normed=True)#, alpha=0.50)\n \n kde1 = sm.nonparametric.KDEUnivariate(quirkNowRescaled)\n kde1.fit(kernel='uni', bw=0.33*median(diff(xhist)), fft=False)\n \n ax1.plot(kde1.support, rescale(kde1.density), lw=2, color=rcParams['axes.color_cycle'][0], label='QuirkNow Rescaled')\n \n # if classNow == 1:\n #ax1.hist(pp.scale(darks_flat_med_axis0), bins=20, normed=True, alpha=0.25, label='DarksMed Rescaled')\n kde2 = sm.nonparametric.KDEUnivariate(pp.scale(darks_flat_med_axis0))\n kde2.fit(kernel='uni', fft=False)\n ax1.plot(kde2.support, rescale(kde2.density), lw=2, color=rcParams['axes.color_cycle'][1], label='DarksMed Rescaled')\n \n leave1out = np.zeros(nFrames)\n for k in range(nFrames):\n leave1out[k] = np.std(hstack([quirkNow[:k],quirkNow[k+1:]]))\n \n #ax1.hist(pp.scale(np.std(quirkNow) - leave1out), bins=20, alpha=0.5, normed=True, label='Leave1Out Rescaled');\n kde3 = sm.nonparametric.KDEUnivariate(pp.scale(np.std(quirkNow) - leave1out))\n kde3.fit(kernel='uni', fft=False)\n ax1.plot(kde3.support, rescale(kde3.density), lw=2, color=rcParams['axes.color_cycle'][2], label='Leave1Out Rescaled')\n ax1.legend()\n # else:\n # ax1.hist((darksAvg - np.median(darksAvg)), bins=20, normed=True, alpha=0.25)\n # kde2 = sm.nonparametric.KDEUnivariate((darksAvg - np.median(darksAvg)))\n # kde2.fit(kernel='uni', fft=False)\n # ax1.plot(kde2.support, kde2.density, lw=2, color=rcParams['axes.color_cycle'][1])\n\n ylims = ax1.get_ylim()\n xlims = ax1.get_xlim()\n xyNow1 = [np.min(xlims) + 0.1*diff(xlims),\n np.min(ylims) + 0.9*diff(ylims)]\n ax1.annotate(str(classOut) + ': ' + classLabels[classOut], xyNow1, fontsize=75)\n # ax1.plot()\n # ax1.axvline(median(rtnNow), linestyle='--', color='k')\n ax1.set_xlabel('Subtraction Hist')\n # ax1.plot(darks_reshaped[irow][1:] / median(darks_reshaped[irow][1:]) / darks_flat_med_axis0_norm - 1);\n \n # Plot Normalized Frame: DarkNow vs DarMed\n ax2.plot((quirkNow - np.min(quirkNow))/darksMed_scaled,'o-');\n ylims = ax2.get_ylim()\n xlims = ax2.get_xlim()\n xyNow2 = [np.\n \n min(xlims) + 0.1*diff(xlims),\n np.min(ylims) + 0.9*diff(ylims)]\n ax2.annotate(str(classOut) + ': ' + classLabels[classOut], xyNow2, fontsize=75)\n #ax2.plot(darksMed_scaled,'o-')\n ax2.set_xlabel('Normalized Frame')\n \n # Plot Common Mode Correlation Frame\n # ax3.plot((quirkNow - np.min(quirkNow))-(darks_flat_med_axis0 - np.min(darks_flat_med_axis0)), darksMed_scaled,'o')\n # ax3.plot(darksMed_scaled, darksMed_scaled,'o')\n ax3.plot(rescale(diff(quirkNow)),'o-')\n ax3.plot(rescale(diff_darks_flat_med_axis0), 'o-', alpha=0.25)\n ax3.axhline(np.median(rescale(diff_darks_flat_med_axis0)), c='k', lw=1)\n ax3.axhline(np.median(rescale(diff_darks_flat_med_axis0))+np.std(rescale(diff_darks_flat_med_axis0)), c='k', lw=1,ls='--')\n ax3.axhline(np.median(rescale(diff_darks_flat_med_axis0))-np.std(rescale(diff_darks_flat_med_axis0)), c='k', lw=1,ls='--')\n ylims = ax3.get_ylim()\n xlims = ax3.get_xlim()\n xyNow3 = [np.min(xlims) + 0.1*diff(xlims),\n np.min(ylims) + 0.9*diff(ylims)]\n ax3.annotate(str(classOut) + ': ' + classLabels[classOut], xyNow3, fontsize=75)\n ax3.set_xlabel('Diff Mode')\n \n # Plot Raw DN minus Min Dark Ramp: DarkNow - min(DarkNow) vs DarkMed - min(DarkMed)\n flagNow = flags_loc[iQuirk]\n dark0 = quirkNow - np.min(quirkNow)\n # ax4.plot(rescale((dark0 + diff_darks_flat_med_axis0)),'o-', color=rcParams['axes.color_cycle'][0])\n ax4.plot(rescale((dark0 - diff_darks_flat_med_axis0)),'o-', color=rcParams['axes.color_cycle'][0])\n \n avgCnt = 0\n darksAvg = np.zeros(quirkNow.size)\n if flagNow[0] > 0:\n avgCnt += 1\n darksAvg += (darks[:,flagNow[0]-1, flagNow[1]+0] - diff_darks_flat_med_axis0) * std(darks[:,flagNow[0]-1, flagNow[1]+0])\n \n # ax4.plot(rescale(darks[:,flagNow[0]-1, flagNow[1]+0]),'o-')\n if flagNow[0] + 1 < darks.shape[1]:\n avgCnt += 1\n darksAvg += (darks[:,flagNow[0]+1, flagNow[1]+0] - diff_darks_flat_med_axis0) * std(darks[:,flagNow[0]+1, flagNow[1]+0])\n \n # ax4.plot(rescale(darks[:,flagNow[0]+1, flagNow[1]+0]),'o-')\n if flagNow[1] > 0:\n avgCnt += 1\n darksAvg += (darks[:,flagNow[0]+0, flagNow[1]-1] - diff_darks_flat_med_axis0) * std(darks[:,flagNow[0]+0, flagNow[1]-1])\n \n # ax4.plot(rescale(darks[:,flagNow[0]+0, flagNow[1]-1]),'o-')\n if flagNow[1] + 1 < darks.shape[1]:\n avgCnt += 1\n darksAvg += (darks[:,flagNow[0]+0, flagNow[1]+1] - diff_darks_flat_med_axis0) * std(darks[:,flagNow[0]+0, flagNow[1]+1])\n \n # ax4.plot(rescale(darks[:,flagNow[0]+0, flagNow[1]+1]),'o-',lw=4)\n \n darksAvg = darksAvg / avgCnt\n \n ax4.plot(rescale(darksAvg), 'o-', color=rcParams['axes.color_cycle'][3])\n # ax2.plot((darksAvg - np.min(darksAvg))/darksMed_scaled,'o-');\n ylims = ax4.get_ylim()\n xlims = ax4.get_xlim()\n xyNow4 = [np.min(xlims) + 0.1*diff(xlims),\n np.min(ylims) + 0.9*diff(ylims)]\n ax4.annotate(str(classOut) + ': ' + classLabels[classOut], xyNow4, fontsize=75)\n \n ax4.set_xlabel('Rescaled Nearby Pixels ' + str(flagNow[0]) + ',' + str(flagNow[1]))\n # ax4.set_ylim(-5,5)\n # ax.plot(darks_flat_med_axis0[1:] / median(darks_flat_med_axis0[1:]))\n fig.suptitle('iQuirk: ' + str(iQuirk) + ' / ' + str(len(quirks_store)), fontsize=20)\n # ax1.set_ylim(ax2.get_ylim())\n fig.canvas.draw()\n display.display(plt.gcf())\n \n inputNow = input('[1:Noisy, 2:HP, 3:IHP, 4:LHP, 5:SHP, 6:CR, 7:RTN0, 8:RTN1]? ')\n \n # inputNowBak = np.copy(inputNow)\n quirkCheck[iQuirk] = int(classOut)\n if inputNow == '':\n pass\n else:\n classOut = int(inputNow)\n \n doubleCheck = input(str(classNow) + \" -> \" + str(classOut) + \"? \")\n \n #doubleCheck != '' and {'y':True, 'n':False}[doubleCheck.lower()[0]]:\n if doubleCheck.lower()[0] == 'y':\n print('Changed '+ str(classNow) + ': ' + classLabels[classNow] + \" to \" + str(classOut) + ': ' + classLabels[classOut] + \"!\")\n quirkCheck[iQuirk] = int(classNow)\n \n display.clear_output(wait=True)",
"_____no_output_____"
],
[
"np.sum(quirkCheck == 0)",
"_____no_output_____"
],
[
"fig = figure(figsize=(20,5))\nax1 = fig.add_subplot(141)\nax2 = fig.add_subplot(142)\nax3 = fig.add_subplot(143)\nax4 = fig.add_subplot(144)\n\ndarksMed_scaled = darks_flat_med_axis0 / median(darks_flat_med_axis0)# pp.scale(darks_flat_med_axis0)\n\nrtnCheck = []\nfor iRTN, rtnNow in enumerate(rtn_syn):\n ax1.clear()\n ax2.clear()\n ax3.clear()\n ax4.clear()\n # ax1.plot(darks_reshaped[irow][1:] / median(darks_reshaped[irow][1:]) - darks_flat_med_axis0_norm);\n \n # Plot Subtraction frame: (Darknow - min(Darknow)) - (DarkMed - min(DarkMed))\n # darkNowMinusMed = (darks_reshaped[irow][1:] - np.min(darks_reshaped[irow][1:])) - \\\n # (darks_flat_med_axis0[1:] - np.min(darks_flat_med_axis0[1:]))\n \n rtnNow_scaled = pp.scale(rtnNow_scaled)\n \n rtnMinusMed = (rtnNow_scaled - np.min(rtnNow_scaled)) - (darksMed_scaled - np.min(darksMed_scaled))\n \n ax1.hist((rtnNow - np.min(rtnNow))-(darks_flat_med_axis0 - np.min(darks_flat_med_axis0)), bins=20, normed=True)\n kde1 = sm.nonparametric.KDEUnivariate((rtnNow - np.min(rtnNow))-(darks_flat_med_axis0 - np.min(darks_flat_med_axis0)))\n # kde2 = sm.nonparametric.KDEUnivariate(rtnNow)\n kde1.fit()\n # kde2.fit()\n ax1.plot(kde1.support, kde1.density)\n \n # ax1.plot()\n # ax1.axvline(median(rtnNow), linestyle='--', color='k')\n ax1.set_title('Subtraction Hist')\n # ax1.plot(darks_reshaped[irow][1:] / median(darks_reshaped[irow][1:]) / darks_flat_med_axis0_norm - 1);\n \n # Plot Normalized Frame: DarkNow vs DarMed\n ax2.plot((rtnNow - np.min(rtnNow))/darksMed_scaled,'o-');\n #ax2.plot(darksMed_scaled,'o-')\n ax2.set_title('Normalized Frame')\n \n # Plot Common Mode Correlation Frame\n ax3.plot((rtnNow - np.min(rtnNow))-(darks_flat_med_axis0 - np.min(darks_flat_med_axis0)), darksMed_scaled,'o')\n # ax3.plot(darksMed_scaled, darksMed_scaled,'o')\n ax3.set_title('Common Mode')\n \n # Plot Raw DN minus Min Dark Ramp: DarkNow - min(DarkNow) vs DarkMed - min(DarkMed)\n # ax4.plot(rtnNow - np.min(rtnNow),'o-')\n ax4.plot((rtnNow - np.min(rtnNow)),'o-')\n ax4.plot((darks_flat_med_axis0 - np.min(darks_flat_med_axis0)),'o-')\n ax4.set_title('Raw DN - Min')\n \n # ax.plot(darks_flat_med_axis0[1:] / median(darks_flat_med_axis0[1:]))\n fig.suptitle('Row:' + str(irow) + ' iRTN: ' + str(iRTN) + ' / ' + str(len(rtn_syn)))\n # ax1.set_ylim(ax2.get_ylim())\n fig.canvas.draw()\n display.display(plt.gcf())\n display.clear_output(wait=True)\n rtnCheck.append(input('[1:Noisy, 2:HP, 3:IHP, 4:LHP, 5:SHP, 6:CR, 7:RTN0, 8:RTN1]? '))",
"_____no_output_____"
],
[
"np.random.seed(42)\nnFlatDarks = 5000\ndf_inds = np.arange(darks_flat.shape[0])\ndf_sample = np.random.choice(df_inds, nFlatDarks, replace=False)\ndarks_flat_sample = np.copy(darks_flat[df_sample])",
"_____no_output_____"
],
[
"plot(darks_flat_sample.T - median(darks_flat_sample,axis=1));",
"_____no_output_____"
],
[
"darks_flat_sample.shape",
"_____no_output_____"
],
[
"for k in range(darks_flat_sample.shape[0]):\n if (darks_flat_sample[k] - median(darks_flat_sample[k]) > 200).any():\n print(k)",
"_____no_output_____"
],
[
"std(abs(darks_flat_sample[850] / median(darks_flat_sample[850]))-1), 2*darks_med_std",
"_____no_output_____"
],
[
"darks_flat_sample0 = darks_flat_sample.copy()\ndarks_flat_sample = vstack([darks_flat_sample0[:850], darks_flat_sample0[851:]])#.shape",
"_____no_output_____"
],
[
"plot(np.mean(darks_flat,axis=0))\nplot(np.median(darks_flat,axis=0))",
"_____no_output_____"
],
[
"np.median(darks_flat,axis=0).shape",
"_____no_output_____"
],
[
"darks_flat_std_axis1.shape",
"_____no_output_____"
],
[
"darks_flat_med_axis0.shape,darks_flat_med_axis1.shape",
"_____no_output_____"
],
[
"darks_flat_sample_med1 = np.median(darks_flat_sample,axis=1)",
"_____no_output_____"
],
[
"print(darks_flat_med_axis0.shape, darks_flat_sample_med1.shape)",
"_____no_output_____"
],
[
"darks_flat_sample.shape",
"_____no_output_____"
],
[
"plot(pp.scale(darks_flat_sample).T - pp.scale(darks_flat_med_axis0));",
"_____no_output_____"
]
],
[
[
"Remove Common Mode variations from frame to frame (time series)\n---\nProbably related to bias drifting",
"_____no_output_____"
]
],
[
[
"plot(darks_flat_med_axis0)",
"_____no_output_____"
],
[
"quirks_store_smooth = np.copy(quirks_store) #/ darks_flat_med_axis0\nrtn_syn_smooth = np.copy(rtn_syn) #/ darks_flat_med_axis0\ndarks_flat_sample_smooth = np.copy(darks_flat_sample) #/ darks_flat_med_axis0",
"_____no_output_____"
],
[
"print(quirks_store_smooth.shape, rtn_syn_smooth.shape, darks_flat_sample_smooth.shape)",
"_____no_output_____"
],
[
"plot(((quirks_store_smooth.T - np.min(quirks_store_smooth,axis=1)) / (np.max(quirks_store_smooth,axis=1) - np.min(quirks_store_smooth,axis=1))));",
"_____no_output_____"
],
[
"plot(((rtn_syn_smooth.T - np.min(rtn_syn_smooth,axis=1)) / (np.max(rtn_syn_smooth,axis=1) - np.min(rtn_syn_smooth,axis=1))));",
"_____no_output_____"
],
[
"quirksNoisy = quirks_store_smooth[classes_store==1]\nplot(((quirksNoisy.T - np.min(quirksNoisy,axis=1)) / (np.max(quirksNoisy,axis=1) - np.min(quirksNoisy,axis=1))));",
"_____no_output_____"
],
[
"plot(((darks_flat_sample_smooth.T - np.min(darks_flat_sample_smooth,axis=1)) / (np.max(darks_flat_sample_smooth,axis=1) - np.min(darks_flat_sample_smooth,axis=1))));",
"_____no_output_____"
]
],
[
[
"Random Forest Classification\n---",
"_____no_output_____"
],
[
"Load Sci-kit Learn Libraries",
"_____no_output_____"
]
],
[
[
"from sklearn.ensemble import RandomForestClassifier\nfrom sklearn.utils import shuffle\nfrom sklearn.cross_validation import train_test_split\nfrom sklearn.externals import joblib",
"_____no_output_____"
]
],
[
[
"darks_classes = np.zeros(darks_flat_sample_smooth.shape[0],dtype=int)\nrtn_classes = rtn_classes + 3\nsamples_train_set = vstack([quirks_store_smooth, rtn_syn_smooth, darks_flat_sample_smooth])\nclasses_train_set = vstack([classes_store[:,None], rtn_classes[:,None], darks_classes[:,None]])[:,0]",
"_____no_output_____"
]
],
[
[
"classes_store[np.where(classes_store > 3)] += 1",
"_____no_output_____"
],
[
"darks_classes = np.zeros(darks_flat_sample_smooth.shape[0],dtype=int)\nrtn_classes = rtn_classes + 3\nsamples_train_set = vstack([quirks_store, rtn_syn, darks_flat_sample])\nclasses_train_set = vstack([classes_store[:,None], rtn_classes[:,None], darks_classes[:,None]])[:,0]",
"_____no_output_____"
],
[
"samples_train_set.shape",
"_____no_output_____"
]
],
[
[
"sts_inds = np.arange(samples_train_set.shape[0])\nunsort_sts = np.random.choice(sts_inds, sts_inds.size, replace=False)",
"_____no_output_____"
]
],
[
[
"samples_train_set_resort = shuffle(np.copy(samples_train_set), random_state=42)\nclasses_train_resort = shuffle(np.copy(classes_train_set), random_state=42)",
"_____no_output_____"
]
],
[
[
"Rescaled all samples from 0 to 1",
"_____no_output_____"
]
],
[
[
"samples_train_set_resort.shape",
"_____no_output_____"
],
[
"samples_train_set_resort_scaled = (( samples_train_set_resort.T - np.min(samples_train_set_resort,axis=1)) / \\\n (np.max(samples_train_set_resort,axis=1) - np.min(samples_train_set_resort,axis=1))).T",
"_____no_output_____"
],
[
"samples_train_set_resort_scaled.shape",
"_____no_output_____"
],
[
"plot(samples_train_set_resort_scaled.T);",
"_____no_output_____"
]
],
[
[
"Establish Random Forest Classification\n- 1000 trees\n- OOB Score\n- Multiprocessing",
"_____no_output_____"
]
],
[
[
"rfc = RandomForestClassifier(n_estimators=1000, oob_score=True, n_jobs=-1, random_state=42, verbose=True)",
"_____no_output_____"
]
],
[
[
"rfc2 = RandomForestClassifier(n_estimators=1000, oob_score=True, n_jobs=-1, random_state=42, verbose=True)",
"_____no_output_____"
]
],
[
[
"Split Samples into 75% Train and 25% Test",
"_____no_output_____"
]
],
[
[
"X_train, X_test, Y_train, Y_test = train_test_split(samples_train_set_resort_scaled.T, classes_train_resort, test_size = 0.25, random_state=42)",
"_____no_output_____"
],
[
"X_train.shape, X_test.shape, Y_train.shape, Y_test.shape",
"_____no_output_____"
]
],
[
[
"Shuffle Training Data Set",
"_____no_output_____"
]
],
[
[
"X_train, Y_train = shuffle(X_train, Y_train, random_state=42)",
"_____no_output_____"
]
],
[
[
"Train Classifier with `rfc.fit`",
"_____no_output_____"
]
],
[
[
"rfc.fit(X_train, Y_train)",
"_____no_output_____"
]
],
[
[
"rfc.fit(samples_train_set_resort_scaled, classes_train_resort)",
"_____no_output_____"
]
],
[
[
"Score Classifier with Test Data Score",
"_____no_output_____"
]
],
[
[
"rfc.score(samples_train_set_resort_scaled, classes_train_resort)",
"_____no_output_____"
]
],
[
[
"Score Classifier with Out-of-Bag Error",
"_____no_output_____"
]
],
[
[
"rfc.oob_score_",
"_____no_output_____"
]
],
[
[
"Save Random Forest Classifier becuse 98% is AWESOME!",
"_____no_output_____"
]
],
[
[
"joblib.dump(rfc, 'trained_RF_Classifier/random_forest_classifier_trained_on_resorted_samples_train_set_RTN_CR_HP_Other_Norm.save')\njoblib.dump(dict(samples=samples_train_set_resort_scaled.T, classes=classes_train_resort), 'trained_RF_Classifier/RTN_CR_HP_Other_Norm_resorted_samples_train_set.save')",
"_____no_output_____"
]
],
[
[
"joblib.dump(rfc2, 'trained_RF_Classifier/random_forest_classifier_trained_full_set_on_resorted_samples_train_set_RTN_CR_HP_Other_Norm.save')",
"_____no_output_____"
]
],
[
[
"darks_reshaped.shape",
"_____no_output_____"
],
[
"step = 0\nskipsize = 100\nchunkNow = arange(step*darks_reshaped.shape[0]//skipsize,min((step+1)*darks_reshaped.shape[0]//skipsize, darks_reshaped.shape[0]))",
"_____no_output_____"
],
[
"darks_reshaped_chunk = darks_reshaped[chunkNow]\ndarks_reshaped_chunk_smooth = darks_reshaped_chunk #/ darks_flat_med_axis0\ndarks_reshaped_chunk_scaled = ((darks_reshaped_chunk_smooth.T - np.min(darks_reshaped_chunk_smooth,axis=1)) / \\\n ((np.max(darks_reshaped_chunk_smooth,axis=1) - np.min(darks_reshaped_chunk_smooth,axis=1)))).T",
"_____no_output_____"
],
[
"samples_train_set_resort_scaled.shape, darks_reshaped_chunk_smooth.shape",
"_____no_output_____"
],
[
"plot((darks_reshaped_chunk_scaled[::100]).T);",
"_____no_output_____"
],
[
"rfc_pred = np.zeros(darks_reshaped.shape[0])\nrfc2_pred= np.zeros(darks_reshaped.shape[0])\nstep = 0\nskipsize = 50\n\ngapSize = rfc_pred.size//skipsize\n\nstart = time()\nfor step in range(skipsize):\n chunkNow = arange(step*gapSize,min((step+1)*gapSize, rfc_pred.size))\n print(chunkNow.min(), chunkNow.max(), end=\" \")\n #\n # darks_reshaped_k_scaled = ((darks_reshaped[chunkNow].T - np.min(darks_reshaped[chunkNow],axis=1)) / \\\n # ((np.max(darks_reshaped[chunkNow],axis=1) - np.min(darks_reshaped[chunkNow],axis=1)))).T\n # darks_reshaped_chunk = darks_reshaped[chunkNow]\n # darks_reshaped_chunk_smooth = darks_reshaped_chunk #/ darks_flat_med_axis0\n darks_reshaped_chunk_scaled = ((darks_reshaped[chunkNow].T - np.min(darks_reshaped[chunkNow],axis=1)) / \\\n ((np.max(darks_reshaped[chunkNow],axis=1) - np.min(darks_reshaped[chunkNow],axis=1)))).T\n #\n rfc_pred[chunkNow] = rfc.predict(darks_reshaped_chunk_scaled)\n badPix = rfc_pred != 0.0\n numBad = badPix.sum()\n percentBad = str(numBad / rfc_pred.size * 100)[:5] + '%'\n print(percentBad, rfc_pred[badPix][::numBad//10])\n\nprint('Operation Took ' + str(time() - start) + ' seconds')",
"_____no_output_____"
],
[
"print(str((rfc_pred != 0.0).sum() / rfc_pred.size * 100)[:5] + '%')",
"_____no_output_____"
],
[
"for k in range(6):\n print(k,sum(rfc_pred == k))",
"_____no_output_____"
],
[
"rfc_pred_train = rfc.predict(samples_train_set_resort_scaled)",
"_____no_output_____"
],
[
"1-np.abs(rfc_pred_train - classes_train_resort).sum() / classes_train_resort.size, rfc.score(samples_train_set_resort_scaled, classes_train_resort)",
"_____no_output_____"
],
[
"fig = figure()#figsize=(6,6))\nax = fig.add_subplot(111)\n# ax.plot([nan,nan])\nclass3Check = []\nfor iRTN, irow in enumerate(np.where(rfc_pred == 3)[0]):\n # ax.lines.pop()\n ax.clear()\n ax.plot(darks_reshaped[irow][1:]);\n ax.set_title('Row:' + str(irow) + ' iRTN:' + str(iRTN) + ' / ' + str(sum(rfc_pred == 3\n )))\n fig.canvas.draw()\n display.display(plt.gcf())\n display.clear_output(wait=True)\n class3Check.append(input('Could this be an RTN? '))",
"_____no_output_____"
],
[
"class3Check_array = np.zeros(len(class3Check), dtype=bool)\nfor k, c3cNow in enumerate(class3Check):\n if c3cNow.lower() in ['', 'y', 'd', 'yu']:\n class3Check_array[k] = True\n elif c3cNow.lower() in ['o']:\n class3Check_array[k-1] = True\n elif c3cNow.lower() in ['n']:\n class3Check_array[k] = False\n else:\n print(k, c3cNow)",
"_____no_output_____"
],
[
"class3Check_array.size, len(class3Check)",
"_____no_output_____"
],
[
"plot(darks_flat_med_axis0 / median(darks_flat_med_axis0))",
"_____no_output_____"
],
[
"darks_flat_med_axis0_norm = darks_flat_med_axis0[1:] / median(darks_flat_med_axis0[1:])",
"_____no_output_____"
],
[
"fig = figure(figsize=(20,5))\nax1 = fig.add_subplot(141)\nax2 = fig.add_subplot(142)\nax3 = fig.add_subplot(143)\nax4 = fig.add_subplot(144)\n\n# ax.plot([nan,nan])\nclass1Check = []\nfor iNoisy, irow in enumerate(np.where(rfc_pred == 1)[0]):\n ax1.clear()\n ax2.clear()\n ax3.clear()\n ax4.clear()\n # ax1.plot(darks_reshaped[irow][1:] / median(darks_reshaped[irow][1:]) - darks_flat_med_axis0_norm);\n \n # Plot Subtraction frame: (Darknow - min(Darknow)) - (DarkMed - min(DarkMed))\n # darkNowMinusMed = (darks_reshaped[irow][1:] - np.min(darks_reshaped[irow][1:])) - \\\n # (darks_flat_med_axis0[1:] - np.min(darks_flat_med_axis0[1:]))\n \n darkNowMinusMed = pp.scale(darks_reshaped[irow][1:]) - pp.scale(darks_flat_med_axis0_norm)\n ax1.plot(darkNowMinusMed,'o-')\n ax1.axhline(median(darkNowMinusMed), linestyle='--', color='k')\n ax1.set_title('Subtraction Frame')\n # ax1.plot(darks_reshaped[irow][1:] / median(darks_reshaped[irow][1:]) / darks_flat_med_axis0_norm - 1);\n \n # Plot Normalized Frame: DarkNow vs DarMed\n ax2.plot(pp.scale(darks_reshaped[irow][1:]),'o-');\n ax2.plot(pp.scale(darks_flat_med_axis0_norm),'o-')\n ax2.set_title('Normalized Frame')\n \n # Plot Common Mode Correlation Frame\n ax3.plot(darks_reshaped[irow][1:] / median(darks_reshaped[irow][1:]), darks_flat_med_axis0_norm,'o')\n ax3.set_title('Common Mode')\n \n # Plot Raw DN minus Min Dark Ramp: DarkNow - min(DarkNow) vs DarkMed - min(DarkMed)\n ax4.plot(darks_reshaped[irow][1:] - np.min(darks_reshaped[irow][1:]),'o-')\n ax4.plot(darks_flat_med_axis0[1:] - np.min(darks_flat_med_axis0[1:]),'o-')\n ax4.set_title('Raw DN - Min')\n \n # ax.plot(darks_flat_med_axis0[1:] / median(darks_flat_med_axis0[1:]))\n #ax1.set_title('Row:' + str(irow) + ' iNoisy: ' + str(iNoisy) + ' / ' + str(sum(rfc_pred == 1)))\n fig.suptitle('Row:' + str(irow) + ' iNoisy: ' + str(iNoisy) + ' / ' + str(sum(rfc_pred == 1)))\n # ax1.set_ylim(ax2.get_ylim())\n fig.canvas.draw()\n display.display(plt.gcf())\n display.clear_output(wait=True)\n class1Check.append(input('Could this be an RTN? '))",
"_____no_output_____"
],
[
"fig = figure()#figsize=(6,6))\nax = fig.add_subplot(111)\n# ax.plot([nan,nan])\nrtn_synCheck = []\nfor iRTN, rtnNow in enumerate(rtn_syn):\n # ax.lines.pop()\n ax.clear()\n ax.plot(rtnNow);\n ax.set_title('iRTN:' + str(iRTN) + ' / ' + str(len(rtn_syn)))\n fig.canvas.draw()\n display.display(plt.gcf())\n display.clear_output(wait=True)\n #class3Check.append(input('Could this be an RTN? '))",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"raw",
"code",
"raw",
"code",
"raw",
"code",
"raw",
"code",
"raw",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"raw",
"code",
"raw",
"code",
"markdown",
"code",
"markdown",
"code",
"raw",
"markdown",
"raw",
"markdown",
"raw",
"markdown",
"raw",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"raw",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"raw",
"raw"
],
[
"code"
],
[
"raw"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"raw"
],
[
"code"
],
[
"raw"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"raw"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"raw"
],
[
"code",
"code",
"code"
],
[
"raw"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"raw"
],
[
"markdown"
],
[
"raw",
"raw"
],
[
"markdown"
],
[
"raw"
],
[
"markdown"
],
[
"raw"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"raw"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
e7a11fb7b1d04b4349a53a76584dcb66fb04c65f | 14,822 | ipynb | Jupyter Notebook | Geocoding_Colleges.ipynb | stkbailey/WSJ_CollegeRankings2018 | 4a5a70740e10dbdd87c63ded39297ea544cbb2af | [
"Apache-2.0"
] | 1 | 2017-11-09T07:27:33.000Z | 2017-11-09T07:27:33.000Z | Geocoding_Colleges.ipynb | stkbailey/WSJ_CollegeRankings2018 | 4a5a70740e10dbdd87c63ded39297ea544cbb2af | [
"Apache-2.0"
] | null | null | null | Geocoding_Colleges.ipynb | stkbailey/WSJ_CollegeRankings2018 | 4a5a70740e10dbdd87c63ded39297ea544cbb2af | [
"Apache-2.0"
] | null | null | null | 36.06326 | 289 | 0.432668 | [
[
[
"## Getting Geo-coordinates for WSJ Colleges\nHere we are going to use a couple of Python tools to make a database of the Latitude / Longitude locations for the different schools contained in the report. I'm doing this to compare the speed and accuracy of the included Power BI ArcGIS maps with a hard-coding the coordinates. \n\nOur strategy is:\n- Create a search string using the college name and city.\n- Use `requests` to query Google Maps API.\n- Save the database as a new file.\n\nFirst, we read in the WSJ data and create a search string.",
"_____no_output_____"
]
],
[
[
"geodf.head()",
"_____no_output_____"
],
[
"import pandas as pd\n\nwsj = pd.read_csv('wsj_data.csv')\n\nimport os\n\nif os.path.exists('wsj_locs.csv'):\n geodf = pd.read_csv('wsj_locs.csv', index_col='loc_string')\nelse:\n geodf = pd.DataFrame()\n geodf.index.name = 'loc_string'\n \nwsj.head()\n",
"_____no_output_____"
]
],
[
[
"For each college, we're going to create a search string as if we were looking it up in Google Maps. It's important to include as much information as we have so that the location service doesn't get confused with institutions in other countries, for example.",
"_____no_output_____"
]
],
[
[
"overwrite_loc_string = None\nif overwrite_loc_string:\n wsj['loc_string'] = wsj.apply(lambda s: '{}, {}, USA'.format(s.college, s.city_state), axis=1)\n wsj.to_csv('wsj_data.csv', encoding='utf-8', index=None)\n\nprint(wsj.loc_string[0:5])",
"0 Harvard University, Cambridge, MA, USA\n1 Columbia University, New York, NY, USA\n2 Massachusetts Institute of Technology, Cambrid...\n3 Stanford University, Stanford, CA, USA\n4 Duke University, Durham, NC, USA\nName: loc_string, dtype: object\n"
],
[
"def getCoords(search_string):\n '''Takes a search term, queries Google and returns the geocoordinates.'''\n import requests\n \n try:\n query = search_string.replace(' ', '+')\n response = requests.get('https://maps.googleapis.com/maps/api/geocode/json?address={}'.format(query))\n response_from_google = response.json()\n \n address = response_from_google['results'][0]['formatted_address']\n latitude = response_from_google['results'][0]['geometry']['location']['lat']\n longitude = response_from_google['results'][0]['geometry']['location']['lng']\n \n return pd.Series(name=search_string, \\\n data={'Address': address, 'Latitude': latitude, 'Longitude': longitude})\n except:\n return pd.Series(name=search_string, data={'Address': None, 'Latitude': None, 'Longitude': None})",
"_____no_output_____"
],
[
"for ind, school in wsj.loc_string.iteritems():\n if (not school in geodf.index) or (geodf.loc[school, 'Address'] == None):\n data = getCoords(school)\n geodf.loc[school] = data\n print(school, '\\n\\t\\t ', data)",
"Gonzaga University, Spokane, WA, USA \n\t\t Address 502 E Boone Ave, Spokane, WA 99202, USA\nLatitude 47.6672\nLongitude -117.402\nName: Gonzaga University, Spokane, WA, USA, dtype: object\nCampbell University, Buies Creek, NC, USA \n\t\t Address 143 Main St, Buies Creek, NC 27506, USA\nLatitude 35.4083\nLongitude -78.7394\nName: Campbell University, Buies Creek, NC, USA, dtype: object\nSUNY New Paltz, New Paltz NY, USA \n\t\t Address 1 Hawk Dr, New Paltz, NY 12561, USA\nLatitude 41.739\nLongitude -74.0852\nName: SUNY New Paltz, New Paltz NY, USA, dtype: object\n"
],
[
"geodf.to_csv('wsj_locs.csv', encoding='utf-8')\n",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
e7a132f482f6a8870711cdec2da37e55728027bd | 65,810 | ipynb | Jupyter Notebook | Session9/Day4/workbook_globalsignals.ipynb | rmorgan10/LSSTC-DSFP-Sessions | 1d0b3c28fe7f6f93e00e332e74873e6d1ec29d0b | [
"MIT"
] | null | null | null | Session9/Day4/workbook_globalsignals.ipynb | rmorgan10/LSSTC-DSFP-Sessions | 1d0b3c28fe7f6f93e00e332e74873e6d1ec29d0b | [
"MIT"
] | null | null | null | Session9/Day4/workbook_globalsignals.ipynb | rmorgan10/LSSTC-DSFP-Sessions | 1d0b3c28fe7f6f93e00e332e74873e6d1ec29d0b | [
"MIT"
] | null | null | null | 128.786693 | 49,524 | 0.87356 | [
[
[
"import numpy as np\nimport scipy.fftpack as fftpack\nimport matplotlib.pyplot as plt\nimport matplotlib.font_manager as font_manager\n%matplotlib inline\nfont_prop = font_manager.FontProperties(size=16)",
"_____no_output_____"
]
],
[
[
"# Global Signals in Time Series Data\n\nBy Abigail Stevens",
"_____no_output_____"
],
[
"# Problem 1: Timmer and Koenig algorithm",
"_____no_output_____"
],
[
"The algorithm outlined in Timmer & Koenig 1995 lets you define the shape of your power spectrum (a power law with some slope, a Lorentzian, a sum of a couple Lorentzians and a power law, etc.) then generate the random phases and amplitudes of the Fourier transform to simulate light curves defined by the power spectral shape. This is a great simulation tool to have in your back pocket (or, \"maybe useful someday\" github repo).",
"_____no_output_____"
],
[
"#### Define some basic parameters for the power spectrum and resultant light curve",
"_____no_output_____"
]
],
[
[
"n_bins = 8192 ## number of total frequency bins in a FT segment; same as number of time bins in the light curve\ndt = 1./16. # time resolution of the output light curve\ndf = 1. / dt / n_bins",
"_____no_output_____"
]
],
[
[
"## 1a. Make an array of Fourier frequencies\nYes you can do this with scipy, but the order of frequencies in a T&K power spectrum is different than what you'd get by default from a standard FFT of a light curve.\nYou want the zero frequency to be in the middle (at index n_bins/2) of the frequency array. The positive frequencies should have two more indices than the negative frequencies, because of the zero frequency and nyquist frequency. You can either do this with `np.arange` or with special options in `fftpack.fftfreq`.",
"_____no_output_____"
]
],
[
[
"#freq = fftpack.fftfreq(n_bins, d=df)\nfreqs = np.arange(float(-n_bins/2)+1, float(n_bins/2)+1) * df\npos_freq = freqs[np.where(freqs >= 0)]\n## Positive should have 2 more than negative, \n## because of the 0 freq and the nyquist freq\nneg_freq = freqs[np.where(freqs < 0)]\nnyquist = pos_freq[-1]\nlen_pos = len(pos_freq)",
"_____no_output_____"
]
],
[
[
"## 1b. Define a Lorentzian function and power law function for the shape of the power spectrum",
"_____no_output_____"
]
],
[
[
"def lorentzian(v, v_0, gamma):\n \"\"\" Gives a Lorentzian centered on v_0 with a FWHM of gamma \"\"\"\n numerator = gamma / (np.pi * 2.0)\n denominator = (v - v_0) ** 2 + (1.0/2.0 * gamma) ** 2\n L = numerator / denominator\n return L\n\ndef powerlaw(v, beta):\n \"\"\"Gives a powerlaw of (1/v)^-beta \"\"\"\n pl = np.zeros(len(v))\n pl[1:] = v[1:] ** (-beta)\n pl[0] = np.inf\n return pl",
"_____no_output_____"
]
],
[
[
"## Now the T&K algorithm. I've transcribed the 'recipe' section of the T&K95 paper, which you will convert to lines of code. ",
"_____no_output_____"
],
[
"## 1c. Choose a power spectrum $S(\\nu)$. \nWe will use a sum of one Lorentzians (a QPO with a centroid frequency of 0.5 Hz and a FWHM of 0.01 Hz), and a Poisson-noise power law. The QPO should be 100 times larger amplitude than the power-law.",
"_____no_output_____"
]
],
[
[
"power_shape = 100 * lorentzian(pos_freq, 0.5, 0.01) + powerlaw(pos_freq, 0)",
"_____no_output_____"
]
],
[
[
"## 1d. For each Fourier frequency $\\nu_i$ draw two gaussian-distributed random numbers, multiply them by $$\\sqrt{\\frac{1}{2}S(\\nu_i)}$$ and use the result as the real and imaginary part of the Fourier transform $F$ of the desired data.\nIn the case of an even number of data points, for reason of symmetry $F(\\nu_{Nyquist})$ is always real. Thus only one gaussian distributed random number has to be drawn.",
"_____no_output_____"
]
],
[
[
"from numpy.random import randn",
"_____no_output_____"
],
[
"np.random.seed(3)\nrand_r = np.random.standard_normal(len_pos)\nrand_i = np.random.standard_normal(len_pos-1)\nrand_i = np.append(rand_i, 0.0) # because the nyquist frequency should only have a real value\n\n## Creating the real and imaginary values from the lists of random numbers and the frequencies\nr_values = rand_r * np.sqrt(0.5 * power_shape)\ni_values = rand_i * np.sqrt(0.5 * power_shape)\nr_values[np.where(pos_freq == 0)] = 0\ni_values[np.where(pos_freq == 0)] = 0",
"_____no_output_____"
]
],
[
[
"## 1e. To obtain a real valued time series, choose the Fourier components for the negative frequencies according to $F(-\\nu_i)=F*(\\nu_i)$ where the asterisk denotes complex conjugation. \n\nAppend to make one fourier transform array. Check that your T&K fourier transform has length `n_bins`. Again, for this algorithm, the zero Fourier frequency is in the middle of the array, the negative Fourier frequencies are in the first half, and the positive Fourier frequencies are in the second half.",
"_____no_output_____"
]
],
[
[
"FT_pos = r_values + i_values*1j\nFT_neg = np.conj(FT_pos[1:-1]) \nFT_neg = FT_neg[::-1] ## Need to flip direction of the negative frequency FT values so that they match up correctly\nFT = np.append(FT_pos, FT_neg)\n",
"_____no_output_____"
]
],
[
[
"## 1f. Obtain the time series by backward Fourier transformation of $F(\\nu)$ from the frequency domain to the time domain.\n\nNote: I usually use `.real` after an iFFT to get rid of any lingering 1e-10 imaginary factors.",
"_____no_output_____"
]
],
[
[
"lc = fftpack.ifft(FT).real",
"_____no_output_____"
]
],
[
[
"Congratulations! \n## 1g. Plot the power spectrum of your FT (only the positive frequencies) next to the light curve it makes. \nRemember: $$P(\\nu_i)=|F(\\nu_i)|^2$$",
"_____no_output_____"
]
],
[
[
"fig, (ax1, ax2) = plt.subplots(1,2, figsize=(12, 5))\nax1.loglog(pos_freq, np.abs(FT_pos)**2)\n\nax2.plot(np.linspace(0, len(lc), len(lc)), lc)\nax2.set_xlim(0, 200)\nfig.show()",
"/Users/rmorgan/anaconda3/lib/python3.7/site-packages/matplotlib/figure.py:457: UserWarning: matplotlib is currently using a non-GUI backend, so cannot show the figure\n \"matplotlib is currently using a non-GUI backend, \"\n"
]
],
[
[
"You'll want to change the x scale of your light curve plot to be like 20 seconds in length, and only use the positive Fourier frequencies when plotting the power spectrum.",
"_____no_output_____"
],
[
"Yay!",
"_____no_output_____"
],
[
"## 1h. Play around with your new-found simulation powers (haha, it's a pun!) \nMake more power spectra with different features -- try at least 5 or 6, and plot each of them next to the corresponding light curve. Try red noise, flicker noise, a few broad Lorentzians at lower frequency, multiple QPOs, a delta function, etc. \n\nHere are some other functions you can use to define shapes of power spectra. This exercise is to help build your intuition of what a time signal looks like in the Fourier domain and vice-versa.",
"_____no_output_____"
]
],
[
[
"def gaussian(v, mean, std_dev):\n \"\"\"\n Gives a Gaussian with a mean of mean and a standard deviation of std_dev\n FWHM = 2 * np.sqrt(2 * np.log(2))*std_dev\n \"\"\"\n exp_numerator = -(v - mean)**2\n exp_denominator = 2 * std_dev**2\n G = np.exp(exp_numerator / exp_denominator)\n return G\n\ndef powerlaw_expdecay(v, beta, alpha):\n \"\"\"Gives a powerlaw of (1/v)^-beta with an exponential decay e^{-alpha*v} \"\"\"\n pl_exp = np.where(v != 0, (1.0 / v) ** beta * np.exp(-alpha * v), np.inf) \n return pl_exp\n\ndef broken_powerlaw(v, v_b, beta_1, beta_2):\n \"\"\"Gives two powerlaws, (1/v)^-beta_1 and (1/v)^-beta_2 \n that cross over at break frequency v_b.\"\"\"\n c = v_b ** (-beta_1 + beta_2) ## scale factor so that they're equal at the break frequency\n pl_1 = v[np.where(v <= v_b)] ** (-beta_1)\n pl_2 = c * v[np.where(v > v_b)] ** (-beta_2)\n pl = np.append(pl_1, pl_2)\n return pl",
"_____no_output_____"
]
],
[
[
"# 2. More realistic simulation with T&K\nNow you're able to simulate the power spectrum of a single segment of a light curve. However, as you learned this morning, we usually use multiple (~50+) segments of a light curve, take the power spectrum of each segment, and average them together.",
"_____no_output_____"
],
[
"## 2a. Turn the code from 1d to 1e into a function `make_TK_seg`\nMake it so that you can give a different random seed to each segment.",
"_____no_output_____"
],
[
"## 2b. Make the Fourier transform for a given power shape (as in Problem 1)\nUse a Lorentzian QPO + Poisson noise power shape at a centroid frequency of 0.5 Hz and a full width at half maximum (FWHM) of 0.01 Hz. Make the QPO 100 time stronger than the Poisson noise power-law.",
"_____no_output_____"
],
[
"## 2c. Put `make_TK_seg` in a loop to do for 50 segments. \nMake an array of integers that can be your random gaussian seed for the TK algorithm (otherwise, you run the risk of creating the exact same Fourier transform every time, and that will be boring).\n\nKeep a running average of the power spectrum of each segment (like we did this morning in problem 2).",
"_____no_output_____"
],
[
"## 2d. Compute the error on the average power\nThe error on the power at index $i$ is\n$$ \\delta P_i = \\frac{P_i}{\\sqrt{M}} $$\nwhere `M` is the number of segments averaged together.",
"_____no_output_____"
],
[
"## 2e. Use the re-binning algorithm described in the morning's workbook to re-bin the power spectrum by a factor of 1.05.",
"_____no_output_____"
],
[
"## Plot the average power spectrum\nRemember to use log scale for the y-axis and probably the x-axis too!",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots(1,1, figsize=(8,5))\nax.plot(rb_freq, rb_pow, linewidth=2.0)\nax.set_xscale('log')\nax.set_yscale('log')\nax.set_xlabel(r'Frequency (Hz)', fontproperties=font_prop)\nax.tick_params(axis='x', labelsize=16, bottom=True, top=True, \n labelbottom=True, labeltop=False)\nax.tick_params(axis='y', labelsize=16, left=True, right=True, \n labelleft=True, labelright=False)\nplt.show()",
"_____no_output_____"
]
],
[
[
"## 2f. Re-do 2b through the plot above but slightly changing the power spectrum shape in each segment. \nMaybe you change the centroid frequency of the QPO, or the normalizing factors between the two components, or the slope of the power-law.",
"_____no_output_____"
],
[
"# Bonus problems:\n\n## 1. Use a different definition of the Lorentzian (below) to make a power spectrum. \nFollow the same procedure. Start off with just one segment. Use the rms as the normalizing factor.\n\n## 2. Using what you learned about data visualization earlier this week, turn the plots in this notebook (and the QPO one, if you're ambitious) into clear and easy-to-digest, publication-ready plots.",
"_____no_output_____"
]
],
[
[
"def lorentz_q(v, v_peak, q, rms): \n \"\"\" \n Form of the Lorentzian function defined in terms of \n peak frequency v_peak and quality factor q\n q = v_peak / fwhm\n with the integrated rms of the QPO as the normalizing factor.\n e.g. see Pottschmidt et al. 2003, A&A, 407, 1039 for more info\n \"\"\"\n f_res = v_peak / np.sqrt(1.0+(1.0/(4.0*q**2)))\n r = rms / np.sqrt(0.5-np.arctan(-2.0*q)/np.pi)\n lorentz = ((1/np.pi)*2*r**2*q*f_res) / (f_res**2+(4*q**2*(v-f_res)**2))\n return lorentz",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
]
] |
e7a13a0da8921e4e2c031bd6f410b4905929914b | 8,181 | ipynb | Jupyter Notebook | onem2m-05-accesscontrol.ipynb | lovele0107/oneM2M-jupyter | fc2b536b027187487ced1fa62e4e90bb2da9921c | [
"BSD-3-Clause"
] | 1 | 2020-06-24T14:21:20.000Z | 2020-06-24T14:21:20.000Z | onem2m-05-accesscontrol.ipynb | lovele0107/oneM2M-jupyter | fc2b536b027187487ced1fa62e4e90bb2da9921c | [
"BSD-3-Clause"
] | 1 | 2020-06-24T13:52:00.000Z | 2020-06-24T13:52:00.000Z | onem2m-05-accesscontrol.ipynb | lovele0107/oneM2M-jupyter | fc2b536b027187487ced1fa62e4e90bb2da9921c | [
"BSD-3-Clause"
] | null | null | null | 31.832685 | 257 | 0.443589 | [
[
[
"# oneM2M - Access Control\n\nThis notebook demonstrates how access control to resources can be done in oneM2M.\n\n- Create an <ACP> resource with different credentials for a new originator\n- Create a second <AE> resource with the new access controls policy\n- Succeed to add a <Container> to the second ≶AE> resource\n- Fail to update the second <AE> resource\n\n## Intitialization\nThe section does import necessary modules and configurations.",
"_____no_output_____"
]
],
[
[
"%run init.py",
"_____no_output_____"
]
],
[
[
"## Create an <ACP> Resource\n\nAccess Control Policies are used to associate access control with credentials. They define the rules to for access control to resources. Each <ACP> resource has two sections:\n\n- **pv (Privileges)** : The actual privileges defined by this policy.\n- **pvs (Self-Privileges)** : This defines the privileges necessary to access and control the <ACP> resource itself.\n\nEach section has at least the following two parameters:\n\n- **acor (accessControlOriginators)** : This list includes the Originator information. The parameter comprises a list of domain, CSE-IDs, AE-IDs, the resource-ID of a <Group> resource that contains <AE> or <remoteCSE> as member or Role-ID.\n- **acop (accessControlOperations)** : This number represents a bit-field of privileges. The following table shows the mapping:\n\n| Value | Interpretation |\n|-------|----------------|\n| 1 | CREATE |\n| 2 | RETRIEVE |\n| 4 | UPDATE |\n| 8 | DELETE |\n| 16 | NOTIFY |\n| 32 | DISCOVERY |\n\nThe following request creates a new <ACP> that allows the originator *abc:xyz* only to send CREATE, RETRIEVE, NOTIFY, DELETE, and DISCOVERY requests to resources that have this <ACP> resource assigned.",
"_____no_output_____"
]
],
[
[
"CREATE ( # CREATE request\n url,\n \n # Request Headers\n {\n 'X-M2M-Origin' : originator, # Set the originator\n 'X-M2M-RI' : '0', # Request identifier\n 'Accept' : 'application/json', # Response shall be JSON\n 'Content-Type' : 'application/json;ty=1' # Content is JSON, and represents an <ACP> resource\n },\n \n # Request Body\n '''\n {\n \"m2m:acp\": {\n \"rn\":\"Notebook-ACP\",\n \"pv\": {\n \"acr\": [{\n \"acor\": [ \"abcxyz\" ],\n \"acop\": 59\n }]\n },\n \"pvs\": {\n \"acr\": [{\n \"acor\": [ \"%s\" ],\n \"acop\": 63\n }]\n }\n }\n }\n ''' % originator\n)",
"_____no_output_____"
]
],
[
[
"\n\n## Create a second <AE> Resource with the new <ACP>\n\nWe now create a new <AE> resource that uses the just created <ACP>.\n\n**This should succeed.**",
"_____no_output_____"
]
],
[
[
"CREATE ( # CREATE request\n url,\n \n # Request Headers\n {\n 'X-M2M-Origin' : 'C', # Set the originator\n 'X-M2M-RI' : '0', # Request identifier\n 'Accept' : 'application/json', # Response shall be JSON\n 'Content-Type' : 'application/json;ty=2' # Content is JSON, and represents an <AE> resource\n },\n \n # Request Body\n '''\n { \n \"m2m:ae\": {\n \"rn\": \"Notebook-AE_2\",\n \"api\": \"AE\",\n \"acpi\" : [ \n \"%s/Notebook-ACP\" \n ],\n \"rr\": true,\n \"srv\": [\"3\"]\n }\n }\n ''' % basename\n)",
"_____no_output_____"
]
],
[
[
"## Try to Create <Container> under the second <AE> Resource\n\nWe will update a <Container> resource under the second <AE> resource with the originator of *abc:xyz*. \n\n**This should work** since this originator is allowed to send CREATE requests.",
"_____no_output_____"
]
],
[
[
" CREATE ( # CREATE request\n url + '/Notebook-AE_2',\n \n # Request Headers\n {\n 'X-M2M-Origin' : \"abcxyz\", # Set the originator\n 'X-M2M-RI' : '0', # Request identifier\n 'Accept' : 'application/json', # Response shall be JSON\n 'Content-Type' : 'application/json;ty=3' # Content is JSON, and represents an <Container> resource\n },\n \n # Request Body\n '''\n {\n \"m2m:cnt\": {\n \"rn\":\"Container\",\n \"acpi\": [\n \"%s/Notebook-ACP\"\n ]\n }\n }\n ''' % basename\n)",
"_____no_output_____"
]
],
[
[
"## Try to Update the second <AE> Resource\n\nNow we try to update the new <AE> resource (add a *lbl* attribute) with the other originator, *abc:xyz*. \n\n**<span style='color:red'>This should fail</span>**, since the associated <ACP> doesn't allow UPDATE requests.",
"_____no_output_____"
]
],
[
[
"UPDATE ( # UPDATE request\n url + '/Notebook-AE_2',\n \n # Request Headers\n {\n 'X-M2M-Origin' : 'abcxyz', # Set the originator\n 'X-M2M-RI' : '0', # Request identifier\n 'Accept' : 'application/json', # Response shall be JSON\n 'Content-Type' : 'application/json;ty=2' # Content is JSON, and represents an <AE> resource\n },\n \n # Request Body\n {\n \"m2m:ae\": {\n \"lbl\": [\n \"test:test\"\n ]\n }\n }\n)",
"_____no_output_____"
]
],
[
[
" ",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
e7a14bbfb6fb95355bcebd48ebbd866139317c88 | 860,566 | ipynb | Jupyter Notebook | kaggle_tgs_salt_identification.ipynb | JacksonIsaac/colab_notebooks | a49be32209de5e13f73b7e5dcadfbfde3f40cf20 | [
"MIT"
] | 1 | 2018-09-19T14:05:50.000Z | 2018-09-19T14:05:50.000Z | kaggle_tgs_salt_identification.ipynb | a-parida12/colab_notebooks | a49be32209de5e13f73b7e5dcadfbfde3f40cf20 | [
"MIT"
] | 1 | 2018-10-28T10:54:24.000Z | 2018-10-28T10:54:24.000Z | kaggle_tgs_salt_identification.ipynb | a-parida12/colab_notebooks | a49be32209de5e13f73b7e5dcadfbfde3f40cf20 | [
"MIT"
] | 2 | 2018-10-28T10:14:54.000Z | 2018-12-16T13:58:05.000Z | 113.128171 | 117,202 | 0.76677 | [
[
[
"[View in Colaboratory](https://colab.research.google.com/github/JacksonIsaac/colab_notebooks/blob/master/kaggle_tgs_salt_identification.ipynb)",
"_____no_output_____"
],
[
"### Kaggle notebook\nFor *TGS Salt identification* competition:\n\nhttps://www.kaggle.com/c/tgs-salt-identification-challenge",
"_____no_output_____"
],
[
"# Setup kaggle and download dataset",
"_____no_output_____"
]
],
[
[
"!pip install kaggle",
"Requirement already satisfied: kaggle in /usr/local/lib/python3.6/dist-packages (1.4.7.1)\nRequirement already satisfied: python-dateutil in /usr/local/lib/python3.6/dist-packages (from kaggle) (2.5.3)\nRequirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from kaggle) (2.18.4)\nRequirement already satisfied: certifi in /usr/local/lib/python3.6/dist-packages (from kaggle) (2018.8.24)\nRequirement already satisfied: python-slugify in /usr/local/lib/python3.6/dist-packages (from kaggle) (1.2.6)\nRequirement already satisfied: urllib3<1.23.0,>=1.15 in /usr/local/lib/python3.6/dist-packages (from kaggle) (1.22)\nRequirement already satisfied: six>=1.10 in /usr/local/lib/python3.6/dist-packages (from kaggle) (1.11.0)\nRequirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from kaggle) (4.26.0)\nRequirement already satisfied: idna<2.7,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->kaggle) (2.6)\nRequirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->kaggle) (3.0.4)\nRequirement already satisfied: Unidecode>=0.04.16 in /usr/local/lib/python3.6/dist-packages (from python-slugify->kaggle) (1.0.22)\n"
],
[
"## Load Kaggle config JSON\nfrom googleapiclient.discovery import build\nimport io, os\nfrom googleapiclient.http import MediaIoBaseDownload\nfrom google.colab import auth\n\nauth.authenticate_user()\n\ndrive_service = build('drive', 'v3')\nresults = drive_service.files().list(\n q=\"name = 'kaggle.json'\", fields=\"files(id)\").execute()\nkaggle_api_key = results.get('files', [])\n\nfilename = \"/content/.kaggle/kaggle.json\"\nos.makedirs(os.path.dirname(filename), exist_ok=True)\n\nrequest = drive_service.files().get_media(fileId=kaggle_api_key[0]['id'])\nfh = io.FileIO(filename, 'wb')\ndownloader = MediaIoBaseDownload(fh, request)\ndone = False\nwhile done is False:\n status, done = downloader.next_chunk()\n print(\"Download %d%%.\" % int(status.progress() * 100))\nos.chmod(filename, 600)\n\n!mkdir ~/.kaggle\n!cp /content/.kaggle/kaggle.json ~/.kaggle/kaggle.json",
"Download 100%.\n"
],
[
"!kaggle competitions download -c tgs-salt-identification-challenge",
"Downloading depths.csv to /content\n\r 0% 0.00/322k [00:00<?, ?B/s]\n100% 322k/322k [00:00<00:00, 62.8MB/s]\nDownloading sample_submission.csv to /content\n 0% 0.00/264k [00:00<?, ?B/s]\n100% 264k/264k [00:00<00:00, 70.5MB/s]\nDownloading train.csv to /content\n 0% 0.00/922k [00:00<?, ?B/s]\n100% 922k/922k [00:00<00:00, 92.9MB/s]\nDownloading test.zip to /content\n 94% 154M/163M [00:00<00:00, 150MB/s]\n100% 163M/163M [00:00<00:00, 186MB/s]\nDownloading train.zip to /content\n 92% 35.0M/37.9M [00:00<00:00, 42.2MB/s]\n100% 37.9M/37.9M [00:00<00:00, 78.5MB/s]\n"
],
[
"!ls",
"adc.json sample_data\t\t test.zip train.zip\ndepths.csv sample_submission.csv train.csv\n"
],
[
"!unzip -q train.zip",
"_____no_output_____"
]
],
[
[
"# Install Dependencies",
"_____no_output_____"
]
],
[
[
"!pip install -q imageio\n!pip install -q torch",
"tcmalloc: large alloc 1073750016 bytes == 0x590d2000 @ 0x7fa2b66e71c4 0x46d6a4 0x5fcbcc 0x4c494d 0x54f3c4 0x553aaf 0x54e4c8 0x54f4f6 0x553aaf 0x54efc1 0x54f24d 0x553aaf 0x54efc1 0x54f24d 0x553aaf 0x54efc1 0x54f24d 0x551ee0 0x54e4c8 0x54f4f6 0x553aaf 0x54efc1 0x54f24d 0x551ee0 0x54efc1 0x54f24d 0x551ee0 0x54e4c8 0x54f4f6 0x553aaf 0x54e4c8\n"
],
[
"!pip install -q ipywidgets",
"_____no_output_____"
],
[
"import os\nimport numpy as np\n\nimport imageio\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport torch\n\nfrom torch.utils import data",
"_____no_output_____"
]
],
[
[
"# Create class for input dataset",
"_____no_output_____"
]
],
[
[
"class TGSSaltDataSet(data.Dataset):\n def __init__(self, root_path, file_list):\n self.root_path = root_path\n self.file_list = file_list\n \n def __len__(self):\n return len(self.file_list)\n \n def __getitem__(self, index):\n file_id = self.file_list[index]\n \n # Image folder\n image_folder = os.path.join(self.root_path, 'images')\n image_path = os.path.join(image_folder, file_id+ '.png')\n \n # Label folder\n mask_folder = os.path.join(self.root_path, 'masks')\n mask_path = os.path.join(mask_folder, file_id+ '.png')\n \n image = np.array(imageio.imread(image_path), dtype=np.uint8)\n mask = np.array(imageio.imread(mask_path), dtype=np.uint8)\n \n return image, mask",
"_____no_output_____"
]
],
[
[
"# Load dataset csv",
"_____no_output_____"
]
],
[
[
"train_mask = pd.read_csv('train.csv')\ndepth = pd.read_csv('depths.csv')\n\ntrain_path = './'\n\nfile_list = list(train_mask['id'].values)\n\ndataset = TGSSaltDataSet(train_path, file_list)",
"_____no_output_____"
]
],
[
[
"# Visualize dataset",
"_____no_output_____"
]
],
[
[
"def plot2x2array(image, mask):\n fig, axs = plt.subplots(1, 2)\n axs[0].imshow(image)\n axs[1].imshow(mask)\n \n axs[0].grid()\n axs[1].grid()\n \n axs[0].set_title('Image')\n axs[1].set_title('Mask')",
"_____no_output_____"
],
[
"for i in range(5):\n image, mask = dataset[np.random.randint(0, len(dataset))]\n plot2x2array(image, mask)",
"_____no_output_____"
],
[
"plt.figure(figsize = (6, 6))\nplt.hist(depth['z'], bins = 50)\nplt.title('Depth distribution')",
"_____no_output_____"
]
],
[
[
"# Convert RLE Mask to matrix",
"_____no_output_____"
]
],
[
[
"def rle_to_mask(rle_string, height, width):\n rows, cols = height, width\n try:\n rle_numbers = [int(numstr) for numstr in rle_string.split(' ')]\n rle_pairs = np.array(rle_numbers).reshape(-1, 2)\n img = np.zeros(rows * cols, dtype=np.uint8)\n \n for idx, length in rle_pairs:\n idx -= 1\n img[idx:idx+length] = 255\n \n img = img.reshape(cols, rows)\n \n img = img.T\n \n except:\n img = np.zeros((cols, rows))\n \n return img",
"_____no_output_____"
],
[
"def salt_proportion(img_array):\n try:\n unique, counts = np.unique(img_array, return_counts=True)\n return counts[1]/10201.\n except:\n return 0.0",
"_____no_output_____"
]
],
[
[
"# Create training mask",
"_____no_output_____"
]
],
[
[
"train_mask['mask'] = train_mask['rle_mask'].apply(lambda x: rle_to_mask(x, 101, 101))\ntrain_mask['salt_proportion'] = train_mask['mask'].apply(lambda x: salt_proportion(x))",
"_____no_output_____"
]
],
[
[
"# Let's merge the mask and depths",
"_____no_output_____"
]
],
[
[
"merged = train_mask.merge(depth, how='left')\nmerged.head()",
"_____no_output_____"
],
[
"plt.figure(figsize=(12, 6))\nplt.scatter(merged['salt_proportion'], merged['z'])\nplt.title('Proportion of salt vs depth')",
"_____no_output_____"
],
[
"print(\"Correlation: \", np.corrcoef(merged['salt_proportion'], merged['z'])[0, 1])",
"Correlation: 0.10361580365557428\n"
]
],
[
[
"# Setup Keras and Train",
"_____no_output_____"
]
],
[
[
"from keras.models import Model, load_model\nfrom keras.layers import Input\nfrom keras.layers.core import Lambda, RepeatVector, Reshape\nfrom keras.layers.convolutional import Conv2D, Conv2DTranspose\nfrom keras.layers.pooling import MaxPooling2D\nfrom keras.layers.merge import concatenate\nfrom keras.callbacks import EarlyStopping, ModelCheckpoint, ReduceLROnPlateau\nfrom keras import backend as K",
"Using TensorFlow backend.\n"
],
[
"im_width = 128\nim_height = 128\nborder = 5\nim_chan = 2 # Number of channels: first is original and second cumsum(axis=0)\nn_features = 1 # Number of extra features, like depth\n#path_train = '../input/train/'\n#path_test = '../input/test/'",
"_____no_output_____"
],
[
"# Build U-Net model\ninput_img = Input((im_height, im_width, im_chan), name='img')\ninput_features = Input((n_features, ), name='feat')\n\nc1 = Conv2D(8, (3, 3), activation='relu', padding='same') (input_img)\nc1 = Conv2D(8, (3, 3), activation='relu', padding='same') (c1)\np1 = MaxPooling2D((2, 2)) (c1)\n\nc2 = Conv2D(16, (3, 3), activation='relu', padding='same') (p1)\nc2 = Conv2D(16, (3, 3), activation='relu', padding='same') (c2)\np2 = MaxPooling2D((2, 2)) (c2)\n\nc3 = Conv2D(32, (3, 3), activation='relu', padding='same') (p2)\nc3 = Conv2D(32, (3, 3), activation='relu', padding='same') (c3)\np3 = MaxPooling2D((2, 2)) (c3)\n\nc4 = Conv2D(64, (3, 3), activation='relu', padding='same') (p3)\nc4 = Conv2D(64, (3, 3), activation='relu', padding='same') (c4)\np4 = MaxPooling2D(pool_size=(2, 2)) (c4)\n\n# Join features information in the depthest! layer\nf_repeat = RepeatVector(8*8)(input_features)\nf_conv = Reshape((8, 8, n_features))(f_repeat)\np4_feat = concatenate([p4, f_conv], -1)\n\nc5 = Conv2D(128, (3, 3), activation='relu', padding='same') (p4_feat)\nc5 = Conv2D(128, (3, 3), activation='relu', padding='same') (c5)\n\nu6 = Conv2DTranspose(64, (2, 2), strides=(2, 2), padding='same') (c5)\n#check out this skip connection thooooo\nu6 = concatenate([u6, c4])\nc6 = Conv2D(64, (3, 3), activation='relu', padding='same') (u6)\nc6 = Conv2D(64, (3, 3), activation='relu', padding='same') (c6)\n\nu7 = Conv2DTranspose(32, (2, 2), strides=(2, 2), padding='same') (c6)\nu7 = concatenate([u7, c3])\nc7 = Conv2D(32, (3, 3), activation='relu', padding='same') (u7)\nc7 = Conv2D(32, (3, 3), activation='relu', padding='same') (c7)\n\nu8 = Conv2DTranspose(16, (2, 2), strides=(2, 2), padding='same') (c7)\nu8 = concatenate([u8, c2])\nc8 = Conv2D(16, (3, 3), activation='relu', padding='same') (u8)\nc8 = Conv2D(16, (3, 3), activation='relu', padding='same') (c8)\n\nu9 = Conv2DTranspose(8, (2, 2), strides=(2, 2), padding='same') (c8)\nu9 = concatenate([u9, c1], axis=3)\nc9 = Conv2D(8, (3, 3), activation='relu', padding='same') (u9)\nc9 = Conv2D(8, (3, 3), activation='relu', padding='same') (c9)\n\noutputs = Conv2D(1, (1, 1), activation='sigmoid') (c9)\n\nmodel = Model(inputs=[input_img, input_features], outputs=[outputs])\nmodel.compile(optimizer='adam', loss='binary_crossentropy') #, metrics=[mean_iou]) # The mean_iou metrics seens to leak train and test values...\nmodel.summary()",
"__________________________________________________________________________________________________\nLayer (type) Output Shape Param # Connected to \n==================================================================================================\nimg (InputLayer) (None, 128, 128, 2) 0 \n__________________________________________________________________________________________________\nconv2d_77 (Conv2D) (None, 128, 128, 8) 152 img[0][0] \n__________________________________________________________________________________________________\nconv2d_78 (Conv2D) (None, 128, 128, 8) 584 conv2d_77[0][0] \n__________________________________________________________________________________________________\nmax_pooling2d_17 (MaxPooling2D) (None, 64, 64, 8) 0 conv2d_78[0][0] \n__________________________________________________________________________________________________\nconv2d_79 (Conv2D) (None, 64, 64, 16) 1168 max_pooling2d_17[0][0] \n__________________________________________________________________________________________________\nconv2d_80 (Conv2D) (None, 64, 64, 16) 2320 conv2d_79[0][0] \n__________________________________________________________________________________________________\nmax_pooling2d_18 (MaxPooling2D) (None, 32, 32, 16) 0 conv2d_80[0][0] \n__________________________________________________________________________________________________\nconv2d_81 (Conv2D) (None, 32, 32, 32) 4640 max_pooling2d_18[0][0] \n__________________________________________________________________________________________________\nconv2d_82 (Conv2D) (None, 32, 32, 32) 9248 conv2d_81[0][0] \n__________________________________________________________________________________________________\nmax_pooling2d_19 (MaxPooling2D) (None, 16, 16, 32) 0 conv2d_82[0][0] \n__________________________________________________________________________________________________\nconv2d_83 (Conv2D) (None, 16, 16, 64) 18496 max_pooling2d_19[0][0] \n__________________________________________________________________________________________________\nfeat (InputLayer) (None, 1) 0 \n__________________________________________________________________________________________________\nconv2d_84 (Conv2D) (None, 16, 16, 64) 36928 conv2d_83[0][0] \n__________________________________________________________________________________________________\nrepeat_vector_5 (RepeatVector) (None, 64, 1) 0 feat[0][0] \n__________________________________________________________________________________________________\nmax_pooling2d_20 (MaxPooling2D) (None, 8, 8, 64) 0 conv2d_84[0][0] \n__________________________________________________________________________________________________\nreshape_5 (Reshape) (None, 8, 8, 1) 0 repeat_vector_5[0][0] \n__________________________________________________________________________________________________\nconcatenate_21 (Concatenate) (None, 8, 8, 65) 0 max_pooling2d_20[0][0] \n reshape_5[0][0] \n__________________________________________________________________________________________________\nconv2d_85 (Conv2D) (None, 8, 8, 128) 75008 concatenate_21[0][0] \n__________________________________________________________________________________________________\nconv2d_86 (Conv2D) (None, 8, 8, 128) 147584 conv2d_85[0][0] \n__________________________________________________________________________________________________\nconv2d_transpose_17 (Conv2DTran (None, 16, 16, 64) 32832 conv2d_86[0][0] \n__________________________________________________________________________________________________\nconcatenate_22 (Concatenate) (None, 16, 16, 128) 0 conv2d_transpose_17[0][0] \n conv2d_84[0][0] \n__________________________________________________________________________________________________\nconv2d_87 (Conv2D) (None, 16, 16, 64) 73792 concatenate_22[0][0] \n__________________________________________________________________________________________________\nconv2d_88 (Conv2D) (None, 16, 16, 64) 36928 conv2d_87[0][0] \n__________________________________________________________________________________________________\nconv2d_transpose_18 (Conv2DTran (None, 32, 32, 32) 8224 conv2d_88[0][0] \n__________________________________________________________________________________________________\nconcatenate_23 (Concatenate) (None, 32, 32, 64) 0 conv2d_transpose_18[0][0] \n conv2d_82[0][0] \n__________________________________________________________________________________________________\nconv2d_89 (Conv2D) (None, 32, 32, 32) 18464 concatenate_23[0][0] \n__________________________________________________________________________________________________\nconv2d_90 (Conv2D) (None, 32, 32, 32) 9248 conv2d_89[0][0] \n__________________________________________________________________________________________________\nconv2d_transpose_19 (Conv2DTran (None, 64, 64, 16) 2064 conv2d_90[0][0] \n__________________________________________________________________________________________________\nconcatenate_24 (Concatenate) (None, 64, 64, 32) 0 conv2d_transpose_19[0][0] \n conv2d_80[0][0] \n__________________________________________________________________________________________________\nconv2d_91 (Conv2D) (None, 64, 64, 16) 4624 concatenate_24[0][0] \n__________________________________________________________________________________________________\nconv2d_92 (Conv2D) (None, 64, 64, 16) 2320 conv2d_91[0][0] \n__________________________________________________________________________________________________\nconv2d_transpose_20 (Conv2DTran (None, 128, 128, 8) 520 conv2d_92[0][0] \n__________________________________________________________________________________________________\nconcatenate_25 (Concatenate) (None, 128, 128, 16) 0 conv2d_transpose_20[0][0] \n conv2d_78[0][0] \n__________________________________________________________________________________________________\nconv2d_93 (Conv2D) (None, 128, 128, 8) 1160 concatenate_25[0][0] \n__________________________________________________________________________________________________\nconv2d_94 (Conv2D) (None, 128, 128, 8) 584 conv2d_93[0][0] \n__________________________________________________________________________________________________\nconv2d_95 (Conv2D) (None, 128, 128, 1) 9 conv2d_94[0][0] \n==================================================================================================\nTotal params: 486,897\nTrainable params: 486,897\nNon-trainable params: 0\n__________________________________________________________________________________________________\n"
],
[
"import sys\nfrom tqdm import tqdm\nfrom keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img\nfrom skimage.transform import resize\n\n\ntrain_ids = next(os.walk(train_path+\"masks\"))[2]\n\n\n# Get and resize train images and masks\nX = np.zeros((len(train_ids), im_height, im_width, im_chan), dtype=np.float32)\ny = np.zeros((len(train_ids), im_height, im_width, 1), dtype=np.float32)\nX_feat = np.zeros((len(train_ids), n_features), dtype=np.float32)\nprint('Getting and resizing train images and masks ... ')\nsys.stdout.flush()\nfor n, id_ in tqdm(enumerate(train_ids), total=len(train_ids)):\n path = train_path\n \n # Depth\n #X_feat[n] = depth.loc[id_.replace('.png', ''), 'z']\n \n # Load X\n img = load_img(path + 'images/' + id_, grayscale=True)\n x_img = img_to_array(img)\n x_img = resize(x_img, (128, 128, 1), mode='constant', preserve_range=True)\n \n # Create cumsum x\n x_center_mean = x_img[border:-border, border:-border].mean()\n x_csum = (np.float32(x_img)-x_center_mean).cumsum(axis=0)\n x_csum -= x_csum[border:-border, border:-border].mean()\n x_csum /= max(1e-3, x_csum[border:-border, border:-border].std())\n\n # Load Y\n mask = img_to_array(load_img(path + 'masks/' + id_, grayscale=True))\n mask = resize(mask, (128, 128, 1), mode='constant', preserve_range=True)\n\n # Save images\n X[n, ..., 0] = x_img.squeeze() / 255\n X[n, ..., 1] = x_csum.squeeze()\n y[n] = mask / 255\n\nprint('Done!')",
"Getting and resizing train images and masks ... \n"
],
[
"!ls ./masks",
"000e218f21.png\t35499d9dd6.png\t6957579961.png\t99f52af0ea.png\tcc876896b5.png\n003c477d7c.png\t356fa48ac4.png\t697af33de9.png\t9a1fbef3e3.png\tcc97e87765.png\n00441f1cf2.png\t3577258d6b.png\t697f80cece.png\t9a3abd0ba6.png\tccc41663c6.png\n0050766ae2.png\t358b9acfb5.png\t697fee88c5.png\t9a478d8cf2.png\tccd6841e90.png\n005b452274.png\t359128303c.png\t6994342067.png\t9a48617407.png\tccdd1c542f.png\n0061281eea.png\t3598c246d7.png\t699608ddb7.png\t9a4864ff06.png\tcced2adf43.png\n008a50a2ec.png\t35b34a9da1.png\t69bb149e2d.png\t9a4b15919d.png\tccfb270edb.png\n00950d1627.png\t35ba6f2faf.png\t69cddf572c.png\t9a4db0d7aa.png\tcd1b2e49f4.png\n00a3af90ab.png\t35bbf1302b.png\t69ee613976.png\t9a4f625408.png\tcd2046479c.png\n00cda0328c.png\t35f3cb48d6.png\t6a0f5a8b42.png\t9a672bfb51.png\tcd249859fa.png\n0108518d1e.png\t35fd991255.png\t6a105e7555.png\t9a6c280d8d.png\tcd292b7c8d.png\n010ee525b6.png\t361d5e6834.png\t6a1783be17.png\t9a6cf411ff.png\tcd5c9532e4.png\n0115703825.png\t362cec7849.png\t6a1fd3c67e.png\t9a71e8563b.png\tcd5cd074f1.png\n01323211a0.png\t3634f29aac.png\t6a1fe1a81e.png\t9a91290888.png\tcd6140444f.png\n016fc8031c.png\t363a171d90.png\t6a29fbd74b.png\t9a9ff1beca.png\tcd6320c6eb.png\n019afb4b4e.png\t365abbdf77.png\t6a3d92ce59.png\t9aa65d393a.png\tcd69ee839f.png\n01b5362cce.png\t365bdf0ae1.png\t6a3f8feaea.png\t9ad32f3c66.png\tcd6ed7690d.png\n01c033e116.png\t3662fffe30.png\t6a57dfd211.png\t9ade7d33f2.png\tcdc7d77e77.png\n01c2045d03.png\t367c6fb412.png\t6a6b42a6f2.png\t9ae45b19d3.png\tcddb50fb4b.png\n020376e68e.png\t3685aa0e78.png\t6a6ebb2255.png\t9b09626522.png\tcde14bb8d2.png\n020678ec1b.png\t368767c9fd.png\t6a7506aa1e.png\t9b1f59b89c.png\tcdf9422054.png\n02117a400e.png\t3687839ecc.png\t6a89c7a85d.png\t9b29ca561d.png\tce06c157c1.png\n021494f3aa.png\t368f635c4c.png\t6ab003835b.png\t9b31db4e58.png\tce0e9def63.png\n02206b234e.png\t36a6901cae.png\t6abe55deaf.png\t9b3d4b0379.png\tce14d335bf.png\n022b1b01be.png\t36aa21019e.png\t6ac91c75fc.png\t9b450671aa.png\tce15cc1d1a.png\n023d486fba.png\t36ad52a2e8.png\t6af48dfcdd.png\t9b4aa94aed.png\tce16b0d0bc.png\n0243477802.png\t36b768bfa7.png\t6aff151236.png\t9b585a2ea0.png\tce1f9711bc.png\n0249c49180.png\t36bc8044f8.png\t6b0688f33f.png\t9b5bed5c6f.png\tce31ab9b97.png\n026b509cd4.png\t36db23ceb8.png\t6b2f79f6ff.png\t9b656c339a.png\tce3e6bba2e.png\n026f97575b.png\t36e68393b6.png\t6b2f7ba83d.png\t9b80cf8992.png\tce9c3510a8.png\n0280db420c.png\t36f332bd11.png\t6b377327b1.png\t9b9c107a62.png\tceb69201df.png\n0280deb8ae.png\t36f9ab3ee0.png\t6b4d65ac6a.png\t9ba564058f.png\tceca04aa2a.png\n029c321179.png\t370763b8a9.png\t6b5b7c0ff3.png\t9bbb4c7c34.png\tced8d25446.png\n02adf272e9.png\t37251064bd.png\t6b95bc6c5f.png\t9bc9c403e0.png\tceeaafc4fc.png\n02d40993ab.png\t3734601c15.png\t6b996daecc.png\t9bd58061d6.png\tceef85bdc0.png\n03049b14ca.png\t376e55e566.png\t6baf05d61e.png\t9bdfa54912.png\tcef03959d8.png\n0304e9ac37.png\t3782748f51.png\t6bc4c91c27.png\t9befc65890.png\tcef624f327.png\n03511989ac.png\t3792be0845.png\t6bffabbd91.png\t9bf022776e.png\tcf11876547.png\n0389fc6012.png\t3794cd4d73.png\t6c0a545d06.png\t9bf8a38b92.png\tcf1ffe14e0.png\n03a9a9f615.png\t37aa9fc79a.png\t6c25735f24.png\t9bf8f84094.png\tcf241124b1.png\n03be56aa9a.png\t37ba215c62.png\t6c3a9009c8.png\t9bf982cf65.png\tcf706fb35a.png\n03c9b5ffe8.png\t37c95ff7df.png\t6c40978ddf.png\t9c08332067.png\tcf7abfc8bb.png\n03eed26d3e.png\t37df75f3a2.png\t6c4568e51c.png\t9c298b9575.png\tcfb8e5c36e.png\n0401ae15d3.png\t37e16a5404.png\t6c45d80d1e.png\t9c30bb0a5f.png\tcfcffeda9e.png\n04182ced8e.png\t38063f77a8.png\t6c49b277aa.png\t9c3383eb85.png\tcfdc385955.png\n041925f04c.png\t380dea2a5a.png\t6c557220d3.png\t9c4468a160.png\tcfe4dd9f7e.png\n04234be388.png\t380ff30e30.png\t6c5b8b3950.png\t9c4c524983.png\tcff87534c4.png\n0429d9759b.png\t38178b0ded.png\t6c61beb5dd.png\t9c5c26e2ba.png\tcffbfab33b.png\n042c3d4e03.png\t3822f73c1c.png\t6c6c1cbae9.png\t9c6f8e4676.png\td00d1ea52b.png\n04315b9a64.png\t38232d1fe3.png\t6c793e5879.png\t9ca11c3696.png\td0244d6c38.png\n044106f759.png\t38438eab6c.png\t6c8ad32f36.png\t9ca1ebe351.png\td0319f0ec5.png\n045c5516a2.png\t384d50db49.png\t6c91546a57.png\t9ca520f895.png\td059fcb2fd.png\n0461a2fb30.png\t3862950849.png\t6c95018d5c.png\t9ca84742ef.png\td06f237a79.png\n04711cb760.png\t38673ade97.png\t6caec01e67.png\t9cb42834b3.png\td0aeb260e8.png\n0473e496e2.png\t387aa87ae2.png\t6cafc8475b.png\t9cbd5ddba4.png\td0bbe4fd97.png\n0497f2d447.png\t387b6956c4.png\t6cb11aa807.png\t9ccf50ab0e.png\td0cba16846.png\n04c48c9743.png\t38a15dee44.png\t6cc7ba6c56.png\t9ccf9ff1c3.png\td0cd29a61b.png\n04cd9599a3.png\t38c0d9ca8e.png\t6cf284fb9e.png\t9ce55dc2c9.png\td0d908374e.png\n04e7e791e2.png\t38dfaadc68.png\t6cf6889e33.png\t9cf8fe7f96.png\td0e720b57b.png\n04f7a73f2b.png\t38e018ea0f.png\t6d03f91961.png\t9d12bb050d.png\td0e8c9833f.png\n05139ad3e5.png\t38ec59117e.png\t6d17d3be2d.png\t9d173ec22a.png\td0eb51a04f.png\n052de39787.png\t38ed2aa7cb.png\t6d1afa9d46.png\t9d23bc70b8.png\td12d5894b8.png\n0530b67189.png\t390355fd73.png\t6d2a5a2fe2.png\t9d354bc9e7.png\td145884178.png\n0532916483.png\t393048bc6c.png\t6d46f994ac.png\t9d462381b0.png\td14ad5cb55.png\n053a6f6704.png\t393f2db0b0.png\t6d4fca6a35.png\t9d5d4bf0f1.png\td14db20055.png\n0558ee87d3.png\t39400e0443.png\t6d55cabc8d.png\t9d7165c9c0.png\td16591547d.png\n055ec918b9.png\t395726930f.png\t6d593a19d1.png\t9d98768f19.png\td1665744c3.png\n056bfa0cc8.png\t395e668f22.png\t6d69267940.png\t9daa6c133f.png\td175b3b3cc.png\n057eae4f34.png\t396ae389d8.png\t6d8a8e2e6d.png\t9db11d4414.png\td17eee846e.png\n0589f557d7.png\t3975043a11.png\t6d93c0a95e.png\t9dc604318f.png\td17f801011.png\n05a61e5799.png\t397f00cf61.png\t6d9b6a2c4b.png\t9dcb043fa9.png\td1857e0577.png\n05b69f83bf.png\t3984ed8d4e.png\t6d9be2f492.png\t9dde6475f9.png\td18a64f984.png\n05be526826.png\t399694c381.png\t6da01cc490.png\t9e426ec524.png\td19827b32b.png\n05be75afd9.png\t39a05bf91f.png\t6da04e4c20.png\t9e64fc57aa.png\td1c9be7e25.png\n05c33e6854.png\t39a5b5f33a.png\t6db821bf5f.png\t9e7d89d4c0.png\td1f7d2876b.png\n05c7033694.png\t39cd06da7d.png\t6dba47cc58.png\t9e8e175c01.png\td1f8ba2f02.png\n05d9351a35.png\t39dd1c43bd.png\t6dc7a51dd9.png\t9e966d733c.png\td204ea9f09.png\n05f09b380c.png\t39f101a807.png\t6ddd6a2ec1.png\t9e9ce2e1ce.png\td21012f5c1.png\n060cdfded5.png\t3a068284ed.png\t6dea52e54d.png\t9e9f3940a9.png\td21e3d7843.png\n060f14fcee.png\t3a310860a3.png\t6dfaae4996.png\t9ea0525027.png\td2522cfc93.png\n06289f69f8.png\t3a3281b37d.png\t6e1444d88d.png\t9eaf536a63.png\td25a2d736a.png\n063cf2a23a.png\t3a33e1eb89.png\t6e1bb2e64a.png\t9eb4a10b98.png\td268ae57b5.png\n0642bbbc5f.png\t3a3e1e824d.png\t6e3634da35.png\t9ecc1f5457.png\td26e890704.png\n06523f9719.png\t3a58606814.png\t6e3bb2a882.png\t9efedb3b86.png\td274664a41.png\n065ce5590e.png\t3a5e5f4202.png\t6e3c46c2d3.png\t9f07d6a050.png\td27831fef2.png\n066e0ce4c3.png\t3a690dbd1b.png\t6e409af2cb.png\t9f0c9f3b90.png\td289692ab8.png\n069035a9b5.png\t3a7bbd73af.png\t6e691f1825.png\t9f0ebbd210.png\td2aa4238f7.png\n06a361fc57.png\t3ad9500e43.png\t6e72c79c29.png\t9f224c71b1.png\td2b40a2781.png\n06c2c49688.png\t3b05fe7a3f.png\t6e86fed535.png\t9f2e5a8c24.png\td2b77b255b.png\n06d21d76c4.png\t3b2f902ec6.png\t6ea5191e2a.png\t9f3b8d0186.png\td2bd2fe97f.png\n06d3201dd0.png\t3b4be071a8.png\t6ebb6826bd.png\t9f4001d8ea.png\td2c800e9f5.png\n071c81c9ad.png\t3b5c0e9308.png\t6ec6e26ddd.png\t9f4471dcd6.png\td2e14828d5.png\n071d5aa17f.png\t3b5e9c6e05.png\t6ecba60505.png\t9f459445b6.png\td2e44953e6.png\n074673a5f1.png\t3b9586e2ff.png\t6ed0ef14db.png\t9f54d7807c.png\td2ea97ca4e.png\n0762a99eb4.png\t3bbf82fc28.png\t6edb010212.png\t9f7e064e9f.png\td2f131dba8.png\n077cb8696f.png\t3bd3216e16.png\t6edba13f1b.png\t9f96821d81.png\td3034e7a60.png\n078f36a006.png\t3be0c0be8e.png\t6ee755c67d.png\t9f97b2925e.png\td308ec69d3.png\n07a81dccdc.png\t3bf70b71cd.png\t6eeeda7f4a.png\t9fabca6b17.png\td30ad7d936.png\n07ac7e530f.png\t3c1097c378.png\t6ef46555e2.png\t9fac28f3df.png\td30e73de0c.png\n07bb074759.png\t3c10a71f62.png\t6f0c88962b.png\t9fbd3ca3b5.png\td31c506913.png\n07cbc65347.png\t3c1d07fc67.png\t6f19d11809.png\t9fd1beec08.png\td31e3fbfdc.png\n07e08eee37.png\t3c1ed5cc1f.png\t6f438c26f8.png\t9fdb6afc56.png\td353a74b66.png\n07fef26d18.png\t3c2ea1a63f.png\t6f5c2712b1.png\t9fe6cf976f.png\td35af24206.png\n08173ef66c.png\t3c2f5ba174.png\t6f78817a2d.png\t9fe707f837.png\td369e6b550.png\n0822e176b2.png\t3c30b8e27b.png\t6f79e6d54b.png\t9ff59d834b.png\td390b1cbef.png\n0829b44678.png\t3c35e3ad94.png\t6fc88fcbed.png\t9fff0319cd.png\td3b6c08868.png\n082d2910a8.png\t3c3e74faf6.png\t6fe2b681e4.png\ta00d2b180c.png\td3be640ed4.png\n0841984941.png\t3c66e5ab54.png\t6fe4017680.png\ta0219e8fa4.png\td3c3988e54.png\n084d3ed9c8.png\t3c6fe4b7ba.png\t7019c22c5e.png\ta038a2d781.png\td3c8980685.png\n085e7a048e.png\t3c7e2d60a7.png\t702209f0b3.png\ta042e73fee.png\td3c9529867.png\n08640ede7a.png\t3c9931bca2.png\t7053b28de2.png\ta05ae39815.png\td3d19422ca.png\n086deb3ccb.png\t3ca252f526.png\t705eb5d67a.png\ta05da102a2.png\td3f2336daf.png\n087192e2ec.png\t3cb59a4fdc.png\t7074e402ca.png\ta0729c7a02.png\td3f3db7bd1.png\n088a402727.png\t3cc59342bd.png\t707f14c59a.png\ta076699b79.png\td3fdfb9e8f.png\n0898e36ee6.png\t3cd0ce6e18.png\t70a32506ab.png\ta07c1fd090.png\td4033ae7f2.png\n08993551ae.png\t3cd4d01b7d.png\t70b6a2347f.png\ta07f9159e0.png\td422a9eb8f.png\n08bafa2eee.png\t3ce41108fe.png\t70c082d59e.png\ta0b1d96545.png\td433300064.png\n08c2070b20.png\t3ce657b8f7.png\t70c7bb7616.png\ta0c156a4c9.png\td453812357.png\n08dc3c9a23.png\t3ced3ed63e.png\t70c8e14243.png\ta0ea88f6a5.png\td45b4f3b85.png\n08f3346b8a.png\t3cf0203cfb.png\t70c92bc80f.png\ta0f050600e.png\td465b396f2.png\n09152018c4.png\t3d0acb5c90.png\t70ca8ebf7b.png\ta10b932fb3.png\td468d401ab.png\n0923f7bef9.png\t3d0f8c81ce.png\t70cef9fcb6.png\ta112a63d2d.png\td46c133c20.png\n0926b7396b.png\t3d339265c4.png\t70d9a282d1.png\ta12cbb17e5.png\td47c78058a.png\n0937b1df3f.png\t3d4bc58424.png\t70db04a203.png\ta12d2c10a9.png\td49a9c4664.png\n0951f85203.png\t3d6d400816.png\t70e5bd9d27.png\ta141ed0c99.png\td4acb30303.png\n09541ce79b.png\t3d781526c2.png\t70e9442b06.png\ta149b73993.png\td4d2ed6bd2.png\n0977568b54.png\t3d99801810.png\t7114209cf0.png\ta14d2ea3b1.png\td4d34af4f7.png\n098c5f3e75.png\t3da64841d8.png\t711c478c93.png\ta1526d3c6e.png\td4d964f62d.png\n09b7182dc5.png\t3da729cae9.png\t711e8ac5ba.png\ta1585a58b8.png\td4e9588d7c.png\n09b9330300.png\t3dae882971.png\t7134e4acac.png\ta15b73d4bb.png\td5023a3662.png\n09e29fbb74.png\t3dbbc237c5.png\t7135ca553a.png\ta15eff2958.png\td514a30db3.png\n09ee5831f4.png\t3dc4b77bb5.png\t7140b36ff6.png\ta163126d96.png\td531fa41aa.png\n09f0f1f414.png\t3dc8ca137c.png\t71438f26ec.png\ta18073f2a2.png\td538594e14.png\n0a0814464f.png\t3ddb2ff7a4.png\t7145f7e956.png\ta19163fc11.png\td553c91194.png\n0a1742c740.png\t3de0c773e1.png\t71507ccc96.png\ta19dbb48cd.png\td56050d3ab.png\n0a18b314fc.png\t3df62de11d.png\t7187f4c02c.png\ta1b5786252.png\td567ba2b49.png\n0a19821a16.png\t3e1c11cfa8.png\t71a31f4356.png\ta1dae195e7.png\td583d291ea.png\n0a31d7553c.png\t3e3431be2f.png\t71a5b1da57.png\ta1e32a6815.png\td58653f803.png\n0a41de5e3b.png\t3e84576386.png\t71a7c521ef.png\ta1e3e1af04.png\td5a283c553.png\n0a7e067255.png\t3e87545802.png\t71c0fd5516.png\ta1fe3de7bb.png\td5afcea9a2.png\n0aab0afa9c.png\t3e8b5729dc.png\t71d10659ad.png\ta21ae8cb6b.png\td606f4a578.png\n0aabdb423e.png\t3eabf280ed.png\t71d3bf0ea8.png\ta21afccc50.png\td61250af0c.png\n0ab5e14937.png\t3ec0423139.png\t71d7482240.png\ta239bfe21a.png\td61e5c8296.png\n0ad0876244.png\t3ec30a713c.png\t71ef380e21.png\ta24080b4a0.png\td629a56d3c.png\n0ad38c8f25.png\t3ed6168b22.png\t71f6fcb48a.png\ta245c6d001.png\td62ad254c6.png\n0adb490de5.png\t3ed6d1c7df.png\t71f724c7ef.png\ta2530b4c45.png\td6379c1f18.png\n0ae08cc1b2.png\t3edc3edf92.png\t71f7425387.png\ta266a2a9df.png\td6437d0c25.png\n0aea45ba8f.png\t3ee1cfb6a6.png\t7205efa791.png\ta27e64c2f7.png\td6471dc0a3.png\n0b176124c4.png\t3ee4de57f8.png\t7242ab00b6.png\ta291ef5d83.png\td65973824b.png\n0b45bde756.png\t3ee87f1ea6.png\t72544ea5b6.png\ta2a012f128.png\td66bb0c151.png\n0b73b427d1.png\t3f22f2ef0e.png\t725a0ae373.png\ta2a9c0c00a.png\td66e51d456.png\n0b7459a0af.png\t3f2865df45.png\t725d92524a.png\ta2b7af2907.png\td67dd7da95.png\n0b7b00ebfd.png\t3f5adbdcc7.png\t7268aedee7.png\ta2bce9d94a.png\td67e3a11d8.png\n0b9981aba1.png\t3f76f25042.png\t726a0d854d.png\ta2da67afff.png\td68c08baec.png\n0b9cb1205a.png\t3f8634704b.png\t7277ef5fef.png\ta3079aded1.png\td69556b746.png\n0ba541766e.png\t3f8a8f4715.png\t728890d8f8.png\ta317a82463.png\td6987190ea.png\n0bca43f7ff.png\t3f9df26c0c.png\t728917ca07.png\ta31e485287.png\td69c382be5.png\n0bdd44d530.png\t3fc4d6b6ed.png\t729c382e20.png\ta32a4a2256.png\td6b87a521f.png\n0be923c49b.png\t3fc79d2ece.png\t72c2499de9.png\ta32b0d0cd7.png\td6cba9f9de.png\n0bed5f2ace.png\t3fd0ef2d38.png\t72e05a5a33.png\ta33e7de189.png\td6e2bb1586.png\n0c02f95a08.png\t3fd616ef00.png\t72e1a1b225.png\ta33ef5ed68.png\td70a2f1f23.png\n0c089f7c1b.png\t3fe0ac38f0.png\t72ef4718f3.png\ta33fe0952a.png\td7196745e5.png\n0c0f001366.png\t3ff3881428.png\t730bddcc02.png\ta35792b156.png\td71ca6d32c.png\n0c19639aeb.png\t400e66f34e.png\t730eee262c.png\ta35ba23a41.png\td71e91efc0.png\n0c376f6728.png\t4015200239.png\t734df57cdf.png\ta3605816c1.png\td7215eb198.png\n0c600207b1.png\t401b591c07.png\t7361f4e5bc.png\ta37249665e.png\td73189228f.png\n0c62dfc234.png\t403cb8f4b3.png\t737e6513c8.png\ta39388bd3f.png\td7398cfded.png\n0c62ed4494.png\t404b62a4d5.png\t739a9ab34a.png\ta3ad1422f5.png\td73e3ccdec.png\n0c6ec35477.png\t4079c73376.png\t73a2f6fb22.png\ta3bea8f266.png\td7506a6a38.png\n0cc1d0e4c4.png\t407bd44d31.png\t73a3712fcd.png\ta3de137406.png\td752eaa82a.png\n0cc3c31d1a.png\t4085f8522d.png\t73a4c6a4bb.png\ta3e0a0c779.png\td765e53b37.png\n0cc5805a92.png\t40a571cf0f.png\t73c6a1ede1.png\ta3e89a273d.png\td768234bb5.png\n0ce329b5f9.png\t40a7f422e2.png\t73f58ba609.png\ta3ff13b1e1.png\td7a53f942a.png\n0ce8a6ae62.png\t40aaac97b1.png\t740070f15a.png\ta41357f553.png\td7cf3bf4e9.png\n0cf7155eb6.png\t40b45f1871.png\t740d34befe.png\ta43ca2ef73.png\td7e6e1323a.png\n0cf9c803b8.png\t40bb5a9cbe.png\t740e4ce45d.png\ta4442fef48.png\td7e93ff9c2.png\n0d19f89dc9.png\t40ccdfe09d.png\t7410f440bb.png\ta446aa0ac8.png\td7f48d294e.png\n0d3b5ec2a8.png\t40d01e18a0.png\t7469baeb8d.png\ta48b9989ac.png\td7f7a9f48a.png\n0d8ed16206.png\t40d96e8022.png\t7488e076ca.png\ta4932322ce.png\td800abe6a4.png\n0d9adabb2c.png\t40dcff68b3.png\t748f8f4106.png\ta49edf33ae.png\td80ead8703.png\n0db1b4c199.png\t40e8e14e60.png\t74ab2a9b1a.png\ta4a2921863.png\td82b80a555.png\n0dcf4a11a6.png\t40fbc03558.png\t74d1d03fc3.png\ta4ae0c0e35.png\td83a74a082.png\n0ddcd3fd1a.png\t415eb5b72c.png\t74d53a0d53.png\ta4c734f1a8.png\td84e52cb6e.png\n0deadce8f1.png\t416101e57a.png\t74d574653f.png\ta4d621da2d.png\td859ee2145.png\n0df375dc9c.png\t4161f35a4f.png\t74d7a9df8b.png\ta4e5b9902b.png\td885813aa8.png\n0df53ae04c.png\t417833daa4.png\t74de17e9c0.png\ta4e8ba21d1.png\td8aebd0f86.png\n0df9f16bc9.png\t41895b0f40.png\t74f00bb214.png\ta4e90eef39.png\td8b1bbbfdf.png\n0e16439e05.png\t418a13df5c.png\t74fa9386ae.png\ta4fb7072d6.png\td8b2dc2e8a.png\n0e53e6294a.png\t418b7878a8.png\t7504d73df8.png\ta5005ea567.png\td8bed49320.png\n0e63d9a8b4.png\t41a4f7424e.png\t751a212e82.png\ta50ac7d1e2.png\td8f80479ac.png\n0ea5ea6946.png\t41cfd4b320.png\t7524d9b7bf.png\ta5299290bb.png\td8fca64ebf.png\n0eacc93ff3.png\t41e173a1e5.png\t753371cca5.png\ta52d7a6cd8.png\td9044ee3c6.png\n0eb52aa52a.png\t41e2137184.png\t7547289453.png\ta531b78648.png\td90473fc86.png\n0ebe3ee1ad.png\t4210e9e5d3.png\t754cfc4b80.png\ta536f382ec.png\td916ac7864.png\n0ec68b2e42.png\t423800be38.png\t7555f79a2c.png\ta5375b180b.png\td93d713c55.png\n0ece337a15.png\t423ae1a09c.png\t755c1e849f.png\ta5471f53d8.png\td944700bae.png\n0ee8a744eb.png\t42758bb41a.png\t7568a9474e.png\ta54d582262.png\td9457c3b1c.png\n0f02cec809.png\t4279e747a0.png\t7572c78534.png\ta554d2d4a3.png\td967181724.png\n0f19680b4d.png\t4285c2ffe8.png\t7576015468.png\ta562619124.png\td96af7c9cc.png\n0f20f0242c.png\t428a877104.png\t7576a8a091.png\ta568cc8273.png\td97f94a2dd.png\n0f3118527c.png\t429b289e07.png\t7582c0b2e9.png\ta56e87840f.png\td991b7a689.png\n0f4637e4b7.png\t429bf7c665.png\t7588e9645a.png\ta56ff4758a.png\td99612d949.png\n0f5ada4dc3.png\t42a3baf571.png\t759b3aaa22.png\ta5805697c1.png\td998a07370.png\n0f613bd191.png\t42b9ff31aa.png\t75ad49564c.png\ta5833d6cc4.png\td9a52dc263.png\n0f6e788cdd.png\t42c563d895.png\t75b4a808b3.png\ta58d3b12bc.png\td9cf4a6e42.png\n0f762f2746.png\t42d64f2d78.png\t75c554378e.png\ta59385755d.png\td9d06aa699.png\n0f7aa746d6.png\t43001d32e7.png\t75c7d3b5a8.png\ta5c4007b7b.png\td9d7716b7a.png\n0f9977b1dd.png\t430ef30e87.png\t75d0e44b1c.png\ta5c6cde207.png\td9e45a7fab.png\n0f9b1dbc3f.png\t4319885cf1.png\t75d70e2411.png\ta5ccb4f8d9.png\td9fb5b47e2.png\n0f9beb1453.png\t4329ab8cf6.png\t75e041fbd8.png\ta5d188bbd7.png\tda02c9184d.png\n0faaf196b4.png\t432b8b6a71.png\t75eaea4cdb.png\ta5d2da8488.png\tda082ec985.png\n0fb7b2772a.png\t4336ac847e.png\t75efad62c1.png\ta5eedd418a.png\tda0f021c5b.png\n0fc5ad7e3b.png\t437fbb64d4.png\t75fb579aa2.png\ta60daba969.png\tda5ab3d506.png\n0fcbd0567b.png\t438443e0d3.png\t760ca73d1f.png\ta633b127fc.png\tda689f1bca.png\n0fd38464c3.png\t438b4cc6e3.png\t7616ee06bd.png\ta653a50dc4.png\tda84ed2423.png\n0fea4b5049.png\t43a1dbb062.png\t761eaf3fa4.png\ta659530342.png\tda94055b53.png\n0ff9f243e5.png\t43a9c21cea.png\t762c7d7084.png\ta65aa82e0a.png\tdab1d67df7.png\n1017084680.png\t43ec939286.png\t762f01c185.png\ta6625b8937.png\tdae1555fbf.png\n1019e6ed47.png\t44047c3e10.png\t7633c6bb70.png\ta667ba911e.png\tdae96d99b2.png\n103713e20d.png\t440534c8dd.png\t766784f810.png\ta66c03c236.png\tdb0238e5c7.png\n1037da2ff7.png\t4433b159db.png\t76761b2a41.png\ta685e6f87e.png\tdb02a30bb8.png\n104ce199d8.png\t44381a3f55.png\t768d5f7ade.png\ta69586db4e.png\tdb0bc63d60.png\n106a222c7c.png\t443985b709.png\t769d51b80b.png\ta6a3383416.png\tdb1aa8e608.png\n106e4043ba.png\t44425dc49d.png\t76ae1014bd.png\ta6ba9b9f91.png\tdb1eeaebb4.png\n1075fa3200.png\t444c32935c.png\t76b4df03dd.png\ta6c2605919.png\tdb32e0961b.png\n107f1047c1.png\t444dcc147b.png\t76ba896225.png\ta6ec5aa914.png\tdb548fd037.png\n10833853b3.png\t44ac5b2526.png\t76c76f8056.png\ta6f4e0f8b5.png\tdb5bf853fc.png\n109bccf6c6.png\t44c8aed146.png\t76d080ed6f.png\ta7000441e9.png\tdb75c6831e.png\n10a1a5b78f.png\t44ccc2d846.png\t76e1428103.png\ta701844e9a.png\tdb819c45e5.png\n10aa115164.png\t44d745a704.png\t76ecdc5896.png\ta708e52d7c.png\tdb840e1e22.png\n10b19bd1f8.png\t44e3f20074.png\t7709ec3c5d.png\ta70b2204f0.png\tdb9bb00470.png\n10e1c8e23a.png\t44edd661a0.png\t770af4e501.png\ta71bdf1af7.png\tdbb4b51e4a.png\n10e835f9ab.png\t44efef9c4f.png\t7716aa7b58.png\ta7254b3aa6.png\tdbcfdf17a3.png\n10ecfda568.png\t44f2ea3584.png\t7738fbabca.png\ta72f220de9.png\tdc0119712d.png\n10f24c23af.png\t44f5dd71ab.png\t77415e7cda.png\ta774562f5e.png\tdc05ed2596.png\n10f3d01ddb.png\t4514686882.png\t77420e5628.png\ta783bd82c5.png\tdc4a1c9a30.png\n113b017a91.png\t4514d46495.png\t774ee8afcf.png\ta78e36d77b.png\tdc4bf302e8.png\n114397189f.png\t45180088a6.png\t775bc235d0.png\ta7a124fff5.png\tdc534f1e28.png\n114d62da56.png\t4520536c03.png\t77608c7770.png\ta7a87180a3.png\tdc710565e6.png\n114daa1bed.png\t452b06b003.png\t7769e240f0.png\ta7b1f4bae1.png\tdc7c506beb.png\n11518835a4.png\t4541761ba4.png\t776bdf9043.png\ta7b918d80e.png\tdca3f5ce34.png\n115ee54da8.png\t455c8febce.png\t776d123d8d.png\ta7bc02b7d4.png\tdca83468d2.png\n1167a2bec9.png\t455d05742a.png\t778a92f420.png\ta7cfa30d35.png\tdcb1eee6e3.png\n1169f5b1fb.png\t4575e490e4.png\t7791af3409.png\ta7f4b8d36e.png\tdcbb6474ee.png\n116bd22fd0.png\t457d5edb4c.png\t7796858a9b.png\ta7f7a964f5.png\tdcca025cc6.png\n11a14b9fcd.png\t457f80b1c5.png\t77abc5ed21.png\ta7fa6dd049.png\tdcca0c8297.png\n11a54f2eff.png\t458a05615e.png\t77acb619e7.png\ta803fc87d1.png\tdccd2d9d48.png\n11bfb46a23.png\t459396b418.png\t77c071c12d.png\ta8247ac73a.png\tdccf379405.png\n1263576548.png\t45a1f78171.png\t77c695369e.png\ta826a7a9a4.png\tdce4c70a99.png\n1267729ffd.png\t45a2a01008.png\t77ecd76754.png\ta826e5e6e5.png\tdce6361f2a.png\n1268fbf8c2.png\t45a7562dbf.png\t7812fc8116.png\ta849a4e373.png\tdd11ae96e3.png\n126f253739.png\t45b5f91d83.png\t78299d7a47.png\ta8503e0dc8.png\tdd580d1681.png\n1277f1a611.png\t45f292bb41.png\t782ae9b7e7.png\ta854446118.png\tdd5a90cf89.png\n127ef4105f.png\t45f31c81d4.png\t7845115d01.png\ta86acaf330.png\tdd6a04d456.png\n128f3b0e36.png\t45fb16d378.png\t78b2d05419.png\ta87eb66a65.png\tdd81358234.png\n12a953a079.png\t4608c638bd.png\t78be8270bb.png\ta88dd59036.png\tdd87f1a21f.png\n12bf4f2d3b.png\t4618b50b42.png\t78c50b3977.png\ta898892737.png\tdd9a94a797.png\n12c92c9f70.png\t4630209118.png\t78f17697f2.png\ta89d1999ec.png\tddcb1ad560.png\n12d09b9fa4.png\t46386ca4a3.png\t790f497dc8.png\ta89fc2586a.png\tddcb457a07.png\n12e859e91d.png\t4654389112.png\t792929a4c8.png\ta8aeef5395.png\tddd0be8fd5.png\n12e94a5a08.png\t465463b009.png\t7972b70832.png\ta8be31a3c1.png\tddd3099877.png\n130229ec15.png\t465486e6ec.png\t7977cbab6f.png\ta8bfe7072c.png\tdde0ee264b.png\n13037dfbb2.png\t465f93cd9c.png\t79fed5a697.png\ta8d50a4a97.png\tde08c5a3b7.png\n1306fcee4c.png\t466030918d.png\t7a095f0b21.png\ta8e4e6ddec.png\tde091953f9.png\n1313dbf1d7.png\t466548defe.png\t7a0c8915be.png\ta8e5845bf9.png\tde1142326e.png\n131ca4b83d.png\t46673b0e2d.png\t7a0f1e9b45.png\ta8ebba8c1e.png\tde15d35ebc.png\n13209940d7.png\t467469da7b.png\t7a25e51377.png\ta8ef242dfd.png\tde261976e6.png\n132e30cffd.png\t4680bc7373.png\t7a2cca4fe5.png\ta90545f377.png\tde39c2b892.png\n13483cd62a.png\t4696bb53e6.png\t7a2eebff84.png\ta91212b01e.png\tde51e77a38.png\n135ae076e9.png\t46b424e067.png\t7a696bb878.png\ta913c2fc19.png\tde5a329823.png\n136af05158.png\t46b7364b81.png\t7a6d32bcae.png\ta91bd88bc1.png\tde7202d286.png\n1379940546.png\t46bbe12b2c.png\t7a817b12b4.png\ta92f49d2d9.png\tde7af15747.png\n13adf02399.png\t46bec86213.png\t7a9610078a.png\ta93c251217.png\tde95e861ac.png\n13f3daf89f.png\t46c1a0e9a3.png\t7aa7e1b355.png\ta94450f57e.png\tdea0a56bab.png\n13f448be07.png\t46dd77ede5.png\t7ab70da99b.png\ta959b2372a.png\tdeaa0ab2b8.png\n13fb17bf13.png\t46f402dcd5.png\t7aebdeb9a7.png\ta962f4c9b6.png\tdebe560cfb.png\n1401f04c4b.png\t4702cdd4f0.png\t7aef9898fa.png\ta970e82aac.png\tdeec7bb4d9.png\n1414606ab4.png\t47048ac10a.png\t7af5b00299.png\ta988661613.png\tdef16cd3d7.png\n14152a6731.png\t47398efeb4.png\t7af84fef71.png\ta98ab1c549.png\tdf32c11fe8.png\n1435d5602c.png\t4743e3bf8d.png\t7b078e7517.png\ta98d950667.png\tdf344e0599.png\n1439570aec.png\t475fafb8e2.png\t7b0c7e28f0.png\ta9a9fb1747.png\tdf3a1c2d7b.png\n143eefe05a.png\t4766b51055.png\t7b2032d7d1.png\ta9ab1afe01.png\tdf47f2e681.png\n1463aee5b3.png\t47729d4ee5.png\t7b4ccb8113.png\ta9b7fa7e0d.png\tdf56e7aaad.png\n146ec37657.png\t4776354474.png\t7b596fc561.png\ta9df598e16.png\tdf5df272fb.png\n1495558064.png\t479cb24efb.png\t7b5d5d40fe.png\ta9e940dccd.png\tdf811b26b6.png\n14a89a1840.png\t47aba1e35c.png\t7b79d341cf.png\ta9ebb78977.png\tdf8c73e52b.png\n14bba6af57.png\t47bd268dcd.png\t7b815ae7fb.png\ta9ee40cf0d.png\tdf8caa8bdf.png\n14c16f99d5.png\t47fd4036e7.png\t7b9facc0aa.png\ta9fd8e2a06.png\tdf8d9625c0.png\n14c6a2d399.png\t481ed855f2.png\t7ba58ee1be.png\taa1c0ba47b.png\tdfa4d727b8.png\n14da113475.png\t483372b203.png\t7beceb5850.png\taa1cd9fb71.png\tdfae9e48e8.png\n14dcf18fb6.png\t483b35d589.png\t7bf203a987.png\taa1d903842.png\tdfc8a378b7.png\n1544a0e952.png\t4848a3a509.png\t7bfd026d91.png\taa21ba676c.png\tdfd3138198.png\n154948ae6d.png\t4864559ba5.png\t7c0447f939.png\taa36cc5b96.png\tdfe111dace.png\n155bcd4b4d.png\t486ae46ba0.png\t7c0b76979f.png\taa56279e8f.png\tdfe230b677.png\n157b2aa3e0.png\t4875705fb0.png\t7c2609c66c.png\taa791ce59a.png\te021227dbd.png\n158765ed16.png\t488f5f133e.png\t7c41f727b9.png\taa94cfb806.png\te02ae4f371.png\n15d079a8e8.png\t48940ae0b0.png\t7c61d788af.png\taa97ecda8e.png\te030bdae65.png\n15d76f1672.png\t489a8fd021.png\t7c6221dbb8.png\taa9eb3a110.png\te045664b46.png\n16017c8e02.png\t48a18f5386.png\t7c6a2df039.png\taabb4516d1.png\te05db130be.png\n16207869ba.png\t48b63017da.png\t7c6f76a267.png\taac747667e.png\te060d5ca41.png\n1621c14879.png\t48b8dda801.png\t7c7125fd12.png\taae8971d6e.png\te0751c41fd.png\n1623de321c.png\t48bb098115.png\t7c99d1c1eb.png\tab18a0a7fa.png\te082398595.png\n164873b51f.png\t48d34846d7.png\t7c9c01ebe8.png\tab5fe4fea8.png\te099e54eee.png\n16589f0702.png\t48d81e93d9.png\t7c9c98aa11.png\tab6dfe07be.png\te0b6a1211f.png\n1664b6a6d5.png\t48db3b8c5f.png\t7cbfebdf9d.png\tab72556c51.png\te0b778dacd.png\n168f625d62.png\t492383a53b.png\t7cc53fe88b.png\tab767aef7f.png\te0bde1cf11.png\n16a9609841.png\t49336bb17b.png\t7ce6463bc8.png\tab7d09443f.png\te0c136dc8e.png\n16b2119ffa.png\t494daad56b.png\t7cf386709a.png\tab82b9e44e.png\te0cfbbcb0c.png\n16bee00afc.png\t495a2d3c7c.png\t7d0b2f2046.png\taba57f0f2c.png\te0d98e6fde.png\n16cebad726.png\t495e5d8dc0.png\t7d0cea3489.png\tabb1521de8.png\te0da89ce88.png\n16e308dbc3.png\t4985566e7d.png\t7d14f8cfe3.png\tabbfb56731.png\te0dade9896.png\n17113fd0bd.png\t4986fb6755.png\t7d41b0a6e1.png\tabcc448c24.png\te0f030eb66.png\n1724889841.png\t4989c6509e.png\t7d48474434.png\tabd1e3e008.png\te0fb508d51.png\n1739b1634d.png\t498d327724.png\t7d52a376a7.png\tabd889c82f.png\te1009fa370.png\n173dd2f369.png\t499560e566.png\t7d5c34a95a.png\tabe36055e8.png\te101b9a0ec.png\n174b478747.png\t49a90688bf.png\t7d87ab4aa8.png\tabe74273c8.png\te10a1a3e81.png\n174ebb49df.png\t49d061af38.png\t7d8b219e1c.png\tabed119094.png\te11123ab8b.png\n1751290ca2.png\t49f22705ea.png\t7d8e29261b.png\tabfed46705.png\te1280ddc3c.png\n176ad39bc7.png\t49fe2ee7ba.png\t7d963e0f5d.png\tac03463615.png\te12aab98e6.png\n1770691c51.png\t4a11c0f4ad.png\t7d9f0dacd0.png\tac1ea6faa4.png\te12cd094a6.png\n178bd2185d.png\t4a12baccc4.png\t7db7a5fb8f.png\tac412e6d55.png\te12cf83477.png\n178d777950.png\t4a27de03fc.png\t7db9b53d89.png\tac460261f4.png\te1323691be.png\n1790f3520c.png\t4a39459597.png\t7de09d894b.png\tac5f676fa3.png\te172f09faa.png\n17972a6517.png\t4a5595076f.png\t7de95e0115.png\tac6055e7ce.png\te185ab5dc1.png\n17a6639049.png\t4a5dd4ab45.png\t7deaf30c4a.png\tac6289352d.png\te1921d9a0d.png\n17a8b7e7f3.png\t4a7418a5e6.png\t7df36788a9.png\tac64f9de3c.png\te1984bebe3.png\n17bb0d4b3e.png\t4a8d55bdf5.png\t7df431274e.png\tac7df612ff.png\te199174d64.png\n17be1887b9.png\t4ac19fb269.png\t7dfdf6eeb8.png\tac90c088ee.png\te1bd09e50b.png\n17c5d2464c.png\t4ad2ddacfc.png\t7e138bf911.png\tac931ace49.png\te1c6994595.png\n17c6355815.png\t4ad8a63538.png\t7e26f45eb9.png\tacb95dd7c9.png\te1cb177740.png\n17d70fee5c.png\t4b019a8354.png\t7e40cdf1f3.png\taccb644c9e.png\te1ce826217.png\n17dd87aefa.png\t4b0bf2f4a1.png\t7e58c5a1f6.png\tace37eaaf9.png\te204511fd2.png\n17dde2138e.png\t4b0c5aad63.png\t7e5a6e5013.png\tad0e66eabf.png\te2063ce131.png\n17e67f9ce4.png\t4b15912a41.png\t7e5cdca0b7.png\tad16edc8be.png\te23a93a95e.png\n1804a4c064.png\t4b17d5fb8f.png\t7e67ef84e1.png\tad1e8ef844.png\te2421eee13.png\n1804b11f9a.png\t4b2af55ae6.png\t7e68df3981.png\tad2113ea1c.png\te24e9afdd1.png\n180d2ec2cc.png\t4b3502ca99.png\t7e7276d088.png\tad297bfd03.png\te26c381bd4.png\n180f0a1854.png\t4b44c94892.png\t7e737a6cc6.png\tad29bc8923.png\te27400528c.png\n1821c518da.png\t4b522e501e.png\t7e78a47cee.png\tad2fa649f7.png\te278cee272.png\n1825fadf99.png\t4b72e35b8b.png\t7e92ab6fe8.png\tad3c6dd8d4.png\te2810cc777.png\n182bfc6862.png\t4b873df7a2.png\t7e9e46174c.png\tad49962b22.png\te2a70ffdf6.png\n182ea1798b.png\t4b9862566c.png\t7eb42fe5ec.png\tad5454050e.png\te2ba31c4a5.png\n184356736e.png\t4b9b8c2e44.png\t7ebd3f0cdd.png\tad6e3961d4.png\te2d3957c95.png\n18449e54f8.png\t4bb8260218.png\t7ec58d0743.png\tad6fe18943.png\te2dd6ebd6b.png\n185172060e.png\t4bc87727b5.png\t7ec64d2b8e.png\tad76d4cd21.png\te2e4fcf83e.png\n18590f686a.png\t4bd876a71c.png\t7ec803b58a.png\tad93d35798.png\te2f397d206.png\n186d861771.png\t4be0764138.png\t7ec839af45.png\tade746d346.png\te3349e2125.png\n186edd93a3.png\t4be85a3110.png\t7ec97427ca.png\tadf2a57778.png\te335542c17.png\n18ac660fbb.png\t4c36178b41.png\t7ed0489985.png\tae11c7b131.png\te337021e6a.png\n18b26e968c.png\t4c43d99f68.png\t7ed79f8f56.png\tae2ef70d51.png\te340e7bfca.png\n18c4cdefec.png\t4c4556fff0.png\t7ede03dede.png\tae51614513.png\te34c1220b1.png\n18d013fced.png\t4c7e3c71e6.png\t7f01757d4c.png\tae61c09508.png\te353ae4dd5.png\n18e2325a5b.png\t4c7f2b2c18.png\t7f0825a2f0.png\tae6b6df24d.png\te358121d19.png\n18ea3802ca.png\t4c8132dc73.png\t7f0bdb5e31.png\tae6ea6e613.png\te360f895de.png\n18f3626b0f.png\t4c89e94e36.png\t7f0e67d5ec.png\tae96972c1a.png\te366f9d62b.png\n18fc51d517.png\t4ca8350f11.png\t7f0ee9c30a.png\taeb36c1964.png\te3755d5769.png\n190ea39ec4.png\t4ce60abe5f.png\t7f1d5f223c.png\taeba5383e4.png\te37a3da5b2.png\n190febf6cd.png\t4cfee4b097.png\t7f2642a14a.png\taebff18bf9.png\te37b8208fa.png\n1925c11fce.png\t4d03329aa4.png\t7f2694ca3f.png\taedad85e8c.png\te39ded223d.png\n1931feb9b8.png\t4d04dcd959.png\t7f30236ca4.png\taede306d15.png\te39f959114.png\n1937cfedc5.png\t4d1efb19d7.png\t7f3e8a8526.png\taeebdd4350.png\te3b1b48b5a.png\n196cffdb4b.png\t4d2ede6b7a.png\t7f5914b74a.png\taf0e101aae.png\te3cded005d.png\n19843b4e95.png\t4d33311a1e.png\t7f5a72fa2c.png\taf171a3570.png\te3e1afe993.png\n1986fcadda.png\t4d341171a4.png\t7f826d312e.png\taf296d1c0a.png\te3e92cc747.png\n19902af6bd.png\t4d4adc5623.png\t7f99abfc11.png\taf6fe9a9c8.png\te4159ed1de.png\n19b7d2e2bf.png\t4d58e8d8ff.png\t7f9f13c0d0.png\taf71ca4ab3.png\te41a05735d.png\n19bd49e478.png\t4d6121dae0.png\t7fa6578978.png\taf7e9d7648.png\te437c3912d.png\n19cc342b91.png\t4d684d99cc.png\t7fadb8efe8.png\taf84304233.png\te446b0b4b6.png\n19d821e8e4.png\t4d7b42314d.png\t7faea04242.png\tafd0b385f2.png\te44dcaeb74.png\n19e749d0b1.png\t4d9b845652.png\t7fc1ce1cb4.png\taff87f7ba7.png\te45c509f80.png\n19fdb270d6.png\t4d9fa2d97f.png\t7fd5baaee8.png\taff8ae9d4d.png\te476b868ef.png\n1a0996a817.png\t4db68c7d25.png\t800daf6ac1.png\tb012e9ebb0.png\te4be64c1fb.png\n1a1b0c1549.png\t4e3ccf37ee.png\t8023f35b58.png\tb014624db2.png\te4c8dc59fa.png\n1a288f27bf.png\t4e41f7a5df.png\t802df9bb5e.png\tb04c03e3d4.png\te4d2bf6036.png\n1a28f06897.png\t4e47ff086f.png\t80363d0458.png\tb052410a6a.png\te4d9d1178c.png\n1a2918c9b4.png\t4e4cadf03f.png\t804fcfaf4b.png\tb06fe13f54.png\te51599adb5.png\n1a2d060823.png\t4e51caa68e.png\t8063c65b57.png\tb07d4da735.png\te51bea30d0.png\n1a37af90c1.png\t4e5d643b2f.png\t80694aa70f.png\tb07fc3b87c.png\te51e614146.png\n1a4278325d.png\t4e6855dbae.png\t808cbefd71.png\tb095375d19.png\te53c2e8771.png\n1a4ffa4415.png\t4e7d4502bc.png\t809cb34e8d.png\tb097a1f9e0.png\te54c815e85.png\n1a5c093750.png\t4e86258646.png\t809ce44053.png\tb09f0e743d.png\te560423302.png\n1a67b08604.png\t4e89a919ed.png\t80a458a2b6.png\tb0a18e86c7.png\te5a22d144d.png\n1a72cc3d3e.png\t4e961dd0ed.png\t8116fa64d4.png\tb0a2fe2a63.png\te5a6b92463.png\n1a7f8bd454.png\t4ebed82264.png\t811ebdcc4b.png\tb0a7056a04.png\te5a8368ee6.png\n1a8a17f220.png\t4eccb0cb9d.png\t812c804456.png\tb0d8f26ad8.png\te5b47aa145.png\n1a8c4fd6ea.png\t4ed12ec89a.png\t8138c79081.png\tb0e479668d.png\te5c26c8634.png\n1ab49f29e9.png\t4ed2467e7c.png\t813d6ad4ba.png\tb10fa280bf.png\te5df7166bc.png\n1abe5e8a3d.png\t4edadcf4c4.png\t81491224d7.png\tb11110b854.png\te61d42e6cf.png\n1abf3d7f60.png\t4ee4b3485e.png\t8164beeaf5.png\tb112d9993c.png\te63196d890.png\n1ac058f0a4.png\t4ee5d2c629.png\t816c9d6d5d.png\tb12a5a2b61.png\te639615636.png\n1ac4fd8022.png\t4ef0559016.png\t8174aba668.png\tb13f0dbc84.png\te642037975.png\n1ad2135efa.png\t4f02bab289.png\t81ab8c6bb7.png\tb15313d4e8.png\te6674212ce.png\n1ad819c49a.png\t4f105993c5.png\t81c1149163.png\tb172bd2032.png\te667f4d290.png\n1aef65e24b.png\t4f30a97219.png\t81c77c1b6c.png\tb183b2ddc4.png\te683e8ae69.png\n1af44a9db0.png\t4f3368a855.png\t81d37cb5fd.png\tb194549062.png\te69330b40a.png\n1af9e4bbd8.png\t4f349b5ab8.png\t81e9af5ee2.png\tb1ac5ed0b3.png\te6a78e36b5.png\n1afddc8196.png\t4f5df40ab2.png\t81fa3d59b8.png\tb1be1fa682.png\te6ab700d15.png\n1b073494ab.png\t4f62a5bb14.png\t8201005b79.png\tb1dc23a427.png\te6ac2a35cd.png\n1b0d74b359.png\t4f6328bf93.png\t820c4570b1.png\tb1e590d1c9.png\te6caf5eb81.png\n1b2c817dd0.png\t4fbda008c7.png\t821e0217f0.png\tb1f0e104f5.png\te6e3e58c43.png\n1b2daa0f13.png\t4fd99d2b52.png\t822043d604.png\tb202b6b3d4.png\te6f86c83e2.png\n1b5a148cd3.png\t4fdc882e4b.png\t824152cf1c.png\tb221ece24b.png\te73532b450.png\n1b5c4051d4.png\t4ff5c5ef47.png\t8270908d6a.png\tb240582de8.png\te73ed6e7f2.png\n1b61e02423.png\t500683ce7e.png\t829ebfbb32.png\tb240778420.png\te7442307b1.png\n1b6919f87c.png\t500d909b87.png\t82a285227e.png\tb24d3673e1.png\te799fcac15.png\n1b71d67404.png\t501c68e571.png\t82adfbc0f1.png\tb28690f7ad.png\te7a5e5e55f.png\n1b8d100712.png\t50251a9572.png\t82c316179b.png\tb2899a1188.png\te7c54b5f4e.png\n1b9865501f.png\t50290e75f9.png\t82c340f4e5.png\tb28aeb78e0.png\te7c55756c7.png\n1bd1c8c771.png\t502b97b954.png\t82e88d973f.png\tb2958f2747.png\te7da2d7800.png\n1bebafb9f3.png\t503aba826b.png\t82f186d2ac.png\tb2b0cd95a4.png\te7e9165d9b.png\n1befe1ba44.png\t504c4a71d2.png\t8320911258.png\tb2c6b8cf57.png\te7ee2c6dce.png\n1bf062bda2.png\t505483d249.png\t8329ec9395.png\tb2d02516e1.png\te7f0e30585.png\n1bfe4907c1.png\t5081bad441.png\t8343ad9ed8.png\tb2d7897c35.png\te7f63a219f.png\n1c02222ef8.png\t50b3aef4c4.png\t834861f1b6.png\tb2f87059da.png\te80ac14b0f.png\n1c0b2ceb2f.png\t50d3073821.png\t83558aa4b8.png\tb2fbaf80d5.png\te813f3d1bf.png\n1c20fc4477.png\t5103df3d79.png\t83645a3b6f.png\tb329b390e4.png\te82162c83c.png\n1c3397d0ef.png\t5108785aaf.png\t8367b54eac.png\tb33771a471.png\te82421363e.png\n1c37e37fbf.png\t51179a1815.png\t836eedd4a7.png\tb3396387a6.png\te82b7a2abb.png\n1c41821d13.png\t51423622fd.png\t83967f4314.png\tb33b1f9db6.png\te843caea8f.png\n1c6237ae58.png\t5155ad847f.png\t83bb4578eb.png\tb33d57c4be.png\te86361f5d1.png\n1c687fea93.png\t51846d4c96.png\t83be4b6836.png\tb33e244157.png\te865014246.png\n1c6f7aac78.png\t51870e8500.png\t83c0564224.png\tb342a4959d.png\te8757626a1.png\n1c78d88cb8.png\t5187e13789.png\t8409fed45c.png\tb345b1f290.png\te875c9a960.png\n1c8f6881ac.png\t518ee8cf52.png\t84242f42bc.png\tb354751edd.png\te880e84886.png\n1c9fa3dff4.png\t519cc2a871.png\t8431a87cff.png\tb355ed119c.png\te88b8de75c.png\n1ca865b041.png\t51acfd1399.png\t843447abe4.png\tb35b1b412b.png\te89272a29b.png\n1cad8c327b.png\t51b639314d.png\t843e8ec44e.png\tb393a57640.png\te8b730fe92.png\n1cb4733828.png\t51be9fc3d5.png\t843f9892ef.png\tb39fe87a25.png\te8cf12563d.png\n1cb9549d2c.png\t51d1696054.png\t8457263314.png\tb3ad13d9e9.png\te8d6412ed8.png\n1cc428e57e.png\t51e82eaa5b.png\t84723e7793.png\tb3b37c152d.png\te8d8b35601.png\n1d0a131f99.png\t51e8e6bfc4.png\t848c9d1a6a.png\tb3f6affa39.png\te8e8b96a84.png\n1d0c2fa004.png\t51ede1c702.png\t849881c690.png\tb40457631c.png\te8f3e42b14.png\n1d2382ac3f.png\t52020f0fd4.png\t849ccdf3af.png\tb40844d201.png\te927f91b12.png\n1d2c8152bd.png\t5203d6e046.png\t849f39dcbe.png\tb42ffc97d4.png\te929ca0e34.png\n1d2cd614be.png\t520fdf987c.png\t84af757738.png\tb448f0fae6.png\te92f67b3c7.png\n1d4e60eafa.png\t525be73de5.png\t84b5568c2a.png\tb45ad3932e.png\te93ddc6ba7.png\n1d685060b2.png\t52667992f8.png\t84bd944c62.png\tb45f9383e9.png\te96a030ded.png\n1d798aa5d5.png\t526d63530a.png\t84e23dba29.png\tb461a5b584.png\te98557c9f3.png\n1d7a812c76.png\t527c3abd5f.png\t84e305da04.png\tb4b225aaa8.png\te9909298a9.png\n1d80de2ec9.png\t528cb6678c.png\t84e540208f.png\tb4e23ae450.png\te9ce8188b9.png\n1d93614392.png\t52914952d6.png\t84eb2c400b.png\tb4e61836df.png\te9d39cddd7.png\n1d93cf8169.png\t52a0f74bfe.png\t851498c095.png\tb4e8386ff1.png\te9f4d6f927.png\n1d955f8103.png\t52aa9a9703.png\t851a37ac37.png\tb4f6d6f1aa.png\tea1270a8b2.png\n1d9bc2591e.png\t52ac7bb4c1.png\t852873eb7b.png\tb506085af9.png\tea1332b03d.png\n1d9f4e875a.png\t52ae147d60.png\t853e89d0d3.png\tb51c130238.png\tea1a6e1b33.png\n1da21e7498.png\t52e7de7430.png\t8555cb627a.png\tb51eed936a.png\tea2d80c951.png\n1dbc694c5d.png\t52fdbf981f.png\t855f399924.png\tb521c9ddf3.png\tea2f1fa62f.png\n1e2bad4d81.png\t5306b708be.png\t85670a7093.png\tb525824dfc.png\tea2fe8434f.png\n1e4a233a70.png\t530f759c6f.png\t857e511397.png\tb540435ae7.png\tea3cdbd7cd.png\n1e4c96a9cd.png\t5326fad561.png\t857f570585.png\tb5448cb40a.png\tea54256d21.png\n1e5e713964.png\t533a731c99.png\t858f01d368.png\tb552fb0d9d.png\tea5dd02892.png\n1eaf42beee.png\t5343d4837c.png\t859f623d32.png\tb57e98a275.png\tea6f13458b.png\n1ebc029ba0.png\t53485dc0d9.png\t85bca2aade.png\tb5850af687.png\tea7425da00.png\n1ee0d5b4d0.png\t535503063e.png\t85c191c52c.png\tb5893fc541.png\tea9dfe6f60.png\n1ee82dabee.png\t5378360450.png\t85cf79a2cb.png\tb5a47c646f.png\teab589e1f2.png\n1efe1909ed.png\t537f18bd97.png\t85e6b4cb1c.png\tb5b767492b.png\teabe9337a9.png\n1f0b16aa13.png\t5395202313.png\t85fe62d2d6.png\tb5bdff9392.png\teabed5b2b4.png\n1f1cc6b3a4.png\t53b480e29f.png\t8612ca9286.png\tb5e1371b3b.png\teac3fb300e.png\n1f1e11aee4.png\t53c79255ba.png\t8620f40403.png\tb60fbb288b.png\teac822921e.png\n1f3299c2ba.png\t53dab98a36.png\t86365721e2.png\tb637a7621a.png\teaeb630ff4.png\n1f33387a8e.png\t53de66df2a.png\t8650153427.png\tb63b23fdc9.png\teaefc56247.png\n1f50bc7a9f.png\t53e17edd83.png\t86866ba71d.png\tb6520147bb.png\teb23ddad08.png\n1f6afee5ca.png\t53eb356632.png\t8686b3a5d4.png\tb6604d4bd6.png\teb38db5443.png\n1f73caa937.png\t5423d6049c.png\t868e7a70ae.png\tb67c735ff6.png\teb3a40225e.png\n1f7e3eef06.png\t543acdb5f7.png\t8693030eb3.png\tb6804bfa43.png\teb4efc4c39.png\n1f805e4733.png\t543f5887a2.png\t869f99406d.png\tb690abdc09.png\teb4fa16734.png\n1f8bf8c5d0.png\t5446a64ba5.png\t86a772adb1.png\tb6aa5c7ab4.png\teb5e2a6b3a.png\n1f9dcb9a1f.png\t5448c46aac.png\t86abf1c96d.png\tb6bbd06618.png\teb84dfdf18.png\n1fba03699e.png\t54629440c9.png\t86bc03a8a6.png\tb6bca345bb.png\teb91416afe.png\n1fec8e839a.png\t5468ed796d.png\t86c1109730.png\tb6bd6424eb.png\teb958a5cc8.png\n1ff0aa7f05.png\t547c9d83fd.png\t86c675c4fb.png\tb6c52b4a53.png\tebacf0e713.png\n2010b17cf6.png\t54842b574a.png\t86f2fd7025.png\tb6e57d954e.png\tebc46e6691.png\n20349d0a12.png\t548887f17d.png\t86f892ad2f.png\tb6ebc3831e.png\tebcd41b5f1.png\n203f6117be.png\t5489419bef.png\t86fe531971.png\tb6f907680a.png\tebd623bfc6.png\n204c4657d5.png\t5489a9e706.png\t870be46a42.png\tb70f0ab0c2.png\tebdbbb52b5.png\n20658e781f.png\t54a3636ba6.png\t8718d851c6.png\tb71915a0fe.png\tebdef67fa5.png\n2067c20654.png\t54cb8bdbe2.png\t871be86f43.png\tb749a000ea.png\tebe64e1d15.png\n20894faad9.png\t54cfa66336.png\t875b087f1c.png\tb74ecdfe20.png\tebe8198105.png\n20a0f53112.png\t54dd074cd7.png\t875dda3975.png\tb76760a4a1.png\tebf89446f2.png\n20a4dad0c1.png\t54dd4658d1.png\t876e6423e6.png\tb76b065012.png\tebff5f719e.png\n20b11dcfb8.png\t54e2419450.png\t87725b728a.png\tb77cc54800.png\tec10996820.png\n20b26a6ea7.png\t55111519ec.png\t877d1a7ebe.png\tb7b83447c4.png\tec212b3d2d.png\n20c99dee06.png\t553733b17b.png\t877f9dd0a7.png\tb7c1e2a377.png\tec2664c4cf.png\n20ed65cbf8.png\t5559e0ceab.png\t878566ab56.png\tb7c462dd1c.png\tec3130dd0b.png\n2101c21f75.png\t5575337f66.png\t87986259f0.png\tb7dc0d87fd.png\tec406285e8.png\n211dca3b80.png\t557feaaf65.png\t87af5ec058.png\tb7ec14f5e4.png\tec50263c20.png\n211eb21702.png\t558b11d1b8.png\t87afd4b1ca.png\tb80db32c01.png\tec542d0719.png\n211fc910da.png\t55a49950a0.png\t87c6bf0012.png\tb8236d92ff.png\tec5afeae66.png\n212ba67b51.png\t55e9012561.png\t87cf2c6542.png\tb833fdb882.png\tec5db065d9.png\n212f074587.png\t55f1ea73b3.png\t87d58438cc.png\tb839f805f9.png\tec61b48efb.png\n213e0a9f7b.png\t563191e71a.png\t8811181716.png\tb83bc9dfa7.png\tec649a86b2.png\n216aa6275d.png\t565d2342f2.png\t881407fa84.png\tb85046c5c1.png\tec7522391d.png\n216e2a7492.png\t565f9c3fce.png\t881bfee759.png\tb852505678.png\tec817961a2.png\n217a70a0cb.png\t566ac48e00.png\t8824cdcbf0.png\tb8525600cf.png\tec857f4518.png\n217eeca27e.png\t5673036fa1.png\t884400b2e8.png\tb858ccf692.png\tec88937127.png\n21d01a8e82.png\t5677f991b7.png\t884cd7af00.png\tb890b261a0.png\tec9ba7900d.png\n21df07989f.png\t56810440c4.png\t885c2a90f5.png\tb8a9602e21.png\tecdd648857.png\n21e1d6e38d.png\t568d119e66.png\t886c96ffb8.png\tb8b3f26ad4.png\ted081aa065.png\n21ec084221.png\t5691530ff4.png\t886dc2e534.png\tb8c3ca0fab.png\ted20d93bfd.png\n2226df3ca1.png\t56974f6864.png\t88839f49f9.png\tb8dea42e1a.png\ted24549a3f.png\n22398d77b8.png\t569f85c4ce.png\t88a4200cec.png\tb8ee1d8dbd.png\ted2b73840d.png\n223e0c3942.png\t56a4e5c1e5.png\t88a5c49514.png\tb8f76a8e5f.png\ted46821aa5.png\n225ae42209.png\t56ab4fae30.png\t89097ab99b.png\tb8f92fbe45.png\ted52743a1b.png\n226eeb7c67.png\t56ac0f17be.png\t89165071a4.png\tb911d46acb.png\ted60e1c64a.png\n228f9cee2e.png\t56ac5605d2.png\t8919be1266.png\tb91a91adc5.png\ted66549524.png\n22a889c9fc.png\t56b5540fd7.png\t891fa63295.png\tb92b11b3a4.png\ted6f9dfa22.png\n22b0774178.png\t56ced19ad6.png\t89340d64ab.png\tb92ff69b44.png\ted7251fc01.png\n22e7cc183f.png\t56d025f76c.png\t893bd55371.png\tb937f4558c.png\ted761ad8ef.png\n22fdf96a54.png\t56d6d999b8.png\t8945b8916d.png\tb93d143e4b.png\tedc9fb3512.png\n230226a446.png\t56f4bcc716.png\t894b5b40af.png\tb942b38a45.png\tedd5fc6648.png\n230b98e713.png\t5701bbfe46.png\t894b9ae774.png\tb94a5b0bda.png\tede36de5cc.png\n231fc4e8e5.png\t570d1da727.png\t894d0e63f5.png\tb9614348f4.png\tede9c03a6f.png\n2356ec6570.png\t5717740aa6.png\t89562d9f7d.png\tb964244dfd.png\tedea91429d.png\n236a5cc090.png\t571b14e631.png\t8978bc57b7.png\tb966734278.png\tedf1e6ac00.png\n2380664922.png\t571e63d0bc.png\t89a74723fd.png\tb96ef7e302.png\tedf47afe46.png\n2388b3eb40.png\t57293a65d5.png\t89ad7a93d6.png\tb97918ac08.png\tedfe5ae3ef.png\n23a17b87e7.png\t572cb220d2.png\t89af21b6ee.png\tb981d35c43.png\tee0be71990.png\n23a7afcb36.png\t572f0b1ad8.png\t89ba98210c.png\tb9a7c521ac.png\tee3b4b8c91.png\n23afbccfb5.png\t57302cf4b6.png\t89dba191b1.png\tb9afd7b728.png\tee48914076.png\n23b7bf14c5.png\t573f9f58c4.png\t89dc091930.png\tb9bb1b7b3f.png\tee5c6d6639.png\n23bbdfd154.png\t575d24d81d.png\t89dfb7ba1d.png\tb9bf0422a6.png\tee5e9e4570.png\n23c72b10cb.png\t576fd5de16.png\t89e7c93d0c.png\tb9c2f9550a.png\tee7ff3b2fe.png\n23cdfc53fe.png\t578aefa47e.png\t8a02b4a1b7.png\tb9ebcbc115.png\tee94d2281f.png\n23e1bb3ee8.png\t578b502c6d.png\t8a1aed2c83.png\tb9f3886670.png\teead2fc101.png\n23e5717c0f.png\t57904adf94.png\t8a29fd2ebe.png\tb9fb7882a0.png\teeb18519bc.png\n23eb8cfbc0.png\t57e0cff707.png\t8a2dc90df5.png\tb9fcc34ef3.png\teec3f628d6.png\n2424f4afc7.png\t57e394bc67.png\t8a4671cb47.png\tb9fd9429e3.png\teec46ec66b.png\n2425501bf4.png\t57f8b13094.png\t8a55cca47a.png\tb9fe322e38.png\teec5e72dad.png\n2444e83809.png\t57ff97666d.png\t8a58fe70f8.png\tba045610d1.png\teeca9d6a63.png\n24522ec665.png\t5800248c17.png\t8ab2399b4a.png\tba1287cb48.png\teed279a71a.png\n246d86ba32.png\t58135d5189.png\t8ad8b9b163.png\tba1ae7a6c7.png\teed6c66da4.png\n24a38972a3.png\t581e7f38d7.png\t8adaa82868.png\tba2a5fbf4c.png\teeecc7ab39.png\n24a8ae9bb9.png\t58777a8c79.png\t8ae916bb5c.png\tba4781de48.png\teeef52be31.png\n24b40af38f.png\t58789490d6.png\t8b01d00cb0.png\tba67135a54.png\tef03d3e2fa.png\n24bf79cd1e.png\t589e94265a.png\t8b0478ed02.png\tba6970504c.png\tef060b82b1.png\n24c102ca1f.png\t58a51c861e.png\t8b09c914be.png\tba705c8398.png\tef13247eed.png\n24cb80b5da.png\t58ad72a266.png\t8b0d0b0633.png\tba836c5ae1.png\tef29bdffc6.png\n24d67576d0.png\t58d177c0d0.png\t8b0d16466b.png\tbaa743f4b4.png\tef406bff03.png\n24e98d224a.png\t58d62862fa.png\t8b11113861.png\tbaac3469ae.png\tef51bbcde7.png\n24f27c4087.png\t58de316918.png\t8b4189c2a4.png\tbab30e51b8.png\tef58213c9b.png\n25143d7837.png\t58edb9b58c.png\t8b44342fd9.png\tbabbac4cf9.png\tef6229f0db.png\n253eb71c3d.png\t58f752db3b.png\t8b5232fcfa.png\tbadb3ea766.png\tef654c1b73.png\n253ee294d9.png\t59092e355d.png\t8b5757d8ee.png\tbadc44bb99.png\tef753a06c7.png\n254efd9043.png\t590e32af63.png\t8b5d38c547.png\tbb1743a571.png\tefbfde4930.png\n2556c23489.png\t590e6bdbef.png\t8bb7245636.png\tbb1e7f303f.png\tefc705478f.png\n255e6ef6a9.png\t590f7ae6e7.png\t8bc707feee.png\tbb2160921c.png\tefd1625f2a.png\n2578042000.png\t5936740d48.png\t8bf499cb0c.png\tbb455284f1.png\tefe3043924.png\n25845ace3d.png\t594712691d.png\t8c1d0929a2.png\tbb63bcb48a.png\tefe3e72587.png\n258a5db63e.png\t5948185d4d.png\t8c1db0fc39.png\tbb6683a533.png\tefec1f093e.png\n2596227117.png\t59597a8431.png\t8c232a1cf8.png\tbb697642ae.png\teff595dbcc.png\n25a9f40814.png\t595deae139.png\t8c38199b15.png\tbb8c452bb6.png\tf0190fc4b4.png\n25bbbcd4a4.png\t5970137a1e.png\t8c477d6abf.png\tbb9ad7c38d.png\tf0228a88de.png\n25bd98aec3.png\t5990041346.png\t8c4ac14f53.png\tbb9af788f8.png\tf057258486.png\n25c415e373.png\t5991bd3f1b.png\t8c88cf3d22.png\tbbb5279a06.png\tf06ef6c8b7.png\n25d24d4554.png\t59944f0f92.png\t8c8c2012e7.png\tbbb762835a.png\tf083e6692d.png\n25db231ea0.png\t59b1bac5d5.png\t8c916f9cb5.png\tbbf6b192be.png\tf0b5a05a6e.png\n25dc959ebd.png\t59ceec4f7c.png\t8c976149d0.png\tbbf8c7968a.png\tf0c401b64b.png\n25e4b6fd77.png\t59dabe134f.png\t8c99d8c85f.png\tbbfb2201ce.png\tf0d15c63c5.png\n25e519b494.png\t59dfd4c7e5.png\t8c9a0ea088.png\tbc3156b1e4.png\tf0dc7e30fc.png\n25edb9edb3.png\t5a02df7676.png\t8c9b19f267.png\tbc3f2b82ec.png\tf0e7e06354.png\n25f2e30c66.png\t5a3c38104c.png\t8ca104baef.png\tbc5382bc71.png\tf0e99e9761.png\n25f532d7cc.png\t5a6cd229a9.png\t8ca267be2b.png\tbc5ee94de7.png\tf0ea2421f0.png\n25fb3a895a.png\t5a6f57c150.png\t8cb6f8581b.png\tbc79fe1a3a.png\tf0ec542987.png\n26000ba320.png\t5a6fa22207.png\t8cbefc189e.png\tbc7e49ee18.png\tf0efea3879.png\n2615b5210d.png\t5a7a933a05.png\t8ce1df1a6a.png\tbc811e2c1f.png\tf1080b61e2.png\n26281da345.png\t5a81181e41.png\t8ce5996371.png\tbcae7b57ec.png\tf115ca6981.png\n26382e2d1f.png\t5aa0015d15.png\t8cf3a6a6f5.png\tbcbe3c0ee2.png\tf11f8eab00.png\n263bd9dfd6.png\t5aca18a605.png\t8d08955cdf.png\tbcd1e64c58.png\tf121c06f1f.png\n26527458de.png\t5ad4b1d1ff.png\t8d141a0b7a.png\tbce104494c.png\tf122ead5f2.png\n26808abb2d.png\t5ad7172ca7.png\t8d28e714e3.png\tbce4cdea7f.png\tf1395b9e13.png\n268baf2d43.png\t5aee63c515.png\t8d2e68312e.png\tbcf322c1c0.png\tf139be21a4.png\n268d39db25.png\t5b217529e7.png\t8d3ee9356b.png\tbd20d18369.png\tf170afca54.png\n269b6cad0d.png\t5b2282b51e.png\t8d4ba8fd28.png\tbd2a0b256c.png\tf1774ff8e8.png\n26a76b910c.png\t5b28c5ba50.png\t8d70a676cd.png\tbd4f797d33.png\tf19b7d20bb.png\n26ae98fddb.png\t5b3b86341f.png\t8d73c4f678.png\tbd51537256.png\tf1b1f57aee.png\n26ec6ef9df.png\t5b435fad9d.png\t8d839a365b.png\tbd526ca9af.png\tf1cdcb9883.png\n2717821409.png\t5b72ade49e.png\t8d89d465fc.png\tbd7b74689d.png\tf1d488759e.png\n276807c93b.png\t5b7c160d0d.png\t8d9314a9f4.png\tbdb45f92c7.png\tf1e4acdd7b.png\n2775c5f447.png\t5b836d2444.png\t8d9b8d724a.png\tbdb9ba462f.png\tf20521b7f5.png\n27870bb184.png\t5b86544456.png\t8daa7e48d1.png\tbe18a24c49.png\tf21593cfa8.png\n2787bfe603.png\t5b8dc086ff.png\t8daf098aa9.png\tbe1c8cbf1f.png\tf2259e86f4.png\n27a240a570.png\t5bb48ef0ea.png\t8db0a68a4c.png\tbe1f425eac.png\tf234066134.png\n27b5526adb.png\t5bb976391e.png\t8db5bf9be2.png\tbe2dd1db16.png\tf2598805e6.png\n27c035c5ae.png\t5bbcd70d2f.png\t8db9dc4a62.png\tbe4db8ff8a.png\tf26e6cffd6.png\n27c45fef01.png\t5be9925bca.png\t8dc946b34d.png\tbe565e046b.png\tf288e998f5.png\n27c4850501.png\t5bf5d82829.png\t8ddaea54ce.png\tbe6cf5cf86.png\tf28b483859.png\n27d0d9af30.png\t5bffe3b52f.png\t8e13060816.png\tbe7014887d.png\tf2c640742a.png\n27de423ad3.png\t5c0cde01b7.png\t8e280f2c87.png\tbe90ab3e56.png\tf2c869e655.png\n27e12cbcea.png\t5c28eec331.png\t8e2e3afdb5.png\tbe9ce6a9d7.png\tf2d2fefde3.png\n27ec51c877.png\t5c297b5dcd.png\t8e3228d793.png\tbea9b0571b.png\tf2d668f7e7.png\n27efc3195e.png\t5c33b65396.png\t8e48de3b3b.png\tbebae78c6a.png\tf2e9faa785.png\n27f6632fd1.png\t5c51c23a7b.png\t8e7324c8ed.png\tbebf415bd1.png\tf2f0278d02.png\n27f923aac9.png\t5c616dea00.png\t8e84b940e3.png\tbec8f132d0.png\tf2f67e14a6.png\n280dcd0b2d.png\t5c65a9775b.png\t8e93b66386.png\tbedb558d15.png\tf2fcceee25.png\n2821f70a5a.png\t5c70da6c2d.png\t8e9f2b9b61.png\tbefdfcbe5b.png\tf2ffcf3de5.png\n28553d5d42.png\t5c7f310937.png\t8ebceef687.png\tbf00a0ab71.png\tf30c36bf6b.png\n285b9ba056.png\t5c87971507.png\t8ec207d7df.png\tbf05a52a6b.png\tf30e47cb36.png\n285f4b2e82.png\t5c8ce5869b.png\t8ee20f502e.png\tbf294b9661.png\tf3220c3590.png\n2874150952.png\t5c9326f1e4.png\t8ef76ed16e.png\tbf71bff284.png\tf328eccbf3.png\n287aea3a3b.png\t5cada3d3f1.png\t8f07ef8585.png\tbf72590547.png\tf3356c9ee4.png\n287b0f197f.png\t5cb2496125.png\t8f08f4e9f2.png\tbf73cacab5.png\tf342603c76.png\n287d035464.png\t5ce554e890.png\t8f16b2685d.png\tbf76c8aff3.png\tf34ae8f3d4.png\n2886e27fb6.png\t5cefe5b52c.png\t8f1d6045fb.png\tbfa7ee102e.png\tf362d474b7.png\n288c4c1941.png\t5cf1f1e7f0.png\t8f1fbec1e6.png\tbfac8e63cb.png\tf37556e4da.png\n28948eeb9c.png\t5d07e84138.png\t8f2087a1f5.png\tbfb0e22b53.png\tf3ee165cd0.png\n289af097b0.png\t5d0a9c5b0e.png\t8f39d964c3.png\tbfbb9b9149.png\tf3fb95295e.png\n28a1187a4e.png\t5d4dcb9d8b.png\t8f407b64ae.png\tbfbea83faa.png\tf3fd020f27.png\n28a9c6e5fe.png\t5d600057f5.png\t8f5af7f42a.png\tbfc47bbe13.png\tf409d55fee.png\n28ad4e60c5.png\t5d6126f459.png\t8f7b7b9980.png\tbfcd0fb5ac.png\tf41a025b6c.png\n28ca5ff16f.png\t5d7424eab2.png\t8fb68e99e8.png\tbfd1d7caa1.png\tf425a67987.png\n28d42de29f.png\t5d752d6d4a.png\t8fd4178b11.png\tc00c4b8814.png\tf429b8e6f8.png\n28df927e87.png\t5df9874c5b.png\t90194208b9.png\tc02071548b.png\tf42a7e65f2.png\n28e3c1128c.png\t5dfa4cfa2d.png\t902d01b7dd.png\tc023a62b96.png\tf43f1e7456.png\n28f6f52c8f.png\t5e09ef27a6.png\t903180c447.png\tc026a17632.png\tf44f67efae.png\n28f865caaa.png\t5e0d2e6c25.png\t9031ddc924.png\tc038f6fbe4.png\tf454efbd20.png\n2903f6a1db.png\t5e350fa662.png\t903206854b.png\tc0565001a3.png\tf456e3161d.png\n290e9d9748.png\t5e3c2aa2bb.png\t904f3fc07a.png\tc06ce71f9a.png\tf45dbe6541.png\n29162a3471.png\t5e58ed4adb.png\t9053aa6508.png\tc073b8930c.png\tf4638e7428.png\n296a04bc06.png\t5e85356a28.png\t9067effd34.png\tc092a99d23.png\tf463f6b15b.png\n296de39be2.png\t5e95137d5f.png\t90704ffe69.png\tc09f4e95e8.png\tf4784e4f1f.png\n2993a272be.png\t5e98f349f3.png\t90720e8172.png\tc0a37fef54.png\tf4e6928ad4.png\n299ba3cf34.png\t5ebe9e404d.png\t90791c2417.png\tc0aa0c7b66.png\tf4ed280779.png\n299ef808f6.png\t5ec4b8eff9.png\t9086ac3a68.png\tc0b6fafc47.png\tf4f0fd932e.png\n29a1b18e79.png\t5ec6db5c9d.png\t9090f8f97b.png\tc0cbdabcc3.png\tf4feed1635.png\n29b25b5f03.png\t5ecf1f5cb4.png\t90946392e6.png\tc0ef23e386.png\tf503aaf699.png\n29b5ee6cf5.png\t5edb37f5a8.png\t90990cd2ba.png\tc0fe58323e.png\tf507d4f122.png\n29beb70925.png\t5edca1bb89.png\t90a57bacf0.png\tc0ffa90c4a.png\tf52c144f24.png\n29c033e8e4.png\t5ee2d40bbb.png\t90b14e3f04.png\tc100d07aa9.png\tf566a7cbc6.png\n29d8bf3a4c.png\t5ef096aeac.png\t90b1a59d1b.png\tc10ff1398a.png\tf57cb32d6d.png\n29df15c1ce.png\t5f1aa014c5.png\t90b1ba3bb5.png\tc112748161.png\tf57e741b5b.png\n29ea4f92c3.png\t5f1df5e4a8.png\t90bf0125af.png\tc1190c9c37.png\tf59821d067.png\n29fd4ba626.png\t5f248cf0d6.png\t90c62cd13b.png\tc137712b61.png\tf599fad413.png\n2a00ecc904.png\t5f397e0074.png\t90d2cdb32d.png\tc1431cfade.png\tf5a0acc480.png\n2a070f3dc6.png\t5f3b26ac68.png\t90f37a2a28.png\tc1503fb0ec.png\tf5a92d1356.png\n2a16af2ae7.png\t5f43816c42.png\t91127f8cc2.png\tc156fd6d44.png\tf5b2861c9b.png\n2a26fb616b.png\t5f45291a7a.png\t911efbb175.png\tc1c54ad4ba.png\tf5c25276d2.png\n2a484baa26.png\t5f48c8c123.png\t911f05128d.png\tc1c5602088.png\tf5c2e66754.png\n2a49169951.png\t5f51a19f9e.png\t9126bbe228.png\tc1c6a1ebad.png\tf5d1176e33.png\n2a5805ad13.png\t5f7c5c59f5.png\t9128b824e5.png\tc1ed7e8344.png\tf5e72111d0.png\n2a6477aee8.png\t5f8370ed60.png\t912b573a9e.png\tc1f92fd149.png\tf5ffe74512.png\n2a71003f75.png\t5f8b4540fd.png\t9146f2afd1.png\tc20069b110.png\tf6009f764c.png\n2a9190ed84.png\t5f8f40ce31.png\t916aff36ae.png\tc202c25557.png\tf62410bce9.png\n2ae425e5a3.png\t5f98029612.png\t917ef84e64.png\tc20c019650.png\tf625984494.png\n2af008d773.png\t5fa536d1fd.png\t91823ecb1f.png\tc23090da64.png\tf641699848.png\n2af240d261.png\t5fa87fc4cd.png\t918548e472.png\tc230cf2a94.png\tf64190c47e.png\n2af3c055b2.png\t5fabfdc0b6.png\t919157e393.png\tc2395da60e.png\tf646560aa9.png\n2b160a1291.png\t5fc8bb94aa.png\t919bc0e2ba.png\tc23d957c59.png\tf6576d4cbe.png\n2b1e71c2a3.png\t5fd2a3ef64.png\t91a0ac314c.png\tc2485cb6c3.png\tf65a52f375.png\n2b2fc71ee1.png\t5fd411e302.png\t91b91fd79b.png\tc256f6e713.png\tf6a89b6f89.png\n2b30313d2c.png\t5fd8ed2395.png\t91bb8b5dbb.png\tc26d515fdb.png\tf6c49acac0.png\n2b39f9a88f.png\t5fe333fd26.png\t91bfe61493.png\tc26ff00bfd.png\tf6c784904d.png\n2b405e7016.png\t5fe4e3f5e6.png\t91c003779c.png\tc27409a765.png\tf6e87c1458.png\n2b55bdbda1.png\t5fe599e1a5.png\t91dfba8133.png\tc2973c16f1.png\tf70f3b7b95.png\n2b723b8474.png\t5ff89814f5.png\t91ea3f1227.png\tc29c36a1e8.png\tf712d8b870.png\n2b76b47947.png\t5ffd928d1e.png\t92107f9559.png\tc2ca0ed857.png\tf724f78b1a.png\n2b77f82b42.png\t6010b9b84d.png\t921b60e76d.png\tc2e9f328e3.png\tf7380099f6.png\n2bc179b78c.png\t602dc99f10.png\t923249032c.png\tc307912941.png\tf75842e215.png\n2bc8a3db3c.png\t6040cbf086.png\t923473085f.png\tc313a1ad7d.png\tf76c6bc5d0.png\n2bca4fe522.png\t60584b31d8.png\t9258e48804.png\tc32752c00f.png\tf78361aa48.png\n2bce09096c.png\t60638605b1.png\t9260b4f758.png\tc33df3d7ad.png\tf78674d646.png\n2bd87e0587.png\t60698e8fd2.png\t9267f6cf9c.png\tc33e3a20a4.png\tf7933b266c.png\n2bd913a165.png\t6074c5b0bf.png\t92895cb54f.png\tc33faa5d81.png\tf7b1bd7c8a.png\n2be45a7fa3.png\t608567ed23.png\t92a20ccbe2.png\tc352f6069b.png\tf7c147ac61.png\n2bf5343f03.png\t608574722a.png\t92a6b22052.png\tc353af5098.png\tf7d4d6fb0d.png\n2bfa664017.png\t6092afaec7.png\t92c25915f1.png\tc356f76804.png\tf7dc2b2c9a.png\n2c3c64402e.png\t60a0ca8d0e.png\t92c91d1494.png\tc3589905df.png\tf7e555a0aa.png\n2c45b152f1.png\t60b3d0d613.png\t92cc29d923.png\tc37e74bffc.png\tf7e8060b57.png\n2c707479f9.png\t60b7303d20.png\t92d116836e.png\tc37f7c51e9.png\tf7e855fc40.png\n2ce9c36a98.png\t60bbd166a4.png\t92d9025a37.png\tc385af3da9.png\tf7ff576dc4.png\n2d20ee4eff.png\t60d3607be2.png\t92e617896e.png\tc387a012fc.png\tf807851f5a.png\n2d3c0a68f9.png\t60dccbc52f.png\t930183efe5.png\tc38e2a42e3.png\tf839d20b06.png\n2d4e7a937b.png\t61321814ce.png\t930a21d485.png\tc39f317ad0.png\tf841c6dee7.png\n2d5f4b0213.png\t6138f82415.png\t930c8dd0e4.png\tc3a963f5e3.png\tf85054273a.png\n2d64b6a8cf.png\t613fb95edf.png\t930cbb8ccb.png\tc42abd17f3.png\tf85a85710a.png\n2d7a83381c.png\t6150f39f8d.png\t930dcd571a.png\tc42d63b0e6.png\tf870e41e88.png\n2d8a3a262b.png\t6153c227de.png\t93196366e1.png\tc43b8533c4.png\tf87154ea7b.png\n2da763222e.png\t617c746fca.png\t932529c83d.png\tc450f9d7f4.png\tf87365827d.png\n2da8da57e1.png\t618f843c7a.png\t9347c39706.png\tc45c9617bf.png\tf8921580bb.png\n2da9564e53.png\t619edf8941.png\t934cd14205.png\tc47c5ea1c3.png\tf8b721f632.png\n2e035505d6.png\t61b42f8f0e.png\t9358b1fa7a.png\tc4911615e1.png\tf8ba90c991.png\n2e69b6eb16.png\t61bd2ee822.png\t935cba59ad.png\tc4a681819d.png\tf8ca1f5c5a.png\n2e7921981a.png\t61cfb43abf.png\t937ea43a65.png\tc4adb6e891.png\tf8e67987c6.png\n2e8bf3f7b3.png\t61df7e1f4d.png\t9382a047a8.png\tc4b05c2eaa.png\tf8f7ed86fc.png\n2eae73b03a.png\t61eb86f1ad.png\t9385125b12.png\tc4b4099179.png\tf950879320.png\n2ebe2eba03.png\t623565cb2c.png\t938d11586e.png\tc4ed7cdb1c.png\tf959f79b64.png\n2eca5f70b1.png\t624bd59549.png\t939232310f.png\tc4ee594685.png\tf9629f4e29.png\n2ecb2b0797.png\t6252a820da.png\t939879034e.png\tc4f2799234.png\tf9645c1acf.png\n2edf2e3e05.png\t62715133c5.png\t939c88a964.png\tc4f5a0fa36.png\tf992390442.png\n2ee6e81e4b.png\t6272ae54bc.png\t93a0cb5111.png\tc50ba562ef.png\tf9a3d103d7.png\n2eea9d51a4.png\t6285015437.png\t93a1541218.png\tc5327d784b.png\tf9b7196dfc.png\n2f00b33427.png\t628ab35fe8.png\t93a8901635.png\tc5493cc7b0.png\tf9d03271d6.png\n2f20c083c2.png\t628ea36ba6.png\t93b51f6335.png\tc56524ded9.png\tf9d8f014c2.png\n2f2e86de0b.png\t628f1de539.png\t93b8ec180f.png\tc571804a93.png\tf9ef613aec.png\n2f35ac4b87.png\t62966dffd3.png\t93c7358e41.png\tc5745e04ff.png\tf9fc7746fb.png\n2f3eff4192.png\t62a0ee9480.png\t93d21b7636.png\tc576ef893a.png\tfa01eb7fdf.png\n2f493f72ff.png\t62a8f89a19.png\t93f7759d52.png\tc58124fdc4.png\tfa08ffa6fb.png\n2f4fb95bb7.png\t62aad7556c.png\t93f8f9e663.png\tc58b6277c7.png\tfa25724a62.png\n2f5377ea9d.png\t62d30854d7.png\t941d0865da.png\tc593c662f1.png\tfa2ebd1083.png\n2f746f8726.png\t62d6a9b673.png\t942d3bfbf3.png\tc59e5f46c0.png\tfa3e7cb3f5.png\n2f8b36c386.png\t62de954435.png\t9447159eb7.png\tc5a9c13a77.png\tfa50e86a72.png\n2f8e4f82fd.png\t62df3565cd.png\t944b46458e.png\tc5c4479af6.png\tfa56377e58.png\n2f917b1124.png\t62e608259a.png\t94551e800f.png\tc5caa57f2f.png\tfa59461848.png\n2fa330e8fd.png\t62e72fe39a.png\t9458b40464.png\tc60eaa0c75.png\tfa70e3da06.png\n2fa4260cde.png\t62f822b156.png\t9466e21e0d.png\tc613a090a4.png\tfa752c35ca.png\n2fa889172a.png\t6306dd3a8e.png\t9472731dfd.png\tc61e0f2507.png\tfa92bed2b1.png\n2fb152d905.png\t6313ab72dc.png\t948d73d7c2.png\tc6612326fa.png\tfa9ad1caeb.png\n2fb6791298.png\t63193879b4.png\t9492db510d.png\tc6648ea2ca.png\tfa9b3f2aa9.png\n2fbde8630d.png\t631dacfc0f.png\t9494906f54.png\tc66a8f70e4.png\tfa9da6a455.png\n2fc6fb555f.png\t633c7d5c80.png\t94b3a9ed9c.png\tc66a9ea31d.png\tfad11bf2fa.png\n2fd6d25adb.png\t6348c487a4.png\t94b4f44698.png\tc673bebe29.png\tfadd6aa446.png\n2fe0292eea.png\t63532a858b.png\t94c2d96e56.png\tc67832ba86.png\tfadd701a33.png\n2fe186a2c6.png\t638d8572e9.png\t94c8aad517.png\tc68ee0308f.png\tfb14beb69f.png\n2fe4adfd48.png\t638e7a825e.png\t94c94ee7b6.png\tc69962cd9d.png\tfb17f161d1.png\n2ff32a1188.png\t6397715bb4.png\t94e7558e32.png\tc70153909b.png\tfb1b82dde6.png\n2ffe02e882.png\t63bed65d65.png\t9511856b03.png\tc701db91e2.png\tfb1e8578cf.png\n2ffea0c397.png\t63d3cf63be.png\t9522339a9a.png\tc70619fb09.png\tfb3392fee0.png\n3004f41177.png\t63e32c0cfb.png\t9539d280a6.png\tc72b546073.png\tfb44090bc7.png\n30082e87d9.png\t63feb8de2b.png\t954502fb12.png\tc72c5ee731.png\tfb47e8e74e.png\n302ea1ac81.png\t640ceb328a.png\t956c91fe94.png\tc7408f1d38.png\tfb58510ec6.png\n30307d5103.png\t6419fd454b.png\t958a3dd464.png\tc749d05181.png\tfb5adc8df0.png\n30491b379f.png\t641d0227e7.png\t95a60e2f42.png\tc74ef130f0.png\tfb81781b44.png\n306e6e7743.png\t641ef8ba11.png\t95a87f8ad9.png\tc756ff4e1f.png\tfb903c3b68.png\n3072ae3119.png\t642c62a158.png\t95af8ca744.png\tc78996aa02.png\tfb9b58527c.png\n30787ad7df.png\t6457057cec.png\t95bdc8b506.png\tc78c89577c.png\tfba32825cd.png\n308a9348e0.png\t645abba5ec.png\t95d3841a53.png\tc79d52ffc3.png\tfba91bfec8.png\n30badffdd5.png\t6460ce2df7.png\t95f6e2b2d1.png\tc7a92a6707.png\tfbb8eba26d.png\n30bb4cd749.png\t6493c2cbab.png\t96049af037.png\tc7ab9848b9.png\tfbc98aedc0.png\n30cb1002fb.png\t64baad8029.png\t960f92cf81.png\tc7c3a8a806.png\tfbc9a1149e.png\n30cc3d3982.png\t64bfdd0b9a.png\t961b1310e9.png\tc7f54785b8.png\tfbcd92f03f.png\n30d4fb116b.png\t64e050f295.png\t96216dae3b.png\tc80f09bee6.png\tfbd61072f4.png\n30e4d22141.png\t64e79513a3.png\t962348be7a.png\tc824b8afaa.png\tfbf380831b.png\n30e5cfec08.png\t64f41e36f2.png\t963189b39c.png\tc82ae5f941.png\tfbf5c91d9e.png\n30ff4f38cf.png\t651b2c01bc.png\t963210b127.png\tc82dabf9b1.png\tfbf64b5ea0.png\n3116a9d54c.png\t653c7e9f6c.png\t964030c032.png\tc82f7676ac.png\tfbf73cb975.png\n31250cef62.png\t65408225fc.png\t964be849b0.png\tc83d9529bd.png\tfbfa863bd4.png\n312c4ce754.png\t6545c6235b.png\t96523f824a.png\tc8404c2d4f.png\tfc13dabcbb.png\n314db606fd.png\t6558dcde17.png\t9667824566.png\tc89eda8701.png\tfc1ca2fbc8.png\n316074fdc4.png\t6576184a08.png\t966fb0cdbb.png\tc8adc706f5.png\tfc250f574c.png\n31683f8a18.png\t657f2c34f7.png\t9678fb935e.png\tc8bd99cbb0.png\tfc4c223773.png\n3187bf5c2d.png\t659193a103.png\t967d30a3fd.png\tc8ce151a18.png\tfc6e9d0385.png\n31a4341eb6.png\t65a55b69b7.png\t96857148ca.png\tc8d70d9c3d.png\tfc87224914.png\n31aaa04c63.png\t65aadb6f53.png\t9693c84080.png\tc901f669f8.png\tfc8f0668fb.png\n31b003483c.png\t65cb9cbc05.png\t9694a662dc.png\tc9142b341c.png\tfc944f4e1f.png\n31b1314c62.png\t65d05b81c6.png\t96998d1ad9.png\tc920246511.png\tfcb49356a5.png\n31b7874f49.png\t65dc362682.png\t96aeb8d018.png\tc92c37ecd7.png\tfcbc5cef9d.png\n31ca053bec.png\t65e01bd65f.png\t96cc5caec0.png\tc93d0ecd1d.png\tfcce81906d.png\n31d35c7c1d.png\t65e2650a80.png\t96d1d6138a.png\tc95d3f1510.png\tfccf774196.png\n32062fff08.png\t65eca0ad70.png\t96d917e53d.png\tc9695bff51.png\tfcdf2756aa.png\n3218afdc84.png\t65ef110246.png\t96f26f6397.png\tc98504a8ab.png\tfcf5137a10.png\n321f21e02f.png\t65fe1a0efc.png\t96f8fd0d4b.png\tc98dfd50ba.png\tfcf61d1883.png\n32273eef45.png\t66073ddce1.png\t9704b1ba61.png\tc9a1fd48f6.png\tfd0d237b01.png\n3228f203a3.png\t6636021b35.png\t9718102ef6.png\tc9a2990f76.png\tfd1be18f7d.png\n323882f716.png\t664e3eb07f.png\t971c96134a.png\tc9c0097eff.png\tfd257af5d2.png\n324eb2ec65.png\t665e84dc42.png\t9722dedee7.png\tc9da9c3d42.png\tfd423b9cbf.png\n32599d97bb.png\t6682af0131.png\t9736389261.png\tc9f7f4020f.png\tfd4741392b.png\n325cde4fa9.png\t66891ec970.png\t9737133eff.png\tc9ff0d072b.png\tfd51e63c83.png\n3276e900a7.png\t668b41c03e.png\t973a4f6dad.png\tc9ffc53f25.png\tfd5c3c20d6.png\n3281bb7282.png\t6698a570b8.png\t9747413253.png\tca0d3ea2f5.png\tfd63516ff4.png\n32887c2719.png\t669cb42011.png\t974f5b6876.png\tca2e414e83.png\tfd949b5ab7.png\n32afbf53f6.png\t66b4280f4e.png\t97515a958d.png\tca393f0eaf.png\tfd998a993b.png\n32f1039c61.png\t66c603a013.png\t976eb05465.png\tca511bc319.png\tfd9db12ca0.png\n32f4fd66ae.png\t66c61616cc.png\t976fcded4d.png\tca58996278.png\tfdabdc4399.png\n33041df256.png\t66cd32da94.png\t977f897477.png\tca6bdbc405.png\tfdb7d132be.png\n33159fbb9b.png\t66cd349922.png\t9790f7880b.png\tca78405cea.png\tfdc06fea10.png\n331ec135dd.png\t66ced55b7d.png\t97970b5e2c.png\tca82310ee1.png\tfdc1ac926a.png\n3320eb35e4.png\t66cf41c563.png\t97a10a3ba0.png\tca84b9e8d4.png\tfdc28cafdc.png\n335474f8ec.png\t66cfa202e5.png\t97a653e83c.png\tcaa039b231.png\tfdd73141d6.png\n3358462c1b.png\t66e922c8d6.png\t97adf32d4d.png\tcab2f6614c.png\tfde2b51fb2.png\n336e1995cd.png\t66eeb09788.png\t97b058f254.png\tcabe5c94b2.png\tfdff017e77.png\n3371d166f0.png\t66fed16154.png\t97bd8b22b1.png\tcaccd6708f.png\tfe0ff502fb.png\n337b30caff.png\t670d9655d5.png\t97e2c6b5c5.png\tcadb46b378.png\tfe11d93a24.png\n33820f50d8.png\t6738d474a3.png\t97e78a5448.png\tcadfbf8011.png\tfe37eddf8e.png\n33887a0ae7.png\t6752a6815f.png\t980570ed7e.png\tcae1a9f23f.png\tfe4f6ff284.png\n3391f0befc.png\t675cab373f.png\t9819277faa.png\tcae4ec68c0.png\tfe6e74532c.png\n33b50159a7.png\t675fcfe29a.png\t981f73e811.png\tcafb1294ae.png\tfe801cdc47.png\n33ba7dd9a1.png\t6760543d8f.png\t9842f69f8d.png\tcb0d3e8399.png\tfe85864f8f.png\n33dfce3a76.png\t676d100540.png\t9874f75819.png\tcb2698f715.png\tfe9e558605.png\n33dfe8b243.png\t67a8e23baa.png\t98870eab9e.png\tcb2acfd639.png\tfea1ff8505.png\n33f32ff406.png\t67bc6ac1f4.png\t9895f188e8.png\tcb2c20898a.png\tfeadf0e85d.png\n3404341cc6.png\t67d3486ac8.png\t9898ce93d6.png\tcb36193e2f.png\tfeb90d8376.png\n3406ef6042.png\t67db09fda5.png\t989ac5a9de.png\tcb4f7abe67.png\tfebd1d2a67.png\n340c808130.png\t67dfa4bcf5.png\t98b1f3fb9c.png\tcb677fab7a.png\tfee35c9fa9.png\n340fe814a3.png\t680ce5b666.png\t98c6496076.png\tcb68c2431a.png\tfefe01ac32.png\n341f6694f1.png\t681ee85a7b.png\t98f91cd1b8.png\tcb6c6c15ce.png\tff038e87b1.png\n34205cc12b.png\t684555a30a.png\t9917602913.png\tcb7cd324a5.png\tff1e8ed948.png\n343b0359b5.png\t687dd94fee.png\t992572448c.png\tcb974e60ca.png\tff238df1c4.png\n346358e652.png\t68a2df3195.png\t99405de5c5.png\tcb9d520cf9.png\tff24769ea3.png\n3480629331.png\t68b593f903.png\t994af8ec2a.png\tcba568b15f.png\tff2c353279.png\n34831dcd0c.png\t68b6abcefc.png\t99817a653e.png\tcbad345488.png\tff532274cd.png\n34b32c5b5d.png\t68bad6a787.png\t99909324ed.png\tcbbf379ef8.png\tff89e5e3cd.png\n34bd0facd2.png\t68d00224e0.png\t999ae650f6.png\tcbd42ee63f.png\tff8bf1417c.png\n34bfcfa11b.png\t68d9f77646.png\t999cb0ea3b.png\tcbe33c57c4.png\tff9b7d0d2e.png\n34d2af0f86.png\t69013a70ab.png\t99aefa9443.png\tcbeab7807d.png\tff9d2e9ba7.png\n34e51dba6a.png\t690281a93d.png\t99b602dde4.png\tcbed0283ff.png\tffb2ad3f94.png\n34f61e9a54.png\t690514a3e4.png\t99bc5efa33.png\tcc15d94784.png\tffce5bbb9a.png\n3501da04b2.png\t69098163e1.png\t99bf605daa.png\tcc216a861c.png\tffe228a7e3.png\n3507f73405.png\t690b1ce5f7.png\t99c726d528.png\tcc2da77c17.png\tfff2abcaf9.png\n351ea99ec3.png\t691dc0bc42.png\t99d3fbf9f1.png\tcc4344175e.png\tfff4eb4941.png\n3525fcea1e.png\t691fec8ce2.png\t99d7f0c538.png\tcc6abe337d.png\tfff6522bd1.png\n3531c9193f.png\t6940237693.png\t99ee31b5bc.png\tcc70a60681.png\tfff987cdb3.png\n"
],
[
"!ls ./images",
"0005bb9630.png\t3474172578.png\t6723526c1a.png\t990468763b.png\tcc2c7789b8.png\n000a68e46c.png\t3474bba21d.png\t6724070ec9.png\t990a672905.png\tcc2da77c17.png\n000c8dfb2a.png\t347686c1b9.png\t672b6c08a1.png\t990d808b85.png\tcc2fc654f5.png\n000d0a5f6c.png\t3476cc2c7c.png\t672bb9b070.png\t9911b2b904.png\tcc30dc52ee.png\n000e218f21.png\t347d3a7801.png\t6730d663de.png\t9913721ee5.png\tcc31e0720c.png\n001ef8fc87.png\t3480629331.png\t67320e41a5.png\t9917602913.png\tcc32da1b7d.png\n002124aa19.png\t348232af8d.png\t67358e3002.png\t991d9bf376.png\tcc383f3d4d.png\n002af5d1e8.png\t34831dcd0c.png\t6736047f57.png\t991ec143b1.png\tcc3cd53279.png\n00323f1910.png\t3484c36b51.png\t67377e2624.png\t991f1104ec.png\tcc3fa7233a.png\n00329dc15c.png\t34882b481e.png\t6738994f95.png\t991f239daf.png\tcc4153a0ba.png\n0035c56490.png\t348b3df20a.png\t6738d474a3.png\t9922cd597b.png\tcc4344175e.png\n003c477d7c.png\t348f475161.png\t673d620dca.png\t9923de1c7f.png\tcc435c1f0b.png\n0041cb8c49.png\t349516c6c0.png\t673ec36187.png\t992572448c.png\tcc45eeeafb.png\n0043a01a19.png\t3496c1d5b3.png\t674653f58a.png\t9926e8e837.png\tcc46103f98.png\n00441f1cf2.png\t34990ac2e4.png\t6747f7bb35.png\t992bf0d046.png\tcc4a7f0884.png\n0050766ae2.png\t349f4316aa.png\t67500eb66d.png\t992d1a1434.png\tcc4cb99e50.png\n00565e793d.png\t34a2e84c01.png\t6751604e2d.png\t9930673328.png\tcc4d4d7351.png\n005855cd72.png\t34a38ef34a.png\t6752a6815f.png\t9931dd3141.png\tcc4dd6c823.png\n005b02dd7c.png\t34a3c779b2.png\t6754a6fdac.png\t9935d5f9e9.png\tcc52462e9d.png\n005b452274.png\t34a7b93098.png\t67550b39f3.png\t9938699a58.png\tcc5b03f643.png\n0061281eea.png\t34a94a77bd.png\t675cab373f.png\t9938ffcf0f.png\tcc636fcb13.png\n0066451ce3.png\t34aa18cb4f.png\t675d0ed1c7.png\t993950daeb.png\tcc63e9b444.png\n00690a4185.png\t34ab125f26.png\t675fcfe29a.png\t993a7eca7a.png\tcc6723c303.png\n006d9df08a.png\t34acb105f7.png\t6760543d8f.png\t993edf78fa.png\tcc6a06550b.png\n00707c7864.png\t34adc69da6.png\t6761377ad9.png\t99405de5c5.png\tcc6abe337d.png\n007176e42a.png\t34aec621f6.png\t67686e8e07.png\t994217fb87.png\tcc6dc526c6.png\n007c7c7dcf.png\t34b01f1639.png\t6768e73c8b.png\t9946b858fc.png\tcc6f3913b0.png\n00801127b0.png\t34b32c5b5d.png\t6768f2c7cd.png\t994af8ec2a.png\tcc6f8b821b.png\n008315cbd8.png\t34b4732d20.png\t676b7b2d48.png\t9950526250.png\tcc70a60681.png\n008a50a2ec.png\t34b4cd1648.png\t676d100540.png\t9951ffaa5a.png\tcc718eada4.png\n008f8c8aaa.png\t34b54fdbba.png\t676d53ad70.png\t9952099fd0.png\tcc71a2479b.png\n0091fb2dab.png\t34b8d44aef.png\t676ea2ece0.png\t99525132fd.png\tcc76384dfc.png\n0092c53387.png\t34bcf08008.png\t677149a1a7.png\t995c49297f.png\tcc770c1a5e.png\n0094822667.png\t34bcfef7e6.png\t6774a12a64.png\t9969eb6d37.png\tcc827009db.png\n00950d1627.png\t34bd0facd2.png\t6778b7675a.png\t996c4dc2a9.png\tcc859576a3.png\n009bfb3a78.png\t34bf180903.png\t677cf459d1.png\t9970a663e7.png\tcc876896b5.png\n009d3365bc.png\t34bfcfa11b.png\t677d02e7f1.png\t99817a653e.png\tcc8e029e21.png\n009e9c5f22.png\t34c0247f34.png\t677dbfd1c1.png\t9981ad3aaf.png\tcc90212371.png\n00a052d822.png\t34c4081c9b.png\t677e132321.png\t998e781159.png\tcc924f5758.png\n00a3af90ab.png\t34c5b8d814.png\t677f4cbc88.png\t99909324ed.png\tcc97e87765.png\n00a5318192.png\t34cfc626af.png\t67804854cd.png\t9990d72d8d.png\tcc9a95b13f.png\n00a6bfc7a7.png\t34d2af0f86.png\t6781b6e297.png\t9991480992.png\tcc9fae6415.png\n00a738b887.png\t34d49de018.png\t67840ed90a.png\t9991d31572.png\tcca0ef0b2b.png\n00a9b78f49.png\t34d78a619f.png\t678716dff8.png\t9993baa4e5.png\tcca35c91c7.png\n00aa6a3958.png\t34d898e6d9.png\t678be9f387.png\t999ac34f4c.png\tcca5d24d54.png\n00aaa5b7f4.png\t34dba0581e.png\t678e0aa3ad.png\t999ae650f6.png\tcca88b97da.png\n00b3ad08c8.png\t34dbf84916.png\t67913364b6.png\t999cb0ea3b.png\tcca8cdfe66.png\n00b68675b4.png\t34dcdfb4c8.png\t679833ddad.png\t999d21abf1.png\tccace37b00.png\n00b6d3a31f.png\t34e51dba6a.png\t6798d3af4a.png\t99a661efaf.png\tccae28c451.png\n00bbed5966.png\t34e956d7e8.png\t6798e841e7.png\t99ac198170.png\tccc41663c6.png\n00bf25f9f9.png\t34ee4db3e9.png\t67998dfb60.png\t99aefa9443.png\tccc645a996.png\n00c06147b5.png\t34ef14a0f7.png\t679a0030aa.png\t99b014b973.png\tcccb72adc0.png\n00c153007b.png\t34eff695a2.png\t679b9fbf54.png\t99b09414cf.png\tccceb23dda.png\n00c473f654.png\t34f0687bbc.png\t679ee788d7.png\t99b27c5760.png\tccd02673de.png\n00c6225718.png\t34f399a2e5.png\t67a413f082.png\t99b4599831.png\tccd33818b2.png\n00c6536758.png\t34f546117e.png\t67a8e23baa.png\t99b602dde4.png\tccd4db34f0.png\n00cbbf2293.png\t34f61e9a54.png\t67ae021997.png\t99b6682fcc.png\tccd4f37dae.png\n00cda0328c.png\t34f944baa4.png\t67b499db6a.png\t99b6ecc1c8.png\tccd5f44474.png\n00cde1fc96.png\t34fc4627ff.png\t67b4e6e068.png\t99bc5efa33.png\tccd6841e90.png\n00cdefa5d6.png\t34fcc75290.png\t67b6a5a9de.png\t99bf605daa.png\tccd8906fbf.png\n00d26ea627.png\t34fee0c28d.png\t67ba09a89e.png\t99c1cc31ca.png\tccd8af9426.png\n00d51ae232.png\t34ffa4722d.png\t67ba6c0b0f.png\t99c504fd40.png\tccdd1c542f.png\n00e68bfb1b.png\t3501da04b2.png\t67bc6ac1f4.png\t99c726d528.png\tccdf60d661.png\n00e6e260a2.png\t35053e8edb.png\t67c6eccc7d.png\t99cf856ff7.png\tcce73f7dd4.png\n00e900afb7.png\t3507dd1cf1.png\t67cb2d8463.png\t99cfeaba69.png\tccec631d75.png\n00e9cff0f2.png\t3507f73405.png\t67ccf95c75.png\t99d1164d5f.png\tcced2adf43.png\n00eea9cc56.png\t351168f0b3.png\t67ce646767.png\t99d18ed201.png\tccf829551b.png\n00efd96e2c.png\t35155f4bb1.png\t67d21b9e92.png\t99d3fbf9f1.png\tccfb270edb.png\n00f053ec96.png\t351d8ccfd9.png\t67d3486ac8.png\t99d7f0c538.png\tccfb9693c3.png\n00f12566b9.png\t351ea99ec3.png\t67d600cccc.png\t99de133bc7.png\tccfd0dd652.png\n00f1af5e69.png\t3521b38d49.png\t67da0e77b4.png\t99e00a4567.png\tcd01f01ac2.png\n00f342f248.png\t3525fcea1e.png\t67db09fda5.png\t99e331e803.png\tcd0352c2f3.png\n00f6108ccb.png\t352749994a.png\t67dcb0ce6e.png\t99ea7f1c07.png\tcd09e4859e.png\n010082e36a.png\t35277a6516.png\t67dfa4bcf5.png\t99ee31b5bc.png\tcd0ce1b45b.png\n0100bf4bf9.png\t3527e03e5d.png\t67e00c705e.png\t99f14c4b3c.png\tcd12ccf73c.png\n010452a624.png\t352e661350.png\t67e0d71e9f.png\t99f2dd50d8.png\tcd134bd0dd.png\n01066c1de0.png\t3531c9193f.png\t67e2ba4499.png\t99f40c4b94.png\tcd179a8e54.png\n0108518d1e.png\t35341c7a45.png\t67e4fdc167.png\t99f52af0ea.png\tcd1b2e49f4.png\n010c27432a.png\t3535fe3ff5.png\t67e59dec89.png\t99f5360246.png\tcd2046479c.png\n010ee525b6.png\t3536b5d252.png\t67e8e9f5a2.png\t99f86c993d.png\tcd21f9a20e.png\n0113339745.png\t35371045fb.png\t67efda4b14.png\t99fb0ccafa.png\tcd22ea7236.png\n0115319420.png\t3538c8a7db.png\t67efef3228.png\t99fb331dc9.png\tcd249859fa.png\n0115703825.png\t353bca9171.png\t67f35d28b4.png\t99fcb67e17.png\tcd2567246d.png\n01166ff99c.png\t353e010b7b.png\t67fa36c65c.png\t99fddf686c.png\tcd260624b5.png\n0123eeda35.png\t353e34b6a9.png\t67fce30109.png\t9a0aa00799.png\tcd292b7c8d.png\n01258c92e7.png\t3540080bfe.png\t67fdaf03af.png\t9a0fef6e76.png\tcd34f93d5a.png\n0129200dc4.png\t3542a81a70.png\t67fefe864d.png\t9a14689f82.png\tcd3841173f.png\n012b7b5cd0.png\t35452780b5.png\t6805e1ed9f.png\t9a16cf829d.png\tcd3b363bfa.png\n012c3ed07e.png\t354780e9fb.png\t680ce5b666.png\t9a171b49a9.png\tcd3b8db4a0.png\n012daf4256.png\t3548f9bbca.png\t680ec6e447.png\t9a1a93a217.png\tcd3dd039a7.png\n012ee0802b.png\t35493007b0.png\t6813cd8a9a.png\t9a1d9405c1.png\tcd42b00c6c.png\n01323211a0.png\t35499d9dd6.png\t681612cea3.png\t9a1fbef3e3.png\tcd433757bb.png\n013259c337.png\t354c3a5d44.png\t6817506a91.png\t9a21f73ad4.png\tcd48ef3def.png\n013974bce2.png\t354d881033.png\t6817f42eba.png\t9a28814a64.png\tcd4db8b133.png\n013b2d2dd1.png\t355952537c.png\t68186883f1.png\t9a30620053.png\tcd5168b2db.png\n013e4e716a.png\t355a7edcb8.png\t681910e8fb.png\t9a32f8d94c.png\tcd52989e69.png\n013f2c7bef.png\t355ce54530.png\t681ee85a7b.png\t9a346d8216.png\tcd570d2764.png\n014681457b.png\t355de57e2c.png\t68206e6536.png\t9a354f8017.png\tcd573faeae.png\n01495a638f.png\t356334a716.png\t6821a0013a.png\t9a39b3caad.png\tcd5c9532e4.png\n014a31d53f.png\t3567a5235d.png\t68220c08c2.png\t9a3a802f4c.png\tcd5cd074f1.png\n014cc353bd.png\t3568a238ab.png\t68241a010a.png\t9a3abd0ba6.png\tcd5d4d603c.png\n014df49b3c.png\t356a6e85dd.png\t6832de9b9c.png\t9a4222ba49.png\tcd5ebe51b6.png\n0155f4541f.png\t356abb91d5.png\t68350ac997.png\t9a478d8cf2.png\tcd6140444f.png\n01572cfbc2.png\t356d5e4e49.png\t68383a41a5.png\t9a48617407.png\tcd6320c6eb.png\n0167d67b9e.png\t356dd015ba.png\t683fb67c92.png\t9a4864ff06.png\tcd68b00149.png\n01694e04e2.png\t356f233d08.png\t6840d101a6.png\t9a4b15919d.png\tcd69ee839f.png\n016a8b9b9a.png\t356fa48ac4.png\t684187d7b8.png\t9a4c770131.png\tcd6ed7690d.png\n016ec66cd4.png\t35714fa202.png\t6842e3fc27.png\t9a4d1749d4.png\tcd7b5459c9.png\n016fc8031c.png\t35741a470b.png\t6843ce1763.png\t9a4d43460c.png\tcd7e41ca62.png\n01721bbf24.png\t3575b15285.png\t684555a30a.png\t9a4db0d7aa.png\tcd81de3128.png\n0172b71f58.png\t3577258d6b.png\t6849022dab.png\t9a4f625408.png\tcd85f4b39b.png\n0175551830.png\t357ce270a6.png\t684a14ff53.png\t9a51433e42.png\tcd8765b15a.png\n0176d2ace5.png\t357e13bb04.png\t684d0d84d4.png\t9a59a4f4ae.png\tcd8c487734.png\n017d84a5a6.png\t35832ed326.png\t68519b8d65.png\t9a5cc7e8dc.png\tcd8ed0c05d.png\n0189ff5265.png\t35856d2bdf.png\t6852ac83d6.png\t9a5d8e5f37.png\tcd90c05864.png\n01914d82ca.png\t35883e7b28.png\t6856dc43a1.png\t9a5e7fb405.png\tcd93859edc.png\n0192693ea2.png\t3589572823.png\t686113cc67.png\t9a5f20c2b6.png\tcd9d726e26.png\n01926f809a.png\t358b9acfb5.png\t686437c2cb.png\t9a613530e7.png\tcda102a81b.png\n019342f169.png\t358c91b7ce.png\t68666e7f45.png\t9a616b906b.png\tcda1d37159.png\n019766b43c.png\t359128303c.png\t6868649862.png\t9a672bfb51.png\tcda5087210.png\n019afb4b4e.png\t3598c246d7.png\t6868facb17.png\t9a6aec2179.png\tcdaa28bbd0.png\n01a40c3405.png\t359e6a043b.png\t6869e641ab.png\t9a6c280d8d.png\tcdaa2d3f02.png\n01a48b1544.png\t359fa522b7.png\t686e1f28fb.png\t9a6cf411ff.png\tcdaa35be38.png\n01a70b45ed.png\t35a48ea5cf.png\t687093f687.png\t9a704ae142.png\tcdab88a890.png\n01a9e8454c.png\t35aa9f76ff.png\t68709951de.png\t9a7167039d.png\tcdac58c984.png\n01aa774656.png\t35ae3107a6.png\t68715e75f4.png\t9a71e8563b.png\tcdb1d2185f.png\n01abaa26e8.png\t35b34a9da1.png\t68724d8662.png\t9a74757d79.png\tcdb440a4e1.png\n01af0a6ee9.png\t35b544e74b.png\t687a034e8b.png\t9a76952b63.png\tcdb488ab27.png\n01b134167c.png\t35b730549d.png\t687dd94fee.png\t9a79d93047.png\tcdb4e3510f.png\n01b47afbf4.png\t35ba6f2faf.png\t688b56cd43.png\t9a7fc35eab.png\tcdb6298286.png\n01b5362cce.png\t35bbf1302b.png\t688e20fd00.png\t9a804d3e93.png\tcdbc142c7e.png\n01c033e116.png\t35c527afd0.png\t68904129ca.png\t9a808f321d.png\tcdbec42931.png\n01c2045d03.png\t35c5c1e757.png\t68915bcc5a.png\t9a81355819.png\tcdc7d77e77.png\n01c5af2643.png\t35cb0dbb51.png\t689a03ea18.png\t9a836057bc.png\tcdca082980.png\n01d6be9b57.png\t35d158258c.png\t689de6e32e.png\t9a85353794.png\tcdd5f9eb0c.png\n01d782ab9a.png\t35d30e4e6d.png\t689edd4185.png\t9a91290888.png\tcdd7ef59a9.png\n01dd84b7dd.png\t35d3bc7c5e.png\t68a1443ac6.png\t9a954c3db7.png\tcdda3b1f5e.png\n01e136f700.png\t35d93d979c.png\t68a2df3195.png\t9a95d359d9.png\tcddb50fb4b.png\n01ea536900.png\t35da3d536c.png\t68a469ef7e.png\t9a99e91fa3.png\tcddd1a5805.png\n01eab004dd.png\t35da47b3e6.png\t68a5d5d7ce.png\t9a9afab314.png\tcde14bb8d2.png\n01ee06523b.png\t35e7acda04.png\t68a8521c79.png\t9a9fa8ad20.png\tcde189305a.png\n01ee4f83f9.png\t35e9810619.png\t68aa2f3bf4.png\t9a9ff1beca.png\tcde2cf1ae9.png\n01f2262bf6.png\t35f1574879.png\t68af6c5f57.png\t9aa58dfee8.png\tcde797dd61.png\n01f2b031ac.png\t35f233e10a.png\t68b593f903.png\t9aa65d393a.png\tcde7db0718.png\n01f32c502c.png\t35f3cb48d6.png\t68b6abcefc.png\t9aa679bd58.png\tcdf00572fe.png\n01f43eb833.png\t35f8e59694.png\t68b826491f.png\t9aa744c0d6.png\tcdf9422054.png\n01f537721e.png\t35fd991255.png\t68b892b80c.png\t9aa82de10d.png\tcdff8ed1ca.png\n01f5dedbf0.png\t35ff11761b.png\t68b94c24d5.png\t9aad618687.png\tce00ad751e.png\n01f91c6197.png\t35ff3457e7.png\t68bad6a787.png\t9aae327222.png\tce04be18c5.png\n01fc11f637.png\t3603aae294.png\t68c0ec9724.png\t9ab1087c65.png\tce062742bf.png\n01fd9581f6.png\t3606703e6b.png\t68c1093ddc.png\t9ab3a897e8.png\tce06c157c1.png\n020376e68e.png\t3606ccb4e2.png\t68c205d1e4.png\t9ab6cfe256.png\tce0e9def63.png\n0203970177.png\t36118090a1.png\t68c24f3d46.png\t9abb77605e.png\tce13104347.png\n020678ec1b.png\t3614095a0e.png\t68ca6d4b5d.png\t9abcd70258.png\tce146a156c.png\n0207954fdc.png\t361d5e6834.png\t68cc289774.png\t9ac1664234.png\tce14d335bf.png\n0208be7868.png\t361efa9093.png\t68cd094920.png\t9ac1f214d8.png\tce15cc1d1a.png\n020a0335b5.png\t3621dd35dc.png\t68cd77317d.png\t9ac4e6c7fc.png\tce16b0d0bc.png\n020b586ddf.png\t3627e1a769.png\t68ce559aca.png\t9ac5fd325e.png\tce1ed41e9a.png\n020e6211ad.png\t362cec7849.png\t68d00224e0.png\t9accde1974.png\tce1f9711bc.png\n02117a400e.png\t362d82ca7b.png\t68d047de80.png\t9acd310f25.png\tce23065eac.png\n021494f3aa.png\t362de7e871.png\t68d1cc2e33.png\t9ad15b2d25.png\tce269253a0.png\n021594d8ee.png\t3634f29aac.png\t68d3e8314d.png\t9ad32f3c66.png\tce2c3ed5c1.png\n02206b234e.png\t363a171d90.png\t68d5a944e7.png\t9ad3911361.png\tce2f737ee1.png\n02208cc8e1.png\t363a8a9701.png\t68d70623a8.png\t9ad6e2b35a.png\tce31ab9b97.png\n0224643e71.png\t363cf09985.png\t68d8ee613f.png\t9ad7e627fa.png\tce32dfe7c3.png\n0227a27435.png\t3648353086.png\t68d9f77646.png\t9ade7d33f2.png\tce36e2a0ae.png\n0227b5b765.png\t36488934b7.png\t68da71599b.png\t9ae45b19d3.png\tce3d0fa3fa.png\n0228455db0.png\t364b6ecf01.png\t68dd1c44ae.png\t9ae5ecca79.png\tce3e6bba2e.png\n022b1b01be.png\t364c09b563.png\t68de95fb39.png\t9ae739a3ad.png\tce4047f646.png\n0230002b65.png\t364e353a82.png\t68deafa4a7.png\t9ae81a029e.png\tce4424106b.png\n023350c4e2.png\t3653f28f67.png\t68dfacddd1.png\t9ae931f21a.png\tce4593cd3e.png\n02337f94a9.png\t3659fb63e1.png\t68e24bffc0.png\t9aea81d01d.png\tce47492e46.png\n023a4a8fc1.png\t365abbdf77.png\t68e3ae2fd2.png\t9aebdf433e.png\tce48548fa6.png\n023b29090a.png\t365b34260e.png\t68eaa96a49.png\t9af260ba1f.png\tce4a1c6a57.png\n023d486fba.png\t365bdf0ae1.png\t68efd1662e.png\t9af3b0c9ad.png\tce4ceaa5ed.png\n024332e85b.png\t365c346528.png\t68f0a9f47d.png\t9afa3dea27.png\tce53de2451.png\n0243477802.png\t365dec6186.png\t68ff47165b.png\t9afd0533dd.png\tce54f2defa.png\n02446ab7b5.png\t365f78e91a.png\t69013a70ab.png\t9afd54b9a1.png\tce5710c27b.png\n02446da877.png\t3660856523.png\t690281a93d.png\t9b0239d22c.png\tce573caad1.png\n0245364ec1.png\t3662fffe30.png\t6902873ebf.png\t9b024c5953.png\tce59600cfa.png\n0246257f2b.png\t36667d3e96.png\t690514a3e4.png\t9b095d1144.png\tce5b9c1de1.png\n0249c49180.png\t3666f04356.png\t6908a36e6a.png\t9b09626522.png\tce5fd7f840.png\n024a6d0c47.png\t3666fea8c3.png\t69098163e1.png\t9b10a74575.png\tce61662084.png\n02504aecbe.png\t366dbb8cb4.png\t6909be8eb1.png\t9b11d67e37.png\tce6314ca42.png\n02564072c0.png\t367099de12.png\t690ad6b01a.png\t9b1403e161.png\tce66fda32b.png\n0259b01cfe.png\t367b744d2d.png\t690b1ce5f7.png\t9b1b4b7481.png\tce6760a4f3.png\n025f694e91.png\t367c6fb412.png\t690b29e4f5.png\t9b1efe2798.png\tce67c2e8dc.png\n02607a1d6e.png\t367eac65c5.png\t690fab72e2.png\t9b1f59b89c.png\tce69bd4ece.png\n02658f5ae3.png\t3680dada70.png\t6919c6895e.png\t9b24c1eb70.png\tce6a32fc51.png\n02677351df.png\t36841d78ba.png\t691a004407.png\t9b29c91d73.png\tce6b910fe3.png\n0267db6d11.png\t3685aa0e78.png\t691dc0bc42.png\t9b29ca561d.png\tce6e39406d.png\n026b509cd4.png\t368767c9fd.png\t691fec8ce2.png\t9b2ad9f14a.png\tce7515118f.png\n026f97575b.png\t3687839ecc.png\t692207d70f.png\t9b303e8370.png\tce78d7d613.png\n02782eb6f2.png\t36886c22c7.png\t69233da61a.png\t9b3112b0b6.png\tce7addb51e.png\n0279a4db6e.png\t368b385f61.png\t6923e04956.png\t9b31db4e58.png\tce7b81ea3d.png\n0279cf7419.png\t368f635c4c.png\t6924dd75cb.png\t9b3614da4a.png\tce7bb8f52b.png\n027f49e160.png\t368fc1bf25.png\t69253f7efc.png\t9b3d407035.png\tce7de72524.png\n0280db420c.png\t368fff8ff3.png\t6926d43d4a.png\t9b3d4b0379.png\tce8add65b4.png\n0280deb8ae.png\t36951bc157.png\t692f785564.png\t9b40ed4c71.png\tce900e9c54.png\n0283add18e.png\t3695558548.png\t69319596b6.png\t9b42e0bfb8.png\tce96a07785.png\n02910e3652.png\t369f8a6f01.png\t69357d53fb.png\t9b450671aa.png\tce96b58bee.png\n0291a0c49b.png\t36a2715c09.png\t693689526d.png\t9b496d8233.png\tce98269a83.png\n0291e643ee.png\t36a6901cae.png\t6937514d1c.png\t9b4aa94aed.png\tce9b8cc11e.png\n0294ae08af.png\t36a7c1effe.png\t693ed8cd99.png\t9b4ae543b5.png\tce9c3510a8.png\n029612d30a.png\t36aa21019e.png\t6940237693.png\t9b525b08e7.png\tcea291eedc.png\n02999b160b.png\t36aaae86d3.png\t6945362fd4.png\t9b5538c283.png\tceaa808817.png\n029c321179.png\t36aac13e58.png\t6947dbc4f4.png\t9b55f81daa.png\tceab85e43a.png\n029dd6f51f.png\t36ad52a2e8.png\t694cf352f7.png\t9b585a2ea0.png\tceadfc98bd.png\n02a3ec26b3.png\t36b0906d62.png\t694e451a45.png\t9b5b0e60a1.png\tceb58f3460.png\n02a4e3a82e.png\t36b2e8672d.png\t6952ef1a14.png\t9b5bed5c6f.png\tceb69201df.png\n02a4f7c7de.png\t36b2ed5c09.png\t6955cae168.png\t9b61c3ff73.png\tceba75e611.png\n02a73b97e5.png\t36b768bfa7.png\t6957579961.png\t9b622b10f4.png\tcebc1b3f7d.png\n02ad64d551.png\t36b83e5530.png\t6959af3cc3.png\t9b656c339a.png\tcebd6f3ebb.png\n02adf272e9.png\t36bc8044f8.png\t6959f9efff.png\t9b66bcf23c.png\tcec3aace1e.png\n02b0a55486.png\t36bd172c8d.png\t695d72762c.png\t9b67bcf412.png\tceca04aa2a.png\n02b224b871.png\t36c0313b9e.png\t695da8b59b.png\t9b6a41e977.png\tcecb9a3bd3.png\n02b2e006c8.png\t36c0d6063f.png\t696552712d.png\t9b6cb72794.png\tcecf808839.png\n02b705b112.png\t36c2188143.png\t696a866b71.png\t9b73299424.png\tced201b37c.png\n02be61df2b.png\t36c3c517ad.png\t696d91b37e.png\t9b76ac365d.png\tced71f3641.png\n02c1fb48e4.png\t36c461dcb6.png\t69700e2869.png\t9b77f7802e.png\tced8d25446.png\n02c9416d86.png\t36cb81c54e.png\t6973e71920.png\t9b7ade011b.png\tceda8750a4.png\n02caed72b7.png\t36ccc5a8f9.png\t69768c0d00.png\t9b7e9f8a91.png\tcedb73a437.png\n02d40993ab.png\t36d1b74131.png\t697af33de9.png\t9b80cf8992.png\tcede9158fa.png\n02d63087cc.png\t36d9ea343f.png\t697f80cece.png\t9b82ce06a2.png\tcee1ebd36d.png\n02d8b415fd.png\t36db23ceb8.png\t697fee88c5.png\t9b838f638a.png\tcee5f80b1c.png\n02dae1c38d.png\t36dcec0996.png\t69840acbc6.png\t9b86b8a5fe.png\tceeaafc4fc.png\n02dd21181c.png\t36ddf39bf0.png\t69874e7963.png\t9b901e92e4.png\tceedc60808.png\n02e1654bef.png\t36df9ee508.png\t699169e3a1.png\t9b90e921ec.png\tceef85bdc0.png\n02e3c926d1.png\t36e1354d65.png\t6994342067.png\t9b933aea52.png\tcef03959d8.png\n02e651b7bd.png\t36e1d2c441.png\t699608ddb7.png\t9b952b1af0.png\tcef5623670.png\n02edcee8af.png\t36e68393b6.png\t6997ae1756.png\t9b965db66d.png\tcef624f327.png\n02f0e1dc32.png\t36e6e20bd8.png\t699924356a.png\t9b97dee78e.png\tcf023db098.png\n02f19130c1.png\t36e8817ef5.png\t699c07648b.png\t9b9bddfb54.png\tcf02a20d32.png\n02f36ca5ec.png\t36eaaa64e2.png\t699c9334c5.png\t9b9c0f8eb2.png\tcf063b6a84.png\n02f62a671c.png\t36eb8d9d53.png\t699e9c511c.png\t9b9c107a62.png\tcf11876547.png\n02fedd0c79.png\t36f087bbb8.png\t69a3b60b11.png\t9b9c2086f4.png\tcf14982a21.png\n02ff065ed6.png\t36f332bd11.png\t69a52adc65.png\t9b9d62b4f2.png\tcf14a9e3d0.png\n02ff55da9e.png\t36f364c4fb.png\t69a8775edd.png\t9ba1866dfa.png\tcf19041b8e.png\n0300ad09f3.png\t36f3c069ac.png\t69ab424098.png\t9ba360fe0d.png\tcf1ffe14e0.png\n03049b14ca.png\t36f9549e06.png\t69ac72804f.png\t9ba53387b4.png\tcf22da8b0b.png\n0304e028d9.png\t36f9ab3ee0.png\t69ad43554c.png\t9ba564058f.png\tcf241124b1.png\n0304e9ac37.png\t36febb2cf5.png\t69adeb575d.png\t9bb238c99f.png\tcf36d61397.png\n0310766b6c.png\t37062e1138.png\t69af036eb7.png\t9bba9f0981.png\tcf38172321.png\n03138aebc4.png\t370746aa1b.png\t69b96fdadb.png\t9bbb4c7c34.png\tcf39a7aef0.png\n031709c0fa.png\t370763b8a9.png\t69bb149e2d.png\t9bc0027019.png\tcf3c075df1.png\n031fc7e30d.png\t3709acd3d8.png\t69bc7087d5.png\t9bc43ac1d9.png\tcf3c36117a.png\n0320c3c633.png\t370ef12310.png\t69bf9df3a8.png\t9bc6bf1c23.png\tcf4162011a.png\n032177aab1.png\t371bf59030.png\t69c4e1da35.png\t9bc6f25a04.png\tcf44d77ca0.png\n032535bbd8.png\t371ee426c2.png\t69c5f6be60.png\t9bc988c8b7.png\tcf495190b9.png\n033029547c.png\t371fb700c0.png\t69c7de5d26.png\t9bc9c403e0.png\tcf4b289c17.png\n033bf5105e.png\t37247922ec.png\t69cb0653e5.png\t9bcf7863cd.png\tcf4daafaff.png\n033c0ecfb6.png\t372483a22d.png\t69cd6cdd28.png\t9bd1fdb232.png\tcf52335730.png\n034034d17a.png\t3724c63ddd.png\t69cddf572c.png\t9bd349312c.png\tcf58cfcd0f.png\n034210b100.png\t37251064bd.png\t69cec1ddde.png\t9bd58061d6.png\tcf5be83765.png\n03422321bb.png\t37260c432f.png\t69d4d52bb3.png\t9bde066865.png\tcf5d2ec0bd.png\n034a84a1a9.png\t37260d998a.png\t69d62b2f52.png\t9bdfa54912.png\tcf5d5915c1.png\n034aaba5a3.png\t372df8e65c.png\t69d62e153c.png\t9be3796b6c.png\tcf5dcf33f2.png\n034f55162e.png\t372fbddaf4.png\t69d9cb1c48.png\t9be54cd6a2.png\tcf5e9f93ab.png\n0350a9e3bd.png\t372fc2e2ad.png\t69d9edeccb.png\t9be67d911c.png\tcf63fbd55d.png\n03511989ac.png\t3731205282.png\t69da7346f5.png\t9be8405130.png\tcf671605eb.png\n0352ae7ea3.png\t37335631a0.png\t69db2d04da.png\t9beb6c1184.png\tcf6e41f329.png\n035922489b.png\t3734601c15.png\t69dcaf2ada.png\t9bef2899b0.png\tcf706fb35a.png\n035c07744f.png\t37377158c5.png\t69dd4b8705.png\t9befc65890.png\tcf7ab56ccc.png\n035d178826.png\t3739ae2d77.png\t69dde91f51.png\t9bf022776e.png\tcf7abfc8bb.png\n0360e6f681.png\t373b220a7b.png\t69deef503a.png\t9bf7bee899.png\tcf8042ccb0.png\n0366c64dd9.png\t373c27c20a.png\t69e0c0ac2d.png\t9bf8a38b92.png\tcf8108a1fc.png\n03672c1b75.png\t373e714f3d.png\t69eabf5f29.png\t9bf8f84094.png\tcf856ef539.png\n03733663ac.png\t3746667eb4.png\t69ee613976.png\t9bf982cf65.png\tcf8acde8a6.png\n0373bd8d6c.png\t374a1dbf0c.png\t69f188eb44.png\t9bfb38a26f.png\tcf8ad2b1eb.png\n0381cdf44e.png\t374b8c1563.png\t69f47cbc7b.png\t9c08332067.png\tcf8ed7e7b4.png\n03821e9586.png\t3755a4f1c8.png\t69fbd42bce.png\t9c0b85b5b4.png\tcf8f782c1a.png\n03837c9b48.png\t375661a521.png\t69fe14835b.png\t9c14cb5581.png\tcf92834b28.png\n038617ab0d.png\t375a8da879.png\t69ff232673.png\t9c1a87e7fa.png\tcf93be642e.png\n0389fc6012.png\t37617be9c2.png\t6a0282da60.png\t9c26e28710.png\tcf9c283f60.png\n038c4d7a3d.png\t3767362bcf.png\t6a05025151.png\t9c276c8606.png\tcfa895be5c.png\n038e53260b.png\t37676b8fa4.png\t6a050e000f.png\t9c298b9575.png\tcfad7b2185.png\n03933bbd6d.png\t376826eeb0.png\t6a06625d6a.png\t9c2cab0be5.png\tcfb0294fdf.png\n03948bcdc9.png\t3768877c25.png\t6a0c145773.png\t9c2e45bf79.png\tcfb2babeb8.png\n03969a67e1.png\t376b6b7e94.png\t6a0f5a8b42.png\t9c30bb0a5f.png\tcfb8e5c36e.png\n039929151f.png\t376e1a060f.png\t6a105e7555.png\t9c3383eb85.png\tcfbc5030dd.png\n039b7c2b74.png\t376e55e566.png\t6a1703471f.png\t9c36db226b.png\tcfc651fc58.png\n03a2677daf.png\t3772d8349e.png\t6a1783be17.png\t9c37d6fe35.png\tcfcd55d96c.png\n03a92bda0e.png\t37742e5f10.png\t6a19a8ccc3.png\t9c4468a160.png\tcfce7a0920.png\n03a9683c39.png\t3778554f16.png\t6a1beb1f77.png\t9c44868c96.png\tcfcffeda9e.png\n03a98b53cc.png\t377953344d.png\t6a1defeaa5.png\t9c462097d0.png\tcfd2c7ab97.png\n03a9a9f615.png\t377b6c18d5.png\t6a1fd3c67e.png\t9c4c524983.png\tcfd6f05061.png\n03aa815dd1.png\t3780e66035.png\t6a1fe1a81e.png\t9c4cf9f5f1.png\tcfd9af336f.png\n03ab71b9d7.png\t3782748f51.png\t6a272dacd8.png\t9c4d9fdb92.png\tcfdc385955.png\n03ae7e298c.png\t37880e817f.png\t6a2861f265.png\t9c4e59c725.png\tcfddb8231f.png\n03b1288247.png\t3790459cd0.png\t6a2862b831.png\t9c4fb66a3b.png\tcfe4dd9f7e.png\n03b2e0b652.png\t3791842a6d.png\t6a29fbd74b.png\t9c4fed0493.png\tcfe57ea3fa.png\n03b4f1dc2a.png\t3792be0845.png\t6a2a9f1e1d.png\t9c50f550c4.png\tcfe755cfcd.png\n03b8f5bfa7.png\t37940b652c.png\t6a2d493fb8.png\t9c53811ac1.png\tcfeaf79010.png\n03be56aa9a.png\t37941ad573.png\t6a2dd17071.png\t9c5406268d.png\tcff4eaf93f.png\n03c0db5f66.png\t37949f5dd0.png\t6a2de1c736.png\t9c5567124a.png\tcff690800f.png\n03c247c74b.png\t3794cd4d73.png\t6a3d92ce59.png\t9c57a7ab91.png\tcff78ba6f7.png\n03c7e68cbb.png\t3797fc31e1.png\t6a3da583b7.png\t9c58e81c46.png\tcff87534c4.png\n03c9b5ffe8.png\t37a866550e.png\t6a3f8feaea.png\t9c59b70ad0.png\tcff9f980d7.png\n03cf2a80bc.png\t37aa9fc79a.png\t6a436d7460.png\t9c5c26e2ba.png\tcffbb49dc1.png\n03d1a4e518.png\t37ab36b8b8.png\t6a443ecf2a.png\t9c5cd7cf24.png\tcffbfab33b.png\n03d9a31cd0.png\t37aba9fcb3.png\t6a484cdfa8.png\t9c60208784.png\td0030ee6be.png\n03dcdf8b24.png\t37ae46fab5.png\t6a49954583.png\t9c6b016819.png\td006b6ef23.png\n03dee840bf.png\t37afd06dfa.png\t6a4a8f9166.png\t9c6d46eac1.png\td00d1ea52b.png\n03e9a82817.png\t37b5123c8f.png\t6a4c68f4bc.png\t9c6f8e4676.png\td0107a1229.png\n03eed26d3e.png\t37ba215c62.png\t6a5070c07d.png\t9c726c07cc.png\td011233664.png\n03f4b4a6ce.png\t37bfa26044.png\t6a50b924bd.png\t9c740fdea9.png\td0190aaf62.png\n03f991ef8f.png\t37c25c3268.png\t6a54c9a21f.png\t9c75c05f12.png\td01a30c77a.png\n03faead0bb.png\t37c400489b.png\t6a57dfd211.png\t9c76cd7a3b.png\td01b8993e1.png\n04013dfaae.png\t37c8167b81.png\t6a5d2c2738.png\t9c82805504.png\td01e3d8f7c.png\n0401ae15d3.png\t37c95ff7df.png\t6a65fb7ddb.png\t9c86b981a8.png\td0244d6c38.png\n0402a7ba0c.png\t37ca434162.png\t6a6b42a6f2.png\t9c8e69221c.png\td024745342.png\n0406d1fda9.png\t37db34423c.png\t6a6ebb2255.png\t9c90708fb3.png\td024b8d5d9.png\n040af02c81.png\t37dd11b8cb.png\t6a72b1d26d.png\t9c936ae7ff.png\td0257d7254.png\n040cf0d9ed.png\t37ddf7de3c.png\t6a73082b55.png\t9ca11c3696.png\td028ae45d9.png\n040e8b3eb6.png\t37df75f3a2.png\t6a7506aa1e.png\t9ca1ebe351.png\td029316467.png\n0413ae3f5c.png\t37e16a5404.png\t6a79ccb43e.png\t9ca2242c33.png\td02a7dc593.png\n0414cc3198.png\t37e459b035.png\t6a7bf9f1c6.png\t9ca520f895.png\td02bbb7461.png\n0416ba85bb.png\t37e819830e.png\t6a7d046783.png\t9ca84742ef.png\td0319f0ec5.png\n04182ced8e.png\t37e8e0a012.png\t6a7e072e7a.png\t9ca858814a.png\td034695ac3.png\n041925f04c.png\t37e8fbecdc.png\t6a86f75986.png\t9ca9eae6c7.png\td037cfacb1.png\n041b1900de.png\t37f1f5388e.png\t6a8890377a.png\t9cacb628db.png\td0380e138d.png\n041bda3b04.png\t37f78a8697.png\t6a89c7a85d.png\t9cb3e373b5.png\td03df054cf.png\n041bdffed0.png\t37ff6cca4f.png\t6a8b6c77c4.png\t9cb42834b3.png\td04b72eb09.png\n041d8acbf6.png\t380209106c.png\t6a8f9bd70a.png\t9cb55834da.png\td04d504efe.png\n041f48019f.png\t38049a710d.png\t6a92fa6b2e.png\t9cb9185a70.png\td05176b53c.png\n04234be388.png\t38063f77a8.png\t6a936a6de9.png\t9cba387403.png\td054be6e3d.png\n0429d9759b.png\t380bf23f4b.png\t6a98b95754.png\t9cbca0af7a.png\td059fcb2fd.png\n042c3d4e03.png\t380dea2a5a.png\t6a9c9598eb.png\t9cbd5ddba4.png\td05ad715e2.png\n04300886c1.png\t380e59ba25.png\t6aa0d8db82.png\t9cc2bd46f2.png\td05c1dc1ad.png\n04306197e8.png\t380ff30e30.png\t6aa39d32e5.png\t9cc7c33d95.png\td05f21e2ad.png\n0430d7ea49.png\t38178b0ded.png\t6aa4963e22.png\t9ccba7ec23.png\td063e9625c.png\n04315b9a64.png\t381ef9a1af.png\t6aad1b361e.png\t9ccbea0d6a.png\td064049a0b.png\n0432c28b41.png\t3820797dab.png\t6aad94af7d.png\t9cccb3141e.png\td067d519f9.png\n0433c39f5d.png\t3821e89b66.png\t6aaddb7b41.png\t9cced7a09c.png\td06d2ba008.png\n04384ff2b9.png\t3822f73c1c.png\t6ab003835b.png\t9ccf50ab0e.png\td06f237a79.png\n043bcccc79.png\t38232d1fe3.png\t6abab2e4a1.png\t9ccf9ff1c3.png\td0712900eb.png\n043c15a9af.png\t3823ddd8cf.png\t6abe55deaf.png\t9cd6d11627.png\td07279fb65.png\n043cefa1b0.png\t3825b66859.png\t6abfffdd6d.png\t9cd8e831c6.png\td074403298.png\n043ed7fe5e.png\t3825eb8722.png\t6ac91c75fc.png\t9cdb3805a1.png\td075ac58be.png\n0441017e49.png\t38263b36df.png\t6acd09c3a1.png\t9ce4cbaefa.png\td078c2ee8a.png\n044106f759.png\t38272bc266.png\t6acd730f72.png\t9ce55dc2c9.png\td07b4cef46.png\n0443387784.png\t38285ef00d.png\t6ad29a5c0e.png\t9ce70fe066.png\td07b5f804a.png\n04467827d1.png\t382b3f0c24.png\t6ad49a6e14.png\t9cea9923bf.png\td07cc055fa.png\n04495e40e6.png\t382c39da65.png\t6ad87d8a2d.png\t9cead7930d.png\td082aa9b68.png\n044d6800ff.png\t3831b3ff1c.png\t6adbbca343.png\t9cf5dd8561.png\td095994640.png\n0455d56bca.png\t3835ff55a0.png\t6ade8abb39.png\t9cf74c6432.png\td09c8b55e8.png\n045c5516a2.png\t3839b132a3.png\t6adfb31724.png\t9cf75e089f.png\td09f670a22.png\n0460ed00e4.png\t383c73a408.png\t6ae09ed91b.png\t9cf8fe7f96.png\td0a0a50b27.png\n04614bf5ac.png\t383e701524.png\t6ae3ffe059.png\t9cfa22e826.png\td0a0d48334.png\n0461a2fb30.png\t384050c4e5.png\t6ae646336d.png\t9d0b07cd6d.png\td0a367e216.png\n0462491d87.png\t38413a2fac.png\t6aee0e5621.png\t9d0f0a2488.png\td0a54afbc5.png\n04697a9083.png\t38416b901f.png\t6aee62c3eb.png\t9d12bb050d.png\td0a8505343.png\n046cba545c.png\t38438eab6c.png\t6aeee32e59.png\t9d1516b5b9.png\td0aaadfad8.png\n0470c449ca.png\t38447389af.png\t6af3f7f682.png\t9d156a0efa.png\td0aeae2658.png\n0470d6aba4.png\t38471001af.png\t6af48dfcdd.png\t9d173ec22a.png\td0aeb260e8.png\n04710d36cf.png\t384a44c635.png\t6afae09aa9.png\t9d218888de.png\td0b188118f.png\n04711cb760.png\t384d50db49.png\t6aff151236.png\t9d22a9e14a.png\td0bb00f25e.png\n0473e496e2.png\t385580a2f7.png\t6b030ae5e8.png\t9d23bc70b8.png\td0bbe4fd97.png\n0475626c02.png\t3856d9cc21.png\t6b06770322.png\t9d2a8ea7d1.png\td0bdfc3217.png\n048763f628.png\t38599bb2b5.png\t6b0688f33f.png\t9d2eafece8.png\td0c287e257.png\n0488eb2f20.png\t3860513b01.png\t6b090a7068.png\t9d302f4085.png\td0c4763c9b.png\n04893c47bb.png\t3862950849.png\t6b0f472ce3.png\t9d31575469.png\td0c9bf6f02.png\n048b6423d4.png\t3863eafbfa.png\t6b13e87edf.png\t9d354bc9e7.png\td0c9e7f9c6.png\n048ced84d2.png\t38673ade97.png\t6b1694099b.png\t9d39009d77.png\td0cba16846.png\n0493bb7894.png\t386b1e501f.png\t6b1a1e50c5.png\t9d3ec04eac.png\td0cd29a61b.png\n0497f2d447.png\t386d47cbe0.png\t6b21f429a8.png\t9d45516026.png\td0cf409725.png\n0498d1fd98.png\t387265cc0b.png\t6b22f89f38.png\t9d462381b0.png\td0d908374e.png\n04a21e7ce0.png\t387aa87ae2.png\t6b22fa7059.png\t9d48cf230c.png\td0da4f7de7.png\n04a5bb16d7.png\t387b6956c4.png\t6b26e7364e.png\t9d4c306750.png\td0df1a121f.png\n04a60b7e45.png\t387f07cf75.png\t6b279fcb79.png\t9d518b6b65.png\td0e720b57b.png\n04aec42282.png\t388099f75d.png\t6b28e2f3f5.png\t9d55e4ccc9.png\td0e8c9833f.png\n04b029690c.png\t388385f4be.png\t6b2b54455e.png\t9d5617b842.png\td0e95cbd2a.png\n04b529dd46.png\t3885101128.png\t6b2d8a1c1e.png\t9d567894cb.png\td0eaa2adbe.png\n04b90090f8.png\t389139a9d0.png\t6b2ee6c253.png\t9d57cce77c.png\td0eb51a04f.png\n04bcc0f6c4.png\t3893cc2455.png\t6b2f49cad6.png\t9d5aeb5104.png\td0ec6da193.png\n04be99d480.png\t389a9f7ad6.png\t6b2f79f6ff.png\t9d5d4bf0f1.png\td0ef5d91e0.png\n04c0c5f664.png\t389ceae5ba.png\t6b2f7ba83d.png\t9d64ebd0cc.png\td0ef6f681d.png\n04c1375204.png\t389d097af2.png\t6b3478573c.png\t9d64f7778a.png\td0f06eee13.png\n04c31e19f4.png\t389d9615ec.png\t6b36c0a954.png\t9d65420fcc.png\td0f903ce16.png\n04c48c9743.png\t389e82d252.png\t6b377327b1.png\t9d65b4b8c6.png\td0f922c629.png\n04c4a4acdc.png\t389eeae597.png\t6b387438d4.png\t9d7165c9c0.png\td0fa5460a9.png\n04cbf52244.png\t38a0a84353.png\t6b3c84dc5a.png\t9d72658a31.png\td0fa976e23.png\n04cd9599a3.png\t38a15dee44.png\t6b3dd7f999.png\t9d7482beb6.png\td0ffb5b3f6.png\n04d05430ab.png\t38a53b6b7c.png\t6b3f024cad.png\t9d7600e578.png\td100338a78.png\n04d2ead1b4.png\t38a5cc707d.png\t6b42489773.png\t9d7815c64e.png\td101256aee.png\n04d3cd2444.png\t38a89c763c.png\t6b45297549.png\t9d8178a4cb.png\td106946488.png\n04d7031258.png\t38b090e4a3.png\t6b464d56a4.png\t9d9232634b.png\td107a8e18e.png\n04d92b5a13.png\t38b5d46689.png\t6b4b989279.png\t9d96a157ca.png\td1115a51da.png\n04e021f699.png\t38bb5d43f4.png\t6b4bdc9134.png\t9d98768f19.png\td1154e5122.png\n04e1d8f7ca.png\t38bda63402.png\t6b4d65ac6a.png\t9d98ce444e.png\td11b0da6cb.png\n04e5d3d26a.png\t38bffa5fa7.png\t6b59edf680.png\t9d9988ac9d.png\td11e86362a.png\n04e5ecfbde.png\t38c0d9ca8e.png\t6b5b7c0ff3.png\t9d9a5cfece.png\td11ee7fec1.png\n04e7e791e2.png\t38c805312f.png\t6b5cc16048.png\t9d9b6f17b7.png\td123f81e7d.png\n04ecf2b040.png\t38c852ce16.png\t6b5dc56ee8.png\t9d9f348b03.png\td125d1f8f4.png\n04f48c4646.png\t38cc48d708.png\t6b5f03996d.png\t9da19a126d.png\td129dd38da.png\n04f7141ff7.png\t38d61f0fef.png\t6b5f72af67.png\t9da3f4e2ff.png\td12d3125e7.png\n04f7a73f2b.png\t38d9fe1df5.png\t6b677614ea.png\t9da48c77c3.png\td12d5894b8.png\n04fbced677.png\t38daf12dba.png\t6b67b187e3.png\t9da56ef334.png\td12e6ed617.png\n04fdd70962.png\t38dfaadc68.png\t6b69a22bc3.png\t9da5d14ca9.png\td12faedcfc.png\n04fe99f3e4.png\t38e018ea0f.png\t6b6a7b9756.png\t9da76dae32.png\td13105c407.png\n04ff5c14f7.png\t38e744be5e.png\t6b6aef9a45.png\t9da95ae912.png\td1368abea3.png\n0505759310.png\t38e9ea8346.png\t6b6e268379.png\t9daa6c133f.png\td13ede5f14.png\n0507f8672e.png\t38ea88abd7.png\t6b74af57ef.png\t9dab37614e.png\td13f2aa25a.png\n050bf9ca30.png\t38ec59117e.png\t6b75879bf2.png\t9dada384f3.png\td14060225a.png\n050da9c564.png\t38ed2aa7cb.png\t6b7945d817.png\t9daed1e88d.png\td1408f9935.png\n050f3600c1.png\t38edaacd98.png\t6b7f4a1a91.png\t9db11d4414.png\td14105ad2e.png\n0510482257.png\t38f374d26f.png\t6b81fba94a.png\t9db4f0eb40.png\td145118221.png\n0510fec103.png\t38fee950c3.png\t6b83d97821.png\t9db87ffdaa.png\td145884178.png\n05139ad3e5.png\t390285454f.png\t6b8ac65560.png\t9dba0d933d.png\td14ad5cb55.png\n0514122b6e.png\t390355fd73.png\t6b8b1e7ad6.png\t9dba372ece.png\td14b77170a.png\n05163a6e41.png\t390a558cbb.png\t6b8be32bca.png\t9dc2a6667a.png\td14be11409.png\n051bfc19fc.png\t390b6e7e7b.png\t6b95bc6c5f.png\t9dc4ce9178.png\td14da73251.png\n051c0a2ddb.png\t390c158f57.png\t6b9716ffbd.png\t9dc508d95d.png\td14db20055.png\n0521e30a6c.png\t390dfe1688.png\t6b97b024aa.png\t9dc604318f.png\td155b45b62.png\n052c50e4e6.png\t3911bcfb02.png\t6b996daecc.png\t9dcb043fa9.png\td155d709dc.png\n052de39787.png\t3912a7c40b.png\t6b9fb67be5.png\t9dcb6ffffa.png\td156089f4a.png\n052f53097b.png\t391aa43e9d.png\t6ba0601416.png\t9dd492d6e6.png\td15e59193a.png\n0530b67189.png\t3922843c4e.png\t6ba0ff2b77.png\t9dd4d99ab5.png\td15eeb222b.png\n0532916483.png\t3924307553.png\t6ba5e65f26.png\t9dd88fbb40.png\td16591547d.png\n053293110f.png\t39294d0534.png\t6baf05d61e.png\t9ddc91e5bb.png\td1665744c3.png\n0532a37194.png\t3929965e55.png\t6bb26e2bd5.png\t9dde6475f9.png\td167981f98.png\n0534b9c761.png\t392b7bc50a.png\t6bb333fe7b.png\t9dde6c3755.png\td168218bac.png\n053566ac5a.png\t393048bc6c.png\t6bb88deab5.png\t9ddf722913.png\td169f9f991.png\n053a6f6704.png\t39316d8a46.png\t6bc1550a56.png\t9de0eaded3.png\td16c0e2f8b.png\n053e7d3043.png\t39377f4740.png\t6bc1aa8213.png\t9de7ed8672.png\td16f246753.png\n0540ea22cd.png\t393941ea97.png\t6bc1b58b5d.png\t9deb934708.png\td175b3b3cc.png\n05454fe6a8.png\t393b86d929.png\t6bc1d1227c.png\t9decfafbf3.png\td17759d2d1.png\n054c764e3a.png\t393f2db0b0.png\t6bc43978cd.png\t9df2466f89.png\td17e7b28c2.png\n05562883e7.png\t39400e0443.png\t6bc4c91c27.png\t9dfb81996b.png\td17e8f536b.png\n05572c229f.png\t394070b0b0.png\t6bc69c62e6.png\t9e0f3ef394.png\td17eee846e.png\n0557701ace.png\t39438a4c45.png\t6bce96251f.png\t9e16b88203.png\td17f801011.png\n0558ee87d3.png\t394956fd49.png\t6bcf6c19bb.png\t9e1719a077.png\td1835bf888.png\n055ec918b9.png\t394d2280fc.png\t6bd0e156a6.png\t9e17d61db9.png\td184edb816.png\n055f01a5c9.png\t394f8072bb.png\t6bd3e38587.png\t9e191d17e0.png\td185255316.png\n0568930a95.png\t39538ce2b0.png\t6bd528bc92.png\t9e1d7b968e.png\td1857e0577.png\n056bfa0cc8.png\t395726930f.png\t6bd9f7f4c9.png\t9e2432ac74.png\td187151bc2.png\n056d1b3b59.png\t39579da136.png\t6bdc887531.png\t9e26b0563f.png\td18868570d.png\n056d7bd8cd.png\t395b3fdd18.png\t6bdcbd78dd.png\t9e26c87dad.png\td18a64f984.png\n0570579cc9.png\t395b9609f3.png\t6bdcd9b280.png\t9e28ecb2b1.png\td18d6ca1b5.png\n0573ed7a7b.png\t395bab08f4.png\t6bdd15f74a.png\t9e2cb9d6cc.png\td18dec87b4.png\n057754e031.png\t395e668f22.png\t6bde41d5b5.png\t9e2e88c82b.png\td18e1a3ffe.png\n057b1c6840.png\t396a5f2049.png\t6be54e99b0.png\t9e2f4c7946.png\td1917291dc.png\n057b54820c.png\t396ae389d8.png\t6be77d7154.png\t9e31e1c64f.png\td19827b32b.png\n057eae4f34.png\t396c918307.png\t6be85fecc3.png\t9e346a910c.png\td19c8929dd.png\n0589f557d7.png\t396f334f22.png\t6becbd1796.png\t9e3ed814dc.png\td1a15c65a2.png\n0593bd99d0.png\t3970b732a5.png\t6bece6177e.png\t9e426ec524.png\td1a34003b1.png\n0599374285.png\t3974e9d32e.png\t6bedfa8083.png\t9e457f2983.png\td1a7ba274d.png\n0599c13063.png\t3975043a11.png\t6bf20054b6.png\t9e45e84bac.png\td1a903690d.png\n059ae59b4f.png\t3977f3e349.png\t6bf32019f9.png\t9e45ead18b.png\td1ab12608e.png\n059f99344f.png\t397f00cf61.png\t6bf65c3d06.png\t9e46a3352e.png\td1ac97d3ec.png\n05a54ab210.png\t398177485c.png\t6bf6f7cd9d.png\t9e48a469e8.png\td1adf6a8d8.png\n05a61e5799.png\t3983d723b8.png\t6bfa33ab79.png\t9e4e6733bc.png\td1aedb7466.png\n05af913dbe.png\t3984ed8d4e.png\t6bffabbd91.png\t9e4f9640fa.png\td1b150a0d0.png\n05b69f83bf.png\t398808ab1f.png\t6c00d07fe9.png\t9e54251a89.png\td1b2555927.png\n05b824cf58.png\t398a4d0c7f.png\t6c05e03941.png\t9e545de5b0.png\td1b35d319f.png\n05b9a246c6.png\t398c285d13.png\t6c0a545d06.png\t9e54e0ee92.png\td1b3ac82c3.png\n05bb19ebcd.png\t398c678c04.png\t6c0a68de6c.png\t9e5ca2e27f.png\td1baafb170.png\n05be526826.png\t399315184c.png\t6c0ae2e809.png\t9e60534bef.png\td1bc239adf.png\n05be75afd9.png\t399694c381.png\t6c0f139b5b.png\t9e64d5f765.png\td1bc57f92d.png\n05bf647e6d.png\t3998ea2fb2.png\t6c10ab3d93.png\t9e64fc57aa.png\td1c5e7f75b.png\n05bfa6624d.png\t399ac12902.png\t6c126643dc.png\t9e699c4bd5.png\td1c6b43e5e.png\n05c164b212.png\t399f7b1816.png\t6c147e8265.png\t9e6b377ea4.png\td1c8242b80.png\n05c33e6854.png\t39a05bf91f.png\t6c16057cd7.png\t9e7a9ccc6f.png\td1c9be7e25.png\n05c7033694.png\t39a599078a.png\t6c1a8906dc.png\t9e7d89d4c0.png\td1cfe0c1f6.png\n05cdc83902.png\t39a5b5f33a.png\t6c1ff4fc83.png\t9e80f1de5b.png\td1d46ef2de.png\n05d175bd91.png\t39aa1f3a2b.png\t6c217077e5.png\t9e83c04d5c.png\td1d7ed06cf.png\n05d9351a35.png\t39ae6fcec8.png\t6c21a33935.png\t9e866473b7.png\td1d882f7de.png\n05e30ce8af.png\t39aeeceb53.png\t6c237adaba.png\t9e86e03bec.png\td1dcbcfea6.png\n05e47b3be5.png\t39b07bf42a.png\t6c25735f24.png\t9e8723bca5.png\td1dff8de14.png\n05e59f3c7e.png\t39b10799fb.png\t6c25956bc4.png\t9e88bf3ab1.png\td1e15abcbe.png\n05e6941139.png\t39b1d2294d.png\t6c292f738a.png\t9e897a76ea.png\td1e2aed114.png\n05e7f129ee.png\t39b6ef3f29.png\t6c29463345.png\t9e8e175c01.png\td1e5eeedf1.png\n05e8012e5e.png\t39b8cfa930.png\t6c2c6de49a.png\t9e8fbecf82.png\td1e97641e8.png\n05ef6e7154.png\t39bf2e6358.png\t6c2e63198d.png\t9e91369659.png\td1f0237831.png\n05f09b380c.png\t39c3747602.png\t6c33bebba4.png\t9e9200aab0.png\td1f53c0fa1.png\n05fb522022.png\t39c47a3f54.png\t6c34c04e12.png\t9e966d733c.png\td1f7d2876b.png\n06050de815.png\t39c584b10a.png\t6c35d8d95b.png\t9e9707e9ef.png\td1f8ba2f02.png\n060cdfded5.png\t39c765b268.png\t6c36352343.png\t9e97c676a0.png\td1fa6d0e0a.png\n060cfb2d18.png\t39cd06da7d.png\t6c39c17251.png\t9e9ce2e1ce.png\td1fb1a76f2.png\n060d966da4.png\t39d6fbba1b.png\t6c39cad9de.png\t9e9d2d23b9.png\td204ea9f09.png\n060e0dd7d4.png\t39d7ccca10.png\t6c3a9009c8.png\t9e9f0266ea.png\td20516eb19.png\n060f14fcee.png\t39d958da04.png\t6c40978ddf.png\t9e9f1a10c8.png\td20568b2d6.png\n0611452742.png\t39dd1c43bd.png\t6c41e98267.png\t9e9f3940a9.png\td2066b2414.png\n0611c441b8.png\t39e91d3247.png\t6c4568e51c.png\t9ea0525027.png\td20d6f41dd.png\n0614268874.png\t39eeff5b3d.png\t6c45c3cae0.png\t9ea14bcead.png\td20de74c75.png\n0614cf0b6c.png\t39f0728ddf.png\t6c45d80d1e.png\t9ea14f1617.png\td20e50b054.png\n0616cb9c1f.png\t39f101a807.png\t6c467ece12.png\t9ea1b3c663.png\td21012f5c1.png\n061a8ca0db.png\t39f1a4f12f.png\t6c48112d73.png\t9ea1b4c239.png\td2164759b2.png\n061f2729bd.png\t39f345dee2.png\t6c49b277aa.png\t9ea6ebf3d1.png\td2169330bb.png\n061ff2990f.png\t39f87fe0c3.png\t6c4b8e468b.png\t9ea7bef973.png\td21d09dd73.png\n0621ebf7b5.png\t39f8da5892.png\t6c50e0b05e.png\t9ea97ebc02.png\td21e3d7843.png\n0623c74e99.png\t39f9ba348e.png\t6c53696981.png\t9eab4428db.png\td223809084.png\n06289f69f8.png\t39fc7213e1.png\t6c557220d3.png\t9eaf536a63.png\td22a045c82.png\n062ff07243.png\t39fcc00d01.png\t6c5770c91c.png\t9eb4a10b98.png\td22f0a172a.png\n0634142132.png\t39fd17e017.png\t6c5b6e5733.png\t9eb54c97be.png\td230bf4a24.png\n063c439dac.png\t39fe9cb4e4.png\t6c5b8b3950.png\t9eb7f34982.png\td23419186e.png\n063cf2a23a.png\t3a00144329.png\t6c61beb5dd.png\t9ec2ecb34b.png\td2353f04b3.png\n063d449cd2.png\t3a003f211d.png\t6c654f0642.png\t9ec5530e38.png\td2355eb92b.png\n06403b80fa.png\t3a00f81f99.png\t6c65ab86f5.png\t9ec6ead0e2.png\td236c0b381.png\n0642bbbc5f.png\t3a0168afba.png\t6c676d1605.png\t9ec8ddce2e.png\td238d7beab.png\n0642eb11c7.png\t3a068284ed.png\t6c6c1cbae9.png\t9ec9469778.png\td23d16fa85.png\n06446b9d8c.png\t3a06849378.png\t6c6f886709.png\t9ecc1f5457.png\td24045efa2.png\n064a8eb3dd.png\t3a0922aeea.png\t6c705b0da4.png\t9ed9a2feed.png\td244dc0b1c.png\n064b4962e9.png\t3a0e15d4bc.png\t6c725b1b62.png\t9edc9cd726.png\td247469515.png\n06523f9719.png\t3a10265232.png\t6c72d0dfd0.png\t9edcfbe543.png\td248e4ccc2.png\n065c07cc06.png\t3a174d4c0b.png\t6c74777cbf.png\t9ee2a96d6f.png\td24b00781f.png\n065c6cd6ff.png\t3a1c84fe7c.png\t6c793e5879.png\t9eeb6f090f.png\td24e24eea9.png\n065ce5590e.png\t3a1e053394.png\t6c7b1a041d.png\t9ef364d6de.png\td2511dfca8.png\n065d95e8dc.png\t3a1e264383.png\t6c81929afd.png\t9ef54fa6eb.png\td2522cfc93.png\n065dae68d5.png\t3a1e404bbc.png\t6c82a04337.png\t9ef73b5eba.png\td2531a4d49.png\n06665f87d0.png\t3a226d196a.png\t6c88ce3622.png\t9ef7777145.png\td254cf287c.png\n066803e978.png\t3a23c0ad1f.png\t6c89234f73.png\t9ef9b40de2.png\td2557e5d59.png\n066c01d461.png\t3a25d72ad5.png\t6c8ad32f36.png\t9efa93ff14.png\td25a2d736a.png\n066e0ce4c3.png\t3a272cabc2.png\t6c8e7ad4c0.png\t9efedb3b86.png\td25aa6c068.png\n066f18a55c.png\t3a310860a3.png\t6c91546a57.png\t9f024839e6.png\td268ae57b5.png\n0670f16651.png\t3a3281b37d.png\t6c91aa602d.png\t9f0249442a.png\td26e890704.png\n067f82832a.png\t3a33e1eb89.png\t6c94e054e6.png\t9f0685e8a2.png\td26f77256f.png\n0680eb5e6e.png\t3a35c0728f.png\t6c95018d5c.png\t9f06b4bbd9.png\td274664a41.png\n06814a8ba4.png\t3a37009a34.png\t6c95a61d2c.png\t9f07d6a050.png\td2776b82d5.png\n0681b94208.png\t3a39bd2a91.png\t6c9c69cf63.png\t9f081a61d7.png\td27831fef2.png\n068b75c080.png\t3a3e1e824d.png\t6c9d4e4786.png\t9f0a0921e0.png\td27b85e7c0.png\n068f7a5267.png\t3a3e8fd696.png\t6c9e7df548.png\t9f0bf03c7e.png\td27ceec04a.png\n069035a9b5.png\t3a40695e2d.png\t6ca0287c4a.png\t9f0c9f3b90.png\td27db3dcc0.png\n0690bc9579.png\t3a41611201.png\t6ca74c085b.png\t9f0ccd7979.png\td27dc449a5.png\n0692b505d2.png\t3a449e5df9.png\t6ca8842553.png\t9f0d446457.png\td27e49b95a.png\n069780eb80.png\t3a45a51f21.png\t6cabd17489.png\t9f0d6b4bbe.png\td28614094e.png\n069b571f97.png\t3a479d449b.png\t6caca4af05.png\t9f0ebbd210.png\td2862ced4d.png\n069c7fbb03.png\t3a489c31f9.png\t6cad0fd382.png\t9f224c71b1.png\td288995f93.png\n06a361fc57.png\t3a4a98aadd.png\t6caec01e67.png\t9f252722f5.png\td289692ab8.png\n06a79609c9.png\t3a51d8fb4b.png\t6cafc8475b.png\t9f29dfa66d.png\td29316fd33.png\n06aa4ac96d.png\t3a5330c79d.png\t6cb11aa807.png\t9f2b973eef.png\td298918b62.png\n06aeec54bd.png\t3a58606814.png\t6cb2eca8e7.png\t9f2ca5078b.png\td29a58d133.png\n06b111b02d.png\t3a58ed89fe.png\t6cc3e319bf.png\t9f2cc46573.png\td29c7b4b50.png\n06b664b866.png\t3a5bdd55ef.png\t6cc50bb10c.png\t9f2e5a8c24.png\td29d1424ce.png\n06b9cddeb5.png\t3a5d046537.png\t6cc7ba6c56.png\t9f2ffd1c09.png\td2a65f4435.png\n06c19e4c07.png\t3a5e5f4202.png\t6ccaafe665.png\t9f32747c28.png\td2a6a4e67c.png\n06c2c49688.png\t3a613d003a.png\t6cce97a933.png\t9f350005b5.png\td2aa4238f7.png\n06c6968589.png\t3a61ff125e.png\t6cd1de8ef4.png\t9f3b8d0186.png\td2aac3d56d.png\n06cdaef7b0.png\t3a63fec26c.png\t6ce16dbcd4.png\t9f3cc74e77.png\td2ae89c1a9.png\n06cf6a9e15.png\t3a690dbd1b.png\t6ce2fbd852.png\t9f3d03c145.png\td2b3f8b7da.png\n06d21d76c4.png\t3a6bc1e413.png\t6cea60aa06.png\t9f4001d8ea.png\td2b40a2781.png\n06d2cdd10b.png\t3a6caba2e9.png\t6ceb5f2342.png\t9f436ce32f.png\td2b77b255b.png\n06d3201dd0.png\t3a6f5a2675.png\t6cf284fb9e.png\t9f4471dcd6.png\td2b9a9beea.png\n06d4212fca.png\t3a6f7f1320.png\t6cf33a5e20.png\t9f459445b6.png\td2bb65bbd0.png\n06d9c22d31.png\t3a76e22552.png\t6cf620a2f0.png\t9f4d859a6d.png\td2bbf82139.png\n06da5a2b39.png\t3a76eda882.png\t6cf6889e33.png\t9f5029183b.png\td2bc78e5ce.png\n06decb795c.png\t3a7868e7e4.png\t6cf6befa6a.png\t9f54b45a74.png\td2bd2fe97f.png\n06df6aa4bb.png\t3a7981d706.png\t6cf706f764.png\t9f54d7807c.png\td2c08be257.png\n06e009a7a4.png\t3a7a0b6a9d.png\t6cf84a4340.png\t9f56da8998.png\td2c3155dc4.png\n06e0ccbc02.png\t3a7bbd73af.png\t6cf8fec79a.png\t9f5bb5383e.png\td2c4ef10c7.png\n06e2664df1.png\t3a7f0d0bc1.png\t6cfc72c21d.png\t9f60bf97a9.png\td2c53f4b7b.png\n06ef1122ec.png\t3a81cff745.png\t6cff622a5e.png\t9f6abef1a7.png\td2c728aaf3.png\n06f0b6ef91.png\t3a81fa6cb5.png\t6d00a14fdf.png\t9f6bc7a549.png\td2c800e9f5.png\n06f311ebb5.png\t3a828fd486.png\t6d03f91961.png\t9f6d0b9e30.png\td2c8d1b1d3.png\n06f467360f.png\t3a8641b598.png\t6d08896012.png\t9f7155bd98.png\td2c9311644.png\n06f549734d.png\t3a8925ef48.png\t6d0a23599d.png\t9f75213aa3.png\td2cb4356bb.png\n06fa0b053b.png\t3a8da5dd51.png\t6d0ab73c29.png\t9f757eb55b.png\td2d362db2e.png\n06fe4d8223.png\t3a8facd639.png\t6d0b7e8c5e.png\t9f75a6366c.png\td2d97caad6.png\n070047d518.png\t3a8fcdba16.png\t6d0e37599c.png\t9f7e064e9f.png\td2d9c7f757.png\n0702b8700f.png\t3a9c63e8f4.png\t6d0ea586a4.png\t9f7e293e89.png\td2db3c72fe.png\n070394af17.png\t3a9fd030f2.png\t6d17d3be2d.png\t9f7f3b3c88.png\td2dcce97b1.png\n0703a8437b.png\t3aa0a9ed5b.png\t6d1afa9d46.png\t9f7f6d76db.png\td2e14828d5.png\n0703b799a0.png\t3aad1de5ac.png\t6d23b68142.png\t9f88bc6651.png\td2e44953e6.png\n0703c49856.png\t3ab1c71dba.png\t6d25e0f3a7.png\t9f89caa7f3.png\td2e494a718.png\n0706b7541a.png\t3ab3158a27.png\t6d27f1faf7.png\t9f91c31994.png\td2e4b1e381.png\n070a913f3e.png\t3ab326340b.png\t6d2a5a2fe2.png\t9f95d4f2cd.png\td2e5b43517.png\n070ab527b1.png\t3ab7b7d831.png\t6d2b421f7a.png\t9f96821d81.png\td2e5dc1a0b.png\n070da35501.png\t3abde75a97.png\t6d3209c6d5.png\t9f97b2925e.png\td2ea97ca4e.png\n0711675899.png\t3ac5dc16c1.png\t6d34f414ed.png\t9f9f947775.png\td2edf62bfa.png\n0712535c2b.png\t3ad1d37b83.png\t6d3a5ecdab.png\t9f9fd9cd6e.png\td2ee323c45.png\n0712ea6aab.png\t3ad3fd1b91.png\t6d3bb1c3d6.png\t9fa6f8e4d7.png\td2ee33bc03.png\n07143897b2.png\t3ad9500e43.png\t6d3bd2f30b.png\t9fa72ddc6a.png\td2f131dba8.png\n0717a21efc.png\t3adea2aede.png\t6d40520212.png\t9fabca6b17.png\td2f25a78a6.png\n071c81c9ad.png\t3ae2766077.png\t6d412459ba.png\t9fac28f3df.png\td2f6881bda.png\n071d5aa17f.png\t3ae4bfa0de.png\t6d42396edf.png\t9fb226174a.png\td2f82ead6a.png\n0723cf2280.png\t3ae6793cf7.png\t6d428b7d07.png\t9fb88ef17c.png\td2fbaaeae5.png\n07243dbcbf.png\t3ae90bbf94.png\t6d46f994ac.png\t9fbd3ca3b5.png\td2fe1189c4.png\n07244009e3.png\t3ae98f57f6.png\t6d4837b498.png\t9fc4b3704c.png\td3012444d9.png\n0727c8045d.png\t3aecf1edfc.png\t6d49d84013.png\t9fd1beec08.png\td3034e7a60.png\n072bfb6a9c.png\t3aef003939.png\t6d4ae7243e.png\t9fdb6afc56.png\td303e1774a.png\n0731f87679.png\t3af0b6fef4.png\t6d4fca6a35.png\t9fe3459bb2.png\td30597f09f.png\n0736a1aa17.png\t3af482b98c.png\t6d5077d085.png\t9fe6cf976f.png\td308b0b2aa.png\n073a59f653.png\t3b00ad8dae.png\t6d545391d1.png\t9fe707f837.png\td308ec69d3.png\n073eeb58c8.png\t3b02981f8d.png\t6d55cabc8d.png\t9fe905c12d.png\td30ad7d936.png\n07424fb781.png\t3b05fe7a3f.png\t6d593a19d1.png\t9fe9aa0449.png\td30e73de0c.png\n074535c849.png\t3b1004b99a.png\t6d5ad4ab9e.png\t9fecf3b7b0.png\td3124d74bd.png\n074673a5f1.png\t3b1a9d94cf.png\t6d5bf5fde0.png\t9fed64924e.png\td31407ea3b.png\n074a4a918a.png\t3b215fbf05.png\t6d5ed262e4.png\t9ff01fb94a.png\td3174677ee.png\n074c92dc0e.png\t3b2362a168.png\t6d61d71a38.png\t9ff430c52b.png\td31b7bca55.png\n0753d119b6.png\t3b23772288.png\t6d633872b5.png\t9ff59d834b.png\td31c506913.png\n0754ae5da8.png\t3b24607460.png\t6d68b91615.png\t9ff904c385.png\td31cbc0033.png\n0759b4c0a6.png\t3b282ca9fb.png\t6d69267940.png\t9ffa8a1f08.png\td31e3fbfdc.png\n075c30e2f5.png\t3b2be010c4.png\t6d707a2713.png\t9ffb5d6034.png\td31eb8057d.png\n075c99bcfc.png\t3b2e64b4bb.png\t6d710ec327.png\t9ffe9b70c4.png\td3247e8f62.png\n075e7d7937.png\t3b2f902ec6.png\t6d75fcd108.png\t9fff0319cd.png\td3302e82e5.png\n0762a99eb4.png\t3b2ff53a06.png\t6d7774165a.png\t9fff138ea3.png\td3332b8273.png\n0762de37c3.png\t3b32255caa.png\t6d7b9dbbe5.png\t9fffec824a.png\td33ac09d91.png\n076670cc1c.png\t3b3647008a.png\t6d7f1db1d5.png\ta0018e4f71.png\td33fce2c44.png\n076c17198a.png\t3b36b4cfc7.png\t6d84c92dd5.png\ta003c71571.png\td342b98570.png\n076eb6998e.png\t3b3e29c628.png\t6d8817f092.png\ta009da75d1.png\td343f6974e.png\n077074bba9.png\t3b3e3a9a97.png\t6d88e3341d.png\ta00a1f5133.png\td3444844ea.png\n07707ab8b2.png\t3b403b3f01.png\t6d8a8e2e6d.png\ta00b575195.png\td34c539d4e.png\n0775d2543e.png\t3b408e43d0.png\t6d8b46c501.png\ta00d2b180c.png\td34c5a3e21.png\n077cb8696f.png\t3b44dcf59b.png\t6d8e425cc3.png\ta00d39a13e.png\td353a74b66.png\n0786b42c85.png\t3b474dfed2.png\t6d91979be6.png\ta01134fa8e.png\td35a7e87f5.png\n078f36a006.png\t3b48cce005.png\t6d93c0a95e.png\ta011acc5d9.png\td35af24206.png\n0794347c0c.png\t3b4be071a8.png\t6d94eea16d.png\ta01b8c4af5.png\td3618783eb.png\n0794c37f5a.png\t3b4f30c92e.png\t6d975190d4.png\ta01dfd5b11.png\td361e24d92.png\n0795655fb6.png\t3b5368ede9.png\t6d9b5ae77a.png\ta01ec79fb7.png\td36234ebd5.png\n079629bc48.png\t3b5653f6b3.png\t6d9b6a2c4b.png\ta0219e8fa4.png\td367a6e845.png\n07993dc1d4.png\t3b5774226e.png\t6d9baf0d42.png\ta022b9c3ac.png\td3696f45f9.png\n079e05a311.png\t3b57b1ef9d.png\t6d9bbf4c1e.png\ta024514486.png\td369e6b550.png\n07a70fdbf3.png\t3b58fbbf7b.png\t6d9be2f492.png\ta024bd28fa.png\td36a3ca700.png\n07a81dccdc.png\t3b5aad6cdc.png\t6da01cc490.png\ta026d16f0b.png\td36e380a0b.png\n07ac7e530f.png\t3b5c0e9308.png\t6da04e4c20.png\ta02d22da7b.png\td37250bba7.png\n07adb8811b.png\t3b5ca1ae0a.png\t6daeaa31c7.png\ta03090943e.png\td3757f9885.png\n07b06234e7.png\t3b5d5dd52b.png\t6db1a21e46.png\ta035f3c359.png\td376de0a55.png\n07b344fbb5.png\t3b5e9c6e05.png\t6db4d1afbe.png\ta0372e4e59.png\td376f8676a.png\n07b92247d3.png\t3b5f671680.png\t6db821bf5f.png\ta03848e48d.png\td377b39581.png\n07bb074759.png\t3b65804dd5.png\t6db8231d14.png\ta038a2d781.png\td3829ca10a.png\n07c3553ef7.png\t3b680d50c9.png\t6dba47cc58.png\ta03c96ed05.png\td3841e6c80.png\n07cbc65347.png\t3b69920d03.png\t6dc107059b.png\ta042e73fee.png\td384ce1a9c.png\n07d631693c.png\t3b6e0e45c7.png\t6dc1cff938.png\ta04349cfc4.png\td385c199a3.png\n07d94f047a.png\t3b6ed42600.png\t6dc362f696.png\ta0447d64e2.png\td39077b54d.png\n07d9da93c6.png\t3b6f36dc01.png\t6dc54e0021.png\ta046868003.png\td390b1cbef.png\n07dafc9877.png\t3b71213533.png\t6dc67726cb.png\ta049e7255e.png\td390da215f.png\n07e08eee37.png\t3b7121a267.png\t6dc7a51dd9.png\ta04d442538.png\td3947fc006.png\n07e0ac7743.png\t3b73d2e8af.png\t6dca333081.png\ta056040228.png\td395e56a46.png\n07e48a0c94.png\t3b77ba573d.png\t6dcb56d4db.png\ta056a19d45.png\td397b6d85a.png\n07e73a3201.png\t3b7ceb9c90.png\t6dcf25329a.png\ta059a8be41.png\td39f51a99c.png\n07e7427152.png\t3b7fac390a.png\t6dd12af82e.png\ta05a91dc20.png\td3a1329a49.png\n07e99af922.png\t3b80fffae9.png\t6dd1746fdf.png\ta05ae39815.png\td3a7f52f0a.png\n07f00d6967.png\t3b85447c61.png\t6ddd6a2ec1.png\ta05da102a2.png\td3aaf303a1.png\n07faa3527e.png\t3b881cf889.png\t6de14923ed.png\ta05dea8cff.png\td3aba83506.png\n07fc2c8e9a.png\t3b88cf99b3.png\t6de3a6d961.png\ta05fe50177.png\td3acb2d561.png\n07fef26d18.png\t3b8c8f0a19.png\t6de89340a5.png\ta0637bbf74.png\td3ad12287c.png\n08017b8e25.png\t3b8d185300.png\t6dea52e54d.png\ta06546807e.png\td3ada660bd.png\n0804088f79.png\t3b92f96f2f.png\t6ded3d4b61.png\ta067c4ea5b.png\td3b47c38e2.png\n080577b945.png\t3b956837ed.png\t6df052d3ad.png\ta06f18de38.png\td3b4bcf92a.png\n080b5e2407.png\t3b9586e2ff.png\t6df6139389.png\ta070422df0.png\td3b6c08868.png\n0813911355.png\t3b9d95d5df.png\t6df9a9d1f5.png\ta0716df5df.png\td3b9892977.png\n08173ef66c.png\t3ba256ba46.png\t6dfaae4996.png\ta0723cef81.png\td3be640ed4.png\n081bdb0060.png\t3ba7e5581e.png\t6dfbacf10b.png\ta0729c7a02.png\td3c0ed351b.png\n082041a8eb.png\t3baa95e545.png\t6dfcbdbade.png\ta076699b79.png\td3c27f59da.png\n0822e176b2.png\t3bac4bd5af.png\t6dfd2f101f.png\ta07b40cd8a.png\td3c3988e54.png\n0823602363.png\t3bacefecc1.png\t6e00792276.png\ta07c1fd090.png\td3c6689f39.png\n0823abbc29.png\t3bb083c7b3.png\t6e030e29a9.png\ta07f9159e0.png\td3c87f3e56.png\n08274e82fb.png\t3bb4585abc.png\t6e03492fe6.png\ta081425976.png\td3c8980685.png\n0827ac0e62.png\t3bb4884ac5.png\t6e05b60c72.png\ta08783f5fd.png\td3c9529867.png\n0828fed727.png\t3bb5a46a23.png\t6e1232c356.png\ta089157e91.png\td3ccdbe2be.png\n082908c425.png\t3bb7347dab.png\t6e1433766d.png\ta0891f7e3e.png\td3cdf99eba.png\n0829b44678.png\t3bba888579.png\t6e1444d88d.png\ta089728353.png\td3ce9e576f.png\n082cb33f51.png\t3bbbe5e44d.png\t6e166d6284.png\ta08f3a707f.png\td3d19422ca.png\n082d2910a8.png\t3bbf82fc28.png\t6e1bb2e64a.png\ta099efaadf.png\td3d258ee8f.png\n082ea078ea.png\t3bc1fe41fc.png\t6e1c857c00.png\ta09eea259b.png\td3d5626477.png\n08371860e0.png\t3bc23c3ac7.png\t6e1e4abee9.png\ta0a13ec7de.png\td3d6ced0d3.png\n0839306166.png\t3bc80b1c12.png\t6e1f24469c.png\ta0a1b2118c.png\td3d7b32db0.png\n0839b6cf1a.png\t3bc996d7b7.png\t6e23b5482f.png\ta0a349ec44.png\td3d8adb2c2.png\n083c487e4a.png\t3bceaa0303.png\t6e262f320f.png\ta0a3f03553.png\td3d97e53d4.png\n083ffe7f02.png\t3bd0ebda02.png\t6e28a340f7.png\ta0a978309f.png\td3dc15fbc2.png\n0841984941.png\t3bd3216e16.png\t6e293e90b2.png\ta0ae63c9bc.png\td3e042cc9f.png\n084a5e8648.png\t3bde8847ad.png\t6e2ca15d41.png\ta0b00e7922.png\td3e4630469.png\n084cb5f737.png\t3bde973437.png\t6e339205f1.png\ta0b1d96545.png\td3e51484a2.png\n084cdc19f0.png\t3be0c0be8e.png\t6e3634da35.png\ta0b47461d9.png\td3e581916e.png\n084d3ed9c8.png\t3be3f006c8.png\t6e37ae1fde.png\ta0b4f9b664.png\td3edd4d0d2.png\n0852944481.png\t3be5267c36.png\t6e3a11fa65.png\ta0bd7123fa.png\td3f2336daf.png\n08539d8fd6.png\t3be52b19e1.png\t6e3a815b3e.png\ta0c156a4c9.png\td3f3db7bd1.png\n085484b27f.png\t3be7193c50.png\t6e3bb2a882.png\ta0c20610e2.png\td3fb9d5fef.png\n0854a2336d.png\t3bf1419151.png\t6e3c46c2d3.png\ta0c27256fd.png\td3fbedf392.png\n085e7a048e.png\t3bf1dc1ffc.png\t6e3d4665bd.png\ta0c2b4a329.png\td3fdfb9e8f.png\n085ef8448c.png\t3bf345cd29.png\t6e3d814fd3.png\ta0c81d2a79.png\td3fec84000.png\n0860568fe5.png\t3bf70b71cd.png\t6e409af2cb.png\ta0cc288f83.png\td3ff38ba9b.png\n08640ede7a.png\t3bfe771dc8.png\t6e409df907.png\ta0cc846d2f.png\td4033ae7f2.png\n086c242b55.png\t3c0012cf47.png\t6e416c1a91.png\ta0cd55b2f4.png\td4069da6d3.png\n086deb3ccb.png\t3c01f5e5cf.png\t6e470db51c.png\ta0cfcb6546.png\td410a74e8c.png\n086df8ab93.png\t3c03ac9b56.png\t6e4a4f8843.png\ta0d21fb232.png\td422a9eb8f.png\n087075f0f2.png\t3c04beaa5b.png\t6e4ac641d6.png\ta0d3c0da28.png\td4259417a6.png\n087192e2ec.png\t3c0ae755c9.png\t6e4fe449ce.png\ta0d7c5dcbb.png\td42e7ee9cd.png\n087b30cbd3.png\t3c0b4f4c0e.png\t6e55491120.png\ta0d9eb23e1.png\td430cdaadc.png\n0881e78cac.png\t3c0f1d465a.png\t6e58134639.png\ta0e0452f1c.png\td431312cb9.png\n088a402727.png\t3c1097c378.png\t6e58a88e4d.png\ta0e18ad460.png\td43149afb9.png\n088baa4c44.png\t3c10a71f62.png\t6e5c564258.png\ta0e6d8b0a7.png\td433300064.png\n088f1c6ae1.png\t3c16374ea3.png\t6e5e055e6f.png\ta0e9cd5453.png\td43d1a6f66.png\n0892b7c9f6.png\t3c1abef969.png\t6e5f8caed6.png\ta0ea88f6a5.png\td43e2382c9.png\n0893024db8.png\t3c1d07fc67.png\t6e634a6fd2.png\ta0eca9a853.png\td43f845db5.png\n0898e36ee6.png\t3c1ed5cc1f.png\t6e65695d49.png\ta0ef83e8f7.png\td443579521.png\n08993551ae.png\t3c1ee5b008.png\t6e67571b91.png\ta0f050600e.png\td446acc411.png\n08a17ff7fb.png\t3c246c464d.png\t6e691f1825.png\ta0f1f0b1f8.png\td446b467a4.png\n08aa284d8a.png\t3c24e4dc89.png\t6e70299b98.png\ta0f5234b07.png\td44a4a62dd.png\n08af8b0df4.png\t3c2ea1a63f.png\t6e72c79c29.png\ta0f76db166.png\td451f0645b.png\n08b117294d.png\t3c2f5ba174.png\t6e72e53dda.png\ta0f7d1db58.png\td453812357.png\n08b3c441eb.png\t3c30b8e27b.png\t6e79a01f0b.png\ta0fa330a03.png\td454f0d70c.png\n08b4743736.png\t3c35e3ad94.png\t6e7bef0170.png\ta0fa989668.png\td45b4f3b85.png\n08b5985634.png\t3c3a01dc7a.png\t6e7e9b98a4.png\ta0fefcdca0.png\td45c69bd9d.png\n08b63f1442.png\t3c3dedc0cc.png\t6e807dbef1.png\ta100455c2c.png\td4602e9eb0.png\n08bafa2eee.png\t3c3e74faf6.png\t6e8334b5ce.png\ta10b932fb3.png\td462737ce8.png\n08c2070b20.png\t3c4352ff4f.png\t6e86ef999f.png\ta112a63d2d.png\td465b396f2.png\n08c5eccda6.png\t3c44e011af.png\t6e86fed535.png\ta1196a546e.png\td468d401ab.png\n08caf9890f.png\t3c4acab200.png\t6e8d40dc65.png\ta11a5bab72.png\td46c133c20.png\n08ce8991f1.png\t3c4c3153af.png\t6e9137c28d.png\ta11b83ca27.png\td4703bff2c.png\n08d073b404.png\t3c4df17a32.png\t6e9164c1f3.png\ta11c4322e8.png\td4711d9cd4.png\n08d4933465.png\t3c4eb3116f.png\t6e91f1eba3.png\ta11da27e7b.png\td4762e96b3.png\n08d6cb7e93.png\t3c50c38749.png\t6e9484138d.png\ta11f564ac5.png\td476a47369.png\n08dc3c9a23.png\t3c50ddbc65.png\t6e98da23b9.png\ta12ab05d74.png\td47c78058a.png\n08de6285b8.png\t3c565576bb.png\t6e99d91e17.png\ta12c152cac.png\td480770008.png\n08e529fbea.png\t3c58a45fc6.png\t6e9aceff2f.png\ta12c9895f2.png\td48134038d.png\n08ea9862c9.png\t3c5e3124c7.png\t6ea0a105e6.png\ta12cbb17e5.png\td484fbb030.png\n08eb8cc9e5.png\t3c66e5ab54.png\t6ea5191e2a.png\ta12d2c10a9.png\td488ebc31b.png\n08f2e9fcc4.png\t3c6ea4c16d.png\t6ea89a9eaa.png\ta12f7f7f94.png\td4939a8bb6.png\n08f3346b8a.png\t3c6fe4b7ba.png\t6eb24b0585.png\ta1323b0391.png\td4952a2854.png\n08f3e1580b.png\t3c77445aa6.png\t6eb2d4d1e3.png\ta1351b1979.png\td49547ef96.png\n08f5aef292.png\t3c77cbb39a.png\t6eb4b1e1b0.png\ta139ee59f1.png\td49a9c4664.png\n090075ab14.png\t3c7ce000d3.png\t6eb7db0910.png\ta13bd1e1fa.png\td4a1f3d6cd.png\n09020d5904.png\t3c7d16efb4.png\t6ebb6826bd.png\ta13e062836.png\td4a60589bb.png\n0902703ce8.png\t3c7e2d60a7.png\t6ec1f0c03e.png\ta13ede84dd.png\td4a84662ee.png\n0905843ee0.png\t3c7ee1e49e.png\t6ec4386cbb.png\ta141ed0c99.png\td4acb30303.png\n090f67b75d.png\t3c80de29aa.png\t6ec4de436b.png\ta1465a9036.png\td4afd342c1.png\n0910a5f8d6.png\t3c84d40142.png\t6ec6e26ddd.png\ta149b73993.png\td4b07eba7b.png\n0910edcf69.png\t3c89b5b635.png\t6ecb722809.png\ta14d2ea3b1.png\td4b3a58d5e.png\n0914ccfb51.png\t3c8cf08665.png\t6ecba60505.png\ta14daf13a3.png\td4b9cb057f.png\n09152018c4.png\t3c90e0da69.png\t6ed0ef14db.png\ta14dfc9c23.png\td4be3d7e50.png\n0915403c23.png\t3c916033c2.png\t6ed4e5215a.png\ta1526d3c6e.png\td4bef9a35f.png\n091aaece26.png\t3c91c4805d.png\t6ed63018b1.png\ta15415f3c9.png\td4c0bfa755.png\n091b46d588.png\t3c9218e3d2.png\t6ed76bb3fd.png\ta157bacfbe.png\td4c21d0aa2.png\n0923f7bef9.png\t3c937cd300.png\t6edb010212.png\ta1585a58b8.png\td4c235a41e.png\n0926af3a6c.png\t3c95ea6205.png\t6edba13f1b.png\ta1593593f2.png\td4c351a04c.png\n0926b7396b.png\t3c975c06cc.png\t6edecbefa0.png\ta15ac41235.png\td4c67575c4.png\n0929430755.png\t3c97c52d0f.png\t6ee68db123.png\ta15b73d4bb.png\td4ce2a2530.png\n092a81dc1c.png\t3c9931bca2.png\t6ee6f38517.png\ta15dc1875a.png\td4d2ed6bd2.png\n092c6bc422.png\t3c9c8cdc54.png\t6ee73fdc3e.png\ta15eff2958.png\td4d34af4f7.png\n092d7bc330.png\t3ca09d9e5c.png\t6ee755c67d.png\ta163126d96.png\td4d825fe95.png\n093433e9d7.png\t3ca18312a0.png\t6eeddeeb29.png\ta1646cbb62.png\td4d964f62d.png\n0937b1df3f.png\t3ca252f526.png\t6eeeda7f4a.png\ta1660f15a4.png\td4db8ada5e.png\n09393f1bb5.png\t3ca6956aac.png\t6eef02f2b2.png\ta16611557b.png\td4dcc3fb31.png\n0939b133e1.png\t3ca7a95c42.png\t6eef596fc8.png\ta16b0e43e9.png\td4e045055f.png\n093e16bfdc.png\t3cae60b2e3.png\t6ef46555e2.png\ta16bbd70da.png\td4e2400b25.png\n093e17ad49.png\t3cb4906b6e.png\t6efc5fc61d.png\ta16cf0126e.png\td4e723b4c7.png\n093f8b1d6e.png\t3cb59a4fdc.png\t6efd749560.png\ta16f6fa6f9.png\td4e9588d7c.png\n0942db4eed.png\t3cbad9e965.png\t6f07ae886b.png\ta173685785.png\td4eaffbe2f.png\n094511c4a5.png\t3cbade83c1.png\t6f091c04b8.png\ta17a67f57e.png\td4ec7810c4.png\n09452c6408.png\t3cbbc5ed1d.png\t6f0c88962b.png\ta17be3fe58.png\td4ed19a4d4.png\n09456ecccf.png\t3cc01963e1.png\t6f0d7bdd29.png\ta17c068d3a.png\td4f2ed2ce2.png\n09493c684a.png\t3cc03621b5.png\t6f124c4dd2.png\ta17c4dc16f.png\td4ff78ea8a.png\n094a5b30fa.png\t3cc59342bd.png\t6f18215e39.png\ta18073f2a2.png\td5023a3662.png\n094cf586ca.png\t3cc902c1b4.png\t6f19a15faa.png\ta18b7d9b6e.png\td502c7113a.png\n0951f85203.png\t3cca3f86b6.png\t6f19d11809.png\ta19163fc11.png\td50b6c3267.png\n09541ce79b.png\t3ccbc353af.png\t6f2071d2b5.png\ta19987ca28.png\td50c9d0919.png\n095473ab35.png\t3cced1ea46.png\t6f26c4be9e.png\ta19a483998.png\td50e55dc71.png\n0958fdc7bf.png\t3ccf256507.png\t6f2a1691c4.png\ta19dbb48cd.png\td512386a1b.png\n095b44b0ce.png\t3ccf3f7a15.png\t6f2bded59b.png\ta19df661b7.png\td514a30db3.png\n0960ae5218.png\t3cd0ce6e18.png\t6f2f98381f.png\ta1a0480cd9.png\td514b10317.png\n0962d90501.png\t3cd4d01b7d.png\t6f334750e4.png\ta1a950ce34.png\td51a4d6c04.png\n0965a4cfdf.png\t3cd84afe62.png\t6f39c615f9.png\ta1abc4d779.png\td51ef97f33.png\n096788e6c4.png\t3cd8674cb2.png\t6f40647181.png\ta1ac36929f.png\td520e24c96.png\n0969783d07.png\t3cdbe07a1a.png\t6f438c26f8.png\ta1ad0767f1.png\td526a6bf6a.png\n096e56929a.png\t3cdd50ddf0.png\t6f452a72e9.png\ta1ae077026.png\td528409626.png\n097031e22e.png\t3ce13de7cb.png\t6f4612137e.png\ta1b071d24f.png\td529ba7f74.png\n0970725471.png\t3ce41108fe.png\t6f48dd2861.png\ta1b5786252.png\td53032aabd.png\n0971f6b351.png\t3ce657b8f7.png\t6f4fce059a.png\ta1bd5af287.png\td531fa41aa.png\n0977568b54.png\t3ce8a8bdf4.png\t6f509c9790.png\ta1bd9ad852.png\td5342efd47.png\n097989f097.png\t3ce95c74d5.png\t6f533a271d.png\ta1c1dad851.png\td538594e14.png\n09799a998e.png\t3cea2b2f8b.png\t6f533e9f3e.png\ta1cf9fdfb9.png\td53dabc268.png\n097f02867c.png\t3ced3ed63e.png\t6f582d4023.png\ta1d305862c.png\td53df2727c.png\n0981026c4f.png\t3cee398114.png\t6f5865ccfb.png\ta1d381f0fd.png\td53f293494.png\n098194923a.png\t3ceec05d58.png\t6f5968f78b.png\ta1dae195e7.png\td53feca9be.png\n09821df016.png\t3cf0203cfb.png\t6f5c2712b1.png\ta1db1ba4e9.png\td542d8517b.png\n0988338304.png\t3cf05e05b7.png\t6f5da258a5.png\ta1dc7c8cab.png\td54615afef.png\n098c5f3e75.png\t3cf20ce659.png\t6f5eebb896.png\ta1dd97eff3.png\td547702609.png\n099090a5ac.png\t3cf7c01481.png\t6f6ed73e26.png\ta1df99dffd.png\td548a1b2e9.png\n09923f804b.png\t3cffb3227b.png\t6f7396cb07.png\ta1e16dc7d6.png\td553c91194.png\n099342689f.png\t3d033df4c0.png\t6f750b8a0d.png\ta1e1b90a18.png\td553d1a918.png\n099b96c480.png\t3d059ab2cf.png\t6f755aad25.png\ta1e32a6815.png\td55dcb72ec.png\n099bce60e0.png\t3d05dc9c29.png\t6f75868949.png\ta1e3e1af04.png\td55f9bcd21.png\n09a2862301.png\t3d08ee7fe7.png\t6f770862bf.png\ta1e640f0b7.png\td56050d3ab.png\n09a53387aa.png\t3d0acb5c90.png\t6f78817a2d.png\ta1e931d1da.png\td566980876.png\n09a7af5a1b.png\t3d0eb2ac5b.png\t6f79e6d54b.png\ta1ebb51291.png\td567492570.png\n09a9bd4ffc.png\t3d0f8c81ce.png\t6f843d8fdd.png\ta1eca69525.png\td567ba2b49.png\n09aa9bde24.png\t3d0fec4a1e.png\t6f8f441aaa.png\ta1ece99af1.png\td56b977756.png\n09b712b3a2.png\t3d12cf2051.png\t6f90ec390c.png\ta1ee362660.png\td56c4da533.png\n09b7182dc5.png\t3d158db770.png\t6f917328a7.png\ta1ef786b98.png\td56d366f80.png\n09b9330300.png\t3d18153e16.png\t6f931611f7.png\ta1efe19ede.png\td56dbda1b5.png\n09be9b3f94.png\t3d19cc455e.png\t6f9541ff15.png\ta1f432ecc5.png\td5777e4789.png\n09bf956831.png\t3d1e2064c8.png\t6f96fea158.png\ta1f51211b1.png\td57917d8d8.png\n09c4201b4a.png\t3d21e1dc58.png\t6f97112511.png\ta1f55a188e.png\td57ab9cd68.png\n09ce1453ea.png\t3d254d5311.png\t6f9e3bb2f7.png\ta1f8f97ec0.png\td57d30d6a1.png\n09ce3f15ee.png\t3d2b243494.png\t6fa955a605.png\ta1fcd43f33.png\td583d291ea.png\n09cf21afa7.png\t3d2bc98c66.png\t6fabf8d88a.png\ta1fe3de7bb.png\td58653f803.png\n09dbb503a8.png\t3d329cdce2.png\t6fad2e31a1.png\ta20cced9fb.png\td588077d84.png\n09e29fbb74.png\t3d3359afd8.png\t6fb061814b.png\ta20f3d52d0.png\td58addc3c9.png\n09e2c7bfa3.png\t3d339265c4.png\t6fb1cab30a.png\ta2116f6ae2.png\td590a3a3ba.png\n09e6cfad6b.png\t3d348e6dc0.png\t6fb1ecf627.png\ta218c04841.png\td5992222c3.png\n09e83b7ec9.png\t3d40103812.png\t6fb29028b3.png\ta21ae8cb6b.png\td59fe2fb62.png\n09eb37dca0.png\t3d4093dcdb.png\t6fb4c35e0e.png\ta21afccc50.png\td5a283c553.png\n09ee30a08d.png\t3d414f6b50.png\t6fb522de93.png\ta21e1cf8d4.png\td5a82ebebf.png\n09ee5831f4.png\t3d42b00d47.png\t6fbf6c40cd.png\ta227e7f1da.png\td5aaeec7b0.png\n09ef36d784.png\t3d477a4128.png\t6fc195d3de.png\ta23192d1eb.png\td5afcea9a2.png\n09f0f1f414.png\t3d49c23446.png\t6fc2b9d6fd.png\ta23223b144.png\td5b11bbfe0.png\n09f1675cfb.png\t3d4bc58424.png\t6fc88fcbed.png\ta2322778e0.png\td5b1a635a9.png\n09f5a0410c.png\t3d53b0ae57.png\t6fcb932d11.png\ta2329d9497.png\td5b2dbbbf6.png\n09fbbb4b20.png\t3d558ae0c2.png\t6fcf6097f2.png\ta233a60deb.png\td5b2e07776.png\n09fbd62e87.png\t3d57917af7.png\t6fd4ac5516.png\ta2350575a3.png\td5b547eaf2.png\n09ff0bf1b8.png\t3d5d820fa4.png\t6fd673ec85.png\ta235adfab3.png\td5b845a346.png\n09ff67448c.png\t3d5f63e0be.png\t6fd888afc5.png\ta2393894ff.png\td5b94e470f.png\n09ff9f2dc2.png\t3d6452e27a.png\t6fe2b681e4.png\ta239bfe21a.png\td5bcc73a21.png\n0a050d38ca.png\t3d64a3515d.png\t6fe3868d58.png\ta23b79953c.png\td5c50fa71c.png\n0a072e6c9b.png\t3d64b5afb9.png\t6fe4017680.png\ta23ce0105e.png\td5c8c3579c.png\n0a0814464f.png\t3d65204ea2.png\t6fe5709361.png\ta24080b4a0.png\td5cb770b8c.png\n0a0cc52eca.png\t3d653d9577.png\t6fe5d74176.png\ta2437f978d.png\td5d4ad6b25.png\n0a13effcb6.png\t3d65b48db5.png\t6fe6f537f0.png\ta245c6d001.png\td5d599c225.png\n0a1742c740.png\t3d6a897901.png\t6fe9bc9132.png\ta2460b1e74.png\td5da2cf109.png\n0a18b314fc.png\t3d6d400816.png\t6fea77b2a6.png\ta248b69897.png\td5da791643.png\n0a194d3828.png\t3d6e96149e.png\t6febaa8dc2.png\ta24e316abb.png\td5dbfe8494.png\n0a19821a16.png\t3d73f32e1e.png\t6fef9bc131.png\ta25158dec8.png\td5dcc34739.png\n0a1b35bb54.png\t3d74dea2f3.png\t700614dfbb.png\ta2530b4c45.png\td5dd97abfb.png\n0a1b6ae0d5.png\t3d7736db3e.png\t700a2f62f1.png\ta25b549073.png\td5dec361aa.png\n0a1ea2ce4f.png\t3d781526c2.png\t7010d3e8d9.png\ta25e0da562.png\td5e433e1bd.png\n0a23a32d1a.png\t3d796eaf07.png\t701138cbde.png\ta25e9e12a7.png\td5e5df065d.png\n0a29abfa13.png\t3d7e68aca2.png\t70147d1d9d.png\ta265577181.png\td5e72925d7.png\n0a31d7553c.png\t3d80641a99.png\t7019c22c5e.png\ta266a2a9df.png\td5e7b5a6a4.png\n0a39c86312.png\t3d84ed9c15.png\t701c8ddc8a.png\ta26ee36d15.png\td5ec79a1d9.png\n0a3a8a5f37.png\t3d86cf245b.png\t701cfebd25.png\ta2728f5a2b.png\td5f32fb9d8.png\n0a3b7696cf.png\t3d882b8d41.png\t702209f0b3.png\ta27600b0b2.png\td5f7d673f3.png\n0a4082a7d0.png\t3d897625dd.png\t702315d0b7.png\ta27e64c2f7.png\td5f877964c.png\n0a41de5e3b.png\t3d8e486c58.png\t70239a8c97.png\ta27e70b9ad.png\td5fbaeab9c.png\n0a45628c37.png\t3d960d5148.png\t7039307d0a.png\ta282128493.png\td5fdf520e1.png\n0a4d368a58.png\t3d969bc3cb.png\t703cc2cdc7.png\ta282e85e8d.png\td600ee1df1.png\n0a4fd33486.png\t3d99801810.png\t7049da9998.png\ta28b050491.png\td6063c665b.png\n0a52e1bf02.png\t3d9c671ce1.png\t704de84fa8.png\ta28c2ff4e2.png\td6066d27a9.png\n0a59145585.png\t3d9cc215e5.png\t70538e4c47.png\ta28e9a56a4.png\td606f4a578.png\n0a5b5df54b.png\t3da64841d8.png\t7053b28de2.png\ta291ef5d83.png\td607bab433.png\n0a5eac8775.png\t3da729cae9.png\t7054ae209d.png\ta29480d00e.png\td607fd7e0b.png\n0a5f49ac6e.png\t3da8005edf.png\t7054f9226c.png\ta2976c4ec4.png\td60db2ca4a.png\n0a64d578b5.png\t3da8ee4c09.png\t7056ecb2cc.png\ta2a012f128.png\td61250af0c.png\n0a6605d064.png\t3dac65a9fd.png\t70596c3dd2.png\ta2a2ec508b.png\td615842865.png\n0a66231f08.png\t3dac7c4094.png\t705eb5d67a.png\ta2a8d65144.png\td61a3ab831.png\n0a69d50363.png\t3dacf22e6d.png\t70618c36ee.png\ta2a9c0c00a.png\td61b459e61.png\n0a6b2cea49.png\t3dae882971.png\t706321cca9.png\ta2acd2c458.png\td61b895fcd.png\n0a746dfb52.png\t3db26cb791.png\t706b40bddc.png\ta2ad71b576.png\td61e5c8296.png\n0a75bbbb74.png\t3db89d9c92.png\t706bee4c9c.png\ta2b2d4fe99.png\td621c19bb8.png\n0a797ef290.png\t3dbbc237c5.png\t706f5ea90f.png\ta2b3dffa6f.png\td623222fb0.png\n0a7c09181a.png\t3dbd807584.png\t707361a693.png\ta2b7af2907.png\td62486399c.png\n0a7e067255.png\t3dbf4d3e57.png\t7074999e14.png\ta2b82c3cf3.png\td625028eb4.png\n0a7fa5c5e1.png\t3dc1fea85e.png\t7074e402ca.png\ta2bac51579.png\td626773552.png\n0a8fbf4379.png\t3dc2ae7514.png\t70764e7a6d.png\ta2bca0bbcf.png\td6273a678a.png\n0a90963914.png\t3dc343c503.png\t70779a50f1.png\ta2bce9d94a.png\td6275b7ae5.png\n0a9712e8a5.png\t3dc4b77bb5.png\t707ae290b6.png\ta2c1aaa15d.png\td629a56d3c.png\n0a98fc89ab.png\t3dc568193f.png\t707c0b494f.png\ta2c697968b.png\td62ad254c6.png\n0a9992ce50.png\t3dc8ca137c.png\t707da802b9.png\ta2c8d323cc.png\td6344fd906.png\n0a9fc67f17.png\t3dcc2fdb76.png\t707f14c59a.png\ta2ca5eec91.png\td63629d428.png\n0aa2cfd2e2.png\t3dd0afef2c.png\t7089f09d5b.png\ta2ce75daaa.png\td63778e374.png\n0aa3bfe21d.png\t3dd350a6f2.png\t708da79f31.png\ta2d7f0563a.png\td6379c1f18.png\n0aa49c1f5b.png\t3ddb2ff7a4.png\t708dcf36f9.png\ta2da67afff.png\td639f4c3ae.png\n0aa90143db.png\t3ddc3ad0b5.png\t70958cacd4.png\ta2e625f222.png\td63e5779da.png\n0aa92c32d3.png\t3ddf2a9990.png\t709b6f8868.png\ta2e850fa2a.png\td6437d0c25.png\n0aab0afa9c.png\t3de0c773e1.png\t709c9537c9.png\ta30307864e.png\td6471dc0a3.png\n0aabdb423e.png\t3de4ee4e0c.png\t709f59e611.png\ta3079aded1.png\td648667e46.png\n0aada349b6.png\t3de6277060.png\t709fcc80e5.png\ta30967eb65.png\td649ce6484.png\n0ab06437bc.png\t3de99572f3.png\t70a32506ab.png\ta30a23a310.png\td64b16435d.png\n0ab43360f6.png\t3df20bcdfb.png\t70a4a1eb64.png\ta310519345.png\td64b58516c.png\n0ab5e14937.png\t3df2736429.png\t70a891a3b9.png\ta317a82463.png\td6547548a1.png\n0ab8828eb5.png\t3df34446db.png\t70ae581728.png\ta31d85fa70.png\td65973824b.png\n0ab8b23845.png\t3df62de11d.png\t70afc514d2.png\ta31e485287.png\td65ad5c777.png\n0ab9e80e8c.png\t3df9f3edc2.png\t70b3951b2f.png\ta325af7cd6.png\td65df3a8cf.png\n0abe6c137d.png\t3dfca0b816.png\t70b5a9919a.png\ta328d04452.png\td65f579350.png\n0ac4df785f.png\t3dfd1c311a.png\t70b6a2347f.png\ta32a4a2256.png\td665b7d9c2.png\n0ace75e142.png\t3dfdde9c1a.png\t70b7e1c459.png\ta32b0d0cd7.png\td666f65d34.png\n0ad0876244.png\t3dfef75f8e.png\t70b96ce692.png\ta3328ab6d1.png\td669ab34d4.png\n0ad15731c4.png\t3dff1a7114.png\t70be68d0a1.png\ta333c87428.png\td66bb0c151.png\n0ad38c8f25.png\t3dff831f3c.png\t70c082d59e.png\ta336eeacdf.png\td66e51d456.png\n0ad7dd5b44.png\t3e01678748.png\t70c273f62a.png\ta33e7de189.png\td66e58294c.png\n0adb490de5.png\t3e04a6cc24.png\t70c40f648b.png\ta33ef5ed68.png\td67b2eaa3c.png\n0adb8d3c2c.png\t3e06571ef3.png\t70c6682b69.png\ta33fe0952a.png\td67dd7da95.png\n0adcdbf194.png\t3e0743ab02.png\t70c6e41609.png\ta340d1d4ab.png\td67e3a11d8.png\n0ae08cc1b2.png\t3e11cc14db.png\t70c7bb7616.png\ta342e04707.png\td68a5abbe9.png\n0aea45ba8f.png\t3e1b6580f5.png\t70c7ead798.png\ta344f31c96.png\td68c08baec.png\n0aebb572f8.png\t3e1c11cfa8.png\t70c8e14243.png\ta346964cc6.png\td694c84b1f.png\n0aedf52acf.png\t3e1cfd2ccb.png\t70c92bc80f.png\ta34b03c4db.png\td69556b746.png\n0aef3dff6a.png\t3e1d9b88f7.png\t70ca23bd65.png\ta35093b0f8.png\td6987190ea.png\n0af4a2ad0b.png\t3e1e8416e0.png\t70ca8ebf7b.png\ta350bf9ce2.png\td699f66ea1.png\n0af60a2408.png\t3e227bf413.png\t70cb03f04c.png\ta35792b156.png\td69abbca50.png\n0afbccca04.png\t3e2400afbf.png\t70cc45bc3a.png\ta35ba23a41.png\td69b835df1.png\n0afc8c6d3f.png\t3e2d048107.png\t70cef9fcb6.png\ta35cab481b.png\td69c382be5.png\n0afffbf3a7.png\t3e3431be2f.png\t70cf57ee68.png\ta35dded5b4.png\td69cf8151e.png\n0b03165d15.png\t3e34b5b49b.png\t70cf807d0e.png\ta3605816c1.png\td69d9cf365.png\n0b0b0362be.png\t3e34c60f5f.png\t70d9a282d1.png\ta36673b867.png\td69dc03388.png\n0b0e4210b3.png\t3e34caf528.png\t70db04a203.png\ta366e0eeae.png\td69fe4565f.png\n0b10d5525d.png\t3e3d120907.png\t70e5bd9d27.png\ta36735c684.png\td6a0d579f1.png\n0b118ea7b7.png\t3e44639703.png\t70e6f8f3bc.png\ta369e4ece1.png\td6ad336745.png\n0b11c85eb7.png\t3e47574443.png\t70e9442b06.png\ta36c68a4fa.png\td6b36a5082.png\n0b16ac0482.png\t3e47fdb169.png\t71022f79b0.png\ta371ec6aee.png\td6b4003095.png\n0b176124c4.png\t3e4bd3b6e5.png\t71033f6fbd.png\ta37249665e.png\td6b48fc0b9.png\n0b1b18a835.png\t3e4ea2e9b8.png\t71080dbb80.png\ta37df3d8bc.png\td6b87a521f.png\n0b1b207e5e.png\t3e503f55a8.png\t710fc67627.png\ta37e566b4b.png\td6b9a9ab7e.png\n0b1c21a106.png\t3e53d236d3.png\t71108e4178.png\ta388900be9.png\td6bcecfcfe.png\n0b1f1059ca.png\t3e56432fea.png\t7110db12b9.png\ta38c6bd807.png\td6cae22fc0.png\n0b20585c5a.png\t3e59ba039c.png\t7113090556.png\ta38ec5925e.png\td6cba9f9de.png\n0b293581b5.png\t3e5e5bc3be.png\t7114209cf0.png\ta39388bd3f.png\td6cfbda4bf.png\n0b2a0711a2.png\t3e5eac5296.png\t711470dabc.png\ta393f5609b.png\td6d62e9b41.png\n0b2fbd6e26.png\t3e60aff57c.png\t71150da04b.png\ta3945cc9c8.png\td6dc342f83.png\n0b37e26016.png\t3e613b4906.png\t71158cec61.png\ta3982b9d36.png\td6dcd4da21.png\n0b4093a820.png\t3e635c129e.png\t711c478c93.png\ta39866f50c.png\td6ddf45bf5.png\n0b40a9afe5.png\t3e6a6c3c9b.png\t711c505cf4.png\ta39c858e2c.png\td6e01e97bd.png\n0b410a0dc2.png\t3e6b6146a4.png\t711d860055.png\ta39dc1c7d1.png\td6e2bb1586.png\n0b41b59f85.png\t3e70505ba4.png\t711e8ac5ba.png\ta39f01e911.png\td6e385a87e.png\n0b42061620.png\t3e70579f6a.png\t711ef6f24e.png\ta3a6c8f3af.png\td6e3f898b0.png\n0b449cb641.png\t3e71bbc1d5.png\t7127a65f6f.png\ta3a7f49081.png\td6e41f3c45.png\n0b45162089.png\t3e72a48ef4.png\t7128b1506e.png\ta3a8190a03.png\td6eb3700b1.png\n0b45bde756.png\t3e741873b1.png\t71311cdce9.png\ta3ad1422f5.png\td6ec32a8b3.png\n0b4b307e28.png\t3e75a0f7d4.png\t71319b35e8.png\ta3adfce43e.png\td6f318bd6f.png\n0b4b5e2bb1.png\t3e7f8f6b33.png\t7134e4acac.png\ta3b2bb97ac.png\td6fdb80bae.png\n0b4d1f5071.png\t3e7fa000c7.png\t7135ca553a.png\ta3b37496d0.png\td7099eb334.png\n0b584a37c8.png\t3e821f8642.png\t713faefb03.png\ta3b4801717.png\td70a2f1f23.png\n0b5988d7b0.png\t3e84576386.png\t71408d1c56.png\ta3b91acb96.png\td710d1af61.png\n0b59f5ec03.png\t3e85318809.png\t7140b36ff6.png\ta3bbf5c708.png\td711226cda.png\n0b5da62449.png\t3e8651d250.png\t71416f3e05.png\ta3bea8f266.png\td71590e987.png\n0b633fad17.png\t3e871346f4.png\t71432420ed.png\ta3bfc5a549.png\td7196745e5.png\n0b64e69d0e.png\t3e87545802.png\t71438f26ec.png\ta3c26d86cb.png\td71ca6d32c.png\n0b651bd775.png\t3e8ab52b8a.png\t7145f7e956.png\ta3c332fe5a.png\td71e91efc0.png\n0b6bb5c85a.png\t3e8b104f48.png\t71468b522d.png\ta3c6ed959e.png\td7215eb198.png\n0b70d67563.png\t3e8b5729dc.png\t714a530736.png\ta3c895cf5e.png\td72554e86a.png\n0b73b427d1.png\t3e8ca34142.png\t71507ccc96.png\ta3cc6ecf52.png\td728765746.png\n0b7459a0af.png\t3e8cdc7e6a.png\t71510358a2.png\ta3cfc9ec3a.png\td72a290951.png\n0b75c88e6c.png\t3e8f4f7c8f.png\t715295d6ab.png\ta3d7e18515.png\td72bc1190c.png\n0b7b00ebfd.png\t3e95d4417b.png\t715dd853a3.png\ta3de137406.png\td72c5b3623.png\n0b83548c34.png\t3e991d73fa.png\t715df66304.png\ta3debd62c6.png\td72dedfa75.png\n0b835fe552.png\t3e9e8b4fc3.png\t715e37b25b.png\ta3df74dd31.png\td72f61a21b.png\n0b86387a02.png\t3e9fee1df1.png\t715faa9d3f.png\ta3dfaa5fab.png\td73189228f.png\n0b8ec6ac7a.png\t3ea3db144d.png\t7161ad1940.png\ta3e067edd8.png\td734013742.png\n0b925d4e8d.png\t3ea50ff362.png\t7161efbf0a.png\ta3e0a0c779.png\td7398cfded.png\n0b9424abbb.png\t3eabf280ed.png\t7163dee1f4.png\ta3e5e1d301.png\td73e3ccdec.png\n0b9686de0e.png\t3eaee773ac.png\t71644debd8.png\ta3e85ba277.png\td7470e757f.png\n0b9829c5db.png\t3eb3e346ca.png\t716575974d.png\ta3e89a273d.png\td7506a6a38.png\n0b9874fd4f.png\t3eb7ad3107.png\t716eb65725.png\ta3eedc216e.png\td752eaa82a.png\n0b9981aba1.png\t3eb810a03f.png\t717270753b.png\ta3eff91b5e.png\td7579484b5.png\n0b9a1a7a83.png\t3eb9c63046.png\t7172c509f6.png\ta3f3157a83.png\td76292283c.png\n0b9cb1205a.png\t3ebce107cb.png\t71737fb196.png\ta3f4d92597.png\td7652e36f2.png\n0b9fbd9b42.png\t3ec0423139.png\t7173e8bb6c.png\ta3f894216b.png\td765e53b37.png\n0ba253bf47.png\t3ec16830a7.png\t7177b9576f.png\ta3f8992e72.png\td768234bb5.png\n0ba2681782.png\t3ec30a713c.png\t717bd5d6b1.png\ta3f984c72f.png\td76ab972db.png\n0ba541766e.png\t3ec79673da.png\t7187f4c02c.png\ta3ff13b1e1.png\td76b5e2a7f.png\n0bab031d59.png\t3ecb0db5bf.png\t71890f2e7c.png\ta401ccdd89.png\td76c91c5b2.png\n0bb14b8c19.png\t3ecd9b4931.png\t718d3ec452.png\ta4066d42fc.png\td76fdc0c66.png\n0bbaa6d56a.png\t3ed16c7933.png\t718f18f71e.png\ta40813ad63.png\td7702122e6.png\n0bbbd7b589.png\t3ed46948ce.png\t71930621da.png\ta40e1112da.png\td7757dc339.png\n0bbee760e3.png\t3ed6168b22.png\t7194629d17.png\ta40e944791.png\td7778398ed.png\n0bc80a2050.png\t3ed6d1c7df.png\t7195bd6223.png\ta4104dec88.png\td778928df6.png\n0bca43f7ff.png\t3ed7b7577b.png\t71991595fa.png\ta41122dbd8.png\td77eb315ac.png\n0bcde7fb0f.png\t3edc3edf92.png\t71a00195c6.png\ta41339b815.png\td78179dc1c.png\n0bcf0f39d5.png\t3ee086e2ac.png\t71a31f4356.png\ta41357f553.png\td7826e339b.png\n0bcfa63f88.png\t3ee1cfb6a6.png\t71a360306a.png\ta41685b927.png\td782a6ea7d.png\n0bcfadfd28.png\t3ee3375dce.png\t71a3a44fc8.png\ta425d50aac.png\td78c366a0d.png\n0bcfcf34d7.png\t3ee4de57f8.png\t71a5b1da57.png\ta42a45b613.png\td7927f11dd.png\n0bd08d4d2d.png\t3ee59e7485.png\t71a7c521ef.png\ta433189cb9.png\td792df5590.png\n0bd1e720bc.png\t3ee7e5d5ec.png\t71a8710bfe.png\ta43568e310.png\td795565f86.png\n0bd264d1a0.png\t3ee87f1ea6.png\t71aa85d5a4.png\ta438d5e984.png\td79c094faf.png\n0bda89116e.png\t3ee9a8ddb1.png\t71abe009c6.png\ta43ae84282.png\td79d1a972c.png\n0bdd44d530.png\t3eea2cd3e4.png\t71b381b1bb.png\ta43ca2ef73.png\td7a43dab4a.png\n0bdf279a26.png\t3eeb599831.png\t71b7cc2fdc.png\ta43ca4f01d.png\td7a53f942a.png\n0be00ec340.png\t3eef3e190f.png\t71ba21f967.png\ta43ec97f2a.png\td7a9a97b4a.png\n0be104dd37.png\t3eefb6f02d.png\t71bab9f311.png\ta44270ae5e.png\td7b2a04db3.png\n0be16f65f8.png\t3efcef3f21.png\t71bdecc536.png\ta442c472c5.png\td7b84ee2cd.png\n0be517d0e6.png\t3efd2d1d03.png\t71c0f47b6c.png\ta443637ddf.png\td7bf279db2.png\n0be5323c3b.png\t3eff3c0e27.png\t71c0fd5516.png\ta443a58ff2.png\td7c4d4e1b2.png\n0be8956e00.png\t3eff6468cc.png\t71c5288092.png\ta4442fef48.png\td7c57f676e.png\n0be923c49b.png\t3f01db328a.png\t71cd2e2d3d.png\ta444fdc56e.png\td7ca0bfe02.png\n0bec31a789.png\t3f05ca9eda.png\t71ce444528.png\ta4457eaa05.png\td7cc4296b2.png\n0bed5f2ace.png\t3f0d544cae.png\t71ce93a7b7.png\ta44587b9ca.png\td7cf3bf4e9.png\n0bf3bff34d.png\t3f0debff7e.png\t71cebdc720.png\ta446aa0ac8.png\td7d3ead58e.png\n0bf4710ee2.png\t3f0e97ff77.png\t71d10659ad.png\ta447f49818.png\td7d6a4e742.png\n0bfed731a5.png\t3f12023654.png\t71d1424c5e.png\ta453819463.png\td7d6cb4a8e.png\n0c02f95a08.png\t3f18e1afb2.png\t71d34fc906.png\ta4546fe71f.png\td7de9e7fd6.png\n0c0610be6a.png\t3f1fe84513.png\t71d3bf0ea8.png\ta4554beb3c.png\td7e63b028a.png\n0c065a063b.png\t3f20bbb54d.png\t71d46b2a4b.png\ta45a69e464.png\td7e6e1323a.png\n0c06b7f8b4.png\t3f22f2ef0e.png\t71d5d55932.png\ta45db71245.png\td7e72cfc7e.png\n0c089f7c1b.png\t3f26445878.png\t71d7482240.png\ta45f297827.png\td7e93ff9c2.png\n0c096431ce.png\t3f26d3ea52.png\t71de4aefac.png\ta468149f40.png\td7eb8b4a95.png\n0c0f001366.png\t3f277a3bb2.png\t71e17c98ff.png\ta4707da97f.png\td7ecb1e319.png\n0c10b00c44.png\t3f2865df45.png\t71e2382100.png\ta470f0c83a.png\td7f329deb1.png\n0c14b0fe28.png\t3f325903d7.png\t71e5181d4f.png\ta4763177ef.png\td7f48d294e.png\n0c14b502e8.png\t3f358013ac.png\t71e937b285.png\ta478665286.png\td7f7a9f48a.png\n0c16d79223.png\t3f35c6241a.png\t71ed7325d3.png\ta48358f1bc.png\td7fef0da4f.png\n0c187c0ec0.png\t3f377767ea.png\t71ef380e21.png\ta4876c155d.png\td800abe6a4.png\n0c19639aeb.png\t3f37d8bfd6.png\t71ef3fc1c9.png\ta487cbafbd.png\td800b91a5f.png\n0c19fd77ca.png\t3f39e6e1e1.png\t71efaabf89.png\ta48b98d992.png\td802b933e3.png\n0c1a420b5c.png\t3f3ac0573d.png\t71f0e0759f.png\ta48b9989ac.png\td809054e30.png\n0c25be1d66.png\t3f404c5022.png\t71f16b2619.png\ta48d0b7d22.png\td80d8067d8.png\n0c26ab8168.png\t3f43f2850f.png\t71f4235df5.png\ta48ffd1f27.png\td80d9c1445.png\n0c2778b69e.png\t3f44516dda.png\t71f6fcb48a.png\ta490a68472.png\td80ead8703.png\n0c29a22fe5.png\t3f4a3d99ea.png\t71f724c7ef.png\ta4928cc0c4.png\td811dea143.png\n0c2f4cf57b.png\t3f4b677dec.png\t71f7425387.png\ta4932322ce.png\td81265587a.png\n0c348fdf9d.png\t3f4d7ad434.png\t71f7d85904.png\ta49465e1af.png\td81b111a52.png\n0c36a4de54.png\t3f57a66451.png\t71f7ef5864.png\ta495e667ac.png\td81b53a8c1.png\n0c376f6728.png\t3f57f5288e.png\t71fd51fbe8.png\ta496d57c85.png\td81f668af4.png\n0c39e78c19.png\t3f587ca621.png\t71fd607e6b.png\ta4975f26d0.png\td81f6c3432.png\n0c3cdc4486.png\t3f58e8ba5a.png\t7205efa791.png\ta498311665.png\td81f87949c.png\n0c426c1510.png\t3f5adbdcc7.png\t7208f75841.png\ta498e414e9.png\td8200d88b1.png\n0c43ec36f9.png\t3f68cc0fcd.png\t720b810ecc.png\ta4990e5d39.png\td827064439.png\n0c444101f4.png\t3f73575f28.png\t720d581c59.png\ta49edf33ae.png\td82b7bc89b.png\n0c44f43875.png\t3f76f25042.png\t720ed9c0e9.png\ta4a1e89b8e.png\td82b80a555.png\n0c45122390.png\t3f77083653.png\t720fde436b.png\ta4a209dc52.png\td82ec932ab.png\n0c476ba662.png\t3f8634704b.png\t72181494e8.png\ta4a2921863.png\td830c44d12.png\n0c4a4fe6ec.png\t3f88795b50.png\t721b6d7dda.png\ta4a5eca0c8.png\td832b3d757.png\n0c4aea0dc8.png\t3f8a8f4715.png\t721e2e1811.png\ta4a75dd824.png\td835d3e6f3.png\n0c4b3b3838.png\t3f8dab943f.png\t722155ef59.png\ta4ae0c0e35.png\td83a74a082.png\n0c53156b05.png\t3f90de437f.png\t722574af9e.png\ta4af4ec79e.png\td83d9f26b8.png\n0c58be00dc.png\t3f94b6d38b.png\t722806521a.png\ta4b189748f.png\td83fe9fa7a.png\n0c59027602.png\t3f9678c6f9.png\t722e556818.png\ta4b1b30e7b.png\td8448c2a06.png\n0c59da2eea.png\t3f99903e74.png\t723124abad.png\ta4b3dbc634.png\td84535e5f6.png\n0c5ea2df41.png\t3f9b5ad0e0.png\t72317af9c6.png\ta4b5244207.png\td84b3632d2.png\n0c600207b1.png\t3f9df26c0c.png\t72328599e2.png\ta4b54abbf7.png\td84c0443ef.png\n0c6036a506.png\t3fa0352e1a.png\t7233849a7a.png\ta4bab2ef2d.png\td84d285b26.png\n0c61b75e90.png\t3fa5b770f9.png\t7234856751.png\ta4bdd53fb9.png\td84e52cb6e.png\n0c62dfc234.png\t3fa643b8fb.png\t7238a260cf.png\ta4bf108f7b.png\td84e84d367.png\n0c62ed4494.png\t3fa715098c.png\t723ee6c018.png\ta4bf75108b.png\td84ed53d33.png\n0c66d31a1c.png\t3faab13d3e.png\t7242ab00b6.png\ta4c5fdf614.png\td84f620e57.png\n0c6874fde9.png\t3fb234589e.png\t724541cf3b.png\ta4c734f1a8.png\td853da3c22.png\n0c6e42492c.png\t3fb7ca35c9.png\t7246da723f.png\ta4cede6f59.png\td859e715d4.png\n0c6ec35477.png\t3fb8c2ce8a.png\t7252a44a02.png\ta4d2a5df9f.png\td859ee2145.png\n0c6f58357e.png\t3fba8a6660.png\t72544ea5b6.png\ta4d44e2c7d.png\td864cf03a1.png\n0c71c122bc.png\t3fbe62aa50.png\t72559e426b.png\ta4d621da2d.png\td86ca1254a.png\n0c73d146b1.png\t3fc444f3c1.png\t725a0ae373.png\ta4d9822ec8.png\td86dc56bdc.png\n0c75a9cbad.png\t3fc4d6b6ed.png\t725c079100.png\ta4db311648.png\td879349c6e.png\n0c7631165a.png\t3fc79d2ece.png\t725d92524a.png\ta4db94dace.png\td87bfe2c9a.png\n0c79f66275.png\t3fd0ef2d38.png\t725e4a6ba0.png\ta4dc4e0266.png\td87eca88a6.png\n0c7a052374.png\t3fd3788e31.png\t725f623fa0.png\ta4dd1d2900.png\td8807e8abf.png\n0c7fe0c644.png\t3fd616ef00.png\t725fe78bcc.png\ta4dffd26d3.png\td884aa82ab.png\n0c82e01a9f.png\t3fd8443a93.png\t72606ea7e2.png\ta4e04b5bc8.png\td885813aa8.png\n0c842f5207.png\t3fd9c0ffed.png\t7262d7ac24.png\ta4e2723074.png\td88eaf77d3.png\n0c85a86c36.png\t3fdc35b832.png\t7266925b66.png\ta4e5b9902b.png\td89124eac2.png\n0c89c4c0f7.png\t3fde792ba8.png\t7268aedee7.png\ta4e5bfb182.png\td89411d9d1.png\n0c8d55a780.png\t3fe0ac38f0.png\t7269030c21.png\ta4e8ba21d1.png\td8993ae908.png\n0c91260657.png\t3fe1578a95.png\t726a0d854d.png\ta4e90eef39.png\td89c8c4e3b.png\n0c94055b5b.png\t3fe3373b67.png\t726c429e57.png\ta4ee1a0e9d.png\td8a4c11479.png\n0c9c49e734.png\t3fea5ef77d.png\t726c743e97.png\ta4f642c2d7.png\td8ad9966ba.png\n0c9ce1e818.png\t3feeed910d.png\t726f7e5450.png\ta4f7eb0929.png\td8aebd0f86.png\n0c9f8a7a42.png\t3fefb4fa2d.png\t7270c3b9e3.png\ta4f86ae62b.png\td8af997922.png\n0ca07a2b06.png\t3ff3881428.png\t7271cef48a.png\ta4fb7072d6.png\td8b065d0eb.png\n0ca984e5b4.png\t3ff987c2e4.png\t7276a4aa1b.png\ta4fec99d30.png\td8b1bbbfdf.png\n0cb2f8b29c.png\t3ffa242fa1.png\t7277ef5fef.png\ta5005ea567.png\td8b24dc645.png\n0cb5f92f6c.png\t3ffbcbfb40.png\t727bd3a3b7.png\ta5050ea371.png\td8b2dc2e8a.png\n0cb8a3d070.png\t40009f61ce.png\t727c1d5dbb.png\ta5055b4e51.png\td8b5e842cf.png\n0cbd2440cd.png\t400714c26c.png\t72826bcbb3.png\ta5079cb93c.png\td8b68009d5.png\n0cbebcad99.png\t400e66f34e.png\t7285733d96.png\ta50a98656c.png\td8b8dacc7e.png\n0cbf2fdd4b.png\t400f822d93.png\t72862eb350.png\ta50ac7d1e2.png\td8bed49320.png\n0cc152d572.png\t4011974d7a.png\t728890d8f8.png\ta515cb395b.png\td8bf9761f8.png\n0cc1d0e4c4.png\t401310594d.png\t728917ca07.png\ta51b08d882.png\td8c00f1f19.png\n0cc25fed1c.png\t4015200239.png\t728a6d743c.png\ta51ccd2f00.png\td8c4b4d618.png\n0cc3c31d1a.png\t401718a5ca.png\t728b06f90b.png\ta51e0ecc70.png\td8ca3bb8c8.png\n0cc5805a92.png\t40186e2c64.png\t728b34e175.png\ta521c311e0.png\td8d1ff3d49.png\n0cc93c1754.png\t401b591c07.png\t7294001ccb.png\ta52298e252.png\td8d207d58f.png\n0ccce05a04.png\t4022a53ed1.png\t7296069bfc.png\ta524090258.png\td8d5895a6f.png\n0ccee86f32.png\t4024a7e03d.png\t7299f61039.png\ta52470ae8e.png\td8d7a8ab4d.png\n0cd43eb1d8.png\t4025c0ac8e.png\t729c382e20.png\ta527fe6bd1.png\td8da34b265.png\n0cd4ad5bae.png\t4026885307.png\t729f4f3a44.png\ta5291b23f1.png\td8dcf9f0be.png\n0ce2690857.png\t40278e32e2.png\t72a6095db8.png\ta5299290bb.png\td8e410e039.png\n0ce329b5f9.png\t402b2c241c.png\t72abaed3bd.png\ta52cf1d9ea.png\td8e549f16a.png\n0ce88ce9c7.png\t4030fcfc51.png\t72aea0aaef.png\ta52d7a6cd8.png\td8e57f29b6.png\n0ce8a6ae62.png\t4031759c78.png\t72b009939a.png\ta52f9258e6.png\td8e7f62b4e.png\n0ceb9ed723.png\t4031850bb9.png\t72b3086d05.png\ta531b78648.png\td8e9f9d7e4.png\n0cf4d64a89.png\t4037f4ba93.png\t72bb7e268d.png\ta535f4e908.png\td8ea97b26c.png\n0cf56f5ff9.png\t403a7b5de7.png\t72bc5edd34.png\ta536f382ec.png\td8edb39aa3.png\n0cf7155eb6.png\t403b12a81f.png\t72c2499de9.png\ta5375b180b.png\td8eefce4ca.png\n0cf880cdf5.png\t403b47ec61.png\t72c9b3ca5b.png\ta539c08702.png\td8f44271c2.png\n0cf9c803b8.png\t403cb8f4b3.png\t72ccfcef34.png\ta53ce65d9f.png\td8f465c713.png\n0cfa7b3b64.png\t4045285603.png\t72d5769c67.png\ta53cf7d9d9.png\td8f80479ac.png\n0cfc3f08a5.png\t40452bd035.png\t72d5c2e914.png\ta53de091e0.png\td8fa1cbf13.png\n0d01cacbc9.png\t404b62a4d5.png\t72db0c3a2f.png\ta53e9abcef.png\td8fa7027a9.png\n0d059aeae3.png\t404db84e12.png\t72dbd4a9bc.png\ta53f55e67b.png\td8fca64ebf.png\n0d0762e2f1.png\t40512c6d02.png\t72e05a5a33.png\ta54115fadc.png\td9044ee3c6.png\n0d091ab909.png\t40525a740e.png\t72e1a1b225.png\ta543591b0c.png\td90473fc86.png\n0d0a173bfe.png\t405a1991a0.png\t72e7f5a583.png\ta5471f53d8.png\td905ab6a67.png\n0d0a5591c7.png\t405e04b9bc.png\t72e90f437c.png\ta54ae707dd.png\td9076d4917.png\n0d10332602.png\t4062ebb8f8.png\t72ed4033ee.png\ta54bf46d21.png\td90e533ce3.png\n0d12341d7c.png\t406ba2a864.png\t72ef4718f3.png\ta54d55c572.png\td910edd2e7.png\n0d14aedcd5.png\t40709ec6d8.png\t72f5a4b1f0.png\ta54d582262.png\td9140cdbc2.png\n0d151d1bbe.png\t4071dd4858.png\t72fb8a9d26.png\ta54da7eb9e.png\td914ed0736.png\n0d15b3011a.png\t4071fae726.png\t72fe314276.png\ta5536ee0bb.png\td916ac7864.png\n0d19f89dc9.png\t4074911972.png\t72fed38a43.png\ta553f6ecec.png\td91a887ed2.png\n0d1aaab6b8.png\t407798f0de.png\t72ff77dcdc.png\ta554d2d4a3.png\td91cf89370.png\n0d1bf337ce.png\t4079c73376.png\t7300c9fdf1.png\ta557fd3376.png\td91fb67e81.png\n0d215e9cc4.png\t407aa4d1a3.png\t7304dcacee.png\ta55c6c1d0b.png\td923825e05.png\n0d2238f294.png\t407bd44d31.png\t7304ffb567.png\ta560bed6e4.png\td926e006d0.png\n0d247ee571.png\t408083de74.png\t73053e4d03.png\ta562619124.png\td927c15fb4.png\n0d265e3a4c.png\t4085f8522d.png\t7308c64525.png\ta562fff73f.png\td928dc5068.png\n0d2789f1a0.png\t40876f27d4.png\t730bddcc02.png\ta5657750c8.png\td9299579aa.png\n0d2bff49ea.png\t4087d80276.png\t730eee262c.png\ta5659fd2cc.png\td92a3528f6.png\n0d2e0095df.png\t408b37703b.png\t7310863c7b.png\ta568cc8273.png\td92da71c3e.png\n0d2e3e9a4c.png\t4090bd3129.png\t73152cd72f.png\ta56e87840f.png\td93241186b.png\n0d2f16deee.png\t4096a4741f.png\t7317846e74.png\ta56ff4758a.png\td934a53d9b.png\n0d31971e2f.png\t4098e4f4ef.png\t731bd710ad.png\ta579c775bb.png\td9377ce4f8.png\n0d3b5ec2a8.png\t409f102f88.png\t7327c7cad1.png\ta57d19240f.png\td93cea85bb.png\n0d3cdf8ada.png\t40a1a3ddcf.png\t732fb7faf6.png\ta57f92a611.png\td93d713c55.png\n0d3dd984a6.png\t40a31f799b.png\t733f591b93.png\ta5805697c1.png\td93f654b79.png\n0d3e8f9868.png\t40a571cf0f.png\t734bcacca1.png\ta5822773db.png\td941b83446.png\n0d3eb7681c.png\t40a6d01026.png\t734df57cdf.png\ta5833d6cc4.png\td943ef59b4.png\n0d4038338b.png\t40a7b6b08b.png\t734e92abd7.png\ta583b90630.png\td9443b3376.png\n0d41d039cc.png\t40a7f422e2.png\t73500d8f72.png\ta58470d2e0.png\td9443d6374.png\n0d4c2ee69a.png\t40aaac97b1.png\t73547e4381.png\ta588bcb650.png\td944700bae.png\n0d4d304c79.png\t40ab4aa81b.png\t735aa737ea.png\ta589985aa4.png\td9457c3b1c.png\n0d51b207aa.png\t40b45f1871.png\t735ff8c333.png\ta58b03118a.png\td946d589ff.png\n0d52ed9aad.png\t40b72f14c4.png\t736061632f.png\ta58d3b12bc.png\td9475049de.png\n0d53163eb1.png\t40b8ea6ae7.png\t7361f4e5bc.png\ta58f7bf642.png\td94be52713.png\n0d55880203.png\t40bb5a9cbe.png\t73621db013.png\ta5902ff190.png\td952078a85.png\n0d5957b64a.png\t40bfc63823.png\t73623061fa.png\ta59385755d.png\td953b0b60b.png\n0d5b43b41a.png\t40c2c80414.png\t73658121f9.png\ta594d86f88.png\td95ad63bb9.png\n0d5d3cadd5.png\t40c65da2ad.png\t736628f96a.png\ta595c6296e.png\td967181724.png\n0d5ec46059.png\t40ca4c4ff2.png\t7366dd0525.png\ta5964e0dac.png\td9692cf925.png\n0d61c54a7f.png\t40ccdfe09d.png\t736e3874c8.png\ta5a99f3913.png\td96a45425b.png\n0d64aa7de2.png\t40d01e18a0.png\t7370b31cd0.png\ta5a9e48a1b.png\td96af7c9cc.png\n0d6a468b92.png\t40d7f084ce.png\t7374d22ad8.png\ta5ae14f500.png\td96c4efcda.png\n0d6db8f166.png\t40d925130f.png\t73788e166d.png\ta5b0ce8395.png\td96f1003cf.png\n0d6dcdd704.png\t40d96e8022.png\t737e6513c8.png\ta5b81694e5.png\td975321865.png\n0d7157d748.png\t40db8da030.png\t737f39881c.png\ta5bdac93d1.png\td9764a65ae.png\n0d7993b61d.png\t40dcff68b3.png\t7381f17bcc.png\ta5c02f60cd.png\td978006aae.png\n0d808a8f68.png\t40e3262fb1.png\t738671724b.png\ta5c4007b7b.png\td979a315ce.png\n0d875c5490.png\t40e37ab6d3.png\t7387c46115.png\ta5c44e49ba.png\td97e6ffda3.png\n0d8a25ca65.png\t40e58a796f.png\t738958ae5d.png\ta5c6cde207.png\td97f7b6b18.png\n0d8ed16206.png\t40e83640df.png\t738d7b7191.png\ta5ccb4f8d9.png\td97f94a2dd.png\n0d9250fea0.png\t40e8e14e60.png\t739a2484b4.png\ta5cef24195.png\td9827c66d5.png\n0d9adabb2c.png\t40f2443ba3.png\t739a9ab34a.png\ta5d188bbd7.png\td98293b5d2.png\n0d9e65099f.png\t40f7669f65.png\t739ba65f16.png\ta5d2da8488.png\td98ee3cef9.png\n0d9f17158f.png\t40fbc03558.png\t739c26298f.png\ta5d6f86af7.png\td98ef8bb13.png\n0da0088115.png\t40fceff327.png\t73a091665b.png\ta5dbf312b9.png\td98faddff6.png\n0da24cf72a.png\t40fdadad5d.png\t73a1614bea.png\ta5e1da850e.png\td991b7a689.png\n0da270a98a.png\t41045d9a0f.png\t73a2f6fb22.png\ta5eedd418a.png\td99216cfbb.png\n0da4918f5a.png\t410b6f1760.png\t73a3712fcd.png\ta5f0c8b6f8.png\td99612d949.png\n0da4b0731e.png\t411c393de0.png\t73a4c6a4bb.png\ta5f876a4b6.png\td998a07370.png\n0da877df19.png\t411f4ac9d3.png\t73acee452c.png\ta5fc503220.png\td9a52dc263.png\n0daa48f5a1.png\t4127db527f.png\t73af373fca.png\ta6043f00ef.png\td9ab5529da.png\n0daed89bf4.png\t412e61e42f.png\t73bd1ade5a.png\ta609f4096e.png\td9aff0b72f.png\n0daf208ba4.png\t41314275bd.png\t73c6a1ede1.png\ta60d8f7fbe.png\td9bce92614.png\n0db0d82d37.png\t413353cd7d.png\t73c97fc400.png\ta60daba969.png\td9c57ba98b.png\n0db1b4c199.png\t41376c4fbe.png\t73cabef9bd.png\ta611a75229.png\td9cf4a6e42.png\n0db214c76b.png\t4150e85a49.png\t73cb3fb57b.png\ta612bf83d3.png\td9cf5a971f.png\n0db6588bf9.png\t4151efcf29.png\t73cb41b13d.png\ta614c44092.png\td9d06aa699.png\n0dba013816.png\t4152525251.png\t73cf0baa15.png\ta6192703d8.png\td9d49a1831.png\n0dbed4dcf2.png\t4153eb92e6.png\t73d3411765.png\ta619fde3ae.png\td9d75842cb.png\n0dc05f65a2.png\t41574b9d75.png\t73d3d08aff.png\ta61b2d8fd3.png\td9d7716b7a.png\n0dc2288534.png\t415d3e85b1.png\t73d612a604.png\ta624d8dc64.png\td9d7c77e13.png\n0dcf4a11a6.png\t415eb5b72c.png\t73d6a58df1.png\ta62afb364a.png\td9d82b7c5a.png\n0dd421c6db.png\t416101e57a.png\t73d7fd7daa.png\ta62c94394a.png\td9db7ad9d8.png\n0dd4d9acf7.png\t4161f35a4f.png\t73d82d003c.png\ta62d4765d2.png\td9dca6cf43.png\n0dd7241a30.png\t41647172d6.png\t73da5d6722.png\ta62e6bb195.png\td9e2dab380.png\n0ddcd3fd1a.png\t4166f7db11.png\t73e231af37.png\ta62e962739.png\td9e45a7fab.png\n0de7c509cc.png\t4169f0feaf.png\t73e31d6ba2.png\ta62f9b6494.png\td9ef8f6403.png\n0de8ecce67.png\t416a425526.png\t73e81d4093.png\ta630afad97.png\td9f21e185f.png\n0de9dd9520.png\t416bbf1081.png\t73eb0134c9.png\ta6327775cd.png\td9f9f7ba0d.png\n0deadce8f1.png\t417002075c.png\t73eb023f31.png\ta633b127fc.png\td9fb5b47e2.png\n0decef2ce3.png\t4171b6ca36.png\t73efb5b23b.png\ta63468651c.png\td9fc7d8aa6.png\n0ded804d83.png\t417833daa4.png\t73f4759351.png\ta6355cc103.png\td9fe7e4fce.png\n0df375dc9c.png\t4179c83f47.png\t73f53cd327.png\ta63763888b.png\tda02c9184d.png\n0df528c6f9.png\t417fbc9aef.png\t73f58ba609.png\ta63c9474ad.png\tda0338c297.png\n0df53ae04c.png\t4180504cd3.png\t73f951b30c.png\ta63fd8e2a8.png\tda07ac2a52.png\n0df9f16bc9.png\t41809fc23d.png\t740070f15a.png\ta647ea2cad.png\tda082ec985.png\n0dfe2971d0.png\t4183f90c66.png\t7401d2fbbb.png\ta6481131c7.png\tda090efc50.png\n0dffbb13f1.png\t4184479788.png\t7404408177.png\ta648490cf6.png\tda0af9cbf9.png\n0e04f3f709.png\t4189112ea4.png\t740a5406d8.png\ta64ade1730.png\tda0b7ec78b.png\n0e0edd09c7.png\t41891c30fa.png\t740caf5038.png\ta64ebcbf08.png\tda0c1a83d9.png\n0e0efa0166.png\t41895b0f40.png\t740cf53b4c.png\ta653a50dc4.png\tda0f021c5b.png\n0e102a8da9.png\t418a13df5c.png\t740d34befe.png\ta656c1edc0.png\tda1081e0ab.png\n0e16439e05.png\t418b7878a8.png\t740e4ce45d.png\ta659530342.png\tda116eb67e.png\n0e18f949cc.png\t4194504947.png\t740f33db4b.png\ta65aa82e0a.png\tda1577ebcd.png\n0e1f6f4a9d.png\t419e419f89.png\t7410f440bb.png\ta65ba359fa.png\tda1b14541c.png\n0e2266401d.png\t419ef90290.png\t74169e1717.png\ta65fa99532.png\tda23d83a71.png\n0e27f19921.png\t41a4f7424e.png\t742017e5d7.png\ta6625b8937.png\tda28696786.png\n0e2808a025.png\t41a8ca9668.png\t742dce37c4.png\ta667ba911e.png\tda2f5777db.png\n0e292e0b79.png\t41a93e2459.png\t742fb47f42.png\ta66a01bd98.png\tda322f4911.png\n0e36eb125b.png\t41aae07818.png\t743097891a.png\ta66a21e4c9.png\tda324cbdde.png\n0e371eac28.png\t41ae01a47b.png\t74312e8f3f.png\ta66c03c236.png\tda33eff7ae.png\n0e37560ba9.png\t41b11a12e6.png\t7432950176.png\ta66dd1f283.png\tda38b07bb1.png\n0e37ae8b14.png\t41b1e8e333.png\t74361fb662.png\ta66fd32f20.png\tda3b6be1d5.png\n0e3a142be1.png\t41b3376629.png\t743a985f94.png\ta670e833a4.png\tda3e6f8513.png\n0e4123fff1.png\t41b52c1438.png\t743ccf6d5a.png\ta6753a46ff.png\tda4016987e.png\n0e437b51df.png\t41b907baea.png\t74477df7f1.png\ta676b46c7e.png\tda41e9ed21.png\n0e4572a9ae.png\t41bbbc3c6a.png\t744a2fab2c.png\ta67fe36ac3.png\tda459db8b7.png\n0e471a54c7.png\t41be9a5164.png\t744cda657f.png\ta680be5220.png\tda4679232d.png\n0e4ed69b55.png\t41bf1f313d.png\t745003b86f.png\ta680da22d5.png\tda4a6183cb.png\n0e5156bf25.png\t41bff02c51.png\t74545c82ce.png\ta685e6f87e.png\tda4e06f9b5.png\n0e53e6294a.png\t41c64dc78f.png\t74575644bf.png\ta688df9574.png\tda5209aa13.png\n0e54c8098a.png\t41c76eef27.png\t745b73e846.png\ta68923cc57.png\tda56ed9d56.png\n0e57d9d4c9.png\t41ca396fae.png\t745be77eed.png\ta6903a84f2.png\tda575a0444.png\n0e59057977.png\t41cb99014d.png\t7463d7da9a.png\ta6935120c2.png\tda5a43625d.png\n0e5add3298.png\t41cfd4b320.png\t7467212701.png\ta69586db4e.png\tda5ab3d506.png\n0e5e551d4c.png\t41d06c2c33.png\t7469baeb8d.png\ta69839ffc6.png\tda5b430ecd.png\n0e5eb2e19c.png\t41d0f0703c.png\t746abb2c3a.png\ta69dbcad1b.png\tda63fc9668.png\n0e63d9a8b4.png\t41d2667f7c.png\t746f7f7f27.png\ta6a0885ac9.png\tda65890bbb.png\n0e64ca0b80.png\t41da1ab679.png\t747141df74.png\ta6a18e0dea.png\tda679a7f9b.png\n0e67c7794b.png\t41dca1ffb2.png\t7478744236.png\ta6a252e8a0.png\tda689f1bca.png\n0e69714d68.png\t41df6abd8b.png\t747924ff6c.png\ta6a3383416.png\tda6d3b141e.png\n0e6f485743.png\t41e173a1e5.png\t747af0ae75.png\ta6a568e5de.png\tda6ddf9739.png\n0e6f54a44e.png\t41e2137184.png\t747c075dfb.png\ta6a674ec87.png\tda7076f6d4.png\n0e760c1317.png\t41e4b1a387.png\t748078728d.png\ta6ab750afe.png\tda720e8d17.png\n0e79df7ad2.png\t41e4f4038a.png\t7481e20a4b.png\ta6acbda593.png\tda72305266.png\n0e83c832a8.png\t41e672b66d.png\t7482e0c304.png\ta6b21886fa.png\tda723a31d3.png\n0e899c2105.png\t41edd70f4f.png\t7488e076ca.png\ta6b263f3bc.png\tda73799b4c.png\n0e8e9e9b41.png\t41f7617f91.png\t7489fb76ec.png\ta6b5ba6921.png\tda75372c30.png\n0e8ff74901.png\t41fbcb79d9.png\t748b1244ca.png\ta6ba9b9f91.png\tda7658f446.png\n0e90fb8fc6.png\t41fee23985.png\t748da1dda2.png\ta6be9b9688.png\tda7c022d60.png\n0e91ebd824.png\t41ffbf9674.png\t748f8f4106.png\ta6c17d483f.png\tda7c1dfeca.png\n0e92422666.png\t4200355d17.png\t748fda4538.png\ta6c2605919.png\tda7da78656.png\n0e9495f2fe.png\t420447e7a6.png\t749314518c.png\ta6c55fe3af.png\tda84ed2423.png\n0e9624bf3c.png\t42074c0725.png\t74a066b134.png\ta6c63420b2.png\tda85c5ae62.png\n0e991e3624.png\t420ed132b9.png\t74a3b69852.png\ta6c84e89f1.png\tda8c2994e7.png\n0e9bd4e164.png\t4210e9e5d3.png\t74a524c6f6.png\ta6d02564cc.png\tda8f12fb9e.png\n0e9c3b837b.png\t421b69d6a4.png\t74a5ef33c0.png\ta6d2bf21fe.png\tda90bb6570.png\n0e9d22de45.png\t42241a4692.png\t74a6423226.png\ta6d618a651.png\tda90e6e56f.png\n0e9d78f14f.png\t4228ad4f1a.png\t74a717ddcb.png\ta6d97826dd.png\tda94055b53.png\n0ea282db13.png\t422ce49bbf.png\t74ab2a9b1a.png\ta6debaf56e.png\tda98867337.png\n0ea4e8145a.png\t4230b78a8e.png\t74ace6d352.png\ta6e085d8e0.png\tda9d8425cf.png\n0ea4f97315.png\t4230e4510d.png\t74b54e1925.png\ta6e3ee1039.png\tdaa0c14bf8.png\n0ea5ea6946.png\t4234acef31.png\t74b8b1e820.png\ta6ec2ce172.png\tdaa3d292d1.png\n0eacc93ff3.png\t4235358bc3.png\t74b9b40ef0.png\ta6ec5aa914.png\tdaa44f92ec.png\n0eb03cccc1.png\t423749f702.png\t74be878015.png\ta6f453f543.png\tdaa59da19e.png\n0eb1cadc73.png\t423800be38.png\t74c0bc5214.png\ta6f46b656e.png\tdaa98b9d98.png\n0eb52aa52a.png\t423a10be0b.png\t74c86f24cb.png\ta6f4e0f8b5.png\tdaadda3a18.png\n0ebe3ee1ad.png\t423a435241.png\t74cb9f9a6c.png\ta6f63524af.png\tdab1938acb.png\n0ec11733cd.png\t423ae1a09c.png\t74ce107a84.png\ta6fad0fb3d.png\tdab1d67df7.png\n0ec3cfbcc5.png\t423e66795d.png\t74d1d03fc3.png\ta6ffc65eef.png\tdab3760ec2.png\n0ec5a99768.png\t42446ee0c8.png\t74d53376f9.png\ta7000441e9.png\tdab9d0e3e6.png\n0ec68b2e42.png\t424d862bcc.png\t74d53a0d53.png\ta701844e9a.png\tdab9e1b94c.png\n0ecc2e0065.png\t42579e0bc6.png\t74d574653f.png\ta703fc4911.png\tdabe63a77d.png\n0ece337a15.png\t426bdcb658.png\t74d5a5c926.png\ta707bacffb.png\tdac0ee6141.png\n0ed2fce6db.png\t426cec9dd2.png\t74d7a9df8b.png\ta708e52d7c.png\tdacb8bebea.png\n0ed6516b91.png\t426e71086b.png\t74d9a8ec01.png\ta70b2204f0.png\tdad62e6cd7.png\n0ed75a8482.png\t42707233b9.png\t74dc83a265.png\ta70be225dc.png\tdae1555fbf.png\n0eda123a2a.png\t427212d90c.png\t74de17e9c0.png\ta70ecc55dc.png\tdae96d99b2.png\n0ee2f5120a.png\t4273671d5d.png\t74ecf92a27.png\ta710b672de.png\tdaef07fd90.png\n0ee3d20d51.png\t427387c25b.png\t74ef894c55.png\ta71154d40a.png\tdaf22070f5.png\n0ee6773a65.png\t42758bb41a.png\t74f00bb214.png\ta712f50125.png\tdafc37c9b5.png\n0ee8a744eb.png\t4279e747a0.png\t74f0afbfc5.png\ta713287800.png\tdb00870dcd.png\n0eeccfa5d3.png\t427c6caf1c.png\t74f1b30c08.png\ta715988e34.png\tdb0238e5c7.png\n0eed90b7de.png\t427e472cf8.png\t74fa9386ae.png\ta7181efc6d.png\tdb02a30bb8.png\n0eeeca0531.png\t4280f083c5.png\t74fb9f65d1.png\ta71b76545d.png\tdb04c2683d.png\n0ef56f5ef6.png\t428255da91.png\t7500f695e6.png\ta71bdf1af7.png\tdb07e691de.png\n0ef6a1a9d0.png\t4282adcba1.png\t7504d73df8.png\ta71cadc1ab.png\tdb087a15ac.png\n0f02cec809.png\t4285c2ffe8.png\t750a9f8a03.png\ta71ecaecb1.png\tdb0bc63d60.png\n0f04ad5127.png\t4287d43823.png\t750aa12137.png\ta7221c24e4.png\tdb0fe6fc41.png\n0f054c214c.png\t4289f167b7.png\t750bd2308b.png\ta7254b3aa6.png\tdb102119c2.png\n0f074f459b.png\t428a877104.png\t750d00b2f9.png\ta72f220de9.png\tdb16c4c2ae.png\n0f13aea58f.png\t428b47a70c.png\t750ef7cde4.png\ta7323247ef.png\tdb17867b19.png\n0f1831de13.png\t42903ae768.png\t751248efa5.png\ta734025f26.png\tdb1856cd5d.png\n0f19680b4d.png\t429b289e07.png\t751796a6b2.png\ta7354b2f3e.png\tdb1aa8e608.png\n0f19797af1.png\t429bf7c665.png\t75190d4eb8.png\ta7434c600a.png\tdb1b291802.png\n0f19e90f78.png\t429f9cc0c7.png\t751a212e82.png\ta743bf1c06.png\tdb1c258e8f.png\n0f1bf1af9a.png\t42a3baf571.png\t751b4f4354.png\ta74573e55c.png\tdb1eeaebb4.png\n0f1c46ba13.png\t42a5fda22e.png\t751bb24ab9.png\ta745ee2cf2.png\tdb246349d8.png\n0f20f0242c.png\t42a6d179ef.png\t751ccfabff.png\ta747fc2b99.png\tdb2552b1e8.png\n0f2381dc96.png\t42a7f9ac30.png\t751e4a5bc0.png\ta749ecef14.png\tdb275a11fb.png\n0f23861d30.png\t42a8d2bcb5.png\t7520edb0d6.png\ta754990d1f.png\tdb32e0961b.png\n0f240d4463.png\t42aa153d94.png\t75234e0190.png\ta754e003ca.png\tdb373ab7b6.png\n0f2540e5f7.png\t42ad1eb10b.png\t752368df10.png\ta754e9e91c.png\tdb4967025d.png\n0f2b6a8586.png\t42b9ff31aa.png\t7524d9b7bf.png\ta763650a19.png\tdb4ad3e61e.png\n0f3118527c.png\t42bafd5bec.png\t75277c554d.png\ta765cd6aab.png\tdb50e3d295.png\n0f337683b4.png\t42becaa75c.png\t7529b6d062.png\ta7667efaea.png\tdb52b1a918.png\n0f3af50fdb.png\t42bf7406bb.png\t752d0ede64.png\ta76adbbd52.png\tdb548fd037.png\n0f41fb1404.png\t42bfae680f.png\t75311de238.png\ta76d1d833a.png\tdb5547a121.png\n0f4637e4b7.png\t42c30e696c.png\t753371cca5.png\ta76f0622ec.png\tdb5856eb69.png\n0f482a81b3.png\t42c563d895.png\t75363c138f.png\ta77029244a.png\tdb5bf853fc.png\n0f4c862220.png\t42c5683e28.png\t753761db4b.png\ta774562f5e.png\tdb5ec02b2e.png\n0f5164a7c9.png\t42c6942544.png\t753ac9f3ed.png\ta778e85a05.png\tdb5fa9a654.png\n0f5ada4dc3.png\t42ce7fe354.png\t753f85d9de.png\ta77f69692a.png\tdb6202ef84.png\n0f5b2d9561.png\t42d64f2d78.png\t75420c460b.png\ta77f740603.png\tdb658c5ade.png\n0f5f146f1c.png\t42d68dbc86.png\t754297deed.png\ta783bd82c5.png\tdb68ab6d70.png\n0f613bd191.png\t42dcb5e761.png\t7547289453.png\ta787b2115a.png\tdb6b84e832.png\n0f61dd2edd.png\t42de63318a.png\t754cfc4b80.png\ta78aa61c59.png\tdb6f54bb56.png\n0f6aa9f5da.png\t42e3dc40ba.png\t754d4e98a5.png\ta78e36d77b.png\tdb714e7d68.png\n0f6e788cdd.png\t42e58ae641.png\t754f34301c.png\ta78ec4f628.png\tdb75c6831e.png\n0f72426c72.png\t42e7632c3c.png\t7551ee0210.png\ta792976d83.png\tdb79147c92.png\n0f7302ad43.png\t42e97b4a5f.png\t7551f356ce.png\ta79de081cf.png\tdb7bd98d2f.png\n0f734266ca.png\t42eaca028e.png\t7555f79a2c.png\ta79de3a9cd.png\tdb819c45e5.png\n0f73683331.png\t42eb2264d8.png\t755b1d4def.png\ta79f8ad5ea.png\tdb840e1e22.png\n0f762f2746.png\t42f38d930f.png\t755b31ff54.png\ta7a124fff5.png\tdb8b480f5f.png\n0f77318a28.png\t42f3ec73d6.png\t755ba63241.png\ta7a29c02d9.png\tdb8f57831b.png\n0f776d1a05.png\t42f46dcb33.png\t755c1e849f.png\ta7a3075a1c.png\tdb9008cc91.png\n0f785ea0ce.png\t42f5dde8ec.png\t75623af4a3.png\ta7a4bdce55.png\tdb90c7db2c.png\n0f7aa746d6.png\t42f76c7c21.png\t7562462083.png\ta7a66e306b.png\tdb9158c9ec.png\n0f7d72f024.png\t42f9e766c4.png\t7568a9474e.png\ta7a87180a3.png\tdb923ca728.png\n0f81ab5099.png\t42faca9621.png\t756a0a16dd.png\ta7ad37eda1.png\tdb92a00c3b.png\n0f83ae8508.png\t43001d32e7.png\t756fc7c806.png\ta7b1f4bae1.png\tdb99ed23d2.png\n0f8497a7f9.png\t43012fb793.png\t7570331dbb.png\ta7b439c134.png\tdb9bb00470.png\n0f8905287c.png\t4301b1f7e3.png\t7572c78534.png\ta7b5999db2.png\tdba440c1e1.png\n0f9977b1dd.png\t4303a5ae52.png\t7576015468.png\ta7b918d80e.png\tdba4475686.png\n0f9b1dbc3f.png\t4303cf97d3.png\t7576a8a091.png\ta7bc02b7d4.png\tdbaa4c219a.png\n0f9beb1453.png\t4304badb8f.png\t757775da15.png\ta7c05c669a.png\tdbaa50e4e8.png\n0f9ca6c340.png\t430bad3c88.png\t7582c0b2e9.png\ta7c1278674.png\tdbaa639a96.png\n0f9ef8df38.png\t430c303755.png\t75842bc006.png\ta7c572c021.png\tdbabc23680.png\n0fa1e2ec0d.png\t430ef30e87.png\t7585d90354.png\ta7c5bb3d9a.png\tdbb3965aca.png\n0fa2535b77.png\t431012008f.png\t7587b8363e.png\ta7c7cbca6d.png\tdbb4b51e4a.png\n0fa6d979c2.png\t4313120f5b.png\t7588e9645a.png\ta7c88d0eb0.png\tdbb637eebf.png\n0fa88fc811.png\t4315656153.png\t758a2d9b6d.png\ta7c93f1649.png\tdbbbf02cb1.png\n0faaf196b4.png\t4319885cf1.png\t758c74c39e.png\ta7c9e07706.png\tdbc239fc29.png\n0fac240905.png\t431a019f05.png\t758d47a932.png\ta7c9f89610.png\tdbc641e275.png\n0fb39bdd4f.png\t431ad89cf5.png\t75907e6bb3.png\ta7cfa30d35.png\tdbc99c3182.png\n0fb7b2772a.png\t4325dc8108.png\t759128aca2.png\ta7d56ddaf1.png\tdbcb10c638.png\n0fb8a865dd.png\t43287380d3.png\t759541acc7.png\ta7dbfe6a12.png\tdbcc1619f5.png\n0fbbf1548f.png\t4329ab8cf6.png\t759b3aaa22.png\ta7ddfa3911.png\tdbcfdf17a3.png\n0fbce89aa2.png\t432a77d470.png\t75a285c615.png\ta7e060fec7.png\tdbd3a5dc0a.png\n0fbf0f803f.png\t432b8b6a71.png\t75a7afca65.png\ta7e191ebc4.png\tdbd44add85.png\n0fbf301a5e.png\t432e819b15.png\t75aa6457a4.png\ta7e5a56e0f.png\tdbd5499147.png\n0fc11f3da1.png\t432f07b09e.png\t75ab739866.png\ta7ea4ad918.png\tdbd762aa71.png\n0fc3f9d9dc.png\t43311d0da8.png\t75ac803a11.png\ta7ec866668.png\tdbdb45277f.png\n0fc5ad7e3b.png\t4334a4df17.png\t75acea0c54.png\ta7f209b6d9.png\tdbdc2b5730.png\n0fca192950.png\t4336ac847e.png\t75ad49564c.png\ta7f4b8d36e.png\tdbdfc912ac.png\n0fcbd0567b.png\t434551feb4.png\t75b273372e.png\ta7f6142fa0.png\tdbe0fe4dc1.png\n0fcc664e40.png\t4358e27e18.png\t75b33b09d0.png\ta7f7930f3c.png\tdbe559c0ed.png\n0fcf26daaf.png\t435935f0bf.png\t75b4a808b3.png\ta7f7a964f5.png\tdbe5ab28a4.png\n0fcfbcf23f.png\t4359dac218.png\t75b4fe1c78.png\ta7fa6dd049.png\tdbe799dc2b.png\n0fd22e2882.png\t435ae188ca.png\t75b8134623.png\ta7fad8af86.png\tdbec4fe1c5.png\n0fd38464c3.png\t435cd5f003.png\t75b8d54e1c.png\ta7fbaa2e3a.png\tdbec5c6e98.png\n0fd4156470.png\t435e1f9e54.png\t75ba3f1b99.png\ta7fcafee63.png\tdbecae2cc0.png\n0fd4f71b7d.png\t435f2841d7.png\t75bd1e0652.png\ta7fdcba6e8.png\tdbefc8222e.png\n0fd7ee2ea9.png\t43639ee6de.png\t75c4c73270.png\ta80109c813.png\tdbf16358a4.png\n0fdbfa2f0d.png\t436aaf7e67.png\t75c554378e.png\ta803fc87d1.png\tdbf7801ea6.png\n0fdf592515.png\t436d01f71f.png\t75c7d3b5a8.png\ta80973d210.png\tdbfbead279.png\n0fdfde6a55.png\t436fead290.png\t75cae57c64.png\ta81b4d2453.png\tdbfc10558b.png\n0fe0913aa8.png\t43710ef977.png\t75d0e44b1c.png\ta81fe70c11.png\tdbffa35f38.png\n0fe7bd88fb.png\t437f1c2894.png\t75d6c48a00.png\ta8247ac73a.png\tdc0047f648.png\n0fe88d720a.png\t437fbb64d4.png\t75d70e2411.png\ta826a7a9a4.png\tdc0119712d.png\n0fea4b5049.png\t438443e0d3.png\t75d8153a52.png\ta826e5e6e5.png\tdc048ecac2.png\n0fec368473.png\t43874376f4.png\t75d91acabd.png\ta82d7959b3.png\tdc05ed2596.png\n0feeb1040c.png\t4388f8ed7f.png\t75db955734.png\ta82ea3864e.png\tdc066ad24e.png\n0ff0988d7e.png\t4389cc59a0.png\t75dd47f9f2.png\ta8336f8463.png\tdc079bad31.png\n0ff62b7139.png\t438b4cc6e3.png\t75dda4f3d6.png\ta836cef588.png\tdc0eba53d6.png\n0ff9f243e5.png\t4392b160a6.png\t75dda858fd.png\ta83eb8c3c0.png\tdc1ec1396e.png\n0ffb6cd301.png\t4397a2b3fb.png\t75e041fbd8.png\ta843ccfbf7.png\tdc21186e6b.png\n0ffbef2a57.png\t439a2ebfbd.png\t75e8f7f24e.png\ta848047508.png\tdc2600ff63.png\n1001eb37e5.png\t439cecc232.png\t75ea71b97e.png\ta849a4e373.png\tdc2a31bc27.png\n1008a41c55.png\t43a137a06b.png\t75eaea4cdb.png\ta84af7a220.png\tdc3239a567.png\n100c1f042e.png\t43a180e051.png\t75ec408017.png\ta84bb18e9f.png\tdc330f6bfe.png\n100d433d0d.png\t43a1dbb062.png\t75efad62c1.png\ta8503e0dc8.png\tdc33e8d3a5.png\n1017084680.png\t43a8e00245.png\t75f0269699.png\ta854446118.png\tdc35bbc9ee.png\n101926ea19.png\t43a9c21cea.png\t75f421cca0.png\ta856815b31.png\tdc3c0c492d.png\n1019e6ed47.png\t43addb85a4.png\t75f992ad8c.png\ta85d0a35b4.png\tdc3f33d2ea.png\n101aef99e5.png\t43b367d7ee.png\t75fb042981.png\ta86acaf330.png\tdc3ff463f0.png\n101f683ee6.png\t43b527f949.png\t75fb579aa2.png\ta8718f10cc.png\tdc47477270.png\n101fab9667.png\t43b65acf38.png\t75fec62ce4.png\ta87b3fe6b0.png\tdc4a1c9a30.png\n1025f1f9f7.png\t43b82e6341.png\t760173ca87.png\ta87c29bf6b.png\tdc4bf302e8.png\n102a73cb82.png\t43bc135ba9.png\t76079d0bf1.png\ta87eb66a65.png\tdc4f08cf84.png\n102a7ad32b.png\t43bce54542.png\t760c2b68cf.png\ta8876c27ca.png\tdc501d2cb5.png\n102afec92a.png\t43be809ad3.png\t760ca73d1f.png\ta889481fa2.png\tdc534f1e28.png\n103035b94b.png\t43c29e757d.png\t760ce79881.png\ta88b9fb912.png\tdc59a10de0.png\n10321b58c3.png\t43c323e221.png\t760e3d1750.png\ta88d46d451.png\tdc5a650af7.png\n1034497357.png\t43d4e67fed.png\t760e92cf87.png\ta88dd59036.png\tdc690b4c89.png\n103713e20d.png\t43d9c911d1.png\t760eeda87b.png\ta8921964e8.png\tdc6b08b866.png\n1037da2ff7.png\t43dc4c8c28.png\t761204f143.png\ta898892737.png\tdc710565e6.png\n103811d8af.png\t43de3f9c9a.png\t76131bdd5f.png\ta89c8b991c.png\tdc71cdae70.png\n10391aefa5.png\t43de4bb238.png\t7616ccfb6a.png\ta89d1999ec.png\tdc786d88fe.png\n103c96d62c.png\t43dfa0e160.png\t7616ee06bd.png\ta89e1ab744.png\tdc7c506beb.png\n10442d17d2.png\t43e345b8bd.png\t7616ef548b.png\ta89f111730.png\tdc7d737c47.png\n1045448bac.png\t43e7c593d2.png\t761995bfd6.png\ta89fc2586a.png\tdc7da3a2cc.png\n1045d28892.png\t43ec939286.png\t761b0fef99.png\ta8a109908e.png\tdc8281c5ff.png\n104cbe173f.png\t43eee3a16f.png\t761e6c392a.png\ta8a6d20d45.png\tdc86b26706.png\n104ce199d8.png\t43f87dfde2.png\t761eaf3fa4.png\ta8a86d7d11.png\tdc888669ce.png\n104def81dc.png\t43f9b2d59b.png\t761fb17dc0.png\ta8a8e4949b.png\tdc898d55ca.png\n104f908a3d.png\t43fb770e34.png\t761fdaf070.png\ta8aa087faa.png\tdc9070258f.png\n1052fcf702.png\t43fba1a9a4.png\t7628d4d391.png\ta8aa1d5b64.png\tdc98370594.png\n1053f64ac9.png\t4400c5bac8.png\t762b4b27c9.png\ta8ab7eba73.png\tdc9b82b153.png\n1056e892f6.png\t4401600a78.png\t762c7d7084.png\ta8ae93258c.png\tdca0ca5ecd.png\n10637d3b2d.png\t44047c3e10.png\t762f01c185.png\ta8aeef5395.png\tdca3f5ce34.png\n1064076633.png\t440534c8dd.png\t76302d297d.png\ta8afc6de39.png\tdca68fd494.png\n1069817f2b.png\t440908f39f.png\t763128cef3.png\ta8b1c4d814.png\tdca83468d2.png\n106a222c7c.png\t440927b8e0.png\t7633c6bb70.png\ta8bb069a39.png\tdcaa2f7dec.png\n106b0f23eb.png\t4409f74613.png\t7633d58974.png\ta8bc163db6.png\tdcaacc0402.png\n106d450f14.png\t4411ace0b8.png\t7639365121.png\ta8be31a3c1.png\tdcadc4c820.png\n106e4043ba.png\t441888f897.png\t763de0e83d.png\ta8bfe7072c.png\tdcaec58c8b.png\n1075fa3200.png\t441adcda4d.png\t763eecf967.png\ta8c12329c4.png\tdcaecdf831.png\n1078ce67ed.png\t441e322b5a.png\t763f6e03ce.png\ta8c33b125c.png\tdcb1eee6e3.png\n107abcbb72.png\t4424dfb42f.png\t7640216003.png\ta8c3aafd2a.png\tdcb3f0a060.png\n107f1047c1.png\t44257c5138.png\t76438dfb9d.png\ta8c5ce8f72.png\tdcb4915c46.png\n1081dc0cb9.png\t442fc1d9de.png\t7643966f2f.png\ta8cf94648e.png\tdcb7468a65.png\n10833853b3.png\t4431e6768e.png\t764765d9b9.png\ta8d437f870.png\tdcba17e654.png\n1083eaa92a.png\t4433b159db.png\t764d1a6614.png\ta8d50a4a97.png\tdcbb6474ee.png\n1087f576d3.png\t44381a3f55.png\t764fc25e64.png\ta8da16b97c.png\tdcbc243dd1.png\n108fe2c3d8.png\t443985b709.png\t7651ea8aa4.png\ta8dd12fde3.png\tdcbfe835ec.png\n1093e4f138.png\t443f9cadc3.png\t765321cd6b.png\ta8e26c8317.png\tdcc130871b.png\n10947d16b1.png\t4440f7d174.png\t765964a84c.png\ta8e3c96ae3.png\tdcc80c1df1.png\n10968dea34.png\t44425dc49d.png\t765db22f0e.png\ta8e3f97611.png\tdcca025cc6.png\n109bccf6c6.png\t444c32935c.png\t7661b7a843.png\ta8e4e6ddec.png\tdcca0c8297.png\n10a04f7dfd.png\t444d0d46d3.png\t76656d621e.png\ta8e5845bf9.png\tdccc7bde70.png\n10a0e97a05.png\t444dcc147b.png\t766784f810.png\ta8e62d2504.png\tdccd2d9d48.png\n10a1a5b78f.png\t444e6e251c.png\t76695059e8.png\ta8ebba8c1e.png\tdccf379405.png\n10a1d480a5.png\t4450426d55.png\t766c55d886.png\ta8ee53877f.png\tdcd3b15b7f.png\n10a3b543d4.png\t4454298e57.png\t7671c2d961.png\ta8ef242dfd.png\tdcd3c54405.png\n10aa115164.png\t4454cf34a8.png\t7672a277e6.png\ta8f6cd5b49.png\tdcd5eaced7.png\n10aa861f59.png\t4454da0dcd.png\t767574627a.png\ta8fe8b9306.png\tdcd83e940b.png\n10ad5189a3.png\t445bfee824.png\t76761b2a41.png\ta900d21f9e.png\tdcd9710d60.png\n10af7026ff.png\t445ee07efe.png\t76784fa72d.png\ta90545f377.png\tdcdad231c0.png\n10b09b3d29.png\t446509eb39.png\t7680c9b7fe.png\ta9054ffba1.png\tdcdb96f81a.png\n10b19bd1f8.png\t44678207cb.png\t76876c97cd.png\ta9066a933d.png\tdcdfb81371.png\n10b41b221b.png\t446ddc96b2.png\t768b7a5955.png\ta90d339e39.png\tdce0abc347.png\n10b99a2679.png\t446e131acf.png\t768d5f7ade.png\ta911d4f326.png\tdce4c70a99.png\n10bb0dfced.png\t44730f65b0.png\t76949f552a.png\ta91212b01e.png\tdce5510be3.png\n10c0624432.png\t4478a2a1e0.png\t769620862d.png\ta913c2fc19.png\tdce6361f2a.png\n10c1560bed.png\t4479bbc91e.png\t76990c9738.png\ta919766328.png\tdce7811645.png\n10c3458a28.png\t447aefae2b.png\t769c87aa71.png\ta9199bae3f.png\tdce832adcc.png\n10c40a8426.png\t447b0d530d.png\t769d51b80b.png\ta91bd88bc1.png\tdceb326aee.png\n10c66588d5.png\t447ddaf0dc.png\t769f0a09b7.png\ta920e20f1c.png\tdcee7f62ac.png\n10c6b84f52.png\t447e0cd396.png\t76aa28c0ef.png\ta922b44416.png\tdceef5a0e1.png\n10c7f0d246.png\t447e2666f2.png\t76ac440877.png\ta9287d0e2d.png\tdcf1889cbb.png\n10ca4a5d66.png\t448427097b.png\t76ae1014bd.png\ta92c322fcd.png\tdcf6316d5a.png\n10cbd21919.png\t4487941235.png\t76afc4b8e8.png\ta92c448553.png\tdcf7859b19.png\n10ccf7d9dc.png\t44882c70fb.png\t76b121e7c6.png\ta92d86be81.png\tdcff079eb2.png\n10cda1078b.png\t4488974c52.png\t76b26d2e39.png\ta92f49d2d9.png\tdd01adda98.png\n10d142f54c.png\t448a080d3f.png\t76b4df03dd.png\ta93060eb0d.png\tdd05b799d5.png\n10d2e76fb5.png\t448b2ab416.png\t76b5d31d85.png\ta93c251217.png\tdd063e8cd4.png\n10d990f91e.png\t448df97735.png\t76b63181f0.png\ta93e3cdc1d.png\tdd07581d02.png\n10dbc1c592.png\t44940466b6.png\t76b6c9025b.png\ta9443f266d.png\tdd07c35381.png\n10dc83d62f.png\t44970cdc57.png\t76ba896225.png\ta94450f57e.png\tdd0b1012ca.png\n10e16dd975.png\t4497395e80.png\t76bc4e0cd6.png\ta945daae40.png\tdd10c894a7.png\n10e1c8e23a.png\t449a9e0c51.png\t76bd3c0890.png\ta945ecdcc9.png\tdd11ae96e3.png\n10e835f9ab.png\t449b778c01.png\t76bd88df66.png\ta9494a9a5f.png\tdd15ac9493.png\n10ecfda568.png\t449edd9569.png\t76c406c9bf.png\ta94a0b7780.png\tdd1779a6e8.png\n10eeeda559.png\t44a268d228.png\t76c44f5b8c.png\ta94a59fc55.png\tdd1b4b2f7d.png\n10f1d4a32c.png\t44ab5b7be4.png\t76c72dec1a.png\ta95461312e.png\tdd20094c03.png\n10f24c23af.png\t44ac304502.png\t76c76f8056.png\ta9593732f1.png\tdd230b84a9.png\n10f3d01ddb.png\t44ac5b2526.png\t76c9b61a93.png\ta959b2372a.png\tdd2c8b294c.png\n10f59e0caf.png\t44ae5edaad.png\t76cc8212dd.png\ta9619e916f.png\tdd35be7c48.png\n10f949e0c3.png\t44b05c1bc5.png\t76d080ed6f.png\ta962f4c9b6.png\tdd361d12f5.png\n10fb725cfb.png\t44b4548573.png\t76d317386a.png\ta964a0f499.png\tdd3d4091c1.png\n10fe79fb2e.png\t44b7b7a750.png\t76d6ec9bb0.png\ta970e82aac.png\tdd3e93b539.png\n11005c0470.png\t44c1acd6e2.png\t76dbac3910.png\ta9710f6508.png\tdd475f0405.png\n1103eb72d3.png\t44c56a45b0.png\t76de859005.png\ta971989911.png\tdd4a11ab43.png\n110664c412.png\t44c8aed146.png\t76de89d0eb.png\ta971d2ae53.png\tdd4af2ee73.png\n11096643c6.png\t44c903a754.png\t76df3b1845.png\ta97394e47e.png\tdd4b9d33ab.png\n11100c5070.png\t44c9b9a51b.png\t76e1428103.png\ta97c8676be.png\tdd4e7dde38.png\n1113401d3c.png\t44ccc2d846.png\t76e19598b4.png\ta97cd1d7ab.png\tdd52fb7cb6.png\n1117ef4324.png\t44cf29d695.png\t76e214f5ed.png\ta97deaf198.png\tdd535d0e0a.png\n111affce1a.png\t44d28db2dc.png\t76e5c5422e.png\ta984e630ab.png\tdd566cf41e.png\n111b01c533.png\t44d745a704.png\t76e70c38f6.png\ta987ebe117.png\tdd56be4d81.png\n111eb1d764.png\t44df56fa12.png\t76ea2b7366.png\ta988661613.png\tdd580d1681.png\n111f53e48f.png\t44e0f2c9c7.png\t76ecdc5896.png\ta98ab1c549.png\tdd59e76729.png\n1120f58e5f.png\t44e3f20074.png\t76ed70b2e7.png\ta98d950667.png\tdd5a90cf89.png\n11236835b9.png\t44e41ce44f.png\t76f2458ae9.png\ta9914bdbf5.png\tdd5c9e2550.png\n1126f78dd3.png\t44e625aad1.png\t76f2e81fea.png\ta99344890f.png\tdd606ae1f3.png\n112774ba35.png\t44e630bf50.png\t76f8eb7530.png\ta994a04fd8.png\tdd631e6569.png\n113156b9e9.png\t44e905e599.png\t76fa8ec9a1.png\ta9957cbaa0.png\tdd65de3b70.png\n113655af27.png\t44edd661a0.png\t76fba60faa.png\ta99b61c527.png\tdd65f1856b.png\n11373f4cb8.png\t44efef9c4f.png\t76fccbfd67.png\ta99d486d6f.png\tdd68a62c10.png\n113b017a91.png\t44f2ea3584.png\t76ff83530d.png\ta9a4f7162f.png\tdd691e392a.png\n113f9e005e.png\t44f5dd71ab.png\t770210adb4.png\ta9a9fb1747.png\tdd6a04d456.png\n113ff7f87d.png\t44fcf2ebf9.png\t7704608969.png\ta9aaafc24b.png\tdd6c3f08d9.png\n114397189f.png\t4501df1f39.png\t7708c1a8fa.png\ta9ab1afe01.png\tdd6e39ee85.png\n1144b8dfed.png\t450203ec14.png\t7709ec3c5d.png\ta9ad49f3a9.png\tdd75c9c653.png\n114d62da56.png\t4502ae73a4.png\t770af4e501.png\ta9b099b251.png\tdd792321e8.png\n114daa1bed.png\t4504f9c398.png\t770d038a86.png\ta9b361a48e.png\tdd7f250e5a.png\n114fd22f5c.png\t45068edcbe.png\t7712c705f8.png\ta9b4214b92.png\tdd7f87e316.png\n11518835a4.png\t450715077f.png\t7712e09b9d.png\ta9b7fa7e0d.png\tdd80b7cd67.png\n115485e965.png\t450da2ddcd.png\t7716aa7b58.png\ta9b8ff1cc7.png\tdd81358234.png\n1155134b65.png\t4510ce8fef.png\t772858e71a.png\ta9bc82e277.png\tdd87f1a21f.png\n115623101a.png\t45116205b9.png\t772abdb8df.png\ta9bf8069ba.png\tdd8a561e4c.png\n1156737ae5.png\t45123cba3b.png\t772d2aed6b.png\ta9c9b93505.png\tdd8bdd9081.png\n115d0054ef.png\t4514686882.png\t772ebad9a4.png\ta9cb7bf304.png\tdd8c8b143d.png\n115ee54da8.png\t4514d46495.png\t77321a127e.png\ta9d067cf35.png\tdd92524ddf.png\n1161352da6.png\t45180088a6.png\t7733aab28c.png\ta9d0b5f2f3.png\tdd9997f013.png\n1167a2bec9.png\t4519bf6347.png\t773533b10e.png\ta9d13eb682.png\tdd9a94a797.png\n1169f5b1fb.png\t4520536c03.png\t77388690e1.png\ta9d6ff5d4c.png\tdd9af1f22f.png\n116bd22fd0.png\t4520e28929.png\t7738fbabca.png\ta9de4ea104.png\tddaa8f2cc9.png\n1172c56c13.png\t45228f9dab.png\t773932c485.png\ta9df598e16.png\tddaf8d7cf3.png\n11737bd271.png\t452b06b003.png\t773d6d6367.png\ta9e19c1e10.png\tddb0fdbd54.png\n1173fe4941.png\t452d4ea919.png\t773e6fba21.png\ta9e44990f9.png\tddb1a59147.png\n117503a0b0.png\t452e3918f7.png\t77415e7cda.png\ta9e646fc8e.png\tddbdc80b55.png\n117ac75de0.png\t453054d86c.png\t77420e5628.png\ta9e6fa74bc.png\tddbe6bbb4a.png\n117bd13f22.png\t45314d6084.png\t7744768972.png\ta9e940dccd.png\tddcb1ad560.png\n1180277e3f.png\t453cd20948.png\t77468b048a.png\ta9ea42fb81.png\tddcb457a07.png\n1184b70b09.png\t45406c9a5a.png\t774ae6c2eb.png\ta9ebb78977.png\tddcbdb5447.png\n1186d4ca3c.png\t4540ae2a91.png\t774ee8afcf.png\ta9ee40cf0d.png\tddcd5701b2.png\n118c8410fb.png\t4541761ba4.png\t7756280610.png\ta9ee986303.png\tddd0be8fd5.png\n1190a333a3.png\t45456c0cf5.png\t7757d22ad4.png\ta9eeedce67.png\tddd3099877.png\n1190cfd81e.png\t4545f4bad4.png\t775bc235d0.png\ta9f29dca64.png\tddd3e5ca3f.png\n1196a14840.png\t4548b5fe8a.png\t775be16cd8.png\ta9fd8e2a06.png\tddd6c17de3.png\n1196deafb1.png\t454964ab02.png\t77608c7770.png\ta9ff5638a1.png\tddd80e9d54.png\n11985f826d.png\t454a4e14a6.png\t7767f3239d.png\taa0059cf3f.png\tdddf6d20bc.png\n11994fcea5.png\t454b28069a.png\t7769e240f0.png\taa073980f7.png\tdddf779b5a.png\n119db9578b.png\t454c09bd70.png\t776bdf9043.png\taa082095e7.png\tdddfb6085f.png\n119e974f72.png\t4553bc6fda.png\t776d123d8d.png\taa0846111a.png\tdde0ee264b.png\n11a0fc8072.png\t4553c50290.png\t776d73a00d.png\taa0a7a440e.png\tdde86fece3.png\n11a14b9fcd.png\t4553f776f0.png\t77795c3885.png\taa0dfcf8af.png\tddeb7c04b2.png\n11a54f2eff.png\t4555582431.png\t777a42ca36.png\taa0e831f50.png\tddf2eb4eb2.png\n11a8397e82.png\t4555a21318.png\t777df11ca2.png\taa0ef6b439.png\tddf789f194.png\n11a9828601.png\t4555e5fcdf.png\t7784c83371.png\taa0f87b252.png\tde06dcb997.png\n11ad04f277.png\t455c8febce.png\t7785879425.png\taa10246dda.png\tde08c5a3b7.png\n11b18944fe.png\t455d05742a.png\t778a92f420.png\taa166b090d.png\tde091953f9.png\n11b4141a6a.png\t456019cf5f.png\t778f031703.png\taa1c0ba47b.png\tde0c63338d.png\n11b8214fae.png\t4565c06028.png\t7791af3409.png\taa1cd9fb71.png\tde0d58d7b5.png\n11b987732e.png\t4566beb8e0.png\t77953e3230.png\taa1d903842.png\tde1142326e.png\n11bb6f999f.png\t4572bc04b2.png\t7796858a9b.png\taa21ba676c.png\tde14d598bc.png\n11bc27b01c.png\t4575e490e4.png\t7798aa0c5f.png\taa250d5464.png\tde15d35ebc.png\n11bfb46a23.png\t4578174915.png\t779c169816.png\taa29e7876b.png\tde193a4b3b.png\n11c00399d6.png\t4578fe28d8.png\t77a154ebcb.png\taa2ab180cd.png\tde1b4b1e70.png\n11c70211b2.png\t457b6caea7.png\t77a38f1df1.png\taa2bcc866e.png\tde1d3a9266.png\n11c70402fd.png\t457c69bcc8.png\t77a67d6573.png\taa3289dd48.png\tde20d35e16.png\n11cf17a0ed.png\t457d5edb4c.png\t77aaeef0a4.png\taa34fea223.png\tde25ccc39e.png\n11d36f46e8.png\t457f80b1c5.png\t77abc5ed21.png\taa36cc5b96.png\tde261976e6.png\n11d6b46aa7.png\t4588ecf11f.png\t77acb619e7.png\taa37f37aa4.png\tde2733d9e2.png\n11d82f87fd.png\t458a05615e.png\t77b07e34fc.png\taa3f48ac07.png\tde2758051c.png\n11dbf86bab.png\t458c78660e.png\t77b202a9c6.png\taa4be18d6a.png\tde27bef2bf.png\n11dc566878.png\t45911f71fb.png\t77b35c01c7.png\taa4cb435c6.png\tde28f19d4c.png\n11dce96954.png\t459275ed4f.png\t77b692d7ed.png\taa4f0df151.png\tde2a295cb6.png\n11e04924b2.png\t459396b418.png\t77ba57abdb.png\taa4f987da1.png\tde2ce9bf28.png\n11e4c5d93b.png\t4598c65a74.png\t77c071c12d.png\taa532a7c30.png\tde2d8f0fd6.png\n11e9ac0aeb.png\t45a04b7102.png\t77c695369e.png\taa55948a76.png\tde2fe332eb.png\n11ee5f9682.png\t45a197485c.png\t77cd801900.png\taa56279e8f.png\tde38a99f09.png\n11f4ed3a8a.png\t45a1f78171.png\t77cf676acc.png\taa5bb9e3ef.png\tde39c2b892.png\n11f82be11e.png\t45a2a01008.png\t77d0c793f9.png\taa5f085ae7.png\tde3b4d9046.png\n11fe4dca71.png\t45a7562dbf.png\t77d383dd97.png\taa5fbe402e.png\tde3db2a1c6.png\n1202914cf2.png\t45aa2ccbfe.png\t77d71dd60c.png\taa60ebe3d4.png\tde3f38b5de.png\n12069cc8fe.png\t45aa888721.png\t77dde5d3cb.png\taa622a23c5.png\tde3f584f95.png\n120ae382d4.png\t45ad1de292.png\t77e13dbb16.png\taa63df78d9.png\tde475737fb.png\n120bb13cfc.png\t45b110b2d7.png\t77e259c418.png\taa649ae84e.png\tde51e77a38.png\n120cef9fab.png\t45b3edccac.png\t77e273731e.png\taa6ca37586.png\tde5264d7e9.png\n1215d3d0e0.png\t45b430d235.png\t77e92f7157.png\taa74dbc398.png\tde52bfb104.png\n121bc4deb7.png\t45b5f91d83.png\t77ecd76754.png\taa7589fa21.png\tde5338e179.png\n122568b709.png\t45b6634d07.png\t77f0e3eadb.png\taa78d862d6.png\tde5361fe2c.png\n1226755483.png\t45b8f0ec13.png\t77f0e55398.png\taa791ce59a.png\tde54842e48.png\n12298a8e03.png\t45b943d358.png\t77f958ff1b.png\taa794b2756.png\tde54a0fc4e.png\n122b429ccd.png\t45bc251e85.png\t7801fdbd16.png\taa81ca4f3b.png\tde56d56e3b.png\n122e7dface.png\t45bccca47c.png\t7808f030a0.png\taa82830b4f.png\tde59f281db.png\n122edd47a9.png\t45bd9c2a06.png\t7809e5b59d.png\taa856fa950.png\tde5a329823.png\n1230d07385.png\t45c17e9144.png\t780bbb490f.png\taa88fc4124.png\tde5a42a506.png\n123870a546.png\t45c44595a1.png\t78116fa2a8.png\taa8b2d0fbb.png\tde5f1f1e62.png\n12395f07bf.png\t45c4464cf1.png\t7812fc8116.png\taa8daddd49.png\tde63c230aa.png\n1240a6ce3b.png\t45c512050e.png\t78153bb977.png\taa94cfb806.png\tde672581da.png\n124a6c6989.png\t45c637499c.png\t7815ce431a.png\taa94d14241.png\tde67466121.png\n124f5f46f6.png\t45cabd3ca3.png\t781694a6b1.png\taa97ecda8e.png\tde682a1eb5.png\n12503e67bd.png\t45caeb65cd.png\t78181dbca5.png\taa9ba4a552.png\tde693bd53a.png\n12516190e9.png\t45cca8e973.png\t7818f0f4e6.png\taa9eb3a110.png\tde69685b70.png\n12536382c6.png\t45cf70cc74.png\t7819882e1a.png\taaa025af5f.png\tde6a98fcbb.png\n1261de8951.png\t45d1bb9fb2.png\t781b5a8baf.png\taaa30aba1e.png\tde6bc4b60f.png\n1262067f6b.png\t45dc0d5d93.png\t781c32484c.png\taaa9b6e252.png\tde6e1b4095.png\n1263576548.png\t45de55ad3b.png\t781c76c9fb.png\taaac9f9da2.png\tde6e6ed26a.png\n1267729ffd.png\t45dfe0640a.png\t781ea1465f.png\taaad317b50.png\tde7202d286.png\n1268fbf8c2.png\t45e3277f3f.png\t78211a2b74.png\taab10e4cfb.png\tde73a5a74c.png\n126f253739.png\t45e6edb850.png\t7824b680f1.png\taab16917e6.png\tde7a0dc03b.png\n1274daf0ae.png\t45e8c5d733.png\t78299d7a47.png\taab3d007d1.png\tde7af15747.png\n1276bc341c.png\t45ea134474.png\t782a4a89f1.png\taab9a109da.png\tde80c430e6.png\n1277f1a611.png\t45ea31b9da.png\t782ae9b7e7.png\taab9d19cd3.png\tde83a67dd2.png\n127ef4105f.png\t45ead8f8fd.png\t782dced32d.png\taabb4516d1.png\tde8577c69e.png\n1283b9ee2f.png\t45f292bb41.png\t7833140676.png\taac1902d4d.png\tde87d19851.png\n1285d1dc37.png\t45f31c81d4.png\t78377df955.png\taac2c54c69.png\tde8a820637.png\n12867cc74e.png\t45f9b71a91.png\t783ee7dc03.png\taac2d24fd7.png\tde8e649f22.png\n128d07c753.png\t45fb16d378.png\t783f585c78.png\taac410c5d3.png\tde8e673c5d.png\n128f3b0e36.png\t4603c0743b.png\t784440cf32.png\taac707eaa7.png\tde917fe153.png\n128f4ccb18.png\t4604ba0a16.png\t7844cbedaf.png\taac747667e.png\tde95e861ac.png\n1290e11543.png\t4607195958.png\t7845115d01.png\taacc4fb565.png\tde96056281.png\n1298444dfa.png\t4608c638bd.png\t784d0bfad7.png\taacc6d9a4d.png\tde99730157.png\n1299942387.png\t460c5f2ccd.png\t78534871be.png\taacf648c4b.png\tde9ecc8950.png\n129f66cf0b.png\t46100c192b.png\t785a5c800f.png\taad0f3a281.png\tdea0a56bab.png\n12a50502da.png\t461176ae03.png\t785d976125.png\taad709fe6f.png\tdea3ee05df.png\n12a7ca7096.png\t46163e4ad9.png\t7864dc7883.png\taad74ec770.png\tdeaa0ab2b8.png\n12a7d7f2e9.png\t4618b50b42.png\t7865079bea.png\taad8794a12.png\tdeade70419.png\n12a953a079.png\t461c4b2154.png\t7867d98278.png\taadc43bce3.png\tdeb1b058f8.png\n12aaea5149.png\t4620695897.png\t78728951e6.png\taaddd5cd19.png\tdeb1e0e23d.png\n12ab19091a.png\t4621e4cf32.png\t7877bf3a49.png\taadea795b0.png\tdeb49604b1.png\n12ad6085b7.png\t4622823178.png\t7877c97e02.png\taae2ea202c.png\tdeb84b430f.png\n12b47dcb66.png\t4622cee4b9.png\t787813e948.png\taae3d06017.png\tdebe560cfb.png\n12b6c7311a.png\t46244f8df6.png\t787c79465f.png\taae6ca24a9.png\tdebfaadc1e.png\n12bf4f2d3b.png\t462493f004.png\t787ceb54eb.png\taae740801a.png\tdebfff36ea.png\n12c17cee12.png\t4628d07e17.png\t787f824d70.png\taae8971d6e.png\tdec2ad9eb2.png\n12c401001b.png\t462a384a2b.png\t7884eaed47.png\taaf148d8c3.png\tdec41374ce.png\n12c412993e.png\t462a7f9805.png\t788720abf3.png\taaf2dcbb36.png\tdec5c3fa93.png\n12c92c9f70.png\t462bfb1848.png\t7887dc1b63.png\taaf90619d3.png\tdec7f76698.png\n12d09b9fa4.png\t462d3ca87e.png\t78884dad01.png\taafaaa68d6.png\tdecd39afc3.png\n12de44ce0c.png\t462ee323ab.png\t7889812661.png\taafb5e7b77.png\tdecdf173a4.png\n12e11ab92c.png\t4630209118.png\t788cb5df86.png\taafb84b530.png\tded4d6422a.png\n12e44aaf72.png\t4636496784.png\t78918ee676.png\taafdda92d8.png\tdeec7bb4d9.png\n12e7a48729.png\t46386ca4a3.png\t7896674088.png\taaffa385df.png\tdef16cd3d7.png\n12e859e91d.png\t463925cada.png\t7898118d67.png\tab01feb861.png\tdef68d1495.png\n12e94a5a08.png\t464317917c.png\t7899c31f0a.png\tab0c6b03b0.png\tdef865335d.png\n12eb26825d.png\t4645308ab1.png\t789b0007eb.png\tab0f420be5.png\tdefb156cc1.png\n12ef0f8019.png\t464b563944.png\t789cf04c29.png\tab1210dadc.png\tdefb3d4896.png\n12f06e8373.png\t464c9005e9.png\t78a68dece6.png\tab18a0a7fa.png\tdf0027e3ab.png\n1300ce55c4.png\t464e8483f0.png\t78a8a60bde.png\tab1de6f41d.png\tdf0628f8b4.png\n1301514d67.png\t4652ed0274.png\t78ab68c3c8.png\tab1efdf03a.png\tdf14c1e0c5.png\n130229ec15.png\t4654389112.png\t78ac553fb7.png\tab1ff4ba98.png\tdf19018cf2.png\n13037dfbb2.png\t465463b009.png\t78ad1df4a1.png\tab21118871.png\tdf22272d32.png\n1306fcee4c.png\t465486e6ec.png\t78b2d05419.png\tab21a116de.png\tdf23231a9d.png\n130820c014.png\t465cdee46c.png\t78b32781d1.png\tab222f9a41.png\tdf244aceeb.png\n130904553c.png\t465f93cd9c.png\t78b42298ab.png\tab230639bd.png\tdf274ef7c6.png\n131332f84f.png\t466030918d.png\t78bc946c0e.png\tab2670448a.png\tdf2ec73bd5.png\n1313dbf1d7.png\t466548defe.png\t78be8270bb.png\tab273f19ff.png\tdf2ecb632f.png\n13148c27f8.png\t46673b0e2d.png\t78c36a5e46.png\tab276f9fd2.png\tdf32c11fe8.png\n13197aa534.png\t466a5efde6.png\t78c50b3977.png\tab2d102ef6.png\tdf344e0599.png\n131ca4b83d.png\t4671f0bf71.png\t78c843da8a.png\tab2e576462.png\tdf34ce2d4c.png\n131fc7cbf2.png\t467469da7b.png\t78d559a80d.png\tab2ed24522.png\tdf356bb303.png\n131fe6a1fe.png\t467f13c5c8.png\t78d5dd593b.png\tab2ff73087.png\tdf3a1c2d7b.png\n13209940d7.png\t468067c08b.png\t78d776d952.png\tab301a4e93.png\tdf3b956d5b.png\n13218ed167.png\t4680a45404.png\t78d7865bde.png\tab3200a666.png\tdf3c0ca9ca.png\n13268e9b59.png\t4680bc7373.png\t78d7c70b94.png\tab343c4976.png\tdf3d72a2d0.png\n132cea196e.png\t468303ea76.png\t78d973e057.png\tab3db522bd.png\tdf3fa9de04.png\n132df3fdef.png\t4686d84198.png\t78d992d2c6.png\tab40e136cd.png\tdf4275e6d3.png\n132e1081a7.png\t4696bb53e6.png\t78de77e44b.png\tab4a9da299.png\tdf4495253c.png\n132e30cffd.png\t4698cc32f9.png\t78e09d764b.png\tab5278f9b7.png\tdf46b3c036.png\n132e7e2cd5.png\t469b1c506c.png\t78e2b8371a.png\tab5e7b990d.png\tdf47f2e681.png\n13351e9bc6.png\t46a12ca49e.png\t78e40a6459.png\tab5f39fa08.png\tdf4991a4c3.png\n1336b989ff.png\t46a88e46a1.png\t78e417087a.png\tab5fe4fea8.png\tdf49be5433.png\n133b5ee507.png\t46a8d5b47f.png\t78e58914a3.png\tab61a1c6b8.png\tdf4d3306f0.png\n133c2477d6.png\t46a93d2bd4.png\t78e836e9f4.png\tab644f4608.png\tdf52f7c7dd.png\n133d7c26cd.png\t46a94d4190.png\t78f17697f2.png\tab647410b9.png\tdf56e7aaad.png\n133e16f211.png\t46af3a5b29.png\t78f3b18d4f.png\tab6a1be5d9.png\tdf599e40e4.png\n134502bf0d.png\t46b424e067.png\t78f7d2a79b.png\tab6dfe07be.png\tdf5df272fb.png\n13483cd62a.png\t46b7364b81.png\t78f7d46b55.png\tab7074c50b.png\tdf61ab2482.png\n13496d77cf.png\t46bbe12b2c.png\t78fb3ab862.png\tab72556c51.png\tdf64563d9e.png\n134c694413.png\t46bcca523a.png\t78fb4e57da.png\tab72718c45.png\tdf687410da.png\n1351d43f41.png\t46bec86213.png\t78fbb95606.png\tab767aef7f.png\tdf692412f1.png\n1356069815.png\t46bf3f3bdf.png\t78ffefb944.png\tab7bb7a149.png\tdf6ab067bb.png\n1357b08d8e.png\t46bfd27b90.png\t79017b6cd1.png\tab7d09443f.png\tdf6e53387b.png\n135ae076e9.png\t46c1a0e9a3.png\t790251ad62.png\tab7eb91a82.png\tdf7176eaa4.png\n135bbbeafe.png\t46c21d7d4b.png\t790e72ed9f.png\tab7fcad758.png\tdf71d80bdf.png\n135e445ceb.png\t46c8f4192e.png\t790f497dc8.png\tab827ecd91.png\tdf72029e4e.png\n135fc76f91.png\t46ce54f66d.png\t7910314114.png\tab829d20a1.png\tdf7b58d2e5.png\n1363323721.png\t46d4cce6b7.png\t7910783120.png\tab82b9e44e.png\tdf7c813622.png\n1368ade51e.png\t46d98e3296.png\t79115207b8.png\tab85ed0835.png\tdf7daf15b5.png\n1369ddd462.png\t46dd77ede5.png\t791a14f34f.png\tab89a9e4e5.png\tdf811b26b6.png\n136a686679.png\t46df24ee51.png\t791ca7c91f.png\tab8c023b14.png\tdf8800857f.png\n136af05158.png\t46e374f8d3.png\t791d8e2d30.png\tab8f26de28.png\tdf8c73e52b.png\n13709c6a2d.png\t46eccacdd4.png\t79201fb797.png\tab8f326e09.png\tdf8caa8bdf.png\n13723d4e2a.png\t46f402dcd5.png\t792181c3f1.png\tab90867ff8.png\tdf8d9625c0.png\n1374727444.png\t46f947b8d8.png\t7922ae6d9b.png\tab90b80524.png\tdf8ebf4571.png\n1379940546.png\t46fccdac75.png\t792351f140.png\tab9683d790.png\tdf8f7ad6a1.png\n137fe8a868.png\t46fdd04ed3.png\t792467eff9.png\tab97a25f56.png\tdf90d30bf6.png\n138652f569.png\t4702cdd4f0.png\t792929a4c8.png\tab99fc85b6.png\tdf999cbe7d.png\n1389c3bac1.png\t4703aadcb4.png\t792b1ff712.png\tab9a9e48dc.png\tdf9a4986d7.png\n138cdd198f.png\t47048ac10a.png\t792b31cf9b.png\tab9b2622b0.png\tdfa4d727b8.png\n138cdd8247.png\t4705df4bbc.png\t792cef86c2.png\taba14b6a42.png\tdfab2c098a.png\n138d9c0a80.png\t4708fce9cb.png\t7931eb93b6.png\taba57f0f2c.png\tdfacd912f5.png\n1392afc522.png\t470d69965f.png\t7938b71fe5.png\tabace4b11f.png\tdfae9e48e8.png\n1393ef7cbb.png\t470fd746f1.png\t793ba9f7d2.png\tabaeb87b3f.png\tdfafd0bad1.png\n13950a2184.png\t4711ddd4a2.png\t793dde2ab1.png\tabaf233b39.png\tdfb3390016.png\n139650fe97.png\t4713dcdfe7.png\t7940a73429.png\tabb1521de8.png\tdfb450f27a.png\n139a4307a3.png\t471e146748.png\t79410fa81a.png\tabb222c430.png\tdfb780956c.png\n139b327245.png\t4720794315.png\t7943b928ff.png\tabb62290bd.png\tdfb9785064.png\n139d31bbc4.png\t47285b6e1f.png\t7949b8c74d.png\tabb6ca967b.png\tdfbc399eca.png\n13a93f9444.png\t472ca04a17.png\t794e763e6b.png\tabb91e8148.png\tdfbdbcb4f7.png\n13adf02399.png\t472d0bfdce.png\t794ee5fa8e.png\tabbb0b44df.png\tdfbf9cdc06.png\n13b1a7ad42.png\t472d8fb7f7.png\t7959b9bd53.png\tabbc5e8ba1.png\tdfc0a7b323.png\n13b4cdb701.png\t4730f5a23c.png\t7964e62d1d.png\tabbcfad73b.png\tdfc37cbc5c.png\n13bd6b4cb5.png\t47393e1828.png\t7968eec1f1.png\tabbed11be4.png\tdfc4545499.png\n13bdb9a218.png\t47398efeb4.png\t796a06dd45.png\tabbfb56731.png\tdfc563f793.png\n13be34d712.png\t4739b252fe.png\t796f2819a3.png\tabbfbfc591.png\tdfc8a378b7.png\n13c22c3cd8.png\t473cbcdc24.png\t79718d0ce6.png\tabc1ec4bc3.png\tdfced982a4.png\n13c3fb3de1.png\t4743e3bf8d.png\t7972b70832.png\tabc4c46db1.png\tdfd1b4037a.png\n13cd1b2e50.png\t474c60acf4.png\t7974419f1a.png\tabcc448c24.png\tdfd3138198.png\n13cde8654c.png\t474cb70429.png\t7974573fe0.png\tabcc4c1aa9.png\tdfd7cbea2e.png\n13d02b602b.png\t4757068b17.png\t7977b3e4d0.png\tabcd697bda.png\tdfdb8d4571.png\n13d601c146.png\t475986aa57.png\t7977cbab6f.png\tabce81db29.png\tdfdc5ca98a.png\n13d6e87328.png\t475e27d13c.png\t7987e0e0c2.png\tabd1e3e008.png\tdfde68c552.png\n13d89f8afb.png\t475fafb8e2.png\t798e0e80ba.png\tabd3256917.png\tdfe111dace.png\n13da82c8ae.png\t4766b51055.png\t799453dcae.png\tabd466e64b.png\tdfe230b677.png\n13df1a9d29.png\t4767ae41d0.png\t79949a4117.png\tabd889c82f.png\tdfe6acb4b9.png\n13dfe3b905.png\t47694d4de5.png\t79950869a4.png\tabd9ba8120.png\tdfe7e1c7ac.png\n13e1e4670e.png\t476b88ccd8.png\t7999a4c07a.png\tabe36055e8.png\tdfe93fbc63.png\n13e2856613.png\t476bde9dbf.png\t799b58948d.png\tabe74273c8.png\tdfea8c08b8.png\n13ecf7e516.png\t476cd53e02.png\t799db836f6.png\tabed119094.png\tdfecdb6231.png\n13f1fec759.png\t476f1f618b.png\t79a4692349.png\tabeec5ed35.png\tdfee9e4bab.png\n13f3daf89f.png\t47729d4ee5.png\t79ad96357c.png\tabfc365c4b.png\tdfeefdc7e3.png\n13f448be07.png\t4775e574dd.png\t79af45f6e5.png\tabfed46705.png\tdff67697cb.png\n13fb17bf13.png\t4776354474.png\t79b2e7fb3b.png\tac03463615.png\tdff966dad4.png\n13fd9a6ce9.png\t477df6be1d.png\t79b39472b7.png\tac03d3aeb6.png\tdffb6b3ae2.png\n1401f04c4b.png\t47808639c0.png\t79b5c88862.png\tac059f8997.png\tdffe0975ff.png\n14027d7d54.png\t47822df035.png\t79b6c430de.png\tac06468546.png\tdffe50ca61.png\n140572b0ce.png\t4782e66916.png\t79bb9dc447.png\tac083c126f.png\te001c9a868.png\n1409ddabbf.png\t47859b36c3.png\t79c3a9a0de.png\tac08414ac6.png\te002cd7798.png\n140bb23a86.png\t4789404953.png\t79c7997f05.png\tac0dfa8b51.png\te006ec6c95.png\n140c7cc82c.png\t479ac4a280.png\t79c7b8ff29.png\tac11aab43a.png\te00acaaec7.png\n1410a2c9af.png\t479cb24efb.png\t79c7e5a9e9.png\tac14003b16.png\te00f9885c4.png\n1414606ab4.png\t479f4e8b6f.png\t79c8bd2854.png\tac1854ccd3.png\te01867ca71.png\n141495f4b1.png\t47a6238a64.png\t79cb3c11a6.png\tac18a9120d.png\te01c232637.png\n14152a6731.png\t47aba1e35c.png\t79cdd2b60f.png\tac1aefb611.png\te01c32c8a9.png\n141d9f2d64.png\t47abe23212.png\t79ce59706a.png\tac1c1262a0.png\te01fa7e070.png\n141f29d8b3.png\t47b9dd541a.png\t79d37aa089.png\tac1c575cd9.png\te021227dbd.png\n141fe4baa1.png\t47bab26c30.png\t79d6d9e217.png\tac1cf2cdcf.png\te0215d7e9c.png\n1421a7b417.png\t47bd268dcd.png\t79dc99b54d.png\tac1dd634a8.png\te02281aad7.png\n142209dcbf.png\t47beea764b.png\t79e00af543.png\tac1ea6faa4.png\te02453fa1f.png\n1423eb65ce.png\t47c58dfdae.png\t79e2069243.png\tac2404b452.png\te0257b4c20.png\n14241c27ef.png\t47c62d446e.png\t79e562058a.png\tac2468e7b2.png\te02756640e.png\n14289fd124.png\t47c89965f6.png\t79e829b445.png\tac26936da7.png\te0294013db.png\n142b9c8d88.png\t47cb876dde.png\t79e870db52.png\tac26efe98e.png\te02ae4f371.png\n142c7c5408.png\t47ccff39a8.png\t79eb884bd2.png\tac289a4281.png\te030bdae65.png\n143196b1e8.png\t47cd9710fd.png\t79ec88ff29.png\tac2af593c2.png\te033795108.png\n143238e252.png\t47d97eaa56.png\t79f0ebae66.png\tac30b3edde.png\te034b7e250.png\n1432fcc191.png\t47d9ae7016.png\t79f188cbf5.png\tac3600bf24.png\te0362d2205.png\n14353ee720.png\t47dd1a37eb.png\t79f1ed9ebd.png\tac38d44d7d.png\te0377b9bcd.png\n1435d5602c.png\t47e2eda9d4.png\t79f4872925.png\tac3e753207.png\te03be6ad65.png\n1436c973af.png\t47e49cf02a.png\t79f6734016.png\tac412e6d55.png\te03d434fea.png\n1439570aec.png\t47e634345e.png\t79f7bf317e.png\tac42154416.png\te040765722.png\n143eefe05a.png\t47e91d98fc.png\t79f95432ba.png\tac4331b411.png\te040d91fe9.png\n143ef299ab.png\t47f07fc396.png\t79fbf50ca3.png\tac4368e97d.png\te045664b46.png\n14407c3233.png\t47f405dbf5.png\t79fd4a6e1e.png\tac460261f4.png\te0463d1c9a.png\n14411bfb9c.png\t47fd4036e7.png\t79fed5a697.png\tac4eff5773.png\te04715a8e5.png\n1442c59632.png\t47fec2abbd.png\t7a00981494.png\tac50daa8b6.png\te04bf08dff.png\n144c958fa8.png\t4801e6c341.png\t7a095f0b21.png\tac50e41f8a.png\te04d9a0afe.png\n145095689d.png\t48055c205f.png\t7a0a1152e2.png\tac525e4467.png\te05b637f1c.png\n1451f3a7ba.png\t48088c96a6.png\t7a0c21b0e4.png\tac54591ef1.png\te05db130be.png\n14525851f6.png\t4811bce9be.png\t7a0c8915be.png\tac562c9a1b.png\te060d5ca41.png\n145335fd03.png\t48132be0ac.png\t7a0d2b5d6d.png\tac5cd0c053.png\te06234311a.png\n14566568d9.png\t481ed855f2.png\t7a0f1e9b45.png\tac5f676fa3.png\te0650ed8bf.png\n145c06c2d1.png\t4820132bfc.png\t7a14038cdc.png\tac6055e7ce.png\te066f5d49c.png\n145c7992e7.png\t48226b94d3.png\t7a14066e7a.png\tac6289352d.png\te06a3e3bfb.png\n1463aee5b3.png\t4829eb37ab.png\t7a1852f98b.png\tac63c9fa7b.png\te074d99484.png\n146931dd43.png\t482c05de33.png\t7a1f58855d.png\tac64f9de3c.png\te0751c41fd.png\n146dfd93ac.png\t482e5face0.png\t7a25e51377.png\tac691fa4e2.png\te076188a6c.png\n146ec37657.png\t483294d654.png\t7a26e2ae43.png\tac69290900.png\te07886c9e1.png\n1470cdc60c.png\t483372b203.png\t7a29766af2.png\tac6c4357a9.png\te07cd49e84.png\n1471b3bf7e.png\t483b35d589.png\t7a2cca4fe5.png\tac6f1aba22.png\te082398595.png\n1471bc7b6d.png\t48477e7884.png\t7a2eebff84.png\tac700035a2.png\te0829f05ac.png\n1477d3e710.png\t4848a3a509.png\t7a3bff0c21.png\tac7169bfb7.png\te082c17bfa.png\n147a36429e.png\t4856963531.png\t7a3c76d202.png\tac7300662b.png\te083d6d0f1.png\n147caf1892.png\t48588f24a8.png\t7a3c8c0d2d.png\tac7df612ff.png\te085780d3a.png\n147ec91542.png\t485e70caef.png\t7a47875a7b.png\tac7e275644.png\te08ad29c4d.png\n148227be45.png\t485e869caf.png\t7a49ff1a46.png\tac872800c9.png\te08e7d5619.png\n14826063c7.png\t485f0d30ff.png\t7a4bf67c4b.png\tac8ecfa0f0.png\te09080ef49.png\n14862c2754.png\t486273b149.png\t7a4dad7bb4.png\tac90614072.png\te093696fc6.png\n14863a785e.png\t486377874d.png\t7a4ef2cd6c.png\tac90c088ee.png\te0939d3faa.png\n1495558064.png\t4864559ba5.png\t7a4f1e86bf.png\tac91198928.png\te09572a521.png\n1496c1470f.png\t4869d25c05.png\t7a597c0b30.png\tac931ace49.png\te095c82f73.png\n149c2f22c9.png\t486ae46ba0.png\t7a5cb865a2.png\tac938e2761.png\te099a1a8ad.png\n149d907795.png\t48726415f8.png\t7a60a99114.png\tac9cb56c91.png\te099e54eee.png\n149d9f1a60.png\t4875705fb0.png\t7a67a57455.png\tac9f1608b0.png\te09e63b36c.png\n149eec939e.png\t4878809292.png\t7a696bb878.png\taca25559c4.png\te0a303ec6b.png\n14a00a63d3.png\t487e2ad19c.png\t7a6b92a90d.png\taca781ea7d.png\te0a6b46fd1.png\n14a89a1840.png\t4881a4f71f.png\t7a6d32bcae.png\tacaae4b72f.png\te0a9c3c415.png\n14a9d4876d.png\t48822379cc.png\t7a6ef8886b.png\tacad22ecfe.png\te0aabc9212.png\n14ab406aff.png\t4882d2a775.png\t7a710d3b24.png\tacad5e495a.png\te0ac392d8f.png\n14ad692232.png\t4885a860a1.png\t7a7cac58b1.png\tacaf962142.png\te0ad4ecf12.png\n14aec6bfbc.png\t488aa5caa0.png\t7a817b12b4.png\tacb17142e1.png\te0b2222122.png\n14af3853d2.png\t488f5f133e.png\t7a82502124.png\tacb6f14934.png\te0b26de691.png\n14b1c33a62.png\t4892ce4f3f.png\t7a86bccee8.png\tacb8f69957.png\te0b6a1211f.png\n14b445d27e.png\t48940ae0b0.png\t7a88625b83.png\tacb95dd7c9.png\te0b778dacd.png\n14b50d2c26.png\t4899b0b50f.png\t7a8d800e83.png\tacb9762a12.png\te0bbaec3b9.png\n14bba6af57.png\t489a8fd021.png\t7a8df90e4a.png\tacba470146.png\te0bc3d5a16.png\n14bdb1a4e0.png\t489b8e8389.png\t7a8eae16e1.png\tacc61bd7af.png\te0bd8e545c.png\n14c16f99d5.png\t489df63e85.png\t7a8ef143fe.png\taccb644c9e.png\te0bde1cf11.png\n14c2834524.png\t48a0396892.png\t7a90f87b2f.png\tacd12248e4.png\te0bec01df6.png\n14c6a2d399.png\t48a18e2b84.png\t7a9610078a.png\tacd1a34650.png\te0c018cc57.png\n14c7a48dad.png\t48a18f5386.png\t7a9729505b.png\tacdb7877fa.png\te0c136dc8e.png\n14c9bc2545.png\t48a6b2914d.png\t7a9b7e9d5b.png\tacdbb294e4.png\te0c79815d3.png\n14d1b7816a.png\t48a9647ade.png\t7aa0e4e924.png\tacdc75508d.png\te0c928ff5d.png\n14d3d242ed.png\t48abb60ba0.png\t7aa0fe93c5.png\tacdec8b46c.png\te0ceaa659b.png\n14d5d41e14.png\t48b3dccf26.png\t7aa312d8d5.png\tace1b850fd.png\te0cfbb4d1e.png\n14d7924ddd.png\t48b61d9b9c.png\t7aa7d78d30.png\tace37eaaf9.png\te0cfbbcb0c.png\n14d9b671a4.png\t48b63017da.png\t7aa7e1b355.png\tace9b935fb.png\te0d104baea.png\n14da113475.png\t48b74ccd0e.png\t7aa84dac4c.png\taceadac414.png\te0d61ed3cb.png\n14db62dde9.png\t48b79d6237.png\t7aa8677b5a.png\tacf7bb62cc.png\te0d98e6fde.png\n14dbe534d0.png\t48b8dda801.png\t7aa9ed1476.png\tacfa052ac8.png\te0da89ce88.png\n14dcf18fb6.png\t48bb098115.png\t7aad507507.png\tad04472e14.png\te0dade9896.png\n14e3b9d607.png\t48c0d1f283.png\t7ab5edb803.png\tad068f08d1.png\te0dc20e66a.png\n14e3d2f19d.png\t48c7477de8.png\t7ab70da99b.png\tad0bb529b5.png\te0dddf219b.png\n14ec9b67c3.png\t48c8b7dbfa.png\t7ab963815b.png\tad0e66eabf.png\te0df3ed0ad.png\n14ecb5e60b.png\t48c9c1c34f.png\t7abfe45a2f.png\tad14d2413e.png\te0e065b312.png\n14eddd75d3.png\t48ca300f39.png\t7ac46c38d9.png\tad16edc8be.png\te0e1e28aaa.png\n14f01b8876.png\t48d00777ce.png\t7ac61d2369.png\tad19d4e20c.png\te0e23af636.png\n14f25502c5.png\t48d34846d7.png\t7aca719607.png\tad1e8ef844.png\te0e40311d9.png\n14f73c6c74.png\t48d37635a0.png\t7ad42439bc.png\tad1eae1fe9.png\te0e56bd515.png\n14fecd67d2.png\t48d705f189.png\t7adfa301a2.png\tad2113ea1c.png\te0e9a7dcb8.png\n14ff4740d0.png\t48d81e93d9.png\t7ae288db6b.png\tad2114bd71.png\te0eb2e95c3.png\n15103f6ef7.png\t48da4541cc.png\t7ae7a91efc.png\tad212e6869.png\te0f030eb66.png\n151099de84.png\t48db3b8c5f.png\t7aebdeb9a7.png\tad2736fbd3.png\te0f166a352.png\n15117177f0.png\t48db5fccee.png\t7aebf2b39b.png\tad292a8043.png\te0f43b5213.png\n1511f1de49.png\t48dddfecdc.png\t7aedf0e386.png\tad294a082c.png\te0f853692c.png\n1513083378.png\t48e42488b0.png\t7aeebea3bc.png\tad297bfd03.png\te0f9a3a2d8.png\n1513b19605.png\t48e6f06a5b.png\t7aef9898fa.png\tad298486ed.png\te0fb508d51.png\n1516960973.png\t48e7a5676b.png\t7aeffacb7a.png\tad29bc8923.png\te0fce64c8c.png\n151a1098b1.png\t48e86df5c9.png\t7af335b97b.png\tad2be130fb.png\te0fd25b6c9.png\n151c5884d6.png\t48e9086d51.png\t7af5b00299.png\tad2fa649f7.png\te1009fa370.png\n15253b48f4.png\t48f237773a.png\t7af84fef71.png\tad34383074.png\te101b9a0ec.png\n1529529daa.png\t48f502aaca.png\t7af87e1aa6.png\tad34bc33ed.png\te10417ab71.png\n152a2d8bd6.png\t48fa508d87.png\t7afcc15c87.png\tad364c186d.png\te10798585e.png\n152abce000.png\t48fa734394.png\t7b0773f310.png\tad38358679.png\te109e2dc76.png\n152f9b96d5.png\t48fb4775d3.png\t7b078e7517.png\tad3a16eb53.png\te10a1a3e81.png\n15332d7c18.png\t48fc4044ff.png\t7b0932f7f9.png\tad3c6dd8d4.png\te110c5bf98.png\n1533770ecc.png\t4902b2ec40.png\t7b0b71c55b.png\tad3daf9ba7.png\te11123ab8b.png\n15338f4b95.png\t490c2cc42e.png\t7b0c7e28f0.png\tad459fb6b1.png\te113a8b7e2.png\n153558a258.png\t490e5d3e4f.png\t7b0f327bda.png\tad49962b22.png\te11479e2ef.png\n1537754546.png\t490fcd92db.png\t7b0f7847a8.png\tad4bfb8c00.png\te115069b08.png\n153a47924d.png\t490fd4bfaf.png\t7b10d2ad6a.png\tad513e2b36.png\te1207841c3.png\n153c26c3e9.png\t49174ccdd9.png\t7b10e08263.png\tad5454050e.png\te1213e9ace.png\n153fed17c4.png\t491c5dbdf2.png\t7b19f5eb71.png\tad553cf52f.png\te127edb610.png\n154079ce60.png\t492256412c.png\t7b1c5cf0c3.png\tad56da0d75.png\te1280ddc3c.png\n1544a0e952.png\t492383a53b.png\t7b2032d7d1.png\tad58a834a8.png\te12852e1cf.png\n15474a9e79.png\t4927c651e0.png\t7b20febd6c.png\tad5d638041.png\te12aab98e6.png\n1548834d01.png\t4930028223.png\t7b238266de.png\tad61fd92b2.png\te12b6c0b0a.png\n154948ae6d.png\t49301c781c.png\t7b23a72408.png\tad6283c993.png\te12c5719e9.png\n15498a92ad.png\t49336bb17b.png\t7b24cc5255.png\tad642c2161.png\te12cd094a6.png\n154e585cc4.png\t493521412a.png\t7b2ad17f6c.png\tad653a0343.png\te12cf83477.png\n155070a394.png\t493a210d3a.png\t7b2d30b095.png\tad65f9f2c8.png\te12ea6b1c1.png\n155111e72b.png\t493bee70cc.png\t7b2d4b9295.png\tad68e3b559.png\te1323691be.png\n1551bb254e.png\t494393c725.png\t7b32cb2861.png\tad6a8ffd03.png\te133ac97fd.png\n1552d4e795.png\t494a610bfa.png\t7b33b2ca6c.png\tad6e3961d4.png\te133fc5d3d.png\n155410d6fa.png\t494cc43525.png\t7b392bda72.png\tad6fe18943.png\te135f81c66.png\n1557139238.png\t494ce271f5.png\t7b3a5f3d95.png\tad7065c216.png\te138084a56.png\n15586bd7ac.png\t494ce6d61d.png\t7b3f1eaaa6.png\tad72eda039.png\te13bcd6e22.png\n1559e04eec.png\t494daad56b.png\t7b49a01666.png\tad73449af3.png\te141a12073.png\n155bcd4b4d.png\t494f10f99b.png\t7b4ae4fdb5.png\tad761e92d0.png\te145be89b2.png\n155de1fbf9.png\t495424a64b.png\t7b4ccb8113.png\tad76d4cd21.png\te146c20d21.png\n155eb0affb.png\t495718a00c.png\t7b4e7cb0e1.png\tad7abb5e49.png\te148bc72aa.png\n156589ec9c.png\t4957366c55.png\t7b5261d44c.png\tad7ac7b47f.png\te155c42993.png\n15691df1b1.png\t49578dea83.png\t7b54f602e5.png\tad7d11a9ce.png\te1593c24b1.png\n156be6765a.png\t495962d062.png\t7b596fc561.png\tad7e806892.png\te15d008a21.png\n156e64df6b.png\t495a2d3c7c.png\t7b5cb1b93e.png\tad8304e634.png\te15d1d1c7d.png\n1572629edf.png\t495a45df4c.png\t7b5d5d40fe.png\tad8604aeac.png\te15f74cf8c.png\n157401d95e.png\t495b31555e.png\t7b5e37063c.png\tad870f48bc.png\te161ceb5a7.png\n157500b95c.png\t495da9037d.png\t7b5fda3922.png\tad883213d2.png\te162057e32.png\n157b2aa3e0.png\t495e5d8dc0.png\t7b680e68ce.png\tad898df954.png\te16235fa5b.png\n157cd8b518.png\t495f652399.png\t7b6a6f4f69.png\tad8edbc8cc.png\te162d8e24f.png\n15823abda7.png\t495faf2c57.png\t7b72683a91.png\tad91e70d5c.png\te169daa2d7.png\n158372da81.png\t4961bb7a3b.png\t7b741f4f83.png\tad93d35798.png\te16c36b008.png\n158765ed16.png\t4963db2e0b.png\t7b785380e9.png\tad9427548d.png\te16cf1ced5.png\n158ac29820.png\t4969c6ece4.png\t7b79d341cf.png\tad95b2baae.png\te16fb83895.png\n158d0a4ec2.png\t497626af88.png\t7b7c7b6488.png\tad972c7127.png\te17119b103.png\n15921c04ea.png\t497735ff9a.png\t7b7ce987a9.png\tad9cf3c2b9.png\te17126744e.png\n1594446b53.png\t497e15ff3a.png\t7b7e1a542f.png\tad9f0a2927.png\te1719ec329.png\n15966e1968.png\t497ed6a760.png\t7b815ae7fb.png\tada318e411.png\te171e6162c.png\n1596ae1cdb.png\t498219f37e.png\t7b85a10a19.png\tada4a5903d.png\te1721bfe62.png\n159ff83d6d.png\t49830fa829.png\t7b8700f0e1.png\tada8a36dff.png\te172f09faa.png\n15a4e9864c.png\t4983760174.png\t7b8b5a7bbc.png\tada919169f.png\te17703fdd5.png\n15a52de24a.png\t4985566e7d.png\t7b8bada98a.png\tada9f8c74d.png\te18072f750.png\n15a54065ac.png\t4986fb6755.png\t7b91d38f81.png\tadaa24accb.png\te18238535d.png\n15a5fc9a44.png\t4989c6509e.png\t7b9739c44f.png\tadae2748c1.png\te185ab5dc1.png\n15a76bc90c.png\t498d327724.png\t7b97559200.png\tadbee50c8d.png\te18636b104.png\n15a7893aaa.png\t499237cf6c.png\t7b9a3b1da0.png\tadc1ec2462.png\te18e04a6ff.png\n15a7e201ca.png\t4992de1835.png\t7b9b43e231.png\tadc42b057d.png\te1921d9a0d.png\n15a864a1d7.png\t499560e566.png\t7b9bdbea3c.png\tadc6e14121.png\te19316acc4.png\n15b7f06ba0.png\t4995af5fe3.png\t7b9facc0aa.png\tadc797b5b9.png\te196178f53.png\n15b84c63ac.png\t49960a0373.png\t7b9fe6b1b3.png\tadccaabc57.png\te196a48c94.png\n15c00d00c4.png\t4997428512.png\t7ba36e3032.png\tadd26d48a4.png\te196ca33b3.png\n15c043279f.png\t4998016777.png\t7ba388a682.png\tadd95e52cb.png\te1978e83b1.png\n15c2b6e55e.png\t499db8dbc2.png\t7ba58ee1be.png\taddbeb160c.png\te1984bebe3.png\n15c58c3f7c.png\t49a27aaaf0.png\t7ba5fb76a8.png\taddf8411a1.png\te199174d64.png\n15cc715515.png\t49a438eeaf.png\t7ba89db3b5.png\tade2a658ee.png\te19a8ccce6.png\n15d079a8e8.png\t49a553c65e.png\t7baa1cf344.png\tade2aff16c.png\te19d76920d.png\n15d090dfcf.png\t49a5de33b3.png\t7bab31366e.png\tade356fe0a.png\te19efd1928.png\n15d731142e.png\t49a90688bf.png\t7bb36cdb1a.png\tade51acc52.png\te1a0800dc7.png\n15d76f1672.png\t49adbea04a.png\t7bb4e8e6d7.png\tade714ba38.png\te1a0cca60e.png\n15d7c9beab.png\t49aeb06689.png\t7bb95ab2d1.png\tade746d346.png\te1a51e48a1.png\n15d872393a.png\t49b0c9b785.png\t7bbc8e0a69.png\tade7b0d2aa.png\te1aaf1b487.png\n15da8f19c8.png\t49b0ee21be.png\t7bbea56114.png\tade7b6c1ea.png\te1b169bec4.png\n15db56acd5.png\t49b54df7bd.png\t7bc114d737.png\tadea138b16.png\te1b367d535.png\n15e87c0c0e.png\t49b7be4ef2.png\t7bc3bb0a20.png\tadeb60c15b.png\te1b3a4044b.png\n15e99c4456.png\t49ba3412bd.png\t7bca7168b6.png\tadecc7ecaa.png\te1b5a636ba.png\n15ecb44c76.png\t49c0988c19.png\t7bcb61c3bf.png\tadf2a57778.png\te1bc4ce241.png\n15eebbf52a.png\t49c3835f0c.png\t7bcb9044cf.png\tadf61b04b9.png\te1bd09e50b.png\n15f3e1a0df.png\t49c3f16b62.png\t7bcfd67b4a.png\tadffb35572.png\te1c2f954a0.png\n15f7a047c7.png\t49c5661e21.png\t7bd606b890.png\tae0582d77c.png\te1c6994595.png\n15f9aa8356.png\t49c82e40f6.png\t7bd8bfb18e.png\tae0fb45c52.png\te1c77c92d5.png\n16017c8e02.png\t49c9c04ee7.png\t7beceb5850.png\tae11c7b131.png\te1cb177740.png\n1602c8b05b.png\t49ce7de30b.png\t7bed570cde.png\tae12e89a80.png\te1cbce6620.png\n16099fb731.png\t49d061af38.png\t7bee8e1e41.png\tae13bcc892.png\te1ce826217.png\n160adc1c40.png\t49d1d110b5.png\t7bf025f29c.png\tae14c389bf.png\te1d0fcd36a.png\n160cf2c5d3.png\t49d2d5a232.png\t7bf1f7ff2c.png\tae16194c7a.png\te1d267dd49.png\n161704122f.png\t49d36f09b4.png\t7bf203a987.png\tae1849dbec.png\te1d26c8806.png\n161f7c184f.png\t49de37c5b8.png\t7bf5942c18.png\tae2db518f3.png\te1d4493539.png\n161febc96f.png\t49e215f84c.png\t7bf6203cb1.png\tae2e378908.png\te1d6f165cc.png\n16207869ba.png\t49e96a6bab.png\t7bf748aa2b.png\tae2ef70d51.png\te1dbf1ca37.png\n1621c14879.png\t49eb0d5d7b.png\t7bf9ae630f.png\tae2f6f17be.png\te1e2a4a56d.png\n1623de321c.png\t49ee3373b3.png\t7bfd026d91.png\tae36f17301.png\te1e2b3b3ea.png\n1626a286be.png\t49ee9c5e00.png\t7bfdfb3a12.png\tae3782511b.png\te1e2f5742a.png\n162b99d5b6.png\t49f22705ea.png\t7bfeb0809c.png\tae46eaaee3.png\te1e4f6186e.png\n162f55f72b.png\t49f33f1832.png\t7c01d02935.png\tae47fe8c01.png\te1f1069116.png\n162f738a03.png\t49f7d795cc.png\t7c0447f939.png\tae4c9f8061.png\te1f1f79326.png\n1630eeba83.png\t49f8c016d2.png\t7c08e09f87.png\tae4feae4ac.png\te1f32ca8c1.png\n1631293bba.png\t49fafe3b60.png\t7c0b76979f.png\tae51614513.png\te1f349d34d.png\n16322de21c.png\t49fbbd45f5.png\t7c0d3a28a1.png\tae55273079.png\te1f4bd4876.png\n16358ba5b1.png\t49fe2ee7ba.png\t7c0fa011cc.png\tae595b003e.png\te202f8d4df.png\n1637d12136.png\t4a021e9e3e.png\t7c1348eb50.png\tae61c09508.png\te204511fd2.png\n163dbc14cd.png\t4a023354b9.png\t7c14a49942.png\tae6aacd57a.png\te2063ce131.png\n1640610fd0.png\t4a03a65d79.png\t7c1728d80b.png\tae6b6df24d.png\te2070fdf67.png\n164530d3f0.png\t4a0490244b.png\t7c1758ce19.png\tae6ea6e613.png\te20c37a47b.png\n16472639a6.png\t4a07a46b1d.png\t7c18c8575c.png\tae6ffe812a.png\te20e33ab7c.png\n164873b51f.png\t4a08951f0e.png\t7c1962afdc.png\tae708e6579.png\te21021cd36.png\n1649955714.png\t4a08f295c0.png\t7c1b8a17cc.png\tae7203cd0d.png\te2112cdbb0.png\n164999db81.png\t4a09589ec2.png\t7c223a0665.png\tae74081600.png\te214be96e8.png\n164a29c9f8.png\t4a0a6965fa.png\t7c2609c66c.png\tae75c77ff8.png\te21597846e.png\n164cf9c566.png\t4a11afb357.png\t7c3081a812.png\tae7ba85ff3.png\te220abe8b0.png\n164de10f56.png\t4a11c0f4ad.png\t7c337d9e42.png\tae7fdfd55e.png\te224e91c50.png\n164e2232cf.png\t4a12baccc4.png\t7c34859167.png\tae807d587a.png\te2262f7275.png\n164e2dccd5.png\t4a16993559.png\t7c374bd2cc.png\tae80852574.png\te22e1ac585.png\n16531b343c.png\t4a1afaf591.png\t7c37fbeb5f.png\tae87f367a8.png\te2313f87de.png\n16541f595d.png\t4a1c037d5f.png\t7c383cc256.png\tae8823b009.png\te2351d6eff.png\n16545fae06.png\t4a1df22a26.png\t7c3a952cf2.png\tae8be09ef8.png\te236f6035a.png\n16589f0702.png\t4a21d26749.png\t7c3c2d6b19.png\tae96972c1a.png\te238748543.png\n165968e004.png\t4a23ca32fa.png\t7c3c772835.png\tae9da06afe.png\te23a93a95e.png\n165c8b1f14.png\t4a263eda40.png\t7c3e872cbc.png\tae9ec40b18.png\te23d59c891.png\n165cf32a37.png\t4a26fdca60.png\t7c41f727b9.png\taea6c99c49.png\te2421eee13.png\n165df5edb1.png\t4a27de03fc.png\t7c4605002a.png\taeadb630c5.png\te246e54bdd.png\n165ec2acbb.png\t4a2b003ce2.png\t7c46ee160d.png\taeb2194389.png\te24e9afdd1.png\n165ec50967.png\t4a2c200514.png\t7c4874f691.png\taeb36c1964.png\te24efb21d8.png\n165fb47f85.png\t4a2c93f934.png\t7c4a4add03.png\taeb4aa148c.png\te2526a10e9.png\n165ff6c3b8.png\t4a2dcbe387.png\t7c530a9085.png\taeb75ff00e.png\te2562cf41b.png\n165ffa1723.png\t4a30e84131.png\t7c5528d1ed.png\taeba5383e4.png\te258f2a7f1.png\n1664b6a6d5.png\t4a33e17f59.png\t7c55edb677.png\taebc36c11a.png\te25d11adc9.png\n1668e61448.png\t4a39459597.png\t7c5c9f8ba4.png\taebf30c3c7.png\te25d3a27b1.png\n166b46c4ff.png\t4a3f525526.png\t7c61d788af.png\taebf99da9e.png\te26029b910.png\n167025be93.png\t4a408fb14b.png\t7c6221dbb8.png\taebff18bf9.png\te2623063bb.png\n1675d7da88.png\t4a411c946b.png\t7c640919a2.png\taec650e318.png\te263adecde.png\n16769a3e8a.png\t4a430a5d54.png\t7c66d9c0fc.png\taec734e4db.png\te265c37b89.png\n16771e6d53.png\t4a4573191e.png\t7c6a2df039.png\taec95a704d.png\te265ee440b.png\n168f625d62.png\t4a4c747233.png\t7c6b453f71.png\taed722bd3b.png\te267b50b06.png\n16924862c6.png\t4a4e33c15e.png\t7c6e3e4d9d.png\taed738c07f.png\te2684fca5d.png\n1693eecf63.png\t4a51c8b3bc.png\t7c6e4834c9.png\taed7e87d02.png\te26935aff8.png\n1695784a24.png\t4a525c21b4.png\t7c6f76a267.png\taed83b81bb.png\te26c381bd4.png\n169b928039.png\t4a5595076f.png\t7c7125fd12.png\taedad85e8c.png\te27400528c.png\n169ef6df9c.png\t4a5681a563.png\t7c72ae0a51.png\taedd098370.png\te2775a3b33.png\n169f0aa462.png\t4a56c90000.png\t7c73090754.png\taede306d15.png\te278cee272.png\n16a231bd69.png\t4a5769fbe8.png\t7c7ea2a8db.png\taee12c124d.png\te2810cc777.png\n16a9609841.png\t4a5dd4ab45.png\t7c819c29a1.png\taee24b7b6f.png\te286e6d0fc.png\n16ad47a452.png\t4a5f65e1fe.png\t7c8b940bba.png\taee94e6892.png\te288ff18e2.png\n16ad51e2ed.png\t4a6120cd63.png\t7c8c827799.png\taeea7d0ea2.png\te2902bf2a3.png\n16af70da6d.png\t4a624a202b.png\t7c96230fcc.png\taeebdd4350.png\te292577582.png\n16b198d233.png\t4a723a302f.png\t7c9770ad2a.png\taeed80472d.png\te2938f7370.png\n16b2119ffa.png\t4a7418a5e6.png\t7c99d1c1eb.png\taef07da484.png\te297179699.png\n16b6cd9bc0.png\t4a74c0873d.png\t7c9c01ebe8.png\taef24f508a.png\te2a70ffdf6.png\n16bee00afc.png\t4a7ac48295.png\t7c9c98aa11.png\taef4309602.png\te2a8145eae.png\n16c2c3cbae.png\t4a7c0517d4.png\t7ca0a71ed2.png\taefe24c5ae.png\te2aa5063cc.png\n16c3e76481.png\t4a800ed06e.png\t7ca30f5c04.png\taeff4535bc.png\te2b655987d.png\n16cebad726.png\t4a822016ff.png\t7ca4629653.png\taf02eda909.png\te2b680135b.png\n16d09f6351.png\t4a82c89850.png\t7ca695f191.png\taf03b50e82.png\te2b7d2f1ca.png\n16d12d560d.png\t4a830422cb.png\t7ca8574a96.png\taf08b134ef.png\te2b83ec618.png\n16d444da1d.png\t4a85d2fb19.png\t7ca96e0c9f.png\taf0c3f3591.png\te2b86b766e.png\n16d83d1861.png\t4a8d55bdf5.png\t7caf112c28.png\taf0e101aae.png\te2ba31c4a5.png\n16d8e1d384.png\t4a91733da3.png\t7caf9364fd.png\taf1140dc2f.png\te2bb21efd1.png\n16da227a77.png\t4a91a7ed71.png\t7cb042c4eb.png\taf171a3570.png\te2c081fd15.png\n16db20b5e8.png\t4a96727bdf.png\t7cb2165703.png\taf1826b80d.png\te2c1c771e9.png\n16dea35a1c.png\t4a9ca949fa.png\t7cb4bf1ca6.png\taf18915f0b.png\te2c5c7991c.png\n16e0dd1047.png\t4aa24fbabb.png\t7cb65cc6ea.png\taf1b38af1e.png\te2c6d37f9f.png\n16e308dbc3.png\t4aa5b85fb1.png\t7cb7fb97d6.png\taf1d47494c.png\te2ca341f4d.png\n16e3b0564d.png\t4aa6feb688.png\t7cb8401bb1.png\taf1d588df1.png\te2cc98b860.png\n16f4daba7d.png\t4aaa140390.png\t7cbc8d92ab.png\taf226b2973.png\te2cec6cc04.png\n16f514d1ee.png\t4ab1ae3516.png\t7cbfebdf9d.png\taf26a47641.png\te2cff6312d.png\n16fb12f4ce.png\t4ab227936f.png\t7cc1022e2c.png\taf2901932b.png\te2d3957c95.png\n16fb21800b.png\t4ab53d1795.png\t7cc53fe88b.png\taf296d1c0a.png\te2d4588c65.png\n16fe0e5148.png\t4ab596b394.png\t7ccccf5f30.png\taf29e02e15.png\te2dd6ebd6b.png\n1702c5fa6e.png\t4abf00b33b.png\t7ccedd8742.png\taf3015e30b.png\te2de2ce77a.png\n1702e1317f.png\t4ac19fb269.png\t7cd2918c27.png\taf3340c64a.png\te2e1a8c419.png\n17050aa15e.png\t4ac6530567.png\t7cd4898363.png\taf34547bf2.png\te2e4fcf83e.png\n1705ade299.png\t4acda6877e.png\t7cdba71b29.png\taf3bc2fb94.png\te2ea7ff45b.png\n170791a419.png\t4ad00e8b5b.png\t7cdc6291db.png\taf3bc56edb.png\te2f17cf732.png\n170a1cb9c7.png\t4ad2ddacfc.png\t7cdc7ed0b9.png\taf3d81a42d.png\te2f397d206.png\n17111dba6b.png\t4ad823e2d3.png\t7cdd7df705.png\taf3e29f617.png\te2fade489b.png\n17113fd0bd.png\t4ad8a63538.png\t7ce5d97648.png\taf445bc114.png\te30535a390.png\n17170c3fb1.png\t4ae16a3b40.png\t7ce6463bc8.png\taf45265429.png\te306f89a02.png\n1717b6750a.png\t4ae24ea066.png\t7ce692067e.png\taf4794de08.png\te30e29d767.png\n1717c31a33.png\t4aec150e7b.png\t7ced00a298.png\taf4a92584e.png\te31144fbe3.png\n17234b98a7.png\t4aee25ccf3.png\t7ceede7d55.png\taf600f2a65.png\te3131891f8.png\n1724889841.png\t4af50b7c23.png\t7cf26c9dea.png\taf6dd3433a.png\te3134028ed.png\n172d24f19c.png\t4afc4b57ad.png\t7cf386709a.png\taf6fe9a9c8.png\te31364901d.png\n173073f1c8.png\t4b019a8354.png\t7cf3f7f133.png\taf71ca4ab3.png\te31b6fc92f.png\n17312df29e.png\t4b02d1de2d.png\t7cf4d3dacf.png\taf7343b7e3.png\te321020b2e.png\n17394fc203.png\t4b04101375.png\t7cf4dcc68e.png\taf73ae71b9.png\te323635585.png\n1739b1634d.png\t4b05075e91.png\t7cf6272d08.png\taf773ea8ed.png\te32452b10d.png\n173a077701.png\t4b05f60263.png\t7cf7445350.png\taf77e1d769.png\te325a27d31.png\n173abc7490.png\t4b07eed60b.png\t7cf980df36.png\taf795e857d.png\te325b6388c.png\n173b15430f.png\t4b0bf2f4a1.png\t7cfbd6059d.png\taf7cdf8534.png\te3261dbace.png\n173b50b4e0.png\t4b0c5aad63.png\t7cfcdfb76e.png\taf7d70119d.png\te327e9714d.png\n173dd2f369.png\t4b0d23912d.png\t7d048ae83d.png\taf7e0cdd0d.png\te32a58e85d.png\n1744bd61ef.png\t4b0e52eb9c.png\t7d04e34530.png\taf7e9d7648.png\te32ae6e436.png\n17470dd9fc.png\t4b15912a41.png\t7d0ab4664f.png\taf84304233.png\te32b064308.png\n174789f4b8.png\t4b15cebb3c.png\t7d0b2f2046.png\taf85ba8c8c.png\te3349e2125.png\n174b478747.png\t4b173d8f06.png\t7d0c3ea808.png\taf85d6429d.png\te335542c17.png\n174bc625a0.png\t4b17d5fb8f.png\t7d0cea3489.png\taf86467c71.png\te337021e6a.png\n174bd4ed02.png\t4b1ad60300.png\t7d0d470e33.png\taf8d5de7eb.png\te33d753279.png\n174ebb49df.png\t4b1bf13b46.png\t7d14f8cfe3.png\taf8fe66d63.png\te340e7bfca.png\n1750f60b23.png\t4b1c5e085d.png\t7d180ef5a2.png\taf9230b78b.png\te34832b696.png\n1751290ca2.png\t4b20e32091.png\t7d185a151b.png\taf92444d7f.png\te3493e23f0.png\n1751413556.png\t4b2294dfcc.png\t7d187bf4bb.png\taf9b763988.png\te34c1220b1.png\n175420a7ac.png\t4b2522e257.png\t7d19a1bc43.png\tafa6b76f77.png\te34eaa10e1.png\n175540fbbf.png\t4b264383e4.png\t7d2561517f.png\tafaa14f2a2.png\te3519d5537.png\n175804ae16.png\t4b276b23ec.png\t7d2a057276.png\tafad34be59.png\te3532bddf2.png\n175ad73f3a.png\t4b27a19834.png\t7d2a222b14.png\tafb0329994.png\te353ae4dd5.png\n175c9c9e62.png\t4b2af55ae6.png\t7d2ad19738.png\tafb3847686.png\te3577b8d4d.png\n175cf3b2e2.png\t4b3502ca99.png\t7d2eec203e.png\tafb4b98b8f.png\te358121d19.png\n175cf79569.png\t4b354208b1.png\t7d35569f4b.png\tafc247387c.png\te35bb189f6.png\n175f61a717.png\t4b366ad75d.png\t7d3b22a889.png\tafc4844f6a.png\te35d17412e.png\n1760bece6d.png\t4b3bee9051.png\t7d41b0a6e1.png\tafc56c2c28.png\te360f895de.png\n176ad39bc7.png\t4b3f1d969f.png\t7d420cc2d4.png\tafc76fb439.png\te361151ef3.png\n176bd974e0.png\t4b43990c7b.png\t7d42cc767d.png\tafc846e1e5.png\te364ebd859.png\n176e1cf804.png\t4b44c94892.png\t7d48474434.png\tafc996abf3.png\te366f9d62b.png\n1770691c51.png\t4b473f0e4a.png\t7d51af6ef3.png\tafca9aebdb.png\te367a6ca8b.png\n1774fc54d9.png\t4b478173aa.png\t7d52a376a7.png\tafcddf4696.png\te36949a00d.png\n17759743db.png\t4b4b441e64.png\t7d5345b3aa.png\tafce0bd7dc.png\te3699b796a.png\n1775baeb70.png\t4b4b7e4655.png\t7d5409ff4b.png\tafd0b385f2.png\te369caf81e.png\n1777e9be51.png\t4b4bb5ca9a.png\t7d561f388e.png\tafd4f54ea5.png\te36cafd036.png\n1780e96886.png\t4b4ce588a9.png\t7d58c958b7.png\tafd7bcd985.png\te37147b293.png\n17815a50e0.png\t4b4d404c82.png\t7d58fd7771.png\tafd984cece.png\te372009e84.png\n178bd2185d.png\t4b4dd814e7.png\t7d594dd079.png\tafde39c848.png\te37551bed6.png\n178c67d992.png\t4b50ad0e65.png\t7d596c01eb.png\tafe5e42216.png\te3755d5769.png\n178d777950.png\t4b522e501e.png\t7d5c34a95a.png\tafeced8b5f.png\te37a266061.png\n178f33f196.png\t4b532df3df.png\t7d5f9abb1c.png\taff52da041.png\te37a3da5b2.png\n1790f3520c.png\t4b59ccd023.png\t7d60e50a98.png\taff87f7ba7.png\te37b8208fa.png\n17972a6517.png\t4b643eb181.png\t7d65bb33ed.png\taff8ae9d4d.png\te38299f1fc.png\n179b18eb1b.png\t4b64c96dbe.png\t7d6e95d2d5.png\taffc79d07d.png\te38510a800.png\n179de94b7e.png\t4b6931306e.png\t7d6ec23f08.png\tb001a5381c.png\te388ed5fb9.png\n17a2b55a01.png\t4b6bc93e6c.png\t7d73751ac9.png\tb00430ef37.png\te3900fce68.png\n17a5a58850.png\t4b72e35b8b.png\t7d761f02c2.png\tb0078f9be2.png\te39ca54d2b.png\n17a6639049.png\t4b82adea21.png\t7d7656d684.png\tb00ae599a6.png\te39ded223d.png\n17a8b7e7f3.png\t4b86ca5deb.png\t7d785d5296.png\tb00c0fdd0e.png\te39f959114.png\n17a938e17d.png\t4b873df7a2.png\t7d87ab4aa8.png\tb00e1265b6.png\te3a437ca0d.png\n17aa492868.png\t4b8f0dd391.png\t7d8b219e1c.png\tb00fd9fcb9.png\te3a53f0510.png\n17aabc4f34.png\t4b950b6227.png\t7d8e29261b.png\tb011b540e8.png\te3ada5840a.png\n17ac6c1165.png\t4b9862566c.png\t7d8e81e137.png\tb012e9ebb0.png\te3b1b48b5a.png\n17b344c505.png\t4b99b98243.png\t7d90c184ab.png\tb013de1e71.png\te3b8f5e02b.png\n17bb0d4b3e.png\t4b9b8c2e44.png\t7d933efecd.png\tb014624db2.png\te3bb9c881a.png\n17be1887b9.png\t4b9c05eb80.png\t7d963e0f5d.png\tb019443c23.png\te3bd0d2365.png\n17bfb214ad.png\t4ba3207c38.png\t7d9995c4cb.png\tb01d2e5375.png\te3bf961b41.png\n17bfcdb967.png\t4ba8d5d08b.png\t7d9b2f9402.png\tb01e16574a.png\te3c11882be.png\n17c1981f7d.png\t4bacc96aa3.png\t7d9f0dacd0.png\tb01e1cd19c.png\te3c14583d7.png\n17c5d2464c.png\t4bb0fd9591.png\t7dad1b3905.png\tb01ed5d099.png\te3cced6b6a.png\n17c6355815.png\t4bb7ebf911.png\t7db35be44e.png\tb01ffa12f5.png\te3ccf38664.png\n17c6a6e9b8.png\t4bb8260218.png\t7db7a5fb8f.png\tb02595bf3e.png\te3cded005d.png\n17ca457cff.png\t4bbc410ec5.png\t7db83b64e3.png\tb02ad730ec.png\te3ce4247d8.png\n17cc9250e9.png\t4bbd0a683c.png\t7db9b53d89.png\tb0308268a8.png\te3ce840258.png\n17d27ccb9e.png\t4bbe1afbc2.png\t7dc2f2afb6.png\tb035452c45.png\te3cf4481ae.png\n17d4b5e4e6.png\t4bbeaf32f1.png\t7dc633c2c9.png\tb04510ede2.png\te3d9ffaacc.png\n17d4d42273.png\t4bbf7cc4b9.png\t7dcd975aab.png\tb04c03e3d4.png\te3dca32ca5.png\n17d70fee5c.png\t4bc3530a30.png\t7dd01cb655.png\tb052410a6a.png\te3dcd28ec0.png\n17dd87aefa.png\t4bc7c8cca1.png\t7dd0a19b0a.png\tb054b1c29c.png\te3e09fce08.png\n17dde2138e.png\t4bc87727b5.png\t7dd0a9a509.png\tb05514e5fe.png\te3e1afe993.png\n17df03088f.png\t4bc99aed9e.png\t7dd17aac9d.png\tb059eb7570.png\te3e55d9e16.png\n17df5aa486.png\t4bcd073e5e.png\t7dd235711d.png\tb05d265497.png\te3e70129bc.png\n17e67f9ce4.png\t4bcefa3819.png\t7dd73361c1.png\tb05d4e02d2.png\te3e7a65574.png\n17e8052ff2.png\t4bcfc0f9bd.png\t7ddcf83303.png\tb0608ba635.png\te3e92cc747.png\n17e9169813.png\t4bd29d99f5.png\t7de09d894b.png\tb06aa2ee78.png\te3ecb649c6.png\n17ea34c5d9.png\t4bd2d2981c.png\t7de3a34407.png\tb06dde0ed9.png\te3ee715d4e.png\n17ef7c6253.png\t4bd8042755.png\t7de77dc41f.png\tb06fe13f54.png\te3ef53a7be.png\n17f3075bd2.png\t4bd819a267.png\t7de8a54df3.png\tb07015747f.png\te3f1908c35.png\n17f320bc00.png\t4bd876a71c.png\t7de95e0115.png\tb0732b724a.png\te3f1cd2bb1.png\n17f6adbfed.png\t4bd87bd3da.png\t7dea0df8c7.png\tb07d4da735.png\te3f6e4b032.png\n17fb9d852c.png\t4bdff7af87.png\t7deaf30c4a.png\tb07fc3b87c.png\te3fc1a3e14.png\n17ff51e78c.png\t4be04e10aa.png\t7df36788a9.png\tb08170c5f3.png\te3fc3bd709.png\n18046df8e3.png\t4be0764138.png\t7df398e4ba.png\tb081cf2ca0.png\te3fc4ea5e8.png\n1804a4c064.png\t4be11bded3.png\t7df431274e.png\tb08328c011.png\te402f53276.png\n1804b11f9a.png\t4be61a973e.png\t7df490b209.png\tb0869b33ce.png\te4096090e6.png\n1806e34f68.png\t4be85a3110.png\t7dfdf6eeb8.png\tb0912218d7.png\te40a6b75af.png\n18078e8284.png\t4be90bdeff.png\t7dfe39d7ae.png\tb091620f53.png\te40a992524.png\n180d2ec2cc.png\t4be9ea64fe.png\t7e009749e7.png\tb09456d59c.png\te414c58d11.png\n180e38fa4b.png\t4bea09bfd4.png\t7e02747598.png\tb095375d19.png\te4159ed1de.png\n180f0a1854.png\t4beb590412.png\t7e0279884d.png\tb0955cb6a5.png\te4177132c1.png\n1811721401.png\t4bef5a7d3a.png\t7e0d970af8.png\tb097a1f9e0.png\te4186029c2.png\n1811afb8fb.png\t4bf2e5b78e.png\t7e0eddd4a7.png\tb0998be638.png\te41a05735d.png\n1813a61a7c.png\t4bf4b35f56.png\t7e138bf911.png\tb09aaa94c1.png\te41b2925a7.png\n18150f9ca6.png\t4bf7663d30.png\t7e1b387b7c.png\tb09d843d5f.png\te423a30632.png\n1817464894.png\t4bfa0bfa8b.png\t7e1b79b767.png\tb09f0e743d.png\te426261c0a.png\n18184347cf.png\t4bfdc5e5b3.png\t7e1e743cdf.png\tb0a1399f41.png\te428876d1a.png\n1819f215f7.png\t4c02cb44d5.png\t7e21858e69.png\tb0a18e86c7.png\te42e6250bd.png\n181be07ca5.png\t4c075a6454.png\t7e23d561cd.png\tb0a1a539d0.png\te431838b7a.png\n181f0917b2.png\t4c09e631c1.png\t7e26f45eb9.png\tb0a2fe2a63.png\te432bbadaf.png\n1821c518da.png\t4c0efe9735.png\t7e2f5c3a0b.png\tb0a33197f5.png\te435eabb31.png\n1822127483.png\t4c0f207a50.png\t7e325e7a9d.png\tb0a3baf70e.png\te437c3912d.png\n1825fadf99.png\t4c0f3277e7.png\t7e3425f0f9.png\tb0a5d211a8.png\te439ae5e8c.png\n182891f8a6.png\t4c1c8b3774.png\t7e35e90693.png\tb0a7056a04.png\te445d83dd1.png\n182a63515e.png\t4c1fa719dc.png\t7e37c5b66e.png\tb0a7ac630a.png\te446b0b4b6.png\n182ab836ce.png\t4c249c63a1.png\t7e38f83eb1.png\tb0a9170b6f.png\te44dcaeb74.png\n182bfc6862.png\t4c26629cdf.png\t7e3f31ef05.png\tb0ac47bf6d.png\te451bbd1a8.png\n182ea1798b.png\t4c2c29b221.png\t7e40cdf1f3.png\tb0b6a38e0b.png\te451d2193d.png\n182eb84479.png\t4c2eb1cd66.png\t7e42729efd.png\tb0beff63b4.png\te4534b82b1.png\n182ef04220.png\t4c33b2dd89.png\t7e453c8ed7.png\tb0bfdd3d3c.png\te45649bcc1.png\n1832ab22f3.png\t4c36178b41.png\t7e4af53bfd.png\tb0c03d1040.png\te45c509f80.png\n1835fd08ce.png\t4c38a9702e.png\t7e4fbf9bc2.png\tb0c2b9a9f2.png\te45c6ac356.png\n183b58df29.png\t4c3a64cb0c.png\t7e584d50b4.png\tb0c68deb98.png\te45d4eb76a.png\n184356736e.png\t4c407be403.png\t7e5879117e.png\tb0ce227ed6.png\te46324b167.png\n18449e54f8.png\t4c429ee782.png\t7e58c5a1f6.png\tb0d008f6df.png\te46625376f.png\n184a687046.png\t4c43b24562.png\t7e5a6e5013.png\tb0d0386a1c.png\te468cbe171.png\n184cf35640.png\t4c43d99f68.png\t7e5cdca0b7.png\tb0d08bde82.png\te468f76d96.png\n185172060e.png\t4c445e5206.png\t7e5ffe9006.png\tb0d0d70817.png\te469859cd3.png\n185444ffcf.png\t4c4556fff0.png\t7e61b8c897.png\tb0d14aeedc.png\te46d4284d5.png\n1858148001.png\t4c48b962ff.png\t7e67ef84e1.png\tb0d8f26ad8.png\te4705d28d5.png\n18590f686a.png\t4c4991aaa9.png\t7e6815b7b2.png\tb0dbde2430.png\te4715f9546.png\n185bf0a1fc.png\t4c4ccb59ed.png\t7e68df3981.png\tb0e07dd157.png\te476971090.png\n185ff08d47.png\t4c4d0194fc.png\t7e6d80e407.png\tb0e3178d6f.png\te476b868ef.png\n18606906bc.png\t4c4dbdeaac.png\t7e713574ed.png\tb0e3dba964.png\te47c5d83fc.png\n18619c2342.png\t4c5482697f.png\t7e72764080.png\tb0e479668d.png\te47e42d98b.png\n18654f824e.png\t4c5937c9e2.png\t7e7276d088.png\tb0e7a894c3.png\te47f1a1be8.png\n186ab92516.png\t4c5ba6aee9.png\t7e737a6cc6.png\tb0eb953a1d.png\te480079ae6.png\n186d861771.png\t4c5f25ed11.png\t7e7579b833.png\tb0ef4ce251.png\te483ed9136.png\n186e591d03.png\t4c64716fa8.png\t7e777df9d5.png\tb0f23f7415.png\te484db5a63.png\n186ec3704a.png\t4c67f13454.png\t7e78a47cee.png\tb0f7c95ac9.png\te484ea9ad2.png\n186edd93a3.png\t4c691b6847.png\t7e79bb71dc.png\tb0f7f890c8.png\te4872a93ba.png\n1870bc0ec0.png\t4c6c90518d.png\t7e8201578d.png\tb0fa04add4.png\te487752343.png\n18729df07b.png\t4c76db3834.png\t7e84c340bd.png\tb0fda65353.png\te4883bfdf0.png\n1873b8f6a8.png\t4c7781ae43.png\t7e893b59b5.png\tb10fa280bf.png\te48923a845.png\n1873c5eced.png\t4c7e3c71e6.png\t7e8db86731.png\tb11110b854.png\te48aff427f.png\n1874b0614a.png\t4c7f2b2c18.png\t7e92ab6fe8.png\tb112d9993c.png\te48c8294d2.png\n18758d8f8f.png\t4c7f5de583.png\t7e94eb71be.png\tb1144d629d.png\te48c970217.png\n187ecc32d5.png\t4c7ffb89ae.png\t7e9501ad1c.png\tb116d83f00.png\te48ce37527.png\n187fa49f49.png\t4c8132dc73.png\t7e9b6a1687.png\tb1221bb8d2.png\te49090fa59.png\n188598e41e.png\t4c822aca6a.png\t7e9c4e3932.png\tb129f24bcd.png\te4955211ab.png\n188da27ff6.png\t4c82bc66c9.png\t7e9e46174c.png\tb12a52e09f.png\te4970114f2.png\n188de9320e.png\t4c89d6ba5c.png\t7ea0fd3c88.png\tb12a5a2b61.png\te499f3c6ee.png\n188ec986d9.png\t4c89e94e36.png\t7ea197b7df.png\tb12bc217f6.png\te49ab753ed.png\n188f970073.png\t4c8bc90762.png\t7eaa5900b9.png\tb12c491f2a.png\te49b5c4be8.png\n18902bd56d.png\t4c8dab3f1a.png\t7eab4d8284.png\tb132d4111e.png\te49d683e92.png\n18914f0797.png\t4c92645a67.png\t7eae0b1bb1.png\tb13437be49.png\te4a1869906.png\n18925b68f8.png\t4c99a92c2b.png\t7eb296821f.png\tb138330f9f.png\te4a276f1cb.png\n1892c3d197.png\t4c9a332e58.png\t7eb34773ab.png\tb13910ee1b.png\te4a2ebc155.png\n189357775a.png\t4c9b3dc4c8.png\t7eb352ecfe.png\tb13be19418.png\te4a49040ed.png\n1894f7c35a.png\t4c9d8d500c.png\t7eb42fe5ec.png\tb13f0dbc84.png\te4a5a7227d.png\n189dcac15c.png\t4c9eef5a61.png\t7eb6261dcd.png\tb142db0966.png\te4ab79406f.png\n189f543aa7.png\t4ca00f3f9b.png\t7eba79439a.png\tb1430fc9bf.png\te4ada88a55.png\n18a167d8a9.png\t4ca0c2e20f.png\t7ebbf0b3dc.png\tb144a72dc9.png\te4afc92300.png\n18ac660fbb.png\t4ca0fd980f.png\t7ebd3f0cdd.png\tb14736ed95.png\te4b6946a28.png\n18ac7a5b88.png\t4ca1a99fcf.png\t7ebfc0aee3.png\tb148e3561f.png\te4b76c6f69.png\n18aea1f624.png\t4ca4737f4a.png\t7ec369d890.png\tb149724d80.png\te4b9664ba5.png\n18b147861a.png\t4ca8350f11.png\t7ec58d0743.png\tb14f73e81c.png\te4b97067dc.png\n18b26e968c.png\t4caad13660.png\t7ec64d2b8e.png\tb15313d4e8.png\te4bcdceb27.png\n18bbbabadd.png\t4caf753363.png\t7ec7eda82f.png\tb15561ae18.png\te4be64c1fb.png\n18bcca2e96.png\t4cb49fd736.png\t7ec803b58a.png\tb15b7eb177.png\te4bfd07346.png\n18bd2e6277.png\t4cb8474e4b.png\t7ec839af45.png\tb15d57ba45.png\te4c74c0573.png\n18c219ecb7.png\t4cb9bdf555.png\t7ec97427ca.png\tb1667e0603.png\te4c7895444.png\n18c28327b4.png\t4cc08d6a80.png\t7ed0489985.png\tb169f405fb.png\te4c862a4a5.png\n18c4cdefec.png\t4cc9c51c09.png\t7ed2a9b1f5.png\tb16b507725.png\te4c8dc59fa.png\n18cd0215e3.png\t4cca1e8763.png\t7ed44899db.png\tb16b9fa3da.png\te4cc57f730.png\n18cd26bc05.png\t4ccc7e2691.png\t7ed6ac2f41.png\tb16bea9a38.png\te4ce609688.png\n18cd36e491.png\t4ccccc2ff3.png\t7ed6de6ae9.png\tb16c004483.png\te4cf54768b.png\n18cedfd6ea.png\t4ccd4ad68a.png\t7ed79f8f56.png\tb16f497480.png\te4d2bf6036.png\n18cf38b412.png\t4cde340a69.png\t7ede03dede.png\tb172bd2032.png\te4d536fec6.png\n18d013fced.png\t4cded6760d.png\t7edef17361.png\tb1734f57e7.png\te4d6a54b19.png\n18da13aaaf.png\t4cdf314bda.png\t7ee0e76934.png\tb173d500a9.png\te4d9d1178c.png\n18e2325a5b.png\t4cdff73936.png\t7ee1b2f49b.png\tb174e802d6.png\te4dbb99aa2.png\n18e4bdc8d4.png\t4ce0d52b0e.png\t7ee3778aaa.png\tb178921d26.png\te4dbcffde6.png\n18ea3802ca.png\t4ce2883ba3.png\t7ee401ed76.png\tb180ac0113.png\te4e893bc98.png\n18ef5f86ab.png\t4ce34a9fb5.png\t7ee71ee874.png\tb183b2ddc4.png\te4eb890966.png\n18f3626b0f.png\t4ce3eb12cd.png\t7eeebb8145.png\tb184282a6a.png\te4ecc82bb4.png\n18f3cb6bd1.png\t4ce60abe5f.png\t7efc8f9954.png\tb18cb84bca.png\te4ede1a7f3.png\n18f73b05ae.png\t4ce77cc568.png\t7efd271bba.png\tb18f11ea40.png\te4ee492bd5.png\n18fc51d517.png\t4ce9c87fee.png\t7eff5ec2dd.png\tb192ef99f8.png\te4f1c5b15c.png\n190ac23ab2.png\t4cec6bf0aa.png\t7f012d3661.png\tb1935eef13.png\te4f2783759.png\n190ea39ec4.png\t4cee41f355.png\t7f01757d4c.png\tb194549062.png\te4fa7c6a7d.png\n190febf6cd.png\t4cf72e5ba6.png\t7f03a3995f.png\tb1956b1703.png\te4fce4b587.png\n19112f71b9.png\t4cf7f2cbc9.png\t7f03bfdbdf.png\tb19888b1d4.png\te4ff4bbfd3.png\n1915af8856.png\t4cf99abf91.png\t7f0825a2f0.png\tb199772417.png\te50041b85d.png\n19167725cd.png\t4cf9a64333.png\t7f09587094.png\tb19db02e8e.png\te50212245b.png\n1917722d17.png\t4cfcacd4f9.png\t7f0a6fb2f3.png\tb1a2903d1c.png\te50303e172.png\n1918e72177.png\t4cfe2d4763.png\t7f0bdb5e31.png\tb1a8e768af.png\te506b4b589.png\n191bfc2fae.png\t4cfe70e0ef.png\t7f0e67d5ec.png\tb1a9b71eb1.png\te509b9c869.png\n1925c11fce.png\t4cfee4b097.png\t7f0ee9c30a.png\tb1ac5ed0b3.png\te50a5a077a.png\n1926bbd7a9.png\t4d00f5b453.png\t7f107a23d4.png\tb1b0090486.png\te50d3d420d.png\n1931feb9b8.png\t4d03329aa4.png\t7f14677274.png\tb1b4b687f7.png\te510198a84.png\n1937cfedc5.png\t4d04dcd959.png\t7f196f6711.png\tb1b8290b61.png\te5121874bb.png\n193a5e4c04.png\t4d0a24c425.png\t7f1c1802d4.png\tb1b9ce9b86.png\te51487b92e.png\n193f8f285e.png\t4d0d7b3b8c.png\t7f1c9e5f69.png\tb1ba475d8d.png\te51599adb5.png\n19436cc9b9.png\t4d119162bf.png\t7f1d5f223c.png\tb1ba756a79.png\te519d5dc25.png\n194e149187.png\t4d137bd91d.png\t7f22aad9b8.png\tb1be1fa682.png\te51bea30d0.png\n194ebc0b86.png\t4d163fda07.png\t7f23ba4e4a.png\tb1c30ea1dc.png\te51e614146.png\n195420ac11.png\t4d1825be4e.png\t7f24dfceef.png\tb1cb5be1ce.png\te5265b3f94.png\n195450d08d.png\t4d19875184.png\t7f2642a14a.png\tb1cc24274f.png\te5333c6911.png\n1957bd75f5.png\t4d1b2c230e.png\t7f2694ca3f.png\tb1ceebc1fa.png\te536d6ff17.png\n195b9ae705.png\t4d1d5d9d62.png\t7f30236ca4.png\tb1d038b961.png\te53934a7c9.png\n195f997af2.png\t4d1efb19d7.png\t7f34898162.png\tb1d1f35104.png\te53c2e8771.png\n19611e0df2.png\t4d1f5bd37e.png\t7f380f4095.png\tb1d7dde27b.png\te53ca23b39.png\n1963a9e03a.png\t4d21a8e11c.png\t7f38341dd9.png\tb1dc23a427.png\te53d96b2ce.png\n1965c20ff5.png\t4d2778b117.png\t7f3a6d74ed.png\tb1de48342b.png\te53f18894a.png\n196638fe77.png\t4d27ef11d6.png\t7f3e8a8526.png\tb1e590d1c9.png\te5416b5ff6.png\n196976ff21.png\t4d2c664efd.png\t7f43266580.png\tb1e6235974.png\te544c8249f.png\n196b082463.png\t4d2ede6b7a.png\t7f45830e33.png\tb1e7ed420a.png\te5463f2c0f.png\n196cffdb4b.png\t4d33311a1e.png\t7f45ef3c24.png\tb1ec2f27e4.png\te547ff9d88.png\n196e75547a.png\t4d341171a4.png\t7f4ccadf3f.png\tb1f0e104f5.png\te54944e5bf.png\n196e901a92.png\t4d3a5d287e.png\t7f54fb7d0e.png\tb1f47c5be7.png\te54b5f7985.png\n1973a628b1.png\t4d3bff540d.png\t7f558ed116.png\tb1fcf57156.png\te54c815e85.png\n197f03e18a.png\t4d3cdc8697.png\t7f5737c5b3.png\tb1fd612a60.png\te54d72e6b7.png\n19809089fd.png\t4d3e7f3eb4.png\t7f5914b74a.png\tb1fd7719e3.png\te556a7c183.png\n19843b4e95.png\t4d3eeda971.png\t7f59a43fb8.png\tb1ff4879c6.png\te55ce10074.png\n1986fcadda.png\t4d4271753e.png\t7f5a72fa2c.png\tb202b6b3d4.png\te55f192294.png\n198925c04a.png\t4d468ea464.png\t7f5b4e0cd1.png\tb2039444f4.png\te560423302.png\n198b605afd.png\t4d4769981f.png\t7f5df12942.png\tb203e59c04.png\te5632bb2f1.png\n198daefb7f.png\t4d493854a1.png\t7f60438768.png\tb2063e6bb6.png\te563323c18.png\n19902af6bd.png\t4d49c3bdc3.png\t7f62bd4856.png\tb20787e2e7.png\te563ae7e9c.png\n19932718b7.png\t4d4adc5623.png\t7f6ad34bcf.png\tb208e23ba1.png\te5699c01e6.png\n19a0c4e9b5.png\t4d4db48d0d.png\t7f6d43202d.png\tb20a70b6b7.png\te56d476cf0.png\n19a19742ce.png\t4d522a60c2.png\t7f73304dca.png\tb20eb9e813.png\te573b432c3.png\n19a274548c.png\t4d5306ecf9.png\t7f76751c6b.png\tb20f6fc0eb.png\te576adfb4f.png\n19a46e881a.png\t4d5860a0df.png\t7f7b6a6a6c.png\tb211bcbb47.png\te578390adc.png\n19a6580aed.png\t4d58e8d8ff.png\t7f7ced694f.png\tb212dd2177.png\te57b99c9b4.png\n19a9d37c4c.png\t4d5b2c8fe7.png\t7f7fcf0a7b.png\tb21810165a.png\te57d4e9c04.png\n19b25d6700.png\t4d5bd7f272.png\t7f80ac6447.png\tb221ece24b.png\te57ef95bb2.png\n19b7d2e2bf.png\t4d6121dae0.png\t7f8112b31e.png\tb223958781.png\te585ec8541.png\n19bd16cc80.png\t4d630eb6ce.png\t7f826d312e.png\tb2277ba07e.png\te589937bda.png\n19bd49e478.png\t4d640602cc.png\t7f82fe9f71.png\tb2293c79c5.png\te589972dfc.png\n19c0520d70.png\t4d6522470e.png\t7f841928bb.png\tb2294d186b.png\te58a0083f9.png\n19c666a371.png\t4d66fb9825.png\t7f8cc01860.png\tb22bf3694c.png\te58c0adca4.png\n19c8c5aa98.png\t4d684d99cc.png\t7f8df64318.png\tb22e02807a.png\te58d16b268.png\n19cc12c7fe.png\t4d6ef1a9bb.png\t7f954be7ab.png\tb23429fadf.png\te59003dfc4.png\n19cc342b91.png\t4d762be5e6.png\t7f996bc3ed.png\tb2343c89d7.png\te594ef45fc.png\n19d0384e1f.png\t4d774cb2a4.png\t7f99abfc11.png\tb239eb33f5.png\te59526d6bf.png\n19d06934db.png\t4d77975b6f.png\t7f9bb4a93a.png\tb240582de8.png\te599fabb46.png\n19d2303888.png\t4d7b42314d.png\t7f9e61c079.png\tb240778420.png\te59a73943c.png\n19d821e8e4.png\t4d7d9cad2d.png\t7f9e856b9a.png\tb244d7991e.png\te59cd0d10e.png\n19dc3b6402.png\t4d7f9af393.png\t7f9ec4856a.png\tb249094b5a.png\te5a158c743.png\n19dfee3fcd.png\t4d7ff23978.png\t7f9f059f8e.png\tb24947c8d3.png\te5a22d144d.png\n19e749d0b1.png\t4d8007c75e.png\t7f9f13c0d0.png\tb249b1f476.png\te5a61707cf.png\n19edb31599.png\t4d808c34c8.png\t7f9f7a2bbc.png\tb24c0b91a3.png\te5a6b92463.png\n19fbc4280d.png\t4d83ad1538.png\t7fa12eba13.png\tb24d3673e1.png\te5a8368ee6.png\n19fdb270d6.png\t4d8c1e0580.png\t7fa49d7624.png\tb254feec1a.png\te5a9192e3d.png\n1a013e5ca3.png\t4d8fc28f50.png\t7fa55c8e4f.png\tb258011964.png\te5aa4a03b4.png\n1a0230832a.png\t4d92d49210.png\t7fa5927702.png\tb259d12150.png\te5aefdd3da.png\n1a05259ac0.png\t4d9b845652.png\t7fa6578978.png\tb259e7fa97.png\te5af87d93c.png\n1a05d9c2a2.png\t4d9caf0bed.png\t7fa74fdeba.png\tb26035f279.png\te5b07450e3.png\n1a091013a6.png\t4d9fa2d97f.png\t7fa98fc471.png\tb268ce180e.png\te5b47aa145.png\n1a0975b0fa.png\t4da67126ce.png\t7fadb8efe8.png\tb270ffbcd8.png\te5b8905fea.png\n1a0996a817.png\t4da725a703.png\t7fadebf068.png\tb2776c7bf3.png\te5bd8ae238.png\n1a0a056dbd.png\t4da9fb1be9.png\t7faea04242.png\tb2782e3901.png\te5c26c8634.png\n1a12d639de.png\t4dae4a3072.png\t7faefd69a8.png\tb2789223d4.png\te5c3b78df0.png\n1a1383df07.png\t4dae5d1e28.png\t7fb0f3f722.png\tb278cc6326.png\te5cd6bc55c.png\n1a14ae3468.png\t4dae9c390c.png\t7fb1a8a0c5.png\tb27c83519f.png\te5d26987a9.png\n1a157f9551.png\t4db03fc647.png\t7fb8b363c5.png\tb27dcf087e.png\te5d4791389.png\n1a1b0c1549.png\t4db1587bed.png\t7fb929c05f.png\tb28690f7ad.png\te5d57c3fbd.png\n1a1c2eb4c7.png\t4db23d2bc3.png\t7fb97671b4.png\tb2899a1188.png\te5d64a2aea.png\n1a2196365e.png\t4db3811fbf.png\t7fbb12b1ac.png\tb28aeb78e0.png\te5db78c10b.png\n1a241a8926.png\t4db68c7d25.png\t7fbc88375b.png\tb28b22e057.png\te5dd8bbfd7.png\n1a279da1f9.png\t4db87ce39b.png\t7fc1ce1cb4.png\tb28dee166b.png\te5df7166bc.png\n1a288f27bf.png\t4dbd529a9a.png\t7fd5baaee8.png\tb28f64e243.png\te5e6a428b6.png\n1a28f06897.png\t4dc10dd415.png\t7fd682b979.png\tb2958f2747.png\te5e9e5fcff.png\n1a2918c9b4.png\t4dc26b1265.png\t7fdbce1fca.png\tb2a0365c9d.png\te5f5d12e89.png\n1a2b9fc0ef.png\t4dc47d5ffd.png\t7fdd18017e.png\tb2a129276d.png\te5fb8e1029.png\n1a2d060823.png\t4dc913ff73.png\t7fde5cb281.png\tb2a2fd2e0b.png\te5fc8cb081.png\n1a37af90c1.png\t4dd463ca9e.png\t7fe5c45bd7.png\tb2a7f34fa1.png\te5fd31d394.png\n1a407db63b.png\t4dd47d353e.png\t7fefadd513.png\tb2aa9a8705.png\te5fdeba224.png\n1a4278325d.png\t4dd7ed43a0.png\t7ff1d00b47.png\tb2aa9c7f1c.png\te5fdf421a1.png\n1a4411bd5c.png\t4de659b0f5.png\t7ff5700a23.png\tb2afc5f0b4.png\te609ac2104.png\n1a44c4e591.png\t4dee03c883.png\t7ff659b264.png\tb2b0cd95a4.png\te60a29d658.png\n1a47928acb.png\t4dee25677d.png\t7ffccc46c5.png\tb2b49923ea.png\te60b73512b.png\n1a4a10e974.png\t4df40162f4.png\t7ffd320769.png\tb2ba08c21b.png\te60cd100f1.png\n1a4cd0fd59.png\t4df5b725e4.png\t7fff116e01.png\tb2bb8b1c2c.png\te60e1201f5.png\n1a4ffa4415.png\t4df85f0d5a.png\t800b9f8b72.png\tb2c26edd9c.png\te60ec55223.png\n1a505d91b8.png\t4df895ff96.png\t800d311316.png\tb2c6b8cf57.png\te612ab0326.png\n1a52f342a7.png\t4df9af6dd8.png\t800daf6ac1.png\tb2ce3e9581.png\te616634db7.png\n1a5c093750.png\t4dfc862dff.png\t8011d054e7.png\tb2d02516e1.png\te619160edb.png\n1a5c58280d.png\t4e04982929.png\t8011db07e4.png\tb2d2042049.png\te61d42e6cf.png\n1a5d0c150d.png\t4e0acb57df.png\t8016056c46.png\tb2d2951f0e.png\te61d5c8b3a.png\n1a66074498.png\t4e0f9ba44c.png\t8016da014b.png\tb2d4c41f68.png\te61d98500a.png\n1a67b08604.png\t4e10dfcd54.png\t8020c6cc91.png\tb2d7897c35.png\te61e9bcb65.png\n1a6ae27d2b.png\t4e13bb0edb.png\t8023f35b58.png\tb2d7e9d9b0.png\te62622675a.png\n1a6fd02308.png\t4e176e4a6d.png\t802b3e6be9.png\tb2d7eb0937.png\te6285f497c.png\n1a72cc3d3e.png\t4e1c46b1ed.png\t802df9bb5e.png\tb2d9c9184b.png\te62b69e2d6.png\n1a73abcf3e.png\t4e1d253099.png\t802fa1a9f6.png\tb2eb29f6a6.png\te62cc3f52d.png\n1a73f9ce89.png\t4e206da365.png\t8032860ce9.png\tb2eec54a46.png\te62ec0b893.png\n1a7497f2c6.png\t4e22f17d32.png\t80363d0458.png\tb2ef982b4b.png\te63196d890.png\n1a7b40b40e.png\t4e23fc92f3.png\t8043caee17.png\tb2f220fad7.png\te634f95937.png\n1a7f8bd454.png\t4e3378fc69.png\t80444c3dab.png\tb2f2f933b7.png\te639615636.png\n1a8026456f.png\t4e3ccf37ee.png\t804580b1af.png\tb2f87059da.png\te63a0991d5.png\n1a80491edd.png\t4e41f7a5df.png\t804aeabe78.png\tb2f93873c6.png\te63b9c1cd4.png\n1a8a17f220.png\t4e42f26a2f.png\t804c7d7108.png\tb2fbaf80d5.png\te63ba604c4.png\n1a8af411c3.png\t4e434e64a5.png\t804def1efb.png\tb2fc87dc4e.png\te642037975.png\n1a8b38f45f.png\t4e44708d4b.png\t804fcfaf4b.png\tb302fbe079.png\te643e16a33.png\n1a8c4fd6ea.png\t4e47ff086f.png\t8054361dfd.png\tb303943f66.png\te64711cf41.png\n1a8cbd3a80.png\t4e4cadf03f.png\t80552d0e8f.png\tb3075bc7c0.png\te64b6d27bf.png\n1a948a1a2d.png\t4e4fab3bf3.png\t80590647c7.png\tb30982f994.png\te659f781ee.png\n1a997b9106.png\t4e502a4a1f.png\t805a279a36.png\tb30a514b50.png\te65cb587e2.png\n1aa56af36c.png\t4e50b795b0.png\t805cb4d316.png\tb3102e4e0d.png\te65d1f0012.png\n1aa5baa761.png\t4e50c41f4e.png\t805e8f004a.png\tb314d90af5.png\te6674212ce.png\n1aac5dad59.png\t4e516a255d.png\t8063c65b57.png\tb318dfdca7.png\te667f4d290.png\n1aaffdb790.png\t4e51caa68e.png\t80666396cb.png\tb319973d58.png\te66b084eba.png\n1ab49f29e9.png\t4e54c259a0.png\t80668fbcee.png\tb31d571a94.png\te66b194457.png\n1abe5e8a3d.png\t4e583649f3.png\t8068909c83.png\tb31e7d63e4.png\te6703f6191.png\n1abf3d7f60.png\t4e596a41b0.png\t80693d458f.png\tb31ed95288.png\te6714fd3e7.png\n1ac058f0a4.png\t4e5a439e1c.png\t80694aa70f.png\tb3244169d5.png\te671d3f744.png\n1ac1e18079.png\t4e5c620706.png\t807262dbb7.png\tb325cf9099.png\te6735cfd80.png\n1ac21f6704.png\t4e5d643b2f.png\t8077c06f68.png\tb329875de2.png\te6757ac486.png\n1ac220b834.png\t4e5eaec9e7.png\t808c282801.png\tb329b390e4.png\te6784c9c5b.png\n1ac3babe57.png\t4e6855dbae.png\t808c63ed8f.png\tb32be89c4a.png\te67a038cf7.png\n1ac4fd8022.png\t4e6fd7ab9c.png\t808cbefd71.png\tb32c994c11.png\te67ad4fc61.png\n1ac6523c3a.png\t4e70e4d96c.png\t808fb3d46c.png\tb332da5919.png\te67c706413.png\n1ac7f45711.png\t4e7930ac09.png\t80988b32b4.png\tb33771a471.png\te683e8ae69.png\n1acac10cfe.png\t4e7d4502bc.png\t809cb34e8d.png\tb337d81063.png\te68d553bdf.png\n1ad16d5693.png\t4e80e2cc2c.png\t809ce44053.png\tb3396387a6.png\te68e53a654.png\n1ad2135efa.png\t4e8167fe05.png\t809d3a2b71.png\tb33b1f9db6.png\te69330b40a.png\n1ad3ccd797.png\t4e8235d649.png\t80a253818f.png\tb33d57c4be.png\te694b95fa4.png\n1ad819c49a.png\t4e84e470fe.png\t80a458a2b6.png\tb33e244157.png\te697e2d9c7.png\n1ad9ccb904.png\t4e86258646.png\t80a60e50be.png\tb342a4959d.png\te698b4859b.png\n1adad51ef8.png\t4e86e06594.png\t80a63d5fa6.png\tb34410a696.png\te699288e54.png\n1adcc5d526.png\t4e89a48b43.png\t80ae0d1b54.png\tb344da6616.png\te69bb6ca73.png\n1adf735507.png\t4e89a919ed.png\t80b0fa63e4.png\tb3459251e5.png\te69c6b5d79.png\n1aeaa85dae.png\t4e8f4b7b6e.png\t80b2ea548c.png\tb345b1f290.png\te69e307dfd.png\n1aec922970.png\t4e8f536122.png\t80b7b5c6f6.png\tb349b57230.png\te6a6cb08c5.png\n1aed9a9daa.png\t4e961dd0ed.png\t80bc3aa015.png\tb354751edd.png\te6a78e36b5.png\n1aef19ae12.png\t4e98c975bd.png\t80bc5a57a3.png\tb355ed119c.png\te6a986f6c8.png\n1aef65e24b.png\t4ea67ddd37.png\t80bdc2584b.png\tb35a524585.png\te6aa3a20b4.png\n1af44a9db0.png\t4ea94e932a.png\t80bded4d93.png\tb35ac76f29.png\te6ab700d15.png\n1af68faa92.png\t4eacd42b51.png\t80bdf2dfe0.png\tb35b1b412b.png\te6abda423b.png\n1af8728ece.png\t4ead0643ad.png\t80bea8ddbe.png\tb3609791f5.png\te6ac2a35cd.png\n1af88d781e.png\t4ead77ca14.png\t80bef8d68f.png\tb365432652.png\te6b5506c1f.png\n1af9e4bbd8.png\t4eaf4060af.png\t80c17491af.png\tb365cdf41f.png\te6b58b9a6e.png\n1afb27f632.png\t4eb102cd2e.png\t80c275e4ad.png\tb3680c53d9.png\te6b6755d97.png\n1afb60e200.png\t4eb539b65b.png\t80c2769443.png\tb368971dc7.png\te6b76e030e.png\n1afddc8196.png\t4eba5550d9.png\t80cb33ede2.png\tb36d540a8d.png\te6c0408a1f.png\n1b01a457d9.png\t4ebe8a59b6.png\t80cb3e7f7d.png\tb36f7e07bb.png\te6c62abf69.png\n1b073494ab.png\t4ebed82264.png\t80d05ebf2c.png\tb3740228b0.png\te6ca6a00d4.png\n1b0d74b359.png\t4ebf1a7500.png\t80d16f5eb3.png\tb3777ff467.png\te6caf5eb81.png\n1b0dc93eec.png\t4ec803a4a1.png\t80d492f994.png\tb37ebb4c18.png\te6d58ad89b.png\n1b0eec121b.png\t4eccb0cb9d.png\t80d529997b.png\tb380fcfa80.png\te6d88fe842.png\n1b0ff56708.png\t4eceea0d2a.png\t80d7cdbbed.png\tb385f3a649.png\te6d955fffa.png\n1b1783a2a8.png\t4ed082016b.png\t80db571e2b.png\tb3861c9620.png\te6ddf7d21f.png\n1b1bf5bd6b.png\t4ed1276470.png\t80dc850f25.png\tb387191390.png\te6e3e58c43.png\n1b1f719ff8.png\t4ed12ec89a.png\t80ddb7476e.png\tb38d8d7402.png\te6e478cbc6.png\n1b203471b7.png\t4ed2467e7c.png\t80de565868.png\tb38ee97247.png\te6f2179d84.png\n1b279deb93.png\t4ed639c85f.png\t80de9489b2.png\tb38fa501ed.png\te6f52ddb72.png\n1b29749de6.png\t4ed65877ea.png\t80e1e37c49.png\tb38fc0bc4d.png\te6f8434e55.png\n1b2a33253f.png\t4eda5fee4a.png\t80eae07846.png\tb393a57640.png\te6f86c83e2.png\n1b2bc9e64c.png\t4edadcf4c4.png\t80ec035d24.png\tb393cb32a7.png\te70983bf79.png\n1b2c24dfdb.png\t4edb0938ab.png\t80edfca7d3.png\tb3961ddea6.png\te71380e985.png\n1b2c817dd0.png\t4edca0711e.png\t80efbeb7e2.png\tb3962e504c.png\te719377b14.png\n1b2daa0f13.png\t4ee453173a.png\t80f65d6da6.png\tb397172c0c.png\te71e4852db.png\n1b2fa5d4f5.png\t4ee4b3485e.png\t80f930d541.png\tb398085048.png\te71eb5dbaa.png\n1b313539f1.png\t4ee4b387e1.png\t80faacb8c6.png\tb398e877c2.png\te723bcb7ae.png\n1b346bfba2.png\t4ee5d2c629.png\t810309305f.png\tb39932655c.png\te72b32e044.png\n1b3aa9a328.png\t4eeb9a13a4.png\t810d63ce0c.png\tb39b153e93.png\te72c42736a.png\n1b44199942.png\t4eec9d2907.png\t810dacccc3.png\tb39b67db8a.png\te72ec9aa6a.png\n1b46f6af5c.png\t4ef0559016.png\t810e964fb7.png\tb39fe87a25.png\te72efea1d1.png\n1b4836f485.png\t4ef618552a.png\t811261bfe4.png\tb3a1353d0e.png\te730c9c4e8.png\n1b4c67f0cc.png\t4ef64308cf.png\t8113b5279c.png\tb3a135bd66.png\te7319918f8.png\n1b50bad7d1.png\t4ef659421f.png\t8116fa64d4.png\tb3a4656e16.png\te733c8e845.png\n1b51d7eb4c.png\t4ef8852462.png\t8118c2c6e0.png\tb3a5061e67.png\te73532b450.png\n1b554577bf.png\t4ef8c0897a.png\t811ebdcc4b.png\tb3a6ed3046.png\te735f7b212.png\n1b5991119e.png\t4efbd07c56.png\t811fb98143.png\tb3aabe7b9a.png\te739540fa2.png\n1b5a148cd3.png\t4efc5efc79.png\t812492066c.png\tb3ad0f3dc0.png\te739fe4fbf.png\n1b5c4051d4.png\t4eff1c1469.png\t8126503a2e.png\tb3ad13d9e9.png\te73ed6e7f2.png\n1b611f81d9.png\t4f02bab289.png\t812a494ff0.png\tb3ad85e8a9.png\te7442307b1.png\n1b61453b52.png\t4f02e56080.png\t812c804456.png\tb3b28f26eb.png\te7489b0c3c.png\n1b61e02423.png\t4f05c105c6.png\t813593f7e0.png\tb3b37c152d.png\te7527c0183.png\n1b64862494.png\t4f06185ed0.png\t8135ca6dde.png\tb3b3e761fe.png\te7551e9a26.png\n1b6678d415.png\t4f069d6740.png\t8138c79081.png\tb3b9fd43cc.png\te755b8f086.png\n1b6919f87c.png\t4f0f086203.png\t813d6ad4ba.png\tb3be6d0b23.png\te760d65679.png\n1b6afd84fb.png\t4f105993c5.png\t81466d1890.png\tb3bf74f77d.png\te764404579.png\n1b6db79ff5.png\t4f136093cb.png\t8147d3558c.png\tb3c3f6847b.png\te764cac378.png\n1b70c24eb0.png\t4f15ac8be7.png\t81491224d7.png\tb3c53c57c0.png\te764e3ede0.png\n1b71d67404.png\t4f18b39baa.png\t814a711be4.png\tb3c74119ae.png\te766c650a3.png\n1b7304112e.png\t4f1bb674c2.png\t814b6d6ac1.png\tb3c8932dc8.png\te767158a8e.png\n1b741079c5.png\t4f1c44860a.png\t814bd19e43.png\tb3c8b426fb.png\te7688312ec.png\n1b7497985f.png\t4f1c683f81.png\t814d0b9a3e.png\tb3c901c03b.png\te7723f6276.png\n1b7ac7bb31.png\t4f1fe6ebef.png\t81545f724c.png\tb3cbe4ea5d.png\te77514fa9d.png\n1b7adc4406.png\t4f21ee4e02.png\t81560cbf2b.png\tb3d612db82.png\te775f485b6.png\n1b818dd025.png\t4f26bde95e.png\t81563b7b32.png\tb3df47aa9f.png\te77611a140.png\n1b818ee74a.png\t4f28c879e1.png\t81574fb1cf.png\tb3e1c43576.png\te778f4c95f.png\n1b8d100712.png\t4f29c4cd04.png\t81575289eb.png\tb3e448572a.png\te77d46b5e7.png\n1b8d23755b.png\t4f30a97219.png\t815d94120d.png\tb3e666b13e.png\te78041acf8.png\n1b97d2054b.png\t4f3368a855.png\t8164beeaf5.png\tb3eb95cd5b.png\te78df7b2c3.png\n1b9865501f.png\t4f349b5ab8.png\t8169a50aec.png\tb3ee456eb5.png\te7923bdabf.png\n1b99e91068.png\t4f34a8f76a.png\t816c9d6d5d.png\tb3ef8aa64a.png\te799fcac15.png\n1b9d714ef4.png\t4f34b06dc8.png\t8174aba668.png\tb3f377a7e4.png\te79aa5ff58.png\n1ba354e42a.png\t4f34c7f252.png\t817a1259a3.png\tb3f6affa39.png\te7a09512ea.png\n1ba6ec8c09.png\t4f359f7110.png\t817a69e036.png\tb3fa13682c.png\te7a5e5e55f.png\n1bae2bef94.png\t4f373d4a73.png\t817dc95214.png\tb3ffb97bd0.png\te7afd37c7f.png\n1bb08d1d48.png\t4f37c11814.png\t81809318af.png\tb4043d90c5.png\te7be02fbab.png\n1bb2b1fbf4.png\t4f381902e8.png\t8181166ca9.png\tb40457631c.png\te7c38ef150.png\n1bb3688821.png\t4f3c549609.png\t8182bdb020.png\tb404aafd56.png\te7c396bf28.png\n1bb7369c68.png\t4f3c55a643.png\t818c8334ad.png\tb404ffdbf7.png\te7c54b5f4e.png\n1bbf1cdaba.png\t4f3ebfa6fa.png\t818ccc0ca7.png\tb40512ff28.png\te7c55756c7.png\n1bc1ec2325.png\t4f42f2193a.png\t818de0feb2.png\tb40844d201.png\te7cbcd855e.png\n1bc532153f.png\t4f43acf03a.png\t81989809aa.png\tb4089e711c.png\te7cd8cdcb8.png\n1bc7d41154.png\t4f47a5553c.png\t819ac703be.png\tb413657974.png\te7ceb8a1f0.png\n1bca27f98f.png\t4f4d5a2c20.png\t81a05f393f.png\tb41a0ef572.png\te7d582936b.png\n1bcab21a24.png\t4f4f1baf90.png\t81a2b120a6.png\tb41beccab3.png\te7d950f0b1.png\n1bcc5ea50a.png\t4f5a6f8c74.png\t81aa49a9fc.png\tb42453f94b.png\te7d963d69a.png\n1bd1c8c771.png\t4f5df40ab2.png\t81ab5dc2d6.png\tb427cb176e.png\te7da2d7800.png\n1bd88b6b5d.png\t4f62a5bb14.png\t81ab8c6bb7.png\tb42ae73708.png\te7e1dca0aa.png\n1bd9b1a1c8.png\t4f6328bf93.png\t81abf40136.png\tb42b881141.png\te7e9165d9b.png\n1bde2dd64c.png\t4f633e2d63.png\t81ad4fed2b.png\tb42c01d7fb.png\te7ed227dda.png\n1bde429ee9.png\t4f64946e08.png\t81b1d7ed17.png\tb42d968164.png\te7ee2c6dce.png\n1be10e344a.png\t4f66cd03ef.png\t81b5783353.png\tb42eb8028b.png\te7eeb87ebe.png\n1bea5426f5.png\t4f684f7764.png\t81b6dacaba.png\tb42ffc97d4.png\te7ef2c54aa.png\n1bebafb9f3.png\t4f6c2e18f6.png\t81b84bb9c9.png\tb433dba580.png\te7f0e30585.png\n1befe1ba44.png\t4f718f6704.png\t81babf8586.png\tb435f169a7.png\te7f271e490.png\n1bf062bda2.png\t4f738a869e.png\t81bded8b9b.png\tb4382ccb5d.png\te7f3808bd0.png\n1bf3d27d55.png\t4f74d3b977.png\t81be859449.png\tb43c7545ec.png\te7f63a219f.png\n1bf9e7785d.png\t4f77eb811c.png\t81c1149163.png\tb43c8c37a1.png\te7feb2d438.png\n1bfba17615.png\t4f79ec0ba8.png\t81c4fb6418.png\tb43dba4531.png\te800a212f6.png\n1bfe4907c1.png\t4f7b28dcba.png\t81c77c1b6c.png\tb43e1ad347.png\te8042ee36c.png\n1bff16cb1e.png\t4f7ba782e6.png\t81c9c9365f.png\tb440f01d8c.png\te80ac14b0f.png\n1c02222ef8.png\t4f84cfd807.png\t81d37cb5fd.png\tb44168da26.png\te8135a1a96.png\n1c093fde09.png\t4f856fc09d.png\t81d51f7eaf.png\tb447288bcf.png\te813f3d1bf.png\n1c098a50aa.png\t4f89afc1e5.png\t81d87e9c69.png\tb448f0fae6.png\te81bc8050f.png\n1c0b2ceb2f.png\t4f8a095e23.png\t81da51a1b2.png\tb44efbf44d.png\te81dd36733.png\n1c16231286.png\t4f8b7cf7dd.png\t81e673345b.png\tb452af04cd.png\te82162c83c.png\n1c165d61d6.png\t4f8c82df73.png\t81e9af5ee2.png\tb4538661b4.png\te82421363e.png\n1c1aa330aa.png\t4f92b8d674.png\t81ede1ca77.png\tb455bee299.png\te82a9be847.png\n1c1c2983ca.png\t4f9748f56b.png\t81f6c814e2.png\tb455f3a725.png\te82b7a2abb.png\n1c1d062bca.png\t4f9c0b8c34.png\t81fa3d59b8.png\tb45ad3932e.png\te82fddb779.png\n1c20fc4477.png\t4f9f89ee2a.png\t81fc619c7f.png\tb45b8b6a5f.png\te83222698a.png\n1c21b0e9bb.png\t4fa49ff7f5.png\t81fd475861.png\tb45dd34dfb.png\te8325b54e1.png\n1c224a1b4d.png\t4fa6ae9db4.png\t81fdac0b58.png\tb45f9383e9.png\te835ac232b.png\n1c23bcab44.png\t4fb0d65a85.png\t81fde400a8.png\tb4602ed79f.png\te83c7943f9.png\n1c23f6817c.png\t4fb16587b5.png\t8200fffe39.png\tb461a5b584.png\te843caea8f.png\n1c2931e548.png\t4fb2778316.png\t8201005b79.png\tb46364e795.png\te846787720.png\n1c298a2807.png\t4fbce04350.png\t820b99b513.png\tb4650d576e.png\te84940ac6a.png\n1c2ad559cc.png\t4fbda008c7.png\t820c4570b1.png\tb465487b3e.png\te849a8a057.png\n1c31c535a0.png\t4fc4759294.png\t82197f0f0a.png\tb467416ab9.png\te84a489d0e.png\n1c3397d0ef.png\t4fcbac3606.png\t821bab3386.png\tb468a3a0f5.png\te84a4d5380.png\n1c34dafbcc.png\t4fce657ed6.png\t821c0444b3.png\tb46c37a0a2.png\te84aa09005.png\n1c376a8287.png\t4fcfcdafb4.png\t821c36cd97.png\tb46cb148e9.png\te853c3322a.png\n1c37e37fbf.png\t4fd13dd205.png\t821e0217f0.png\tb46cdbca5d.png\te855773fbc.png\n1c3a92013c.png\t4fd3d56da4.png\t821f0bcb1d.png\tb4759edcc7.png\te855b705bc.png\n1c3df8172c.png\t4fd8abbdb5.png\t822043d604.png\tb475e5ad7a.png\te85a720f91.png\n1c3ef17382.png\t4fd99d2b52.png\t8221d9b2b3.png\tb479520ed1.png\te85ba9f671.png\n1c4012bff4.png\t4fda00e2d1.png\t82238ffb93.png\tb47ad209d4.png\te85ca402ee.png\n1c41821d13.png\t4fdc882e4b.png\t8229c8feac.png\tb47f514539.png\te86203f4f1.png\n1c427dd881.png\t4fe184616a.png\t82300bf473.png\tb47f64fc94.png\te862c4a1dc.png\n1c458ae740.png\t4fe35f1f53.png\t82301d3f95.png\tb482310066.png\te86361f5d1.png\n1c45ad9906.png\t4fed31718b.png\t8232db2b76.png\tb4836d1c53.png\te865014246.png\n1c46fe53df.png\t4ff0eb9512.png\t82385cd8e9.png\tb484e89591.png\te86524fcbc.png\n1c4d3b5dc2.png\t4ff25e6999.png\t823ad213d9.png\tb4863138c2.png\te86690dd2b.png\n1c4d649995.png\t4ff2b5959b.png\t824152cf1c.png\tb4875b1537.png\te86a7ecaa5.png\n1c51a79860.png\t4ff2ff620d.png\t8245366dcf.png\tb48a0af1f1.png\te86bc3782c.png\n1c5e4acb6d.png\t4ff5c5ef47.png\t824e666e6f.png\tb48d5d0a49.png\te8757626a1.png\n1c60c13d20.png\t4ffb50da29.png\t824ef90b9b.png\tb4916183a5.png\te875c9a960.png\n1c6237ae58.png\t50000fada5.png\t825204d8a8.png\tb49666ccc4.png\te8769e4043.png\n1c674cc105.png\t50006ac207.png\t825aa65d80.png\tb49679bab9.png\te8799c2322.png\n1c687fea93.png\t5001f015a4.png\t825b2f52f0.png\tb497566ac3.png\te880e84886.png\n1c6f0527ae.png\t500267ba02.png\t825db130f4.png\tb49894747f.png\te8827bc832.png\n1c6f7aac78.png\t5002e0163a.png\t82612016ee.png\tb49a70d0c3.png\te888c2eb4d.png\n1c78d88cb8.png\t5005257da5.png\t826120ef4e.png\tb49fe5308d.png\te88b8de75c.png\n1c7acddca2.png\t500683ce7e.png\t826698f6b8.png\tb4a2b76dfe.png\te88de6b8fc.png\n1c7bd9200a.png\t50094e6744.png\t82670a1711.png\tb4a849594f.png\te891e8cdad.png\n1c7c8f02c5.png\t500d909b87.png\t8267c5f862.png\tb4a88043be.png\te89272a29b.png\n1c82f67993.png\t500e912121.png\t826805973e.png\tb4aff6e970.png\te8957ff25d.png\n1c85715bf7.png\t5010a5cb65.png\t826b1b2ec1.png\tb4b1840b53.png\te895c207eb.png\n1c8830d82b.png\t501b9807d2.png\t826bb24d25.png\tb4b225aaa8.png\te89784ccbf.png\n1c8f6881ac.png\t501c68e571.png\t826dac705d.png\tb4b45105cd.png\te8a0999879.png\n1c900e9782.png\t501da23c5b.png\t826f5573b8.png\tb4b9723956.png\te8a4f0dd80.png\n1c920e3604.png\t502360ae46.png\t8270908d6a.png\tb4c5f0d153.png\te8ad4c4398.png\n1c9a942d22.png\t50251a9572.png\t82724f83ce.png\tb4cc869bd0.png\te8b694951f.png\n1c9fa3dff4.png\t50290e75f9.png\t827360080a.png\tb4cdeee946.png\te8b730fe92.png\n1ca865b041.png\t502a8b84ea.png\t827491498a.png\tb4d509de34.png\te8b83b2005.png\n1cac4d17ac.png\t502b97b954.png\t8277c7e8eb.png\tb4dd1f458f.png\te8bc4c8384.png\n1cad8c327b.png\t502bb755af.png\t8279b390ec.png\tb4e067c10e.png\te8bda05582.png\n1cb2fde971.png\t502d93b515.png\t827bc20b7a.png\tb4e23ae450.png\te8bec3476b.png\n1cb4733828.png\t503aba826b.png\t8280397ff7.png\tb4e61836df.png\te8c3aa3269.png\n1cb68f59e7.png\t503c47f8ce.png\t8280dbed52.png\tb4e8386ff1.png\te8c835e104.png\n1cb6d0d0ce.png\t503f08307e.png\t82821aa473.png\tb4eb3ee501.png\te8c919c8ec.png\n1cb9549d2c.png\t504c4a71d2.png\t828d543df2.png\tb4ed4a112b.png\te8ce5691ef.png\n1cbb39d623.png\t505483d249.png\t828f7f524f.png\tb4f0255ba5.png\te8cf12563d.png\n1cbfb9a597.png\t50563e717b.png\t82904b406b.png\tb4f68d9a41.png\te8d4c1d49b.png\n1cc2a7346e.png\t505692120b.png\t8291aa0945.png\tb4f6d6f1aa.png\te8d5563783.png\n1cc428e57e.png\t5057536c73.png\t829451e2cf.png\tb4f806f74e.png\te8d6412ed8.png\n1cc45d2a1f.png\t50580fb78e.png\t8295d7b1ed.png\tb4f8f7e136.png\te8d8b35601.png\n1cc76fe868.png\t505ceabce1.png\t82991308ed.png\tb4fa92bf4a.png\te8e0ac9215.png\n1ccc46e36f.png\t506313852a.png\t8299b97f38.png\tb4fc2ce0be.png\te8e330c390.png\n1ccd0edac4.png\t5064f20af7.png\t829ebfbb32.png\tb506085af9.png\te8e42ebc71.png\n1cd0858d58.png\t506bc2ec20.png\t82a15ecee7.png\tb50aa287f2.png\te8e897a8ae.png\n1cd35a5d9d.png\t5073b62291.png\t82a285227e.png\tb50dbf7dc1.png\te8e8b96a84.png\n1cdd74608b.png\t507dbc9444.png\t82a5a9bf4a.png\tb50f5e59ba.png\te8ebd88bb5.png\n1ce1580804.png\t508062c055.png\t82a6a0e66d.png\tb50f98299f.png\te8f06500ea.png\n1ce78adad6.png\t5081bad441.png\t82adfbc0f1.png\tb5137b993b.png\te8f2ace48b.png\n1cec04bb12.png\t508213f8ca.png\t82b1901574.png\tb51a512daf.png\te8f3e42b14.png\n1cf2b64739.png\t50822696b4.png\t82b5b3d64f.png\tb51c130238.png\te8f48a2038.png\n1cf33ec55a.png\t5084c38739.png\t82b62ee3be.png\tb51c6afa98.png\te8f7ceaf00.png\n1cf783890d.png\t5086050a38.png\t82b7a32ad4.png\tb51eed936a.png\te8f8ab82ce.png\n1cfaab36d8.png\t508f77d60d.png\t82bc39d826.png\tb521c9ddf3.png\te8fda6896b.png\n1cfb3a3092.png\t5092434e2a.png\t82c316179b.png\tb523c067be.png\te902cfed5a.png\n1cfc959ec2.png\t509634d9d9.png\t82c340f4e5.png\tb525824dfc.png\te9032dee29.png\n1d05eff605.png\t5097350725.png\t82c3cfff3d.png\tb52ac8f64e.png\te904dc08db.png\n1d085fd979.png\t5098537745.png\t82c7ad300d.png\tb535380255.png\te9077013c8.png\n1d0a131f99.png\t509e612741.png\t82c95a0ab4.png\tb538164346.png\te907b958e2.png\n1d0c2fa004.png\t50a91adce8.png\t82c95cdb56.png\tb538ad97f7.png\te90e55d803.png\n1d10ae96f5.png\t50ac0f2474.png\t82cf8b58e3.png\tb538b5e2dc.png\te9147272b4.png\n1d1447a1b6.png\t50ac343646.png\t82d226eb19.png\tb53cd73f5e.png\te91679e234.png\n1d1ca0e373.png\t50acc0ee96.png\t82d65941ec.png\tb53e07d18a.png\te91e8917f7.png\n1d1e3f95d3.png\t50b21b5dba.png\t82d8b50c5d.png\tb540435ae7.png\te91eaf191a.png\n1d1f8ae141.png\t50b21b7971.png\t82db07ac64.png\tb541e2c336.png\te91f473198.png\n1d2382ac3f.png\t50b3aef4c4.png\t82dc840568.png\tb542d75a78.png\te92024548a.png\n1d23853728.png\t50b9046e67.png\t82dc91ec22.png\tb542fd75b6.png\te92155faf2.png\n1d241376fb.png\t50bc1a2eb4.png\t82de84fe54.png\tb5448cb40a.png\te925b2a8ff.png\n1d2bbee88b.png\t50bf331c09.png\t82e0f71730.png\tb544b666a4.png\te927f91b12.png\n1d2bf3cc94.png\t50bf94a59a.png\t82e4332ae3.png\tb5469f2eb9.png\te929ca0e34.png\n1d2c8152bd.png\t50c056c994.png\t82e87e1ab6.png\tb54ac21864.png\te92f67b3c7.png\n1d2cd614be.png\t50c1a76871.png\t82e88d973f.png\tb54c04f07f.png\te93456f706.png\n1d2d268bbc.png\t50c5efed92.png\t82eb6c93d3.png\tb54c20b864.png\te938a48293.png\n1d2e024c1f.png\t50cb6ae8b3.png\t82ee3a1413.png\tb54c2fdf03.png\te938c76532.png\n1d35fb9bb8.png\t50cd98fea9.png\t82f186d2ac.png\tb54c9cc80b.png\te93981eb89.png\n1d3b12f176.png\t50d1ebb96f.png\t82f412d210.png\tb54da3837d.png\te93ddc6ba7.png\n1d40181f56.png\t50d3073821.png\t82f50352d2.png\tb54e6cba44.png\te93e2f83d6.png\n1d417cc926.png\t50d3daf954.png\t82f686d860.png\tb552fb0d9d.png\te94023f24e.png\n1d49878a48.png\t50d689da2c.png\t82f8e5737e.png\tb553dbb375.png\te947cf77db.png\n1d4e60eafa.png\t50d7c206cc.png\t82fae7d0cd.png\tb5544cfe5f.png\te9497af3d7.png\n1d5095d325.png\t50e1d5bf28.png\t82fd0dd4ae.png\tb55a1e3033.png\te94b092cc1.png\n1d56be5402.png\t50e3e10afc.png\t830509e296.png\tb55aa09a18.png\te94ed59b74.png\n1d5ff29422.png\t50e3fe9bc1.png\t8306f847a0.png\tb5672d535a.png\te9531eada5.png\n1d614f22d8.png\t50e42267af.png\t830b0c4dc7.png\tb569096e82.png\te953b8fb9d.png\n1d6266dfe6.png\t50e4b38d46.png\t830f5d9628.png\tb56f9e78e6.png\te962689604.png\n1d630e8527.png\t50e92f4a67.png\t8311fce851.png\tb570655083.png\te964212470.png\n1d660098ac.png\t50f78681db.png\t8313bd80b5.png\tb5768c5204.png\te969baeebd.png\n1d685060b2.png\t50fa8bbc7e.png\t8320911258.png\tb57a03686d.png\te96a030ded.png\n1d6a8f9dcc.png\t50fbd415fe.png\t83232c5c15.png\tb57b8d5c7e.png\te96b8f5e28.png\n1d6d730817.png\t50fc16ff18.png\t8329088275.png\tb57e98a275.png\te96dc1f42e.png\n1d6e08ee8d.png\t50fd2ca447.png\t8329ec9395.png\tb57ecf9e0e.png\te97033b42c.png\n1d6f51de05.png\t510097c524.png\t832a05c627.png\tb5821bfa07.png\te974741e66.png\n1d72c26633.png\t5103df3d79.png\t832c709b04.png\tb5850af687.png\te97a3a3e47.png\n1d73d18a1f.png\t510579c24b.png\t832dd54aed.png\tb586705391.png\te97a3e5acc.png\n1d77187496.png\t5106b883d7.png\t832ebfaa73.png\tb5869ced23.png\te981a42d6e.png\n1d798aa5d5.png\t510805c1d2.png\t83349a5042.png\tb58742de47.png\te98557c9f3.png\n1d7a812c76.png\t5108785aaf.png\t833993f5c2.png\tb58913573a.png\te9909298a9.png\n1d7bb8a0b3.png\t510e8c243a.png\t833a450da1.png\tb5893fc541.png\te991f05a67.png\n1d7eaa9036.png\t510f0a1180.png\t833ff7ed7e.png\tb5934d2a3b.png\te99311dcb3.png\n1d80de2ec9.png\t5113517acf.png\t83424cab00.png\tb596f5428a.png\te995d7343b.png\n1d8164b941.png\t51141fa392.png\t8343ad9ed8.png\tb5970ee4d5.png\te9a1ccaee4.png\n1d89f24721.png\t51176b9d90.png\t834437456a.png\tb5995bbf96.png\te9a25067ba.png\n1d8a5c0fb1.png\t51179a1815.png\t8346a9c748.png\tb59bb4f42b.png\te9a4c4ee10.png\n1d8b53d9df.png\t5119fdbfde.png\t8346af8d52.png\tb59cadbfbe.png\te9a6d47f94.png\n1d8c40e024.png\t511b704f00.png\t834861f1b6.png\tb5a137afa4.png\te9adb58e5f.png\n1d8cfc6954.png\t511e4a0eb1.png\t834a9e2087.png\tb5a15d7bf1.png\te9afdf107c.png\n1d93614392.png\t5122692cce.png\t834c1f2d71.png\tb5a206a00d.png\te9b058f0f0.png\n1d93cf8169.png\t5125a5c599.png\t83558aa4b8.png\tb5a47c646f.png\te9b05be83e.png\n1d955f8103.png\t5125c3b025.png\t8356ba4507.png\tb5a52c35fc.png\te9c131be99.png\n1d9bc2591e.png\t512d7e9115.png\t835a573b7d.png\tb5a76532d0.png\te9c1c14939.png\n1d9d2189a2.png\t512d8d9997.png\t835fd78e54.png\tb5af74d77f.png\te9c2184f8b.png\n1d9f4e875a.png\t512e5b5681.png\t83645a3b6f.png\tb5b511dc06.png\te9c73652a5.png\n1da21e7498.png\t51302b1585.png\t8367b54eac.png\tb5b762cc9e.png\te9c780ce9e.png\n1da9e3d14a.png\t51308c8b03.png\t8369ecd82d.png\tb5b767492b.png\te9ca391fbe.png\n1dab5be479.png\t513685023c.png\t836bb06e23.png\tb5b9b37a9b.png\te9cce9c914.png\n1dac95d5b7.png\t5137243b27.png\t836eedd4a7.png\tb5bdff9392.png\te9ce8188b9.png\n1db00d7590.png\t513a596d8b.png\t836f781562.png\tb5c096003d.png\te9d149a452.png\n1db40838ce.png\t513a7fde16.png\t8370703e18.png\tb5c4939835.png\te9d39cddd7.png\n1db85dc47c.png\t51417cc1ba.png\t83711cf4a8.png\tb5c6caa273.png\te9d622dba7.png\n1db9722702.png\t51423622fd.png\t837342a089.png\tb5cb0a5337.png\te9d8c832f0.png\n1dbc694c5d.png\t514a1c19da.png\t8373567bb1.png\tb5dd684e36.png\te9dad4676f.png\n1dcece25d7.png\t514b6302ee.png\t837543dddb.png\tb5ddc25ebf.png\te9deeee398.png\n1dcf541ad8.png\t514ece6215.png\t837c793261.png\tb5e1371b3b.png\te9edc8ee67.png\n1dd78bd5d5.png\t514f87c157.png\t837d92f194.png\tb5e32a1b6f.png\te9f2b199ef.png\n1ddca41ffc.png\t51519aaa77.png\t838581e54e.png\tb5e9a12412.png\te9f338cd6e.png\n1ddcb58933.png\t5155437a9d.png\t838e2cfbb1.png\tb5ed436d27.png\te9f4699c1e.png\n1ddce7211e.png\t5155ad847f.png\t838f5e53b9.png\tb5f1babe63.png\te9f4d6f927.png\n1dde15c583.png\t515798b1cd.png\t83967f4314.png\tb5f2602f5b.png\te9fafd9af3.png\n1ddf6944a7.png\t5157fb22d8.png\t8399158374.png\tb5f5cb0885.png\te9fb2fa04f.png\n1de2a09cca.png\t515b9e1052.png\t83a2adae40.png\tb5f79edcc3.png\te9fe017cf3.png\n1de9b66b64.png\t515ed7b6de.png\t83a48671bd.png\tb5f8880d44.png\te9feb4fcc4.png\n1deb46ea81.png\t516ae893fc.png\t83aaa6313e.png\tb5fc6574cc.png\te9ffc56d5d.png\n1dec8a70c7.png\t5171c905da.png\t83b062b608.png\tb5fd02b2f7.png\tea0b716b6e.png\n1df0d002f5.png\t51767ccd9d.png\t83b28f856a.png\tb600f6e4fd.png\tea0c6893a5.png\n1df8119d15.png\t517872435e.png\t83b58112e7.png\tb607345314.png\tea0ceb154d.png\n1df8d91491.png\t51806f45ce.png\t83b78b9386.png\tb608bfc9f4.png\tea0d2f8ff1.png\n1dfcbf4bca.png\t51846d4c96.png\t83bb4578eb.png\tb60fbb288b.png\tea0df883d2.png\n1dfdd132ca.png\t5186f6e1b4.png\t83be4b6836.png\tb6100c2809.png\tea122de9b3.png\n1e048fd40e.png\t51870e8500.png\t83be87dae4.png\tb614e76921.png\tea1270a8b2.png\n1e0b9a424b.png\t5187928643.png\t83c0564224.png\tb618cae630.png\tea127c793e.png\n1e10c8b214.png\t5187e13789.png\t83d1c277dd.png\tb619b88818.png\tea12c68da6.png\n1e178288ac.png\t518b6d0f17.png\t83d4aae732.png\tb61cb2c555.png\tea1332b03d.png\n1e1e0fb0e5.png\t518ee8cf52.png\t83dbe2a00f.png\tb61e97ca9a.png\tea1390525e.png\n1e2bad4d81.png\t51902003d8.png\t83dbe763aa.png\tb62081e0eb.png\tea1488cd35.png\n1e2ff4572f.png\t51936b6118.png\t83dddb0ca6.png\tb6235dcbb4.png\tea189558a2.png\n1e37c0eae8.png\t5198929758.png\t83e2b18761.png\tb6274f23a3.png\tea1a6e1b33.png\n1e3881f1e0.png\t519955218e.png\t83eb3d2cd4.png\tb6298c4af0.png\tea1ac2b2ce.png\n1e3b1e8c02.png\t519cc2a871.png\t83f67afabd.png\tb6332b22a5.png\tea1bf77219.png\n1e45cb9411.png\t51a170c2ea.png\t83f8ae854e.png\tb637a7621a.png\tea1d553565.png\n1e4a233a70.png\t51a41b9281.png\t83f98d2c24.png\tb63b23fdc9.png\tea1fe0df1d.png\n1e4a6b81a3.png\t51a554c73d.png\t83fdc47ff0.png\tb642883836.png\tea20808a7e.png\n1e4a6db5c6.png\t51a67d4268.png\t84044adfe4.png\tb64a39eb97.png\tea2206dd6f.png\n1e4bc50928.png\t51a98a8f3a.png\t8409fed45c.png\tb64cdc425f.png\tea233d4117.png\n1e4c96a9cd.png\t51acfd1399.png\t840edfd9b0.png\tb6520147bb.png\tea2d80c951.png\n1e4e74dd1e.png\t51ad12b135.png\t840f1a7eb1.png\tb65894db11.png\tea2f1fa62f.png\n1e4fcfb2f9.png\t51ad23de04.png\t841103a83d.png\tb65ac8a9b3.png\tea2fe8434f.png\n1e51e9a592.png\t51adcb4fd1.png\t84123cd149.png\tb65d1b12f9.png\tea3000cc49.png\n1e5ae9d2a9.png\t51b006e1e8.png\t8412d2542a.png\tb65e79a4fb.png\tea301ffabd.png\n1e5e713964.png\t51b5d26a59.png\t8413d342a9.png\tb6604d4bd6.png\tea314f7c7f.png\n1e60f585fa.png\t51b639314d.png\t841ebfb96f.png\tb66151dbe5.png\tea35a95332.png\n1e62978cc9.png\t51b91303cc.png\t841fe8b503.png\tb664b27025.png\tea374127b1.png\n1e62bfeb3d.png\t51be9fc3d5.png\t842016c170.png\tb66dfdf6dd.png\tea378e6f88.png\n1e66c09f24.png\t51c4fbfe63.png\t84242f42bc.png\tb6702e727f.png\tea38e39ba5.png\n1e674e5b6d.png\t51c67608e1.png\t842579c2cb.png\tb6711aa65b.png\tea39579316.png\n1e6c05169d.png\t51c7e18ca7.png\t842cd34527.png\tb67b1a4db4.png\tea3a61e257.png\n1e6c071a06.png\t51c9bd64b2.png\t842e153666.png\tb67c735ff6.png\tea3b072e78.png\n1e6cc68c5f.png\t51caea4f7a.png\t8431a87cff.png\tb67e8c0e8e.png\tea3cdbd7cd.png\n1e6eea6c81.png\t51caec18c7.png\t843447abe4.png\tb6804bfa43.png\tea4522cf05.png\n1e76a9cede.png\t51cf810935.png\t843530ad3c.png\tb680631ed5.png\tea4c1436c7.png\n1e78fe8a45.png\t51d1696054.png\t843bf0269c.png\tb68729a56a.png\tea4ee37cea.png\n1e79d48a44.png\t51d19a3eb3.png\t843e8ec44e.png\tb688bd015e.png\tea54256d21.png\n1e7b37c671.png\t51d27e732b.png\t843f9892ef.png\tb68b21f128.png\tea5c00cd7e.png\n1e7c85e567.png\t51d5142fdd.png\t843fe9dd2e.png\tb690245d77.png\tea5c092532.png\n1e7e6a7b31.png\t51d523930b.png\t8440daea2c.png\tb690abdc09.png\tea5cc18a2d.png\n1e7e77e274.png\t51dd15dd1e.png\t8441ee9ffa.png\tb69229dd1a.png\tea5dd02892.png\n1e7f747bb7.png\t51df82c979.png\t84450c580f.png\tb6923d46ee.png\tea65085167.png\n1e889b5203.png\t51e533cf03.png\t84480cf944.png\tb695427db8.png\tea68c33a0e.png\n1e933dc6cb.png\t51e6fa0574.png\t8449ff9997.png\tb696d70c72.png\tea6937a76e.png\n1e94d3d832.png\t51e82eaa5b.png\t845267074c.png\tb69705bcf7.png\tea69f29606.png\n1e95bad8bb.png\t51e8e6bfc4.png\t8457263314.png\tb698fce464.png\tea6dfe38ae.png\n1e95f2f6eb.png\t51ed1f2aed.png\t84634ba306.png\tb699fe7dd1.png\tea6f13458b.png\n1e99ecc4de.png\t51ede1c702.png\t846caee6e0.png\tb69bf71d8a.png\tea7425da00.png\n1ea255084b.png\t51f091af03.png\t846d597c6b.png\tb69cacf30a.png\tea77554368.png\n1ea7d8528a.png\t51f1a24bbd.png\t84723e7793.png\tb69dd4ba7a.png\tea79ad612f.png\n1eac578815.png\t51f328cebc.png\t8472683427.png\tb6a256fb4f.png\tea79dbd86d.png\n1eaf09c1bd.png\t51fd24701d.png\t847550431e.png\tb6a6479b03.png\tea7b588e17.png\n1eaf42beee.png\t51ff6322d5.png\t848c3fb0b7.png\tb6a84556e1.png\tea7cd41d83.png\n1eafbf57bf.png\t52020f0fd4.png\t848c9d1a6a.png\tb6a88be280.png\tea80fb83f5.png\n1eb92902a4.png\t5202aa6a12.png\t848fcb838e.png\tb6aa5c7ab4.png\tea83727b30.png\n1eb9662cd1.png\t5202cbe424.png\t849845be66.png\tb6ad668266.png\tea86c1d9cf.png\n1ebc029ba0.png\t5203d6e046.png\t849881c690.png\tb6af3ba0e1.png\tea89287cac.png\n1ebd1031cd.png\t5208af95c1.png\t8499eba4d6.png\tb6b2115f33.png\tea8a0439fe.png\n1ec1f4b3ea.png\t520c426a43.png\t849ccdf3af.png\tb6b9452a52.png\tea8da4b503.png\n1ec750ee29.png\t520fdf987c.png\t849f39dcbe.png\tb6b97b2238.png\tea90052ad6.png\n1ec85530b6.png\t52110da0ac.png\t84a42703ee.png\tb6b9bfd923.png\tea97efa26f.png\n1ecbdbe10f.png\t52114dcb14.png\t84a93da64c.png\tb6bb3d4986.png\tea9a1c61a0.png\n1ed0432125.png\t52158a3b36.png\t84ac54e5a8.png\tb6bbd06618.png\tea9ad92591.png\n1ed64d38cd.png\t521675c678.png\t84af757738.png\tb6bca345bb.png\tea9dfe6f60.png\n1edc6ca8a7.png\t52193e1f20.png\t84b1fda205.png\tb6bd6424eb.png\teaa326be94.png\n1edcb6a250.png\t521d437270.png\t84b5568c2a.png\tb6bed493ec.png\teaa481ae86.png\n1edcff8f55.png\t5220b448bc.png\t84b9c5016d.png\tb6bef2058c.png\teaa6ff2cc3.png\n1ee0d5b4d0.png\t52285f7cbf.png\t84b9da36fb.png\tb6c1ef1441.png\teaaa014bcd.png\n1ee12ee903.png\t522901656b.png\t84bd944c62.png\tb6c3fc77ae.png\teaadfd33a5.png\n1ee20d41ce.png\t522bfc0c1b.png\t84c0fd3ce3.png\tb6c481b604.png\teab12b6c0b.png\n1ee2156eb1.png\t52346bbc4c.png\t84c15cdca0.png\tb6c52b4a53.png\teab159c085.png\n1ee82dabee.png\t5235fc6d22.png\t84c1e715ca.png\tb6cb1fda08.png\teab2d3c8af.png\n1ee8d337ab.png\t5237e5d016.png\t84c314d868.png\tb6d3278d88.png\teab589e1f2.png\n1ee8e51093.png\t523834c0e0.png\t84cc99f06b.png\tb6d5ad81c3.png\teabb73d429.png\n1ef0b973fd.png\t523c5d756a.png\t84cce35d7e.png\tb6d738b57b.png\teabe9337a9.png\n1ef24cd2f7.png\t523ddc787f.png\t84cee5c038.png\tb6dc397995.png\teabed5b2b4.png\n1ef4ad65ac.png\t5241080fcb.png\t84cf0b50cf.png\tb6e57d954e.png\teac3fb300e.png\n1ef7e65f42.png\t52449a1c2a.png\t84d1245b1c.png\tb6e77d3212.png\teac66ccf10.png\n1efbd5996b.png\t5244b40d0c.png\t84d61225fc.png\tb6e91e4bc9.png\teac74e0d63.png\n1efe1909ed.png\t5246cf43ed.png\t84d6b146a8.png\tb6ebc3831e.png\teac822921e.png\n1effba6499.png\t52477a56f5.png\t84dadcf34c.png\tb6ef3d7ed7.png\teaceba067e.png\n1f00c16021.png\t5248258f96.png\t84db0ddb55.png\tb6f193a90d.png\tead08fb62d.png\n1f01fe0512.png\t524adc6537.png\t84dd68008f.png\tb6f6a93b17.png\tead48d81ae.png\n1f028def4a.png\t524e500962.png\t84e1902710.png\tb6f907680a.png\tead4e4f3fe.png\n1f038bf35a.png\t524f01dd79.png\t84e23dba29.png\tb6fc486117.png\tead7dd8e26.png\n1f043f4fe8.png\t52551f7a80.png\t84e305da04.png\tb70271bba2.png\teae2a8b2ef.png\n1f07460c78.png\t5256861a31.png\t84e540208f.png\tb702f2456b.png\teae4111384.png\n1f086449b5.png\t5258922b55.png\t84e866a2d3.png\tb70825020e.png\teae5302a1d.png\n1f0907fbef.png\t5258e72b4f.png\t84eb2c400b.png\tb70f0ab0c2.png\teaeb630ff4.png\n1f0b16aa13.png\t525be73de5.png\t84ee0bfc80.png\tb711135cfd.png\teaebee27aa.png\n1f150266cf.png\t525e2bed36.png\t84ee6c32bc.png\tb7112227fa.png\teaef4c3a5a.png\n1f158e8d68.png\t525fdefea3.png\t84f4a771f3.png\tb71915a0fe.png\teaefc56247.png\n1f161001b4.png\t526179a9b1.png\t84f510947a.png\tb71f8ae21f.png\teaf39dda19.png\n1f17c6232a.png\t5261804250.png\t84f807d2ad.png\tb720639d3d.png\teaf3b0ec78.png\n1f1cc6b3a4.png\t526418e256.png\t8500564b08.png\tb7214417bc.png\teaf44cece7.png\n1f1cfbdcb3.png\t526453de1e.png\t85007ebd13.png\tb72181a9d5.png\teafa11cd72.png\n1f1e11aee4.png\t52667992f8.png\t8502b6d382.png\tb722380b7f.png\teb004aea14.png\n1f24885947.png\t526d63530a.png\t8507aeb1cb.png\tb7275ec255.png\teb05ac38b0.png\n1f27336ad7.png\t526fb2a204.png\t850f48c5e3.png\tb728ab6b4d.png\teb0707f881.png\n1f27634299.png\t526fe8e7ca.png\t8511455765.png\tb72e542ea7.png\teb0b46dd7e.png\n1f28807881.png\t526fec1e2d.png\t851498c095.png\tb74675b268.png\teb18014173.png\n1f3038af73.png\t5270c3324f.png\t8518809291.png\tb749a000ea.png\teb19a9744b.png\n1f31e73ac9.png\t52739de8a0.png\t851a37ac37.png\tb74e810872.png\teb1b5278e7.png\n1f320a021a.png\t527c3abd5f.png\t851d6c941b.png\tb74ecdfe20.png\teb2313684f.png\n1f3299c2ba.png\t527c9dd65f.png\t851fcdc053.png\tb74fcab200.png\teb23ddad08.png\n1f33387a8e.png\t527f4c67b5.png\t8526b93268.png\tb7563a8f61.png\teb28ac55ab.png\n1f3a28fbbd.png\t52812aaf65.png\t852873eb7b.png\tb75933e223.png\teb2e928dba.png\n1f3a94b447.png\t5281d67da6.png\t85296701e6.png\tb76637a5b1.png\teb30d99426.png\n1f3f6c3791.png\t52889bff06.png\t85306393b6.png\tb76760a4a1.png\teb342b28ff.png\n1f4067e73f.png\t528affbfce.png\t8534d4dfec.png\tb7699d334f.png\teb38db5443.png\n1f4d1bd6fc.png\t528bb36b3b.png\t8536003f03.png\tb76b065012.png\teb3a40225e.png\n1f50bc7a9f.png\t528cb6678c.png\t853a1ba28f.png\tb773b0a33d.png\teb3b019e86.png\n1f536c5e95.png\t52914952d6.png\t853d9832a2.png\tb773d9a181.png\teb3f2d1468.png\n1f5849be95.png\t52956e1f53.png\t853e07e13e.png\tb774b632f8.png\teb46e70089.png\n1f68e3a538.png\t529758ae86.png\t853e89d0d3.png\tb7754e1aac.png\teb46ed5de8.png\n1f6afee5ca.png\t5299cd2c33.png\t8544771e1d.png\tb77ab07896.png\teb48070e3e.png\n1f7147deda.png\t529cd39ad9.png\t85460cb29e.png\tb77b60349e.png\teb4b2816ca.png\n1f71a777ff.png\t529fe2bf20.png\t854b912e71.png\tb77cc54800.png\teb4bc6e2d6.png\n1f73caa937.png\t52a0f74bfe.png\t854e676f6d.png\tb7838fa9a6.png\teb4d13433b.png\n1f75d0bd30.png\t52a63b329c.png\t854ee81355.png\tb783c69710.png\teb4d9e16ad.png\n1f782cbc43.png\t52a8c847d0.png\t854fc2c0df.png\tb7844c89e2.png\teb4efc4c39.png\n1f78b0cece.png\t52aa9a9703.png\t85523172e6.png\tb788015db8.png\teb4fa16734.png\n1f7d2d2769.png\t52ac7bb4c1.png\t855363abfa.png\tb788ebfa2c.png\teb574eab0e.png\n1f7e3eef06.png\t52ae147d60.png\t85548b6982.png\tb789820d77.png\teb5778dd5b.png\n1f7f415bed.png\t52af742153.png\t8555cb627a.png\tb78adef524.png\teb58ef5910.png\n1f7fae2e19.png\t52b11a66b7.png\t855b3c7d30.png\tb78cb3ec99.png\teb590942a1.png\n1f805e4733.png\t52b4fe7246.png\t855e5569a9.png\tb7900af09d.png\teb5934bd67.png\n1f83bc94df.png\t52b98a62e6.png\t855f399924.png\tb790c65f84.png\teb5baeb627.png\n1f841a6f8b.png\t52c4d1d8fa.png\t8561c63d20.png\tb796ee40b0.png\teb5e2a6b3a.png\n1f8bf8c5d0.png\t52c753e4da.png\t85670a7093.png\tb79b0318fa.png\teb5fd4f394.png\n1f8d2fb264.png\t52c9a7d0ea.png\t8567ff7170.png\tb7a0e18aa1.png\teb60c1763b.png\n1f96e75dd4.png\t52cfb25e88.png\t85703ee290.png\tb7aa650710.png\teb61290af7.png\n1f9dcb9a1f.png\t52d6cae7a0.png\t85720e2425.png\tb7abc2029c.png\teb61ff3bd3.png\n1f9fe8640a.png\t52da078de7.png\t857293f3fe.png\tb7ae0829e9.png\teb62b76058.png\n1fa25802cc.png\t52dbe09f4c.png\t85738bfeb5.png\tb7af4a57fd.png\teb69e96015.png\n1fac01003f.png\t52e6756789.png\t85773f7a14.png\tb7b64e2e3c.png\teb6a10a157.png\n1fb19a252f.png\t52e7de7430.png\t857c693eed.png\tb7b72a69d1.png\teb7196935a.png\n1fba03699e.png\t52eff24db9.png\t857e511397.png\tb7b83447c4.png\teb735d8274.png\n1fbc472829.png\t52f369a734.png\t857f570585.png\tb7b9702887.png\teb78f23074.png\n1fca9e9479.png\t52f6267e3f.png\t85813f234e.png\tb7bdbc58cd.png\teb7b4320f4.png\n1fcb42b90e.png\t52f8ed4b41.png\t8582e20d75.png\tb7be8c30e0.png\teb84ce7263.png\n1fce96e96c.png\t52fb476677.png\t85841d174b.png\tb7bedf816c.png\teb84dfdf18.png\n1fd17872d4.png\t52fdbf981f.png\t858f01d368.png\tb7c0062406.png\teb86b8c715.png\n1fd2701517.png\t53065eaa33.png\t858f84a797.png\tb7c1e2a377.png\teb8720bb47.png\n1fd49d28d8.png\t5306b708be.png\t8590af360e.png\tb7c27ed70c.png\teb886df39d.png\n1fdb251261.png\t53071b0839.png\t8595d53217.png\tb7c3012e31.png\teb8a1f8c5d.png\n1fdb7fe384.png\t530f6e4597.png\t8596457c74.png\tb7c462dd1c.png\teb8cc8640f.png\n1fdce40b42.png\t530f759c6f.png\t8597ba8dc8.png\tb7c6762ad5.png\teb8d9a0d4a.png\n1fdd703c81.png\t5313788acb.png\t8599321bef.png\tb7cbfe92cf.png\teb8e6e0b8a.png\n1fde67d42e.png\t531687f063.png\t859c40e162.png\tb7d6aadf83.png\teb91416afe.png\n1fdf187ca2.png\t532012ac5a.png\t859c673a95.png\tb7dc0d87fd.png\teb942a7ad0.png\n1fdf1ea64b.png\t5320f9f579.png\t859d0acc6d.png\tb7e6b68ca5.png\teb958a5cc8.png\n1fe408de00.png\t5326fad561.png\t859f623d32.png\tb7e9aeb1f1.png\teb95aae3a3.png\n1fe82aa458.png\t53289da645.png\t859f69af1a.png\tb7eb34da94.png\teb9ae11a93.png\n1fec8e839a.png\t533383b680.png\t85a3114e20.png\tb7ebdaa8bb.png\teb9af521d6.png\n1fef5171bb.png\t533683b8e0.png\t85a9687a4f.png\tb7ec14f5e4.png\teb9afa8deb.png\n1ff0aa7f05.png\t53383aa190.png\t85a9824616.png\tb7ed416e3c.png\teb9d14df6f.png\n1ff103fe54.png\t5338d62f75.png\t85b57f0344.png\tb7f1e8c111.png\teba07f1ec1.png\n1ff564151f.png\t533a731c99.png\t85bca2aade.png\tb7f523db92.png\teba212cf18.png\n1ff696d59f.png\t533c11374e.png\t85c075014c.png\tb7f5e6933e.png\teba7ead66c.png\n1ff6e2c33b.png\t533cb21516.png\t85c191c52c.png\tb7fbd93422.png\teba8908805.png\n1ffaab4396.png\t533f1cfb17.png\t85c5595dc3.png\tb7fc96f5d6.png\tebacf0e713.png\n1ffc2168ad.png\t5343d4837c.png\t85c5a1b5e8.png\tb7fcb32056.png\tebae2cdcc7.png\n1ffd2147e1.png\t5344ef1b33.png\t85c813bc7b.png\tb7fdc36c9b.png\tebb000af78.png\n1ffeda69ba.png\t534591738b.png\t85c8d7f76c.png\tb800303800.png\tebb5c4e58e.png\n2001dfd850.png\t53485dc0d9.png\t85cbf88c3a.png\tb8016fff86.png\tebbca91b35.png\n20061dbc16.png\t5349da2c2d.png\t85cf79a2cb.png\tb804c702fe.png\tebbd2ea275.png\n2008ffe345.png\t534da81275.png\t85d1d6b2cd.png\tb8058b581c.png\tebbf3ea889.png\n200f674793.png\t534dcc8cdf.png\t85d4a097a1.png\tb805aaf6ed.png\tebc13ad1f1.png\n2010b17cf6.png\t535503063e.png\t85d7f53793.png\tb807539a5f.png\tebc46e6691.png\n20110b03bf.png\t53558f2db4.png\t85d82d8218.png\tb80db32c01.png\tebcd41b5f1.png\n20128efc40.png\t53560dfd35.png\t85e6b4cb1c.png\tb810ed2ed1.png\tebd0f05fc1.png\n20169726a4.png\t5358961921.png\t85e89d3d33.png\tb8116cd3ac.png\tebd1c80849.png\n2017a57511.png\t535920edd3.png\t85ecabd362.png\tb813b28cf2.png\tebd623bfc6.png\n201a189503.png\t535eb2e821.png\t85ef988548.png\tb814b1151c.png\tebd7b4879f.png\n201b6ac1e0.png\t536106a930.png\t85f12fb05d.png\tb8173c3664.png\tebd833ff22.png\n201c2f346a.png\t5365d85bbf.png\t85f4838aec.png\tb8182ee541.png\tebdbbb52b5.png\n201c30c2f3.png\t53677afabf.png\t85f4a71954.png\tb81a8fc72e.png\tebdef67fa5.png\n202b601141.png\t536b6405d9.png\t85f763a9b4.png\tb81ebd84cc.png\tebe13dbb60.png\n20345ad292.png\t536c71aba3.png\t85f84d4960.png\tb8236d92ff.png\tebe37a58d3.png\n20349d0a12.png\t536d9ac15f.png\t85fe1cb502.png\tb82cd81273.png\tebe61e1427.png\n2035cc5f5d.png\t536da22fe4.png\t85fe62d2d6.png\tb833a115ee.png\tebe64e1d15.png\n203a0b212d.png\t536decbbed.png\t85fe6c5bde.png\tb833fdb882.png\tebe8198105.png\n203a330c4a.png\t536def7060.png\t85ff5b98a7.png\tb83682118b.png\tebe877d429.png\n203bd03cd1.png\t53726197dd.png\t85ff9eacfe.png\tb836a898b4.png\tebeeab3a36.png\n203c13a28b.png\t5375adb7dd.png\t8603855246.png\tb839f805f9.png\tebef3c7dad.png\n203e7683dd.png\t5377347cf4.png\t86081eee45.png\tb83a115365.png\tebf89446f2.png\n203f6117be.png\t5377827ce2.png\t86096bd26d.png\tb83bc9dfa7.png\tebfba68ec5.png\n20415fd0b8.png\t5378360450.png\t860ab1514f.png\tb83fe9c9a8.png\tebff5f719e.png\n20418b6cf2.png\t537c57487c.png\t860f605575.png\tb84a690f8f.png\tec0266ad71.png\n20482a66e6.png\t537f18bd97.png\t8612ca9286.png\tb85046c5c1.png\tec042b411f.png\n2049282c72.png\t53805fe33b.png\t86142b66cd.png\tb850c9ac3c.png\tec04a9c168.png\n204c4657d5.png\t538144a394.png\t861442da79.png\tb852505678.png\tec098526f5.png\n205097cbe4.png\t5382fb0713.png\t861e452121.png\tb8525600cf.png\tec10996820.png\n2056c97a10.png\t5383a3f1ff.png\t861f283361.png\tb85341bcaa.png\tec18a789ec.png\n2057e966b9.png\t53858001d1.png\t8620edf732.png\tb855245294.png\tec18ca6314.png\n205953b152.png\t538b6e9b98.png\t8620f40403.png\tb858ccf692.png\tec19326bbe.png\n205bc22cbe.png\t538b835716.png\t862ad92176.png\tb873f6dad9.png\tec1a509f08.png\n205d9b714c.png\t538befb2d5.png\t862eb1262d.png\tb87452efb1.png\tec1bd0c46a.png\n206045cf26.png\t538bfcfbef.png\t86321af3bf.png\tb87803d6e0.png\tec1d7d1ae1.png\n2063f2934b.png\t538dafe0c4.png\t8635fa4f68.png\tb884ea365d.png\tec1db9fe21.png\n20658e781f.png\t538ec1a18e.png\t86365721e2.png\tb8851182cb.png\tec1e7c99c6.png\n20676ad11f.png\t538fa5596e.png\t863aeec5eb.png\tb887b3f5dc.png\tec20edb5e5.png\n2067c20654.png\t5393cb5880.png\t863eddc9a4.png\tb88f677a39.png\tec212b3d2d.png\n2071f9850b.png\t5394610aed.png\t864532d1e7.png\tb890b261a0.png\tec218a2c03.png\n2077f4f5d8.png\t5394f6a4e9.png\t864ebfc491.png\tb8933f437c.png\tec228a0784.png\n20792970f8.png\t5395202313.png\t864ed26714.png\tb8945d366f.png\tec244d46b5.png\n207b8f1707.png\t5395ef1290.png\t8650153427.png\tb894d4c3b0.png\tec2664c4cf.png\n20810a1279.png\t539666654e.png\t8655a31d30.png\tb8963d4754.png\tec277da743.png\n20821458c9.png\t539881f52c.png\t8656e9203b.png\tb8999eaa8b.png\tec27ab38b8.png\n2087cc2916.png\t5398848bf1.png\t865a3c3462.png\tb8a065e744.png\tec2a46ed6d.png\n20894faad9.png\t539bbb05f8.png\t8661a2b464.png\tb8a1648734.png\tec2fb1a42b.png\n208be81c7c.png\t539c2225d3.png\t866b8fb98f.png\tb8a2f1a673.png\tec3130dd0b.png\n208d90dde3.png\t53a3147186.png\t866cbcc009.png\tb8a51aaee9.png\tec33956570.png\n208db5271e.png\t53a81ace9f.png\t866fb53024.png\tb8a7031d3d.png\tec3f4f6f6d.png\n208e7fcf75.png\t53ac4dc8ba.png\t8676b531a3.png\tb8a8ff838f.png\tec3faa8248.png\n20933fcee0.png\t53b4719e8a.png\t867e8976fd.png\tb8a90a08b3.png\tec4005618f.png\n2093e5c98e.png\t53b480e29f.png\t86866ba71d.png\tb8a9602e21.png\tec406285e8.png\n2097e63334.png\t53b90f4e43.png\t8686b3a5d4.png\tb8a9cb770e.png\tec407c9578.png\n209c3eaa70.png\t53bb18ce06.png\t8687badf0a.png\tb8aac87d86.png\tec49e2aba8.png\n20a0f53112.png\t53bbeae507.png\t8687efe0d4.png\tb8b0fe94b0.png\tec4a9e5599.png\n20a283cf78.png\t53bc10a8c4.png\t8688aa6acb.png\tb8b3f26ad4.png\tec4a9f09a6.png\n20a3294af9.png\t53c5b16111.png\t8688b0a79c.png\tb8b46ff67f.png\tec50263c20.png\n20a4dad0c1.png\t53c79255ba.png\t868a08ed2a.png\tb8bde6d60f.png\tec542d0719.png\n20ada280f0.png\t53c8faa2de.png\t868bb336b9.png\tb8be6ec1db.png\tec55911833.png\n20b000b00b.png\t53cc45fbcf.png\t868e7a70ae.png\tb8c3ca0fab.png\tec5afeae66.png\n20b02228c7.png\t53d3cf1ae4.png\t86900ef966.png\tb8c7283298.png\tec5db065d9.png\n20b0f116bf.png\t53d4c8a52c.png\t8693030eb3.png\tb8d01a893a.png\tec5f9c51f9.png\n20b11dcfb8.png\t53d9ed7ddc.png\t869e15dfcf.png\tb8d543020b.png\tec612d6e1f.png\n20b26a6ea7.png\t53dab98a36.png\t869f99406d.png\tb8dc035a88.png\tec61b48efb.png\n20b3b504bb.png\t53db37148d.png\t86a1ed83c3.png\tb8dc5ec60e.png\tec649a86b2.png\n20b67a6c1d.png\t53de66df2a.png\t86a72bdce6.png\tb8dea42e1a.png\tec7058e510.png\n20b784c31b.png\t53e04029fc.png\t86a772adb1.png\tb8e3c83fd3.png\tec712f4129.png\n20bda6e6d0.png\t53e17edd83.png\t86abf1c96d.png\tb8e7212860.png\tec7522391d.png\n20bde697e6.png\t53e356cf6d.png\t86aeea3b74.png\tb8ecae8f7f.png\tec76f13e84.png\n20bdf88bee.png\t53e739bf11.png\t86b02d1253.png\tb8ee1d8dbd.png\tec7745d84b.png\n20c2ec94c6.png\t53eb356632.png\t86b69c8d0f.png\tb8f76a8e5f.png\tec7e0e98e1.png\n20c314dc8b.png\t53ec68998b.png\t86b82ac3fd.png\tb8f80587f1.png\tec807c98b8.png\n20c468c70c.png\t53f0d7a5de.png\t86bb108198.png\tb8f92fbe45.png\tec80a4212e.png\n20c82f0e9c.png\t53f7dc0824.png\t86bb8ddfe4.png\tb8fcc32f11.png\tec817961a2.png\n20c99dee06.png\t53f97a1a88.png\t86bc03a8a6.png\tb9092f13c7.png\tec82dc9c80.png\n20ce0bb46b.png\t53fb6926f1.png\t86c1109730.png\tb90c03ccdb.png\tec857f4518.png\n20cf8b7a66.png\t53ff9d218a.png\t86c49fac6f.png\tb911d46acb.png\tec88937127.png\n20d0d1f973.png\t5403a9d43a.png\t86c5a1fec1.png\tb9145d4519.png\tec8a34b478.png\n20d394d00f.png\t5404974f3d.png\t86c675c4fb.png\tb915b8aa25.png\tec935828fe.png\n20d433a6b7.png\t5406a61720.png\t86c8fd7185.png\tb91a91adc5.png\tec96a1760a.png\n20d753d385.png\t54075e899e.png\t86cbc4e630.png\tb91af03eda.png\tec9ba7900d.png\n20d90748b2.png\t540b08ca10.png\t86cd09d1e5.png\tb91bb6305a.png\teca39b106e.png\n20d957cd46.png\t540c5a8294.png\t86cda5f8ce.png\tb91cbd7a4a.png\teca8b13122.png\n20d9a105c2.png\t540c76e5bc.png\t86d22b5b83.png\tb91e65bb04.png\teca8fc0255.png\n20dd18d8cb.png\t540d271079.png\t86d3b7c932.png\tb923daba0a.png\tecac326167.png\n20dd6bc2ff.png\t540fe7dc25.png\t86d4be69d2.png\tb9281b76e3.png\tecb398b5e9.png\n20df40b631.png\t5411bfa79c.png\t86d8081e4d.png\tb92b11b3a4.png\tecb3bcbe7c.png\n20e0444ffb.png\t54153773d3.png\t86d8cd13ac.png\tb92dad3725.png\tecb7e859fe.png\n20e707e4e1.png\t5415f36431.png\t86d92264a9.png\tb92ff69b44.png\tecc2ea758c.png\n20ed65cbf8.png\t541687068c.png\t86e3475645.png\tb930c21ff1.png\tecc84d1ead.png\n20ef62f227.png\t541ad9b41a.png\t86e484ac94.png\tb93424039f.png\tecca6a828d.png\n20f04ce268.png\t541d5341c0.png\t86e7716f79.png\tb935b426a5.png\teccb63f909.png\n20f0e2dfdd.png\t541e32b9ea.png\t86eb3e77a3.png\tb935f60197.png\teccbf3f0c2.png\n20f4920dd6.png\t541e59b47e.png\t86eccfe44a.png\tb937f4558c.png\teccc82228f.png\n20f63efb12.png\t541f392605.png\t86ef32c658.png\tb93d143e4b.png\teccdd0f368.png\n20fcf160f4.png\t5420263191.png\t86f1f06094.png\tb94052b389.png\tecd1737bb1.png\n20ff112fb3.png\t5423d6049c.png\t86f2cf093f.png\tb9412c61eb.png\tecd210735f.png\n21006cd47c.png\t5425b9b864.png\t86f2fd7025.png\tb9414d0f45.png\tecd4c52bd7.png\n2101c21f75.png\t54296f679c.png\t86f33178d1.png\tb941c8a655.png\tecda478e37.png\n21036cd656.png\t542f26c6ee.png\t86f7134adf.png\tb942b38a45.png\tecdd648857.png\n2103cc814b.png\t54319ceb3d.png\t86f858f81a.png\tb9463e78e6.png\tecde62fedf.png\n210bf75217.png\t5432daa68e.png\t86f892ad2f.png\tb94a5b0bda.png\tece0d4b36c.png\n210c6bf5d2.png\t5433c8a5aa.png\t86f8e8b9b0.png\tb94b6a4992.png\tece1494d81.png\n210f887ce1.png\t5436f8e94e.png\t86fb847889.png\tb95049835c.png\tece2724f0a.png\n21120f34d4.png\t5439dbbddf.png\t86fe531971.png\tb952e9c869.png\tece2c5440d.png\n2114e8d6b9.png\t543acdb5f7.png\t86fff4bd22.png\tb95433b42d.png\tece4d1d4bc.png\n2116631809.png\t543e8ef480.png\t8706bdd5f4.png\tb959545cfc.png\tece547d5f4.png\n211dca3b80.png\t543f5887a2.png\t870be46a42.png\tb95c797bb5.png\tecea555030.png\n211eb21702.png\t5445c943f8.png\t870e83dd9d.png\tb95f1633e8.png\tecf1ba6ab9.png\n211fab5111.png\t54463c0f84.png\t87114ddd92.png\tb9614348f4.png\tecf1d1a666.png\n211fc910da.png\t5446a64ba5.png\t8716fd9933.png\tb964244dfd.png\tecf703b922.png\n21258b28bb.png\t5448c46aac.png\t87172d491e.png\tb966734278.png\tecf73f779f.png\n2125e91ae1.png\t544caee232.png\t87173ef16f.png\tb969a65d69.png\tecf74e005a.png\n2126473557.png\t54524393c7.png\t8718d851c6.png\tb96b1971eb.png\tecfb64b43e.png\n2126d0dea6.png\t54538419f5.png\t871aacfc7f.png\tb96e52844e.png\ted00daf087.png\n212782d522.png\t5454e63aaf.png\t871be86f43.png\tb96ef7e302.png\ted01a305c2.png\n212ba67b51.png\t54571ba221.png\t871cdbe362.png\tb9711af26a.png\ted039de746.png\n212eeca2de.png\t545a6ed959.png\t871e2a367b.png\tb97701125d.png\ted063311a7.png\n212f074587.png\t545b319e46.png\t871f2a1e45.png\tb9789d9734.png\ted081aa065.png\n2130994fbd.png\t545ff25feb.png\t8720f13ca5.png\tb97918ac08.png\ted0a59bf6f.png\n21337b258f.png\t54629440c9.png\t8723f2b53b.png\tb97a4c857d.png\ted0e7d188e.png\n2133d8d39f.png\t54668cfc5b.png\t872fd7fcde.png\tb97ac031ff.png\ted11bbda4f.png\n2135dad339.png\t5466ce802a.png\t87347549f4.png\tb97e5dc573.png\ted1581c2d9.png\n213ac06da3.png\t5468ed796d.png\t873c879e6c.png\tb97fc0a32a.png\ted17b586e8.png\n213b23af27.png\t546e401ee0.png\t873df4c341.png\tb9802f3539.png\ted20d93bfd.png\n213d9163c6.png\t5473d1590d.png\t873e2f5d2c.png\tb981d35c43.png\ted2151b47c.png\n213e0a9f7b.png\t547b7e3237.png\t873f6cdf33.png\tb983d50fa3.png\ted218a87e5.png\n213e942c77.png\t547bfceee8.png\t874535525b.png\tb985d07f6f.png\ted2224ab77.png\n2143b330fe.png\t547c9d83fd.png\t875052bf6a.png\tb985fb0dfe.png\ted22b7c4a2.png\n214a3af065.png\t547f6cc102.png\t875085ea47.png\tb9863f5b64.png\ted24549a3f.png\n214bb6c663.png\t54818230e7.png\t875170148c.png\tb98e30c2fa.png\ted249ea79b.png\n214bee1aa1.png\t5482075056.png\t8753fe755b.png\tb99433d2e0.png\ted28175fe1.png\n214dc3b533.png\t54842b574a.png\t8755799352.png\tb994f9a4fa.png\ted29f7fa16.png\n215970a6ec.png\t548887f17d.png\t8758e2ddb4.png\tb9962027c6.png\ted2b73840d.png\n215eaca54a.png\t5489419bef.png\t875b087f1c.png\tb9a3a4935d.png\ted2d89f69b.png\n2160b09db5.png\t5489a9e706.png\t875b8313fa.png\tb9a6b4b21f.png\ted2dac5160.png\n21624c7947.png\t54943dc562.png\t875c61e40c.png\tb9a7c521ac.png\ted30208abd.png\n2163aa947a.png\t54961d841d.png\t875d5cbd51.png\tb9afd7b728.png\ted313a0458.png\n2165fa4ff9.png\t54964f9bb6.png\t875dda3975.png\tb9b8727ba8.png\ted35c9c0d8.png\n2168a05735.png\t54978adcc6.png\t875e1a2d73.png\tb9bb1b7b3f.png\ted3d64a782.png\n216a741f7e.png\t5497f42ab8.png\t875fa958c8.png\tb9bbfd8128.png\ted415f29eb.png\n216aa6275d.png\t549a707326.png\t8767e6755c.png\tb9be2550e6.png\ted46821aa5.png\n216d7c5437.png\t54a3636ba6.png\t876a7537a1.png\tb9bf0422a6.png\ted4711ab1d.png\n216e2a7492.png\t54a36671be.png\t876dd03fbb.png\tb9c2f9550a.png\ted47d19004.png\n216f53089c.png\t54a5842258.png\t876e6423e6.png\tb9c7a3c680.png\ted481a5484.png\n21720a05fa.png\t54a7882e73.png\t87725b728a.png\tb9c8fa4ea1.png\ted4cb9bcc6.png\n21753d8ef6.png\t54a86e0c5a.png\t877343b661.png\tb9cff7bb67.png\ted4cce1fbe.png\n21789de742.png\t54ac782d61.png\t8776ffcaa4.png\tb9d10b62c9.png\ted4d01e60b.png\n217a70a0cb.png\t54af2c195a.png\t8779738220.png\tb9d7e04349.png\ted4f75e17e.png\n217e7acba4.png\t54af8e4d2e.png\t877d1a7ebe.png\tb9e2f1d49a.png\ted52743a1b.png\n217eeca27e.png\t54afef2ccf.png\t877f9dd0a7.png\tb9e6350f6f.png\ted53b8fcad.png\n217ffd92fc.png\t54b0600992.png\t87804706f0.png\tb9e8bd58f7.png\ted5515c950.png\n2180e2610f.png\t54b1c1d5bd.png\t87807623b6.png\tb9e9505424.png\ted5d275ef1.png\n2191c9ab21.png\t54b1cff49b.png\t8782cc7732.png\tb9ea5eb4c5.png\ted5f2675bf.png\n2194de7683.png\t54b4d74cba.png\t878338aa59.png\tb9eafc377e.png\ted5ffdd2a8.png\n2199f90646.png\t54b5c16ea9.png\t8784d75f0e.png\tb9ebcbc115.png\ted607ec342.png\n21a3da8bd4.png\t54beb44cac.png\t878566ab56.png\tb9ec0ae950.png\ted60e1c64a.png\n21b690be65.png\t54c30d55b3.png\t878c50521d.png\tb9f002931e.png\ted66549524.png\n21ba72ed47.png\t54c4df56c6.png\t878d489fa1.png\tb9f2ed8a68.png\ted6bee376d.png\n21bc477614.png\t54ca91e49f.png\t87906e4144.png\tb9f34d3039.png\ted6c8fc9be.png\n21be159d97.png\t54cb8bdbe2.png\t8795906a9d.png\tb9f3886670.png\ted6f9dfa22.png\n21beba74c7.png\t54cfa66336.png\t87986259f0.png\tb9f6826cda.png\ted71454d03.png\n21c09f5272.png\t54d675829b.png\t879937f791.png\tb9fb7882a0.png\ted7251fc01.png\n21c2d4e863.png\t54dc8e1099.png\t879ed76a41.png\tb9fbc53c1e.png\ted73eb3bbb.png\n21c312cfa1.png\t54dd074cd7.png\t87a140d745.png\tb9fcc34ef3.png\ted761ad8ef.png\n21c45388e8.png\t54dd4658d1.png\t87af5ec058.png\tb9fd7d330a.png\ted764022ae.png\n21caa62f95.png\t54e0223472.png\t87af61df90.png\tb9fd9429e3.png\ted793c1d33.png\n21cd732e68.png\t54e1508240.png\t87afd4b1ca.png\tb9fe322e38.png\ted7bea5840.png\n21d01a8e82.png\t54e1ae89d2.png\t87c102e766.png\tb9ffd8011c.png\ted7d54e701.png\n21d8a7f563.png\t54e2419450.png\t87c1310f25.png\tba0260e3fb.png\ted80cf7830.png\n21de5cb166.png\t54e5e78455.png\t87c2c8d1bb.png\tba030fff5d.png\ted81af7f86.png\n21df07989f.png\t54f0a62c2c.png\t87c6bf0012.png\tba045610d1.png\ted898877d1.png\n21e1d6e38d.png\t54f848a336.png\t87cf093046.png\tba0b91bf45.png\ted8fe4f4e0.png\n21e2078fca.png\t54fbd50faa.png\t87cf2c6542.png\tba0db3d025.png\ted93c39d1c.png\n21e61a7540.png\t55017a37a3.png\t87d58438cc.png\tba1287cb48.png\teda23a9a24.png\n21ec084221.png\t5503a13b3b.png\t87d94a1264.png\tba13dbaa29.png\teda8637a53.png\n21ee8615f0.png\t550555bcd0.png\t87db3bb36f.png\tba16f363d7.png\teda924e7dd.png\n21f003a1cd.png\t5505f08449.png\t87dcac9eea.png\tba18af259f.png\tedae0cbbda.png\n21f1d4055a.png\t550ab3f639.png\t87de3d4a1e.png\tba1ae7a6c7.png\tedaf99ea34.png\n21f1e10952.png\t550b360c16.png\t87ded295b9.png\tba1ce90692.png\tedb228a9b4.png\n21f6c04e43.png\t550ffbb9f8.png\t87e08f180f.png\tba22bcb515.png\tedb300a0c6.png\n21fe3bdb52.png\t55111519ec.png\t87e5ff8f08.png\tba281ddfb8.png\tedb34c730b.png\n21fea9b8f7.png\t5512a556b3.png\t87e6e60636.png\tba2a5fbf4c.png\tedb7317647.png\n220066ba05.png\t5513e388a6.png\t87ecbd03d6.png\tba2e7143e9.png\tedc0dc3a4f.png\n220bb859ee.png\t551cb1b843.png\t87ede84827.png\tba3441ceef.png\tedc0e28c09.png\n221248dd0a.png\t551f9c1379.png\t87ef963522.png\tba3441fabe.png\tedc16adc35.png\n2216d2a75b.png\t5520599432.png\t87eff3371d.png\tba3558504f.png\tedc3e08e48.png\n221a6117a2.png\t552467d680.png\t87f1da602e.png\tba36fddbc0.png\tedc9216897.png\n221a787935.png\t55283447d6.png\t87ff4a488f.png\tba37ff39c7.png\tedc9fb3512.png\n221df8405b.png\t552f7a9f14.png\t88017f8e99.png\tba38c8759f.png\tedcff16369.png\n221e96f96c.png\t5534b178f4.png\t8807748bf5.png\tba4781de48.png\tedd5fc6648.png\n221f271d8a.png\t5535f37211.png\t880f4c16af.png\tba47b8d963.png\teddd90bd4a.png\n2224f6d2d3.png\t553655c199.png\t8811181716.png\tba488e8c97.png\teddf8b7e55.png\n22263d7cdc.png\t55371c6ef2.png\t881407fa84.png\tba494943d8.png\tede06242fd.png\n2226df3ca1.png\t553733b17b.png\t8816f6d638.png\tba4c45d323.png\tede36de5cc.png\n222bdb4aa9.png\t5537f478e1.png\t88177cb642.png\tba53ab39fb.png\tede9c03a6f.png\n2230bca035.png\t553c6f13ab.png\t881bfee759.png\tba588ce106.png\tedea91429d.png\n22323cb42b.png\t553d0ed5c3.png\t881cc4a167.png\tba59579920.png\tedf1e6ac00.png\n2233e6936c.png\t553d2398dd.png\t8824cdcbf0.png\tba5cd51d20.png\tedf2445671.png\n22398d77b8.png\t55461699f5.png\t88291780d6.png\tba5da40d8c.png\tedf47afe46.png\n223e0c3942.png\t5546a9b0a7.png\t882fe73d29.png\tba5e46652e.png\tedfa8cdc1c.png\n22479c66a9.png\t5547124214.png\t8830ed5259.png\tba67135a54.png\tedfe5ae3ef.png\n224b5191f1.png\t5547127042.png\t883163a438.png\tba6970504c.png\tee026cf4fd.png\n224c490664.png\t554af0ca10.png\t883284c825.png\tba6e0a1859.png\tee044b4654.png\n22546c69f1.png\t55501cc489.png\t88362ab4e5.png\tba6f30ad6f.png\tee0839ebf5.png\n22594a646a.png\t555397b192.png\t883dcbd352.png\tba705c8398.png\tee09643244.png\n225ae42209.png\t5554fc278d.png\t8841a25b64.png\tba70cf68f2.png\tee0be71990.png\n225cbb9fef.png\t555653d188.png\t884400b2e8.png\tba72ecf0b5.png\tee0c346a44.png\n2261517560.png\t5559e0ceab.png\t8847a1fbf2.png\tba75f507fa.png\tee0da7c735.png\n2265a9bb97.png\t555a28a94f.png\t88489834ac.png\tba797582a4.png\tee113db06a.png\n2265b53f82.png\t555ab38f47.png\t884958bc2a.png\tba82685b51.png\tee12c800ac.png\n22668e2420.png\t55689d22ea.png\t884cd7af00.png\tba836c5ae1.png\tee1312ac0e.png\n226a07a818.png\t556d61cbf4.png\t8850d1e9ef.png\tba85d911eb.png\tee13d500a2.png\n226b8347aa.png\t556d8c542e.png\t8851d57f9b.png\tba9328f577.png\tee1bc8b7e4.png\n226ba1773b.png\t556e28c244.png\t8851e4f0b2.png\tba951493dd.png\tee200c15ac.png\n226c8cb7ea.png\t55714a25e7.png\t8858cb4648.png\tba96be1a01.png\tee230082be.png\n226eeb7c67.png\t5572857e9f.png\t885bc954e8.png\tba987b336e.png\tee256d408a.png\n227476b140.png\t55737efbfe.png\t885c2a90f5.png\tba98e2df34.png\tee2bfe0240.png\n227987aa07.png\t5575337f66.png\t886335d982.png\tba98ef6b0d.png\tee2cd8b38b.png\n227d87ebc3.png\t557b56b7e9.png\t8865bb8294.png\tba9906abb1.png\tee308764e2.png\n227dd380e7.png\t557feaaf65.png\t8869cb02d9.png\tba9a540bc1.png\tee39078723.png\n2282d94f28.png\t55810b23c4.png\t8869f3399c.png\tbaa0114aaf.png\tee3aad7ba7.png\n2286d961f2.png\t55894e899a.png\t886c96ffb8.png\tbaa44e8d1f.png\tee3b4b8c91.png\n2287d5cbbf.png\t558a5e1dfe.png\t886dc2e534.png\tbaa4f4a656.png\tee3e2a6b0b.png\n2288a277e8.png\t558b11d1b8.png\t886e3df4ab.png\tbaa743f4b4.png\tee3f606619.png\n22895dc54a.png\t558c857cd1.png\t88711d54c0.png\tbaa7d7ba42.png\tee4094ce85.png\n228c4daacd.png\t558cad1501.png\t8875824a73.png\tbaab5d127e.png\tee41eb45b6.png\n228dc87bdf.png\t558f33e591.png\t887c27bbe9.png\tbaac3469ae.png\tee463d2892.png\n228e88f048.png\t5593543b84.png\t887ccd9e4f.png\tbab082fee3.png\tee468f7195.png\n228f9cee2e.png\t55937bc991.png\t8881e6fb14.png\tbab30e51b8.png\tee48355ed6.png\n22922eb6bf.png\t55a217cba8.png\t88839f49f9.png\tbab434d0ba.png\tee48914076.png\n22959357f8.png\t55a49950a0.png\t88847c22a0.png\tbab4e6842d.png\tee4a501d81.png\n2297e39a1e.png\t55a5c88745.png\t8884856272.png\tbab8c99a60.png\tee4cabf39a.png\n22986eba07.png\t55ae3f3757.png\t8888ff440d.png\tbab9a78e8e.png\tee4cf359a2.png\n229a939c86.png\t55b1c3e797.png\t8891f0ddcc.png\tbabbac4cf9.png\tee4d7cfcea.png\n229e38ecab.png\t55b630dd3b.png\t8894588d5d.png\tbabddbeedb.png\tee5113e6a0.png\n22a109bb35.png\t55b7d18492.png\t88957d4b3f.png\tbabe7b8be7.png\tee54465e3d.png\n22a2288e0f.png\t55b9cb8acd.png\t88997e61d0.png\tbabf9dfe9a.png\tee55c26bfb.png\n22a4bbce52.png\t55bb514332.png\t8899e6ddee.png\tbabfe73af6.png\tee5c6d6639.png\n22a889c9fc.png\t55bc498d65.png\t88a0258400.png\tbac45161c9.png\tee5cddeb0c.png\n22a9fdff9e.png\t55c0f6f4a9.png\t88a4200cec.png\tbac4bf9fdd.png\tee5e67f77a.png\n22aba9c257.png\t55cd432cfe.png\t88a58de38a.png\tbac92dee00.png\tee5e9e4570.png\n22b0398a5d.png\t55cd46afec.png\t88a5c49514.png\tbaca1f1596.png\tee5fb13710.png\n22b0703aa1.png\t55ce36ea90.png\t88a7c71302.png\tbaca775a49.png\tee609c01b5.png\n22b0774178.png\t55cf1d31dc.png\t88a9870bbf.png\tbad4625e76.png\tee6357ee64.png\n22b1274df6.png\t55d1e72d35.png\t88abaec3f3.png\tbadb3ea766.png\tee63fc4ac8.png\n22b3c34606.png\t55d3fc0419.png\t88b36056f0.png\tbadc0a6c23.png\tee6a9c3cd3.png\n22ba69d8cd.png\t55d7c2ef8a.png\t88b412588a.png\tbadc44bb99.png\tee71a6c72b.png\n22bc698c3f.png\t55dbfb0fcd.png\t88b8522325.png\tbae698239d.png\tee77baabf5.png\n22c08a8dba.png\t55dc884bd8.png\t88b89e2654.png\tbaf79bc8b3.png\tee7ff3b2fe.png\n22c42f88ab.png\t55df1f35af.png\t88ba38dab2.png\tbaf857f245.png\tee81585df9.png\n22c6019157.png\t55e2b8d4b6.png\t88bacbc4b9.png\tbafaf5810e.png\tee82c84aeb.png\n22c642d3eb.png\t55e9012561.png\t88bd7bfca0.png\tbafcaf7088.png\tee8419c13f.png\n22ca08b636.png\t55f19d6b76.png\t88c55d296d.png\tbb01433ae6.png\tee86a9485e.png\n22ca26f94a.png\t55f1ea73b3.png\t88c9d3f47d.png\tbb038c78da.png\tee86e0845f.png\n22ca553463.png\t55f39afbe2.png\t88d86cecb5.png\tbb055480d4.png\tee8721cc58.png\n22caaba7fa.png\t5601022a4d.png\t88dc124b7f.png\tbb09d123bc.png\tee8d380c54.png\n22d360be43.png\t5601dc4c6b.png\t88e238caef.png\tbb0c689bb9.png\tee8f46704d.png\n22d990d9a6.png\t5602644562.png\t88e2efec9e.png\tbb0cba14fa.png\tee942dcd44.png\n22dce42e16.png\t5607cc4f40.png\t88e2f2736b.png\tbb173e1ef0.png\tee94d2281f.png\n22e0d4d976.png\t560c9471f6.png\t88e2f52aad.png\tbb1743a571.png\tee98738f1f.png\n22e315d55c.png\t561c3ee6c6.png\t88ee6e1988.png\tbb180f72d4.png\tee9a5d2cf8.png\n22e67f017d.png\t561fa940ab.png\t88ef84acb9.png\tbb1e7f303f.png\tee9a8aa9be.png\n22e7cc183f.png\t5625127de7.png\t88efb73723.png\tbb2160921c.png\tee9bd5a04a.png\n22ef99e229.png\t5628d8d57d.png\t88f3d7a1f6.png\tbb2c8d04b8.png\tee9dd14151.png\n22f1d9ba23.png\t562cf59500.png\t88f479236b.png\tbb2d46bbe0.png\tee9df7a0d7.png\n22f1f09e62.png\t562f848c07.png\t88f4daafbb.png\tbb319c8fcd.png\tee9f7913a1.png\n22f1fcd47e.png\t563191e71a.png\t88f577fd2a.png\tbb326ec0a8.png\teea3aa657e.png\n22f4027b5b.png\t56357ce74c.png\t88fa975478.png\tbb364faea2.png\teead2fc101.png\n22f416fba8.png\t5636457570.png\t88fbabb58b.png\tbb365e1786.png\teeb18519bc.png\n22f5dc03eb.png\t563692395d.png\t88fbee1a2f.png\tbb3686fb3d.png\teeb676e37b.png\n22f989d627.png\t5637102f6f.png\t88ff802bf0.png\tbb394eef20.png\teebb71c6e1.png\n22fdf96a54.png\t56377d37d8.png\t89074926b9.png\tbb3a8a42af.png\teec0d74159.png\n22fe529e57.png\t5637d965e3.png\t89097ab99b.png\tbb3b5f35c7.png\teec3f628d6.png\n23012be6b7.png\t563d40e991.png\t891398cf90.png\tbb3bd778b1.png\teec46ec66b.png\n230226a446.png\t563d6cd7eb.png\t89165071a4.png\tbb3ef9de05.png\teec5e72dad.png\n23028f1d61.png\t56441c3a16.png\t8919be1266.png\tbb40a9a368.png\teec980e314.png\n23062a0de4.png\t56445dca2b.png\t891fa63295.png\tbb40ad5b70.png\teeca9d6a63.png\n23087ece3d.png\t564763543f.png\t8920afbcd2.png\tbb455284f1.png\teecbb5b38e.png\n2309b6b116.png\t564b1ca301.png\t8925e85c08.png\tbb4b7133a6.png\teecee83e14.png\n230b7b8d21.png\t564c7b3515.png\t89272f179a.png\tbb4f2ca417.png\teed2797518.png\n230b98e713.png\t564f4cc077.png\t892cdf8eb5.png\tbb4f51d2f7.png\teed279a71a.png\n23119af289.png\t5651ff20d6.png\t892f66c867.png\tbb5953d161.png\teed2e1afbe.png\n231262842e.png\t5655a238b7.png\t8931ce5d9d.png\tbb5a0a2be6.png\teed2eb2cd1.png\n2319827c36.png\t5655bebf54.png\t89340d64ab.png\tbb5b1e34ba.png\teed6c66da4.png\n231fc4e8e5.png\t56583846d9.png\t8936407edf.png\tbb5d74ece7.png\teed8fc6acc.png\n232a5c4634.png\t565c15d6b2.png\t89381d5be6.png\tbb5ec93158.png\teee30457a2.png\n2337388245.png\t565d2342f2.png\t89381f766c.png\tbb6179387e.png\teee314138e.png\n2338283d0d.png\t565f34a45e.png\t893bd55371.png\tbb62954978.png\teee98f9511.png\n233b762d1a.png\t565f9c3fce.png\t893f6e0f7f.png\tbb62c0e164.png\teeeb536d56.png\n233ef1fa32.png\t5662818526.png\t8942571467.png\tbb633b3089.png\teeeb80810b.png\n2340ad8cfb.png\t5663471396.png\t894420be25.png\tbb63bcb48a.png\teeecc7ab39.png\n23416531ec.png\t5664870cd3.png\t89452791fd.png\tbb66632dea.png\teeef52be31.png\n23563da1f7.png\t566ac48e00.png\t8945b8916d.png\tbb6683a533.png\teef671c610.png\n2356ec6570.png\t566b4e0770.png\t8946dfe209.png\tbb689a3c35.png\teef6e1c58c.png\n23578edc04.png\t566e168f8c.png\t8949175e6f.png\tbb68fcaaf1.png\teefc0c11f7.png\n2357cc8231.png\t566eb41f6a.png\t894b5b40af.png\tbb697642ae.png\teefde17753.png\n235fce1c1e.png\t566fce5ef6.png\t894b6febb5.png\tbb6a397bf9.png\tef00c008bc.png\n23606093d7.png\t5673036fa1.png\t894b9ae774.png\tbb6a94ef1a.png\tef03d3e2fa.png\n236a5cc090.png\t5677f991b7.png\t894d0e63f5.png\tbb71f5a675.png\tef04597f79.png\n236e82dbdd.png\t567a050bf0.png\t894f6a1da6.png\tbb73a9de10.png\tef060b82b1.png\n2374487e5c.png\t567c36f67f.png\t894f88afd4.png\tbb7596db38.png\tef06e29142.png\n237782b1b2.png\t567d785c85.png\t895271788a.png\tbb77421973.png\tef0bb96dcd.png\n2379d514cd.png\t567ddfb264.png\t89562d9f7d.png\tbb78b0170e.png\tef0ca9f5ed.png\n237a9c7f4f.png\t567df9b888.png\t89634f0c5e.png\tbb7e7af84a.png\tef0ccbe97b.png\n237c1d424f.png\t56810440c4.png\t8963e13c49.png\tbb7ff4c697.png\tef0e0b570d.png\n2380664922.png\t56877ea14c.png\t8965871960.png\tbb834c3d6e.png\tef0f2d97df.png\n2380869b24.png\t568a6188ab.png\t896ab41b34.png\tbb875e35ab.png\tef0f75b1e1.png\n23823cad6c.png\t568d051da0.png\t896c89d21c.png\tbb895d5015.png\tef12562fe2.png\n2386a0869a.png\t568d119e66.png\t8972857ec9.png\tbb8a7805be.png\tef13247eed.png\n238860d957.png\t568e610c43.png\t8973a34ef9.png\tbb8a9f2aa7.png\tef18960bb3.png\n2388b3eb40.png\t5691530ff4.png\t8974f6558c.png\tbb8c452bb6.png\tef223baeaf.png\n23895ff11f.png\t569214f149.png\t8976fa79e7.png\tbb8e3d24d2.png\tef22ccf533.png\n238a3c6d9d.png\t56974f6864.png\t8978bc57b7.png\tbb9393397e.png\tef252584ab.png\n238dad4016.png\t56985e82a0.png\t897d7e821c.png\tbb939d39e0.png\tef29bdffc6.png\n238f33b1ea.png\t569c62c7d0.png\t8982354add.png\tbb97216331.png\tef2ad4fbb3.png\n23917c7f27.png\t569cb8bb6b.png\t898fb59f6d.png\tbb97471c13.png\tef36c47553.png\n23986c3a69.png\t569e920119.png\t899172e5dd.png\tbb9ad7c38d.png\tef3d8acb5d.png\n239870f218.png\t569f85c4ce.png\t8995059b83.png\tbb9af788f8.png\tef406bff03.png\n239a80d29f.png\t56a4e5c1e5.png\t8995f21618.png\tbb9b6086fd.png\tef4254154d.png\n239b0528cf.png\t56a8a4e1f3.png\t8996600f77.png\tbb9f42f7c8.png\tef441205b5.png\n239f80dbf5.png\t56ab4fae30.png\t8997ca1a29.png\tbba06a7d49.png\tef449747d5.png\n23a17b87e7.png\t56ac0f17be.png\t899bef8b58.png\tbba083d347.png\tef44ece465.png\n23a4cc9225.png\t56ac5605d2.png\t899d1000fe.png\tbba5226192.png\tef481ba313.png\n23a74b714e.png\t56aee0ab08.png\t899f8650da.png\tbbae9c3dcc.png\tef491c01dc.png\n23a760a05e.png\t56b06072e3.png\t89a0972631.png\tbbb3054bf3.png\tef49e86b66.png\n23a7afcb36.png\t56b0d83562.png\t89a201b5ff.png\tbbb43d70f4.png\tef4b3908f1.png\n23a9b95318.png\t56b5540fd7.png\t89a4b40e09.png\tbbb5279a06.png\tef4b910323.png\n23acdd12a8.png\t56b581d1ca.png\t89a57acfbb.png\tbbb568fb20.png\tef51bbcde7.png\n23ae01f9de.png\t56b597f6a8.png\t89a74723fd.png\tbbb762835a.png\tef521e71b8.png\n23afbccfb5.png\t56b654a72a.png\t89aa92a709.png\tbbb8eb87db.png\tef532b1d90.png\n23affe3bde.png\t56babd544c.png\t89ac484988.png\tbbbee75b08.png\tef58213c9b.png\n23b3bd473f.png\t56bb598a33.png\t89ac736eca.png\tbbc1ca699c.png\tef606386ed.png\n23b75fe183.png\t56bed3512e.png\t89ace8809a.png\tbbc3a70187.png\tef6083f48e.png\n23b7bf14c5.png\t56ced19ad6.png\t89ad7a93d6.png\tbbcd7567ee.png\tef60d10f12.png\n23bbdfd154.png\t56cfa0c19d.png\t89ad7bca6d.png\tbbcfa77705.png\tef6229f0db.png\n23bd6495e5.png\t56d025f76c.png\t89aedf2162.png\tbbd1b4cc0b.png\tef654c1b73.png\n23c6efd41e.png\t56d39ded7f.png\t89af21b6ee.png\tbbd7e0885f.png\tef6a3bd80c.png\n23c72b10cb.png\t56d600b706.png\t89b5cc20e5.png\tbbf06a910a.png\tef719ce0f9.png\n23cdfc53fe.png\t56d6d999b8.png\t89b6496f57.png\tbbf13ffcbb.png\tef73a37e31.png\n23d0724e97.png\t56d842da83.png\t89b77fc82a.png\tbbf1d1b243.png\tef753a06c7.png\n23d24d6b0b.png\t56dc8e416f.png\t89ba98210c.png\tbbf28ad8f4.png\tef77a03cae.png\n23d79ceec4.png\t56dce16be5.png\t89bb9341df.png\tbbf3b633d7.png\tef79b3b912.png\n23d8bf4955.png\t56de180fa8.png\t89c0f9dcab.png\tbbf6b192be.png\tef7b551c05.png\n23dc46cef1.png\t56e61928d2.png\t89c5994dc6.png\tbbf8b15a50.png\tef85dc4cf0.png\n23e0be0175.png\t56e6b6e780.png\t89c9bd7ad4.png\tbbf8c7968a.png\tef898b80ea.png\n23e1bb3ee8.png\t56e8cf35d1.png\t89cb80c596.png\tbbfb2201ce.png\tef8b8553f8.png\n23e2ad6faa.png\t56eecd7fc0.png\t89cc61c040.png\tbbfc493cd1.png\tef9151a6d1.png\n23e5717c0f.png\t56f4bcc716.png\t89d0caa7d1.png\tbc002e42ab.png\tef96de0550.png\n23e7ebf95f.png\t56f72ef356.png\t89d48aba49.png\tbc056b97fc.png\tefa4499cd9.png\n23eb8cfbc0.png\t56f7cb9c52.png\t89daaf02d5.png\tbc067b8eff.png\tefa6548f84.png\n23ee619fec.png\t56fec9fc41.png\t89dba191b1.png\tbc09c0039c.png\tefa70b51dd.png\n23f27e6704.png\t5701bbfe46.png\t89dc091930.png\tbc0b179b41.png\tefb349eca6.png\n23f5febd5d.png\t57063e912c.png\t89dd28d442.png\tbc0ea15b81.png\tefb7370889.png\n23fb59b591.png\t570761aef4.png\t89dfb7ba1d.png\tbc0ecdfd52.png\tefbcadb481.png\n23ff5127ff.png\t570c9d5a7a.png\t89e1d01fda.png\tbc15d3d823.png\tefbf52e9b9.png\n240004d8fd.png\t570d1da727.png\t89e3fc01e4.png\tbc15e4ac18.png\tefbfde4930.png\n24025d965a.png\t57157b6f9d.png\t89e7c93d0c.png\tbc183053e8.png\tefc0369a2e.png\n240285633f.png\t5717740aa6.png\t89eb8ef89f.png\tbc1868e3ce.png\tefc114a3d1.png\n2403b921b5.png\t5718e81a4d.png\t89f8881f19.png\tbc19c541c0.png\tefc705478f.png\n2407342da4.png\t571b14e631.png\t8a02b4a1b7.png\tbc1c545371.png\tefcc088f95.png\n240798dc12.png\t571e63d0bc.png\t8a04ba71be.png\tbc2160ec16.png\tefce4de15c.png\n240835a198.png\t571e7f5a50.png\t8a0eb38750.png\tbc22c76e09.png\tefd13aa45b.png\n240b3b3de9.png\t572001f0b4.png\t8a0f543772.png\tbc24d33d57.png\tefd1625f2a.png\n24157bdeeb.png\t57243fcad7.png\t8a14fa3b33.png\tbc26da3339.png\tefd8613b2d.png\n241a32aab0.png\t5726b12101.png\t8a15905832.png\tbc2ae414f8.png\tefe1db1329.png\n241f5d63c5.png\t572870ed39.png\t8a1aed2c83.png\tbc2bb48583.png\tefe3043924.png\n241f9702a2.png\t57293a65d5.png\t8a21cff6cc.png\tbc2c4e63bf.png\tefe3e72587.png\n24228899d5.png\t572ab7045a.png\t8a237e828a.png\tbc3156b1e4.png\tefe4a7c5fc.png\n2424f4afc7.png\t572cb220d2.png\t8a23c93c1c.png\tbc3c918cd4.png\tefea140c8a.png\n2425501bf4.png\t572f0b1ad8.png\t8a2904c3aa.png\tbc3f2b82ec.png\tefea7102dd.png\n2425fd4e27.png\t572ffe9ec9.png\t8a29fd2ebe.png\tbc41500ff5.png\tefec1f093e.png\n2428fce568.png\t57302cf4b6.png\t8a2afd812c.png\tbc493a5136.png\tefec59e86f.png\n24294d88d3.png\t57314a2b58.png\t8a2dc90df5.png\tbc4fc7c41e.png\tefec662dc1.png\n242beacac1.png\t573b493456.png\t8a3932d6f7.png\tbc51796258.png\teff595dbcc.png\n242d332c46.png\t573dad0ffb.png\t8a394a5c38.png\tbc5382bc71.png\teff6f75116.png\n242f1fac62.png\t573f9f58c4.png\t8a396e306d.png\tbc541b8369.png\teff7ff1f67.png\n24328f0ad1.png\t574c07082c.png\t8a39b1ce9f.png\tbc54357991.png\tf000d583f9.png\n2437b08bc0.png\t575160ad14.png\t8a4671cb47.png\tbc5697eebb.png\tf004fcafc7.png\n243c97ac4d.png\t5755998028.png\t8a481007c9.png\tbc5ee6b256.png\tf0062a6a72.png\n243ee3da2d.png\t575d24d81d.png\t8a48b37b4e.png\tbc5ee94de7.png\tf006fb4fc6.png\n24421f34d4.png\t575d4ae82a.png\t8a4a6988f2.png\tbc606fc1ec.png\tf00dc3b37e.png\n2444e83809.png\t575fca4e10.png\t8a4c33185f.png\tbc6521fff4.png\tf011521ef0.png\n2445bf85b8.png\t575fcf27fd.png\t8a54304ab0.png\tbc69ed9589.png\tf01741e3fb.png\n2446290064.png\t5764d3797f.png\t8a55cca47a.png\tbc6b14c9d3.png\tf0190fc4b4.png\n2447b1e3fa.png\t576fc986ce.png\t8a5739903e.png\tbc6c1c7006.png\tf01ab39d6f.png\n2448f7f11e.png\t576fd5de16.png\t8a578f1c6d.png\tbc6f7dbf38.png\tf01ac1723f.png\n244d2a8082.png\t5777ae6eb1.png\t8a57f722c9.png\tbc7471a6c2.png\tf01d94ac3a.png\n244edb19cc.png\t577964b6ce.png\t8a588ad1c0.png\tbc7512e68b.png\tf01f1a62fd.png\n24522ec665.png\t577c96c270.png\t8a58fe70f8.png\tbc77c9515b.png\tf0228a88de.png\n2454a76962.png\t577e38c148.png\t8a59924259.png\tbc78319696.png\tf0256c5c19.png\n24554214ed.png\t577e97ccbc.png\t8a6090c9ec.png\tbc79fe1a3a.png\tf025797d80.png\n24554f2d85.png\t577eac9bc5.png\t8a60cc9e23.png\tbc7e49ee18.png\tf0272d1ea0.png\n2455c55b4e.png\t57880c7526.png\t8a6401e9d5.png\tbc811e2c1f.png\tf03011ec7e.png\n2456901c98.png\t578aefa47e.png\t8a65df187c.png\tbc814417aa.png\tf0345fbe7e.png\n2458a5cd64.png\t578b502c6d.png\t8a6ed0439b.png\tbc86846749.png\tf0384deeb7.png\n245c641ba5.png\t578c9f94f3.png\t8a6f94a69a.png\tbc8c4fb855.png\tf0405d15ce.png\n245daa2004.png\t578cae22ea.png\t8a700d4031.png\tbc8ee4cf71.png\tf045946cb7.png\n2462065207.png\t57904adf94.png\t8a7395008e.png\tbc97a09993.png\tf0491dfb86.png\n24621be803.png\t5791c70c05.png\t8a78a43802.png\tbc99a6fff0.png\tf04996aba1.png\n246d86ba32.png\t579b201f56.png\t8a7db2c1d1.png\tbc9a247d9a.png\tf04de5e206.png\n247d730764.png\t579fb9fac7.png\t8a83f122c0.png\tbc9ee00ac3.png\tf051e93f9f.png\n247ecc1a63.png\t57a3af612c.png\t8a894561b6.png\tbca0070a77.png\tf0551d4405.png\n2480b2ea50.png\t57a953bb1d.png\t8a8ca7cdcc.png\tbca07dbbec.png\tf057258486.png\n248badfc5a.png\t57aef5c6c3.png\t8a8eaf1df7.png\tbca55d7fa5.png\tf05f01139e.png\n2497cda279.png\t57b062d234.png\t8a927a2c52.png\tbca65f4faa.png\tf0633e8b86.png\n249cc02d2f.png\t57b2f0c86e.png\t8a945b50bd.png\tbcabfde62b.png\tf066ef45be.png\n24a18f95ed.png\t57b461407f.png\t8a9780932d.png\tbcadb2743b.png\tf069247787.png\n24a38972a3.png\t57b4df87a9.png\t8a9ddaff1d.png\tbcae4cc8d2.png\tf069ae9d86.png\n24a8ae9bb9.png\t57bf244688.png\t8aa1e2dddf.png\tbcae7b57ec.png\tf06a2c2849.png\n24a902540b.png\t57c0f3cf2c.png\t8aa2cf1ec9.png\tbcb40fa449.png\tf06e589dca.png\n24ab6d3ded.png\t57cc268ef2.png\t8aa8ca5b75.png\tbcb43f1fec.png\tf06ef6c8b7.png\n24ab9aa878.png\t57ccc5cb82.png\t8aaf885cd3.png\tbcb54a6510.png\tf070629919.png\n24ad74ca83.png\t57cdf8ac6b.png\t8ab2399b4a.png\tbcba6a2e84.png\tf0743b3eda.png\n24af5751a5.png\t57d0d7a5d0.png\t8ab2e33896.png\tbcbe3c0ee2.png\tf083e6692d.png\n24af93594a.png\t57df29302d.png\t8ab63d565b.png\tbcbe49c82f.png\tf08c404497.png\n24b40af38f.png\t57dfb12f1a.png\t8ab67a80a0.png\tbcc6debfe7.png\tf094e60180.png\n24baee6e4b.png\t57e0cff707.png\t8ac0fe0392.png\tbcc80b17b1.png\tf094ebc7d3.png\n24bb2d5239.png\t57e242dd6d.png\t8ac72dfeba.png\tbcc816b619.png\tf097dfb619.png\n24bb32f0f8.png\t57e394bc67.png\t8ac7c26b28.png\tbcc8a4a8c8.png\tf099f5fecf.png\n24bf79cd1e.png\t57e445c326.png\t8aca22cb3e.png\tbcca498c87.png\tf09d1d559c.png\n24c04f56a2.png\t57e8104bca.png\t8acfda99c5.png\tbccfd45fab.png\tf0a7b2070d.png\n24c102ca1f.png\t57eff70775.png\t8ad317659a.png\tbcd1e64c58.png\tf0acdd0a7c.png\n24c2ef0b97.png\t57f175f5ec.png\t8ad6fce1d6.png\tbcd6012819.png\tf0b36fc8b3.png\n24c59f0e00.png\t57f80fed27.png\t8ad8b9b163.png\tbcdf24526e.png\tf0b5a05a6e.png\n24c5a49118.png\t57f8b13094.png\t8adaa82868.png\tbce104494c.png\tf0b88c1c94.png\n24caedea87.png\t57f9c7cc29.png\t8adf3e546b.png\tbce1766212.png\tf0b9ed069c.png\n24cb80b5da.png\t57ff363100.png\t8ae428b77f.png\tbce346738c.png\tf0ba375ffe.png\n24ccf23eed.png\t57ff97666d.png\t8ae4c4eaf8.png\tbce4cdea7f.png\tf0bc578837.png\n24cd295677.png\t57ffcba066.png\t8ae916bb5c.png\tbce80dc1da.png\tf0bc803eec.png\n24cfa6ef47.png\t5800248c17.png\t8aea6cba17.png\tbceae870ce.png\tf0c3cf012a.png\n24d42d4499.png\t580172eb67.png\t8aea7aa3b5.png\tbcf322c1c0.png\tf0c401b64b.png\n24d4817190.png\t5804b2ef80.png\t8af8d5fadd.png\tbcf57add57.png\tf0c5925a80.png\n24d5404d36.png\t580a66f59a.png\t8afdbc2226.png\tbcf6935d08.png\tf0c6966ef8.png\n24d67576d0.png\t580cdf3c55.png\t8b00c9309c.png\tbcfb588cfb.png\tf0c7259583.png\n24da167d30.png\t580f8c86c1.png\t8b01d00cb0.png\tbcfddc1da8.png\tf0c9aa2b32.png\n24df9ce7f3.png\t58135d5189.png\t8b0478ed02.png\tbcfeacb8b3.png\tf0c9c19f47.png\n24dfb5825f.png\t581430fbf1.png\t8b049ec23a.png\tbd02724c0d.png\tf0d15c63c5.png\n24e327c455.png\t58160d4f6c.png\t8b05478217.png\tbd0ea3d8a5.png\tf0d5d88809.png\n24e98d224a.png\t5819935dd4.png\t8b056ea93a.png\tbd1131dcdc.png\tf0d68a2c3a.png\n24eb0dbec4.png\t581a8a574c.png\t8b09c914be.png\tbd125b1dbc.png\tf0d6a1cc66.png\n24efbcb089.png\t581e7f38d7.png\t8b0d0b0633.png\tbd16b04193.png\tf0d7632959.png\n24f27c4087.png\t581f8c2db5.png\t8b0d16466b.png\tbd1f81f0ad.png\tf0db14f93e.png\n24f4a505b4.png\t582649f4c4.png\t8b11113861.png\tbd20d18369.png\tf0dc7e30fc.png\n24f57fec17.png\t582d92d3a0.png\t8b1337f50a.png\tbd21c7615b.png\tf0e0e3f933.png\n24fa97853b.png\t582ec64662.png\t8b18165c97.png\tbd29207bc3.png\tf0e2b7c039.png\n25055c35c2.png\t5832fa2b13.png\t8b1b7b6b9a.png\tbd292a1690.png\tf0e3f4c0ad.png\n25094044ad.png\t58335ef080.png\t8b1d451770.png\tbd2a0b256c.png\tf0e7e06354.png\n250a94739d.png\t58360538fd.png\t8b217536f4.png\tbd2a2c0b16.png\tf0e99e9761.png\n2510191dbf.png\t583a3ba83c.png\t8b258ca8e0.png\tbd2b47f556.png\tf0ea2421f0.png\n25143d7837.png\t583b5ed773.png\t8b27b0910f.png\tbd388b139f.png\tf0ec542987.png\n2517937482.png\t5848af3f89.png\t8b29f94321.png\tbd38ae781d.png\tf0ec895f3d.png\n2529342187.png\t5850ff1f52.png\t8b2a9f576a.png\tbd38e4dccf.png\tf0efea3879.png\n252aa6e031.png\t585de1481a.png\t8b300cc9c3.png\tbd3f5f6d56.png\tf0f1362f19.png\n25320f34bb.png\t58627a9dba.png\t8b30fb6d98.png\tbd48b27bd3.png\tf0f3922d30.png\n2533d14255.png\t586a68e3d4.png\t8b3148b744.png\tbd4aed9a86.png\tf0ffa23ff8.png\n253508042f.png\t5870ec4884.png\t8b3edddb86.png\tbd4dd3ba3f.png\tf104068e09.png\n25386ddea1.png\t5876dcc00f.png\t8b4189c2a4.png\tbd4f797d33.png\tf1080b61e2.png\n2539151aeb.png\t58777a8c79.png\t8b4302d784.png\tbd51537256.png\tf10c0e5046.png\n253eb71c3d.png\t58789490d6.png\t8b4306b0b7.png\tbd526ca9af.png\tf10e03cecf.png\n253ee294d9.png\t587c839aba.png\t8b44342fd9.png\tbd56cea895.png\tf110d1c41f.png\n25402dcb0b.png\t587e768d3a.png\t8b45ba83a7.png\tbd572bed64.png\tf115ca6981.png\n2542f94c1c.png\t58877e2525.png\t8b5232fcfa.png\tbd5c8feb85.png\tf11db4f38a.png\n254689666f.png\t588f6656aa.png\t8b5578c2d4.png\tbd6338e1f6.png\tf11f8eab00.png\n254a7179a6.png\t5893a87a04.png\t8b5757d8ee.png\tbd67eeac0f.png\tf121c06f1f.png\n254ad9e3c2.png\t589548425b.png\t8b58bed470.png\tbd6b9327ac.png\tf122ead5f2.png\n254efd9043.png\t5895535b14.png\t8b5bb309bc.png\tbd73a889d6.png\tf124af3a8a.png\n25544ba1a8.png\t589e94265a.png\t8b5d38c547.png\tbd745ad4e4.png\tf12a8f2716.png\n25567a3210.png\t58a1c44228.png\t8b62c26211.png\tbd76fc4213.png\tf12d23aac8.png\n2556c23489.png\t58a33df533.png\t8b62cdd144.png\tbd7b74689d.png\tf12ef6c930.png\n255a44ea74.png\t58a4b8ec4a.png\t8b63b67722.png\tbd7f4cb2c4.png\tf13198288e.png\n255e6ef6a9.png\t58a51c861e.png\t8b65866f57.png\tbd83eb2721.png\tf13375e660.png\n256573c99f.png\t58a87efcca.png\t8b6939fa74.png\tbd853e7ecb.png\tf1395b9e13.png\n2565d98748.png\t58ad3d9083.png\t8b6a124c7e.png\tbd8614d181.png\tf139be21a4.png\n2567aa74b6.png\t58ad72a266.png\t8b6fe1fb5e.png\tbd89b15cba.png\tf13d52a40a.png\n2567b0c9de.png\t58b63d6905.png\t8b71ac9863.png\tbd8b6464b3.png\tf13eed34a1.png\n256c00d426.png\t58b6687314.png\t8b74a58326.png\tbd8c036b93.png\tf13f682676.png\n256c8caa20.png\t58b73c60c6.png\t8b754adb0c.png\tbd90397780.png\tf145cf3eb1.png\n256f6c133a.png\t58b7922c6d.png\t8b7a361aaf.png\tbd971c3d86.png\tf147fc9161.png\n2570a7d454.png\t58b893053f.png\t8b7b938922.png\tbd9b727231.png\tf14987617d.png\n2575b53c1d.png\t58bc52866e.png\t8b80908f1e.png\tbd9d82e3d8.png\tf14ba9dae4.png\n25770e1dcf.png\t58bc76194d.png\t8b88631559.png\tbd9e699850.png\tf14eb52a4d.png\n2578042000.png\t58bc8d6ba0.png\t8b9135cb6e.png\tbda6b91790.png\tf154d2eb14.png\n257b9aaf33.png\t58bf5a0052.png\t8b937b3dbc.png\tbdaf5d8f32.png\tf158599333.png\n257d28374f.png\t58c00da563.png\t8b9528c3f2.png\tbdb1c1fcbf.png\tf15f1f9c2d.png\n2582af5390.png\t58c0ba9cff.png\t8b99bc5f09.png\tbdb45f92c7.png\tf161d8e4df.png\n2583a7cedf.png\t58c2fb4293.png\t8ba0248e74.png\tbdb72006f7.png\tf1633d5b80.png\n25845ace3d.png\t58c362e010.png\t8ba0e5e058.png\tbdb7d19e2f.png\tf1681f27a6.png\n2585d38f47.png\t58c5cc6c34.png\t8ba1b78c2a.png\tbdb7d2e7c4.png\tf16842c2d3.png\n2587f44a00.png\t58c949a784.png\t8ba3b7b19b.png\tbdb9ba462f.png\tf170afca54.png\n258838cb73.png\t58cb646a9b.png\t8ba86dc9c6.png\tbdbe57a505.png\tf170d1105d.png\n2588e3481e.png\t58d03e7c89.png\t8baac34c90.png\tbdbf180bb4.png\tf1716e0716.png\n258a5db63e.png\t58d177c0d0.png\t8bab2b0c82.png\tbdc98456ab.png\tf1774ff8e8.png\n258a6cebd7.png\t58d550a35d.png\t8bb2eca74b.png\tbdcde6c055.png\tf17a840281.png\n258b965292.png\t58d62862fa.png\t8bb4d98d6a.png\tbdcf0026d8.png\tf17a9a04de.png\n258e0261fe.png\t58d7827e0f.png\t8bb7245636.png\tbdd42e95bb.png\tf17c5539be.png\n258e23c0be.png\t58de316918.png\t8bbb229784.png\tbdde42d974.png\tf18c5bf639.png\n259262e35e.png\t58edb9b58c.png\t8bbcef76bc.png\tbde08b24f0.png\tf18d49b870.png\n2596227117.png\t58ee6464fd.png\t8bc33608b6.png\tbde4f58ca3.png\tf18eb3b587.png\n2596ef2a19.png\t58ef954a94.png\t8bc62bdef7.png\tbdeb6fc911.png\tf19171329c.png\n259b4c2009.png\t58f295fd1c.png\t8bc707feee.png\tbdf243fb52.png\tf193ecaa85.png\n259c209627.png\t58f49bb2a6.png\t8bc939cc85.png\tbdf438ab78.png\tf19734e3a8.png\n259c78b1f0.png\t58f752db3b.png\t8bcd5f745b.png\tbdf72c90ad.png\tf199e2628a.png\n25a0f0fd27.png\t58fcc8cbc4.png\t8bcdfdb737.png\tbdfa099910.png\tf19b7d20bb.png\n25a18d1cc8.png\t59092e355d.png\t8bce9ebbfd.png\tbdfae19bec.png\tf1a0a38096.png\n25a1f27254.png\t590ba9d838.png\t8bd6e290b2.png\tbdfc5dc085.png\tf1a0db4ac1.png\n25a273a16a.png\t590c7895df.png\t8bd8257cea.png\tbe02cfe5da.png\tf1a3d6352e.png\n25a419e8c3.png\t590e32af63.png\t8bde7fa6d3.png\tbe053f21db.png\tf1ad140585.png\n25a9f40814.png\t590e6bdbef.png\t8be32a53af.png\tbe06a00a4d.png\tf1af965f8f.png\n25ab18341a.png\t590f7ae6e7.png\t8be34a849e.png\tbe088426cc.png\tf1b1f57aee.png\n25adecfb8a.png\t5910423829.png\t8be34cb1dc.png\tbe147b0ef2.png\tf1b84b858e.png\n25b1cfb30c.png\t5910d039bc.png\t8be77224a0.png\tbe150e5652.png\tf1ba1f62f3.png\n25b466a5dd.png\t591a9a0aa6.png\t8be9ac156b.png\tbe18a24c49.png\tf1c0279b35.png\n25b5fa95b7.png\t591ea9bcdb.png\t8bef77c6f4.png\tbe1c8cbf1f.png\tf1c3f0c377.png\n25bb6c7720.png\t59224321e4.png\t8beff4b809.png\tbe1f425eac.png\tf1c78adac1.png\n25bbbcd4a4.png\t5925accb52.png\t8bf21420ad.png\tbe234bafbf.png\tf1cdcb9883.png\n25bc6fdc1d.png\t592b9fc860.png\t8bf499cb0c.png\tbe27934873.png\tf1d244a5ca.png\n25bd98aec3.png\t592fccad94.png\t8c09cba826.png\tbe2ca3ebce.png\tf1d488759e.png\n25c415e373.png\t593421be30.png\t8c0d2616e5.png\tbe2d77edfc.png\tf1d52a443e.png\n25c6dc51d9.png\t593519d51e.png\t8c139b7d3a.png\tbe2dd1db16.png\tf1d70317fd.png\n25cc21facb.png\t5936740d48.png\t8c14ce1882.png\tbe331c9969.png\tf1de314d5e.png\n25d24d4554.png\t593ae73040.png\t8c14de9fb8.png\tbe3696598a.png\tf1e1cbd5b8.png\n25d31baad9.png\t593b21a623.png\t8c19c5829d.png\tbe38c07458.png\tf1e3ed1c10.png\n25da96a64b.png\t593c918ee5.png\t8c1cdc6be2.png\tbe40a43921.png\tf1e4acdd7b.png\n25db231ea0.png\t593e0e741f.png\t8c1d0929a2.png\tbe464130da.png\tf1e8fccdc3.png\n25db63dca5.png\t593f3e9254.png\t8c1db0fc39.png\tbe466f2fc3.png\tf1eca3ffd2.png\n25dc959ebd.png\t594712691d.png\t8c232a1cf8.png\tbe48e1e064.png\tf1f051d549.png\n25dcf3cbfa.png\t5948185d4d.png\t8c25a010a8.png\tbe4a1be61e.png\tf1f3eaa428.png\n25e2f2418c.png\t594af4bcde.png\t8c27396a23.png\tbe4b1274dd.png\tf1f4a8d2ee.png\n25e4b6fd77.png\t594fb5e201.png\t8c2dfc1e05.png\tbe4db8ff8a.png\tf1f5e46fe0.png\n25e519b494.png\t595023dd24.png\t8c38199b15.png\tbe53c64363.png\tf1f5ef7e45.png\n25e682fe85.png\t59519b78d2.png\t8c4081751a.png\tbe542946f5.png\tf1f93ae87c.png\n25ebcec9c0.png\t5954cb50c1.png\t8c477d6abf.png\tbe5468a599.png\tf20201155b.png\n25ec24dcd6.png\t5954dc3bbb.png\t8c498416aa.png\tbe5478366a.png\tf202b3b4ca.png\n25ec60d1fb.png\t59597a8431.png\t8c4ac14f53.png\tbe565e046b.png\tf20521b7f5.png\n25ecdbf1e2.png\t595b69ce11.png\t8c4f0af661.png\tbe56c6732d.png\tf21090dacd.png\n25edb9edb3.png\t595c54b0ac.png\t8c4f5c8935.png\tbe56f5359f.png\tf213d4b318.png\n25f2e30c66.png\t595deae139.png\t8c521772c8.png\tbe5f65573c.png\tf2148ddf93.png\n25f532d7cc.png\t59605645c0.png\t8c55a43029.png\tbe604488fd.png\tf214e16a78.png\n25f57cf4f4.png\t5962b44f3d.png\t8c5a6a2aed.png\tbe663f4ad7.png\tf21593cfa8.png\n25fb2afebf.png\t59635629a7.png\t8c659875e6.png\tbe68195daf.png\tf21b46a988.png\n25fb3a895a.png\t59641cace0.png\t8c666c6eca.png\tbe6c27d3ab.png\tf21da6e691.png\n25fcee37d1.png\t59681332a3.png\t8c6a8730ff.png\tbe6cf5cf86.png\tf21fb36d82.png\n25fcf14778.png\t5968a5ba44.png\t8c6b8aac6d.png\tbe6ef0be10.png\tf221c63b76.png\n25fd997ffa.png\t596b9ee0eb.png\t8c71fc1122.png\tbe7014887d.png\tf22224cb9e.png\n25fdf34ca9.png\t5970137a1e.png\t8c7293ab40.png\tbe721a303b.png\tf22304afd9.png\n26000ba320.png\t5977523480.png\t8c758e47a1.png\tbe72bfb0b2.png\tf2259e86f4.png\n26027f6238.png\t597a8f0a20.png\t8c75ac2447.png\tbe74bf2e59.png\tf227baddb1.png\n260720933e.png\t597cd882cb.png\t8c7a0d5a1a.png\tbe78c05316.png\tf2292656d5.png\n260b956861.png\t597fe20001.png\t8c8450cba6.png\tbe7fd92858.png\tf22fc5bff8.png\n260d5faf0e.png\t598008dcfa.png\t8c860c8b88.png\tbe8310513c.png\tf234066134.png\n26127d74ee.png\t598797da90.png\t8c88cf3d22.png\tbe8a6ccc8c.png\tf23939f747.png\n261549d7ff.png\t5987f0b196.png\t8c8b8b7d6d.png\tbe8b72467d.png\tf24baff2ee.png\n2615b5210d.png\t598a044729.png\t8c8c2012e7.png\tbe8fb89ea9.png\tf24c930c5c.png\n261df55c24.png\t598c719aa7.png\t8c916f9cb5.png\tbe90ab3e56.png\tf24e1277be.png\n2620ab7aa2.png\t598d720228.png\t8c9638818d.png\tbe92239eb4.png\tf2597ddd76.png\n26233363f8.png\t598ec4a143.png\t8c976149d0.png\tbe9c6e9b7f.png\tf2598805e6.png\n2626cc9b47.png\t5990041346.png\t8c9846ac4b.png\tbe9cbe601e.png\tf26273bfeb.png\n26281da345.png\t5991bd3f1b.png\t8c99d8c85f.png\tbe9ce6a9d7.png\tf26293e80c.png\n262dbffadf.png\t59944f0f92.png\t8c9a0ea088.png\tbe9f4dba24.png\tf26e6cffd6.png\n262de74669.png\t5995a07fc6.png\t8c9a658868.png\tbe9f7dc881.png\tf273589439.png\n26320df569.png\t5997066cbf.png\t8c9b19f267.png\tbea5fee80f.png\tf27415aed0.png\n26346acb80.png\t599c69c278.png\t8c9df5d510.png\tbea9b0571b.png\tf27a3ca60e.png\n26370170b8.png\t59a07f8b50.png\t8c9f7cd590.png\tbeba956f7e.png\tf2827e1818.png\n2637e17579.png\t59a20a914e.png\t8c9f8d0790.png\tbebae78c6a.png\tf28310a865.png\n26382e2d1f.png\t59ac415c45.png\t8ca0f37825.png\tbebb8af347.png\tf283b78236.png\n263bd9dfd6.png\t59b1bac5d5.png\t8ca104baef.png\tbebc672259.png\tf288e998f5.png\n263c4fdbda.png\t59b5c4accc.png\t8ca267be2b.png\tbebf415bd1.png\tf28b483859.png\n263cd3f9b1.png\t59b6130fa0.png\t8caa006f70.png\tbec7c0ff21.png\tf290d2554b.png\n2643c25e2a.png\t59b6b69cf1.png\t8cafc8c3f5.png\tbec8f132d0.png\tf2961eb2d7.png\n2645e45058.png\t59ba060353.png\t8cb2bbf1a8.png\tbecb8900ea.png\tf29742cf45.png\n2649a5ea52.png\t59bdabe3ff.png\t8cb3525ca4.png\tbed2be7216.png\tf2974ab55b.png\n26527458de.png\t59c6b08391.png\t8cb3edc93c.png\tbed4dce080.png\tf29d14411d.png\n2659e2d532.png\t59c92db054.png\t8cb4641c0d.png\tbed4e13a4b.png\tf2a038cbd1.png\n265e1752b4.png\t59caf44e3d.png\t8cb504a3fa.png\tbedb558d15.png\tf2a062e477.png\n265e948e47.png\t59cb0eecde.png\t8cb6f8581b.png\tbedc7812fe.png\tf2aa5877ad.png\n265f09083e.png\t59ceec4f7c.png\t8cbcad0831.png\tbedeb394ba.png\tf2b0f2cc91.png\n2662f5581a.png\t59d34c582c.png\t8cbd5b61a5.png\tbedf824aa0.png\tf2b1b2d180.png\n266496989f.png\t59d6fd86f3.png\t8cbe7d29e1.png\tbee4d7241e.png\tf2b5d8aa8a.png\n266653cd84.png\t59dabe134f.png\t8cbefc189e.png\tbee920335c.png\tf2b63204c8.png\n2675797f52.png\t59dde43449.png\t8cc037cd28.png\tbee9d5af54.png\tf2ba02ee9c.png\n2675a2f9b6.png\t59dfd4c7e5.png\t8cc2d23c57.png\tbeeceae804.png\tf2bcc24bad.png\n267d02b0fd.png\t59e5a2f843.png\t8cc3d09d76.png\tbef02db904.png\tf2c108fd0b.png\n267e4768a6.png\t59ed0ad9fa.png\t8cd406ddbd.png\tbef04e89bc.png\tf2c640742a.png\n267efbb1f8.png\t59f0bb402e.png\t8cd612c0cb.png\tbef04f7b6e.png\tf2c869e655.png\n26808abb2d.png\t59f388376a.png\t8cd826bc5b.png\tbef0bdcde3.png\tf2cafb648c.png\n26831e1535.png\t59f453a2de.png\t8ce1df1a6a.png\tbef690a9ee.png\tf2ce04bcea.png\n268baf2d43.png\t59f5a1afc7.png\t8ce5996371.png\tbef8500370.png\tf2d096af76.png\n268d39db25.png\t59f63d1a95.png\t8ceaf79e16.png\tbefdfcbe5b.png\tf2d250342d.png\n2690474e7d.png\t59f990f493.png\t8ceb500014.png\tbf00a0ab71.png\tf2d2fefde3.png\n26943026b5.png\t5a005e1b95.png\t8cedd346cf.png\tbf05a52a6b.png\tf2d3d06909.png\n269b6cad0d.png\t5a00950b21.png\t8cf16aa0f5.png\tbf0a278c46.png\tf2d52f3fc0.png\n269bd08c22.png\t5a01200b40.png\t8cf3a6a6f5.png\tbf0c510431.png\tf2d5b21c5d.png\n269dca6e32.png\t5a02df7676.png\t8cff7e0631.png\tbf0f9e16e7.png\tf2d668f7e7.png\n26a0a33c8c.png\t5a06c45958.png\t8d08955cdf.png\tbf18502d61.png\tf2d6c1a719.png\n26a380a4b8.png\t5a0ac0af98.png\t8d08d5a68a.png\tbf185105cb.png\tf2db786fef.png\n26a59c7514.png\t5a0e07149a.png\t8d0a03ad21.png\tbf18760706.png\tf2df759bab.png\n26a5f0bd74.png\t5a109d6ff4.png\t8d0c065461.png\tbf1a216ff9.png\tf2e6753aad.png\n26a76b910c.png\t5a13a495f6.png\t8d0e4f5ceb.png\tbf1bd5c21a.png\tf2e9faa785.png\n26ade8f8ce.png\t5a160497a2.png\t8d141a0b7a.png\tbf21540fd5.png\tf2f0278d02.png\n26ae98fddb.png\t5a1e0d8d51.png\t8d1856dc1b.png\tbf22319310.png\tf2f055154d.png\n26afdd76b5.png\t5a1f5d188d.png\t8d1c74ff65.png\tbf23bdb543.png\tf2f31ca428.png\n26b001bc1f.png\t5a25f90bd9.png\t8d1caa7424.png\tbf294b9661.png\tf2f5f3d673.png\n26b509826c.png\t5a2f0ee267.png\t8d21f488a5.png\tbf34f09365.png\tf2f67e14a6.png\n26b573dd27.png\t5a3946f871.png\t8d2785033d.png\tbf352f5c5c.png\tf2f7bfb55a.png\n26c09fffa8.png\t5a3ab68a97.png\t8d28e714e3.png\tbf358cc90b.png\tf2fcceee25.png\n26c403ce6d.png\t5a3c38104c.png\t8d2e68312e.png\tbf35b1cd7c.png\tf2fd5c81ac.png\n26c4ff2238.png\t5a3ddb7980.png\t8d2e7ffd4a.png\tbf3fc8ff12.png\tf2ff5cdf47.png\n26cd314b1c.png\t5a40c4b4a0.png\t8d3552011c.png\tbf41d02c55.png\tf2ff5ef2b7.png\n26d1c96d4b.png\t5a416af8db.png\t8d35b213a9.png\tbf42f85d68.png\tf2ffcf3de5.png\n26d8e9c763.png\t5a48d9f7c9.png\t8d3ee9356b.png\tbf44061c39.png\tf3037afbee.png\n26de0d5588.png\t5a4f3cfa85.png\t8d3fa0ddbe.png\tbf4ded5e3a.png\tf306181b5d.png\n26de5a7d3e.png\t5a503228eb.png\t8d4644da68.png\tbf56f7dd40.png\tf30c36bf6b.png\n26e47af0a2.png\t5a5dea7240.png\t8d4ba8fd28.png\tbf5cf3c536.png\tf30c96de13.png\n26e5094d40.png\t5a5eee5cde.png\t8d52127a38.png\tbf5f0e482c.png\tf30e47cb36.png\n26e7c4a304.png\t5a689622e2.png\t8d5336acd7.png\tbf6482f963.png\tf30eb09e9c.png\n26e7e22f60.png\t5a69e74675.png\t8d5787b7f0.png\tbf64e8280e.png\tf31007fdcd.png\n26ec6ef9df.png\t5a6cd229a9.png\t8d57bc47c1.png\tbf66dafb31.png\tf3103f4418.png\n26f2a18ee2.png\t5a6f57c150.png\t8d5a7fe872.png\tbf6a587575.png\tf313650e66.png\n26fa565501.png\t5a6fa22207.png\t8d5cca7374.png\tbf6faea4de.png\tf31ab2f578.png\n26fa7eb03e.png\t5a7235dd88.png\t8d5e51efb4.png\tbf71bff284.png\tf31db080ea.png\n2705e93cf0.png\t5a74b2e160.png\t8d5fa4ae69.png\tbf72590547.png\tf31f30db2d.png\n270c61a3f4.png\t5a7566a915.png\t8d641d2e8d.png\tbf73cacab5.png\tf3220c3590.png\n270dd2b8e6.png\t5a7a933a05.png\t8d673bd786.png\tbf76c8aff3.png\tf328eccbf3.png\n27144efbb1.png\t5a7d0e837a.png\t8d702c4aec.png\tbf7fc1c772.png\tf32a360c8b.png\n2714fcd59e.png\t5a7e7cd37a.png\t8d70a676cd.png\tbf8525523a.png\tf33034c1ab.png\n2715531a3f.png\t5a80185b1a.png\t8d71a7b436.png\tbf8bf3d4fa.png\tf330cc6dac.png\n2717821409.png\t5a81181e41.png\t8d71bd7891.png\tbf8d1fb21d.png\tf3356c9ee4.png\n271b6d1519.png\t5a85ebb162.png\t8d71ff6d1f.png\tbf92688cf3.png\tf33577efee.png\n271b964887.png\t5a88500149.png\t8d738d136f.png\tbf92b48ddf.png\tf33605c868.png\n2726b3fba5.png\t5a8893ef09.png\t8d73c4f678.png\tbf9664650b.png\tf33ac8fba7.png\n2731a8327c.png\t5a8b4aeb28.png\t8d74e6f54a.png\tbf98f3e3a7.png\tf33c2ce025.png\n2731fa8693.png\t5a8dce4386.png\t8d7b7baf47.png\tbf9963c402.png\tf33da3a50f.png\n27352bee07.png\t5a93345d97.png\t8d7e882550.png\tbf9e800301.png\tf33e9641c7.png\n2735b4d2b0.png\t5a945e734c.png\t8d839a365b.png\tbfa7ee102e.png\tf342603c76.png\n273aded7a4.png\t5a970eae2c.png\t8d85595c2f.png\tbfac5970fb.png\tf346f5a7da.png\n273f74fadf.png\t5a9dfd23de.png\t8d858a66d8.png\tbfac8e63cb.png\tf34715c722.png\n275477cf56.png\t5aa0015d15.png\t8d89d465fc.png\tbfad493b37.png\tf34886e220.png\n276807c93b.png\t5aa1bc199b.png\t8d8b3c36cd.png\tbfb0e22b53.png\tf34ae8f3d4.png\n276be2d7ab.png\t5aa379e81e.png\t8d8bfa55d5.png\tbfb24092c3.png\tf34b7f5f31.png\n276c7ac1dc.png\t5aa5d4a616.png\t8d8f9a7868.png\tbfb385be9f.png\tf34cd4146d.png\n27737f71f6.png\t5ab14e9d76.png\t8d8fc0c5e9.png\tbfb44314b2.png\tf355185623.png\n2775c5f447.png\t5ab7c36d9b.png\t8d9314a9f4.png\tbfbb9b9149.png\tf35f60b7a3.png\n277678b119.png\t5abac94653.png\t8d932981b5.png\tbfbc7fe76d.png\tf362d474b7.png\n2777901446.png\t5aca18a605.png\t8d9355e7d2.png\tbfbe0808b1.png\tf3638e69a3.png\n27797053ff.png\t5accb79219.png\t8d97feafe8.png\tbfbe6491cb.png\tf369ff14ab.png\n277bd3a647.png\t5acfbf7ae1.png\t8d9905f7d5.png\tbfbea83faa.png\tf371086cfb.png\n277be83e7e.png\t5ad24ea101.png\t8d9ad80697.png\tbfbeb93dd8.png\tf37556e4da.png\n277bfa9b8f.png\t5ad4b1d1ff.png\t8d9b8d724a.png\tbfc44d82f0.png\tf375ddc5a6.png\n27838f7f46.png\t5ad7172ca7.png\t8d9d768dfb.png\tbfc47bbe13.png\tf37a2afb5b.png\n2785d33497.png\t5adb4649bf.png\t8d9e8acc95.png\tbfc495e840.png\tf37d00e098.png\n27870bb184.png\t5adcc5ba11.png\t8daa7e48d1.png\tbfc9276507.png\tf37f2eb2f8.png\n2787bfe603.png\t5ae05ab401.png\t8daf098aa9.png\tbfc9ef388f.png\tf38473c1dd.png\n27882d27ff.png\t5ae1291c95.png\t8db0766923.png\tbfcd0fb5ac.png\tf385ab1b8a.png\n2789ca643e.png\t5ae57cf40e.png\t8db0a68a4c.png\tbfcdcd2720.png\tf389e1ad02.png\n278c81f164.png\t5ae916d52c.png\t8db280a3b0.png\tbfd167b98f.png\tf38bc86fea.png\n2794d6bb71.png\t5ae9fe9e4a.png\t8db5bf9be2.png\tbfd1d7caa1.png\tf38cdb799e.png\n2794ebdad9.png\t5aeabfc312.png\t8db6e29dd8.png\tbfd3a072a8.png\tf38d7d303f.png\n279999e213.png\t5aec15741f.png\t8db9dc4a62.png\tbfd6f21136.png\tf38e999102.png\n279f39950a.png\t5aece0449d.png\t8dbf4c5515.png\tbfd73ece7a.png\tf3a13b06bd.png\n27a240a570.png\t5aee63c515.png\t8dc946b34d.png\tbfd7a73deb.png\tf3a6fd21f7.png\n27a79442e2.png\t5af256bd64.png\t8dd1373b0a.png\tbfd8b4e975.png\tf3aae8d8d5.png\n27aa644839.png\t5af5682dcc.png\t8dd3227410.png\tbfe12bdf70.png\tf3ab43b528.png\n27aad9dc3a.png\t5af7237d5c.png\t8dd3e8622c.png\tbfe7e5c004.png\tf3ab5cd2c9.png\n27ac879f52.png\t5af890206e.png\t8ddaea54ce.png\tbfeb7b1e93.png\tf3ae873adc.png\n27b4cbac34.png\t5afce5bd51.png\t8ddf87b5bd.png\tbfecadcc92.png\tf3b0873046.png\n27b5526adb.png\t5b01231632.png\t8de3dc60ac.png\tbfed194c18.png\tf3b0c5b57a.png\n27b98d2fb7.png\t5b01ba4ec1.png\t8de69c8b40.png\tbff8269656.png\tf3b3d14146.png\n27c035c5ae.png\t5b087e5813.png\t8debc383b3.png\tbffa15aaa5.png\tf3b62e9e0a.png\n27c28dd8be.png\t5b0dc2a3ca.png\t8debc4ebc6.png\tbffb67fc14.png\tf3b6e11340.png\n27c2b373be.png\t5b1a3faaa8.png\t8df60c2b48.png\tc003af5c7d.png\tf3b9a8601b.png\n27c45fef01.png\t5b217529e7.png\t8df7b2c99e.png\tc005e0beae.png\tf3bcec6b35.png\n27c4850501.png\t5b2282b51e.png\t8df9c84f20.png\tc009838721.png\tf3bcfb2955.png\n27c890d8b3.png\t5b25607782.png\t8dfcdb2b3c.png\tc00a5cdd62.png\tf3bf3050d0.png\n27d080fb1c.png\t5b281d742f.png\t8e0c252445.png\tc00c4b8814.png\tf3c02959b3.png\n27d0d9af30.png\t5b28bc540c.png\t8e115cbec4.png\tc0106cb856.png\tf3c0c384d1.png\n27d39dc44f.png\t5b28c5ba50.png\t8e13060816.png\tc0111d3fb6.png\tf3c5b41f11.png\n27d6ed1bb0.png\t5b2a625adb.png\t8e131567d6.png\tc0173ae190.png\tf3cd70744d.png\n27dc4eed59.png\t5b31f5b38e.png\t8e132e492e.png\tc0188d737e.png\tf3ce9ff03e.png\n27dd841e7a.png\t5b3a40bffb.png\t8e13cda6fb.png\tc01b9a4db8.png\tf3d195ee69.png\n27ddc1e08c.png\t5b3b86341f.png\t8e149fe591.png\tc0200e7c89.png\tf3d457ccce.png\n27de423ad3.png\t5b3cecb9af.png\t8e154a02a7.png\tc02071548b.png\tf3de47863a.png\n27e12cbcea.png\t5b3e049596.png\t8e1a651214.png\tc021826839.png\tf3df114520.png\n27e33cfeed.png\t5b40470a20.png\t8e1be11efb.png\tc023a62b96.png\tf3e27d394c.png\n27e4bd625d.png\t5b4089af99.png\t8e20641658.png\tc0240a49cb.png\tf3e2d515f5.png\n27e7cffaa9.png\t5b435fad9d.png\t8e224affa8.png\tc026a17632.png\tf3e6109c05.png\n27ec51c877.png\t5b4cec17bf.png\t8e280f2c87.png\tc02894016b.png\tf3e67c0d77.png\n27ed8e3d5f.png\t5b59adc91c.png\t8e2b00b033.png\tc02940f66e.png\tf3ee165cd0.png\n27efc3195e.png\t5b5b4e2805.png\t8e2e3afdb5.png\tc02e3de473.png\tf3efb5f509.png\n27f1c2b267.png\t5b6501b1e0.png\t8e3228d793.png\tc03056f1aa.png\tf3f127c01a.png\n27f4d59295.png\t5b6aa79adc.png\t8e38d229cc.png\tc032d1b936.png\tf3fb230499.png\n27f63c56a0.png\t5b6d177013.png\t8e3a461a6a.png\tc038f6fbe4.png\tf3fb50e182.png\n27f6632fd1.png\t5b72ade49e.png\t8e3cdc41e0.png\tc0390b37bf.png\tf3fb95295e.png\n27f923aac9.png\t5b72ebcad6.png\t8e429ed84f.png\tc039fc26ad.png\tf3fd020f27.png\n27fbb734ab.png\t5b780293cd.png\t8e43b91b7a.png\tc040483242.png\tf3fd36d099.png\n280150526a.png\t5b7c160d0d.png\t8e45622c55.png\tc0434afd21.png\tf3fdd9e147.png\n28029fa2e6.png\t5b7d9ed799.png\t8e479eaedd.png\tc04c9a4bba.png\tf409d55fee.png\n2803a5cf55.png\t5b7ebdc259.png\t8e48845dd4.png\tc04dbddedd.png\tf40b354125.png\n2806b89050.png\t5b807c2235.png\t8e48de3b3b.png\tc04fb6068a.png\tf40bfdeb45.png\n2807129884.png\t5b808ba57b.png\t8e4e5f12f2.png\tc05245c247.png\tf40ce50235.png\n280911f6fd.png\t5b836d2444.png\t8e544dd9be.png\tc0565001a3.png\tf411b6b9c3.png\n280a7b7630.png\t5b841772e1.png\t8e546f0a9e.png\tc0572f0f52.png\tf411c146f2.png\n280c59adc5.png\t5b86544456.png\t8e58618fe6.png\tc0581f8747.png\tf413f0ea19.png\n280dcd0b2d.png\t5b89d0ef0b.png\t8e58dad54d.png\tc0593f7735.png\tf413fd0931.png\n281050471b.png\t5b8bb095d3.png\t8e5c3abaf7.png\tc05c593480.png\tf417d9657c.png\n28194875d8.png\t5b8c857ffc.png\t8e632c2239.png\tc0660b7172.png\tf4199a74bf.png\n2821f70a5a.png\t5b8ccc9cb6.png\t8e667cec9c.png\tc068228c1e.png\tf419aaa1b8.png\n2822b81a5a.png\t5b8dc086ff.png\t8e6a55c1e7.png\tc06881f692.png\tf41a025b6c.png\n2822fd3839.png\t5b979e06c1.png\t8e7324c8ed.png\tc06acdcf08.png\tf41e46eedd.png\n2824065636.png\t5ba2224fff.png\t8e76bf194e.png\tc06ce71f9a.png\tf41faf6e43.png\n282f1c2a78.png\t5ba378eb07.png\t8e7f953305.png\tc073b8930c.png\tf420886e62.png\n2832423eb9.png\t5ba442f1cf.png\t8e80f40570.png\tc07a44eb90.png\tf4259be373.png\n283563892f.png\t5ba4a336bd.png\t8e84576858.png\tc07a6c2da6.png\tf425a67987.png\n283a0f28ed.png\t5ba8ea8437.png\t8e84b940e3.png\tc07ed81d25.png\tf4297d5e4d.png\n283aea40d1.png\t5bb0ca2e6b.png\t8e8ddd8b7b.png\tc08283e078.png\tf429b8e6f8.png\n284894a681.png\t5bb0dceac1.png\t8e900e1e4f.png\tc086a756ec.png\tf42a7e65f2.png\n2849772dd6.png\t5bb48ef0ea.png\t8e914b81fe.png\tc08e29a1dd.png\tf42f9a65c3.png\n284f7dd9f9.png\t5bb5258677.png\t8e919b610c.png\tc092a99d23.png\tf433483dba.png\n28532f0b9b.png\t5bb5684d12.png\t8e93b66386.png\tc0933729ea.png\tf4341b924b.png\n28553d5d42.png\t5bb56c179a.png\t8e950278f2.png\tc0936aa1c4.png\tf4365363ef.png\n2855669f79.png\t5bb976391e.png\t8e96420dca.png\tc096fab229.png\tf43882a92c.png\n28574ad175.png\t5bbbade03d.png\t8e99199e35.png\tc09b502837.png\tf4394e72c5.png\n2858abba07.png\t5bbcd70d2f.png\t8e99928455.png\tc09b5e92cb.png\tf439b7a428.png\n285b9ba056.png\t5bbff609ed.png\t8e9a51812e.png\tc09f4e95e8.png\tf43c6999ba.png\n285c426094.png\t5bc134fb2d.png\t8e9df4c791.png\tc09ff6deab.png\tf43de2bb2a.png\n285d4eed14.png\t5bc16d8dd1.png\t8e9f2b9b61.png\tc0a045726d.png\tf43f1e7456.png\n285d6a6e97.png\t5bc5e4df88.png\t8ea492a7ee.png\tc0a37fef54.png\tf447853ad2.png\n285f4b2e82.png\t5bc61454ef.png\t8ea497ed9d.png\tc0a6b5b312.png\tf448297676.png\n28604880bb.png\t5bcea69e9b.png\t8ea5a5cf57.png\tc0aa0c7b66.png\tf44850f63d.png\n2862c77d0c.png\t5bd4b0d116.png\t8eaa59df0a.png\tc0aa15a26a.png\tf448b518f6.png\n28639f0d4e.png\t5bdd3618c8.png\t8eab008735.png\tc0ac5c2522.png\tf44be3e2d7.png\n2864cd871f.png\t5be0000be2.png\t8eb9bd0e65.png\tc0b0e1782b.png\tf44f67efae.png\n286978d145.png\t5be7dbad99.png\t8eb9fca4f3.png\tc0b11fc61f.png\tf450f9c067.png\n286f309694.png\t5be84e4ce5.png\t8ebceef687.png\tc0b1d5f314.png\tf452c01d97.png\n28708ca79f.png\t5be9925bca.png\t8ebdcba27d.png\tc0b31ebf26.png\tf454efbd20.png\n28710277a9.png\t5beb3977b4.png\t8ec207d7df.png\tc0b6fafc47.png\tf456e3161d.png\n2872ffab1e.png\t5bef437307.png\t8ec7e755e6.png\tc0b9277880.png\tf45dbe6541.png\n2874150952.png\t5bf185daf4.png\t8ec847ca45.png\tc0bcd07a7b.png\tf45fa444e5.png\n287457855d.png\t5bf4cc95a7.png\t8ecdb4163c.png\tc0be35aea2.png\tf460ceb65d.png\n287aea3a3b.png\t5bf538ffa6.png\t8ed7bbe728.png\tc0c4b98415.png\tf4638e7428.png\n287b0f197f.png\t5bf5d82829.png\t8edcb8b48c.png\tc0c83b4e07.png\tf463f6b15b.png\n287d035464.png\t5bf6104412.png\t8ee1bbbff2.png\tc0ca73758a.png\tf4676634e5.png\n2886e27fb6.png\t5bf8deec0f.png\t8ee20f502e.png\tc0cbdabcc3.png\tf478171003.png\n2886fdf93f.png\t5bf950d526.png\t8ee22588c5.png\tc0cc10dcaf.png\tf4784e4f1f.png\n288afaadba.png\t5bfa612299.png\t8ee672b4e6.png\tc0cc9532c4.png\tf47c7a0fd2.png\n288c260e08.png\t5bfd9e78d0.png\t8ee70d5955.png\tc0cda22d64.png\tf47fa89a52.png\n288c4c1941.png\t5bffe3b52f.png\t8eedc347f2.png\tc0cf5a03ea.png\tf4809326f6.png\n2890995b32.png\t5c06d96310.png\t8ef3d8c73c.png\tc0d50876eb.png\tf486d3c0e5.png\n28948eeb9c.png\t5c0acc3f31.png\t8ef46c404f.png\tc0d5f29125.png\tf48701d932.png\n2897b099fe.png\t5c0cde01b7.png\t8ef76ed16e.png\tc0d95beb5f.png\tf48b8dab45.png\n289af097b0.png\t5c122ecb96.png\t8ef841cae8.png\tc0dc514a7c.png\tf49380471e.png\n289e85588f.png\t5c18909f08.png\t8ef8466217.png\tc0de45a4d5.png\tf49f7aab7a.png\n28a1187a4e.png\t5c18ce998b.png\t8ef9856c81.png\tc0df1c6bc5.png\tf4a0c6315d.png\n28a5f65cc2.png\t5c21830f2c.png\t8efaac0550.png\tc0e06c8b5f.png\tf4a2cfefb3.png\n28a9c6e5fe.png\t5c283e9cad.png\t8efd177b4b.png\tc0e7b93c04.png\tf4a6b37b7c.png\n28abb87539.png\t5c28eec331.png\t8f02627082.png\tc0e834f392.png\tf4aae34a84.png\n28ad4e60c5.png\t5c297b5dcd.png\t8f03ee383d.png\tc0ea282c4f.png\tf4b56e86fa.png\n28b3d997ce.png\t5c2a235f47.png\t8f06e655a4.png\tc0ef23e386.png\tf4ba0ceaa9.png\n28b5ab7b37.png\t5c2c196072.png\t8f07ef8585.png\tc0f8ef8904.png\tf4c0be9dbe.png\n28b600699e.png\t5c2ee31bad.png\t8f08f4e9f2.png\tc0fe58323e.png\tf4c228a2cb.png\n28b61cb50a.png\t5c33b65396.png\t8f09a3c64f.png\tc0ffa90c4a.png\tf4c478c3c9.png\n28bfbb1894.png\t5c46f520d2.png\t8f0a738345.png\tc100d07aa9.png\tf4c8e27cbb.png\n28c176bebe.png\t5c51c23a7b.png\t8f0abdabcc.png\tc104f63014.png\tf4db6fadac.png\n28c49076d1.png\t5c520f8046.png\t8f0cd99812.png\tc10a04cb9c.png\tf4de1207a1.png\n28c4c062c6.png\t5c5ecbc728.png\t8f10fd62fb.png\tc10ff1398a.png\tf4e1756bea.png\n28c9b3239f.png\t5c616dea00.png\t8f1145f250.png\tc111da300a.png\tf4e6470f2b.png\n28ca5ff16f.png\t5c635d62ae.png\t8f135ade8b.png\tc112748161.png\tf4e6928ad4.png\n28d365fb50.png\t5c641d8b06.png\t8f13f14cf0.png\tc115806ff8.png\tf4ed280779.png\n28d42de29f.png\t5c65a9775b.png\t8f1649cfba.png\tc1169d47ce.png\tf4ef003a58.png\n28da93197f.png\t5c665e8088.png\t8f16b2685d.png\tc1190c9c37.png\tf4f0625dbf.png\n28dc7e6de7.png\t5c70da6c2d.png\t8f17217adc.png\tc11adadc15.png\tf4f0fd932e.png\n28df927e87.png\t5c78acfc62.png\t8f1d6045fb.png\tc11c6a374c.png\tf4f94415aa.png\n28e1de6f4e.png\t5c7e257630.png\t8f1fbec1e6.png\tc120a8aeb0.png\tf4fb218df6.png\n28e3c1128c.png\t5c7f310937.png\t8f2087a1f5.png\tc123ed5f24.png\tf4fd4f5ebe.png\n28e55ba73b.png\t5c82266b71.png\t8f22918dcf.png\tc12eecef20.png\tf4fee61024.png\n28f031b89e.png\t5c87971507.png\t8f25e6d0d6.png\tc13346444f.png\tf4feed1635.png\n28f2e72621.png\t5c8ce5869b.png\t8f29c5d348.png\tc1335a0107.png\tf4ff08a9ff.png\n28f383ba58.png\t5c8f7b5ec8.png\t8f3009b2cb.png\tc137712b61.png\tf50030f52a.png\n28f5a828d0.png\t5c91f8d9c0.png\t8f384b5525.png\tc137884306.png\tf503aaf699.png\n28f6f52c8f.png\t5c93067d41.png\t8f399c3cc6.png\tc13a0c92d1.png\tf507d4f122.png\n28f865caaa.png\t5c9326f1e4.png\t8f39d964c3.png\tc1431cfade.png\tf50c1f8bce.png\n28fb4363ea.png\t5c964e5765.png\t8f3c9b3a6f.png\tc144359991.png\tf50ca50044.png\n28fbed93e4.png\t5c9745bcd2.png\t8f3d13f940.png\tc1466f194c.png\tf518cd6d26.png\n290082cd0a.png\t5c97dd7b21.png\t8f3ef2f080.png\tc147830d3d.png\tf5199961a1.png\n2901743d7d.png\t5c9de2a022.png\t8f407b64ae.png\tc1494cafd8.png\tf51a3cd14b.png\n2903f6a1db.png\t5c9ff6624c.png\t8f45dcab28.png\tc14978dca9.png\tf51a494b10.png\n290510ddab.png\t5ca3ff4be9.png\t8f4a26738e.png\tc14ab2b105.png\tf51afa7501.png\n2907c5cca5.png\t5ca4a83636.png\t8f4b5f835f.png\tc14e1cad20.png\tf51f522f3c.png\n290a374cd4.png\t5ca5bc9009.png\t8f4e0ee37f.png\tc1503fb0ec.png\tf526e6c2dc.png\n290adb76ea.png\t5ca5ec45da.png\t8f4e3c0d7c.png\tc156fd6d44.png\tf52c144f24.png\n290b6aeb20.png\t5ca63e94e9.png\t8f57714896.png\tc157e34751.png\tf52e44f960.png\n290e9d9748.png\t5cac2dde69.png\t8f5af7f42a.png\tc165af8d0d.png\tf537c28b83.png\n290f496ffd.png\t5cada3d3f1.png\t8f5d3d5cec.png\tc16e8ff6c3.png\tf53940cb48.png\n291464ee6d.png\t5caec95eb3.png\t8f60ff0312.png\tc16f939a6f.png\tf5395fd3cd.png\n29162a3471.png\t5caecaa345.png\t8f6454562a.png\tc172f3ca08.png\tf5431c0d2c.png\n291a923318.png\t5cafb50b66.png\t8f64d370c5.png\tc173b95448.png\tf54aaaacd9.png\n291c027b3f.png\t5cb1c7a644.png\t8f6b83329f.png\tc175c77179.png\tf55a5f089d.png\n291c5dc477.png\t5cb2496125.png\t8f73748cda.png\tc176666556.png\tf560c14550.png\n292ea3b61f.png\t5cb7627bee.png\t8f73820ee7.png\tc17cc9f45c.png\tf5613d5371.png\n29309e32c6.png\t5cbaa0c09f.png\t8f7ad537ab.png\tc17dd92387.png\tf5648f98db.png\n29345e1a87.png\t5cbb8cb416.png\t8f7b79768c.png\tc17e3409af.png\tf566a7cbc6.png\n293636b7d5.png\t5cbdfef7b3.png\t8f7b7b9980.png\tc180f4858b.png\tf56e5373a1.png\n29372dcce7.png\t5cbf79259f.png\t8f7fd55a76.png\tc18a60afdc.png\tf57a53da45.png\n2937b14b90.png\t5cc486a204.png\t8f80ef62b2.png\tc18ed6f057.png\tf57cb32d6d.png\n2937fed09f.png\t5cc4cc40f5.png\t8f8458c343.png\tc190d5a96a.png\tf57e741b5b.png\n293a7d72e7.png\t5cc6b64727.png\t8f86c7b6f9.png\tc192a33608.png\tf580cbbe8c.png\n293dccb805.png\t5cca3a3213.png\t8f8dcb2506.png\tc19559637d.png\tf5865bd6ff.png\n29422d4ed6.png\t5cca94091b.png\t8f90213bc5.png\tc195df04f5.png\tf58804bcfc.png\n294983be8c.png\t5cd181b41f.png\t8f9027e401.png\tc19963fb19.png\tf597d72b6d.png\n294d7c8fa9.png\t5cd2e24eb2.png\t8f91b34e5b.png\tc19973511d.png\tf59821d067.png\n294dd884ff.png\t5cd89f2d4c.png\t8f955e935c.png\tc19e72fb1f.png\tf599fad413.png\n2958ce0087.png\t5cdaefe6c0.png\t8f9be04cd4.png\tc1ab73f5b2.png\tf59ae94c33.png\n2959cf9916.png\t5cdb791f34.png\t8fa22efd9b.png\tc1b12c9cbe.png\tf5a0acc480.png\n295e52d03d.png\t5ce21107fa.png\t8fa238ffc7.png\tc1b28f261a.png\tf5a92d1356.png\n29603f9405.png\t5ce554e890.png\t8fa4415858.png\tc1b3a9912c.png\tf5ab2a12f3.png\n29605d469a.png\t5ce6499aa2.png\t8fa6d81b29.png\tc1b4f6c978.png\tf5ab6d0102.png\n2964d1edb5.png\t5ce76a572d.png\t8fa7391a85.png\tc1b9ea094d.png\tf5abd9a08a.png\n29669db85f.png\t5cedcdb47a.png\t8fa965a12c.png\tc1bdf548c9.png\tf5b281f27c.png\n296a04bc06.png\t5cee7efb7a.png\t8fb16c02e4.png\tc1be8dcffe.png\tf5b2861c9b.png\n296de39be2.png\t5cefe5b52c.png\t8fb2ab4e8b.png\tc1bf512092.png\tf5b747d45b.png\n2975e0a15e.png\t5cf1f1e7f0.png\t8fb440fbad.png\tc1c23f6048.png\tf5bae9cdc5.png\n297646fb99.png\t5cf4d33d4f.png\t8fb449f918.png\tc1c54ad4ba.png\tf5bd321e00.png\n2980d1bf07.png\t5cf7cc53bf.png\t8fb4b30658.png\tc1c5602088.png\tf5bf21ccc3.png\n298246d0a2.png\t5cfa3dc715.png\t8fb4c44b8b.png\tc1c62f32e5.png\tf5c25276d2.png\n2984673259.png\t5cfc14855c.png\t8fb68e99e8.png\tc1c6a1ebad.png\tf5c2e66754.png\n2985c57bb7.png\t5cfcd47740.png\t8fb8e7784c.png\tc1d8ab2c28.png\tf5c5ee96f3.png\n29878a94ca.png\t5d0410b2b2.png\t8fbd93c9b8.png\tc1d9f5d8d1.png\tf5d0bc1b0d.png\n298bda8c6b.png\t5d049a35b3.png\t8fc26c0caa.png\tc1dbd8af8d.png\tf5d1176e33.png\n2992bd33c2.png\t5d079b66dc.png\t8fc3eccd37.png\tc1de072b83.png\tf5d2d4d974.png\n2993a272be.png\t5d07e84138.png\t8fc42d8cc9.png\tc1e2411006.png\tf5d30fbf24.png\n29940401c6.png\t5d0a9c5b0e.png\t8fcf8fd268.png\tc1ed7e8344.png\tf5d9f8e9b0.png\n2998f3f29b.png\t5d126bd7d8.png\t8fd4178b11.png\tc1f15b6967.png\tf5ddab9cab.png\n29994023fc.png\t5d127af974.png\t8fd6d6de42.png\tc1f6155789.png\tf5e19b10bd.png\n2999c30683.png\t5d15e26742.png\t8fd7efc82d.png\tc1f92fd149.png\tf5e331a0f0.png\n299ba3cf34.png\t5d1909b602.png\t8fd80add2e.png\tc1fbb3149c.png\tf5e72111d0.png\n299c46c3d2.png\t5d1934fb12.png\t8fde701098.png\tc1fc605426.png\tf5efd88c24.png\n299ef808f6.png\t5d1be3b238.png\t8fe9c92090.png\tc20069b110.png\tf5f5901c55.png\n29a1b18e79.png\t5d2a7997d3.png\t8fef989622.png\tc202c25557.png\tf5f9636464.png\n29a27c9903.png\t5d2c7161fc.png\t8ff5e93573.png\tc20570220b.png\tf5fa8299bc.png\n29a3dc18b0.png\t5d318847b6.png\t8ffb4376c0.png\tc20578d4b6.png\tf5fbe0ec32.png\n29a4296e74.png\t5d36a9659e.png\t8ffb761299.png\tc207a2983d.png\tf5ffdccdcd.png\n29a734bae8.png\t5d43b6e684.png\t8ffef526e0.png\tc20c019650.png\tf5ffe74512.png\n29b017cb20.png\t5d4578efd7.png\t9001151a5c.png\tc20fd75377.png\tf6009f764c.png\n29b0235e9a.png\t5d45c03e9d.png\t9002e4bbcb.png\tc215d4d831.png\tf6011fc398.png\n29b25b5f03.png\t5d47b1f360.png\t9003e2ecd0.png\tc215f25359.png\tf6065d0142.png\n29b2a38316.png\t5d4ae4af66.png\t90090d08ae.png\tc216c094d3.png\tf60e495e7c.png\n29b42ba485.png\t5d4dcb9d8b.png\t900a53ca3e.png\tc2171fde8a.png\tf61358ecbf.png\n29b5ee6cf5.png\t5d4ebda29c.png\t900da4d5f2.png\tc218b0d187.png\tf6143e3cf8.png\n29bc65465e.png\t5d5a0c70a2.png\t900eb82c33.png\tc21acbb2b2.png\tf614477668.png\n29bd0c2d25.png\t5d5bbffb59.png\t9010a05d5e.png\tc221839b8f.png\tf61edc2659.png\n29beb70925.png\t5d600057f5.png\t9010e76e79.png\tc2228f38aa.png\tf61f590543.png\n29bf18e23c.png\t5d6126f459.png\t90113ed6bc.png\tc223b6cd87.png\tf62410bce9.png\n29c033e8e4.png\t5d62a981e7.png\t90117af5fb.png\tc227dad701.png\tf625984494.png\n29c6630f61.png\t5d6b433645.png\t9011eda704.png\tc229e55bfa.png\tf62cb9db70.png\n29cd49e08e.png\t5d6be3e74c.png\t9017195c55.png\tc22ccd5186.png\tf62d2298d6.png\n29ceeed519.png\t5d6e5fe652.png\t9018e08735.png\tc22d9b32b0.png\tf6322a46a2.png\n29cf979b92.png\t5d7424eab2.png\t90194208b9.png\tc22f677a67.png\tf63a24c964.png\n29cfb788f9.png\t5d752d6d4a.png\t901c578a64.png\tc22fae7598.png\tf63b28a739.png\n29d32c7561.png\t5d81dbf5ea.png\t901cf47850.png\tc23090da64.png\tf63dc1e3be.png\n29d8817f97.png\t5d83036e50.png\t901f697be4.png\tc230cf2a94.png\tf63f03c926.png\n29d8bf3a4c.png\t5d85874fbb.png\t9029480222.png\tc230ea7815.png\tf641699848.png\n29db281c47.png\t5d89e1ab89.png\t902d01b7dd.png\tc23175cb53.png\tf64190c47e.png\n29ddc2a2aa.png\t5d91db33fd.png\t902f970c1c.png\tc236669b05.png\tf6442fd9c6.png\n29df15c1ce.png\t5d934f890d.png\t903180c447.png\tc238f22e8a.png\tf646560aa9.png\n29e9c79a7e.png\t5d98069797.png\t9031ddc924.png\tc2395da60e.png\tf64705a472.png\n29e9f52665.png\t5d9a434f6f.png\t903206854b.png\tc23a260780.png\tf64b762060.png\n29ea4f92c3.png\t5d9c2a8469.png\t90364abd01.png\tc23be3df99.png\tf64c261570.png\n29f2d6e111.png\t5d9d0112ad.png\t903ed53422.png\tc23d957c59.png\tf64e69ccfe.png\n29f3b6dc8e.png\t5d9d31868c.png\t903f9f033c.png\tc2419712de.png\tf650532780.png\n29f66371c7.png\t5da5ddb2e2.png\t9041bc2468.png\tc24839469c.png\tf653451c51.png\n29f9ee1ab4.png\t5dacdb1080.png\t90436aed27.png\tc2485cb6c3.png\tf6576d4cbe.png\n29fd4ba626.png\t5dafe5bb61.png\t9047a592c9.png\tc24b739128.png\tf65a52f375.png\n29fdaa3c4e.png\t5db18c66d8.png\t9048ab9f44.png\tc24e12d0cf.png\tf6663b4aad.png\n2a00ecc904.png\t5db75dc1f1.png\t904a5115ed.png\tc2568dbdcf.png\tf66b2e3432.png\n2a0583533f.png\t5dbeace688.png\t904f3fc07a.png\tc256f6e713.png\tf66b4e58d5.png\n2a070f3dc6.png\t5dc352fbfc.png\t905082b7fb.png\tc2582be202.png\tf66e94caad.png\n2a091413ea.png\t5dc3703d30.png\t90519d19c7.png\tc25acd1ab5.png\tf66f46d2bc.png\n2a0ae73183.png\t5dc37f7b2a.png\t9051d9037e.png\tc25cf4d8e9.png\tf6736eaab1.png\n2a0c3c2aee.png\t5dcb83b69e.png\t9053aa6508.png\tc25ec5b066.png\tf678d23811.png\n2a0c638678.png\t5dd1246d5b.png\t9061243d58.png\tc25f0275e8.png\tf67ca9b0ff.png\n2a0d1a296e.png\t5dd42247a1.png\t9067effd34.png\tc268afb4a1.png\tf67e8204ca.png\n2a16af2ae7.png\t5dda6082c7.png\t906c1ff22e.png\tc26c927bb8.png\tf6838859d7.png\n2a18b725e4.png\t5de263d7ea.png\t906cdb0f91.png\tc26cee2f37.png\tf68ae19c90.png\n2a25afc47f.png\t5deab3e059.png\t90704ffe69.png\tc26d515fdb.png\tf68ae5fc4f.png\n2a26a80b9a.png\t5df0d7bceb.png\t90720e8172.png\tc26df751cb.png\tf68b35ebd2.png\n2a26fb616b.png\t5df1adaa10.png\t90746d35da.png\tc26ff00bfd.png\tf694707d78.png\n2a2b51e873.png\t5df6f69b48.png\t90791c2417.png\tc27409a765.png\tf6964b62ae.png\n2a2dd02617.png\t5df9874c5b.png\t90859e7e87.png\tc279e2e81b.png\tf6a63121f1.png\n2a3ae9f7a4.png\t5dfa4cfa2d.png\t9086ac3a68.png\tc2849b2261.png\tf6a71f87fd.png\n2a3b0dba1c.png\t5dff1ecb41.png\t908b0475f4.png\tc2879bc657.png\tf6a757cbf6.png\n2a3db7bdf8.png\t5e008b90b6.png\t908bacfbae.png\tc28f0753df.png\tf6a89b6f89.png\n2a40c4adeb.png\t5e01df0d66.png\t908d9627df.png\tc2916143c9.png\tf6a8da7ffb.png\n2a44b1aa4c.png\t5e06154276.png\t908ee82dde.png\tc29736e475.png\tf6a9cdc607.png\n2a44c7f603.png\t5e07f381cc.png\t9090f8f97b.png\tc2973c16f1.png\tf6abb2285e.png\n2a481972a1.png\t5e09ef27a6.png\t90912de036.png\tc29c36a1e8.png\tf6b8b2edef.png\n2a484baa26.png\t5e0be40c1f.png\t9091630666.png\tc29d85de16.png\tf6b8be7626.png\n2a48ce5e78.png\t5e0d2e6c25.png\t9091c7ef1d.png\tc2a16e2360.png\tf6b8bf64b4.png\n2a49169951.png\t5e10672652.png\t9092dfb9de.png\tc2a2ad1fa0.png\tf6b94122b4.png\n2a4a5d2b71.png\t5e1656cfcb.png\t90946392e6.png\tc2a4700caf.png\tf6ba31fbf3.png\n2a4ceb76ae.png\t5e16e537a2.png\t90990cd2ba.png\tc2a808b758.png\tf6bf2f264d.png\n2a5805ad13.png\t5e2793aa94.png\t909a223b2a.png\tc2ad307cd2.png\tf6c03c0af4.png\n2a59f72960.png\t5e29776755.png\t90a3f2a030.png\tc2af313f48.png\tf6c2cf9470.png\n2a5db8e4f7.png\t5e2a86e209.png\t90a57bacf0.png\tc2b183bdc0.png\tf6c49acac0.png\n2a6209adae.png\t5e350fa662.png\t90a5cd7288.png\tc2b6747f68.png\tf6c784904d.png\n2a6477aee8.png\t5e3679f159.png\t90ac7f0cc8.png\tc2b94268bc.png\tf6ca628f38.png\n2a6a13e3ed.png\t5e3ac6f983.png\t90b140c01e.png\tc2bcde700c.png\tf6cb4b93e6.png\n2a6d18ab5d.png\t5e3c2aa2bb.png\t90b14e3f04.png\tc2bd5f07c3.png\tf6cba2e890.png\n2a701aa3e7.png\t5e3ddc6340.png\t90b1a59d1b.png\tc2be737b81.png\tf6d3bcf420.png\n2a71003f75.png\t5e400955f7.png\t90b1ba3bb5.png\tc2c78b53d6.png\tf6d5677450.png\n2a7336b189.png\t5e41dd0388.png\t90b385321a.png\tc2c8c4bcfa.png\tf6d6caf753.png\n2a75f20754.png\t5e4cd3fb1c.png\t90b7053440.png\tc2ca0ed857.png\tf6d7d74093.png\n2a7b5e1f16.png\t5e4f66c987.png\t90b7f38c5c.png\tc2cf683cdc.png\tf6d8b8393d.png\n2a7d747db0.png\t5e51d0070c.png\t90b8109bc7.png\tc2d0892947.png\tf6e2c5da67.png\n2a7fcf7da7.png\t5e52f098d9.png\t90b82efe82.png\tc2db0d0a66.png\tf6e56e8246.png\n2a82ebb062.png\t5e5447c74c.png\t90b8394c22.png\tc2e24433d2.png\tf6e8117887.png\n2a9190ed84.png\t5e583467a8.png\t90b83d9260.png\tc2e34aa6b9.png\tf6e87c1458.png\n2a9805a861.png\t5e58ed4adb.png\t90bf0125af.png\tc2e472a205.png\tf6ea999684.png\n2a9d434aa5.png\t5e5ee7f8c2.png\t90c0054bfe.png\tc2e6f57d03.png\tf6ece7409c.png\n2a9f891518.png\t5e6a728cf3.png\t90c067de3d.png\tc2e9f328e3.png\tf6f32bb5f5.png\n2a9ff688e3.png\t5e713bea12.png\t90c214f1e5.png\tc2eae5cd30.png\tf6f5806e2a.png\n2aa30fc1d4.png\t5e721cd3d9.png\t90c2ceb801.png\tc2ec2c9de4.png\tf6fc4f5177.png\n2aa3aed6f0.png\t5e8440e745.png\t90c3e5604e.png\tc2eed8269f.png\tf70176789c.png\n2aa42366f9.png\t5e85356a28.png\t90c62cd13b.png\tc2f2f9fb78.png\tf707d4c4ba.png\n2aa4bae403.png\t5e8f17069e.png\t90c830116a.png\tc2fbac0e66.png\tf70a411f05.png\n2aa5771b5e.png\t5e95137d5f.png\t90cd0edbf4.png\tc2ff5a5721.png\tf70b438480.png\n2aa72c7903.png\t5e95c5418c.png\t90cec09744.png\tc2ff775372.png\tf70f3b7b95.png\n2aa93d33e9.png\t5e96338a97.png\t90d2cdb32d.png\tc304dcd3c7.png\tf70f6a5336.png\n2aa996939f.png\t5e98f349f3.png\t90d9677c56.png\tc304ef9ecb.png\tf712d8b870.png\n2aac273cdb.png\t5e9c32ead8.png\t90dfe4f673.png\tc307912941.png\tf716197b94.png\n2ab2a682bc.png\t5ea1a7508e.png\t90e92b9fce.png\tc30d453614.png\tf7234bb47d.png\n2ab5284049.png\t5ea1fe5010.png\t90e9b53a63.png\tc30dc2fb88.png\tf724f78b1a.png\n2ababa7cf5.png\t5ea247195e.png\t90ebf56f6e.png\tc30f196026.png\tf7282eff57.png\n2abbb11546.png\t5ea341abf2.png\t90edc7fb0a.png\tc313a1ad7d.png\tf72a9df651.png\n2abe2d917b.png\t5eafa5297b.png\t90ede84c79.png\tc316ff21b8.png\tf72d64ecdb.png\n2ac0b5c25c.png\t5eb7ee9522.png\t90eef5ba6a.png\tc31a3c9ae4.png\tf72edf3035.png\n2ac1aebb6d.png\t5eb8ee3c04.png\t90efc61382.png\tc32048ab2e.png\tf731d8930f.png\n2ac25d2e83.png\t5ebb6bc370.png\t90f086d17e.png\tc3205a8130.png\tf734abe528.png\n2acb8bbe84.png\t5ebe9e404d.png\t90f37a2a28.png\tc320fb7641.png\tf73561cee2.png\n2acc196bd1.png\t5ec431887a.png\t90fb6f4825.png\tc3214a300e.png\tf737613072.png\n2acc6caa30.png\t5ec4b8eff9.png\t90fbd7163f.png\tc32590b06f.png\tf7380099f6.png\n2ad1ace6a6.png\t5ec6db5c9d.png\t90fd5ae87d.png\tc32752c00f.png\tf738309aed.png\n2ad4bf0d9c.png\t5ec71d2e6e.png\t90fd7812ed.png\tc328588e36.png\tf73d73ec75.png\n2ad5cb95c8.png\t5eca033d76.png\t91002b54e4.png\tc329b6d198.png\tf73e93ed41.png\n2add3264c5.png\t5ecf1f5cb4.png\t91062f1f24.png\tc32c64b949.png\tf741649c22.png\n2adf56faa5.png\t5ed1c2294d.png\t9106efa654.png\tc32e8869bc.png\tf741becc08.png\n2ae328c427.png\t5ed25d35df.png\t910f26f2e7.png\tc3327c163a.png\tf744325ef1.png\n2ae425e5a3.png\t5ed9ba604c.png\t9110d9e4d0.png\tc333977b5c.png\tf74833dc1c.png\n2ae74e9ece.png\t5edb37f5a8.png\t91115cfc77.png\tc3363afbc4.png\tf74a97c2ff.png\n2aeaa3a43e.png\t5edb47ab35.png\t91127f8cc2.png\tc33df3d7ad.png\tf74beabd47.png\n2af008d773.png\t5edca1bb89.png\t911efbb175.png\tc33e3a20a4.png\tf74f2c5dec.png\n2af01517c2.png\t5edcec5115.png\t911f05128d.png\tc33faa5d81.png\tf74f620d68.png\n2af240d261.png\t5edea5c7cd.png\t9121e2f1dd.png\tc345789eb2.png\tf74ff67ea7.png\n2af3c055b2.png\t5ee2d40bbb.png\t9126bbe228.png\tc34a4aa5b3.png\tf7500b963a.png\n2af94f9c4c.png\t5ee3fbc1d2.png\t9128b824e5.png\tc34d5dfb1d.png\tf753515bb8.png\n2afa754199.png\t5ee5607228.png\t9129261f99.png\tc3507d4ac6.png\tf756a811de.png\n2afd8cbff7.png\t5eef0d1f4d.png\t91297fe545.png\tc352c269de.png\tf75842e215.png\n2afe092e2c.png\t5ef096aeac.png\t912b573a9e.png\tc352f6069b.png\tf760baa181.png\n2afe87d240.png\t5ef14acc13.png\t912fb85555.png\tc353af5098.png\tf761f7c273.png\n2b05065d47.png\t5ef4baab15.png\t9132b8b239.png\tc353f941ca.png\tf762ed84a5.png\n2b068f2f40.png\t5ef61f604d.png\t9134be08ab.png\tc356f76804.png\tf7677ef77c.png\n2b0d71a390.png\t5ef66a020f.png\t913f5c6470.png\tc3589905df.png\tf76ac8de40.png\n2b0de46b94.png\t5ef842ff3e.png\t913fc84bb4.png\tc35a01102a.png\tf76af6728e.png\n2b0e163a9d.png\t5eff7a6eb2.png\t91439aab61.png\tc35f12ef87.png\tf76c6bc5d0.png\n2b0ff86c1b.png\t5f041131a1.png\t91458415ab.png\tc3682a463c.png\tf777740f86.png\n2b11f08d45.png\t5f0b4e4389.png\t914604b3ae.png\tc36bb4c231.png\tf78361aa48.png\n2b125e9fc8.png\t5f112087b7.png\t9146f2afd1.png\tc36ea2b661.png\tf783725948.png\n2b13786b5e.png\t5f15393348.png\t91471464a2.png\tc36f7b3920.png\tf78391fcd1.png\n2b157ab430.png\t5f15a72005.png\t914945be12.png\tc37134e5b8.png\tf784443dc3.png\n2b160a1291.png\t5f15a9ec4c.png\t914df9f8f8.png\tc37685f2ad.png\tf78674d646.png\n2b185ef472.png\t5f168cd2d5.png\t914f9a2e4e.png\tc376dbe602.png\tf78ea315ec.png\n2b1bc39770.png\t5f171def5f.png\t915238dbf7.png\tc3789dbed6.png\tf78fcd6c45.png\n2b1e71c2a3.png\t5f1aa014c5.png\t9156a46309.png\tc37a17f712.png\tf791482ed1.png\n2b221fc0d7.png\t5f1bd25287.png\t9159c5f8fe.png\tc37ab09296.png\tf7933b266c.png\n2b23c2e75a.png\t5f1bec06fe.png\t915ad56faf.png\tc37b8373f3.png\tf799c8d19f.png\n2b2cafa2f0.png\t5f1df5e4a8.png\t915be642cc.png\tc37e74bffc.png\tf79b4370a0.png\n2b2fc71ee1.png\t5f1e22d81c.png\t915ead8b25.png\tc37f7c51e9.png\tf7a341bb7a.png\n2b2fdaaa9b.png\t5f240e936f.png\t9163f15e89.png\tc3825792c5.png\tf7a539ee62.png\n2b30313d2c.png\t5f248cf0d6.png\t916aff36ae.png\tc385af3da9.png\tf7a8c1442e.png\n2b31c73245.png\t5f28207f7a.png\t916c472945.png\tc387a012fc.png\tf7a9edb985.png\n2b34eb8436.png\t5f28c579a7.png\t916e95dae3.png\tc38c3f0a49.png\tf7aabef468.png\n2b39f9a88f.png\t5f2b6a66d1.png\t9175c41a74.png\tc38e2a42e3.png\tf7adc6c4a8.png\n2b3a5ae3ea.png\t5f37e756a1.png\t9175e7d74c.png\tc38f7daf18.png\tf7b1bd7c8a.png\n2b3efd7d7f.png\t5f384be473.png\t91791bd48c.png\tc39f317ad0.png\tf7b2fe49a6.png\n2b3fa35f16.png\t5f397e0074.png\t917971eeb8.png\tc3a749a6a7.png\tf7b409e684.png\n2b4049f833.png\t5f3b26ac68.png\t917a0e6868.png\tc3a963f5e3.png\tf7b5458d2a.png\n2b405e7016.png\t5f3bc00104.png\t917e80f94e.png\tc3b07f663a.png\tf7b7615fb8.png\n2b40a03d1d.png\t5f3bc6338a.png\t917ef84e64.png\tc3b0fd8a6d.png\tf7b9f874cb.png\n2b41b80931.png\t5f3f6d6ca6.png\t91823ecb1f.png\tc3b1a8efa1.png\tf7c147ac61.png\n2b420cf6f6.png\t5f405e1970.png\t918548e472.png\tc3b31898df.png\tf7c45731d2.png\n2b4c4aeecc.png\t5f41db0529.png\t918d76601c.png\tc3b42ca60c.png\tf7c67e98e7.png\n2b4dbfc252.png\t5f43816c42.png\t918e6952a0.png\tc3bafa1d78.png\tf7c685315e.png\n2b5388e7b9.png\t5f43ce7f89.png\t919157e393.png\tc3c242a14d.png\tf7c8709aad.png\n2b55bdbda1.png\t5f45291a7a.png\t9197adb6f3.png\tc3c25408ae.png\tf7cb3559f9.png\n2b562b2ea8.png\t5f487741b7.png\t919b7a9683.png\tc3c257491a.png\tf7d4d6fb0d.png\n2b624269dd.png\t5f48c8c123.png\t919bc0e2ba.png\tc3c41a510a.png\tf7d6830beb.png\n2b62bc74fd.png\t5f4c315e78.png\t919f18579b.png\tc3cb500dfb.png\tf7d7577dff.png\n2b641b5aee.png\t5f51a19f9e.png\t919f72b795.png\tc3ce830a05.png\tf7dc2b2c9a.png\n2b65712f62.png\t5f53c9d252.png\t919fdc4e34.png\tc3ceb42632.png\tf7e09f79e1.png\n2b65f74905.png\t5f5445bcd8.png\t91a0ac314c.png\tc3d0ede216.png\tf7e384e255.png\n2b66898740.png\t5f54df3753.png\t91a92088b4.png\tc3dc03f42e.png\tf7e555a0aa.png\n2b66c196d7.png\t5f553d9bc2.png\t91b0697d86.png\tc3ea16ed8c.png\tf7e58da74f.png\n2b6f27a34d.png\t5f55c759cc.png\t91b10fffe2.png\tc3edae73f9.png\tf7e8060b57.png\n2b723b8474.png\t5f56aedf85.png\t91b12546b9.png\tc3ef1a2a7a.png\tf7e855fc40.png\n2b761449ef.png\t5f5b77cc1b.png\t91b88142fa.png\tc3f52220da.png\tf7ed9b7ab7.png\n2b76b47947.png\t5f5badbc35.png\t91b90e946d.png\tc3f533acee.png\tf7f8a24612.png\n2b77f82b42.png\t5f5c584d56.png\t91b91fd79b.png\tc3f7c12107.png\tf7ff576dc4.png\n2b7baf7374.png\t5f64a4011b.png\t91ba84f784.png\tc3fba1c34e.png\tf803e91a54.png\n2b80686da1.png\t5f664c10a2.png\t91bb8b5dbb.png\tc3feb6b712.png\tf807851f5a.png\n2b80a1a753.png\t5f69ff419b.png\t91bfb63ca7.png\tc3ff8a15a4.png\tf8088ca24a.png\n2b85abe20c.png\t5f6a9df8e3.png\t91bfe61493.png\tc4117da764.png\tf80ccad902.png\n2b8903d2db.png\t5f6cf01dd6.png\t91c003779c.png\tc41bf71d37.png\tf80f1932f1.png\n2b8d74fcb3.png\t5f6d235f00.png\t91c0886bbe.png\tc424ef2d35.png\tf8121521b6.png\n2b9d6bad0d.png\t5f6eb0452b.png\t91c3046e29.png\tc425404682.png\tf81413f017.png\n2b9fc720f8.png\t5f706cc210.png\t91c3a0e8a8.png\tc427e869c9.png\tf819844efd.png\n2ba0764fe1.png\t5f723327ef.png\t91c4dc1e59.png\tc4291c0396.png\tf81a3e3c9c.png\n2ba7fd429b.png\t5f77a1ba95.png\t91c68481cd.png\tc42a96a61c.png\tf81a4dd023.png\n2baa4cda28.png\t5f7859c06f.png\t91cae16c03.png\tc42abd17f3.png\tf81b02f6ee.png\n2bbae68be2.png\t5f78fdc4fb.png\t91d104a6b1.png\tc42d63b0e6.png\tf81e574205.png\n2bbba74c54.png\t5f7a888a88.png\t91d4f1574a.png\tc42e2dbf2a.png\tf81f9cead3.png\n2bbe5967d8.png\t5f7c5c59f5.png\t91dfba8133.png\tc430c464eb.png\tf82cc7ffe3.png\n2bc179b78c.png\t5f7e85ed97.png\t91ea3f1227.png\tc43159d789.png\tf82f19a681.png\n2bc8a3db3c.png\t5f80ba82d5.png\t91eca562cd.png\tc43594084d.png\tf836633c5d.png\n2bc9bc7c1b.png\t5f8370ed60.png\t91eccaa550.png\tc435b4f4f8.png\tf836b0746c.png\n2bca466c25.png\t5f86057ba1.png\t91f07b5bb6.png\tc437d4a4cd.png\tf83756c308.png\n2bca4fe522.png\t5f8b4540fd.png\t91f0dc8f97.png\tc43ad110ce.png\tf83888b84a.png\n2bcae75a7b.png\t5f8f40ce31.png\t91f656f295.png\tc43b8533c4.png\tf83931131f.png\n2bce09096c.png\t5f98029612.png\t91f6bcd7a7.png\tc440ceda26.png\tf839d20b06.png\n2bd492e362.png\t5f9879a8bc.png\t91f7d4e92f.png\tc4467ac3b7.png\tf83ecd0d61.png\n2bd5791def.png\t5f9c7e9503.png\t91fbe9ad11.png\tc44a16349c.png\tf83ff82a40.png\n2bd6da4508.png\t5fa15f3d6e.png\t91ffedcfd5.png\tc4505b8d57.png\tf841c6dee7.png\n2bd82f8fd4.png\t5fa536d1fd.png\t920d7f4c5a.png\tc4507c616a.png\tf844ef3537.png\n2bd87e0587.png\t5fa87fc4cd.png\t920dd2feba.png\tc450f9d7f4.png\tf84783b31a.png\n2bd913a165.png\t5fabfdc0b6.png\t92107f9559.png\tc453e5d3fb.png\tf85054273a.png\n2be0c0384d.png\t5fadeae354.png\t9211f9a1aa.png\tc455f7be1b.png\tf851b1959f.png\n2be45a7fa3.png\t5fae53bf33.png\t9213d2bca3.png\tc457249c34.png\tf85573609a.png\n2be52b518b.png\t5fafc03e52.png\t92165511b1.png\tc45bc4c599.png\tf85632cd95.png\n2be9a84d1b.png\t5fb0953e0b.png\t921b60e76d.png\tc45c9617bf.png\tf858319382.png\n2bec322251.png\t5fb39d0e7f.png\t921f62423d.png\tc463a7bc45.png\tf8594cc56f.png\n2bec33a46d.png\t5fb9409afd.png\t921fe866ac.png\tc467bf46da.png\tf85a321be3.png\n2becc508a4.png\t5fbbe16b53.png\t92290c13f0.png\tc4687713db.png\tf85a85710a.png\n2bf2a9037f.png\t5fbcb45ed1.png\t922f4b1692.png\tc46ec7a3a5.png\tf85c15d050.png\n2bf5343f03.png\t5fbeb50b5a.png\t922f50a838.png\tc470fd3fa4.png\tf85d300376.png\n2bf60e2330.png\t5fbf274e64.png\t923249032c.png\tc47357ae6a.png\tf8621065c1.png\n2bf72b17e6.png\t5fbfc69a6c.png\t923473085f.png\tc47a2bb9a3.png\tf869349ffe.png\n2bfa664017.png\t5fbff21973.png\t9234e203f8.png\tc47b1ac061.png\tf86b6b05b6.png\n2bfc5326d4.png\t5fc4559c16.png\t92357f8a9c.png\tc47c5ea1c3.png\tf86c86466c.png\n2bfe79c7c6.png\t5fc8bb94aa.png\t9238f8f461.png\tc47f022330.png\tf870e41e88.png\n2bfe87d488.png\t5fcb672d96.png\t923aca4789.png\tc47f69debb.png\tf87154ea7b.png\n2c04e84b3d.png\t5fce13f8e1.png\t923afac48b.png\tc483125bdc.png\tf8723f9dd2.png\n2c07634387.png\t5fd24230c4.png\t923f8be483.png\tc4902fb3f1.png\tf87365827d.png\n2c0820cfb4.png\t5fd2a3ef64.png\t9241807175.png\tc4911615e1.png\tf876ebe60b.png\n2c0cdab53e.png\t5fd411e302.png\t9244c80d9b.png\tc491ef1f9f.png\tf8777febea.png\n2c0da8535c.png\t5fd6a2da0d.png\t92497ac74d.png\tc49addaf55.png\tf87ef87ca2.png\n2c0dca7cbd.png\t5fd8556f1b.png\t924c0e9cef.png\tc49ef24cd2.png\tf8853d0a66.png\n2c0e79d94a.png\t5fd8ed2395.png\t924c3d6a9c.png\tc4a1a78146.png\tf889a20f30.png\n2c0fd3a542.png\t5fdd07c869.png\t92564491d9.png\tc4a41f979d.png\tf88e532e4a.png\n2c157df08f.png\t5fe333fd26.png\t9256e2b7ba.png\tc4a43ba621.png\tf88f72e343.png\n2c17660e01.png\t5fe4e3f5e6.png\t9258e48804.png\tc4a681819d.png\tf89138fa75.png\n2c1d9a7284.png\t5fe599e1a5.png\t925d995276.png\tc4a70f7cba.png\tf8921580bb.png\n2c1eaa1a47.png\t5fe661243a.png\t9260b4f758.png\tc4a7680264.png\tf89a479c21.png\n2c22f20699.png\t5fe816a330.png\t92629f769b.png\tc4adb6e891.png\tf89b63c425.png\n2c27610d37.png\t5fe8923649.png\t9267408c45.png\tc4b05c2eaa.png\tf8a05eac67.png\n2c2e72caf5.png\t5fe905145d.png\t9267f6cf9c.png\tc4b4099179.png\tf8a1f29b78.png\n2c312d9f1f.png\t5fe9fb5c4e.png\t9269b8617c.png\tc4b4370ea7.png\tf8a2cb1d13.png\n2c321af2dd.png\t5fea3149b3.png\t926a4a511c.png\tc4b802d1d1.png\tf8aac62301.png\n2c327a06e2.png\t5febd30d82.png\t9271befaa3.png\tc4b84025e1.png\tf8ae2a094d.png\n2c34919bdf.png\t5febd850e1.png\t927218abd3.png\tc4b9e32030.png\tf8af98960f.png\n2c3585a547.png\t5fee7724fc.png\t927ac50fa4.png\tc4bc6757ce.png\tf8afabcb4c.png\n2c38be5eea.png\t5fefd5edd1.png\t927c12b972.png\tc4c0cb57f0.png\tf8b721f632.png\n2c3c64402e.png\t5ff89814f5.png\t9281a1fa33.png\tc4d1e821d1.png\tf8b9fa402a.png\n2c436eb10f.png\t5ffd928d1e.png\t92895cb54f.png\tc4d31c5319.png\tf8ba90c991.png\n2c45b152f1.png\t6010b9b84d.png\t928ba68f99.png\tc4d777c228.png\tf8c1b2b3b1.png\n2c47d41861.png\t60135a8a90.png\t929063c568.png\tc4e6a80118.png\tf8ca1f5c5a.png\n2c4b9599df.png\t6013c6b879.png\t9290c42ee0.png\tc4e956d952.png\tf8cd0940ab.png\n2c4d015b90.png\t6015110caa.png\t929322a227.png\tc4e9c0dda0.png\tf8d07a5ecf.png\n2c4e9612cc.png\t60235a3230.png\t9293eee645.png\tc4ed7cdb1c.png\tf8d083fbd9.png\n2c4ec05c50.png\t60269a895b.png\t92944e2aab.png\tc4ee594685.png\tf8d3fb6b13.png\n2c501a8b77.png\t602cf815b4.png\t929bb32e72.png\tc4f145aef7.png\tf8d8c9be54.png\n2c51e2a97b.png\t602dc99f10.png\t929f4f4c47.png\tc4f24495e3.png\tf8e06d0345.png\n2c55aedecd.png\t603069d579.png\t92a20ccbe2.png\tc4f2799234.png\tf8e39bd61f.png\n2c5df02c7f.png\t6031814bd6.png\t92a2165923.png\tc4f2855630.png\tf8e3b0b7c6.png\n2c5ee8638a.png\t6032385d55.png\t92a6b22052.png\tc4f2b6ed96.png\tf8e67987c6.png\n2c5fd36516.png\t603549682a.png\t92a88f5641.png\tc4f2c790fb.png\tf8e7ee80bc.png\n2c61e169be.png\t60373566f9.png\t92abe333f4.png\tc4f3cced43.png\tf8ea0667b8.png\n2c67cc6d42.png\t6040cbf086.png\t92ad1bf3e3.png\tc4f5a0fa36.png\tf8ea96a1ed.png\n2c6dbbc57c.png\t6040f6b5af.png\t92af56fd8a.png\tc4f8acef95.png\tf8f3fa162f.png\n2c6e23e219.png\t6047a56db1.png\t92b1a7eec7.png\tc4f9fffbad.png\tf8f7ed86fc.png\n2c707479f9.png\t6048597c55.png\t92b23cfe90.png\tc4fcfdce00.png\tf8f9e5ba3b.png\n2c755bf200.png\t604c708fe5.png\t92b5f47694.png\tc5098d265e.png\tf8fb47e8e1.png\n2c76837632.png\t604dafb545.png\t92b7e5def2.png\tc50a04f278.png\tf8fba159c0.png\n2c7f7f93a7.png\t604dc2c69c.png\t92b9a0c1a6.png\tc50affbe08.png\tf8ff29eba9.png\n2c8148ce76.png\t604e5bbf9b.png\t92c25915f1.png\tc50ba562ef.png\tf8ff7058b6.png\n2c84e49a39.png\t604e99b0c6.png\t92c6b79a7c.png\tc50d5ef081.png\tf905b4ed4a.png\n2c89a54ba5.png\t6056e713c4.png\t92c6e998bf.png\tc51173f352.png\tf906c96b0b.png\n2c8a53b559.png\t60584b31d8.png\t92c77ff4df.png\tc51176fe44.png\tf90accfd37.png\n2c8a7e025d.png\t6059cedbfb.png\t92c91d1494.png\tc5141ec49f.png\tf90ae4ac62.png\n2c9058bc9e.png\t60611407d3.png\t92caf3281a.png\tc514a65854.png\tf90d64d4bf.png\n2c9cb402d7.png\t6061f711e6.png\t92cb8028e8.png\tc51572a089.png\tf90d686e01.png\n2c9e776fd0.png\t60638605b1.png\t92cbec90ac.png\tc515bc227f.png\tf91185cc34.png\n2ca6256249.png\t60668ceb17.png\t92cc29d923.png\tc516803d41.png\tf9193e9a6d.png\n2ca82d707e.png\t6066dbfca9.png\t92cdbbb3f5.png\tc5172e6af4.png\tf91af813d6.png\n2cb5045cba.png\t60698e8fd2.png\t92ceaa55e4.png\tc51ad843f1.png\tf91da5124f.png\n2cc1211c36.png\t606ca6621d.png\t92cef4936f.png\tc51edbadc7.png\tf928d746c3.png\n2cc9698c4c.png\t606cf6645d.png\t92cfea3778.png\tc529870d8c.png\tf92998b2d2.png\n2cc9b18030.png\t606ef9d9f3.png\t92d0113e12.png\tc52c863224.png\tf92e022f08.png\n2cc9e8a198.png\t606efcc5ae.png\t92d116836e.png\tc52f664819.png\tf93e29bff0.png\n2ccc636a08.png\t606f5503e8.png\t92d1de717f.png\tc52faa0d3a.png\tf93f231ea4.png\n2ccf5a4fcb.png\t60738ba31e.png\t92d9025a37.png\tc531c994cd.png\tf9467f033a.png\n2cd315fb22.png\t6074c5b0bf.png\t92d9119491.png\tc5327d784b.png\tf947b41164.png\n2cd4095182.png\t60764d08d3.png\t92deb56177.png\tc539d447bf.png\tf94f2ddfcd.png\n2cd6db6fcd.png\t607728ac81.png\t92e3428d1e.png\tc54217c477.png\tf94f8cce10.png\n2cdc2c12a3.png\t6080c9f07c.png\t92e4968815.png\tc543bb359b.png\tf950879320.png\n2ce0ef6162.png\t608567ed23.png\t92e4cc072a.png\tc5493cc7b0.png\tf953e11b4c.png\n2ce1812846.png\t608574722a.png\t92e617896e.png\tc54981fe30.png\tf959f79b64.png\n2ce7a9e81b.png\t6086dd952b.png\t92eaa4c1c5.png\tc54a1a835a.png\tf959ffc6d9.png\n2ce8354e3c.png\t608e51e49b.png\t92eb4aa6e4.png\tc54ccd9f94.png\tf95b20e4b5.png\n2ce903774a.png\t6090b5c911.png\t92eb99caa8.png\tc551ed5fad.png\tf95cb64587.png\n2ce9c36a98.png\t6092afaec7.png\t92ec5cd1a3.png\tc555df6518.png\tf96265d40e.png\n2ced1f6e8f.png\t60969ee8a0.png\t92f0273998.png\tc5592544a0.png\tf9629f4e29.png\n2cf2b06f33.png\t609afffbe8.png\t92fa1137d9.png\tc55beb0582.png\tf963d0da02.png\n2cf50a51a0.png\t609c37ba03.png\t92fa3084e7.png\tc55f5b7931.png\tf9645c1acf.png\n2cf5acb2c2.png\t609d7d1c49.png\t92fb5b47ac.png\tc5627d8c35.png\tf96854d4b7.png\n2cf5e31383.png\t60a06c3078.png\t92fe1d923f.png\tc563ab4d1f.png\tf96ce17553.png\n2cf988af12.png\t60a0ca8d0e.png\t93007a203b.png\tc56524ded9.png\tf96f608c5a.png\n2cf9b202bb.png\t60a54aae83.png\t930183efe5.png\tc56c59f691.png\tf970a44702.png\n2cfa33fde4.png\t60a5591917.png\t930666ce82.png\tc571804a93.png\tf97372fd50.png\n2cfae3f373.png\t60a686709b.png\t930939bcaa.png\tc573df4315.png\tf976134657.png\n2d05b0403b.png\t60adaaeeef.png\t930a21d485.png\tc5745e04ff.png\tf97e33f4a0.png\n2d05dbfff2.png\t60affd7e45.png\t930ace946d.png\tc5749817aa.png\tf981f39d37.png\n2d069cef1f.png\t60b08464de.png\t930c8dd0e4.png\tc575484a95.png\tf9829bf4d2.png\n2d08095342.png\t60b26ed15e.png\t930cbb8ccb.png\tc576a01a52.png\tf98a87f651.png\n2d0eaa0a25.png\t60b3d0d613.png\t930dcd571a.png\tc576ef893a.png\tf98e740c98.png\n2d177a217d.png\t60b574ccdd.png\t930e785c7d.png\tc5779b8a86.png\tf990b164fd.png\n2d1846ae12.png\t60b7303d20.png\t93196366e1.png\tc58124fdc4.png\tf992390442.png\n2d1a6bb739.png\t60b7f7060d.png\t931f8f7623.png\tc58269ccfe.png\tf99a792e1c.png\n2d1bc23321.png\t60ba38f87b.png\t9321025f07.png\tc5843a2b20.png\tf99a7a0250.png\n2d209b870d.png\t60baa3002f.png\t9324c530ca.png\tc585215aef.png\tf9a3d103d7.png\n2d20ee4eff.png\t60bb53f935.png\t932529c83d.png\tc58b431079.png\tf9a438cdfd.png\n2d230a0550.png\t60bbd166a4.png\t9327ed7fc3.png\tc58b6277c7.png\tf9a787b1ec.png\n2d29478050.png\t60bc283617.png\t932d0af9a5.png\tc58cbe572d.png\tf9aa42899b.png\n2d2b303e07.png\t60bdccece7.png\t932ec40ab4.png\tc58edbac0a.png\tf9ac19b3e4.png\n2d2f28610a.png\t60bf7c5fbd.png\t93353de86c.png\tc593c4a497.png\tf9b563d232.png\n2d36bc16c9.png\t60c1bb4390.png\t9337b5e168.png\tc593c662f1.png\tf9b7196dfc.png\n2d37d558a3.png\t60c4d1d597.png\t933b9adba0.png\tc5959875cd.png\tf9bb50511b.png\n2d3c0a68f9.png\t60c70d9d2b.png\t933d90f822.png\tc59cbc2c2c.png\tf9becf46d4.png\n2d3e3db6b9.png\t60cad4fd1e.png\t9347c39706.png\tc59e29f167.png\tf9c42b8d50.png\n2d3ff9e083.png\t60ce9113f9.png\t9347eca0c8.png\tc59e59474e.png\tf9c9f273a1.png\n2d44cabf72.png\t60cfe5b3e0.png\t9348c2319b.png\tc59e5f46c0.png\tf9cc9a649d.png\n2d450d63a8.png\t60d3607be2.png\t934a1de79c.png\tc5a37a86fb.png\tf9d03271d6.png\n2d4e7a937b.png\t60d7d451e2.png\t934ac92102.png\tc5a4c4e6d1.png\tf9d5a850cf.png\n2d50edd604.png\t60d9423890.png\t934c53d70e.png\tc5a4dbdd1a.png\tf9d5bb76c5.png\n2d55a5841b.png\t60dccbc52f.png\t934cd14205.png\tc5a9c13a77.png\tf9d8f014c2.png\n2d57242565.png\t60dfd16b68.png\t934cd199f2.png\tc5af6a21e7.png\tf9d932e912.png\n2d5846820b.png\t60e0902746.png\t934fcb3879.png\tc5afb8174e.png\tf9e510b5d4.png\n2d5933d5f6.png\t60e17a8944.png\t93506ed6e7.png\tc5b4548ff2.png\tf9e8baab8d.png\n2d5ca11aae.png\t60e40140e7.png\t9351220cec.png\tc5c0d14356.png\tf9eb5a18dd.png\n2d5f4b0213.png\t60e7b5d738.png\t935329585f.png\tc5c4479af6.png\tf9ebff804b.png\n2d64b6a8cf.png\t60e9913cf8.png\t93577f1313.png\tc5c7f765d9.png\tf9ef613aec.png\n2d65eed666.png\t60ed5847a1.png\t9358b1fa7a.png\tc5caa57f2f.png\tf9f0be34b7.png\n2d6a27493c.png\t60fa9f985c.png\t935bd57df6.png\tc5cae37fca.png\tf9f6588f79.png\n2d6b59a83b.png\t60fb99942f.png\t935cba59ad.png\tc5cc962173.png\tf9f779bc32.png\n2d6c9ecb08.png\t61023e1d99.png\t935e8e43fc.png\tc5ce3b7c0f.png\tf9fb359f7d.png\n2d6d0b8ed3.png\t6102cfaeb3.png\t9360f588c1.png\tc5d0701e78.png\tf9fc7746fb.png\n2d6f3c5ea4.png\t610522a8ea.png\t93614d4b95.png\tc5d1c2b8dc.png\tf9fd089684.png\n2d6fbe4e0e.png\t61071d8a4c.png\t9366fccb95.png\tc5d6528a35.png\tfa01eb7fdf.png\n2d744191c8.png\t6107902d61.png\t936d863260.png\tc5d65cc917.png\tfa02cf34fd.png\n2d75d6bc0b.png\t610a3254fc.png\t936fa8e3e4.png\tc5d6fe7b88.png\tfa03ff2472.png\n2d76db6f66.png\t610d6fa9e0.png\t9375886364.png\tc5d8f51521.png\tfa04dab924.png\n2d777bcac5.png\t610f0279fa.png\t937ac0c55c.png\tc5dad43641.png\tfa05e43818.png\n2d7a83381c.png\t611103499c.png\t937b4cd8e7.png\tc5dbddb17e.png\tfa061fde8d.png\n2d7e704551.png\t61138805e1.png\t937bf3bde5.png\tc5dc55c970.png\tfa08ffa6fb.png\n2d7ff66d35.png\t6119e97074.png\t937d435c0c.png\tc5dded34ae.png\tfa0ae3c3f9.png\n2d8a3a262b.png\t611d13abcd.png\t937ea43a65.png\tc5df6c89df.png\tfa0beb5007.png\n2d9abe1e25.png\t61291bc911.png\t937f1253a0.png\tc5e23e3c24.png\tfa0daef792.png\n2d9b4d01d0.png\t612c946296.png\t9382a047a8.png\tc5e3415a42.png\tfa14635802.png\n2da763222e.png\t612e3372ce.png\t9385125b12.png\tc5e7ec3c34.png\tfa17bef99a.png\n2da79e62f9.png\t612eab5d71.png\t9388190e1d.png\tc5e9aa26af.png\tfa1c58a6c2.png\n2da8adf16a.png\t6131d2d8b3.png\t938ad4fd40.png\tc5eb095a74.png\tfa1edfd24c.png\n2da8da57e1.png\t61321814ce.png\t938d11586e.png\tc5ef2c8428.png\tfa2269d3e6.png\n2da9564e53.png\t6134054093.png\t938d4036a9.png\tc5ef41f73e.png\tfa230ac409.png\n2dad6bef5f.png\t6138f82415.png\t939232310f.png\tc5f1688f84.png\tfa233440c7.png\n2db3730f05.png\t6139ed81ab.png\t939855018d.png\tc5f3085b12.png\tfa25724a62.png\n2db8392cd9.png\t613bde123c.png\t939879034e.png\tc5f335d44c.png\tfa2c779db2.png\n2db89bd54e.png\t613ce5c75c.png\t9399b45f19.png\tc5f5614e87.png\tfa2d86387e.png\n2dbaca14cf.png\t613d4bb70f.png\t939c88a964.png\tc5f64bb650.png\tfa2e0ec852.png\n2dbbe7bf53.png\t613da61fd8.png\t939e2fceef.png\tc5f6d81074.png\tfa2ebd1083.png\n2dbcfec0b3.png\t613fb95edf.png\t939ec7a920.png\tc5f7b76607.png\tfa3575af98.png\n2dc58e4bd0.png\t614270933e.png\t939fbf65d8.png\tc5f83feb7d.png\tfa37b3bbb5.png\n2dc9e78155.png\t61427e5f8b.png\t93a0cb5111.png\tc5fbc0612f.png\tfa3d9abeb9.png\n2dcae23744.png\t61438a146a.png\t93a1541218.png\tc6013519e9.png\tfa3e7cb3f5.png\n2dcf7dc762.png\t61493ca044.png\t93a8901635.png\tc602d02313.png\tfa3e8609bb.png\n2dd21f355b.png\t6149ec3408.png\t93a9e01734.png\tc60bdbbcae.png\tfa420ff909.png\n2dd318a0c7.png\t614afa7224.png\t93ae9f8df9.png\tc60eaa0c75.png\tfa44a921f8.png\n2dd75dd10c.png\t614cd43381.png\t93b51f6335.png\tc60ef7505e.png\tfa45b2168b.png\n2ddb9d0b5e.png\t614dd9f857.png\t93b58cb146.png\tc60f31657e.png\tfa461678e9.png\n2ddfdf7a00.png\t614e08f3f8.png\t93b8ec180f.png\tc60ff9fb7d.png\tfa467c163e.png\n2de086d292.png\t6150f39f8d.png\t93badd2207.png\tc613a090a4.png\tfa4c444dd5.png\n2de62031ee.png\t61538a4a19.png\t93bcc09673.png\tc614988286.png\tfa4f1b94b2.png\n2dea78df15.png\t6153c227de.png\t93bcc1b466.png\tc616ac5178.png\tfa503f3f93.png\n2deae378ef.png\t61541269c2.png\t93be99ac1f.png\tc61e0f2507.png\tfa50e86a72.png\n2df17e4028.png\t6158817b25.png\t93c17f9615.png\tc620332bc0.png\tfa56377e58.png\n2df8965962.png\t615b89367a.png\t93c3bd5f5c.png\tc621f020d9.png\tfa56b04a29.png\n2df90133e5.png\t6169381013.png\t93c58d5dd5.png\tc6239930ec.png\tfa59461848.png\n2e035505d6.png\t616afb7dc0.png\t93c7358e41.png\tc62768ea85.png\tfa5f0040ee.png\n2e0ae8fa98.png\t616d7f08a8.png\t93c8f958ce.png\tc627f6c6c5.png\tfa62872b1c.png\n2e0bbaa822.png\t616dc8b3e1.png\t93cb37cdd3.png\tc62a370303.png\tfa63971218.png\n2e0bd0472c.png\t616e0b20ba.png\t93d0068f10.png\tc62b3e15f4.png\tfa65fcaf12.png\n2e10314df1.png\t616ebf67c3.png\t93d0ed6abd.png\tc62cffd089.png\tfa68bdde45.png\n2e10e5fdf0.png\t616ef89f45.png\t93d21b7636.png\tc62fb52445.png\tfa69566591.png\n2e16e6397a.png\t6172891da4.png\t93d3950d7b.png\tc6339fc9f8.png\tfa6d82463a.png\n2e1884a536.png\t61772569c7.png\t93d82c5591.png\tc63683e0af.png\tfa6da5cb9d.png\n2e19b36968.png\t6177ff1ad4.png\t93d95729b9.png\tc638c588fe.png\tfa70e3da06.png\n2e1af0654d.png\t617af65d86.png\t93ec7f697f.png\tc63a278bac.png\tfa74fdc04c.png\n2e1f0b7a74.png\t617c746fca.png\t93f3cf7644.png\tc645f9fb9d.png\tfa752c35ca.png\n2e21ca5ebe.png\t617cee13bc.png\t93f7759d52.png\tc6466c2752.png\tfa78260d1c.png\n2e24b94e5d.png\t617e3ea776.png\t93f8f9e663.png\tc646a94e06.png\tfa87eb3962.png\n2e3ab718cd.png\t617f00b8de.png\t93facfbb35.png\tc64b87ba5a.png\tfa899a655b.png\n2e3b9098b8.png\t617f259ef0.png\t93ff5d63de.png\tc65a7ae03a.png\tfa92bed2b1.png\n2e3c7838bb.png\t617fff6c35.png\t9402477e2e.png\tc65a9fa7e1.png\tfa93747e87.png\n2e40cf4b88.png\t6182e7e891.png\t940eb1bbe4.png\tc6612326fa.png\tfa976875f7.png\n2e482fb818.png\t618bace044.png\t940fce5c20.png\tc6648ea2ca.png\tfa9ad1caeb.png\n2e4a592cac.png\t618f843c7a.png\t9411951a0d.png\tc6650767c2.png\tfa9aec9e27.png\n2e4b5494c7.png\t61934728ec.png\t9416e65d9e.png\tc66a8f70e4.png\tfa9b3f2aa9.png\n2e54db05a5.png\t6193839fa8.png\t941c82187f.png\tc66a9ea31d.png\tfa9da6a455.png\n2e55225481.png\t61991ac001.png\t941d0865da.png\tc66acd78a9.png\tfa9e8e851e.png\n2e55ac15c6.png\t619a549372.png\t9427ca3902.png\tc66ae04671.png\tfaa5a454bc.png\n2e5811bc80.png\t619edf8941.png\t942b755ac6.png\tc66e8dcd88.png\tfaa76d8a7c.png\n2e616e1cde.png\t619fa9083c.png\t942bf47b58.png\tc66fd86269.png\tfaa833d1e6.png\n2e61d980c7.png\t61a0a17dcd.png\t942d3bfbf3.png\tc6722496fc.png\tfaabfd1272.png\n2e69b6eb16.png\t61a526dba0.png\t942f9aee0a.png\tc673bebe29.png\tfaaf883ec2.png\n2e6d2556b4.png\t61a61114b9.png\t9431677217.png\tc67832ba86.png\tfab01bfcae.png\n2e749cfd59.png\t61ab0aa78b.png\t94348a4334.png\tc67d804ce4.png\tfab4191314.png\n2e7921981a.png\t61ac649947.png\t9435647390.png\tc687189501.png\tfab48de08f.png\n2e7d8297e7.png\t61b42f8f0e.png\t943754ba03.png\tc6885a38e6.png\tfab534d1ce.png\n2e7e2de23f.png\t61b6c452ad.png\t9437ba261e.png\tc689bc0cbb.png\tfab9e5a46a.png\n2e80aec171.png\t61b8d3debd.png\t943daf2fcb.png\tc68b2ca953.png\tfabad6b46a.png\n2e87e350af.png\t61ba42e450.png\t943df308b6.png\tc68d428ccb.png\tfabb5223dc.png\n2e8bf3f7b3.png\t61bd2ee822.png\t9447159eb7.png\tc68ee0308f.png\tfabf77ade7.png\n2e9396a8df.png\t61c0129a58.png\t944b46458e.png\tc69942be7a.png\tfac01da8d0.png\n2ea086caa3.png\t61c1a0a4ee.png\t944d2d6bb8.png\tc69962cd9d.png\tfac33dff05.png\n2ea0b129a3.png\t61c2d72207.png\t94500b7464.png\tc6a07c1bf0.png\tfac66f7b11.png\n2ea5f9e7d9.png\t61c5b61567.png\t945317d107.png\tc6a614c6fb.png\tfacbcfa109.png\n2eaa0e2925.png\t61c79657d1.png\t94548cba37.png\tc6a9583845.png\tfaced45569.png\n2eae67f243.png\t61cfb43abf.png\t94551e800f.png\tc6aaba4d7e.png\tfad11bf2fa.png\n2eae73b03a.png\t61d15e4861.png\t9458b40464.png\tc6aaf86f7f.png\tfadacca720.png\n2eaecdf8b5.png\t61d244b0e1.png\t945b8261c9.png\tc6ac3cb5f1.png\tfadd6aa446.png\n2eaf33b9f0.png\t61d395dbd4.png\t9461c11c60.png\tc6acc6edb0.png\tfadd701a33.png\n2eb194fca4.png\t61db631085.png\t9465348644.png\tc6b0fcca43.png\tfadefe071c.png\n2eb37ccc68.png\t61df7e1f4d.png\t9466e21e0d.png\tc6b2feddd7.png\tfae3d31a65.png\n2eb9de549b.png\t61e325ac65.png\t946fe1467b.png\tc6b59300c7.png\tfae8a1af69.png\n2ebc0fb07c.png\t61e932cc71.png\t9472731dfd.png\tc6ba202202.png\tfae8a6874d.png\n2ebc20ce2a.png\t61eb7aa298.png\t94759a6603.png\tc6bc5851f4.png\tfaf72a780b.png\n2ebe2eba03.png\t61eb86f1ad.png\t9476c5ca8d.png\tc6c3e6a015.png\tfaf8e27e42.png\n2eca5f70b1.png\t61ec0b181a.png\t9478850fcc.png\tc6c65a92a2.png\tfb03170eac.png\n2ecb2b0797.png\t61eddad5ff.png\t9478fcc031.png\tc6c66ccb3c.png\tfb08e351bd.png\n2ecd04a9f8.png\t61f1a55dbf.png\t947bd72542.png\tc6ccbc78ff.png\tfb09c214ae.png\n2ecd05d83c.png\t61fa3e07b2.png\t9480d31795.png\tc6ccf15b4e.png\tfb0a643ce3.png\n2ecfb10439.png\t61fbd7bcd9.png\t9484a42649.png\tc6cebf6829.png\tfb0b7686a2.png\n2ecfd04874.png\t6204577291.png\t94856b278e.png\tc6d17f5835.png\tfb131f3cd5.png\n2ed02fc499.png\t6205973a3e.png\t94857fe622.png\tc6d5346205.png\tfb14beb69f.png\n2ed7843982.png\t6206b55810.png\t948b2bed44.png\tc6d577c616.png\tfb17f161d1.png\n2edf2e3e05.png\t6208c7711e.png\t948d73d7c2.png\tc6d9c13cea.png\tfb1977dbf4.png\n2ee235848e.png\t62135221f1.png\t948e82d61e.png\tc6e3a3feea.png\tfb1add496b.png\n2ee6e81e4b.png\t6213dd23d6.png\t948eba1e86.png\tc6e6e51ec1.png\tfb1b82dde6.png\n2eea9d51a4.png\t62157d438e.png\t9492db510d.png\tc6e884e38c.png\tfb1bed3120.png\n2ef21a740f.png\t6225c596c4.png\t9494906f54.png\tc6ed46bb39.png\tfb1e8578cf.png\n2ef21d2cb6.png\t6225d9ec9e.png\t949b1fbe67.png\tc6ed8a5ae5.png\tfb21ee1ab4.png\n2ef244e01a.png\t6225efbdfa.png\t949d41bcc8.png\tc6f29dc1a4.png\tfb29635668.png\n2ef75bf4d2.png\t6227359fbc.png\t94a1b56a13.png\tc6f717f69b.png\tfb29b004e9.png\n2efa022d01.png\t622771eae9.png\t94a4ec0276.png\tc70153909b.png\tfb2f5485c7.png\n2efa48cd11.png\t6230e29f51.png\t94b1dd298b.png\tc701db91e2.png\tfb305a2062.png\n2efb94dbfd.png\t623565cb2c.png\t94b29f3064.png\tc70619fb09.png\tfb3392fee0.png\n2efd3776d7.png\t6237eac159.png\t94b3a9ed9c.png\tc717a458f1.png\tfb377b3df7.png\n2efda9a6bd.png\t6239f9dccc.png\t94b4f44698.png\tc71ec28470.png\tfb3ffe5424.png\n2f00b33427.png\t62418f57e1.png\t94baa78d86.png\tc7272c27d4.png\tfb44090bc7.png\n2f07cfbb01.png\t6242efa63a.png\t94bcae761a.png\tc72b546073.png\tfb45d8e754.png\n2f0f0f38bb.png\t624aa8ebdf.png\t94bd962c6b.png\tc72c5ee731.png\tfb47e8e74e.png\n2f1e684ac7.png\t624b6a6c3b.png\t94c1d3a759.png\tc72ddedc0c.png\tfb4aaea758.png\n2f20c083c2.png\t624b6d899f.png\t94c2d96e56.png\tc72e00ed55.png\tfb4bfbfd88.png\n2f248d982b.png\t624bd59549.png\t94c4ad470b.png\tc7383e9a87.png\tfb56c30236.png\n2f28a9f9f8.png\t624c50077e.png\t94c8aad517.png\tc73ac8fd67.png\tfb58510ec6.png\n2f2b8bc0c1.png\t624f67e5f0.png\t94c94ee7b6.png\tc73cf17575.png\tfb5adc8df0.png\n2f2e86de0b.png\t6250becb42.png\t94c9bffdc4.png\tc73fcb8a62.png\tfb5b4462d9.png\n2f35ac4b87.png\t6252a820da.png\t94ca0e01a0.png\tc7407d87cb.png\tfb5bb6be18.png\n2f37f301c0.png\t62546dadf6.png\t94cb5df4f2.png\tc7408f1d38.png\tfb5c32dad7.png\n2f3b480bb3.png\t62560a7420.png\t94cb90a273.png\tc741343ed8.png\tfb643d300a.png\n2f3eff4192.png\t625ba0be4e.png\t94cf1a511f.png\tc749d05181.png\tfb663774b9.png\n2f425d379e.png\t625d4d8487.png\t94d397294b.png\tc74ef130f0.png\tfb6ed9dcc9.png\n2f44c71554.png\t6267b99664.png\t94dc4409ce.png\tc752672e83.png\tfb6ef1a753.png\n2f493f72ff.png\t62683a84e4.png\t94dc7202b0.png\tc756ff4e1f.png\tfb773b7c69.png\n2f4cdfb568.png\t62697e02cc.png\t94dcc2ee9a.png\tc75903d2a9.png\tfb7987693e.png\n2f4efcb073.png\t62715133c5.png\t94e3ebf443.png\tc75a92a0fc.png\tfb7a9f15a5.png\n2f4fb95bb7.png\t6272ae54bc.png\t94e7558e32.png\tc75e33ad32.png\tfb7cc4bd6c.png\n2f50fdcd27.png\t6284db1276.png\t94eb08b688.png\tc765869d59.png\tfb7ec23a4c.png\n2f5377ea9d.png\t6285015437.png\t94eb6e5d40.png\tc768d42da6.png\tfb81781b44.png\n2f611aa593.png\t6288726a58.png\t94edc8c496.png\tc76c0acac6.png\tfb820642eb.png\n2f64ffd99a.png\t628ab35fe8.png\t94f2701ea6.png\tc76cbc2fe9.png\tfb8524cf3d.png\n2f6668d7df.png\t628d220a0f.png\t94f2fa673f.png\tc76d853d5b.png\tfb85acf55a.png\n2f673006e2.png\t628dc60af6.png\t94f3719cb1.png\tc76df8e72f.png\tfb8b050b64.png\n2f7043caf5.png\t628e8fc49b.png\t94f4c2e264.png\tc771336efb.png\tfb8cd151ce.png\n2f71546df6.png\t628ea36ba6.png\t94f53802ae.png\tc771f34634.png\tfb8d42aeb5.png\n2f71bcb486.png\t628f1de539.png\t94f577f74c.png\tc773d84225.png\tfb8da7b005.png\n2f746f8726.png\t62930388ba.png\t94f94ed836.png\tc779a1e062.png\tfb903c3b68.png\n2f85d3f736.png\t629388a831.png\t94fc265d24.png\tc77bf07d42.png\tfb90ccc9c1.png\n2f8693379e.png\t62966dffd3.png\t94fd51a14f.png\tc77cf11ee3.png\tfb91f7bd64.png\n2f8b36c386.png\t629d13dfeb.png\t94fef33e39.png\tc78063e0a6.png\tfb99f1be84.png\n2f8b6beb17.png\t629e31317c.png\t95016016d6.png\tc784560371.png\tfb9a1efb51.png\n2f8e4f82fd.png\t629f639f2c.png\t9501e38344.png\tc78996aa02.png\tfb9b58527c.png\n2f917b1124.png\t62a0ee9480.png\t95037a00f7.png\tc7899986d3.png\tfb9c58299e.png\n2f91b6f34d.png\t62a3c1e005.png\t950b416f78.png\tc78c89577c.png\tfb9d1d06a6.png\n2f9998ea8e.png\t62a8f89a19.png\t950defdb1e.png\tc78e071138.png\tfba32825cd.png\n2fa10e0844.png\t62aad7556c.png\t950f33c5f9.png\tc7937f3713.png\tfba70b6e6f.png\n2fa330e8fd.png\t62ab055b81.png\t9511856b03.png\tc793818b13.png\tfba91bfec8.png\n2fa3affaf2.png\t62ac70c3e8.png\t9511d7e887.png\tc79d168bdc.png\tfbac994408.png\n2fa4260cde.png\t62b4377713.png\t951c8ea3fe.png\tc79d52ffc3.png\tfbb14f2691.png\n2fa889172a.png\t62b7bc9a1b.png\t951cf62d3d.png\tc79df71fb3.png\tfbb223dbe7.png\n2fac24e793.png\t62bce5724c.png\t9521c81c4b.png\tc79f40f0cd.png\tfbb738d83f.png\n2fafa3db1a.png\t62bd7dc7d7.png\t9522339a9a.png\tc7a0370b35.png\tfbb8303f81.png\n2fb152d905.png\t62bead84dd.png\t952333ee99.png\tc7a2535ee6.png\tfbb8eba26d.png\n2fb64df4f9.png\t62bfc99b5d.png\t952563a97e.png\tc7a92a6707.png\tfbb9e8c041.png\n2fb6791298.png\t62c2287cc4.png\t9525b60672.png\tc7a97902e7.png\tfbbc4ef116.png\n2fb72cfd9e.png\t62c998cc72.png\t952c9f8a43.png\tc7ab9848b9.png\tfbbd320627.png\n2fb7fa40ee.png\t62cb5b8d84.png\t953316da2b.png\tc7b0d84043.png\tfbbdc5f6e2.png\n2fb819ab32.png\t62d063baf2.png\t95338a2cc9.png\tc7b2190636.png\tfbbed87ccd.png\n2fbde8630d.png\t62d11a72f3.png\t953761165d.png\tc7b4b3f494.png\tfbc0199119.png\n2fbf1cd550.png\t62d13e356a.png\t9537c49d89.png\tc7b8fda693.png\tfbc0dcd4df.png\n2fc2517c87.png\t62d30854d7.png\t953860af74.png\tc7b9c59554.png\tfbc13a5b47.png\n2fc2b3f3d8.png\t62d367a908.png\t9539d280a6.png\tc7bde3271c.png\tfbc34adf28.png\n2fc6fb555f.png\t62d6a9b673.png\t953d0eb2ab.png\tc7bfd8548e.png\tfbc5acbff9.png\n2fcb45c44c.png\t62d9d1fc79.png\t9543901c15.png\tc7c0ded4b5.png\tfbc680b74f.png\n2fceacfd78.png\t62db2c0479.png\t954502fb12.png\tc7c3a8a806.png\tfbc8a6a4f9.png\n2fd6740e77.png\t62de954435.png\t9545264612.png\tc7c733b053.png\tfbc98aedc0.png\n2fd67fc61d.png\t62debbe8ae.png\t9548bbf287.png\tc7c8e8ff63.png\tfbc9a1149e.png\n2fd6d25adb.png\t62df3565cd.png\t955155dcfc.png\tc7c8ffea4a.png\tfbcc1b2a37.png\n2fd895df3d.png\t62df6591d2.png\t9556d5ea0a.png\tc7d1e6fe7b.png\tfbcd0447ae.png\n2fdc5ea9d3.png\t62e4d6c471.png\t95577ef8f3.png\tc7dbcf550d.png\tfbcd92f03f.png\n2fe0292eea.png\t62e58689c3.png\t9559ab8b64.png\tc7dca52386.png\tfbcef17a13.png\n2fe186a2c6.png\t62e608259a.png\t955a22dab3.png\tc7dd6d539f.png\tfbd452dfbf.png\n2fe4adfd48.png\t62e72fe39a.png\t955a9274be.png\tc7df3e05ad.png\tfbd61072f4.png\n2fe75ba3f9.png\t62f02d3e0e.png\t9567a72dc5.png\tc7e88240be.png\tfbd6989cc2.png\n2fecaf1c54.png\t62f191ada1.png\t956922a176.png\tc7f37af819.png\tfbdd5e45a4.png\n2ff32a1188.png\t62f822b156.png\t956aa89d91.png\tc7f3dbc608.png\tfbdea8f46c.png\n2ffcaf4c45.png\t62feb1eebf.png\t956c91fe94.png\tc7f4301086.png\tfbe41e2659.png\n2ffe02e882.png\t6301fc6be7.png\t957100fe7a.png\tc7f54785b8.png\tfbea8c7ae0.png\n2ffea0c397.png\t63051e4428.png\t9571d16b40.png\tc7f8839a1e.png\tfbea9383e2.png\n2ffff5b00b.png\t6306dd3a8e.png\t95747d4345.png\tc7f9a5ab36.png\tfbecb60564.png\n3003a4c4c8.png\t63098bee3b.png\t95775a9362.png\tc7fbf93ee8.png\tfbed21a091.png\n3004f41177.png\t6313ab72dc.png\t957e09ce5b.png\tc80354dce7.png\tfbf380831b.png\n30082e87d9.png\t6315adc398.png\t9583b7d310.png\tc8074c16dd.png\tfbf5c91d9e.png\n300ad77caa.png\t6318ebf3fb.png\t95862fc051.png\tc80cc9e84e.png\tfbf6226410.png\n300c287220.png\t63193879b4.png\t958a3dd464.png\tc80f09bee6.png\tfbf64b5ea0.png\n3011508959.png\t631a3498d7.png\t9594833b58.png\tc81174f057.png\tfbf73cb975.png\n3011a7f48d.png\t631cec82b9.png\t9595247472.png\tc812d01650.png\tfbf7b17153.png\n301595fcd8.png\t631dacfc0f.png\t9599d70e00.png\tc813181542.png\tfbf8f58b71.png\n3018711518.png\t631f8ac5aa.png\t959b1ff073.png\tc814546335.png\tfbfa863bd4.png\n301bb33659.png\t63235ccaf6.png\t95a184c026.png\tc818265a5d.png\tfc01432146.png\n3022a2d122.png\t632c5881af.png\t95a60e2f42.png\tc81e790680.png\tfc017371d0.png\n30265b3659.png\t632d03fc18.png\t95a7b73906.png\tc820e89250.png\tfc04d08cfe.png\n3027326497.png\t6331b430fe.png\t95a87f8ad9.png\tc824b8afaa.png\tfc05b19061.png\n302d49efc1.png\t63340cabf9.png\t95abc7000c.png\tc8254ec3b4.png\tfc0cec9297.png\n302e684a9c.png\t6337fc7a59.png\t95ad98d2a6.png\tc8266973f7.png\tfc1292d85d.png\n302ea1ac81.png\t6338a2def5.png\t95af8ca744.png\tc829cbf315.png\tfc13dabcbb.png\n30307d5103.png\t633a88f80a.png\t95b0e08307.png\tc82ae5f941.png\tfc1b10ee80.png\n30315fb6e8.png\t633b6d6ec4.png\t95b1cfae2f.png\tc82b27b742.png\tfc1ca2fbc8.png\n3032b4f8fd.png\t633c7d5c80.png\t95ba7bd510.png\tc82d0ed565.png\tfc244e24dd.png\n303a74f801.png\t634088c2cb.png\t95bdc8b506.png\tc82dabf9b1.png\tfc250f574c.png\n303f985155.png\t6343d8b189.png\t95c2122344.png\tc82f7676ac.png\tfc26352edb.png\n30451ed875.png\t634572b017.png\t95d14b6295.png\tc8316110ee.png\tfc3293661a.png\n30491b379f.png\t6347c18183.png\t95d3841a53.png\tc834d7e9df.png\tfc3dcf852d.png\n304b53d8b1.png\t6348c487a4.png\t95d55a10f2.png\tc835712756.png\tfc471580a2.png\n3056753c3a.png\t634aab7a86.png\t95d6538296.png\tc83d9529bd.png\tfc4be42dc3.png\n305861d7ab.png\t634be460d9.png\t95d7c67f1a.png\tc83f1f448b.png\tfc4c223773.png\n305e201b21.png\t6350e7b9ea.png\t95d9e303bd.png\tc83f3cc8ac.png\tfc4f24e7f2.png\n305e3dc5c7.png\t63532a858b.png\t95da7d8f73.png\tc8404c2d4f.png\tfc5a8b5122.png\n305f5d7900.png\t635bf7e9d1.png\t95dc8f147e.png\tc8409479b3.png\tfc623c5fc1.png\n306bde4606.png\t635c1e9055.png\t95e0620119.png\tc843f11c8a.png\tfc6750f468.png\n306e6e7743.png\t635c6f6b8f.png\t95e0b5a613.png\tc8479fe755.png\tfc697b13ac.png\n30707d81d9.png\t635fda5299.png\t95e3027425.png\tc84a98b6d3.png\tfc6e9d0385.png\n3072ae3119.png\t636456b360.png\t95e69499a8.png\tc84e06f60f.png\tfc7019ac3c.png\n30787ad7df.png\t6364ce708e.png\t95e790317b.png\tc84f0d4dd9.png\tfc71babfdf.png\n3078a52105.png\t63660a3693.png\t95eb72bdc1.png\tc8505be4f8.png\tfc721787b5.png\n307d14d721.png\t636b5fb5a2.png\t95ef42fec4.png\tc854388025.png\tfc74f77bae.png\n307f22698c.png\t636df5b27a.png\t95ef69abc1.png\tc858f7fb38.png\tfc7926f05e.png\n3081fa09b8.png\t63726c0bdd.png\t95f6e2b2d1.png\tc8592c745c.png\tfc7e44c6e0.png\n308469f5cf.png\t6376c88ec3.png\t95fc6d05a6.png\tc8598ec93b.png\tfc7fa22730.png\n308a9348e0.png\t6377b2a077.png\t95fc7cf6a6.png\tc85d6353df.png\tfc81b3e307.png\n308bdf2973.png\t637a5ec28d.png\t96021baf01.png\tc864432114.png\tfc835386e7.png\n308ce73ab6.png\t63824b7412.png\t96030d7569.png\tc866a1e2e0.png\tfc87224914.png\n3091d07dd9.png\t6385fbffc8.png\t96049af037.png\tc8682f2387.png\tfc8d512788.png\n309400bf90.png\t6389a7688d.png\t96065da756.png\tc86aa5e90a.png\tfc8f0668fb.png\n30998d19cd.png\t638d8572e9.png\t9608c60728.png\tc86cede787.png\tfc944f4e1f.png\n309a8f9f4b.png\t638e7a825e.png\t9609c8185b.png\tc86ec557ad.png\tfc98189a79.png\n309e1f0815.png\t639103c523.png\t9609f60e28.png\tc87253dfae.png\tfca1016aa7.png\n30a31f4de8.png\t6395bd6c7f.png\t960a4b6a0e.png\tc877496b7a.png\tfca169873a.png\n30a47ba346.png\t6397715bb4.png\t960d395a3f.png\tc87cebf728.png\tfca4d42846.png\n30a5a8d3c5.png\t63987c709a.png\t960f8991c8.png\tc87da590ee.png\tfca9339160.png\n30a9858fdd.png\t639a529644.png\t960f92cf81.png\tc884a56bdd.png\tfcaa76df64.png\n30ad603f80.png\t639dbc4799.png\t9611bdd831.png\tc8861d0d75.png\tfcaa836106.png\n30aef2acea.png\t639e4dac3e.png\t9613762743.png\tc8899dc0bf.png\tfcac5cb6d1.png\n30b166ac76.png\t639e8ffbae.png\t9617049802.png\tc88a511d91.png\tfcb49356a5.png\n30b8d8ed35.png\t63a0fa5b43.png\t961789705f.png\tc88d7d3c76.png\tfcb75e949a.png\n30badffdd5.png\t63a1777e28.png\t96196151be.png\tc894b9e4d2.png\tfcb97ed460.png\n30bb4cd749.png\t63a347b03a.png\t9619b76e6f.png\tc895702d04.png\tfcbc5cef9d.png\n30bb570ce5.png\t63a371159d.png\t961b1310e9.png\tc899bb9da4.png\tfcbda0c85b.png\n30bdba0c6b.png\t63a6efdc11.png\t961c64fe51.png\tc89bda5c02.png\tfcbe9ea985.png\n30cb1002fb.png\t63b3217142.png\t961da31e2d.png\tc89eda8701.png\tfcbf884d92.png\n30cc3d3982.png\t63b5f1676d.png\t961ed3ae33.png\tc8a0a0c511.png\tfcce81906d.png\n30cfe8210b.png\t63b871232e.png\t96216dae3b.png\tc8a3cf31fe.png\tfccebb487b.png\n30d3aa07d0.png\t63b98f9248.png\t962348be7a.png\tc8a43265bc.png\tfccf003bed.png\n30d4fb116b.png\t63ba14657e.png\t962598eac3.png\tc8a942dbf4.png\tfccf774196.png\n30e154843b.png\t63bed65d65.png\t963189b39c.png\tc8a962d7d1.png\tfccfe75923.png\n30e2d59d62.png\t63c0eff552.png\t963210b127.png\tc8adc706f5.png\tfcd907558f.png\n30e4d22141.png\t63c153b577.png\t96383c7022.png\tc8b32316bd.png\tfcdf222153.png\n30e5c4d227.png\t63c49b24d0.png\t963c696594.png\tc8b6b2c92e.png\tfcdf2756aa.png\n30e5cfec08.png\t63c62f219c.png\t963e4a56bc.png\tc8b7b52f7f.png\tfce095c11b.png\n30e76b97d1.png\t63c8f10d71.png\t964030c032.png\tc8b9beec1c.png\tfced196fe6.png\n30e8cdb401.png\t63d0049fd4.png\t9641c4ccf1.png\tc8bb808990.png\tfcf3215350.png\n30f085c4e3.png\t63d008746b.png\t964820913b.png\tc8bc2ec5b9.png\tfcf5137a10.png\n30f19c9e60.png\t63d3cf63be.png\t964ba742b6.png\tc8bd931f39.png\tfcf5a2aa19.png\n30f1ed8caf.png\t63d4a471c8.png\t964be849b0.png\tc8bd99cbb0.png\tfcf61d1883.png\n30feec0e78.png\t63d57d16de.png\t964f452e73.png\tc8be125295.png\tfcf64355cc.png\n30ff4f38cf.png\t63db2a476a.png\t964f6dab2f.png\tc8c576b9db.png\tfcf64ff2d1.png\n3108f0bfec.png\t63e0650693.png\t96523f824a.png\tc8c7701dbd.png\tfcf843ab71.png\n310abcd735.png\t63e2318edd.png\t9652e27bdc.png\tc8c82dcede.png\tfcf87c94e3.png\n310efcec18.png\t63e28795b7.png\t9653eda8bb.png\tc8c86d269e.png\tfcf8a26d93.png\n311030c73a.png\t63e32c0cfb.png\t9654d1a56c.png\tc8cc1a4dd3.png\tfcf8c91350.png\n3110e75d41.png\t63e49edc70.png\t9656f8f7df.png\tc8ce151a18.png\tfcfac062ef.png\n3116974feb.png\t63e4c28d41.png\t9658b015f5.png\tc8cee4f4b3.png\tfcfc987c61.png\n3116a9d54c.png\t63ea7ee5d6.png\t965ab1dc20.png\tc8cf8823a0.png\tfd038e782c.png\n3116f272e3.png\t63ec6fa3e4.png\t965b4cffcc.png\tc8d429588d.png\tfd0424be0f.png\n3118e18c79.png\t63ed0faeb8.png\t965bf5c85e.png\tc8d70d9c3d.png\tfd05d8188f.png\n311ee1dd72.png\t63ee7113ff.png\t9661a4546f.png\tc8e5f52fa0.png\tfd07e58d73.png\n31221e6d5b.png\t63f3762727.png\t9665f7544e.png\tc8e691b7ba.png\tfd0aafd8b7.png\n31250cef62.png\t63f93a4213.png\t9666274ded.png\tc8ea9fc5cb.png\tfd0b168cf6.png\n312a39246f.png\t63fa6eda16.png\t9667824566.png\tc8eb320feb.png\tfd0c0208cb.png\n312c4ce754.png\t63feb8de2b.png\t9667fcc96d.png\tc8eba9e456.png\tfd0d1264e5.png\n31333d1810.png\t6402f3af25.png\t966b3bffe3.png\tc8f06b6aa5.png\tfd0d237b01.png\n3136a81ab1.png\t64060a574f.png\t966b96f8b2.png\tc8f08b5891.png\tfd0d738abc.png\n3138069557.png\t640c22038a.png\t966bf8fd8b.png\tc8f0aaaaf7.png\tfd1085d949.png\n3138263206.png\t640c607028.png\t966f23aa7c.png\tc8f3d1b9c6.png\tfd119c8d2f.png\n313961f834.png\t640ceb328a.png\t966fb0cdbb.png\tc8f4812ba1.png\tfd141b8ffd.png\n3139acbbe9.png\t641142a960.png\t9678fb935e.png\tc8fabb275a.png\tfd1a957827.png\n313a79ab24.png\t6415017006.png\t967958f50b.png\tc9005cebc8.png\tfd1be18f7d.png\n313be8227c.png\t6417ae8563.png\t96798b90f9.png\tc9012b5df3.png\tfd1f3e5caa.png\n313cf7f11a.png\t6417b5bc9c.png\t967b146e20.png\tc901e3db9d.png\tfd219f9cfc.png\n3140fdaec8.png\t641806b79e.png\t967d30a3fd.png\tc901f669f8.png\tfd22cb4311.png\n31461f0f03.png\t6419e5b53f.png\t967e6213ce.png\tc9023ac460.png\tfd2421b895.png\n3149b1eebc.png\t6419fd454b.png\t967f41605c.png\tc9049a4fa8.png\tfd257af5d2.png\n314b80176d.png\t641d0227e7.png\t96857148ca.png\tc90712e934.png\tfd2642606b.png\n314db606fd.png\t641daad2d2.png\t96867a3897.png\tc9094061c7.png\tfd284219fe.png\n3153603b3e.png\t641dabe2c5.png\t96882dd4fa.png\tc90ad27000.png\tfd2ef2374c.png\n3159053d8f.png\t641dd7bc32.png\t968c87dbe9.png\tc90d708c52.png\tfd2f09d544.png\n3159a2100b.png\t641ef8ba11.png\t9693c84080.png\tc91038e023.png\tfd31311ea1.png\n315b1e0da7.png\t642057611a.png\t9694305ae1.png\tc9142b341c.png\tfd35db589d.png\n315b8f2266.png\t6420ece709.png\t9694a662dc.png\tc91684784f.png\tfd36b0917a.png\n315df3b3ef.png\t6424d3b6e1.png\t96961227f1.png\tc91958a03b.png\tfd38950b2c.png\n316074fdc4.png\t6425106fd4.png\t9699288c2f.png\tc91a6405a5.png\tfd3eb62762.png\n31609e30ad.png\t64266e4d4e.png\t969951e384.png\tc920246511.png\tfd423b9cbf.png\n31629cce42.png\t6429ce0a8d.png\t96998d1ad9.png\tc921814978.png\tfd45c51330.png\n31632725b3.png\t642bd295f9.png\t969caf2c6c.png\tc9255da5a5.png\tfd4741392b.png\n3164ab09ed.png\t642c2750e3.png\t969e0a4804.png\tc925d8f9e1.png\tfd493dfefa.png\n31681e5258.png\t642c62a158.png\t96a7ad5c07.png\tc927012921.png\tfd4d5819be.png\n31683f8a18.png\t642d76749a.png\t96a9252b0c.png\tc9272c542f.png\tfd50358e85.png\n316b113b65.png\t642e79bdbc.png\t96aeb8d018.png\tc92ae135a1.png\tfd51e63c83.png\n316d0460b6.png\t643005b5af.png\t96b2a2359b.png\tc92c37ecd7.png\tfd5894772e.png\n316d5719d5.png\t64310e0861.png\t96b5aa4543.png\tc92e16d3c0.png\tfd5c3c20d6.png\n31729627d2.png\t643180a49b.png\t96b71a38e9.png\tc931414489.png\tfd5cc36d93.png\n3174c61ab2.png\t6432584b53.png\t96b7a3b42a.png\tc939e08a4b.png\tfd63516ff4.png\n3175807e68.png\t643a1fb9c4.png\t96c6342357.png\tc93d0ecd1d.png\tfd66ad582d.png\n3175b43628.png\t643c7c5672.png\t96c83b6822.png\tc9454c6638.png\tfd66b58631.png\n31762a217d.png\t64423dd98b.png\t96cc5caec0.png\tc9512a289b.png\tfd6a32c9b6.png\n31767a6b6a.png\t64431a59f4.png\t96cd0e4cc7.png\tc9526a0744.png\tfd6b09e668.png\n317a7a9f5e.png\t6445eace2c.png\t96cd72c1f6.png\tc95594456f.png\tfd6cc6ecbb.png\n317f5e5458.png\t644ab5c5da.png\t96ce456813.png\tc955f688f3.png\tfd82a6b2f5.png\n3187bf5c2d.png\t644d224e9c.png\t96ce99d109.png\tc9573d723f.png\tfd82a91d3b.png\n31888f3b13.png\t644dd2c53a.png\t96d1d6138a.png\tc95a673c42.png\tfd84c2ff6c.png\n3189466c83.png\t644e2688b3.png\t96d2ddd94f.png\tc95b654fb9.png\tfd8d57d81f.png\n318cd6739c.png\t6456991eb0.png\t96d45b4e8e.png\tc95d3f1510.png\tfd938a4752.png\n318cdeff85.png\t6457057cec.png\t96d917e53d.png\tc95e99e11f.png\tfd949b5ab7.png\n31975a3fc1.png\t645abba5ec.png\t96dd878ace.png\tc96223d1d2.png\tfd98fc3bac.png\n31982ce029.png\t6460ce2df7.png\t96e03e657f.png\tc9663ee44e.png\tfd998a993b.png\n319872df0a.png\t646325e5bb.png\t96e09ff434.png\tc9695bff51.png\tfd9db12ca0.png\n3198f42cfb.png\t6464ec6140.png\t96e1a80391.png\tc96d3d7eba.png\tfda5468e31.png\n319f3240b9.png\t64663bc8c0.png\t96e656eb8c.png\tc96eb8eee8.png\tfda74a4257.png\n319fc55710.png\t64691b048e.png\t96ecbfedff.png\tc96f52ba7a.png\tfda99f622d.png\n31a1e63e3b.png\t64692bc1de.png\t96ed985fda.png\tc97455657d.png\tfdab05aef0.png\n31a30113b3.png\t646bead850.png\t96f1a0e953.png\tc97a78e54d.png\tfdabdc4399.png\n31a4341eb6.png\t646dc6b247.png\t96f26f6397.png\tc982bcdf9e.png\tfdad2f99d8.png\n31a78326c8.png\t646f093b5f.png\t96f60d42bd.png\tc98504a8ab.png\tfdb24056fe.png\n31aa522509.png\t64752b5954.png\t96f700aec5.png\tc9867c4064.png\tfdb2660fd9.png\n31aaa04c63.png\t647acd5865.png\t96f8bd1813.png\tc987a5763a.png\tfdb49c8336.png\n31ae66e986.png\t64864b7738.png\t96f8fd0d4b.png\tc98b273ac9.png\tfdb7d132be.png\n31af711b5c.png\t6489d78b57.png\t96fbe949fa.png\tc98c4c3f09.png\tfdbafdd468.png\n31b003483c.png\t648ac9c05a.png\t96fce23186.png\tc98dfd50ba.png\tfdbe0275bd.png\n31b1314c62.png\t648b09f832.png\t9700b1dd12.png\tc9907b0008.png\tfdc06fea10.png\n31b22c447f.png\t648d1ae4be.png\t9704b1ba61.png\tc9958b45d0.png\tfdc16dcadf.png\n31b631d4c1.png\t6491e7762a.png\t970723f88d.png\tc996dc19a1.png\tfdc18ce87f.png\n31b7874f49.png\t649224d545.png\t97099ae7b2.png\tc999ea6eb4.png\tfdc1ac926a.png\n31c01817bd.png\t6493c2cbab.png\t970bd44389.png\tc9a1fd48f6.png\tfdc28cafdc.png\n31c40c5c2a.png\t649a81e118.png\t970e117246.png\tc9a2990f76.png\tfdc2d50f42.png\n31c6a2860f.png\t649b95fde3.png\t9714d5390c.png\tc9a3cb97e5.png\tfdc2de2c84.png\n31c8f92fa9.png\t649c70eab7.png\t9714f42c6b.png\tc9a6cf85ca.png\tfdc2f74523.png\n31c97009c1.png\t649d547157.png\t9716120e4b.png\tc9a87ab2ab.png\tfdceac3f62.png\n31ca053bec.png\t64a02779cc.png\t971775bc8b.png\tc9a8c9c5bf.png\tfdcf50e18a.png\n31cd04055c.png\t64a2a50ee8.png\t9718102ef6.png\tc9adcab5f0.png\tfdd2dc5ce6.png\n31cee2e8f0.png\t64ac969cac.png\t9718a3fbb3.png\tc9afe2f91e.png\tfdd6fb200a.png\n31d2ea0306.png\t64ad05c7f2.png\t971bab4c1e.png\tc9b35591b7.png\tfdd73141d6.png\n31d35c7c1d.png\t64ad650850.png\t971c96134a.png\tc9b54f28b2.png\tfdd8e5ab91.png\n31d4b415e4.png\t64b5679468.png\t971ff26c0d.png\tc9ba1867cc.png\tfdda40625e.png\n31d4fdd297.png\t64b8ca4d93.png\t97205f3eed.png\tc9bbf41eb2.png\tfde097daa0.png\n31d640b9b2.png\t64baad8029.png\t9720e1e105.png\tc9bd75f14e.png\tfde2b51fb2.png\n31d9475776.png\t64bfdd0b9a.png\t9722dedee7.png\tc9bf1b4bc6.png\tfde46e4a24.png\n31dba1ef9a.png\t64c0dcc319.png\t97246287a5.png\tc9bf362410.png\tfde6627282.png\n31dbaea96c.png\t64c27a1dba.png\t972718a84f.png\tc9c0097eff.png\tfdec2622ca.png\n31de8d8b34.png\t64c36c6a25.png\t973038e1c1.png\tc9c7cd6fa4.png\tfdee9fcb31.png\n31e17c9459.png\t64c76ea7a3.png\t97357fbd43.png\tc9cbf72057.png\tfdf190fde9.png\n31f29ada31.png\t64c875c36e.png\t9736389261.png\tc9ce569601.png\tfdf773357a.png\n31f2c66fc9.png\t64d1275e94.png\t97363b2f20.png\tc9cfbe1161.png\tfdfaab1f5c.png\n31f3ed6cb1.png\t64d3566e09.png\t9737133eff.png\tc9d1f1d210.png\tfdfe7e5987.png\n31f6090f30.png\t64d5d795ee.png\t97381917c6.png\tc9da9c3d42.png\tfdff017e77.png\n31fb079ea8.png\t64d66aa587.png\t973872ad59.png\tc9e76be5dd.png\tfe051a426a.png\n3200c1b26e.png\t64d8cbc1ab.png\t9739e07b5c.png\tc9ea133098.png\tfe0f94a2bc.png\n32034f14af.png\t64dba827d6.png\t973a14e68f.png\tc9eb68517b.png\tfe0ff502fb.png\n3203884be7.png\t64dc4d38bf.png\t973a4f6dad.png\tc9efd977d4.png\tfe11d93a24.png\n32062fff08.png\t64dc981415.png\t973b0b920a.png\tc9f2c6ab45.png\tfe14970ee7.png\n32071efa12.png\t64dcad5c0e.png\t973b5ff2f7.png\tc9f7f4020f.png\tfe14b9e031.png\n320bf85482.png\t64e050f295.png\t973efa154b.png\tc9f84c908b.png\tfe18ea88b0.png\n320e7851cc.png\t64e526c39f.png\t973fa28ef3.png\tc9fc576387.png\tfe1ab48d73.png\n320fb9c41a.png\t64e79513a3.png\t9741bfeded.png\tc9fc80957f.png\tfe1e39f435.png\n32160e4b75.png\t64eb7a40ca.png\t974300fa9a.png\tc9fd06a1f9.png\tfe2c23d972.png\n32163dcd2a.png\t64ee4cf22b.png\t9744ed0e4a.png\tc9ff0d072b.png\tfe2daeb3c0.png\n3218afdc84.png\t64f11799ae.png\t9747413253.png\tc9ffc53f25.png\tfe2e88b36d.png\n321abb9aa6.png\t64f1d0878c.png\t974f07329c.png\tca0030c4ed.png\tfe2fafb64f.png\n321f21e02f.png\t64f1e00916.png\t974f5b6876.png\tca04c19c0b.png\tfe34a95b4d.png\n321fd72edc.png\t64f41e36f2.png\t97515a958d.png\tca06942a59.png\tfe37eddf8e.png\n3223aabdaa.png\t64f93bb656.png\t9754145e61.png\tca08075350.png\tfe3b576f13.png\n32273eef45.png\t6503a5e606.png\t97559e9fc0.png\tca0cf3d157.png\tfe3d670457.png\n322815a679.png\t650b1ef686.png\t975ae481a2.png\tca0d3ea2f5.png\tfe4930a905.png\n322816b277.png\t650dc752dc.png\t97603fdda8.png\tca0e0b463f.png\tfe4f6ff284.png\n3228f203a3.png\t650e16bbf8.png\t976150fcd7.png\tca14a9a381.png\tfe5227d488.png\n32326636a3.png\t6514081e49.png\t9765869d24.png\tca176eaab9.png\tfe52d8e719.png\n32332fc441.png\t651873c256.png\t9769ec0011.png\tca1b6771a0.png\tfe53259bf6.png\n32368fa1a5.png\t6519483c74.png\t976e6e223f.png\tca210d235f.png\tfe5625c304.png\n323882f716.png\t6519dfb0ad.png\t976eb05465.png\tca2c7ceb97.png\tfe56b937f6.png\n32405003b5.png\t651a594743.png\t976fcded4d.png\tca2cc667dc.png\tfe5cefdc72.png\n3243944add.png\t651b2c01bc.png\t97712a8265.png\tca2e414e83.png\tfe5fdd7979.png\n3245d14eee.png\t651d78e249.png\t9771c21176.png\tca2f360993.png\tfe6361c582.png\n324e1c12c4.png\t6521203218.png\t9772738ce2.png\tca30359954.png\tfe6487ceb1.png\n324eb2ec65.png\t6531d04894.png\t9772fe6e9b.png\tca32b9178d.png\tfe6a637d8d.png\n32510ca887.png\t65353fa69b.png\t977f897477.png\tca339e5289.png\tfe6b152962.png\n3255c8a764.png\t6538555893.png\t9782b39532.png\tca384fdd2f.png\tfe6dca6363.png\n32599d97bb.png\t6539fda60c.png\t97866aac0c.png\tca393f0eaf.png\tfe6e74532c.png\n325cde4fa9.png\t653a53866e.png\t9787ddd96b.png\tca3bc1f093.png\tfe7152e183.png\n325fe8db2b.png\t653c7e9f6c.png\t978ae7ff61.png\tca3bf7b634.png\tfe76b5e9e0.png\n326a51c36b.png\t653f33a676.png\t9790f7880b.png\tca3d34d73e.png\tfe78577b21.png\n3276b81273.png\t65408225fc.png\t9791e4eea4.png\tca430c5ad2.png\tfe79b09a4a.png\n3276e900a7.png\t65449ab1f3.png\t9795561cd9.png\tca4d17a4f2.png\tfe7c0c392e.png\n327b7a01ea.png\t6545c6235b.png\t97970b5e2c.png\tca4d59fef3.png\tfe7ecb6b3b.png\n3280be0d66.png\t65474585cb.png\t979a06fdbf.png\tca4dc840a9.png\tfe801cdc47.png\n328183c949.png\t6550e7b59a.png\t979b3ab248.png\tca4e3b12aa.png\tfe85864f8f.png\n3281bb7282.png\t655358d4c6.png\t979f77d312.png\tca509cfd78.png\tfe86a6f843.png\n3284441946.png\t65556f12d3.png\t97a10a3ba0.png\tca511bc319.png\tfe87689920.png\n32851c6bff.png\t6558dcde17.png\t97a19e7d37.png\tca51c2690c.png\tfe8abf3496.png\n3287f756d4.png\t655bb63be8.png\t97a26ea163.png\tca5226f983.png\tfe8ee42b93.png\n32887c2719.png\t655d3e4d01.png\t97a306a60a.png\tca58996278.png\tfe92cc02d9.png\n328929d96a.png\t656213f488.png\t97a644bbf3.png\tca69c909a8.png\tfe9e558605.png\n328ab38f76.png\t6567f59d54.png\t97a653e83c.png\tca6bc81fcd.png\tfe9eb2c15c.png\n328ac22108.png\t656973c43c.png\t97a7516ca4.png\tca6bdbc405.png\tfe9ef2e02d.png\n328d1497af.png\t6569d102b6.png\t97aa1042ad.png\tca6f8f16a1.png\tfe9ef87c45.png\n328e232d2c.png\t6576184a08.png\t97aab49e92.png\tca70f9caf9.png\tfea15039a7.png\n3290d37891.png\t6576741772.png\t97ab9d8f26.png\tca720c6ad8.png\tfea1ff8505.png\n329563756f.png\t657f2c34f7.png\t97ac7d5eee.png\tca78405cea.png\tfea329e4ee.png\n32968918d9.png\t658219f5b4.png\t97adf32d4d.png\tca7c651e29.png\tfea530ed89.png\n32970c365a.png\t65839ceeb2.png\t97aed38b4d.png\tca82310ee1.png\tfea53bcb0c.png\n3297a9816f.png\t658b78124a.png\t97af1c4038.png\tca84b9e8d4.png\tfea5f19d76.png\n32995c6b96.png\t658f7c6b32.png\t97b058f254.png\tca8621f9cb.png\tfeaae39fc4.png\n32a0e5cde1.png\t659193a103.png\t97b2dbc36f.png\tca86bc43a0.png\tfeace2a064.png\n32a40874f0.png\t6591c2f6c8.png\t97b67fe469.png\tca8b323d7d.png\tfeadf0e85d.png\n32a414a0d8.png\t6592d5e7fe.png\t97b71c8c6b.png\tca9ebdb0e6.png\tfeaf8ae578.png\n32a4fb558d.png\t6599700d73.png\t97b7351da7.png\tca9f072002.png\tfeb0134117.png\n32a571418e.png\t659b17c4bd.png\t97bbb7e309.png\tca9f801c0a.png\tfeb066e638.png\n32abaddce7.png\t659b1e0690.png\t97bd8b22b1.png\tcaa039b231.png\tfeb1738f44.png\n32afbf53f6.png\t659ecabddb.png\t97c05f4628.png\tcaab8d8513.png\tfeb2c6a00c.png\n32b1cd7652.png\t65a150de53.png\t97c0a37e6f.png\tcaac24dbe3.png\tfeb90d8376.png\n32b2140185.png\t65a338624e.png\t97c11d9b04.png\tcab2f6614c.png\tfeb9ca88e3.png\n32b4371321.png\t65a55b69b7.png\t97c42503fb.png\tcab5dde4eb.png\tfebd1d2a67.png\n32b6780741.png\t65aa0b8161.png\t97c5934455.png\tcab729cc02.png\tfec37b496b.png\n32b6ac7ddf.png\t65aadb6f53.png\t97c63fd321.png\tcab90fda72.png\tfec5331bb4.png\n32c2343233.png\t65ac89d834.png\t97cb7b1d6a.png\tcabc7e9643.png\tfec8cb9677.png\n32c6df82a7.png\t65b5805351.png\t97cef77fa4.png\tcabcae01fa.png\tfec9d967bd.png\n32cb09b74f.png\t65b8b45711.png\t97d23cea10.png\tcabe5c94b2.png\tfeca2858c6.png\n32cbd61c99.png\t65bb27f932.png\t97d26bbb53.png\tcac3b6adf1.png\tfecda8c43c.png\n32d3368584.png\t65c5edcfa3.png\t97d2c11f37.png\tcacc0e3239.png\tfecf98956c.png\n32dacc5dae.png\t65c87893a0.png\t97d34ed343.png\tcaccd6708f.png\tfed1be9dbc.png\n32dd0a323e.png\t65cb9cbc05.png\t97da46ad4a.png\tcad296a394.png\tfed6020461.png\n32ee738945.png\t65cb9ccd83.png\t97dabec5eb.png\tcad643a3b6.png\tfedc8a385f.png\n32f1039c61.png\t65d05b81c6.png\t97dc92d4e2.png\tcad650215e.png\tfee1806de8.png\n32f2fed6c9.png\t65dc362682.png\t97def6b173.png\tcad8e1ccac.png\tfee2a9e791.png\n32f4de545c.png\t65e01bd65f.png\t97e2c6b5c5.png\tcadb46b378.png\tfee35c9fa9.png\n32f4fd66ae.png\t65e2650a80.png\t97e2dc7222.png\tcadfbf8011.png\tfee853f244.png\n32f6748b0a.png\t65e6f00f4f.png\t97e335754c.png\tcae1a9f23f.png\tfeec666b6c.png\n32fc8d7cd8.png\t65e9989c7b.png\t97e78a5448.png\tcae26f3214.png\tfeecb89064.png\n32fe997af3.png\t65ec8a8f69.png\t97e8949284.png\tcae4b0f0d4.png\tfefe01ac32.png\n32ffc55b56.png\t65eca0ad70.png\t97ea7698b8.png\tcae4ec68c0.png\tfefe2eb684.png\n33041df256.png\t65edaad9c9.png\t97efe48f6e.png\tcae5f3c6fa.png\tff038e87b1.png\n3307dd0179.png\t65ee9780bc.png\t97f03b486d.png\tcae81d44d2.png\tff06094492.png\n330e786de1.png\t65ef110246.png\t97f8cb9fac.png\tcaeaa21d24.png\tff06e0f167.png\n330fac2779.png\t65f00f9982.png\t97fad1d33f.png\tcaeb60c6f8.png\tff07b183f1.png\n3312ba9934.png\t65fc772aeb.png\t97fcdfcb2b.png\tcaee58b3ea.png\tff08bceaf5.png\n33159fbb9b.png\t65fd376d9f.png\t97fe862edb.png\tcaf0757af5.png\tff0a22953c.png\n331b11f8db.png\t65fd699136.png\t97ff8aa3a0.png\tcaf12d30ab.png\tff0a4f1109.png\n331ec135dd.png\t65fe1a0efc.png\t98017bf556.png\tcaf762e1ab.png\tff0b6070b1.png\n3320eb35e4.png\t66073ddce1.png\t98056176af.png\tcaf77a8651.png\tff0d9d4f75.png\n33225bdb83.png\t661059d0a5.png\t980570ed7e.png\tcafb1294ae.png\tff14a3baeb.png\n33228f3eaf.png\t66114b9764.png\t980acb2f1d.png\tcafccdcba5.png\tff16cbe4bb.png\n3325f8af22.png\t66120fc984.png\t980b56419d.png\tcafce53ad7.png\tff16f2f584.png\n33273c4d81.png\t66148a9c70.png\t980d9ad971.png\tcb040c0f89.png\tff182734dc.png\n332dbd80a3.png\t6614d038a7.png\t9819277faa.png\tcb04c4c5a1.png\tff18770185.png\n3330b9ee6b.png\t661a4a99df.png\t981bc36e6e.png\tcb060ff8e7.png\tff1e8ed948.png\n333262e665.png\t661adc7b9c.png\t981f73e811.png\tcb0d2bd803.png\tff22eb863f.png\n33350197f3.png\t661ee2a9b9.png\t9827ff0a2c.png\tcb0d3e8399.png\tff234af44a.png\n3336643b56.png\t66236629d2.png\t9829094cb2.png\tcb11801fc4.png\tff238df1c4.png\n333c47577b.png\t6626ba484f.png\t982cb1378d.png\tcb18628039.png\tff24769ea3.png\n333dd21431.png\t66276f7cf4.png\t982eb40781.png\tcb210acc22.png\tff263bac3d.png\n334ca525eb.png\t6627a74e9c.png\t9830eb6971.png\tcb2698f715.png\tff28f29ea4.png\n334f76e324.png\t662b00c0fe.png\t9831a5a7fa.png\tcb271e243d.png\tff29a185b7.png\n335474f8ec.png\t662c5c3fea.png\t9833bfe671.png\tcb2a21050e.png\tff29cd153f.png\n335586ece3.png\t663377db96.png\t9834bd4034.png\tcb2a2e89d3.png\tff2b792046.png\n3358462c1b.png\t6636021b35.png\t983fc78511.png\tcb2acfd639.png\tff2c353279.png\n336474d36f.png\t663821acb8.png\t9840439edc.png\tcb2c20898a.png\tff2cfb99cd.png\n336e1995cd.png\t663990c75e.png\t9842f69f8d.png\tcb33ac262f.png\tff2ebb3e79.png\n3371d166f0.png\t663c095075.png\t9844d1cd98.png\tcb36193e2f.png\tff3267b613.png\n3371fb06cb.png\t6642c9d86f.png\t9848ad8e46.png\tcb36e2e2ae.png\tff32fc77c7.png\n337b30caff.png\t6648f8d017.png\t984c93703e.png\tcb3f0be080.png\tff349094e4.png\n337d3152d1.png\t664dd976a9.png\t984cf461df.png\tcb3f9687bb.png\tff35ac6658.png\n337d9a4379.png\t664e3eb07f.png\t9853112f5e.png\tcb47a30879.png\tff35f2a328.png\n33820f50d8.png\t665301fedb.png\t9853f60114.png\tcb4e3c72e3.png\tff377c193c.png\n3385dcccfc.png\t6654150ac9.png\t985507fc8a.png\tcb4f7abe67.png\tff37c16424.png\n33887a0ae7.png\t665452e731.png\t98595cad73.png\tcb507d53f7.png\tff3b80b82c.png\n3391f0befc.png\t6654e84002.png\t985b25f2b1.png\tcb55032676.png\tff41dd8a8a.png\n3392f9b1c5.png\t6654fbf093.png\t985bcaf46f.png\tcb559bbed4.png\tff47f2a6f2.png\n3394b3e57b.png\t66564cc507.png\t985c03d8c8.png\tcb58d64264.png\tff4f821746.png\n339db3eb53.png\t665912cade.png\t9866b8ffb5.png\tcb5926b32e.png\tff52c8f5f2.png\n339e2d762e.png\t6659f1e7e1.png\t9866f4d5dc.png\tcb610f515a.png\tff532274cd.png\n33a18815cb.png\t665e84dc42.png\t98688a6c7f.png\tcb61601ea1.png\tff540c7ca0.png\n33a1b10a37.png\t6661af933c.png\t9868dcfa42.png\tcb63ab8cde.png\tff55cc5361.png\n33abaa62db.png\t6663d5f688.png\t986d0353be.png\tcb658e0c88.png\tff56914ac3.png\n33b02dc92f.png\t6677cfe3c6.png\t986f57256f.png\tcb66ce2ae0.png\tff5d55d4b6.png\n33b3e958fb.png\t667a5015dd.png\t986f9cb702.png\tcb677fab7a.png\tff608418c1.png\n33b50159a7.png\t667b3004a1.png\t9874f75819.png\tcb68284b4e.png\tff642cbc55.png\n33b84dc57b.png\t667de59b1a.png\t98765fddfa.png\tcb68c2431a.png\tff65941484.png\n33ba7dd9a1.png\t667ef29591.png\t9877b263a6.png\tcb6c6c15ce.png\tff69f8d84e.png\n33bb699ffe.png\t667fa631f4.png\t987eb16144.png\tcb6e524a1a.png\tff763fc36c.png\n33bbf6ca95.png\t6682af0131.png\t9881925fc5.png\tcb707b00e6.png\tff76657860.png\n33be55293b.png\t668879ae3f.png\t988251c854.png\tcb79c2c1e3.png\tff7a7f0d93.png\n33c2c53893.png\t66891ec970.png\t9883150c69.png\tcb7a4519ac.png\tff7dc78a2c.png\n33c2d43d95.png\t668b41c03e.png\t9886bf8a44.png\tcb7cd324a5.png\tff7e13c3c5.png\n33c7f692a1.png\t668cf61ce0.png\t98870eab9e.png\tcb88a4e70a.png\tff82b105a2.png\n33cb556649.png\t6698a570b8.png\t98872fcf4e.png\tcb8a681027.png\tff85a49633.png\n33cfb92423.png\t669bd87a83.png\t988ed36efc.png\tcb8d49e61d.png\tff89e5e3cd.png\n33d638b9bd.png\t669cb42011.png\t988f29b4c1.png\tcb8ed909f8.png\tff8afea074.png\n33d921c28e.png\t669d158701.png\t98919a128f.png\tcb8fbea20f.png\tff8bf1417c.png\n33dd8c2bd2.png\t66a00fd33e.png\t98943bbc82.png\tcb956d972a.png\tff9b7d0d2e.png\n33dfce3a76.png\t66a1ce8398.png\t9895f188e8.png\tcb974e60ca.png\tff9c8270bd.png\n33dfe8b243.png\t66a4f809c6.png\t9896d6ac33.png\tcb992ed293.png\tff9d2e11aa.png\n33e736f52e.png\t66abc632b7.png\t989778f18c.png\tcb9aedbfdb.png\tff9d2e9ba7.png\n33e98bf037.png\t66b2d4a486.png\t9897c52667.png\tcb9c3597e8.png\tff9dcbb5ba.png\n33f32ff406.png\t66b4280f4e.png\t9898ce93d6.png\tcb9d510ceb.png\tff9e8760d8.png\n33f4067402.png\t66b56662df.png\t98994afc4f.png\tcb9d520cf9.png\tffa51e8c7e.png\n33f7a451b3.png\t66b8883dd0.png\t989980f611.png\tcba568b15f.png\tffa51f1bc4.png\n33f94114cc.png\t66c484c763.png\t989a91d6ab.png\tcbaa2838cf.png\tffa592312b.png\n33f9dd99c6.png\t66c5e69234.png\t989ac5a9de.png\tcbaabe0368.png\tffa64b0acf.png\n34003990cc.png\t66c603a013.png\t989c646373.png\tcbab88c5c0.png\tffa75fd526.png\n34009c8c2e.png\t66c61616cc.png\t989e8cfb6a.png\tcbad345488.png\tffac152437.png\n340216e59a.png\t66cd32da94.png\t98a174a0d4.png\tcbad522960.png\tffb1b9d096.png\n3403f634c5.png\t66cd349922.png\t98a4f24dcb.png\tcbae673566.png\tffb2ad3f94.png\n3404341cc6.png\t66ce83df1c.png\t98a9095250.png\tcbb74247a4.png\tffb3be0fa8.png\n3406ef6042.png\t66ceca6202.png\t98a98ca9e8.png\tcbbf379ef8.png\tffb3e66bbe.png\n340c808130.png\t66ced55b7d.png\t98a998d784.png\tcbc13d1a01.png\tffb5a4a57c.png\n340fe814a3.png\t66cf41c563.png\t98aee6595f.png\tcbc52ce985.png\tffbbadd3eb.png\n3417ef817c.png\t66cfa202e5.png\t98af17c0ff.png\tcbc5bd4021.png\tffc0ed590d.png\n341da65aac.png\t66d07dd062.png\t98b1f3fb9c.png\tcbd043c3b0.png\tffc50b68bf.png\n341e90a036.png\t66d29aed4c.png\t98b2dfadb6.png\tcbd1b23efc.png\tffc6f6f99d.png\n341f6694f1.png\t66d52bc03a.png\t98b49ec0e6.png\tcbd2213768.png\tffc6f7ddbf.png\n341fca7b7c.png\t66dec21f1d.png\t98b6e72be0.png\tcbd42ee63f.png\tffc7accfe2.png\n34205cc12b.png\t66e0cd3d9b.png\t98bc73660f.png\tcbd649b1ee.png\tffc8978fd2.png\n342be6effb.png\t66e2ec7b55.png\t98be6fe916.png\tcbd6daf292.png\tffca00e9d7.png\n342d17a869.png\t66e42629c1.png\t98c055f418.png\tcbd8199a3a.png\tffca8b12cc.png\n342e683405.png\t66e4cf0269.png\t98c4538e70.png\tcbdd38711c.png\tffce5bbb9a.png\n342ec0cfc8.png\t66e5e6debd.png\t98c6496076.png\tcbe33c57c4.png\tffd071d826.png\n343382c0c8.png\t66e610e586.png\t98c93a0d18.png\tcbe5d89ffc.png\tffd47b14e5.png\n343443992e.png\t66e922c8d6.png\t98c97825f1.png\tcbeab7807d.png\tffd51ee9a3.png\n3435b69c29.png\t66e94a50e2.png\t98d115e32d.png\tcbed0283ff.png\tffdc97aa4f.png\n343705e8fe.png\t66eb719e38.png\t98d2f64d03.png\tcbef2e2128.png\tffdef44e2a.png\n343a049d60.png\t66eeb09788.png\t98d30fa576.png\tcbf0cdfe2a.png\tffe228a7e3.png\n343b0359b5.png\t66fca3e5c6.png\t98d374c1f6.png\tcbf2924b41.png\tffe63102a0.png\n343ea5c3b8.png\t66fed16154.png\t98ddd20ff1.png\tcbf3dbfb4c.png\tffeb06fdc6.png\n343fd32150.png\t6700ff86b9.png\t98e18540a2.png\tcc079af7c4.png\tffecb03e46.png\n344395de65.png\t6702002a59.png\t98e2d2aba5.png\tcc08ae7778.png\tffedecc19c.png\n34453e5f12.png\t6704437efa.png\t98e63a2729.png\tcc09190dfb.png\tffef5dd845.png\n3448234dee.png\t6705b7299a.png\t98ec81a8df.png\tcc132b53ec.png\tfff2abcaf9.png\n3457baa631.png\t67066b5871.png\t98ef8e3b13.png\tcc15d94784.png\tfff4282d5f.png\n34586dbe50.png\t670d6332be.png\t98ef92da51.png\tcc18f94210.png\tfff4eb4941.png\n345b80cd1d.png\t670d9655d5.png\t98f28c0303.png\tcc1b695b2c.png\tfff6522bd1.png\n34625fbc1c.png\t670eb245f5.png\t98f4e2f918.png\tcc216a861c.png\tfff987cdb3.png\n346358e652.png\t67142f744f.png\t98f91cd1b8.png\tcc21d3af94.png\tfff99a4584.png\n3466c68b0a.png\t6719731541.png\t98faa20dde.png\tcc233193ba.png\tfff9bc7e09.png\n346b4c38aa.png\t67217aafd8.png\t98fd0a654c.png\tcc28f19bce.png\tfffa0542df.png\n346d259abd.png\t6721c5356e.png\t98fd6b23f0.png\tcc2b6a1eb2.png\tfffd909d0f.png\n"
],
[
"from sklearn.model_selection import train_test_split\n\nX_train, X_valid, X_feat_train, X_feat_valid, y_train, y_valid = train_test_split(X, X_feat, y, test_size=0.15, random_state=42)",
"_____no_output_____"
],
[
"callbacks = [\n EarlyStopping(patience=5, verbose=1),\n ReduceLROnPlateau(patience=3, verbose=1),\n ModelCheckpoint('model-tgs-salt-2.h5', verbose=1, save_best_only=True, save_weights_only=False)\n]",
"_____no_output_____"
],
[
"results = model.fit({'img': X_train, 'feat': X_feat_train}, y_train, batch_size=16, epochs=50, callbacks=callbacks,\n validation_data=({'img': X_valid, 'feat': X_feat_valid}, y_valid))",
"Train on 3400 samples, validate on 600 samples\nEpoch 1/50\n3400/3400 [==============================] - 20s 6ms/step - loss: 0.5518 - val_loss: 0.4322\n\nEpoch 00001: val_loss improved from inf to 0.43223, saving model to model-tgs-salt-2.h5\nEpoch 2/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.4242 - val_loss: 0.3713\n\nEpoch 00002: val_loss improved from 0.43223 to 0.37132, saving model to model-tgs-salt-2.h5\nEpoch 3/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.3469 - val_loss: 0.4525\n\nEpoch 00003: val_loss did not improve from 0.37132\nEpoch 4/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2967 - val_loss: 0.2642\n\nEpoch 00004: val_loss improved from 0.37132 to 0.26422, saving model to model-tgs-salt-2.h5\nEpoch 5/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2647 - val_loss: 0.2500\n\nEpoch 00005: val_loss improved from 0.26422 to 0.24997, saving model to model-tgs-salt-2.h5\nEpoch 6/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2499 - val_loss: 0.2329\n\nEpoch 00006: val_loss improved from 0.24997 to 0.23293, saving model to model-tgs-salt-2.h5\nEpoch 7/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2382 - val_loss: 0.2179\n\nEpoch 00007: val_loss improved from 0.23293 to 0.21790, saving model to model-tgs-salt-2.h5\nEpoch 8/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2212 - val_loss: 0.2293\n\nEpoch 00008: val_loss did not improve from 0.21790\nEpoch 9/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2315 - val_loss: 0.2604\n\nEpoch 00009: val_loss did not improve from 0.21790\nEpoch 10/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2138 - val_loss: 0.2087\n\nEpoch 00010: val_loss improved from 0.21790 to 0.20865, saving model to model-tgs-salt-2.h5\nEpoch 11/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.2069 - val_loss: 0.1930\n\nEpoch 00011: val_loss improved from 0.20865 to 0.19302, saving model to model-tgs-salt-2.h5\nEpoch 12/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1927 - val_loss: 0.1934\n\nEpoch 00012: val_loss did not improve from 0.19302\nEpoch 13/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1904 - val_loss: 0.1890\n\nEpoch 00013: val_loss improved from 0.19302 to 0.18904, saving model to model-tgs-salt-2.h5\nEpoch 14/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1763 - val_loss: 0.1816\n\nEpoch 00014: val_loss improved from 0.18904 to 0.18155, saving model to model-tgs-salt-2.h5\nEpoch 15/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1799 - val_loss: 0.1826\n\nEpoch 00015: val_loss did not improve from 0.18155\nEpoch 16/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1706 - val_loss: 0.2017\n\nEpoch 00016: val_loss did not improve from 0.18155\nEpoch 17/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1635 - val_loss: 0.1926\n\nEpoch 00017: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n\nEpoch 00017: val_loss did not improve from 0.18155\nEpoch 18/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1396 - val_loss: 0.1777\n\nEpoch 00018: val_loss improved from 0.18155 to 0.17767, saving model to model-tgs-salt-2.h5\nEpoch 19/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1324 - val_loss: 0.1845\n\nEpoch 00019: val_loss did not improve from 0.17767\nEpoch 20/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1297 - val_loss: 0.1800\n\nEpoch 00020: val_loss did not improve from 0.17767\nEpoch 21/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1275 - val_loss: 0.1814\n\nEpoch 00021: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n\nEpoch 00021: val_loss did not improve from 0.17767\nEpoch 22/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1232 - val_loss: 0.1822\n\nEpoch 00022: val_loss did not improve from 0.17767\nEpoch 23/50\n3400/3400 [==============================] - 17s 5ms/step - loss: 0.1228 - val_loss: 0.1820\n\nEpoch 00023: val_loss did not improve from 0.17767\nEpoch 00023: early stopping\n"
],
[
"!ls",
"adc.json images model-tgs-salt-1.h5 sample_submission.csv\ttrain.csv\ndepths.csv masks sample_data\t\t test.zip\t\ttrain.zip\n"
],
[
"!unzip -q test.zip -d test",
"replace test/images/8cf16aa0f5.png? [y]es, [n]o, [A]ll, [N]one, [r]ename: N\n"
]
],
[
[
"# Predict\nRef: https://www.kaggle.com/jesperdramsch/intro-to-seismic-salt-and-how-to-geophysics",
"_____no_output_____"
]
],
[
[
"path_test='./test/'\n\ntest_ids = next(os.walk(path_test+\"images\"))[2]\n\nX_test = np.zeros((len(test_ids), im_height, im_width, im_chan), dtype=np.uint8)\nX_test_feat = np.zeros((len(test_ids), n_features), dtype=np.float32)\n\nsizes_test = []\nprint('Getting and resizing test images ... ')\nsys.stdout.flush()\nfor n, id_ in tqdm(enumerate(test_ids), total=len(test_ids)):\n path = path_test\n \n img = load_img(path + 'images/' + id_, grayscale=True)\n x_img = img_to_array(img)\n x_img = resize(x_img, (128, 128, 1), mode='constant', preserve_range=True)\n \n # Create cumsum x\n x_center_mean = x_img[border:-border, border:-border].mean()\n x_csum = (np.float32(x_img)-x_center_mean).cumsum(axis=0)\n x_csum -= x_csum[border:-border, border:-border].mean()\n x_csum /= max(1e-3, x_csum[border:-border, border:-border].std())\n\n \n # Save images\n X_test[n, ..., 0] = x_img.squeeze() / 255\n X_test[n, ..., 1] = x_csum.squeeze()\n \n #img = load_img(path + '/images/' + id_)\n #x = img_to_array(img)[:,:,1]\n sizes_test.append([x_img.shape[0], x_img.shape[1]])\n #x = resize(x, (128, 128, 1), mode='constant', preserve_range=True)\n #X_test[n] = x\n\nprint('Done!')\n\n\n\n#test_mask = pd.read_csv('test.csv')\n#file_list = list(train_mask['id'].values)\n#dataset = TGSSaltDataSet(train_path, file_list)",
"Getting and resizing test images ... \n"
],
[
"X_train.shape",
"_____no_output_____"
],
[
"X_test.shape",
"_____no_output_____"
],
[
"!ls -al",
"total 210516\ndrwxr-xr-x 1 root root 4096 Sep 14 21:18 .\ndrwxr-xr-x 1 root root 4096 Sep 14 16:19 ..\n-rw-r--r-- 1 root root 2556 Sep 14 17:05 adc.json\ndrwxr-xr-x 1 root root 4096 Sep 14 17:05 .config\n-rw-r--r-- 1 root root 329525 Sep 14 17:06 depths.csv\ndrwxr-xr-x 2 root root 790528 Sep 14 19:16 images\ndrwxr-xr-x 2 root root 4096 Sep 14 17:05 .kaggle\ndrwxr-xr-x 2 root root 135168 Jul 16 19:26 masks\n-rw-r--r-- 1 root root 2030528 Sep 14 21:13 model-tgs-salt-1.h5\ndrwxr-xr-x 2 root root 4096 Sep 13 17:28 sample_data\n-rw-r--r-- 1 root root 270012 Sep 14 17:06 sample_submission.csv\ndrwxr-xr-x 3 root root 4096 Sep 14 21:18 test\n-rw-r--r-- 1 root root 171262199 Sep 14 17:06 test.zip\n-rw-r--r-- 1 root root 943702 Sep 14 17:06 train.csv\n-rw-r--r-- 1 root root 39757560 Sep 14 17:06 train.zip\n"
],
[
"preds_test = model.predict([X_test, X_test_feat], verbose=1)",
"18000/18000 [==============================] - 23s 1ms/step\n"
],
[
"preds_test_t = (preds_test > 0.5).astype(np.uint8)",
"_____no_output_____"
],
[
"from tqdm import tnrange",
"_____no_output_____"
],
[
"# Create list of upsampled test masks\npreds_test_upsampled = []\nfor i in tnrange(len(preds_test)):\n preds_test_upsampled.append(resize(np.squeeze(preds_test[i]), \n (sizes_test[i][0], sizes_test[i][1]), \n mode='constant', preserve_range=True))",
"_____no_output_____"
],
[
"def RLenc(img, order='F', format=True):\n \"\"\"\n img is binary mask image, shape (r,c)\n order is down-then-right, i.e. Fortran\n format determines if the order needs to be preformatted (according to submission rules) or not\n\n returns run length as an array or string (if format is True)\n \"\"\"\n bytes = img.reshape(img.shape[0] * img.shape[1], order=order)\n runs = [] ## list of run lengths\n r = 0 ## the current run length\n pos = 1 ## count starts from 1 per WK\n for c in bytes:\n if (c == 0):\n if r != 0:\n runs.append((pos, r))\n pos += r\n r = 0\n pos += 1\n else:\n r += 1\n\n # if last run is unsaved (i.e. data ends with 1)\n if r != 0:\n runs.append((pos, r))\n pos += r\n r = 0\n\n if format:\n z = ''\n\n for rr in runs:\n z += '{} {} '.format(rr[0], rr[1])\n return z[:-1]\n else:\n return runs\n",
"_____no_output_____"
],
[
"def rle_encode(im):\n '''\n im: numpy array, 1 - mask, 0 - background\n Returns run length as string formated\n '''\n pixels = im.flatten(order = 'F')\n pixels = np.concatenate([[0], pixels, [0]])\n runs = np.where(pixels[1:] != pixels[:-1])[0] + 1\n print(runs)\n runs = np.unique(runs)\n runs = np.sort(runs)\n print(runs)\n runs[1::2] -= runs[::2]\n print(runs)\n #print(type(runs))\n #runs = sorted(list(set(runs)))\n return ' '.join(str(x) for x in runs)",
"_____no_output_____"
],
[
"from tqdm import tqdm_notebook",
"_____no_output_____"
],
[
"#pred_dict = {fn[:-4]:RLenc(np.round(preds_test_upsampled[i])) for i,fn in tqdm_notebook(enumerate(test_ids))}",
"_____no_output_____"
],
[
"def downsample(img):# not used\n if img_size_ori == img_size_target:\n return img\n return resize(img, (img_size_ori, img_size_ori), mode='constant', preserve_range=True)",
"_____no_output_____"
],
[
"threshold_best = 0.77\nimg_size_ori = 101",
"_____no_output_____"
],
[
"pred_dict = {idx: rle_encode(np.round(downsample(preds_test[i]) > threshold_best)) for i, idx in enumerate(tqdm_notebook(test_df.index.values))}",
"_____no_output_____"
],
[
"sub = pd.DataFrame.from_dict(pred_dict,orient='index')\nsub.index.names = ['id']\nsub.columns = ['rle_mask']\nsub.to_csv('submission.csv')",
"_____no_output_____"
],
[
"sub.head()",
"_____no_output_____"
],
[
"!ls",
"adc.json masks\t\t sample_data\t\ttest\t train.zip\ndepths.csv model-tgs-salt-1.h5 sample_submission.csv\ttest.zip\nimages\t model-tgs-salt-2.h5 submission.csv\t\ttrain.csv\n"
],
[
"!kaggle competitions submit -c tgs-salt-identification-challenge -f submission.csv -m \"Re-Submission with sorted rle_mask\"",
"Successfully submitted to TGS Salt Identification Challenge"
]
],
[
[
"# Predict\nRef: https://www.kaggle.com/shaojiaxin/u-net-with-simple-resnet-blocks",
"_____no_output_____"
]
],
[
[
"callbacks = [\n EarlyStopping(patience=5, verbose=1),\n ReduceLROnPlateau(patience=3, verbose=1),\n ModelCheckpoint('model-tgs-salt-new-1.h5', verbose=1, save_best_only=True, save_weights_only=True)\n]",
"_____no_output_____"
],
[
"#results = model.fit({'img': [X_train, X_train], 'feat': X_feat_train}, y_train, batch_size=16, epochs=50, callbacks=callbacks,\n# validation_data=({'img': [X_valid, X_valid], 'feat': X_feat_valid}, y_valid))\n\nepochs = 50\nbatch_size = 16\n\nhistory = model.fit(X_train, y_train,\n validation_data=[X_valid, y_valid], \n epochs=epochs,\n batch_size=batch_size,\n callbacks=callbacks)",
"_____no_output_____"
],
[
"def predict_result(model,x_test,img_size_target): # predict both orginal and reflect x\n x_test_reflect = np.array([np.fliplr(x) for x in x_test])\n preds_test1 = model.predict(x_test).reshape(-1, img_size_target, img_size_target)\n preds_test2_refect = model.predict(x_test_reflect).reshape(-1, img_size_target, img_size_target)\n preds_test2 = np.array([ np.fliplr(x) for x in preds_test2_refect] )\n preds_avg = (preds_test1 +preds_test2)/2\n return preds_avg",
"_____no_output_____"
],
[
"train_df = pd.read_csv(\"train.csv\", index_col=\"id\", usecols=[0])\ndepths_df = pd.read_csv(\"depths.csv\", index_col=\"id\")\ntrain_df = train_df.join(depths_df)\ntest_df = depths_df[~depths_df.index.isin(train_df.index)]",
"_____no_output_____"
],
[
"img_size_target = 101\n\nx_test = np.array([(np.array(load_img(\"./test/images/{}.png\".format(idx), grayscale = True))) / 255 for idx in tqdm(test_df.index)]).reshape(-1, img_size_target, img_size_target, 1)",
"\n 0%| | 0/18000 [00:00<?, ?it/s]\u001b[A\n 1%| | 132/18000 [00:00<00:13, 1313.41it/s]\u001b[A\n 1%|▏ | 263/18000 [00:00<00:13, 1310.71it/s]\u001b[A\n 2%|▏ | 392/18000 [00:00<00:13, 1302.42it/s]\u001b[A\n 3%|▎ | 505/18000 [00:00<00:14, 1244.96it/s]\u001b[A\n 4%|▎ | 635/18000 [00:00<00:13, 1259.55it/s]\u001b[A\n 4%|▍ | 754/18000 [00:00<00:13, 1233.94it/s]\u001b[A\n 5%|▍ | 871/18000 [00:00<00:14, 1212.41it/s]\u001b[A\n 6%|▌ | 994/18000 [00:00<00:13, 1216.93it/s]\u001b[A\n 6%|▌ | 1116/18000 [00:00<00:13, 1215.84it/s]\u001b[A\n 7%|▋ | 1239/18000 [00:01<00:13, 1217.53it/s]\u001b[A\n 8%|▊ | 1362/18000 [00:01<00:13, 1218.71it/s]\u001b[A\n 8%|▊ | 1482/18000 [00:01<00:13, 1211.21it/s]\u001b[A\n 9%|▉ | 1608/18000 [00:01<00:13, 1222.77it/s]\u001b[A\n 10%|▉ | 1732/18000 [00:01<00:13, 1227.49it/s]\u001b[A\n 10%|█ | 1854/18000 [00:01<00:13, 1209.75it/s]\u001b[A\n 11%|█ | 1975/18000 [00:01<00:13, 1203.63it/s]\u001b[A\n 12%|█▏ | 2096/18000 [00:01<00:13, 1191.35it/s]\u001b[A\n 12%|█▏ | 2215/18000 [00:01<00:13, 1178.53it/s]\u001b[A\n 13%|█▎ | 2333/18000 [00:01<00:13, 1122.94it/s]\u001b[A\n 14%|█▎ | 2453/18000 [00:02<00:13, 1143.91it/s]\u001b[A\n 14%|█▍ | 2574/18000 [00:02<00:13, 1160.97it/s]\u001b[A\n 15%|█▌ | 2701/18000 [00:02<00:12, 1191.57it/s]\u001b[A\n 16%|█▌ | 2825/18000 [00:02<00:12, 1203.74it/s]\u001b[A\n 16%|█▋ | 2950/18000 [00:02<00:12, 1216.94it/s]\u001b[A\n 17%|█▋ | 3072/18000 [00:02<00:12, 1204.15it/s]\u001b[A\n 18%|█▊ | 3196/18000 [00:02<00:12, 1212.49it/s]\u001b[A\n 18%|█▊ | 3318/18000 [00:02<00:12, 1195.16it/s]\u001b[A\n 19%|█▉ | 3438/18000 [00:02<00:12, 1177.27it/s]\u001b[A\n 20%|█▉ | 3559/18000 [00:02<00:12, 1185.17it/s]\u001b[A\n 20%|██ | 3686/18000 [00:03<00:11, 1207.48it/s]\u001b[A\n 21%|██ | 3807/18000 [00:03<00:12, 1162.31it/s]\u001b[A\n 22%|██▏ | 3927/18000 [00:03<00:12, 1171.63it/s]\u001b[A\n 23%|██▎ | 4053/18000 [00:03<00:11, 1195.80it/s]\u001b[A\n 23%|██▎ | 4177/18000 [00:03<00:11, 1206.12it/s]\u001b[A\n 24%|██▍ | 4300/18000 [00:03<00:11, 1211.63it/s]\u001b[A\n 25%|██▍ | 4424/18000 [00:03<00:11, 1219.11it/s]\u001b[A\n 25%|██▌ | 4547/18000 [00:03<00:11, 1202.87it/s]\u001b[A\n 26%|██▌ | 4668/18000 [00:03<00:11, 1184.55it/s]\u001b[A\n 27%|██▋ | 4790/18000 [00:03<00:11, 1193.36it/s]\u001b[A\n 27%|██▋ | 4914/18000 [00:04<00:10, 1205.16it/s]\u001b[A\n 28%|██▊ | 5038/18000 [00:04<00:10, 1213.10it/s]\u001b[A\n 29%|██▊ | 5160/18000 [00:04<00:10, 1212.18it/s]\u001b[A\n 29%|██▉ | 5282/18000 [00:04<00:10, 1185.64it/s]\u001b[A\n 30%|███ | 5403/18000 [00:04<00:10, 1192.64it/s]\u001b[A\n 31%|███ | 5523/18000 [00:04<00:10, 1193.88it/s]\u001b[A\n 31%|███▏ | 5647/18000 [00:04<00:10, 1206.58it/s]\u001b[A\n 32%|███▏ | 5768/18000 [00:04<00:10, 1191.70it/s]\u001b[A\n 33%|███▎ | 5888/18000 [00:04<00:10, 1182.28it/s]\u001b[A\n 33%|███▎ | 6007/18000 [00:05<00:10, 1177.15it/s]\u001b[A\n 34%|███▍ | 6127/18000 [00:05<00:10, 1183.36it/s]\u001b[A\n 35%|███▍ | 6246/18000 [00:05<00:09, 1182.25it/s]\u001b[A\n 35%|███▌ | 6373/18000 [00:05<00:09, 1205.00it/s]\u001b[A\n 36%|███▌ | 6498/18000 [00:05<00:09, 1218.10it/s]\u001b[A\n 37%|███▋ | 6625/18000 [00:05<00:09, 1232.15it/s]\u001b[A\n 37%|███▋ | 6749/18000 [00:05<00:09, 1200.64it/s]\u001b[A\n 38%|███▊ | 6870/18000 [00:05<00:09, 1201.28it/s]\u001b[A\n 39%|███▉ | 6991/18000 [00:05<00:09, 1189.44it/s]\u001b[A\n 40%|███▉ | 7111/18000 [00:05<00:09, 1191.96it/s]\u001b[A\n 40%|████ | 7232/18000 [00:06<00:09, 1193.46it/s]\u001b[A\n 41%|████ | 7353/18000 [00:06<00:08, 1196.36it/s]\u001b[A\n 42%|████▏ | 7478/18000 [00:06<00:08, 1209.26it/s]\u001b[A\n 42%|████▏ | 7602/18000 [00:06<00:08, 1216.05it/s]\u001b[A\n 43%|████▎ | 7728/18000 [00:06<00:08, 1227.74it/s]\u001b[A\n 44%|████▎ | 7854/18000 [00:06<00:08, 1234.87it/s]\u001b[A\n 44%|████▍ | 7978/18000 [00:06<00:08, 1225.99it/s]\u001b[A\n 45%|████▌ | 8101/18000 [00:06<00:08, 1198.78it/s]\u001b[A\n 46%|████▌ | 8222/18000 [00:06<00:09, 1064.27it/s]\u001b[A\n 46%|████▋ | 8332/18000 [00:06<00:09, 1035.88it/s]\u001b[A\n 47%|████▋ | 8454/18000 [00:07<00:08, 1083.85it/s]\u001b[A\n 48%|████▊ | 8578/18000 [00:07<00:08, 1124.86it/s]\u001b[A\n 48%|████▊ | 8701/18000 [00:07<00:08, 1151.91it/s]\u001b[A\n 49%|████▉ | 8827/18000 [00:07<00:07, 1178.43it/s]\u001b[A\n 50%|████▉ | 8953/18000 [00:07<00:07, 1199.84it/s]\u001b[A\n 50%|█████ | 9079/18000 [00:07<00:07, 1214.03it/s]\u001b[A\n 51%|█████ | 9202/18000 [00:07<00:07, 1206.97it/s]\u001b[A\n 52%|█████▏ | 9324/18000 [00:07<00:07, 1202.22it/s]\u001b[A\n 52%|█████▏ | 9445/18000 [00:07<00:07, 1197.38it/s]\u001b[A\n 53%|█████▎ | 9567/18000 [00:08<00:07, 1202.52it/s]\u001b[A\n 54%|█████▍ | 9688/18000 [00:08<00:07, 1185.81it/s]\u001b[A\n 54%|█████▍ | 9807/18000 [00:08<00:06, 1185.09it/s]\u001b[A\n 55%|█████▌ | 9933/18000 [00:08<00:06, 1205.39it/s]\u001b[A\n 56%|█████▌ | 10057/18000 [00:08<00:06, 1214.92it/s]\u001b[A\n 57%|█████▋ | 10184/18000 [00:08<00:06, 1227.62it/s]\u001b[A\n 57%|█████▋ | 10307/18000 [00:08<00:06, 1220.90it/s]\u001b[A\n 58%|█████▊ | 10430/18000 [00:08<00:06, 1198.18it/s]\u001b[A\n 59%|█████▊ | 10550/18000 [00:08<00:06, 1193.52it/s]\u001b[A\n 59%|█████▉ | 10670/18000 [00:08<00:06, 1179.47it/s]\u001b[A\n 60%|█████▉ | 10789/18000 [00:09<00:06, 1171.99it/s]\u001b[A\n 61%|██████ | 10915/18000 [00:09<00:05, 1195.37it/s]\u001b[A\n 61%|██████▏ | 11036/18000 [00:09<00:05, 1199.32it/s]\u001b[A\n 62%|██████▏ | 11158/18000 [00:09<00:05, 1202.91it/s]\u001b[A\n 63%|██████▎ | 11279/18000 [00:09<00:05, 1165.83it/s]\u001b[A\n 63%|██████▎ | 11403/18000 [00:09<00:05, 1185.39it/s]\u001b[A\n 64%|██████▍ | 11525/18000 [00:09<00:05, 1191.24it/s]\u001b[A\n 65%|██████▍ | 11645/18000 [00:09<00:05, 1183.82it/s]\u001b[A\n 65%|██████▌ | 11764/18000 [00:09<00:05, 1169.47it/s]\u001b[A\n 66%|██████▌ | 11882/18000 [00:09<00:05, 1169.31it/s]\u001b[A\n 67%|██████▋ | 12001/18000 [00:10<00:05, 1174.34it/s]\u001b[A\n 67%|██████▋ | 12126/18000 [00:10<00:04, 1193.84it/s]\u001b[A\n 68%|██████▊ | 12247/18000 [00:10<00:04, 1196.16it/s]\u001b[A\n 69%|██████▊ | 12369/18000 [00:10<00:04, 1201.45it/s]\u001b[A\n 69%|██████▉ | 12491/18000 [00:10<00:04, 1205.84it/s]\u001b[A\n 70%|███████ | 12615/18000 [00:10<00:04, 1214.80it/s]\u001b[A\n 71%|███████ | 12737/18000 [00:10<00:04, 1166.26it/s]\u001b[A\n 71%|███████▏ | 12855/18000 [00:10<00:04, 1158.43it/s]\u001b[A\n 72%|███████▏ | 12972/18000 [00:10<00:04, 1116.91it/s]\u001b[A\n 73%|███████▎ | 13090/18000 [00:10<00:04, 1134.90it/s]\u001b[A\n 73%|███████▎ | 13210/18000 [00:11<00:04, 1152.86it/s]\u001b[A\n 74%|███████▍ | 13331/18000 [00:11<00:03, 1167.44it/s]\u001b[A\n 75%|███████▍ | 13449/18000 [00:11<00:03, 1170.98it/s]\u001b[A\n 75%|███████▌ | 13572/18000 [00:11<00:03, 1186.09it/s]\u001b[A\n 76%|███████▌ | 13697/18000 [00:11<00:03, 1200.23it/s]\u001b[A\n 77%|███████▋ | 13820/18000 [00:11<00:03, 1206.01it/s]\u001b[A\n 77%|███████▋ | 13941/18000 [00:11<00:03, 1189.26it/s]\u001b[A\n 78%|███████▊ | 14061/18000 [00:11<00:03, 1182.56it/s]\u001b[A\n 79%|███████▉ | 14180/18000 [00:11<00:03, 1140.16it/s]\u001b[A\n 79%|███████▉ | 14297/18000 [00:12<00:03, 1146.85it/s]\u001b[A\n 80%|████████ | 14421/18000 [00:12<00:03, 1172.69it/s]\u001b[A\n 81%|████████ | 14544/18000 [00:12<00:02, 1188.64it/s]\u001b[A\n 81%|████████▏ | 14669/18000 [00:12<00:02, 1205.19it/s]\u001b[A\n 82%|████████▏ | 14790/18000 [00:12<00:02, 1174.99it/s]\u001b[A\n 83%|████████▎ | 14908/18000 [00:12<00:02, 1145.69it/s]\u001b[A\n 83%|████████▎ | 15023/18000 [00:12<00:02, 1134.13it/s]\u001b[A\n 84%|████████▍ | 15137/18000 [00:12<00:02, 1120.79it/s]\u001b[A\n 85%|████████▍ | 15250/18000 [00:12<00:02, 1107.78it/s]\u001b[A\n 85%|████████▌ | 15369/18000 [00:12<00:02, 1129.11it/s]\u001b[A\n 86%|████████▌ | 15493/18000 [00:13<00:02, 1159.30it/s]\u001b[A\n 87%|████████▋ | 15610/18000 [00:13<00:02, 1114.33it/s]\u001b[A\n 87%|████████▋ | 15732/18000 [00:13<00:01, 1142.74it/s]\u001b[A\n 88%|████████▊ | 15859/18000 [00:13<00:01, 1175.24it/s]\u001b[A\n 89%|████████▉ | 15979/18000 [00:13<00:01, 1181.05it/s]\u001b[A\n 89%|████████▉ | 16104/18000 [00:13<00:01, 1200.19it/s]\u001b[A\n 90%|█████████ | 16225/18000 [00:13<00:01, 1201.55it/s]\u001b[A\n 91%|█████████ | 16346/18000 [00:13<00:01, 1185.45it/s]\u001b[A\n 91%|█████████▏| 16465/18000 [00:13<00:01, 1178.94it/s]\u001b[A\n 92%|█████████▏| 16587/18000 [00:13<00:01, 1189.22it/s]\u001b[A\n 93%|█████████▎| 16713/18000 [00:14<00:01, 1209.39it/s]\u001b[A\n 94%|█████████▎| 16835/18000 [00:14<00:00, 1207.80it/s]\u001b[A\n 94%|█████████▍| 16958/18000 [00:14<00:00, 1212.98it/s]\u001b[A\n 95%|█████████▍| 17080/18000 [00:14<00:00, 1186.57it/s]\u001b[A\n 96%|█████████▌| 17199/18000 [00:14<00:00, 1172.81it/s]\u001b[A\n 96%|█████████▌| 17320/18000 [00:14<00:00, 1183.50it/s]\u001b[A\n 97%|█████████▋| 17444/18000 [00:14<00:00, 1199.52it/s]\u001b[A\n 98%|█████████▊| 17565/18000 [00:14<00:00, 1184.75it/s]\u001b[A\n 98%|█████████▊| 17684/18000 [00:14<00:00, 1172.91it/s]\u001b[A\n 99%|█████████▉| 17802/18000 [00:14<00:00, 1172.46it/s]\u001b[A\n100%|█████████▉| 17927/18000 [00:15<00:00, 1194.24it/s]\u001b[A\n100%|██████████| 18000/18000 [00:15<00:00, 1187.66it/s]\u001b[A"
],
[
"def rle_encode(im):\n '''\n im: numpy array, 1 - mask, 0 - background\n Returns run length as string formated\n '''\n pixels = im.flatten(order = 'F')\n pixels = np.concatenate([[0], pixels, [0]])\n runs = np.where(pixels[1:] != pixels[:-1])[0] + 1\n runs[1::2] -= runs[::2]\n return ' '.join(str(x) for x in runs)",
"_____no_output_____"
],
[
"preds_test = predict_result(model,x_test,img_size_target)",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
],
[
[
"# Save output to drive",
"_____no_output_____"
]
],
[
[
"from google.colab import drive\ndrive.mount('/content/gdrive')",
"_____no_output_____"
],
[
"!ls /content/gdrive/My\\ Drive/kaggle_competitions",
"catsvsdogs colab_test.py digit_recognizer kaggle.json tgs_salt\n"
],
[
"!cp model-tgs-salt-1.h5 /content/gdrive/My\\ Drive/kaggle_competitions/tgs_salt/\n!cp model-tgs-salt-2.h5 /content/gdrive/My\\ Drive/kaggle_competitions/tgs_salt/\n!cp submission.csv /content/gdrive/My\\ Drive/kaggle_competitions/tgs_salt/submission.csv",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
e7a15a17f45056a3b0062bde59ed8b0f53c94263 | 258,527 | ipynb | Jupyter Notebook | 1-Lessons/Lesson19/Lab19/.src/Lab19_WS.ipynb | dustykat/engr-1330-psuedo-course | 3e7e31a32a1896fcb1fd82b573daa5248e465a36 | [
"CC0-1.0"
] | null | null | null | 1-Lessons/Lesson19/Lab19/.src/Lab19_WS.ipynb | dustykat/engr-1330-psuedo-course | 3e7e31a32a1896fcb1fd82b573daa5248e465a36 | [
"CC0-1.0"
] | null | null | null | 1-Lessons/Lesson19/Lab19/.src/Lab19_WS.ipynb | dustykat/engr-1330-psuedo-course | 3e7e31a32a1896fcb1fd82b573daa5248e465a36 | [
"CC0-1.0"
] | null | null | null | 119.08199 | 27,008 | 0.836764 | [
[
[
"# Laboratory 18: Linear Regression\n",
"_____no_output_____"
],
[
"## Full name: \n## R#: \n## HEX: \n## Title of the notebook\n## Date: ",
"_____no_output_____"
],
[
" <br>\n",
"_____no_output_____"
],
[
"#### The human brain is amazing and mysterious in many ways. Have a look at these sequences. You, with the assistance of your brain, can guess the next item in each sequence, right? <br>\n\n- A,B,C,D,E, ____ ?\n- 5,10,15,20,25, ____ ?\n- 2,4,8,16,32 ____ ?\n- 0,1,1,2,3, ____ ?\n- 1, 11, 21, 1211,111221, ____ ?\n\n <br>\n <br>\n <br>\n <br>\n <br>\n <br>\n <br>\n <br>\n <br>\n\n\n#### But how does our brain do this? How do we 'guess | predict' the next step? Is it that there is only one possible option? is it that we have the previous items? or is it the relationship between the items?<br>\n\n#### What if we have more than a single sequence? Maybe two sets of numbers? How can we predict the next \"item\" in a situation like that? <br>\n <br>\n#### Blue Points? Red Line? Fit? Does it ring any bells? <br>\n\n <br>\n\n---\n---\n---\n# Problem 1 (5 pts)\nThe table below contains some experimental observations.\n\n|Elapsed Time (s)|Speed (m/s)|\n|---:|---:|\n|0 |0|\n|1.0 |3|\n|2.0 |7|\n|3.0 |12|\n|4.0 |20|\n|5.0 |30|\n|6.0 | 45.6| \n|7.0 | 60.3 |\n|8.0 | 77.7 |\n|9.0 | 97.3 |\n|10.0| 121.1|\n\n1. Plot the speed vs time (speed on y-axis, time on x-axis) using a scatter plot. Use blue markers. \n2. Plot a red line on the scatterplot based on the linear model $f(x) = mx + b$ \n3. By trial-and-error find values of $m$ and $b$ that provide a good visual fit (i.e. makes the red line explain the blue markers).\n4. Using this data model estimate the speed at $t = 15~\\texttt{sec.}$\n---\n---\n---\n <br>\n\n\n\n",
"_____no_output_____"
],
[
"### Let's go over some important terminology:\n\n Linear Regression:\n a basic predictive analytics technique that uses historical data to predict an output variable.\n\n The Predictor variable (input):\n the variable(s) that help predict the value of the output variable. It is commonly referred to as X.\n \n The Output variable:\n the variable that we want to predict. It is commonly referred to as Y.\n \n#### To estimate Y using linear regression, we assume the equation: $Ye = βX + α$\n*where Yₑ is the estimated or predicted value of Y based on our linear equation.* <br>\n\n#### Our goal is to find statistically significant values of the parameters α and β that minimise the difference between Y and Yₑ. If we are able to determine the optimum values of these two parameters, then we will have the line of best fit that we can use to predict the values of Y, given the value of X. <br>\n\n## So, how do we estimate α and β? <br>\n <br>\n\n#### We can use a method called Ordinary Least Squares (OLS). <br>\n <br>\n\n#### The objective of the least squares method is to find values of α and β that minimise the sum of the squared difference between Y and Yₑ (distance between the linear fit and the observed points). We will not go through the derivation here, but using calculus we can show that the values of the unknown parameters are as follows: <br>\n <br>\n#### where X̄ is the mean of X values and Ȳ is the mean of Y values. β is simply the covariance of X and Y (Cov(X, Y) devided by the variance of X (Var(X)). <br>\n\n Covariance:\n In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the lesser values, (i.e., the variables tend to show similar behavior), the covariance is positive. In the opposite case, when the greater values of one variable mainly correspond to the lesser values of the other, (i.e., the variables tend to show opposite behavior), the covariance is negative. The sign of the covariance therefore shows the tendency in the linear relationship between the variables. The magnitude of the covariance is not easy to interpret because it is not normalized and hence depends on the magnitudes of the variables. The normalized version of the covariance, the correlation coefficient, however, shows by its magnitude the strength of the linear relation.\n <br>\n <br>\n \n The Correlation Coefficient:\n Correlation coefficients are used in statistics to measure how strong a relationship is between two variables. There are several types of correlation coefficient, but the most popular is Pearson’s. Pearson’s correlation (also called Pearson’s R) is a correlation coefficient commonly used in linear regression.Correlation coefficient formulas are used to find how strong a relationship is between data. The formulat for Pearson’s R:\n <br>\n \n The formulas return a value between -1 and 1, where:\n <br>\n\n 1 : A correlation coefficient of 1 means that for every positive increase in one variable, there is a positive increase of a fixed proportion in the other. For example, shoe sizes go up in (almost) perfect correlation with foot length.\n -1: A correlation coefficient of -1 means that for every positive increase in one variable, there is a negative decrease of a fixed proportion in the other. For example, the amount of gas in a tank decreases in (almost) perfect correlation with speed.\n 0 : Zero means that for every increase, there isn’t a positive or negative increase. The two just aren’t related.",
"_____no_output_____"
],
[
"### Example 1: Let's have a look at the Problem 1 from Exam II<br>\n\n#### We had a table of recoded times and speeds from some experimental observations:\n\n|Elapsed Time (s)|Speed (m/s)|\n|---:|---:|\n|0 |0|\n|1.0 |3|\n|2.0 |7|\n|3.0 |12|\n|4.0 |20|\n|5.0 |30|\n|6.0 | 45.6| \n|7.0 | 60.3 |\n|8.0 | 77.7 |\n|9.0 | 97.3 |\n|10.0| 121.1|",
"_____no_output_____"
],
[
"#### First let's create a dataframe:\n",
"_____no_output_____"
]
],
[
[
"# Load the necessary packages\nimport numpy as np\nimport pandas as pd\nimport statistics \nfrom matplotlib import pyplot as plt\n\n# Create a dataframe:\ntime = [0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0]\nspeed = [0, 3, 7, 12, 20, 30, 45.6, 60.3, 77.7, 97.3, 121.2]\ndata = pd.DataFrame({'Time':time, 'Speed':speed})\ndata",
"_____no_output_____"
]
],
[
[
"#### Now, let's explore the data:\n",
"_____no_output_____"
]
],
[
[
"data.describe()",
"_____no_output_____"
],
[
"time_var = statistics.variance(time)\nspeed_var = statistics.variance(speed)\n\nprint(\"Variance of recorded times is \",time_var)\nprint(\"Variance of recorded times is \",speed_var)",
"Variance of recorded times is 11.0\nVariance of recorded times is 1697.7759999999998\n"
]
],
[
[
"#### Is there a relationship ( based on covariance, correlation) between time and speed?",
"_____no_output_____"
]
],
[
[
"# To find the covariance \ndata.cov() ",
"_____no_output_____"
],
[
"# To find the correlation among the columns \n# using pearson method \ndata.corr(method ='pearson') ",
"_____no_output_____"
]
],
[
[
"#### Let's do linear regression with primitive Python:\n#### To estimate \"y\" using the OLS method, we need to calculate \"xmean\" and \"ymean\", the covariance of X and y (\"xycov\"), and the variance of X (\"xvar\") before we can determine the values for alpha and beta. In our case, X is time and y is Speed.",
"_____no_output_____"
]
],
[
[
"# Calculate the mean of X and y\nxmean = np.mean(time)\nymean = np.mean(speed)\n\n# Calculate the terms needed for the numator and denominator of beta\ndata['xycov'] = (data['Time'] - xmean) * (data['Speed'] - ymean)\ndata['xvar'] = (data['Time'] - xmean)**2\n\n# Calculate beta and alpha\nbeta = data['xycov'].sum() / data['xvar'].sum()\nalpha = ymean - (beta * xmean)\nprint(f'alpha = {alpha}')\nprint(f'beta = {beta}')\n",
"alpha = -16.78636363636363\nbeta = 11.977272727272727\n"
]
],
[
[
"#### We now have an estimate for alpha and beta! Our model can be written as Yₑ = 11.977 X -16.786, and we can make predictions:",
"_____no_output_____"
]
],
[
[
"X = np.array(time)\n\nypred = alpha + beta * X\nprint(ypred)",
"[-16.78636364 -4.80909091 7.16818182 19.14545455 31.12272727\n 43.1 55.07727273 67.05454545 79.03181818 91.00909091\n 102.98636364]\n"
]
],
[
[
"#### Let’s plot our prediction ypred against the actual values of y, to get a better visual understanding of our model:",
"_____no_output_____"
]
],
[
[
"# Plot regression against actual data\nplt.figure(figsize=(12, 6))\nplt.plot(X, ypred, color=\"red\") # regression line\nplt.plot(time, speed, 'ro', color=\"blue\") # scatter plot showing actual data\nplt.title('Actual vs Predicted')\nplt.xlabel('Time (s)')\nplt.ylabel('Speed (m/s)')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"#### The red line is our line of best fit, Yₑ = 11.977 X -16.786. We can see from this graph that there is a positive linear relationship between X and y. Using our model, we can predict y from any values of X! <br>\n#### For example, if we had a value X = 20, we can predict that:",
"_____no_output_____"
]
],
[
[
"ypred_20 = alpha + beta * 20\nprint(ypred_20)",
"222.7590909090909\n"
]
],
[
[
"#### Linear Regression with statsmodels:\n#### First, we use statsmodels’ ols function to initialise our simple linear regression model. This takes the formula y ~ X, where X is the predictor variable (Time) and y is the output variable (Speed). Then, we fit the model by calling the OLS object’s fit() method.",
"_____no_output_____"
]
],
[
[
"import statsmodels.formula.api as smf\n\n# Initialise and fit linear regression model using `statsmodels`\nmodel = smf.ols('Speed ~ Time', data=data)\nmodel = model.fit()",
"_____no_output_____"
]
],
[
[
"#### We no longer have to calculate alpha and beta ourselves as this method does it automatically for us! Calling model.params will show us the model’s parameters:",
"_____no_output_____"
]
],
[
[
"model.params",
"_____no_output_____"
]
],
[
[
"#### In the notation that we have been using, α is the intercept and β is the slope i.e. α =-16.786364 and β = 11.977273.",
"_____no_output_____"
]
],
[
[
"# Predict values\nspeed_pred = model.predict()\n\n# Plot regression against actual data\nplt.figure(figsize=(12, 6))\nplt.plot(data['Time'], data['Speed'], 'o') # scatter plot showing actual data\nplt.plot(data['Time'], speed_pred, 'r', linewidth=2) # regression line\nplt.xlabel('Time (s)')\nplt.ylabel('Speed (m/s)')\nplt.title('model vs observed')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"#### How good do you feel about this predictive model? Will you trust it?",
"_____no_output_____"
],
[
"### Example 2: Advertising and Sells! <br>\n#### This is a classic regression problem. we have a dataset of the spendings on TV, Radio, and Newspaper advertisements and number of sales for a specific product. We are interested in exploring the relationship between these parameters and answering the following questions:\n- Can TV advertising spending predict the number of sales for the product?\n- Can Radio advertising spending predict the number of sales for the product?\n- Can Newspaper advertising spending predict the number of sales for the product?\n- Can we use the three of them to predict the number of sales for the product? | Multiple Linear Regression Model\n- Which parameter is a better predictor of the number of sales for the product?",
"_____no_output_____"
]
],
[
[
"# Import and display first rows of the advertising dataset\ndf = pd.read_csv('advertising.csv')\ndf.head()",
"_____no_output_____"
],
[
"# Describe the df\ndf.describe()",
"_____no_output_____"
],
[
"tv = np.array(df['TV'])\nradio = np.array(df['Radio'])\nnewspaper = np.array(df['Newspaper'])\nnewspaper = np.array(df['Sales'])\n",
"_____no_output_____"
],
[
"# Get Variance and Covariance - What can we infer?\ndf.cov()",
"_____no_output_____"
],
[
"# Get Correlation Coefficient - What can we infer?\ndf.corr(method ='pearson') ",
"_____no_output_____"
],
[
"# Answer the first question: Can TV advertising spending predict the number of sales for the product?\nimport statsmodels.formula.api as smf\n\n# Initialise and fit linear regression model using `statsmodels`\nmodel = smf.ols('Sales ~ TV', data=df)\nmodel = model.fit()\nprint(model.params)",
"Intercept 7.032594\nTV 0.047537\ndtype: float64\n"
],
[
"# Predict values\nTV_pred = model.predict()\n\n# Plot regression against actual data - What do we see?\nplt.figure(figsize=(12, 6))\nplt.plot(df['TV'], df['Sales'], 'o') # scatter plot showing actual data\nplt.plot(df['TV'], TV_pred, 'r', linewidth=2) # regression line\nplt.xlabel('TV advertising spending')\nplt.ylabel('Sales')\nplt.title('Predicting with TV spendings only')\n\nplt.show()",
"_____no_output_____"
],
[
"# Answer the second question: Can Radio advertising spending predict the number of sales for the product?\nimport statsmodels.formula.api as smf\n\n# Initialise and fit linear regression model using `statsmodels`\nmodel = smf.ols('Sales ~ Radio', data=df)\nmodel = model.fit()\nprint(model.params)",
"Intercept 9.311638\nRadio 0.202496\ndtype: float64\n"
],
[
"# Predict values\nRADIO_pred = model.predict()\n\n# Plot regression against actual data - What do we see?\nplt.figure(figsize=(12, 6))\nplt.plot(df['Radio'], df['Sales'], 'o') # scatter plot showing actual data\nplt.plot(df['Radio'], RADIO_pred, 'r', linewidth=2) # regression line\nplt.xlabel('Radio advertising spending')\nplt.ylabel('Sales')\nplt.title('Predicting with Radio spendings only')\n\nplt.show()",
"_____no_output_____"
],
[
"# Answer the third question: Can Newspaper advertising spending predict the number of sales for the product?\nimport statsmodels.formula.api as smf\n\n# Initialise and fit linear regression model using `statsmodels`\nmodel = smf.ols('Sales ~ Newspaper', data=df)\nmodel = model.fit()\nprint(model.params)",
"Intercept 12.351407\nNewspaper 0.054693\ndtype: float64\n"
],
[
"# Predict values\nNP_pred = model.predict()\n\n# Plot regression against actual data - What do we see?\nplt.figure(figsize=(12, 6))\nplt.plot(df['Newspaper'], df['Sales'], 'o') # scatter plot showing actual data\nplt.plot(df['Newspaper'], NP_pred, 'r', linewidth=2) # regression line\nplt.xlabel('Newspaper advertising spending')\nplt.ylabel('Sales')\nplt.title('Predicting with Newspaper spendings only')\n\nplt.show()",
"_____no_output_____"
],
[
"# Answer the fourth question: Can we use the three of them to predict the number of sales for the product?\n# This is a case of multiple linear regression model. This is simply a linear regression model with more than one predictor:\n# and is modelled by: Yₑ = α + β₁X₁ + β₂X₂ + … + βₚXₚ , where p is the number of predictors.\n# In this case: Sales = α + β1*TV + β2*Radio + β3*Newspaper\n# Multiple Linear Regression with scikit-learn:\nfrom sklearn.linear_model import LinearRegression\n\n# Build linear regression model using TV,Radio and Newspaper as predictors\n# Split data into predictors X and output Y\npredictors = ['TV', 'Radio', 'Newspaper']\nX = df[predictors]\ny = df['Sales']\n\n# Initialise and fit model\nlm = LinearRegression()\nmodel = lm.fit(X, y)",
"_____no_output_____"
],
[
"print(f'alpha = {model.intercept_}')\nprint(f'betas = {model.coef_}')",
"alpha = 2.9388893694594085\nbetas = [ 0.04576465 0.18853002 -0.00103749]\n"
],
[
"# Therefore, our model can be written as:\n#Sales = 2.938 + 0.046*TV + 0.1885*Radio -0.001*Newspaper\n# we can predict sales from any combination of TV and Radio and Newspaper advertising costs! \n#For example, if we wanted to know how many sales we would make if we invested \n# $300 in TV advertising and $200 in Radio advertising and $50 in Newspaper advertising\n#all we have to do is plug in the values:\nnew_X = [[300, 200,50]]\nprint(model.predict(new_X))",
"[54.32241174]\n"
],
[
"# Answer the final question : Which parameter is a better predictor of the number of sales for the product?\n# How can we answer that?\n# WHAT CAN WE INFER FROM THE BETAs ?\n",
"_____no_output_____"
]
],
[
[
" <br>\n\n*This notebook was inspired by a several blogposts including:* \n- __\"Introduction to Linear Regression in Python\"__ by __Lorraine Li__ available at* https://towardsdatascience.com/introduction-to-linear-regression-in-python-c12a072bedf0 <br>\n- __\"In Depth: Linear Regression\"__ available at* https://jakevdp.github.io/PythonDataScienceHandbook/05.06-linear-regression.html <br>\n- __\"A friendly introduction to linear regression (using Python)\"__ available at* https://www.dataschool.io/linear-regression-in-python/ <br>\n\n*Here are some great reads on linear regression:* \n- __\"Linear Regression in Python\"__ by __Sadrach Pierre__ available at* https://towardsdatascience.com/linear-regression-in-python-a1d8c13f3242 <br>\n- __\"Introduction to Linear Regression in Python\"__ available at* https://cmdlinetips.com/2019/09/introduction-to-linear-regression-in-python/ <br>\n- __\"Linear Regression in Python\"__ by __Mirko Stojiljković__ available at* https://realpython.com/linear-regression-in-python/ <br>\n\n*Here are some great videos on linear regression:* \n- __\"StatQuest: Fitting a line to data, aka least squares, aka linear regression.\"__ by __StatQuest with Josh Starmer__ available at* https://www.youtube.com/watch?v=PaFPbb66DxQ&list=PLblh5JKOoLUIzaEkCLIUxQFjPIlapw8nU <br>\n- __\"Statistics 101: Linear Regression, The Very Basics\"__ by __Brandon Foltz__ available at* https://www.youtube.com/watch?v=ZkjP5RJLQF4 <br>\n- __\"How to Build a Linear Regression Model in Python | Part 1\" and 2,3,4!__ by __Sigma Coding__ available at* https://www.youtube.com/watch?v=MRm5sBfdBBQ <br>",
"_____no_output_____"
],
[
"### Exercise 1: In the \"CarsDF.csv\" file, you will find a dataset with information about cars and motorcycles including thier age, kilometers driven (mileage), fuel economy, enginer power, engine volume, and selling price. Follow the steps and answer the questions. <br>\n\n- Step1: Read the \"CarsDF.csv\" file as a dataframe. Explore the dataframe and in a markdown cell breifly describe it in your own words. <br>\n- Step2: Calculate and compare the correlation coefficient of the \"selling price\" with all the other parameters (execpt for \"name\", of course!). In a markdown cell, explain the results and state which parameters have the strongest and weakest relationship with \"selling price\" of a vehicle. \n- Step3: Use linear regression modeling in primitive python and VISUALLY assess the quality of a linear fit with Age as the predictor, and selling price as outcome. Explain the result of this analysis in a markdown cell.\n- Step4: Use linear regression modeling with statsmodels and VISUALLY assess the quality of a linear fit with fuel economy as the predictor, and selling price as outcome. Explain the result of this analysis in a markdown cell.\n- Step5: Use linear regression modeling with statsmodels and VISUALLY assess the quality of a linear fit with engine volume as the predictor, and selling price as outcome. Explain the result of this analysis in a markdown cell.\n- Step6: In a markdown cell, explain which of the three predictors in steps 3,4, and 5, was a better predictor (resulted in a better fit ) for selling price?\n- Step7: Use multiple linear regression modeling with scikit-learn and use all the parameters (execpt for \"name\", of course!) to predict selling price. Then, use this model to predict the selling price of a car that has the following charactristics and decide whether this prediction is reliable in your opinion: \n - 2 years old\n - has gone 17000 km\n - has fuel economy measure of 24.2 kmpl\n - has an engine power of 74 bhp\n - has en engine volume of 1260 CC",
"_____no_output_____"
]
],
[
[
"# Step1:\nvdf = pd.read_csv('CarsDF.csv')\nvdf.head()",
"_____no_output_____"
],
[
"vdf.describe()",
"_____no_output_____"
]
],
[
[
"On Step1: [Double-Click to edit]",
"_____no_output_____"
]
],
[
[
"# Step2:.\nvdf.corr()",
"_____no_output_____"
]
],
[
[
"On Step2: [Double-Click to edit]",
"_____no_output_____"
]
],
[
[
"#Step3:\n# Calculate the mean of X and y\nxmean = np.mean(vdf['Age'])\nymean = np.mean(vdf['selling_price'])\n\n# Calculate the terms needed for the numator and denominator of beta\nvdf['xycov'] = (vdf['Age'] - xmean) * (vdf['selling_price'] - ymean)\nvdf['xvar'] = (vdf['Age'] - xmean)**2\n\n# Calculate beta and alpha\nbeta = vdf['xycov'].sum() / vdf['xvar'].sum()\nalpha = ymean - (beta * xmean)\nprint(f'alpha = {alpha}')\nprint(f'beta = {beta}')\n",
"alpha = 1172124.2284155204\nbeta = -86818.48239180523\n"
],
[
"X = np.array(vdf['Age'])\nY = np.array(vdf['selling_price'])\n\nypred = alpha + beta * X\n# Plot regression against actual data\nplt.figure(figsize=(12, 6))\nplt.plot(X, Y, 'ro', color=\"blue\") # scatter plot showing actual data\nplt.plot(X, ypred, color=\"red\") # regression line\nplt.title('Actual vs Predicted')\nplt.xlabel('Age')\nplt.ylabel('selling price')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"On Step3: [Double-Click to edit]",
"_____no_output_____"
]
],
[
[
"# Step4:\nimport statsmodels.formula.api as smf\n\n# Initialise and fit linear regression model using `statsmodels`\nmodel = smf.ols('selling_price ~ FuelEconomy_kmpl', data=vdf)\nmodel = model.fit()\nmodel.params",
"_____no_output_____"
],
[
"# Predict values\nFE_pred = model.predict()\n\n# Plot regression against actual data\nplt.figure(figsize=(12, 6))\nplt.plot(vdf['FuelEconomy_kmpl'], vdf['selling_price'], 'o') # scatter plot showing actual data\nplt.plot(vdf['FuelEconomy_kmpl'], FE_pred, 'r', linewidth=2) # regression line\nplt.xlabel('FuelEconomy_kmpl')\nplt.ylabel('selling price')\nplt.title('model vs observed')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"On Step4: [Double-Click to edit]",
"_____no_output_____"
]
],
[
[
"# Step5:\nimport statsmodels.formula.api as smf\n\n# Initialise and fit linear regression model using `statsmodels`\nmodel = smf.ols('selling_price ~ engine_v', data=vdf)\nmodel = model.fit()\nmodel.params",
"_____no_output_____"
],
[
"# Predict values\nEV_pred = model.predict()\n\n# Plot regression against actual data\nplt.figure(figsize=(12, 6))\nplt.plot(vdf['engine_v'], vdf['selling_price'], 'o') # scatter plot showing actual data\nplt.plot(vdf['engine_v'], EV_pred, 'r', linewidth=2) # regression line\nplt.xlabel('engine_v')\nplt.ylabel('selling price')\nplt.title('model vs observed')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"On Step5: [Double-Click to edit]",
"_____no_output_____"
],
[
"On Step6: [Double-Click to edit]",
"_____no_output_____"
]
],
[
[
"#Step7:\n# Multiple Linear Regression with scikit-learn:\nfrom sklearn.linear_model import LinearRegression\n\n# Build linear regression model using TV,Radio and Newspaper as predictors\n# Split data into predictors X and output Y\npredictors = ['Age', 'km_driven', 'FuelEconomy_kmpl','engine_p','engine_v']\nX = vdf[predictors]\ny = vdf['selling_price']\n\n# Initialise and fit model\nlm = LinearRegression()\nmodel = lm.fit(X, y)\nprint(f'alpha = {model.intercept_}')\nprint(f'betas = {model.coef_}')",
"alpha = -349182.4300874341\nbetas = [-6.39699185e+04 -2.61785873e+00 6.27862738e+03 4.17269445e+03\n 7.62703638e+02]\n"
],
[
"new_X = [[2, 17000,24.2,74,1260]]\nprint(model.predict(new_X))\n",
"[900102.89014124]\n"
]
],
[
[
"On Step7: [Double-Click to edit]",
"_____no_output_____"
],
[
" <br>\n",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
]
] |
e7a15a6c40c23fec9ed3a9ca2c0425f7d5c5ab75 | 4,283 | ipynb | Jupyter Notebook | demo_control_side_sep_16.ipynb | mengzaiqiao/TVBR | cdac86a753c41f8f3c55a025be8d88dd305325f5 | [
"MIT"
] | null | null | null | demo_control_side_sep_16.ipynb | mengzaiqiao/TVBR | cdac86a753c41f8f3c55a025be8d88dd305325f5 | [
"MIT"
] | null | null | null | demo_control_side_sep_16.ipynb | mengzaiqiao/TVBR | cdac86a753c41f8f3c55a025be8d88dd305325f5 | [
"MIT"
] | null | null | null | 31.036232 | 223 | 0.519262 | [
[
[
"### Import packages",
"_____no_output_____"
]
],
[
[
"import os\nimport sys\nimport time\nfrom datetime import datetime\n\nimport GPUtil\nimport psutil\n\n#######################\n# run after two days\n# time.sleep(172800)\n\n#######################\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0,1,2,3\"\nsys.path.append(\"../\")",
"_____no_output_____"
],
[
"def gpu_free(max_gb):\n gpu_id = GPUtil.getFirstAvailable(\n order=\"memory\"\n ) # get the fist gpu with the lowest load\n GPU = GPUtil.getGPUs()[gpu_id[0]]\n GPU_load = GPU.load * 100\n GPU_memoryUtil = GPU.memoryUtil / 2.0 ** 10\n GPU_memoryTotal = GPU.memoryTotal / 2.0 ** 10\n GPU_memoryUsed = GPU.memoryUsed / 2.0 ** 10\n GPU_memoryFree = GPU.memoryFree / 2.0 ** 10\n print(\n \"-- total_GPU_memory: %.3fGB;init_GPU_memoryFree:%.3fGB init_GPU_load:%.3f%% GPU_memoryUtil:%d%% GPU_memoryUsed:%.3fGB\"\n % (GPU_memoryTotal, GPU_memoryFree, GPU_load, GPU_memoryUtil, GPU_memoryUsed)\n )\n if GPU_memoryFree > max_gb:\n return True\n return False\n\n\ndef memery_free(max_gb):\n available_memory = psutil.virtual_memory().free / 2.0 ** 30\n if available_memory > max_gb:\n return True\n return False",
"_____no_output_____"
],
[
"for item_fea_type in [\n \"random\",\n \"cate\",\n \"cate_word2vec\",\n \"cate_bert\",\n \"cate_one_hot\",\n \"random_word2vec\",\n \"random_bert\",\n \"random_one_hot\",\n \"random_bert_word2vec_one_hot\",\n \"random_cate_word2vec\",\n \"random_cate_bert\",\n \"random_cate_one_hot\",\n \"random_cate_bert_word2vec_one_hot\",\n]:\n while True:\n if gpu_free(4) and memery_free(10):\n os.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0,1,2,3\"\n gpu_id = GPUtil.getAvailable(order=\"memory\", limit=4)[\n 0\n ] # get the fist gpu with the lowest load\n print(\"GPU memery and main memery availale, start a job\")\n date_time = datetime.now().strftime(\"%Y_%m_%d_%H_%M_%S\")\n command = f\"CUDA_VISIBLE_DEVICES=0,1,2,3; /home/zm324/anaconda3/envs/beta_rec/bin/python run_tvbr.py --item_fea_type {item_fea_type} --device cuda:{gpu_id} >> ./logs/{date_time}_{item_fea_type}.log &\"\n os.system(command)\n time.sleep(120)\n break\n else:\n print(\"GPU not availale, sleep for 10 min\")\n time.sleep(600)\n continue",
"-- total_GPU_memory: 10.761GB;init_GPU_memoryFree:10.760GB init_GPU_load:0.000% GPU_memoryUtil:0% GPU_memoryUsed:0.001GB\nGPU memery and main memery availale, start a job\n-- total_GPU_memory: 10.761GB;init_GPU_memoryFree:10.757GB init_GPU_load:0.000% GPU_memoryUtil:0% GPU_memoryUsed:0.004GB\nGPU memery and main memery availale, start a job\n"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.